SQL primary key: integer vs varchar

SqlPerformanceIndexing

Sql Problem Overview


The team I'm working with decided to create a table with a varchar primary key. This table is referenced by another table on this primary key.

I've the habit to create an integer primary key, following what I learnt at university. I've read that there is a performance boost using integer primary key.

The matter is that I don't know any other reason for creating an integer primary key. Do you have any tips?

Sql Solutions


Solution 1 - Sql

VARCHAR vs. INT doesn't tell much. What matter is the access pattern.

On absolute terms, a wider key will always be worse than a narrow key. The type carries absolutely no importance, is the width that matters. When compared with INT though, few types can beat INT in narrowness, so INT usually wins that argument just by the fact that is only 4 bytes wide.

But what really matters is the choice of clustered key. Often confused with the primary key, the two represent different notions and are not required to overlap. Here is a more detailed discussion Should I design a table with a primary key of varchar or int? The choice of the clustered key is just about the most important decision in table design, and a mechanical application of an INT identity(1,1) on it may be just the biggest mistake one can make. Here is where the question of access patterns comes in:

  • what are the most frequent interrogations on the table?
    • what columns are projected?
    • what predicates are applied?
    • what ranges are searched?
    • what joins are performed?
    • what aggregations occur?
  • how is the data inserted into the table?
  • how is the data updated in the table?
  • how is old data purged from the table, if ever?
  • how many non-clustered indexes exist?
    • how often are columns included in the NC indexes (key or leaf) are updated?

Overall, there are many access patterns that can be ruined by using an INT IDENTITY clustered key. So before jumping to apply a cookie cutter solution, perhaps a little bit of analysis is required...

Some more general guidelines:

You see there are no Primary Key design guidelines, because the Primary key is not an issue of storage design but an issue of modeling and is entirely domain driven.

Solution 2 - Sql

The primary key is supposed to represent the identity for the row and should not change over time.

I assume that the varchar is some sort of natural key - such as the name of the entity, an email address, or a serial number. If you use a natural key then it can sometimes happen that the key needs to change because for example:

  • The data was incorrectly entered and needs to be fixed.
  • The user changes their name or email address.
  • The management suddenly decide that all customer reference numbers must be changed to another format for reasons that seem completely illogical to you, but they insist on making the change even after you explain the problems it will cause you.
  • Maybe even a country or state decides to change the spelling of its name - very unlikely, but not impossible.

By using a surrogate key you avoid problems caused by having to change primary keys.

Solution 3 - Sql

> i was a bit disappointed because i've > the habit to create an integer primary > key (following what some teacher told > me at the university). I've read a lot > of documentation on the performance > boost using integer primary key.

There is a term for this: confirmation bias:

"also called confirmatory bias or myside bias) is a tendency for people to favor information that confirms their preconceptions or hypotheses, independently of whether they are true. This results in people selectively collecting new evidence, interpreting evidence in a biased way, or selectively recalling information from memory."

Of course, your first reaction will be to say, "But that's not true!" Yeah, you would say that 'cos you're biased ;) [tongue firmly embedded in cheek]

Here's a classic example: say you had been told by your zoology professor that all swans are white and, sure enough, all swans you and your friends have ever encountered are white. Now let's say later in life a colleague expressed the opinion that perhaps there is such creature as a black swan. What?! That's not what your were taught. Your world is rocked! You immediately go out and conduct a swan survey and you count 1,000 white swans and zero black swans. Proof! If you'd found 10,000 white swans then the hypothesis 'All swans are white' would be ten times truer, right?

A different approach would be to forget about white swans for the moment and try to seek out a black swan. Perhaps take a holiday by the sea in sunny Dawlish?

I really don't mean to sound disrespectful; you admit to reading a lot about what you have been told and that indeed earns my respect. So here's a challenge: try to find cases where adding an integer column to a table is unnecessary.

Here are some hints and spoilers: tables that are not referenced by other tables; single column 'all key' lookup tables; 'small' tables that aren't queried much :)

Here are some other related topics you may like to investigate:

Does the word 'primary' in 'primary key' have much meaning or are all keys in a given table equal?

What are the qualities of a 'good' key? (e.g. should a key's values be immutable or is a stability 'good' enough?)

Is an integer column added to the table as an artifical key (perhpas because the available natural key is not 'good' enough) or as a surrogate key (perhaps to boost performance of an otherwise 'good' natural key)?

When a surrogate key is added to a table on performance grounds, is this for actual measured effect or merely for perceived effect (i.e. premature optimization)?

Should surrogate keys appear in the logical business model or are they for implementation only?

Is it a good idea to always do something (e.g. add an integer column to a table) without engaging the brain each time? ;)

[Disclaimer: I'm a natural key advocate and shun surrogates. For me they are like denormalization: you only do it when you have to, usually for a performance issue (specific and demonstrable), where the fault lies elsewhere (lousy SQL product version, logical design flaw that cannot be fixed at this time, etc). Surrogates should never appear in the logical business model. I sometimes need an artificial identifier and have even exposed them logical business models.]

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionfrabiaccaView Question on Stackoverflow
Solution 1 - SqlRemus RusanuView Answer on Stackoverflow
Solution 2 - SqlMark ByersView Answer on Stackoverflow
Solution 3 - SqlonedaywhenView Answer on Stackoverflow