Best primary key format for a large table

I am developing an asp.net application that has potentially large data tables. I would like to know what would be the best way to determine the primary key. I know this was asked before, but since it is for a specific situation, I think the question is valid.

I am using Entity Framework 4 in a SQL Server 2008 database.

What are the possibilities for determining the primary key, given the following:

  • There is a real possibility that over time the number of records will exceed the 32-bit boundary, so an integer with automatic increase will not be possible.
  • It is not possible to determine the primary key in a combination of other columns in the table.
  • For reasons of data synchronization, an identifier created by an application will be preferable to an identifier created using a database. In addition, in EF, this would mean an additional transition to the database to obtain the newly created identifier.
  • For insertion performance, it is preferable to use a serial key.
  • I am considering space requirements for (sequential) orientation.
  • For string identifiers, case insensitivity is preferred.

What I have come up with so far is a custom algorithm that generates a datetime part and a random part converted to a hexadecimal representation of the string. This leaves me a little shorter than the manual. I could still convert it to base64 anyway, but this will go against element nr 6.

Thanks for your suggestions.

+3
3

BIGINT (8- ).

BIGINT , INT, .

+12

.

  • 5 6 .
  • , .
  • . .

, . max(Id)+1. .NET, . , , , .

+2

GUID.

  • ,
  • if you can overflow 32-bit keys, you will probably have to use 64-bit keys (in addition, you will be able to create and use 48-bit keys or something like that) - then 128-bit GUIDs require only twice of space
  • Surrogate string keys are somewhat unnatural for me, and I don't see any advantages over GUID keys.
0
source

Source: https://habr.com/ru/post/1755728/


All Articles