How to process a large amount of data in a specific database table

I am working on a project in which I constantly insert rows into a table, and within a few days this table will be very large, and I came up with a question and can’t find the answer: what happens when I have more rows than "bigint" in this table knowing that

Do I have an id column (which is an int)? Can my database (MySQL) handle this correctly? How do large companies cope with such problems and join large tables?

I do not know if there are short answers to such problems, but any questions that will help solve my question will be welcome!

+3
source share
6

, .

, , .

id bigint, .

uuid ( ),

, uuiq ,

+1

, BIGINT .

Unsigned BIGINT 0 18,446,744,073,709,551,615. , BIGINT (8 ), (18 446 744 073 709 551 615 Γ— 8) Γ· 1,024 ^ 4 = 134 217 728 .

MySQL 256 MyISAM 64 InnoDB, 256 Γ— 1024 Γ— 4 Γ· 8 = 35 .

Oracle NUMBER(38) ( 20 ) PK, 0 1e38. 20- , Oracle 4 * 32 = 128 ( 32 ).

+5

. Facebook, ? ? , ? , SQL- , ? . , , (, , :-)), Google AppEngine. , , , .

, , - , MSSQL -better, , Oracle.

, .

+1

BIGINT , 1 (1000 ), 31 536 000 000 .

BIGINT 18 446 744 073 709 551 615 18 .

+1

bigint , 18 446 744 073 709 551 615

0

DB2 Oracle

-1

Source: https://habr.com/ru/post/1780479/


All Articles