I am creating a database that will grow rapidly. Some tables will contain several million rows throughout the year. When should I start to worry about database size?
Is it possible to process a table with 30 million rows? How is this usually solved?
Several million lines are not so large. Creating the appropriate indexes for your query workload will work quickly.
The rule (loose) with SQL Server is that you should consider splitting to a mark of 20-30 million rows. [Assuming you have an Enterprise Edition SQL Server in Production. But splitting is not always a solution.]
Separated tables and basic concepts
Partitioned table and index strategies using SQL Server 2008
Partitioning SQL Server: Not Best Practices for Everything
Properly configured tables can handle billions of rows, so don't worry 8-) I have several tables in my production project s> 1.5 billion rows each
But yes!
It takes longer to process and maintain these tables.
Source: https://habr.com/ru/post/1388251/More articles:Is it possible to pull out the whole element without moving to the beginning? - vimAndroid emulator error -ddmlib - androidMYSQL count: select ALL of users, count active / total ads (2 accounts) - mysqlHow to call a method of an overridden base class using C ++ / CLI - .netJQuery sets up a collection of text fields for an empty value - jqueryHow to install Command-T, Pathogen and not use RVM? - ruby โโ| fooobar.comHow to prevent external users from viewing document files - securityPause until user logs in to run shell script in OS X - bashFind string inside byte buffer - javaErlang binary concatenation - erlangAll Articles