This question may be more appropriate for .stackexchange programmers. If yes, please migrate.
I am currently reflecting on the complexity of typical data models. Everyone knows that data models must be normalized, however, on the other hand, a normalized data model will require quite a few connections for subsequent data collection. And joins are potentially expensive operations, depending on the size of the tables used. So the question I'm trying to figure out is how would this compromise normally be used? That is, in practice, how many connections will you find acceptable in typical queries when developing a data model? This would be especially interesting when counting multiple joins in single queries.
As an example, let's say that we have users who have houses, in which there are rooms, in which there are boxes, in which there are objects. It is trivial to normalize this using tables for users, houses, rooms, drawers and items in the sense described above, later it would require me to join the five tables upon receipt of all items belonging to a particular user. It seems to me a very difficult task.
Most likely, the size of the tables will also be involved. Combining five tables with little data is not as bad as three tables with millions of rows. Or is it wrong?
source share