Is an n-tier system more readable for big data processing?

I recently became part of a development team writing our flagship product. This is, first of all, an application for reading on the Internet (asp.net (C #) and oracle), implemented in an N-level system. Most of the records in the database are performed through external services (and not through webapp). Instead of scheduling the usual batch jobs in the database for data aggregation, they push all levels to the business level (sometimes creating one hundred million objects). Although this keeps all the "business logic" in the same place, it also takes about 200 times longer than running the equivalent query in the database. This seems like a terrible idea to me. Am I wrong here and is this standard and good stuff? Does anyone have any real research that I can point out to my colleagues (or me if I'm wrong)?

I am not discussing whether the n-tier is good or bad, but is it suitable for processing data aggregation and the like?

+3
source share
1 answer

You are right about processing time (as well as resources, such as memory).

  • Best practices are to aggregate as much as possible for the data, ideally in the database. One hundred million objects seem crazy.
  • However, we all know that this code is less served in this way. Thus, this is more development time, and in the end, more cost.

, . ,
.

, . , , , , , , , ...


, . , ROI:

  • , , .
  • - .

:

  • , ( ),
  • , SQL

( ), , , . , -, , ( ) .

+1

Source: https://habr.com/ru/post/1719460/


All Articles