I do not ask for an opinion, but more about the documents.
We have many data files (XML, CSV, Plantext, etc.), and they need to be processed, their data can be divided.
The database leader suggested using a stored procedure to complete the task. Basically we have an intermediate table in which the file becomes serialized and stored in the clob or XML column. Then from there, he suggested continuing to use the stored procedure to process the file.
I am an application developer with a db background, especially on application development, and I may be biased, but using this logic in the DB seems like a bad idea, and I can not find any documentation that confirms or disapproves that Iβm talking about to put the car on a train to pull out a load of goods.
So my questions are: How well do databases (Oracle, DB2, MySQL, SqlServer) perform when we talk about finding regular expressions, finding and replacing data in clob, dom traversal, recursion? Compared to a programming language such as Java, PHP, or C #, for the same problems.
Edit
So, I am looking for documentation on the analysis of matching / runtime of a specific programming language compared to a DBMS, in particular, for searching and replacing strings, searching and replacing regular expressions. Passing an XML site. Memory usage when calling recursive methods. And, in particular, how well they scale in a collision with 10 - 100 GB of data.