VBScript error from memory

I have a classic ASP CRM that was created by a third-party company. Currently, I have access to the source code and I can make any changes.

By chance during the day, usually after some prolonged use by users, most of my pages begin to receive a "Out of Memory" error.

As the application is created, all pages and scripts pull the main functions from the Global.asp file. Other global files are also embedded in this file, but the error presented shows

From memory

WhateverScriptYouTriedToRun.asp String 0

Line 0 is the include for the global.asp file. As soon as an error occurs, after an indefinite period of time, the occurrence of the error subsides for a while, and then begins to repeat again. With how the application is written and what functions it uses, and the “diagnostics” that I have already done, it seems to be the usual function used that holds data, such as a set of records or something like that, and then does not release it properly, then other users try to use the same function, and in the end it just fills up, causing an error. The only way to resolve the error effectively is to actually restart IIS, recycle the application pool, and restart SQL Server services.

Needless to say, me and my users are annoyed ....

I can’t pinpoint the error due to the actual error message, which is line 0, but from there I have no idea where in the 20 kilogram lines of code it can hang itself. Any thoughts or ideas on how to isolate, or at least point me in the right direction, to start clearing it? Is there a way to increase memory size for VBScript? I know there is a limit, but is it set ... 512K and can you increase it to 1 GB?

Here is what I tried:

  • Removing SQL Inline Statements in Views
  • Passing several hundreds of scripts and ensuring that each OpenConnection and OpenRecordSet are matched with the corresponding closure.
  • Going through a global file and commenting out any large SQL statements, such as ApplicationLog (a function that writes a executed query to a table).
  • Some smaller script changes.
+4
source share
3 answers

General memory leak

You say you close all records and connections that are good.

But are you deleting objects?

For instance:

Set adoCon = new Set rsCommon = new 'Do query stuff 'You do this: rsCommon.close adocon.close 'But do you do this? Set adoCon = nothing Set rsCommon = nothing 

There is no garbage collection in classic ASP, so any objects that were not destroyed will remain in memory.

Also, make sure your private / nothing starts in each branch. For instance:

 adocon.open rscommon.open etc 'Sql query myData = rscommon("condition") if(myData) then response.write("ok") else response.redirect("error.asp") end if 'close rsCommon.close adocon.close Set adoCon = nothing Set rsCommon = nothing 

Nothing is closed / destroyed before the redirect, so it will free up memory for some time, since not all branches of the logic lead to a proper memory gap.

Improved design

Unfortunately, it looks like the site was not well designed. I always structure my classic ASP as:

 <% Option Explicit 'Declare all vars Dim this Dim that 'Open connections Set adoCon... adocon.open() 'Fetch required data rscommon.open strSQL, adoCon this = rsCommon.getRows() rsCommon.close 'Fetch something else rscommon.open strSQL, adoCon that = rsCommon.getRows() rsCommon.close 'Close connections and drop objects adoCon.close set adoCon = nothing set rscommon = nothing 'Process redirects if(condition) then response.redirect(url) end if %> <html> <body> <% 'Use data for(i = 0 to ubound(this,2) response.write(this(0, i) & " " & this(1, i) & "<br />") next %> </body> </html> 

Hope this helps.

+6
source

Have you looked at using a memory monitoring tool to find out how much memory fragmentation is occurring? My guess about a possible reason is that some kind of size object is created, but there is not enough space in the memory to store it as one continuous fragment. Imagine that you need a place to store an object that will occupy 100 MB, and although there may be several hundred megabytes for free, the largest continuous piece is 90 MB, then this does not work.


Debugging Diagnostic Tool v1.1 will be a tool where Bernard's articles can help in understanding the use of this tool.

Another thought is the question of how many lines of concatenation are there in the code? I remember where I worked, there were problems with performing many string concatenation operations that sucked out memory, which might be another idea.


Yes, I could see some shock on such a number the first few times when you see it, but if you understand what the code is doing, it may make sense why so much space is reserved right off the bat.


I did not use this tool specifically for debugging, but I had a tool that took a snapshot of the memory when pages were hung, so I couldn’t tell if the performance impacted the tool or not. Of course, in my case, I used a similar tool in 2004, so several years have passed since I had to research this issue.

+3
source

Just going to drop it here, but this problem took a long time to solve. Here is a breakdown of what we did:

  • We took all the embedded SQL and made SQL representations, each SELECT now processed using VIEW .

  • I took each SQL INSERT and UPDATE (as far as I could, without disrupting the system) and put them in the stored procedures.

    # 2 was the one that really made the biggest difference

  • Several thousand scripts went through and ensured that the variables were deleted correctly and all Open Open connections were correctly respected using Close Connection, as well as with Open / Close RecordSet.

  • One of the slow killers was doing something like:

    ID = Request.QueryString ("ID")

at the top of the page. Before redirecting or closing a page, there is always:

 Set ID = Nothing 

or complete removal of output.

+1
source

Source: https://habr.com/ru/post/1340986/


All Articles