Uncategorized

Quick things to check if you are leaking your memory

After a couple of years working in InternetDev support I’ve seen many different kind of problems reported by customers (different environments, different use of Microsoft technologies and products combined together, different application needs, different customer’s background and technical knowledge etc…), but as one of my favorite authors said, Sometimes they come back, so it happens that sometimes we also get incoming calls for well known problems like memory leaks, high CPU and worker process crashes. Of course there could be many different causes for those problems, but some of them are more likely to affect your application; here is a list of the first things I check when I’m working on a memory leak problem (usually analyzing a memory dump of the faulting application). This is not a complete list (but I promise to update the post if I’ll find something new and interesting) and if you talk to other Support Engineers they might give you a slightly different view on the subject, this is what I found and learnt in my day to day support experience.

Application deployed in debug mode

This is something we must avoid in a production environment; debug mode is useful during development, but in production prevents the runtime to use a some optimization at compilation and resource management level (memory, CPU etc…): I personally saw a big eCommerce site stop working after a few minutes just because of this setting. To resolve it you have to check your web.config files and assure that you set debug=”false” as the compilation mode. See this nice post from Tess on the subject.

Avoid duplicated assemblies (install them in the GAC)

If you have multiple applications running on the same server, and all of those applications are using some common assemblies (maybe your own custom data access layer, or some common controls with shared functionality across all of your applications, some sort of utility classes etc…), and you put those components in the /bin folder of every application, all those assemblies will be loaded inside their correspondent AppDomain. As you can easily guess, you’ll end up with some (maybe lots) of identical assemblies loaded multiple times (one per AppDomain/application), thus wasting resources on your server (of course memory, but those components must also be JITted and loaded in memory); wouldn’t be nice if you could load just one instance of those shared components and then all of your applications could use that only copy in memory?
Well… it turns out that this is exactly what the Global Assembly Cache is meant for! 😊

– 315682 How To Install an Assembly in the Global Assembly Cache in Visual Basic http://support.microsoft.com/?id=315682
· 815808 How To Install an Assembly into the Global Assembly Cache in Visual C# http://support.microsoft.com/?id=815808
· Global Assembly Cache Tool (Gacutil.exe) http://msdn.microsoft.com/library/default.asp?url=/library/en-us/cptools/html/cpgrfGlobalAssemblyCacheUtilityGacutilexe.asp

Large Object Heap

You should also pay attention to large objects, which lead to memory fragmentation. When .NET based application (as ASP.NET) needs to allocate new memory, it does so looking for (and reserving) chunks of 64 Mb free and contiguous memory: that is the key of this matter. When the Garbage Collector does its job and frees memory for unused (better say, unreachable) objects, it also tries to compact the chunks of memory to have it contiguous for future use, but this is not always possible: when the GC runs, it must temporarily stop all application threads, move around chunks of memory, update memory pointers and then the application can continue it’s job. But for performance reasons this can’t be done with objects larger than 85 Kb, which are seen as big objects from the CLR point of view and are allocated on a special heap, called the Large Object Heap. Objects in LOH are still collected and the memory freed, but since moving around such large chunks of memory is very expensive (and remember during this operation the application is frozen and can’t respond to client’s requests) and would require too much time and effort to the system, at the end the application would be too badly affected. The GC frees that memory but does not compact it, and this leaves some holes in our memory (like a Swiss cheese, if you like it 😊).
Multiply this process during the life of the application and you could end up having lot of free memory but so fragmented (I’ve seen dumps where we still had 80% of available memory that the biggest contiguous free chunk of memory was just 50 Mb, too small for the 64 Mb needed), than the CLR could do nothing else but throw an OutOfMemoryException. Of course if you have lots of big objects you’ll have more chances to run into this problem.
This article has a quite detailed description of GC internals: “Garbage Collection: Automatic Memory Management in the Microsoft .NET Framework” http://msdn.microsoft.com/msdnmag/issues/1100/gci/ and part2 http://msdn.microsoft.com/msdnmag/issues/1200/GCI2/default.aspx.

In this same category I include the Viewstate problem; Viewstate is certainly a useful and powerful feature and there are controls (like the DataGrid) which needs Viewstate to work properly, but you must always be very careful with it… if you are running short in memory always try to disable Viewstate for every control (also for every page, if this is feasible) which does not explicitly needs it. Sometimes looking at the list of large object in a dump we find very big string objects which contains the Viewstate for your page and controls… Disabling it whenever possible will reduce memory fragmentation and will also decrease the time a client need to load your pages, because of the smaller HTML it will have to download from the server.

Dynamic assemblies

XML/XSLT stuff going on inside your process is also a known cause of memory problems if not used correctly; the problem is related to dynamic assemblies which are created by the runtime to manage those operations. Here are a couple of articles which may help you on this:
321702 HOW TO: Use Extension Objects When You Execute XSL Transformations in Visual Basic .NET Applications
323370 HOW TO: Use Extension Objects When You Execute XSL Transformations in Visual C# .NET Applications
316775 PRB: Cannot Unload Assemblies That You Create and Load by Using Script

What about disposing objects?

As you may know, even if the Garbage Collector usually does an excellent job removing objects, releasing resources and freeing memory for you, this does not relieve you as the developer to take care of the objects and resources you allocate, and release them when you’re done; this specially applies for unmanaged (COM) resources and objects, since the GC is not able to collect that memory and release those resources, and it’s developer’s responsibility to take case of them.

I’ve seen an extension of this particular circumstance with EventHandlers; this is the basic scenario: you create you own custom object to store information your application needs, that object exposes some events and you put that object in your user’s session. Then from some other code in your application (a page or a custom component) you subscribe to one or more events exposed by that cached object, and your’ re done, right? Well… nope! 😲 Because if you don’t explicitly remove the subscription to that event on the cached object, your subscriber code (page or component) will always be reachable (there will always be a link between it and the cached object) and the GC will not be able to remove your subscriber object, thus wasting memory. If in a dump the output of a !gcroot command shows a long chain of EventHandlers you stumbled into this problem. Of course this applies to every kind of objects, not just EventHandlers.
My colleague Tess has a very nice post on this (we worked together on one of this cases a few months ago… maybe this inspired the post? 😉).

Use /3Gb and multiple application pools

Applying the /3Gb switch usually helps to address memory pressure problems, but only if the application reaches a point of stability where the GC is able to free enough memory for the runtime to run properly, but there might be circumstances where this is not enough; in those cases usually we have a problem with application design, either at code level or architecture level.
· How to Set the /3GB Startup Switch in Windows http://www.microsoft.com/technet/prodtechnol/exchange/guides/E2k3Perf_ScalGuide/e834e9c7-708c-43bf-b877-e14ae443ecbf.mspx?mfr=true
· Memory Support and Windows Operating Systems http://www.microsoft.com/whdc/system/platform/server/PAE/PAEmem.mspx
· How to use the /userva switch with the /3GB switch to tune the User-mode space to a value between 2 GB and 3 GB http://support.microsoft.com/?kbid=316739

Windows 2003 and IIS 6.0 gives us greater flexibility and fine tuning possibilities thanks to it’s architecture and internals (quite different from Windows 2000 and IIS 5.x); if your application’s design allows it, you can configure your application pool to be served by multiple worker processes (web garden), or you can split your application to run under different application pools. Everyone of them capable of taking advantage of the 2 (or 3, depending on your configuration) Gb or virtual memory available for Win32 processes. Of course this must we carefully tested before going live, but usually IIS 6.0 help us sorting some of those problems, or at least can put some relief on it while you are working to fine tune the solution (maybe with the help of a Microsoft PSS Engineer 😊).

Oh, of course if you put a lot of data in your cache you could run out of memory…

There is also a nice KB article on this subject (actually I found it after I wrote this post…): http://support.microsoft.com/kb/893660/en-us

Update (18/07/2007):
Reading this post I just noted I forgot to add the EventHandler problem I encountered a few times, and Tess described here

Cheers
Carlo

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.