Someone else will run into this problem, for sure, so I am posting here for community benefit. I had a serious memory leak with LINQ to SQL when loading thousands of rows from XML. This caused the hosting memory resource limit (100mb)to be busted. To avoid the problem, dispose the datacontext and recreate it periodically.
Thanks for posting. In general, if you pull a lot of data from a file (XML, txt, etc.), it gets loaded into memory and can cause the memory usage to rise. Bruce DiscountASP.NET www.DiscountASP.NET
In this case I was using LINQ to XML to scan the XML file, and LINQto SQL to insert the data. The first problem was that LINQ to XML loads the entire XML file into memory. So I switched to XmlReader which is non-cached and the problem still occurred... which left the LINQ to SQL as being the only possible culprit.
Hmm.. i have not actually tested memory footprint for LINQ. I'll give it a shot if I get some free time this wkend. Bruce DiscountASP.NET www.DiscountASP.NET
The memory footprint for LINQ to SQL is very small (unless a leak occurs). The LINQ to XML stuff is like XDocument... the entire file is consumed into memory.
Eric White has been keeping on top of this. You might want to shoot him an e-mail: http://blogs.msdn.com/ericwhite/archive/2007/12/06/performance-of-linq-to-xml.aspx
I pretty much gave up on LINQ to SQL for this specific application. It's too heavy on memory and CPU. Still, it's not bad for a first stab and is very useful in some circumstances. Because of the (necessary) memory and CPU restrictions in the hosting environment, I set up a temporary staging table in SQL and use XmlReader with LINQ to SQL to dump the 'raw' data into there, then call a stored procedure to update/insert to the actual tables. Jobs that were taking 12+ hours now take only a few minutes, and most of the time is spent waiting for Windows Workflow delay activities, so as to avoid busting CPU restrictions.