|
Derar Novice
Joined: 09 Sep 2006 Posts: 44
|
Posted: Wed Aug 11, 2010 12:37 am
[3.22a] CMud Memory Management |
So, I made CMud eat up so much of my working memory today that it stopped working because it ran out of memory. When I checked the task manager, the memory consumption for CMud was 1.7GB.
Reproducing this from scratch is a pain I'd rather avoid, but I can set it up if really required. I'd much rather just send the package that exhibits the issue and tell you what commands to run, because I think I'd be halfway back to it with a from-scratch scenario anyway.
Basically, CMud doesn't seem to want to release memory once it's been allocated, causing it to eat more and more and more. This doesn't seem to be an ongoing issue, but perhaps isolated to how it's dealing with db vars or massive json compilations.
Here's the rundown of what I'm doing:
Prior to 3.22, I tracked level data (trains, hps, mana, moves, pracs, stats, etc.) and stored it. Aardwolf has 200 levels per remort. I would store the data from past remorts in a var of their own. Each "remort" variable has 13 keys, each containing a stringlist of (more or less) 200 items. I have 25 remorts worth of data. I also have similar per-remort variables storing quest information, etc.
3.22 made my tracker stop working. I figured it was a good time to redo it, and utilize nested lists to store everything in a nice easy to access way. Yeah, it was going to be a lot of data; I figured I'd see how much these puppies could handle.
One of the things I've done is written an alias to convert the old variables and add their data all into a single variable, with the structure Variable.Tier.#.Remort.#.Level.#. That works pretty much ok, or so I thought, and the result of that is this post requesting a large line count field because of 65000+ lines...
Anyway, the reason it doesn't work fine is, before I convert one remort variable into the main storage variable, CMud's memory usage is ~60MB... normal. After I've converted all 25, the memory usage has spiked to 500MB. Additionally, it hit that point about 10 minutes ago, and is *still* at that point. If I then turn around and start trying to convert, say, my quest variables into that variable... I run out of memory, because each iteration of conversion adds more memory allocation than the last.
Ignoring, for the moment, the potential issues with creating such a gigantic single variable record, there's definitely an issue here with memory not being released after use.
As I said, I'd be more than happy to share the package file and let you know what to do with it to recreate the issue.
Gonna go make it happen again so I can get a crash report, since this did generate one for me. |
|
|
|
Tech GURU
Joined: 18 Oct 2000 Posts: 2733 Location: Atlanta, USA
|
Posted: Wed Aug 11, 2010 3:03 am |
It definitely sounds something weird is going on. I'm sure you've already done this but I'll still recommend that you make liberal use of local variables for all intermediate calculations and derivations. Another thing you can look into is to see if using #UNVAR has any effect as well.
|
|
_________________ Asati di tempari! |
|
|
|
Zugg MASTER
Joined: 25 Sep 2000 Posts: 23379 Location: Colorado, USA
|
Posted: Wed Aug 11, 2010 4:17 pm |
Quote: |
I'd much rather just send the package that exhibits the issue and tell you what commands to run |
Definitely do that. Send it to sales@zuggsoft.com with your pkg file as an attachment (or put it into a zip archive). I definitely sounds like the kind of problem that I want to fix before a public release and definitely sounds related to the json stuff.
The json library that I'm using in CMUD treats the json tables as "objects" with a reference count. The table is only freed from memory when the reference count drops to zero. While I haven't run into any memory leaks with this, it's possible something in your script is causing it to keep a reference to variables that it shouldn't. |
|
|
|
Derar Novice
Joined: 09 Sep 2006 Posts: 44
|
Posted: Wed Aug 11, 2010 9:34 pm |
Tech,
Yes, I try to use local variables wherever I can for calcluations/adjustments. In fact, if I am updating multiple records in a DB variable, I will typically copy the DB variable on to a local variable, update all the rows I need to update on the local, and then copy the local back over the DB variable.
---
Zugg,
I sent the package. Here's some more to the story... when I went to go attach the file, I noticed the package file was a whopping 6MB. I figured maybe it had gotten corrupted and that's actually what was causing the problem. So I deleted the big single DB var I had, and closed CMud - the file stayed at 6MB. I opened CMud again, and exported the package to XML, then closed CMud, renamed the package file, reopened CMud, created a new package, and imported the XML. Closing CMud, the file size of the new package is a mere 450KB.
So I went back in, and ran the conversion for Levels. Exited CMUD... the package file had shot up to 1.8MB or so. Went back in, converted Campaigns, exited. It was closer to 4MB. When I exported with the variable "full" with this data, the XML file was about 2MB. Also the working memory snowballing continued to happen, again to quite high usage amounts (500MB used on level conversion; on a fresh tally with the campaign it finished just over 1GB).
Setting the variable back to "", or even deleting it completely, doesn't cause the .pkg file to get any smaller. Fortunately it did seem to reuse the space it thought it already was using, so the file only got a little bigger (just over 4MB) when I ran the level converstion again. |
|
|
|
GeneralStonewall Magician
Joined: 02 Feb 2004 Posts: 364 Location: USA
|
Posted: Thu Aug 12, 2010 1:34 am |
As far as I'm aware, packages never getting smaller is semi-by-design. When you delete a setting, it doesn't really get erased, but is simply marked as such. I'm not sure as to the reasoning behind this, but if you're concerned about file size (Even though hard-drive space is cheap and plentiful) you can export/import the XML into a fresh session like you pointed out. That said, I'm sure it's unrelated to your memory usage.
|
|
|
|
Derar Novice
Joined: 09 Sep 2006 Posts: 44
|
Posted: Thu Aug 12, 2010 2:11 pm |
I suspect you're probably right that it's not related to the memory usage issues. But it does raise a red flag or two.
Inherently, file size as it relates to disk consumption is not really an issue here - 5MB is paltry. But consider that all my other package files really top out around 300 or 400 kilobytes.
Consider also that I'm not really creating any new data here, so much as copying existing data into a new format (JSON nesting). The existing level data, then, comprises approximately 80kb of the freshly imported file size, based on a rough estimate of what portion of the package data it represents. Creating a nested json table for this purpose then balloons the size of that data to 1300kb... a 1625% increase.
Now, the benefits of the JSON objects and such are pretty clear. If the package has to be that size to support it, then I guess it has to. But a point for consideration is what kind of impact this may have on CMud's performance, at least from a loading standpoint. Personally I haven't seen any hit loading, that I've noticed, but I've only got one big package like this presently. What happens when more hit the scene. What happens if you need to load 10 such packages? What about people who aren't inclined to work with multiple packages and just have one main package that gets crazy big, like 50MB? Are these sizes acceptable? Now that they're likely, is it worth looking at having package files downsize where appropriate?
I certainly haven't seen any downside to the file size as yet, except of course for considerations about e-mailing the package. I also think that any such packages are probably likely to only develop such size in storage of user-specific data, meaning that package sharing should be ok since that data can be excluded.
The answer to all these points may well be to maintain the current status, and that's totally fine.
But being a beta and all, I'd be remiss not at least bringing them up for a brief think. :) |
|
|
|
Taz GURU
Joined: 28 Sep 2000 Posts: 1395 Location: United Kingdom
|
Posted: Thu Aug 12, 2010 4:49 pm |
The package file is a SQLite database and I'm guessing marked free space is getting reclaimed by SQLite when it writes more data to the database the only thing that is not happening from the sound of it is a compress after a big free up. SQLite is FAST and it won't read from that free space when it does a specific table read as it knows exactly where to look anyway.
Feel free to use a third party SQLite Admin program to compress your package files it shouldn't cause any issues. |
|
_________________ Taz :) |
|
|
|
Zugg MASTER
Joined: 25 Sep 2000 Posts: 23379 Location: Colorado, USA
|
Posted: Thu Aug 19, 2010 5:44 pm |
I've asked Derar to update his scripts before I can really debug this to look for memory leaks.
It is generally a *bad* idea to make copies of tables into local variables. Each time you assign a list/table to any variable (local or otherwise), CMUD is making a copy of that table. The script in question was building a very large table structure and was making several copies of this table each time through the loop. So naturally the loop started taking longer and longer and using more and more memory.
This was a rather unusual case of a specific script. If Derar can rewrite it to not make copies of the big list he is building, then I might be able to debug it a bit more to track down any memory issues. |
|
|
|
Delgar Beginner
Joined: 22 Dec 2008 Posts: 21
|
Posted: Fri Aug 20, 2010 1:01 pm |
I am seeing a very similar thing as Derar, except not as large a memory usage over time (max I've seen it 400 MB used after around 30 hours of use).
This issue was not in 3.17 and I upgraded directly to 3.22a from there.
I use a lot of local variables, but not many if any table copies (none being copied inside of loops).
Just wanted you to know that Derar's isn't an isolated case.
Hope this helps make CMud and even better product.
-- Delgar |
|
|
|
Zugg MASTER
Joined: 25 Sep 2000 Posts: 23379 Location: Colorado, USA
|
Posted: Fri Aug 20, 2010 4:55 pm |
What I'll need somebody to do is reduce this down to a simpler script that I can run on my end to reproduce the problem. None of my sessions are growing past 50MB so it's specific to your scripts. I'm sure it's related to the new list/table code and it might be related to local variables not releasing copies of these tables.
But I need somebody to play around and experiment with this to try and help pin down the problem while I work on fixing other bugs right now. |
|
|
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|