Page 1 of 1

MemoryError please

Posted: Thu Feb 01, 2007 1:04 am
In Blender export:

File"<string>", line 825, in gui
File"<string>", line 603, in save_still
File"<string>", line 546, in save_indigo
File"<string>", line 307, in exportObject
File"<string>", line 162, in exportMesh

Any Idea whats wrong in the scene??


Posted: Thu Feb 01, 2007 1:38 am
by zuegs
It's tooooo big ????
Please check memory consumption while exporting


Posted: Thu Feb 01, 2007 2:04 am
CPU about 60%
"peging" file cca 1,54GB..
have a 2G RAM...

Posted: Thu Feb 01, 2007 2:09 am
I removed some complicated objects and it working... :)
But I need these objects :(

Need more RAM? Better Processor or graphic card??
Or just need a more simple mesh?? :shock:
:idea: :?:

Posted: Thu Feb 01, 2007 2:38 am
by zuegs
don't know if more RAM will help ... :roll:
... best is to simplify mesh if possible :oops: 8)
... exporter will need some deep optimisation to be able to handle big scenes in a economic way :wink:

Posted: Mon Feb 19, 2007 10:48 pm
by tobak30
I roo haev troubles with memory in converting the scene to xml. It takes to much memory. I am am making a kitchen and its my most complex scene ever. But my sysytem peaks at 1,54 and blender is using about 800 megabytes.

The exporter could need some rewrite as you pinted out.

Posted: Sat Feb 24, 2007 2:26 am
by tobak30
it seems that I have gotten over the trouble by removing some subsurf settings.

Posted: Sun Mar 11, 2007 5:28 pm
by Wedge
I have looked into this quite a lot lately...It is somehwat difficult for me to find a solution. The exporter 7t5 exporter follows the same path as the 6t6 exporter does when talking about writing the vertex pos / normals / uv coords.

As far as I can tell, this right here is the problem. If we could grab the vertex pos by itself, and then grab the normals by themselves, grab the uv coords by themselves, etc etc it would seem that meshes could get very very big. Because now instead of creating one massive list for verts, normals, uv coords....we could create 3 smaller lists by themselves so they would be smaller.

I have code already that grabs the vertex pos separate from the normals but it is useless as far as I can tell because now there is no way to write them to the xml file. If I somehow added the two lists together again, I think it would be the same large memory hogging list. So while I found a way to separate them, I see no way bring them back together on the same line as the vertex pos.

I made new code to grab one line at a time, which also works to my knowledge, but it is so slow now it is useless. It has to check the loop over and over and over and over to write a line......etc

So.........I have no idea at this point.

Posted: Mon Mar 12, 2007 1:28 am
by nikolatesla20
Just one question remains. I have not taken time to look at the python script for any exporter yet (I used to code lots of python scripts for blender), but does the script output a single mesh at a time? That alone would still be fast and also be possible I would think. It would be hard to do each vert, etc, seperately, because now you need to tie them into which mesh they belong to. But of course you can do other optimizations if you have those lists. But I dont think that having seperate lists is going to save memory. The objects are there no matter which way you do it..

That being said, remember that optimization is always a trade-off. You can get more speed but it takes more memory. You can use less memory but it goes slower. The only other way to handle it is to have a script that can function either way, and let the user decide which method to use.


Posted: Mon Mar 12, 2007 8:12 pm
by Wedge
From what I can tell the exportMesh function puts all of the mesh data into one big string. (or list, but this list is quickly joined to a string) This is then called from the exportObject function where it gets its <model> or <meshlight> tags and relevant data to that area. This data is also += to the giant string of mesh data. Causing one big massive memory error causing string. This one string holds data from one object/mesh.

It is the size of this single string, that causes the memory error. (Or at least that is what my weekend's testing has showed me.)

I have good news! I finally was able to clear my mind and rethink this. I have separated the mesh into about three peices and saved them as three smaller strings. Then these are written one by one to the file. I have just ran a test, trying to export a 636,000 FA 636,000 VE uvsphere and all went well. It took roughly 11 minutes to export it and it was in a basic scene with only a plane and cube. I am unsure the size that will cause a memory error now.

Unfortunately, the code is a mess now and also very slow because I have just solved this problem towards the end of a long trial and error session. After rewriting over and over trying new ideas, I started taking shortcuts and just copy/paste the code all over to save time with failed attempts. So now it is doing things it doesn't need to and making it slower. Although I don't think 11 minutes is too bad but I'm pretty sure it can be lower. I ran my normal test scene and it took 60+ seconds instead of 30 to export. So I am hoping I can remove the extra code to things not necessary from the mess and bring the times back down to near 30 if I can.

Then of course after I make sure everything is ok. I'll post the updated exporter. I am hoping it is ready by late Monday or Tuesday. (my time: gmt -5) :)

I was thinking of making a tag to add to mesh names to determine how each mesh gets exported, or maybe a button in the gui turning on global slow (large mesh) exporting. But first I am hoping with the way that it is written now, the small meshes pass by the functions that the big meshes use. Since I do not really know when the memory error happens, it is a rough guess at how big to to let a mesh grow before it is separated into piece number two and then three. So maybe future tweaking can also speed this up a little for the meshes that might be just over that growth line, if I find that the line can be moved a bit to the larger side.

I do not know any easy way to test and also mesh data can be expanded by how many - signs are inside of it...(like -1.0000 vertex pos or normals) Since the positive values do not have + signs and neutral values have no sign, it is tough to test exactly what size mesh throws the error. Adding also to the problem with meshes that are UV wrapped. These extra variables are in the mix that makes it questionable how high to let the mesh grow and how much headroom to give it...

Update from early morning Tuesday: I have fixed all problems that I found from my messy code. I was also able to bring the time back down to where it is the same speed with or without this split mesh feature. So with that said, I will also say that this will be included into the finished exporter as a hidden feature that is always on and waiting for big meshes to split apart right before it would normally throw memory errors. I am going to perform some more tests today and post this by late Tuesday. (my time) This way I can make sure the bugs are gone and take some time tests and see how big the meshes can be without throwing errors.