Compression of .igi

General questions about Indigo, the scene format, rendering etc...
mrCarnivore
Posts: 517
Joined: Sun Mar 04, 2007 6:20 am
Location: Stuttgart, Germany

Compression of .igi

Post by mrCarnivore » Thu Mar 22, 2007 2:45 am

@OnoSendai:

As .igi files for even simple scenes easily exceed 50 MB, I think lossless compression (even a simple one) could be very beneficial to the .igi format.

If you were interested and rather spend you time improving indigo itself, I would like to implement a lossless compression algorithm as an extension for the .igi format. Of course I would give you the code when it works so that you can integrate it to indigo.

What do you think about that?

User avatar
zsouthboy
Posts: 1395
Joined: Fri Oct 13, 2006 5:12 am

Post by zsouthboy » Thu Mar 22, 2007 3:12 am

Send the framebuffer (for network rendering) the same way, if possible, if that happens. :)

Deus
Posts: 336
Joined: Sun Feb 04, 2007 3:47 am

Post by Deus » Thu Mar 22, 2007 10:15 am

Lossless compression would suck on floating point (HDR) framebuffer values from a stocastic raytracer. I doubt it would compress anything at all besides areas not hit by rays. (Not being agressive here ;-) )

mrCarnivore
Posts: 517
Joined: Sun Mar 04, 2007 6:20 am
Location: Stuttgart, Germany

Post by mrCarnivore » Thu Mar 22, 2007 9:11 pm


User avatar
OnoSendai
Developer
Posts: 6241
Joined: Sat May 20, 2006 6:16 pm
Location: Wellington, NZ
Contact:

Post by OnoSendai » Thu Mar 22, 2007 9:28 pm

I'll try it with zlib first and see what kind of compression i get with that.

User avatar
OnoSendai
Developer
Posts: 6241
Joined: Sat May 20, 2006 6:16 pm
Location: Wellington, NZ
Contact:

Post by OnoSendai » Thu Mar 22, 2007 9:30 pm

Deus wrote:Lossless compression would suck on floating point (HDR) framebuffer values from a stocastic raytracer. I doubt it would compress anything at all besides areas not hit by rays. (Not being agressive here ;-) )
Yeah.. it's hard to say how much it will compress, but I think the exponents and more significant mantissa bits will be predictable enough to get a reasonable amount of compression.

Deus
Posts: 336
Joined: Sun Feb 04, 2007 3:47 am

Post by Deus » Thu Mar 22, 2007 9:39 pm

Why dont we just zip the darn file and stop bitchin about theory? :P

You post some paper from 2006? I realize you dont work in the industry Carnivore. You are probably some clever naive young gun :) i.e. You dont make a simple format for a raytracer and directly try to implement some untested compression algorithm from scratch with faint hopes it will compress the file. Even if it did compress the file 10:1 it would still not be a big deal or even worth the hassle. 50 meg files is a piece of cake on normal computers. Carnivore let me give you a piece of advice. If something is necessary implement it. If it's not necessary don't. Just being nice is not justification enough for existence. (thats the reason why I am such an asshole). Most software is crap just because they added stuff that is not necessary (like the helper in Excel). This is just as unimportant as an API or optimized builds for indigo. It feels good somewhere in that engineering heart and roleplaying statcrunching mind but in practice its really pointless. I am sorry. Sorry for bitching about this but I know I am right about all of this. FTR: I LOVE compression algorithms.

Over and out ;-)

User avatar
OnoSendai
Developer
Posts: 6241
Joined: Sat May 20, 2006 6:16 pm
Location: Wellington, NZ
Contact:

Post by OnoSendai » Thu Mar 22, 2007 10:16 pm

Deus wrote:Why dont we just zip the darn file and stop bitchin about theory? :P

You post some paper from 2006? I realize you dont work in the industry Carnivore. You are probably some clever naive young gun :) i.e. You dont make a simple format for a raytracer and directly try to implement some untested compression algorithm from scratch with faint hopes it will compress the file. Even if it did compress the file 10:1 it would still not be a big deal or even worth the hassle. 50 meg files is a piece of cake on normal computers. Carnivore let me give you a piece of advice. If something is necessary implement it. If it's not necessary don't. Just being nice is not justification enough for existence. (thats the reason why I am such an asshole). Most software is crap just because they added stuff that is not necessary (like the helper in Excel). This is just as unimportant as an API or optimized builds for indigo. It feels good somewhere in that engineering heart and roleplaying statcrunching mind but in practice its really pointless. I am sorry. Sorry for bitching about this but I know I am right about all of this. FTR: I LOVE compression algorithms.

Over and out ;-)
hehehe... nice rant :)
That's why i will probly use zlib.. because i can compile + link with it, zip the data (with one function call), and you're done. And you know the person at the other end can do the same. (Unless they're running Java? :P)

mrCarnivore
Posts: 517
Joined: Sun Mar 04, 2007 6:20 am
Location: Stuttgart, Germany

Post by mrCarnivore » Thu Mar 22, 2007 10:19 pm

OnoSendai wrote:I'll try it with zlib first and see what kind of compression i get with that.
Yeah, I think that might be the path of least resistance. ;-)

No, honestly, although an image compression algorithm works WAY better on images than zlib since it also uses the correlation between the colour channels and the correlation between the neighbouring pixels, you might still just get 30% compression with an image specific lossless compression and it therefore might not be worth the hassle. So going zlib might give the best cost/benefit...
Deus wrote:Even if it did compress the file 10:1 it would still not be a big deal or even worth the hassle. 50 meg files is a piece of cake on normal computers.
Seems like you either are too affected by your nick or you otherwise think that just something that is worthy of the nobel price deserves to be implemented...
Deus wrote:If something is necessary implement it. If it's not necessary don't. Most software is crap just because they added stuff that is not necessary (like the helper in Excel). This is just as unimportant as an API or optimized builds for indigo.
You really do have a funny opinion! So you think you should using -SSEn optimized builds is comparable to the helper in Excel?? Let me give you an advice: You should really think things through before you make your mind! The helper is an annoying addition that is not necessary, you are right. But optimized build (using -SSEn optimizations) don't have any negative side effect other than your render is finished quicker and you might not get that much time watching tv or drinking coffee, while waiting for the render as before...

Also I would like to know your definition of "necessary". If something gives an extra use without affecting other functions it is almost always worth adding. The excel helper pops up when you don't need it and annoys the hell out of almost anybody. So definitely affects the usability and therefore is not necessary. Building a faster render using just another flag (takes about 10 seconds of your life to add that flag to the compiler) without any of the functions being affected in a negative way is definitely worth it!

Also it is very worth it if there would be a compression algorithm that would compress the file by the factor 10. Because that would add value without affecting any other function...

So please stop talking down on me and start giving real reaons.

Deus
Posts: 336
Joined: Sun Feb 04, 2007 3:47 am

Post by Deus » Thu Mar 22, 2007 11:37 pm

I am done with you Carnivore. I like that you bite back :)

You tried compressing it yet Ono?

User avatar
OnoSendai
Developer
Posts: 6241
Joined: Sat May 20, 2006 6:16 pm
Location: Wellington, NZ
Contact:

Post by OnoSendai » Thu Mar 22, 2007 11:49 pm

no, not yet. Feel free to try yourself however :)

mrCarnivore
Posts: 517
Joined: Sun Mar 04, 2007 6:20 am
Location: Stuttgart, Germany

Post by mrCarnivore » Fri Mar 23, 2007 12:01 am

Deus wrote:I am done with you Carnivore.
Sad, that you are against a lot of things and ramble on how bad you think they are with your know-it-all attitude, without giving any real reasons...

Nice to know, though, how to value your other comments now.

IMSabbel
Posts: 3
Joined: Mon Jan 29, 2007 9:35 pm

Post by IMSabbel » Fri Mar 23, 2007 1:38 am

Sorry Carnivore, but i am with deus.
First: maybe its your paper, but the content isnt really great at all (meaning the algorithm=useless, probably just a result of some publish or perish in action).
Second: Image compression algorithms are _margially_ better than general algorithms (look at maximumcompression, for example)
Third: They will fail horribly at floating point images. Really horrible. And even more on Indio-style ones.
Fourth: Du hast auch nicht die Weisheit mit Löffeln gefressen.

User avatar
manitwo
Posts: 1029
Joined: Wed Jul 05, 2006 4:50 am
Location: Tirol - Austria

Post by manitwo » Fri Mar 23, 2007 2:10 am

hmm ... with winrar i can pack them so that they are ~75% of the filesize of the original .igi file.

Deus
Posts: 336
Joined: Sun Feb 04, 2007 3:47 am

Post by Deus » Fri Mar 23, 2007 2:22 am

I dont need to prove my point. Its ok to be wrong. I used to be wrong alot when I was younger. Its the only way to be right ;-)

yeah. 25% reduction = pointless. Next useless feature please.

EDIT: I am gonna be real real nice right now. Carnivore you seem to be dying to co-develop indigo. I suggest you start with doing something you want yourself. Why dont you start by creating an API that can generate indigo xml... then you do two birds with one stone. :)
Last edited by Deus on Fri Mar 23, 2007 2:27 am, edited 1 time in total.

Post Reply
24 posts

Who is online

Users browsing this forum: No registered users and 44 guests