Forum Bugs

Memory usage for large image renders

daneren2005
We just re-upped our support contract because we were hoping that an issue we have been running into was fixed. After updating I am still seeing the same issue. When generated large (ie: like 20k x 15k) renders, we are seeing some huge memory usages. Memory usages seems to go up fairly quickly as we go larger. This is for doing large composites at 300 DPI. I tried just lowering the DPI when we go really big like this, but it caused quality issues at the printers. Is there any way to reduce the memory usage for giant renders like this so it isn't using a gig of memory when creating the jpg output?

As a note I am using Prince 12.5. I see this issue across both Heroku's Ubuntu 16.04 image and Debian Stretch.
mikeday
That might be tricky; at 32-bits per pixel a 20k x 15k image will take 1,200,000,000 bytes, so Prince is going to use over a gig of memory while creating the image.

Is this rendering HTML/SVG to JPEG or applying filters?
daneren2005
This is converting some semi-complex html with images embedded into it into a rendered jpg. An example render can be downloaded at https://bit.ly/2qMqcFe

Edited by daneren2005

daneren2005
I am not sure if there is a way to generate the image in chunks or compress part of the memory while it is waiting for the rest of the image to be rendered. I am pretty sure that tools like libvips can do stuff to large images without grabbing a 1GB buffer. Obviously they can optimize processes like that a little easier since that is their entire job, but I was just wondering if there is something we could to limit this super high memory usage.

PS if you need I can also email you an example of the html to run exactly what I am. Not sure how easy it is to reproduce this issue without my exact html or not. I don't want to post it here because of proprietary images involved in it.

Edited by daneren2005

mikeday
Yes it is theoretically possible to page parts of the image to disk so that only portions of it need to be in memory at a time, although this would require some work to implement. It might be a bit more complicated given that we're not starting with an existing image on disk like libvips, and we have to consider awkward cases like rasterising complex content that can potentially touch the entire image.