...
Current implementation uses existing download_large
size derivatives. I thought it was good to include high-res images suitable for printing at high-quality, but this leads to pretty large PDF sizes – and contributes to large RAM sizes. Try with download_medium
see if that alone lets us do very large PDFs without worrying about it? Probably good enough.
Tried it: uses signfiicantly less RAM, but our biggest works still use too much, so doesn’t get us all the way there, unless we’re going to limit PDF generation to 500-page0-max or something. (smaller images may be better for users anyway, may do anyway).
50-image work, qf85nc451. Originally 99MB PDF using 440MB RAM. Smaller images, 20MB PDF using 284MB RAM.
100-image work, fx719n43f. Originally 171MB PDF using 549MB RAM. Smaller images, 35MB PDF using 328MB RAM.
325-image work, 1831ck38c. Originally 386MB PDF using 912MB RAM with out of memory errors. Smaller images, 92MB PDF, 472MB RAM.
Ramelli, 694 items. Originally 1.8GB PDF(!), did not measure RAM far too much for heroku. Smaller images, 325MB PDF, RAM usage 987MB, with heroku out of memory errors – so this is around the limit for what we can fit on heroku still (and we do have a few larger ones maybe too).
Ruby hexapdf instead of prawn
...