Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
Inspired by the informative post from Jumping Rivers about selecting the correct image file type, I decided to optimise PNG file size as part of this blog’s CI pipeline.
OptiPNG
As suggested, I used optipng
. Let’s see how well this works on a large sample PNG file.
$ optipng moon.png ** Processing: moon.png 8192x4096 pixels, 3x8 bits/pixel, RGB Input IDAT size = 55785027 bytes Input file size = 55810891 bytes Trying: zc = 9 zm = 8 zs = 0 f = 5 IDAT size = 49789959 Selecting parameters: zc = 9 zm = 8 zs = 0 f = 5 IDAT size = 49789959 Output IDAT size = 49789959 bytes (5995068 bytes decrease) Output file size = 49795399 bytes (6015492 bytes = 10.78% decrease)
Not bad: a 10.78% decrease in file size from 54 MiB to 48 MiB.
I settled for the default level of optimisation (equivalent to -o2
). As the package authors note, there is unlikely to be a substantial improvement in image size with more aggressive optimisation, although it will take substantially longer.
Automating
I wanted all of the “larger” files on my blog to get optimised. And I, obviously, didn’t want to do this by hand, so I added it to the CI pipeline.
optimise: stage: optimise variables: GIT_SUBMODULE_STRATEGY: none script: - find public/ -name "*.png" -size +50k -exec optipng {} \; only: - master
The -size +50k
option filters out files which are smaller than 50 kiB. This is an arbitrary threshold, but it doesn’t seem worthwhile using resources to optimise files which are already fairly small.
Now each time the master
branch is deployed, the larger PNG images are optimised.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.