I'm aware of at least a few people using sharp for image optimisation.
Here's a technique I first saw almost 12(!) years ago that hasn't seen much love recently - http://www.websiteoptimization.com/speed/tweak/blur/
The description is of a manual process, but here's how it can be automated:
This should make the frequency-based JPEG algorithm a little more efficient, and could also be used to remove noise.
Google's new Guetzli JPEG encoder appears to do something very similar:
https://github.com/google/guetzli/blob/547b45e2899069bb7084bd50daadd93f607d8c5d/guetzli/preprocess_downsample.cc#L238
It also sharpens dark red regions:
https://github.com/google/guetzli/blob/547b45e2899069bb7084bd50daadd93f607d8c5d/guetzli/preprocess_downsample.cc#L220
Guetzli achieves apparently quite impressive compression. Any plans to support it?
@sedubois Did you see the comments in the linked-to https://github.com/jcupitt/libvips/issues/623 ?
Thanks, hadn't seen.
Most helpful comment
Google's new Guetzli JPEG encoder appears to do something very similar:
https://github.com/google/guetzli/blob/547b45e2899069bb7084bd50daadd93f607d8c5d/guetzli/preprocess_downsample.cc#L238
It also sharpens dark red regions:
https://github.com/google/guetzli/blob/547b45e2899069bb7084bd50daadd93f607d8c5d/guetzli/preprocess_downsample.cc#L220