End-to-end optimized image compression

View Researcher II's Other Codes

Disclaimer: “The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).”

Please contact us in case of a broken link from here

Authors J. Ballé, V. Laparra, and E.P. Simoncelli
Journal/Conference Name Int'l. Conf. on Learning Representations (ICLR2017)
Paper Category
Paper Abstract We describe an image compression system, consisting of a nonlinear encoding transformation, a uniform quantizer, and a nonlinear decoding transformation. The transforms are constructed in three successive layers of convolutional linear filters and nonlinear activation functions, but unlike most convolutional neural networks, we use a joint nonlinearity that implements a form of local gain control, inspired by those used to model biological neurons. Using a variant of stochastic gradient descent, we jointly optimize the entire system for rate–distortion performance over a database of training images, introducing a continuous proxy for the discontinuous loss function arising from the quantizer. The relaxed optimization problem resembles that of variational autoencoders, except that it must operate at any point along the rate–distortion curve, whereas the optimization of generative models aims only to minimize entropy of the data under the model. Across an independent set of test images, we find that the optimized coder generally exhibits better rate–distortion performance than the standard JPEG and JPEG 2000 compression systems. More importantly, we observe a dramatic improvement in visual quality for all images at all bit rates.
Date of publication 2017
Code Programming Language MATLAB
Comment

Copyright Researcher II 2021