A hybrid quantum enabled RBM advantage: convolutional autoencoders for quantum image compression and generative learning

Author/Creator ORCID

Date

2020-05-20

Department

Program

Citation of Original Publication

Jennifer Sleeman, John Dorband, and Milton Halem "A hybrid quantum enabled RBM advantage: convolutional autoencoders for quantum image compression and generative learning", Proc. SPIE 11391, Quantum Information Science, Sensing, and Computation XII, 113910B (20 May 2020); https://doi.org/10.1117/12.2558832

Rights

This item is likely protected under Title 17 of the U.S. Copyright Law. Unless on a Creative Commons license, for uses protected by Copyright Law, contact the copyright holder or the author.
©2020 Society of Photo-Optical Instrumentation Engineers (SPIE). One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited.

Abstract

Understanding how the D-Wave quantum computer could be used for machine learning problems is of growing interest. Our work explores the feasibility of using the D-Wave as a sampler for a machine learning task. We describe a hybrid method that combines a classical deep neural network autoencoder with a quantum annealing Restricted Boltzmann Machine (RBM) using the D-Wave for image generation. Our method overcomes two key limitations in the 2000-qubit D-Wave processor, namely the limited number of qubits available to accommodate typical problem sizes for fully connected quantum objective functions, and samples that are binary pixel representations. As a consequence of these limitations we are able to show how we achieved nearly a 22-fold compression factor of grayscale 28 x 28 sized images to binary 6 x 6 sized images with a lossy recovery of the original 28 x 28 grayscale images. We further show how generating samples from the D-Wave after training the RBM, resulted in 28 x 28 images that were variations of the original input data distribution, as opposed to recreating the training samples. We evaluated the quality of this method by using a downstream classification method. We formulated a MNIST classification problem using a deep convolutional neural network that used samples from the quantum RBM to train the MNIST classifier and compared the results with a MNIST classifier trained with the original MNIST training data set, as well as a MNIST classifier trained using classical RBM samples. We also explored using a secondary dataset, the MNIST Fashion dataset and demonstrate the first quantum-generated fashion. Our hybrid autoencoder approach indicates advantage for RBM results relative to the use of a current RBM classical computer implementation for image-based machine learning and even more promising results for the next generation D-Wave quantum system. Our method for compression and image mappings is not constrained to RBMs, the autoencoder part of this method could be coupled with other quantum-based algorithms.