May 19, 2024
Google Researches Quantum Computing Improvements To Image Classification
Quantum computing has attracted the attention of large companies in the domain, given its vast potential for the development of tasks on a scale difficult to achieve with traditional computers.

Quantum computing has attracted the attention of large companies in the domain, given its vast potential for the development of tasks on a scale difficult to achieve with traditional computers.

In a recently published research, Thomas Fischbacher and Luciano Sbaiz, researchers at Google, presented a method that uses quantum computing techniques and resources to classify images of 28 pixels by 28 pixels, illuminated by a single photon.

Before getting into the details of this breakthrough, if you’re not familiar with some concepts of quantum computing, we have an introductory capsule available on our YouTube channel.

Recapitulating the novelty, through this mechanism of classification of images, by which the quantum state of that only present photon is transformed, it was shown that it can be obtained, at least, 41.27% precision when working with databases MNIST, commonly used to train image processing systems. This would be an improvement in performance by 21.27% compared to conventional computer tools.

The model considers the highest achievable classification accuracy if an algorithm must make a decision after detecting the step of the first «quantum» of light (i.e., a photon), following a dynamics similar to that of an LCD screen revealing an image, but this time applied to a data set.

Following traditional models of working with MNIST databases, more classical computing can at best detect a photon that lands on one of the pixels in the image and guess the digit of the distribution of the intensity of light, which is obtained by changing the brightness scale of each image to a sum unit.

The paradigm shift offered by this quantum mechanical solution is based on a mechanism that employs beam splitters, phase changers, and other optical elements to create a hologram-like pattern of inference. The area of the pattern of inference in which the photon is placed can be used to make the classification of the image known.

One conclusion that stands out from this research is that it is not a requirement to illuminate a scene with many photons simultaneously to produce interference. «Conceptually, exploiting interference to improve the probability that a quantum experiment will produce the desired result is the essential idea underlying all quantum computation», the researchers pointed out in their publication, where they stressed that the shared advances were published in terms understandable to all experts in the field, with the ability to extrapolate to other fields beyond research, such as physics education.

Quantum computing is shaped to continue being a great ally of the advances that arise around artificial intelligence and machine learning. As we have seen during the last time and it seems that the direction will continue in the same direction. Its emergence in the last time has opened up a world of new possibilities for the development of new initiatives and, mainly, has encouraged research. However, given the associated costs, work of this magnitude can be presumed by very few at present.

This work, according to their researchers, aims to show how quantum technology can reach out to solve some problems by making use of AI. You can find all the details of the research in the document published by its authors with their findings.

Leave a Reply

Copyright © All rights reserved www.HufNews.com | ChromeNews by AF themes.