Quantum Embedding | QML

Amit Nikhade
6 min readJun 24, 2021

Yes you, we are Itinerants, exploring quantum technology. Come along with me.

Visit: amitnikhade.com

Photo by JEONGUK — on Unsplash

In our last quantum ML post, we saw the fundamentals of Quantum machine learning and how can we get started with it. And surely we’ll be going to inspect each terminology in QML and hence here I’ll be explaining how exactly the embedding happens in quantum ML, and how it differs from the classical method.

Introduction

Quantum machine learning offers velocity with precision, and hence it is obviously to be the future of Artificial Intelligence. As we usually feed input to the neural network in embedded format, let it initially be in text format, image, or audio format. Input is only acceptable to the neural network in the form of embedding vectors and hence in the same way the quantum neural network needs to get the input embedded, before engaging it in the machine learning cycle.

Phenomenon

source: medium.com logo

Quantum machine learning involves embedding (encoding) , which is performed by transforming classical data points as quantum states in Hilbert space via the quantum feature map.

Let’s go a little deeper into the definition.

I hope you may be familiar with the quantum states, if not I recommend you to please go through this quick guide below.

I hope you are now you don’t need much explanation on the quantum states. The quantum state provides a probability distribution for the end result of each possible measurement on a system. They can be a wave function, a set of quantum numbers, or vectors in the Hilbert space.

The two orthonormal basis states, {∣0⟩, ∣1⟩} together are called the computational basis.

What's the Hilbert space?

In simple terms, Hilbert space is a linear vector space that possesses some directive properties. The vector space is a space with consists collection of vectors or scalars.

The Hilbert space has inner product operation and it is complete and separable in nature. This means it involves the multiplication of vectors (ψ.Φ) that harvest a scalar value i.e the inner product operation. having conjugate-linear inner product within the first argument, symmetric conjugate values, and self inner product ≥ 0. you just have to focus on what is the Hilbert space for now.

Quantum feature map

source

The quantum feature map is a feature map that transforms data into space where it is easier to process. It reduces the consumption of resources while describing a large amount of data. It can be referred to as a form of dimensionality reduction.

Just consider the example of any machine learning problem, a classification one. At the initial stages, we try to restructure the data using several techniques like feature selection, removing outliers and null values which makes it easier for the model to process it and even increases the accuracy as well. The same way feature map makes the data feasible.

Pennylane

The figure shows the feature map ϕ, which maps the data from the original space to the vector space. where space is the vector space (Hilbert space) and the data transformed to the Hilbert space are state vectors like it maps from x to |ϕ(x)⟩. I hope I haven’t kept you confused, it’s pretty simple. This is done with the help of unitary transformation Uϕ(x) that is the so-called variational circuit or a quantum circuit. The parameters passed to the circuit (as we saw in our last article on how a parameterized quantum circuit is constructed) are dependent on the kind of data.

These concepts are needed to understand Quantum embedding. go once more through them if you haven’t.

So there are various embedding techniques in the quantum marketplace, let us see some of them.

Amplitude Embedding

fig: Pennylane

The Amplitude embedding is also known as wave function embedding. In simple terms, the amplitude is the height of a wave. In this kind of embedding the data points are transformed into amplitudes of the quantum state. The above figure represents the amplitude encoding.

Normalized data

Let’s consider a dataset α= { a1, a2……..an} having n number of dimension, the dataset is initially normalized to length 1, As there may be different numeric data types. The squared moduli of the amplitudes of a quantum state must sum up to 1, as shown in the figure. The number of amplitudes to be encoded is the product of the number of dimensions and the number of samples. The encoding is denser as compared to the basis or angle encoding.

Basis Embedding

In basis embedding, the data has to be in form of a binary string to get embedded. The idea behind basis embedding is using a computational basis. Approximating a scalar value to its binary form and then transforming it to a quantum state.

The algorithm involves — The first step is to approximate a number by a binary bit string and the second step is encoding it by a computational basis state. for instance: x=1001 is represented by the 4-qubit quantum state |1001⟩

Pennylane

A dataset transformed to basis embedding can be represented as the equation above. where M is the number of samples and m is the n-dimensional vector. The number of quantum subsystems must be at least equal to N in N-bit binary string. Amplitude vectors are scattered during embedding.

Angle embedding

Angle embedding preparation

Angle encoding is a simple and effective method for embedding data, radically it is one of the most basic forms of encoding which transforms classical data into a quantum state, but it is not robust. The angle embedding is performed by applying rotations on the x-axis or y-axis using quantum gates along with the values that have to be encoded. If we wanna apply angle embedding on a dataset the number of rotations will be the same as the number of features in the dataset. the n-dimensional sample would take n-number of qubits to generate the set of quantum states.

source

Quantum AI is the upcoming trend in technology and we have to be prepared for this hype. I hope you have established your substructure for moving deeper into quantum AI. And if you haven’t understood the concept clearly just go through them once you will be able to grasp them.

References

About me

--

--