This is a continuation of “A great alternative to qubits — PART 1”! Click 👉here👈to read it first
You might have seen massive tables full of a complex, interconnected network of optical components. If you have, welcome to the field of Free-space optics!
These components are used to realize various types of quantum states, protocols, and algorithms.
Of course, these apparatuses are unusable in real-life settings — it’s like the ANIAC and UNIAC back in the early days of digital computing, where they were more for lab experiments rather than computing devices for the masses.
Free-space optics can easily be realigned and rearranged depending on the experiment you want to perform. Moreover, optical losses in a specialized optics environment are much lower than optical fibers.
However, imagine that, just like the ANIAC/UNIAC, you were tasked with enabling these machines to be used by individuals globally. The maintenance, creation, and rearrangement of the setups, and possibly even their reproduction, is not easy.
You could experiment with things like auto-alignment, but it’s infeasible at scale for so many kinds of apparatuses.
So, what kinds of cool apparatuses are we talking about here? Let’s dive into the architecture of a CV-based Quantum Computer to explore the tech behind it 🔎 ✨
The Hardware 💾
It starts with light sources such as lasers, which might be squeezed and/or produce single photons.
Once the initial state of the light has been prepared, the next step is to pass that light through an optical circuit (also called an interferometer), much like the transistors in a CPU on a classical computer, to determine what operations and protocols are to be implemented. We will cover some of the most important components in detail shortly.
Finally, to get the results of the computation, we need to measure the light. This is done by using detectors — my favorite kinds are SNSPDs, but SPADs are pretty cool too, even though they work at room temperature (…come on, that was funny!).
It is standard to approach the creation of an optical system using optical fibers and waveguides. If you don’t know what waveguides are, they’re essentially the PCB equivalent of optical components.
Depending on the type of component that you may be working with, it may be more difficult to integrate than others. So while the efficiency of optical fibers and waveguides may be considerably lower than the lab-grade counterparts, their obvious benefits include easy wiring and portability, making them easily integrated into products for the masses.
Yet, the efficiency of the components can’t be too low. So while you can’t expect lab-level performance from integrated optical components, they still need to meet some minimum standard.
What we mean by the “efficiency” of an optical component is its resistance against reflectivity, absorption, noise, etc.
In the case of working with squeezed light, the total losses of the system need to be less than 1%, which is significantly lower than even products marketed as “low-loss” integrated optical components. But what exactly are these components, and how do they work?
Let’s start by discussing interferometers. They are the building block of any linear optical quantum information processing system (whew!). They are made using a combination of beam splitters and phase shifters.
The former mixes two optical inputs (often called modes) to produce an output. They have two input ports, labeled a and b, as well as two output ports labeled c and d. The inputs are split between the two output ports depending on two complex parameters called r and t, reflectance and transmittance respectively.
The latter is used to change the oscillatory position (spin) of light in a single circuit path (mode).
You can create an arbitrary interferometer by combining beam splitters and phase shifters. In fact, for every N-input N-output interferometer, you can compose a configuration of ~N² variable beam splitters and phase shifters to realize it. Do you notice a problem here?
Creating a fully programmable network of linear optics would require parameter adjustment on the order of N² as N increases too! Not to mention that each beam splitter has to be manufactured specifically for each N — fortunately scientists have figured out how to solve this issue using GKP Qubits.
Next up we have Optical Delay Lines, which connect the components of an optical circuit with each other. When deciding on the kind of optical delay line to be used, three things need to be put into consideration:
- the fiber used for the line,
- the length of the line
- the shape of the line.
When considering materials for the creation of an Optical Delay Line, optical fibers are often preferred over waveguides as they can be made longer with fewer manufacturing resources.
However, we are not free to simply make the lines as long as we want to. No, that would be too easy!
The maximum length of the line depends on something called propagation loss, defined as the decreased strength of the light signal as it travels through the line.
Finally, the spatial mode (a fancy word for the distribution of light across a section) of the light inside waveguides versus an optical fiber is usually different, requiring careful consideration and execution to make sure nothing goes wrong.
Moving forward, we have Single-Photon Sources. There are many ways to physically realize them, such as combining a Spontaneous Parametric Down Conversion (SPDC) source and optical fiber or building the SPDC source on the waveguide directly.
Another option is using quantum dots, which, among other properties, serve as a semiconductor that emits single photons in a single direction when shined on by laser light. Unfortunately, they require extremely low-temperature environments to operate — not cool!
After that, we have an essential component for CV Quantum Computation called a Squeezed Light Source. The idea of squeezing light is reshaping it to reduce its error rate when measuring a specific property of a particle, getting us a much more accurate result.
And this component does exactly that. The light produced from this source has a reduced error rate on one of its two properties (momentum and position). If you’re interested in knowing more, I suggest you read my article on Squeezed Light for Biosensing Applications.
And finally, we have photon detectors. There are many types of these, so I will stick to just describing the workings of an SNSPD.
As you might guess, they are used to detect the presence of individual photons. These detectors are based on highly sensitive superconducting nanowires that are cooled to very low temperatures, typically around -270C.
When a photon is absorbed by the nanowire, it causes a small disturbance in its superconducting state, which can be detected by a readout circuit.
Conclusion 🤔
If you have been following news about quantum computing advancements, you will know that fabricating single qubits that can implement operations with high fidelity is a relatively simple task.
Compared to scaling up the ability of these quantum computers, what the industry is focusing on right now is retaining their fault tolerance and improving their error correction capabilities.
As you saw, many developments of CV Quantum States were simply extensions of qubit results into the Continuous Variable realm. However, some developments make it more than just a trivial “alternative” to qubits; rather, it becomes an important concept in quantum computation as a whole.
One of the first examples is the Quantum Teleportation Algorithm demonstrated on CV Quantum States. This experiment proved that CV Quantum States were at least as capable of quantum computation as qubits.
Another example involves scientists experimenting with GKP Qubits to achieve universal, fault-tolerant quantum computation. This, combined with newer experiments that show how this system can reasonably scale up with the number of available inputs, suggests that it may be a much better approach than using qubits for computation.
A common method used for error correction is creating logical qubits out of multiple regular qubits. However, since we’re using qumodes instead of qubits, we only need a single qumode to create a logical qubit!
Yes, I did say logical “qubit.” Well, if we’re going to be using qubits anyway for error correction, what’s the point of CV computation?
The thing is, we are still going to benefit from many aspects, such as a more natural representation of states and improved fault tolerance.
This highlights the significance of combining discrete and continuous quantum states in the development of quantum computers. Blending these two approaches by “digitizing” CV Quantum Information marks a major leap forward, allowing us to harness the unique strengths of both systems for building quantum computers with better fault tolerance.
Let’s end it here, with a great quote from Collaboration and Team Science — From Theory to Practice:
…”Scientific collaborations are especially interesting because they occur not just among people whose areas of expertise are complementary but also among people who are competitors or potential competitors. To work together competitors must give up some of their autonomy, and so must have confidence that their mutual interests will take precedence over their individual interests.”