AWS has launched a new fully managed quantum computing service that lets users start getting hands-on with some of the steadily improving technology’s very arcane algorithms — all without actually running a quantum machine itself.
The new service has been dubbed “AWS Braket” and is available in three US regions. The hyperscaler said it is being experimented with by US biotech firm Amgen, Italian utility Enel, and Germany’s VW. It includes access to both a classically-powered quantum simulator, and a range of different actual quantum systems from Canada’s D-Wave, Washington DC-based startup IonQ, and Berkeley’s Rigetti.
(AWS is following Microsoft in offering cloud-based access to a curated portfolio of machines from third-party providers: Azure Quantum — currently in limited preview — offers access to machines from IonQ, QCI, and Honeywell and should be GA this year. IBM, meanwhile, has been offering cloud-based access to its quantum computers via the Quantum Experience programme since 2016; and claims work by its 200,000 users has resulted in over 200 published academic papers. Credit where it’s due…)
How does AWS Braket Work?
“You can design and build your own quantum algorithms from scratch or choose from a set of pre-built algorithms. Once you have built your algorithm, Amazon Braket provides a choice of simulators to test, troubleshoot and run your algorithms,” said AWS late Thursday; with AWS’s CEO Andy Jassy noting on Twitter that it is “still early days, but has the potential to be a game changer in computing.”
Users can acccess the AWS quantum computing service via a “notebook-style” interface that customers can choose — if they wish — to run in a virtual private cloud (VPC): a logically isolated section of the AWS Cloud in a virtual network that you define as a user (i.e. with control over your virtual networking environment, including IP address range, creation of subnets, and configuration of route tables and network gateways.)
The interface is based on Jupyter‘s open-source web application.
Remind me, is this going to be difficult?
While the building blocks of classical computing are “bits” that use the 0 and 1 vocabulary of binary code, quantum computers use “qubits” that draw on two-state quantum-mechanical systems. In theory – because quantum computers can also process multiple values simultaneously – this makes quantum computers hugely powerful. They remain error-prone, hard to scale and require novel mathematical schemes to compensate for external “noise” however; although progress is happening.
They are also deeply challenging to programme: unlike classical computers that have, at the lowest-level, circuits that use ANDs and ORs and NOTs and XORs – that is, binary gates – quantum computers use different kinds of gates like CNOTs and Hadamards that require entirely different sets of instructions.
(“I’m going to go to night school, take some classes, get a doctorate, and THEN I’ll be ready to misuse this thing as a database!” quipped consultant Corey Quinn of AWS’s tutorial on the Quantum Approximate Optimization Algorithm: a “step-by-step walkthrough explaining the QAOQ quantum algorithm and how to build the corresponding parametrized quantum circuit ansatz”.)
Is it going to be expensive, putting my team on the quantum spot?
The curious can run simulations of gate-based quantum algorithms locally on their own hardware, within a managed notebook on a chosen AWS instance type, or via the fully managed simulation capability provided by Amazon Braket.
AWS said: “The local simulator is provided for free as part of the Amazon Braket SDK and is suitable for running small and medium scale simulations (typically up to 25 qubits). For larger, more complex algorithms (up to 34 qubits) that require high-performance compute resources, you can submit simulation tasks to the Amazon Braket service. The cost of using the Amazon Braket simulator is based on the duration of each simulation task. You will be billed at an hourly rate, in increments of one second, for the time taken to execute your simulation. If you use the managed simulator, you will be billed for a minimum of 15 seconds. The simulator is billed at $4.50 per hour.
Thanks to Amazon Braket, I finally have the necessary quantum computing capabilities to analyze the AWS bill which as we all know, occupies multiple probabilistic states until observed by your VP of engineering.
For access to the actual quantum computers, it gets confusing: “There are two pricing components when using a quantum processing unit (QPU) on Amazon Braket. You will be charged both a per-task and a per-shot fee”, AWS explains.
“A shot is a single execution of a quantum algorithm, such as a single pass through each stage of a complete quantum circuit on a gate-based quantum computer, or one result sample of quantum annealing problem. The per-shot pricing depends on the QPU used. The per-shot price is not affected by the number or type of gates used in a quantum circuit or the number of variables used in a quantum annealing problem. A task is a sequence of repeated shots based on the same circuit design or annealing problem.
We’ll leave our readers to thrash that one out. (Costs, to be fair, look reasonable.)
How could quantum computing actually be applied, for example in financial services?
To those wondering how this could be put to use, an August 10 paper by the IBM Quantum team (“Quantum computing for Finance: state of the art and future prospects“) suggests that options pricing, risk modelling and more could all benefit from the use of quantum computing. Quantum machines could also “allow for a more precise approach to incorporating market volatility into an institution’s Tier 1 reporting, optimizing risk weighted assets results through a much more accurate/precise calculation process,” the authors suggest in the paper this week.
Quantum algorithms assessed by IBM for use in financial services. Credit: IBM QuantumAs they write: “Financial risk, which comes in many forms such as credit risk, liquidity risk, and market risk, is often estimated using models and simulations.
“For instance, the capital requirements imposed on banks under the Basel accords depend on the accuracy of risk models. Therefore, banks with more accurate models can make better use of their capital. Value at risk (VaR), a quantile of the loss distribution, is a widely used risk metric… Monte Carlo simulations are the method of choice to determine VaR and CVaR. They are done by building a model and computing the loss/profit distribution for different realizations of the model input parameters.
Many different runs are needed to achieve a representative distribution of the loss/profit distribution. Classical attempts to improve the performance are variance reduction or Quasi-Monte Carlo techniques. The first aims at reducing the constants while not changing the asymptotic scaling; whereas, the latter improves the asymptotic behavior, but only works well for low-dimensional problems.”
By using a technique called “Quantum Amplitude Estimation” banks could secure a “quadratic speed-up over classical Monte Carlo (MC) simulations” they suggest.
Risk modelling, in short, could get significantly faster.
It’s not that straightforward…
All very exciting, but deal-breaking caveats remain.
As the authors note: “For some algorithms, loading the data can become computationally as expensive as using a classical algorithm to solve the problem.”
This boils down to the complexity of loading data into quantum machines, which do not work on the Von Neumann model in which a CPU performing computation is connected by a system bus to volatile memory (RAM) and non-volatile memory (such as a hard drive). There are no quantum (memory) hard drives at the current level of hardware technology and preparation is outlandishly complex and error-prone.