![]() We accurately model our experiment, extracting error budgets that highlight the biggest challenges for future systems. To investigate damaging, low-probability error sources, we run a distance-25 repetition code and observe a 1.7 × 10 −6 logical error per cycle floor set by a single high-energy event (1.6 × 10 −7 excluding this event). We find that our distance-5 surface code logical qubit modestly outperforms an ensemble of distance-3 logical qubits on average, in terms of both logical error probability over 25 cycles and logical error per cycle ((2.914 ± 0.016)% compared to (3.028 ± 0.023)%). ![]() Here we report the measurement of logical qubit performance scaling across several code sizes, and demonstrate that our system of superconducting qubits has sufficient performance to overcome the additional errors from increasing qubit number. However, introducing more qubits also increases the number of error sources, so the density of errors must be sufficiently low for logical performance to improve with increasing code size. Quantum error correction 1, 2 offers a path to algorithmically relevant error rates by encoding logical qubits within many physical qubits, for which increasing the number of physical qubits enhances protection against physical errors. "Our analysis shows it is necessary for the community to focus on super-quadratic speeds, ideally exponential speedups, and one needs to carefully consider I/O bottlenecks.Practical quantum computing will require error rates well below those achievable with physical qubits. "These considerations help with separating hype from practicality in the search for quantum applications and can guide algorithmic developments," the paper reads. The team notes that while linear algebra has an exponential speed up, this is negated by I/O bottlenecks as soon as the matrix is loaded into memory. However, not every algorithm capable of an exponential speedup is necessarily well suited to quantum systems. More effective and targeted cancer drugs rely on computational biochemistry."Ĭryptoanalysis using Shor's algorithm presents similar challenges, the researchers note. Better and more efficient electric vehicles rely on finding better battery chemistries. "Many problems facing the world today boil down to chemistry and material science problems. "If quantum computers only benefited chemistry and material science, that would be enough," Troyer emphasized. This is because many of these workloads rely on relatively small datasets. One such workload likely to benefit from quantum systems are chemical and material sciences. "Generally, quantum computers will be practical for 'big compute' problems on small data, not big data problems," the researchers wrote. Generally, quantum computers will be practical for 'big compute' problems on small data, not big data problems ![]() This doesn't mean that quantum computing is worthless it simply means that, at least for the foreseeable future, the applications for quantum systems are likely to be narrower than the marketers would have you believe. He added that this means that workloads like drug design, protein folding, as well as weather and climate prediction are better suited to conventional workloads, given the current state of the tech. "Our research revealed that applications that rely on large datasets are better served by classical computing, because the bandwidth is too low on quantum systems to allow for applications such as searching databases or training machine learning models on large datasets," Troyer explained. Input and output (I/O) bandwidth is another limiting factor. That's not the only problem facing quantum architectures.
0 Comments
Leave a Reply. |