Researchers have unveiled an entirely new technique for correcting errors in quantum computer calculations, with the potential to remove significant barriers to a powerful new field of computing. 

 Debugging is a well-developed system in traditional computing. All telephones must be reviewed and modified to transmit and receive data over congested airwaves. Quantum computers have enormous potential to solve complex problems that conventional computers cannot, but that ability depends on harnessing the incredibly volatile behavior of subatomic particles. These accounts behave very erratically, and checking them for errors can cause the entire system to crash.

In a traditional computer, errors can be as simple as a block of memory accidentally jumping from  1 to  0, or as messy as a wireless router interfering with other memory. A common way to deal with these errors is to create some redundancy so that each piece of data is compared to a duplicate. But this approach increases the amount of data required and creates more opportunities for error. So it only works if the vast majority of the information is actually correct. Otherwise, checking for incorrect data and bad data can dig deeper errors. Iterations is a bad strategy if your benchmark failure rate is too high, said Thompson. Getting below that limit is the biggest challenge.

Thompson’s team has focused not only on reducing the number of errors, but also on making them more visible. The team investigated the actual physical causes of failures and designed their system to effectively eliminate the most common sources of failure, rather than simply corrupting corrupt data. Thompson said this behavior represents a special type of error called a deletion error, which is essentially easier to delete than deleting corrupt data, but still looks like any other data. 

 This advance requires the combination of knowledge of quantum computing devices and the theory of quantum error correction, taking advantage of the interdisciplinary nature of the research team and their close collaboration. Although the mechanism of this formation is specific to Thomson’s ytterbium atoms, he says the idea of ​​designing qubits to create erasure errors could be a useful target for other systems, many of which  are being developed at all over the world, and the band continues to do so. 

In practice, his proposed system could maintain an error rate of 

.1 percent, which Thompson says is well within the capabilities of current quantum computers. In previous systems, advanced error correction could handle less than 1 percent of errors, which Thomson says are the maximum capabilities of any current quantum system with a large number of qubits. 

Using these additional tools has been proven to be helpful in eliminating bugs. The team proposes to pump electrons from terbium and its stable “ground state” into excited states called  unstable states, which are long-lived under the right conditions but are inherently fragile. Unexpectedly, the researchers suggested using these states to encode quantum information. 

 It’s like electronics on a tightrope, Thompson said. The system is designed so that the same factors that cause failure can knock electrons out of the string.  As a bonus, electrons scatter light very dramatically once they fall to the ground state, so turning on a cluster of ytterbium qubits only illuminates the defective qubits. Those that light up should be closed as errors.


Post a Comment

थोडे नवीन जरा जुने