A quantum physicist, Michelle Simmons, has been awarded the Australian of the Year award. Professor Simmons is researching quantum computing with the aim to start a quantum computing industry. Many people know that Silicon Valley is where a lot of technological evolution happens, but it might surprise many to see what is happening in the land down under.
Quantum computing across the seas
Professor Simmons and her team developed the first transistor made from a single atom. The research was done by Australian and Amercian physicists how transistors work. Currently, conventional computers use “1” and “0”, or “on” and “off.” The new computing uses qubits, and a qubit can be multiple values. These multiple values mean that some can be “on” and “off” simultaneously, providing more computing people. The physicists are hoping this will end up on a nanoscale inside devices.
“It shows that Moore’s Law can be scaled toward atomic scales in silicon,” said Gerhard Klimeck, a professor of electrical and computer engineering at Purdue and leader of the project there. Moore’s Law refers to technology improvements by the semiconductor industry that have doubled the number of transistors on a silicon chip roughly every 18 months for the past half-century. That has led to accelerating increases in performance and declining prices. “The technologies for classical computing can survive to the atomic scale,” Dr. Klimeck said.
What is next
It is possible that with the current growth of semiconductor research will result in quantum computing in two decades. The method that Professor Simmons and her team used involved using a single phosphorus atom in a scanning tunneling microscope. This method uses a silicon layer that is scraped and turned into a trench by a technique called scanning tunneling microscope. Following this, the precise location of the phosphorus atom is found using a phosphine gas before being covered with silicon atoms.
Traditional computing methods look to be reaching an end of their possibilities, and some are under the belief that the conventional computing circuitry will hit a wall where no more development can occur. That is why it is looking in quantum computing now will ease the transition from the end of one technology to the new one coming through.
According to Mike Mayberry, an Intel vice president who is the director of the company’s components research group, “It’s good science, but it’s complicated,” said. “By cooling it to very low temperatures, they’ve frozen out a lot of effects that might otherwise be there.”