Scientists develop theory for quantum computer transistors

Copenhagen (Denmark) - Scientist from the Niels Bohr Institute at University of Copenhagen and from Harvard University say they have developed a new theory which describe how transistors for future quantum computers may look like.

Quantum computing is a topic we hear more and more often, typically in bits and pieces that describe how these super-fast computing devices will work. One of the key problems of quantum computing is how information will be passed along among photons, which are expected to become the main carriers of information.

Transferring data signals is done today via electrical current, but in a future of quantum computing, this signal is likely to be optical. Sending and receiving a signal from one photon to another isn’t particularly easy, as this process requires two photons to meet and directly interact with each other. At least up to today, photons are so small that it is virtually impossible to aim two photons against each other in a way so they can collide and interact.

A theory now developed, however could open the door to attract photons in order to enable an interaction. Instead of shooting two photons at each other from different directions and trying to get them to hit each other, Anders Søndberg Sørensen, a Quantum Physicist at the Niels Bohr Institute at Copenhagen University, said that he wants to use an atom as an intermediary. The atom can absorb one photon and collide with a second photon.

The problem, however, is that an atom is still small and difficult to hit, according to Søndberg Sørensen. He noted that, in a previous experiment, researchers had discovered that microwaves could be focused on an atom via a superconducting nano-wire. They got the idea that the same could happen with visible light.

Søndberg Sørensen said that the theoretical model shows that it works. Now the task is to prove it in an experiment.

Create a new thread in the UK News comments forum about this subject
This thread is closed for comments
No comments yet
    Your comment