"If you bring a charged particle like an electron near the surface, because the helium is dielectric, it'll create a small image charge underneath in the liquid," said Pollanen. "A little positive charge, much weaker than the electron charge, but there'll be a little positive image there. And then the electron will naturally be bound to its own image. It'll just see that positive charge and kind of want to move toward it, but it can't get to it, because the helium is completely chemically inert, there are no free spaces for electrons to go."
Obviously, to get the helium liquid in the first place requires extremely low temperatures. But it can actually remain liquid up to temperatures of 4 Kelvin, which doesn't require the extreme refrigeration technologies needed for things like transmons. Those temperatures also provide a natural vacuum, since pretty much anything else will also condense out onto the walls of the container. //
Erbium68 Wise, Aged Ars Veteran
8m
1,829
Subscriptor
The trap and what they have achieved so far is very interesting. I have to say the mere 40dB of the amplifier (assuming that is voltage gain not power gain) is remarkable for what is surely a very tiny signal (and that is microwatts out, not megawatts).
But, as a practical quantum computer?
It still has to run at below 4K and there still has to be a transition to electronics at close to STP. The refrigeration is going to be bulky and power consuming. Of course the answer to that is to run a lot of qubits in one envelope, but getting there is going to take a long time.
We seem to have had the easy technological hits. The steam engine, turbines, IC engines, dynamos and alternators all came with relatively simple fabrication techniques and run at room temperature except for the hot bits. Early electronics began with a technical barrier - vacuum enclosures - but never needed to scale these beyond single or dual devices, and by the time that became a barrier to progress, transistors were already happening and it was then a matter of scaling size down and gates up. The electronics revolution happened at room temperature, maybe with some air cooling or liquid cooling for high powers.
Now we have the issue that getting a few gates to work needs a vacuum chamber at below 4K. Scaling is going to be expensive. And progress in conventional semiconductors will continue.
This approach may be wildly successful like epitaxial silicon technology. But it may also flop like the Wankel engine - the existing technology advancing faster than the initially complex and new technology can. //
dmsilev Ars Tribunus Angusticlavius
16y
6,561
Subscriptor
Erbium68 said:
The trap and what they have achieved so far is very interesting. I have to say the mere 40dB of the amplifier (assuming that is voltage gain not power gain) is remarkable for what is surely a very tiny signal (and that is microwatts out, not megawatts).
But, as a practical quantum computer?
It still has to run at below 4K and there still has to be a transition to electronics at close to STP. The refrigeration is going to be bulky and power consuming. Of course the answer to that is to run a lot of qubits in one envelope, but getting there is going to take a long time.
Compared to a datacenter computing system, it's actually not all that hugely power consuming. In rough numbers, 10-12 kW of electricity will get you a pulse tube cryocooler which can cool 50 or 100 kilograms of stuff down to about 4 K and keep it at that temperature with 1-2 W of heat load at the cold end. That's enough for a lot of 4 K qubits and first-stage electronics. Add in an extra kW for another pump and you can cool maybe 10 kg to ~1.5 K, with about 0.5 W of headroom. A couple more pumps at a kW or so each, some helium3 and a lot of expensive plumbing, and you have a dilution refrigerator, 20 mK with about 20-40 uW of headroom.
Compare that 10-15 kW with the draw from a single rack of AI inference engines.