[Editor: Dr. Rose's company announced the world's first Quantum Computer in Vancouver Canada last week.]
Any answer would involve a lot of speculation so take this with a grain of salt. An important thing to consider is that better processor and computer architectures, with the same components, can give impressive performance gains. So just finding better ways of laying out and using devices we already have should add another 10 years to Moore's Law (witness Intel's 80 core chips--this is just the beginning of this type of progress). In parallel, people will be trying to develop smaller, lower-power devices.
As IC components continue to shrink, and their power consumption goes down, one thing you could do is to try to build "up". Circuits right now are primarily 2D structures. Being able to build 3D circuits allows a designer to pack a lot more stuff on a "chip" (cube?).
As an example, if a modern circuit is roughly 50 microns thick, you could double the number of devices roughly 8 times before the chip became a cube. That's another 10 years or so of Moore's Law. You can't build "up" with today's devices. They generate too much heat. The cube would melt.
So if I had to predict how this will go: first better processor and systems architectures--this gives us another 10 years of Moore's Law, without changing the underlying devices much.In parallel new devices (smaller and much lower power) get introduced and integrated into systems. This will extend Moore's Law another 10 years or so. Once the power generation of devices shrinks sufficiently, we can build up, giving another 10 years of Moore's Law.
In summary, I don't think that the "atomic limit" to devices is going to slow down Moore's Law for a long, long time. What could stop it is economics...it may cost a lot to keep Moore's Law going.