A Brief History of Modern Computing

1826 1

Tomorrow, I will be doing a Tepper (and CMU broadly, I believe) alumni webinar, that has more than 500 folks registered!

The topic is:

Quantum and Quantum-inspired Algorithms with Applications.

Did I not just do a webinar on this topic a few backs? Yes. To PhDs at Cornell.

How to gracefully introduce quantum computing to a non-specialist, general audience (more likely to be MBAs and not engineering PhDs), likely not familiar even with how classical computing actually works?

As I was preparing the slides, I recalled a book that I had read a while back, by Georges Ifrah, a follow up to the magnificent The Universal History of Numbers:

The Universal History of Computing.

Why not borrow from Stephen Hawking the idea for a title? I created a slide:

Brief History of Modern Computing.

I view modern computing as having four threads that are braided together:

        Hardware, Computing model, Algorithms, Software.

The hardware that we use – like I am using to write this blog, a MacBook Pro – is based on transistors (and integrated chip), which relies on quantum mechanics.

That is, classical computing already uses quantum mechanics in how the hardware works.

Remember my previous post on Pulp Physics? I mentioned John Bardeen there? His first Nobel (in 1956) was because of his contributions towards creating transistors.

His second Nobel (in 1972) was for his contributions for superconductivity, that forms the basis for a certain type of hardware – from Google and IBM – for quantum computing.

That is, quantum hardware also uses quantum mechanics, but a different aspect of it.

Now to the second of the four threads. I think it was Dijkstra who said:

Computer Science has as much to do with computers as astronomy has to do with telescopes.

What did he mean by that?

He was talking about the conceptual model of computing

The one that we rely on for classical computing is by Alan Turing

As I referred to in my blog on What would Steve Jobs do?, the Turing Prize is annually awarded by the Association of Computing Machinery (ACM) for impressive contributions to computer science (broadly defined). The 2020 winners are Pixar folks.😊

Algorithms are recipes. A series of steps to follow. They accept inputs and deliver outputs. A good algorithm provides desired outputs quickly. 

How do we analyze algorithms? That is, how “good” are they? We need a conceptual model for computing. 

This is the area of computational complexity, a branch of theoretical computer science, and as Dijkstra put it, has not much to do with a specific hardware or physical computer, but has to do with “what a reasonable classical computer” can be expected to do.

How do you “write” these algorithms so that “hardware” can execute its steps? This is the fourth thread: software.

Classical computing uses software to translate algorithms, that were conceived using a conceptual computing model (of Turing), to instructions that the hardware (constructed using ideas from quantum mechanics) can execute.

Quantum computing uses a different conceptual model.

Information is not stored in bits, that have a definite value of zero or one at any given instant, but in qubits – quantum bits – that store information as superposition of zero and one, and so is both zero and one at the same time.

That is not how classical computing works.

Quantum computing also requires a different physical device as hardware.

One approach is to work in supercooled temperature environment where superconductivity occurs in certain materials, and the electrons, who at higher temperatures are disorganized and feel resistance, now instead pair up (and act as bosons!) and feel no resistance.

If a circular coil made of this material is supercooled, then these electrons can circulate (clockwise or anti-clockwise) forever if needed, and now, this is the key:

 we can (using microwave pulse) make this supercooled coil behave like a qubit, that is, as if the electron pair is circulating both clockwise and anti-clockwise at the same time!

The physical device needs to be able to “hold” this superposition state for a period of time.

Quantum algorithms are conceived with this new conceptual model with qubits to run on quantum hardware.

This is not your father’s Turing-machine!

Quantum algorithms take a classical input (in bits), transform that to an initial superposition state, a qubit, then provide a series of steps that act on superposition states and transform to other super position states, and so on, until the very last step, when the final super position state (of the qubit) is “measured” and it collapses to a classical state, into a bit, which is then delivered as an output (that is classical).

The physical device needs to be able to execute the instructions of the quantum algorithm while maintaining the superposition states, at all times, until the final measurement.

It is really, really difficult to “hold” a superposition state as it is so fragile that any contact with the environment makes it collapse.

Software? Not surprisingly, new programming languages are cropping up that are tailored to “writing” quantum algorithms that quantum hardware can understand.

Welcome to the 21stCentury! 

1 comment

  1. Finally, I understand!!

Leave a Reply to Dawn Cancel reply