Superficial intelligence - the limits of A.I.

icon-mk-computer
Last weekend, I listed all the letters I sent to the New Scientist magazine for the past three months. Although they weren't printed, I thought it would be nice for people to read the letters anyway, rather than leave them to sit, forgotten on a shelf (sort of). They aren't very long and I think they make some interesting points - one of them even includes a joke! - so it seemed worthwhile to blog them.

colossus-forbin-poster
This weekend (5th Sept), out of the blue, the New Scientist have actually printed one of those letters, the 4th August letter entitled 'Superficial Intelligence'. This is what I wrote on the 4th August:

Dear New Scientist. In your Opinion page (issue 3032, 1st August 2015, pg22), Martin Rees states that biological brains will eventually be superseded by far superior, machine intelligences. This follows on from recent comments in the media by Stephen Hawking and others, warning of the dangers of runaway A.I. These are all surprising assertions, as digital computers, fundamentally, are no different from punch-card clocks. Also, A.I. and quantum computing have so far failed to live up to their initial hype; they're currently more Superficial Intelligence than Artificial Intelligence. How do Hawking and Rees think these automated sorters and calculators will reach such lofty goals? 

I'm pleased that the New Scientist magazine published it. They didn't publish the full letter; they removed the middle sentence, but it's still good to see it in the magazine. Thinking again on the topic, I would like to add a few more points. I did write a blog article in March, explaining the fundamental limitations of computers, and that does cover a lot, but here's three new points:

computer-register
Firstly, a digital computer is a punch card clock because it is, at its heart, a binary number cruncher. The heart of a computer is its processor; that's what makes the computer compute. A processor has registers. These registers carry out simple operations on numbers inserted into the register. For example, after a clock tick, a new number enters the register. This number is a binary number (like 01010101110), which looks a bit exotic, but it's still just a number. The register then changes that number according to an instruction, physically added to the machine by a person. For example, an instruction may double the number in the registry, or take one off the number, or add one to it. These sound like really, boringly simple operations but the important thing is that they can be done very easily and very quickly on a binary number. Also, more complicated operations can be built up from a series of simple operations. Once the operation is done on the number in the register, the computer's clock ticks again and the new number is shunted out of the register. That's it; that's really all that a computer does. It shoves the next number in the register, operates on it according to a stored instruction, and shunts it out again.

fast-robot
The above explanation makes a computer sound a lot like a factory that makes plastic parts and, in essence, a computer is very much like a factory that makes plastic parts. Modern computers seem almost intelligent because they can perform an operation faster than it takes for the light to leave the surface of the screen and get to your eyes, but speed and intelligence aren't the same thing. Even though computers are amazing, they're still, fundamentally, punch card clocks, albeit ones that goes very fast indeed. A digital computer is a bit like a clockwork robot that runs at four billion cycles per second; it's an incredible clockwork robot, but it's still a clockwork robot. It may run at four-hundred miles an hours, but it doesn't mean you can conclude it's alive or fall in love with it. It's just a box with some tiny levers inside that are flipping backwards and forwards once ever pico-second.

The second nugget to mention is Quantum Computing. Q.C. has been the new, exciting, kid on the block these last ten years or so. Quantum computer researchers (and commercial companies) are keen to talk about it becoming something far beyond what we have at the moment with our digital computers. Quantum computers, they think, will be able to break encryptions in seconds, compute vast equations instantaneously and so on. Theoretically, this should be possible - they're not being irrational - but so far, the only quantum computer has been made that is publicly available is by D-Wave systems and there's currently a lot of argument about whether it actually does any quantum computation, including points made in a paper published in the journal Science. Quantum computing is therefore still in its infancy and there's no strong, practical evidence that it can even become a significant technological field. Quantum Computing therefore can't sensibly be used to support any 'A.I.'s run wild' or 'A.I.'s will rule the world' speculations.

Phase-IV-poster

icon-mk-brain
The third point I really, really want to add is that our brains are not like computers. There is no scientific evidence at all that shows that our brains function like digital computers with their registers and buses and binary operations etc. Our brains are a 3D network of electromagnetic activity involving precision chemicals and living cells. If a computer was a water pump, our brains would be coral reefs. If a computer was a bicycle, our brains would be antelopes… (you get the idea).

In conclusion, computers are brilliant and incredibly useful but, according to everything we know scientifically, ants are more likely to take over our planet than any computer (actually, talking about that, I still want to see the movie Phase IV, which looks great). In future, if it does look as if a computer is taking over our planet, I strongly recommend that everyone should search carefully for the person or persons in the background that will be intelligently guiding the computer's actions. They're the guilty party and if they use an excuse like 'I didn't do it, it was the computer!', remember it's about as factually correct as 'I didn't shoot the guy, it was the gun that did it!', and then feel free to laugh at them a lot, until your sides hurt.