Computers are already visual, we just forgot

“No, no, you don’t understand! Computers are basically about binary numbers and then we program them by putting layers on top of the raw binary to finally display text and visual images.”

Bullshit. Computers are really about spatial patterns of voltage storing elements which we then interpret as bits in the binary system.

When a computer is designed, somewhere some guy has a big chart on his wall showing the mapping from the spatial pattern into bits.

Which probably started out as a diagram on a white board which was erased many times before they had the bit storage elements far enough apart to prevent heat confounding.

cmos-screen-capture2

So the computer started out as an analog map on somebody’s wall.

Computers are inherently spatial and then we cover it up.

Digital “ones and zeros” are an abstraction! Just a way of interpreting spatial patterns of voltage differences. Probably goes back to Alan Turing and his machine. And the damn tape, so easy to interpret in the bitwise way (great work on Enigma, Al, but the tape is crap).

Imagine instead if back at that crucial time in history someone had used the far more powerful* and already positional lambda calculus of Alonzo Church to directly interpret the spatial patterns and compute with them.

Then computers would also be usefully and overtly analog computers. That is, electrical computers would be analog computers (which they already are (so deep is the mis-thinking it perverts the very terms of our discourse)).

If only. And then you wouldn’t have to be reading this silly diatribe. And visual thinkers wouldn’t exist in a world of pain when they try to use computers let alone program them.

1rule-h35

Q: OK, but is spatial the same as visual?

A: Well, close enough for a rant. You bet.

Consider a creature who can see into the infrared really really fast and really really small. That guy can watch the patterning of the bit storage elements in real time as they are read and written.

* Q: Since Church and Turing proved that the lambda calculus and turing machines are both universal computational engines, how can the lc be more powerful?

A: Usefully powerful to a human. Whenever someone says some system is “Turing-equivalent”, it means you don’t want to have to actually use it for anything.

Leave a comment