WPA2 hack: How secure is your Wi-Fi?
Could somebody please put this into simple words for me?
It seems to say that it is OK to produce defective chips as long as you have the software to deal with it.
But since the chip runs the software that can't be right.
Sorry to be a bit slow, but I'm finding it harder and harder to keep abreast, despite PCA's best efforts!
Or perhaps fuzzy logic ;-)
put this into simple words for me?"
Imagine you are an expert at something - calibrating intricate surveying instruments, for example. All your working life you have been used to instruments that work more or less along the same lines - they contain the same set of parts, and the parts fulfil the same functions. They may vary slightly in colour, or in the speed at which they move, but you know exactly where you are with them - your world is predictable, and as long as you know that each part does what it has always done you can cope with what comes your way.
You're the software, and the instruments are the computer chips. You, the software will work efficiently, as long as those instruments are predictable - your stress levels are low, and your need to cope with unusual incidents is slight.
One day you open an instrument, and you notice that the parts are much smaller, and as you look at them you notice one or two are not moving as they should - they're behaving oddly now and then. You try to understand what's going on, but you can't, and because you can't you come to a halt...you crash, because you have no inbuilt ability to cope with errors in the hardware.
That's precisely what happens with most computer software when a CPU makes a mistake, and it's the reason that chip manufacturers go to endless lengths to make their chips function perfectly. So far that has all worked pretty well, but as the tiny switches on silicone chips become smaller and smaller - approaching the molecular level in terms of size - the potential for manufacturing errors has increased.
What's needed is software that is far more fault-tolerant, so it won't fall over the first time a CPU misses a beat. It all sounds terribly simple, but it's mind-numbingly complex.
Yes, it's OK to produce chips that aren't perfect, as long as you produce software that can cope with errors. The world is becoming a far more complex place, just when we thought we had this computer thing nailed.
Did it help?
It's a convoluted story, and like all technology issues it's far from clear-cut. A computer's operating system plays an important part in all this, as we've all found out.
In the early days of computing, core memory was made from little doughnut like magnetic 'cores' click here that were flipped on/off (1/0) by passing a current through the wire grid. The first mainfraime I worked on had a whopping 8k or RAM.
The cores would stick and become unresposive on a fairly regular basis and the machine code we wrote was self-modifying in order to be fault tolerant. This was the only way we could get any viable up time.
Later in my career when working on pressure sensors, the circuitry was potted (sealed into the sensor body with resin) so repair was impossible. These were very expensive components and over time the performance would drift out of tolerance, so again they were designed with self modifying code that would re-calibrate them. We even built redundancy into them.
I think its a great idea to do it with CPUs. Would be nice if Microsoft could do it with their OS instead of giving a blue screen and telling you there has been a user error.
I think the problem today is that chip manufacturers are working at incredibly challenging levels of complexity. Imagine trying to get a crowd of jostling electrons to line up and stream through a molecular gate.
There may well be some stunning developments at the chip level, but fault tolerant software would truly be a godsend. Microsoft has made big strides with its operating systems, I can't remember when I last saw a blue-screen incident (said he, sawing away at the branch he's sitting on),Windows 7 certainly hasn't missed a beat on any of my machines since I installed it.
That's largely because my CPU ticks away predictably, but if what Professor Kumar says is true (and I'm sure it is) I may not be able to rely on that for much longer.
FE - Yes it did help thanks, and the further contributions from you and PaleoBill
This thread is now locked and can not be replied to.