Without
getting too technical… before the microprocessor came along in the early 1970s (effectively a small computer
on a chip), all my products comprised logic chips and various other components
and, if I made a mistake, I would have to modify the circuit physically;
cutting tracks, soldering new components. All my timing products which I
described in a previous chapter were built using discrete chips; software was
still in the future.
And
then in 1974 Intel produced the 4040 4-bit microprocessor. Now, the
functionality of the product was in software, a series of instructions in code
and I could change that without recourse to the scalpel. You think I’m joking!
A scalpel was a vital tool of the trade.
And
at the very same time, a project came along which was made for the
microprocessor. In Dubai. I was commissioned to build a system which would
monitor around 4000 points around a shopping mall, control lighting and
fountains, monitor security. And it all had to be controlled from a central
control room. I completed the project around 1980 but I upgraded it after that
when the PC came along and went back regularly up to 1999 to check it out.
This
was the time of the early Apple computers and, when the IBM Personal Computer
arrived in 1982, it created enormous wealth for Microsoft.
I
could write a book about this period in my life but I’ll just use it now to
illustrate how I came to be in Dubai in 1978 and how I decided to go back as a
tourist to see how it had changed.
Bill
Gates and I share a momentous time in the 1970s but our paths have diverged
somewhat since then. At that time, there were computers of course but they were
large and heavy and spoke a variety of different languages: Some sat on
desktops but they were still large and heavy. You will probably think that I’m
going to say that this period marked the arrival of the personal computer but
that came later, in 1981. What changed the world was the microprocessor which
first arrived in a very simple form in the early 70s.
I
So,
at this time, at the beginning of the 70s, IBM was making computers (together
with many other manufacturers, of course) and I was designing and building
electronic devices, having aborted a career with the BBC as a sound engineer in
TV. My products were digital (in other words, controlling or displaying things)
as opposed to analogue products such as audio amplifiers. They usually
comprised many logic chips - building blocks - mounted on a circuit board. If,
on testing the product, I found a mistake, it would be “back to the drawing
board” and I would have to re-solder components or cut tracks on the circuit
board.
It
would be wrong to say that the microprocessor changed all that; it still
involved lots of chips on a circuit board but much of the functionality was in
software which could be changed without hacking up the circuit board. But what
happened in these traditional designs, before the microprocessor, was that
everything happened more or less at the same time in various chip
neighbourhoods on the circuit board. Then, in 1971, a very simple
microprocessor arrived on the scene. Intel
produced the 4004, a 4-bit processor followed by the 8080, 8086, 80286 and now
we have all singing dancing, dual core processors but they are still based on
the original Intel architecture. Forgive me if I get rather dewy-eyed. For an
electronics engineer of this epoque, these are magic numbers.
I
could see the possibilities fairly quickly but I couldn’t get my head round
this new way of doing things. I pored over the Intel documents but I could see
no way in. At the same time as Intel produced its first 8-bit processor,
Motorola had produced their own, the 6800, with a completely different
architecture and they sold it in a “bubble pack” kit with one or two other
support chips. I soldered it all together, pressed Reset, and the terminal
printed an asterisk! By the way, at this time, I was using a mechanical
teletype machine with a punched-tape reader. It hardly seems credible in these
modern days but I was there.
(the
photo shows the “computer” that I designed around the Motorola 6800
microprocessor, with 8 inch floppy disk drive!)
So
where does Bill Gates fit into the picture? I guess he is a fair bit younger
than I am and he would have been a student during the later 70s. To me, his
great success was not so much technical as political in that he managed to
persuade various developers, all of whom had their own pet projects, to create
a common effort (this is rather an imperfect way of describing how it
happened). During the latter part of the 70s, the microprocessor had spawned
many personal computers which no longer exist (with the exception of one,
Apple, which pre-dated the PC).
Bill
Gates persuaded IBM to use his operating system, MSDOS, in their new personal
computer and Microsoft was born. As we all know, MSDOS grew into Windows and
all personal computers, at least in the early days, used MSDOS. I know it’s a
bit daft comparing my career with that of Bill Gates but I feel an affinity
with him because I remember those formative years so well. And it was very
exciting because, even then, one could see the possibilities. Anyway, I
continued in electronic design as opposed to computer design but both used the
same device: the microprocessor. Later came the microcontroller which had loads
of legs which one could waggle up and down in order to control external
devices. By the late 70s, I was producing what is called BMS, building
management systems. This is a general term meaning control and monitoring of
heating, air conditioning etc. My systems went into several large shopping
centres.
Microsoft
comes into a lot of flak on the grounds of monopolising the market much of
which I feel is unjustified. Had Bill Gates not cornered the market by getting
MSDOS and Windows as the default operating system for the PC, we would now have
a plethora of different operating systems all talking different languages. A
similar situation occurred several years later when Tim Berners-Lee, an
Englishman working at CERN, banged heads together to persuade developers to
abandon their pet projects in favour of a common language for the World Wide
Web: HTML. Before that, the Internet existed but it was a network of computers
all talking different languages, initially government establishments and,
later, colleges in the USA. To define the Internet and WWW: the Internet, in my
view, is the network of cables and wireless links which (amazingly to my mind)
circle the earth. The WWW is the community that resides in the Internet. And
Tim Berners-Lee still works in the constant development of the web which has to
move with every new development that comes along, not the least of which is
mobile computing.
I continued to use Motorola
microcontrollers in my products right up to 2003 when I retired. My company
gradually lost its clients due to the fact that regulations were tightening up
regarding products for buildings, especially fire alarms, and they could no
longer deal with a one-man-band. Incredible as it seems now, my BMS system not
only controlled the lighting in the Shopping Centre but it was the fire alarm
system. And I still slept at night, possibly through naivety at what might happen
if it failed.this was the last version of my controller board, SAMP4. You can see the original SAMP in the photo above (floppy drive), it required a second circuit board to provide memory. |
One product that I can hardly believe that I designed and built was a modem. These are very complex devices, using telephone lines to transmit data. Now the technology is very advanced. So, here is a photo of my CM336 modem.