|This part of my career as a development engineer extended from engineering supervision into management, although being
an engineering company there were still many times I had to get my hands and mind into the technical detail. The internet
burst onto the public consciousness at about the time I left Ferranti, but this decade of my career saw the
groundwork from which the internet and a whole new selection of computer architectures emerged and displaced
the ones I had been previously working with.|
So it was an interesting time in the development of computers, both civilian and military, and I was in the
thick of it. But my feet were pretty much nailed to the floor while I sent my engineers aound the globe for
the installation work and trials, which means that this account is rather less colourful than the first 10 years.
It was after leaving Ferranti that I became deeply involved in the internet, particularly as a womb-to-tomb
provider for mainly small and medium-sized companies. It was all very new and geeky at the start, and needed
people like me to make it work, but it soon became routine "Lego engineering" and of little retrospective interest.
The following selected highlights summarise the various activities that I was involved in, and I accumulated disparate
responsibilities as the years went by. But while my "core" Commissioning work declined as the company was awarded fewer
development contracts, my department remained a small, specialist unit with a mixture of talents that could be called
on to deal with a wide range of technical problems.
This was essentially a mirror of the department I worked for at Bracknell. But while the Bracknell people
were mainly concerned with naval contracts we were designated for land-based and airborne equipment. Land-based
contracts related to air traffic control (ATC) systems and armoured vehicles; airborne equipment was intended
to be flown, and had it's own special design demands.
We remained small as the business was slow to develop, and in the end never did reach the heights that the company
was expecting. But this was now my own department and represented a double-jump in responsibility. This stream
persisted until I left the company, at which time the remnants of work were absorbed
into several other departments, and my engineers were able to move on.
This was our largest Commissioning project by far, and kept us busy for several years. In fact I believe
it was the biggest and most sophisticated ATC trainer system in the world at the time and for some years afterwards.
An air traffic control trainer system is a lot bigger from the computing aspect than a "live" system. For
example there are no "live" radars and no "live" communications, or real aircraft or pilots. All these things
have to be simulated. So, in addition to the normal functionality of an ATC system, it has to generate simulated
radar, with "targets" passing in various directions under the programme control of the training officers;
and with simulated problems such as clutter, fade and dropouts; again all under control for training purposes.
Similarly there are no "real" pilots to talk to, so there is a room of so-called "blip drivers" who use computer
consoles to create simulated radar echoes of aircraft, doing what the training officers require (which is not
necesarily what the trainee ATC people have asked for!). And the simulated radio communications are controlled with
simulated dropouts, fade etc, all under planned exercise control by the training officers.
My engineers went out there to conduct trials, but I stayed at base organising other work.
At about the time I moved to Wales the company had seen the writing on the wall regarding bulky computers such
as our standard naval-issue range, and was developing microprocessors. One, the lower-power F100 was inexpensive
and destined for a wide market. It came as a single chip processor and needed to work with a variety of "support"
components to make up a complete computer. Ferranti supplied these as well as a development kit that consisted
of a mains-powered box with some PCBs for various functions, and some spare slots. The other, the higher-power
Argus M700 was a re-engineered version of the commercial Argus computers. These used hybrid construction, being
several chips encapsulated into a single component.
Both these came in commercial and militarised form with radiation hardening, different clock speeds, and temperature tolerances to
suit a range of needs. I got involved in production testing of these later on, but they featured very little in
the mainstream land-based or avionics contracts.
In my opinion these developments were perhaps a vanity project, as they were very quickly superceded by devices
made by Motorola (the 68000 series that found a foothold in the market through Apple personal computers) and
Intel (that powered the IBM-style personal computers). I suspect that at the time there was a very limited
prospect of these devices being militarised and Ferranti thought they could establish a niche foothold.
While this may in retrospect seem futile, we must remember that Ferranti was the first company to make a radio
on a chip. So it was not the lack of technical experience or ability but the size of the commercial backing available to the
global chip-makers that forced the market.
An unmentionable Middle East power had obtained a quantity of ex-Russian tanks and wanted a modern fire control
system fitted. But they were not in the best condition to start with, and did not provide a stable gun platform.
Nevertheless we designed a "black box" that was stuck to the base plate and performed the necesary calculations.
For this project we used a new type of processor and computer architecture because it had to be contained in
something about the size of a cubic shoe-box, and it was the first time my engineers had come across the new Ferranti M700
militarised microprocessor. The box was delivered from the labs to site and mounted by the local engineers, but
my engineers were needed to conduct trials. It would be entirely inappropriate for me to name
the location, but a senior manager, on return from contract negotiations, said "if the Good Lord requires the
Universe to be administered a suppository, we have found the very place for Him to put it".
Later production items did not require my engineers, and I would not have seen any of them except for a nearly
disastrous incident. Three of the engineers were in a parked vehicle that had just come back from a sortie, when some
fool decided to throw petrol at the fuel filler. The tank ran on diesel, and they didn't use a funnel. So the
fuel ignited on the hot engine and the whole thing went up like a bomb. my engineers got out and clear by the skin
of their teeth, and brought home some interesting pictures of a fireball only yards from their hiding place behind
a low ridge of earth.
My only view of the computer was of the salvaged item when they brought it home. It had retained it's shape, but
the contents had been entirely incinerated.
Equipment intended to be flown in military aircraft had to conform to a variety of stringent standards
not generally applicable to land-based aplications. Obviously light weight and compact bulk were a premium
requirement, but as a lot of electronic equipment had to be packed into racks, the equipment had to conform
to standardised overall physical dimensions very different from the 19" racks we were used to. The bulk and weight restrictions meant that our equipment had to be
microprocessor based, which led us away from the established computer designs that we were using for ATC systems, and
into the new generation of computers. Also, in order to save bulk, the modularised equipment was constructed in sealed
boxes which had to be cooled by conduction through the box itself rather than an airflow over the components inside.
This materially affected the design of the internal components and circuits.
A particular problem with equipment from various suppliers operating in close proximity concerned mutual interference
by radiation of the high frequency internal signals, as well as potential evesdropping. So we had to be mindful of
TEMPEST and EMC standards as well. The former concerns the amount of signal radiation that could interfere with or be
re-radiated by other equipment; the latter concerns susceptibility to interference from other radiated signals
(local or remote).
So, apart from our usual design testing facilities we needed to set up a special walk-in cabinet, lined with soid
metal sheets where we could set up calibrated broadband listening equipment at a variety of angles. So our department
came to look very different by the end of the decade than when it started, and my engineers progressed without demur
into a whole new, emerging technical world.
For some reason (perhaps desperation to get contracts as our military business was drying up) the company
decided to hitch a ride on the upcoming AI bandwagon. This was an unusual extension for me because it involved
taking charge of a newly formed department, where hitherto I had been asked to take on mainstream engineering work for my established team.
The current state of the art was known as Expert Systems. It was nothing like the later Artificial Intelligence systems
that used Neural Networks, but it provide a way of defining real-life complex systems in a way that could be routinely
analysed and reported. Our activities required us to act as agents for a USA company to market and support their
product across Europe. We therefore had three things to do: sell the product; support the customers in developing
their applications; and train the customers in the technology. When I got involved the project was well under way and
we had customers ranging from Aldermaston (nuclear research) to Lyonnaise des Aux (the French water board), and many other
"blue chip" organisations.
The product, from a company called Inference, was known as their Automated Reasoning Tool (ART), and would be technically
described as an Inference Engine (IE). An IE operates in a smilar way to a database engine but does a very different job.
In other words, it is installed on a computer to process structured data, and has to be "programmed" using a special
notation called Predicate Logic (PL). There were several IEs on the market, using different formats for the Predicate Logic.
This one used Lisp but others used Prolog, which was very similar.
The training school, which was my operation, taught the customers how to express real-life situations in PL
and obtain sensible answers from the analysis. We did not get involved in their particular projects, which were invariably
very sensitive, but another department provided specific help where needed. This whole field, as you would already
be suspecting, is imbued with a vocabulary of it's own and quite different from the terms commonly used by conventional
programmers. The engineers who help customers define their problems (the equivalent of analysts) called themselves
Knowledge Engineers (KEs), and described their work as "extracting the contents of experts' minds through their mouths".
Typical training problems might include planning multi-hop, multi-modal journeys given start and end points and a set
of timetables; or control shunting problems in a railyard, or bus/coach services; or even play Cluedo, draughts or chess;
or solve the Rubik's cube puzzle. The most suitable problems were those where the "rules" are essentially quite simple
and the data is structured. Thus, if the rules and the data can be set out by a trained user, seemingly complex
problems can be routinely tackled without resorting to a full-blown programmed application, which would be expensive and
time-consuming to get working. The difficulty lies in expressing a real-life situation, such as the foregoing examples, in
PL notation, and teaching that was the task of my training school.
When delivering a technical product and the user training, customers understandably want them to happen at about the same
time. One without the other is of no use, and knowledge atrophies very quickly if it is not used. But there were serious
discrepancies and the training school was unable or unwilling to resolve them. That is why I was asked to take it over. I
used the existing training team, who helpfully put me on their next course so i could understand what they did, and we
turned things round very quickly. So after an initial flurry of excitement the team more or less ran itself.
As a matter of interest, after leaving Ferranti I discovered a freeware IE called BProlog, installed it on my Linux server
in the office and used it to play with a variety of problems. While Inference Engines have been overshadowed by the onslaught of Neural
Networks (now the coventional and commonplace AI), the latter are extremely resource consuming because they usually need
many thousands of data samples to determine recognisable patterns. Once a pattern "chart" has been eventually determined for a particular problem
it can be easily copied and applied; so our mobile phones can recognise handwriting and voices, for example.
Prolog-based systems (AKA Expert Systems) are more efficiently used where the problems can be defined as a set of "rules" by an appropriate
expert (hence the name), coded in PL comparatively quickly (perhaps by the experts themselves with a bit of help from a Knowledge
Engineer) and seemingly complex problems sorted in a very short time.