The growth of technology has been at breakneck speed. Technology that was latest even a few decades back is in many cases now passé, and has even passed out of existence on some occasions. All this has been the result of heralding in the era of the modern digital computers.
The history of modern digital computers is actually quite short. There have been many people who have had a telling role in its growth and progress. One of them is American researcher George Stibitz, recognised as one of the fathers of the modern digital computer.
Experimenter at heart
Born in Pennsylvania in 1904, Stibitz’s childhood was spent in Dayton, Ohio. This is where his father taught as a professor of theology, while his mother worked as a maths teacher. An experimenter at heart with an inclination towards science and engineering, Stibitz started tinkering with electrical gadgets even as a child. He even nearly set his house on fire on one occasion by overloading the circuits with an electric motor that was given to him by his father.
After earning his bachelor’s degree at Denison University in 1926, he was awarded his M.S. degree in 1927 from the Union College in Schenectady, New York. Following a year working as a technician at General Electric, Stibitz began his doctoral programme at Cornell University and received his Ph.D. in mathematical physics from Cornell in 1930.
Relays for computing
Working as a research mathematician at the Bell Telephone Laboratories in New York City, Stibitz was tasked with helping design and operate an increasingly more complex system of telephones. Stibitz made his breakthrough in 1937 when he came up with the idea of using relays for automated computing, the discovery for which he is best known for.
Relays are mechanical devices that can take one of two positions – open or close – when an electrical current passes through it. With the ability to control the flow of current, the relay thus functioned like a gate and was a common device in regulating telephone circuits.
Stibitz decided to find out if relays could be used to perform simple mathematical functions in November 1937. Using borrowed devices from the Bell stockroom, Stibitz assembled a simple computing system on his kitchen table at home.
Consisting of relays, a dry cell, flashlight bulbs, and metal strips cut from a can, Stibitz soon had a device that lighted up to represent the binary digit “1” and was unlighted to represent the binary digit “0”. The device was able to use binary mathematics to add and subtract and was soon dubbed as the “K-model” by Stibitz’s colleagues, as he had built it on a kitchen table.
When first demonstrated, the executives weren’t really impressed. But with increasing pressure to solve the complex mathematical problems confronting them, Bell executives changed their mind and decided to fund the construction of a large experimental model of Stibitz’s device.
Along with switching engineer Samuel Williams, Stibitiz got to work and the Complex Number Computer (CNC) was ready by the end of 1939. First put into operation on January 8, 1940, the CNC was able to add, subtract, multiply, and divide complex numbers, the kind of calculations that were troublesome for Bell engineers.
Uses it remotely
By September the same year, Stibitiz achieved another milestone in computer science with the CNC, by making it the first computing machine to ever be used remotely. In a demonstration to the American Mathematical Society at Dartmouth College, Stibitz sent commands to the CNC in New York over telegraph lines. When the correct answers were received less than a minute later, the audience were left dumbfounded.
Even though the demonstration was a success, it was another decade before further advances were made in this area as resources were poured into efforts pertaining to World War II. As for Stibitz, he contributed to the war effort by working to improve the CNC for the National Defense Research Council.
After the war, Stibitz moved over to academia and focussed on using computers to solve biomedical problems. By the time he died aged 90 in 1995, digital computing had not only changed the medical landscape, but also communications, factories, and literally every conceivable field.