A replica of the Model K built by George Stibitz is in the Computer History Museum. | Image source: Wikimedia Commons
The growth of technology has been very fast. Technology that was cutting edge until a few decades ago is now in many cases now gone and even out of existence on some occasions. All this was the result of ushering in the era of modern digital computers.
The history of modern digital computers is actually very short. There were many people who played a prominent role in its growth and advancement. One of them is the American researcher George Stibitz, who is known as one of the fathers of the modern digital computer.
Fitter at heart
Born in Pennsylvania in 1904, Stibitz spent his childhood in Dayton, Ohio. This is where his father taught theology, while his mother worked as a mathematics teacher. An experimenter at heart inclined towards science and engineering, Stibitz began tinkering with electrical appliances even as a child. He even nearly set his house on fire on one occasion by overloading electrical circuits with an electric motor given to him by his father.
After receiving his bachelor’s degree at Denison University in 1926, he received his master’s degree in 1927 from Union College in Schenectady, New York. After a year working as a technician for General Electric, Stibitz began a doctoral program at Cornell University and earned his Ph.D. in Mathematical Physics from Cornell University in 1930.
relays for computing
Working as a research mathematician at Bell Telephone Laboratories in New York City, Stibitz was tasked with helping to design and operate a more complex system of telephones. Stibitz made a breakthrough in 1937 when he came up with the idea of using relays for automatic computing, the discovery for which he is best known.
Relays are mechanical devices that can assume one of two positions – open or closed – when an electric current is passed through it. With the ability to control the flow of current, the relay thus functions as a gate and was a common device in regulating telephone circuits.
Stibitz decided to see if relays could be used to perform simple mathematical functions in November 1937. Using hardware borrowed from a Bell store, Stibitz assembled a simple computing system on his kitchen table at home.
Consisting of relays, a dry cell, searchlights, and metal strips cut from a case, Stibitz soon had a device that lit up to represent the binary number “1” and was unlit to represent the binary number “0”. The device was capable of using binary mathematics for addition and subtraction and was soon dubbed “K-model” by Stibitz’s colleagues, who built it on their kitchen table.
When it was first shown, executives weren’t really impressed. But with increasing pressure to solve the complex mathematical problems facing them, Bell executives changed their minds and decided to fund the construction of a large experimental model of the Stibitz device.
Together with conversion engineer Samuel Williams, Stibitiz got down to business and had a digital complex computer (CNC) ready by the end of 1939. First turned on on January 8, 1940, CNCs could add, subtract, multiply, and divide complex numbers, the kind of calculations that were troublesome for Bell Engineers.
He uses it remotely
Plaque commemorating the first remote operation of a CNC machine. | Image source: Wikimedia Commons
By September of the same year, Stibitiz had achieved another milestone in computer science with CNC, by making it the first telematics computing machine. In a demonstration for the American Mathematical Society at Dartmouth College, Stibitz sent commands to a CNC in New York via telegraph lines. When the correct answers were received less than a minute later, the audience was dumbfounded.
Although the demonstration was successful, it was another decade before more progress was made in the field as resources were poured into efforts related to World War II. As for Stibtz, he contributed to the war effort by working on CNC improvements for the National Defense Research Council.
After the war, Stibitz moved into academia and focused on using computers to solve biomedical problems. By the time he died at the age of 90 in 1995, digital computing had changed not only the medical landscape, but also communications, factories, and literally every conceivable field.