According to most accounts, the history of computing is a triumph of enterprise. This story starts in the 1950s and 1960s with commercial mainframe computers that, one stack of punch-cards at a time, assumed business tasks ranging from managing airline reservations to calculating betting odds. But the public’s day-to-day life looked much the same. Then, in the mid-1970s, geniuses like Steve Jobs and Bill Gates pioneered home computing. Personal computers, and later smartphones and the internet, became the defining technologies of our age. Nerdy men, often in their garages, had remade the world.
An appealing story, but it leaves out a lot. In fact, it might leave out the key parts. People were sending electronic messages all over New England in 1968. Around that time, professors and students at Dartmouth College pioneered the BASIC programming language, innovative for prioritising clarity over efficiency. Soon it was the lingua franca of hobbyists and students worldwide. In the early 1970s, the People’s Computing Company organised low-cost classes, school visits, and circulated publications that featured computer programs readers could copy, modify, and redistribute. This social world was the fertile soil from which personal computing grew.