On March 12, 1989, computer programmer Sir Tim Berners-Lee wrote a paper titled “Information Management, a proposal.” Working at CERN, the European Organization for Nuclear Research, he had a problem with information about particle accelerators and experiments being stored on too many different computers with no convenient way to access the data from another computer and no good way to link data stored on one computer to data stored on another one. His proposed solution contained early but recognizable descriptions of HTTP, HTML, and the URI.
Tim Berners-Lee didn’t invent the Internet. Kind of like Al Gore. But he invented something. And his invention did make the Internet infinitely easier to use, and it had many uses beyond his initial need to share information about nuclear science.
The dawn of the World Wide Web

His plan evolved into a framework for something more ambitious that could benefit not just CERN but the whole world: the World Wide Web. It wasn’t just useful for managing scientific data. Any kind of data benefited from his idea.
It’s not really fair to say that March 12, 1989 is when the Web was born. There was still a great deal of work required to turn Berners-Lee’s ideas into something functional. By November 1990, he had a formal proposal on paper, and the month after that, he had a functional web browser and a functional web server running on NeXT hardware. The Web itself launched May 17, 1991. So it took about 26 months for Tim Berners-Lee to turn his ideas into a functioning system that CERN and others could use.
The next step was getting it running on other types of computers as well. Various organizations around the world contributed to that work, including the NCSA in the United States. And it wasn’t long before college students were coding their own web pages and viewing other pages with NCSA Mosaic or an early version of Netscape. From there the dotcom bubble wasn’t far away either.
On April 30, 1993, CERN placed its World Wide Web software into the public domain, ensuring it would forever be available to all.
Time magazine included Berners-Lee in its list of the 100 most important people of the 20th century. And Queen Elizabeth II knighted him in 2004.
Today we use Berners-Lee’s invention to do a lot of things that have very little to do with nuclear science. Like denying science altogether. Or ordering pizza. Or arguing with strangers and looking at pictures of cats. But it’s completely fair to say his invention changed the world. It may be surprising to some people that he worked on the idea for several years before it became mainstream.

David Farquhar is a computer security professional, entrepreneur, and author. He has written professionally about computers since 1991, so he was writing about retro computers when they were still new. He has been working in IT professionally since 1994 and has specialized in vulnerability management since 2013. He holds Security+ and CISSP certifications. Today he blogs five times a week, mostly about retro computers and retro gaming covering the time period from 1975 to 2000.
