The first electronic computer filled an entire room. But something the size of a fingernail put all that power in people’s hands: the Intel 4004. The predecessor to smaller, stronger, cheaper microchips, the 4004 showed it was possible to put all of a computer’s processing onto a tiny slice of silicon. In doing so, it not only transformed existing devices, it helped create new ones.

Oh, and it was all kind of an accident. As a small startup in 1969, Intel was looking to increase funding for its primary objective: making memory chips. So it accepted an offer from Japanese electronics maker Busicom to build several custom chips for the company’s new line of calculators. Behind schedule and lacking resources, Intel’s three-person team scrambled to deliver a family of four chips for the calculator, including the central processing unit chip that would become the 4004.

10,000 nano­meters

Circuit line width of the original Intel 4004 microprocessor

10 nano­meters

Circuit line width of today’s smallest Intel microprocessors

By the time it was completed in 1971, the chip could perform different functions on a range of devices beyond Busicom’s calculator. It could be programmed to operate a pinball machine, for example. As Intel began to realize the possibilities of the technology, it made a deal with Busicom to secure the rights to develop the chip for other products and markets.

Intel co-founder Andrew Grove, PhD, and his team continued to iterate on chip technology, and in 1981 IBM chose Intel’s 8088 chip to power its personal computer—the device that would eventually revolutionize the PC market. By 2011, Intel’s share of the microchip market for PCs hit 80 percent. Today, microprocessors are ubiquitous, from mobile devices to home appliances, livestock ear tags to industrial equipment. Companies are cranking out chips so small and so sophisticated, some can even be implanted in people, transforming their hands into virtual wallets.