The Internet of Things emerged as a technological infrastructure in the 2000s. Yet there was an everyday object that was able to transmit information on its status wirelessly before that: In 1982, computer science students in Pittsburgh, U.S., monitored how full a beverage vending machine was over the precursor to the Internet. The first connected home appliance followed in 1990, in the early days of the World Wide Web – even before the first website, which was launched 1991: The U.S. software and network expert John Romkey and the Australian computer scientist Simon Hackett connected a toaster to the Internet during a conference. It could be turned on and off online. That toaster is now regarded as the first-ever IoT device.
The term “Internet of Things” was coined by the British researcher Kevin Ashton back in 1999. The expert for sensor and identification technologies at the Massachusetts Institute of Technology used it to describe passive RFID tags. RFID (radio-frequency identification) is a technology where a device can read and store data from a tag in a non-contacting manner. Here, too, physical objects were therefore connected with the virtual world, albeit in a narrowly defined area. A little later, in 2000, the electronics company LG presented the idea of an Internet refrigerator: It notifies owners when the cheese, butter or eggs in it have run out.
Connectivity has expanded significantly since then: The network specialist Cisco calculated that by 2008 there were more devices connected to the Internet than people on Earth. That includes not only smartphones and computers, but all sorts of objects. More and more devices will be smart in the future: Around 75 billion devices worldwide will be connected to the Internet in 2025.