Wouldn’t it be convenient to tell your air-conditioning system to activate at a certain temperature before you head home from work? Or have your front door automatically unlock when you need it to open? The scenarios above are some examples of the Internet of Things (IoT) at work.
The Internet of Things is a system that connects any electronic device, gadget, machine, microchip, sensor, appliance, or building—just about anything, in fact—to the Internet. As a result, all these things can collect information and share it with each other. Such interconnection of devices and machines allows people to monitor, control, and improve their overall environment.
Other interesting terms…
Read More about the “Internet of Things (IoT)“
Over the last few years, we’ve seen the Internet of Things make practically everyone’s life more convenient. Even the remote controls that we thought were so innovative once have seemingly become obsolete.
How Does the Internet of Things (IoT) Work?
Much like the computers that we use to access the Internet daily, each IoT-enabled or smart device gets a unique Internet Protocol (IP) address. That way, other Internet-connected devices will know how to communicate with each IoT device. A message or command meant for your smart TV won’t get sent to your smart refrigerator and vice versa.
IoT aims to enable smart devices to send data to others without much human intervention. You can, for instance, configure your smart coffee maker to turn on every 6 A.M. soon after you wake up. By the time you’re dressed for work, all you need to do is take your cup of freshly made coffee before heading out the door.
How Did the Internet of Things (IoT) Originate?
Adding sensors and intelligence to everyday objects made its way into discussions as early as the 1980s. Probably one of the earliest IoT projects was an Internet-connected vending machine. But the chips back then were still too big and bulky and there was no way for objects to communicate effectively.
The emergence of radio frequency identification (RFID) tags or low-power chips that can communicate wirelessly solved some issues. So did the increasing availability of broadband Internet and cellular and wireless networking. The last catalyst was the adoption of IPv6, which should provide enough IP addresses for every IoT device.
We owe the term “Internet of Things” to Kevin Ashton, who coined it in 1999. But the technology nor the term didn’t gain mainstream adoption until manufacturers began embedding RFID tags to track the location of expensive equipment. That was actually one of the first IoT applications in the manufacturing sector, which gave birth to “machine-to-machine (M2M)” technology. Some suggested the name “blogjects” for Internet-connected devices and “ubiquitous computing (ubicomp),” “invisible computing,” and “pervasive computing” for what we now know as the Internet of Things. But those names didn’t stick.
What Technologies Made the Internet of Things Possible?
While the concept behind IoT emerged decades ago, we had to wait for several technological advancements to turn it into reality. These developments include:
- Access to low-cost and low-power sensor technology
- Cloud computing platforms
- Machine learning (ML) and analytics
- Conversational artificial intelligence (AI), specifically neural networks and natural language processing (NLP), as evidenced by applications like Alexa and Siri
IoT has become one of the most important 21st-century technologies. It gave way to connecting appliances and machines we use daily (e.g., kitchen appliances, cars, and thermostats) to the Internet. As such, we can seamlessly communicate with people, processes, and things.
IoT has made human-to-machine communication possible.