The tech industry is infamous for its buzzwords and platitudes. Whether it’s digital transformation, machine learning, or smart technology these terms quickly become fashionable, bandied about, and then either become stale or increasingly nebulous in their definition.

There is currently no shortage of these words and one of the biggest offenders in present times is the Internet of things or IoT, which refers to interconnected physical devices across the world. At this point, however, IoT has come to refer to basically anything with an internet connection. The term has seen a expeditious explosion in popularity over the last five years as it has become an industry standard in the new world of hyper connectivity.

Gartner estimates that by 2020, there will be over 12 billion smart devices installed by consumers, four times the present amount. Across both the consumer and business markets there will be 20 billion devices giving the market surrounding IoT devices a valuation of almost 3 trillion dollars. However, these stats account for any and all smart devices connected to the internet, at what point does IoT just become the internet? What does IoT actually mean?

IoT
Google search interest over time for IoT between Jan ’04 – present

 

The term ‘Internet of Things’ was first coined as far back as 1985 by Peter T. Lewis, despite claims from MIT that they did in ’99. In a speech delivered at an FCC supported session at the Congressional Black Caucus of September 1985 Lewis described the IoT as: “The integration of people, processes and technology with connectable devices and sensors to enable remote monitoring, status, manipulation and evaluation of trends of such devices.”

For 1985 this definition seems sufficient but in 2017 that could apply to almost anything we interact with on a daily basis. Has the IoT become the internet or has the internet become the IoT?

Kevin Ashton, founder of Auto-ID at MIT, mistakenly took credit for the creation of the term in 1999 but his definition seems more or less the same as Lewis’ stating that the IoT is a world in which physical devices connect to the internet through ubiquitous sensors.

Ten years later in a piece for RFID Journal in 2009, Ashton went into more depth about his previous definition of IoT and said that the linchpin of IoT was the collection, analysis, and effects of data. Computers rely on data created by human beings but human beings have limited available time and therefore cannot input sensory data on the scale required by the IoT.

In the piece Ashton wrote: “We’re physical, and so is our environment. Our economy, society and survival aren’t based on ideas or information—they’re based on things. You can’t eat bits, burn them to stay warm or put them in your gas tank. Ideas and information are important, but things matter much more. Yet today’s information technology is so dependent on data originated by people that our computers know more about ideas than things.”

“If we had computers that knew everything there was to know about things—using data they gathered without any help from us—we would be able to track and count everything, and greatly reduce waste, loss and cost.”

It would seem then that, at its very base, the IoT is absolutely about the collection and use of data.

In 2017 however, this definition becomes tricky as practically everything we can imagine is capable of collecting data. Smartphones are probably the most common consumer device, but there are also a myriad of other physical devices capable of gathering data such as internet connected doorbells, smart hoovers, and even smart kettles.

IoT

In cities, governments are using traffic cameras and road sensors to determine the best method to maximise the efficiency of traffic flow and parking spaces. In several European cities, authorities have even introduced smart waste bins that can detect when the receptacle is full to optimise pick up times, or use pneumatic tubes to dispose of the refuse.

Technologies such as 5G are expected to bring even more physical devices online so whilst we are moving ever closer to IoT as a concept, and every device we own becomes ‘smart’, has the naming convention behind smart technology and IoT become redundant?

When asked at this year’s CeBIT Dr Joseph Reger, CTO Global Business, Fujitsu, told CBR: “People very quickly label things ‘smart’ but that’s actually not needed. If you know that there’s an identifiable tag on these things the ‘smarts’ doesn’t have to be in the device, it can actually be in a database so I know how that thing is operating and what it is doing.”

“I always emphasise that we shouldn’t make the mistake of thinking that for IoT everything needs to be smart, everything will be able to communicate because of 5G and it’s enough if we can just identify them. The sensors and the equipment prices are all coming down very much and we will have a situation where it is even easier to manage systems. That will be the time when the Internet of Things is just the Internet, and that’s it.”

Dr Reger gave the analogy of a smart monitoring system in a hospital which could be used to track the availability of assets within healthcare systems. Specifically, something like a wheelchair or a bed does not need to be smart, a wheelchair merely needs to be identifiable to the network, send its data, and then be analysed by machines in order to determine the most effective way to utilise these assets.

Whilst this definition does very much conform to the original idea of IoT as being a network of sensors, it does deviate slightly in saying that because the concept of IoT is becoming so ubiquitous it will eventually be an exercise in futility to continue referring to it as the IoT.

Dr Reger added: “In the era of 5G there will not be many devices that use electricity and are not connected to the network. And one of the few things that still doesn’t use electricity might be a brick or a rock.”

IoT cybersecurity

However, the idea that the IoT will progress to a point in which the smart technology is centralised in a hub is up for some debate. HPE believe that computing at the edge of the devices will become increasingly important in order to reduce latency, and protect sensitive data. This interpretation of IoT means that devices will always be different enough from internet and IT devices to warrant their own separate categorisation.

Peter Widmer, Category Manager EMEA Moonshot Edgeline IoT, HPE, told CBR: “The problem with that approach is that it’s not scalable, due to the physics. Adding some gears in between the process, bandwidth, wireless etc. creates latency. You also have to deal with the connection being lost, what do you do with the data? You have to cache it, ensure it’s not replicated of duplicated maybe even prevent corruption if it’s not secure enough.”

“I predict two years from now around 45% of data will be computed at the edge, for obvious reasons. Why should I wait for a decision if I want to shut down a reactor in an automatic plant? Why should I wait for a non-secure connection? No, I think that will always be specifically IoT.”

This definition asserts that the IoT will always be firmly separate from the internet, and the specificities that categorise a device as being part of the IoT will remain. Even in the face of almost every device being connected, the fact that the ‘smarts’ of the system will begin to take place largely on the device itself means that the IoT has cemented its independence.

So whilst the core definition of IoT has remained essentially unchanged since its inception in 1985, the future of the term appears to be up for some debate. It seems that everyone can agree that the IoT absolutely is a connected network of sensors to enable effective management, but whether the location of the smarts make a difference and if the advent of 5G and hyper-connectivity will make the term tautologous remains to be seen.