Sometime in the 1990’s, when the Web was young, the savants at Xerox PARC connected a drinking fountain to the Internet such that the height of the stream provided a real-time indicator of their parent company’s stock performance throughout the day. While this watery exercise might have seemed little more than a stunt by zany nerds with too much time on their hands, it was a harbinger of things to come, indeed of the Internet of Things.
The PARC people didn’t use that phrase back then, but their leader at the time, John Seely Brown, often spoke of ubiquitous or pervasive computing. The premise was that as microprocessors found their way into more “things,” we would soon live in a world where computation was everywhere. Like a lot of bright ideas from PARC, this one was slow to gain traction. After all, who needed a microprocessor-controlled, Internet-connected refrigerator to tell them the milk was past its sell date?
But now that microprocessors have proliferated, the Internet of Things has gone from vague hypothesis to strategic imperative. Networking giant Cisco, which prefers the term Internet of Everything, projects that the coming ubiquity of computation and connectivity will generate at least $613 billion in global corporate profits during calendar year 2013, and that $14.4 trillion of value (net profit) will be at stake globally over the next decade, driven by connecting the unconnected—people-to-people, machine-to-people and machine-to-machine—via the Internet of Everything.
That would certainly be good news for Cisco, which as a leading purveyor of Internet infrastructure is positioned to reap a substantial portion of those profits, almost regardless of how the scenario shakes out. Of course, that depends on Cisco following a coherent strategy and executing with few fumbles, and many a dominant incumbent has stumbled at such an inflection point.
If Cisco and other tech-savvy prognosticators are correct, we are about to enter, or have already entered, an era of exponential growth accompanied by exponential change. New fortunes will be made on the right side of that equation, long-held hegemonies lost on the other. And if the Internet of Things got off to a slow start, it is now moving very fast.
FIRST THERE WAS RFID
The phrase Internet of Things dates to June 1999, when a British technologist named Kevin Ashton gave a presentation with that title at Procter & Gamble. Ashton jointly founded the Auto-ID Center at the Massachusetts Institute of Technology (M.I.T.). The center devised a system of global standards for radio-frequency identification (RFID) and other sensors. Linking the new idea of RFID in P&G’s supply chain to the then-red-hot topic of the Internet certainly caught the assembled executives’ attention, but Ashton believed that much more was at stake. “The fact that I was probably the first person to say ‘Internet of Things’ doesn’t give me any right to control how others use the phrase,” Ashton wrote in a 2009 article for RFID Journal. “But what I meant, and still mean, is this: Today computers—and, therefore, the Internet—are almost wholly dependent on human beings for information. Nearly all of the roughly 50 petabytes (a petabyte is 1,024 terabytes) of data available on the Internet were first captured and created by human beings—by typing, pressing a record button, taking a digital picture or scanning a bar code. Conventional diagrams of the Internet include servers and routers and so on, but they leave out the most numerous and important routers of all: people. The problem is, people have limited time, attention and accuracy.”
Ashton argued that our economy, indeed our survival as a society, depended less on the ideas generated by people and distributed on the Internet than it did on physical things. But today’s information technology, dependent as it is on data originated by people, knows much more about ideas than it does about things, which until recently had no way to communicate. He called for a new computing vision that went far beyond conventional uses of RFID, as a sort of bar code on steroids, to incorporate computation and communication in seemingly mundane things. Ashton’s initial presentation spawned dozens if not hundreds of papers on the Internet of Things, some laudatory and utopian, some projecting a deeply disturbing future in which technology imperceptibly influences moral decision-making and reduces human agency.
WHEN THINGS GAIN SMARTNESS
But many of the new nodes on the Internet of Things are benign, even friendly. Consider Nest, a smart thermostat. Nest, the company, was founded and is headed by Tony Fadell, who led the team at Apple that created the first 18 generations of the iPod and the first three generations of the iPhone. As iconic as those devices are to the Internet as we know it, so is the Nest for the Internet of Things. A Nest costs $250, but unlike those $20 thermostats sold at Home Depot, it programs itself in about a week, by observing and interpreting how you use your heating and cooling systems. It creates a personalized schedule based on the temperature changes you’ve made and continually adapts to your needs, automatically balancing comfort and energy savings. When you leave the house, Nest senses you are gone and automatically adjusts the temperature to avoid heating or cooling an empty home. And it connects via Wi-Fi to the Internet so you can make remote adjustments with your smartphone or laptop.
That sounds like a neat toy for tech geeks, but Nest’s Web site is full of testimonials from consumers who have saved enough energy to recoup the device’s cost in a few months. Independent tests by organizations like CNET have confirmed the savings, and as with all devices based on microprocessor technology, the Nest will inevitably drop in price, gain in capability or both as time goes on. Fadell said his aim was to produce a game changer for the many, not a clever device for the few. “First, we wanted to make a great thermostat that people actually cared about, that helped them use less energy,” Fadell said upon winning the World Economic Forum’s Tech Pioneer award for 2014. “Then we wanted it to change the world. That’s always been the plan. Nest was created to disrupt an industry, to revolutionize the way people used energy, to start something big.”
Some of the things connecting to the Internet are even more personal than your home’s HVAC system. Devices from Fitbit, Nike+ and Garmin allow users to monitor their activity levels, calorie consumption and sleep habits, and to store and share that data over the Internet. Wonder how your speed over a favorite jogging or cycling route compares with other fitness addicts? It’s a mouse-click away. Does drinking less wine with dinner improve your sleep? Ditto. And the purveyors of these digital nags are openly seeking to influence users’ behavior. As Fitbit puts it: “The Fitbit family motivates you to stay active, live better and reach your goals.”
Still more personal and potentially life-altering are Internet of Things innovations in health care. In August 2012, Proteus Digital Health received U.S. Food and Drug Administration approval for the company’s “ingestible sensor.” The one-square-millimeter device—the size of a grain of sand—is imbedded in a pill. Ingest it at the same time that you take your medication and it will go to work inside you, recording the time you took your dose. It transmits that information through your skin to a stick-on patch, which in turn sends the data to a mobile phone or other devices and on to your doctor or nurse.
The idea is to improve patient compliance—drugs don’t work if you don’t take them—but more advanced sensors could one day monitor how drugs are metabolized and enter the bloodstream, providing valuable data on safety and efficacy. While the current Proteus chip is embedded in a placebo pill taken along with an active medication, the company hopes to get its technology placed inside commonly used drugs. Proteus has partnered with major drug companies, including Novartis and Otsuka, to further develop what it calls digital medicines.
Swallow a sensor along with your medication and you can monitor compliance; attach a sensor to a tree or plant and you can monitor wind velocity, fire danger or even carbon uptake. Treesensor.com uses sensors to help ensure that trees remain securely rooted and healthy even when buffeted by winds. M.I.T. researchers are developing a power-scavenging system for small wireless sensors that detect forest fires. Each sensor’s battery is trickle-charged with the electricity generated by the imbalance in pH between the tree and the soil. The Internet of Things is also helping biologists in Australia determine which types of grain grow best in a wide variety of conditions. From more than a million plots all over the country, a wireless sensor network sends data to the High Resolution Plant Phenomics Centre in Canberra, which runs the experiments.
Automobiles have long depended on sensor and microprocessor technology to reduce emissions and enable systems like anti-lock brakes. New cars increasingly feature Internet connectivity, which allows more microprocessors to find the best route in current traffic conditions, locate a free parking space and even drive the car itself. While Google’s autonomous automobiles still seem a Silicon Valley novelty, in September Mercedes-Benz unveiled a specially equipped version of its S-Class sedan that it said could enter production as soon as regulatory changes permit. “Autonomous driving is here today; we just can’t quite give it to you yet,” Dieter Zetsche, Mercedes-Benz’s chief executive, said at the Frankfurt Auto Show. He said the company’s goal is “fully automated driving for all.”
LET A TRILLION NODES BLOOM
Adding microprocessors to things allows them to do all sorts of cool stuff, but that alone would not explain the inexorable growth of pervasive computing. There is a stronger imperative, as Peter Lucas, Joe Ballay and Mickey McManus explain in their 2012 book, “Trillions: Thriving in the Emerging Information Ecology”: it saves money. They note that as early as 2002, the world was producing more transistors than grains of rice, and cheaper. And they estimate that semiconductor manufacturers now produce 10 billion microprocessors a year, more than the total number of people alive. Only a tiny percentage go into computers, tablets or smartphones; the rest go everywhere else.
Consider washing machines. Most washers that are at least 10 years old have the familiar knob and pointer that you pull and turn to set the cycle, controls that are intuitive and easy to use. But “behind them is a complex series of cams, clockwork and switch contacts whose purpose is to turn on and off all the different valves, lights, buzzers and motors throughout the machine,” Lucas, Ballay and McManus write. “It even has a motor of its own, needed to keep things moving forward. That knob is the most complex single part in the appliance.” Newer washers have touch buttons and a digital readout, which are typically harder to use. But behind them is a microprocessor and software, which is much cheaper to produce. Money-saving is a powerful engine for change. “We have pursued this notion of the trillion-node network, where literally every device has some capability of computation; that’s going to happen one way or another, because it’s just cheaper that way,” says Lucas, who co-founded MAYA Design Inc. in 1989 to “remove disciplinary boundaries that cause technology to be poorly suited to the needs of humanity,” according to his biography. “Trillions of computers is a done deal. We can do it well or we can do it poorly, but it’s going to happen.”
WHAT COMES NEXT: FROM THINGS TO EVERYTHING
In late 2012, Dave Evans, Cisco’s chief futurist and chief technology officer, wrote a paper titled “The Internet of Everything: How More Relevant and Valuable Connections Will Change the World.” Evans’ thesis is that computation and connectivity are about to become even more ubiquitous, turning information into actions that create new capabilities, richer experiences and unprecedented economic opportunity for businesses, individuals and countries.
“In terms of phases or eras, Cisco believes that many organizations are currently experiencing the Internet of Things, the networked connection of physical objects and one of the many technology transitions creating greater value for organizations that embrace the Internet of Everything,” Evans wrote. “As things add capabilities like context awareness, increased processing power and energy independence, and as more people and new types of information are connected, the Internet of Things becomes an Internet of Everything—a network of networks where billions or even trillions of connections create unprecedented opportunities as well as new risks.”
Evans foresees a world in which people become nodes on the Internet, data becomes information and things become aware, helping people and machines make better decisions. He sees the Internet of Everything already transforming cities, as with the smart-screen technology that provides real-time information about public transit and other services. In the future, he sees the proliferation of sensors and microprocessors helping to solve intractable problems, like climate change, world hunger and the scarcity of drinkable water. But he adds that these advances may not come easily.
“Of course, IoE will face many hurdles as it comes to fruition over the next 10 years,” Evans writes. “Some of these challenges will be familiar, including security, privacy and reliability, while other problems will require us to have open social and political discussions. To overcome these challenges, government organizations, standards bodies, businesses and even citizens will need to come together with a spirit of cooperation. When the history of the Internet of Everything is written, its success or failure will be determined by answering one question: How did the Internet of Everything benefit humanity? In the end, nothing else matters.”