Robert High is an IBM fellow serving as the company’s CTO of edge computing. We tapped High’s decades of technical leadership – spanning core IBM initiatives like Watson – to gain an insight into the future of edge computing and how it can improve our lives and help to tackle some of the biggest global challenges.
The world is currently entering the fourth industrial revolution (or so-called ‘Industry 4.0’) thanks to converging technologies including 5G, AI, IoT, AR/VR, and, of course, cloud and edge computing. IBM has doubled down on its Industry 4.0 work in recent years.
“We continue to work on our use cases around Industry 4.0 – things like production quality, worker insights, production optimisation – and bringing AIs into these environments,” High explains.
One recent example of IBM’s Industry 4.0 work is with the Sugar Creek Brewing Company. The brewer integrated AI and IoT technologies from IBM and Bosch to prevent waste and maintain quality by monitoring parameters such as fill time, temperature, pH, gravity, pressure, and carbonation.
Two key benefits of edge computing are the reduction in latency and improvements to privacy through on-device compute. Apple’s recent WWDC keynote showed the benefits of on-device compute when the company boasted of the speed and privacy improvements coming to Siri as it shifts towards on-device processing rather than relying on the cloud.
“We’re seeing lots of examples of where people want the advantages of having compute local to where they’re doing their work,” says High. “Lowering their latency, protecting their personal private information, and managing the cost of all the data that they’re producing.”
At this year’s IBM Think conference, the company unveiled the largest open-source dataset for code with the aim of helping AIs to better understand code and one day even write it.
IBM is an indisputable pioneer in AI but just last week announced a five-year, £210 million ($297 million) partnership with the UK’s Science and Technology Facilities Council (STFC) to help solidify its leadership.
“When we think about AI, we like to think of it not so much as artificial intelligence but more as augmented intelligence; which is the idea that you know AIs will benefit humans in the way that we think and help us to think about the things we weren’t thinking about on our own,” explains High.
“To be able to do that, the machine has to have a really engaging experience with us. It has to activate our ability to be inspired.”
High goes on to explain how AIs must have more human-like qualities to achieve such an engaging experience. This could mean sometimes answering questions with another question or even imitating things we pick up on subconsciously during human-to-human conversations like dilating pupils which shows that a person is being receptive and/or thinking.
“That cue gives the other speaker a sense that who they’re talking to is following what they’re saying,” says High. “The very subtle fine-grained interactions are what influence humans.”
To boldly go… where no internet is available
Edge AI is unlocking new possibilities for what devices can achieve; potentially even where no internet connection is available. This could be due to a temporary outage or a deployment in a remote location.
Short of taking a shuttle into space, there are few places more remote than the middle of the ocean. IBM will soon launch a fully autonomous ship called Project Mayflower (named after the ship which carried pilgrim settlers from Plymouth, England to Plymouth, Massachusetts in 1620) that will make the same trip as the original over 400 years later, albeit without any onboard crew.
Mayflower 2.0 will use edge processing to take information from the onboard sensors and quickly act on that data to navigate the perilous and rapidly changing conditions of the ocean while conducting vital scientific research.
“It will be entirely responsible for its own navigation decisions as it progresses so it has very sophisticated software on it—AIs that we use to recognise the various obstacles and objects in the water, whether that’s other ships, boats, debris, land obstacles, or even marine life,” says High.
The Weather Company is advising on the departure window for Mayflower’s voyage.
IBM acquired The Weather Company in 2016 and continues to use technological advancements to improve the potentially lifesaving service in a world with an increasingly unpredictable climate. IBM and The Weather Company are using Mesh Network Alerts to alert billions of people with vital weather information.
High explains how mesh networks can be used in factories that are using 5G which, of course, requires higher spectrum frequencies to achieve high bandwidth and low latency.
“The problem there is the higher the frequency, the more it has difficulty penetrating through walls and boxes and obstacles and whatnot,” says High. “There’s a technology company that we’ve been working with called GENXCOMM who will take that signal and relay it in a mesh network throughout the factory.”
Edge computing’s roots go back to at least the 90s with content delivery networks but emerging industry 4.0 technologies and increasing on-device compute capabilities have caused a resurgence in excitement around the topic.
“Almost all of the devices we’re accustomed to using are now more often being built with compute—what we call software-defined products,” explains High. “This light, this microphone, this camera… all of these things actually have compute in them now and more often that compute is using general-purpose commodity hardware like x86 chips and ARM chips and so they’re quite capable.”
“That ability to bring software to that device kind of unlocks its features, its functions. It enhances its value by being able to update the software running on it and that idea I think has a lot of power.”
Tackling global challenges
As I write this, world leaders have convened at the G7 Summit in Cornwall, UK where top of the agenda is tackling climate change. I asked High how edge computing can help to tackle one of the biggest threats facing the world today.
“To return the world back to a more sustainable condition is going to require a lot of information, it’s going to require that we have a lot of awareness of what’s happening on a very microscopic scale,” High explains.
“All that awareness is going to require a lot of compute. All of that data that we collect, if we had to send it all back to a central compute complex – like a cloud or a data centre – could get quite expensive.”
High goes on to explain how edge computing enables that data to be processed where it was created to cut the noise of irrelevant information and enable what’s useful to be acted on more quickly.
Some of the worst wildfires have happened in recent years—killing wildlife and wrecking habitats, destroying homes, and wiping out large parts of forests that help to reduce carbon dioxide levels. One project, from Dryad Networks, uses edge computing to quickly detect fires in remote locations when they’re in their smouldering stage and can be contained before they become full-scale wildfires.
“You’ve got both hopefully detecting the fire before it began – so you can preempt it, maybe even predict the potential for that fire before it begins – but once you’re on station and trying to fight the fire, you need compute to do a better job of leveraging the resources and getting the most efficiency and effectiveness out of those resources.”
You can watch our full interview with High below:
(IBM logo: © International Business Machines 2021)
Want to learn more about topics like this from thought leaders in the space? Find out more about the Edge Computing Expo Europe on June 16-17, a brand new, innovative event and conference exploring the edge computing ecosystem.