Rob High, VP & CTO, IBM Watson, on edge computing innovation

Rob High, VP & CTO, IBM Watson, on edge computing innovation Duncan is an award-winning editor with more than 20 years experience in journalism. Having launched his tech journalism career as editor of Arabian Computer News in Dubai, he has since edited an array of tech and digital marketing publications, including Computer Business Review, TechWeekEurope, Figaro Digital, Digit and Marketing Gazette.


What kind of trends have you seen developing so far this year in edge computing?

I think the most important trend is people moving from playing, testing and validating to now really moving forward into serious production engagements. 

We’re seeing this in retail. We’ve seen this in manufacturing, although that’s probably been going on a little bit longer already. But one of the key things there is that even the OT vendors, now are starting to get on board to embed edge computing within their OT, what I call proprietary technologies, meaning you’re not using open IT technologies. 

So we’re seeing a large shift there, I think some momentum, a lot of uptake and interest in the edge computing marketplace about using and leveraging the compute capacity in these remote on premise locations where business conducted business to facilitate them automating and transforming the business to be much more digital.

And what about the benefits of using edge computing? Have they been evolving at all?

I think there’s been a little bit more emphasis on protecting personal and private information. We’ve always understood that edge computing would bring benefits to latency. But now there’s a lot more focus on protecting personal private information, because a lot of the things that people are trying to do at the edge involve collecting video and acoustic data, both of which have the propensity to gather information beyond the thing you’re mostly interested in. 

In other words, if you have a video camera on your factory floor that you’re using for quality inspection, chances are you’re also picking up images of employees walking by. You’re picking up imagery that represents your inventory and current rate of progress or process, all of which is pretty sensitive information. And so the idea that information could be leaving the premises is becoming a more urgent concern. 

The other one is operational cost. And I think most people have come to understand how much of that data, when transmitted to the cloud to be processed, incurs a tremendous overhead in terms of the network bandwidth, especially when you’re dealing with video data. And then, of course, the storage costs at the cloud, the processing costs at the cloud, some of which is kind of unnecessary, especially if you don’t know what to do with the data, which is sometimes the case. And, in any case, you get a lot of redundant data. If you think about video and analytics, if I’m doing quality inspection, I’ve got a video feed that’s generating 23 megabits per second video data. I only needed that one frame that had the part in it with a quality issue in it. That’s the only thing I really needed. All the rest of the video data that I’ve collected in between the different frames that have that part in it is superfluous data. So that’s a lot of overhead. I think that’s the second thing that people have come to understand. 

I think resilience is another one. And we’re actually seeing this a lot in retail right now. Not just resilience at the edge, but even resilience in the cloud as well, which goes to another trend, which is important, which is we’ve been in this industry focusing on some of the unique issues that surface at the edge. And when it comes to things like making placement decisions about where to put the software and how to keep it up to date, and how to adjust for the differences in location and other operating conditions, people have come to understand that a degenerate version of that applies even to their cloud environment. 

And especially for multi hybrid cloud, organisations that have made a commitment to two or more cloud environments. They’ve got issues like I’m using Azure today, but I have a special deal with GCP, or with AWS or some other cloud, where I get a discount if I use it in a certain time period. I want to be able to have the flexibility, agility, and resilience to be able to move that workload around at different points of time throughout the day, or through the month, based on where’s the best, most efficient use of my resource at that point in time. 

And so that turns out to be very similar to the kinds of placement processing that we do when we’re talking about the edge. And since a lot of these things are homogenous, they’re all using cloud native container based execution platforms, typically either to run Kubernetes or sometimes around Docker, or perhaps Podman, because everything’s containerised people now have that flexibility to choose to make late binding decisions as to where they’re gonna run this container.

When companies start using edge computing, what do you think they should bear in mind? What kind of challenges might they have to overcome?

First of all, when you think about edge computing, it’s actually useful to look back at your legacy system. 

And there’s virtually no manufacturer out there, no retailer out there, that doesn’t already have compute capacity in their manufacturing site or in their stores. It just may have a different form. It may be in a bunch of Windows servers, maybe a bunch VM based compute runtimes that are sitting out there. And so they may not recognise that as being edge computing, because by any other name, it’s just simply the way that they did things in the past. 

But yet, with the emergence of cloud native development practices becoming the norm, the only real distinction we’re making here between that sort of legacy of on premise computing and edge computing is that we’re now adopting cloud native development practices for the edge. So if you’ve been doing any kind of processing in your retail store, almost every store has one or two or 10 servers or PCs or industrial PCs, sitting on a desktop at the back of the store and store managers office many times. And maybe that was running VM based applications or monolithic applications or Windows based applications. 

The only difference between that and edge computing is now adopting Linux and a container runtime. And using cloud native development practices for building the applications and deploying those, which means that you can extend all the skills and tools that you’ve been developing for building cloud native applications running in the hyper scale cloud environments, or even a private data centre, you can take those same practices and bring them out to the edge. 

So with that in mind, it now becomes a lot simpler to think about what advantages you’re getting from edge. Well, you’re getting the same advantages you always got. It’s just now you’re getting the additional advantage of being able to normalise your skills and tool chains in that process. 

I think the need for distributing workload in a business is a pretty fundamental need. If you think back to the last 60-70 years of history in IT, we’ve seen this pendulum between centralised computing and distributed computing service under different forms and different generations. We had client server computing, we now have mobile computing and SOA computing. Now we’ve got edge computing. 

So you have to ask, why does this distributed computing idea keep coming back? Well, the answer is quite simple. It’s through distributing in various forms that you grow your business. If you think about it, if you want to scale your business, you have to distribute business. If you’re a retailer, you grow your business by opening a new retail store and distributing the retail capabilities out into the neighbourhoods and locations where your clients exist. 

If you’re trying to grow your manufacturing site, you’re going to distribute work across multiple different production lines or across different factories. If you’re going to grow your distribution business, you’re going to do that by creating additional warehouses and distribution centres. 

And so distribution is a natural part of that. It’s almost a law of nature, if you will, for growing your business. Well, the same thing is true of computing as well, in part because you need to compute this localised to where you’re distributing your business to, but also because the fundamentals of computing have the same properties that govern how you grow your business. So I think edge computing is, at its most fundamental form, simply about how you scale your compute to coincide with the scaling of your business and attempting to get optimization by localising the compute to where your business is actually being performed.

Are there any companies or sectors that have been using edge computing particularly well?

Yeah, of course, keeping in mind that edge computing is really a revision of classic legacy distributed computing. Retail banks have been doing distributed computing in their retail bank locations for decades now. Some of them have reverted to doing laptops with web applications in their branches. But still, they have ATMs and an ATM itself is an edge device in some sense. Edge computing just simply being a refinement of that basic idea, the focus on cloud native and containerised practices. The places where we’ve seen this kind of distributed computing, now being upgraded to make use of this cloud native development practices in the form that we now refer to as edge computing is where we’re seeing the great success. 

So again, back to manufacturing, for decades now manufacturing has had both OT as well as some limited IT capabilities at their manufacturing sites, and now being upgraded to make use of containers. And so we’re seeing this pervasively across the entire manufacturing industry. 

In retail, again, very much like retail banking, packaged goods and products are also subject to that same kind of evolutionary transformation where they started out having a lot of compute capacity within their stores that are now being upgraded to normalise that around containers and cloud native development practices. 

So very effective in all three of those – retail, retail banking, manufacturing. A little bit of early conversion now in the distribution sector, in logistics and distribution infrastructure. We’re starting to see a little bit of uptake there. 

I think the automotive industry aspires to get to that kind of normalisation, but that’s a pretty major transformation for the automobile industry to go through, both in terms of ECU, the tier one ECU providers, the Continentals and the Harmons of the world, and Qualcomm for that matter too, all providing the next generation ECUs with a container native approach. 

But then also there’s this tension within the automobile industry between trying to leverage general purpose IT compute technologies, yet still have to maintain their safety and security compliance requirements that are really important to that industry. 

So I think it’s going to take the automobile industry a little bit longer to fully embrace what we now refer to as edge computing. But it’s another example of where an industry has had a tremendous growth of compute capacity within their vehicles. It’s just been in a proprietary form that doesn’t benefit from the flexibility and agility that we can bring when we bring a cloud native approach to solving problems there.

What kind of impact is the combination of robotics, AI edge computing and 5G having on industry 4.0?

This is actually a really interesting area to think about because there’s a natural synergy between edge computing and networking in general. Obviously, anywhere we have a edge computer, you’re doing work that probably needs to communicate and connect back to other parts of your business. And most often it’s going to be your data centres or your hyperscale cloud environments. 

That connectivity is always an interesting challenge in a manufacturing environment, to do a wire based connectivity or a cabled connectivity is a really expensive proposition, to bring cables out to all of your equipment to string it across your factory floor. 

You could be talking about miles of cabling inside your factory to connect up all the equipment and devices that you’re making use of in your production environment. 

So there’s a very strong desire to move to wireless technologies. The problem with classic WiFi, has been that the spectrum that WiFi uses has a high probability of interfering with operational equipment, as does 4G and LTE for that matter. But with 5G, we’re now seeing mid band and high band spectrum, which has a much lower propensity for interfering with operational technologies. And so, therefore, it’s now viable to consider and when you can, when you factor in things like private 5G, or public 5G, splicing, network slicing, the idea that you can then use that as a substitute for your wireless connectivity into your factory floor and onto the equipment is really compelling. 

So there’s a very nice synergistic effect there that both edge computing can benefit from 5G, especially private 5G. But also edge computing, for a lot of the telcos that have invested heavily in 5G infrastructure, edge computing represents an opportunity to recoup and get a good return on their investment. Because the value that edge computing brings to the business can drive benefit from 5G and, therefore, is much more compelling of an argument to make than it is, for example, for 5G in the consumer space. Consumers, yes, we all enjoy the extra benefits that 5G brings but are we really willing to pay more for a subscription for 5G than we would have for LTE? 

Is the benefit that much more significant that we’re willing to pay a lot more? And, the chances are it’s not. There’s not a lot of additional revenue to be generated that way. Whereas in the enterprise space, edge computing has a tremendous benefit to 5G, so really strong synergies.

How can network agility and flexibility lead to edge computing innovation?

This is really key too because, if you think about any kind of connectivity into the edge, we might be able to take advantage of classic fixed configuration networking, for securing and protecting the trustworthiness of that internet network interconnect to this edge. 

But remember, the edge is a pretty dynamic world, right? When we’re doing edge computing on the factory floor, we’ve got workloads moving around all over the place, either because the production processes are changing, or because the software is being updated, or because the equipment is being replaced. Or even just basic things like the factory manager has switched out a production run or the robotics space, we’ve got different tasks they are trying to perform. 

So it’s a very dynamic world. And our network activity has to exhibit the same degree of dynamism. It has to almost, in some sense, go with the workload. It has to migrate with workload as we move that around the factory floor or in the distribution centre or in the retail stores in areas. 

So we need to have a fairly strong linkage between the decisions about where to place workload at the edge, the connectivity that workload needs, and how that might change over a period of time in a dynamic way. And the connectivity association to the edge workload has to be as dynamic as the workload itself is or as dynamic as the placement that workload is across all the different edge nodes in the location where you’re operating.

What do you think the future holds for edge computing?

The simple answer to this is scale. Everything that we’ve seen in the past about distributed computing. We’ve gotten through these pendulums between distributed and centralised computing. But every time we come back to the distribute computing paradigm, we have orders of magnitude more devices and endpoints that we’re dealing with than the prior generation. 

And that’s the same again here, that as we get into the edge computing space, and especially as the marketplaces and the industry start to mature, their use of edge computing, the one thing that we’re seeing over and over again, is that what they want to be able to do is go to first their hundreds or thousands of stores with a single server, then move from that into within the store to the five or 10 or 20 point of sale terminal applicant point of sale terminals that they have within the store. 

So I’ve got another order of magnitude more edge locations, your edge endpoints you’re dealing with. Moving into what we call the portable associate model, which is I don’t have a fixed cashier point of sale terminal, but rather, I’ve got tablets, that I’m handing out to my store associates, and you’re just checking out with whichever associate you’re encountering there on the store floor. 

So I’ve gone from 10 to, let’s say 100, to now I want to go to all my equipment arrayed across the store, whether that’s intelligent refrigerators or IoT devices connected to edge IoT gateways that I’m using to sense the humidity of my vegetable counter and how much water to spray on it all the way through to how I manage my HVAC systems to all the cameras I might use for doing stock inventory, verification, spillage and spoilage, loss prevention, clients and associate protection and safety. So I’m literally going from hundreds or thousands of stores, to tens of thousands of point of sale terminals to hundreds of thousands of other edge devices. 

And, of course, that’s just for one retail organisation. Multiply that across all the different retail organisations out there, all the manufacturing organisations out there. And the thing that we can predict quite safely is the numbers are going to be enormous.

What plans does IBM have for the future?

First of all, we are pretty heavily invested in solving people’s administrative problems that I mentioned earlier. We think it’s key that we be able to help customers manage the deployment of their workflows out to their edge safely, securely and at scale without requiring any additional IT resources. So that’s an area that we’re already quite heavily invested in. 

And we’ve been trying to exhibit this commitment and investment through open source. So we actually do our work in the open through the Linux Foundation Edge, the LF Edge, Open Horizon project. We’ve been combining that effort with other things that are also at the LF Edge, including things like EdgeX Foundry, what used to be called the secure device onboard or SDO project. There are several other components that we think within the LF Edge space are quite relevant to this as a computing requirement. 

Then we’re combining that with the strength of our cloud native development practices, the tool chains, the skills, including our partnership with Red Hat, which brings a lot of container native infrastructure to the table in the form of both Red Hat Enterprise Linux for edge, which also has the embedded Podman runtime, the future Red Hat Device Edge, which is really the microchip productisation, as well as Red Hat OpenShift. 

All three of those, we think, have a lot of relevance in any of these edge computing spaces based on the footprint differences that might be applicable in this space. And we also are quite invested in solution capabilities, starting with our sustainability efforts, our sustainability applications, where we bring in things like Maximo for asset management.

And then coupling that with our weather app. The weather service applications for introducing weather data to the analytics that you might want to perform. You might want to estimate something about your requirements. Knowing something about what the weather is going to be over the next few days is quite useful in determining what you might need and when – how you might want to adjust your inventory requirements, your stocking requirements. If you’re in manufacturing, knowing something about the weather might influence how you anticipate your supply chain, or upstream or downstream to flow, etc. 

So there’s things that we’re bringing to the table that I think extend the capabilities of the analytics that we typically would want to find out there at the edge.

Want to learn more about edge computing from industry leaders? Check out Edge Computing Expo taking place in Amsterdam, California and London. 

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Author

  • Duncan MacRae

    Duncan is an award-winning editor with more than 20 years experience in journalism. Having launched his tech journalism career as editor of Arabian Computer News in Dubai, he has since edited an array of tech and digital marketing publications, including Computer Business Review, TechWeekEurope, Figaro Digital, Digit and Marketing Gazette.

Tags: , ,

View Comments
Leave a comment

Leave a Reply