Robots are on the verge of transforming businesses around the globe. They’re increasingly capable of performing the physical tasks done by humans. For example, Spot from Boston Dynamics is being used with the Rocos platform to inspect energy substations - increasing productivity and efficiency while reducing risks for workers.

But what does this look like in practice? How can businesses make a success of automating the physical world? In our recent webinar we put these questions to expert speakers from electricity company Transpower and leading robotics companies: 

  • Andrew Askinas (Sales Manager Industrial Applications, Boston Dynamics)
  • Jourdan Templeton (Chief Technology Officer, Aware Group)
  • Andrew Renton (Senior Principal Engineer, Transpower)
  • David Inggs (CEO & Co-Founder, Rocos)

Here are a few highlights from the Q&A. You can also watch the full webinar on-demand here.

Q: This one’s for Andrew Renton. How long will it be before you’re comfortable for a robot to take autonomous actions in your environment?

Andrew Renton (Transpower): It’s an interesting question. At Transpower, we have 173 sites throughout the country and they are all currently unmanned. We control the power system in real-time, 24/7, 365 days a year. And we control and operate all our equipment remotely in our system. So we don’t need Spot to actually interact with our gear, because we control all of those functions remotely - from opening and closing circuit breakers to turning transformers on and off. For us, the real value of Spot lies in performing fault response, truck rolls, picking up data, inspection and condition management on-site. And all sorts of other hidden business cases, like spotting a man down if there’s a health and safety incident.

Q: Are people being replaced with robots - and what happens to those employees?

Andrew Renton (Transpower): Our manned inspections are scheduled 4 to 8 weeks apart, because we can’t afford to do them every day or week, it’s just not commercially possible. But with the robot we can do a daily inspection or fault response detection much more quickly, and with better data. At the same time, we’re using more people for things like targeted interventions, because we get an earlier heads up and can plan things better. So we have more people employed in the industry doing work on the actual network, and that means we’re making better decisions.

Q: How do you think robots like Spot (from Boston Dynamics) are impacting the internal culture of organizations?

Andrew Askinas (Boston Dynamics): There’s a management challenge as the role of employees will change. The workers who used to look at gauges or take thermographic charts, they might become data scientists interpreting that information or working with advanced software. It’s the responsibility of management to appropriately handle and communicate that change to employees. It’s definitely a challenge and I’ve watched organizations struggle with it. But if it’s accomplished then you have happy workers who stick around. 

Q: If you’re working with a total R&D or technology budget, how do you split up budgeting for physical automation and robotics?

Jourdan Templeton (Aware Group): It’s a tough one, because at the moment robotics falls into that innovation budget. That means you have to include data storage and collection, and connectivity - especially if the robot is going somewhere remote. And there’s the cost of tying the whole solution together with artificial intelligence (AI) as well. So managing that entire budget is pretty tough. The only answer I can give is that it comes from experience. At Aware Group, we have a fairly good idea of how to estimate what portions will be the physical automation or hardware budget, and a good idea of what the AI’s going to cost as well. 

Q: When you attempt to get data from a robot into analytics and the cloud, what are the challenges you’re dealing with on the network side? And how do you solve them?

Jourdan Templeton (Aware Group): That’s a really good question. Everyone talks about the cloud as a solution for crunching data. But if your robot is in a power substation and there’s a lot of interference, it may not be realistic to get live data at high throughput. And what happens if you lose connectivity? So we usually include some kind of edge computing device on the back of Spot. That’s there to continually manage the mission that Spot is on, but also make decisions if connectivity disappears. If we’ve added additional sensors or cameras, we can have Spot making decisions based on those inputs, irrespective of its internet connection. So I think the key to making robots successful in remote environments is that edge computing factor.

David Inggs (Rocos): At Rocos, a lot of our technology was built for the mining and marine industries where connectivity is sometimes impossible. But even our customers with the best 5G implementations still have robots that move around and end up hiding behind big metal objects, so their connectivity can drop out. These robots have to be built to cache data and make as many decisions as possible on the edge. So I think that’s critical to any robotics strategy - the disconnected system has to be able to function for durations of time. 

Q: Have robots become the target of hackers yet? How much effort does a platform like Rocos have to put into cybersecurity?

David Inggs (Rocos): At Rocos, we built our cybersecurity model with Microsoft at their head office. They have a huge amount of pedigree around cybersecurity, especially on their Xbox team where people have their hands on the piece of hardware and understand how the data transmission works. 

Andrew Renton (Transpower): It’s all about understanding your business - what’s mission critical and what’s about efficiency. For us, robots are about efficiency. Therefore we have a completely self-contained, remote system that doesn’t get anywhere close to the stuff we do in the power system. That’s just the way we like it!

Q: What’s your view on the shortage of trained people in robotics? How can companies upgrade these skill sets as they move towards higher automation?

Andrew Askinas (Boston Dynamics): The good news is that Spot is probably the easiest thing on the planet to operate! I recently went up to headquarters to work with Spot. And I’m not someone who plays video games, I’m not big on joystick controllers, but within minutes I was sending Spot everywhere he needed to go. So in terms of virtual control of the physical robot, I think those skills are already present … You don’t have to be a rocket scientist to engage Spot, get it moving and acquire data. I think that’s good news for organizations. 

Q: I think we all dream about the ability to integrate any sensor, from any manufacturer, at any time. Is that realistic? What are the characteristics of sensors that you need to be able to integrate it into a physical automation solution?

Jourdan Templeton (Aware Group): That’s a good question. I’m going to use Spot as an example again. You’ll notice on the back of Spot there’s a couple of rails which you can attach payloads to. And these payloads can be any kind of sensor that you want to package up. The key is how you actually collect the data from those sensors. That’s where we come back to the idea of edge computing. As a closed system, Spot has all these amazing features - it can walk around and go upstairs - but it’s the responsibility of the payload to interact with the sensors, perform analysis and feed information back to Spot. I think that’s where the Rocos platform works really well, because the Rocos agent can speak to Spot and control it, but it can also connect the sensors and relay that information back to where it needs to go. And then our role [Aware Group] is basically to stick an AI model in the pipeline to connect it all up.

David Inggs (Rocos): With the Boston Dynamics core payload, it’s very easy to remotely flash software onto that device, add additional drivers or plug in sensors. But one challenge is the volume of data that comes off these sensors. If you have a LiDAR system that’s getting millions of data points a second, do you try to ship all that data remotely, or do you have an edge that compresses it or takes a sample of some of the data? So I think the volume of data is more of a problem than actually connecting to the sensor. It’s a question of what should be shared locally versus remotely. You should start with the business problem you’re trying to solve and set expectations around the work that needs to be done by the robot. That’s key to selecting the right payloads and robot form factor, and what you need in terms of cloud and the edge going forward.

To learn more about how robots are automating the physical world in sectors like energy and utilities, watch the full webinar on demand.

Screen Shot 2021-06-29 at 9.08.47 AM

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec rhoncus, mi ut malesuada faucibus, ipsum nunc mollis erat, sollicitudin accumsan.


Join our newsletter to stay up-to-date with the latest in robotics..

Related Articles