SAP supply chain guru: 'Old tech relevant to Industry 4.0'
Andy Hancock is Global Vice VP Centre of Excellence, SAP Digital Supply Chain. He has a degree in aeronautical engineering and is an old-school computing fanatic from the 1980s. Here, he explains how these early passions have provided the perfect foundation for his role at SAP today.
Tell us about your role at SAP
The SAP Digital Supply Chain Centre of Excellence is like a global SWAT team of industry experts. I lead a small unit that can travel the globe to support knowledge transfer and also customer facing activities. We become experts in the product and transfer that knowledge, both internally, to our stakeholders, and also to sales forces around the globe.
We support field sales teams, helping them with product-features knowledge in the early part of a product’s life cycle. We also do internal knowledge enablement. We look after a lot of global projects, so it’s very interesting work.
What led you to SAP?
I did an aeronautics degree, but computer science was always a passion. I started off with a Commodore 64 and got my first PC in 1982. Then at the end of my degree an opportunity arose to go into computer science. The strict methodology of engineering is similar to computer science, because in both there is always a right or a wrong answer. It was a good fit for me.
My first couple of jobs were as an analyst programmer. In some ways what I was doing back then is similar to my role with the Centre of Excellence, in that I was helping clients plot out customer journeys. I got to understand the solutions in their portfolios very well, and was able to help clients get to this or that end goal.
You’re an engineer. Does this help at SAP?
It does, yes. My engineering background has helped me at SAP because my specialty is with asset–intensive industries, and all the technology that is around that. My stakeholders or customers are very often chief engineers or heads of asset management, and they also understand the machine side of things. They’ve come up through a similar route as me. I can always have a good conversation about pain points with them.
Can Industry 4.0 learn from old-style computing?
Absolutely. Although computing has changed a lot since the 1980s, the core thing is still data flow. Where was the data created? Where is it stored? Who needs to use it? What's it for?
Being a programmer means you want to be very effective in the way you move data from one place to another.
Think back to the days of dial-up modems, where everyone minimised the amount of data transmitted - because if you didn’t then the whole thing just hung.
With the huge data capabilities of 5G I think people can get lazy, and end up throwing tons of information around just because they can. The trouble is, when you scale this approach up to enterprise level you soon end up with 50 million data points that flood the network, making it inefficient. Then you end up chucking more technology at the problem, where what you really need to do is come back to the fundamentals.
What’s the secret to working with big data?
You should always be looking out for exceptions. Think of a temperature gauge on a piece of equipment, that is feeding back data. As long as everything is running okay, the equipment will always be roughly the same temperature. You don’t need to keep feeding back data about that piece of equipment. The only data you want to capture is if something changes - let's say the thermostat fails. With data, less is usually more.