How do you know you’ve made the best decision possible with the information you have? Well, in the vast majority of cases, you can’t say for sure. Despite what many people might think, nothing is 100% – very few things in life are that certain and nothing performs perfectly all the time. By Mark Begbie, Business Development Director, CENSIS
It’s just as well then that often, you don’t need everything to work at its absolute optimum performance. A plant can deliver to 80% of its capacity, still function well and turn a solid profit. Likewise, an athlete can perform to 70% of their potential, like Mark Viduka famously quipped about playing football in Scotland, and still be among the best.
But the fact is, components, plant, and equipment, like human bodies, degrade and change over time – particularly those that are exposed to extreme conditions. In the case of sensors, used to inform people about a range of data – from temperature and humidity, to pollution levels and leakages – this can have a big impact on decision-making. So what starts out as 90% can slip to 70%, and the data you’re drawing on looks very different to how it did before.
That’s the fundamental problem the new EPSRC-funded Science of Sensor Systems Software (S4) project is looking to solve, through a new, £4.2 million initiative. In early 2016, we announced the launch of the project: a collaborative endeavour involving the Universities of Glasgow, St Andrews, Liverpool, and Imperial College, London.
The project involves the commercial expertise of some of the world’s largest engineering firms, like ABB, NXP, Rolls-Royce, and Thales, with test-case requirements from the British Geological Survey and Transport Scotland.
By bringing together the broad coalition of knowledge and skills incumbent in these organisations, taking in computing, engineering and mathematics, the project’s aim is to improve the reliability and accuracy of the decisions we base on data, which is sensed from possibly uncertain sensors in uncertain environments.
With sensors so widespread in modern life, the results could be far-reaching. The enhanced scientific and mathematical models the project will produce should end up being applied to a range of use-cases, including more robust water networks and improved accuracy in monitoring air quality, all the way through to precision manufacturing. They could even impact on the development of more reliable autonomous vehicles.
Essentially, the project aims to capture new laws in the science and mathematics behind sensor networks, the software that controls them, and how we interpret and act on the data they provide. The outcome will not only be more resilient, responsive, reliable and robust sensor networks, but more assurance about decision-making – a challenge for businesses across the board.
That could result in the more-efficient operation of plant. Imagine, for example, an industrial control situation, involving many probes and monitors producing data about the performance of a boiler. Such a chemically harsh environment quite naturally puts sensors under a great deal of pressure and, after years, they can become unreliable witnesses to conditions.
The aim is to take account of this degradation process into the models used to analyse the data – taking account of the uncertainty introduced by inaccuracy and ensuring the decision taken is the best that could be with the data available. With that, the plant could operate for longer, more efficiently, and with less chance of being shut down for maintenance tasks based on erroneous information. That inevitably results in cost savings and improved efficiency.
This project could herald a breakthrough in the science, not only of sensors, and sensor system software, but of decision-making too. Ultimately, the aim is to enable people to say that they took the best possible action in a particular scenario. That’s a strong thing to be able to say, with absolute certainty.