The following article was brought in the December 2019 issue of Trafik og Veje
How do you make financial and technical infrastructure data accessible in clear and intuitive dashboards? Is it possible to simulate the consequences of different choices and make strategic decisions based on data from roads, railways and similar transport infrastructure? The search for answers to the above led Sund & Bælt to develop 360TBM – a platform which transforms the Total Cost of Ownership (TCO) concept into an active tool. Preliminary forecasts show that we can reduce TCO by up to 25 per cent at our infrastructure by opting for the optimum operating scenarios. Substantial value is often concealed in the data sources to which most of us already have access.
Not a new invention
Total Cost of Ownership (TCO) can be defined quite simply as the sum of all costs incurred during the lifetime of an asset. In other words, TCO is also the result of the chosen approach to operations and maintenance and the quality requirements that define the service level of the asset in question. There are many approaches to the definition and cost content in TCO; what type of costs should be included, how common costs for energy, vehicles and administration should be apportioned and should a discount rate be applied? As there is no industry standard, it is important to be in control of assumptions and relevant data so as to ensure that comparisons are carried out on a consistent basis.
Focus on optimising TCO is nothing new. Some people call it accountability and others call it good business. Our view is that by far the majority of dedicated and experienced professionals from operating organisations have a good idea of the lifetime and cost level of their infrastructure. This is also the case with Sund & Bælt where we have come a long way with an experienced-based approach to TCO. From 2005 to 2019, the expected reinvestment requirement was reduced by 48 per cent. But as ambitions and expectations for financial efficiency increase, the use of data in structured form has become a fundamental prerequisite for all ambitious operating organisations. This is also the case with Sund & Bælt.
Lower TCO is the target
Two years ago, Sund & Bælt embarked on a targetted journey towards digital transition. We now use drones and Artificial Intelligence to inspect the Storebælt Bridge’s concrete structures for damage, and we utilise data from our many sensors. The transition means that we are evolving from a traditional operating company with many manual processes, to being a digital company capable of utilising both new and historical data for process optimisation.
In 2017, it was decided to focus actively on TCO for Sund & Bælt’s physical assets with the aim of utilising data to reduce TCO. Secondly, to make it accessible and feasible for an operating organisation to work with TCO in practice. In other words, it was to gain an insight into what affects TCO by systematically working with operating history, focusing on future costs and getting an overview of the significance of the selected quality levels of the assets concerned.
Since then, Sund & Bælt has developed 360TBM (Transport Budgeting Management), a platform for data simulation, which integrates financial and technical data and makes adjustments for external factors and service requirements. To date, 360TBM contains data on two railway lines. This allows us to compare TCO at the two assets and, through certain key figures on cost types, error ratio and traffic impact, we can analyse and identify the components which have the lowest TCO under the given conditions. At Sund & Bælt, we have been able to make comparisons across the entire railway line, which is particularly interesting for decision-makers, and right down to component level, which is valuable for those with operational responsibility. With 360TBM as a tool, and its systematic work with TCO optimisation, Sund & Bælt expects to be able to reduce the annual cost level on a specific rail line by 25 per cent – corresponding to DKK 6 billion over a 100-year period.
The visual overview
The overall objective of 360TBM is to make data easily accessible to users. This is solved by using an analysis tool (Power BI), in which maps are integrated with data known from the geographical information system (GIS). Many other professionals, including those with operational responsibility, probably identify with the fact that information notes or analyses are frequently queried by the organisation’s management levels. Queries can be about operational performance, operating economy, the condition of the asset or the outlook for future economic development. Solving the task often requires using data from different systems, which sometimes ends up with the need to cut and paste in various Excel sheets to create a basis, which is predominantly a manual process. Should the same need recur, one typically starts thinking in terms of system integration or data warehouse solutions. The essence of structuring data in fixed transactions between various systems is good but can quickly turn out to be an expensive solution – particularly if the need for how the data is to be used is constantly changing. In our experience, a number of system components come with the option to monitor condition and performance. This offers new opportunities to gain insight into the system, but the new data sources can be difficult to integrate.
360TBM builds on an approach where, as far as possible, we gather information from data in the source systems. The data is normalised, i.e. organised in tables, and the necessary relationships between the data are created in order to ensure that the various types of costs are assigned to specific system components. The process is necessary to link the different types of data together, but we have found that this is something that needs to be done a number of times. The result is that data can be presented in easily accessible and needs-specific dashboards. Another significant benefit is that data is active. This means that it is possible to perform simulations, and there are countless possibilities for filtering and sorting the data.
Machine Learning brings 20-year-old data into play
If our current knowledge about the value of structured data had been available twenty years ago, we would have demanded data collection from the start. This does not mean, however, that past data cannot be used. In fact, new technology enables us to benefit from unstructured data. As with most maintenance systems, Sund & Bælt – which uses Maximo – has access to large amounts of relevant historical data about the operation and maintenance activities that have been carried out over the years. To meet our objective for reducing overall TCO, it is imperative that we use our maintenance history. We can do this by using the solutions that work and give a TCO, but also by being curious to learn from the assets that have relatively more errors. In Sund & Bælt’s historical work orders, error descriptions and remedial measures are formulated as descriptive free text. Over a 10-year period, for a specific type of system, such as a points system, 17,200 work orders with error descriptions in free text were created. This is a representative level and huge amounts of data are involved when all types of systems and 20 years of operating history come into play.
By using the Natural Language Processing (NLP) language technology, an offshoot of Machine Learning, we have managed to define an algorithm that is capable of decoding free text using language patterns and grammar. NLP enables large amounts of unstructured operating history to be translated into categorised error descriptions, which can be used for optimisation. Our experience with NLP is that the time spent training the algorithm controls the precision of the categorisation. However, compared to the hours consumed on the alternative manual categorisation, investing in the use of NLP has been an overall saving.
Great perspective in the use of new data sources
In addition to the factors that an operating organisation can itself influence – the approach to maintenance, quality level, etc. – there are often other factors that affect an asset’s overall TCO. This may be soil conditions and weather conditions or cable owners who are conducting excavations in the ground. Typically, it is complex enough to estimate TCO on the basis of one’s own data. Data from other factors is, however, relatively easily accessible and some is even publicly available and free. A consistent and deliberate approach to how factors from new data sources is brought into play will make TCO more nuanced and accurate. An example of the use of other factors can be seen at our rail lines where our TCO tool takes account of the significance of, for example, soil conditions, number of trains, the permitted train speed and the weight from the trains in the predicted costs.
There are some outcome requirements and consequences for each of the factors, which can either be economic or an adjustment of the lifetime. We know that there are increased costs when a rail line passes an area where the terrain has a soft bottom. By using geological map data, we can predict which parts of the asset are likely to have increased costs because of the soil conditions. The same principle comes into play for the other types of factors, e.g. the traffic impact on the track and the inclination of the terrain. 360 TBM enables us to activate the various factors and directly see the consequence reflected in TCO. This type of data simulation creates a more nuanced TCO calculation and allows for factors that we previously had difficulty calculating because it was too complex.
The factors used were identified through work sessions with leading specialists from Sund & Bælt’s operating organisation, selected on the basis of which parameters have the greatest impact on maintenance costs. Determining how the parameters will impact the TCO calculation is important but not crucial since the impact can also be constructed as variables for the consequence simulation. In this way, decision-makers can be greatly involved in determining how different parameters will impact the overall TCO.
What have we learned?
Sund & Bælt can see many perspectives in 360TBM. First and foremost, 360TBM has made the work to reduce TCO tangible. In a variety of different data, it has become easy to find and analyse through a visual approach. The development of 360TBM is a learning process, e.g. when data quality appears hopeless, new technology like NLP has proved that there is no need to wait for good data quality. In fact, we have utilised existing historical data instead, and by integrating new factors, we can link the past with the future. Sund & Bælt is constantly developing and expanding 360TBM with more data. Moreover, 360TBM is also a solution that can easily be adopted by other infrastructure owners; indeed, Sund & Bælt is also keen to share its experience and advice on how to get started.
- The detail of your TCO calculation depends on data quality/data structure. Accept that you have to compromise on your ambitions to do everything precisely from the start. Instead choose a level from where you can get some quick results and use them as a starting point for your ongoing work.
- If an overview seems impossible, then start with a defined part of your asset, e.g. the road surface on the prioritised roads. You will probably be able to identify the 20 per cent of your asset that makes up 80 per cent of your cost base.
- If you lack financial data on specific areas, then ‘borrow’ comparable data from colleagues in the industry to fill the gaps until you get your own data. But remember to keep track of the assumptions.
- Get involved with some of the people who have experience with data coordination, normalisation and simulation.
- If you wish to investigate whether your economy is affected by other factors, then begin by, for example, inserting different data into maps. This will undoubtedly arouse your curiosity to start investigating the connections. Remember to make use of the large amount of free information such as publicly available emergency statistics, geological data and other maps containing data.