Anyone reading forecasts regarding the growth of data worldwide in the near future will probably feel like a gold digger in the Rocky Mountains. According to analysts from International Data Corporation, the digital data mountain is expected to grow forty to fifty-fold between 2010 and 2020, to 40 zettabytes. A zettabyte has 21 zeros. To put it another way, six terabytes of data will be stored for each of the world’s inhabitants in 2020. This corresponds to the amount of text contained in three million books per person!
Most experts are convinced that veritable gems can be found in this mass of data. In a global survey of people from all sectors of the economy conducted by the University of Oxford, almost two-thirds of the respondents said that the use of data and analytical processes provides their companies with a competitive edge. Two years earlier, the corresponding figure was only 37 percent. But how can today’s “gold diggers” find nuggets in the mountains of data?
A term frequently used here is “big data,” which refers to new technologies for recording, storing, and analyzing large amounts of data, as well as for displaying the results in a suitable form. For example, a topic that is much discussed is the use of data generated by people who search or buy on the Web or the utilization of data from the world’s financial and communication networks. However, the bits and bytes from industrial facilities, buildings, energy systems, and hospitals are at least just as valuable — and big data, as the term is understood today, is inadequate here. That’s why big data has to evolve into smart data.
We have to understand the data in order to create added value
We have to understand the mass of data in order to correctly evaluate it. We have to know how the various devices and facilities function and what sensors and measuring technology we need to obtain the really relevant data. The decisive criterion here isn’t necessarily the amount of data (big), but valuable content (smart). In a large gas turbine, hundreds of sensors measure temperatures, pressures, flows, and gas compositions every second. If you have in-depth knowledge of the facility’s physical properties and thus know how to correctly analyze the data, you can give power plant operators valuable tips on how they can make the facility more efficient and cut pollutant emissions without reducing the output of electricity. The same applies to the optimization of electricity production at wind farms or the minimization of energy consumption in buildings, steel plants, and entire cities.
In all of these areas, you have to not only collect the data but also understand it. Here, people who have both equipment know-how and user know-how — domain knowhow — will definitely be in the strongest position. Such people will not only know how gas turbines, steel stamping facilities or power grids work, but also be familiar with the operators’ processes and needs. If they can also develop the right algorithms for evaluating the data, they will be able to provide their customers with real added value in the form of energy savings, more environmentally friendly operations, reduced costs, accelerated processes and more reliable equipment.
Prophetic capabilities are needed
In the future, smart data will not only enable us to find out what is happening in our facilities at any given moment, but also why it is happening. Moreover, it might even tell us what will occur in the near future and what we can do about it. The first steps in this direction have already been taken. For example, Siemens operates remote-maintenance centers on several continents. Around 250,000 facilities are connected to these centers, which process more than 10 terabytes of data each month. This amount is expected to increase tenfold by 2020. The centers analyze the data from almost all of the systems, from traffic lights, traffic computers, trains, and ship engines to thousands upon thousands of buildings, steel plants, paper factories, wind and gas turbines, X-ray machines, and computer tomographs.
Take wind power plants, for example. In these systems, sensors also measure mechanical vibrations, which are compared with a database containing the measurement values of more than 6,000 wind turbines. If there is an anomaly, the service team can take immediate action before the system breaks down. Such anticipatory maintenance would also be extremely valuable for trains or medical equipment — and for power plants as well, of course. For example, if the drive unit of a power station’s coolant pump stops working, the plant will stop generating electricity and cause losses of hundreds of thousands of dollars each day.
Smart data is revolutionizing business models in many sectors of the economy
However, together with anticipatory maintenance, real-time remote diagnosis is only one example of how smart data will change companies’ business models in the future. There are many more possibilities. For example, technicians could use tablet computers to obtain the help of specialists whenever they have problems operating equipment. In addition, doctors evaluating images from computer or magnetic resonance tomographs, for example, could use smart algorithms to access a database containing many similar cases that have been rendered anonymous. This would allow the doctors to take previously accumulated information into account in their diagnoses. Moreover, the data obtained from the management of thousands of buildings could be used to come up with recommendations for saving electricity and reducing heating costs. What’s more, measurement data from a train’s operation could provide train engineers with tips on how they can drive their trains more energy-efficiently. The resulting savings would be split between the user and the provider of the smart data. It would be a real winwin situation and a good example of how nuggets can be mined in the data mountains.