Do you also know this in your company? An important new phase in business intelligence is upon us: The Internet of Things (IoT), Artificial Intelligence (AI) and Big Data comprise a constellation of new technologies capable of producing, manifesting, and leveraging data in new and different ways. Each of these technologies has immense potential, and together they are transforming the business processes of entire industries. Companies must embrace and grapple with with these technologies to understand their capability to change operational realities from back office functions to a firm’s interaction with customers. This can prove daunting: rushing in can be complex and costly, with an uncertain return on investment. Disruptive start-ups often have the luxury of weathering such challenges but going concerns must get it right the first time or risk undermining current operations. That’s where DataValueThinking comes in.
Corporate leaders constantly grapple with such innovations, particularly technical innovations. However, the demands of AI are creating a new dynamic. As a basic raw material, these applications need a reliable supply of high-quality data. Happily, the sensors that form the Internet of Things (IoT) provide this data in spades. Storage and evaluation tools from the realm of Big Data make it permanently available. Data is thus the raw material which will determine the quality, capabilities, and nature of AI, and also plays a decisive role in the success and sustainability of a company choosing to implement these innovations.
A typical exercise in the DataValueThinking workshops is to rank and classify new technologies according to their importance to the company from a business perspective. First solution ideas and possible use cases are also discussed. Also considered are which data areas and data sources are of strategic importance.
Achieving quick wins while ensuring the future.
In our workshops, we repeatedly encounter mixed experiences with these new technologies. Often the pilots do not achieve the expected benefits or are abandoned before they’re even started. Reasons for this vary, ranging from uncertainty and scepticism about individual topics such as data protection to technical challenges during implementation. Often, however, even after the first successful pilots, widespread use falters because of incompatibility between the various solutions. When using new technologies, it is important to achieve a balance between pragmatic pilot projects that realize short-term benefits and acceptance as well as the strategic sustainability of the solutions. However, it is often not the solution itself that is so important for this strategic sustainability, but rather its ability to be integrated into more extensive solutions.
A big hurdle for the universal adoption of new technologies is the co-existence of old and new assets, legacy infrastructure, and plants. What good is the adoption of sensor technology in production if only the new machines can make use of them? What is the use of building applications in Smart Facilities if you can only use them in new buildings?
Another topic that emerges from our workshops is the difficulty of balancing technology, pragmatism and data analysis. Many ideas are only implemented if the data is 100% accurate. So, you have to measure in an expensive and time-consuming way, although you could derive the information from existing statistical data. One example is the exact measurement of workplace usage by cameras or the observation of customer behaviour by correspondingly complex sensor technology. In addition to the data protection (GDPR) issues that come along with such solutions, simple data in a large quantity can achieve the same information through statistical simulations. They are also more cost-effective and often more efficient in terms of data protection.
Of course, it is not enough to point out ideas and observations in workshops. It is also important to provide suggestions for possible solutions based on real world experience. Here we also make use of our technology partners, who in turn use the results and presentations of DataValueThinking for their own positioning. A good example is our partner Microshare, a US technology firm that offers a wide range of Sensing-as-a-Service Solutions to integrate existing infrastructures into the digital world and capture their information as a Digital Twin. The sensors are already integrated with the application. At the same time, Microshare has a future-proof data integration and sharing platform, and its solutions operate on a LoRaWAN system that lives separately from sensitive corporate networks, mitigating cyber risk. The data collected with Microshare solutions can therefore also be easily integrated into your own applications. In addition, the Microshare architecture makes it possible to trace where the data comes from and with whom or which applications it may be shared.
Microshare illustrates the importance of using data from the different areas of the company or the different data zones very clearly with a few examples https://www.microshare.io/microshare-how-it-works-smart-iot/. And since Microshare can be deployed in existing infrastructures at short notice and if required without the involvement of busy Information Technology (IT) departments, the technology is also used for Covid-19 tracing solutions in the production area, as the following video shows. https://www.microshare.io/2020/06/18/animation-how-universal-contact-tracing-works/
Microshare represents just one way to use data sensibly to increase the value of the company and streamline its operations. In addition to technical solutions and products, a prerequisite for this is to anchor the data culture in the company. Use the methodological framework and creative approaches of DataValueThinking to unlock the hidden values of the data treasure in your company. Learn more about DataValueThinking.