AIOTI – What could be the architecture that unlocks the true value of IoT-enabled Data Marketplaces?

Reading Time: 4 minutes

 

Data Marketplaces enabling the exchange of data sets and data streams are analogous to Digital Marketplaces such as eBay, an analogy made interesting by the fact that few predicted the meteoric rise of eBay and its compatriots. Stories told about eBay’s launch in 1995 quote prominent venture capitalist David Cowan as saying: “Stamps? Coins? Comic books? You’ve GOT to be kidding. No-brainer – pass.”

It is now estimated that Digital Marketplaces will impact 40 per cent of worldwide retail by 2020.

IoT-enabled Data Marketplaces face scepticism similar to that seen at the launch of e-commerce. Some have questioned their commercial viability. And much like the early days of eBay, existing Data Marketplaces have focused initially on niche opportunities. However, we believe there is huge potential for Data Marketplaces to succeed as first-movers establish themselves or scale-up their efforts by consolidating adjacent opportunities.

Today we have a range of reasons to believe that the market drivers for IoT-enabled Data Marketplaces are solid, especially with the Internet of Things (IoT) becoming pervasive.

We see three market drivers as especially important:

1. Selling IoT data dramatically improves the business case for organisations’ digital transformation based on IoT

The newfound success of IoT technologies and applications is the result of recent advances in information and communication technology (ICT), advances that are making IoT affordable for a large number of use cases where previously the cost was prohibitive. Selling data is expected to become an integral part of any IoT business case, and this should result in IoT being deployed on a larger scale thanks to quicker return on investment. Industry now understands how ‘IoT plumbing’ works and is increasing its focus on the monetization potential of IoT data.

2. External and fresh IoT data complement data generated internally

Data generated internally to an organisation is usually not enough to remain competitive, enhance customer experience, and improve strategic decision-making.

Looking at IoT-enabled mobility, for example, a car equipped with LIDAR (light detection and radar), a gyroscope and an accelerometer can accurately detect bumps and potholes on the road. Those data sets could be extremely useful for municipal governments as well as companies in fields such as car insurance, navigation applications, and road maintenance.

This would however rely on incentive for this data to be shared. In this example, the entities to benefit from access to the data are not in a position to collect this data on their own.

Efforts are also ongoing to predict the development of potholes even before their formation. Provided that a sufficient number of cars are equipped with adequate sensors to generate the necessary data, local authorities could improve road safety, decrease road maintenance costs, and limit the need to compensate road users for damage following insurance claims.

3. Artificial Intelligence (AI) and Machine Learning (ML) will yield greater value if algorithms are trained on large volumes of representative data

There is widespread industry agreement that AI and ML algorithms, if trained with the largest possible volume of high-quality data, can create new business opportunities and revenue streams. Acquiring high-quality IoT data is thus comparable to acquiring the raw materials essential to production in conventional industries.

Recognizing that a strong business case for IoT Data Marketplaces is emerging, elaborated by reports such as Western Digital and Accenture’s report on the ‘Dawn of the Data Marketplace’, the figure below illustrates a possible ‘High-Level Architecture for an IoT-enabled Data Marketplace’ under development in the Alliance for Internet of Things Innovation (AIOTI).

We aim to provide a snapshot of what this reference architecture could look like to invite feedback and encourage new contributions to this work.

High-level architecture and key underlying concepts

Possible high-level architecture of an IoT-enabled Data Marketplace

The above figure provides a possible high-level architecture for an IoT-enabled Data Marketplace. Find explanations about the different data exchanges here…

Certain concepts are fundamental to the successful development of IoT Data Marketplaces adopting the proposed high-level architecture.

Consistent data quality: A guide developed by McKinsey on the creation of Data Marketplaces lists six key enablers. We see one of these enablers – “achieving consistent data quality” – as perhaps the most important. Auditable and adequate service-level agreements that can ensure that marketplaces deliver data of consistently high quality will become a defining feature of sustainable Data Marketplaces. Quality assurance will come at an additional cost but would deliver a more sustainable model.

Metadata – descriptions of the data assets up for sale by different stakeholders as well as the methods to transact in these assets – provide data buyers and sellers with a common understanding of ‘what the data is about’ and the commercial terms governing the subsequent transactions. Reaching this common understanding would only be possible with a standard or agreed ontology. The ITU Focus Group on data processing and management and the Open Geospatial Consortium could be the two initiatives to consider this standards gap.

Mirroring metadata is the concept of exposing metadata in a third-party data lake. This mechanism allows for cross-domain data discoverability.

Cross-domain data discoverability facilitates the distributed, collaborative development of data-driven solutions in line with the principles put forward by the EU Digital Single Market, for example.

Blockchain and distributed ledger technologies provide means to build trust into every transaction without the need for central authorities. They are capable of enabling micropayments without transaction fees. They are also valuable in providing proof-of-origin for data sets as well as proof-of-integrity for data lakes.

Decentralized, yet federated: The proposed reference architecture describes a data economy without need for a central entity or centralized powers, which could offer a foundation for a fair distribution of revenue streams. The federation is achieved through the mirroring process.

Governance presents some of the most complex problems in this space. It is difficult to define sustainable governance models for new technology solutions when new models appear continuously and the oldest model is only a few years’ old. The governance challenge is two-fold: