Dec 30, 2019

Privacy payoffs in smart cities

A few weeks ago, I spoke at the Connected Cities Privacy Summit (CCPS) in Washington DC. This was a 'first of a kind' event focusing on data privacy issues. Other smart city events tend to feature pilot-projects and technology demonstrators.CCPS drew speakers from Google’s Sidewalk Labs and public-sector officials from Canada. US presenters came from a range of academic, consultancy, legal and technology organizations [1].

Many of the CCPS presentations took a cautious approach to privacy protections. To some extent, this reflected the nature of the audience. Roughly half of the attendees hailed from legal, compliance and policy professions. I took a somewhat different approach. My presentation covered the opportunities arising from data sharing. This drew on some of the lessons learned from, one of my consulting projects over the past few years [2].

This project began life as a concept for sharing IoT and other data assets in smart city and intelligent transport applications. From the outset, its broad approach aimed to support many IoT applications. It also sought to enable interoperability across hardware suppliers and between application developers. This is often referred to as a horizontal approach. The approach differs from many of the single-purpose or proprietary platform strategies in the market. The horizontal approach implies data sharing across organizational silos. As a result, data- and privacy-management are early design issues rather than a systems integration afterthought.

Two Models for Managing Privacy 

The experience from early deployments reveals two models for managing privacy. One involves a private version of a data hub. Here, several organizations agree to share data within a neutral environment [3]. Businesses are notoriously wary of exposing data from their assets and operations. They fear the loss control over their data. They also don't want to miss out on commercial opportunities by allowing third parties to extract value from their data. Many organizations realize that they can improve their operations by using data from external sources. Some are willing to enter into mutually beneficial data sharing arrangements. An example might be a connected and autonomous vehicle (CAV) manufacturer. They can evaluate their road tests more effectively with access to data about the road network, data from roadside infrastructure and even local weather conditions.

A first requirement for the neutral data hub is to adhere to privacy regulations, notably GDPR. Another is to provide users with privacy policy controls. This allows data providers to control downstream access by partners. Usage monitoring features are another feature as they provide visibility into consumption patterns. This provides visibility into market demand. Participants in large and open data ecosystems can gauge interest in different data sets and innovate accordingly.

The second privacy model is the public data hub. This is where providers focus on data that contains no personally identifiable information (PII). In effect, participants focus on applications that do not involve personal data. This simplifies the requirement for privacy guards. However, the hub provider still has an obligation to check that data suppliers are not contributing PII.

Data Monetization Factors 

"How much are businesses charging for their data?" is the most common question I come across on the topic of data monetization. The answer, to borrow a saying from the art world, is that "beauty is in the eye of the beholder". In other words, the value of data and its pricing depends on the buyer's perceptions and not on the cost of supply. That makes life complicated because we are at the very early stages of a market for data (leaving aside the market for personal data).

Some organizations are approaching the opportunity by targeting applications where they have a good understanding of their customers' needs. This is the basis for value pricing, but it is difficult to scale. Other organizations are taking a data catalog or marketplace approach, publishing a wide range of data. While this is scalable, the business model depends on driving up the number of transactions between many organizations who are reacting to data offers and sending purchasing and pricing signals.

Context is another driver of value. Knowing where data comes from, and its provenance along a data supply chain, has a bearing on the quality of solutions offered by downstream users and service providers. Dynamic richness also plays a part in the value of data. Think about two cameras, one monitoring a minor country road and the other monitoring a busy, city-center intersection. The latter corresponds to a much higher level of activity and relevance to the traveling public that use that junction. The quality of data supply is another factor. Intermittent feeds or data streams with erroneous values are unreliable. They are less valuable to downstream users and service innovators.

Data monetization requires three enablers to be in place. One is the technical infrastructure, such as a data exchange, to source data and make it accessible to many potential users. A second enabler is a licensing framework. This specifies the conditions under which data is supplied and whether there are any constraints on its downstream use. This framework defines a baseline set of rules that data producers and data consumers agree upon as a condition for participating in the data exchange. The third enabler is a set of commercial tools. These track requests for data (e.g. via data catalog searches). They also track usage patterns and administer payments between data suppliers and consumers.

Open ecosystems and open standards 

A scalable approach to data management involves many participants supplying and consuming data. This is where an open ecosystem approach is a strategic design choice. With low barriers to entry, it encourages market entry and fosters innovation. Consider the example of data sharing in a well-established supply chain. There are greater prospects for identifying operational efficiencies and new uses of data by involving data specialists and even experts from outside the industry value chain.

Most businesses, however, are wary of sharing their data or seeing other organizations profit from its use. In some cases, businesses believe that they have a unique data set and monopolize that data. This limits the scope for innovation. Nevertheless, external data sharing is an approach that has yielded gold [4], literally. Organizations are starting to experiment with data sharing accompanied by appropriate measures to control access [5].

Standardization is a critical element in encouraging wide participation. Importantly, it needs to be a horizontally oriented standard. This is necessary to work across application, organizational and industry silos. Scalable IoT presents a lot of new challenges with new standards addressing new requirements, ideally building on established and vertical-specific standards. In the case of the project, that standard is oneM2M [6].

Success Factors Data 

sharing depends on an underlying technical platform. This handles the tasks of collecting and distributing data between different endpoints. These endpoints might be parts of siloed solutions. Or, they might provide only a partial picture of a larger system. Users are becoming aware of locking into such arrangements. They also recognize the risks of single-vendor solutions which will limit their ability to make best use of their data in the future. Take the example of a city that outsources the supervision of its road network, or its car parking facilities, to specialist service providers. It will incur contract-variance fees if it wishes to change performance reporting terms (e.g. from hourly to per-minute reporting). That presumes that the city specified a non-proprietary, machine readable format for data supply in the first place. Requirements for open-standard solutions and rules over data control are important data management success factors.

While users might begin with one or two priority use cases, they should plan on the scope of applications expanding. This approach ensures that solutions follow a horizontal architecture strategy and prepare for a high degree of re-use. A horizontal architecture also supports interoperability. This can be between IoT devices from many different suppliers and across many different data sources. These principles offer smart cities (and regions) a roadmap from breaking down the walls between silo solutions and sharing data across operational and organizational boundaries.

[1] Connected Cities Privacy Summit 2019, 

[2] Intelligent Transport Solutions for Smart Cities and Regions: Lessons Learned from an 18-month Trial, Industrial Internet Consortium, Journal of Innovation, 

[3] Fly by data: How service models drive data collaboration in aerospace 

[4] Open Innovation: GoldCorp Challenge 

[5] There are new markets for industrial IoT data - 

[6] oneM2M – 

Image Credit - John Barkiple via 

No comments:

Post a Comment