Oct 13, 2021

Inevitable Innovation

In the mobile industry, some developments are just inevitable. It’s only a question of time before the right mix of technologies, demand, and agents of change turn inevitable innovations into market reality. I was reminded of this while attending a recent webinar [1] organized by the Transport Data Initiative Forum. This featured stories about pilot projects in several UK regions to apply ‘IoT’ data to tackle a variety of what might be termed smart city applications. 

Several of these pilots involved applications built with mobile network operator data, provided by the UK’s O2. Pilot projects involved several municipalities. Each of them focused on a local need (e.g., road maintenance, environmental monitoring etc.) and each built applications using data from a range of their municipal sensors and data sources. In several cases, they combined their local data assets with mobile network operator data. 

I recall discussing this inevitability with a mobile operator in the UK, around 2017. The challenge in those days was to persuade different parties about the value of sharing data across organizational and commercial boundaries. A related challenge was the need to explore collaborative business models based on the equitable sharing of rewards. 

Cross silo uses of data 

Over the past few years, many smart city initiatives failed to progress beyond the pilot stage. This was not surprising given that they set out to address a priority use issue within a department or operating unit. Unfortunately, the scale of individual undertakings was insufficient to bear the support costs. When a more generalist approach is taken, many more use case ideas crop up. There is also greater scope for innovation as implementers explore cross-silo opportunities and ways of reusing common capabilities. This was the case with the ADEPT Live Labs initiative [2]. 

Among the use-cases discussed were an application using machine learning to analyze historical traffic flow and then to analyze current or future contextual information to provide more accurate forecasts. This approach has relevance when forecasting pre- and post-pandemic travel patterns. Another application used cameras on waste collection trucks to identify potholes and to prioritize repairs based on road-usage intensity as deduced from mobile network operator data. In another pilot, a combination of environmental sensing and traffic pattern data, drawing once again on mobile network sources, served to map air quality in different zones for short term reporting and long-term planning. One final example to catch the eye combined public health data and mobile network data to help social services teams in dealing with issues of loneliness, frailty, and obesity among the general population. 

Good enough for the price 

One of the interesting insights from the discussion portion of the webinar addressed the issue of implementation costs. One municipality had struck a deal with its local police force to access some of their live data for a fee of GBP2,000. 

There was also a comment about ways to economize on cloud service provider costs by opting for not quite real time operations. Instead of paying a premium to build ‘real time’ applications, a willingness to work at a slower pace (i.e., of the order of minutes or tens of minutes) meant that this developer was paying about $500/month for their cloud services. These data points show how the market is at an early stage of development. Data providers and data consumers are still learning about the value of data [3] while developers are exploring cost economics and the ways to lower adoption barriers thanks to attractive (and market-building) offers from pay-as-you platforms. 

Another inevitability 

As pilots and collaborative business model experiments such as the ones above take root, many more applications and services will be deployed. Over time, these will develop into essential operational processes that municipalities and their citizens come to depend upon. One inevitability that is already materializing on the horizon relates to the topics of data quality, data reliability and decision-making transparency. How will municipalities know when they are getting good data from remotely connected assets? In IoT terms, this depends on functionality to check the status of remote devices and sensors as well as the ability to update firmware and, potentially operational parameters, over the air. These are examples of fee-paying IoT platform capabilities.

Other capabilities that will inevitably be required involve functionality to explain how and why automated decisions are being made. This will take users into the domain of explainable AI. The IoT implications here involve greater use of digital twin techniques as well as interoperability technologies that allow diagnostic applications to query data sources. These will range from conventional data bases to remotely connected assets that will likely be supplied by many different vendors. The inevitable innovation to enable interoperability and automated, fault-tree reasoning requires users and solution providers to anticipate design requirements sooner rather than later. The alternative is to delay inevitable cost and field-support burdens associated with re-engineering deployed systems. 


[1] ADEPT LIVE LABS, INNOVATION IN THE LOCAL ROAD SECTOR https://transportdatainitiative.com/uncategorized/tdi-9-adept-live-labs-innovation-in-the-local-road-sector-invitation-to-attend/ 

[2] ADEPT SMART Places Live Labs https://www.adeptnet.org.uk/livelabs 

[3] There Are New Markets for Industrial IoT Data https://www.iiconsortium.org/news/joi-articles/2018-June-JoI-New-Markets-for-Tradeable-Data.pdf


1 comment:

  1. 14 October 2021 update

    What Happens When ‘If’ Turns to ‘When’ in Quantum Computing?

    https://www.bcg.com/publications/2021/building-quantum-advantage

    ReplyDelete