Feb 11, 2021

IoT Platforms and Digital Regulation

A couple of recent and seemingly unconnected publications provide food for strategic thought on the topic of IoT platforms. 

Platforms are an important topic for the following reasons. As businesses deploy Internet of Things applications, many will turn to the service provider market for affordable, feature-rich, and well-engineered platforms. Platforms also represent an important topic for the large Cloud-providers, such as Amazon Web Services, Google Cloud and Microsoft Azure, who understand the importance of platform strategies and data. 

The first publication that caught my attention is a short article on the IoT Agenda site. It outlines that issues of IoT technology fragmentation and discusses the trend towards concentration in the IoT platform market [1]. The second is a study by a group of economists with an expertise in platform economics and competition policy. They studied the EU’s Digital Markets Act (DMA) and its regulatory implications for large and dominant digital platforms [2]. 

Platform Market Dynamics 

The IoT Agenda article describes the proliferation of niche IoT offerings and a market situation where interoperability practices are inconsistent. These have resulted in a fragmented IoT market, creating a barrier to IoT adoption IoT. The author refers to the growing number of IoT platforms, numbering 620 in 2019 [3]. According to Deutsche Telekom, this estimate could be closer to 1600 [3].

The past few years have witnessed a growing involvement of major Cloud-providers in creating technology and partner eco-system to help businesses adopt IoT. Their strategies aim to create a convenient on-ramp for connected devices, making it easy to import data into their cloud storage and analytics environments [4], [5]. This is likely to reinforce the trends toward market concentration in the IoT platforms market that IoT Analytics, an industry research firm, reported on [6]. 

Economics and Regulatory Implications 

To get a sense of the strategic implications for IoT adopters and the data they are generating, there are some cross-over insights from the regulatory implications facing ‘gatekeeper’ platforms. For the present, this class of platforms centers on personal data. They include large, online platforms (e.g., for on-line advertising or Apps) with significant network effects. Their label derives from the fact that they act as gatekeepers in the digital economy. This is a label that could easily apply to some IoT platforms handling machine and mixed data (personal and machine types) as IoT data volumes grow and market concentration increases. 

One issue facing businesses that have historically operated in the mobile and M2M/IoT industries is their mastery of business models that take advantage of economies of scale (price deflation of handsets and connectivity) and scope (common operating model to deliver voice, entertainment, and data services). Platform business models, in contrast, derive their strategic advantage from learning-by-doing and network effects. Furthermore, the dynamic of increasing returns is directly linked to data accumulation. Concepts such as tying, through discount schemes, bundling and, practices that involve pre-installed Apps are some of the ways that platforms use to foreclose competition.

Cloud platforms compete with platform service providers that themselves rely on public cloud providers. Now, the cloud provider is no longer a pure intermediary and this poses another competitive threat in the form of self-preferencing business practices. 

Finally, the information asymmetry between a platform provider and its users equates to a privileged market overview and scope for competitive advantage over its business users. 

Managing Platform Dependency Risks 

Mobile network operators and systems integrators need to adapt their traditional models to deal with the competition posed by cloud providers. In the case of IoT adopters, there are different ways to manage their dependency on IoT platforms. One approach centers on data portability which makes it possible for a user to switch between platforms. This is an issue to address during procurement, contractual negotiation and in terms of the practicality and degree of friction in data migration. 

Platform users can also request access to the activity datasets that are generated in the use of the core platform service, subject to data protection law. This is a way to learn about the dynamics of data use and to offset some of the competitive advantage that accrues to the platform service provider. 

The pressure of user expectations, competitor lobbying and/or regulation might usher in an open-standards interoperability regime in the future. This would enable federated operating models across multiple platforms. Higher up the IoT stack, interoperability might be possible through arrangements that enable semantic interoperability. 

Platform users also need to understand the implications of relying on seemingly ‘free’ technology building blocks, implementation services or a curated partner-ecosystem. These are forms of bundling with the potential to tie a user into a proprietary solution or platform. While the components on offer have the potential to facilitate standardization, the downside is to lock users into walled garden and non-interoperable relationships. What looks like fast time-to-market convenience might end up being a millstone that hinders competitiveness and data-centered innovation in the long run. 


‘Bundling and Tying’ Image Credit: Matt Seymour via unsplash.com


[1] IoT market fragmentation complicates device deployment - https://internetofthingsagenda.techtarget.com/feature/IoT-market-fragmentation-complicates-device-deployment 

[2] The EU Digital Markets Act: A Report from a Panel of Economic Experts - jrc122910_external_study_report_-_the_eu_digital_markets_act.pdf (europa.eu) 

[3] Implementing oneM2M in products will allow more flexibility for customers and systems integrators - https://onem2m.org/insights/executive-viewpoints/365-implementing-onem2m-in-products-will-allow-more-flexibility-for-customers-and-systems-integrators 

[4] Integrating Sigfox IoT network with Google Cloud - https://cloud.google.com/community/tutorials/sigfox-gw 

[5] New AWS IoT Device Client simplifies onboarding to AWS IoT Core - New AWS IoT Device Client simplifies onboarding to AWS IoT Core, AWS IoT Device Management, and AWS IoT Device Defender (amazon.com) 

 [6] IoT Platform Companies Landscape 2019/2020: 620 IoT Platforms globally https://iot-analytics.com/iot-platform-companies-landscape-2020/ 




4 comments:

  1. 11 Feb 2021 update

    Sigfox migrates its infrastructure to Google Cloud to process billions of messages monthly and to develop new value–added services.

    Google Cloud and Sigfox today announced that the leading global 0G network and IoT platform services provider, has partnered with Google Cloud, to scale its cloud infrastructure and extend its IoT services portfolio.

    This partnership will enable Sigfox to rapidly accelerate its “Massive IoT” agenda – processing billions of messages each month from objects connected to the internet using data stored in the cloud.

    https://iotbusinessnews.com/2021/02/11/40113-sigfox-collaborates-with-google-cloud-to-accelerate-its-global-iot-strategy

    ReplyDelete
  2. 11 Feb 2021 update

    Proposal for a Regulation on European data governance (Data Governance Act)

    The Regulation includes:

    i) A number of measures to increase trust in data sharing, as the lack of trust is currently a major obstacle and results in high costs.

    ii) Create new EU rules on neutrality to allow novel data intermediaries to function as trustworthy organisers of data sharing.

    iii) Measures to facilitate the reuse of certain data held by the public sector. For example, the reuse of health data could advance research to find cures for rare or chronic diseases.

    iv) Means to give Europeans control on the use of the data they generate, by making it easier and safer for companies and individuals to voluntarily make their data available for the wider common good under clear conditions.

    https://www.standict.eu/news/data-governance-act-european-commission

    ReplyDelete
  3. 19 Feb 2021 update

    Interoperability is a technical mechanism for computing systems to work together – even if they are from competing firms. An interoperability requirement for large online platforms has been suggested by the European Commission as part of an ex ante (up-front rule) mechanism in its forthcoming Digital Services Act (DSA), currently due to be proposed on 2 December 2020.

    Interoperability is the online equivalent of interconnection that the EU imposed on the telecoms sector in the early 1990s, and was fundamental to the successful opening up of these markets to competition. Interoperability has been fundamental to competitive communications markets since their inception, and underlies many technologies today, including email, digital TV, and indeed the Internet itself. Users can exchange calls, text messages and e-mails irrespective of their phone, network or e-mail service.

    An interoperability requirement would apply to the largest online platforms, such as social media (e.g. Facebook), search engines (e.g. Google), e-commerce marketplaces (e.g. Amazon), smartphone operating systems (e.g. Google’s Android and Apple’s iOS), and their ancillary services, such as payment systems and app stores.

    https://openforumeurope.org/publications/ofa-research-paper-interoperability-as-a-tool-for-competition-regulation/

    Where large operators dominate a market, they are unlikely to favour technical interoperability because they do not want competition from new market entrants able to connect to their existing users. Technical interoperability therefore needs to be imposed on those large platforms by regulation.

    https://www.ianbrown.tech/2020/10/01/interoperability-as-a-tool-for-competition-regulation-2/

    ReplyDelete
  4. 19 Feb 2021 update

    Oil and gas sector dispenses with in-house IoT to splurge $712.7m on cloud analytics

    Spending on big data and analytics in the oil and gas industry is increasing at a rate of about 75 percent per year, as companies rapidly dispense with in-house IoT management to go with big cloud providers instead. Analyst house ABI Research said the oil and gas sector will invest $712.7 million on IoT analytics by the end of 2026, up from $156 million in 2020.

    The sector’s investment in third-party IoT expertise has been climbing steadily, already; the annual spending figure was about $90 million in 2018, and has increased by 36.8 percent per year since then. ABI Research explained the oil and gas sector remains “deeply challenged” in its mission to connect and transform operations by “complex system integrations, siloed data, and supervisory control and data acquisition (SCADA) management systems”.

    Kateryna Dubrova, research analyst at ABI Research, commented: “In-house analytics is no longer a sustainable and cost-effective IoT option, and oil and gas firms have widely recognized the expertise of IoT cloud platform- and software-as-a-service vendors… More and more enterprises are turning to suppliers [for] advanced analytics and AI as-a-service offerings enabled through extensive cross-industry collaborations.”


    https://enterpriseiotinsights.com/20210216/channels/news/oil-and-gas-sector-dispenses-with-in-house-iot-to-splurge-on-cloud

    ReplyDelete