What Makes IoT Different to the Regulator?

Andy Haire
November 9, 2015

 

This second of four articles, drawn from the lens of an Authority's mandate, addresses the policy issues that could become part of their thinking – a departure from typical articles whose focus is on interoperability or devices. Parenthetically, Authority, is used loosely: it means anyone or any organization in a position to impose rules or laws on marketplace behavior – generally parts of governments such as legislatures, competition authorities or sector regulators.

Why is this important? The communications sector remains regulated, and the societal reach of IoT demands deeper safeguards.

The characteristics

Broadly how would IoT be characterized; what are its features? When the EU regulatory authority, DG Connect, painted this policy picture a few years back, their whitepaper1 believed that the IoT characteristics were: ubiquity, miniaturization – even invisible; ambiguity – it’s not rational; identification – things are getting tagged; connectivity – given overall ease and efficiency; imbedded intelligence – and rapidly growing; seamless information flow – sometimes absent the context it was collected; distributed control – there is no controlling center, just edges. Considering all of these together, this communications paradigm is radically different from anything an 'Authority' has previously seen.

But there is a new element that IoT stakeholders must consider. No longer is a regulator the only authority 'in town', many are. The new world is now filled with interested, empowered and committed 'overseers'; policy setting has now become multi-stakeholder, sometimes without coordination or harmonization, and not bound by geography, economic sector, political ideology, or even a technology.

What sets this apart?

What places IoT policy apart from earlier ICT initiatives, what makes it different? So why might this show up on any Authorities' doorstep: with no intention to be all inclusive, five characteristics are offered:

  1. Nature of the communication In IoT architecture we have objects (sensors; devices; analytics; intelligence; etc.) These objects grow opaque – both in what they collect and how the information collected is used. Further, the architecture permits people controlling objects; objects controlling people; objects controlling other objects. Direct 'people' control is waning; objects fill the gap, assuming greater control by bringing the incompetence trap2. All of this raises a serious question of responsibility and accountability, and eventually leads to a possible overreliance on technology. A tragic example surrounds Air France Flight 447 air-crash in 20093, (pilots relied on 'fly by wire' so much that as this particular crisis unfolded, they were ill trained and thus incapable to cope; the plane crashed into the sea), all of which forced a reconsideration of management behavior toward automation4 . The notion of 'de-skilling' is well documented5, but raises a deep concern that highly skilled workers give way to automation operated by the semi-skilled – skill fades over time, cost savings are gained but at the expense of quality.
  2. Risks for profiling Data, especially given the ease of post-collection correlation, leads to profiling, and profiling results in discrimination, some good, but some bad. This may drive discriminatory commercial behaviors, which has a history of driving societal exclusion. Governments worry.
  3. Intrusion on civil society The very pervasiveness of IoT, both present and foreseen, leads to societal divides – between those who understand and those who are possibly intimidated by the technology. It might also lead to false trust of that technology. Objects encompass an ever growing part of daily lives – and thus a malfunction or error has an ever growing impact on society.
  4. Diffused control IoT architecture is by its very nature not controlled from the center but from the edge. But under such diffusion, who becomes accountable and thus liable for that, say, errant sensor collecting data, that weak credential check to allow access, that data point taken out of context and then not adequately protecting future actions?
  5. Legal norms are no longer norms Adequate notice, usually the pillar of consent, is leading to 'consent fatigue’6; people agree to sharing data and have no idea what they are agreeing to. If the 'ordinary user' doesn’t have time to understand, then who protects the unprotected – those with special needs, such as children and the elderly.

Taken together the Authority has ample justification to become involved; but how?

What do they want?

Traditionally an authority worried about competition, fair play and growth. In a recent speech the head of BEREC – the European regulator's group – focused7, on historical regulatory issues: evolution of internet-driven services that will stimulate the market and […] attention for the foreseeable future. This list expands when you take into account the reach presented by IoT.

Going forward it will be societal justice, growing digital divides, trust, and the use of information8. And, given that the Authorities arguably hold sway on market outcomes, it should become a shared worry for IoT's developers. A simple example: while it might be exciting for your automobile to deliver benefits from its on-board technology, it would be less exciting if your car was prohibited from crossing a national border because it doesn’t comply with your destination's rules, such as individual's privacy or radio spectrum use.

Of the many policy considerations we'll stick with three: protective rights – such as privacy, security, data protection; safety – which includes trust, national security and public safety; economic opportunity – which includes growth, innovation and anticompetitive conduct; societal risk. Several of these topics, especially economic opportunity, will be further developed in future articles in this series.

Protective rights

While often used in the same breath, there is a difference between 'privacy' and 'data protection' and 'information security'. Very simply, privacy involves sharing only what you want to; data protection is sharing only with who you want to; security is keeping away the unwanted.

We tend to limit privacy policy considerations to guarding personally identifiable information (PII). Furthermore, until recently, we mistakenly thought if we protectively suppress the PII elements from data captured, we then protected the individual's identity. Highly sophisticated algorithms in the world of data analytics simply correlate9 seemingly unrelated data to accurately determine identity10. Additionally, while some see opportunity from 'smart living' such as managing electricity grids, others see that as intrusive. What if the household meter were sensitive enough, and related analytics smart enough to determine – based on the occupant's moment to moment consumption – what the occupant was now doing, maybe what food was consumed, what television program was being viewed, which rooms were occupied. Incidentally, the upper house of the Dutch parliament shared these concerns.

Security on the other hand is to keep unwanted intruders out – of your data and with your devices. If my home's door locks are controlled by automation, my security is only as strong as the most inept hacker's success. Pressures to produce sensors at the lowest possible cost, should not give way to compromising the integrity of the broader system. A 'flash crash' resulting in a quick 'reboot' of firmware – offers false comfort of little to no harm because the object returns to service – BUT should not become the backdoor that the scoundrels with dishonest intentions enter.

Trust and data protection: Two elements are tightly interwoven. Data collected whether intentionally, incidentally, accidentally or inadvertently is now under someone's stewardship. Should that data become compromised – as seen in far too many recent headlines – and in any fashion and for any reason – the public's trust in accepting these intrusive IoT systems will be deeply damaged. Trust lost, will be difficult to restore. Jurisdictions worldwide are enacting legislation to make breaches 'painful'. Some now realize that such legislation tends to be ill-directed: it focuses on collection, not intended use.

Over the past few years, the World Economic Forum, an international institution committed to improve the state of the world through public-private cooperation, has reported11 some goals about data use:

  • From Transparency to Understanding: People need to understand how data is being collected, whether with their consent or without – through observations and tracking mechanisms given the low cost of gathering and analyzing data.
  • From Passive Consent to Engaged Individuals: Too often the organizations collecting and using data see their role as a yes-no / on-off degree of consent. New ways are needed to exercise more choice and control over these data that affects their lives.
  • From Black to White to Shades of Gray: the context by which data is collected and used matters significantly. How is the data used; much like money, it means little until it is used.

Before the dawn of networked data, individual data was generally used once and for a specific purpose. But today, given the role of analytics, reuse of data is common allowing more value to others. Reused data away from its original context creates risks.

In the recent past, especially with the advent of data mining technology, the line between public and private data use has become more opaque, and thus people no longer know if, when or how their personal information is used, or worse, shared. Trust is considered the key challenge for the Future Internet12. Trust builds when that 'object' performs in a certain way and usually is not threatening any expectations of that person.

Conclusion

The EU expert group noted above arrived at several conclusions13: need for transparency, both in the vendor supply relationship (how these systems function; independent certification with relevant metrics such as privacy and security), and in the public's understanding about what these objects are doing. This, in a way, leads to insuring access to information is governed by clear and understandable rules. There was a considered set of reports14 released in mid-2014 from the Executive Office of the US President that recommended, among many other policy considerations, the policy focus for this broader issue should shift from collection to intended use.

Finally it is crucial that the public and industry conversations take place alongside the technological development. It is insufficient to believe the IoT community can self-regulate or that the Authority can develop its rules alone.

 

References

1. Van den Hoven, Jeroen, 2012, EU DG Connect; Ethics subgroup IoT – Version 4.0; p4

2. Crabb, Peter B; June 2010; International Journal of Technoethics, p19

3. Langewiesche, W., The Human Factor, Vanity Fair, October 2014

4. BestRid.com Blog; What the Airline Industry Learned about Automation…, 7Oct14

5. Lerner, S, Univ of Waterloo, Waterloo, Ontario, Canada; The Future of Work in North America, Futures, March 1994

6. Schermer, B, Your consent is overrated, Leiden Law Blog – University of Leiden, 11Apr13

7. BEREC chair speech http://www.contel.hr/2015/fatima-barros/ July 2015

8. Van den Hoven, Jeroen, EU DG Connect; Ethics subgroup IoT – Version 4.0; p19

9. Adam Sadilek, Henry Kautz and Jeffrey Bingham, "Finding Your Friends and Following Them to Where You Are", 5th ACM Conference on Web Search and Data Mining, 2012

10. MIT Technology Review http://www.technologyreview.com/news/514351/has-big-data-made-anonymity-impossible/

11. WEF; Unlocking the Value of Personal Data: From Collection to Usage; http://www3.weforum.org/docs/WEF_IT_UnlockingValuePersonalData_CollectionUsage_Report_2013.pdf

12. De Paoli, S, Toward Trust as Result. Triple C, tripleC 9(2): 702-714, 2011; ISSN 1726-670X

13. Van den Hoven, Jeroen, EU DG Connect; Ethics subgroup IoT – Version 4.0; pp20-21

14. Executive Office of the President; May 1, 2014; http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf (pdf accessed May 2014) and http://www.whitehouse.gov/sites/default/files/microsites/ostp/PCAST/pcast_big_data_and_privacy_-_may_2014.pdf (pdf accessed May 2014)

 


 

Andrew HaireAndrew Haire with more than 30 years of experience spanning four continents has been associated with some of the industry’s most successful telecom initiatives. He advises both governments and communication providers and is an expert in industry policy, market growth, strategy, technical opportunity, and economic structure. His portfolio included architecting major policy frameworks in the telecoms, technology, and postal sectors, as well as serving as regulator and ICT policy for 10 years at Singapore’s iDA, soon after its inception in the year 2000.

He serves on the Board of the International Institute of Communications in London, and is Chairman of its US Chapter. Mr Haire holds a degree in engineering in the United States, attended the advanced management program from Harvard University. He has delivered papers / speeches on policy and regulatory frameworks in Asia, Europe and North America.