By Martin Head, Programme Director, Communities and Director of Content
Our homes are becoming ever more connected, with ownership of smart devices more than doubling in the last two years. This data, from a report earlier this year from PwC(1), reveals that almost 40% of people now enter the connected home market via smart entertainment devices.
The PwC survey estimates that the market in 2019 for smart devices will be 10.8bn, whilst techUK(2) have recently found that the use of smart speakers has doubled from 2017 to 2018 and that households owning more than three smart home products have grown by a quarter since 2017.
As Ronan O’Regan, PwC’s Digital Utilities lead said, as their report was launched,
“While smart home assistants are relatively new to the market, we believe they could potentially be the ‘glue’ towards wider adoption. You could say they are having an ‘iPhone effect’ in the market.”(3)
The development of connected technology is still in its infancy with some devices offering solutions seemingly still in search of a problem (Bluetooth kettles anyone?). The real scope and ultimate power of connecting our homes in an integrated way is still a long way from being realised.
There are huge opportunities in how we might use the technology to support and protect people, however, these devices generate vast quantities of personal data – a fact that may be misunderstood by the users. Therefore, a deeper understanding of data privacy and an ability to develop a trusted relationship with the providers and the uses of the technology is needed. As techUK’s Sue Daley wrote recently in a piece for Corsham Institute’s Observatory for a Connected Society app in regard to AI and ethics, “It is our job to continue to build the culture of data trust and confidence needed to ensure technology remains a force for good”(4).
The relevance and power of all the personal data gathered from a connected home when numerous devices are integrated together is going to take on new dimensions at an exponential rate. PwC in launching their recent report said that, “Tech giants are blurring lines and breaking down barriers, creating innovative products that capture data to provide differentiating insights, novel solutions and a seamless user experience”, but they recognised that trust in suppliers “could become a major battleground for traditional players over the next few years”(3).
For most consumers the journey to understand the meaning and full implications of sharing their personal data is only just beginning. Corsham Institute’s Your Data, Your Rights survey earlier this year showed that while 60% of respondents cared a lot about the use of their personal data, only 18% knew a lot about its collection(5), and the recent techUK ‘Connected Homes’ report shows that for 23% of consumers personal privacy was the second highest barrier to buying connected home products (after the cost).(2)
Further, while there is an official definition of personal data in the 2018 Data Protection Act as “any information relating to an identified or identifiable living individual… particular by reference to, (a) an identifier such as a name, an identification number, location data or an online identifier, or (b) one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of the individual”(6), what we’re prepared to share isn’t a fixed concept, it will differ between age groups, other demographics and for the value we perceive we are receiving from each device or app.
In a recent workshop held at the Digital Catapult, by a multi-university Petras funded project into the ‘Value of Personal Data in the Internet of Things’(7), the ‘privacy paradox’ was highlighted as the tension between individuals’ stated desire to maintain privacy, set against their willingness to share their data online, while it was recognised that ‘little is understood about the value consumers place on keeping their data private’. Their work to date includes findings that individuals perceive a lack of choice about whether to share their personal data, care deeply about protecting it and are even willing to pay to do so.
The debate over data ethics is increasingly fundamental, and ever more urgent as homes become more connected, when devices could be listening to our everyday conversations and making decisions about our preferences, opinions and shopping habits(8). It’s not simply about finding more appropriate business models to use the data, nor that people should have to pay for the protection of the data about themselves, it’s that there is an urgent need for a culture of data to develop as rapidly as the technology is doing and to grow hand in hand with it.
Individual citizens need to own and control third party access to their personal data and further still, when access has been granted, have choices on how it is used on an ongoing basis, and for that permission to be able to be withdrawn if circumstances change. This is an area that Corsham Institute will be working with partners to develop ethical frameworks and community-led test beds to understand in greater depth the implications for all of us, as our homes become ever more connected.
If Alexa and Cortana are to become invited guests into our living rooms or even virtual members of the family, they must end up serving the real interests of their owners and work for us, and not use their all too attractive functionality as a cover story for a massive data mining exercise by those who market them.
(1) PWC Disrupting Utilities: www.pwc.co.uk/industries/power-utilities/insights/energy2020/connected-home.html
(2) techUK Report: State of the Connected Home, Edition Two, September 2018 www.techuk.org/insights/news/item/13914-connected-home-device-ownership-up-but-consumers-remain-sceptical
(4) Excerpt from blog by Sue Daley, Head of Programme, Cloud, Data, Analytics and AI at techUK: ‘AI, ethics and data: a watershed moment for UK tech’, written for the Observatory for a Connected Society: www.connectedobservatory.net/commentblog/sue-daley
(6) Data Protection Act 2018, Chapter 12, Part 1:3(2) www.legislation.gov.uk/ukpga/2018/12/enacted