Data protection and consumer trust are the key to unlocking AI


Research published to coincide with Data Privacy Day unequivocally sends a message to programmers and policymakers alike. With 93% of UK consumers apprehensive about how their data is used and the advent of AI causing many to now also be concerned about the use of their data to train AI models, the message is clear: trust has to be gained before it can be retained.

Almost 7 years on from the implementation of comprehensive data privacy laws in the UK (in the form of the General Data Protection Regulation) and 3 years on from the UK government hinting that it might think outside the box in terms of data privacy regulation, the research uncovered a significant proportion (56%) of consumers who want to see stricter regulation of personal data. 

This could be interesting food for thought for policymakers and slightly at odds with both the latest data protection reform bill currently going through Parliament (the Data Use and Access Bill) and the Government’s recently published AI Action Plan, which seeks to adopt a pro-innovation light touch approach to AI regulation, in contrast to other more heavily regulated regimes such as seen in the EU.

Need for transparency

What is clear from the research is that there is work to do at all levels of the data ecosystem to build trust and alleviate fears. The Government’s focus in its Action Plan is firmly on unlocking data sets as a key to AI innovation, so it stands to reason that success is predicated on consumers being confident that their data will be protected when it is shared with third parties in both the public and private sectors. The research showed that consumers still don’t understand what is happening to their data, particularly in the context of AI, and it is likely that this is driving consumer fears about data sharing. Although transparency principles lie at the heart of data protection laws in the UK, an overwhelming majority (93%) of respondents still want more to be done to better inform people about how their data is being used.

For business, these trust issues can be viewed as both a threat and an opportunity. With 78% of consumers saying that they are likely to stop using a service or company if they discover it has experienced a data breach, it likely follows that data privacy standards and compliance present an opportunity for businesses to differentiate, with those that are trusted being more likely to succeed in the race to harness the power of data in the evolving AI world.

As for consumers, they must also share the burden of responsibility for data protection. Whilst the majority (67%) of respondents confirmed that they have heeded years of advice to use strong passwords to protect their data, more sophisticated techniques such as VPNs and other privacy enhancing technologies are much less likely to be employed. 

So, whilst policy makers and regulators can mold the metaphorical bricks that house our data and software developers install the windows, that is still not enough to deliver robust security. Or, put another way, in the same way people are unlikely to leave their home without first locking the front door, the onus must be on consumers to take some responsibility for the protection of their data.

Miriam Everett, partner and global head of data and privacy at Herbert Smith Freehills



Source link