When was the last time you read all the terms and conditions for an online service? Do you ever click on the emails that inform you about an updated privacy policy? Didomi's Amy Arnell looks at compliance and privacy regulation
Is the phrase "I have read and agree to the terms and conditions" in fact the biggest lie on the internet? Often, the reality is that we have not even tried to read, let alone understand, the terms and conditions that apply.
We all know we don’t read the fine print. And some companies have put this to the test, such as F-Secure, who, buried deep into their lengthy terms and conditions stated that, in exchange for their WiFi service, “the recipient agreed to assign their first born child to us for the duration of eternity”. People still signed up.
Another example is Dima Yarovinsky’s art project which aims to “visualize how small and helpless users are against large corporations,” as Designboom writes. The designer printed out the T&Cs of leading online services on coloured, standard A4-size rolls.
Perhaps the logical conclusion to be drawn is that we just don’t care. But, actually, we do.
Recent European-wide research has shown that 85% of people across France, Great Britain, Ireland and Germany feel they understand the meaning of personal data, and a high proportion are worried about how it is being used - 71% in Great Britain, for example.
If some 80% of us believe that transparency is important for trusting a company or brand, then why don’t we read the fine print? Excessive word counts, complicated legal language: dry, impenetrable, inaccessible prose. It’s not a difficult question to answer.
Plus, did we even really have the choice to begin with? Often there is no option but to “agree,” to “consent” or to “accept” if we want to interact socially or work professionally online, leaving little opportunity to challenge brands’ data practices.
It would appear that we are stuck in a trap. A trap of feeling uncomfortable about how our data is used, but unable to fully understand what exactly it is being used for until it is too late.
One need only call on the Cambridge Analytica scandal of 2018. To hear more about Cambridge Analytica and what we can learn from it, join the Yes We Trust Summit on October 7th, where Brittany Kaiser (Cambridge Analytica whistleblower) will be a keynote speaker on how the digital industry can gain the trust of consumers and companies.
The “Digital Buddy”: A companion to navigate a complex digital world
The “Digital Buddy” project by Field Systems proposes to leverage artificial intelligence (AI) and augmented reality (AR) to create a 3D avatar to help protect your interests and privacy online.
The Buddy would communicate the T&C in a way that is easy to understand. You could ask the Buddy questions, such as “are my messages being read?”, “is my location being tracked?”. And finally, over time, your Buddy would also learn about your values and preferences, acting independently to review T&Cs and quickly highlighting any serious concerns.
But, if we think deeper, what does this tell us about our society?
How have we got to the stage where compliance tools - like cookie banners, T&Cs or privacy policies - that were built to protect and provide transparency, have been manipulated to further obscure the way our data is being used? Why do we believe that we’re being tricked while - in reality - these tools are made to inform us.
An uncertain future: Do we trust in compliance?
Perhaps the internet that was supposed to “set us free” is an internet we have left behind. It’s time we reversed this cycle of distrust, and opt for a citizen-led data industry.
The Digital Buddy seeks to educate and empower, and this is definitely a good thing. But wouldn’t it be more effective to try and solve the problem the Digital Buddy addresses? Rather than creating AI companions to help protect our data, it would surely be better for companies to reverse their thinking and design all privacy interfaces with the user in mind.
We have listed a couple of examples of cookie banners - some of them among our clients, some in creative sectors like gaming - but we would be lying if we said we were impressed. Even on awwwards, you can see that most of the “Creative Examples of Cookie Consent Experiences” are more about visual design than about privacy-by-design.
Privacy policies could be made clearer thanks to a theoretical virtual digital assistant, but they could also be made clearer through an ambitious effort to invest in trust and transparency. Interestingly, more than privacy regulations like GDPR and CCPA, it’s a tech company - Apple - that has designed one of the most impactful consent notices. Does it mean that compliance has lost to big tech?
Four-fifths of consumers consider transparency important for trusting a brand or company. Users want to be informed about why their data is being collected, and they appreciate companies with transparent data practices.
Trust should be regarded as the single most important driver of success in business, and privacy professionals as real business enablers.
Consumer data has never been more valuable. How brands collect, use and protect it determines user experience and customer trust, and this is why compliance professionals should think more as users, not as lawyers!
You can sign up to the 'Yes We Trust Summit' here to discuss these ideas further. The event is sponsored by Didomi and Securys.
This post was originally published on https://blog.didomi.io/en on July 27th here - https://blog.didomi.io/en/yes-we-trust-in-compliance
Posted on: Friday 20 August 2021