17 July 2018

Australian Consumer Privacy Perceptions

The Consumer Policy Research Centre report Consumer data and the digital economy: Emerging issues in data collection, use and sharing calls for placing consumers in the driver’s seat to ensure significant benefit and innovation can flow from open data.

Salient findings are
95% wanted companies to give options to opt out of certain types of information collected about them, how it can be used and/or what can be shared with others
91%  agreed that companies should only collect the information currently needed to provide the service 
When asked ‘what data/information would you be uncomfortable with companies sharing with third parties for purposes other than delivering the product or service’, the four highest- ranking answers were:
  • Phone contacts (87%) 
  • Your messages (86%) 
  • Device ID (84%) 
  • Phone number (80%) 
Of the Australians surveyed who reported reading a Privacy Policy or Terms and Conditions for one or more services/products in the past 12 months: 67% indicated that they still signed up for one or more products even though they did not feel comfortable The most common reason (73% ) for accepting privacy policies with which consumers were not comfortable was that it was the only way to access the product or service 
Consumers surveyed found it unacceptable for companies to:
  • Charge different consumers different prices based on their (data) profile (88%) 
  • Collect data about them without their knowledge to assess eligibility or exclude from a loan or insurance (87%) 
  • Use payment behaviour data to exclude from certain essential products and services (82%)
73% believe Government should ensure companies give consumers options to opt out of what data they provide, how it can be used and if it can be shared 
67% believe Government should develop protections to ensure consumers are not unfairly excluded from essential products or services based on the data or profile.
The authors note
Australians are spending more of their lives online. 87% were active internet users in 2017 , more than 17 million use social networking sites , and 84% of Australians are now buying products online. 
Big Data is big business. In 2018 alone, revenue from the Big Data software market was estimated at $42 billion. The introduction of the General Data Protection Regulation now provides EU consumers with new protections including greater transparency and control of data being collected about them by companies.
They argue that although  establishment of the Consumer Data Right is 'a step in the right direction, it currently falls short of economywide protections for Australian consumers whose data is being collected, shared and used on a daily basis'.

The report highlights policy implications
Building consumer trust and confidence to participate in the digital economy. 
Policy and regulatory settings must ensure that consumers can build trust through their participation in the digital economy. This will be central to the sustainable development of innovative technologies that are dependent on data collection. The United Kingdom’s Competition and Market Authority provides useful guidance for businesses and government on the elements of consumer data use practices that support well-functioning markets; which includes:
  • Consumers know when and how their data is being collected and used; and have some control over whether and how they participate. 
  • Businesses are using the data to compete on issues that matter to the consumer. 
  • The use of consumer data benefts both consumers and businesses. 
  • Rights to privacy is protected through the regulation of data collection and use. › There are effective ways to fairly manage non-compliance with regulation. 
Consumers need to be provided with genuine choice and control over collection, sharing and use 
Reforms to ensure that consumers are put in the driver’s seat when it comes to their own data are critical. Protections and regulations that are reliant on a consent model must ensure consumers genuinely comprehend and have choice over the type of data being collected, who it is being shared with and for what purpose. 
Comprehension testing and behavioural research should inform consent requirements. Essential elements of any consent regime must ensure that the consent provided is:
  • Expressed – the controller must be able to demonstrate that consent was given. 
  • Specifc to purpose (unbundled with other matters). 
  • Easy to understand (written in clear and plain language). 
  • Easily accessible 
  • Able to be withdrawn (as easily as it is to give consent). 
  • Freely given (not conditional if the data is not necessary for the provision of the service). 
Ensuring consumers’ right to privacy is adequately protected 
Experts caution against a wholly self-management approach to privacy. 
A balanced approach includes: having a definition for valid consent; developing practical mechanisms for people to manage their privacy for all entities, rather than micro-managing each; adjustment of time and focus of privacy law to provide guidance on types of uses at the time they are proposed; and the law to develop a code of basic privacy norms. 
Privacy by Design principles can be also applied by businesses and regulators to protect consumer privacy during the design phase, and throughout the lifecycle of any product 
Greater transparency of, and access to data and profiles 
Enabling greater transparency and access to the scores and profiles that are being built of consumers can help to avoid incorrect, biased and potentially discriminatory practices. Without transparency for consumers as to what data may have been used as an input to a company making a decision, they are unable to either challenge the outcome nor change their behaviour in the future to achieve a different outcome. To ensure algorithms and scores are not discriminatory, regulators can increase monitoring, auditing and assessment powers. 
Examples of algorithmic auditing and assessment services emerging internationally include bias check services established by the Algorithmic Justice League and ORCAA established by mathematician Cathy O’Neil for companies to test the fairness of the algorithms they are using. 
Strengthening regulatory monitoring and intervention powers 
The evolution of technology and machine-learning practices require a significant shift in capability, skills and monitoring powers within regulators. 
It is critical that regulators are adequately resourced and skilled to keep pace with new technologies and practices. They also need to be armed with sufficient discovery powers to identify potentially discriminatory or predatory lending behaviours, based on profiling practices, and to audit or assess algorithms. 
This will require investment in new skills, systems and people to keep pace with a fast-moving commercial environment. Ensuring the academic and community sector are also suffciently resourced to engage in policy and regulatory development processes will be key during this economy-wide shift in the operation of our markets.