The 151 page
Clearly Opaque: Privacy Risks of the Internet of Things report by Gilad Rosner and Erin Kenneally comments
There have been many names for the
IoT over time: ubiquitous computing,
ambient intelligence, machine-tomachine
communications, pervasive
computing, and, most recently, cyberphysical
systems. The terms emerged
from various disciplines, but they all point
in the same direction. These persistent
attempts to find a suitable term for the
phenomenon reveal an awareness that
the world is in rapid transition towards
more comprehensive monitoring and
connectivity, that this will likely have a
profound impact on our lives, and that
it is important to start anticipating the
potential consequences. Our physical
and informational world is evolving,
and with it, the concept of privacy as
we know it.
The authors argue that
The IoT will expand the data collection
practices of the online world to the
offline world.
— The IoT will enable and normalize
preference and behavior tracking in
the offline world. This is a significant
qualitative shift, and a key reason to
evaluate these technologies for their
social impact and effect on historical
methods of privacy preservation. The
very notion of an offline world may
begin to decline.
The IoT portends a diminishment of
private spaces.
— The scale and proximity of sensors
being introduced will make it harder
to find reserve and solitude. The IoT
will make it easier to identify people in
public and private spaces.
The IoT will encroach upon emotional
and bodily privacy.
— The proximity of IoT technologies
will allow third parties to collect our
emotional states over long periods
of time. Our emotional and inner life
will become more transparent to data
collecting organizations.
Given the likelihood of ubiquitous
data collection throughout the human
environment, the notion of privacy
invasion may decompose; more so
as people’s expectation of being
monitored increases.
— Much of consumer IoT is predicated
on inviting these devices into our lives.
The ability to know who is observing us
in our private spaces may cease to exist.
The IoT will hasten the disintegration of
the ‘reasonable expectation of privacy’
standard as people become more
generally aware of smart devices in
their environments.
When IoT devices fade into the
background or look like familiar
things, we can be duped by them,
and lulled into revealing more
information than we might otherwise.
Connected devices are designed to be
unobtrusive, so people can forget that
there are monitoring devices in
their environment.
IoT devices challenge, cross and
destabilize boundaries, as well as
people’s ability to manage them.
— The home is in danger of becoming a
‘glass house,’ transparent to the makers
of smart home products. And, IoT
devices blur regulatory boundaries –
sectoral privacy governance becomes
muddled as a result.
As more and more products are
released with IoT-like features, there
will be an “erosion of choice” for
consumers – less of an ability to not
have Things in their environment
monitor them.
Market shifts towards ‘smart’ features
that are intentionally unobtrusive
lead to less understanding of data
collection, and less ability to decline
those features.
The IoT retrenches the surveillance
society, further commodifies people,
and exposes them to manipulation.
The IoT makes gaining meaningful
consent more difficult.
The IoT is in tension with the principle
of Transparency.
The IoT threatens the Participation
rights embedded in the US Fair
Information Practice Principles and the
EU General Data Protection Regulation.
IoT devices are not neutral; they are
constructed with a commercial logic
encouraging us to share. The IoT
embraces and extends the logic of
social media – intentional disclosure,
social participation, and continued
investment in interaction.
The IoT will have an impact on
children, and therefore give parents
additional privacy management duties.
— Children today will become adults in a
world where ubiquitous monitoring by
an unknown number of parties will be
business as usual.
The report identifies 'emerging frameworks and strategies' regarding IoT privacy
Having broad non-specialist social
conversations about data (use, collection,
effects, socioeconomic dimensions) is
essential to help the populace understand
the technological changes around them.
Privacy norms must evolve alongside
connected devices – discussion is
essential for this.
Human-Computer Interaction (HCI)
and Identity Management (IDM) are two
of the most promising fields for privacy
strategies for IoT.
A useful design strategy is the ‘least
surprise principle’ – don’t surprise users
with data collection and use practices.
Analyze the informational norms of
personal data collection, use and sharing
in given contexts.
Give people the ability to do fine-grained
selective sharing of the data collected by
IoT devices.
Three major headings for emerging
frameworks and strategies to address
IoT privacy:
— User Control and Management
— Notification
— Governance
User Control and
Management Strategies
— Pre-Collection
• Data Minimization – only collect data
for current, needed uses; do not collect
for future as-yet-unknown uses
• Build in Do Not Collect ‘Switches’
(e.g., mute buttons or software toggles)
• Build in wake words and manual
activation for data collection, versus
the truly always-on
• Perform Privacy Impact Assessments
to holistically understand what your
company is collecting and what would
happen if there was a breach
— Post-Collection
• Make it easy for people to delete
their data
• Make it easy to withdraw consent
• Encrypt everything to the maximum
degree possible
• IoT data should not be published on
social media or indexed by search
engines by default – users must review
and decide before publishing
• Raw data should exist for the shortest
time possible
— Identity Management
• Design strategies:
> Unlinkability – build systems that
can sever the links between users’
Emerging Frameworks
and Strategies to address activities on different devices
or apps
>; Unobservability – build or use
intermediary systems that are
blind to user activity
• Give people the option for
pseudonymous or anonymous
guest use
• Design systems that reflect the
sensitivity of being able to
identify people
• Use selective sharing as a
design principle
> Design for fine-grained control
of data use and sharing
> Make it easy to “Share with this
person but not that person”
• Create dashboards for users to see,
understand and control the data
that’s been collected about them
• Design easy ways to separate
different people’s use of devices
from one another
— Notification Strategies
• Timing has an impact on privacy
notice effectiveness.
• Emerging privacy notice types:
> Just-in-time; > Periodic; > Context-dependent; > Layered
• Test people’s comprehension
of privacy policies
• Researchers are exploring privacy
notification automation:
> Automated learning and setting
of privacy preferences; > Nudges to encourage users to think
about their privacy settings; > IoT devices advertising their
presence when users enter
a space
— Governance Strategies
• Creation of baseline, omnibus privacy
laws for US
• Regulations restricting IoT data from
certain uses
• Regulator guidance on acceptability
of privacy policy language and
innovation
• Requirement to test privacy policies
for user comprehension
• Expansion of “personally-identifiable
information” to include sensor data
in the US
• Policymaker discussions of the
collapse of the ‘reasonable expectation
of privacy’ standard
• Greater use of the ‘precautionary
principle’ in IoT privacy regulation
• More technologists embedded
with policymakers
• Trusted IoT labels and
certification schemes