Showing posts with label IoB. Show all posts
Showing posts with label IoB. Show all posts

28 October 2022

Implantables and privacy

'When is the processing of data from medical implants lawful? The legal grounds for processing health-related personal data from ICT implantable medical devices for treatment purposes under EU data protection law' by Sarita Lindstad and Kaspar Rosager Ludvigsen in (2022) Medical Law Review states 

Medicine is one of the biggest use cases for emerging information technologies. Data processing brings huge advantages but forces lawmakers and practitioners to balance between privacy, autonomy, accessibility, and functionality. ICT-connected Implantable Medical Devices plant themselves firmly between traditional medical equipment and software that processes health-related personal data, and these implants face many data management challenges. It is essential that healthcare providers and others can identify and understand the legal grounds they rely on to process data. The European Union is currently updating its framework, and the special provisions in the GDPR, the current ePrivacy Directive, and the coming ePrivacy Regulation all provide enhanced thresholds for processing data. This article provides an overview and explanation of the applicability of the rules and the legal grounds for processing data. We find that only a cumulative application of the GDPR and the ePrivacy rules ensure adequate protection of this data and present the legal grounds for processing in these cases. We discuss the challenges in obtaining and maintaining valid consent and necessity as a legal ground for processing and offer use case-specific discussions of the role of consent long-term and the lack of an adequate ‘vital interest’ exception in the ePrivacy rules.

The authors comment 

 Medicine is an emerging field for information communication technologies (ICT). Data processing brings significant advantages, and medical technologies develop at record speeds. ICT-connected Implantable Medical Devices (ICTIMD) plant themselves firmly between traditional medical equipment and software processing health-related personal data. ICTIMD are medical devices implanted in the human body with software capable of communicating and transferring data to external devices. They allow healthcare providers to monitor the patient’s condition without being physically present and help medical industries go from reactive to predictive and proactive models of care. 

However, the rapid technological development is a two-edged sword, forcing lawmakers and practitioners to balance between privacy, data protection, autonomy, and accessibility. ICTIMD rely on the processing of data on a massive scale, and while they face many of the same data management challenges as other fields, there are some major distinguishing factors. Health data is one of the most sensitive types of personal data, and the impact of a data breach can have enormous consequences. ICTIMDs are also, in contrast to most other devices, collecting data automatically and constantly from sensors implanted in human subjects. The end-user and data subject, the patient, does not have the freedom to leave the device at home. These devices form a particularly sensitive part of the private sphere of the users, demanding high data protection standards. 

The European Union (EU) is in the process of updating its privacy and data protection framework. Having replaced the Data Protection Directive (DPD) with the General Data Protection Regulation (GDPR), the complimenting ePrivacy directive (PECD) will eventually be replaced by an ePrivacy Regulation (EPR) and future additional legislation. These instruments together implement enhanced thresholds for processing health data from terminal equipment. For efficient data protection, it is vital that all the actors in the value chain, the healthcare providers, and the patients can identify and understand the lawful grounds available for processing. Our sections II and III start by clarifying the applicability of the rules and provide an overview of the legal grounds for processing from ICTIMD. Sections IV and V dive deeper into consent and necessity as legal grounds for processing ICTIMD data before section VI discusses the framework’s suitability for ICTIMD processing. 

The article will focus on processing enabling medical treatment and exclude processing for research purposes or other public interests. It will be limited to data protection law and will not cover law enforcement access, criminal law issues of illegal access, product liability law, or health law specifically

27 December 2020

UK Animal Tagging

The UK government has launched a consultation about mandatory microchipping of cats, alongside three separate mandatory scanning proposals. Tuk’s Law would make it mandatory for vets to scan cats and dogs for microchips before putting them down; Fern’s Law  would require vets to microchip cats and dogs when brought into a vet practice for the first time; and Gizmo’s Legacy would make it mandatory to scan for microchips when a cat or dog is found dead by the roadside. 

The government states that over a quarter of the UK’s pet cats aren’t microchipped, 'meaning that up to 2.6 million cats will benefit from the new measures'.  Since compulsory dog microchipping was introduced in 2016 around nine million dogs are now microchipped.

In discussing the dog regime the consultation paper states 

 All adverse reactions to microchips, including failed and migrated microchips (those that move within the animal) must be reported to the Secretary of State via the Veterinary Medicines Directorate. The latest data on adverse reactions in dogs from 2019 shows that there were a total of 354 reactions reported out of 540,000 dogs microchipped that year. 98% of reported cases were instances where the microchip had failed or migrated, rather than where implantation had caused health issues. 

In order that dogs are microchipped in a satisfactory way, minimum standards are set for microchips, and for the databases in which the keeper’s details are recorded (see paragraph 18). There are also minimum qualifications for those people who implant microchips. The 2015 Regulations require keepers to register their own details, including name, address, telephone number, and their dog’s details, including name, age and description of dog, and microchip number, on a compliant database. Failure by a keeper to register the dog on a compliant database or to keep their details up to date is an offence under the 2015 Regulations. 

The cost of microchipping a dog varies but it is generally between £15 to £30. Database operators offer different packages but may charge keepers to update a record. 

The 2015 Regulations also set requirements for the databases who register the dogs and their keepers. Databases must:

• have sufficient electronic capacity to store the keepers’ details; 

• back up the data to a secure off-site facility every day; 

• provide information to an authorised person (e.g. Defra, local authority, police); 

• provide information to a registered keeper about their dog; 

• have a system for identifying authorised persons; 

• have a system for identifying keepers of registered dogs; 

• maintain records to demonstrate that they are complying with the 2015 Regulations; 

• have a system for answering the telephone and responding to online requests; 

• be able to redirect online and telephone requests relating to dogs whose details are recorded on other databases; and 

• make available to other database operators the necessary information that allows other databases to determine which microchip numbers are recorded on their database. 

. There are currently 15 compliant databases that register dogs in England. Anyone enquiring about the registration of a microchip number can simply type the microchip number into any one of the compliant databases’ internet-based search facilities, or ‘lookup tools’, and the result will display the name of the database to which the number is registered.

17 November 2020

Internet of Bodies

The Internet of Bodies: Opportunities, Risks, and Governance (RAND, 2020) by Mary Lee, Benjamin Boudreaux, Ritika Chaturvedi, Sasha Romanosky, Bryce Downing comments 

A wide variety of internet-connected “smart” devices now promise consumers and businesses improved performance, convenience, efficiency, and fun. Within this broader Internet of Things (IoT) lies a growing industry of devices that monitor the human body, collect health and other personal information, and transmit that data over the internet. We refer to these emerging technologies and the data they collect as the Internet of Bodies (IoB) (see, for example, Neal, 2014; Lee, 2018), a term first applied to law and policy in 2016 by law and engineering professor Andrea M. Matwyshyn (Atlantic Council, 2017; Matwyshyn, 2016; Matwyshyn, 2018; Matawyshyn, 2019).  

IoB devices come in many forms. Some are already in wide use, such as wristwatch fitness monitors or pacemakers that transmit data about a patient’s heart directly to a cardiologist. Other products that are under development or newly on the market may be less familiar, such as ingestible products that collect and send information on a person’s gut, microchip implants, brain stimulation devices, and internet-connected toilets. 

These devices have intimate access to the body and collect vast quantities of personal biometric data. IoB device makers promise to deliver substantial health and other benefits but also pose serious risks, including risks of hacking, privacy infringements, or malfunction. Some devices, such as a reliable artificial pancreas for diabetics, could revolutionize the treatment of disease, while others could merely inflate health-care costs with little positive effect on outcomes. Access to huge torrents of live-streaming biometric data might trigger breakthroughs in medical knowledge or behavioral understanding. It might increase health outcome disparities, where only people with financial means have access to any of these benefits. Or it might enable a surveillance state of unprecedented intrusion and consequence. There is no universally accepted definition of the IoB. For the purposes of this report, we refer to the IoB, or the IoB ecosystem, as IoB devices (defined next, with further explanation in the passages that follow) together with the software they contain and the data they collect. 

An IoB device is defined as a device that

• contains software or computing capabilities 

• can communicate with an internet-connected device or network and satisfies one or both of the following: 

• collects person-generated health or biometric data 

• can alter the human body’s function. 

The software or computing capabilities in an IoB device may be as simple as a few lines of code used to configure a radio frequency identification (RFID) microchip implant, or as complex as a computer that processes artificial intelligence (AI) and machine learning algorithms. A connection to the internet through cellular or Wi-Fi networks is required but need not be a direct connection. For example, a device may be connected via Bluetooth to a smartphone or USB device that communicates with an internet-connected computer. Person-generated health data (PGHD) refers to health, clinical, or wellness data collected by technologies to be recorded or analyzed by the user or another person. Biometric or behavioral data refers to measurements of unique physical or behavioral properties about a person. Finally, an alteration to the body’s function refers to an augmentation or modification of how the user’s body performs, such as a change in cognitive enhancement and memory improvement provided by a brain-computer interface, or the ability to record whatever the user sees through an intraocular lens with a camera. 

IoB devices generally, but not always, require a physical connection to the body (e.g., they are worn, ingested, implanted, or otherwise attached to or embedded in the body, temporarily or permanently). Many IoB devices are medical devices regulated by the U.S. Food and Drug Administration (FDA). Figure 1 depicts examples of technologies in the IoB ecosystem that are either already available on the U.S. market or are under development. 

Devices that are not connected to the internet, such as ordinary heart monitors or medical ID brace- lets, are not included in the definition of IoB. Nor are implanted magnets (a niche consumer product used by those in the so-called bodyhacker community, described in the next section) that are not connected to smartphone applications (apps), because although they change the body’s functionality by allowing the user to sense electromagnetic vibrations, the devices do not contain software. Trends in IoB technologies and additional examples are further discussed in the next section. 

Some IoB devices may fall in and out of our definition at different times. For example, a Wi-Fi-connected smartphone on its own would not be part of the IoB; however, once a health app is installed that requires connection to the body to track user information, such as heart rate or number of steps taken, the phone would be considered IoB. Our definition is meant to capture rapidly evolving technologies that have the potential to bring about the various risks and benefits that are discussed in this report. We focused on analyzing existing and emerging IoB technologies that appear to have the potential to improve health and medical outcomes, efficiency, and human function or performance, but that could also endanger users’ legal, ethical, and privacy rights or present personal or national security risks. 

For this research, we conducted an extensive literature review and interviewed security experts, technology developers, and IoB advocates to under- stand anticipated risks and benefits. We had valuable discussions with experts at BDYHAX 2019, an annual convention for bodyhackers, in February 2019, and DEFCON 27, one of the world’s largest hacker conferences, in August 2019. In this report, we discuss trends in the technology landscape and outline the benefits and risks to the user and other stakeholders. We present the current state of gover- nance that applies to IoB devices and the data they collect and conclude by offering recommendations for improved regulation to best balance those risks and rewards.

16 March 2018

Chipper

No great surprises in the report that Meow-Ludo Disco Gamma Meow-Meow has been unsuccessful after brouhaha over his bodyhacking of a Transport for NSW (TfNSW) travel card.

Mr Meow-Meow was noted here, here and in a piece for The Conversation.

TfNSW had taken action against him for not using a valid  ticket (using public transport without a valid ticket and for not producing a ticket to transport officers).

Despite hyperbole about 'cyborg rights' (does everyone with a stent, a pacemaker or joint implant count as a cyborg?),  he today pleaded guilty to both offences at Newtown Local Court.

The ABC reports that  Mr Meow-Meow
was fined $220 for breaching the Opal Card terms of use and was ordered to pay $1,000 in legal costs. 
The lawyer representing Mr Meow Meow argued that transport legislation had advanced to include methods of contactless payment through MasterCard and some smart phones. He said that the law should adapt to all available technologies including implantable tech. 
But Magistrate Michael Quinn said, while the legislation may catch up with technology in the future, the law of the day must be followed. 
Outside court, Mr Meow Meow said he was disappointed both offences were not dismissed and that he was ordered to pay legal costs. 
Despite the decision, Mr Meow Meow said he would continue to experiment with implanted technology. He said he was planning to push the boundary even further, replacing his Opal chip with one that will hold all of his personal information, including credit cards and memberships. 
DIY unauthorised modification of credit card and membership cards will breach the terms and conditions of his account with the credit card providers, so he can expect to see those businesses restricting or cancelling the relevant accounts.

17 February 2018

Biohacking and travel cards

Given that Meow-Ludo Disco Gamma Meow-Meow - noted last year - is in the news again it was timely to read 'DIY Bio: Hacking Life in Biotech’s Backyard' by Lisa C. Ikemoto in (2017) 51 University of California Davis Law Review 539.

The peripatic Meow-Meow - recurrent political candidate, cyborg advocate and biohacking enthusiast - has unsurprisingly had his OPAL near-field transit card cancelled after he extracted the chip for subcutaneous insertion. He appears to consider that the resulting litigation - contesting a $200 fine in 2017 for riding the train without a valid ticket and reportedly planning to launch legal action against TfNSW for unlawfully cancelling his cards - will advance cyborg rights.

Australian law does not recognise 'cyborgs' as such and his action would appear to be readily addressed under the terms and conditions for use of his card.

In the Australian Capital Territory there is a prohibition under Regulation 49 of the Road Transport (Public Passenger Services) Regulation 2002 (ACT) of traveling on an ACT government bus using a ticket that has been 'damaged or defaced in a material respect' or 'changed in a material particular', with ticket including a card with a chip or magnetic strip.

In NSW use of the OPAL travel card is governed by the Passenger Transport (General) Regulation 2017 (NSW). The Cards 'are and remain' the property of TransportNSW, which may 'inspect, de-activate or take possession of an Opal Card or require its return at our discretion without notice at any time'.

Users are required to 'take proper care of the Opal Card, avoid damaging it, keep it flat and not bend or pierce it' and - saliently - 'not misuse, deface, alter, tamper with or deliberately damage or destroy the Opal Card'. Further, the user must not 'alter, remove or replace any notices (other than the activation sticker), trademarks or artwork on the Opal Card. Additionally, they must not'modify, adapt, translate, disassemble, decompile, reverse engineer, create derivative works of, copy or read, obtain or attempt to discover by any means, any (i) encrypted software or encrypted data contained on an Opal Card; or (ii) other software or data forming part of the Opal Ticketing System'.

Meow-Meow gained attention several years ago regarding 'biohacking' (centred on a DIY community DNA-modification lab) rather than 'bodyhacking'.

Ikemoto comments
DIY biologists set up home labs in garages, spare bedrooms, or use community lab spaces. They play with plasmids, yeast, and tools like CRISPR-cas9. Media stories feature glow-in-the-dark plants, beer, and even puppies. DIY bio describes itself as a loosely formed community of individualists, working separate and apart from institutional science. This Essay challenges that claim, arguing that institutional science has fostered DIY bio and that DIY bio has, thus far, tacitly conformed to institutional science values and norms. Lack of a robust ethos leaves DIY bio ripe for capture by biotech. Yet, this Essay suggests, DIY bio could serve as a laboratory for reformulating a relationship between science and society that is less about capital accumulation and more about knowledge creation premised on participation and justice.
 She goes on
Popular media depicts biohackers or Do-It-Yourself (“DIY”) biologists as the ultimate science geeks. “DIY bio” refers to noninstitutional science or science performed outside of professional laboratories.  DIY biologists set up home labs in garages, spare bedrooms, and closets or use community lab spaces. The people doing DIY bio range from the self-taught to PhDs. Instead of building computers or creating apps, DIYers play with plasmids, jellyfish, yeast, and polymerase chain reaction in genetic engineering experiments. Media stories and DIY bio websites often feature glow-in-the-dark plants, food, petri dish art, and even puppies.
DIY bio is an emerging set of activities. A range of players, with varied ideologies, are shaping DIY bio’s trajectories. DIY bio’s signature claim is that it exists apart from, and even in opposition to, institutional science. This Essay challenges that claim. Whether all DIY biologists know this or not, DIY bio serves the interests of institutional science and is well-situated for capture by biotechnology. Biotechnology refers not only to the life sciences-based industry, but also to the neoliberal epistemology that values the use of applied science to commercialize the transformation of life itself into technology. DIY bio’s origin stories do reflect resistance to the highly structured and bureaucratic nature of institutional science. Yet these accounts also indicate interest convergence between DIY bio and institutional science. Accounts that forecast DIY bio’s future show DIY bio conforming its practices to mainstream law, policy, and market concerns. Thus far, DIY bio has not crafted its own account of the relationship between science, society, and ethics, and is falling into a science-as-usual practice that situates DIY bio in biotech’s backyard.
Part II sets out a descriptive account of biohacking, and DIY bio, in particular. Part III identifies three overlapping explanations for DIY bio. The first two, explicitly political accounts and nostalgic accounts, are largely consistent with the DIY bio claim that DIY bio is different and apart from institutional science. The third account borrows from Frederick Jackson Turner’s frontier thesis and asserts that DIY bio sustains an ideology of bio-individualism embedded in biotechnology. Part IV reviews and critiques law and policy views of DIY bio and its prospects. These views apply the frames and standards applicable to biotech. Part V makes the case for biotech’s annexation of DIY bio. Part V elaborates on DIY bio’s failure, so far, to re-define the relationship between science and society, and suggests a few initial critical points of engagement for doing so.
She suggests that
As yet, DIY bio has not expressed a commitment to ethical science activity, nor developed a robust ethos. Perhaps, its tacit acceptance of the risk-benefit framework means that its view of ethics aligns with that of institutional science. That is, it conflates a risk-benefit weighing with ethical standards or views ethics as a compliance obligation.
The risk calculus is not devoid of ethical concerns. It maps onto a standard ethical test used in institutional science. The test highlights three criteria — safety, efficacy, and autonomy. That test derives from the Belmont Report’s principlist framework, the FDA’s drug and device approval standards, and neoliberalism’s effects on the life sciences and autonomy. The Belmont Report states four principles — autonomy, beneficence, non-maleficence, and distributive justice. Autonomy’s application is informed consent. The non-maleficence principle is addressed by weighing risk to human health against benefits. Benefits refer to efficacy or improvements to human health. The FDA uses safety and efficacy as its criteria in the drug and device testing requirements for market approval. Efficacy, like safety or risk to human health, is narrowly defined. The FDA requires that the product work, but does not require that it work well or better than existing therapeutics. Market thinking has infiltrated these criteria. Claims that individual choice should trump agency standards in determining access to drugs have gained credence. This indicates that traditional bioethics’ first principle, autonomy, may now be understood as a form of free market individualism. In addition, the pharmaceutical industry has leveraged that version of autonomy to maximize the role of drugs in medical care, and the sale of particular products. While big bio’s risk calculus is not the end-all and be-all of ethics in institutional science, it is part of an impoverished ethical framework.
In 2011, the North American and European DIYbio Congresses issued Draft Codes of Ethics. The codes incorporate principles of open science — open access, transparency, and education; and selfregulation — safety (adopt safe practices), environment (respect the environment), and peaceful purposes (biotechnology should only be used for peaceful purposes). As discussed, the North American Code has one more element — Tinkering. The Code elements are general. As my characterization suggests, the Code elements, like the Belmont Report principles, lend themselves to narrow or broad readings. Read more generously, safety, environment, and peaceful purposes might move DIY bio beyond the issue of forestalling regulation to situating science as a tool for social justice. On the other hand, open access could be read as a right to access, premised on free market individualism. Tinkering invokes the individual, as the nostalgic accounts show. If DIY bio is first and foremost an individualist vision of science, it stands little chance of evolving into a new understanding of science.
The open science principles suggest that DIY bio’s ethos differs from big bio’s, and that DIY bio is not bound by big bio’s norms. Yet, open science goals do not translate to an ethics of science. Open science can be used for different goals, including forms of commercial distribution that are exploitative. In addition, the Code states the elements as universal principles, which in itself is problematic. Typically, dominant readings of so-called universal principles are used to maintain boundaries, and identify the out-group as non-compliant. It is very possible that the universal principles may be used to undercut the inclusive goals that open science asserts.
My comments in the previous subparts suggest, without prescriptive detail, the possibility of using DIY bio to redefine the possible relationship between science and society. Contemporary accounts indicate that DIY bio projects are typically small-scale and are relatively unsophisticated. As such, DIY bio seems underpowered as a platform for re-thinking the political economy of the life sciences. What I suggest here is not that DIY biologists directly challenge or redesign institutional science. Rather, DIY bio might provide an opportunity to create, by deliberate experimentation, a set of practices that are ethos-based and originate from critical social inquiry. The most valorized explanatory accounts speak, in bits and pieces, of social justice goals. Using these as a starting point, DIY bio might craft ways of doing science that embed justice-based ethics into inquiry and practice. Ethics, then, could become not a compliance checklist, but constitutive of good science.
Ikemoto concludes
 DIY bio is many things to many people. That is, undoubtedly, part of its appeal. What is it not, however, is separate and apart from institutional science. Its location in biotech’s backyard, without a fence or substantive alternative vision of DIY bio’s role, makes it vulnerable to annexation. In that scenario, DIY bio and its dream of a new science by the people might disappear. This Essay maps the relationships between DIY bio and institutional science. The mapping also critiques aspects of biotechnology that are inconsistent with DIY bio’s stated goals of access and participatory knowledge formation. If DIY bio takes those goals seriously, this Essay suggests that it move beyond compliance-based thinking, and beyond experimentation using plasmids and pipettes. Acknowledging that science is a social practice, followed by scientific-social inquiry about how and why we engage with plasmids and pipettes, and willingness to experiment with new social methods of doing science, might move DIY bio out of biotech’s backyard, and into society.

27 June 2017

Biopunks

'“Let’s pull these technologies out of the ivory tower”: The politics, ethos, and ironies of participant-driven genomic research' by Michelle L. McGowan, Suparna Choudhury, Eric T. Juengst, Marcie Lambrix, Richard A. Settersten Jr and Jennifer R. Fishman in (2017) 1 BioSocieties 1 comments
This paper investigates how groups of ‘citizen scientists’ in non-traditional settings and primarily online networks claim to be challenging conventional genomic research processes and norms. Although these groups are highly diverse, they all distinguish their efforts from traditional university- or industry-based genomic research as being ‘participant-driven’ in one way or another. Participant-driven genomic research (PDGR) groups often work from ‘labs’ that consist of servers and computing devices as much as wet lab apparatus, relying on information-processing software for data-driven, discovery-based analysis rather than hypothesis-driven experimentation. We interviewed individuals from a variety of efforts across the expanding ecosystem of PDGR, including academic groups, start-ups, activists, hobbyists, and hackers, in order to compare and contrast how they relate their stated objectives, practices, and political and moral stances to institutions of expert scientific knowledge production. Results reveal that these groups, despite their diversity, share commitments to promoting alternative modes of housing, conducting, and funding genomic research and, ultimately, sharing knowledge. In doing so, PDGR discourses challenge existing approaches to research governance as well, especially the regulation, ethics, and oversight of human genomic information management. Interestingly, the reaction of the traditional genomics research community to this revolutionary challenge has not been negative: in fact, the community seems to be embracing the ethos espoused by PDGR, at the highest levels of science policy. As conventional genomic research assimilates the ethos of PDGR, the movement’s ‘democratizing’ views on research governance are likely to become normalized as well, creating new tensions for science policy and research ethics.
'Steve Jobs, Terrorists, Gentlemen and Punks: Tracing Strange Comparisons of Biohackers' by Morgan Meyer in Joe Deville, Michael Guggenheim and Zuzana Hrdlicková (eds) Practising Comparisons: Logics, Relations, Collaborations (Mattering Press, 2016) comments
In this paper, I want to reflect and shed new light on one of my current research topics: biohacking. While I have been researching biohacking for a few years now, to date I have not yet examined its comparative dimension. The themes I have investigated thus far revolve around the materiality, boundaries, and ethics of biohacking. However, so far I have not problematised or made visible the issue of comparison, despite the fact that comparisons abound in discussions about biohackers. This article is thus an opportunity to use a comparative optics to ‘make new discoveries’ (Yengoyan 2006) on a subject that I felt I already knew well. 
Biohackers are people who hack and tinker with biology. On the one hand, the phenomenon of biohacking can be easily localised (both temporally and spatially). The movement emerged in 2007/2008 and has largely developed in large US and European cities. On the other hand, in order to understand and analyse the phenomenon, comparisons with a wide and heterogeneous set of figures are made by science journalists and practitioners alike. For example, biohackers are concurrently compared to the following: seventeenth-century gentlemen amateurs; terrorists (whom Western powers usually locate in the East); the punk movement that emerged in the 1970s and their do-it-yourself ethics; and Steve Jobs and the Homebrew Computer Club. 
The term biohacking is used today to designate a wide array of practices including the hacking of expensive scientific equipment by building cheaper alternatives; producing biosensors to detect pollutants in food and in the environment; and genetically re-engineering yoghurt to alter its taste, make it fluorescent, or produce vitamin C. Biohacking mobilises and transforms both molecular biology techniques and the ethics of hacking/open source. As such, it can be seen as a recent phenomenon. Its emergence as a distinct and visible movement can be traced back to the past eight or nine years. In 2008, for instance, DIYbio (the first association dedicated to do-it-yourself biology) was created. Two years later, the Biopunk Manifesto (2010) was written by Meredith Patterson, one of the leading figures in the biohacking movement. In addition, at the time of writing this paper, there are a number of associations, laboratories, wikis, websites, and so on, dedicated to biohacking. 
The rise of the biohacker movement has caught the attention of journalists and academics alike. Academics have followed and analysed the movement since around 2008 (see Schmidt 2008a; Bennet et al. 2009; Ledford 2010), and two books dedicated to the subject have recently been published: Biohackers: The Politics of Open Science (2013), by science and technology studies (STS) scholar Alessandro Delfanti, and Biopunk: DIY Scientists Hack the So ware of Life (2011), by science journalist Marcus Wohlsen. In one way or another, this body of work has examined the ethics, risks, potentials, and openness of the movement. 
The geographical spread of biohacking – like its temporal emergence – can also be delineated. According to the main website in the field (DIYbio.org), there are currently eighty-five DIY biology laboratories in the world, of which twenty-eight are located in Europe, and thirty-five are in the US on either the east or west coast. There are now biohacker labs and biohackers in cities like New York, Boston, Paris, San Francisco, Manchester, Vienna, and in recent years, initiatives have developed in places like Japan, Indonesia, and Singapore. The political geography of biohacking (and consequently, the arguments developed in this paper) thus needs to be emphasised. The biohacker movement is developing in Western and Westernised countries; laboratories are usually located in urban or suburban settings; and English is the lingua franca for the majority of the websites, articles, mailing lists, discussions, and wikis devoted to biohacking. 
This paper focuses on how, and to what, biohackers are compared. This is a challenging question, for as we will see below, biohackers are compared to rather unlikely bedfellows. Not only are plentiful comparisons being made, but they are also drawn between different cultures and times, and between different – sometimes opposing – values and ethics. Unlike the ‘comparator’ which needs to be actively assembled, fed, and calibrated in order to provide comparisons (Deville, Guggenheim, and Hrdličková 2013), in the case of biohackers, comparisons are ‘already there’ and they are omnipresent. The frequency and disparity of these comparisons are what caught my interest in comparison and what compelled me to write this chapter. Why are such comparisons mobilised and why are such unlikely gures put side by side? What kinds of effects do such comparisons afford? How should we analyse these comparisons?
It is not unusual for hackers and computer programmers to be compared. Computer hackers, for instance, have been compared to public watchdogs, whistle-blowers, elite corps of computer programmers, artists, vandals, and criminals (see Jordan and Taylor 1998), while recent hacker networks like the Anonymous group have been compared to industrial machine breakers, and to Luddites (Deseriis 2013). The Homebrew Computer Club (initially a group of ‘hobbyists’) eventually became a group of ‘business entrepreneurs’ (see Coleman 2012), and Steve Jobs is today being compared to people like Thomas Edison or Walt Disney. 
Using biohacking as a case study, I will reflect upon and problematise comparison. The list of potential benefits of comparison is long, and it is worth mentioning a few, such as how they help to explore new, unanticipated routes; move beyond national frameworks by varying scales of analysis; and identify social patterns while highlighting the singularity of the cases studied (de Verdalle et al. 2012). The practices, methods, and problems of comparison have been discussed in a number of academic texts over the past decade or so. For instance, Richard Fox and Andre Gingrich (2002) have made an important contribution by revisiting and (re)theorising comparison. Arguing that comparison is a basic human activity that deserves academic scrutiny, they lay out a specific programme for comparative approaches. Differentiating between weak or implicit comparison, and strong and explicit comparison, Fox and Gingrich push especially for the latter and highlight their plural nature (2002: 20). The explicit focus on comparison has now become increasingly common, so that people talk of a ‘comparative turn’ in the social sciences (see Ward 2010). In this sense, comparison is actively engaged with, problematised, and theorised. This interest is visible beyond the Anglo-Saxon world as well. In France, for instance, two collections of essays on comparison have been published in 2012 alone: one is in the journal Terrains et Travaux (featuring on its cover an orange and an apple – a classic image that at once depicts sameness and difference, and is one of the chief challenges of comparison). The other is in an edited book called Faire des Sciences Sociales: Comparer (Remaud, Schaub, and ireau 2012). 
In this article, I want to draw on this body of work in several ways. First, I am interested in several authors’ emphases on ‘thick’ and multidimensional comparisons. Ana Barro, Shirley Jordan, and Celia Roberts (1998) have argued that comparison should be explorative, thick, and multidimensional. Jörg Niewöhner and Thomas Scheffer – who also argue for a ‘thick’ comparison – further emphasise that comparisons are performative in that ‘they connect what would otherwise remain unconnected, specify what would otherwise remain unspecified, and emphasise what would otherwise remain unrecognised’ (2008: 281). In a related way, Joe Deville, Michael Guggenheim, and Zuzana Hrdličková (this volume) talk about approaches that actively ‘provoke’ comparisons, while Tim Choy (2011) examines what comparisons do. 
Second, I do not want to ‘solve’ the issue of comparison, nor tell a coherent account of what biohackers are and what they are not. I am, rather, exploring the problems that biohackers and their identities entail. In this sense, I follow Adam Kuper (2002) who reminds us that we have to ‘begin with a problem, a question, an intuition’ (2002: 161). He further writes:
I remain convinced that methodological difficulties are the least of our problems [...] We lack questions rather than the means to answer them. What we need in order to revive the comparative enterprise is not new methods but new ideas, or perhaps simply fresh problems (Ibid. 162).
I hold that biohackers are possibly such a ‘fresh problem’ since their identity is somewhat ambiguous and unclear, and since the probable risks and innovative potential of their activities are currently being debated. Discussions about biohacking reveal that there are many uncertainties and that it seems diffcult to put their identity into neat categories. The questions that seem to drive most biohacking comparisons – Who are they? How can we make sense of them? Are they to be feared or hailed? – seem to have no clear answer. 
Third, I also draw on Donna Haraway’s and Marilyn Strathern’s ideas around ‘partial connections’ and positionality. In her discussion about situated knowledge, Haraway writes:
[h]ere is the promise of objectivity: a scientific knower seeks the subject position, not of identity, but of objectivity, that is, partial connection. There is no way to ‘be’ simultaneously in all, or wholly in any, of the privileged (i.e. subjugated) positions (1988: 586).
She continues:
I am arguing for politics and epistemologies of location, positioning, and situating, where partiality and not universality is the condition of being heard to make rational knowledge claims [...] Feminism loves another science: the sciences and politics of interpretation, translation, stu ering, and the partly understood (Ibid. 589).
In her book Partial Connections (1991), Strathern further draws on Haraway’s work and uses the term ‘partial’ to say that ‘for not only is there no totality, each part also de nes a partisan position’ (1991: 39). The trope of ‘partial connections’ can be – and already has been – engaged with in work on comparisons. 
For instance, Endre Dányi, Lucy Suchman and Laura Watts (cited in Witmore 2009) have compared seemingly incompatible field sites (a renewable energy industry, the Hungarian Parliament, and a research centre in Silicon Valley) and noted that there can be a ‘remarkable repetitiveness’ when these sites are connected through specific themes (such as newness, centres/peripheries, place, and landscape). Others have talked about ‘partial comparisons’ (Jensen et al. 2011) as a way to think about multiplicities while still recognising that ‘there exists no single, stable, underlying nature on which all actors have their perspectives’ (Ibid. 15). In this paper, I want to use these ideas in order to avoid one pitfall: the depiction of biohackers as a coherent whole that is able to be summated according to the different parts and comparisons reported in this article. In other words, the comparisons made can only be ‘partially connected’. I will thus refrain from taking an analytical view ‘from above’, one that is detached from what takes place ‘on the ground’. Instead, I will follow the actors themselves and consider their comparisons and knowledge claims to be valid and legitimate. In the remainder of this paper, I look in turn at four comparisons of biohackers (Steve Jobs, punks, amateurs, and terrorists). I will think with biohackers about comparison, rather than think about biohackers’ comparisons. In doing so, I not only seek to examine what comparisons do and produce, but I will also be reflexive and critical about my own previous research.