Showing posts with label Biohacking. Show all posts
Showing posts with label Biohacking. Show all posts

17 April 2021

Biohacking

'Prescribing unapproved medical devices? The case of DIY artificial pancreas systems' by Joseph T.F. Roberts, Victoria Moore and Muireann Quigley in (2021) 21(1) Medical Law International 42-68 comments 

In response to slow progress regarding technological innovations to manage type 1 diabetes, some patients have created unregulated do-it-yourself artificial pancreas systems (DIY APS). Yet both in the United Kingdom (UK) and internationally, there is an almost complete lack of specific guidance – legal, regulatory, or ethical – for clinicians caring for DIY APS users. Uncertainty regarding their professional obligations has led to them being cautious about discussing DIY APS with patients, let alone recommending or prescribing them. In this article, we argue that this approach threatens to undermine trust and transparency. Analysing the professional guidance from the UK regulator – the General Medical Council – we demonstrate that nothing within it ought to be interpreted as precluding clinicians from initiating discussions about DIY APS. Moreover, in some circumstances, it may require that clinicians do so. We also argue that the guidance does not preclude clinicians from prescribing such unapproved medical devices. 

The authors argue

 Healthcare technology innovation in type 1 diabetes (T1D) management has until recently been a relatively slow process. Patients have become tired of waiting for commercial companies to produce effective, accessible technological solutions that fully meet their needs. As a result, some patients (sometimes called ‘loopers’) are taking matters into their own hands and constructing do-it-yourself (DIY) systems to better manage their diabetes (encapsulated by #WeAreNotWaiting used to describe the movement on social media). Utilising two increasingly available technologies – continuous glucose monitors (CGMs) and insulin pumps – patients are creating hybrid closed-loop ‘artificial pancreas’ systems (APS). They do this by connecting their pumps to their CGMs using software installed on either a small computer or their smartphones. These systems calculate and deliver the required insulin doses automatically in real time. The main aims of ‘looping’ are to optimise blood glucose and insulin control and reduce the manual (and mental) input required by patients to manage their disease. For many patients, ‘looping’ represents a welcome step forward in the management of T1D. Users of DIY APS report experiencing improved amount of ‘time in range’ (time spent with blood glucose in optimal range), reduced anxiety surrounding sleep, and reduced time spent doing diabetes-related tasks such as checking blood glucose levels and calculating insulin doses. Nevertheless, it raises a number challenges for clinicians treating patients who loop or are thinking about looping. 

These challenges, which this article will outline in detail, are exacerbated by the lack of regulatory approval for these devices. Although this article focuses on the implications of this in the United Kingdom (UK) context, the issue is an international one. No regulatory body has approved the use of these DIY devices; indeed, two have issued statements actively discouraging their use. Both the French and US regulators warn patients of the safety implications and tell healthcare professionals to be vigilant (the latter’s statement followed the report of a serious adverse event in which a DIY APS user received an excess of insulin). As such, many of the arguments in this article will be of relevance to clinicians, patients, and regulators in other jurisdictions. 

Within the UK, there is an almost complete lack of ethical or regulatory guidance for clinicians who provide care to patients using DIY systems. This results in significant uncertainty with regard to their ethical and professional obligations in this respect. Practically speaking, this has led to clinicians adopting a precautionary approach in the clinic. Generally, even clinicians who are aware of the existence of DIY systems do not discuss them as an option unless the patient raises the issue themselves. 

In this article, we do three things. First, while we acknowledge clinicians’ concerns that legal or regulatory body actions could arise if they initiate discussions around DIY APS with patients, we argue that the current approach is ethically suboptimal and stems in part from a misinterpretation of regulatory guidance. In particular, we note that the current approach may be creating a lack of transparency in clinic. Such a lack of transparency is ethically undesirable since it inhibits both clinicians’ and patients’ abilities to openly discuss the availability and benefits, as well as the potential risks associated with looping. Secondly, we examine relevant guidance from the UK regulator – the General Medical Council (GMC) (including Good Medical Practice, new consent guidance, and prescribing guidance) – and demonstrate that there is nothing in it which ought to be interpreted as requiring clinicians to refrain from discussing DIY APS with, or recommending them to, their patients. Indeed, the latest iteration of the GMC’s consent guidance, published in September 2020, could be interpreted as requiring such discussions in some circumstances. Thirdly, we go one step further and argue that, although a high degree of caution might be needed (especially as the technology diffuses out from the current core of highly expert users), GMC guidance does not preclude or prohibit clinicians from prescribing medical devices which lack regulatory approval (‘unapproved medical devices’); and to conclude otherwise is a misinterpretation of the guidance. 

In making these arguments, it is important to note that we do not include either adults who lack capacity to make treatment decisions or children. While similar issues may arise for each of these groups, there are significant differences in relation to both the legal and regulatory landscape and the ethical arguments. For example, with regard to both of these groups, consideration of whether DIY APS is in the patient’s best interests is paramount. This is likely to further influence doctors’ decision-making processes and deserves careful consideration. As such, these patient groups are outside the scope of this piece. Our primary focus within this article is on regulatory matters surrounding prescribing; this particular focus reflects concerns raised among clinicians. It should, however, be noted that although GMC guidance is designed to be consistent with UK law, it is not intended to be a statement of legal principles. Nevertheless, we acknowledge that clinicians have concerns regarding legal liability and make some brief comments on this later in ‘Discussing DIY APS: What counts as a prescription?’ and ‘Professional judgement, clinical discretion and prescribing DIY APS’ sections.

17 November 2020

Internet of Bodies

The Internet of Bodies: Opportunities, Risks, and Governance (RAND, 2020) by Mary Lee, Benjamin Boudreaux, Ritika Chaturvedi, Sasha Romanosky, Bryce Downing comments 

A wide variety of internet-connected “smart” devices now promise consumers and businesses improved performance, convenience, efficiency, and fun. Within this broader Internet of Things (IoT) lies a growing industry of devices that monitor the human body, collect health and other personal information, and transmit that data over the internet. We refer to these emerging technologies and the data they collect as the Internet of Bodies (IoB) (see, for example, Neal, 2014; Lee, 2018), a term first applied to law and policy in 2016 by law and engineering professor Andrea M. Matwyshyn (Atlantic Council, 2017; Matwyshyn, 2016; Matwyshyn, 2018; Matawyshyn, 2019).  

IoB devices come in many forms. Some are already in wide use, such as wristwatch fitness monitors or pacemakers that transmit data about a patient’s heart directly to a cardiologist. Other products that are under development or newly on the market may be less familiar, such as ingestible products that collect and send information on a person’s gut, microchip implants, brain stimulation devices, and internet-connected toilets. 

These devices have intimate access to the body and collect vast quantities of personal biometric data. IoB device makers promise to deliver substantial health and other benefits but also pose serious risks, including risks of hacking, privacy infringements, or malfunction. Some devices, such as a reliable artificial pancreas for diabetics, could revolutionize the treatment of disease, while others could merely inflate health-care costs with little positive effect on outcomes. Access to huge torrents of live-streaming biometric data might trigger breakthroughs in medical knowledge or behavioral understanding. It might increase health outcome disparities, where only people with financial means have access to any of these benefits. Or it might enable a surveillance state of unprecedented intrusion and consequence. There is no universally accepted definition of the IoB. For the purposes of this report, we refer to the IoB, or the IoB ecosystem, as IoB devices (defined next, with further explanation in the passages that follow) together with the software they contain and the data they collect. 

An IoB device is defined as a device that

• contains software or computing capabilities 

• can communicate with an internet-connected device or network and satisfies one or both of the following: 

• collects person-generated health or biometric data 

• can alter the human body’s function. 

The software or computing capabilities in an IoB device may be as simple as a few lines of code used to configure a radio frequency identification (RFID) microchip implant, or as complex as a computer that processes artificial intelligence (AI) and machine learning algorithms. A connection to the internet through cellular or Wi-Fi networks is required but need not be a direct connection. For example, a device may be connected via Bluetooth to a smartphone or USB device that communicates with an internet-connected computer. Person-generated health data (PGHD) refers to health, clinical, or wellness data collected by technologies to be recorded or analyzed by the user or another person. Biometric or behavioral data refers to measurements of unique physical or behavioral properties about a person. Finally, an alteration to the body’s function refers to an augmentation or modification of how the user’s body performs, such as a change in cognitive enhancement and memory improvement provided by a brain-computer interface, or the ability to record whatever the user sees through an intraocular lens with a camera. 

IoB devices generally, but not always, require a physical connection to the body (e.g., they are worn, ingested, implanted, or otherwise attached to or embedded in the body, temporarily or permanently). Many IoB devices are medical devices regulated by the U.S. Food and Drug Administration (FDA). Figure 1 depicts examples of technologies in the IoB ecosystem that are either already available on the U.S. market or are under development. 

Devices that are not connected to the internet, such as ordinary heart monitors or medical ID brace- lets, are not included in the definition of IoB. Nor are implanted magnets (a niche consumer product used by those in the so-called bodyhacker community, described in the next section) that are not connected to smartphone applications (apps), because although they change the body’s functionality by allowing the user to sense electromagnetic vibrations, the devices do not contain software. Trends in IoB technologies and additional examples are further discussed in the next section. 

Some IoB devices may fall in and out of our definition at different times. For example, a Wi-Fi-connected smartphone on its own would not be part of the IoB; however, once a health app is installed that requires connection to the body to track user information, such as heart rate or number of steps taken, the phone would be considered IoB. Our definition is meant to capture rapidly evolving technologies that have the potential to bring about the various risks and benefits that are discussed in this report. We focused on analyzing existing and emerging IoB technologies that appear to have the potential to improve health and medical outcomes, efficiency, and human function or performance, but that could also endanger users’ legal, ethical, and privacy rights or present personal or national security risks. 

For this research, we conducted an extensive literature review and interviewed security experts, technology developers, and IoB advocates to under- stand anticipated risks and benefits. We had valuable discussions with experts at BDYHAX 2019, an annual convention for bodyhackers, in February 2019, and DEFCON 27, one of the world’s largest hacker conferences, in August 2019. In this report, we discuss trends in the technology landscape and outline the benefits and risks to the user and other stakeholders. We present the current state of gover- nance that applies to IoB devices and the data they collect and conclude by offering recommendations for improved regulation to best balance those risks and rewards.

26 April 2020

COVID Cyborgs

'The COVID Cyborg and Protecting the Unaugmented Human' by Kate Galloway in (2020) Alternative Law Journal comments
This article examines the increasing tendency towards governance of people through their representation via data. In its most contemporary iteration, the COVID-19 pandemic has raised the prospect of contact tracing apps. While public discourse about the apps has focused principally on the important issue of data privacy, there are other possible effects whereby participation in such schemes might become a pre- requisite to accessing services or basic rights—either from government or from corporations. The pathway to acceptability of applying our data in this way is already paved, through fitness monitors and other technologies by which we represent ourselves. This article sets out the foundation of such technologies and their application, before outlining their effect on the recognised boundaries of governance and the conception of the holder of rights and the substance of those rights. 
Galloway argues
In 2018, biohacker Meow-Ludo Disco Gamma Meow-Meow was found guilty of travelling on Sydney buses without a valid ticket. Rather than carrying Sydney transport’s Opal Card with him, he had instead implanted its chip into his hand. He had indeed tapped on when entering the bus—so had paid for his trip. However, Sydney transport authorities were not satisfied with this, alleging that he had breached the card’s terms of use. 
Meow-Meow claimed that his case was based on the principle of ‘cyborg rights’. The modification of his body through embedding technology-capable hardware is a feature of a posthuman evolution, a ‘leaky distinction between animal-human and machine’. As an activist pushing the boundaries of the definition of human, Meow-Meow was simultaneously pushing the boundaries of the rights held by an altered human before the law. 
The science fiction-like nature of body modification is occurring in more prosaic ways. A pacemaker, for example, might transmit data about its human operating system in the same way that Meow-Meow’s Opal card chip transmitted data concerning payment of a bus fare. Whether therapeutic interventions properly constitute a ‘cyborg’ remains an open question, however to the extent that they might, the pacemaker example certainly poses less of a challenge to our general conceptions of humanity than does a more extreme bodily modification, possibly undertaken by oneself. 
Machines and other hardware (and software) may be implanted within us, but more readily we are enhancing our physical capabilities by carrying them on our person. Smartphones are ubiquitous, and as they extend our intellectual capacity, ability to communicate, and even provide biophysical feedback for lifegiving treatments, and share myriad personal data with government and corporations alike. Fitness trackers worn on the wrist measure our physiological signs not only re-presenting them to their wearer as a variety of metrics by way of graphs and icons, but also sharing with other users and their corporate creators. Our devices also call for biometric data to unlock their features. We readily submit to fingerprints and facial recognition, granting global corporates the most intimate of insights into ourselves. 
At the same time as we have willingly released aspects of ourselves, through our data, in the private sphere our government has constructed a surveillance architecture affording security services wide scope for access to our telecommunications data and our biometric data. Although governments have pushed through the suite of legislation for over a decade, this has not come without a cost. The uptake of My Health Record, a putative personal database of one’s medical information, has been poor. And now, in the thrall of a pandemic, government is proposing a contact tracing app whereby a user’s proximity to another person (within 1.5m for more than 15 minutes) would be identified through Bluetooth technology, encrypted, and recorded in the app. If a contact is diagnosed with COVID-19, then all contacts would be notified of that. 
Critique of the app—at the time of writing not yet released—has generally been concerned with data privacy per se. This is, of course, important. However, independently of data privacy is a question the opposite to that encountered by Meow-Meow. For the foreseeable future, and in particular while we are in a declared public health emergency, our infection status regarding COVID-19 is central to our freedom, and indeed, to wider societal freedom. In that sense, a tracing app —a nd its data — effectively function as an extension of ourselves. They are a means of reassurance not only to public health officials running the program, but to wider society, that we, collectively, are safe. Meow-Meow exercised his freedom to extend the functioning of his body by inserting the Opal card chip. But will we be free from extending our corporeal body through the incorporeal data contained in a contact tracing app? Without making the app mandatory, there are multiple ways that it might entrench itself within society to create classes of people. Those whose provenance is known (via the app) and those whose provenance is not. 
This article suggests that the COVID-19 pandemic will test the boundaries of our personhood in a new way. Despite the existing state/corporate data infrastructure whereby others are able to construct a picture of our most intimate lives, there is not yet a universally compelling basis for production of personal data as a threshold for acceptance into places or institutions. Contact tracing may present one. And if our data is to be carried with us as an integral and qualifying part of our interface with the world around us, it may be considered as part of our person. To the extent that our data engagement differentiates us from other humans, the question arises of the protections available at law. In particular, with an ‘extended’ human, the question arises about where recognised boundaries of governance lie, whether the extended human is the bearer of rights, and if so, what is the substance of those rights. 
Part II outlines the basis on which our data is effectively an extension of ourselves, and as such constitutes the extended human as a species of ‘cyborg’ following Haraway’s interpretation. Part III then hypothesises about a variety of social contexts that might prefer or demand, what I call here a ‘COVID cyborg’—a person enhanced by their COVID tracing data—to the exclusion of those not so enhanced. It envisages our society comprising two classes of people: the COVID cyborg, and the unaugmented human. Unlike the experience of Meow-Meow, the COVID cyborg has the potential to be embraced, effectively affording them rights superior to those of the unaugmented human. If this is to be the case, the law needs to comprehend both cyborg and unaugmented status as equal subjects of protection.

31 March 2019

Implantables and biometrics

A slow news day at the Canberra Times, with a breathless item today announcing that 'More than a hundred Canberrans have implantable microchips'.

Yes, we are back in Meow-Ludo Disco Gamma Meow-Meow  territory again, this time with reporting about an Australian entrepreneur who's an advocate of implanted tags.

The CT states
A convicted hacker is selling implantable microchips able to store data including credit card details, and more than 100 Canberrans have signed on to the new technology. 
Chip My Life, an Australian company which imports the technology and sells it domestically, has sold more than 100 microchips to ACT residents since operations began in 2016. 
The microchips are the size of a grain of rice and are implanted into the webbing between the thumb and forefinger in a procedure that takes less than a minute. The procedure can be carried out by specialists in Sydney, and a Canberra-based clinic is slated for the coming months. 
The company's co-founder, a convicted hacker, Skeeve Stevens said interest had surged in the national capital for the technology in recent years. 
He said the microchip was inserted into the webbing between the finger and thumb because the area had fewer pain nerves. 
In 1995, Mr Stevens was sentenced to three years in prison for stealing and publishing the credit card numbers of 1200 AUSNet subscribers. ... 
Mr Stevens laughed off privacy concerns about his business, saying the conviction didn't seem to matter in other parts of his career where he speaks to government departments about the implications of new technology. 
He said he does not have access to the data being placed on the chips. While the company provided the microchip, users were responsible for inputting their personal information. 
"In the three years we've been going, we've sent out around 1600 microchips," Mr Stevens said. "We don't know what people do with the chips once we send it to them," he said. "It's more secure than the card in your hip pocket or your wallet if you're using it for the same type of purposes." 
Mr Stevens is among those who have an implanted microchip. His holds codes for his front door and garage at home. He said the majority of microchip users use the technology to store data normally found in swipe cards. 
"Some of the chips use the same technology as swipe-card technology in that it holds a single serial number to open something like a garage door. "The other type of chips can hold a lot more information and people use it store personal information and things like credit card details."
An irreverent friend commented that 100 implants (primarily to black tshirt-wearing male geeks?)  is not a revolution and wondered whether there had been a similar uptake of Prince Alberts and other things that make most people scratch their heads of go ouch.

'Use and acceptance of biometric technologies in 2017' (AIC Trends and Issues in Crime and Criminal Justice) by Russell G Smith, Alexandra Gannoni and Susan Goldsmid comments 
As part of the Australian Government’s National Identity Security Strategy (AGD 2013), a sample of Australians were surveyed about their experience of identity crime and misuse and how they responded to the problem. In addition to finding out how prevalent misuse of personal information is, the surveys asked respondents to indicate how willing they were to use various biometric technologies to protect their personal information (Emami, Brown and Smith 2016). 
This paper presents the findings of the surveys conducted in 2014, 2016 and 2017 that relate specifically to previous use of biometrics and willingness to use biometrics in the future. It provides updated information on the findings of the 2014 survey, which was the first to assess the willingness of Australian victims of identity crime to use biometrics to enhance the security of their personal information (Emami, Brown and Smith 2016). The other more general findings of the identity crime and misuse surveys have been published elsewhere (Goldsmid, Gannoni and Smith 2018; Smith, Brown and Harris-Hogan 2015; Smith and Hutchings 2014; Smith and Jorna 2018). 
The place of biometrics in identity security 
Biometric technologies use an individual’s unique physiological or behavioural attributes as a means of identification. They include fingerprint matching, signature analysis, or recognition of a person’s retina, iris, face or voice. Facial recognition is now considered to be the dominant biometric technology globally and the one most likely to increase in use over the next few years. The findings of the Biometrics Institute’s annual surveys of members since 2010 have shown that facial recognition has continued to grow in importance. These surveys are conducted annually and are sent by the Biometrics Institute to its 6,000 individual members and other stakeholders worldwide. In June 2018, 310 individuals responded to the survey, representing suppliers of biometrics (48%), users (38%) and other interested organisations and industries (14%; Biometrics Institute 2018). The 2018 survey found that 47 percent of respondents considered facial recognition to be the biometric technology most likely to be on the increase over the next few years (Biometrics Institute 2018). This was followed by iris recognition (8%), fingerprint recognition (7%) and voice recognition (6%). A further 19 percent of survey respondents considered that multi-modal approaches that combine various biometrics would be most likely to increase over the next few years (Biometrics Institute 2018). 
Biometric technologies are currently used by a range of organisations in Australia to verify the identities of the people with whom they deal. For example, the Department of Home Affairs collects biometric information including fingerprints and facial images from offshore visa applicants, onshore protection visa applicants, immigration detainees, and certain categories of airline passengers (Department of Home Affairs 2018). 
Australian airports have facial recognition capabilities, known as SmartGates, that enable travellers with ePassports from Australia, New Zealand, the United Kingdom, Switzerland, Singapore and the United States to process themselves rather than undergoing the customs and immigration checks that are usually conducted by Australian Border Force officers (Department of Home Affairs 2018). Standards for the interoperability of biometric systems have also been developed to promote the effective operation of biometric systems between various government agencies (AGD 2012). 
In addition, biometrics have been introduced to verify an individual’s identity in a range of other settings. For example, a number of financial institutions are considering using biometric technologies such as fingerprint recognition for payment card authentication and for mobile banking services instead of passwords and PINs (Saarinen 2017). Iris recognition has also been used for cardless ATM transactions (Kim 2015). In Australia, the National Australia Bank and Microsoft have collaborated to design a proof of concept ATM using biometrics, cloud and artificial intelligence technologies; this would enable customers to withdraw cash from ATMs using facial recognition technology and a PIN (Planet Biometrics 2018). 
Respondents to the Biometrics Institute’s survey in 2018 indicated that the most significant development in the use of biometrics during the last 12 months related to border control/security, accounting for 20 percent of responses. This was followed by online identity verification (12%), largescale national identity deployments (9%), financial services (8%) and mobile payments/m-commerce, device access and surveillance (these last three types accounting for 7 percent each). Respondents also indicated that over the next five years the most important developments would occur in relation to online identity verification (20%), large-scale national identity deployments (11%) and border control/security (11%; Biometrics Institute 2018).
The authors conclude
As the use of digital technologies has become more widespread, and identity crime and misuse have continued to increase, the computer security industry has sought to improve avenues for the efficient and secure authentication of users’ identities. Existing systems that rely on username and password combinations have become problematic as criminals have become more adept at compromising passwords. The proliferation of username and password combinations has also made it impossible for users to manage this information without resorting to insecure ways of remembering their passwords, or having to purchase and use automated password management software (Emami, Brown and Smith 2016). 
Biometric technologies seek to solve this problem by enabling individuals to use their biological attributes as a means of identifying themselves. This report presents the findings of recent surveys that sought to quantify the extent to which a sample of Australians have made use of different biometrics in the past, and how willing they would be to use the selected biometrics in the future to minimise the risk of criminal misuse of personal information. 
With the rise of international security incidents, a balance must be struck between the need for personal security and the need for privacy and confidentiality of personal information. Prior research has found that concerns over privacy, data loss and spoofing (attempting to overcome biometric recognition systems) are important factors restraining the biometrics market. The present surveys confirmed that respondents were concerned about the privacy and confidentiality of personal information when using biometrics, particularly with systems operating outside government control. Understanding people’s perceptions of risk and their willingness to use technology as a security solution is of critical importance in devising appropriate policy measures that will be effective on the one hand, and accepted by the community on the other (Emami, Brown and Smith 2016).  
The current survey research showed that a relatively small percentage of respondents had used the specified biological biometrics in the past, but that use increased significantly between 2016 and 2017. It also showed that almost half (48%) of respondents in 2017 were willing to use one of the four biological biometrics to protect personal information in the future, and that this was a three percentage point increase on the same finding in 2016. Between 2016 and 2017 all the biometrics examined showed a statistically significant increase in user acceptance. In 2017, nine percent of respondents even reported being willing to use implanted chips to protect their personal information. It was also found that older respondents were significantly more willing to use any of the four biological biometrics than younger respondents, perhaps indicating greater concern among older Australians regarding the security of their personal information, or perhaps their need to guard their assets and life savings from theft. Alternatively, younger people might be reluctant to use technologies that appear to be complex and could be seen to impede their immediate access to information in the online world. Clearly, ongoing monitoring of these attitudes is needed to ensure that future generations of users are willing to use any biometric systems that are implemented. Respondents who reported recent victimisation were also significantly more likely than other respondents to report a willingness to use voice and iris recognition as well as chip implantation to protect their personal information, but not fingerprint or facial recognition systems. 
As the biometrics market continues to develop, further research is needed to understand users’ behaviour and willingness to use biometric technologies, particularly facial recognition and multimodal systems that combine various biometrics, which are developing strongly. Evidence is needed of the extent to which such systems are vulnerable to fraud and misuse and how individuals respond to victimisation and reinstate their personal information following compromise. In addition, evidence is needed of the crime displacement effects of introducing biometrics and how criminal behaviour adapts and changes as a result of enhanced user authentication processes. In particular, risks of violent crime and duress inflicted on users need to be examined and strategies developed to address any such problems.
The work is interesting because it will be embraced by particular advovcates and because much of the data is very discordant, with the authors recurrently referring to inconsistencies regarding claimed exposure to specific technologies rather than merely a wariness about stated attitudes. Like much professional research, it is an invitation to a deeper and more comprehensive study.

14 December 2018

Cultures

I occasionally allow myself to step away from writing about data protection (privacy, confidentiality, secrecy) and health sector regulation by taking a walk on the wild side. Here's the abstract from a presentation at Griffith Law School earlier this week and associated book chapter.
Bullies, Blokes and Buggery: Homosociality, Justice and Male Rape through an Australian lens 
The depiction in Australian cinema of male-on-male sexual assault offers a lens for understanding homosociality and justice within Australia and across the globe. 
Male rape – an assault that objectifies the victim and valorises the perpetrator as both powerful and outside the rules – is a recurring but largely unrecognised feature of the Australian screen. It is evident in for example iconic works such as Wake in Fright (1971), The Chant of Jimmie Blacksmith (1978), Mad Max (1979) and Ghosts of the Civil Dead (1988). Those works often use a distinctly Australian landscape, one that is recognisably not the American West or Scandinavia. 
They involve brutality in an environment in which legal authority – conventions about rules and remedies – is absent, weak or indifferent. It is an environment in which bystanders, the homosocial ‘mates’ whose deepest emotional relationships are with each other, are contemptuous or even amused by the ‘unmanning’ of a victim through force or intoxication, placed outside their brotherhood and without the redemptive ending in for example The Shawshank Redemption (1994). 
The chapter suggests that the films offer a view of belonging, power and exclusion that is at odds with the celebration of difference in Priscilla, Queen of the Desert (1994) or Holding The Man (2015) and with adventures such as Deliverance (1972). If ‘mateship’ is a distinctively, although increasingly fictive, Australian value the films offer a dark view of complicity and violence within the sunburnt country, a land of sweeping plains, kangaroos and eyes wide shut to brutality. At a global level they tell us something interesting about anxieties at the heart of manhood and about the efficacy of law where victimisation excludes men from justice.
En route I caught up with 'Biohacking by Ali K. Yetisen in (2018) 36(8) Trends in Biotechnology 744.

Yetisen comments
Biohacking is a do-it-yourself citizen science merging body modification with technology.The motivations of biohackers include cybernetic exploration, personal data acquisition, and advocating for privacy rights and open-source medicine. The emergence of a bio-hacking community has influenced discussions of cultural values,medical ethics, safety, and con-sent in transhumanist technology. 
Epidermal electronics, biosensors, and artificial intelligence have converged as healthcare technologies to monitor patients in point-of-care settings within the Internet of Things (IoT). These technologies have created a community of hobbyist software developers involved in the quantified-self movement. The self-experimentalist community is primarily interested in tracking their daily physical and biochemical activities to build a library of personal informatics in order to main-tain a healthy lifestyle or improve body performance. The growing interest in this‘tech-savvy’community has motivated questioning the possibility of experimenting with implantable technologies. The emergence of implantables for biometric animal identification has encouraged self-experimentalists to chipify themselves in order to interact with computers in the IoT. Inspired by transhumanism, which advocates the enhancement of human body and intelligence by technology,the overlap between self-experimentation and medical implant domains has created a vision to modify the human body and document their experiences in social media for open-source medicine. 
The movement of biohacking has begun with a self-experimentation project(Cyborg 1.0, 1998) of Kevin Warwick who implanted a radio frequency identification (RFID) tag to his arm in order to control electronic devices. In another experiment, a multielectrode array was implanted in Warwick’s arm to create a neural interface, which allowed controlling a robotic arm and establishing telepathy system with another human implantee via the Internet. Self-experimentation with biomaterials has also been popularized with the performance art works of Stelarc,who had a scaffold implanted in his arm (Third Ear, 2007). The synergy of cybernetics, biopunk, and citizen science has led to the formation of a media-activist biohacking community. Figures in this transhumanist community include Amal Graafstra (tagger), Tim Cannon, LephtAnonym, and Neil Harbisson. These technology activists, also known as grinders,implant chips in their bodies or have them implanted. Their primary motivations include human–electronic device communication and self-quantification, and cosmetic enhancement[. Another over-arching goal of this community is to increase scientific literacy as citizen scientists. The biohacking community is actively engaged in the development of off-the-shelf protocols at low cost, open access research and collaboration by creating individual pursuit of inquiry. Bio-hackers document and share their protocols, equipment designs, and experiences on the Internet
The article has a useful inventory of implants.

'The Security Implications of Synthetic Biology' by Gigi Gronvall, a more insightful piece in (2018) 60(4) Survival: Global Politics and Strategy 165-180, comments
Advances in synthetic biology hold great promise, but to minimise security threats, national and international regulation will need to keep pace. Consumers have grown accustomed to personalised products. There are T-shirts made to order, books printed on demand, music-streaming services that cater to individual tastes, personalised news feeds and lists of suggested apps. 
This trend towards personalisation has even been extended to biology: genetic information and biological techniques can now be used by individuals to meet their personal needs. Biological information, such as the number of steps one takes in a day, one's heart rate or one's genetic code, has become trackable, and can be compiled for individualised purposes. Biological laboratory techniques, once the sole purview of scientific professionals, are likewise becoming increasingly accessible to amateurs, yielding information such as what a person eats or where they live. The trend towards the personalisation of biology would not be possible without synthetic biology, a growing technical field that aims to make biology easier to engineer. Synthetic biology is widely seen as an exciting new branch of the life sciences, but can be difficult to define  One group of researchers has described synthetic biology as ‘a) the design and fabrication of biological components and systems that do not already exist in the natural world and b) the re-design and fabrication of existing biological systems’. Others define synthetic biology in terms of what the field aims to do: make biology easier to engineer. While bioengineering has been around for a while, synthetic biology is more powerful: it has been described as ‘genetic engineering on steroids’ by one of its founding practitioners. Synthetic-biology tools, such as CRISPR (clustered regularly interspaced short palindromic repeats) for gene editing, gene synthesis and gene drives, are being used in a wide range of life sciences.
Scientists working in synthetic biology envision a time when biological traits, functions and products may be programmed like a computer. While there is a great deal of research yet to be done to allow for this, the convergence of high-speed computing power, intense research interest and some early commercial successes during the last decade has spurred the growth of the field. Publications about synthetic biology have increased from 170 per year in 2000–05 to more than 1,200 per year in 2015 More than 700 research organisations in over 40 countries are undertaking work in the field.
One major outcome of this growth is that biology is becoming industrialised. While biological processes have long been used in industrial settings – for example, to produce some medicines and vaccines, as well as certain consumer products such as beer and wine – they are increasingly being exploited for manufacturing, replacing the use of petrochemicals and resource-intensive harvesting from nature. Synthetic biology is now used to alter the internal machinery of microbes so that they produce a variety of desired molecules, from biofuels to flavour compounds to pharmaceuticals.  This has expanded the biological footprint of a range of industries including fuel, agriculture, medicines and mining, and of products such as construction materials, perfumes, fibres and adhesives. The economic implications of synthetic biology are vast and growing: the global market was valued at $3.9 billion in 2016, and is anticipated to grow at an annual rate of 24.4% to reach over $11bn by 2021. McKinsey and Company has reported that the total economic impact of synthetic biology, including applications in energy, agriculture and chemicals, could reach $700bn to $1.6 trillion annually by 2025.
While clearly useful on an industrial scale, synthetic biology can also be useful to individuals. It can yield information that would never merit a traditional research grant from the National Institutes of Health (NIH) or the Wellcome Trust. In contrast to the research funded by agencies like these, which is intended to foster benefits at a societal level, personalisation allows for the acquisition of information and products that are immediately useful to particular individuals. Scientific advances and the democratisation of synthetic biology should bring about an exciting future, but will also lead to changes in national and international security, the governance of biological research, and safety. 
Do-it-yourself biology 
Synthetic biology has already produced one of the most promising developments in cancer treatments for years, known as chimeric antigen receptor T-cell therapy, or CAR-T therapies.  In this treatment, a patient's own T cells are altered in a laboratory so that they will attack cancer cells. The Food and Drug Administration (FDA) has approved two CAR-T therapies, one to treat children with acute lymphoblastic leukaemia and the other to treat adults with advanced lymphomas. The complete remission rate in a trial of 100 adults with refractory or relapsed large B-cell lymphoma was 51%. 
The trend towards the personalisation of biology is not limited to FDA-approved therapies, but is also in the hands of individuals curious about their own bodies. There is intense public interest in harvesting and making sense of personal biological information from health-monitoring devices.  Services like 23andMe and Ancestry.com provide clients with detailed genetic information, including clues – and sometimes surprises – about their ancestry. Their users can find out whether they potentially have a higher likelihood of developing breast cancer (as established by the presence of BRCA genes) or Parkinson's disease. 
PatientsLikeMe is another example of a service generating personalised health information. On this for-profit site, people who suffer from one or more of 2,800 listed conditions share their medical data and reactions to investigational drugs. The company claims that patients who use their service will learn more about their medications and conditions, make connections with others who share their illnesses, and ultimately ‘change the future of personalised health’.  The data provided to this site has led to original published research, and to the development of an easier way to enrol patients in clinical studies.  
Non-traditional research environments, including home- or community-based laboratories, are becoming more common, an approach that has been called DIY Bio (do-it-yourself biology), bio-hacking or citizen science. Community laboratories where bio enthusiasts can gather and work together, alongside many more DIY communities that lack laboratory space, have been established in New York, Boston, Seattle, San Francisco and Baltimore – as well as in Budapest, Manchester, Munich, Paris and Prague.  According to DIYBio.org, a charitable organisation formed with the mission of ‘establishing a vibrant, productive and safe community of DIY biologists’, there were 44 DIY Bio groups across the US and Canada, 31 in Europe, and 17 in Asia, South America and Oceania as of early June 2018.  These laboratories, which typically charge membership fees to purchase equipment, are dedicated to making science accessible and frequently offer educational programmes. 
The Baltimore Underground Science Space (BUGSS), for example, recently held a class for people aged ten and up to learn about bioluminescence in bacteria, during which a gadget was built that puffs air into bacterial cultures to make the bacteria glow.  Participants were directed to take a stool sample at home and to quickly inactivate it so that no living microbes were brought into the laboratory. At the lab, the participants attempted to use polymerase chain reaction (PCR) to amplify the DNA of the microbes so as to identify them. Participants could also compare samples taken before and after embarking on a diet, or of two different people. 
In the hands of amateurs, straightforward ‘DNA-barcoding’ techniques can be used to determine whether purchased sushi is actually made from the species advertised.  Other techniques can be used to detect the presence of melamine, a poison, in baby formula.  The ease of use offered by such technologies has inspired new biological services as well. For instance, apartment-complex owners have required stool samples from tenants’ pets to genetically identify them, for the purpose of identifying and deterring those who do not pick up after them.
The pipeline for non-traditional biological exploration is expanding, thanks to iGEM, the International Genetically Engineered Machine competition. iGEM began more than a decade ago as a class offered at the Massachusetts Institute of Technology (MIT) in Cambridge, MA, that was modelled on robotics competitions intended to draw students into engineering fields.  In iGEM competitions, teams comprising undergraduates from around the world are given a kit of standard biological parts called BioBricks. Over a summer, and with the help of instructors, the teams use the parts and others they create to engineer biological systems and operate them within living cells. The competition has grown from involving fewer than two dozen undergraduates in its early years to drawing more than 6,000 undergraduates, high-school students, DIY Bio practitioners and ‘overgrads’ per year from more than 40 countries, with 30,000 alumni having already participated. Many of the projects aim to tackle real-world problems and to develop solutions that can be used in low-resource settings, such as a bacteria-produced blood substitute that may be stored for long periods. 
As people acquire more biological information about their environment, they will increasingly have the opportunity to make more personalised and biologically informed choices to improve their health, pursue new hobbies and even care for new types of pets. While these are positive outcomes, there is also the potential for negative outcomes, given the possibility that synthetic biology could be misused to cause deliberate harm. There will also be many new opportunities for quackery and dangerous self-experimentation that could spread via social media and thus become a contagious phenomenon. Biological safety practices will be challenged, and there could be some unwelcome surprises

18 July 2018

Welfare Cards and Chippers

The ANAO report on The Implementation and Performance of the Cashless Debit Card Trial considers implementation and evaluation of the Cashless Debit Card trial by the Department of Social Services.

The ABC astringently reports 'Cashless welfare audit finds data on effectiveness severely flawed, but Government maintains scheme is working', going on to comment
The report said it was "difficult to conclude" whether there had been a reduction in social harm, such as alcoholism and violence, because there was a "lack of robustness in data collection". 
It pointed to missing data as part of the problem, such as hospital admission figures for Kununurra and Wyndham. 
The audit office also conducted some of its own analysis and came up with different figures to what the Social Services Minister was told. 
For example, the minister was advised that there were fewer ambulance call-outs in September 2016 compared to the previous year. However, when ANAO took seasonality into account and analysed the data over a longer period, it found a 17 per cent increase in call-outs between April and October 2016 compared to the previous year. 
It was a similar story with school attendance. The minister was told there was an increase, but ANAO analysis found it dropped for Indigenous students after the implementation of the trial.
The actual report states
Welfare quarantining, in the form of income management, was first introduced in 2007 as part of the Australian Government’s Northern Territory National Emergency Response. 
The aim of income management is to assist income support recipients to manage their fortnightly payments — such as Newstart/Youth Allowance, parenting or carer payments, and the Disability Support Pension — for essentials like food, rent and bills. 
On 1 December 2014, the Government agreed to trial a new approach to income management — the Cashless Debit Card (CDC), in Ceduna and the East Kimberley. The Cashless Debit Card Trial (CDCT or the trial) aimed to: test whether social harm caused by alcohol, gambling and drug misuse can be reduced by placing a portion (up to 80 per cent) of a participant’s income support payment onto a card that cannot be used to buy alcohol or gambling products or to withdraw cash; and inform the development of a lower cost welfare quarantining solution to replace current income management arrangements. 
On 14 March 2017, the Minister for Human Services and the Minister for Social Services announced the extension of the trial in Ceduna and the East Kimberley for a further 12 months. In addition, funding was allocated as part of the 2017–18 Budget to trial the CDC in two new locations with the Government announcing in September 2017 that the CDC would be delivered to the Goldfields region of Western Australia and also to the Hinkler Electorate (Bundaberg and Hervey Bay Region) in Queensland. 
Subsequently, the Social Services Legislation Amendment (Cashless Debit Card) Act 2018 received royal assent on 20 February 2018. The amendments restricted the expansion of the CDC, with the cashless welfare arrangements continuing to 30 June 2019 in the current trial areas of East Kimberley and Ceduna, with one new trial site in the Goldfields. 
Rationale for undertaking the audit 
Recent ANAO audits have highlighted the need for entities to articulate mechanisms to determine whether an innovation is successful and what can be learned to inform decision making regarding scaling up the implementation of that innovation. The CDCT was selected for audit to identify whether the Department of Social Services (Social Services) was well placed to inform any further roll-out of the CDC with a robust evidence base. Further, the audit aimed to provide assurance that Social Services had established a solid foundation to implement the trial including: consultation and communication with the communities involved; governance arrangements; the management of risks; and robust procurement arrangements. 
Audit objective and criteria 
The objective of the audit was to assess the Department of Social Services’ implementation and evaluation of the Cashless Debit Card Trial. 
To form a conclusion against the audit objective, the ANAO adopted the following high level audit criteria: Appropriate arrangements were established to support the implementation of the Cashless Debit Card Trial. The performance of the Cashless Debit Card Trial was adequately monitored, evaluated and reported on, including to the Minister for Social Services. 
Audit methodology 
The audit methodology included: examining and analysing documentation relating to the implementation, risk management, monitoring and evaluation for the Cashless Debit Card Trial; and interviews with key officials in the departments of Social Services and Prime Minister and Cabinet and with external stakeholders including Indue Limited (Indue), ORIMA Research (ORIMA), Community Leaders, Local Partners and others in the trial sites. 
Conclusion 
The Department of Social Services largely established appropriate arrangements to implement the Cashless Debit Card Trial, however, its approach to monitoring and evaluation was inadequate. As a consequence, it is difficult to conclude whether there had been a reduction in social harm and whether the card was a lower cost welfare quarantining approach. 
Social Services established appropriate arrangements for consultation, communicating with communities and for governance of the implementation of CDCT. Social Services was responsive to operational issues as they arose during the trial. However, it did not actively monitor risks identified in risk plans and there were deficiencies in elements of the procurement processes. 
Arrangements to monitor and evaluate the trial were in place although key activities were not undertaken or fully effective, and the level of unrestricted cash available in the community was not effectively monitored. Social Services established relevant and mostly reliable key performance indicators, but they did not cover some operational aspects of the trial such as efficiency, including cost. There was a lack of robustness in data collection and the department’s evaluation did not make use of all available administrative data to measure the impact of the trial including any change in social harm. Aspects of the proposed wider roll-out of the CDC were informed by learnings from the trial, but the trial was not designed to test the scalability of the CDC and there was no plan in place to undertake further evaluation.
Other findings are
Implementation of the Cashless Debit Card Trial 
Social Services conducted an extensive consultation process with industry and stakeholders in the trial sites. A communication strategy was developed and implemented which was largely effective, although Social Services identified areas for improvement in future rollouts. 
There were appropriate governance arrangements in place with clearly defined roles and responsibilities across key departments and stakeholders for reporting and oversight of the CDCT. 
Social Services demonstrated an integrated approach to risk management across the department linking enterprise, program and site-specific risk plans. While a CDCT program risk register was developed, the identified risks were not actively managed, some risks were not rated in accordance with the Risk Management Framework, there was inadequate reporting of risks and some key risks were not adequately addressed by the controls or treatments identified. In particular, treatments were inadequate to address evaluation data and methodology risks that were ultimately realised. Social Services managed and effectively addressed operational issues as they arose. 
Aspects of the procurement process to engage the card provider and evaluator were not robust. The department did not document a value for money assessment for the card provider’s IT build tender or assess all evaluators’ tenders completely and consistently. 
Social Services effectively established or facilitated arrangements to deliver local support to CDCT communities, although there were delays in the deployment of additional support services. As part of the CDCT, Social Services also trialled Community Panels and reviewed their effectiveness to inform broader implementation. 
Performance monitoring, evaluation and reporting 
A strategy to monitor and analyse the CDCT was developed and approved by the Minister. However, Social Services did not complete all the activities identified in the strategy (including the cost-benefit analysis) and did not undertake a post-implementation review of the CDCT despite its own guidance and its advice to the Minister that it would do a review. There was scope for Social Services to more closely monitor vulnerable participants who may participate in social harm and their access to cash. 
Key performance indicators (KPIs) developed to measure the performance of the trial were relevant, mostly reliable but not complete because they focused on evaluating only the effectiveness of the trial based on its outcomes and did not include the operational and efficiency aspects of the trial. There was no review of the KPIs during the trial and KPIs have not been established for the extension of the CDC. 
Social Services developed high level guidance to support its approach to evaluation, but the guidance was not fully operationalised. Social Services did not build evaluation into the CDCT design, nor did they collaborate and coordinate data collection to ensure an adequate baseline to measure the impact of the trial, including any change in social harm.   
Social Services regularly reported on aspects of the performance of the CDCT to the Minister but the evidence base supporting some of its advice was lacking. Social Services advised the Minister, after the conclusion of the 12 month trial, that ORIMA’s costs were greater than originally contracted and ORIMA did not use all relevant data to measure the impact of the trial, despite this being part of the agreed Evaluation Framework. 
Social Services undertook a review and reported to the Minister on a number of key lessons learned from the 12 month trial of the CDC. Learnings about the effectiveness of the Community Panels were based on the number of applications received and delays in decision making, rather than from the evaluation findings that noted a delay in the establishment of the Community Panels and a lack of communication with participants. The 12 month trial did not test the scalability of the CDC but tested a limited number of policy parameters identified in the development of the CDC. Many of the findings from the trial were specific to the cohort (predominantly indigenous) and remote location, and there was no plan in place to continue to evaluate the CDC to test its roll-out in other settings.
The ANO makes the following recommendations
1 Social Services should confirm risks are rated according to its Risk Management Framework and ensure mitigation strategies and treatments are appropriate and regularly reviewed.  
2 Social Services should employ appropriate contract management practices to ensure service level agreements and contract requirements are reviewed on a timely basis. 
3 Social Services should ensure a consistent and transparent approach when assessing tenders and fully document its decisions. 
4 Social Services should undertake a cost-benefit analysis and a post-implementation review of the trial to inform the extension and further roll-out of the CDC. 
5 Social Services should fully utilise all available data to measure performance, review its arrangements for monitoring, evaluation and collaboration between its evaluation and line areas, and build evaluation capability within the department to facilitate the effective review of evaluation methodology and the development of performance indicators. 
6 Social Services should continue to monitor and evaluate the extension of the Cashless Debit Card in Ceduna, East Kimberley and any future locations to inform design and implementation.
The ABC has meanwhile reported that biohacker and 'chipper' enthusiast Meow-Ludo Disco Gamma Meow-Meow has had his 2018 Opal Card conviction overturned.

Meow-Meow had pleaded guilty to using public transport without a valid ticket and for not producing a ticket to transport officers. That conviction reflects his disassembly of the near-field Opal Card, with the hacker - in what he reportedly claimed was an advance for cyborg rights - inserting the Opal Card chip under his skin.

Not a major innovation, given work - legally and ethically problematical or otherwise - by a range of identification-by-chip enterprises over the past 15 years, noted for example in 'Ethics and indemnification regarding the VeriChip' by Virginia Ashby Sharpe in (2008) 8(8) The American Journal of Bioethics 49-50 and 'The security implications of VeriChip cloning' by John Halamka, Ari Juels, Adam Stubblefield and Jonathan Westhues in (2006) 13(6) Journal of the American Medical Informatics Association 601-607.

As noted earlier in this blog,  Meow-Meow was fined $220 in Local Court for breaching the Opal Card terms of use and was ordered to pay $1,000 in legal costs. On  appeal to the NSW District Court the conviction  was quashed, with Dina Yehia J reportedly taking into  account Meow-Meow's good character and commenting that although there were legal issues of general deterrence, she was of the view that the objective seriousness of the offence fell towards the lower end of the range, if not the bottom. He  had no prior convictions and had not tampered with the Opal Card in order to avoid paying the fine.

Unsurprisingly Meow-Meow is reported as saying that he was pleased with the outcome, did not encourage anyone else to implant an Opal Card chip into their skin and would not do it again without permission from Transport New South Wales. On to the next appearance in the limelight?

16 March 2018

Chipper

No great surprises in the report that Meow-Ludo Disco Gamma Meow-Meow has been unsuccessful after brouhaha over his bodyhacking of a Transport for NSW (TfNSW) travel card.

Mr Meow-Meow was noted here, here and in a piece for The Conversation.

TfNSW had taken action against him for not using a valid  ticket (using public transport without a valid ticket and for not producing a ticket to transport officers).

Despite hyperbole about 'cyborg rights' (does everyone with a stent, a pacemaker or joint implant count as a cyborg?),  he today pleaded guilty to both offences at Newtown Local Court.

The ABC reports that  Mr Meow-Meow
was fined $220 for breaching the Opal Card terms of use and was ordered to pay $1,000 in legal costs. 
The lawyer representing Mr Meow Meow argued that transport legislation had advanced to include methods of contactless payment through MasterCard and some smart phones. He said that the law should adapt to all available technologies including implantable tech. 
But Magistrate Michael Quinn said, while the legislation may catch up with technology in the future, the law of the day must be followed. 
Outside court, Mr Meow Meow said he was disappointed both offences were not dismissed and that he was ordered to pay legal costs. 
Despite the decision, Mr Meow Meow said he would continue to experiment with implanted technology. He said he was planning to push the boundary even further, replacing his Opal chip with one that will hold all of his personal information, including credit cards and memberships. 
DIY unauthorised modification of credit card and membership cards will breach the terms and conditions of his account with the credit card providers, so he can expect to see those businesses restricting or cancelling the relevant accounts.

17 February 2018

Biohacking and travel cards

Given that Meow-Ludo Disco Gamma Meow-Meow - noted last year - is in the news again it was timely to read 'DIY Bio: Hacking Life in Biotech’s Backyard' by Lisa C. Ikemoto in (2017) 51 University of California Davis Law Review 539.

The peripatic Meow-Meow - recurrent political candidate, cyborg advocate and biohacking enthusiast - has unsurprisingly had his OPAL near-field transit card cancelled after he extracted the chip for subcutaneous insertion. He appears to consider that the resulting litigation - contesting a $200 fine in 2017 for riding the train without a valid ticket and reportedly planning to launch legal action against TfNSW for unlawfully cancelling his cards - will advance cyborg rights.

Australian law does not recognise 'cyborgs' as such and his action would appear to be readily addressed under the terms and conditions for use of his card.

In the Australian Capital Territory there is a prohibition under Regulation 49 of the Road Transport (Public Passenger Services) Regulation 2002 (ACT) of traveling on an ACT government bus using a ticket that has been 'damaged or defaced in a material respect' or 'changed in a material particular', with ticket including a card with a chip or magnetic strip.

In NSW use of the OPAL travel card is governed by the Passenger Transport (General) Regulation 2017 (NSW). The Cards 'are and remain' the property of TransportNSW, which may 'inspect, de-activate or take possession of an Opal Card or require its return at our discretion without notice at any time'.

Users are required to 'take proper care of the Opal Card, avoid damaging it, keep it flat and not bend or pierce it' and - saliently - 'not misuse, deface, alter, tamper with or deliberately damage or destroy the Opal Card'. Further, the user must not 'alter, remove or replace any notices (other than the activation sticker), trademarks or artwork on the Opal Card. Additionally, they must not'modify, adapt, translate, disassemble, decompile, reverse engineer, create derivative works of, copy or read, obtain or attempt to discover by any means, any (i) encrypted software or encrypted data contained on an Opal Card; or (ii) other software or data forming part of the Opal Ticketing System'.

Meow-Meow gained attention several years ago regarding 'biohacking' (centred on a DIY community DNA-modification lab) rather than 'bodyhacking'.

Ikemoto comments
DIY biologists set up home labs in garages, spare bedrooms, or use community lab spaces. They play with plasmids, yeast, and tools like CRISPR-cas9. Media stories feature glow-in-the-dark plants, beer, and even puppies. DIY bio describes itself as a loosely formed community of individualists, working separate and apart from institutional science. This Essay challenges that claim, arguing that institutional science has fostered DIY bio and that DIY bio has, thus far, tacitly conformed to institutional science values and norms. Lack of a robust ethos leaves DIY bio ripe for capture by biotech. Yet, this Essay suggests, DIY bio could serve as a laboratory for reformulating a relationship between science and society that is less about capital accumulation and more about knowledge creation premised on participation and justice.
 She goes on
Popular media depicts biohackers or Do-It-Yourself (“DIY”) biologists as the ultimate science geeks. “DIY bio” refers to noninstitutional science or science performed outside of professional laboratories.  DIY biologists set up home labs in garages, spare bedrooms, and closets or use community lab spaces. The people doing DIY bio range from the self-taught to PhDs. Instead of building computers or creating apps, DIYers play with plasmids, jellyfish, yeast, and polymerase chain reaction in genetic engineering experiments. Media stories and DIY bio websites often feature glow-in-the-dark plants, food, petri dish art, and even puppies.
DIY bio is an emerging set of activities. A range of players, with varied ideologies, are shaping DIY bio’s trajectories. DIY bio’s signature claim is that it exists apart from, and even in opposition to, institutional science. This Essay challenges that claim. Whether all DIY biologists know this or not, DIY bio serves the interests of institutional science and is well-situated for capture by biotechnology. Biotechnology refers not only to the life sciences-based industry, but also to the neoliberal epistemology that values the use of applied science to commercialize the transformation of life itself into technology. DIY bio’s origin stories do reflect resistance to the highly structured and bureaucratic nature of institutional science. Yet these accounts also indicate interest convergence between DIY bio and institutional science. Accounts that forecast DIY bio’s future show DIY bio conforming its practices to mainstream law, policy, and market concerns. Thus far, DIY bio has not crafted its own account of the relationship between science, society, and ethics, and is falling into a science-as-usual practice that situates DIY bio in biotech’s backyard.
Part II sets out a descriptive account of biohacking, and DIY bio, in particular. Part III identifies three overlapping explanations for DIY bio. The first two, explicitly political accounts and nostalgic accounts, are largely consistent with the DIY bio claim that DIY bio is different and apart from institutional science. The third account borrows from Frederick Jackson Turner’s frontier thesis and asserts that DIY bio sustains an ideology of bio-individualism embedded in biotechnology. Part IV reviews and critiques law and policy views of DIY bio and its prospects. These views apply the frames and standards applicable to biotech. Part V makes the case for biotech’s annexation of DIY bio. Part V elaborates on DIY bio’s failure, so far, to re-define the relationship between science and society, and suggests a few initial critical points of engagement for doing so.
She suggests that
As yet, DIY bio has not expressed a commitment to ethical science activity, nor developed a robust ethos. Perhaps, its tacit acceptance of the risk-benefit framework means that its view of ethics aligns with that of institutional science. That is, it conflates a risk-benefit weighing with ethical standards or views ethics as a compliance obligation.
The risk calculus is not devoid of ethical concerns. It maps onto a standard ethical test used in institutional science. The test highlights three criteria — safety, efficacy, and autonomy. That test derives from the Belmont Report’s principlist framework, the FDA’s drug and device approval standards, and neoliberalism’s effects on the life sciences and autonomy. The Belmont Report states four principles — autonomy, beneficence, non-maleficence, and distributive justice. Autonomy’s application is informed consent. The non-maleficence principle is addressed by weighing risk to human health against benefits. Benefits refer to efficacy or improvements to human health. The FDA uses safety and efficacy as its criteria in the drug and device testing requirements for market approval. Efficacy, like safety or risk to human health, is narrowly defined. The FDA requires that the product work, but does not require that it work well or better than existing therapeutics. Market thinking has infiltrated these criteria. Claims that individual choice should trump agency standards in determining access to drugs have gained credence. This indicates that traditional bioethics’ first principle, autonomy, may now be understood as a form of free market individualism. In addition, the pharmaceutical industry has leveraged that version of autonomy to maximize the role of drugs in medical care, and the sale of particular products. While big bio’s risk calculus is not the end-all and be-all of ethics in institutional science, it is part of an impoverished ethical framework.
In 2011, the North American and European DIYbio Congresses issued Draft Codes of Ethics. The codes incorporate principles of open science — open access, transparency, and education; and selfregulation — safety (adopt safe practices), environment (respect the environment), and peaceful purposes (biotechnology should only be used for peaceful purposes). As discussed, the North American Code has one more element — Tinkering. The Code elements are general. As my characterization suggests, the Code elements, like the Belmont Report principles, lend themselves to narrow or broad readings. Read more generously, safety, environment, and peaceful purposes might move DIY bio beyond the issue of forestalling regulation to situating science as a tool for social justice. On the other hand, open access could be read as a right to access, premised on free market individualism. Tinkering invokes the individual, as the nostalgic accounts show. If DIY bio is first and foremost an individualist vision of science, it stands little chance of evolving into a new understanding of science.
The open science principles suggest that DIY bio’s ethos differs from big bio’s, and that DIY bio is not bound by big bio’s norms. Yet, open science goals do not translate to an ethics of science. Open science can be used for different goals, including forms of commercial distribution that are exploitative. In addition, the Code states the elements as universal principles, which in itself is problematic. Typically, dominant readings of so-called universal principles are used to maintain boundaries, and identify the out-group as non-compliant. It is very possible that the universal principles may be used to undercut the inclusive goals that open science asserts.
My comments in the previous subparts suggest, without prescriptive detail, the possibility of using DIY bio to redefine the possible relationship between science and society. Contemporary accounts indicate that DIY bio projects are typically small-scale and are relatively unsophisticated. As such, DIY bio seems underpowered as a platform for re-thinking the political economy of the life sciences. What I suggest here is not that DIY biologists directly challenge or redesign institutional science. Rather, DIY bio might provide an opportunity to create, by deliberate experimentation, a set of practices that are ethos-based and originate from critical social inquiry. The most valorized explanatory accounts speak, in bits and pieces, of social justice goals. Using these as a starting point, DIY bio might craft ways of doing science that embed justice-based ethics into inquiry and practice. Ethics, then, could become not a compliance checklist, but constitutive of good science.
Ikemoto concludes
 DIY bio is many things to many people. That is, undoubtedly, part of its appeal. What is it not, however, is separate and apart from institutional science. Its location in biotech’s backyard, without a fence or substantive alternative vision of DIY bio’s role, makes it vulnerable to annexation. In that scenario, DIY bio and its dream of a new science by the people might disappear. This Essay maps the relationships between DIY bio and institutional science. The mapping also critiques aspects of biotechnology that are inconsistent with DIY bio’s stated goals of access and participatory knowledge formation. If DIY bio takes those goals seriously, this Essay suggests that it move beyond compliance-based thinking, and beyond experimentation using plasmids and pipettes. Acknowledging that science is a social practice, followed by scientific-social inquiry about how and why we engage with plasmids and pipettes, and willingness to experiment with new social methods of doing science, might move DIY bio out of biotech’s backyard, and into society.

27 June 2017

Biopunks

'“Let’s pull these technologies out of the ivory tower”: The politics, ethos, and ironies of participant-driven genomic research' by Michelle L. McGowan, Suparna Choudhury, Eric T. Juengst, Marcie Lambrix, Richard A. Settersten Jr and Jennifer R. Fishman in (2017) 1 BioSocieties 1 comments
This paper investigates how groups of ‘citizen scientists’ in non-traditional settings and primarily online networks claim to be challenging conventional genomic research processes and norms. Although these groups are highly diverse, they all distinguish their efforts from traditional university- or industry-based genomic research as being ‘participant-driven’ in one way or another. Participant-driven genomic research (PDGR) groups often work from ‘labs’ that consist of servers and computing devices as much as wet lab apparatus, relying on information-processing software for data-driven, discovery-based analysis rather than hypothesis-driven experimentation. We interviewed individuals from a variety of efforts across the expanding ecosystem of PDGR, including academic groups, start-ups, activists, hobbyists, and hackers, in order to compare and contrast how they relate their stated objectives, practices, and political and moral stances to institutions of expert scientific knowledge production. Results reveal that these groups, despite their diversity, share commitments to promoting alternative modes of housing, conducting, and funding genomic research and, ultimately, sharing knowledge. In doing so, PDGR discourses challenge existing approaches to research governance as well, especially the regulation, ethics, and oversight of human genomic information management. Interestingly, the reaction of the traditional genomics research community to this revolutionary challenge has not been negative: in fact, the community seems to be embracing the ethos espoused by PDGR, at the highest levels of science policy. As conventional genomic research assimilates the ethos of PDGR, the movement’s ‘democratizing’ views on research governance are likely to become normalized as well, creating new tensions for science policy and research ethics.
'Steve Jobs, Terrorists, Gentlemen and Punks: Tracing Strange Comparisons of Biohackers' by Morgan Meyer in Joe Deville, Michael Guggenheim and Zuzana Hrdlicková (eds) Practising Comparisons: Logics, Relations, Collaborations (Mattering Press, 2016) comments
In this paper, I want to reflect and shed new light on one of my current research topics: biohacking. While I have been researching biohacking for a few years now, to date I have not yet examined its comparative dimension. The themes I have investigated thus far revolve around the materiality, boundaries, and ethics of biohacking. However, so far I have not problematised or made visible the issue of comparison, despite the fact that comparisons abound in discussions about biohackers. This article is thus an opportunity to use a comparative optics to ‘make new discoveries’ (Yengoyan 2006) on a subject that I felt I already knew well. 
Biohackers are people who hack and tinker with biology. On the one hand, the phenomenon of biohacking can be easily localised (both temporally and spatially). The movement emerged in 2007/2008 and has largely developed in large US and European cities. On the other hand, in order to understand and analyse the phenomenon, comparisons with a wide and heterogeneous set of figures are made by science journalists and practitioners alike. For example, biohackers are concurrently compared to the following: seventeenth-century gentlemen amateurs; terrorists (whom Western powers usually locate in the East); the punk movement that emerged in the 1970s and their do-it-yourself ethics; and Steve Jobs and the Homebrew Computer Club. 
The term biohacking is used today to designate a wide array of practices including the hacking of expensive scientific equipment by building cheaper alternatives; producing biosensors to detect pollutants in food and in the environment; and genetically re-engineering yoghurt to alter its taste, make it fluorescent, or produce vitamin C. Biohacking mobilises and transforms both molecular biology techniques and the ethics of hacking/open source. As such, it can be seen as a recent phenomenon. Its emergence as a distinct and visible movement can be traced back to the past eight or nine years. In 2008, for instance, DIYbio (the first association dedicated to do-it-yourself biology) was created. Two years later, the Biopunk Manifesto (2010) was written by Meredith Patterson, one of the leading figures in the biohacking movement. In addition, at the time of writing this paper, there are a number of associations, laboratories, wikis, websites, and so on, dedicated to biohacking. 
The rise of the biohacker movement has caught the attention of journalists and academics alike. Academics have followed and analysed the movement since around 2008 (see Schmidt 2008a; Bennet et al. 2009; Ledford 2010), and two books dedicated to the subject have recently been published: Biohackers: The Politics of Open Science (2013), by science and technology studies (STS) scholar Alessandro Delfanti, and Biopunk: DIY Scientists Hack the So ware of Life (2011), by science journalist Marcus Wohlsen. In one way or another, this body of work has examined the ethics, risks, potentials, and openness of the movement. 
The geographical spread of biohacking – like its temporal emergence – can also be delineated. According to the main website in the field (DIYbio.org), there are currently eighty-five DIY biology laboratories in the world, of which twenty-eight are located in Europe, and thirty-five are in the US on either the east or west coast. There are now biohacker labs and biohackers in cities like New York, Boston, Paris, San Francisco, Manchester, Vienna, and in recent years, initiatives have developed in places like Japan, Indonesia, and Singapore. The political geography of biohacking (and consequently, the arguments developed in this paper) thus needs to be emphasised. The biohacker movement is developing in Western and Westernised countries; laboratories are usually located in urban or suburban settings; and English is the lingua franca for the majority of the websites, articles, mailing lists, discussions, and wikis devoted to biohacking. 
This paper focuses on how, and to what, biohackers are compared. This is a challenging question, for as we will see below, biohackers are compared to rather unlikely bedfellows. Not only are plentiful comparisons being made, but they are also drawn between different cultures and times, and between different – sometimes opposing – values and ethics. Unlike the ‘comparator’ which needs to be actively assembled, fed, and calibrated in order to provide comparisons (Deville, Guggenheim, and Hrdličková 2013), in the case of biohackers, comparisons are ‘already there’ and they are omnipresent. The frequency and disparity of these comparisons are what caught my interest in comparison and what compelled me to write this chapter. Why are such comparisons mobilised and why are such unlikely gures put side by side? What kinds of effects do such comparisons afford? How should we analyse these comparisons?
It is not unusual for hackers and computer programmers to be compared. Computer hackers, for instance, have been compared to public watchdogs, whistle-blowers, elite corps of computer programmers, artists, vandals, and criminals (see Jordan and Taylor 1998), while recent hacker networks like the Anonymous group have been compared to industrial machine breakers, and to Luddites (Deseriis 2013). The Homebrew Computer Club (initially a group of ‘hobbyists’) eventually became a group of ‘business entrepreneurs’ (see Coleman 2012), and Steve Jobs is today being compared to people like Thomas Edison or Walt Disney. 
Using biohacking as a case study, I will reflect upon and problematise comparison. The list of potential benefits of comparison is long, and it is worth mentioning a few, such as how they help to explore new, unanticipated routes; move beyond national frameworks by varying scales of analysis; and identify social patterns while highlighting the singularity of the cases studied (de Verdalle et al. 2012). The practices, methods, and problems of comparison have been discussed in a number of academic texts over the past decade or so. For instance, Richard Fox and Andre Gingrich (2002) have made an important contribution by revisiting and (re)theorising comparison. Arguing that comparison is a basic human activity that deserves academic scrutiny, they lay out a specific programme for comparative approaches. Differentiating between weak or implicit comparison, and strong and explicit comparison, Fox and Gingrich push especially for the latter and highlight their plural nature (2002: 20). The explicit focus on comparison has now become increasingly common, so that people talk of a ‘comparative turn’ in the social sciences (see Ward 2010). In this sense, comparison is actively engaged with, problematised, and theorised. This interest is visible beyond the Anglo-Saxon world as well. In France, for instance, two collections of essays on comparison have been published in 2012 alone: one is in the journal Terrains et Travaux (featuring on its cover an orange and an apple – a classic image that at once depicts sameness and difference, and is one of the chief challenges of comparison). The other is in an edited book called Faire des Sciences Sociales: Comparer (Remaud, Schaub, and ireau 2012). 
In this article, I want to draw on this body of work in several ways. First, I am interested in several authors’ emphases on ‘thick’ and multidimensional comparisons. Ana Barro, Shirley Jordan, and Celia Roberts (1998) have argued that comparison should be explorative, thick, and multidimensional. Jörg Niewöhner and Thomas Scheffer – who also argue for a ‘thick’ comparison – further emphasise that comparisons are performative in that ‘they connect what would otherwise remain unconnected, specify what would otherwise remain unspecified, and emphasise what would otherwise remain unrecognised’ (2008: 281). In a related way, Joe Deville, Michael Guggenheim, and Zuzana Hrdličková (this volume) talk about approaches that actively ‘provoke’ comparisons, while Tim Choy (2011) examines what comparisons do. 
Second, I do not want to ‘solve’ the issue of comparison, nor tell a coherent account of what biohackers are and what they are not. I am, rather, exploring the problems that biohackers and their identities entail. In this sense, I follow Adam Kuper (2002) who reminds us that we have to ‘begin with a problem, a question, an intuition’ (2002: 161). He further writes:
I remain convinced that methodological difficulties are the least of our problems [...] We lack questions rather than the means to answer them. What we need in order to revive the comparative enterprise is not new methods but new ideas, or perhaps simply fresh problems (Ibid. 162).
I hold that biohackers are possibly such a ‘fresh problem’ since their identity is somewhat ambiguous and unclear, and since the probable risks and innovative potential of their activities are currently being debated. Discussions about biohacking reveal that there are many uncertainties and that it seems diffcult to put their identity into neat categories. The questions that seem to drive most biohacking comparisons – Who are they? How can we make sense of them? Are they to be feared or hailed? – seem to have no clear answer. 
Third, I also draw on Donna Haraway’s and Marilyn Strathern’s ideas around ‘partial connections’ and positionality. In her discussion about situated knowledge, Haraway writes:
[h]ere is the promise of objectivity: a scientific knower seeks the subject position, not of identity, but of objectivity, that is, partial connection. There is no way to ‘be’ simultaneously in all, or wholly in any, of the privileged (i.e. subjugated) positions (1988: 586).
She continues:
I am arguing for politics and epistemologies of location, positioning, and situating, where partiality and not universality is the condition of being heard to make rational knowledge claims [...] Feminism loves another science: the sciences and politics of interpretation, translation, stu ering, and the partly understood (Ibid. 589).
In her book Partial Connections (1991), Strathern further draws on Haraway’s work and uses the term ‘partial’ to say that ‘for not only is there no totality, each part also de nes a partisan position’ (1991: 39). The trope of ‘partial connections’ can be – and already has been – engaged with in work on comparisons. 
For instance, Endre Dányi, Lucy Suchman and Laura Watts (cited in Witmore 2009) have compared seemingly incompatible field sites (a renewable energy industry, the Hungarian Parliament, and a research centre in Silicon Valley) and noted that there can be a ‘remarkable repetitiveness’ when these sites are connected through specific themes (such as newness, centres/peripheries, place, and landscape). Others have talked about ‘partial comparisons’ (Jensen et al. 2011) as a way to think about multiplicities while still recognising that ‘there exists no single, stable, underlying nature on which all actors have their perspectives’ (Ibid. 15). In this paper, I want to use these ideas in order to avoid one pitfall: the depiction of biohackers as a coherent whole that is able to be summated according to the different parts and comparisons reported in this article. In other words, the comparisons made can only be ‘partially connected’. I will thus refrain from taking an analytical view ‘from above’, one that is detached from what takes place ‘on the ground’. Instead, I will follow the actors themselves and consider their comparisons and knowledge claims to be valid and legitimate. In the remainder of this paper, I look in turn at four comparisons of biohackers (Steve Jobs, punks, amateurs, and terrorists). I will think with biohackers about comparison, rather than think about biohackers’ comparisons. In doing so, I not only seek to examine what comparisons do and produce, but I will also be reflexive and critical about my own previous research.