When we think of drug discovery, we normally do not consider technology misuse potential. We are not trained to consider it, and it is not even required for machine learning research, but we can now share our experience with other companies and individuals.
paper's authors
24 March 2022
Artificial intelligence could be repurposed to create new biochemical weapons
A new paper, co-authored by King’s academic Dr Filippa Lentzos, should act as a “wake-up call” to those using artificial intelligence (AI) technologies for drug discovery.
Drug discovery companies using artificial intelligence (AI) to search for new compounds need to be more sensitive to the risk that their technology could be repurposed to create biochemical weapons, a new paper warns.
The paper, published in Nature Machine Intelligence, describes how a thought experiment to deliberately optimize for harm turned into a computational proof. The paper’s authors, including Dr Filippa Lentzos of the Department of War Studies and Department of Global Health & Social Medicine at King’s, say their experience should serve as “a wake-up call” to those in the AI drug discovery community.
As part of a biennial arms control conference in Switzerland that looks at the implications of new technologies on chemical and biological weapons threats, the drug development company Collaborations Pharmaceuticals was invited to present on how AI in drug discovery could be misused. “The thought had never previously struck us” the authors say, “We have spent decades using computers and AI to improve human health--not to degrade it.”
To prepare for their presentation, the company took a piece of drug-discovery software and reversed one of its functions. Instead of penalising toxicity, it rewarded it. Within hours the company's technology had ‘generated’ 40,000 molecules that were highly toxic, including a nerve agent so lethal that just a few salt-sized grains can kill a person. The process also ‘generated’ other new molecules that appeared even more toxic.
While no physical molecules were made as part of the exercise, the authors point out there are many companies offering chemical synthesis and “this area is poorly regulated, with few if any checks to prevent the synthesis of new, extremely toxic agents that could potentially be used as chemical weapons.”
The paper calls for a public discussion on repurposing potential among those in the AI drug discovery community, highlighting the substantial risk to the field if their technology were misused in this way and how “it only takes one bad apple, such as an adversarial state or other actor looking for a technological edge, to cause actual harm.”
We hope that by raising awareness of this technology, we will have gone some way toward demonstrating that although AI can have important applications in healthcare and other industries, we should also remain diligent against the potential for dual use.
paper's authors
They outline several recommendations including:
- Discussion of the topic at major scientific conferences and for broader impact statements to become part of applications to funding bodies and review boards.
- Continued efforts to raise awareness of the potential of dual-use aspects of cutting-edge technologies to promote responsible conduct
- Using a public-facing Application Programme Interface (API) for models, with code and data available upon request, to enhance security and control over how published models are used
- A reporting structure or hotline to authorities to alert them to any lapse or someone working on developing toxic molecules for non-therapeutic uses.
- For universities to redouble their efforts toward the ethical training of science students and those from other disciplines, especially computing students, so that they are aware of the potential for misuse of AI from an early stage of their career, as well as understanding the potential for broader impact.
Read the full paper here.
Sean Ekins and Filippa Lentzos also talked about this at an event How are emerging technologies (re)-shaping the security landscape? held on the 19 January 2022 as part of the 'War Studies at 60' series. You can watch it again here.