The European Holocaust Research Infrastructure (EHRI) is an EU-funded series of projects (running since 2010, with the current phase due to end in 2024) whose mission is to support the Holocaust research community by building a digital infrastructure and facilitating human networks. EHRI provides access to information about Holocaust-related sources through its Online Portal, and tools that enable researchers’ work with such sources. EHRI further facilitates a network of researchers and archivists to increase cohesion among practitioners. EHRI seeks to overcome the hallmark challenge of Holocaust research: the global dispersal of the archival sources, and the concomitant fragmentation of Holocaust historiography. EHRI’s work is carried out by a consortium of 24 partner institutions, and it is now being established as a permanent European infrastructure.
https://www.ehri-project.eu/ {external site}
|
Creative AI: machine learning as a medium in artistic and curatorial practice
This is a collaboration between DDH and the Serpentine Gallery, examining how new approaches to aspects of Artificial Intelligence and Machine Learning have emerged from artistic practices, with the aim of surfacing this ‘back-end’ knowledge and linking it to wider artistic and curatorial practices. Through the lens of art-making, the lab is producing knowledge for cultural institutions, artists, engineers and researchers on how to engage AI/ML as media, with the objective of developing institutional capacities to engage with these media for the benefit of the wider cultural sector.
creative-ai.org/ {external site}
|
AI assistants are increasingly integrated into everyday life, from the personal AI assistants running in our smart phones and homes, to health AI assistants. A crucial issue is how secure AI assistants are, as malicious actors may exploit vulnerabilities to make AI assistants behave in an insecure way, or security issues may be introduced accidentally through negligent use. SAIS is examining the AI assistant ecosystem, including its models and stakeholders, to propose methods for specifying, verifying and monitoring the security behaviour of AI assistants. This will provide a better understanding of security attacks on AI assistants and help to prevent or counter these attacks. The project is a cross-disciplinary collaboration between the Department of Informatics, the Department of Digital Humanities and The Policy Institute at King's College London, along with Imperial College London and non-academic partners including Microsoft, Humley, Hospify, and the general public. It is funded by the EPSRC.
https://secure-ai-assistants.github.io/
|
The Chinese initiative Digital Silk Road (DSR) is a set of loosely connected plans for China to invest in the digital sector in different countries, and it has raised fears that it would export the “Chinese internet” outside China in order to create an alternative to the US-centred digital world. DIGISILK employs qualitative methods, digital methods, and document analysis to understand the emergence of the DSR from the ground-up and from the comparative perspective of businesses, governments and ordinary people in China and three neighbouring countries: Cambodia, Myanmar and Kazakhstan. The project starts from a simple question whose answer is still very unclear: What is the Digital Silk Road, exactly? What projects does it consist of and how are they coordinated? Who is investing in it? How are these projects connected with the goals of the Chinese State, and directly supported by it, and how do they deviate, following instead the goals of the corporations or people who materialise them? What specific consequences do they have on the daily lives of people who live in the three countries we focus on, as well as on the countries’ economies and societies?
https://www.digisilk.eu {external site}
|
Social Trust, Crisis Perceptions, and Viral Misinformation over the Course of the Covid-19 Emergency Period
Effective mitigation of the Covid-19 health crisis depends partly on the public’s trust that the measures imposed are worthwhile and that the people taking the decisions are trustworthy. This trust has come under pressure, partly because of the spread of online conspiracy theories, with various online actors exploiting the situation by spreading misinformation to generate distrust and undermine confidence in the measures. Governments and public health bodies need high-quality evidence about the dissemination of misinformation and the threats that it poses to public health and security. This project, in which the Department is collaborating with the University of Bristol and the King’s Policy Institute, is carrying out a detailed analysis the online posts both of people who endorse conspiratorial views and those who don’t, to identify whether endorsement of conspiratorial accounts of the pandemic undermines trust and compliance, or whether the relationship works the other way around.
https://gtr.ukri.org/projects?ref=ES%2FV015494%2F1 {external site}
|
Our Data Ourselves
Our research project “Our Data Ourselves” was an AHRC-funded grant at King’s College London in 2013-15, to consider the personal data generate in the everyday lives of young people. Our aims were to increase our understanding of the nature and role of the data that young people produce when they use platforms and applications on their smartphones.
We co-researched this environment with members of Young Rewired State, who were between the ages of 14 and 18. Together with them, we produced an open environment for mobile big social data research with tools, applications and an infrastructure predicated on an ethical framework for data sharing available for widespread community use. The MobileMiner app allowed us to trace in- and out-going communications on mobile phones and identify, which data was leaked and to whom on the phone.
At the end of the project, we ran a series of master classes covering a range of topics in privacy and mobile phone usage as well as future possibilities for research.
King’s lead researchers: Tobias Blanke (PI); Mark Cote (CI); Jennifer Pybus.
|
PERICLES
January 2013 – March 2017
PERICLES received funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement No. 601138, as part od the Digital Preservation theme. Details of funding may be found here.
Project Coordinator and King’s PI: Mark Hedges, Department of Digital Humanities.
PERICLES addressed the challenge of ensuring that digital content remains accessible in an environment that is subject to continual change. The project took a model-based approach that involved capturing and maintaining information about digital objects, their environments, and the processes and policies to which they are subject. PERICLES produced a portfolio of modelling languages, and tools for populating and managing the resulting models, as well as a range of examples, guidelines and best practice. The approaches were trialled in two domains: digital media artworks, and experimental scientific data from the International Space Station.
The project culminated in an international conference in London, organised jointly with the Digital Preservation Coalition, ‘Acting on Change: New Approaches and Future Practices in Long-term Digital Preservation’. The project created a working group addressing at the complex issues and challenges involved in preserving software-based artworks, which resulted in a report . Another result of the collaboration between King’s and Tate in this area was as an AHRC-funded Collaborative Doctoral Partnership, for which King’s student Tom Ensom is producing a thesis on ‘Technical Narratives: Analysis, Description and Representation in the Conservation of Software-based Art’.
Image by: Rafael Lozano-Hemmer, "Subtitled Public", 2005. Sala de Arte Publico Siqueiros, Mexico City, Mexico. Photo by: Alex Dorfsman (right) Courtesy of NASA (left).
|
Read the Department of Digital Humanities full list of past projects