Inkvisible was a hybrid framework of digital, projected graffiti; game mechanics; and event organising applied within the interior space of an art gallery or museum. There were four main playtesting days for trialing the prototype and Nikki Pugh blogged about each of the days on her website.
Playtesting Day One: Getting to know the medium, June 2013
Nikki and Ben trialled Graffiti Research Lab's L.A.S.E.R Tag software in the interior space of Birmingham Museum and Art Gallery. Positive interactions with gallery visitors and staff provided support for the development of the project. Nikki observed that it was immediately obvious that the software was something that drew people of all ages in and could be used as a catalyst for conversation.
Playtesting Day Two: Provocations, June 2013
Nikki, Gretchen and Linda first met with Birmingham Museum and Art Gallery's conservationist to discuss which arts objects the laser technology could be used upon seeing as there wasn't a lot of published research regarding the use of laser pointers within museums. This was followed by various trials of the technology in different spaces and with different art objects throughout the gallery, and with a variety of visitors and staff. One visitor was drawn to the graffitied paintings, Nikki states, "as if to a car crash". She was horrified at the prospect of having marked over the paintings, but found the process very intriguing.
Nikki observed that, in contrast to the previous week when the team had the projector and laptop out in plain view and people readily came up to ask what was happening, this week the mechanics were somewhat hidden and people tended to stay back to watch from a distance. It was almost as if when the equipment is visible it acts as an invitation for people to come up and ask what’s happening and how it works.
Playtesting Day Three: Associations, assumptions and frustrations, July 2013
The team aimed to undertake a final trial of the prototype technology in a particular space within the gallery,chosen following the previous playtesting day. However, a number of issues raised doubts about the reliability of the technology, which resulted in the decision not to organise and promote a public event in which the prototype would be launched.
However, during day three the team had more people wanting to take photos of the graffitied paintings as well as having their photo taken with their own laser creation, including a member of staff at the museum.
Playtesting Day Four: Smooth moves and scribbles, July 2013
Nikki, Ben and Gretchen trialled a new technology using a Kinect body-tracking sensor and a modified version of BlitzTag , also by Graffiti Research Lab, in various spaces around the gallery. The technology seemed more reliable and offered a more embodied experience. Nikki observed that the projected line almost felt elastic at times and participants seemed to get really absorbed in their movements. It was a different experience from the version that tracks a laser pointer as it was more about the movement of the body. This final playtesting day was also the first time the team had been able to successfully project onto 3D sculpture (as seen below).
However, there were still a number of site- and user- specific bugs that would need to be sorted out before serious use. Again, there were very positive interactions with a range of users and much insight was gained into different possible event prototypes.
Project team
Nikki Pugh
Nikki is an artist who explores questions relating to how people perceive, move through and interact with their surroundings. She harnessed various tools and techniques adopted from walking-based practices, guided tours, physical computing, locative media, pervasive gaming, installation and collaboration.
Specific areas of research interest within her practice currently include: the use of making, prototyping and participatory playtesting as tools for—and sites of—knowledge production; and investigating the relationship of my practice to Mobilities Studies, non-representational theory and human+technology+place assemblages.
Dr Gretchen Larsen
Gretchen is currently a Senior Lecturer in Marketing at Durham University. Prior to joining Durham, Gretchen was a Lecturer at King's College London, where she was the Director of the MSc International Marketing. She has also worked at the University of Bradford and the University of Otago (New Zealand), and held visiting posts at EADA (Spain) and LKEAM (Poland).
Gretchen’s research is located within interpretive and critical consumer research, at the intersection of consumption, markets and the arts. In particular, she seeks to understand how the position of the consumer in a socio-cultural world is constructed, performed, interpreted and questioned through the arts.
Ben Eaton
Ben is a digital and interactive artist based in Leeds. He is one third of the interactive arts organisation Invisible Flock. Ben is interested in how we use technology as an empowering tool, both politically and personally. His work often explores how technology and new platforms allow us to tell new stories and allow for both ownership and authorship for participants and he uses technology in an accidental hap-hazard and experimental way, exploring new software and platforms all the time rather than focusing on a single medium.
Linda Spurdle
As the Digital Development Manager at Birmingham Museums Trust, Linda's role includes managing web, social media, digitisation, photography, the Picture Library, the Planetarium, IT infrastructure and projects. She manages the digital team which has recently expanded to include the IT Manager and has extensive experience of managing and leading on digital projects, and have produced award winning websites. She is also the co-founder of Museum Camp, the innovative UK museums unconference.
Danny Birchall
Danny is the Digital Manager at the Wellcome Collection where he is responsible for the strategic digital presence of the Wellcome Collection including web, social media, games and commissions. He has written and presented about games and the importance of play for museums at conferences and in museum journals. He has also completed Birkbeck College's Museum Studies MA.
Developed by Paul Vetch and Michael Takeo Magruder from the Department of Digital Humanities as well as creative technologist Mairead Buchanand and Helen Jeffrey at London Review of Books, the app posts personalised cultural artefacts onto Facebook users’ walls based upon keywords extracted from their status updates. The principal aim of this project was to develop a Facebook app which takes personal data and, rather than using it for marketing purposes, uses it to serve up a little personalised ‘nugget’ of culture in the form of a photograph, or museum object drawn from open data being published by galleries, libraries, archives and museums across the UK.
Many UK holding institutions of all sizes now routinely make their collections data available online for use in other contexts, and an easy way to access this data is via the Collections Trust’s aggregated CultureGrid service. The data available here allowed the team to select and grab an image and description, together with a link back to the holding institution’s website, so that if something piques a particular user’s interest they can leave Facebook to go and find out more about the object on the related website.
The team's aim through CultureMe was to offer a pathway to ‘one of your cultural five a day’. Each object that the app shows could offer a little moment of pause and a cultural diversion from Facebook ‘life’. Equally the data might be weirdly irrelevant. But in any case the app ultimately gives users a glimpse at the incredible breadth of the cultural heritage in the UK and a welcome antidote to yet more online adverts.
The project team were keen to ensure something tangible came of the project and this meant developing a working prototype. One of the team's first discoveries was that there were only five Facebook apps that describe themselves as having cultural content. Therefore in the absence of analogues, the first step was to establish what was feasible to achieve using the Facebook Graph API and to get an understanding of what is involved in developing Facebook apps in order to establish a specification for the application and determine what could be realistically achieved. The team thus generated an initial set of user stories and then sorted them to determine what would be needed in order to test the initial concept.
Over the course of the project the team were able to create a working prototype of the CultureMe app. Once a Facebook user installed the CultureMe app and agreed to let it have access to their personal information and post to their wall, a visit to the app page invokes a process during which:
- The user’s wall posts and status updates were parsed and keywords they used were extracted. A stopword list is used to discard common words.
- A random word was selected from the extracted keyword list and used to generate a query against the CultureGrid API. The app attempted only to return objects from CultureGrid which have images associated with them.
- The image, together with title, description, and a link to the webpage for the object were displayed back to the Facebook user and then the object and associated metadata were published on the user’s Facebook wall.
- An initial visual identity for CultureMe was developed that equally highlighted the project’s text (keywords) and image (cultural assets) aspects. The app played on a sense of nostalgia/history through the font, Trajan Pro, and the colour‐space, greyscale and sepia, and blended in an unapologetic sense of the digital since the app’s purpose is to interface people with digital cultural space.
Project team
Paul Vetch
Paul formerly worked in the Department of Digital Humanities at King's as a Research Fellow and Head of Research Development & Delivery where he was responsible for interface / interaction design as well as server infrastructure operations, management of desktop support team and system administrator, and undergraduate and graduate teaching. In 2014 he joined Torchbox as the Client Services Director where he manages the Account Direction and Project Management teams. He also has experience working as a consultant to, and collaborator with, many arts and cultural heritage organisations including Tate, the South Bank Centre and the Royal Opera House. He also ran his own digital agency for over ten years.
Michael Takeo Magruder
Michael Takeo Magruder (b.1974, US/UK) is a visual artist and researcher who works with digital and new media including real-time data, digital archives, immersive environments, mobile devices and virtual worlds. His practice explores concepts ranging from media criticism and aesthetic journalism to digital formalism and computational aesthetics, deploying Information Age technologies and systems to examine our networked, media-rich world. In the last 15 years, Michael’s projects have been showcased in over 250 exhibitions in 30 countries, and his art has been widely supported by numerous funding bodies and public galleries within the UK, US and EU. For further information about Michael's work, visit www.takeo.org.
Mairead Buchan
Mairead is a creative web developer, based near Bristol, specialising in exciting interactive experiences. She is interested in any technology that pushes the boundaries of web development and is always looking for new ideas to harness what the internet can do for us. She specialises in Front End web development technologies, JavaScript, Responsive, Mobile and UI interactions. For more information on projects she has been involved in visit her website here.
Helen Jeffrey
Currently the Associate Publisher at the London Review of Books, Helen is experienced in digital strategy, working with uncertainty, and creative thinking. She has run unusual and unique projects, both digital and real-world, providing vision and leadership. Key areas of her work has included digital projects, new digital products, change management, and consultancy. She is also interested in the creative use of data, collaborative working, and the potential of the internet/social media to effect social change.
For the Haptic Robotic Glove, the team created a prototype robotic glove, which allows the wearer to feel as if they are touching a museum object when actually they are following the outline of an image of the object in a virtual environment. The glove is able to mimic the feel of the object, by using micro-vibrators, and can also mimic the sound of a hand touching the object and the hand’s position in relation to it.
Haptic feedback of the mechanical properties of an object can enhance the accuracy of judgement in many applications like robot assisted surgery, exploration of underwater objects, virtual exploration of untouchable exhibits in museums, and even tele-shopping. This project aimed to provide a solution to remote exploration of unseen, partially seen, or archived virtual objects with multi-point whole arm haptic feedback using a lightweight wearable robotic exoskeleton and a haptic glove.
Schematic model of haptic feedback for spatial details via vibro-actuators in the silicone layer
The exoskeleton provides haptic feedback of the surface and mechanical properties of the object using both cutaneous feedback to fingers and mechanical impedance at the wrist, elbow, and shoulder. To solve the problem of partial or no visual feedback from the object being explored, the team developed an algorithm to interactively build a virtual object that contained not only geometrical features but also mechanical properties of an object.
Fingertips of the human hand posses large number of several types of mechanoreceptors that help us to understand the tactile properties of an object. Small Merkel cells are very sensitive to small vibrations (less than 5 Hz). Meissner’s corpuscles respond to vibrations from 5 to 50 Hz and are located close to the skin surface. The receptors located most deeply and at low density, called Pacinian corpuscles, respond to high-frequency vibrations and high pressure. Therefore, human tactile system perceives separate stimuli, such as different frequencies of vibrations, with the help of these mechanoreceptors, strategically located in the skin. Using a haptic interface, such as a glove, makes it possible to generate the desired tactile perception in a natural way. The design of the glove took into account the anatomy of the human hand, haptic perception features, and was created to be user-friendly and not constrain the natural movement of the fingers.
This project was organised in three work packages: during the first the team developed a novel sensor, a 'soft vector probe', to collect geometry and impedance information from objects to be archived by way of a low mechanical impedance haptic glove with enhanced degrees of freedom at the thumb to track fingers and to provide haptic feedback of the geometry of the object as well as a wearable exoskeleton to track the position and orientation of the hand and to provide haptic feedback of theimpedance of objects. In the second package the team developed novel real time algorithms to retrieve and playback location specific archived information of the objects via the haptic glove and the exoskeleton, and in the third the team addressed social aspects of individual and collaborative exploration of archived objects.
Schematic model of the vector probe
For the third part of the project, blindfolded participants wearing the haptic glove prototype explored objects, such as scissors, a doll, a cup, and pens, that were hidden in a dark box. The first participant was asked to explore the contents of the box through touch and report their experience to a second participant who asked for further information and encouraged further investigation. The free movements of the hand were tracked using an existing Vicon 3D motion tracking system. The second participant then used the reports of the first participant’s experience to make sense of the objects’ properties. The obligation to verbalise experience provided a clear goal for the first participant to explore objects. The results informed the team on the classes of primitive movements used during exploration of different types of objects, enabling them ways to provide a firm foundation to design the hardware in the first work package - the soft probe. Moreover, it alled the team to develop behavioural models that are useful to design real-time information retrieval algorithms.
The haptic glove prototype revealed the potential of the research and the scope to develop it further, such as allowing users to feel differences in temperature or texture of a virtual object. This could boost investigations into human tactile perception and the role sound feedback plays in how we understand touch. There is also potential for these hapic gloves to have wider applications in medicine, in museology and in education.
Project team
Dr. Thrish Nanayakkara
Thrish joined King's in 2009 and is a Senior Lecturer in the Department of Informatics. As well as being a member of the Society for Neuroscience (SFN), he is also Associate Editor for the Journal of Control and Intelligent Systems and the Founding General Chair for the International Conference of Information and Automation for Sustainability ( ICIAfS). His research interests include: human-robot interaction, robotic interaction with uncertain environments, soft robots with controllable stiffness, and robotic learning based on demonstrations.
Dr. Dirk vom Lehn
Dirk is a Lecturer in Marketing, Interaction and Technology. He is a member of the Work, Interaction & Technology Research Centre. His research explores the interplay of technology and social interaction in different domains, including museums and galleries,optometric consultations, street-markets and social networking sites. The research primarily uses video-recording of naturalistic situations and examines how tools and technologies as well as objects and artefacts are embedded within social interaction.
Dr. Milena Michalski
Milena has worked at King's since 2007 and is currently a Visiting Research Fellow and practising artist as a member of the Arts & Conflict Hub. Her arts practice interweaves academia and art, engaging particularly with place and perception, sight and site, using a range of media, including print-making, photography, analogue film, digital video and site-specific installation. As an artist she has won national and international juried competitions, and has been awarded bursaries and a residency.
Professor James Gow
James is a professor of International Peace and Security in the Department of War Studies at King's as well as being the Director of the International Peace and Security Programme. Between 1994 and 1998, he served as an expert advisor and witness for the Office of the Prosecutor at the UN International Criminal Tribunal for the former Yugoslavia, and in 2010 was involved in background preparations for the Strategic Defence and Security Review. His research interests include: international peace and security, UK and Euro-Atlantic security policy and war crimes, and in 2013 he was awarded a three-year Leverhulme Major Research Fellowship to examine the defining trial of the ICTY, that of Ratko Mladic, and its impact on the evolution of international criminal justice.
Others who worked on the project include: Dr Jelizaveta Konstantinova, Dr Min Li and Dr Anuradha Ranasinghe, research associates in the Department of Informatics, who all assisted in the research phase of the project.
Nick Stevens from the Crafts Council and Creative Technologists, Dan Tagg and Fleeta J. Chew Siegel, also assisted on the project.
The ‘People’s Republic of You’ project specifically developed a prototype of an online portal to give audiences at home a new way to engage with live performance, specifically Coney's Early Days (of a better nation) which was a large-scale show with a dynamic combination of theatre, a playing audience, video projection and live-streamed media. The interface allows an online audience to have a connection to the drama being acted out in the theatre, giving the online users voting rights, live ticker newsreel and online movement between the ‘virtual colonies’ of the show.
The portral is available online here.
Project team
Dr Adrian Gordon
Adrian is the co-founder and CTO of Mimosa Wireless Ltd, a company delivering innovative applications to mobile devices. He is also currently working on dimple, a platform for publishing digital media content to all mobile devices, via mobile optimised websites, apps for all the major Smartphone platforms and Augmented Reality service such as Layar.
Adrian is an experienced IT professional, in both Research and Development (gaining a PhD in an area of research related to Machine Learning), and commercial environments. He also has experience in Business Modelling and Agile Development
Dr Ricarda Vidal
Ricarda holds a PhD in Cultural Studies (London Consortium/ Birkbeck). She is the author of Death and Desire in Car Crash Culture: A Century of Romantic Futurisms (Peter Lang, 2013) and co-editor of The Power of Death (Berghahn, 2014) and Alternative Worlds (Peter Lang, 2014). She has published on speed, the car and driving as cultural phenomena, Modernism (in particular Futurism), moving-image art, urban space and art in relation to gentrification (with a particular focus on contemporary London), Brutalism and cinematic architecture, as well as society’s fascination with death and murder, and most recently, alternative worlds and utopias. Further information is available on her website.
Annette Mees
Annette makes immersive theatre, interactive experiences and adventures. She is one of the four Runners of Coney; an award-winning agency of adventure and play. Coney tells stories where the audience can become the heroes, and makes play which is live and responsive, interactive, immersive and participatory. She also works as an independent artist and director usually in collaboration with others.Work by Coney includes: A Small Town Anywhere at BAC; an interactive dance project in Paris; an adventure for the Dublin Fringe Festival; and Rabbit: NTT, an adventure in, around and about the National Theatre.