Interactive Artwork Overview
“King Lear: …yet you see how this world goes.
Gloucester: I see it feelingly”
-William Shakespeare, King Lear
Visual art is a broad and inclusive term, used to describe all forms of painting, design, photography and sculpture. Yet, the term is inherently exclusive, as it deems all creative produce to be appreciated purely visually. This means that those who cannot see cannot appreciate art.
Yet, sight is not the only sense which we use to perceive the world around us. Berger (1972) in his famous work Ways of Seeing argues that while looking brings what we see into our reach, touching situates ourselves in relation to it. Berger also argues that paintings are often “reproduced with words around them”. These shape our perception and our opinion of the work. In simpler terms, while art might be perceived visually, other factors such as audio descriptions, shape that perception, creating our overall mental image.
It has been proven that those who are blind from birth organise “visual information in much the same way as sighted people” (Pitt 1998). An experiment which confirms this was conducted by Kennedy (1980), whereby several subjects were asked to draw a table. Remarkably, all of the subjects recognised that the table would appear differently depending on its position in relation to the participant. Most of them ascertained that parts of the table would also be obscured depending on their location. The subjects would also explain their drawing, telling the tester which angle they were drawing the table from, such as “from the top” or “from the side” (Kennedy 1980).
With these theories and studies in mind, the main question I attempted to answer with this study was if haptic feedback together with audio improved a user’s mental model or image of an artwork, or was this achieved purely through auditory feedback. To answer this, I created an interactive artwork as a web based application for use on a tablet device or a large smartphone. The app displays a simplified line drawing of an artwork. The device vibrates when the user runs their finger over the screen, giving them the outline of the work. Audio description is provided depending on what paths their fingers cross. HTML5, JavaScript, CSS and SVG were the primary technologies I used to develop the application.
Research
The starting point for this interactive artwork study was to survey institutions to find out what forms of accessibility resources they had for visitors. I surveyed over 70 galleries, museums and art institutions in Ireland, the UK and Europe. I contacted 42 institutions in Ireland for information about their resources and whether they have an application associated with the museum. Only ten replied, with five disclosing that they had made provisions for blind visitors at the time of writing. I also investigated the websites of the 42 institutions to see what resources they had advertised. Again, the number of institutions with resources available was quite low.
In contrast to Ireland, accessibility is more expansive further afield, regardless of whether the institutions are large, small, private or public. In the UK, the Whitechapel Gallery in London provide patrons with large print guides, magnifying glasses and raised image booklets. These can be taken home after a tour of the gallery. The Tate Liverpool has a Braille guide, large print guide, magnifier, coloured overlays and audio labels. The Tate Modern and Britain both have large print gallery plans and captions for the permanent collection and for special exhibitions. They also provide touch tours using raised images, direct handling, descriptions and discussions. They have an online resource called I-Map, which delves into a work and can be used in classrooms as an education tool.
The computer terminals in the Victoria and Albert Museum in London include JAWS and MAGIC screen reading software. Around the museum spaces are desks which include touchable objects, with descriptions written in large text and in Braille. Elsewhere in Europe, the Lourve Museum in Paris offers a touch gallery with tours limited to eight spaces. The Van Abbemuseum in Eindhoven holds an interactive tour where original or replica artworks can be touched. Other sensory elements, such as smells, music, literature and even tastes are incorporated into the tour.
From this survey, it can be agreed that some museums and galleries have developed resources for VI patrons. Yet, many of these have to be pre-booked, meaning a loss of independence. Moreover, the limited number of resources means only a certain number of attendees can use them at one time. Most institutions haven’t embraced technology and are using archaic methods, such as swell paper, or Braille handouts. While these methods work, most people use their smartphones as a means of interacting with their world around them. Creating resources, such as an interactive artwork, that will compliment what people use on a daily basis is a better and more inclusive approach.
Development and Design
From discussing devices and screen readers with a number of VI people, the most commonly used are smartphones, in particular the iPhone, and tablets. Out of nine people, eight used an iPhone whilst only one used an Android device. The Android user also had ten percent vision, and did not use a screen reader. A survey of devices and screen readers carried out by WebAIM in 2014 shows that mobile devices are used nearly as much as laptops and desktop computers in terms of a screen reader. Users were also asked as to what mobile platform they used. Just over 65% of respondents used Apple products, with only 16% of participants using Android. The survey also enquired as to what mobile screen readers were being used. VoiceOver was the chief system, at 60.5%. TalkBack was the next commonly used at 21.6%.
Based on my research of devices and screen readers, I developed my accessible interactive artwork. I experimented with different image formats, such as JPEG and PNG. However, I decided to use the vector based SVG, which I identified as having advantages over other forms. With SVG, the graphics are generated within the browser, reducing the server load and network response time generally associated with other forms of web imagery. Another key advantage with SVG is that it responds to JavaScript. Using Inkscape, an open source program, I created my SVG interactive artworks.
Next, I experimented with the Vibration API of HTML5. The API is clearly targeted toward mobile devices, and could be used to generate alerts within a web application. To explore this API, I created a simple SVG combined the Vibration API with Web API Touch Events. This resulted in the device successfully vibrating when a finger crossed a SVG path. I found that the line thickness of the SVG made a difference to the vibration and the capturing of the touch events. Therefore, a thickness of at least 8 pixels was used to create subsequent interactive artworks.
The next step was to include audio descriptions to describe different parts of the artwork. Since SVG is XML based, title or desc attributes can be added to the graphic. These are then parsed by a screen reader. To read these, I used the Speech Synthesis part of the Web Speech API of HTML5. Using this API, I attempted to activate my descriptive texts upon touchmove over the SVG paths.
After achieving rudimentary tactile and audio feedback, the next stage was to adapt the application so it could be used by those with partial sight. I added a high resolution image of the artwork to the SVG. A menu was added to the side of the work. This included options to adjust the appearance and colours of the artwork, to suit those with colour blindness or partial vision. After performing an informal usability test, I adapted the SVG once more to include zoom functionality. I added more detailed paths and audio descriptions when zoomed in, and less when zoomed out. This would recreate how one observes a painting, initially taking in the overall impression of the work, and then standing closer, focussing on different details.
Prototypes
Below are my final interactive artwork prototypes used for usability testing.
To interact with the below prototypes, hover the cursor over the SVG paths to activate the audio descriptions of the interactive artworks.
Crossing the green border of the interactive artwork will activate a beep sound.
Use the menu to the right to adjust the interactive artwork’s appearance, such as inverting the SVG path colours, hiding the background image, and changing the SVG to greyscale.
A section from The Creation of Adam by Michelangelo (1512)
The Starry Night by Vincent van Gogh (1889)
The Girl with the Pearl Earring by Johannes Vermeer (1640)
A Bar at the Folies-Bergère by Édouard Manet (1882)
Usability Testing
I began testing the interactive artwork prototypes. I researched usability test methods to decide which approach to take for my initial test. Finally, I settled on conducting a simple beta test on a few individuals, and then on a focus group. Each participant would then be asked to answer a short questionnaire. The questionnaire consisted of six background questions, with the purpose of discovering what devices and screen readers were used. These were followed by ten questions asking to rate aspects of the application and to give opinions. With the usability tests, I wanted to determine if there is a market for my application. I also wanted to determine that the application is easy to use, and that it was effective at conveying an artwork through audio description and tactile feedback.
I first ran a beta test with three participants: one fully blind, one fully sighted and blindfolded, and one who has deuteranomaly colour blindness. The beta testers were all asked to simply explore the application. I also asked them to describe what they were feeling and hearing. Lastly, I asked when they felt they were competent in using the application, to stop and describe the artwork.
Feedback about the interactive artwork application was somewhat consistent amongst the participants. First impressions of the application were that there were some initial difficulties. Once these were overcome, the application was easy to use. These difficulties included understanding the screen division, and other overall concept that touching the interactive artwork outlines caused the device to vibrate.
When asked about the ease of use and to give it a rating, the responses ranged from 6.5 to 8 out of 10. All participants rated the audio description highly, with the consensus being that the descriptions give the context of the work. They also informed the user what they are exploring, and developed the mental image of the artwork.
Following these beta tests, I trialled the interactive artwork application with a focus group. First impressions were that the concept was a good idea. The descriptions were interesting, and the application was quite similar to others on their phone. All rated the ease of use quite highly, with most giving it 10 out of 10, and only one giving it 9 out of 10. The audio description received similar ratings, with most giving it a 10. Vibration however fared differently, with two claiming they didn’t feel or notice vibration at all. Yet, the participants that did, rated it at an average of 8.5/10. However, it was felt that both were needed.
Next, I then ran a controlled task test. I would then use the data for a statistical hypothesis test to determine if the vibration aided the use of the interactive artwork application. My test would be structured with two groups: group A and group B. Two objects in the interactive artwork would be used for this task test: object 1 and object 2. Group A would test the application, and would try to find object 1 with vibration off, and object 2 with vibration on. Group B would try to find object 1 with vibration on, and object 2 with vibration off. This would mean that I would have matched subjects. I would be collecting interval data, and I would be testing under two different conditions.
After researching different forms of statistical tests that would allow for these parameters, I chose a Wilcoxon signed-rank test. While other tests, such as a T-test might have yielded better results, I couldn’t guarantee that the data obtained was parametric. For the controlled test, I would just document background information and timed data. I would use the results of the Wilcoxon test to determine if a null hypothesis could be rejected. This would prove that the added vibration improves the application’s usability.
Analysis
Lastly, I ran the controlled study with 16 subjects. Half were sighted and blindfolded, the other half were visually impaired. Group A and Group B both contained half sighted and half VI participants. Using the obtained data, the sum of the positive ranks was 53 and the sum of the negative ranks was 83. This meant that the W-value was 53, while the Mean Difference was 0.81. The Z-value was calculated as -0.7756 while the P-value was 0.2177. The Standard Deviation was also found to be 19.34. The average time for the experimental condition (Treatment 1) of vibration on was 28.8 seconds, while the mean time of the control condition (Treatment 2) was 36.5 seconds, meaning users of the experimental condition were quicker on average.
However, according to the Critical Values of the Wilcoxon test, for the sample size of 16 at a one-tailed hypothesis with a significance level of 0.05, 53 is too high. The critical value of W when N=16 at p_ 0:05 is 35. Therefore, the result is not significant and the null hypothesis that the vibration does not improve usability could not be rejected. Reasons for the higher number could be due to not having a large sample size. Furthermore, the fact that sighted participants had to be used could have affected the results. Finally, the task test itself may have been awed, as some found the object seconds after starting due to the direction in which they explored the screen. Prior knowledge of the artwork might have also influenced the results.
Summary
“He who looks through an open window sees fewer things than he who looks through a closed window.”
-Charles Baudelaire.
The aim of this project was to investigate whether haptic feedback together with audio, improved a VIP’s mental model of an artwork. Various usability tests yielded results that suggested that both haptic and audio feedback were effective in the creation of a strong mental image. Since the interactive artwork prototypes mainly used traditional forms of images, adapting the application for contemporary or abstract artworks may produce different results. Similarly, given the results of the Wilcoxon test, more users would be needed to obtain sufficient data. In conclusion, while the aim wasn’t fully achieved, the interactive artwork application can be seen as an important step in improving access to the visual arts for the visually impaired.