Mobile Narrative and Location-based Narrative
As prevalence of mobile and tablet devices, text and narrative have been embedded into urban cities via location technologies. In the early stage, from the end of 1990s to the early 2000s, mobile narrative represented potable narrative using mobile devices such as portable audio devices. Most of these mobile narrative were prepared by creators, and they didn’t allow users to interact with the stories even though they could choose the fragment of the stories.
After the invention of smart phone, location-based narrative using GPS (Global Positioning System) technology appeared. The new generation narrative become more interactive because location awareness application allows people to attach their contents to particular locations as well as read information attached a place.
Audio Talks : “The Missing Voice” (1999)
Janet Cardiff, who is well known for her audio talks, strayed around London city with her voice recorder in 1999. She recorded her voice at several locations. According to Cardiff, the story was influenced by detective series and mystery novels. The mobile narrative prompts users to trace her trajectory and mysterious story with a CD player. The work is linear narrative, which requests users to follow creator’s navigation.
Janet Cardiff “The Missing Voice” : http://www.cardiffmiller.com/artworks/walks/missing_voice.html
Also, there are hybrid of audio talks and GPS technology. They are called mixed reality performance.
- Blast Theory and The Mixed Reality Lab”Uncle Roy All Around You” (2003)
- Blast Theory and The Mixed Reality Lab”Can You See Me Now?” (2003)
Shawn Michallef, James Roussel and Gabe Sawhney created a mobile storytelling performance “Murmur, collection of secret histories of the cityscape via their cellular phone. In Toronto, there were [murmur] signs at several spaces such as a restaurant and cross-point. Furthermore, the project expanded to other cities including Sao Paulo (Brazil), Geelong (Australia) and Dublin (Ireland). The project is similar to audio talks in terms of prepared stories, but there is a difference that other people such as local people and participants created stories for the project. In other words, the project is more open to public than previous audio talks.
Shawn Michallef: http://www.visiblecity.ca/index.php/artists/95-shawn-micallef
“Yellow arrow” (2004-2006)
“Yellow arrow” is similar project to “murmur” created by the members of Counts Media; Michael Counts, Christopher Allen, Brian House, and Jesse Shapins. Participants got a yellow arrow sticker which was marked by unique code from the project’s website, and placed it wherever they want to tell a story about the place. Then, they sent a text message with the single code. Each sticker and message were linked each other using the single code. Other participants could get a story if they sent the unique code via their cellar phone.
Yellow Arrow project photos(Frickr) : https://www.flickr.com/photos/yellowarrow/collections/
Jason Lewis and Obx Labs at the Concordia University created a “Cityspeak”, which is a public installation using mobile devices and a big screen in a city. The project focus on converting private communication to public displays at a particular location. Participants use their mobile phones to send their message to the common server. The messages people sent via their mobile phone appeared on a public screen with other messages as a text stream. The text is processed using the NextText text visualization software. NextText references real-time data from the location to output visual behaviors of the text.
CitySpeak.net : http://cspeak.net/
Also, “TXTual Heading” by Paul Notzold is similar interactive textual installation using a screen and mobile phone.
“Urban Tapestries” (2002-2007)
“Urban Tapestries” is “Public Authoring in the Wireless City” project using location technologies developed by Proboscis, which is an independent artist-led creative studio. The Urban Tapestries software platform enabled people to attach text, sound and video to locations using GPS technology. There were two functions to weave tapestries, pockets and threads. Pockets were stories connecting to specific locations. Threads showed the thematic relationship between pockets and locations.
“Rider Spoke” (2007)
“Rider Spoke” by Blast Theory and The Mixed Reality Lab is similar to “Urban Tapestries”. People cycled through the streets of the city, equipped with a tablet device. The tablet showed a map of the city and the locations people can access and upload information. During cycling, a narrator told stories about the city via a headphone. The project allowed people to explore the city freely. There were no fixed route like early audio talks.
Rider Spoke : http://www.blasttheory.co.uk/projects/rider-spoke/
“7scenes” is a mobile storytelling platform that provides tools to develop scenes. People can drop and drug their information easily on particular locations with signs. The platform provides the interface to create scenes without programming knowledge. The apps optimize GPS technology and smart phone’s functions.
“CSVNGR” is a game using location-based information. Users go places, find quiz and game, and earn points. As with 7Scenes, it uses GPS technology and provide mobile apps.
CSVNGR : http://www.scvngr.com/
Silva, A.S. 2013. Mobile Narratives : Reading and Writing Urban Space with Location-Based Technologies. In : Hayles, N. K. and Pressman,J. (ed.) . 2013. Comparative Textual Media Transforming the Humanities in the Postprint Era. Minnesota : University of Minnesota Press.
Raley, R. 2013. TXTual Practice. In : Hayles, N. K. and Pressman,J. (ed.) . 2013. Comparative Textual Media Transforming the Humanities in the Postprint Era. Minnesota : University of Minnesota Press.