Memoir – Social Reminiscence

Aim

To create technological support for episodic memory to aid people in story telling or reminiscing activities.

Summary

The approach we adopted was to explore the use of photographs annotated with context details to help the users have a reference back to the context in which the photograph was taken and remember the past more clearly and vividly. As a solution we designed Memoir. Memoir is a two part application: a mobile application which automatically collects data and a computer application which annotates your digital photos automatically with the meta-data collected and allow us to display them as a Google Map mash up on tabletop interfaces.
This project presented a new approach for collaborative memory sharing and dissemination with tabletop photo sharing device beyond the classical desktop screen and mouse currently employed. The resulting collections of captured photos can be managed, searched and reminisced about more easily by taking benefit of the detailed information stored inside them as metadata.

My Role (What did I do, learn)

I was responsible for brainstorming the functionalities of both the mobile and the computer applications. I was also responsible for designing and developing both the applications. The mobile application was developed using Python and Java was used to develop the computer application.

Design and Methodology

One of the main objectives of our work was to create technological support for episodic memory to aid people in story telling or reminisce activities. Episodic memory refers to the memory of an experience, time, place or an event. A challenge when dealing with episodic memory is that, with time the fine details tend to decline or become more difficult to recall.

Our approach was to explore the use of different recording mediums to capture digital artifacts which we annotate with ”context-coded” details. For our purposes, we were interested in the ”context” of the individual (entity) who is capturing the digital artifact (eg. a photograph).
Providing user access to these context-coded artifacts allows people to have a reference back to the context in which the artifact was created, which we aim to show allows them to
remember the past more clearly and with greater fidelity.
Our method was to place such media in its correct context by processing the media itself and by correlating it with sensor, environmental data and data collected from each person’s digital existence (email, calendars). By combing these data sources one can offer recall technologies that assist people’s memories in both the short and long term.

Context Coding Digital Artifacts

Digital pictures can be stored as files with extra information (metadata) embedded (place and date the picture was taken, name of the people in the picture, etc). Images are taken with an ordinary camera. The camera saves the images tagging them with a date and time. Meanwhile, the mobile phone receives location coordinates from the GSM device and appends them to a file. It also keeps track of the date and time the information location is saved. All these features have been implemented via a mobile phone application. The mobile application can also access and save the calendar details and profile of the mobile phone. This provides information about the people you are with at a given time (colleagues for a meeting, family on holidays, friends at a birthday party, etc). The application uses a Bluetooth scan to determine the BlueTooth devices (and so people’s phones) that are around it at any particular time. Note that the previously mentioned application can run in the background without affecting the users’ normal interaction with the phone.
For our approach, the pictures with date information are stored in the camera and date and location information are stored in the phone. Data files can be uploaded from the mobile phone to a PC. The images can also be uploaded to the PC. We implemented an application to retrieve and synchronize the collected information. The synchronization is performed with respect to the time. The application considers the photos, the time they were taken at and then looks for the GPS and BlueTooth information that coincides with the closest time to the time the image was taken. The matching results are stored in the photo as metadata. Metadata is stored in the picture itself using the Exchangeable Image File Format (EXIF). If due to some unavoidable circumstances, (like the inability of the GPS device to connect to the satellite or mobile phone) it is not possible to collect data between two points A and B, then extrapolation techniques are used. Also, the frequency for collecting location information adapts to the speed with which the user moves at. For instance, if the user does not change her position for a while, then the frequency location information is collected at is reduced. In the same way, the frequency increases if the user moves very fast.

Visualizing data

Adding metadata to picture in order to help users’ memory is only the first step of the process. To be relevant, the approach has to provide a meaningful way to use the information.
For that reason, both spatial and temporal criteria are important for the design of an interface for browsing a users’ photographs.
This approach leads us to provide the user with a map and a time scale. She can then select a location and access all the pictures that were taken at that location. She can also select a time or a period and then access all the pictures taken during that period.
The visual interface is a 3D representation of the Earth (see Figure). The user is able to select a location with this representation.

Overview

Collaborators
David Swords, Aaron Quigley

Methods Used
Visualization, Extrapolation, S/W design

Technology Used
Java, Python, Illustrator

Client

CASL - University College Dublin

Services

3 months (May 2007 - Jul 2007)

Skills

Mobile
Software Dev