Now that more than 20 years have past since Mark Weiser identified this as the century of Ubiquitous Computing (ubicomp)  it seems we are no closer to giving users a handle on what is happening with their data. As Weiser noted,
“The problem, while often couched in terms of privacy, is really one of control. If the computational system is invisible as well as extensive, it becomes hard to know what is controlling what, what is connected to what, where information is flowing, how it is being used, what is broken (as compared to what is working correctly, but not helpfully), and what are the consequences of any given action (including simply walking into a room).”
What is especially interesting in Weiser’s quote is the notion of future consequences. Ubicomp technologies offer the potential of capturing and storing large datasets of users’ behaviour, preferences, activities or movements. They allow us to look into the past, to check what we liked, where we have been or what we did. Company owners can track their assets using GPS devices, parents can track their children via mobile phones, big supermarkets can monitor our purchases using loyalty cards and banks can find our location when we are using cash machines. CCTV, Google Maps Street View and Webcam technologies allow us to access remote locations simply by observing those locations or objects and people inside. Technological advances in storage, aggregation, and extraction of information both online and offline raise several privacy concerns that have an impact on the acceptance of the new technologies .
While, one reason for this problem is technological invention, we cannot blame the technology for all privacy problems in Ubiquitous Computing. Nguyen and Mynatt note that a ubicomp system is not limited to devices of different sizes connected through the wireless network. It encompasses three environments, in which people live, work and interact with each other: technical, physical and social . A ubicomp system is an ecology of devices (technology layer) situated in the physical space (physical layer), in which people are connected (social layer).
Ubicomp technologies are just the beginning of this new information society, in which humans and the technology co-exist. They change our culture and the ways we interact with information and other people. They open new ways for communication, in which sharing personal information becomes a part of everyday communications.
The privacy risk we see here is sharing without realizing the consequences – i.e. lack of awareness. Here we describe one approach for mitigating this risk. We report on our investigation into the efficacy of real-time feedback – privacy-protection technology that help people be aware of the future implications of their privacy-related choices.
Introducing Real-Time Feedback
Although significant attempts have been made to support privacy awareness [6, 10], usable awareness interfaces design remain a big challenge. Therefore in our work we focused on user interaction design, and users’ reaction to the technology.
We borrowed from Erickson and Kellogg’s concept of social translucence  in supporting awareness through a shared understanding that enforces accountability by making things visible to one another. While their approach was mainly addressed to the computer supported collaborative work domain, we see a strong benefit for incorporating social translucence into privacy-aware ubicomp systems for the following reasons:
- First, since ubicomp systems encompass technical, physical and social environments, social translucence offers a more natural approach towards communication, in which the information flow between the three environments can be more effective.
- Second, despite support for visibility and awareness, a third characteristic of social translucence – accountability, offers great promise for stimulating privacy-respecting behaviour and enforcing social norms in digital systems. It is a useful design feature as it might minimize the practical burden of privacy management as highlighted by Hong .
In our work, awareness is achieved by real-time feedback as the method of informing users about how their information is being used. We define feedback to be the notification of information disclosure, where the notification specifies what information about the person is disclosed when and to whom. This definition is drawn from the work of Bellotti and Sellen .
Usage scenario: Buddy Tracker
We decided to evaluate our approach in the domain of mobile location-sharing applications. For this reason we developed Buddy Tracker, social networking app allowing peers share their location in real time. We see mobile technology as a challenging design domain for awareness systems and interfaces, mainly due to its context-awareness potential and interaction design opportunities: context-awareness, richness of input and output methods and robust hardware offer a great promise for designing novel interactions for feedback on mobile devices.
Real-Time Feedback: How it works?
Every time a user of Buddy Tracker checks another user’s location the system automatically sends a notification to the data owner, which informs him about every check made on his location. Because both the requester and data owner are aware of this notification process, the system supports awareness. Buddy Tracker supports several sensory dimensions for representing feedback, which can be flashing screen, led, auditory message in natural language, vibration or graphical element on the screen. Example visual notifications are presented.
To provide a richer experience and minimize the negative effect of interruptions caused by inappropriate feedback representation, we implemented a feedback adaptation mechanism. Our system is capable of sensing user’s context and adapting its behaviour to the user. For example: vibration is used when the phone is detected in the pocket or a flashing led light is used when user is watching a video.
The basic scenario presenting our approach is presented in the Figure 2. A user of the client application (U1) sends a request to view the location of a fellow user (U2) to the Buddy Tracker server. The server generates a response containing U2’s location information and sends it to U1. Additionally, the server generates a feedback response, which is sent to U2, informing them that U1 viewed their location. Both the data requester (U1) and data owner (U2) are users of Buddy Tracker client application.
Examining Real-Time Feedback: Does it work?
We conducted several studies (focus groups, interviews, field trials ) to:
- explore users’ reactions to the concept of real-time feedback as a mean for supporting awareness and enforcing social norms in privacy-sensitive systems; and
- examine the impact of socially translucent system on social norms enforcement. We were also interested in assessing whether associating contextual factors with users’ preferences can minimize the intrusiveness while maximizing the effectiveness of real-time notifications.
A total of 27 participants used our technology for 3 weeks. They were split into smaller groups and asked to use the Buddy Tracker application in their daily routine. People could share their location, set privacy preferences and check who viewed them. Depending on the user’s context, Real-Time feedback was delivered in the form of vibration, sound, flashing light, textual or graphical interface presented to the tracked person, immediately after they had been looked up.
We designed and built a privacy-awareness system capable of adapting to the user’s context, which improved the user experience and had a positive impact on the acceptance of this technology. Moreover, we observed that the introduction of real time feedback had a definite effect on the participants’ use of the system; it did not stop them but it did limit usage to the situations where they felt they had an obligation from the data owner to check his location. In other words, real-time feedback helped us introduce social norms into the digital system usage practice.
Our studies indicate that one’s privacy can be protected with little to no effort by making things visible one to another. We showed that visibility, which has been represented in the form of real-time notifications, resulted in better awareness of the extent to which the system works. It shows that a socially translucent architecture successfully enforces accountability and limits the number of unmotivated and unreasonable location requests, which in consequence helps preserve one’s privacy.
There is no strong consensus in the HCI community as to how privacy-awareness interfaces should be built. The ideas presented in this work provide a new starting point for the privacy-aware systems designers and goes some way towards addressing the problem of awareness interfaces, which has been recognized as one of the key challenges for the future work on privacy in HCI .
We showed that incorporating feedback in digital systems could be used to enforce social norms. We hope this work contributes to a discussion about novel ways of achieving privacy, including those that nudge people towards privacy-respecting behaviour. More studies are needed to explore what other design features can help people make better privacy choices.
More details about this work can be found in [1, 2, 3].
- Mancini, C, Rogers Y., Thomas K., Joinson N. A., Price B. A., Bandara A. K., Jędrzejczyk Ł., Nuseibeh, B., “In the Best Families: Tracking and Relationships”. Proceedings of the 29th International Conference on Human Factors in Computing Systems. ACM CHI. 2011.
- Jędrzejczyk, Ł, Price B. A, Bandara A. K., Nuseibeh B. A., “On the Impact of Real-time Feedback on Users’ Behaviour in Mobile Location-sharing Applications.” In Proceedings of SOUPS ’10. Redmond, WA, USA. 2010.
- Ł Jędrzejczyk, C Mancini, D Corapi, B A Price, A K Bandara, Nuseibeh B. A., “Learning from Context: A Field Study of Privacy Awareness System for Mobile Devices”. Technical Report 2011/07. 2011.
- Weiser, M, “The Computer for the 21 Century” Scientific American 256 (3): 94–104. 1991.
- Iachello, G, Hong J., “End-User Privacy in Human-Computer Interaction”. Now Publishers Inc. 2007.
- Langheinrich, M,. “Personal Privacy in Ubiquitous Computing. Tools and System Support.” PhD Thesis, Zürich, Switzerland: Swiss Federal Institute of Technology (ETH Zürich). 2005.
- Bellotti, V, Sellen A., “Design for Privacy in Ubiquitous Computing Environments.” In Proceedings of ECSCW ’93, 77–92. Milan, Italy: Kluwer Academic Publishers. 1993.
- Erickson, T, Kellogg W., “Social Translucence: An Approach to Designing Systems That Support Social Processes.” ACM Transactions on Computer-Human Interaction (TOCHI) 7 (1): 59–83. 2000.
- Nguyen, D, Mynatt E., “Privacy Mirrors: Making Ubicomp Visible.” In Human Factors in Computing Systems: CHI 2001 (Workshop on Building the User Experience in Ubiquitous Computing). 2001.
- Hong, J I., “An Architecture for Privacy-Sensitive Ubiquitous Computing”. Unpublished PhD Thesis, University of California at Berkeley, Computer Science Division. 2005.