Twitter

Posts Tagged with “ubicomp”

Ubiquitous Computing and Environmental Interaction

Abstract

The way we interact with computing technology has dramatically changed over the last half-century. From their origins as stationary devices that sit on our desks, computers have evolved to embed themselves in our daily lives by permeating the spaces that we live in. They now control our thermostats, track our heart rate from our wrists, run our cars, and collect sensory data through complex arrays out in the field. This shift is often referred to as ubiquitous computing: the idea that computers are invisibly all around us, passively observing and interacting with groups of people, individuals, or the broader environment. As we continue to embed computers into our daily lives, it’s important to understand how they affect our behavior, the way we perceive the world around us, and how we interact with others. Further, as developers for these new forms of technology, we must understand how to design for embedded, ubiquitous computer systems to fit in with our pre-existing behaviors as we navigate our environments. This research review looks at how computers, when set within the broader context of an environment, influence the way we behave, and the considerations that designers need to take into account in order to integrate these systems in unobtrusive and intelligent ways.

Introduction

Several major themes emerge when looking at how researchers measure the effectiveness of ubiquitous computing devices (also known as embedded systems) with environments, individuals, and larger groups. Furthermore, designers of embedded systems have several unique problems that they must take into account than if they were simply developing a physical product, or if they were simply creating a purely digital product. Devices that are ubiquitous in nature bridge the digital-physical gap and occupy a hybrid space that demands a slightly different way of thinking about design and the design process than traditional products. This is intrinsically tied to the idea that ubiquitous computing artifacts occupy multiple spaces (physical/digital) simultaneously(1)Crabtree, A., & Rodden, T. (2007). Hybrid ecologies: understanding cooperative interaction in emerging physical-digital environments. Personal and Ubiquitous Computing,12(7), 481-493. doi:10.1007/s00779-007-0142-7.

One of those themes has to do with how designers and users understand the relationship between ubiquitous computers and the contexts that they reside within. In ubiquitous computing (ubicomp), understanding the physical state of the individual, understanding the emotional state of the individual, and understanding environment that they exist within are critical to designing a device that engages users in context (2)Okada, M., & Tada, M. (2013). Understanding spatial contexts of the real world under explicit or tacit roles of location. Proceedings of the 25th Australian Computer-Human Interaction Conference on Augmentation, Application, Innovation, Collaboration – OzCHI 13. doi:10.1145/2541016.2541018. This involves understanding how space is divided up by location, but as well, by clusters within those locations. It also involves understanding the significance of space to the user as they interact within it. That is to say that space is only given meaning within the context of the actions that a user takes within it. Therefore, space, and the meaning that the user gives it, is a significant factor that a designer must account for if they want to create unobtrusive, useful artifacts.

Second, testing ubiquitous computing devices is incredibly challenging as the devices allow for both explicit and implicit interaction (3)Tan, C. S., Schöning, J., Luyten, K., & Coninx, K. (2013). Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments. Proceedings of the 2013 international conference on Intelligent user interfaces – IUI 13. doi:10.1145/2449396.2449427(4)Gomes, T., Abade, T., Campos, J. C., Harrison, M., & Silva, J. L. (2014). Rapid development of first person serious games using the APEX platform. Proceedings of the 29th Annual ACM Symposium on Applied Computing – SAC 14. doi:10.1145/2554850.2554969. In other words, users can interact explicitly and directly with a device similar to how we interact with many digital and physical products today. Alternatively, however, users can interact passively with ubicomp devices through sensors that detect motion, body posture, eye gaze, and other factors. Implicit interaction creates challenges for both user and designer. For users, it’s often unclear how they are interacting with a system when they aren’t intentionally and directly engaging with it. For designers, understanding how people navigate space, and how to design systems that respond to a user’s behavior in such a way to provide actionable, useful information as it’s needed is a challenge that requires intensive, and often costly, research.

Finally, since ubiquitous computing devices operate in a hybrid nature (in that they bridge the digital-physical gulf), they resultantly exist in a fragmented environment(5)Andy Crabtree , Peter Tolmie, A Day in the Life of Things in the Home, Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, February 27-March 02, 2016, San Francisco, California, USA(6)Crabtree, A., & Rodden, T. (2009). Understanding interaction in hybrid ubiquitous computing environments. Proceedings of the 8th International Conference on Mobile and Ubiquitous Multimedia – MUM 09. doi:10.1145/1658550.1658551. That is to say, instead of objects being connected with one another, they are connected to the Internet. This changes the feedback loop that users typically rely on to understand how they are affecting the state of a system, as the feedback is no longer directly and meaningfully mapped to the object of interaction. As a result, designers must consider how to provide feedback to a user that makes sense given the actions that they’re taking. Further, users must be able reconcile their interactions with devices to real outcomes.

Taking all of these factors into account, we begin to see a picture take shape where ubicomp devices challenge the way we navigate space and design for users who are navigating space. These problems require a deeper analysis into how changing contexts, modes of interaction, and analyses of fragmented environments affect these new forms of technology. Along the way, we’ll look into the methodologies that some designers are developing to try to address these challenges.

Review of Research

Contexts

One of the challenges for ubiquitous computing is understanding when the user is confused and providing meaningful feedback to put them back on track. This requires some sense of being aware of the context (environment, state of mind, etc.) that the user is working in. The things that people interact with (both in their homes and in public environments) are seated within a larger context of other types of things which must also be taken into account when researching how the user interacts with ubiquitous computing (7)Andy Crabtree , Peter Tolmie, A Day in the Life of Things in the Home, Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, February 27-March 02, 2016, San Francisco, California, USA.

In addition to the types of things that surround ubicomp devices, when designing for ubiquitous computing and IoT devices, one must consider the order of things in the home. That is to say, one must consider the spaces in the home, the zones within those spaces, how a user navigates those spaces in their daily routines to create meaning, and how to compliment those spaces and activities with ubicomp devices(8)Andy Crabtree , Peter Tolmie, A Day in the Life of Things in the Home, Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, February 27-March 02, 2016, San Francisco, California, USA.

These things are further complicated by the physical and emotional contexts that users maintain as they engage with ubiquitous devices, and how those devices sense things like state of mind, gaze, and other psychological and physiological responses. For instance, researchers have investigated utilizing eye tracking as a way to determine what users are concentrating on at any given time. If the right time, place, and action can be discerned by a system through these forms of tracking, then the correct information for that time can be given to the user(9)Jalaliniya, S., Pederson, T., & Mardanbegi, D. (2017). Symbiotic attention management in the context of internet of things. Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers on – UbiComp 17. doi:10.1145/3123024.3124559. Physiological recognition in the form of posture has also been explored, where embedded systems can detect when it might be necessary to provide feedback to a user. In addition to locational data, the state of things can be measured by analyzing the body of the individual for physical cues that might indicate things like frustration, confusion, and delight(10)Tan, C. S., Schöning, J., Luyten, K., & Coninx, K. (2013). Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments. Proceedings of the 2013 international conference on Intelligent user interfaces – IUI 13. doi:10.1145/2449396.2449427.

Finally, context is important as it relates to how users expect an interaction to fit into their current working models. Bill Buxton’s story of the “door cam” at Xerox PARC is one such example. Door cam was an extension of a larger system of cameras connected to desktop computers throughout the offices of PARC at the time. The cameras were used in such a way that they could be accessed by anyone, at any time, to dial in and talk to someone at their desk. Problematically, this (to some) felt like an intrusion of space. So, additional cameras were implemented directly outside the door of each person’s office, which simulated what we would do if we wanted to approach someone in their office and talk with them (we would first look in the office to see if the person was busy)(11)Bill Buxton. “Living in Augmented Reality: Ubiquitous Media and Reactive Environments,” 01 Nov 1997.. By affording people the ability to do things that they would also do in real life – and by being aware of contexts and social standards – designers can create ubicomp devices that situate themselves within the current working models of users.

Context is critical to how we navigate space through ubiquitous devices, and how designers create ubicomp devices to reside naturally within our environments. It is also, notably, incredibly expensive and time-intensive to design artifacts that meet these requirements, as they require real, physical spaces with individuals navigating them in order to provide valid tests. Nevertheless, understanding how people navigate space, zones within those spaces, and their physical and mental states as they do so is important for embedded devices. Further, due to the challenging nature of designing these devices well, an overwhelming amount of research suggests that current ubicomp devices don’t naturally fit in with the flows of user behaviors(12)Andy Crabtree , Peter Tolmie, A Day in the Life of Things in the Home, Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, February 27-March 02, 2016, San Francisco, California, USA.

Explicit and Implicit Interaction

The way that users engage with devices can be explicit or implicit. Explicit means of interacting with a device involve intent, in that the user must actively and purposefully engage with the device. Implicit interaction, on the other hand, is where the user is simply going about their daily lives, and sensors actively collect information and respond to their behavior(13)Gomes, T., Abade, T., Campos, J. C., Harrison, M., & Silva, J. L. (2014). Rapid development of first person serious games using the APEX platform. Proceedings of the 29th Annual ACM Symposium on Applied Computing – SAC 14. doi:10.1145/2554850.2554969. As we continue to embed computers into our daily lives, the explicit nature by which we interact with things is being changed and shifted toward more implicit forms of interaction.

Implicit forms of information require the categorization and classification of types of behaviors that can occur within a space, then tracking where people are in space, and what types of things they are doing while in the space. Categories might include things like interacting with appliances, furniture, toys, media, and so forth(14)Andy Crabtree , Peter Tolmie, A Day in the Life of Things in the Home, Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, February 27-March 02, 2016, San Francisco, California, USA. Perhaps surprisingly, when tracking an individual and analyzing what types of things they are doing in a space, the majority of things they are doing defy categorization(15)Okada, M., & Tada, M. (2013). Understanding spatial contexts of the real world under explicit or tacit roles of location. Proceedings of the 25th Australian Computer-Human Interaction Conference on Augmentation, Application, Innovation, Collaboration – OzCHI 13. doi:10.1145/2541016.2541018. This could include things like walking (moving about space), standing and thinking, and other non-critical interactions. Designers, therefore, must be able to discern a critical interaction from a non-critical one, and design objects that fit these needs.

The challenges provided by examining implicit interaction lie in the processing and computational requirements it places on machines attempting to track eye movement, body posture, and other similar physiological indicators. As Jalaliniya et al. note, in order for ubicomp devices to wayfind a user to the action they need to do, a system requires strong connections between multiple devices that can read a user’s behavior and reinforce the kinds of subliminal cues necessary to aid them in completing a task [8, 9]. This is not to mention the inherent dangers in the assumptions and biases we have about what the actions of a user mean, and how a computer’s interpretations of those actions might be erroneous and even dangerous.

Implicit interaction is another influencing factor in the way we perceive and design for space through ubicomp devices in that it requires a designer to create a model that they believe fits with a user’s expectations. In doing so, the designer is making assumptions about how users navigate space, and attempting to influence the user’s behavior through subliminal messaging or other means. It is, in effect, the main differentiating factor between ubicomp devices and other non-hybridized artifacts that users engage with, and presents a strong challenge when it comes to designing objects that can remain unseen while simultaneously providing meaningful feedback to their owners.

Fragmented Environments

Fragmentation is built into the very nature of physical computing devices, as it is into other hybridized forms of media, and requires the user to reconcile that fragmentation. Fragmentation emerges when one does not directly receive feedback from the thing that they’re interacting with, and therefore do not immediately see how a system or object is responding to their behavior(16)Crabtree, A., & Rodden, T. (2009). Understanding interaction in hybrid ubiquitous computing environments. Proceedings of the 8th International Conference on Mobile and Ubiquitous Multimedia – MUM 09. doi:10.1145/1658550.1658551. In other words, since ubicomp devices are not physically interconnected, but instead connected and mediated through the Internet, the nature of that connection is not whole but instead fragmented. This creates a seemingly disjointed system that both influences how users navigate space, and places responsibility on a designer to consider who that user navigates through their space.

Fragmentation influences how users navigate physical space by putting the onus on them to both discern and seek meaningful feedback from the actions that they take. This could take the form of confirming that a light has turned off after giving a command to a speech recognition device, or checking the temperature of one’s home after activating a geo-fence that enables temperature control in that home. By widening the gulf between action and response, fragmentation forces a user to seek meaning that emerges from the ubiquitous computing systems that they interact with.

This is only further complicated when we’re forced to consider multiple users interacting within a fragmented environment. Testing ubicomp devices becomes even more complicated as multiple individuals are both implicitly and explicitly interacting with ubiquitous systems, requiring a system to mediate concurrent behavior that requires overlapping feedback(17)Zilz, R. (2011). Specifying concurrent behavior to evaluate ubiquitous computing environments. Proceedings of the 3rd ACM SIGCHI symposium on Engineering interactive computing systems – EICS 11. doi:10.1145/1996461.1996540. Because of fragmentation, concurrent users can also influence the behaviors of other users interacting with a system by altering the state of the system and, thus, confounding the expectations of others.

By using the Internet to mediate their connections, ubiquitous devices are capable of taking advantage of interconnection and information abundance. At the same time, the fragmentation that emerges from decoupling physical inputs with feedback creates new forms of interaction that influence how users navigate space, and how designers create for those spaces. Understanding how fragmentation changes the way a user (or users) perceives the space around them should strongly influence the design of any ubiquitous computing device.

Conclusion

Ubiquitous computing devices are inherently tied to our navigation of space, and the ways in which we navigate space are tied to how designers create embedded, ubiquitous artifacts. The three primary design challenges of ubiquitous computing devices – context, modes of interaction, and fragmentation – influence how a user navigates their space, and how a designer understands their work within the larger stage of the environment in which it sits. As designers and users continue to adopt ubicomp devices, they should pay close attention to the influence that computers have on their behaviors and attitudes.

As further models are developed to measure the contextual use of ubiquitous artifacts, and how people interact with them, designers should take care when approaching such sensory and wayfinding methodologies as gaze tracking, body posture, and subliminal perception. At the same time, users must be critical of the devices that they engage with, and strive to understand the underlying systems that these fragmented artifacts run on top of.


Jay Margalus is on Twitter at @jaymargalus

References   [ + ]

1. Crabtree, A., & Rodden, T. (2007). Hybrid ecologies: understanding cooperative interaction in emerging physical-digital environments. Personal and Ubiquitous Computing,12(7), 481-493. doi:10.1007/s00779-007-0142-7
2, 15. Okada, M., & Tada, M. (2013). Understanding spatial contexts of the real world under explicit or tacit roles of location. Proceedings of the 25th Australian Computer-Human Interaction Conference on Augmentation, Application, Innovation, Collaboration – OzCHI 13. doi:10.1145/2541016.2541018
3, 10. Tan, C. S., Schöning, J., Luyten, K., & Coninx, K. (2013). Informing intelligent user interfaces by inferring affective states from body postures in ubiquitous computing environments. Proceedings of the 2013 international conference on Intelligent user interfaces – IUI 13. doi:10.1145/2449396.2449427
4, 13. Gomes, T., Abade, T., Campos, J. C., Harrison, M., & Silva, J. L. (2014). Rapid development of first person serious games using the APEX platform. Proceedings of the 29th Annual ACM Symposium on Applied Computing – SAC 14. doi:10.1145/2554850.2554969
5, 7, 8, 12, 14. Andy Crabtree , Peter Tolmie, A Day in the Life of Things in the Home, Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing, February 27-March 02, 2016, San Francisco, California, USA
6, 16. Crabtree, A., & Rodden, T. (2009). Understanding interaction in hybrid ubiquitous computing environments. Proceedings of the 8th International Conference on Mobile and Ubiquitous Multimedia – MUM 09. doi:10.1145/1658550.1658551
9. Jalaliniya, S., Pederson, T., & Mardanbegi, D. (2017). Symbiotic attention management in the context of internet of things. Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers on – UbiComp 17. doi:10.1145/3123024.3124559
11. Bill Buxton. “Living in Augmented Reality: Ubiquitous Media and Reactive Environments,” 01 Nov 1997.
17. Zilz, R. (2011). Specifying concurrent behavior to evaluate ubiquitous computing environments. Proceedings of the 3rd ACM SIGCHI symposium on Engineering interactive computing systems – EICS 11. doi:10.1145/1996461.1996540