Transcendence, AR Viewports in Architecture

AR Pavilion Starts around 1:50

ABSTRACT

With the advent and integration of modern technologies via current smartphone designs, augmented reality devices and experiences are now commonplace. Artists, corporations, and academia have been using smartphone experiences to capitalize on the affordances that mobile AR has to offer. From selling home-goods to tabletop gaming, these types of experiences are the most familiar to the general public. Concurrently, hardware and software companies are in their third generation of mass-produced AR headwear that provide a hands-free digital overlay augmenting the world around us. Admittedly, not a consumer-facing product as of yet, devices like the Microsoft HoloLens 2 (HL2) and the Magic Leap 1 (ML1) are being targeted at the enterprise spatial computing markets due to their complexity, cost, and lack of social acceptance among consumers.

However, since the inception of the concept of “augmented reality”, there are many more examples of this technology that permeate our daily life. Heads up displays (HUDs) in our cars, simple “smart glasses”, and digital video overlays via drone technology just to name a few. As new technologies such as transparent OLED’s, flexible displays, and micro-leds are developed and reach the consumer market, how can these technologies be used to create augmented reality display systems that provide additional affordances beyond their original purposes of a simple digital display?

This thesis project explores the additional affordances that consumer technology display systems have to offer the augmented reality space. AR viewports provide an alternative interaction model for augmented reality adding to the myriad of ways AR can be deployed in practical applications with unique design constraints. This exploration will result in the development and creation of applied prototypes intended for multidisciplinary uses in storytelling and data visualization.

ACM Concepts

  • Applied computing ➝Arts and humanities ➝Media arts • Human-centered computing ➝Human computer interaction (HCI) ➝Interactive systems and tools

Keywords

Augmented Reality, Projection Mapping, Viweports, User Tracking.

INTRODUCTION

Introduced in the early 1990’s, the term augmented reality was defined by Tom Caudell [5].  Since this time, the technology has grown exponentially into the fields of medicine, entertainment, tourism, gaming and especially education [1]. AR is only a singular demonstration of multiple virtual and artificial digital environments later outlined by Milgram and Kishino [6] wherein the authors defined four different types of “reality media”; the real environment (RE), augmented reality (AR), augmented virtuality (AV), and virtual environments (VE) [2]. Milgram and Kishino [4] would codify their four definitions as a singular mixed reality (MR) ecosystem.   However, these different modes have recently fallen under the banner of “XR” whereas X is the stand-in for V (virtual), and A (augmented), and any other new reality technologies that are developed in the near and distant future.

I believe that augmented reality technologies are part of the Technosphere that have a greater ability to further penetrate our day-to-day lives. This is, in part, due to the maturity of technology for implementing spatial computing becoming smaller, cheaper, and socially acceptable for the consumer. As companies strive to create a socially acceptable product enabling consumers to wear devices that create an immersive digital overlay of our world, we are on the precipice of commonplace AR merging our digital and physical identities. However, AR is already socially accepted through current, albeit limited and novel means.

In its broadest use, we already see AR applied in our vehicles through line of sight heads up display systems that relay vehicle information to the driver and in our smartphones overlaying augmented physical/digital avatar data through applications like Snapchat and Instagram. Games have also introduced augmented reality to the mainstream with the ever-popular Pokemon Go. These examples of augmented reality often go unnoticed to the common consumer as they may only relate “AR” to HMD’s, if at all. This “stealth ubiquity” of current forms of AR are due to the broad definition of the technology’s application. Unaware that technologies are melding the physical and digital, users are consuming AR as more than a novelty, they are relying on it every day to make them look more attractive through skin smoothing camera algorithms and apply nostalgic effects through digital photo filters emulating the aesthetics of vintage Kodachrome or Polaroid processes.

These applications, I believe, only explore a very small slice of what is possible, not only with the technology, but with the creative application of that technology. If we are to truly utilize the broad definition that Tom Caudell put fourth, we must seek  alternative forms of AR that exist outside of that in which has already been produced. New technologies such as micro-led textiles, transparent OLED displays, and flexible displays can be repurposed to include their new affordances with an AR design mindset. These creative experiments may lead to new applications of AR for both industry and entertainment.

AR Viewports

The AR Viewport Project uses commercially available technologies, and the affordances that they provide, to create targeted user experiences for storytelling and data visualization. Initially started in the Spring of 2020, the prototype of AR Viewport utilizes discarded LCD panels to provide a singular AR storytelling experience from a fixed-point perspective. Designed in collaboration with ARCH 4017 – Senior Design Studio – Augmented Reality Lightweight Structures Architecture & LMC 6340 – Reality Experience Design, through the Spring of 2020 Semester, the AR Viewport will be used as a linear video and projection mapped storytelling device. The design constraints of this device can be outlined in the course description of this interdisciplinary collaboration:

This research studio lies at the intersection of design computation, architecture, and computational media with projects developed by collaborative teams of architects, and digital media designers. The goal of this course/studio is to design and build 2-3 lightweight, interactive pavilions for installation on the site of the Lynching Memorial (The National Memorial for Peace and Justice) in Montgomery, Alabama. These pavilions will serve as extensions of the memorial and our senses by including augmented reality displays as part of the visitor experience. These architectures and their AR displays are to be designed to travel the globe to educate different publics on America’s history, the present, and the future. The aim of this project is not to reinforce these acts of violence, but instead to transform it, enrich it, and tell the stories of those who have been silenced. These pavilions will be built using textiles. Just as histories and culture are embedded in architecture, so too are histories and stories woven into fabrics and textile practices. Architecture students will develop conceptual and material connections between textiles and architecture in the design and construction of lightweight, mobile pavilions, while computational media students (LMC) will develop digital stories and experiences that connect these pavilions to the site, memorial, people and their stories. How might we transform textiles to create experiences, cement memories, and transform how we interact with space in architecture? How might architecture and computational medium inform each other to make meaning, acknowledge racial terrorism, and advocate for social justice? How might this architecture link and circulate important social, technical, and economical values? How might the architecture pavilions engage or interact with its surroundings and visitors so that human action is highlighted. By mid-semester all teams will have mock-ups and prototypes of their physical and digital artifacts.

AR Narriative

Additional design constraints rely on integrating a traditional narrative into the textile pavilion using AR. This narrative would chronicle the tale of Jesse Washington, a man who was lynched in Waco Texas at the age of 17. With the collaboration of Morgan Chin (MS LMC Student) the narrative below was formed with the only the idea of the AR Viewport in-mind.

Main AR Section with AR Viewport: Bolded words are POV images from Jesse Washington’s perspective at the time of his lynching:

 

Jesse Washington is being led somewhere. He woke up with what seemed to be a bag on his head. He is very afraid and confused. He doesn’t understand what is going on. He is a 17 year old farm hand from Waco, Texas, and he has never been this afraid before. He can’t see anything at all.

 

Everything is dark. Suddenly, he begins to hear distant voices. He can’t quite make out what the voices are saying, but he can hear them. He begins to be led closer to the voices. He starts to hear what sounds like people saying “We are going to teach him a lesson.” and “Let’s show him what respect means.”. This confuses him, because he doesn’t know what these people are talking about. He doesn’t know who they are talking to. He begins to see a distant and blurred light. As he moves closer to the light, the voices begin to get louder. It sounds as if there are many people near him, and they are all yelling loudly. What is going on?

 

“Step up boy” He hears someone telling him to step up. He is being led up stairs, but where is he going? The lights begin to become a little more clear and the voices are very loud now. He feels very afraid and very confused. Suddenly he is stopped and the bag is snatched off his head. He quickly orients himself to his surroundings, and realizes that he is staring into a sea of people. There must be thousands of people standing in front of him, and they are all yelling at him! He looks down at his hands and realizes that they are tied together with rope. He looks to his right and sees a rope hanging from a tree branch above him. He realizes instantly what is going on. He is about to be lynched. He is terrified and feels as if he is going to be sick. All he can hear are the shouts of a crowd that looks like it contains thousands of people. He feels hopeless and shocked By this sudden turn of events.

This narrative above is used for traditional video images/montage for use in the AR viewport.

 

 

Shot Description

Bag covering with a little bit or light showing through

Bag covering camera and showing a bit of outdoor surroundings/hear yelling

Bag is removed; Angry crowd yelling at the person

Person looks down at their hands and sees them tied with rope

Person sees rope drop down in front of them; they can see crowds in peripheral

Person sees people next to them that are about to lynch them; they can see crowds in peripheral

Man says something to the person & the rope gets put over him

Hear hanging of the person- darkness/black out

Shot of the person on the platform standing in front of the rope with hands tied & white men surrounding

Hear the crowd yelling as the shot fades to darkness

Fade to black

Because of the design constraints of this project, a device needs to be created that can both project traditional POV video footage for one interactor at a time, providing a singular emotional experience while also interacting with the textile pavilion designed by our collaborators in architecture. After spending some time thinking about what could be done outside of traditional “Smartphone AR” design methods, we came up with a solution that seeks to meet the design constraints of the collaboration while also providing for a novel experience for the interactor.

Design

A demonstration of the viewport, the pavilion, and the narrative in action can be found here: youtu.be/nrHcEgPEWXA [12]

The AR Viewport (v1.0) concept uses transparent LCD’s or traditional backlit LCD’s to simulate projection mapped visuals onto the surface of the textile pavilion (figure 1).                                Due to the low transmissive nature of the LCD displays, the textile pavilion will be white and well-lit to provide enough backlight for media projection back to the interactor.

To stress the singular experience, an outside audience, as it looks at the AR viewport, sees nothing displayed on the screen. The media is only revealed to one interactor at a time as they peer through the viewport and the narrative starts. This is due to the removing of the first polarization filter of the LCD panel and it’s relocation to the “port” section of the device. When the interactor looks through the port, the polarizers reveal the mapped image to only them, creating a novel singular experience capitalizing on the use of point-of-view (POV) media in the Viewport. Additionally, there is also a PID sensor in the device, triggering the media playback system for the interactor as the experience is linear with a starting and an end point. The system will have zero interrupt and will cue the interactor to move-on through the pavilion at the end of the playback to engage physical interaction [12].

Further Work

As of this writing, I have used the foundation of AR Viewports for data visualization at the Oak Ridge National Laboratory (ORNL). Under the guidance of Chad Steed, Director of the Visual Informatics for Science and Technology Advances (VISTA) Lab (https://www.ornl.gov/vis), this project has gone beyond the concept and application stage in storytelling and target real-world and real-time visual data analytics for the Exploratory Visualization Environment for Research in Science and Technology (EVEREST) data visualization facility. This facility invests in data visualization for the Oak Ridge Leadership Computing Facility (OLCF), home to the fastest supercomputer in the world (SUMMIT) (https://www.olcf.ornl.gov/summit/). The proposed project intends to use newly available transparent OLED or micro-led screens in conjunction with operational data to monitor the state and health of the lab from SUMMIT’s “Overlook Room”. Version 2.2 of this project has been fully realized by using transparent OLED screens as the AR medium. This work can be found here: youtu.be/0o23vKJt-pQ [13]. 

ACKNOWLEDGMENTS

First and foremost, a big thank you to Dr. Jay Bolter and Dr. Vernelle A. A. Noel for their support and guidance from conception to implementation. Since this project started off as a collaborative initiative between Architecture and Digital Media, there are several members participating in the design constraints as outlined in the ARCH 4017 – Senior Design Studio – Augmented Reality Lightweight Structures Architecture & LMC 6340 – Reality Experience Design:

Daniel Phelps (DM) – Hardware lead in charge of designing AR Viewport and Media content creation.

Morgan Chin (DM) – Story lead in the research and development of the AR Viewport POV narrative.

Tia Calhoun (ARCH) – Senior design student for Machines Pavilion textile work.

Karen Tran (ARCH) – Senior design student for Machines Pavilion textile work.

Morgan Lee (ARCH) – Senior design student for Machines Pavilion textile work.

Montana Ray (ARCH) – Senior design student for Machines Pavilion textile work.

REFERENCES

  • Akçayır, Murat, and Akçayır, Gökçe. “Advantages and Challenges Associated with Augmented Reality for Education: A Systematic Review of the Literature.” Educational Research Review 20 (2017): 1-11. Web.
  • Altinpulluk, Hakan. “Determining the Trends of Using Augmented Reality in Education between 2006-2016.” Education and Information Technologies 24.2 (2019): 1089-114. Web.
  • De Sorbier, F., Takaya, Uematsu, Daribo, and Saito. “Augmented Reality for 3D TV Using Depth Camera Input.” 2010 16th International Conference on Virtual Systems and Multimedia (2010): 117-23. Web.
  • Hsu, C. W. et al.Transparent displays enabled by resonant nanoparticle scattering.  Commun. 5:3152 doi: 10.1038/ncomms4152 (2014).
  • Lee, K. (2012). Augmented reality in education and training. TechTrends, 56(2), 13e21. https://doi.org/56(2):13–21. https ://doi.org/10.1007/s1152 8-012-0559-3
  • Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE Transactions on Information and Systems, 77(12), 1321–1329 https://search.ieice.org/bin/summary.php?id=e77-d_12_1321.
  • Schneider, David. “Augmented-reality TV [Hands On].” IEEE Spectrum 48.10 (2011): 22-23. Web.
  • “Transparent OLEDs: Introduction and Market Status.” OLED, Oled-Info.com, 12 Jan. 2020, oled-info.com/transparent-oleds.
  • Zhang, H., Lin, L., Liu, Y., Yu, J., Wang, J., Liang, H., Zhou, J., Chen, L., Li, L., Mao, Y. and Huang, D. (2018), 6.3: Glasses‐free Virtual Reality for Rehabilitation of Stroke Survivors. SID Symposium Digest of Technical Papers, 49: 57-59. doi:1002/sdtp.12638
  • Murray, Janet Horowitz. Hamlet on the Holodeck: The Future of Narrative in Cyberspace. New York: Free Press, 1997.
  • Bibiloni, A., M. Mascaró, P. Palmer, and Oliver. “Hypervideo, Augmented Reality on Interactive TV.” Communications in Computer and Information Science 389 (2015): 17-31. Web.
  • Phelps, Daniel, director. Architecture & AR Viewports. YouTube, The Georgia Institute of Technology, 14 Apr. 2020, be/nrHcEgPEWXA.
  • Phelps, Daniel, director. Augmented Intermediate Layers (AIL) Research. YouTube, Oak Ridge National Laboratory, The Georgia Institute of Technology, 24 Nov. 2020, be/0o23vKJt-pQ.

Description

This thesis project explores the additional affordances that consumer technology display systems have to offer the augmented reality space. AR viewports provide an alternative interaction model for augmented reality adding to the myriad of ways AR can be deployed in practical applications with unique design constraints. This exploration will result in developing and creating applied prototypes intended for multidisciplinary uses in storytelling and data visualization.

Masters Thesis Accepted Spring 2021

css.php