What is an Augmented Intermediate Layer, or AIL?
As the physical and digital worlds begin to intervene in each of their respective realms, many are looking to mixed realities to parse information streams emanating from our ethereal and metaphysical lives. Formerly coined as mixed reality in 1994 , and what now some would, unfortunately, call the Metaverse™ ; virtual reality, augmented reality, and augmented virtuality intellectual property vehicles have spurred investment from media outlets, start-ups, and global conglomerates alike. As these technologies gain traction and support in the private sector, capitalism has seen the potential of this new technology, and there have been large investments in this sector that have propelled “ancillary reality” technologies forward at a rapid pace . The research and development of both the hardware and software for AR and VR endeavors continue to create new affordances in its application and acceptance from the consumer, artistic, and industrial sectors. As the research shows, however, this technology is still seen as a novelty  plagued by the growing pangs of any new medium, complexity, and inconsistency . As with any new technology medium, cultural acceptance and adoption are the keys to making an emergent and possibly disruptive technology ubiquitous.
As my academic interests orbit the realm of augmented reality and the affordances that the myriad of contributing technologies allows us to create, the following exploration will not only discuss where we are technologically with this medium but speculate as to how and why additional modalities need to be designed to push the form. The time to understand a new medium is when it is emerging .
Augmented Reality (AR) is a suite of hardware and software technologies that provide a digital overlay to the real world. Current AR modalities consist of hardware and software suites to function as head-mounted displays (HMD), mobile device experiences (phones/tablets), heads-up displays (HUDS), or stereoscopic cameras using VR HMD’s in a “see-through” mixed reality (MR) experience. Hardware and software combine to utilize digital information as a reality overlay/augmentation, incorporating entertainment media, technical knowledge, or data visualization. Currently in its infancy, Augmented Reality is thought to be the next outstanding personal computing achievement of the 21st century. With the development of mature software ecosystems to support more advanced and unobtrusive hardware, technologists and capitalists alike are keen to benefit from new consumer and industrial applications. Many in the research and investment community imagine the adoption of AR akin to the mobile computing revolution spurred by the iPhone in the late 2010s, with revenue projections of over 300 billion by 2028 .
A host of technologies are used to make augmented reality a, well… reality. Many of the technologies have been developed specifically for AR. However, many other technologies are off-the-shelf components used in the entertainment industry by the military and even shared between VR, and MR applications. For simplification, I will not wade into the technical minutia that makes the myriad of AR modalities work. I will cover the key components of these systems and refer to these components as “ancillary” technologies as these ancillary components will come into play as additional AR modalities are explored. Achieving functional AR is not dissimilar to the technology included in modern “self-driving” vehicles. Shared components include; computational imagery, networked databases, geolocation services, depth sensing, varifocal eye tracking, plate solving, machine learning, and mobile computing. AR is one of the more complex reality experiences to achieve. However, it is not as widely accepted or adopted in the consumer or industry space compared to mobile computing devices. This is slowly changing, however, due to the rapid pace at which innovation is happening in this sector .
As you can imagine, combining these emerging technologies creates complexity and apprehension from users without the technological know-how to engage in an AR activity. Whether wearing headsets, using your phone as a physical AR intermediate, or even wearing a VR headset for the mixed reality experience requires the user to engage in the process actively. And even then, is it more than just a novelty? In exploring the complexity problem of AR, I asked a simple question. What would augmented reality look like if the user not need to wear, hold, or deliberately interact with the technology? Preliminary explorations of this question yielded the creation of the Augmented Intermediate Layer (AIL), a subset of AR that bootstraps the modalities of the HUD, HMD, and transparent displays .
Augmented Intermediate Layer is a catch-all term for an AR system that requires no deliberate input from the user and provides a digital augmented overlay that is both physically and consciously transparent. Think of a window in your kitchen; as you look out this window, digital media is provided in stereoscopic/parallax acute vision transposed (1:1) to the outside environment, projection-mapped in detail on the surrounding natural landscape. The media displayed could range from a recorded memory of an interaction with your children to ETA data of your grocery delivery. This curated digital vision relies on computational photography and user tracking to provide this deception. What you see through this window blends into the real world seamlessly, automatically adjusting to your movements and position as not to break the illusion of a true reality augmentation. Additionally, AIL’s can be upscaled for multiple interactors, larger spaces, and consumer products such as vehicles and amusements.
Augmented intermediate layers are meant to be a departure from “traditional” AR. Without the need to interact, wear or manipulate the technology, the user experience is shifted from an acute awareness of reality’s augmentation to one where the background is opaque and seamless. I would argue that without the technology at the forefront of the AIL experience (compared to AR), the aesthetic presence in the artifact has changed. In the Benjaminian sense of the word, the user’s perception of the window’s aura creates an intersubjective experience, as if the window is gazing back at you. As there is a camera tracking the user, this is both physically and metaphysically true. The AIL “sees” you and collapses the distance between your digital self and the outside world . The object is projected onto the outside world creates a relationship with the user looking out. Benjamin would see this subjectivity between the natural and digital image as the ‘aura’. He states, “We define the aura of [natural objects] as the unique phenomenon of a distance, however close it may be. If, while resting on a summer afternoon, you follow with your eyes a mountain range on the horizon or a branch which casts its shadow over you, you experience the aura of those mountains, of that branch” . The subjectivity of the foreground and background of ones gaze changes the experience, enhancing the interaction’s perception and realness.
Additionally, without the direct manipulation of a physical artifact such as a HMD, Janet Murrary may suggest that the interactivity and immersion of an AIL would be a recursive experience, reinforcing the bond of one another. This “active creation of belief” she describes as, “when we are immersed in a consistent environment, we are motivated to initiate actions that lead to the feeling of agency which deepens our sense of immersion” . In this case, Dr. Murray is indeed referencing interactive narratives and agency. The digital narrative represented in the AIL would be to entertain, inform, or reflect. This exemplifies the importance agency of the user when designing the digital narrative that the AIL would reveal. As the aforementioned AIL is a speculative design, this consideration would need to be thoughtfully designed by its practitioners.
Additionally, McLuhan’s spectrum of participation best describes the main difference between an AIL and AR HMD experience . According to McLuhan’s theories on how new media changes the perception of society, the traditional AR experience would be “cool media”. At the same time, this media requires a higher form of participation from the user. This participation could be in the form of “providing missing information” to engaging more than one sense that increases involvement. Additionally, the HMD AR event requires the user to take many other things into context; gase, user input, and motor skills to facilitate the function. Augmented Intermediate Layers would most likely be considered “hot media”. McLuhan defines hot media as one that contains a depth of information or high definition to the user. AIL’s achieve this through integration and transparency into the environment. Nothing but the visual sense is engaged in this interaction, reducing the users participation and increasing their commitment to the medium.
From a theoretical design standpoint, Augmented Intermediate Layers offer an exciting intervention point in developing other Augmented Reality experiences. At its core, AIL’s share the same DNA as the myriad of AR implementations yet provide a drastically different experience and feel for the user. As the technology to bootstrap such experiences matures, there may be more opportunities to iterate and rethink what AR is and, more importantly, what it could be.
 Milgram, Paul & Kishino, Fumio. (1994). A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Information Systems. vol. E77-D, no. 12. 1321-1329.
 Stephenson, Neal, and Kodaj Dániel. Snow Crash. Metropolis Media, 1992.
 Di Serio, A., Ibáñez, M. B., & Kloos, C. D. (2013). Impact of an augmented reality system on students’ motivation for a visual art course. Computers & Education, 68, 586e596.
 Lin, T.-J., Duh, H. B.-L., Li, N., Wang, H.-Y., & Tsai, C.-C. (2013). An investigation of learners’ collaborative knowledge construction performances and behavior patterns in an augmented reality simulation system. Computers & Education, 68, 314– 321. doi:10.1016/j.compedu.2013.05.011
 Grand View, Research. “Augmented Reality Market Size & Share Report, 2021-2028.” Augmented Reality Market Size & Share Report, 2021-2028, www.grandviewresearch.com/industry-analysis/augmented-reality-market.
 McLuhan, Marshall, and W. Terrence Gordon. Understanding Media: The Extensions of Man. Gingko Press, 2015.
 Sag, Anshel. “Why Microsoft Won the $22 BILLION Army Hololens 2 AR DEAL.” Forbes, Forbes Magazine, 6 Apr. 2021, www.forbes.com/sites/moorinsights/2021/04/06/why-microsoft-won-the-22-billion-army-hololens-2-ar-deal/.
 Phelps, Daniel F. Affordances of an Augmented Intermediate Layer (AIL) in Scientific Applications, YouTube, 24 Nov. 2020, youtu.be/0o23vKJt-pQ.
 Benjamin, Walter, and J. A. Underwood. The Work of Art in the Age of Mechanical Reproduction. Penguin Books, 2008.
 Murray, Janet H. Hamlet on the Holodeck the Future of Narrative in Cyberspace. The MIT Press, 2017.
Culture, Camera. “Tensor Displays: Compressive Light Field Synthesis Using Multilayer Displays with Directional Backlighting.” Tensor Displays – MIT Media Lab, Camera Culture Group, web.media.mit.edu/~gordonw/TensorDisplays/.
MIT. “Layered 3D: Tomographic Image Synthesis for Attenuation-Based Light Field and High Dynamic Range Displays.” Layered 3D: Tomographic Image Synthesis For ATTENUATION-BASED Light Field and High Dynamic RANGE DISPLAYS, MIT, www.cs.ubc.ca/labs/imager/tr/2011/Wetzstein_SIG2011_Layered3D/.
Kimmel, Ron. “3D Shape Reconstruction From Autostereograms and Stereo.” Journal of Visual Communication and Image Representation, vol. 13, no. 1-2, 2002, pp. 324–333., doi:10.1006/jvci.2001.0486.
Johanna Drucker, Graphesis: Visual Forms of Knowledge Production (Cambridge, Mass.: Harvard University Press, 2014).
I coined the term Augmented Intermediate Layer in 2020 while exploring novel Augmented Reality Systems used in art and industry today. This writing explains the concept's differences and similarities in media theory and artistic practice.