Adventures in AR

During the making of Amplifying Feedback Loop I was eager to try to find more methods for connecting with people and communities on issues, to help educate or even provide an open format for folks to start to speaking out, taking action, and getting informed around what they could do or what was possible to combat climate change.

Ultimately, I also used this as an opportunity to explore some emerging technologies, Aughmented Reality (AR) to bridge that gap — maybe trying for a two-birds one stone approach to solving a problem.

I partnered up with animation and tech wizard Tahnee Gehm, who had crafted interactive experiences with her CoastARs works and other XR technologies. Not knowing how to code myself, I had a vision for what the site and AR would do, and while I provided all the art, Tahnee was integral in crafting a functional platform.

We chose to utilize a non-third party application, such as Snapchat, Facebook, or AdobeAero, to avoid potential data-mining issues as well as remove any paywall for the user. This meant that Tahnee was utilizing neutral applications such as Pictarize Studio and Mind AR.

Funding on this portion is thanks to an Individual Artist’s Grant through the Genesee Valley Council on the Arts, which aided when I wanted to explore how to engage communtiies to foster conversations around Climate Change in more passive or “low-conflict” ways through utilization of AR for Amplifying Feedback Loop .

Producing the interactive work was a feat— and while both of us have a background in animation, the necessary components for creating a functional application proved to be difficult to manage at first simply due to a difference in how we thought about media. At first, I found myself stymied by my own lack of verbiage, and non-comprehension of how AR worked.

I drew diagrams to showcase what I hoped to achieve with video elements, static drawn elements, text, and how the marker’s themselves would have to be oriented to be visible. These notes back and forth to each other were a boon, and helped keep miscommunications or misunderstandings to minimum in the process.

The way the AR functions is that the camera’s lens reads the visual on an AR-marker, which is a printed image that holds a unique high-contrast shape , that is linked to the code that plays the visual.

By using placeholder graphic for the AR-Markers, we could tweak the action bit by bit, but the unique imagery we ended up for AR Markers themselves was also necessary to solidy as soon as posisble, to ready the work to be printed on time for the works-in-progress Rochester Fringefest exhibition held at RIT City Art Space.

Final AR marker # 3 - printed coaster version

Final AR Marker #3 - simplified AR-readable version

High-contrast imagery is necessary for AR— anything unclear, muddy or too “gray tone” would create a break in the camera’s lens ability to read the markers. So each final image printed on the paper coasters correlated to a much simpler version as well— and functionality hinged on if the algorithm could interpret the printed image to be the same as the simplified “8-bit” pixelated image. This meant many iterations, tweaks, inversions, and so on for the art that would be printed on the coasters.

The project ended with a website (www.amplifyARs.com) which functioned on both phone and deskstop. On a desktop, it will take you to a functioning website that introduced the works-in-progress film, discussed climate change facts and shared resources (as of 2021).

The phone version of the website, asked for photo permission and then allowed the user to view the AR-video when pointed at any of the 3 AR-markers. Users could then tap the screen to transition to the next text-slide . Ideally, we had hoped each tap would be able to activate a new animation, but the animation itself was very data-heavy, and higher complexity seemed to break the application.

Completed AR- connected site and markers viewed through an Android phone.

Final stage output of the AR elements and coasters still continued to have issues across different phone types, and the long-term needs of reprogramming or adjusting for each phone type as well as against any updates phones may have seem to show that unless their is unlimited funding, utilizing a 3rd party app does have some more stability due to this.

The desire to utilize a uniquely-functioning application was to try to limit paywalls and access, but while ultimately this was an exciting foray into more interactivity, I do not think this method of engaging folks was successful. There is still a highly questionable nature to those “comfortable” using AR or QR codes that is inherently more true with younger and more urban settings, where this is more highly employed (such as in New York City). The novelty of this can be engaging, but I fear that its usage is still just that— novelty only and does not create an particular advantages to improving the visibility of a message.

I think this also comes from an oversight (or better yet an undersight) which is the main learning I have taken away from creating this film: That the issue is enormous, and that the task of creating a simple and holistic break down the intensely interwoven nature that is an “amplifying loop”— how each causation to the warming (additional carbon, methane, etc gasses) is multiplying the effects, which then also intensifies the initial cause- was simply out of reach for me to create at this time.

Previous
Previous

2022 Rochester Fringe Fest

Next
Next

Saltonstall Residency