tyvan-zaport-arsculpture

Tyvan and Zaport created an Augmented Reality application that allows users to become lost in virtual leaf pile.

Here’s a video:

This immersive AR app gives users the perspective of a small insect navigating through a massive leaf pile. We have chosen to install this piece on the Carnegie Mellon campus outside of the Hunt library. The virtual leaves and bark are made using the photographs of physical plants in this area. In this way, we’ve restored nature to a region that has become urbanized. The perspective of insect creates a sense of wonder and allows users to imagine a reality immersed in nature.

Here are the source images (photos taken outside Hunt Library at CMU):

And here’s a photo from the app:

jackalope-kerjos-arsketch

Check out this awesome, path-following gesture we made with Google’s Draw a Line: Make your own racecourses, and mimic the body motions of your friends!

This project could be further developed with the option of establishing trigger game objects in a game development platform like Unity. With something like a start and end platform in the virtual space, you could define and time a race between two people along the same path, and really have a racecourse. Alternatively, additional virtual objects could be placed along the path as special trail markers or Easter eggs for your followers to find.

tyvan-zaport-arsketch

Tyvan and Zaport created an app that allows you to reforest urban spaces.

Here’s what you see on the device’s screen

IMAGE HERE

Here’s an over-the-shoulder shot.

By tapping the face of a smartphone or other handheld device, users can drop a tree onto the ground. The tree will grow over time. Adding trees creates the illusion of a forest, which can be walked through. You can build your own forest fantasy in any urban area!

rolerman-sheep-arsketch

We went to a graveyard and drew flowers on people’s graves.

Unfortunately we forgot to take an over-the-shoulder video until we had left the graveyard, and we didn’t have the opportunity to return to re-document. Sorry!

We also made some zombies.

ookey-creatyde-arsculpture

Judging a book by its cover ~creatyde & ookey

headphones recommended

For our project, we decided to extend books beyond just their covers.  Rather than having the books exist in 2d space, we extended their existence into 3d using models that are image mapped to some unique book covers.  Further, we wanted to utilize the location of a library and the limitations it places on sound.  Although libraries are associated with silence, these books literally speak through sound that increases the closer you get to them.  The sound is also spatialized so it will move from the left to the right channels depending on the phone’s perspective. We also wanted to use the setting of a library and various genres of books to make these books seem like hidden sculptures that could be found throughout the shelves.  In the context of a library, they appear just like any other book.  Through AR, however, they are revealed to be much more.  Despite the different genres of books we selected, each one brings about powerful mental images, whether it is disco or micro-organisms.

 

farcar-tesh – arsculpture

Have a new gallery opening? Figuring our where to hang family photos? Now you can do all the planning on your phone! Introducing the Studio for Creative Inquiry’s Exhibit Planner.

By tapping on the screen, artworks slam down onto the walls in front of you, allowing you to place artworks on walls and judge their composition before putting up said artworks. The Studio For Creative Inquiry’s Exhibit Planner is user-friendly, allowing users to go into unity and swap in their own artworks for display. The artworks cycle with each tap to the device. Load in as few or as many artworks as you can!

The development of this application required clever thinking. The basic package of Vuforia came only with table detection. Our application was designed to work on walls. With this, we hacked the system to display our objects at 90 degree angles from the surface.

We created animations for each game object (image, frame, and backdrop).

Animations were called on startup once game objects were initiated, and halted once the game objects were destroyed. The images are 3D objects where the material is the 2D artwork. A script handles the material cycling during each tap.

A global accumulator variable is attached to an empty game object outside the scene. At each instance the device is tapped, the accumulator is incremented, a new game object consisting of the image & frame is generated, and a script attached to the image calls the empty game object for the accumulator value, mods it against the number of available materials, and sets the material of the image to the corresponding image.

For example, if we have 4 images loaded into Unity, and have tapped the screen 5 times. That means on our 6th tap, we will generate a new image & frame with image 6 mod 4 = 2.

Below is the script CubeClick which is attached to each image object.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class CubeClick : MonoBehaviour {
 
public Material material_01;
public Material material_02;
public Material material_03;
public Material material_04;
public int indx;
 
void Start () {
indx = GameObject.Find("GameObject").GetComponent().clickIndx;
if(indx == 0)
GetComponent().material = material_01;
else if(indx == 1)
GetComponent().material = material_02;
else if(indx == 2)
GetComponent().material = material_03;
else
GetComponent().material = material_04;
}
}

And the script Click which is attached to the Empty Game Object Outside the scene.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
 
public class Click : MonoBehaviour {
 
    public int clickCount;
    public int clickIndx;
 
    void Update() {
        if (Input.touchCount > 0)
        {
            Touch touch = Input.GetTouch(0);
            if(touch.phase == TouchPhase.Began)
                clickCount++;
        }
        clickIndx = clickCount % 4;
    }
}

So yeah, that’s our AR app…

 

 

yup…

 

 

Nothing more…

 

 

 

 

 

 

 

 

 

 

 

EXCEPT WHEN YOU HAVE YOUR GALLERY UP AFTER USING EXHIBIT PLANNER

THEN YOU CAN BRING HOME THE BIG DOLLARS WITH OUR EXHIBIT PLANNER EXTENSION: EXHIBIT PAY DAY!

tesh-miyehn-arsketch

We originally wanted to use this tool to prototype a physical 3D AR object / space that you could see into or even go inside.

Our initial prototype and ideas are documented here:

dechoes-fatik-arsketch

Porjects made jointly with Fatik.

Just a Line

The idea behind our AR experiment was to extrude 2D architectural plans into physical and tangible space. By layering the path created by tracing the outside walls of the architectural plan vertically, were were able to achieve a 3D space which was representative of the theoretical floor plan.

 

 

ookey-creatyde-arsketch

Line ghosts haunting the studio – creatyde & ookey

Our concept is something that would actually be experienced in the JustALine app or something with a similar level of “glitchyness” as to where the objects appear, especially in low lighting.  It would generate these line monsters or ghosts that would jump from their original position to a different location in the room, scaring you with their new spot.

Written by Comments Off on ookey-creatyde-arsketch Posted in 08-arsketch

tesh-miyehn-arsketch

Our initial idea was to create something that looks completely different (including scale) when viewed from inside and outside, which is impossible to achieve in real world. We first thought about creating a virtual 3D space that one can walk in and out, but it was hard to draw enough lines before they floated away… so we had to scale down our idea, from creating a space that one can walk into, to maybe just a window that one can view from. And we drew what’s shown in these videos.

Do the lines look like a viewport? I doubt… But at least it’s something I can walk around or “look at” as if it’s really there.


To be honest, we don’t think Just A Line is a very effective 3D prototyping tool. But we had fun playing with it though, so despite we don’t know how it works or why it makes the errors, trying out the newest tools is still pretty cool.

dechoes_LookingOutwards7

AR project

Non surprisingly, I had a really hard time finding an AR project that had not already been presented in class (a good one that is). I settled for a Google Creative Lab project that, although featuring a more “typical” and boring wormhole, at least had an interesting twist to it.

The idea was that, instead of stepping into a wormhole and discovering a new, virtual place, you would step into a near past. This “past” had been previously captured by a 360 camera, which was then transferred back into the wormhole. The idea is itself pretty simple, but deals with some interesting sci-fi questions and what will be possible in the near future.

Ackso-LookingOutwards07

An art collective called “MoMAR” (Museum of Modern Augmented Reality?) recently released their first exhibition at MoMA. However, the museum was not informed of this, because they placed their work directly over an existing Jackson Pollock exhibit with AR. This format has a similar message to pieces like when Banksy put his own paintings over ones in a gallery, but in a less aggressive way. The aim of this exhibition was to counteract the fact that “art is owned, valued, and defined by ‘the elite'”. In particular, the artists felt that members of the general public have been reduced to the roles of passive observers because only the richest galleries are canonized. They subverted this trend by not only making fun of the situation, but also giving the public an active role in viewing the work through AR. One could criticize the content itself for being shallow since it relies on the surprise and wonder people have for this fresh new technology. However, the content may be quickly understood, but that doesn’t make it any worse than the inaccessible, “deep”, and highly conceptual work they are covering up. In fact, the work’s accessibility underlines their main argument with this show: art is for everyone.

farcar-conye – arsketch

Let artwork come to life with a click glance of a phone! Created by Oscar & Connie

By looking at artwork with your phone, you can see hidden pieces that were not once there before. See artworks hold hands, yell at each other, or grow in size! This algorithm searches for gaps between pictures and adds its own effect to connect images with each other. It’s not an ARt Gallery without AR.

ookey-lookingoutwards07

 

To be honest, I find Metaverse Nails brand to be kind of overwhelming.  That being said, I really appreciate them exploring different spaces that AR can exist.  I think there is a lot of potential with the concept of wearable AR, and nails are a clever space to explore this.  Even though their aesthetic doesn’t match my own, I appreciate their desire to keep explicit femininity along with the tech.  They state on their instagram that they are “100% FemmeTech” and “Artist-Made.”

tesh – UnityEssentials

Lynda Tute

It Begins…

Chapter 1:

Laid out my file structure and imported the Unity Standard Assets pack

Chapter 2:

Wow made some 3D Objects!

Chapter 3:

Importing objects into Unity

Chapter 4:

*Note – up til this step I was going from scratch but I had some issues with the imported materials in my version so I switched over to some of the tutorial’s Unity files to help follow along.

Messed with materials and animated my own slime object

Chapter 5:

Making prefabs and messing around with objects

Chapter 6:

Designing, organizing, and building the level and then populating it with prefabs

Chapter 7:

Character can walk, run, and see the different animations I built into the level so far. Also there’s physics!

Chapter 8:

COLLIDERS!

Chapter 9:

Audio and sound effects

Chapter 10:

Lighting and reflectivity

Chapter 11:

More Lighting

Chapter 12:

Fog and particle effects

Chapter 13:

Bloom, color correction, and other post-processing

Chapter 14:

Triggering timeline animations and sound

Chapter 15:

The Final Build

 

agusman-lookingoutwards07

Hanely Weng, CoreML in ARKit

A simple project to detect objects and display 3D labels above them in AR. It uses iOS 11 beta, ARKit, and CoreML.

I’m fascinated with how machine learning, when integrated with AR, could allow us to surface more meaningful data points/information about our environment and the things inside of it. While simply the title is displayed atop the object in this specific project, I could imagine CoreML enabling more advanced search capabilities in real-time.

aahdee-LookingOutwards07

Consumption Cycle by Heavy

sxsw_mural_1.jpg

This is a 84’x32′ mural that was created in Austin, Texas. When viewers held a mobile device up to the mural, they would see the 4 figures in the mural become animated. Viewers could approach it from all angles and see that there is depth to the AR animation. The artwork plays with the ideas of production, consumption, and advertisement. This mural is interactive. Viewers can interact with the mural by touching the words “production” or “beauty”. Their answer will be recorded to a database to see how people interact with urban environments.

zbeok-UnityEssentials

As I mentioned to Prof. Levin earlier, I did not follow the standard Unity Essentials training; rather, I branched off to other more advanced tutorials. I ended up going through Lynda.com courses and taking a few, of which are documented below:

This tutorial was on custom tools in Unity. It was a good reference regarding using the Editor, so perhaps I may want to look into fancier subjects to actually put that to use.

Basically a walkthrough of an implementation of a path finding algorithm with no optimization. It was more of a demo of using scripts to suit alternative purposes than ones within Unity.

I felt a bit shaky on my knowledge of materials, which this tutorial brushed me up on. I also learned about light probes and post-processing, which may be helpful in an art-heavy scenario.

I had wondered about account management in the past, but never knew where to start. This tutorial not only opened my eyes to the process, but also walked me through how to do it. It might be pretty good for a multiplayer thing; I don’t think it’ll be very applicable to the class but it’s a fun thing to know.

I didn’t find any tutorials referencing the specific words “autonomous agent” but I think this tutorial came pretty close by introducing me to Unity’s native AI tools. It set up a scene from start to finish, then did some new and exciting things about AI. I would love to learn more in-depth about the subject, however, which did not happen in this specific tutorial.

Fluvio experiments. Unfortunately the version was not quite compatible, but the fluid system worked somewhat. This was a personal experiment involving going through the documentation, and not an actual tutorial, but I am interested in simulations, so I wanted to explore that.

 

tesh – Unity2

Shorthand for If-Else, structured as { condition ? “If Outcome” : “Else Outcome” }
Allows for simultaneous updating of a variable by having it associated with a class, or method without requiring or allowing separate instantiations
Allows for multiple methods with the same name so long as they have different parameter lists. Useful for consolidating methods that act similarly but take in different data types or structures.
Uses a generic variable to obtain type information for a method where you are unsure what type of data you will receive. The types can be constrained with class names, “struct” to ensure value types, “class” to ensure it is a reference type, or “new()” to ensure it has its own valid constructor.
Class B inherits from Parent Class A, meaning Class B has access to any public features from Class A. Class B is always an iteration of Class A.
Using Inheritance, you can allow child classes to have multiple types. You can also “upcast” a child class as a parent to treat it as a parent class by initializing a new child class as a parent class. To treat it as a child class again, you must “downcast” it again as shown here on line 12.
Reverse of Overriding. using the “new” command, you can force child classes that are upcast to their parent classes to use the parent’s version of a method if the method has the same name as a child class method.
Allows children to have variations of a parent’s methods, or even additions to the parent’s methods. Upcast children calling overriden methods will have access to the child’s version of the method rather than the parent, and can also call the parent’s version of a method by prefacing it with “base.”
Interfaces are similar to children of a parent but when classes implement Interface A, it allows functionality from Interface A to be shared among many unrelated classes (B). While you can only inherit from one class at a time, there is no such limit on interfacing with multiple classes.
Allows you to add functionality to methods for which you can’t edit source code or derive a child from.
Allows you to use classes among different scripts. Is seen in the “using UnityEngine;” and “using System.Collections;” lines that are at the head of most unity scripts.

Allows you to evaluate and run incrementing code without using the update function or timers for better efficiency.
Allows you to store and update rotational values without them succumbing to Gimbal Lock the way Euler Angles do.
Delegates are containers for functions that are treated the way variables are for data.

Events are delegates that act as broadcast systems to alert subscribed methods to be called when the event occurs.
Written by Comments Off on tesh – Unity2 Posted in 07-Unity2

sheep-LookingOutward07

 I found “Super Mario Bros AR,” which was super interesting because it really just unlocks a lot of potential with the sorts of movements VR can make you do. The game is a recreation of the first level of Super Mario Bros, except taking place from a first person perspective. It uses Unity, and took a month to complete. Abhishek Singh, the sole creator, used the Hololens to film the video. All in all, its a pretty silly, nostalgic video, but I think it reveals some interesting power of AR- mainly, what sorts of actions feel semi-natural in AR – like walking, moving, snapping, and jumping,  (whereas knocking your head into objects didn’t seem quire right). I think the first person recreation of Mario is super common from this one to this one. I think that is its main downfall- its reliance on nostalgia, though it feels very unique and different by really changing the ways the player interacts in the world.