farcar – ARTech

I began using AR Core from Google after realizing Vuforia had a lot of bugs. But then I realized that AR Core has a lot of bugs too. Such as:

  • Can not track vertical surfaces
  • Can not recognize horizontal surfaces that are far away
  • Can not track vertical surfaces
  • Can not track planes well in the dark
  • Can not track vertical surfaces
  • Track planes can not work on angles (no hills or inclines)
  • Can not track vertical surfaces
  • Can not track vertical surfaces
  • Did I forget to mention it can’t track vertical surfaces?

Fans of AR Core have been asking for months since the release of AR Core for there to be vertical wall detection. Google responds that it’s something they’re working on, but when the release may be, nobody knows. For now, some people have tried to be hack-y and create their own algorithm for creating vertical wall detection. It goes as follows:

  • Read from point-cloud data
  • Filter data (anything with confidence less than a certain interval, drop)
  • Assemble data in form of matrix and solve for the normal vector.
  • Use normal vector to create tracking plane.

Here’s the catch: Google makes it hard, very hard, to generate planes. It is easy to read from the Point Cloud and TrackerPlane data, but using the data to instantiate a new TrackerPlane is not an easy task. They bury this code far deep in their scripts. I had searched through more than 10 TrackerPlane scripts only to find no example of how to actually generate the thing. Nor has anyone online posted on how to do it.

As a result, I was unable to get the vertical wall detection to work. The most I got from the last week was being able to read from the Point Cloud data. I didn’t want to spend all my time this week without any feasible product, so I put the wall detection algorithm aside and focused on another technical problem.

Another, more feasible challenge was generating text that would pass through a lyric-segment array and update during each event call (tap of the screen). I ended up using the default Android AR Kit Demo, and replacing their Android Icon prefab with a 3D text prefab. From there, I wrote a script to update the contents of the text with each tap of the device. I stored all the lyric-segments in an array, and an accumulator counting the total amount of touches to the screen would indicate which index in the array to use as the new text content.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class TextUpdate : MonoBehaviour {
 
public int clickCount;
public TextMesh textLayer;
 
string[] content = new string[] {"How", "are", "you?"};
 
void Update() {
if (Input.touchCount > 0)
{
Touch touch = Input.GetTouch(0);
if(touch.phase == TouchPhase.Began) {
clickCount++;
clickCount = clickCount % 3;
textLayer.text = content[clickCount];
}
}
}
}

This allows me to go out into the world and place pre-determined text down onto surfaces as followed:

Note that the product is not meant to be a real-time AR interaction. Rather, it is a way for videographers to record the AR interaction and edit the footage in an expensive (After Effects) or not-so-expensive (iMovie) editing software along to the audio. Though, there exists no official program to screen record on the Google Pixel, so I downloaded a third-party application AZ Screen Record. Although it records pretty decently, it cannot go up to HD quality.

Another challenge is timing the text. Because AR Core is slow at detecting new surfaces, this means that for fast songs, it is hard to place lyrics/text down fast enough. After experimenting some more, I came to find that it is easier place text down at half speed to let AR Core have its time to recognize tracking planes, and then when editing the footage, speed it up to map it to the actual audio.

I made a demo of this and saw drastic improvements as opposed to real-time text placement.

Though, AR Core does this undesirable thing where it picks a random color and paints the tracker plane that color to indicate the active area. But most times the color is distracting. I modified the code to only use white as a color for painting the tracker plane so as not to be too distracting to the text.