ngdon – telematic

emojia.glitch.me

An online multiplayer world made entirely of emojis. check it out on https://emojia.glitch.me

^ Gameplay on Windows 10

^ Left: Apple Color Emoji, Right: ASCII

^ Comparison of 4 OS.

Process

Emojis

I collected the useful emojis in a dictionary and built a system to find and draw them. Since the center and rotation of individual emoji graphics are arbitrary and different on different OS’s, I built a mapping for different OS’s  to rectify them.

Rendering

Emojis are rendered as text in HTML divs. It turns out that mac Chrome and Safari are bugged in displaying emojis. Emojis cannot be rotated to axis-aligned angles (90°, 180°, 270°, etc), otherwise it will glitch crazily, being scaled to random sizes at frame. Also emojis cannot be scaled above 130 pixels in height, otherwise the emojis will not show up. I made some tricks and hacks so these problems are not visible.

Another problem is that mac Chrome and Safari cannot render emojis very efficiently. If there are many bunches of emojis flying all over the screen, the render loop is going to get very slow. Current solution is to not generate too many items, and delete them as soon as they’re no longer in sight.

Server

I used node.js on glitch.com to make the server. In my previous networked projects, precise sync was not very crucial, and “fairness” for all the players didn’t have to be enforced. However with this project I had to redesign my data structures multiple times to find the ones that optimizes both speed and fairness.

glitch.com seems to be limiting my resources, so from time to time there can be an outrageous lag (5 sec) in communicating between the server and client. I also tried to setup a server on my MacBook using serveo/ngrok, and the lag is much more consistent.

I also found another solution, which is to get a good noise filter. This time I’m using the One Euro Filter. I knew this filter for a while, but for my previous projects a Lerp is suffice. But for this project, Lerp means that everything will be always behind their actual position, which is quite unacceptable for a game. One Euro Filter turns out to be very magical, and the game no longer look laggy even on glitch.

Game Mechanism

Once you connect to the server, the server will put you into the most populated, yet not-full room. The capacity of a room is currently 4. I was hoping to have infinitely many players in a room, so it will look really cool if there’re many people online. However testing this made the network communication infinitely laggy.

When there’re less than 4 human players in a room, bots, which look almost human but can be easily distinguished with their bot “🤖” faces, will join the game to make things more interesting. Once another user want to connect, the bots will self-destruct to make space for human.

The bots have an average IQ, but it is already fun to observe them kill each other while my human character hide in the corner. Watching them I feel that the bots are also having a lot of fun themselves.

Currently players can pickup food and other items on the floor, or hunt small animals passing by for more food. But I have many more ideas that I haven’t got the time to implement.

Platform Specific Looks

Since for different OS’s and browsers the emojis look different, I made a small utility so that I can render the same gameplay video on different devices for comparison. How it works is that I first play it on my mac, and my code writes down the position and so on of all the stuff on the screen for each and every frame. I download the said file, can upload it again to different browsers on different OS’s to render the same play through, but rendered with platform specific emojis.

Currently I’ve rendered on 4 platforms: macOS mojave, windows 10, ubuntu 18, and android. I got access to windows 10 via connecting to virtual andrews in vmware. I have a USB-bootable linux which I plugged into my mac to get the linux recording. Golan lend me a Pixel phone which I used to make the android recording.

Though I’ve made a timer in my code so all the playbacks should be played at exact same speed, each browser seems to have a mind of its own and the resultant 4 videos are still a bit out of sync. Also took me all night to put them together.

You can find the recordings in a 2×2 grid view at the top of this post.

Todo

  • Mobile/touch support
  • More type of interactions: drawing stuff with emojis, leaving marks to the landscape, change facial expressions, etc.
  • Tutorial, instructions and other GUI

Let’s Practice: Alexa Music Practice Tool

  1. I made an Alexa skill with Abby Lannan, a Euphonium performance major Graduate student at CMU, where it will help you to practice music in various ways. It allows you to play music with other people’s music recordings or upload your own music. It helps you with music ear training, intonation and rhythm accuracy. This project is pitched to Amazon’s Alexa Day as a future Alexa project.

It tracks users’ practice progress and helps them achieve their practice goals. It synthesizes sounds on a cloud server based on voice commands.

Basic Functions:

Skill Development: It has a metronome and drone that offer a variety of sounds. It also allows users to upload recordings to play music with each other.

Tutoring: It has ear training game that teaches music theory.

Record Keeping: It saves user info on cloud database and allows them to track longterm practice progress.

A detailed slides is attached here:

2. Also, I attempted to make an Alexa skill that sends you an email of a pdf of pictures that ‘will make you happy.’ First of all, you tells Alexa what makes you feel happy/grateful. Then, she sends you a pdf to remind you the happy things in the world. The results look pretty like trash.

[hands, anime, butt, beach]

[spagetti, good weather, chris johanson, music]