I'm Eugene Jahn
a software developer

I am an undergraduate researcher in UW Reality Lab



In DefHack, my team developed a project called EzPen. It is an integrated web application that detects the movement of your mobile device and utilizes it as a mouse(pen) drawing on laptop.

I built up a webpage to POST phone's acceleration data, and Unity GET the results, caluate the path of the movement, and draw the line on the laptop. The hardest part in this porject is to reduce the errors and latency.

AR Basketball Training

An AR application which simulates a basketball training scene to improve shooting accuracy training efficiency.

The application is built by Unity. It can calculate the best force and angle to shoot the ball from the user's position and also records the force and angle that the user's shoot. This app is made to help the player improve their shooting skils. The future work is to implement on HoloLens.

IoT ShoppingRecord

Food App to check your items

People always forget what they bought from the supermarket, and it will cause the food spoil. If the cashier can scan the item’s qrcode and send the information to the users account, then users can know what they bought and when is the food expired day from their smart phone.

I built this app by Java and PHP. The cashier can use the scaner website to upload the item’s data to users’ mySQL database. For the users, they can check the items from Java app.

Robotic Guide Dog

Robotic guide dog which can guide and assert the people with vision impairment. The user can voice or joystick control the robot. And, the robot have these 3 functions 1. GPS navigation(it can guide user to the destination where the user commands) 2. Obstruction avoidance 3. Intelligent personal assistan(answer the question like the weather, times, and set alarm, etc)

I developed this robot by Raspberry and sensors such as microphone, GPS module, and ultrasonic sensor,etc. The users can control the car by voice, and raspberry pi has a microphone to reccive the sound. Then, raspberry pi sends the sound file to OLAMI Website to do NLP. After getting the results, it will analzye and response to user.

Carrot Top

Using AR technique(HoloLens) to tell people when to harvest

In the 48 hour-long AT&T AR/VR hackathon, my team created an AR application CarrotTop for Microsoft HoloLens. CarrotTop used the built-in camera on HoloLens to detect and predict the length of the carrot that buried underground with the shape of the leaf. Combining the technique of image recognition and augmented reality, CarrotTop overlays a holographic on top of the real carrot to provide the optimal harvest information which helps the farmer or amateurs harvest the carrot at the right time.