GREENTRaiL:

 

figma,
swift,
terra api, ios healthkit, firebase 


PROJECT ROLE:

UX/UI DESIGNER, CONCEPTUAL RESEARCHER



WInner of Best Use of Terra API (Health Track) at Hack Harvard 2023

Hiking app that synthesizes biometric data taken from wearables through
terra api and wildlife data to reccomend hiking routes based on activity levels.
Created as part of Hack Harvard 2023 with Luke Atkins, Ruth Lu, and Tingwei Shi. 









inspiration:


Hiking has exploded in popularity since the pandemic with more than 80 million Americans hiking in 2022 alone. There are many large mental and physical health benefits to hiking, however it can be daunting to select routes as a beginner. It is difficult to imagine how a route would feel before going on it, especially for those without past input.

In addition, many times hikers also don't take into account wildlife when choosing routes. Animals such as elks have also been shown to change behavior up to 1 mile away from hiking trails, and this has far reaching implications to the greater biosphere. With climate change being a threat to traditional migration paths, increased human activity can be detrimental to the already fragile patterns.

GREENTRaiL is an app that will give users personalized recommendations and help make hiking more eco-friendly.



how we built it:



UI/UX prototyping was sketched first traditionally, and then brought into Procreate to develop final color and brand identity. High fidelity wire-framing was then done on Figma, and then the final UI/UX was refined using those prototypes.

GREENTRaiL was coded using Swift and integrates terraAPI to get wearable data and aggregate data of all the people who have taken the past trail.













I’LL BTYE:

VR HAPTIC JAW
 






















unity, arduino, servo motors, vr, blender, singularity SDK, occulus quest
 
PROJECT ROLE: CONCEPTUAL RESEARCHER, HARDWARE TESTER



Winner of MIT RealityHack Hardware Track  
Winner of MIT RealityHack Technology Horizons For Human Interfacing


Futuristic bluetooth, futuristic VR jaw haptic device that interfaces with unity to simulate eating in VR
Created as part of the MIT RealityHack 2023 with Pepi Ng, Grace Park, Julia Daser and Beatriz Ribeiro
with help from T chen, Aubrey and Lukas


Ill'Byte - Haptic Jaw MIT from Julia Daser on Vimeo.

inspiration

The current state of VR technology focuses on creating an immersive experience through sight, hearing and taste. However, as of today, there has not been a VR device that stimulates the jaw, even though eating and food are such integral parts of the human experience.

We created a futuristic jaw haptic device that consists of two parts - an adjustable jaw brace and a harness. The jaw brace is made of aluminum wire and rubber bands, and can be attached to the harness via springs. The tension on the springs is automatically adjusted via motors. This means that when a user “eats” a particular food item in the XR experience, depending on the texture of the food item, the tension in the springs are adjusted automatically. This simulates the extent of difficulty or ease in chewing different items in an XR experience.


how we built it 

I’llByte was built in Unity 2021.3.16f. When the VR controllers interact with a food item, the program sends a bluetooth message via MIT’s The Singularity. Then the SDK connects with the microcontroller on the back of the harness, which spins the servo motors attached to the front of the harness. As a result, the springs attached to the jaw harness are pulled, creating tension dependent on the VR experience. 3 different tension modes are set up for the prototype VR game, creating 3 different “textures” of foods.
















24–09–2024

DEEP BRAIN
EMULATO
R

laser cut acrylic, cnc milled circuit copper board, arduino, adobe illustrator, programmable LEDS


Created for HYPER
MEAT at Grace Exhibition Space, this is an emulator of a deep brain simulator.



inspiration

the theme of hypermeat was electronic interfacing for the human body, how do can robots interact in symboiotic or parasitic ways with humands? I looked to real world examples of robots and humans being embedded through pacemakers and probes. One such fascinating use is the area of neuromodulation through the use of artifical electrical currents to treat severe epilepsy. 

Deep Brain stimulation is a relatively new area of treatment for epilepsy. Neurostimulators are  implanted in the anterior thalamus and current is generated regardless of seizure activity in order to  mitigate symptoms. 

This piece puts you in the reverse place, you must stop the seizure in the robot as a human. 


YOU ARE THE NEUROSTIMULATOR




how i built it


First created the initial sketches in procreate of a 3 frame animation, inspired by old medical illustrations. Retraced it in adobe illustrator and laser cut it onto seperate sheets of acrylic. Then programmed LEDs on arduino using the FastLED library and created 3 rows for each face. Afterwards I created the brain using illustrator and CNC milled cooper and used the arduino capacitive touch libarary to create the interactive portion of the brain. I mapped capacitive touch values to certain animations depending on the distance of the hand. Finally, the box was laser cut and engraved to provide easy access for the circuits.

users tap a portion of the brain and then trigger the stabilization (purple flashing) which then becomes white again.  












24–09–2024