Pison Experience
At Pison Technology, I contributed to the development of innovative user experiences centered around neural-based gesture control. As stated before due to NDA constraints, much of my work remains confidential; however, I demonstrated public accessible details with images where I could.
​
Key contributions include:
-
Contributing to a gesture-controlled VR experience suite
-
Gesture Calibration Visual Guide
-
Gesture-Based Spotify Controller
-
AR Spell-Casting Game
Gesture-Controlled VR Experience Suite – Unity (VR/Prototype Wearable Integration)
Developed a collection of immersive VR mini-games and interactive demos to showcase the capabilities of gesture-based input using a prototype wrist-wearable device. The project focused on designing intuitive, enjoyable user experiences despite the inherent limitations of early gesture-tracking hardware. Each experience balanced technical feasibility with clear demonstrations of practical gesture control applications.
​​
Temple Run Clone:​
Redesigned the classic Temple Run gameplay to function fluidly with gesture-based controls, where fast-twitch touchscreen mechanics were not viable. Given the latency and limited responsiveness of the prototype wearable, I focused on improving visual feedback and player anticipation.
-
Introduced vibrant color cues and clear on-screen indicators to help users plan actions in advance.
-
Rebalanced game pacing to shift emphasis from reaction speed to predictive planning, preserving tension and engagement without overwhelming input precision.
​​
Rhythm Game:
Designed and developed a rhythm-based gesture game with falling “ghost hands” that visually indicated required gestures in time with a beat.
-
Initial prototype featured horizontal movement across four gesture columns, mimicking a musical conductor's motion.
-
Conducted user testing with non-gamer participants (target B2B clients), revealing the horizontal mechanic created unnecessary complexity.
-
Iterated on design by removing lateral movement, simplifying gameplay to enhance clarity and accessibility—ultimately making the gesture concept easier to understand and more compelling for demonstration purposes.
​​
Smart TV Demo:
Created a virtual smart TV environment that could be entirely controlled using hand gestures, demonstrating a potential real-world use case for gesture input beyond gaming.
-
Implemented gesture-based interactions such as play/pause, volume control, channel switching, and movie selection.
-
Served as a compelling proof-of-concept for future household tech integration without relying on physical remotes or buttons.


wrist wearable worked with
Gesture Calibration Visual Guide – Unity + Blender (Android App)
Spearheaded the design and development of an Android-based Unity application to guide users through the calibration process for gesture input using the Pison wearable device. This interactive visual tutorial ensured accurate calibration by combining real-time feedback with intuitive, spatially aware UI elements.
​
Core Features & Implementation
-
Developed a full Unity-based calibration interface for mobile (Android), walking users through every gesture supported by the wearable hardware.
-
Designed a clean UI layout with the gesture name at the top, followed by a progress indicator—a radial fill circle—that surrounded a 3D model demonstrating the target gesture.
-
As users performed each gesture, the system captured live data and filled the progress circle in real-time. Upon meeting gesture detection criteria, the system triggered backend calibration logic seamlessly.
-
​
3D Modeling & Animation
-
Self-taught Blender to create and animate a low-poly hand model capable of demonstrating each gesture.
-
Built dynamic gesture transitions, allowing the hand model to animate fluidly between poses during calibration.
-
Designed and animated a low-poly human figure for a fun and engaging intro sequence: a cinematic camera pan from a bird’s-eye view into a close-up of the hand—helping users spatially contextualize the calibration process.

Gesture-Based Spotify Controller – Android (Kotlin + Spotify API)
Developed an Android application that integrates with Spotify and allows users to control playback entirely through custom gesture inputs. Built in Android Studio using Kotlin, this project combined third-party API integration with gesture-based interaction design to explore alternative, hands-free media control.
​
Core Functionality & Features
-
Enabled gesture-based control for Spotify, supporting actions such as:
-
Play / Pause
-
Skip Forward / Reverse Track
-
Shuffle
-
Volume Up / Down
-
-
Connected directly to Spotify using the official Spotify Android SDK and remote control API.
-
Built gesture recognition and control logic to interface with Spotify commands, offering a seamless, intuitive user experience without physical touch input.
​
Technical & Design Challenges
-
Self-taught Kotlin and gained hands-on experience with Android app architecture while implementing gesture-to-command mapping.
-
Navigated API limitations and integrated complex gesture handling with internal device code (specifics protected under NDA).
-
Designed and implemented an activation gesture system to prevent false positives:
-
The app only began listening for gesture commands after detecting a specific activation gesture.
-
Gesture listening would stop automatically after a period of inactivity or upon repeating the activation gesture—ensuring reliability in everyday use.
-


AR Spell-Casting Game – Unity (Android AR + Gesture Input)
2-Day Internal Hackathon Project
Built an interactive AR game using Unity and Android for a company-wide hackathon. The app combined spatial gesture recognition with augmented reality to deliver a fun, immersive spell-casting experience using a prototype Pison wearable.
​
Gameplay Overview
-
Players are placed in an AR scene where a castle spawns behind them, and five magical portals appear in front.
-
Enemies emerge in waves from the portals. Each enemy displays a unique gesture icon above its head.
-
Players must point their hand toward the enemy and perform the correct gesture to cast a spell and destroy it.
-
Players have three lives—missing an enemy costs a life, and the game ends after three misses.
​
Technical Challenges & Solutions
-
Spatial Hand Tracking Without External Sensors:
-
At the time, the wearable device lacked positional awareness within the physical environment.
-
Solved this by guiding the user to place their hand at a specific position and orientation in front of the camera during startup.
-
Used this calibration point, along with velocity and gyroscopic data from the device, to map real-world hand movement into virtual space.
-
Enabled intuitive aiming mechanics, where players could physically point and cast spells with natural gestures.
-
​
This project showcased how spatial interaction and gesture input could blend with AR to create engaging, hands-free experiences—delivered under tight hackathon constraints in just two days.

Devise used in project