ARROR: An iOS Navigation App using Augmented Reality
(2017)
Fun project during part time.
RML Augmented Reality SDK
(2014 - 2015)
Designed a new controlling algorithm for our Augmented Reality (AR) library to achieve real time camera tracking, and delivered to our industry partner.
Increased the average frame rate of the AR system from 19fps to 40fps on iPhone 5.
Optimized the template based tracking code in our AR library, and increased its speed by 45%.
A demo of AR in car to make navigation more immersive:
vProjects: A series of Virtual Reality (VR) projects
(2012 - 2014)
Developed a series of Virtual Reality (VR) applications under the CAVE (Cave Automatic Virtual Environment) system.
Featured Gesture Recognition and Natural HCI (Human Computer Interaction) for navigation, system control, object selection and manipulation.
Applicable to various scenarios such as interior design, mechanical design, urban planning, etc.
Mini-CAVE for Microsoft Research Redmond(Seattle)
(2012-2013)
Built the first software demo and helped the installation of the Mini-CAVE VR system for the MIX (Multimedia, Interaction and eXperience) team at Microsoft Research in their Redmond campus.
Implemented the 4-screen rendering system using VR Juggler, an open sourced VR library.
Integrated an Xbox game controller with OptiTrack and VRPN.
A web based fashion design survey with GUI
(April, 2014)
Developed an interactive fashion design webpage that records user input for future analysis, using JavaScript/HTML/PHP.
Webpage available at jasminegong.com/survey (in Chinese).
Facial Expression Recognition
(2010-2012)
Research project that aims on recognizing human facial expressions.
UAVs and Aerial Photography
(Jan. 2011 - Jun. 2011)
Out of 2 months, we built our quadcopters based on an open sourced project named Mikrokopter.
Led the engineering team; My part was to design and rebuild the mechanical structure (CAD/CNC) and radio control system (an AVR ATmega64 micro-controller as PWM decoder from a 2.4G radio T/R module).
Having the experience on quadcopters, we then secured fund and built a Helium gassed airship to do aerial photography and shot a marketing video for Zhengzhou University.
2-Wheel Robot Simulation
(2009-2011)
I worked in the robotics lab during my undergraduate study. We designed and implemented the controlling strategy of simulated 2-wheel robots, which include PID control of wheels and action strategy in football and combat games.
Our work won a second prize in 2009 Robot Contest of China (Changsha).
The next year we won a championship in 2010 Robot Contest of China (Ordos).
Collaborative Human like robot
(Jul. 2011 - Sep. 2011)
We built two human like robots controlled by embedded Linux systems. They acquire data from the environment through various sensors such as cameras, microphones, infrared distance sensors, electronic compasses, and inertial navigation sensors. They have the ability of completing various tasks and can work collaboratively by communicating via WIFI.
This work won a second prize in 2011 Robot Contest of China (Lanzhou).
Automatic Volume Control
(2009-2010)
This is a device that can adjust the volume of any audio source in real time according to the noise level of the environment.
For example when there is too much traffic noise outside your room, this device will automatically turn up the current volume of your TV, phone or music player, etc.
Our implementation of this idea was averaging the sampled microphone data and making change to a digital volume control microchip.
This project was supported by the undergraduate innovation program of China and won a third prize in a national competition.
Leon Ziyang Zhang - Projects
Last Update: December 2017