This is an interactive fashion design. Coders can live code the visual effects for different markers using three.js library and the patterns that coders created will be rendered superimposed the vest. The system is designed inspired by gibber. The application is developed based on HTML and javascript. Mindar.js library is used for the tracking in AR and the 3d objects are rendered based on three.js. I also used the codemirror to implement the code editor in the browser. link>>
There is a marker attached to the hat and some EL wires around the hat. The marker controls the play and the pause of the music can also control the volume of the sound by changing the distance between the marker and the camera (the closer the higher volume). The EL wires are reacting to the sound. In AR, there is a music visualization animation playing superimposing the marker, attached to the hat. link>>
This is an AR application designed for “Jewels of the Nile: Ancient Egyptian Treasures” exhibition in Worcester Art Museum. With the App, visitors can try on different pieces of jewelry that are on display in the exhibition, take pictures and share with the friends. They can also learn about the jewels’ background and find the real pieces with the map inside the app. 3D scanning technique is applied to import the real jewelry pieces into AR environment. link>>
This is an AR web application designed to visualize the protein Structure in Augmented Reality and help undergraduates have a better understanding of the structure. The structure model can be downloaded directly from the PDB bank through the PDB ID and be loaded to the AR environment by scanning the 2D marker. There are multiple viewing styles for each structure. Two structures can be loaded in the meantime. link>>
We developed an MR program to build a flow chemistry set-up. The application uses the HoloLens 2 to project holograms in a laboratory environment. The HoloLens 2 is a powerful AR device that includes capabilities for hand gestures and voice recognition. Although the HoloLens 2 does include other features that may be relevant in future training programs, we focus on voice recognition in our system because hand gestures currently require training to achieve proficiency. link>>
In this project, the NBA Players' state in 2020~2021 season is visualized. The original data is collected from the NBA official website. In the project, the dataset is visualized in scatter plot, bar chart, spider chart and pie chart. The visualizations are implemented with d3.js and react.js on VizHub. link>>
This is a VR game built in HTC Vive. The Leap Motion controller is utilized to track the player’s hands. Different from the normal VR game, in this game, the player uses the hand gestures to play the game instead of using controllers. In the game, the player acts as a journalist with a special camera. He/She can walk around in the scene, go into the picture of the fake news and find out the truth. link>>
It ia a VR project built in mobile phone and players can use the Google cardboard to play. You can walk around in the park, go on the sky wheel, and go ride the roller coaster. The 3D sound effects are set to enhance the immersive experience.link>>
It is a generator that can “paint” the sound. Different volumes can be depicted in different colors in real time. The red line represents will be generated by a sound with high volume. The blue ones represent the medium volume while green ones represent whispers.link>>