I was on a team of 4 to create a project (called Innovation InSight) based on my idea of helping the visually impaired use object recognition and vibration feedback. I was the only high school student on the team with my teammates from the University of Michigan, the University of Southern California, and the University of Toronto, Canada. The hackathon took place at the University of Michigan and was a 36 hour event called MHacks 6. My team won the “Best Overall Use of Microsoft Technology Award” with our idea and submission. We used a Kinect V2, OpenCV, HAAR classifiers, voice recognition, object recognition, arduino, a myo armband, vibration motors, among a variety of other things. For more information, check out our submission at http://devpost.com/software/innovation-insight (I wrote this submission at 5 AM with no sleep, so please ignore the typos:D). We are currently continuing our work on the project. The goal is to have the user perceive objects around him/her, use a voice command to select and object, and then use vibration feedback to guide the hand to the object. We proved our concepts at the hackathon, and now we wish to improve on them to help the blind and visually impaired. We also have another website with some more information about this at msight.co.