Tools Used
- Lens Studio
- Typescript
- Javascript

Overview
snapsight is a visual aid software designed for the snapchat spectacles, which are AR glasses and can project image onto their screens. we made an object detection app to help blind people navigate the world without fear. for example if they were approaching a wall the glasses would detect them approaching the wall and warn them when they were coming in too close. we also had ideas to have the glasses detect other things like people and sidewalks
this was done for the divHacks hackathon held at columbia, which was, not... great... to say the most. we werent allowed to leave campus during the event and overnight we were cramped in a common room with around 100 other people. there were only 2 toilets and limited outlets, so we had to ration computer charge as well. i think a lot of the security measures were due to this being a day before oct 7th and columbia's security being increased for everyone, which made hosting an event like this difficult. this isnt really on the organizers but what was was the way they kept pushing things back. one delayed event caused everythign else to be pushed back thirty minutes, and this continued until we ended 2-3 hours later than scheudled, which was frusting being on 0 sleep and knowing i would have to commute an extra hour home.
it kinda sucks but the event left a poor taste in my mouth regarding the school, and it dissuaded me from transferring there

Development
with the snapchat spectacles we had to learn typescript and snapchat's lens studio. its similar to most 3D modeling programs with a much bigger emphasis on scripting, and area i had never touched before with 3d software. we used typescript to code the person detection thanks to the help of our mentors. while most teams made use of the actual AR, we made greater use of the cameras and speakers instead since a blind person wouldnt be able to see anything with the AR. i think the absense of real AR implementation kinda killed the project for the judges, especially since the spectacles were made by snapchat, not a BME startup
with this being my introduction to AR glasses, i found these really cool. while they were limited and buggy, it was so cool to see a glimpse of what this tech could do and i see it going places in the future, just not quite yet. these were pretty hardware limited with the viewing angles and bulk of the product, and i cant help but feel like it was just made to compete with meta's glasses, which seem to be doing a lot better. still cool to work with nonetheless

Reflection
despite the shoddy organization of the event, i made use of what i had. since we couldnt leave the room, i took that as an opportunity to just go up to random people and talk. asking stupid questions about their project, or talking about the unfortunate circumstances we found ourselves in. i just talked to people since thats something i didnt do at the past hackathon, and im glad i did it. it really made the hours of 3-5am less boring. i also remember being let out on a walk around campus at 5am after asking the security guard. it was such a breath a fresh air and security was w mans for letting us out for a bit. yes, it was like prison and it sucked, but definitely not an experience imma forget anytime soon
also shoutout to the team: Samin, Sudiptto, Ayen, and (unofficially) Aaron. we was always 5 minutes away from crashing out but we managed to keep each other as locked in as possible