Designed an application to enable the visually impaired community to better identify ingredients in the kitchen.
Technologies Used: LLMs, Google Cloud, APIs, Object Detection Models, Front End and Backend Application Development.
"The visually impaired community needs a better system for identifying ingredients in the kitchen."
We engaged with several community members and visited multiple kitchens to ensure we’re collaborating with, rather than imposing technology on, the community.
Lucy Greco (Electronic Accessibility Expert, UC Berkeley)
Bobbi Pompi (Independent Living Skill Instructor, LightHouse)
Ann Wei (Disability Culture Community Coordinator, UC Berkeley)
Ryan Dour (Accessibility Department, Apple)
Nathan Tilton (Lab Manager at UC Berkeley, Disability Lab)
Eleanor Mayes (Light House SF Coordinator)
Many visually impaired people face challenges because more than half are unemployed and only about 10% can read Braille. Most current software is not accurate, hard to navigate, and doesn’t meet their needs.
It’s important to create low-cost solutions that are easy to use and accessible.
Hands-free navigation with audio cues such as beeping and stop sounds, captures images automatically.
Customizable output for brand name, object name, expiration date range, and mold detection.
Compatible with screen readers and works on any device with web interface.