A year after winning the 2015 UpPrize social innovation competition, Pittsburgh-based Conversant Labs continues to add to its slate of voice-activated technology with a hands-free cooking app called Yes, Chef!
The voice-enabled app (free on iTunes) has the unique ability to talk cooks through recipes, answering questions along the way such as “how do I make lasagna?” or “what should the oven temperature be?” Users simply say “Chef” before each question or command to trigger an action while using the app, similar to Siri, except that Yes Chef! delivers a customizable, conversational cooking experience.
For example, users can ask for clarification during a recipe, such as “Chef, was that baking powder or baking soda?” The app will understand the meaning. If you don’t understand a step in the recipe, you can ask follow-up questions. Conversant Labs has been working extensively to build advanced algorithms that offer better comprehension for the meaning of words and phrases.
Yes, Chef! is the first voice-controlled cooking app to make hands-free cooking possible, eliminating the need to go back-and-forth to a cookbook or a screen. It was originally developed through Conversant Labs’ efforts to create voice-based experiences that help the blind and visually impaired—and it is especially useful to them. But the broader applications of Yes, Chef! also offer a new kind of independence for any cook in the kitchen.
The app makes it possible to get through a recipe “without smearing cookie dough all over your iPad” as one of the developers noted. It also adds a speedy efficiency to the whole process of cooking once a user gets a feel for the commands. It’s like having a virtual recipe assistant reading directly from a cookbook.
Reactions to the app have been positive from people of all abilities. Audio-based human-computer interaction is growing at a significant pace. Just ask Siri, or Amazon’s Alexa, or try Google’s voice search. As people start looking to voice-activated programs to handle their everyday tasks, cooking appears to be a natural fit. Local Chefs, Jamilka Borges of Spoon and Joey Hilty of The Vandal put Yes Chef’s efficiency to the test at a live cooking demo during Pittsburgh’s recent Thrival Innovation Festival. The chefs plunged hands-free into recipes with voice-guided assistance from the app. Both chefs seem to have an easy time mastering the audio commands, moving through each step to the finished product.
Conversant Labs used blind and visually-impaired users in testing the app prior to launch. Since its release on iTunes, the developers are starting to hear feedback from those same groups about the finished product. Posters on AppleVis, a community-powered website for blind and low-vision users of Apple products commented that they appreciate the total voice control and the extremely limited need to interact with the screen.
“We did extensive testing with Yes Chef! at assisted learning centers locally while we were working on it,” says Chris Maury, founder of Conversant Labs. “Responses from these users helped us fine tune the design and create a better, almost voice-only experience that was our goal. Now, we are getting favorable reactions from blind and low-vision users who are able to access the finished app.”
Developers are getting good feedback from posters on AppleVis, a community-powered website for blind and low-vision users of Apple products who have been commenting that they appreciate the total voice control and the extremely limited need to interact with the screen. (Note: Yes Chef! is referred to on the website both with and without comma.)
It all started when Chris Maury learned he was going blind. During a routine eye doctor appointment in 2010, he was diagnosed with Stargardt Macular Degeneration and told that he would lose his vision within the next 5 to 10 years. As his vision became worse, he looked for assistive technology he could work with and was frustrated by what he found. So he set out to find ways to improve access to smartphones and computers.
After establishing Conversant Labs in Lawrenceville, Maury’s company was selected to be a part of AlphaLab, an accelerator program for early-stage technology companies in the Pittsburgh area. The company’s first product, Say Shopping, an app allowing users to search, browse and buy items without typing a word helped earn the company an UpPrize for social innovation which included $200,000 in investments and $200,000 in grants. (Note: UpPrize just kicked off its second year. See more info here.)
In addition to building apps, Conversant Labs is building tools to help developers integrate voice into their own applications. Their latest beta project, TinCan.ai is a tool for developers who are working to create voice-based apps for Amazon Alexa.