Empowering people to communicate with care takers and loved ones.
Vocable AAC allows those with conditions such as MS, stroke, ALS, or spinal cord injuries to communicate using an app that tracks head movements, without the need to spend tens of thousands of dollars on technology to do so.
Vocable uses ARKit to track the user's head movements and understand where the user is looking at on the screen. This allows the app to be used completely hands-free: users can look around the screen and make selections by lingering their gaze at a particular element.
For users with more mobility, the app can be operated by touch.
Use a list of common phrases provided by speech language pathologists, or create and save your own.
Type with your head or your hands.
For the current progress on features, please visit the project board.
For a high-level roadmap, see the Vocable Roadmap
We'd love to translate Vocable into as many languages as possible. If you'd like to help translate, please visit our Crowdin project. Thanks for helping people communicate all around the world! 🌎🌍🌏
We love contributions! To get started, please see our Contributing Guidelines.
- iOS 13.0
- iOS devices with TrueDepth camera
- Run
fastlane
from the project command line. - From the menu select the option for
Add devices via....
- When prompted enter a
device name
and press enter. This can be any name. - When prompted enter the
device UDID
and press enter. Found inXcode -> Window -> Devices and Simulators
- When prompted enter a
[email protected]
forusername
and press enter. - Fastlane might ask you to enter a username again, use
[email protected]
.
Matt Kubota, Kyle Ohanian, Duncan Lewis, Ameir Al-Zoubi, and many more from WillowTree 💙.
vocable-ios is released under the MIT license. See LICENSE for details.
vocable-android is available on Google Play and is also open-source.