We want to fundamentally change the current approach to physical computer interfaces, close the gap between human and computer for individuals with disabilities, creatives, musicians, researchers and device manufacturers.
The world is full of devices that have singular purpose and limited cross-functionality, making it difficult for individuals with disabilities and mobility issues to use them efficiently.
Extended Expression (EE) is a software that solves these limitations by taking any input source such as sip and puff joystick movement, and translates its output across multiple native computer and interface functions, contextual menu and macro control on your desktop computer and mobile phone with contextually aware sense. In essence, EE acts as a device extension for existing wired and wireless controllers or as an interpreter for common communication protocols. Thus enabling these systems to communicate and control in ways they’ve never been able to before.
We take our inspiration from the playing of a musical instruments where a vast amount of expression and device control can come from just 3 or 4 input sources that when used together can create all the melodies and harmonics that are possible to play and hear.
By allowing the existing ecosystem of controllers and input devices to work in symphony and do more across computing platforms (PC, macOS, Android, iOS) and software applications, EE will make traditional control devices and accessibility technology more cost effective, compatible, simpler to roll out and most importantly make computers and devices more accessible.
EE will empower research and support teams within organizations to create quick custom access solutions for individuals with disabilities, using new I/O technologies with off the shelf devices and components that in conjunction with the EE software are easy to adapt and implement across any software application or task.