[prev in list] [next in list] [prev in thread] [next in thread] 

List:       mutt-dev
Subject:    Re: The future of mutt...
From:       "Eric S. Johansson" <esj () harvee ! org>
Date:       2013-10-04 12:18:37
Message-ID: 524EB21D.9050507 () harvee ! org
[Download RAW message or body]


On 10/3/2013 6:49 PM, Will Fiveash wrote:
> That is unfortunate for sure. Isn't Accessibility a general issue for 
> many disabled Unix/Linux users? The reason I ask is to pin down 
> whether it's mutt that needs Accessibility improvements or is it the 
> platform that mutt runs on. 
Accessibility is an issue for users of any OS. It's also important to 
know that there's no one model for accessibility. What A blind person 
needs is different from someone with upper extremity disabilities like 
me. And both of these are different from what a paraplegic or 
quadriplegic needs. In all cases, the difficulty in adapting to disabled 
users comes from extracting information used for the alternative 
interface from the wrong part of the system i.e. the UI itself.

As a historical note, creators of accessibility interfaces have headed 
down this wrong path of hooking into the existing user interface for 
over a decade and the fact that accessibility hasn't improved was not 
enough of a clue that they needed to try something else.

Coming back to your question, in reality, it's a mixture of both. There 
should be some sort of a common bus interface that an accessibility 
service would connect to in order to access the API/services of an 
application engine. One way to think of this is an extreme version of a 
web application. Client side is all code producing the user interface, 
server side is all Ajax showing and changing state as well as framed 
data based on the context the user interface is displaying.

What I'm proposing is fairly radical and in order for it to become part 
of a bigger system, we would need to prove the concept. Proving the 
concept means trying it out in small such as part of a mutt 
reworking/re-factoring. The most important thing to experiment with is 
the protocol and interface mechanism to a backend application service.

But it's hard to do this because my hands don't work Well enough and 
programming by speech recognition is dicey even with Python, I need to 
count on other people's hands and time. This sucks believe me. I hate to 
impose but unfortunately that's my only option right now.

--- eric
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic