Active Topics

 


Reply
Thread Tools
Posts: 838 | Thanked: 292 times | Joined on Apr 2010
#21
Nothing like the male ego. It makes just about any thread on any forum anywhere eventually spin off into a completely off topic war.
 
mrsellout's Avatar
Posts: 889 | Thanked: 2,087 times | Joined on Sep 2010 @ Manchester
#22
I've done a quick search on google for braille and linux and there seems to quite be a bit of progress in this field.

My first thought was that one could map the keyboard and other buttons to launch certain apps. eg using shortcutd one could map the camera button to bring up the phone app. A lot of work has been done by the maemo community to adopt the various keys, sensors and slides, and these could all be integrated into an 'maeccessibility' ui layer. This is along the lines of what TiagoTiago discusses in his thread (and I will duplicate this post there to try to keep any real ideas/development together on there).

But then I remembered the vibrating function of the n900, and obviously the vibrating screen function would have to be enabled, but the connection I made was with Braille. There have been great efforts made in the wider community to enable blind people to use computers, and within that the linux community is no exception.

From the link above I had a cursory glance at some of the links therein.

As alluded earlier on, there has been work in the Symbian world. Kurzweil KNFB Mobile Reader actually seems to meet exactly the OP's requirements. This is not opensource. There's also an Opensource GPS software for blind users called Loadstone built for the S60 platform. I bet these people will be looking to update their software for newer Nokia models, and which programming platform will they have to use? Qt4.7 of course.

Going back to Braille though and the really exciting area for me was SeebyTouch. Now this piece of kit does need a flat surface to work on, but it is lightweight and portable. It can work via Usb, which the n900 can now do. It is completely free - they have even included a complete spec of the hardware so anyone can build it. I wonder if Nokia were to get involved, maybe offering the developer a few R+D n900 devices, whether they would be interested in porting and adapting it towards a mobile device. If not I'm sure there is the talent out there to do it anyway.

I think the key thing here is that any progress made now can be ported across to future devices thanks to Qt4.7. So maybe it's not what the OP wanted (although some progress can be made through the development of Voice commands/eSpeak and adapting the various keys, switches, sliders and sensors) there is the possibility of really making a difference for Blind and Partially sighted people through the Open source movement.
 

The Following User Says Thank You to mrsellout For This Useful Post:
Posts: 385 | Thanked: 426 times | Joined on Dec 2009 @ Gothenburg, Sweden
#23
Hey, me say have grown up now! May me pleeez comment here once more. Cross my thumbs won't say or type any bad or mean or any word at all that has the letter after H and before K anymore...ever. Me thunk before now.

Last edited by Larswad; 2010-11-15 at 15:08.
 
Alb3rtO's Avatar
Posts: 76 | Thanked: 46 times | Joined on Sep 2010 @ Romania
#24
Uffff
 
benny1967's Avatar
Posts: 3,790 | Thanked: 5,718 times | Joined on Mar 2006 @ Vienna, Austria
#25
I find the whole thread somewhat disturbing actually. The very thought that a touchscreen-based device could be turned into anything useful for blind people is unrealistic. (Yes, I know about the iPhone 'solutions' here... and I find them unconvincing.)

Blind and visually impaired people can live with information being read to them, yes. But it's a lot easier and faster to use any kind of tactile information you can get. Of all the current smartphone OSs, good old Symbian S60v3 with its softkeys (and without a touchscreen) is the only one I would build a UI for blind/visually impaired users on.
 

The Following User Says Thank You to benny1967 For This Useful Post:
Posts: 19 | Thanked: 9 times | Joined on Jun 2011
#26
Bumping this old thread because I'd love to be able to operate my phone without having to look at it all the time.

Anyone who doubts the usefulness of VoiceOver type solutions should pay more attention to the people who are actually using them:
http://behindthecurtain.us/2010/06/1...th-the-iphone/

I'm sure that a well-written, interruptable screen-reading interface is often faster to use that so many of our graphical menus that have to be animated, in windows that have to be scrolled because you can only fit so much text at a given size.

I bet there are plenty of sight-impaired geeks who would prefer something more linux-friendly the these closed source offerings.
It's doable. It's worthwhile. We just need to build it.


Another cool thing: One of the most vision- and space-agnostic virtual keyboards I've seen:
http://youtu.be/G_QWtUFFAFQ

Last edited by octagonhead; 2012-06-09 at 08:29.
 

The Following User Says Thank You to octagonhead For This Useful Post:
qwazix's Avatar
Moderator | Posts: 2,622 | Thanked: 5,447 times | Joined on Jan 2010
#27
A nice idea would be to use the touchscreen as a relative device (like computer touchpads) instead of absolute positioning device. Multitouch would help here but we can do without.

Let's assume you are in the Maemo menu. A swipe down would go to the next app down, a swipe right to the right. A screen reader should read the icon names and when you reach what you need you can tap to activate. Camera button can be mapped to cycle desktop, dashboard, menu shortcutd and keyboard can be used for typing. A double tap could be used for other interactions e.g. back button, or send button.
__________________
Proud coding competition 2012 winner: ρcam
My other apps: speedcrunch N9 N900 Jolla –– contactlaunch –– timenow

Nemo UX blog: Grog
My website: qwazix.com
My job: oob
 

The Following User Says Thank You to qwazix For This Useful Post:
Reply

Tags
accessibility, blind


 
Forum Jump


All times are GMT. The time now is 18:38.