Project Name

Accessibility


Accessibility for Persons with Disabilities and Senior Citizens


IBM Research – Tokyo has been working the following projects mainly in accessibility research for persons with disabilities and senior citizens.



Eclipse Accessibility Tools Framework Project

Accessibility Tools Framework (ACTF) is a collection of tools and building blocks for accessibility technology that was originally developed by IBM Research and was donated to Eclipse.org in 2007. With the reusable components of ACTF, developers can quickly and easily build various accessibility tools, such as accessibility validation and usability visualization tools, tools for improving accessibility of multimedia contents, or alternative accessible interfaces for applications.

Accessibility tools developed on top of ACTF include: aDesigner, miChecker, and Eclipse ACTF Script Editor Lite (EASEL).

The aDesigner and the miChecker are tools that help designers ensure that their content and applications are accessible and usable by the visually impaired. These tools provides a visual representation of the visually impaired users' usability of content. It helps users to learn about real accessibility issues in their content and applications. The EASEL aims to provide an environment to easily edit audio descriptions for internet movies, and it also tries to reduce cost of audio description by using audio synthesis.



Efficient Book Transformation System

Electronic books that enable its textual content to be synthesized into spoken output is an appealing medium that can be enjoyed by everyone including those who are blind or print-disabled. In this project, we are conducting research into generating electronic books from books that are currently only available in their printed form. In particular, we are applying technologies such as inference algorithm for automatically reconstructing document structure, and crowd sourcing for enabling numerous workers to collaborate and streamline the process of verifying and correcting the output produced by the automated process.



Mobile Accessibility

Our research on Mobile Accessibility centers around two focus areas: 1) how to make mobile devices such as smartphones and tablets more accessible to people with disabilities and those with special needs such as the elderly population, and 2) how to make access to information in the real-world more accessible through the use of such mobile devices.

Accessible mobile interfaces - Exploring eyes-free mobile interfaces, such as an auditory display system that uses non-speech audio to convey shapes, a photo application that supports visually impaired users to capture and share photos with sighted people, and evolving user interface that guides elderly and novice users in becoming familiar with mobile device capabilities.

Real world information access through mobile devices - Exploring the use of mobile devices as a tool with which people with disabilities can access and interact with information in their physical environment in an accessible manner, such as through our pedestrian navigation app that supports eyes-free guidance for blind pedestrians. /



Sasayaki: Auditory Assistant for Web Navigation

Sasayaki is an intelligent system to help users through an audio channel. The system observes the users' behaviors and their contexts to provide appropriate auditory feedback. The system can provide information on how to use web pages and summaries of the web pages. Experiments involving blind users and elderly users show that they can navigate through web pages with more confidence based on the audio support of Sasayaki. We are trying to expand the coverage of Sasayaki applications to broader situations such as using ticket machines, ATMs, or driving.