Abstract & PDF - Selected Publications - Prototypes - Test Software - Test Data - Institutions & Funding
Abstract
Mobile phones pervade our daily lives and play ever expanding roles in many contexts. Their ubiquitousness makes them pivotal in empowering disabled people. However, if no inclusive approaches are provided, it becomes a strong vehicle of exclusion. Even though current solutions try to compensate for the lack of sight, not all information reaches the blind user. Good spatial ability is still required to make sense of the device and its interface, as well as the need to memorize positions on screen or keys and associated actions in a keypad. Those problems are compounded by many individual attributes such as age, age of blindness onset or tactile sensitivity which often are forgotten by designers. Worse, the entire blind population is recurrently thought of as homogeneous (often stereotypically so). Thus all users face the same solutions, ignoring their specific capabilities and needs.
We usually ignore this diversity as we have the ability to adapt and become experts in interfaces that were probably maladjusted to begin with. This adaptation is not always within reach. Interaction with mobile devices is highly visually demanding which widens this gap amongst blind people. It is paramount to understand the impact of individual differences and their relationship with demands to enable the deployment of more inclusive solutions.
We explore individual differences among blind people and assess how they are related with mobile interface demands, both at low (e.g. performing an on-screen gesture) and high level (text-entry) tasks. Results confirmed that different ability levels have significant impact on the performance attained by a blind person. Particularly, otherwise ignored attributes like tactile acuity, pressure sensitivity, spatial ability or verbal IQ have shown to be matched with specific mobile demands and parametrizations. This confirms the need to account for individual characteristics and provide space for personalization and adaptation, towards inclusive design.
Keywords
Blind, Individual Differences, Mobile User Interfaces, Abilities, Demands, Touch devices
Committee
- Doctor João Marques da Silva, TU Lisbon (President)
- Doctor Daniel Jorge Viegas Gonçalves, Tu Lisbon (Supervisor)
- Doctor Joaquim Armando Pires Jorge, TU Lisbon (Co-Supervisor)
- Doctor Simon Harper, University of Manchester
- Doctor Luis Carriço, University of Lisbon
- Doctor João Brisson Lopes, TU Lisbon
Selected Publications
Blind People and Mobile Touch-based Text-Entry: Acknowledging the Need for Different Flavors BEST STUDENT PAPER AWARD
Proceedings of ASSETS 2011 - 13th International ACM SIGACCESS Conference on Computers and Accessibility. Dundee, Scotland, October, 2011
Blind People and Mobile Keypads: Accounting for Individual Differences
Proceedings of INTERACT 2011 - 13th IFIP TC13 Conference on Human-Computer Interaction. Lisboa, Portugal, September, 2011
Assessing Mobile-wise Individual Differences in the Blind
MobileHCI 2010 - Doctoral Consortium: 12th International Conference on Human-Computer Interaction with Mobile Devices and Services. Lisboa, Portugal, September, 2010
Identifying the individual ingredients for a (in)successful non-visual mobile experience
ECCE 2010 - Proceedings of the European Conference on Cognitive Ergonomics, ACM DL, Delft,Netherlands, August, 2010
Proficient blind users and mobile text-entry
ECCE 2010 - Proceedings of the European Conference on Cognitive Ergonomics, ACM DL. Delft,Netherlands, August, 2010
NavTap: a Long Term Study with Excluded Blind Users
ASSETS 2009 - Eleventh International ACM SIGACCESS Conference on Computers and Accessibility. Pittsburgh, USA, October, 2009
From Tapping to Touching: Making touch screens accessible to blind users
IEEE Multimedia, vol. 15, no. 4, pp. 48-50, Oct-Dec, December, 2008
Prototypes
Android 2.2 or later
Eyes-free touch input method that resorts to a simple navigational concept and a re-organization of the alphabet layout. Uses OS-defined TTS application. Designed and developed in collaboration with Hugo Nicolau
Android 2.2 or later
Eyes-free touch input method based on the graphical representation of the Braille alphabet. The screen functions as a slate where the user is required to select the desired dots. Designed and developed in collaboration with João Oliveira.
Android 2.2 or later
Eyes-free touch input method that imitates keypad-based MultiTap approaches. Users can search for a key (e.g., "abc") as they do with methods like VoiceOver but then they can split-tap multiple times to select the desired letter. Designed and developed in collaboration with João Oliveira.
Windows 95 or later
A simple demo application exemplifying the NavTap method. Developed by Paulo Lagoá.
Test Software & Scripts
Individual abilities and Mobile Touch Demands (in Chapter 6)
Android project
Application that manages the evaluation session. Communicates with the user via TTS and randomizes the order of the trials. To change from one trial to the next the test monitor is required to press the stand-by button twice. A menu will appear that will enable going to the next step. All buttons (exception made for the Home button) are de-activated during the trials. Log files are automatically created in the device storage
Python script
Reads a path with all user directories and creates a pickle file with all evaluation data.
Python script
Reads the data pickle file and generates an excel file with variate touch metrics for all participants and overall. Requires win32com python package to be installed
Python script
Reads the data pickle file and creates charts on demand (one particular user, all users). Requires matplotlib python package.
Python script
Reads the excel files created by the log analyzer and creates an excel file with data prepared for statistical analysis in SPSS (wide format)
Test Data
Long-term study with excluded blind users (in Chapter 3)
Excel Document
Raw Usage data and Auxilliary tables.
Excel Document
Raw Weekly Text-Entry Data and Auxilliary tables.
Expert Text-entry Comparison (MultiTap and NavTap) SPSS data
SPSS Data File
Text-Entry metrics data.
Interview Study: Identifying individual differences among blind people (in Chapter 4)
Word Document
Transcriptions of ten (10) interviews performed with professionals working closely with blind people about the individual differences found among the blind population and their relevance.
Excel Document
Table with the raw countings for references to individual attributes.
Assessing Individual Differences amongst Blind People (in Chapter 5)
SPSS Data File
Individual data of 51 blind people. This include profile, tactile, cognitive and functional assessments.
Individual abilities and Mobile Touch Demands (in Chapter 6)
Zip File
Archive with the raw XML trial log files for all three touch settings (tablet, touch phone and touch phone with physical border).
Zip file with Excel Documents
Touch metrics for all devices and settings.
Zip file with SPSS files
One SPSS File per Touch Metric with data from all participants and device settings
Individual Abilities and Mobile Touch Typing (in Chapter 7)
SPSS Data File
WPM and MSD data for the 15 participants