Expert User’s Perspective

Introduction

The importance of a user-centered design approach is perhaps best demonstrated when considering accessibility, where it is often the case that the user’s needs and preferences vary greatly from that of the designer. It is imperative that the designer interacts directly with the users, so that they can see first hand what it is like to use the particular assistive devices, software, and other tools or methods employed. We must also be careful to design according to everyone’s ability, and to not, through poor design, impose disability.

Kim Kilpatrick has abundant experience working with designers, software developers, engineers, as well as the end users of numerous technologies. Founder and director of the GTT program (Get Together with Technology), Kim works with people with low vision, as well as with designers looking to create products accessible to people with low vision. Kim, being blind herself, has become an expert on the entire corpus of technology from Apple’s trusted Voice Over screenreader to more cutting edge tech.

Kim had a lot to say about design and the future. I chose three topics to research in-depth, and categorized them based on how long it would take before we saw them implemented in society: a near future topic (within a few years), a mid-future topic (maybe within a decade), and a distant future topic (which may not be available for a generation or two). I hoped by organizing the topics chronologically that I would be able to pick up on a narrative, or perhaps a plan of action which would ferry us from now into the future in terms of accessible design.


Near Future: Staff Training and Education

Walking through a Bestbuy nowadays can be quite the escapade. The warehouse sized stores boast wall-to-wall displays of all the latest in technology, from 3D 4K Ultra HD TV’s to sports video games costing more than a hockey ticket. The variety within any given category can also be daunting (for example, Wikipedia lists 157 brands of mobile phones, worldwide). Often, our first line of defense in these situations are the staff, who can help us navigate to the product best suited for us. This is no different for users who want to purchase a phone or tablet with the best screen-reader or which is compatible with their own assistive devices.

Unfortunately, staff are often not familiar with these features. In fact, many of us have no experience with accessibility software like screen-readers, because these things take time to learn. Taking time to train staff how to enable, configure, and use these features is big step towards a more inclusive society. I read over Bestbuy’s accesiblility policy, which lists a number of goals, most of which had deadlines in 2014-2015. The goals include providing training to employees regarding Accessibility laws and the Human Rights Code, determining how best to communicate information to customers, and to ensure the workspace was accessible to the employees as well [1]. While this is certainly a good start, it is unclear how well the new policy is working. For example, Kim mentions that often Kiosk employees aren’t as well trained, due to high turnover. One solution she suggests is to train more permanent employees like managers.

jobs_iphone
Steve Jobs unveils the first ever iPhone. Apple products have excellent accessibility features, such as VoiceOver. Store Staff should be able to recommend the best smartphones and help new customers to activate features like VoiceOver [2].

So policies are in place, but we are not sure if they work. It might be interesting to do an external evaluation of BestBuy and other tech store training programs, but this is beyond the scope of this case study. Whether or not the policy is working, the solution to keeping staff updated on accessibility is through communication, i.e. by developing awareness. This can be expanded to society in general, where having a baseline level of awareness  is important in fostering a more inclusive world. The theme of awareness will come up again throughout this case study.


Mid Future: App and Software Accessibility

There are over 3,300,000 apps available in the Google Play Store. Some of them are pretty terrible. Apps can be based on flawed ideas, or can be poorly designed. Some aspects of poor design are not readily apparent, unless you are using a screen-reader. Much of what the screen-reader dictates comes from text and descriptions which are added behind the scenes. These features are optional, meaning it is possible for developers to create apps with no descriptions for their various buttons, links, and menus. I imagine using this app would be akin to flying a plane with no instrumentation. Other accessibility features, like adequate colour contrast and communicating information by both audio and visual means, can also be forgotten. To understand what we can do to improve app accessibility, we must first understand what exactly needs improvement. I therefore made a basic app with elements that were not accessible (an unlabeled buttons, below minimum colour contrast, and and button which is too small), then looked through the resources available in helping to identify and fix these errors.

The first resource available for developers are accessibility standards and suggestions, many of which are summarized here. This includes information on colour and contrast choices, correct labeling of elements, and the proper sizing of buttons [3]. All of these suggestions are easy to implement and will make your app more widely accessible. The information here is not limited to standards; there are also suggestions for organization and higher-level design concepts.

The next resource available can be found within Android Studio, the standard IDE for the development of android apps.  tool, which scans your code and highlights any possible errors or poorly structured parts. Android has an “Accessibility” warning which indicates when you are not meeting certain standards. As far as I know this feature is unique to Android Studio.

A screenshot of Android Studio's lint feature catching a mistake
A screenshot of the Android Lint feature, showing a warning for an element without a alt description

One final resource I found was an app called Accessibility Scanner, which can be downloaded from Google Play for free. Once activated, the scanner can view apps on your phone and identify features which do not meet standards. Below are some examples. They are taken from a simple app I created (available on GitHub) which purposefully includes designs which do not meet standards, to showcase the utility of Accessibility Scanner.

This slideshow requires JavaScript.

The above pictures show the Accessibility Scanner in action. It has taken a snapshot of a page in my app with three buttons of different sizes – and has highlighted the last, which is below the recommended size (Picture 1). Selecting the highlighted element gives a description of the problem and a possible solution, in this case making the button larger (Picture 2)

Clearly there are many resources for app developers. However, it is uncertain how many designers take advantage of them. Accessibility Scanner has below 100,000 downloads, and is only available on android devices. Compare this with other tools, such as Google Play’s ‘Playbook for Developers’ which has closer to 1,000,000 downloads. The accessibility linting in Android Studio can be ignored throughout the coding process. And, going back to Kim’s original idea, Google Play does not check that your app meets standards before making it public. Therefore, it is important that designers be aware of all of these tools.


Distant Future: Autonomous Guides

Robots are becoming more advanced. Check out bipedal robots like Boston Dynamic’s ATLAS, or Waymo’s self-driving car. The question is, can we find a good use for these powerful new developments? When I asked Kim for some thoughts on the distant future of accessibility, she mentioned autonomous guides. She navigates most spaces using guide dogs and/or GPS. However, there are times when both of these are insufficient. Think of complex indoor spaces, like shopping malls, where GPS doesn’t work and where guide dogs can’t bring you to specific points. Another example is the hospital – and who hasn’t been lost in a hospital? In these scenarios, it makes sense to design a robotic guide that could take you to your final destination.

This is not a new idea, in fact there are already guide robots in existence. Cice, shown below, can give tours of an archaeology museum in Italy. Robots like these have to navigate unstructured environments, and have to do so in close proximity to people. The creators have found that, while safety is obviously a primary concern, the perceived safety of the robots’ behaviour is also very important [4]. This could include ensuring the robot does not move into anyone’s personal space, that the robot moves at a walking speed, and that the robot approaches others from the front so that it is visible. These considerations can be categorized as comfort, naturalness, and sociability [5].  Cice is also capable of carrying out conversation with others. In this way it can provide information about the various museum exhibits, and can also answer questions. These extra constraints can make designing the robot much more difficult, but the results yield a bot which is more pleasurable to interact with.

Cice in action at a museum in Italy, surrounded by visitors.
Cice amidst some archaeology enthusiasts. Notice that unlike other robots, Cice must interact in close proximity to people. One day, robots could be used as guides in a variety of spaces, like hospitals and shopping malls [4].

Robots like these are not limited to museum settings. With a few refinements, a bot like Cice could be made to navigate a hospital or a shopping mall. It should be noted that Cice does not appear to be capable of guiding people with low vision – future autonomous guides will need to be built with inclusive design in mind.

A similar project a little closer to home is Carleton’s own telepresence robot, iTAD (intelligent telepresent assistive device). This robot is under development as a new mechanical engineering 4th year project. Its main purpose is to provide a controllable robot with cameras and manipulators so that someone off-site could remotely visit the hospital, whether they are a doctor doing rounds or a relative visiting a patient. Telepresence robots are already being used in healthcare settings [6]. A secondary purpose of iTAD is autonomous navigation. Using LiDAR (laser proximity sensors like those used in autonomous cars), as well as knowledge of the layout of the hospital, iTAD could conceivably guide visitors to different parts of the building.

It is clear from these examples that Kim’s idea for autonomous guides may not be as much science fiction as originally thought. Many robots already exist that are designed for similar purposes. However, these devices are not always designed specifically for guiding blind users. So, while the tech is out there, we must also be sure to incorporate inclusive design principals so that it can benefit everyone.


Conclusion

We have looked at three different ideas for the future of accessibility: better training for staff, improving the accessibility of apps, and using automation to make everyone more independent. The purpose of arranging the ideas chronologically was to discover a narrative or timeline, and what makes sense. The first step, better training for store staff, can be seen as a subset of the importance of awareness. Society must first be aware that people have differing abilities, and therefore have different needs and methods for using products. Next, we see that Android has many great tools and resources for building more accessible apps. Finally, complex technology can be used to increase independence, for example by using autonomous robots to guide people through indoor spaces. But, it is important that inclusive design be integrated into their creation from the beginning.


[1] BestBuy (2013). Accessibility Policy. Available online: https://www.bestbuy.ca/en-ca/help/accessibility-policy/hc8336.aspx

[2] http://uk.businessinsider.com/watch-steve-jobs-first-iphone-10-years-ago-legendary-keynote-macworld-sale-2017-6/#well-today-were-introducing-three-revolutionary-products-of-this-class-the-first-one-is-a-wide-screen-ipod-with-touch-controls-the-second-is-a-revolutionary-mobile-phone-and-the-third-is-a-breakthrough-internet-communications-device-3

[3] Usability – Accessibility. Available online: https://material.io/guidelines/usability/accessibility.html#

[4] Chella, A., & Macaluso, I. (2009). The perception loop in CiceRobot, a museum guide robot. Neurocomputing, 72(4–6), 760–766. https://doi.org/10.1016/J.NEUCOM.2008.07.011

[5] Human-aware robot navigation: A survey. (2013). Robotics and Autonomous Systems, 61(12), 1726–1743. https://doi.org/10.1016/J.ROBOT.2013.05.007

[6] Michaud, F., Boissy, P., Labonté, D., Corriveau, H., Grant, A., Lauria, M., Royer, M.-P. (n.d.). Telepresence Robot for Home Care Assistance. Retrieved from http://www.aaai.org/Papers/Symposia/Spring/2007/SS-07-07/SS07-07-012.pdf