Melissa is a User Experience (UX) Researcher on Android, with a focus on Accessibility. Prior to Google, she worked at a London-based start-up, State.com. Melissa earned a bachelor's degree in Political Science from UC San Diego and a master's degree in Urbanization and Development from the London School of Economics.View the profile
Maya is a product manager on Android OS Updates to ensure devices get the latest Android release. In addition, Maya works on Accessibility efforts to make technology usable for everyone. Previously, she led the next generation of the Android Lock Screen and Fingerprint to give users better security without the hassle of entering passwords. Prior to Google, Maya worked for Microsoft as an on-campus representative and helped design their university strategy. Maya is a technology enthusiast and was heavily invView the profile
Patrick is a product manager on the central Accessibility Engineering team at Google where he works to develop services and technology that benefit users with disabilities. Prior to Google, Patrick was a software engineer at Lockheed Martin working on NASA related projects. Patrick earned bachelor's and master’s degrees in computer science from the University of Colorado at Boulder, and an MBA with a focus on new product development from Carnegie Mellon.View the profile
Brian joined Google in 2007 as one of Google Enterprise’s first sales engineers working on the Google Search Appliance and Google Apps. Brian bootstrapped Google’s expansion of Search and Gmail in Myanmar (Burma). He created the Google News Next Billion program launching 9 editions in underserved markets and then went on advance meetings experiences as a product manager on Chromebox for meetings. Brian currently manages accessibility product development for Android. He graduated from the George Washington UView the profile
Victor works on the Android Accessibility Services team as a Technical Program Manager, focusing on the TalkBack screen reader and other accessibility features in the Android ecosystem. He previously contributed to accessibility efforts for Google+, Photos, Hangouts and other social initiatives at Google. Prior to Google, Victor led accessibility teams at such companies as PayPal and yahoo!, and has extensively travelled in South East Asia teaching computers to visually impaired students and trainers. He reView the profile
About the talk
Learn about the latest feature additions to Android P, get an update on accessibility testing and best practices, and hear about new APIs that developers can use to create more accessible app experiences. This session will also offer a unique look into how Google conducts UX research for users with disabilities.
Hello, everyone. fabulous My name is Victor and I'm an Android accessibility technical program manager. And we're super happy that you decided to spend the next 40 minutes of your time at this conference with us learning about accessibility and about two related work with who at Google. So thank you for that. And that's cool. So, why were here today? In this fashion would like to Showcase how access to surrounding environment information and Technology can
Empower people with disabilities to be productive and independent in their lives. According to World Health Organization, there is over 1 billion people with disabilities in the world. It's nine zeros. 115 million of these people have difficulty performing even the most basics of daily tasks for example of water. Or walking out of the house to favorite restaurant or listening to a video on YouTube. Or finding something in a low lighting conditions. Four other people with stars aren't so easily achievable. Technology has an amazing
power to change that. In fact, if you think about it for Millennia Humanity has been developing low and high tech gadgets. I'll give you a few examples wheel. Sorry, no, Nobel Prize for this one but fire clock typewriter smartphone, all of these things. We have been creating so that we can improve the quality of Our Lives. What's up people? This is just improve the quality of their lives for people with disabilities these gadgets this new technical advancements made it possible in many cases and navy to live their lives. IFrogz Apple being completely blind could not do my work at
Google without assistive technology or computers or other tools that help me to do. Very basic things in my life, you know paying bills taxes. What have you so in this case against technology is just super important and today we would like to talk to you about some of the technology now. Some of you probably in the audience have experience. Low-light you conditions where you were trying to find something or how about Shake It Rain where you were trying to read
something to believe that you've been waiting for so long to pick up and read. Many things right you might remember this comfort in frustration that you find yourself grappling with when finding sides have in such circumstances. You probably wouldn't want this comfort and frustration to be companion for the rest of your life. But unfortunately for many people with disabilities it is they have to live with this. Not acknowledge it as I mentioned before has an amazing power to change if not all of that then certainly a lot of these things.
We are Google actually create some of the technology. I think we know something about technogym Google right we created technology and Technology with you. Like we have you here for that purpose so we can tell you about since we create over the last year and we've been working hard on creating some of the new tools to help people with disabilities, but also create new apis and Frameworks to help developers like you to create more accessible apps and more inclusive products. with that
I would like to invite Brian to talk to you to kick off our marathon and to talk to you about. What words rhyme I'm right here while you're here. Alright, Brian Brian. Thank you, Victor. Thank you everybody. My name is Brian kamler and I'm a product manager on Android accessibility. I'm excited to discuss our new accessibility services and the apis that make impossible now, let's dive in and discuss our innovating in this space. First I'm going to talk about what we're
doing in Android P to make hearing conversations a little easier. Listening is difficult even in a quiet field with no background noise and a large set of ears. But the everyday reality is we live work and play with an increasing amount of environmental noise. In fact unwanted sound is one of the most common environmental problems. It's not only annoying but it prevents us from understanding our colleagues friends and loved ones. No matter how well we can hear most of us can relate to the following acoustically challenging
situations listening to your dinner date in allowed restaurant. Trying to listen to a collar allowed airport lounge. Listening to a presenter who is speaking too softly like me right now one could think about environmental noise as a form of situational disability, but what if you could turn down the noise and boost the signal on your smartphone so you would never miss the conversation. Now you can because today we're introducing sound amplifier sound amplifier is in new accessibility service that helps users focus on real-world conversations using only
an Android smartphone and a set of headphones users can users can tune hundreds of personalized levels to optimize their listening experience to the current environment. It's two sliders for loudness and tuning dynamically adjust over 100 audio presets in the background. The settings can then be independently applied to each ear adjusting them provides Ambient sound improves Ambient sound quality in a vast array of scenarios, including enhancing sound loud distant or far away
spaces increasing the volume of somebody who speaks Softly. Turning turning up the TV to volume that's understandable and acceptable for everybody in the room. Now I'm going to briefly explain how it was built using the new Dynamics processing effect. Sound amplifier with build on the new Dynamics processing effect in Androids. Audio framework. The effect is based on a four-stage signal processing architecture. Stage one is pretty felicitations. You can think about the pre equalizer as a programmatic
way to adjust any audio frequency zinc-based to travel and everything in between. Stage 2 is a multi-band compressor. The multiband compressor is the Keystone of the dynamic processing affect. It just up saw sounds at adjust down loud sounds to provide a much better listening experience Equalization similar to the first stage. It allows you to fine-tune any audio frequency lastly stage for the limiter prevent additional game above it developer designated threat threshold, which prevents loud or unwanted noises. We're going to be doing a number of demos and a deep
dive on the Dynamics processing effect and Sound Amplifier Thursday at 3:30 on stage 5, so please join us there if you want to learn more Switching topics. I'm going to discuss what we're doing to make Android easier for users with mobility and dexterity challenges. For example, we're going to do a little empathy exercise. So first I'm going to ask everybody to take out their phones. All right. Now I want everybody to take a screenshot. This does have a purpose take a screenshot, please.
After you've done that I want you to do it again, but I want you to do it with just one hand. screenshot screenshot on your phone not quite as easy. Is it with one hand on? For tens of millions of users. This is their everyday reality. These might be users with tremors or limited hand strength for them simple tasks such as holding the phone pressing hardware buttons for performing gestures can be difficult painful and time-consuming. Accessibility service accessibility menu is a new accessibility service to make common interactions easier for users with physical disabilities in the
us alone. This is a population of over 75 million users. Accessibility menu simplify the following functions Hardware shortcuts including power off lock screen take screenshot and volume control gestures, like swipe up and swipe down for recent apps notifications and quick settings. It'll also allow you to invoke the assistant or go directly to accessibility settings instantly further. We designed it with very large touch Target located at the bottom of the screen
this facilitates reach ability for those with smaller hands or limited reach. It works in both portrait and landscape mode lastly accessibility menu is built on top of our accessibility framework. We headed to new apis for screenshot and lock screen. So developers can start using it today. So now I got to pass the Baton over to Patrick to talk a little bit about vision. Lake Bryan Alright. Hello everyone. My name is Patrick Larry. I'm a product manager on accessibility at
Google and I'm very excited today to announce for the very first time here at I owe a new app coming to the Play store called Lookout. So look at is the new app for users for blind and low-vision and it'll be useful useful for them in certain situations where they require help from others. So for example, one of these users might be entering a new space for the first time, right? They might not know if other people are present in that space. They might need assistance. For example, if I need a chair or reading the office directory. Another example is when a user is cooking.
They might need assistance reading a recipe or locating different ingredients. So the goal of Lookout is to leverage to use your smartphone to make these physical spaces and others more accessible to these users who are blind or the vision impairments. You can think of Lookout as a personal guide that sees the world and speaks items of importance that it detects in the environment. Sony user download Lookout and open the app. This is what they'll see the Sia mode selection where they can pick between four different modes and not
launch will see it will support work and play home. Scanning which is for scanning long-form text and experimental. Please allow users to actually specify the activity. They're partaking in and sign Tunes Lookout to provide more relevant results. After selecting the mode user will be presented with a live view which will speak items of importance. Is there detected? next I'd like to roll just a short video that profiles one of our testers who's an artist up in Saint Francisville and describes the impact that nap
like this can have My first experience with the Labyrinth was this amazing piece of canvas art that were being allowed to walk on uneven this swirling choreography. I was so excited to experience this new thing. My name is Maya Scott. I am a theater and art instructor. I was born visually impaired. I kind of feel like an impressionist painting. I'm often scooching around working with the my supplies and they don't always end up where I want them to be and I can't find the scissors.
Both as an artist and a teacher building a labyrinth. What is the next step for me to invite people to have a physical experience with art and not just observe it? Very much in our lives were invited to watch or listen instead of witnessing or doing. So I can see a little idea of the type of experience that look out can provide so when we were building Lookout we focus on what we call a passive experience and we imagine users using it by wearing their Android device and a lanyard like this. Or perhaps in a shirt pocket
now. This is a big objective of ours this passive experience. We don't want to force the user to enter up the activity they're engaging with for example, if users cooking they can remain engage in that activity without having to take out their device and interact with the screen. So while wearing the phone Lookout provides these useful hints about what the user is holding in front of them and to facilitate this we've designed a set of external controls. So you may have noticed in the video. Are you certain actually knock on the device the beginning recognition this live recognition
that Lookout provides and they can do other things like cover the camera to pause recognition or use the fingerprint sensor to change different modes. Let me tell you a little bit about how this works. So first Lookout will collect live imagery of the scene around the user and then we'll run a series of machine learning recognizers to do things like barcodes detective. People are presents detect objects and read text in the scene. The next thing we do is score of these items first based on a personalization model and then bailed that the user has chosen this way. We only speak
the relevant and important items to the user. This is important cuz as you can imagine and a standard seeing there could be many things we want to we can detect. However, we only want to tell the user about the important things based on what we know about the context that there are currently on. It's important to note that we do most of this all on the device. So that latency is low and you can actually use this app completely without an internet connection. Let's look out an app
for users for a blind and low-vision that harnesses the power of machine learning to provide relevant results about physical spaces around them. It's coming to the Play Store for users in the US on Pixel devices very soon this summer. And we're also start rolling out to trust the testers. So if you'd like to test this app go to google.com accessibility and sign up to be a trusted tester. You can come to a Sandbox to see a demo as well. So next I'll hand it over to my house. We'll talk about what we're doing and the cognitive space.
Patrick hi, I'm Ativan already and I'm a product manager on Android framework. And you're a goal. We launched a service call Select to speak targets at users. We difficulty breathing select allows users to selection on the screen and A text will be read aloud today. We're happy to announce that we added OCR capability to select to speak in camera and fixtures. This makes this even more accessible user can simply select 6 using the camera or in the picture and the test
will be hella highlighted and read aloud. Now. I want to show you a brief them all for that. I'm actually going to go off the stage. I'm going to open my camera. Can my camera be shown. So this is the largest This Is The Life cameras View and this is the regular camera and I have your Dad's disability button. I have your accessibility button here in the corner. I'm going to turn it on. And next I'm going to select the text. Select to speak with OCR is available in Android P beta today. Please try it out.
Thank you. No. Back to the stage. And where's my clicker? So today it's going to come soon to the gym beta would love to have your feedback. No, I'm going to talk a bit about Androids framework and our goal in the Android accessibility frame or cookies to help get rid on of the assumptions that device are billed on for example know though users can interact with a touch screen or weed offender DUI or even hear the sounds that the device is producing know. We're supporting these gold in two ways.
The first one he's we're developing apis for apps and services to make it easier to support wide range of audience and second. We're implementing accessibility pictures into the framework. No, I will talk about speeding uapi addition to being able to developers to make their Ave more accessible. And the first one is Ed a pain title still first. Let me talk to use what is a pain is a section of the window that is visually distinct for using Waze, which is visually distinct for user and change it
like a week and changes and example of the thing. You can see here in the quick setting and the notification shade prior to Android fees accessibility Services could not easily determine when paints on the screens have been updated in Android TV added an ox reviewed called accessibility came title No accessibility services are out of matically notified to changes to the pen to use their get more granular information about what has changed with minimal input from App developer. So, please do add a title
to your pain. The second IPL is called and accessibility heading this is an attribute skill developer can mop with if something is a heading inside their f And this is intended for apps with tech-heavy views. This will allow screen to navigate more efficiently between different padding inside their out without Sports listening to the whole test. Underworld safety argues. We have the ability. We basically the last idea is for setting what is focusable
and that's apis designed to control the grouping of the contract of death content. For example, essay that you have a sports scores and you want to ensure that the number goes with the team instead of topic reading all the teams and then all the numbers which can be very confusing for the users who cannot see the screen now you can use screen reader for possible to improve house screen readers group the content. And by the way, we're also updating our documentation for accessibility focus and groupings. So please check it out.
That's its ability section in developer. Android.com No, I'm going to talk about 3 and you additional features coming to the Android framework. The first one is adjust vibration. This is the new setting for users with the c r e p e juice is used to adjust the ring and look if acacian vibration and also the touch sensitivity. The second the second sad thing is called remove animation as some user find animation distracting. So we basically just added a new
setting to turn those off and the last and the last and the last feature is a year ago. We launched the volume key accessibility shortcut. So basically to toggle on and off accessibility accessibility Services, you need to long press for 3 seconds the volume keys not we did to change it to that one based on the feedback. The first one we add a new settings that you can toggle on and off which is color correction and coloring version. And the second one is we reduce the time to act from 3 seconds to 1
second after the initial activation. In the next section, I'm going to talk about that thing for accessibility and it's important for everything that we build into their last. We see the power of that Force ability service desk in the app ecosystems and we want the second system to be as accessible as possible. No down multiple ways to test including manual testing and alternator testing in general. We found the combination of both could provide you a better
coverage for your F next. I'm going to cover different testing tools and how does can be integrated to cover to cover accessibility to discover accessibility issues. First let me talk a little bit about manual testing has the most reliable way to understand how your products are used by different users with disabilities. What you need? There is you need to turn on TalkBack or switch access with your app cuz you want to be in the shoes of the user to feel how it is to run an accessibility service. However, we are aware that manual testing takes time and you cannot this for every
change that you making to the app. And for that we also provide there is also an automated testing. When developing Android accessibility testing framework also shortly called ATF to quickly identify common accessibility issues early in the development cycle 80s is an open source Library which we already integrated into espresso and Robo electric testing framework, so you can leverage your existing cells to identify accessibility issues. ATS 3.0 3.0 is now available on guitar neck. We have to automated tools that integrate accessibility
testing framework play free laundry for and accessibility scanner. Play free lunch report. We're making it even easier to get accessibility test results, but integrating those directly into the developer console know first. What is play pre-launch report? This is an automated testing solution which caused apps on multiple physical devices and the sex issue with the developers can fix before reaching their end users including recognizing crashes and performances issues publish to the alpha and beta Channel
accessibility test. This is just another example of how we integrate our core accessibility testing library at cooking up. You need to tell people and accessibility tests, including contrast ratio touch Target size missing labels and so on and the test results would be automatically integrated into the test report. This will provide the developer and opportunity to fix those. It's a big dog before their act lunches and make their ass more accessible. Lovely, I'll mention accessibility scanner, which is an app that suggests accessibility improvements for Android app, utilizing
accessibility testing framework, and we're very happy death. I'll just deal today accessabilities counter discovered more than three and a half million opportunities for improving accessibility in apps for improving accessibility apps for accessibility scanner And your version of access ability scanner is available today with additional accessibility check such as reversal order for screenwriter and in Corinth Elementary high for a disability. We also improved at UI for
conference suggestion. And finally we also are allowing changes for threshold for contrast ratio in touch Target size. So to download accessibility scanner, please go to G. Co accessibility scanner. Just some of the section we talked about manual testing and we showed a number of tools to help out the med testing but the most successful project incorporate both manual and automated testing. So, please do try all the different tools and do develop a testing strategy for accessibility for your
ass a piece of the puzzle is to understand the user experience and your user need and without said I will and you talked to Melissa to talk about user research. Thank you Maya. My name is Melissa Barnhart and I'm a user experience researcher on Android. So now that we've gotten a sneak peek at the accessibility features and services coming in Android P. I'd like to talk about the role of user research in the design and development process. But first what
exactly is user research user research focuses on understanding user behaviors needs and motivations. It is the process of understanding the impact of design on an audience. Says technologist we made think we know what's best for our users. But often times we don't the user is truly the expert and their experiences by seeking to understand their behaviors needs and motivations. We start to challenge our assumptions and reshape our perspectives. Why do it why do we do
user research you may have heard the stat that nine out of 10 startups fail, but you may not know that the number one reason for a failure is lack of a market need for a product. So in those situations people design and develop products and look for an audience who might be interested in using them user research advocates for the opposite by putting users first. You can validate your ideas and inform your products Direction. Hopefully you're saying that great but when do I get started and the short answer is now you use the research involves a set of practices that runs
along your existing product development cycle helping you make decisions at each stage. So methods like field studies interviews or competitive analysis can help you explore a specific area of technology and uncover opportunity areas. Once you've decided what you want to build and you have a prototype methods like cognitive walkthroughs usability test or participatory design exercises can help you make important design decisions. And once your product is out in the market methods like log analysis diary studies or surveys can help you evaluate its performance
and if you don't have a lot of time or experience, that's okay. Just take a few hours and talk to a few users Jakob Nielsen of the Nielsen. Group found that the best results come from small affordable studies with no more than 5 users. So now I like to show you how Android uses user research to improve our accessibility offering. Let's take the example of TalkBack vibration feedback TalkBack. The Google screen reader use is vibration feedback by default. We knew that some people turn this off and we wanted to
know why why does some people turn it off While others leave it on. Are there situations where vibration is more or less useful and is the pattern or intensity of the vibration make a difference good research starts with good research question. Because we were interested in perceptions a vibration feedback. We started with user interviews. We asked screen reader users questions, like do use vibration feedback. Why or why not have you ever turn it off? When did you turn it off? And how would you describe the intensity of vibration on your
phone? Is it too high to low or just right after getting a sense of the participants needs we prompted them to compare vibration feedback across a range of Android phones. So these tasks out of a usability component to our study. Ennis research we've validated a real need for vibration feedback. We learn that vibration provides valuable navigational information to visually impaired users, especially in noisy environments when they may not be able to hear spoken feedback on their device. We also learned that the more
distinct the vibration pattern the more useful finally we learned that users one of the ability to adjust vibration intensity according to their personal preferences. So an Android P, we're giving greater control to the user over vibration feedback anyone will be able to go to their accessibility accessibility settings and increase decrease or disable vibration. And this is just the beginning we will continue to make improvements in this area as we further research. So now we're going to hand it over to Victor for some final announcements and resources to help you going
forward. All right. Thanks, Melissa for highlighting the importance of user research in the product decisions to make super bored as you have seen already. Well next we couldn't just let you go without a few more months and Nan Smith. Could we? First I'd like to announce that voice access and accessibility services that help someone with dexterity issues to use their Android phone completely through their voice just completion voice commands. Voice access is graduating from beta after months of user research development
iterations. We are finally ready to launch it and give it to you coming soon this summer. With new UI enhancements a better assistant integration and buy a sister and Men Google Assistant just to be sure. Coming up with Google Assistant ey and more improved interaction over also look out for voice access. Listing of like to let you know about is that we decided to package a lot of hour or hour accessibility services that you commonly find build into an Android
device into a single package. We call Android accessibility Suite some of the services include TalkBack screen viewer for people who are blind and vision-impaired switch access a software that allows people the physical impairments to use their device was just a single button or single switch still like to speak let my just demoed a few minutes ago. And accessibility menu that brand Also showcased earlier and more to come with the new Android accessibility Suite other accessibility services.
Not just talk back will be more discoverable in the Play Store as well. As this will become a platform for new future to stability Services. We will be developing. But enough about us, I think it's time to talk about you a little bit right? I think you deserve guys you build a lot of the cool stuff, too. Last evening, we announced Google Play Awards. And I'm just would like to give a shout out to a few amazing developers that have been working on stuff
or apps to make our ecosystem better and more accessible. First four winner is be my eyes be my eyes is a cross-platform solution that connects people who need help with people who are willing or excited to get it would help the way the service works is that someone who is blind or who needs help requested by declaring themselves as a person who needs help and people on the other hand can sign up as volunteers and when such help request comes they get a notification then they pick it pick up the video link
and then they can have somebody in real time. Toby my eyes check it out on the Play Store. And that's why I like to mention is the open sesame open sesame is already known to some people who have been with us during various Google IO events. And otherwise the coolness of open sesame accessibility solution is that allows people to control their Android device by just looking at it? It's amazing. So check out open sesame and the link is on the screen. The last but not least. I'd like to mention. We really really really
care about game because while it's an important aspect of computing to be productive and independent, it's just as important to have fun. And so one of our yesterday winners is a company from New Zealand called audio Game Hub and what they've created is an experience for anyone to be able to play audio games with their eyes closed. So you only relies on your ears and then you place all kind of funky games. You know, I'm not going to name those games. I hope you'll go to the Play Store and check the month for yourself. And I'd like to leave you with a few resources
will let him mentioned accessibility scanner, please download it. Test with accessibility scanner the website where you can find all of the best accessibility practices for Android development G. Co / Android accessibility, and of course user research and last but not least. Of course. We have a lots of session for you to explore here at the Google IO. It's a great opportunity for you to meet us for us to meet you so we can together create a more friendly and more accessible and inclusive ecosystem. Thank you for coming here today.
Buy this talk
Access to all the recordings of the event
Buy this video
With ConferenceCast.tv, you get access to our library of the world's best conference talks.