ATU259 – Facebook Accessibility

Play

ATU logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Show Notes:
Matt King, an Accessibility Specialist with Facebook | www.facebook.com/accessibility

Apple Accessibility – All Accessories – Apple http://buff.ly/1OlvW97
Skull echoes could become the new passwords for augmented-reality glasses | KurzweilAI http://buff.ly/1TPfOAP
NYU Tandon Doctoral Student’s Cochlear Implant Technology Banishes Ambient Babble http://buff.ly/1UPlSfe
On the Hill with Audrey Busch | www.ATAPorg.org
——————————
Listen 24/7 at www.AssistiveTechnologyRadio.com
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: https://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA

——-transcript follows ——

MATT KING: Hi, my name is Matt King. I am an accessibility specialist in user interface engineering at Facebook, and this is your Assistance Technology Update.

WADE WINGLER: Hi, this is Wade Wingler with the INDATA Project at Easter Seals crossroads in Indiana with your assistive Technology Update, a weekly dose of information that keeps you up-to-date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

Welcome to episode number 259 of Assistive Technology Update. It’s scheduled to be released on May 13 of 2016.

I’m so excited to have as my guest today Matt King who is a Facebook. We are going to talk about what Facebook is doing with artificial intelligence to make automatic alt tags and their accessibility initiatives in general. We’ve got a story about how the echoing around of the sound in your skull might be your new password; information about how new algorithms are making cochlear implants doing a better job of separating conversation from babble; and we get an update from Audrey Hill and what’s happening in the federal government as it relates to disabilities and assistive technology.

We hope you’ll check out our website at www.eastersealstech.com, send us a note on Twitter @INDATA Project, or call our listener line. We love to hear from you. That number is 317-721-7124.

Like this show? Checkout another one of our shows: AT FAQ, Assistive Technology Frequently Asked Questions. ATFAQshow.com.

***

Humanware braille-iant; Ablenet spec switch; Big Track trackball; Blue2 bluetooth switch; the Scoog; and Tracker Pro’s — those are all things that I see on the most popular product page when I look at Apple’s new accessibility store. You can buy all kinds of things from Apple.com from computer to iPads and those kinds of things. Now they have a new section for accessibility product, categories with vision, physical and motor skills, learning and literacy, even talking about compatibility with different Mac products or Apple products. I was fascinated to see the Apple did this. I will pop a link in the show notes and you can now shop for your accessibility and assistive technology products right on Apple.com.

***

How many passwords do I have to remember? I don’t know about you, but for me that’s a real concern in my life. I know that I have a fingerprint reader on my iPhone and I know that there are some optical retina scanners that can help you with some biometric things, but frankly we need a better answer to passwords. There are just too many to keep track of. Which is why I was fascinated when, in the Kurzweil Accelerating Intelligence newsletter, I read this newsletter that says, “Skull echoes could become the new password for augmented reality glasses.” There is a group of German researchers who have figured out that they can take things like Google Glass or HoloLens and they can send a sound through these bone conduction speakers that are part of those devices, basically bounce that sound off of the inside of your skull, and then have that reflected sound or that resident sound come back through the system’s microphones. It is sort of like a fingerprint. Everyone’s skulls resonate a different way in the system. They call it Mel Frequency Cepstral Coefficient, which is a way for them to measure that unique characteristic of the sound rattling around in your skull. Fascinating stuff. I had no idea that we were that unique and that we had this fingerprint sort of resonance in our skulls. There are lots of jokes about noises happening in my head there. It is a fascinating way to think about biometrics. Apparently in their early tests, they were about 97 percent accurate with an error rate of less than seven percent, which isn’t quite as accurate as another technique that does a 100 percent brain print, but it is low cost, portable, and doesn’t require complicated stuff. I’m fascinated to know what this might mean for folks with disabilities, especially where typing in a password is a big pain. If we can simply rattle some sound in your brain and measure that frequently first frequency response or unique pattern, I think it will be an interesting way to authenticate and get into different computer systems and those kinds of things. Fascinating article. I will pop a link in the show notes and you can check it out.

***

Out of the Polytechnic Institute at New York University, there is a doctoral student named Roozbeh Soleymani who has created a new algorithm that might help with the babble problem related to cochlear implants. Users of cochlear implants have often struggled with the issue of separating out the speech that they are interested in hearing from some of the background noise or babble that is often part of a room full of conversation. That cacophony can be distracting and sometimes get in the way of communication. Traditional noise suppression algorithms have figured out what an air-conditioner sounds like or background noise that is more stable sounds like and have been able to separate that noise out for a lot of years. However, when it is babble, people talking in the background, algorithms have had a hard time figuring out what is the spoken language the person wants to hear and what is sort of that background noise.

These new algorithms are looking at those waveforms a little bit differently. It has to do with the frequency of oscillation, or the number of oscillations per second in those noise patterns and can more accurately sort out a spoken voice compared to some of that background babble. Now, while the original idea is about making cochlear implants more clear for those users, it seems that there might be even a bigger market related to cell phone noise reduction. I am sure that you have been in a situation where you are using your signal cell phone in a busy train station or a room full of people who are talking. Imagine an algorithm that can sort out and that you hear only the voice on the cell phone and block out the rest of that background babbling noise.

It’s a fairly technical article. It is very fascinating. It is from NYU.EDU, their engineering blog. I’ll pop a link in the show notes so that you can look at some charts and diagrams and wrap your brain around how these new algorithms for noise cancellation might really improve the experience for uses of cochlear implants. Check our show notes.

***

It’s time for on the Hill with Audrey Bush. Audrey Busch is the Director of Policy and Advocacy for the Association of Assistive Technology Act Programs. In her update, she lets us know how the power of politics is impacting people with disabilities and the use of assistive technology. Learn more about Audrey and her work at ATAPorg.org.

AUDREY BUSCH: This is Audrey Bush, Policy and Advocacy Director for the Association of Assistive Technology Act Programs, coming to you with your monthly Washington Update. It is clearly budget season in Washington. Even with the possibility of passing a full FY 2017 budget languishing, the Senate and the House appropriations committees are clearly trying to will the budget’s future, and move forward with developing and passing appropriations bills as if there were no internal fighting within the GOP over a type line number to spend. So we have seen the Senate Appropriations Committee pass four out of the 12 appropriations bills while the House Appropriations Committee approved three out of the 12. And both committees are charging forward to develop and pass the remaining funding bills. In fact, the labor Health and Human Services Appropriations bill will likely be considered sometime in May in the Senate and sometime in June in the house. While it is an unknown how disability programs will fare in this funding bill in either chamber, it is clear that the bill’s overall allocation is lower than last year’s, leading to an assumption that cuts are coming somewhere, as no one knows where yet.

And while you may not be hearing about progress on Capitol Hill due to the headlines being dominated by presidential candidates, progress was made to reauthorize the Older Americans Act. In fact, the president signed this bill into law in late April. This reauthorization improves the Aging and Disability Resource Centers, otherwise known as the ADRC’s, and also updates these centers’ definition to be consistent with current practice and current law, including by emphasizing independent living and home and community-based services. Throughout the bill, there is an effort to promote evidence-based supports, improve nutrition services, align senior employment services with the workforce development system, and streamline and improve program administration. This is an accomplishment that both parties should be proud to tout during this election season. The hope by those in Washington is that there is more bipartisan success is emerging this year, and as always, this Washington Update will keep you posted on all future accomplishments.

***

WADE WINGLER: So Facebook is one of those tools that I have open on my computer all the time. I can’t get through a day without dealing with Facebook and the people I interact with and the relationships that show up on Facebook. See you can understand why I am super excited today to have Matt King, who is an accessibility specialist with Facebook, to talk with us about what is going on at Facebook when it comes to accessibility and users of assistive technology and that kind of thing. First and foremost, Matt, thank you so much for taking time out of your busy day and being on our show.

MATT KING: You’re very welcome. I appreciate the opportunity to share with you.

WADE WINGLER: Matt, tell us a little bit about you. Kind of set the stage so that people understand your perspective and background. How did you get into assistive technology and accessibility? How did you end up working for Facebook and is very important role?

MATT KING: Growing up, I actually had some sight. I was legally blind. I didn’t really think of myself as blind at that time, though. It was not until I got into my college years that I started losing four of my vision because of retinitis pigmentosa, that I had to really face the fact that, yes, indeed I was blind, and I was going to be totally blind pretty soon, so I better get with the program. Very fortunately, I was able to connect with people that served as mentors for me and were really able to help me face up to the fact that blindness doesn’t disable your entire life. It just affects your eyes. The rest of your life can go on and be happy and great.

I finished an engineering degree at Notre Dame, with a second major in music. I went on to start working as an electrical engineer at IBM. Later on I got into software development. But the whole way, from the very first day I started using a screen reader as a sophomore in college back in 1985, I have always been pretty frustrated with the technology, never felt that it was good enough or getting good enough as fast as I would like. So it was always something I did on the side, was beta test, provide feedback. When I got to IBM, I actually started contributing code. That’s how I got my start in assistive technology. In 1998, I had my first opportunity to work on accessibility full-time at IBM. That’s where I’ve been ever since. That has really been my passion, to make the world a better place for people with disabilities, not just visual disabilities. When I got into the field of assistive technology, I really expanded to focus on all disabilities.

So I think it was about a little over a year ago, Facebook contacted me and wanted to know if I was interested in talking about opportunities here. It turned out to be one of the most wonderful things that ever happened in my professional career. Here I am as a member of the Facebook Accessibility Team.

WADE WINGLER: We are so glad that you are there. I’m excited. By the way, thanks for the Notre Dame plug. We always love to hear Indiana references on our show here. Go Fighting Irish. So, Matt, talk to me little bit about Facebook’s accessibility initiatives over the years. I would love to hear about some of the technical things, and then also culturally what is happening with Facebook in terms of accessibility.

MATT KING: The team started about five years ago when my manager, Jeff, was working in user research and was getting feedback from people who were trying to use the platform, saying I’m using a screen reader or a screen magnifier or Dragon NaturallySpeaking or some other assistive technology, and it is not working very well for me. He started digging into it and found, wow. We really need to do some work in this space and it’snot going to be just a simple little fix. He made a proposal to engineering management and was able to establish a team. That was in 2011. It started chipping away at the very basis back then. Since then, the team has grown and developed quite a broad array of skills, not just web but mobile, iOS and android, and even in the field of data science and now artificial intelligence, user research, etc. It is now much broader initiative with wider scope.

The focus, of course, is making sure that every single one of Facebook’s teams has the ability to serve all of the people in their target market, regardless of whether or not they can see or hear or use their hands or have hands. To do that, of course, you are really wise to point out it is not just technical but it is also cultural. There have been a lot of cultural shifts type initiatives initiated by the accessibility team, such as our Empathy Labs. I’m not exactly sure how many in place right now, but the number keeps growing, but I think there are roughly 4 or five of them, possibly six, that are up and running at different sites. This is where engineers and others can sit down and experience Facebook in different ways, with a braille display or a screen reader or a switch device. It also helps people understand other aspects of empathy as well. What is it like to use Facebook on a 2G connection? What is it like to experience a Safety Check. Things like that. So the Empathy Labs are broader than just accessibility, but they got their start with accessibility.

WADE WINGLER: I think that is an amazing concept, to put developers and engineers in that sort of situation so they kind of walk in the boots of people who are dealing with those different sorts of situation, including disability or assistive technology. Matt, tell me a little bit about some of the technical things that are happening at Facebook. There is a lot of press recently about the alt text kind of thing. I want to say that for a minute. But are there other kinds of accessibility things happening with Facebook that folks might not be aware of?

MATT KING: I can summarize a few things. If they are not yet aware, we would really like them to be aware. Some time ago –I don’t know the exact dates because this is before I was on the team — we started supporting the ability for people to upload caption files for the videos. Last year we finally completed — this turned out to be pretty difficult, actually — initially supporting font scaling across all of the apps on iOS. So if you choose larger fonts within your iOS settings, now all of the Facebook apps will respect that. Similar with high contrast mode.

For Voiceover user’s, we had quite a few changes in iOS within the last year,. A couple that pop in mind is, now when you’re going through a newsfeed, you can like, react, save, and do other things without using that magic tap gesture which will sometimes start music playing or do other things you don’t expect. You can just use the rotor. The actions rotor is supported in his feet as well as some other places within the app. Similarly, if you open a post in a newsfeed and go to what we call the post view page or the permalink page, and you get to the like button.The way non-voiceover users access other reactions is they hold their finger down on the like button, and the other actions like sad and love and so forth, those reactions pop up. Well, we had done something — the equivalent for that for voiceover users, you can double tap and hold the like button and that menu will appear. There is a voiceover hint for that; however, if you have hintsturned off like most of us do, you won’t hear that. You would hear of course if you did that three finger tap on the like button. We are constantly trying to work on ways of making the experience more enjoyable.

WADE WINGLER: It seems to me that you gather a lot of input from your users. You are not working in a vacuum there and people have the opportunity to give you that kind of feedback.

MATT KING: Yes. We enjoy getting feedback and suggestions and do everything we can to act on those.

WADE WINGLER: So one of the things you all have done recently that has garnered a lot of press and news, at least my world, is this thing about artificial intelligence and automatic alt text, or descriptions of images. Can you describe to me what is going on there and tell me why this is important?

MATT KING: We are using computer vision to recognize objects that are in photos and then generating alt text from the information. Right now, we are recognizing roughly 100 different concepts. Everything is an object. For example, if the photo is indoors or outdoors, that’s what we call a scene. We don’t include — well, people are objects until you recognize their face and then it falls into a different system which is not object recognition but actual facial recognition. We are not yet including the names of people in the automatic alt text, although that is something we hope to have on the horizon as we have privacy settings and so forth appropriately set up for that. Concepts can be objects, scenes, activities, so forth. Right now we are recognizing about 100 of those. We are introducing more, slowly and carefully, because we want to make sure that people can trust the automatic alt text and they will not skip by it because they don’t think it is reliable. So our thresholds for what we call precision coming out of the object recognition system is pretty high. It varies depending on the concept, but it is usually not less than .8 on a 0 to 1 scale.

WADE WINGLER: My experience with that. There are times when it will say something to the effect of no description available. Then there are times when it was a something pretty intuitive. We don’t have time for a demo today, but for example, if there were a photo that were correctly recognized and it were a man or dog standing under a tree outside, can you give me some idea of how Facebook might describe that? Would it say that?

MATT KING: It would say, one person smiling outdoors, tree. Pretty soon it would say dog, but dog is not in the list yet. We’re being really careful about animals because sometimes when people are holding a puppy, it can look like a baby.

WADE WINGLER: Danger, danger.

MATT KING: We are being careful, especially careful on that one. It would say something like that. Over time, of course we would hope to provide a lot more detailed information. We have a number of ideas on how we are going to do that. One concept that we are pretty excited about that we think would work really well on mobile devices in particular is the ability to touch the photo and exploreit. As you go around the photo, it could give you some haptic feedback when moving from object to object and tell you what it is. I have this idea that maybe when you get to a person and it says Mrs. Wade, maybe even double tap Wade and now you are on Wade’s face, hand, foot, leg, something like that, or be able to find something about the color of your clothing. Longer-term, down the road, we would really like it so that you can ask questions about the photo. The AI team has shown us some prototypes of this kind of capability where you can literally ask it almost anything and it comes back with some type of answer. You can ask what somebody is doing, what they are wearing, what color something is. What color are their eyes? What color is their hair? That kind of thing.

WADE WINGLER: That’s great. I love those longer-term, futuristic things. I think that’s amazing stuff. A little bit more of a shorter term question we had from our listeners. My understanding is alt text is currently working on the iOS app right now. Is that the case? Are there near future plans to push it out to PC, Mac, Android platforms as well?

MATT KING: There certainly are. Right now it is only in certain products on iOS and in certain countries. We are in seven countries right now. We are adding more almost weekly. We are going to be making it available on Android, it should be today. If not today, definitely before the end of the week. So the next update of your Android app will add it to more places. So anyone who has a fairly recent version of the Facebook app for Android, it will already be in newsfeed, but if you update your app you will get it in a lot more places. That is this week. Again that will be in the same seven countries that we are in right now: US, Canada, UK, Ireland, Australia, New Zealand, and South Africa.

WADE WINGLER: We are recording a little bit earlier than release, so by the time of this interview, those things might be in place already. That’s going to be great. Matt, I know we are running short on time here. Quick question. If people want to know more about the accessibility team or more about the empathy lab and the things you we talked about today, where would they go online to learn more about the stuff or to get more information or even to continue the conversation?

MATT KING: Facebook.com/accessibility is our Facebook page for the accessibility team. You can get regular updates from there as well as comment. We also have a feedback form that you can access from there. If you want to learn more specifically on how to use Facebook products with any assistive technology, or contact us through our accessibility specific feedback form, you can go to Facebook.com/help/accessibility.

WADE WINGLER: Matt King is an accessibility Facebook with Facebook and has given us a lot of insight into the accessibility features that are built into and being added to Facebook all the time. Matt, thank you so much for spending your time with us.

MATT KING: You are very welcome, Wade. I enjoyed it.

WADE WINGLER: Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? Call our listener line at 317-721-7124, shoot us a note on Twitter @INDATAProject, or check us out on Facebook. Looking for a transcript or show notes from today’s show? Head on over to www.EasterSealstech.com. Assistive Technology Update is a proud member of the Accessibility Channel. Find more shows like this plus much more over at AccessibilityChannel.com. That was your Assistance Technology Update. I’m Wade Wingler with the INDATA Project at Easter Seals Crossroads in Indiana.

 

Please follow and like us:
onpost_follow
Tweet
Pinterest
Share
submit to reddit

Leave a Reply

Your email address will not be published. Required fields are marked *