Share via


My Own Accessibility Experience

 

“So, what happened to your blog?”adhesive bandage on white background

That’s a quest I got on Friday at the office from one of my associates. She asked why I had posted fewer and fewer posts per month until about a year ago, and then nearly nothing at all, save a few posts which were more general updates than insights or preaching in front of the fence with a bucket full of brushes. (Some of you will understand that reference in relationship to how I spend many days at the office.)

Let me take you back, I explained. Bear with me for a moment as a I get this out. Some of you won’t find this as informative or engaging as Scott Hanselman’s latest post on a Continuous Integration System with AppVeyor this weekend, but the preface will help you better understand where this is going.

One afternoon about a year ago, I was in my office on our main campus in Redmond preparing for a meeting the next day with a number of folks in our company to talk about our work on the customer experience. It was like any other day, getting meeting notes, presentation materials and last minute work on the agenda and attendees. In my office I had a number of reference materials and tablets spread out on my desk and on to the table height bookshelf that sits beneath a heavy, wall-mounted shelf. I have a standing desk and am often in the habit of using all the available surfaces in my work, which to some appears unkempt and perhaps even in some disarray. But it all serves a purpose.

I reached for one of the tablet computers on the shelf, to ensure the system was up to date and ready to take notes, and accidentally brushed a folder on the table top which went slipping to the floor. Without thinking, I knelt down to reach for the fallen stack of paper and… Stopped. The next thing I remember I was seated in the guest chair in my office, a little dazed and confused (not out of the ordinary, some may argue), but none the worse for wear. Collecting the paperwork, I got back up and went back to work, slogging through the presentation slides and related emails. One of the folks I work with came in to ask a question or share some amusing anecdote – whatever it was, it was cut short.

“You ok?”

Feeling like one does when one is informed that they are flying low, I responded that I was fine, thinking there must be some of my lunch left on my face. But no, it was pointed out that I had a small crimson trickle snaking down my forehead, which I confirmed by wiping my brow and seeing the smear across my fingers. I don’t like the sight of blood, much less my own, and sat down on my stool to steady myself. After mopping up the damage around my hairline, I held it until the bleeding stopped, bandaged it with whatever passes for a generic Band-Aid in our office and then returned to my business of looking for typos. I continued to do so for several hours, until noting that it was late and that it would be good to head home so as not to alienate my wife and family.

Brushing off the throbbing in my forehead and the headache forming behind my eyes, I got in the car and headed home which usually is a fairly nondescript, uneventful trip. I was tired, and chalked up the headache to my busy day, still sore but quite clear. As I neared the highway, the headache turned to nausea and I realized that something was not as it should be. I began to put some of the pieces together as my vision blurred when I looked at the horizon and was wondering if I’d left my messenger bag at the office: having spent time as a ski patrol years ago, this seemed like a concussion.

Surely, you jest. I hadn’t been in an accident, I thought. But then I recalled the bump earlier in the day, the lost time and the growing pain between my ears. But I remembered that people don’t always understand or notice that they’re having difficulty. And as I had been a recluse of the last few hours of the day putting the finishing touches on the prep for our marathon discussion to come, I hadn’t been working around others live in the office, folks who might’ve noticed the symptoms or called attention to them.

So, instead of following the straight shot home, I turned and headed for what I thought a more prudent decision: the emergency room.

In the several hours that followed, I found that I slurred my speech while describing my symptoms several times to different attendants, nurses, doctors and technicians. I waited, then sat, then poked, prodded, and scanned my way through the system to finally recall the doctors telling the nurse after viewing my films (do they still use film?) that I was “TBI”: hospital speak for Traumatic Brain Injury, or a concussion. Either way, it had been quite a bump to my head in my case.

I don’t remember the doctor’s name, but I do recall finally getting home that weekend, and eventually sleeping after having been awake for I don’t know how long. I may’ve even had the presence of mind before drifting off to have my wife contact folks in my office to let them know that I wasn’t abandoning them, but that the doctor had been quite clear that I was not to drive, watch television and certainly not use a computer. But I wasn’t sure.

What followed was a week or more at home, taking time to recover what had been a simple bump to the head, which had done more to slur or stammer my speech (resulting in some aphasia), have a lovely headache and increased the volume and chord to my chronic tinnitus. I learned (as I’d clearly forgotten my basic medical training) that the primary recommendation was to rest, to allow the brain to heal. Rest is not something that comes easy to me: years ago, I spent two weeks directing a national product launch tour from my bedroom after a spill at the office that laid me up, working the phone and effortlessly typing away on my laptop at what was then my standard 80 words (or more) per minute. So certainly this wouldn’t be any different.

But I’m a poor patient and after attempting to read through and respond to some email, I quickly realized that this was not a good idea. First, watching television and reading text on the screen was tiring and even made me a little nauseous. My doctor said it would be some time, at least a couple of weeks, before I could or should return to my daily routine at work. Rest and recovery I thought – particularly in our busiest month – was a luxury. But I agreed and waited to feel better, recovering enough after a couple of weeks away to get back to work.

After lots of rest, I began to ease back to life at the office, first logging on from home and gradually, getting back to my old routine. I found that it was more difficult to return to old habits of running through pages and pages of emails, spending hours on the phone, and reading through stacks of reports. What was most evident and troublesome was my inability to easily and quickly speak my thoughts: it seemed as if there was a disconnect, of the words in my brain eventually making their way to my vocal chords and sounding out of my mouth. I appreciate the very patient comrades at Microsoft for their understanding.

To someone who has spent a lifetime relying on their people and verbal skills, not being able to speak clearly and in rapid fire sentences is more than frustrating, it’s infuriating. And when you’re used to remembering any number of useful (and useless) bits of information in your head, you find it challenging to return to a notepad to write things down to remember the things you might forget. Doubly challenging when you happen to rely so much on OneNote, given the move away from paper.

As I returned to work, I knew that I would have to find ways to cope with the limitations on how I used the computer: I was lucky if I could type ten our fifteen words a minute, much slower than I could think, and somewhat slower than I could get the words out after a month of recovery. I’ve been fortunate enough to work closely with the folks in our product groups and cross company accessibility team, as well as with some of our amazing people who use our technology to accomplish basic productivity tasks after suffering much more debilitating injuries. I knew the technology and applications existed. I had even purchased some of the software solutions to help out my youngest son when he had experienced challenges early on with typing on the computer, to dictate his work. But I hadn’t used them myself beyond the basic dogfooding we do on new product releases, given several of the core abilities are offered in Windows.

My hand-eye coordination hadn’t been as impacted as much as my speech, concentration and typing, so I was able to mouse around and use my Surface tablet and Sony all-in-one touchscreens without much on an issue. But I found that I needed to enlarge what I found on the screen to make it easier to see my work and to navigate around the desktop.

imageSo, I decided to use some of our resources to determine what I could do out of the box with our products and services to get my job done. Being somewhat stubborn and independent, I decided to do this myself, leveraging what was available in our office, on our websites and from our partners.

Having worked in the Windows group, and with Rob Sinclair and his team on accessibility solutions, I knew that Windows 7, Windows 8 and Office 2013 (where I spend most of my day) had built-in capabilities and companion programs to make it easier to use the computer. And I am a big fan of folks like Jenny Lay Flurrie, Kelly Ford, (the late) Michael Kaplan and many others at Microsoft who are not only making great contributions at the office and in our industry, but are strong advocates and adamant voices for the customers we serve.

I’ve close to 20/20 vision, so I didn’t need to take advantage of Narrator, our bundled screen reader, which reads aloud the text on the screen. But I did use the Magnifier in Windows 7 to make it easier to see parts of the screen as I moved my mouse over a document. And I found I regularly used the zoom feature in Internet Explorer and Office 2013 to magnify the documents I worked on and pages I visited. I also optimized my display to change the options of displayed dialogue boxes and the attributes of the cursor, making them easier to see and notice on the large displays. Changing the desktop screen resolution and adding a second larger 40” screen to my all-in-one computer at home made it easier to view materials and work on things in the same familiar flow, but without the need to squint.

I also found myself using the keyboard more and more, rather than the mouse, through Mouse Keys to move the arrow via the arrow keys on my keyboard, and Sticky Keys to trigger multiple, simultaneous key presses to hold one modifier key down while remembering the combination to unlock Windows or log on. I also made it easier to use Windows by preventing the automagical arrangement of windows when moved close to the edge of a screen.

One of the biggest challenges for me was typing. I had tested and used some of the capabilities in Windows 7 and Windows 8 to use my voice to control the computer with basic commands and simple dictation. It takes a while and some training before you can start using Speech Recognition reliably. But after a while of becoming familiar with the basic commands, controlling the computer and moving around a document during dictation are relatively easy, but rudimentary in my experience.

Then I remembered the copy of Dragon NaturallySpeaking I’d purchased for my son. There, I’d found the software to be very good at dictation and voice recognition (kudos to our robust set of third party developers!) over our own solution, particularly for parsing what I’d said and transcribing it automatically in Word our Outlook onto the screen. The user interface is intuitive and overall the application works quite well and is extremely fast and accurate – the key reasons I purchased version 11 (and happily upgraded to 11.5). Recognition also improves with use, and it transcribed what I said father than I could get the words out, and with great accuracy. The biggest challenge was remembering how to enter punctuation, note the end of sentences and overall formatting. One of the first public results was my blog post last July on dotless domains: Dragon handled the words like a champ with a minor amount of typing and editing. The same was true for my last two posts: although I liberally leveraged past posts and materials, I was able to edit and add to them with relative ease.

What I found myself using more and more were the Speech features on my Windows Phone. Almost by accident, I found not only could I use the large tiles on the phone to make a call, send a text or search for something on Bing, I could also initiate the action with my voice. So much easier than hunting and pecking for an app or the tile. It made it so much easier to initiate a phone call to my wife, get my voice mails, open my emails or send a text. But the real win for me was using my Windows Phone to dictate email messages instead of typing with my thumbs. I was impressed that even with my stammered speech, my Windows Phone correctly interpreted what I said with incredible accuracy. So much so that I used it to easily and effortlessly create documents in OneNote which would then be available the next time I got in front of any of my connected devices. Further I set my Windows Phone up so that it would read my incoming text messages aloud, saving to have to read them on screen… and a feature I wish I had for email, too. (I learned recently that https://www.drivesafe.ly/ is coming to Windows Phone.)

A year after my injury I still find myself using many of the features I found last summer while recuperating. I now type about forty to fifty words minute (still below my norm), and regularly use my Windows Phone for dictation, and Dragon on my Windows devices to dictate emails and documents. (That’s how I drafted much of this post today.) In all, I find that I’m more productive in many ways, plus it’s much faster and less tiring – one of the reason I save my focus for my work, and less on long blog posts and (thankfully, I’m sure many recipients think) emails. I find it easier to post on social media (Twitter publicly and Yammer internally) on walks or on the shuttle between meetings, using my phone to dictate comments in the Messaging or Outlook app which I then easily copy and paste into the social app. I also still have my standing desk (although I take more walking and seated breaks) with multiple large screens using IE and Office to magnify my work, all with a desktop screen resolution that’s more to my liking. For the tinnitus, I stream and play music from my Xbox Music account over headphones, and I use a headset or the integrated mic on my Surface Pro with speakers to sync with people over Lync.

I was fortunate enough to recover from the event generally unscathed. I’ve since moved the items around in my office, and I rarely place anything on top of my bookshelf anymore for fear of repeating my uncoordinated move of a year ago. And I have a deeper appreciation for the accessibility features and technology we build into our products at Microsoft and services that many rely upon.

I’ve read that seven out of ten people in the world will experience either permanent or temporary disability at some point in their life, and learned that we have a lot to live up to when it comes to Bill Gates’ vision “to create innovative technology that is accessible to everyone and that adapts to each person's needs.” Having experienced and worked closely first hand on the requirements of the disabled over the last decade, I like to think that I have a good appreciation for the need, but my own personal experience was something more than I have seen and heard third hand through family members and friends. We have a great deal of work to do to make our devices and services more transparent and easy to use. I’m happy to know that our devices and services teams are dedicated and focused on knocking down the barriers for people with disabilities encounter and help them make the most of the tools we offer.

And I’ll get better about posting here, in addition to my updates on Twitter.

Additional links

Post also available at https://aka.ms/M3-060114 .

Comments

  • Anonymous
    June 02, 2014
    This was eye opening. How does the technology get in the way of or impede the user, considering the different limitations individuals find in everyday, off the shelf products? What needs to be done to make services more universally accessible?

  • Anonymous
    June 25, 2014
    Humbling.  Many do not understand the challenge of accessibility until faced with it themselves or through an acquaintance.   www.microsoft.com/enable Accessibility of Microsoft products, assistive technology, keyboard shortcuts, disability types, demos, tutorials, news, resources for educators.