Imagine being blind and trying to attend a virtual event. Try that next time you stage one.

OSTN Staff

How do you make a virtual event accessible for people who are blind or visually impaired?

When I started work on Sight Tech Global back in June this year, I was confident that we would find the answer to that question pretty quickly. With so many virtual event platforms and online ticketing options available to virtual event organizers, we were sure at least one would meet a reasonable standard of accessibility for people who use screen readers or other devices to navigate the Web.

Sadly, I was wrong about that. As I did my due diligence and spoke to CEOs at a variety of platforms, I heard a lot of “we’re studying WCAG [Web Content Accessibility Guidelines] requirements” or “our developers are going to re-write our front-end code when we have time.” In other words, these operations, like many others on the Web, had not taken the trouble to code their sites for accessibility at the start, which is the least costly and fairest approach, not to mention the one compliant with the ADA.

This realization was a major red flag. We had announced our event dates – Dec 2-3, 2020 – and there was no turning back. Dmitry Paperny, our designer, and I did not have much time to figure out a solution. No less important than the dates was the imperative that the event’s virtual experience work well for blind attendees, given that our event was really centered on that community.

We decided to take Occam’s razor to the conventions surrounding virtual event experiences and answer a key question: What was essential? Virtual event platforms tend to be feature heavy, which compounds accessibility problems. We ranked what really mattered, and the list came down to three things:

  • live-stream video for the “main stage” events
  • a highly navigable, interactive agenda
  • interactive video for the breakout sessions.

We also debated adding a social or networking element as well, and decided that was optional unless there was an easy, compelling solution.

The next question was what third-party tools could we use? The very good news was that YouTube and Zoom get great marks for accessibility. People who are blind are familiar with both and many know the keyboard commands to navigate the players. We discovered this largely by word of mouth at first and then discovered ample supporting documentation at YouTube and Zoom. So we chose YouTube for our main stage programming and Zoom for our breakouts. It’s helpful, of course, that it’s very easy to incorporate both YouTube and Zoom in a website, which became our plan.

Where to host the overall experience, was the next question. We wanted to be able to direct attendees to a single URL in order to join the event. Luckily, we had already built an accessible website to market the event. Dmitry had learned a lot in the course of designing and coding that site, including the importance of thinking about both blind and low-vision users. So we decided to add the event experience to our site itself – instead of using a third-party event platform – by adding two elements to the site navigation – Event (no longer live on the site) and Agenda.

The first amounted to a “page” (in WordPress parlance) that contained the YouTube live player embed, and beneath that text descriptions of the current session and the upcoming session, along with prominent links to the full Agenda. Some folks might ask, why place the agenda on a separate page? Doesn’t that make it more complicated? Good question, and the answer was one of many revelations that came from our partner Fable, which specializes in usability testing for people with disabilities. The answer, as we found time and again, was to imagine navigating with a screen reader, not your eyes. If the agenda were beneath the YouTube Player, it would create a cacophonous experience – imagine trying to listen to the programming and at the same time “read” (as in “listen to”) the agenda below. A separate page for the agenda was the right idea.

The Agenda page was our biggest challenge because it contained a lot of information, required filters and also, during the show, had different “states” – as in which agenda items were “playing now” versus upcoming versus already concluded. Dmitry learned a lot about the best approach to drop downs for filters and other details to make the agenda page navigable, and we reviewed it several times with Fable’s experts. We decided nonetheless to take the fairly unprecedented step of inviting our registered, blind event attendees to join us for a “practice event” a few days before the show in order to get more feedback. Nearly 200 people showed up for two sessions. We also invited blind screen reader experts, including Fable’s Sam Proulx and Facebook’s Matt King, to join us to answer questions and sort out the feedback.

It’s worth noting that there are three major screen readers: JAWS, which is used mostly by Windows’ users; VoiceOver, which is on all Apple products; and NVDA, which is open source and works on PCs running Microsoft Windows 7 SP1 and later. They don’t all work in the same way, and the people who use them range from experts who know hundreds of keyboard commands to occasional users who have more basic skills. For that reason, it’s really important to have expert interlocutors who can help separate good suggestions from simple frustrations.

The format for our open house (session one and session two) was a Zoom meeting, where we provided a briefing about the event and how the experience worked. Then we provided links to a working Event page (with a YouTube player active) and the Agenda page and asked people to give it a try and return to the Zoom session with feedback. Like so much else in this effort, the result was humbling. We had the basics down well, but we had missed some nuances, such as the best way to order information in an agenda item for someone who can only “hear” it versus “see” it. Fortunately, we had time to tune the agenda page a bit more before the show.

The practice session also reinforced that we had made a good move to offer live customer support during the show as a buffer for attendees who were less sophisticated in the use of screen readers. We partnered with Be My Eyes, a mobile app that connects blind users to sighted helpers who use the blind person’s phone camera to help troubleshoot issues. It’s like having a friend look over your shoulder. We recruited 10 volunteers and trained them to be ready to answer questions about the event, and Be My Eyes put them at the top of the list for any calls related to Sight Tech Global, which was listed under the Be My Eyes “event’ section.  Our event host, the incomparable Will Butler, who happens to be a vice-president at Be My Eyes, regularly reminded attendees to use Be My Eyes if they needed help with the virtual experience.

A month out from the event, we were feeling confident enough that we decided to add a social interaction feature to the show. Word on the street was that Slido’s basic Q&A features worked well with screen readers, and in fact Fable used the service for its projects. So we added Slido to the program. We did not embed a Slido widget beneath the YouTube player, which might have been a good solution for sighted participants, but instead added a link to each agenda session to a standalone Slido page, where attendees could add comments and ask questions without getting tangled in the agenda or the livestream.  The solution ended up working well, and we had more than 750 comments and questions on Slido during the show.

When Dec. 2 finally arrived, we were ready. But the best-laid plans often go awry, we were only minutes into the event when suddenly our live, closed-captioning broke. We decided to halt the show until we could bring that back up live, for the benefit of deaf and hard-of-hearing attendees. After much scrambling, captioning came back. (See more on captioning below).

Otherwise, the production worked well from a programming standpoint as well as accessibility. How did we do? Of the 2400+ registered attendees at the event, 45% said they planned to use screen readers. When we did a survey of those attendees immediately after the show, 95 replied and they gave the experience a 4.6/5 score. As far as the programming, our attendees (this time asked everyone – 157 replies) gave us a score of 4.7/5. Needless to say, we were delighted by those outcomes.

One other note concerned registration. At the outset, we also “heard” that one of the event registration platforms was “as good as it gets” for accessibility. We took that at face value, which was a mistake. We should have tested because comments for people trying to register as well as a low turnout of registration from blind people revealed after a few weeks that the registration site may have been better than the rest but was still really disappointing. It was painful, for example, to learn from one of our speakers that alt tags were missing from images (and there was no way to add them) and that the screen reader users had to tab through mountains of information in order to get to actionable links, such as “register.”

As we did with our approach to the website, we decided that the best course was to simplify. We added a Google Form as an alternative registration option. These are highly accessible. We instantly saw our registrations increase strongly, particularly among blind people. We were chagrined to realize that our first choice for registration had been excluding the very people our event intended to include.

We were able to use the Google Forms option because the event was free. Had we been trying to collect payment of registration fees, Google Form would not have been an option. Why did we make the event free to all attendees? There are several reasons. First, given our ambitions to make the event global and easily available to anyone interested in blindness, it was difficult to arrive at a universally acceptable price point. Second, adding payment as well as a “log-in” feature to access the event itself would create another accessibility headache. With our approach, anyone with the link to the Agenda or Event page could attend without any log-in demand or registration. We knew this would create some leakage in terms of knowing who attended the event – quite a lot in fact because we had 30% more attendees than registrants – but given the nature of the event we thought that losing out on names and emails was an acceptable price to pay considering the accessibility benefit.

If there is an overarching lesson from this exercise, it’s simply this: Event organizers have to roll up their sleeves and really get to the bottom of whether the experience is accessible or not. It’s not enough to trust platform or technology vendors, unless they have standout reputations in the community, as YouTube and Zoom do. It’s as important to ensure that the site or platform is coded appropriately (to WCAG standards, and using a tool like Google’s LightHouse) as it is to do real-world testing to ensure that the actual, observable experience of blind and low-vision users is a good one. At the end of the day, that’s what counts the most.

A final footnote. Although our event focused on accessibility issues for people who are blind or have low vision, we were committed from the start to include captions for people who would benefit. We opted for the best quality outcome, which is still human (versus AI) captioners, and we worked with VITAC to provide captions for the live Zoom and YouTube sessions and 3Play Media for the on-demand versions and the transcripts, which are now part of the permanent record. We also heard requests for “plain text” (no mark-up) versions of the transcripts in an easily downloadable version for people who use Braille-readers. We supplied those, as well. You can see how all those resources came together on pages  like this one, which contain all the information on a given session and are linked from the relevant section of the agenda.

 

Powered by WPeMatico

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.