Use this button to switch between dark and light mode.

Customer Interaction Provides Insight in Addressing LexisNexis Product Accessibility

May 19, 2020 (4 min read)


By David Lovell - Senior Content Visual Designer — Strategic UX & Product Design

LexisNexis re-emphasized its commitment to accessibility in 2017 when Min Xiong, the head of Content UX within the User Experience team, added this responsibility to her role. Within a short amount of time, the accessibility email inbox was refreshed and an updated LexisNexis Commitment to Accessibility was posted. As a member of her team, I worked with her to monitor customer requests and open dialogues with customers who require accessibility assistance with our products. These one-on-one sessions with our customers have been informative for us and have improved our understanding of the experience our customers have.

I researched the experience of customers with disabilities in preparation for these sessions. I wanted to understand what we could encounter and could step my way through the discussions. It was an entirely different experience once I managed my first session. I understood what it meant to navigate using only the keyboard or to listen to a cue spoken by a screen reader (which is remarkably fast, by the way).

I also learned how often a disability may be only temporary. For instance, a broken arm can make it difficult to type or use a mouse to navigate, but once it is healed the customer can return to their normal workflow.

A disability is very likely something a person is either already living with or will experience within their lifetime. As such, it is far better and less expensive for a product to incorporate Web Content Accessibility Guidelines or WCAG 2.0 level A and AA design as a product is developed than it is to incorporate those items after a product is released.

Fortunately, LexisNexis understands this and already incorporates these guidelines into our product line. The issues our team engage with customers on tend to be items introduced as updates are released. These sessions provide helpful feedback on items which ordinarily would not be detected in our standard pre-release testing.

 In one case study, we were contacted via email by a customer who is blind to discuss some issues she was experiencing in Lexis Advance Research. We arranged a meeting to have those issues demonstrated to us. One was the difficulty she had working with the Client ID menu where a customer can select the case or ID associated with the research they are currently doing. She activated the screen reader and navigated to the menu, which read “Client ID menu.” However, once the menu was activated, the screen reader did not describe the menu was open. She was able to hear she was in the menu, but as she navigated the list of items the screen reader only supplied back “Client” and “Edit Client” for each entry without any identifiable information. She was hearing the generic labels for each item in the list. The menu had not triggered any accessibility concerns in our automated testing, and this experience showed how important it is to include manual testing.

The automated test had found labels were there, but it wasn’t the detail the customer needed. The labels would need to be adjusted to include the specific record data. It helped to understand her frustration and discuss what she expected should happen in order to get a recommended fix added to the development backlog, because then we were able to communicate that back to the development team.

Another experience with a customer who is blind involved an issue with running a search from the Lexis Advance home page. He was frustrated because his searches were missing characters in the search terms he entered. This meant the results he received had nothing to do with his current work. As we watched him demonstrate the issue, we saw that the word wheel feature, which suggests possible terms, publications, documents, and more, was dynamically updating with each keystroke. As the system worked to receive his input, read out what had been entered, and update the entire list below the search field it would occasionally miss recording a keystroke and the incorrect search would be submitted.

We explained to him what we were witnessing. He wasn’t aware the terms below the search field were being updated dynamically. The feature wasn’t being announced via the screen reader and he had no interest in trying to use it, so we helped him turn the feature off. That adjustment improved his experience but didn’t completely resolve the issue; some character strokes were still being missed. After moving to a different browser also supported on his system, he found the issue was completely resolved.

In both of these instances, and others not mentioned, the customers expressed their gratitude in having us reach out to them and listen. Neither of these instances could have been resolved as successfully without having the customers demonstrate the issue on their own system. Fortunately, screen-sharing software supports performing this remotely over the internet.

LexisNexis benefited because we came to understand the issues our customers were experiencing. These meetings also allowed us to then communicate the experience back to the product development teams while at the same time being able to answer additional related questions. And finally, when the fix has been addressed, it is easier for us to confirm the modification to the code resolves the issue.

The benefits of engaging with customers as we seek to improve our products has a dramatic influence on our success as a product solution for our customers.

____________________________

David has worked across a variety of LexisNexis products for 20+ years and joined the User Experience team in 2010. He has a Bachelor of Arts in Design from Brigham Young University.

For more on UX design and problem solving at LexisNexis, visit the LexisNexis Medium Design Blog