© 2019 by Rishma Mendhekar

Rishma Mendhekar


How can we design an online
ordering system that works for everyone?


Our industry partner's online ordering system is confusing and inflexible. How do we redesign it for accessibility?


High-fidelity prototype
of an accessible online
ordering system

My role

  • Designed website flow and structure
  • Authored chatbot script
  • Conducted interviews, cognitive walkthroughs, heuristic evaluation, and other methods


Task analysis, cognitive walkthroughs, semi-structured interviews, personas, affinity mapping, wireframing,
wizard of oz


August - December 2018


Adobe XD + Illustrator + Photoshop, Sketch, Treejack, VoiceOver


The Ask

My team and I received the prompt-- creating an accessible online ordering system-- from Schlotzsky's, a Southern sandwich chain, in late August. Although their main website meets WCAG AA accessibility standards, the bulk of the online ordering happens on a 3rd party website. Our industry partner had two main problems with this:
  1. Customers find the 3rd party platform confusing
  2. The 3rd party platform is not accessible and our industry partner does not have access to the code for the platform
That's where we came in: our task was to design not only an integrated, accessible online ordering webpage up to WCAG AA standards, but to think outside the box (or outside the webpage) to create a novel online ordering alternatives.

Evaluating the present


the problem

We decided to use a mix of designer-led evaluation techniques— task analysis and cognitive walkthroughs— and user-led evaluation techniques— semi-structured interviews, tree-testing, a think-aloud observations, affinity mapping, empathy maps, and journey maps— to gain a holistic understanding of the problems with the existing website.
Along with usability, we also needed to evaluate the system for accessibility. The team created our own shortlist of WCAG 2.1 guidelines based on W3C's most recent recommendation to familiarize ourselves with the standards we would consider while creating design alternatives. To assess the accessibility of our industry partner's online ordering system, we conducted a competitive analysis with five competing restaurants. 
user journey- alex_3x.jpg
A journey map of the original online ordering flow.
We acknowledge that automated accessibility testing has its flaws and can not replace accessibility testing with a real user— however, due to time constraints and the fact that we would be redesigning the entire system from scratch, the automated testing tools provided an appropriate baseline.

evaluating the present

The Conclusions

By synthesizing our results from the methods discussed above, we came to 9 broad conclusions that would serve as a jumping off point for our design alternatives.
  1. The switch to the unbranded 3rd party platform confuses users.
  2. Users are frustrated by the 3rd party platform because they lose their progress without meaning to.
  3. Language used on the main website and 3rd party platform is too jargon-y.
  4. There are too many different options presented which users find difficult to distinguish.
  5. Customization of order items is useful, but is presented in a confusing way.
  6. Users want tailored, concise suggestions at the right moment.
  7. Navigation is tedious and has many redundant steps and loops.
  8. The inability to search or filter items hinders users.
  9. Appropriate feedback for user actions is not provided.




After brainstorming based on our nine conclusions, we chose out three design alternatives— an updated website, a chatbot, and a voice interface— with which to move forward based on the conclusions from our evaluation of the existing online ordering system.
To narrow down which prototype we would move forward with, we planned evaluations for each of the prototypes. For the chatbot, controlled by me, we created a wizard-of-oz Facebook account which our participants would message to place
an order. To test the voice interface,
chatbot and webpage prototypes.
we wrote a script which one of our team members would recite as our participants placed an order.
One of our team members created the accessible website using jQuery based on our sketches. Although a the live website might be considered too high-fidelity for this stage, the team agreed that it would be the best way to test how compatible our basic design was with assistive technologies.


​We conducted two stages of evaluation with these prototypes. Our first stage was conducted with two participants with visual impairments and one without. After ordering a sandwich and thinking aloud while using each of the prototypes, we asked the participants a series of questions about their comfort and satisfaction level, accessibility, and the prototype they preferred most.


presented the website on one half of the computer screen and Facebook Messenger on the other half.
Once again, our three new participants (one with visual impairments and two without) ordered a sandwich using the prototype while thinking aloud and answered questions about accessibility and their comfort and satisfaction levels.
All three participants interviewed in stage two felt that the chatbot and website would be better as separate platforms, so we decided to develop the website as our final deliverable.
Initial sketch for the second stage of evaluation.
For the second stage of the evaluation, we decided to move forward with a hybrid of the chatbot and website based on feedback from the first stage. To simulate the chatbot as part the website, we



our Design

The team made a number of changes based on the first two rounds of feedback, including making promos more visible, creating a favorites section for returning customers, making language clearer, and adding visible prices to add-on ingredients, before conducting the third round of evaluations. Our next step was to evaluate our prototype for a third time with experts and participants so that we could shape it into our final deliverable.
With our updated design in hand, we conducted three expert interviews and six participant interviews. We conducted cognitive walkthroughs with our two accessibility experts to pinpoint problems customers would face while ordering a sandwich; we conducted a heuristic evaluation with our usability expert to ensure our design was up to usability standards.
An accessibility expert evaluating our prototype
With our participants, we conducted a three-part evaluation: a semi-structured interview and think-aloud, a benchmark test with one of Schlotzsky's competitor's websites, and a SUS survey. These methods allowed us to understand users' internal journeys while ordering a sandwich on our website. The benchmark test, conducted with one of the most accessible websites we found during our initial competitive analysis, allowed us to understand whether our design stood up against competitor sites in terms of usability and accessibility. One of our participants had visual impairments while five did not.
initial wireframe

Presenting the deliverable

The Final Product

After implementing the changes from our final round of evaluation, we were ready to present our final recommendation to the Schlotzsky's team.

With our updated design, we were able to solve the main problems we found on the original website. Not only were we able to use clear language on the site, but we worked with our participants to create a logical ordering flow on the Schlotzsky's website without having to transport customers to a 3rd party site. We also succeeded in streamlining the item customization process; we provide users with a customization summary and add-on ingredient prices so that they are constantly able to review what changes they have made. By adding an item to their cart using the "Add to Cart" button, customers aren't forced to navigate through a customization page when they don't want to customize.
Most importantly, we were able to create a fully accessible ordering flow. By starting the design process through the lens accessibility, the team was able to make design and code choices that allowed for maximum ease of use using accessible technologies like screenreaders and braille pads. By refining the ordering flow and using clear language, we were able to achieve cognitive accessibility as well.
The final prototype presented to the Schlotzsky's team can be found here: 
I had the opportunity to present on this project and the importance of web accessibility at World Information Architecture Day 2019 in Atlanta, GA, with my team member James McDowell. By sharing our experiences, we hope to inspire others to incorporate accessibility into their design process.
A list of resources for web accessibility can be found on the following page: http://bit.ly/WIADaccess