



Jane has recently passed her interpreter Exam and she is now officially a certified interpreter.
Jane receives her first interpretation job for the next morning at the University of Washington requested by a deaf student for office hours about her physical computing class.
Jane is a bit concerned because despite her interpretation training, she has no idea what physical computing is, and she wants to do well on her first job
The next day, Jane gets ready to leave for work. She puts on her Augmented Reality Contact lenses that will assist her with her interpretation job. She hopes that this will help interpret even in an unfamiliar topic.




Jane arrives on campus and enters the building where the job is scheduled.
This is Isabelle. She is the deaf student that requested an interpreter.
Jane and Isabelle have a conversation in ASL and Isabelle tells Jane that she is stuck with one of her physical computing projects.
After Jane has a basic understanding of the context, she input her preferences so that the artificial intelligence technology can make better predictions and suggestions during interpretation.


%20(1).gif)

This is Professor Harrington. She is finally ready for Isabelle to come in.
During the conversation, Jane does not know the technical language and terms being used. Luckily, she is wearing her AR lens that processes what is being said in real-time.
Jane can see captions, highlighted specialized terms that are not familiar to her, and a corresponding visual to help her understand what that highlighted word is and how it is being used in the context.
Isabelle and Professor Harrington end up having a fruitful conversation. Isabelle now understands her homework, making the interpretation job a success.