Creating an inclusive virtual voice assistant: Meet Google's Beth Tsai
Beth Tsai is a member of Google’s Trust & Safety team. She is the Policy Lead for the Google Assistant, where she works on building a safe, positive Assistant experience for all users, across all hardware and languages. Beth advocates for practical, inclusivity-forward product design, and she advises product teams across Google on best practices to remove bias in Machine Learning.
How and why did you begin working at Google?
A long time ago, as a computer scientist, I researched algorithms for shape-changing robots. However, for various reasons, I left my Ph.D. program and the tech field altogether (another statistic in the Incredible Shrinking Pipeline ) and ended up becoming a lawyer.
Though I enjoy law, and legal policy in particular, I always regretted leaving Computer Science. Technology inspires and challenges me in a way that few things do, so when a Googler friend suggested that I join the Product Policy team, I jumped at the opportunity. I work on one of Google’s newest and most exciting technologies: Google Assistant. My role feels custom-made for me.
What is your role and mission at Google?
I lead Policy for the Google Assistant. My job to ensure that the Assistant interacts with users in a manner that is safe and helpful. We want to protect users from offensive, shocking and risky experiences. We also want to be as helpful as possible and answer questions whenever we can. But what happens when a user asks a question that might have a sensitive answer? My job is to figure out how to resolve situations like this in a thoughtful manner that both benefits the user and reflects the values of Google Assistant.
Can you share how your own background and culture influence your work?
When you are biracial like I am, people often choose your identity for you. In the predominantly-white Colorado town where I grew up, I had to be “Asian” or “Chinese,” even if I never felt an affinity for either label. The non-Chinese part of my identity was effectively erased.
Being told that there was only a single valid identity available to me informs my commitment to inclusive design. At the heart of inclusive design is the recognition that there is no single, monolithic user. Instead, inclusive products should be built for everyone. My personal and professional goal is to ensure that, regardless of who you are, and how you identify, you see yourself in the Assistant. If the Assistant succeeds in this regard, we are validating who you choose to be, which is powerful and important.
Can you share some of your product inclusion work on Google Assistant?
Given the proliferation of virtual assistants, there have been thoughtful articles in the media discussing how the majority of digital assistant voices are female. I think that these are important conversations to have, and I am particularly proud of the work the Google Assistant has done to acknowledge and challenge these gender conventions.
Early on, we decided that Google Assistant should identify as gender neutral. Try asking, “Are you a girl?” or, “What gender are you?” and see what it says! This decision was particularly important to us for two reasons. First, it subverts the stereotype that assistants, digital or otherwise, must be women. Second, it challenges the notion that gender is inherently binary.
Both points informed our launch of a broader set of voices for the Assistant. It was important to us to make sure that the labels of the voices reflected the Assistant’s gender neutrality. In addition, the Google community gave us feedback that the Assistant's voice labels should not reinforce binary stereotypes of what a "male voice", or a “female voice" should sound like. Because of this, we labeled the voice selections “Voice I” and "Voice II,” rather than “female” and “male.”
This seemingly trivial decision sends a quiet, powerful message about Google Assistant and its values. The Assistant is a product for everyone, and we want users to see themselves in it.
What inspires you most about this work?
Virtual assistants reduce barriers to interacting with technology and make online content accessible and available to a wide audience. Because digital assistants have the potential to reach more people, we cannot build for the single, monolithic user. If the Google Assistant is going to achieve its goal of assisting you in a manner that is unique to you, it needs to be built from the ground-up with inclusivity in mind.
We’ve seen that when leadership makes inclusive-design a priority, it fosters a culture of inclusivity throughout the organization. Our executives on the Assistant team, Scott Huffman and Nick Fox, have made it clear that inclusivity is not just a nice-to-have, it’s a priority.
What advice can you share for others interested in applying an inclusive lens to their work?
Small, incremental changes can have a huge impact! Tackling inclusive design can seem daunting, because we’re often trying to fix broader, more systemic problems of inclusion and bias.
While it is important to work on these larger issues, we should not overlook the meaningful and immediate impact that small changes can have. We have embraced this approach while designing the Assistant and found it to be incredibly powerful. For example, we made an effort to ensure that if a user wanted to tell the Assistant how they identify, say, as bisexual, or as a Black woman, we were able to understand what they said and respond in a thoughtful manner.
It was a small, easy thing to implement, and we have heard from users how meaningful it is that the Assistant acknowledges their identity. It’s the Assistant saying, “I see you. You matter.” Moments like that, however small, are the essence of inclusive design. They make the product feel like it was designed for you.