Working for gender equity online: Meet Googler Nithya Sambasivan

By Bill Reeve | Staff Writer | Product Inclusion

Nithya Sambasivan is a senior user experience (UX) researcher working in the fields of Artificial Intelligence and Next Billion Users. She has led research with underserved communities in emerging markets, bringing them to Google’s fore of product development. Her research has covered gender equity, Internet access, grid electrification, civic engagement, and more, and has influenced large-scale real-world products and infrastructures. Her research has won best paper awards and nominations at top-tier computer science conferences.

She and her team are researching gender equity online and in STEM (Science, Technology, Engineering and Mathematics) fields.


How and why did you begin working at Google?

I have been at Google for more than six years now, and I have always been a part of teams doing impactful work for people that live in emerging markets.

My research trajectory started over 12 years ago, when I first started working in the area of technology design for emerging markets, with a focus on underserved communities. I focused my PhD. dissertation on technology design for low-income communities, like slum communities, sex-workers and micro-enterprises in India. For example, in one project (with Microsoft Research India), I observed that slum communities of Bangalore had much greater access to mobile phones than ownership numbers, due to the process of intermediation--asking someone else, usually a community member familiar with technology, to use make use of the device on their behalf [1]. Such usage models challenge core paradigms in the field of computing like identity models of “one user, one device”, leading to new questions around privacy, usability, and metrics.

In another project (also with MSRI), I studied and designed a system for low-income sex-workers in Bangalore, a highly stigmatized and economically poor population [2]. Back in 2010, when only more than half the country had access to phones, the sex worker community had nearly 100% phone access. As they kept their professional identities separate from family identities, i.e., family members were told that they worked as domestic help or in the garments industry, not sex work; these identity norms carried over to technology through dual-SIM or multiple phones. We built a phone broadcasting system to send voice reminders on sexually transmitted disease testing and loan payments, areas observed from my ethnographic research. The system was designed to bypass literacy constraints through voice recordings, respect the community trust by using the voices of senior sex workers for the recordings, not reveal personally identifying information, and keep calls short to be suitable for busy lives. The research led to new principles for technology design for stigmatized and vulnerable communities, that respected their privacy and lifestyle considerations.

When I finished my PhD., I knew that I wanted to continue working with and designing for underserved communities, on technology projects with an ethical backbone. When I was considering various options, I received an opportunity to come to Google.org to do user research for products in areas of Internet access, electrification, and civic engagement. It was exciting to apply my work at Google-scale and conduct it in a socially-impactful way, while working with some of the smartest people in the world.


What is your role and mission at Google?

My work brings expertise and insightfulness that are a source of solving hard, socially important design and technical problems for people in emerging markets, especially in increasing fairness and autonomy. I conduct fundamental research on emerging technology areas and tie back the implications to our core DNA of engineering and products. I am currently studying user practices, perceptions, and implications around AI technologies among people in emerging markets.

A part of my work involves identifying the right problems to research, using a combination of Google product team inputs, understanding of socio-economic landscapes, and technology trends. I then create a research approach to answer the core research questions, typically using qualitative and mixed empirical methods. Research methodologies for various underserved communities require special care and attention. The challenge is in ensuring deep, rigorous research that can produce long-lasting insights to impact new and existing efforts at scale. I often collaborate with university partners and external research groups, in addition to internal Google partners.

Then I work with various product initiatives at Google to incorporate global user goals, practices, and challenges into their roadmaps, strategies and designs. I regularly translate my research to various communities of practice (in collaboration with Google teams), such as academic publications for the research community, APIs and engineering frameworks for developers, and design guidelines for designers.


How did your own background influence your work?

I was born and raised in a middle-class family in Chennai, India, a coastal city famous for its dosas, cultural arts, political engagement, and hot weather. My dad was a government employee, and my mom was a housewife. My sister and I are their two daughters, and my parents valued free-thinking and independence. We were often financially-constrained and my parents prioritized education as a life-long asset.

When I came to the U.S. to attend Georgia Tech for my Master’s after finishing my undergraduate electronics engineering degree in India, I started working on a kiosk project for rural communities in south India. Looking back, this was an important moment when I realized that technology can be designed specifically for communities, for their context and needs. Until then, it had always seemed like an amalgam of broad infrastructure that was agnostic to people.

Coming from an engineering background, the social sciences really interested me. I learned the craft and technique of the ethnographic method, such as interviews, surveys, participant observation and shadowing (my doctoral advisor is an anthropologist). I decided to pursue a doctorate in HCI and expand into inter-disciplinary domains of anthropology, sociology, political ecology, and critical theory to do computer science. These lenses helped me understand the big picture of people and computers in context and showed me ways to more effectively gather, understand and analyze data on how technology influences society.

These experiences shaped my interests and career pursuits, and I saw that I could bring what I learned back to the technology community through design, scholarship, and eventually through Google.



Nithya conducting an interview with an informal sector employee at Jinja, Uganda, on the river Nile.
Photo credit: George Conard


Can you share some of the discoveries from your gender equity research?

Half the world is already online. However, women, despite making up half the world's population, are underrepresented online. But over the next few years, the access gaps will close and more women will come online. The question is, how can we build equitable designs, algorithms and policies so the Internet is relevant and meaningful to more women?

To help answer that question, I (along with my team and university collaborators) spoke to women in urban, peri-urban and rural areas of seven countries: Brazil, Bangladesh, India, Indonesia, Mexico, Nigeria, and Pakistan. We interviewed 363 participants, and we followed those up with a large-scale survey with 3,500 participants. We spoke to a wide spectrum of women, including new mothers, rural farm workers, call-center workers, and PhD. students, across sexual identities, religions, incomes, and age groups. We also spoke to male partners, friends, brothers, or fathers of women, in order to understand how they viewed women's access to technology.

Our research concluded that in order to understand and build equitable technology, we need to pay attention to four pillars: access, content and community, privacy, and safety [3].

First, are women able to equitably access the Internet? In other words, instead of framing access as cables and networks, we should be thinking about whether women are able to go online on their own terms. The Internet is often associated with a risky reputation for women, due to fears of harassment, exposure to strangers, and promiscuity associations. Factors like reduced agency to go online, limited free time due to housework duties, and limited physical mobility affect women’s ability to go online on their own terms.

Second, when women go online looking for content and community, is what they find relevant and meaningful? In addition to gender gaps in access, women limit online content creation, such as posts and comments, due to anticipated harassment and judgement. Locally relevant content is desired by women on money, education, skill development, health, and relationships, but is hard to find online.

Third, do women feel in control of their privacy online and on devices? Due to social and economic norms, women’s phones were more commonly shared as compared to men’s devices in our study, occurring in the form of shared use, mediated use (relying on someone else to make use of technology), and device monitoring [4]. In some situations, it may even be considered impolite of women to refuse sharing a device, due to social relations with those of higher power or authority.

Our participants used a repertoire of covert and overt privacy management techniques--these practices were performative because they allowed women to preserve privacy in line with social expectations. For example, app locks (third party apps to hide apps, content or folders with passwords or PINs) were commonly used with family members and provided granular control to hide content, but their passwords were quite easy to observe by people nearby. And app locks usually involve a password or PIN prompt, which leads to questions like, ‘what are you hiding from me?’. Deleting content, such as drop-down search history or received messages, was an effective technique to remove content without worrying about traces when devices traveled freely across hands. Using private modes was more effective at hiding content, but participants found it hard to discover these affordances. Simply avoiding installing or using specific applications to guard privacy was more commonly employed, but this practice did limit technology use. We also saw that there were divergent views on the term privacy itself, even though all participants had ways to achieve privacy. It is important for us as technology creators to understand current practices and needs, and facilitate privacy in culturally relevant ways.

Finally, do women feel safe online (and offline)? Safety is often viewed as an edge case in technology, but it can be an everyday issue for women. Online abuse is frequent and often results in severe consequences in the physical and online worlds. Abuse can occur in the form of cyberstalking, impersonation and more; women cope with abuse vectors through informal support channels like family and friends rather than formal options like law enforcement and technology reporting.



After an interview with a data entry student and micro-entrepreneur in a village in Uttar Pradesh, India.
Photo credit: Garen Checkley
.


What are your research findings on women in STEM?

To have equitable representation and participation from women in computing, we need to comprehensively understand why and how women may be discouraged from CS, and how we can create solutions to address the current disparities globally. In collaboration with the University of Michigan, we conducted research on motivating and dissuading factors for women in CS, across various life stages from undergraduate to senior researchers [5]. In India, women represent 45% of total CS enrollment in universities, about 2.5 times the rate in the United States. Similarly, 30% of female engineers enter the IT workforce, higher than the number in the U.S. Despite these promising enrollment numbers, there’s a rapid decline in women’s participation in advanced computing degrees and higher-level positions (80% of women in tech are in entry-level positions).

We found that computing is an aspirational undergraduate degree for women, valued for its financial stability, workplace safety, and improved marital prospects, especially as compared to physically demanding professions like civil or mechanical engineering. However, in the advanced degrees, women’s ambitions start to clash with familial duties and societal expectations, due to the common perception that highly educated women can be ‘over-educated’ in finding an equal or superior groom. Housework and childcare duties fall disproportionately on women (in an OECD survey, Indian women spend 5 hours a day on housework; Indian men, 20 minutes), even among well-to-do women we interviewed. Lack of paternity leave policies, despite an exemplary maternity leave policy, means that women take longer breaks or quit jobs, and faced great difficulty with re-entry into the workforce.

The lack of relatable role models from similar cultural contexts as Indian women was cited as a major barrier. Relatable role models serve an important role for the entire family as much as they do for the female scholar, especially if the scholar is from a village and the first to break patriarchal norms around education. The presence of a role model from a similar community helps start a dialogue with parents or partners about independence and careers.

We took up this insight and created role model videos of some of the most accomplished Indian scientists. We interviewed nine scientists across various backgrounds and professions, ranging from statisticians to astrophysicists to computer scientists to microbiologists, describing their journeys into STEM, how they navigated challenges, and their advice for aspiring young women and their families in India.


What inspires you most about this work?

What inspires me most is technology’s potential to benefit society, particularly under-served communities, if done right. We are facing an important moment in history with accelerated and unprecedented technological growth. Technology has the potential to augment or even exacerbate existing biases and inequities from the physical world. But it can also be leveraged as a force for good--however, this does not happen by default, unless we intentionally pay attention to our design and engineering choices. We have the opportunity to understand existing disparities and make equity and opportunity possible here.


What advice can you share for others interested in applying an inclusive lens to their work?

Educate yourself. Find ways to create an informed perspective, and check your biases. Speak to communities you aspire to design for--this is a great way to understand a new topic, identify meaningful opportunities, and talk intelligently and respectfully about the community in your organization. Cultivating a focus on data and goals can help brand an initiative as rigorous, thoughtful, and core to your organization’s value.

Frame the goals at the right strategic level to have most impact and sustainability. For example, treating gender equity solely as a moral obligation can feel vague and hard to prioritize against other organizational interests. One can make a case for technical opportunities, business opportunities, or ethical purpose. Using an equity lens in technology further allows us to make improvements that benefit more communities. For example, making privacy improvements for women benefits the elderly, new technology users, and survivors of abuse.

If we all just reflected a little on our work’s effects on various communities, we’d make much faster progress as a society.

Watch a video interview with scientist Anuradha TK from the Women in STEM series based on Nithya's team's research:

 

 


[1] Sambasivan, N., Cutrell, E., Toyama, K., & Nardi, B. (2010, April). Intermediated technology use in developing communities . SIGCHI (pp. 2583-2592). ACM.

[2] Sambasivan, N., Weber, J., & Cutrell, E. (2011, May). Designing a phone broadcasting system for urban sex workers in India . SIGCHI (pp. 267-276). ACM.

[3] Sambasivan, N., Checkley, G., Ahmed, N., & Batool, A. (2017). Gender equity in technologies: considerations for design in the global south . interactions, 25(1), 58-61.

[4] Sambasivan, N., Checkley, G., Batool, A., Gaytán-Lugo, L. S., Matthews, T., Consolvo, S., & Churchill, E. (2018, August). “ Privacy is not for me, it’s for those rich women”: Performative Privacy Practices on Mobile Phones by Women in South Asia . SOUPS 2018. USENIX Association.

[5] Thakkar, D., Sambasivan, N., Kulkarni, P., Sudarshan, P.K., & Toyama, K. (2018, April). The Unexpected Entry and Exodus of Women in Computing and HCI in India . SIGCHI (p. 352). ACM.

Contact Us

Stay in touch. We want to hear from you. Email us at Acceleratewithgoogle@google.com

Please note that this site and email address is not affiliated with any former Google program named Accelerator.