Applying industrial design to AI: Meet Kirsten Climer, UX Lead for Google's AIUX team
If you have an OXO product — known for their universal design — in your home, chances are that Kirsten Climer has had a hand in designing it. Prior to coming to Google, Kirsten was a design director and industrial designer at Smart Design, a design & innovation consultancy in New York, where she designed more than 50 OXO products. She now leads a team of UX researchers, designers, and prototypers to help make Google’s artificial intelligence technology more accessible to a wide audience. To that end, Google recently launched Coral, a platform for on-device AI aimed at professional prototypers and engineers who want to incorporate AI into their products.
Kirsten’s team also works on AIY Projects, which are affordable, do-it-yourself kits that contain all of the materials you need to build a simple AI-powered device, like a smart speaker or camera. We sat down with Kirsten to learn more about how she and her team help make Google’s most advanced technologies more human-centered and helpful for everyone.
Why is product inclusion so important to the development of AI/ML technologies?
When our UX team thinks about inclusion, we think about how we can make our products available to the widest range of people. I love this quote from Susan Goltsman, an internationally recognized expert in designing and programing inclusive environments for children, youth and families: “Inclusive design doesn’t mean you’re designing one thing for all people. You’re designing a diversity of ways to participate so that everyone has a sense of belonging.” (Note: this quote appeared in the book Mismatch by Google UX Director Kat Holmes.)
As we’ve developed AIY and Coral, we’ve designed with students, makers, and developers of a wide variety of backgrounds and skill levels so that they can each find success with our product based on their own unique goals. This is especially important in the tech industry because many developers are self-taught. It’s easy for our cross-functional team to design products for ourselves, but it’s been our UX team’s role to incorporate the perspectives of developers outside of Google walls.
Coral products are used by teams at companies or institutions on projects working on custom and proprietary applications of machine learning. That team will be made up of anyone from software engineers, hardware engineers, electrical engineers, and everyone in between. A method we used during our UX research was to do testing in these group contexts, which really highlighted how the user experience of our product needed to meet each user where they’re at.
There are also Coral users who are individual entrepreneurs, or working within a startup to build a proof-of-concept of a product. Likely they’re not an expert in software, hardware, and machine learning, and they wear hats of many roles. They’re scrappy, cost constrained, and want to move quickly. We’ve also thought about inclusivity in how we can help them along their journey, from an easy set-up experience, including models and examples to start from, and hardware that scales from prototype to production.
How are AIY Projects and Coral being applied to build and expand access to technology for everyone?
The on-device aspect of Coral unlocks a lot of potential for AI to be used in places where speed and offline capabilities are requirements. Let’s take the offline aspect. This is helpful in environments where connectivity is poor, so you can imagine the interesting applications that can emerge for emerging markets where connectivity is limited or internet costs are high.
While Coral has been available in Beta for just a few months, we’re already seeing companies use Coral in unique ways. For example, an agricultural company is using AI to detect plant health in order to control water and light. The Fermilab is applying our tools to their high energy physics experiments to look for particles. And automotive companies are exploring how to bring pedestrian detection capabilities to large vehicles. Just today I learned about a healthcare company using Coral in a proof-of-concept for polyp detection in endoscopy applications.
Affordability is also key to expanding access to technology. Let’s take the Coral USB Accelerator. Compared to a graphics card (a GPU), which is often used to execute machine learning models and can cost several hundred dollars, our USB Accelerator is only $75 and offers even faster performance.
Coral gives developers the creative palette to create solutions for everyone. We’re giving people a set of tools to give you the solutions for challenges in industries that Google isn’t currently addressing. We believe that AI will help create a better world, but only when we explore, learn, and build it together.
Your background is in industrial design — how does that experience influence your work today?
Prior to Google, I spent the majority and formative part of my career at Smart Design, a design & innovation consultancy in NYC. There, I worked with clients across many industries, including healthcare, beauty, tech, and housewares. I did things like help clients get a deep, real-world understanding of who their users are, created long-term product vision and roadmaps, and created design strategies to help brands enter new markets.
Each product we designed was an exercise of designing with users. For a single product, I probably made over 20 prototypes and tested it in the hands of 50+ users before we were satisfied with the design intent.
While I don’t practice the craft of industrial design too much today, that background shapes how I approach UX at Google. Industrial designers solve tough, technical problems, and are highly constrained by cost and manufacturing processes. They deliver products that first and foremost serve the needs of the user, but also balance simplicity, beauty, and appropriateness for the context they live in.
I decided to come to Google because I believe a user-centered design approach should be able to solve any problem, large or small, consumer-facing or B2B. I have little-to-no background or training in computer science or machine learning. My personal motto, adopted from my time at Smart, is that design is about people, not things.
Working within Google Research on a team called AIUX, I get thrown problems that are technically complex and ambiguous. There is generally no existing precedent for the projects we tackle. Machine learning is still in its early days, it’s hard, and easy to do wrong, even for experienced practitioners. Amidst all this chaos, it’s our team’s role to highlight the voice of the user and provide a guiding light towards how we can best be inclusive to the widest range of people.
Illustration by Morgane Sanglier
Photo of Kirsten by Neil Buchan Grant