Designing a chatbot free of
gender bias and stereotypes
Chuhms - A chatbot designed to help people working from home get a healthy dose of movement.
by Teja Srinivas
Solo Project
The Timeline
UX , UI
Ideation
Prototyping
Usability Testing
Research - 2 weeks
Design - 2Â weeks
Prototyping & testing - 1Â weeks


Background
in a world filled with AI-based voice assistants, I stumbled upon an intriguing realization. These virtual assistants were often biased and reinforced gender stereotypes. It struck me that there was a common data bias against women in design and research. Women's voices were rarely considered in the development process, despite their unique perspectives. This realization led me to explore the intersection of feminism and technology.​
To deepen my understanding, I embarked on a short course on feminism and its history. The knowledge I gained ignited a spark within me, and I decided to create a chatbot that would be free of any bias and stereotypes, specifically targeting people who work from home. I believed that this chatbot could encourage them to incorporate a healthy dose of movement into their daily routines.
The challenge
The challenge I faced was clear: How might we design a chatbot that promotes movement without perpetuating gender bias and stereotypes? With this question in mind, I embarked on a journey of designing a chatbot that would challenge traditional norms and foster inclusivity.
Steps i followed to design feminist tech
Recognising inherent
biases
& stereotypes
Stakeholder
research
Purpose
Conversation
Design
To begin, I recognized the inherent biases and stereotypes that exist within the technology industry. It was crucial to acknowledge these biases in order to actively counteract them during the design process.





What the Stakeholders say?
Stakeholder research played a vital role in shaping my design approach. I reached out to female content creators on Instagram, seeking their insights and opinions. Through a simple direct message, I asked them about their general viewpoint on exercise, whether a chatbot could motivate them to move, and their thoughts on the potential of a well-designed chatbot to promote equality, well-being, and positive change.
​
​The responses I received were varied. Some felt that a chatbot could indeed be motivating, while others were hesitant about technology dictating their actions. However, the majority agreed that technology, when designed appropriately, had the potential to foster equality and should be embraced more often.


Difficult questions tackled
The chatbot
If we reinforce the belief the belief that the assistant is just a chatbot as a feminist internet PIA standards. How does a stakeholder believe that the chatbot understands him?
Gender requirements
If men and women's physical requirements are not the same ?? How i can ensure my bot is unbiased.
Personal biases
Personally I am someone that does not differentiate between men and women. But in this regard that could itself be a bias because, the physical exercise needs of women might be different to that of a man due to different biologies and they might change with age and circumstance.
Other genders
Women need to get up more often than men. They are more prone to back ache and more dehydrated than men. In this regard, how do i help people who don't identify with any specific gender ?
Language
What kind of language should be used for the conversation?
Answers
Millennials usually put their trust in technology. They would only know after using it, if they can trust the chatbot. So, the experience is very important.
Make it genderless, give general facts about both genders. Use third person pronouns which are gender neutral when describing them.
Took an objective view of it , as i am designing a feminist chatbot. Ask women before designing and take feedback from them after prototyping.
Find out what are some issues specific men and some specific to women, and some specific to people who don't identify (or) design for gender with more physical issues and that will apply for other genders.
Simple, sticking to first person pronouns like I and you and avoiding gender pronouns completely.