Demographics do not make Personas – but neither do stereotypes

They might look different and we might infer different motivations and attitudes on them, but we might also be very wrong! Both eventually travel very often (so they share a behavior) and they might share expectations or questions about a hotel room or something. So we should be very reluctant to infer different motivations, goals, and attitudes based on appearances.

      pic.twitter.com/EQpdIA4yfV

— Jon Davie (@jondavie) August 18, 2015

Q – a genderless voice

Cool and interesting project as it’s a well-known fact that voice based digital assistants like Siri convey social roles via their voice.
When we listen to a voice based assistant, we will imply implicit assumptions like the mentioned social gender roles we have learned and internalised over a long period of time – despite we are knowing it is a machine. And as technology is using (by default) female voices for *assistant*like-roles („How can I help?“) they even support/promote classical gender stereotypes such as women being perceived as „warm“, „helpful“ and „cooperative“ rather than „dominant“, „competitive“ and „independent“ – which correspond more to male gender stereotypes.

+++ https://www.genderlessvoice.com

GDPR Cookie Consent mit Real Cookie Banner