Join us as we celebrate Ethics Awareness + Women’s History Month!
Conversations for Change is pairing up with Professor Laney Strange and Professor Vance Ricks to learn more about the work they are leading through the CS + Ethics Research Fellowship. You will hear from students who are currently in the fellowship and learn more about the research that they are involved in, while having the opportunity to engage through a Q&A.
This panel will be open to all students, staff, faculty, and community members of Khoury College and Northeastern.
Registration is required – Register here today!
Shriya Dhaundiyal (she/her) – “Ethical Implications of Representation and Diversity in the Metaverse”
ABSTRACT: The metaverse’s expansion presents new opportunities and challenges for representation and diversity. Without diverse representation, harmful biases and stereotypes may persist, leading to exclusion and discrimination in marginalized communities. Prioritizing representation and diversity is vital to ensure the inclusion of all identities in the metaverse, creating a more equitable and inclusive digital world. Addressing ethical concerns surrounding representation in the metaverse is critical to avoid reinforcing existing inequalities. Ongoing discussion and exploration are necessary to address these concerns and promote a more diverse and representative metaverse.
Lauryn Fluellen (she/her) – “Examining Racial Biases in Speech Recognition Devices: Impacts and Solutions”
ABSTRACT: This paper investigates racial biases in speech recognition devices, caused by reliance on algorithms with higher Word Error Rates (WER) for marginalized racial groups. The study examines the relationship between WER and sociolinguistic factors like accent and dialect, affecting accuracy and leading to psychological harm. Researchers collected spoken commands and queries from several racial groups, and experiments used speech recognition models. Results show significant accuracy variations across racial groups, with higher WER for African American speakers than white speakers. This research highlights racial biases and calls for more inclusive and equitable speech recognition technologies to improve accessibility and user experience.
Emerson Johnston (she/her) – “Social Media: A Tool for Information Warfare”
ABSTRACT: This paper investigates the use of social network systems as a tool for information warfare, focusing on how these systems can be leveraged to spread false information, manipulate public opinion, and undermine trust in institutions. I examine case studies from various countries, including Russia’s interference in the 2016 US election and China’s use of social media to influence public opinion on issues such as Hong Kong and Taiwan, to highlight the potential dangers of social network systems as a tool of war, as well as the need for increased awareness and countermeasures to mitigate their impact.
Amanda Lee (she/her) – “The Gender Bias inside your Female Voice Assistant”
ABSTRACT: As society uses more kinds of assistive technology, the lines between the characteristics of robots and humans blur. Voice assistants such as Alexa, Siri and Cortana all evoke gender bias through provided information such as names and indirect information such as conversational responses. Using the frameworks of Computers as Social Actors, Sociophonetic Theory, Dishonest Anthropomorphism, and Objectification Theory, and applying them to existing voice assistant frameworks and to the Q “genderless voice,” I show how female voice assistants have negative implications for women and society itself. I propose design solutions and guiding questions to create a world closer to gender justice.
This event is part of the Conversations for Change (diversity, equity, inclusion, accessibility, and belonging) Speaker Series. The goal is to collaboratively bring forth and discuss topics of equity and inclusion to further create and establish a sense of community and belonging, particularly for those who are historically and presently marginalized in computing. CS is for everyone, and this series aims to build towards the social and cultural changes that are needed to ensure all who pursue the field feel welcome.
Questions? Please email Chelsea Smith at c.smith@northeastern.edu.
Join us as we celebrate Ethics Awareness + Women’s History Month!
Conversations for Change is pairing up with Professor Laney Strange and Professor Vance Ricks to learn more about the work they are leading through the CS + Ethics Research Fellowship. You will hear from students who are currently in the fellowship and learn more about the research that they are involved in, while having the opportunity to engage through a Q&A.
This panel will be open to all students, staff, faculty, and community members of Khoury College and Northeastern.
Registration is required – Register here today!
Shriya Dhaundiyal (she/her) – “Ethical Implications of Representation and Diversity in the Metaverse”
ABSTRACT: The metaverse’s expansion presents new opportunities and challenges for representation and diversity. Without diverse representation, harmful biases and stereotypes may persist, leading to exclusion and discrimination in marginalized communities. Prioritizing representation and diversity is vital to ensure the inclusion of all identities in the metaverse, creating a more equitable and inclusive digital world. Addressing ethical concerns surrounding representation in the metaverse is critical to avoid reinforcing existing inequalities. Ongoing discussion and exploration are necessary to address these concerns and promote a more diverse and representative metaverse.
Lauryn Fluellen (she/her) – “Examining Racial Biases in Speech Recognition Devices: Impacts and Solutions”
ABSTRACT: This paper investigates racial biases in speech recognition devices, caused by reliance on algorithms with higher Word Error Rates (WER) for marginalized racial groups. The study examines the relationship between WER and sociolinguistic factors like accent and dialect, affecting accuracy and leading to psychological harm. Researchers collected spoken commands and queries from several racial groups, and experiments used speech recognition models. Results show significant accuracy variations across racial groups, with higher WER for African American speakers than white speakers. This research highlights racial biases and calls for more inclusive and equitable speech recognition technologies to improve accessibility and user experience.
Emerson Johnston (she/her) – “Social Media: A Tool for Information Warfare”
ABSTRACT: This paper investigates the use of social network systems as a tool for information warfare, focusing on how these systems can be leveraged to spread false information, manipulate public opinion, and undermine trust in institutions. I examine case studies from various countries, including Russia’s interference in the 2016 US election and China’s use of social media to influence public opinion on issues such as Hong Kong and Taiwan, to highlight the potential dangers of social network systems as a tool of war, as well as the need for increased awareness and countermeasures to mitigate their impact.
Amanda Lee (she/her) – “The Gender Bias inside your Female Voice Assistant”
ABSTRACT: As society uses more kinds of assistive technology, the lines between the characteristics of robots and humans blur. Voice assistants such as Alexa, Siri and Cortana all evoke gender bias through provided information such as names and indirect information such as conversational responses. Using the frameworks of Computers as Social Actors, Sociophonetic Theory, Dishonest Anthropomorphism, and Objectification Theory, and applying them to existing voice assistant frameworks and to the Q “genderless voice,” I show how female voice assistants have negative implications for women and society itself. I propose design solutions and guiding questions to create a world closer to gender justice.
This event is part of the Conversations for Change (diversity, equity, inclusion, accessibility, and belonging) Speaker Series. The goal is to collaboratively bring forth and discuss topics of equity and inclusion to further create and establish a sense of community and belonging, particularly for those who are historically and presently marginalized in computing. CS is for everyone, and this series aims to build towards the social and cultural changes that are needed to ensure all who pursue the field feel welcome.
Questions? Please email Chelsea Smith at c.smith@northeastern.edu.
Join us as we celebrate Ethics Awareness + Women’s History Month!
Conversations for Change is pairing up with Professor Laney Strange and Professor Vance Ricks to learn more about the work they are leading through the CS + Ethics Research Fellowship. You will hear from students who are currently in the fellowship and learn more about the research that they are involved in, while having the opportunity to engage through a Q&A.
This panel will be open to all students, staff, faculty, and community members of Khoury College and Northeastern.
Registration is required – Register here today!
Shriya Dhaundiyal (she/her) – “Ethical Implications of Representation and Diversity in the Metaverse”
ABSTRACT: The metaverse’s expansion presents new opportunities and challenges for representation and diversity. Without diverse representation, harmful biases and stereotypes may persist, leading to exclusion and discrimination in marginalized communities. Prioritizing representation and diversity is vital to ensure the inclusion of all identities in the metaverse, creating a more equitable and inclusive digital world. Addressing ethical concerns surrounding representation in the metaverse is critical to avoid reinforcing existing inequalities. Ongoing discussion and exploration are necessary to address these concerns and promote a more diverse and representative metaverse.
Lauryn Fluellen (she/her) – “Examining Racial Biases in Speech Recognition Devices: Impacts and Solutions”
ABSTRACT: This paper investigates racial biases in speech recognition devices, caused by reliance on algorithms with higher Word Error Rates (WER) for marginalized racial groups. The study examines the relationship between WER and sociolinguistic factors like accent and dialect, affecting accuracy and leading to psychological harm. Researchers collected spoken commands and queries from several racial groups, and experiments used speech recognition models. Results show significant accuracy variations across racial groups, with higher WER for African American speakers than white speakers. This research highlights racial biases and calls for more inclusive and equitable speech recognition technologies to improve accessibility and user experience.
Emerson Johnston (she/her) – “Social Media: A Tool for Information Warfare”
ABSTRACT: This paper investigates the use of social network systems as a tool for information warfare, focusing on how these systems can be leveraged to spread false information, manipulate public opinion, and undermine trust in institutions. I examine case studies from various countries, including Russia’s interference in the 2016 US election and China’s use of social media to influence public opinion on issues such as Hong Kong and Taiwan, to highlight the potential dangers of social network systems as a tool of war, as well as the need for increased awareness and countermeasures to mitigate their impact.
Amanda Lee (she/her) – “The Gender Bias inside your Female Voice Assistant”
ABSTRACT: As society uses more kinds of assistive technology, the lines between the characteristics of robots and humans blur. Voice assistants such as Alexa, Siri and Cortana all evoke gender bias through provided information such as names and indirect information such as conversational responses. Using the frameworks of Computers as Social Actors, Sociophonetic Theory, Dishonest Anthropomorphism, and Objectification Theory, and applying them to existing voice assistant frameworks and to the Q “genderless voice,” I show how female voice assistants have negative implications for women and society itself. I propose design solutions and guiding questions to create a world closer to gender justice.
This event is part of the Conversations for Change (diversity, equity, inclusion, accessibility, and belonging) Speaker Series. The goal is to collaboratively bring forth and discuss topics of equity and inclusion to further create and establish a sense of community and belonging, particularly for those who are historically and presently marginalized in computing. CS is for everyone, and this series aims to build towards the social and cultural changes that are needed to ensure all who pursue the field feel welcome.
Questions? Please email Chelsea Smith at c.smith@northeastern.edu.
Join us as we celebrate Ethics Awareness + Women’s History Month!
Conversations for Change is pairing up with Professor Laney Strange and Professor Vance Ricks to learn more about the work they are leading through the CS + Ethics Research Fellowship. You will hear from students who are currently in the fellowship and learn more about the research that they are involved in, while having the opportunity to engage through a Q&A.
This panel will be open to all students, staff, faculty, and community members of Khoury College and Northeastern.
Registration is required – Register here today!
Shriya Dhaundiyal (she/her) – “Ethical Implications of Representation and Diversity in the Metaverse”
ABSTRACT: The metaverse’s expansion presents new opportunities and challenges for representation and diversity. Without diverse representation, harmful biases and stereotypes may persist, leading to exclusion and discrimination in marginalized communities. Prioritizing representation and diversity is vital to ensure the inclusion of all identities in the metaverse, creating a more equitable and inclusive digital world. Addressing ethical concerns surrounding representation in the metaverse is critical to avoid reinforcing existing inequalities. Ongoing discussion and exploration are necessary to address these concerns and promote a more diverse and representative metaverse.
Lauryn Fluellen (she/her) – “Examining Racial Biases in Speech Recognition Devices: Impacts and Solutions”
ABSTRACT: This paper investigates racial biases in speech recognition devices, caused by reliance on algorithms with higher Word Error Rates (WER) for marginalized racial groups. The study examines the relationship between WER and sociolinguistic factors like accent and dialect, affecting accuracy and leading to psychological harm. Researchers collected spoken commands and queries from several racial groups, and experiments used speech recognition models. Results show significant accuracy variations across racial groups, with higher WER for African American speakers than white speakers. This research highlights racial biases and calls for more inclusive and equitable speech recognition technologies to improve accessibility and user experience.
Emerson Johnston (she/her) – “Social Media: A Tool for Information Warfare”
ABSTRACT: This paper investigates the use of social network systems as a tool for information warfare, focusing on how these systems can be leveraged to spread false information, manipulate public opinion, and undermine trust in institutions. I examine case studies from various countries, including Russia’s interference in the 2016 US election and China’s use of social media to influence public opinion on issues such as Hong Kong and Taiwan, to highlight the potential dangers of social network systems as a tool of war, as well as the need for increased awareness and countermeasures to mitigate their impact.
Amanda Lee (she/her) – “The Gender Bias inside your Female Voice Assistant”
ABSTRACT: As society uses more kinds of assistive technology, the lines between the characteristics of robots and humans blur. Voice assistants such as Alexa, Siri and Cortana all evoke gender bias through provided information such as names and indirect information such as conversational responses. Using the frameworks of Computers as Social Actors, Sociophonetic Theory, Dishonest Anthropomorphism, and Objectification Theory, and applying them to existing voice assistant frameworks and to the Q “genderless voice,” I show how female voice assistants have negative implications for women and society itself. I propose design solutions and guiding questions to create a world closer to gender justice.
This event is part of the Conversations for Change (diversity, equity, inclusion, accessibility, and belonging) Speaker Series. The goal is to collaboratively bring forth and discuss topics of equity and inclusion to further create and establish a sense of community and belonging, particularly for those who are historically and presently marginalized in computing. CS is for everyone, and this series aims to build towards the social and cultural changes that are needed to ensure all who pursue the field feel welcome.
Questions? Please email Chelsea Smith at c.smith@northeastern.edu.