Proponents of this bill say it’ll protect kids online. But for queer kids, it may do just the opposite.
Wed 09.13.23 / Madelaine Millar
Proponents of this bill say it’ll protect kids online. But for queer kids, it may do just the opposite.
Wed 09.13.23 / Madelaine Millar
Wed 09.13.23 / Madelaine Millar
Wed 09.13.23 / Madelaine Millar
Proponents of this bill say it’ll protect kids online. But for queer kids, it may do just the opposite.
Wed 09.13.23 / Madelaine Millar
Proponents of this bill say it’ll protect kids online. But for queer kids, it may do just the opposite.
Wed 09.13.23 / Madelaine Millar
Wed 09.13.23 / Madelaine Millar
Wed 09.13.23 / Madelaine Millar
Keeping children safe online sounds like a political slam dunk, so it’s no real surprise that the Kids Online Safety Act (KOSA) has faced little resistance since it came out of committee at the beginning of August. The bill’s provisions, proponents claim, will keep children safe on social media by restricting access to minors’ personal data, providing parents with tools to supervise their children’s social media use, and creating liability for platforms that allow minors to encounter inappropriate content, among other measures.
But Michael Ann DeVito, assistant professor of computer science and communication studies, thinks the bill deserves more scrutiny. DeVito has spent the last decade researching how queer and trans people interact with computers, social media, and online communities — first as a doctoral student at Northwestern University, then as a postdoctoral research fellow at the University of Colorado Boulder, and now at Northeastern University as a newly arrived joint appointee between Khoury College and the College of Arts, Media and Design.
“KOSA is one of a family of bills that’s been proposed in the last couple of years that all center around the rhetorical premise that we have to protect children,” she said. “You can slip anything under that radar once you’ve got such an appealing premise; people will stop reading, and they will sign on to something that actually will harm their kids.”
Per DeVito, KOSA’s potential harm derives from three elements: how queer and trans online communities function, how social media platforms navigate questions of legal liability, and the techniques bad-faith actors use to silence queer and trans people online. So, with DeVito’s help, let’s break it down.
How do queer and trans online communities function?
When DeVito was growing up in the 1990s and 2000s, information about being queer was buried in what she called the “nasty corners of the internet,” mixed haphazardly with misinformation and bigotry. Over many years, queer and trans people built alternatives, determined to make sure the next generation of kids didn’t have to go through the same confusion and isolation. Today, the communities DeVito studies often materialize through influencers and educators on Instagram or TikTok who create videos about what to expect when talking to your doctor about hormones, experiences with different surgeries, and what went into figuring out their own identities, as well as influencer staples like makeup tutorials and vlogs.
“One characteristic of these spaces is the provision of high-quality information — information that has scientific backing, and that is also community-based and vetted,” said DeVito, who spent a year becoming a trans TikTok educator as part of her study of the phenomenon.
She also found that queer and trans influencers tend to be vigilant about self-policing, attaching warnings to adult content and moderating their own comments sections for bigotry.
“They will sacrifice all kinds of their own goals and their own time to stay on top of that comment section and make sure no kid who watches their videos sees nasty, hateful stuff,” DeVito added.
Finally, queer and trans online communities are characteristically labors of love.
“These are volunteers who are doing this,” DeVito explained. “They’re trying to create places where people can actually explore who they are, ask questions, and get a sense of what’s going on with their peers.”
And the research is decisive; the availability of supportive online communities directly correlates with a host of improved health metrics, including a notable drop in youth suicide rates.
“It’s not one or two studies; I’m talking about the last ten years of studies on all different kinds of queer youth populations, trans youth populations. It is so clear: when you have a supportive online space, you do way better,” DeVito said. “If you don’t make kids feel super isolated about who they are at their core, they are less likely to harm themselves.”
How does legal liability work with social media platforms?
Social media platforms are governed by a set of federal laws called Section 230, which generally absolve them of liability for what their users post. That means if someone posted a hateful screed about DeVito to TikTok, she could potentially sue the user for defamation, but not the platform for hosting hate speech. Platforms try to moderate harmful content out of an economic incentive to be a pleasant place for users, not out of liability concerns.
KOSA would change that, making platforms liable if children encounter content that could cause them harm. DeVito acknowledges that additional review of harmful content could be a good thing if undertaken by an impartial review board. But instead, KOSA allows individual state governments and attorneys general to determine what constitutes harm, and some have a very different idea of what constitutes “harming children” than others do.
“For example, Texas’s government has been very explicit that they plan to use this law, should it pass, to censor any queer and trans content coming into the state of Texas,” DeVito said.
Under the new law, if a child in Texas watched one of DeVito’s TikToks about her experiences as a trans woman, Texas could hold TikTok itself liable if the child “comes to harm” — however Texas chooses to define “harm.” Even if her videos are factual, well-vetted educational material presented in an age-appropriate way, the threat of a lawsuit incentivizes TikTok to take a “better safe than sorry” approach and remove DeVito’s content.
“(KOSA) gives additional control over what is inappropriate to political entities, and then takes that pressure and lets it sit on platforms that are already inclined to over-moderate,” DeVito said. “That’s how you wind up getting rid of all queer and trans content, not just the stuff directed at kids. It’s like book banning on steroids.”
What techniques do bad-faith actors use to silence queer and trans people online?
While researching queer and trans online communities, DeVito observed many creators undergoing a common, upsetting experience: mass reporting campaigns. If their work caught the attention of a bigot who wanted to silence them, they would suddenly find their posts reported to the platform hundreds of times for hate speech and sexually explicit material, regardless of the post’s content. The reports were either submitted by a group of politically aligned people coordinating through another platform like Discord or Reddit, or by hundreds of fake accounts directed by a single person in a phenomenon called a “bot swarm.”
DeVito found that platforms generally took down mass-reported posts, sometimes via automatic action rather than moderator review. Two hundred reports saying a post contains sexually explicit material sounds like a sure thing, and reviewing posts one by one requires financial, psychological, and temporal investments many platforms won’t — or can’t — make.
Creators can appeal these decisions, but DeVito found that some queer and trans creators who reported harassment saw their own content or accounts get pulled down instead of the hateful comments. Rather than going to the platform, many creators spent hours figuring out how to tag their content to conceal it from bad actors but not from other queer people in a process called folk theorization, all while transphobic and homophobic harassment continued to rain down.
“Part of what I’ve been studying recently is the moments when creators say ‘This isn’t worth it; I can’t do this anymore,’” DeVito said. “People drop out, and communities are harmed by not having that positive influence and that positively curated space.”
That’s the real goal of mass reporting campaigns — not to get individual videos taken down, but to overburden creators until they burn out and quit. Because KOSA would expand what content could be reported and decrease platforms’ incentive to stand with marginalized creators, it’s an ideal tool for bad-faith actors to harass queer creators into silence.
“This law will wind up harming the exact creator that you as a parent probably hope your kids find … the creator who was going to give your kid the piece of information that made them think ‘Oh, I might be okay actually; maybe I can have a good future,’” DeVito said. “I’m afraid that a lot of these spaces that we see queer teens really benefiting from are just going to go silent.”
What’s Next for KOSA?
DeVito encourages people who support queer and trans youth to contact their legislators to voice their feelings about the bill, which is currently awaiting the scheduling of a floor vote. Because it’s packaged so appealingly, ensuring that congresspeople understand the policy they’re voting on is a major hurdle; another is building empathy for the people who will be impacted. DeVito particularly encourages those who can to share stories about how online queer communities have bettered their lives.
“When I was acting as a trans influencer, I was blown away by the number of people who got deep in the comments … It’s touching to get the comments like ‘This made me feel seen, this made me feel safe, this made me feel like I’m not alone,’” DeVito said. “I wish there were people like this when I was young … It would have made it so much easier to get to the good place I’m in now. It was a struggle because I did it largely alone, and they don’t have to — at least, if we don’t pass this law.”
Keeping children safe online sounds like a political slam dunk, so it’s no real surprise that the Kids Online Safety Act (KOSA) has faced little resistance since it came out of committee at the beginning of August. The bill’s provisions, proponents claim, will keep children safe on social media by restricting access to minors’ personal data, providing parents with tools to supervise their children’s social media use, and creating liability for platforms that allow minors to encounter inappropriate content, among other measures.
But Michael Ann DeVito, assistant professor of computer science and communication studies, thinks the bill deserves more scrutiny. DeVito has spent the last decade researching how queer and trans people interact with computers, social media, and online communities — first as a doctoral student at Northwestern University, then as a postdoctoral research fellow at the University of Colorado Boulder, and now at Northeastern University as a newly arrived joint appointee between Khoury College and the College of Arts, Media and Design.
“KOSA is one of a family of bills that’s been proposed in the last couple of years that all center around the rhetorical premise that we have to protect children,” she said. “You can slip anything under that radar once you’ve got such an appealing premise; people will stop reading, and they will sign on to something that actually will harm their kids.”
Per DeVito, KOSA’s potential harm derives from three elements: how queer and trans online communities function, how social media platforms navigate questions of legal liability, and the techniques bad-faith actors use to silence queer and trans people online. So, with DeVito’s help, let’s break it down.
How do queer and trans online communities function?
When DeVito was growing up in the 1990s and 2000s, information about being queer was buried in what she called the “nasty corners of the internet,” mixed haphazardly with misinformation and bigotry. Over many years, queer and trans people built alternatives, determined to make sure the next generation of kids didn’t have to go through the same confusion and isolation. Today, the communities DeVito studies often materialize through influencers and educators on Instagram or TikTok who create videos about what to expect when talking to your doctor about hormones, experiences with different surgeries, and what went into figuring out their own identities, as well as influencer staples like makeup tutorials and vlogs.
“One characteristic of these spaces is the provision of high-quality information — information that has scientific backing, and that is also community-based and vetted,” said DeVito, who spent a year becoming a trans TikTok educator as part of her study of the phenomenon.
She also found that queer and trans influencers tend to be vigilant about self-policing, attaching warnings to adult content and moderating their own comments sections for bigotry.
“They will sacrifice all kinds of their own goals and their own time to stay on top of that comment section and make sure no kid who watches their videos sees nasty, hateful stuff,” DeVito added.
Finally, queer and trans online communities are characteristically labors of love.
“These are volunteers who are doing this,” DeVito explained. “They’re trying to create places where people can actually explore who they are, ask questions, and get a sense of what’s going on with their peers.”
And the research is decisive; the availability of supportive online communities directly correlates with a host of improved health metrics, including a notable drop in youth suicide rates.
“It’s not one or two studies; I’m talking about the last ten years of studies on all different kinds of queer youth populations, trans youth populations. It is so clear: when you have a supportive online space, you do way better,” DeVito said. “If you don’t make kids feel super isolated about who they are at their core, they are less likely to harm themselves.”
How does legal liability work with social media platforms?
Social media platforms are governed by a set of federal laws called Section 230, which generally absolve them of liability for what their users post. That means if someone posted a hateful screed about DeVito to TikTok, she could potentially sue the user for defamation, but not the platform for hosting hate speech. Platforms try to moderate harmful content out of an economic incentive to be a pleasant place for users, not out of liability concerns.
KOSA would change that, making platforms liable if children encounter content that could cause them harm. DeVito acknowledges that additional review of harmful content could be a good thing if undertaken by an impartial review board. But instead, KOSA allows individual state governments and attorneys general to determine what constitutes harm, and some have a very different idea of what constitutes “harming children” than others do.
“For example, Texas’s government has been very explicit that they plan to use this law, should it pass, to censor any queer and trans content coming into the state of Texas,” DeVito said.
Under the new law, if a child in Texas watched one of DeVito’s TikToks about her experiences as a trans woman, Texas could hold TikTok itself liable if the child “comes to harm” — however Texas chooses to define “harm.” Even if her videos are factual, well-vetted educational material presented in an age-appropriate way, the threat of a lawsuit incentivizes TikTok to take a “better safe than sorry” approach and remove DeVito’s content.
“(KOSA) gives additional control over what is inappropriate to political entities, and then takes that pressure and lets it sit on platforms that are already inclined to over-moderate,” DeVito said. “That’s how you wind up getting rid of all queer and trans content, not just the stuff directed at kids. It’s like book banning on steroids.”
What techniques do bad-faith actors use to silence queer and trans people online?
While researching queer and trans online communities, DeVito observed many creators undergoing a common, upsetting experience: mass reporting campaigns. If their work caught the attention of a bigot who wanted to silence them, they would suddenly find their posts reported to the platform hundreds of times for hate speech and sexually explicit material, regardless of the post’s content. The reports were either submitted by a group of politically aligned people coordinating through another platform like Discord or Reddit, or by hundreds of fake accounts directed by a single person in a phenomenon called a “bot swarm.”
DeVito found that platforms generally took down mass-reported posts, sometimes via automatic action rather than moderator review. Two hundred reports saying a post contains sexually explicit material sounds like a sure thing, and reviewing posts one by one requires financial, psychological, and temporal investments many platforms won’t — or can’t — make.
Creators can appeal these decisions, but DeVito found that some queer and trans creators who reported harassment saw their own content or accounts get pulled down instead of the hateful comments. Rather than going to the platform, many creators spent hours figuring out how to tag their content to conceal it from bad actors but not from other queer people in a process called folk theorization, all while transphobic and homophobic harassment continued to rain down.
“Part of what I’ve been studying recently is the moments when creators say ‘This isn’t worth it; I can’t do this anymore,’” DeVito said. “People drop out, and communities are harmed by not having that positive influence and that positively curated space.”
That’s the real goal of mass reporting campaigns — not to get individual videos taken down, but to overburden creators until they burn out and quit. Because KOSA would expand what content could be reported and decrease platforms’ incentive to stand with marginalized creators, it’s an ideal tool for bad-faith actors to harass queer creators into silence.
“This law will wind up harming the exact creator that you as a parent probably hope your kids find … the creator who was going to give your kid the piece of information that made them think ‘Oh, I might be okay actually; maybe I can have a good future,’” DeVito said. “I’m afraid that a lot of these spaces that we see queer teens really benefiting from are just going to go silent.”
What’s Next for KOSA?
DeVito encourages people who support queer and trans youth to contact their legislators to voice their feelings about the bill, which is currently awaiting the scheduling of a floor vote. Because it’s packaged so appealingly, ensuring that congresspeople understand the policy they’re voting on is a major hurdle; another is building empathy for the people who will be impacted. DeVito particularly encourages those who can to share stories about how online queer communities have bettered their lives.
“When I was acting as a trans influencer, I was blown away by the number of people who got deep in the comments … It’s touching to get the comments like ‘This made me feel seen, this made me feel safe, this made me feel like I’m not alone,’” DeVito said. “I wish there were people like this when I was young … It would have made it so much easier to get to the good place I’m in now. It was a struggle because I did it largely alone, and they don’t have to — at least, if we don’t pass this law.”
Keeping children safe online sounds like a political slam dunk, so it’s no real surprise that the Kids Online Safety Act (KOSA) has faced little resistance since it came out of committee at the beginning of August. The bill’s provisions, proponents claim, will keep children safe on social media by restricting access to minors’ personal data, providing parents with tools to supervise their children’s social media use, and creating liability for platforms that allow minors to encounter inappropriate content, among other measures.
But Michael Ann DeVito, assistant professor of computer science and communication studies, thinks the bill deserves more scrutiny. DeVito has spent the last decade researching how queer and trans people interact with computers, social media, and online communities — first as a doctoral student at Northwestern University, then as a postdoctoral research fellow at the University of Colorado Boulder, and now at Northeastern University as a newly arrived joint appointee between Khoury College and the College of Arts, Media and Design.
“KOSA is one of a family of bills that’s been proposed in the last couple of years that all center around the rhetorical premise that we have to protect children,” she said. “You can slip anything under that radar once you’ve got such an appealing premise; people will stop reading, and they will sign on to something that actually will harm their kids.”
Per DeVito, KOSA’s potential harm derives from three elements: how queer and trans online communities function, how social media platforms navigate questions of legal liability, and the techniques bad-faith actors use to silence queer and trans people online. So, with DeVito’s help, let’s break it down.
How do queer and trans online communities function?
When DeVito was growing up in the 1990s and 2000s, information about being queer was buried in what she called the “nasty corners of the internet,” mixed haphazardly with misinformation and bigotry. Over many years, queer and trans people built alternatives, determined to make sure the next generation of kids didn’t have to go through the same confusion and isolation. Today, the communities DeVito studies often materialize through influencers and educators on Instagram or TikTok who create videos about what to expect when talking to your doctor about hormones, experiences with different surgeries, and what went into figuring out their own identities, as well as influencer staples like makeup tutorials and vlogs.
“One characteristic of these spaces is the provision of high-quality information — information that has scientific backing, and that is also community-based and vetted,” said DeVito, who spent a year becoming a trans TikTok educator as part of her study of the phenomenon.
She also found that queer and trans influencers tend to be vigilant about self-policing, attaching warnings to adult content and moderating their own comments sections for bigotry.
“They will sacrifice all kinds of their own goals and their own time to stay on top of that comment section and make sure no kid who watches their videos sees nasty, hateful stuff,” DeVito added.
Finally, queer and trans online communities are characteristically labors of love.
“These are volunteers who are doing this,” DeVito explained. “They’re trying to create places where people can actually explore who they are, ask questions, and get a sense of what’s going on with their peers.”
And the research is decisive; the availability of supportive online communities directly correlates with a host of improved health metrics, including a notable drop in youth suicide rates.
“It’s not one or two studies; I’m talking about the last ten years of studies on all different kinds of queer youth populations, trans youth populations. It is so clear: when you have a supportive online space, you do way better,” DeVito said. “If you don’t make kids feel super isolated about who they are at their core, they are less likely to harm themselves.”
How does legal liability work with social media platforms?
Social media platforms are governed by a set of federal laws called Section 230, which generally absolve them of liability for what their users post. That means if someone posted a hateful screed about DeVito to TikTok, she could potentially sue the user for defamation, but not the platform for hosting hate speech. Platforms try to moderate harmful content out of an economic incentive to be a pleasant place for users, not out of liability concerns.
KOSA would change that, making platforms liable if children encounter content that could cause them harm. DeVito acknowledges that additional review of harmful content could be a good thing if undertaken by an impartial review board. But instead, KOSA allows individual state governments and attorneys general to determine what constitutes harm, and some have a very different idea of what constitutes “harming children” than others do.
“For example, Texas’s government has been very explicit that they plan to use this law, should it pass, to censor any queer and trans content coming into the state of Texas,” DeVito said.
Under the new law, if a child in Texas watched one of DeVito’s TikToks about her experiences as a trans woman, Texas could hold TikTok itself liable if the child “comes to harm” — however Texas chooses to define “harm.” Even if her videos are factual, well-vetted educational material presented in an age-appropriate way, the threat of a lawsuit incentivizes TikTok to take a “better safe than sorry” approach and remove DeVito’s content.
“(KOSA) gives additional control over what is inappropriate to political entities, and then takes that pressure and lets it sit on platforms that are already inclined to over-moderate,” DeVito said. “That’s how you wind up getting rid of all queer and trans content, not just the stuff directed at kids. It’s like book banning on steroids.”
What techniques do bad-faith actors use to silence queer and trans people online?
While researching queer and trans online communities, DeVito observed many creators undergoing a common, upsetting experience: mass reporting campaigns. If their work caught the attention of a bigot who wanted to silence them, they would suddenly find their posts reported to the platform hundreds of times for hate speech and sexually explicit material, regardless of the post’s content. The reports were either submitted by a group of politically aligned people coordinating through another platform like Discord or Reddit, or by hundreds of fake accounts directed by a single person in a phenomenon called a “bot swarm.”
DeVito found that platforms generally took down mass-reported posts, sometimes via automatic action rather than moderator review. Two hundred reports saying a post contains sexually explicit material sounds like a sure thing, and reviewing posts one by one requires financial, psychological, and temporal investments many platforms won’t — or can’t — make.
Creators can appeal these decisions, but DeVito found that some queer and trans creators who reported harassment saw their own content or accounts get pulled down instead of the hateful comments. Rather than going to the platform, many creators spent hours figuring out how to tag their content to conceal it from bad actors but not from other queer people in a process called folk theorization, all while transphobic and homophobic harassment continued to rain down.
“Part of what I’ve been studying recently is the moments when creators say ‘This isn’t worth it; I can’t do this anymore,’” DeVito said. “People drop out, and communities are harmed by not having that positive influence and that positively curated space.”
That’s the real goal of mass reporting campaigns — not to get individual videos taken down, but to overburden creators until they burn out and quit. Because KOSA would expand what content could be reported and decrease platforms’ incentive to stand with marginalized creators, it’s an ideal tool for bad-faith actors to harass queer creators into silence.
“This law will wind up harming the exact creator that you as a parent probably hope your kids find … the creator who was going to give your kid the piece of information that made them think ‘Oh, I might be okay actually; maybe I can have a good future,’” DeVito said. “I’m afraid that a lot of these spaces that we see queer teens really benefiting from are just going to go silent.”
What’s Next for KOSA?
DeVito encourages people who support queer and trans youth to contact their legislators to voice their feelings about the bill, which is currently awaiting the scheduling of a floor vote. Because it’s packaged so appealingly, ensuring that congresspeople understand the policy they’re voting on is a major hurdle; another is building empathy for the people who will be impacted. DeVito particularly encourages those who can to share stories about how online queer communities have bettered their lives.
“When I was acting as a trans influencer, I was blown away by the number of people who got deep in the comments … It’s touching to get the comments like ‘This made me feel seen, this made me feel safe, this made me feel like I’m not alone,’” DeVito said. “I wish there were people like this when I was young … It would have made it so much easier to get to the good place I’m in now. It was a struggle because I did it largely alone, and they don’t have to — at least, if we don’t pass this law.”
Keeping children safe online sounds like a political slam dunk, so it’s no real surprise that the Kids Online Safety Act (KOSA) has faced little resistance since it came out of committee at the beginning of August. The bill’s provisions, proponents claim, will keep children safe on social media by restricting access to minors’ personal data, providing parents with tools to supervise their children’s social media use, and creating liability for platforms that allow minors to encounter inappropriate content, among other measures.
But Michael Ann DeVito, assistant professor of computer science and communication studies, thinks the bill deserves more scrutiny. DeVito has spent the last decade researching how queer and trans people interact with computers, social media, and online communities — first as a doctoral student at Northwestern University, then as a postdoctoral research fellow at the University of Colorado Boulder, and now at Northeastern University as a newly arrived joint appointee between Khoury College and the College of Arts, Media and Design.
“KOSA is one of a family of bills that’s been proposed in the last couple of years that all center around the rhetorical premise that we have to protect children,” she said. “You can slip anything under that radar once you’ve got such an appealing premise; people will stop reading, and they will sign on to something that actually will harm their kids.”
Per DeVito, KOSA’s potential harm derives from three elements: how queer and trans online communities function, how social media platforms navigate questions of legal liability, and the techniques bad-faith actors use to silence queer and trans people online. So, with DeVito’s help, let’s break it down.
How do queer and trans online communities function?
When DeVito was growing up in the 1990s and 2000s, information about being queer was buried in what she called the “nasty corners of the internet,” mixed haphazardly with misinformation and bigotry. Over many years, queer and trans people built alternatives, determined to make sure the next generation of kids didn’t have to go through the same confusion and isolation. Today, the communities DeVito studies often materialize through influencers and educators on Instagram or TikTok who create videos about what to expect when talking to your doctor about hormones, experiences with different surgeries, and what went into figuring out their own identities, as well as influencer staples like makeup tutorials and vlogs.
“One characteristic of these spaces is the provision of high-quality information — information that has scientific backing, and that is also community-based and vetted,” said DeVito, who spent a year becoming a trans TikTok educator as part of her study of the phenomenon.
She also found that queer and trans influencers tend to be vigilant about self-policing, attaching warnings to adult content and moderating their own comments sections for bigotry.
“They will sacrifice all kinds of their own goals and their own time to stay on top of that comment section and make sure no kid who watches their videos sees nasty, hateful stuff,” DeVito added.
Finally, queer and trans online communities are characteristically labors of love.
“These are volunteers who are doing this,” DeVito explained. “They’re trying to create places where people can actually explore who they are, ask questions, and get a sense of what’s going on with their peers.”
And the research is decisive; the availability of supportive online communities directly correlates with a host of improved health metrics, including a notable drop in youth suicide rates.
“It’s not one or two studies; I’m talking about the last ten years of studies on all different kinds of queer youth populations, trans youth populations. It is so clear: when you have a supportive online space, you do way better,” DeVito said. “If you don’t make kids feel super isolated about who they are at their core, they are less likely to harm themselves.”
How does legal liability work with social media platforms?
Social media platforms are governed by a set of federal laws called Section 230, which generally absolve them of liability for what their users post. That means if someone posted a hateful screed about DeVito to TikTok, she could potentially sue the user for defamation, but not the platform for hosting hate speech. Platforms try to moderate harmful content out of an economic incentive to be a pleasant place for users, not out of liability concerns.
KOSA would change that, making platforms liable if children encounter content that could cause them harm. DeVito acknowledges that additional review of harmful content could be a good thing if undertaken by an impartial review board. But instead, KOSA allows individual state governments and attorneys general to determine what constitutes harm, and some have a very different idea of what constitutes “harming children” than others do.
“For example, Texas’s government has been very explicit that they plan to use this law, should it pass, to censor any queer and trans content coming into the state of Texas,” DeVito said.
Under the new law, if a child in Texas watched one of DeVito’s TikToks about her experiences as a trans woman, Texas could hold TikTok itself liable if the child “comes to harm” — however Texas chooses to define “harm.” Even if her videos are factual, well-vetted educational material presented in an age-appropriate way, the threat of a lawsuit incentivizes TikTok to take a “better safe than sorry” approach and remove DeVito’s content.
“(KOSA) gives additional control over what is inappropriate to political entities, and then takes that pressure and lets it sit on platforms that are already inclined to over-moderate,” DeVito said. “That’s how you wind up getting rid of all queer and trans content, not just the stuff directed at kids. It’s like book banning on steroids.”
What techniques do bad-faith actors use to silence queer and trans people online?
While researching queer and trans online communities, DeVito observed many creators undergoing a common, upsetting experience: mass reporting campaigns. If their work caught the attention of a bigot who wanted to silence them, they would suddenly find their posts reported to the platform hundreds of times for hate speech and sexually explicit material, regardless of the post’s content. The reports were either submitted by a group of politically aligned people coordinating through another platform like Discord or Reddit, or by hundreds of fake accounts directed by a single person in a phenomenon called a “bot swarm.”
DeVito found that platforms generally took down mass-reported posts, sometimes via automatic action rather than moderator review. Two hundred reports saying a post contains sexually explicit material sounds like a sure thing, and reviewing posts one by one requires financial, psychological, and temporal investments many platforms won’t — or can’t — make.
Creators can appeal these decisions, but DeVito found that some queer and trans creators who reported harassment saw their own content or accounts get pulled down instead of the hateful comments. Rather than going to the platform, many creators spent hours figuring out how to tag their content to conceal it from bad actors but not from other queer people in a process called folk theorization, all while transphobic and homophobic harassment continued to rain down.
“Part of what I’ve been studying recently is the moments when creators say ‘This isn’t worth it; I can’t do this anymore,’” DeVito said. “People drop out, and communities are harmed by not having that positive influence and that positively curated space.”
That’s the real goal of mass reporting campaigns — not to get individual videos taken down, but to overburden creators until they burn out and quit. Because KOSA would expand what content could be reported and decrease platforms’ incentive to stand with marginalized creators, it’s an ideal tool for bad-faith actors to harass queer creators into silence.
“This law will wind up harming the exact creator that you as a parent probably hope your kids find … the creator who was going to give your kid the piece of information that made them think ‘Oh, I might be okay actually; maybe I can have a good future,’” DeVito said. “I’m afraid that a lot of these spaces that we see queer teens really benefiting from are just going to go silent.”
What’s Next for KOSA?
DeVito encourages people who support queer and trans youth to contact their legislators to voice their feelings about the bill, which is currently awaiting the scheduling of a floor vote. Because it’s packaged so appealingly, ensuring that congresspeople understand the policy they’re voting on is a major hurdle; another is building empathy for the people who will be impacted. DeVito particularly encourages those who can to share stories about how online queer communities have bettered their lives.
“When I was acting as a trans influencer, I was blown away by the number of people who got deep in the comments … It’s touching to get the comments like ‘This made me feel seen, this made me feel safe, this made me feel like I’m not alone,’” DeVito said. “I wish there were people like this when I was young … It would have made it so much easier to get to the good place I’m in now. It was a struggle because I did it largely alone, and they don’t have to — at least, if we don’t pass this law.”