Why We Must Change the Terms
I keep thinking back to the steps of the South Carolina state capitol in 2015.
I was there to deliver petitions calling on legislators to remove the Confederate flag from state grounds.
I was there because just a week earlier a man had walked into the Emmanuel African Methodist Episcopal Church in Charleston and gunned down nine people.
I was shaking so hard on those steps as I told the story of my great-grandfather, a sharecropper from Clarksdale, Mississippi, who disappeared after getting into a disagreement with his employer, who refused to pay him for his work. An employer we later found was a Klansman.
As I stood there, I talked about the Confederate flag that flew over my grandmother’s head when she was forced out of town and separated from the rest of her family at the age of nine. It was the same flag this mass murderer in Charleston had waved around. This psychopath’s manifesto continues to fester online.
I remember looking out and seeing neo-Nazis holding up their Confederate flags and watching me with hatred as I told that story.
I knew they would be there because I saw the Facebook event that had been posted about it. Still, it was one of the few times that I’ve genuinely feared for my life.
I didn’t know then how desensitized I’d become to explicit and implicit threats delivered online.
The family of the man who murdered nine worshippers in that Charleston church told the public what would become an increasingly familiar story — a story of someone pushed to extremes by overexposure to violent ideologies online.
This was something I myself would put to the test when I started using a computer exclusively for my research on white nationalists. I found that when the computer began to read me as a conservative white man it fundamentally changed my user experience.
Hey, you follow Mike Cernovich, Twitter would ask, but have you tried following David Duke or David Horowitz? Amazon would see I was looking up books about far-right activist Lauren Southern and make sure I knew I could also buy Mein Kampf and a plethora of neo-Nazi paraphernalia and get that shipped to me by the next day.
I was bombarded with articles on Yahoo and other places talking about dangerous “Blacks” killing each other in Chicago. I found that my world was profoundly shaped by who the internet thought I was. Once I was relegated to that bubble, it felt like there was no way to get out.
Many of the white-supremacist groups we were fighting against at Color Of Change were actively organizing online, often through gateway content. The platforms’ loose and often unenforced policies give these groups a veneer of legitimacy that allows them to operate unchecked.
They’re able to peddle video and audio content on YouTube and Soundcloud, sell paraphernalia and self-published books on Amazon, sell tickets to their events on Eventbrite, harass and seek to intimidate Black thought leaders — disproportionately women — on places like Twitter, and organize violent encounters on Facebook.
At Color Of Change, we went after these groups’ financial streams through our “Blood Money” campaign, working closely with financial institutions like PayPal to stop violent factions from being able to process credit-card payments.
But often this was like playing Whack-a-Mole with billion-dollar companies depending on us to point them in the right direction rather than investing resources and expanding enforcement policies to create a new normal.
It felt like we were fighting against an invisible clock, waiting for the next violent encounter, and we wondered how could we put forth solutions to slow down, if not stop, the neverending countdown to tragedy that would restart after Charlottesville, Tree of Life, Christchurch, and on and on.
And we noticed something else.
Too often white supremacists and those openly threatening us with hostile, hateful and violent speech are given a pass while Black voices speaking out against them are silenced. Platforms have censored a number of visible Black activists, their accounts shut down or suspended. And we know from the hundreds of stories we’ve collected that the platforms ban everyday Black people — Black women in particular — for speaking out against the racism, misogyny and harassment they’re enduring.
So this isn’t just about stopping anti-Black hate; it’s about looking at all the ways we could make these tech platforms less hostile places for Black people.
Now more than ever we needed to draw a line in the sand for corporations that hold the most power. We needed these companies to rein in online terrorism and deprive it of the digital oxygen it needs to thrive. These platforms needed to understand that staying neutral in these times is in fact making a choice.
This is what brought us at Color of Change to the table to work on the Change the Terms campaign, which I see as one tool in a toolbox of necessary interventions to ensure that the values of equity, justice and security can exist for all of us in both our online and offline worlds.
Brandi Collins-Dexter is the senior campaign director for media, democracy and economic justice at Color Of Change. This post is adapted from remarks she gave during a March 26 webinar on the impacts of online hate organized by the Change the Terms coalition. Learn more and sign on to support the Change the Terms campaign.
Watch the entire presentation and discussion below.