The new face of control
Persecution in the modern world rarely looks like it once did. The chains are subtler now. They are coded into policy, technology, and perception. Through surveillance, predictive policing, biased algorithms, and digital disinformation, systems of control have evolved to match the times. What was once enforced by visible power is now maintained through invisible data, shaping who gets hired, who gets stopped, and whose voice is believed.
VICTIMS OF PERSECUTION
11/12/20253 min read
The New Chains: Persecution in the Age of Data
The modern state no longer needs whips or walls to control populations. It has metadata, social scoring, and predictive analytics. Every online interaction. Every search, purchase, or location ping becomes a data point that can be used to build a profile, anticipate behavior, or deny access. facial recognition software is known to disproportionately misidentifying Black and brown faces. Social media content moderation algorithms flag and suppress political movements. AI-powered hiring tools can be powered by bias and discrimination. This is persecution repackaged: not as explicit violence, but as algorithmic bias — not through brute force, but through silent exclusion.
Predictive Policing and the Illusion of Objectivity
Across the United States and other nations, police departments use predictive policing tools that claim to forecast where crimes will occur and who is likely to commit them. But because these systems are built on historical crime data, they reproduce the very racial disparities already embedded in law enforcement.
In cities like Los Angeles, Chicago, and London, data-driven policing has led to increased surveillance in predominantly Black and poor neighborhoods, reinforcing cycles of criminalization. Studies such as the ProPublica 2016 investigation into COMPAS bias show that predictive policing software like PredPol or COMPAS disproportionately targets minority communities. Technology doesn’t erase human bias; it scales it.
The Algorithmic Gaze
Persecution today also lives in the private sector. In hiring platforms, housing applications, and social media feeds. Automated systems now decide who gets a job interview, a loan, or even visibility online. When those systems are trained on biased data, they perpetuate discrimination invisibly, without a human ever needing to act unjustly.This has been discovered on multiple occasions including the 2018 Amazon hiring algorithm that downgraded resumes with women’s names; studies showing racial bias in mortgage algorithms; or evidence of social media algorithms promoting hate speech and political manipulation. The result is a new digital hierarchy. One in which access and opportunity are filtered through machine logic that mirrors human prejudice.
Surveillance as Social Control
From smart cameras on city streets to data-sharing agreements between corporations and governments, surveillance has become a quiet constant. Communities once targeted by visible forms of state violence now face an invisible one: constant observation. In many urban centers, “predictive policing” merges with real-time camera networks and facial recognition databases, turning entire neighborhoods into zones of perpetual scrutiny.The American Civil Liberties Union has found facial recognition in Detroit and Baltimore have directly lead to discrimination and false arrests. Questionable management and sharing of what should be private data has caused concerns on a global scale. U.S. and Chinese firms are leading the way spreading these risky technologies into African and Latin American countries. Governments claim that the goal is to develop “safe cities “ yet the technology is increasingly associated with digital authoritarianism. When privacy becomes privilege, freedom becomes conditional.
Disinformation as Modern Persecution
Digital persecution doesn’t always come from governments; sometimes it comes from viral lies. Disinformation campaigns have targeted activists, journalists, and entire communities, spreading racial fear, political division, and false narratives designed to erode trust. According to the Stanford Internet Observatory or the Oxford Internet Institute, there is a documented pattern of disinformation and use of bots and troll farms to spread racial division in the U.S. during elections. Cite from. The goal is psychological control. Keep people angry, exhausted, and divided enough and systemic change seems impossible.
Resistance in the Age of Algorithms
Despite the sophistication of these systems, resistance persists. Activists, technologists, and ethicists are fighting for algorithmic transparency, digital privacy, and equitable design. Community groups such as the Algorithmic Justice League (founded by Joy Buolamwini), the Stop LAPD Spying Coalition, and African digital rights collectives like Paradigm Initiative and the African Digital Rights Network are building local networks that reclaim data ownership and advocate for tech justice. Their work reminds us that oppression evolves, but so does liberation.
Seeing the Invisible
Modern persecution hides in plain sight in dashboards, databases, and data centers. It thrives on our comfort with convenience and our faith in technology’s neutrality. But once we recognize that code is never neutral, we begin to see what’s really happening: a digital caste system designed to automate inequality.
To dismantle it, we must demand transparency, push for ethical regulation, and reclaim the human right to privacy as a cornerstone of freedom.
The modern chains are harder to see, but they still bind and breaking them will require not just reform, but reimagining what justice looks like in a world run by code.
Sign Up for Text Updates
Get text messages sent directly to your phone to stay up-to-date on the Fellowship's life-saving work around the world.
Subscribe to our Newsletter
Stay informed about issues affecting Africa, the African descendant nation, and Africa American relations, receive Daily Devotional and more
Who We Are
Who We Help
Latest News
Learn
Act
Ways to Give
Also of interest
@ 2025 International Fellowship of Black Americans and Africans ®
1705 East Joppa Rd
Parkville MD 21234
info@ifbaa.com
1-855-984-3222
A 501(c)(3) tax-exempt, non-profit organization.
The Fellowship's tax identification number is 39-4751944
Privacy Policy |









