As I explained in Keep Calm and Log On, the algorithms that shape our “smart” technologies, social media, and powerful corporate and government systems are kind of like recipes. For the most part, the companies that make them aren’t yet required to tell us what their ingredients are. And this means we don’t have a clear picture of whether some of those “ingredients” are harmful.
But we don’t have to take this sitting down. Check out the following resources to learn more about algorithms and take action for your community.
One good starting point is Algorithmic Accountability: A Primer.
If you’d prefer a video, check out Cathy O’Neil talking about her book, Weapons of Math Destruction.
Check out these articles on how artificial intelligence might be keeping you from getting a job.
Get to know how algorithms are used to set bail in the legal system, and how that can harm defendants.
Follow Virginia Eubanks’s work on how flawed data algorithms keep Americans from getting the services they’re entitled to.
Follow the work of AI Now.
If you work in the tech industry, get familiar with the Association for Computing Machinery’s guidelines for algorithmic transparency, and follow these guidelines to assess the social impact of the algorithms you create.
The Our Data Bodies project developed exercises to help communities understand data profiling and how it may negatively impact them. You can run activities yourself based on their materials, or get in touch with them for assistance.
If you live in the UK, you can find out how political parties may be abusing your data in their campaigns and help the Open Rights Group understand this abuse better. You can also get involved in one of their local groups.
In the US, Senators Ron Wyden and Cory Booker introduced the Algorithmic Accountability Act. Follow its progress here; you can also use that page to contact your representatives about the bill.
In the UK and internationally, follow and support the work of Privacy International.
Follow and support the ACLU’s campaigns for privacy and technology, where there is increasing concern about algorithms.
Follow and support the EFF’s work on artificial intelligence and machine learning.
Follow and support EPIC’s work on algorithmic transparency.
Speak out against bias in systems by joining the Poor People’s Campaign.
For more, order a copy of Keep Calm and Log On.