Today’s data industry is built on social site scraping, API access, sometimes-questionable data downloads and don’t worry promises of it’s anonymized. It appears much of that same grab-first-ask-questions-later mentality guides use of data in academic research settings as well as it does in the corporate world. But there’s a growing movement to change how corporations and academics gather and employ data for research.
When mainstream media outlets including NBC spotlight the use of images from photo-sharing platforms in facial recognition training data, and phrases such as “ethically-sourced data” pop-up in industry marketing language, the time is ripe to take a look at the traditional research ethics review process. Scandals such as the one that erupted from Google’s short-lived ethics board attempt and the failure of Boeing’s ethics board to address 737 Max software and pilot training flaws indicate that current approaches to corporate ethics reviews in particular need serious scrutiny.
Axon, best known for making Taser guns, launched its ethics board in 2018 and it’s still a work in progress, said Mike Wagers, VP of Axon Ecosystem. “You have to understand you will make mistakes and not get everything right,” he told RedTail’s Kate Kaye for a story published in IAPP’s Privacy Tech.
Read the IAPP story for detail on Axon’s ethics board approach and projects intended to update corporate and academic data research ethics review processes.
What needs to change? It’s not just about better data privacy and security protections. Future of Privacy Forum and others want more accountability and equity in data gathering and use that considers impacts on marginalized communities, for instance.
“The traditional privacy rules and laws and debates don’t really take into account concerns that are really driven by imbalances in power,” Jules Polonetsky, Future of Privacy Forum CEO, said in the IAPP story.
So much of the data floating around the massive information economy ecosystem reflects personal attributes and potentially-identifiable information. For Jessica Vitak, assistant professor at the College of Information Studies at the University of Maryland, that demands a new ethics review process for the academic setting that goes above and beyond the Institutional Review Board (IRB) approach.
Vitak is a principal investigator in the National Science Foundation-funded PERVADE project which examines ethics approaches related to pervasive data, or data containing rich personal information generated through digital interaction accessible for computational analysis.
Legislators are stepping in, too. In April, Sens. Mark Warner (D-VA) and Deb Fischer (R-NE) introduced the DETOUR Act requiring that large digital firms establish an IRB for any behavioral or psychological research conducted with user data.
Learn more about these efforts to update an obsolete ethics review process for corporate and academic research in the IAPP story.