Sara M. Watson: In Good Company
Data & Society / saved 2014-07-10
Summary:
I got pretty excited when people who I admire and respect cited my recent articles about data science, Facebook, and the uncanny this week. Beyond the not-so-humble brag, I'm more excited by the mounting accumulation of voices calling for accountability and ethical approaches to our data and its uses. And I was even more excited to overhear older family members at a friend's wedding this past weekend discussing the Facebook study over breakfast. I think we're starting to get somewhere.
Om Malik, who has supported some of the most extensive industry coverage of data on GigaOM , wrote this week about the Silicon Valley's collective responsibility to use their power wisely :
While many of the technologies will indeed make it easier for us to live in the future, but what about the side effects and the impacts of these technologies on our society, it’s fabric and the economy at large. It is rather irresponsible that we are not pushing back by asking tougher questions from companies that are likely to dominate our future, because if we don’t, we will fail to have a proper public discourse, and will deserve the bleak future we fear the most...Silicon Valley and the companies that control the future need to step back and become self accountable, and develop a moral imperative. My good friend and a Stanford D.School professor Reilly Brennan points out that it is all about consumer trust. Read more .
And danah boyd, who is starting up the Data Society research institute, summed up what we've learned from the Facebook emotional contagion study echoing my point that it's not just about the Facebook study, it's about the data practices:
This paper provided ammunition for people’s anger because it’s so hard to talk about harm in the abstract...I’m glad this study has prompted an intense debate among scholars and the public, but I fear it’s turned into a simplistic attack on Facebook over this particular study, rather than a nuanced debate over how we create meaningful ethical oversight in research and practice. The lines between research and practice are always blurred and information companies like Facebook make this increasingly salient. No one benefits by drawing lines in the sand. We need to address the problem more holistically. And, in the meantime, we need to hold companies accountable for how they manipulate people across the board, regardless of whether or not it’s couched as research. If we focus too much on this study, we’ll lose track of the broader issues at stake. Read more .
Both are great reads, and align with a lot of the things I've been exploring in my own work. I'm honored to be in such good company.