Ok so we would need an additional role/work
- yes to storytellers
- but we also need network weavers/partnership builders that go out into the world and partner the initiative with others doing good work e.g Data & Society
- to this end we can probably recruit a couple of people who are working at the EC aggregating information e.g Karel Van Der Putte at DG Grow.
Here is a newsletter from Data and Society that are actually calling for partnerships around this issue:
APRIL 1, 2020
This week we’ve posted new work on [health misinformation ] and shaped broader conversations on how COVID-19 is affecting the intersection of technology and society, from human and AI, to the protection and valuation of human lives* amid accelerated supply chains.
For our community, we gathered better practices for ethical reporting on pandemics, and organized virtual power hours]) to support holistic digital community security while we’re all hyper-online. We’ve also taken pause to take care of each other, and ourselves.
Spot an opportunity for collaboration? Reach out to us at info[at]datasociety.net. The Data & Society team is here to help break down disciplinary silos, illuminate truths, and connect ideas.
AROUND THE INSTITUTE
- Who Benefits from Health Misinformation?
“A public too fragmented to collectively trust health experts can’t hold an administration accountable for its lies. The grifters and snake oil salesmen are profiting now, but the uncertainty sowed today paves way for an oppressive power to take advantage of a fragmented society much more vulnerable to misinformation in the future.” — Data & Society Affiliate Erin McAweeney , Points
- RSVP: NEW INC x Data & Society: Coping Through Precarious Work
On Wednesday, April 8 at 5 p.m. ET / 2 p.m. PT , we’re teaming up with NEW INC for a virtual discussion about how creative practices evolve during precarious times . NEW INC members will give lightning talks about their projects at NEW INC and attendees will be invited to join the artists in a discussion about work, precarity, and art-making. Featuring members: [Foreign Objects])
, [Heidi Boisvert], [Mark Ramos], and [Ziyang Wu]. [RSVP here]
- After Supply Chain Capitalism
“The illusion of a contractor’s independence as plausible deniability and the right to be exploited by someone further up the chain needs to be undone. The insistence that dignity and life aren’t as essential as efficiency and market performance needs to be undermined with a social safety net that protects all people, including and especially workers most at risk, whether faced with a pandemic or simply faced with run-of-the-mill cruelties of capitalism.” — Data & Society Affiliate Ingrid Burrington , Points
“This is the moment for combining public health best practice with empirical research on misinformation, and to explore new ways to create public health messaging that is compelling, persuasive, and effective. It is also a moment of unprecedented mismanagement and misinformation being disseminated from the highest levels of government, and the problem requires a radically different approach to save lives and minimize harm.” — Data & Society Newsroom Outreach Lead Smitha Khorana , Points Bonus: Khorana also put together this guide for journalists on ethics and best practices for reporting on COVID-19.
- Bored Techies Being Casually Racist: Race as Algorithm
“Race-as-algorithm in the present day is tied to the long history of creating migrant casual workers in colonial and later periods where casual labor was used to replace slave labor on colonial plantations in the British Empire, and as quick labor to reconstruct bombed-out German cities through the guest worker program. Focusing on the historical relationship between casual labor and racialization shows that firms that value race as a source of creative vitality remain complicit in racism against Indian tech workers within and beyond their walls.” — Data & Society Director of Research Sareeta Amrute , Science, Technology, & Human Values
- As misinformation surges, coronavirus poses AI challenge
“‘It’s very hard to use automation when you’re in an information environment where the information is always incomplete and changing,’ Robyn Caplan , a researcher at the nonprofit Data & Society, told The Hill, about how shifting to automated content moderation may make the problem of misinformation worse.” — Chris Mills Rodrigo, The Hill
Ping @noemi re the points above on work, do you want us to get in touch with Data & Society? If yes maybe @thornet can connect us? I seem to recall one of the foundation’s fellows has been working with them at some point?