image used courtesy of Nesta
Hi everyone,
Yesterday I participated in “Shaping the Future of Digital Social Innovation in Europe”, a conference organised by Nesta, at the European Commission in Brussels. This post is an attempt to share the main discussions that connected different areas at the intersections of technology and social innovation. It is not exhaustive but focuses on some key issues which social innovators, entrepreneurs, hacktivists, cultural practitioners and policymakers currently are tackling in isolation but which require more aligned efforts.
If you want to do background reading do check out:
1. Nesta’s big report on Europe’s digital social innovation ecosystem: Four technology trends: open hardware, open data, open knowledge, open networks. Pay special attention to the policy recommendations as they were endorsed by several policymakers including Markkula and Campolargo http://www.nesta.org.uk/publications/growing-digital-social-innovation-ecosystem-europe
2. Article on key issues around data entitlement in the internet of things https://hbr.org/2015/02/managing-privacy-in-the-internet-of-things
3. Manifesto for smart citizens http://waag.org/nl/blog/manifesto-smart-citizens
What was not discussed, but should have been
My own observation is that an important missing component from the discussion is in the overlap between economic viability of digital social innovation initiatives…and ownership of physical assets. James mentions ownership of the means of production, distribution and sales in this comment on the limitations of timebanking and other p2p collaboration initiatives. But there are the broader issues of permanently affordable housing, food security, etc that affect many.
Another topic that was raised in the twitter feeds was that we need to have a discussion about privacy/security infrastructure as public goods. Why is this relevant? As an illustrative example you may want to read this story about Werner Koch and think about the implications for a minute.
What was discussed
A key discussion revolved around policymakers and others placing emphasis on technological innovation, rather than focusing on technological arbitrage. Firstly, a purely engineering focus leads to the self-indulgent idea of the act of making as sufficient. It also contributes to mis-identification of problems through fetishisation of data and technology as “optimisation”. But even if we restrict the discussion to one about technology there is a huge political question that is not being addressed, but has to be.
“Algorithms are instructions. Not “acts of God”. Algorithms are subjective. What are we doing about this to democratize debates?” - Renata Avila
As Caspar Bowden points out, when it comes to digital social innovation, the European Commission assumes that Big Data and Privacy are compatible, but recent research shows preserving the utility of data sacrifices its anonymity. If you reduce everything to information rather than social (as Google does) you end in very undemocratic place. One reason being a key point made by Marleen Stikker: that data and the Algorithms that define how those data are processed, are not neutral. They are based on subjective interpretations of reality, and embed values that need to be debated on a philosophical level rather than a technical one alone. There are already indications that this view is paving the way for the erosion of civil liberties
‘Europe needs a moral & intellectual re-think on if it is right to consider data a commodity’ - Evgeny Morozov
Usman Haque made a point that the key issue we need to address in “internet of things” is not about discoverability (as the web was), but about entitlement. How do citizens get involved as producers of data or entitlement structures? A person is moral owner of data. How do we get citizens to decide which are the meaningful decisions to be made? Especially when you really do need digital literacy to understand the effect of algorithms on your futures.
In Edgeryders we are discussing this issue, from a different angle: “a massive case of market failure occurs around personal data, where people simply cannot understand the value of their personal data, and the way it aggregates to become incredibly powerful.”.
One approach proposed was to measure the costs of disenfranchisement with solutions made for us by something or someone other than ourselves. The reasoning is that If something is not measured it doesn’t get factored into institutional decision making.
The big takeaway was that we have to focus on critically looking at gatekeepers of the internet as big (mostly American) data monopolies are taking over all kinds of services and locking out smaller actors in social innovation.
“Look around you-who is capturing and prepping to capture all the data points? #digiSI needs to go beneath surface of service” @cased
“As government services are outsourced to Silicon Valley, how can smaller, local SMEs build services on the data?”
“The major actors can offer for nothing a service or product it took smaller actors a lot of investment to develop” (David Cuartielles, Arduino).
To address this requires us to better equip institutions with appropriate competence as the lack of smart laws leave vacuums that are filled by the rules of technology giants. It also requires us to get behind an agenda of decentralisation, public investment in distributed architectures and putting fair and open technologies at the heart of public procurement.
P.S If you found this post useful and would like more, let me know so I feel motivated to produce and share more of them Write to nadia@edgeryders.eu