Resilience principles for ER cities

This is a list of principles to inform the different applications from ER cities. They are intentionally clear-cut and might need to be shaded. The group is asked to

  • validate them for inclusion
  • if validated, provide references that anchor them in the resilience debate.

The purpose of the exercise is to inform applications that are cutting edge while communicating mastery of the contemporary debate on societal resilience.

  1. The goal of any investment on resilience is survival for the greatest number (as opposed to "sustainability").
  2. Fat-tailed probability distributions imply risk can't be accurately predicted. (Black Swan argument).
  3. Greatly increased societal connectivity implies most probability distributions are fat-tailed (the probability of any system somewhere failing is no longer indipendent from the probability of other systems failing elsewhere).
  4. Consequently, we focus on the direst scenarios.
  5. Man-made (societal and economic) threats carry potentially greater impact than natural ones and deserve attention.
  6. Planning is not preparedness, so building up resilience entails spreading awareness of threats and skilling up the populace.
  7. Resilience implies redundancy implies decentralization.

1 is rejected. Applications must accept the Rockefeller Foundation definition. However, it is is ok to mention priorities (first, protecting lives; second, fighting and preventing disease; third, protecting assets and property).

2 is accepted. Analysis is clearly central – especially analysis of vulnerabilities (example: in Finland water shortages are unlikely to be a problem; in the Canarias, maintaining a transportation network can be difficult when circumstances are bad). Ranking risks is also very useful. But fat-tailed distributions mean there is little to gain in trying to estimate the risks in terms of probability.

3 is accepted. Interdependencies are a thorny problem in the resilience debate, because  authorities build resilience around organizational structures, and that almost always means silos. Example: Sweden is the only country to have formally addressed the systemic risks associated with antibiotics usage,  though everyone knows it’s there. So, you get people in charge of responding to earthquakes, who will tend to think that major economic collapse or a fuel shortage is someone else’s problem. Events that are difficult to classify may result in slow or undecisive action. We make the case that this is a mistake, and try to factor interdependencies into the equation.

4 is accepted, and it follows from 3. Point risks (eg. earthquakes) and systemic risks (eg. glu, financial collapse) need to be kept separate. Point events can be very acute, but they don’t spread because of stochastic independence (probability of earthquake here is independent of an earthquake happening over there; joint probability == product on individual probability => essentially zero). Systemic risks attack the whole area or system at the same time => nobody can come for help. Some systemic risks are not dire (eg. fuel shortage); we take into account their full range and not just collapse (consistent with the rejection of 1).

5  is accepted, and it follows from 3 and 4.

6 is accepted, and it follows from 3. In turn, it implies 7 – knowledge is duplicated and taught reduntantly to as many people as possible. It implies that policy looks like this:

  • skill up the populace
  • build a core of experts
  • build fast ways to teach newbies skills to deploy when bad stuff happens (this would naturally include teaching other-city newbies, thus growing the network, once a foothold is gained in the initial city's own ground)
  • teach concepts like SCIM to enable the local people to have a conversation about resilience and contingencies. Don't keep it just to the experts!
  • incorporate the knowledge in evolving artifacts (local Appropedias as deliverables). 
  • incorporate the artifact in keeper institutions like the unMonastery to prevent degradation of the signal. A lively local ongoing debate on resilience could help a lot reduce signal degradation – think StackExchange, in which the knowledge is not written down in an artefact but incorporated in a community of people that, collectively, know all the asnwers. Note: historical monasteries played a role as keepers of knowledge – manuscripts in libraries, gene banks in the herb gardens.
  • invent and deploy communication networks (not just digital) that stay up. However, cast unMonasteries as labs or hubs.
  • foster hacker culture.
  • mention the involvement of young people in the whole scheme.