And the negative effects of tech is something I usually always think about first:
- be it creating a government sponsored internet as public good (but what about China and Russia?)
- developing smart cities (but what about authoritarian regimes?)
- decentralized messaging apps (but what about hate speech/pedophiles/terrorism)
- or how “good-intended research” on facial recognition is being used by bad actors - and what should done about it.
But sometimes, the tech is either used as it’s supposed to, as the increase in mesh networks during the Hong Kong protests shows, or there’s TikTok that doesn’t only show “Snappy, Short, Cut to pop music” videos.
For this report, my colleague Isobel Cockerell at Coda Story spend months on the weird corners of the internet and talked with Uyghurs in exile who scout the Chinese TikTok for clues about what is happening in Xinjiang.
A little bit of back-story: Xinjiang, China is an information vacuum. With reports of a million Uyghurs in detention, plus censorship, surveillance and a blackout on outside communication, it’s hard to know what’s happening. But some TikTok users have started using the app to share visual clues about life under Xinjiang surveillance. That’s why international Uyghur activists are digging deep into TikTok & similar apps. They’ve found videos that show China’s ongoing destruction of Uyghur and other Muslim architecture, checkpoints with long lines of Uyghurs waiting to go through, and videos of crying people in front of pictures of their relatives. Douyin, the Chinese version of TikTok, is behind China’s firewall. To gain entry, international Uyghur activists have to use Chinese phones. To find compromising content from Xinjiang TikTok, the Uyghur activists “game” the app’s algorithm, which serves them content according to how they respond to each video. Searching isn’t an option. Location-based TikTok search results are cleaned up by moderators. But still, videos make it through, and are downloaded by the activists and then reshared on other social media platforms like Twitter and Facebook, so they don’t disappear. Read the full thread here.
TikTok - obviously - wasn’t made to share human rights violations, but for some it has become the only tool to let the world know what’s happening.
Do you have any other examples of a “positive” unintended use of an app or website?