Future Session #7: Exclusion by Technology

Future Session #7: Exclusion by Technology

September 25th 2018, Hivos, HumanityX and The Spindle hosted Future Session#7 at the Humanity Hub about exclusion by technology. Tin Geber, the new Social Innovation Expert of Hivos guided this inspiring session.

Technology & You

Tin started off by showing that we are exposed to an enormous amount of information nowadays, which will only increase in the future. Information exchange, which he called an ‘evolutionary trait’, has reached sky-high levels and algorithms are used by intermediaries to filter and curate information for us. To illustrate the incredible rise in information exchange, Tin showed the ratio between the minutes of available content and the minutes of our time. This ratio has increased from 82:1 in 1960 to 884:1 in 2005. In addition, after 2005 YouTube, Facebook and other social media channels became increasingly popular and boosted the information exchange further.

Algorithms are used to sort this information. However, they are hidden (they are trade-secrets & black-boxes) and no one actually knows how they work and who made them, which makes it dangerous and un-transparent. That is, an algorithm is never neutral and always has a purpose. It is owned by someone, who uses it for their own particular benefit or reasons. Therefore, algorithms might amplify bias, prejudice and inequality. There is a ‘myth of efficiency’ and a ‘myth of neutrality’ surrounding algorithms.

Technology & Development

Tin referred the word ‘technocolonialism’ and subsequently ‘technosolutionism’ to indicate the technological era and environment in which we live today. We discussed three development challenges within technocolonialism
1) Recipient-provider Dichotomy:
The creation of projects, tech etc is created (often) by the provider, the direction and flow of means is towards the recipient. The provider defines the recipients.
2) Defining Identities:
The provider defines who is why getting what and how much, this is provider driven.
3) Digital Simulacra/ avatars:
The provider defines ‘ideal’ recipients, they get a ‘role’ in spreadsheets this way, the context is lost in collecting the (big) data . And new complexity is gained from being in ‘the system’. Implicit information is lost because of aggregation.

Online Tools & Information

At the end of the session, we looked at several websites containing tools and information on the usage of (responsible) data, technology, online censorship and more. Furthermore, we explored some of the places where one can read true stories about ‘exclusion by technology’. Several of these tools and books were (co) developed by Tin himself. To read more about Tin and his work, see https://tin.fyi/about/. Tin is happy to answer possible questions you still have. The following  link will direct you to his presentation: http://www.tiny.cc/techexcludes

Below you can find the links of the websites visited during this session:

https://www.theengineroom.org/
http://responsibledata.io/resources/handbook/(free downloads)
https://responsibledata.io/
https://tingeber.gitbooks.io/the-new-gatekeepers/content/(free downloads)
https://decoders.amnesty.org/
https://onlinecensorship.org/
https://databasic.io/en/
http://www.makingallvoicescount.org/
https://alidade.tech/
https://5stardata.info/en/
https://atlas.oxfamnovib.nl/#/

 

 

 

 

 

Future Session #6: Playing for purpose

Read more

Future Session #5 on data literacy: how to lie with data

Read more

Future Session #4: Mapping vulnerable groups with chatbots

Read more

Sign up to our newsletter and stay up to date