This week, I have been quite busy so decided to watch another TED talk. It’s a quick and easy way to find interesting topics and take them in.
This TED talk was from techno-sociologist Zeynep Tufekci, and was all about the data we give away to big online corporations such as Facebook, Google and Amazon.
Tufekci discussed how companies such as Facebook use algorithms to make us think and view things a certain way. For example, most people are aware that if you are browsing a website for a new dress, that dress will then follow you around on Facebook and other websites in the form of targeted ads.
The interesting thing that Tufecki was warning about is the power that algorithms hold. If you are constantly feeding a company data about yourself, everything from your profile information, to the conversations you have over the platform, they can really build up a picture of the kind of person you are.
Something she said that truly struck me was an example she gave of algorithms being able to detect when somebody with bi-polar disorder is about to go into a manic phase (where they are likely to act recklessly) and targeting this person with adverts for plane tickets.
Now, I feel as though we are living in an interesting time in terms of human rights online. Obviously when the main human rights based declarations were written, the internet didn’t exist. Tufekci made the point that the people developing algorithms that can discover even the most nuanced parts of your personality may have the best intentions. However, we cannot expect someone, be it big business or even the state, not to exploit this information at some point.
She spoke a lot about an experiment Facebook did which involved people declaring that they had voted in the US election. It was discovered that people were far more likely to vote if that little promt to get you to declare this suggested that your friends had also done so. Could states exploit information on us in order to influence our voting decisions in the future?
Surely the power behind algorithms could be used to break article 2 of the Universal Declaration of Human Rights?
‘No destinction shall be made on the basis of the political, jurisdictional or international status of the country or territory to which the person belongs, whether it be independent, trust, non-self-governing or under any other limitation of sovereignty.’
If states could use algorithms to determine what kind of voter someone was, surely they could then target ads to that person to maybe sway their view? It’s all so interesting!