With the project, Tracking Exposed, we want to analyze and discuss personalization algorithms.
With Tracking Exposed, we focus on transparency in the realm of social media by developing technologies. We face three interrelated questions: 1) How does social media concretely influence the experience of users through algorithms? 2) How can users be provided with tools to monitor their (UX) experience and reflect over the ‘information diet’ that algorithms fabricate for them? 3) How can users be put in control of the data transactions involved in the process? Meanwhile, the concept of Feminist Internet is a fascinating movement of protest surging from traditional feminists, and indicates the direction of an inclusive, harm reducing, aware and human-centered development of technologies. We want to merge these forces by bringing transparency into the realm of personalized algorithms. Companies actively play with our perceptions by segmenting us, and by imposing agency on our choices, we want to take it back by empowering the network peripheries.
By analyzing and developing criticism on Social Network technologies, you might raise some awareness, but not offer any solutions. A solution would be a mixture of policy changes on platforms, new technology developments, enforcing current legal frameworks, political relevance, and education. Considering our skill, we started developing technology to ensure independent analyses. Then we run algorithm accountability experiments with communities and research teams, publish articles, software and data. We want to work in this direction with a global community, where groups can analyze how algorithms affect them, redefine quality, measure impact, integrate new platforms, confront companies and demand their rights. We propose to start with practical experiments that are 1) explained 2) and defined together. Through data collection and analysis, we might produce educative material, redefine the quality of your online experience, and find out if feminist algorithms can exist.