top of page
Search
  • luxzia0

Feminist Search (Code Critique)

Originally published on Critical Code Studies Working Group site in early 2020. Co-authored with Christine Meinders, Sarah Ciston, and Catherine Griffiths. Technical work is purely my own.


A few notes from almost a year later:

1) I chose a really simple model for this for explainability since that was part of the idea - make AI that can be explained line-by-line rather than a more sophisticated deep learning model (a choice for production, but not for the point here)

2) I came up with the term "Feminist Search" in early 2020 inspired by a conversation and Safiya Umoja Noble's Algorithms of Oppression, which I highly recommend as a read.

3) Our approach here was early days. My own ideas on what this could be have changed, but we started with a simple binary - safe versus dangerous - for labels for image collection.



This tree was a dangerous one near my house in San Francisco. The limbs kept falling onto power lines, cars, and nearly onto nearby residents.


Approaches to Co-Creation


In this example, community-sourced data can be traced both visually and in code, and can be used to inform the very model used to process this information. Rather than simply coding, the prototyping process is incorporated in the code from a critical perspective. This process is guided by the Cultural AI Design Tool, which refocuses the design process so that questions of creator, data origin, and rule-creation are centered rather than marginally examined or ignored. Using these as a basis for this particular critical code context, contributors are credited, while also keeping the prototype open for co-creation and reformulation by the community.


Modeling Binaries


There are several pieces that contribute to Feminist Search: personal data donation, interface design, and the use of binaries in data collection and model creation.   


The Feminist Search project explores what is safe and what is dangerous. Binary notions of safety and danger are just the starting point. Within the last five years, rising dangerous rhetoric is becoming socially acceptable once more and a corresponding rise in violent acts globally. Beyond this, there are the pressures of misogyny, racism, and other forms of bigotry that increase an individual or community's constant awareness of action to make themselves safe. What makes people feel safe? Safety can be categorized differently, such as physical, emotional and professional safety. 


These binary definitions can be expanded by examining the grey spaces with the questions in the personal data donation. By having people discuss what safety means to them, or semantics of this term and related concepts, models can be built that reflect these spectrums, that allows for both exciting design and technical challenges, but more importantly, for creating technology that is for the people who contribute their data. Feminist Search explores the challenges of search from a community perspective---with a goal of reflecting the shared data of communities in Los Angeles and San Francisco. 


One highlight is that computation is fundamentally binary, as are labels in machine learning---the data donation portion of Feminist Search uses labels of safe and dangerous. However, the goal is to move beyond a true/false dichotomy, because truth value in subjective particularly in categorizations of feelings and sentiments.


For those who are not familiar with the details of machine learning, fundamentally, machine learning is mathematical representations of geometric spaces that have distance functions as part of their definition. In defining geometric  classes, there will be a division between classes in an n-dimensional space (as in linear regression), or instead perhaps something such as a centroid in a clustering algorithm that will be the most representational of a cluster. Prediction(s) as to a class or type of image will depend on the geometric location in the vector space of the item(s).  


The interesting problems in data science and machine learning aren't in churning out mathematically good predictions, however. The outcome of an algorithm is only as good as the data given to it and how the person(s) constructing it use that data in the creation of a model. What often happens in construction of models is that outliers from other data points are often thrown out or are drowned out in the majority vote of the more "normal" considerations. Thus, these lead to models where a literal tyranny of the majority can happen, since the majority of opinions have more weight statistically - instead of treating all the data equally. 


In this approach, the simple act of search can be used to understand binary decisions that are used to form a model, and how users can donate information to understand who is contributing to search and data collection. This is the central starting point that prioritizes visualization and creates a space to develop a community search engine. In Feminist Search, communities create and provide contexts for evaluation, with the goal of sharing these decisions along with donated personal data, and the "why" in the search results. 


An additional goal of Feminist Search is to highlight thoughtful data donation and model weighting processes, while also showing how search is used---thus incorporating Feminist.AI approaches by exploring the act of search by utilizing embodied, multi-sensory (movement, sound, and images) methods through critical prototyping. Feminist Search is a way to solidify and continually honor the work of feminist communities.


Here is the code for Feminist Search Thompson, 2020, Python




27 views0 comments

Recent Posts

See All
bottom of page