Let’s talk about sex and data

Synopsis, 1st draft

My research question:
How can AI methods be used to enlighten everyone about sexual diversity among women?

The term “sexual diversity” is based on Lev Manovich’s thoughts on a “cultural diversity” in image style, art, and music taste. In his paper “Automating Aesthetics: Artificial Intelligence and Image Culture” (2017), he writes: “I believe that we can study cultural diversity without assuming that it is caused by variations from some types or structures.” (p. 1, my italics) and he argues that one can do this by “using AI methods.” (ibid. p. 10). In the same way, I will argue that we can use AI methods to study the sexual diversity among women.

From a literary history perspective, women’s role in sex has been either “the virgin” or “the whore”, and in critical analyses, both roles are typically described as something that only exists within male fantasy. Some feminists would argue that the recent social media movement #metoo is a step towards giving women from all over the world their own voice to tell about how they do not want to be treated in sex. I agree that the public female voice saying “no” has been near absent up until the movement started. However, in my opinion, the voice of the movement only represents one side of the female sexuality, which means that the movement maintains an unnuanced image of women’s role in sex. Because what about, for instance, all the women who want to say “yes” to sex? In my opinion, the movement can be seen as a sign of the fact that a more diverse image of women’s roles in sex is needed.

Therefore, I have decided to make a design fiction film about an interactive wearable which is called Vagina Helmet. Based on collected data from women, the Vagina Helmet is a tool for anyone to get more experience with fingering a girl/woman. The most important aspect of the Vagina Helmet is the social experience, reflected in that the helmet is not passive. Instead, it always gives the one fingering feedback on how it feels. Most importantly, the Vagina Helmet project is about listening to/reading your sex partner.

The use of the Vagina Helmet includes two people. The way it works is that a girl/woman takes the helmet on. Then, the one fingering starts to touch the helmet in ways he/she wants to touch it, and soon he/she will get a response from an LCD-screen on the forehead of the girl/woman. It could for instance say: “Not in the mood” or “That’s really nice” or “Try something else”. The experience is supposed to reflect the diversity not just between different women, but also the diversity within each single woman. Sometimes, the same woman can be “in the mood” one day but then the day after “not in the mood” or have variating preferences in fingering pattern.

At first sight, it seems to be something else to collect data about women’s sexual behavior than for instance music listening patterns (e.g. through Spotify) and faces (e.g. on the streets of Shanghai or through YouTube’s faces dataset), since in many countries and cultures women’s role in sex is a taboo. Maybe this is partly the reason why the existing knowledge about the anatomy and functions of female genitals is still limited. But in the end, they are all in the same category of personal data, and female sexuality is ready for exploration. Nevertheless, I want to express an awareness of the fact that I am working with personal data, which are vulnerable.

The Vagina Helmet is *not* meant to create a smart/artificially intelligent average fingering path to follow for the guy/girl fingering it. Instead, the data from all the women will be used as inspiration and function as a database for possible fingering paths for the Vagina Helmet to want at the given moment.

The way I am going to show how the Vagina Helmet works is through a design fiction film where these two aspects will be in focus: 1) the collection of data for the database about women’s wishes and wants in sex and 2) the interaction between two people when using the Vagina Helmet. It is my wish that the film shall raise questions about data and women in relation to sex, make a path for talking more about sex from women’s perspective, and not least reflect a more diverse image of women’s thoughts, feelings and desires in and about sex.

Fotografi den 10-07-2018 kl. 17.41 #2
Vagina Helmet 1.0 – Made in Berlin July 2018 with touch sensor (mpr121) and LCD-screen.

Model in candle wax by me November 13th 2018.

Anatomy of the “clitoral hood”.

Potentially, the Vagina Helmet can become a broader metaphor for sexual diversity among everyone and not just women. There are also several stereotypes around guys/men, such as “they are always interested in sex”, which is not a one-sided truth. Also, a Penis Helmet could be a relevant contribution to the project.

Term clarification: Actually, the name “Vagina” Helmet is in a way misleading since vagina in medical terms means the hole where a baby can come out from, and not the whole “thing”. In fact, from an anatomical perspective, the Vagina Helmet consists of all parts of a pussy, in whole, except the vagina. The helmet has no entrance/exit, it is just a surface. But the reason why I have chosen to call it the Vagina Helmet even though there is no vagina in it, is that I think it is the most clean word, and at the same time something that is immediately understood as a serious word. To me, the words “pussy” and “cunt” seem more loaded with negative values, “vulva” seems unfamiliar, and therefore they would steal the attention from the actual case. So: Vagina Helmet it is.

Literature:
Ingrid Hoelzl and Rémi Marie, “The Operative Image (Google Street View: The World as Database)”, in SoftImage, 2015.
Lev Manovich, Automating Aesthetics: Artificial Intelligence and Image Culture, 2017.
Søndergaard, Marie Louise Juul, and Lone Koefoed Hansen. “Intimate Futures: Staying with the Trouble of Digital Personal Assistants through Design Fiction,” 869–80. ACM Press, 2018.
Tin, Ida: “Why data will revolutionize global female health” in helloclue.com, 2015. ( https://helloclue.com/articles/about-clue/why-data-will-revolutionize-global-female-health )

Looking for literature on: design fiction and sex robots!

Giving voice to data

(artifact analysis, mini assignment 4.1)

The art project “ne.me.quittes.pas” made by Audrey Samson and Jonathan Kemp in 2014 is—in Samson’s own words—a “digital data funeral” (Samson, 2015, p. 6). It plays on issues around erasure of things that are usually not so easy to erase digitally—selfies, browser histories and such. In the promo video for the project it says wishfully “if only it could melt away” (video link). Nevertheless, the name of the work carries a reference to the song “Ne me quitte pas” (1959) by Jacques Brel. The song is about the relationship between a man and a woman, and the man is sad that the woman (his mistress) has left him. But two things about this reference is bugging me. First, why did Samson add an ‘s’ after ‘quitte’? In some versions of the title of the work, it is called “ne.me.quitte(s).pas” with a parenthesis around the extra ‘s’, but usually the extra ‘s’ is just there with no further explanation. (I am not going to go further into this in my analysis). Second, when we think of the relationship between the human and its data, we logically think that the human is the one who would wants to “melt away” its data files. But isn’t that the opposite of Brel’s song, since the subject of the song is sad of being left and doesn’t want the woman to “melt away”? Maybe Samson uses the reference ironically, as in: “Please leave me, dear data”. But if she doesn’t, what does it mean for the relationship between human and data?

In the Digital Culture class06 where we discussed this text, a question was raised about where our sympathy was—was it with the man who is being left, or the woman, who has left the man because he wasn’t man enough to be there for her when she got pregnant with their kid. Some people in class said that they had more sympathy with the woman than the man in this case. I would still argue, though, that we understand the story from the crying man’s point of view, and since he is admitting how cowardly he has behaved, I have my sympathy with him. After all it was the woman’s own choice to leave him and afterwards get an abortion to solve the problem about the pregnancy. My question is now: How is the relation between the human and the data in “ne.me.quittes.pas” the same as the relation between man and woman in “Ne me quitte pas”? And then: Who is the man and who is the woman; Who doesn’t want to be left by whom? Who admits his/her cowardice? To answer this, we must have a closer look at Samson’s project. There is a row of actions that the part taker has to do as part of the art project, and they are explained by Samson like this:

1. Take a USB key home with you.
2. Think about what data you would like to ritually erase.
3. Transfer the data to the USB key (delete original).
4. Send the USB key in the pre-addressed envelope and remember to include your return address.
5. You will receive your data remains in the post.

still promo ne.me.quittes.pas
still photo from promo video

As we can see, the human has to actively do a lot of things to get its data erased. In Brel’s song the one who takes action is the woman, who decides to leave the man. Since it apparently is the human itself who decides to get its data erased, it seems like the data is the crying one in the art project, and not the human. No matter if the human chooses to erase nudes or school notes that he/she wanted to burn anyway, the only one in Samson’s project who says that he/she/it doesn’t want to be left is the data itself. Say that the human would put its most important files such as wedding photos or insurance papers on the USB to erase these really important files, that we wouldn’t expect the human to want to be left from. *Then* the human would be crying to the data: “don’t leave me!” But since the human is on purpose erasing its data with its own will, I would argue that the data is the sad one of losing its human. In addition to that, the USB key that you get in the project has these words written on it: “ne.me.quittes.pas” (see still photo above, where the words are being drowned beacuse the human decided to get it drowned in chemicals). This can be seen as a final yell from the data to the human: “don’t leave me, human!”, as the human glimpses at the intact USB key one last time before dropping it into the envelope. Like the woman chooses to leaves the man, the human just coldly puts its data on the USB, into the envelope, sends it away and gets it drowned and erased. In that way, one could argue that Samson actually uses the reference to Brel, to give a voice to our data and tell us that we should try and see the case from more perspectives and start sympathizing with the data that we wish to erase.

Reference: Audrey Samson, “Erasure, an attempt to surpass datafication”, APRJA (4)1, 2015.

more data transparency for humans

(sketch, mini assignment 4.2)

Humans forget, computers don’t. It’s so difficult to erase your digital traces, we are told (e.g. by Samson, 2017), and the knowledge about every click, every touch, every deleted file, will be saved in the computer memory—forever.

The notion that humans forget easily implies that they cannot have an overview of the exact road they have taken, what clicks they have made, what things they have erased. The history of what have been erased is in other words invisible for the user. This means that there is a unequal relationship between the data that the computer has about the user’s digital actions and the user’s “data” about her own digital actions. However, I argue that this relationship can be challenged, if we would design user platforms differently and with more transparency for humans in general. It is clearly no wonder that the human user has a harder time remembering stuff than the computer when her actions are not visibly engraved into her the same way as it they are into the program logs of her computer. This leads me to explaining the speculative creative idea I had for this assignment. I had it when I was sitting reading my homework on my computer.

Last year, I worked as a student assistant on a research project at the university which meant that I could print as many articles on physical paper as I wanted—it was on the university’s bill. But now, I don’t work there anymore, and all the printing is on my own bill. Since I am just a poor student, thismeans that I now read most of my university literature on my computer. I use the programPreview which is an application made by Apple Inc. The program allows me to highlight text in different colors in the .pdfs that I read, almost like I would do on physical paper. But only almost. Because one crucial difference between the physical paper and the .pdf in Preview is that on the physical paper, I cannot change or erase my highlights once they are on the paper. Figure 1 is an image of a text that had underlined words in it when I got it from my professor. It shows that I got a copy of a physical version—one could say that it is historically transparent. Whoever left the traces of his/her previous reading could not erase these traces. In opposition to that, my highlights in Preview leave no trace if I remove them and the history of the document is therefore invisible to me.

computersThatRoar
figure 1

Then I thought: If the black pen on the “Computers that roar”-text had been a pencil, the lines would have been erasable with a rubber—and then it would be as the same non-transparent as the erasure of highlighting in Preview. But then I thought: Even if it had been written with a pencil, there would probably have been subtle leftovers from the graphite from the pencil or the push from the writer—of course depending on how hard I had pushed when writing and how good my rubber was. My first exploration sketch is therefore an illustration of how pencil erasure works (see gif). And from that I saw that you could still see subtle traces of where the pen has been engraved after erasure.

slowErasure
gif

Therefore my idea for a speculative creative project is an add-on to Preview that traces your work within the .pdf you are working in. At its core, this means that when I for instance want to erase the pink highlight over “ameliorate” because I have now looked the word up and therefore have gotten an idea of what it means, Preview will leave some kind of subtle pink glow (more subtle than on my sketch) around the word that indicates that I once had this highlighted. So the idea is a combination of the inherent qualities of pencil and rubber (you can draw something that can be erased) applied to the highlighter-pen (that highlights in colors) in a digital format in Preview. This makes the data about, and the history of, the user’s reading of a .pdf document visible to the human reader as it already was to the computer. I have drawn a sketch of how it could look (see figure 2). I used Instagram to create the pink glow around the words that was “un-highlighted”, but my idea was to have an even subtler glow, inspired more strongly by the subtleness of erased graphite from pencil and the engraving marks from the force behind the writing.

my sketch model
figure 2 (sketch/model)

The reader of this idea may now think that this speculative comment takes place on quite a small scale and not consider student highlighting in homework readings the most juicy data one leaves as digital trace. But when this idea is translated to a bigger scale, it signals that the data that computers have about us and our digital behaviour could be more transparent for us humans, and “who knows what about whom” could be more balanced between computers (and their companies) and users.

short documentary on infrastructural images

Our film
https://drive.google.com/file/d/1f-0fFc8dVRbC0WZP8M2jN514RenFdc8a/view

Photos from the process

Reflections on the process of making the short film

Our idea is to show how Facebook images can function as infrastructural images, leading us from person to person through images and tags on images. It addresses the phenomenon of browsing through random profiles on Facebook, based on subjective interests. We chose Facebook, because Facebook allows you to look at profiles without being observed by other people, as for example on LinkedIn, where people can know when someone visits their profile.

We discussed using Instagram as well, but during our research we concluded that tagging people on Instagram is not as common as on Facebook. Instagram tagging seems to be almost exclusively used as a marketing ploy, and tagging people/friends is more used with a @ as opposed to tagging a profile. We also talked about how Instagram as a medium is more commonly used to “show-off”, and the pictures posted are intentionally made to be aesthetically pleasing and are deliberately made to look a certain way. We see Facebook as a medium to be more private and exclusive.

We decided to use the profile of a young person (under age 20) as a starting point, because we have seen a tendency of tagging each other on profile pictures in the younger generation, and therefore it would be easier to browse through younger people’s profiles. We also decided to start with a person, we weren’t Facebook friends with, so that we only could see the public photos.

We spent some time on discussing what to call the film. We started talking about the title “voyeur”, as in one who “watches” or “observes” other people through Facebook, but we decided that it might have too many sexual connotations. Another suggestion we had was “stalker”, because that is what some of us in the group (girls in our mid-20’s) call ourselves, when we browse through Facebook profiles. Though, as one of the group members pointed out, “stalker” can have extreme and unlawful connotations to the older generation, as stalking as an actual act that can be punished by law, which was not what we wanted to explore with our film.

We chose to not only show the browsing through profiles but also the face of the person who does it, to make the style of our film more scenario-like and more like a real documentary, so that spectators had the possibility of identifying with the person sitting browsing, and to highlight that being a “Facebook voyeur” or “Facebook stalker” is completely normal, not just weirdos do it.

Comparison between writing and coding

Based on the chapter “Material Infrastructures of Writing and Programming” in the book Coding Literacy – How Computer Programming is Changing Writing (2017) by Annette Vee, I have created this timeline which argues that history repeats itself. More specifically, it argues that one can understand coding as a new kind of literacy because it is a material infrastructure like writing and that, with reference to Vee’s chapter, coding is going through many of the same steps as writing in its societal development.

In the model, both writing and coding is referred to as “it” or “the technology”.

writingcoding