User:Eleanorg/Thematic1.1/Anonymitiy

From XPUB & Lens-Based wiki

29 Nov 2011 Thematic Project

Wiki page for today: http://pzwart3.wdka.hro.nl/wiki/A_failed_coup_attempt_with_folk_songs_(Part_V)

Seda Guerses is a researcher working in the group COSIC/ESAT at the Department of Electrical Engineering in K. U. Leuven, Belgium. She is interested in the topics of privacy technologies, participatory design, feminist critique of computer science, and online social networks. Seda is particularly excited about the topic of anonymity in technical as well as cultural contexts, the spectrum being anywhere between anonymous communications and anonymous folk songs.

Beyond her academic work, she also had the pleasure of collaborating with artistic initiatives including Constant vzw, Bootlab, De-center, ESC in Brussels, Graz and Berlin.

http://www.esat.kuleuven.be/~sguerses

Guy from University of Munster - masters geoinformatics; interested in location privacy. Making target tracking systems using RFID etc. Making a game where ppl voluntarily share their location; aim of game is to identify other users from their location. Skinner's theory of rewarding ppl at random intervals as most effective encouragement to do a menial task.


  • 'officially' computer scientist
  • studied privacy for 15 yrs
  • Talking to security engineers - for them security = confidentiality
  • Notion of 'pervasive computing' has shifted to present day reality of mobile computing. Inherent in this idea is the constant collection of data - about our movements, desires, etc. Disconnect between this movement and idea of security
  • Michelle Artist who used antenna to capture CCTV camera signals and watch their footage. Projected indoor footage of a carwash onto the external wall - 'removing' the wall as something which conceals interior. Confrontation with carwash owner who asserted that "it's my signal" - questions about ownership
  • John McGraff - "loving big brother" - became their 'bible' - questions heteronormative idea of 'privacy' based on straight home life. Performative uses of cameras - from queer shows including cctv footage from changing room, to filming cops harrassing queers.
  • 80's - cryptographers aimed to fight Big Brother with techincal means. Their solution to surveillance was based on anonymity.
  • "The only thing I know of that is anonymous are folk songs" - interesting take on anonymity

Examples of 'anonymity': Personal anonymity

  • Nicknames/pseudonyms - solo anonymity, doesn't rely on others
  • changing your name to unlink different works
  • Wearing uniform - being indistinguishable within a group, hiding in numbers - relies on others
  • Anonymous Ltd - legal company structure that allows individuals to avoid responsibility

Anonymous artefacts:

  • The Bible
  • Wall paintings
  • Folk songs, mythologies
  • Mass-produced goods

Anonymity provides a protective layer, but always context-specific. Eg, black mask protects you only if everyone else wears one too. Surveillance - (Bentham's panopticon etc) vs sousveillance - the watched watch back. What is it that makes you anonymous? Can you remain anonymous while being watched?

What does it mean to have identified somebody? What is identity/the 'truth' of a person? In information society, your identity is your data. You can be linked with a trail of other data. What is the relationship of our data trail to us as individuals?

Statistical surveillance

Collecting information about a population, do statistical analysis on that data - no need to identify individuals. Then decide which sections fit into desirable norms/certain categories. This method is called "Social sorting". Problem is that some ppl will always fall into 'undesirable' category, so they'll accumulate disadvantage over time.

Where is this sorting happening on net today? Example: a train between Paris & Brussels which is used by a lot of rich ppl has its price changed so that lower income ppl can't afford it.

Idea that there is a 'norm' into which outliers are pushed. Eg - FB closing accounts of trans ppl for not using their given names.

Resistance to surveillance(?)

  • Withold data
  • Give wrong data
  • Generate noise

Is using the term 'surveillance' an implied value judgement? Is surveillance always bad? Surely it depends who is watching who and what the power relationship is.

Many ppl take 'I don't have anything to hide' line. (But there's an implied priviledge in that statement - that you live under a regime that doesn't condemn you for who you are.) Having something to hide != doing wrong. Examples: being gay or trans, being persecuted, etc...

How do you find out if a given form of surveillance is 'good for you'? Esp. if surveillance is good for one section of population and bad for another. (Or, good for you in one sense but bad in another - eg, a woman, an activist)

  • Find out what data they have on you
  • Find out how it's being used: what algorythms are being used, what patterns looked for?
  • Who has it? Who is it being shared with?
  • Who's paying for it?
  • What results does it give back?

All of this comes under the idea of 'transparency'. But transparency not enough - need accountability/control as well.

How do we help ppl develop an understanding of these algorithms being used on their data?

  • Record the ads you get shown to accompany particular kinds of content
  • Expose the mechanism: many ppl don't see beyond surface content
  • Perform like the algorithm.

Think of algorithm as a performance - gathers info, gives you something back. (Just like a protocol is a 'performance'/simulation of real life conversation). Security engineers ask: how can I subvert the performance of the algorithm? What does it do that might give me data that I'm not supposed to find out?

Reactive vs preventative surveillance. Can you try and prevent deviance before it happens? Eg, system in UK to track youths that are more likely to become criminals & keep them under state's gaze.


Common theme behind surveillance ideology: the data will give you the truth. Examples given by John McGraff - CCTV distorts proportions, so 8 yr olds are identified as teenagers. Discusses artworks - eg camera that looks like it's filming a toilet cubicle, but is actually filming a miniature model. Originality, liveness, etc... What kind of truth do we read into surveillance information?

Institute of media archaeology in Heinberg event: group on privacy created a manifesto on databodies - http://databodies.net/doku.php "our flesh and data bodies are one and many.

this inherent fluidity and the flows of our data bodies are allergic to any authoritarian attempt to fix them or prove them authentic.

those who try to fix identities to our data bodies for their security needs create opportunities for identity theft. "

Anonymous artefacts

What happens when you de-couple the artefact from its author?

  • Can be positive - eg auditioning behind a screen to avoid gender discrimination
  • Can be claimed by someone else. Is this good or bad? Who owns it? Person who stores it?
  • Given a new meaning by someone else
  • Also allows for radical commodification
  • Laptop is designed by an individual, then assembled by anonymous collective of workers, then given unique serial number - passes in and out of identification & anonymisation


Whose is this song?

http://www.youtube.com/watch?v=NGCURBHF2Ss Adela Peeva

Folk song is claimed by various ppl from different countries that the song came from their own country. Tries to find out where it comes from. What happens when the origins of a song are unknown?

  • Can be used for any purpose - lovesong, religious, nationalistic... freedom to change it
  • Everyone tries to claim origination - resistance to its anonymity
  • Who has the right to 'authenticate' it? Musicological approach - eg, identifying region of origin by the beat pattern

Failed coup attempt with folk songs

More anonymous objects:

  • Path through grass is an anonymous, collaborative object
  • Memes, theories
  • A mass demo; fans at a soccer match - protects individuals
  • Military organization/uniformity - protects the military rather than the individuals

Anonymity steps - security based on anonymity; aim is to decouple data trail from the individual who's left it Dresden researchers in 90s - encrypting cell phone traffic. Problem is that cell phones each have unique electromagnetic signatures. "Imagine an anonymous city" - everyone concealed by a box. Effective, but concluded that it's not an appealing way to live.

Anonymous Comms

  • TOR - conceal IP address
  • Controversial - seen as a tool for terrorists, child porn etc
  • But powerful tool for censorship resistance
  • Mix zones - designates a busy place where individuals can meet and swap identities, to foil trackers.
  • Crowds- anonymous comms protocol - when you recieve a message you don't know whether the 'sender' is original sender or a proxy. You then send it on, or keep it, with unpredictable probability.

Privacy preserving dataminers: ppl who want to enable gathering data, but in a way that preserves individual anonymity Sweeney - gathered Boston hospital data and crossed it with another database, to show that even anonymised data could be crossed with other data to identify individuals. Only need a few bits of info - eg, date of birth, zipcode, gender - to identify who you are. Proposed "K anonymity" - make sure at least 2 ppl exist for each match, eg by removing last digits of zipcode and storing age as a range rather than number of years. Even this was broken though; each new, more complex scheme was also broken. Someone eventually proved that you will always be able to cross anonymised data with another database to identify individuals. Anonymisation doesn't work.

Data protection laws use legal definitions of anonymisation, not mathematical ones. Data protection laws allow for free use of anonymised data - even though it can be easily de-anonymised. Govts like to anonymise data because it delinks you from your data, thus you lose your right to access your own data. Data protection directive is as much meant to create an economy of data which can be freely shared (as it's 'safely anonymised')

Three main paradigms in computer science for approaching anonymity:

  • Privacy by concealing or anonymising data
  • Privacy as control paradigm - reveal data but retain control over it or at least have a transparency policy
  • Privacy as practice - use datamining as counter-surveillance; use tools to discover and participate in data collection on you. Rejects idea that decisions on discolsure are made purely individually; we base our decisions on seeing what others do, and also online most of your data (Eg facebook behaviour) is actually collective - you've posted it on someone else's wall, or they on yours. Or, even if you "have nothing to hide", revealing your own data allows larger sampling which could impact other ppl.

Batch Mixer Game

3 senders, 3 recievers, one proxy in the middle, two observers. Observers have to guess who has sent which message to who.

  • Round 1 - messages sent in plain view from senders to recievers.
  • Round 2 - messages encrypted
  • Round 3 - messages batch mixed


Traces game

Thore Fechner

http://giw-traces.uni-muenster.de:8080/secUser

  • Written with Grails & PostgreSQL database; JQuery Mobile frontend
  • Uses Open Street Maps with Cloudmate library
  • Looks & feels like 4square - login & share your location
  • You can add tags to locations to make notes of where you've spotted others

Insitute links:

  • ifgi.de - Institute for Geoinformatics
  • sitcom.de - situated computing lab

Gameification; self-tracking. See http://quantifiedself.com