XPUB2 Research Board / Martin Foucaut: Difference between revisions

From XPUB & Lens-Based wiki
 
(51 intermediate revisions by the same user not shown)
Line 1: Line 1:
=<p style="font-family:helvetica">Pads</p>=
===<p style="font-family:helvetica">Manetta / Michael </p> ===
* Group meeting Michael - https://pad.xpub.nl/p/2021_sandbox
* Group discussion Michael/Manetta - https://pad.xpub.nl/p/20210928_xpub2
* 2nd Group discussion Michael/Manetta - https://pad.xpub.nl/p/2021-10-05-xpub2
* Aquarium — https://pad.xpub.nl/p/aquarium
* Post-Aquarium — https://pad.xpub.nl/p/2021-10-12-postaquarium
* Prototyping — https://pad.xpub.nl/p/2021-11-09-xpub2
===<p style="font-family:helvetica">Steve / Marloes</p> ===
* Graduate Seminar Session 1 — https://pad.xpub.nl/p/GRS_session1_20_21
* Graduate Seminar Session 2 — https://pad.xpub.nl/p/GRS7Oct21
* https://pad.xpub.nl/p/LB2_%26_XPUB2_introduction_to_the_Graduate_Research
* https://pad.xpub.nl/p/GRS7Oct21
* https://pad.xpub.nl/p/GRS_session_3_14_Oct_21
* https://pad.xpub.nl/p/Thesi_OutlinePlanSteve
===<p style="font-family:helvetica">Eleanor Greenhalgh</p> ===
* Collaboration, Conflict & Consent - part 2 — https://pad.xpub.nl/p/2021-10-XPUB2-Nor
=<p style="font-family:helvetica">Links</p>=
=<p style="font-family:helvetica">Links</p>=


*[[Martin (XPUB)-project proposal]]
*[[Martin (XPUB)-project proposal]]
*[[Martin (XPUB)-thesis outline]]
*[[Martin (XPUB)-thesis outline]]
*[[Martin (XPUB)-thesis]]


=<p style="font-family:helvetica">Seminars ([[Graduate_Seminar_2021-2022|source]])</p>=
=<p style="font-family:helvetica">Draft Thesis</p>=


===<p style="font-family:helvetica"> Key Dates and Deadlines </p>===
===What do you want to make?===


'''These are the key dates for 2021-22'''
My project is a data collection installation that monitors people's behaviors in public physical spaces while explicitly encouraging them to help the algorithm collect more information. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.
<br><br>
The way the device is designed doesn’t pretend to give any beneficial outcomes for the subject, but only makes visible the benefits that the machine is getting from collecting their data. Yet, the way the device visually or verbally presents this collected data is done in a grateful way, which might be stimulating for the subject. In that sense, the subject, despite knowing that their actions are done solely to satisfy the device, could become intrigued, involved, or even addicted by a mechanism that deliberately uses it as a commodity. In that way, I intend to trigger conflictual feelings in the visitor’s mind, situated between a state of awareness regarding the operating monetization of their physical behaviors, and a state of engagement/entertainment /stimulation regarding the interactive value of the installation.
<br><br>
My first desire is to make the mechanisms by which data collection is carried out, marketized and legitimized both understandable and accessible. The array of sensors, the Arduinos and the screen are the mainly technological components of this installation. Rather than using an already existing and complex tracking algorithm, the program is built from scratch, kept open source and limits itself to the conversion of a restricted range of physical actions into interactions. These include the detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection. Optionally they may also include the detection of the subject smartphone device or the log on a local Wi-Fi hotspot made by the subject.
<br><br>
In terms of mechanic, the algorithm creates feedback loops starting from: <br>
_the subject behaviors being converted into information; <br>
_the translation of this information into written/visual feedback; <br>
_and the effects of this feedbacks on subject’s behavior; and so on. <br>
By doing so, it tries to shape the visitors as free data providers inside their own physical environment, and stimulate their engagement by converting each piece of collected information into points/money, feeding a user score among a global ranking.
<br><br>
On the screen, displayed events can be:


* 19 November - Graduate Proposal Deadline
_ “subject [] currently located at [ ]” <br>
 
[x points earned/given]<br>
Last year's Graduate Proposals [[Graduate Proposals 2020-2021|UPLOAD YOUR PROPOSAL HERE!]]
_ “subject [] entered the space” <br>
 
[x points earned/given]<br>
* 19 November - Thesis Outline Deadline 
_ “subject [] left the space”<br>
 
[x points earned/given]<br>
Last year's Thesis Outlines [[Thesis Outlines 2020-2021|UPLOAD YOUR THESIS OUTLINE HERE!]]
_ “subject [] moving/not moving”<br>  
 
[x points earned/given]<br>
* 3 Dec         - Deadline First Chapter
_ “subject [] distance to screen: [ ] cm” <br>
 
[x points earned/given]<br>
* 18 Feb       - Deadline First Draft Thesis
_ “subject [] stayed at [ ] since [ ] seconds” <br>
 
[x points earned/given]<br>
* 18 March         - Deadline Second Draft thesis (texts to 2nd readers)                     
_ “subject [] device detected <br>
 
[x points earned/given] (optional)<br>
* 1 April - Deadlines Second readers' comments
_ “subject logged onto local Wi-Fi<br>
 
[x points earned/given] (optional)<br>
* 14 April - DEADLINE THESIS
<br>
 
Added to that comes the instructions and comments from the devices in reaction to the subject’s behaviors:<br>
====<p style="font-family:helvetica">Guides and Guidelines</p>====
<br>
 
_ “Congratulations, you have now given the monitor 25 % of all possible data to collect!” <br>
*[[Graduate_proposal_guidelines]]
[when 25-50-75-90-95-96-97-98-99% of the total array of events has been detected at least once]<br>
 
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot!”<br>
*[[Second Readers Guidelines]]
[if the subject stands still in a specific location]<br>
 
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
*[[A Guide to Essay Writing]] (including guide to Harvard method).
[unlocked at x points earned/given]<br>
 
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers!”<br>
*[[ Handbook details- thesis and final project]]
[if the subject stand still in a specific location]<br>
 
_ “Leaving all ready? The monitor has yet to collect 304759 crucial pieces of information from you!”<br>
*[[Thesis Guidelines]]
[if the subject is a the edge of the detection range]<br>
 
_ “You are only 93860 pieces of information away from being the top one data-giver!”<br>
*[[Criteria for evaluation (Thesis)]]
[unlocked at x points earned/given]<br>
 
_ “Statistics show that people staying for more than 5 minutes average will benefit me on average 10 times more!”<br>
LB Code link (in progress)
[randomly appears]<br>
 
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
*https://pad.xpub.nl/p/LB-groupcritprotocals
[if the subject stands still for a long time any location]<br>
 
<br>
*[https://libguides.elmira.edu/research How to do research]
Responding positively to the monitors instructions unlocks special achievement and extra points<br>
 
<br>
===<p style="font-family:helvetica">About thesis</p>===
—Accidental data-giver badge <br>
 
[unlocked if the subject has passed the facility without deliberately wishing to interact with it] + [x points earned/given]<br>
====<p style="font-family:helvetica">Thesis criteria</p>====
—Lazy data-giver badge <br>
 
[unlocked if the subject has been standing still for at least one minute] + [x points earned/given]<br>
# Intelligibly express your ideas, thoughts and reflections in written English.
—Novice data-giver badge <br>
# Articulate in writing a clear direction of your graduate project by being able to identify complex and coherent questions, concepts and appropriate forms.
[unlocked if the subject has been successfully completing 5 missions from the monitor] + [x points earned/given]<br>
# Clearly structure and analyse an argument.
—Hyperactive data-giver badge <br>
# Use relevant source material and references.
[unlocked if the subject has never been standing still for 10 seconds within 2 minutes lapse time] + [x points earned/given]<br>
# Research texts and practices and reflect upon them analytically.
—Expert data-giver badge <br>
# Synthesize different forms of knowledge in a coherent, imaginative and distinctive way.
[unlocked if the subject has been successfully completing 10 missions from the monitor within 10 minutes] + [x points earned/given]<br>
# Position one's own views within a broader context.
—Master data-giver badge <br>
# Recognize and perform the appropriate mode of address within a given context.
[unlocked if the subject has been successfully logging on the local Wi-Fi hotspot] + [x points earned/given] (optional)<br>
# Engage in active dialogue about your written work with others.
<br>
 
On the top left side of the screen, a user score displays the number of points generated by the collected pieces of information, and the unlocking of special achievements instructed by the monitor.<br>
====<p style="font-family:helvetica">Thesis format</p>====
<br>
 
—Given information: 298 pieces <br>
# A report on your research and practice.
[displays number of collected events]<br>
 
—Points: 312000 <br>
# An analytical essay exploring related artistic, theoretical, historical and critical issues and practices that inform your practice, without necessarily referring to your work directly.
[conversion of collected events and achievement into points]<br>
 
<br>
# The presentation of a text as a body of creative written work.
On the top right of the screen, the user is ranked among x number of previous visitors and the prestigious badge recently earned is displayed bellow<br>
 
<br>
====<p style="font-family:helvetica">Thesis Outline (guideline)</p>====
—subject global ranking: 3/42 <br>
 
[compares subject’s score to all final scores from previous subjects]<br>
Don't make it more than 1500 words
—subject status:  expert data-giver<br>
 
[display the most valuable reward unlocked by the subject]<br>
<b>What is your question?</b>
<br>
 
When leaving the detection range, the subject gets a warning message and a countdown starts, and encouraging it to take the quick decision to come back<br>
Break the proposed text down into parts. 
<br>
Think of the separate sections as "containers"
—“Are you sure you want to leave? You have 5-4-3-2-1-0 seconds to come back within the detection range”<br>
(this may change as you progress with the text but try to make a clear plan with a word count in place)
[displayed as long as the subject remains completely undetected]<br>
 
<br>
Thesis Outline (consider the following before writing the outline. Include all these points in the intro to the outline)
If the subject definitely stands out of the detection range for more than 5 seconds, the monitor will also address a thankful message and the amount of money gathered, achievements, ranking, complete list of collected information and a qr code will be printed as a receipt with the help of a thermal printer. The QR will be a link to my thesis.<br>
 
<br>
Conceptual Outline (what is your question? Try to be a specific as possible. More specific than identifying a subject or general interest. It helps to ask: "what questions does the work I make generate?")
—* “Thank you for helping today, don’t forget to take your receipt in order to collect and resume your achievements”<br>
 
[displayed after 5 seconds being undetected]<br>
Why do you want to write this text?
<br>
 
In order to collect, read or/and use that piece of information, the visitor will inevitably have to come back within the range of detection, and intentionally, or not, reactivate the data tracking game. It is therefore impossible to leave the area of detection without leaving at least one piece of your own information printed in the space. Because of this, the physical space should gradually be invaded by tickets scattered on the floor. As in archaeology, these tickets give a precise trace of the behavior and actions of previous subjects for future subjects. <br>
Outline of Methodology 
 
(for example: " I would like to structure my thesis in relation to the a series of interviews I will conduct for my proposed project" 
===Why do you want to make it?===
OR 
 
"I will make a 'close reading' of three of my past projects" 
When browsing online or/and using connected devices in the physical world, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. While social platforms, brands, public institutions and governments explicitly promote the use of monitoring practices in order to better serve or protect us, we could also consider these techniques as implicitly political, playing around some dynamics of visibility and invisibility in order to assert new forms of power over targeted audiences.  
 
Time line
(how will you plan your time between now and April)
 
 
 
* Introduction- overview 
[500 words]
 
* Chapter 1 
[2000 words]
 
* Chapter 2 
[2000 words]
 
* Chapter 3 
[2000 words]
 
* Conclusion [500 words] 
 
=7000
 
===<p style="font-family:helvetica">Bibliography</p>===
 
Annotated bibliography (five texts max). Make a synopsis of 5 texts  that will be central to your thesis.
 
*Example of annotated bibliography 
https://pzwiki.wdka.nl/mediadesign/Mia/Thesis
 
*Example of a thesis outline:
    #)
    https://pzwiki.wdka.nl/mw-mediadesign/images/f/f3/Thesis_outline_final_Yuching.pdf
    #1)
    https://pzwiki.wdka.nl/mediadesign/User:Zpalomagar/THESIS_OUTLINE/FIFTH_DRAFT
 
=== Referencing System ===
 
*Harvard Referencing system [https://library.aru.ac.uk/referencing/files/QuickHarvardGuide2019.pdf PDF]
 
 
==<p style="font-family:helvetica">Graduate proposal guidelines</p>==
 
===What do you want to make?===
 
My project is a data collection installation that monitors people's behaviors in public physical spaces while explicitly encouraging them to help the algorithm collect more information. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.  
<br><br>
<br><br>
The way the device is designed doesn’t pretend to give any beneficial outcomes for the subject, but only makes visible the benefits that the machine is getting from collecting their data. Yet, the way the device visually or verbally presents this collected data is done in a grateful way, which might be stimulating for the subject. In that sense, the subject, despite knowing that their actions are done solely to satisfy the device, could become intrigued, involved, or even addicted by a mechanism that deliberately uses it as a commodity. In that way, I intend to trigger conflictual feelings in the visitor’s mind, situated between a state of awareness regarding the operating monetization of their physical behaviors, and a state of engagement/entertainment /stimulation regarding the interactive value of the installation.
In the last decade, a strong mistrust of new technologies has formed in the public opinion, fueled by events such as the revelations of Edward Snowden, the Cambridge Analytica scandal or the proliferation of fake news on social networks. We have also seen many artists take up the subject, sometimes with activist purposes. But even if a small number of citizens have begun to consider the social and political issues related to mass surveillance, and some individuals/groups/governments/associations have taken legal actions, surveillance capitalism still remains generally accepted, often because ignored or/and misunderstood.
<br><br>
<br><br>
My first desire is to make the mechanisms by which data collection is carried out, marketized and legitimized both understandable and accessible. The array of sensors, the Arduinos and the screen are the mainly technological components of this installation. Rather than using an already existing and complex tracking algorithm, the program is built from scratch, kept open source and limits itself to the conversion of a restricted range of physical actions into interactions. These include the detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection. Optionally they may also include the detection of the subject smartphone device or the log on a local Wi-Fi hotspot made by the subject.
Thanks to the huge profits generated by the data that we freely provide every day, big tech companies have been earning billions of dollars over the sale of our personal information. With that money, they could also further develop deep machine learning programs, powerful recommendation systems, and to broadly expand their range of services in order to track us in all circumstances and secure their monopolistic status. Even if we might consider this realm specific to the online world, we have seen a gradual involvement from the same companies to monitor the physical world and our human existences in a wide array of contexts. For example, with satellite and street photography (Google Earth, Street View), geo localization systems, simulated three-dimensional environments (augmented reality, virtual reality or metaverse) or extensions of our brains and bodies (vocal assistance and wearable devices). Ultimately, this reality has seen the emergence of not only a culture of surveillance but also of self-surveillance, as evidenced by the popularity of self-tracking and data sharing apps, which legitimize and encourage the datafication of the body for capitalistic purposes.
<br><br>
<br><br>
In terms of mechanic, the algorithm creates feedback loops starting from: <br>
For the last 15 years, self-tracking tools have made their way to consumers. I believe that this trend is showing how ambiguous our relationship can be with tools that allow such practices. Through my work, I do not wish to position myself as a whistleblower, a teacher or activist. Indeed, to adopt such positions would be hypocritical, given my daily use of tools and platforms that resort to mass surveillance. Instead, I wish to propose an experience that highlights the contradictions in which you and I, internet users and human beings, can find ourselves. This contradiction is characterized by a paradox between our state of concern about the intrusive surveillance practices operated by the Web giants (and their effects on societies and humans) and a state of entertainment or even active engagement with the tools/platforms through which this surveillance is operated/allowed. By doing so, I want to ask how do these companies still manage to get our consent and what human biases do they exploit in order to do so. That’s is how my graduation work and my thesis will investigate the effect of gamification, gambling or reward systems as well as a the esthetization of data/self-data as means to hook our attention, create always more interactions and orientate our behaviors.
_the subject behaviors being converted into information; <br>
 
_the translation of this information into written/visual feedback; <br>
 
_and the effects of this feedbacks on subject’s behavior; and so on. <br>
===How to you plan to make it and on what timetable?===
By doing so, it tries to shape the visitors as free data providers inside their own physical environment, and stimulate their engagement by converting each piece of collected information into points/money, feeding a user score among a global ranking.
<br><br>
On the screen, displayed events can be:


[[File:Simu part 01.gif|right|800px|thumb|https://www.dropbox.com/s/zyv3c2ypit8qo1y/Simulation_MartinFoucaut.mp4?dl=0 full HD video]]
I am developing this project with Arduino Uno/Mega boards, an array of ultrasonic sensor, P5.js and screens.<br><br>
[[File:Simu part 02.gif|right|800px|thumb]]
[[File:Simu part 03.gif|right|800px|thumb]]


_ “subject [] currently located at [ ]” <br>
<b>How does it work?</b>
[x points earned/given]<br>
<br><br>
_ “subject [] entered the space” <br>
The ultrasonic sensors can detect obstacles in a physical space and know the distance between the sensor and obstacle(s) by sending and receiving back an ultrasound. The Arduino Uno/Mega boards are microcontrollers which can receive this information, run it in a program in order to convert these values into a mm/cm/m but also map the space into an invisible grid. Ultimately, values collected on the Arduino’s serial monitor can be sent to P5.js through P5.serialcontrol. P5.js will then allow a greater freedom in way the information can be displayed on the screens.
[x points earned/given]<br>
<br><br>
_ “subject [] left the space”<br>  
Process:
[x points earned/given]<br>
<br><br>
_ “subject [] moving/not moving”<br>  
<b>1st semester: Building a monitoring device, converting human actions into events, and events into visual feedbacks</b>
[x points earned/given]<br>
<br><br>
_ “subject [] distance to screen: [ ] cm” <br>
During the first semester, I am focused on exploring monitoring tools that can be used in the physical world, with a specific attention to ultrasonic sensors. Being new to Arduino programming, my way of working is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale. Prototypes are subject to testing, documentation and comments helping to define which direction to follow. The first semester also allows to experiment with different kind of screen (LCD screens, Touch screens, computer screens, TV screens) until finding the most adequate screen monitor(s) for the final installation. Before building the installation, the project is subject to several sketching and animated simulations in 3 dimensions, exploring different scenarios and narrations. At the end of the semester, the goal is to be able to convert a specific range of human actions into events and visual feedback creating a feedback loop from the human behaviors being converted into information; the translation of this information into written/visual feedbacks; and the effects of this feedbacks on human behavior; and so on.
[x points earned/given]<br>
<br><br>
_ “subject [] stayed at [ ] since [ ] seconds” <br>
<b>2nd semester: Implementing gamification with the help of collaborative filtering, point system and ranking.</b>
[x points earned/given]<br>
<br><br>
_ “subject [] device detected <br>
During the second semester, it is all about building and implementing a narration with the help of gaming mechanics that will encourage humans to feed the data gathering device with their own help. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.
[x points earned/given] (optional)<br>
<br><br>
_ “subject logged onto local Wi-Fi<br>
To summarize the storyline, the subject being positioned in the detection zone finds herself/himself unwillingly embodied as the main actor of a data collection game. Her/His mere presence generates a number of points/dollars displayed on a screen, growing as she/he stays within the area. The goal is simple: to get a higher score/rank and unlock achievements by acting as recommended by a data-collector.  This can be done by setting clear goals/rewards to the subject, and putting its own performance in comparison to all the previous visitors, giving unexpected messages/rewards, and give an aesthetic value to the displayed informations.
[x points earned/given] (optional)<br>
<br><br>
<br>
The mechanism is based on a sample of physical events that have been already explored in the first semester of prototyping (detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection). Every single detected event in this installation is stored in a data bank, and with the help of collaborative filtering, will  allow to the display of custom recommendations such as:
Added to that comes the instructions and comments from the devices in reaction to the subject’s behaviors:<br>
<br><br>
<br>
_ “Congratulations, you have now given the monitor 12 % of all possible data to collect”<br>
_ “Congratulations, you have now given the monitor 25 % of all possible data to collect!” <br>
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot”<br>
[when 25-50-75-90-95-96-97-98-99% of the total array of events has been detected at least once]<br>
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot!”<br>
[if the subject stands still in a specific location]<br>
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
[unlocked at x points earned/given]<br>
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers”<br>
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers!”<br>
_ “Leaving all-ready? The monitor has yet 304759 crucial pieces of information to collect from you”<br>
[if the subject stand still in a specific location]<br>
_ “You are only 93860 actions away from being the top one data-giver”<br>
_ “Leaving all ready? The monitor has yet to collect 304759 crucial pieces of information from you!”<br>
_ “Statistics are showing that people staying for more than 5 minutes average will be 10 times more benefitting for me”<br>
[if the subject is a the edge of the detection range]<br>
_ “You are only 93860 pieces of information away from being the top one data-giver!”<br>
[unlocked at x points earned/given]<br>
_ “Statistics show that people staying for more than 5 minutes average will benefit me on average 10 times more!”<br>
[randomly appears]<br>
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
[if the subject stands still for a long time any location]<br>
<br><br>
<br>
The guideline is set out here, but will be constantly updated with the help of experiments and the results observed during the various moments of interaction between the students and the algorithm. For this purpose, the installation under construction will be left active and autonomous in its place of conception (studio) and will allow anyone who deliberately wishes to interact with it to do so. Beyond the voluntary interactions, my interest is also to see what can be extracted from people simply passing in front of this installation. In addition to this, some of the mechanics of the installation will be further explored by collaborating with other students, and setting up more ephemeral and organized experiences with the participants. (ex: 15 February 2022 with Louisa)
Responding positively to the monitors instructions unlocks special achievement and extra points<br>
<br><br>
<br>
This semester will also include the creation of a definite set of illustrations participating to engage the participants of the installation in a more emotional way, the illustrations will be made by an illustrator/designer, with whom I usually collaborate.
—Accidental data-giver badge <br>
<br><br>
[unlocked if the subject has passed the facility without deliberately wishing to interact with it] + [x points earned/given]<br>
<b>3rd semester: Build the final installation of final assessment and graduation show. Test runs, debug and final touchs.</b>
—Lazy data-giver badge <br>
<br><br>
[unlocked if the subject has been standing still for at least one minute] + [x points earned/given]<br>
During the third semester, the installation should be settled in the school, in the alumni area, next to XPUB studio for the final assessment, and ultimately settled again at WORM for the graduation show. I am interested in putting this installation into privileged spaces of human circulation, (such as hallways) that would more easily involve the detection of people, and highlight the intrusive aspect of such technologies. The narration, the mechanics, illustrations and graphic aspect should be finalized at the beginning of the 3rd semester, and subject to intense test runs during all that period until meeting the deadline.
—Novice data-giver badge <br>
 
[unlocked if the subject has been successfully completing 5 missions from the monitor] + [x points earned/given]<br>
===Relation to larger context===
—Hyperactive data-giver badge <br>
 
[unlocked if the subject has never been standing still for 10 seconds within 2 minutes lapse time] + [x points earned/given]<br>
As GAFAM companies are facing more and more legal issues, and held accountable in growing numbers of social and political issues around the world, the pandemic context has greatly contributed to make all of us more dependent than ever on the online services provided by these companies and to somehow force our consent. While two decades of counter-terrorism measures legitimized domestic and public surveillance techniques such as online and video monitoring, the current public health crisis made even more necessary the use of new technologies for regulating the access to public spaces and services, but also for socializing, working together, accessing to culture, etc. In a lot of countries, from a day to another, and for an undetermined time, it has become necessary to carry a smartphone (or a printed QR code) in order to get access transport, entertainment, cultural and catering services, but also in order to do simple things such as to have a look at the menu in a bar/restaurant or to make an order.. Thus, this project takes place in a context where techno-surveillance has definitely taken a determining place in the way we can access spaces and services related to the physical world. <br><br>
—Expert data-giver badge <br>
 
[unlocked if the subject has been successfully completing 10 missions from the monitor within 10 minutes] + [x points earned/given]<br>
Data Marketisation / Self Data: Quantified Self / Attention Economy / Public Health Surveillance / Cybernetics 
—Master data-giver badge <br>
 
[unlocked if the subject has been successfully logging on the local Wi-Fi hotspot] + [x points earned/given] (optional)<br>
 
<br>
===Relation to previous practice?===
On the top left side of the screen, a user score displays the number of points generated by the collected pieces of information, and the unlocking of special achievements instructed by the monitor.<br>
 
During my previous studies in graphic design, I started being engaged with the new media by making a small online reissue of Raymond Queneau’s book called Exercices de Style. In this issue called Incidences Médiatiques (2017), the user/reader was encouraged to explore the 99 different versions of a same story written by the author in a less-linear way. The idea was to consider each user graphic user interface as a unique reading context. It would determine which story could be read, depending on the device used by the reader, and the user could navigate through these stories by resizing the Web window, by changing browser or by using on a different device.
<br><br>
As part of my graduation project called Media Spaces (2019), I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print website. Subsequently, this website became translated in the physical space as a printed book, and a series of installations displayed in an exhibition space that was following the online structure of my thesis (home page, index, part 1-2-3-4). In that way, I was interested to inviting to visitors to make a physical experience some aspects of the Web
<br><br>
As a first-year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website called Tense (2020) willing to display the invisible html <meta> tags inside of an essay in order to affect our interpretation of the text. In 2021, I worked on a geocaching pinball game highlighting invisible Web event, and a Web oscillator, which amplitude and frequency range were directly related to the user’s cursor position and browser screen-size.
<br><br>
While it has always been clear to me that these works were motivated by the desire to define media as context, subject or/and content, the projects presented here have often made use of surveillance tools to detect and translate user information into feedbacks, participating in the construction of an individualized narrative or/and a unique viewing/listening context (interaction, screen size, browser, mouse position). The current work aims to take a critical look at the effect of these practices in the context of techno surveillance.
<br><br>
Similarly, projects such as Media Spaces have sought to explore the growing confusion between human and web user, physical and virtual space or online and offline spaces. This project will demonstrate that these growing confusions will eventually lead us to be tracked in all circumstances, even in our most innocuous daily human activities/actions.
 
 
===Selected References===
 
<b>Works:</b>
 
* M. DARKE, fairlyintelligent.tech (2021) https://fairlyintelligent.tech/
« invites us to take on the role of an auditor, tasked with addressing the biases in a speculative AI »Alternatives to techno-surveillance
<br>
<br>
—Given information: 298 pieces <br>
* MANUEL BELTRAN, Data Production Labor (2018) https://v2.nl/archive/works/data-production-labour/
[displays number of collected events]<br>
Expose humans as producers of useful intellectual labor that is benefiting to the tech giants and the use than can be made out of that labor.
—Points: 312000 <br>
[conversion of collected events and achievement into points]<br>
<br>
<br>
On the top right of the screen, the user is ranked among x number of previous visitors and the prestigious badge recently earned is displayed bellow<br>
* TEGA BRAIN and SURYA MATTU, Unfit-bits (2016) http://tegabrain.com/Unfit-Bits
Claims that that technological devices can be manipulated easily and hence, that they are fallible and subjective. They do this by simply placing a self-tracker (connected bracelet) in another context, such as on some other objects, in order to confuse these devices.
<br>
<br>
—subject global ranking: 3/42 <br>
* JACOB APPELBAUM, Autonomy Cube (2014), https://www.e-flux.com/announcements/2916/trevor-paglen-and-jacob-appelbaumautonomy-cube/
[compares subject’s score to all final scores from previous subjects]<br>
Allows galleries to enjoy encrypted internet access and communications, through a Tor Network
—subject status: expert data-giver<br>
<br>  
[display the most valuable reward unlocked by the subject]<br>
* STUDIO MONIKER, Clickclickclick.click (2016) https://clickclickclick.click/
You are rewarded for exploring all the interactive possibilities of your mouse, revealing how our online behaviors can be monitored and interpretated by machines.
<br>
<br>
When leaving the detection range, the subject gets a warning message and a countdown starts, and encouraging it to take the quick decision to come back<br>
* RAFAEL LOZANO-HEMMER, Third Person (2006) https://www.lozano-hemmer.com/third_person.php
Portrait of the viewer is drawn in real time by active words, which appear automatically to fill his or her silhouette https://www.lozano-hemmer.com/third_person.php
<br>
<br>
—“Are you sure you want to leave? You have 5-4-3-2-1-0 seconds to come back within the detection range”<br>
* JONAL LUND, What you see is what you get (2012) http://whatyouseeiswhatyouget.net/
[displayed as long as the subject remains completely undetected]<br>
«Every visitor to the website’s browser size, collected, and played back sequentially, ending with your own.»
<br>
<br>
If the subject definitely stands out of the detection range for more than 5 seconds, the monitor will also address a thankful message and the amount of money gathered, achievements, ranking, complete list of collected information and a qr code will be printed as a receipt with the help of a thermal printer. The QR will be a link to my thesis.<br>
* USMAN HAQUE, Mischievous Museum (1997) https://haque.co.uk/work/mischievous-museum/
Readings of the building and its contents are therefore always unique -- no two visitors share the same experience. https://haque.co.uk/work/mischievous-museum/
<br><br>
<b>Books & Articles:</b>
 
* SHOSHANA ZUBOFF, The Age of Surveillance Capitalism (2020)
Warns against this shift towards a «surveillance capitalism». Her thesis argues that, by appropriating our personal data, the digital giants are manipulating us and modifying our behavior, attacking our free will and threatening our freedoms and personal sovereignty.
<br>
<br>
* “Thank you for helping today, don’t forget to take your receipt in order to collect and resume your achievements”<br>
* EVGENY MOROZOV, Capitalism’s New Clothes (2019)
[displayed after 5 seconds being undetected]<br>
Extensive analysis and critic of Shoshana Zuboff research and publications.
<br>
* BYRON REEVES AND CLIFFORD NASS, The Media Equation, How People Treat Computers, Television, and New Media Like Real People and Places (1996)
Precursor study of the relation between humans and machine, and how do you human relate to them.
<br>
* OMAR KHOLEIF, Goodbye, World! — Looking at Art in the digital Age (2018)
Authors shares it’s own data as a journal in a part of the book, while on another part, question how the Internet has changed the way we perceive and relate, and interact with/to images.
<br>
<br>
In order to collect, read or/and use that piece of information, the visitor will inevitably have to come back within the range of detection, and intentionally, or not, reactivate the data tracking game. It is therefore impossible to leave the area of detection without leaving at least one piece of your own information printed in the space. Because of this, the physical space should gradually be invaded by tickets scattered on the floor. As in archaeology, these tickets give a precise trace of the behavior and actions of previous subjects for future subjects. <br>
* KATRIN FRITSCH, Towards an emancipatory understanding of widespread datafication (2018)
Suggests that in response to our society of surveillance, artists can suggest activist response that doesn’t necessarily involve technological literacy, but instead can promote strong counter metaphors or/and counter use of these intrusive technologies.


===Why do you want to make it?===
=<p style="font-family:helvetica">Prototyping</p>=


When browsing online or/and using connected devices in the physical world, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. While social platforms, brands, public institutions and governments explicitly promote the use of monitoring practices in order to better serve or protect us, we could also consider these techniques as implicitly political, playing around some dynamics of visibility and invisibility in order to assert new forms of power over targeted audiences.
<br><br>
In the last decade, a strong mistrust of new technologies has formed in the public opinion, fueled by events such as the revelations of Edward Snowden, the Cambridge Analytica scandal or the proliferation of fake news on social networks. We have also seen many artists take up the subject, sometimes with activist purposes. But even if a small number of citizens have begun to consider the social and political issues related to mass surveillance, and some individuals/groups/governments/associations have taken legal actions, surveillance capitalism still remains generally accepted, often because ignored or/and misunderstood.
<br><br>
Thanks to the huge profits generated by the data that we freely provide every day, big tech companies have been earning billions of dollars over the sale of our personal information. With that money, they could also further develop deep machine learning programs, powerful recommendation systems, and to broadly expand their range of services in order to track us in all circumstances and secure their monopolistic status. Even if we might consider this realm specific to the online world, we have seen a gradual involvement from the same companies to monitor the physical world and our human existences in a wide array of contexts. For example, with satellite and street photography (Google Earth, Street View), geo localization systems, simulated three-dimensional environments (augmented reality, virtual reality or metaverse) or extensions of our brains and bodies (vocal assistance and wearable devices). Ultimately, this reality has seen the emergence of not only a culture of surveillance but also of self-surveillance, as evidenced by the popularity of self-tracking and data sharing apps, which legitimize and encourage the datafication of the body for capitalistic purposes.
<br><br>
For the last 15 years, self-tracking tools have made their way to consumers. I believe that this trend is showing how ambiguous our relationship can be with tools that allow such practices. Through my work, I do not wish to position myself as a whistleblower, a teacher or activist. Indeed, to adopt such positions would be hypocritical, given my daily use of tools and platforms that resort to mass surveillance. Instead, I wish to propose an experience that highlights the contradictions in which you and I, internet users and human beings, can find ourselves. This contradiction is characterized by a paradox between our state of concern about the intrusive surveillance practices operated by the Web giants (and their effects on societies and humans) and a state of entertainment or even active engagement with the tools/platforms through which this surveillance is operated/allowed. By doing so, I want to ask how do these companies still manage to get our consent and what human biases do they exploit in order to do so. That’s is how my graduation work and my thesis will investigate the effect of gamification, gambling or reward systems as well as a the esthetization of data/self-data as means to hook our attention, create always more interactions and orientate our behaviors.


==<p style="font-family:helvetica">Arduino</p>==


===How to you plan to make it and on what timetable?===
Early sketch that is about comparing and questioning our Spectator experience of a physical exhibition space (where everything is often fixed and institutionalized), with our User experience of a Web space (where everything is way more elastic, unpredictable and obsolete). I’m interested about how slighly different can be rendered a same Web page to all different users depending on technological contexts (device nature, browser, IP address, screen size, zoom level, default settings, updates, luminosity, add-ons, restrictions, etc). I would like to try to create a physical exhibition space/installation that would be inspired from the technology of a Web user window interface in order then to play with exhbitions parameters such as the distance between the spectator and the artwork, the circulation in space, the luminosity/lighting of the artwork(s), the sound/acoustics, etc etc etc.
 
I am developing this project with Arduino Uno/Mega boards, an array of ultrasonic sensor, P5.js and screens.<br><br>
 
How does it work?
<br><br>
<br><br>
The ultrasonic sensors can detect obstacles in a physical space and know the distance between the sensor and obstacle(s) by sending and receiving back an ultrasound. The Arduino Uno/Mega boards are microcontrollers which can receive this information, run it in a program in order to convert these values into a mm/cm/m but also map the space into an invisible grid. Ultimately, values collected on the Arduino’s serial monitor can be sent to P5.js through P5.serialcontrol. P5.js will then allow a greater freedom in way the information can be displayed on the screens.
Distance between wall behind the spectator and the artwork has to be translated into a variable that can affect sound or light in the room.
<br><br>
Wall position could be connected to the dimensions of a user interface in real time with arduino and a motor.
Process:
<br>
<br><br>
 
1st semester: Building a monitoring device, converting human actions into events, and events into visual feedbacks
===<p style="font-family:helvetica">Create a connected telemeter with an Arduino, a ultrasonic Sensor (HC-SR04) and a ESP8266 module connected to Internet </p>===
<br><br>
 
During the first semester, I am focused on exploring monitoring tools that can be used in the physical world, with a specific attention to ultrasonic sensors. Being new to Arduino programming, my way of working is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale. Prototypes are subject to testing, documentation and comments helping to define which direction to follow. The first semester also allows to experiment with different kind of screen (LCD screens, Touch screens, computer screens, TV screens) until finding the most adequate screen monitor(s) for the final installation. Before building the installation, the project is subject to several sketching and animated simulations in 3 dimensions, exploring different scenarios and narrations. At the end of the semester, the goal is to be able to convert a specific range of human actions into events and visual feedback creating a feedback loop from the human behaviors being converted into information; the translation of this information into written/visual feedbacks; and the effects of this feedbacks on human behavior; and so on.
It seems  possible to create your own telemeter with a arduino by implementing an [https://components101.com/sensors/ultrasonic-sensor-working-pinout-datasheet ultrasonic Sensor HC-SR04]<br>
<br><br>
By doing so, the values capted by the sensor could potentaialy be directly translated as a variable.<br> Then with the ESP8266 module, the values could be translated on a database on the internet.
2nd semester: Implementing gamification with the help of collaborative filtering, point system and ranking.
Then I could enter that website and see the values from anywhere and use them to control light, sound or anything else I wish.
<br><br>
 
During the second semester, it is all about building and implementing a narration with the help of gaming mechanics that will encourage humans to feed the data gathering device with their own help. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.
* https://pzwiki.wdka.nl/mediadesign/Arduino101#Anatomy_of_a_sketch
<br><br>
 
To summarize the storyline, the subject being positioned in the detection zone finds herself/himself unwillingly embodied as the main actor of a data collection game. Her/His mere presence generates a number of points/dollars displayed on a screen, growing as she/he stays within the area. The goal is simple: to get a higher score/rank and unlock achievements by acting as recommended by a data-collector.  This can be done by setting clear goals/rewards to the subject, and putting its own performance in comparison to all the previous visitors, giving unexpected messages/rewards, and give an aesthetic value to the displayed informations.
===<p style="font-family:helvetica">Tool/Material list: </p>===
<br><br>
The mechanism is based on a sample of physical events that have been already explored in the first semester of prototyping (detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection). Every single detected event in this installation is stored in a data bank, and with the help of collaborative filtering, will  allow to the display of custom recommendations such as:
<br><br>
_ “Congratulations, you have now given the monitor 12 % of all possible data to collect”<br>
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot”<br>
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers”<br>
_ “Leaving all-ready? The monitor has yet 304759 crucial pieces of information to collect from you”<br>
_ “You are only 93860 actions away from being the top one data-giver”<br>
_ “Statistics are showing that people staying for more than 5 minutes average will be 10 times more benefitting for me”<br>
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
<br><br>
The guideline is set out here, but will be constantly updated with the help of experiments and the results observed during the various moments of interaction between the students and the algorithm. For this purpose, the installation under construction will be left active and autonomous in its place of conception (studio) and will allow anyone who deliberately wishes to interact with it to do so. Beyond the voluntary interactions, my interest is also to see what can be extracted from people simply passing in front of this installation. In addition to this, some of the mechanics of the installation will be further explored by collaborating with other students, and setting up more ephemeral and organized experiences with the participants. (ex: 15 February 2022 with Louisa)
<br><br>
This semester will also include the creation of a definite set of illustrations participating to engage the participants of the installation in a more emotional way, the illustrations will be made by an illustrator/designer, with whom I usually collaborate.  
<br><br>
3rd semester: Build the final installation of final assessment and graduation show. Test runs, debug and final touchs.
<br><br>
During the third semester, the installation should be settled in the school, in the alumni area, next to XPUB studio for the final assessment, and ultimately settled again at WORM for the graduation show. I am interested in putting this installation into privileged spaces of human circulation, (such as hallways) that would more easily involve the detection of people, and highlight the intrusive aspect of such technologies. The narration, the mechanics, illustrations and graphic aspect should be finalized at the beginning of the 3rd semester, and subject to intense test runs during all that period until meeting the deadline.


* Telemeter (user to get the distance between the device and an obstacle)<br>
* Rails
* Handles
* Wheels
* Movable light wall
* Fixed walls
* USB Cable
* Connexion cables
* [https://www.arduino.cc/ Arduino]
* ESP8266


===Relation to larger context===
[[File:Connexioncable.jpg|200px|thumb|left|Connexion cables (Arduino)]]
[[File:Usb cable1.jpg|200px|thumb|center|USB Cable]]
[[File:Arduino1.png|200px|thumb|left|Arduino]]
[[File:HC-SR04 Ultrasonic Sensor.jpg|200px|thumb|center|HC-SR04 Ultrasonic Sensor]]
[[File:Plywood.jpg|200px|thumb|left|Plywood x 3]]
[[File:Handle.jpg|200px|thumb|center|Handle]]
[[File:ESP8266.jpg|200px|thumb|left|ESP8266]]
[[File:Rail.jpg|200px|thumb|center|Rail]]
<br><br><br><br><br><br><br>
 
===<p style="font-family:helvetica"> About the ultrasonic Sensor (HC-SR04)</p> ===


As GAFAM companies are facing more and more legal issues, and held accountable in growing numbers of social and political issues around the world, the pandemic context has greatly contributed to make all of us more dependent than ever on the online services provided by these companies and to somehow force our consent. While two decades of counter-terrorism measures legitimized domestic and public surveillance techniques such as online and video monitoring, the current public health crisis made even more necessary the use of new technologies for regulating the access to public spaces and services, but also for socializing, working together, accessing to culture, etc. In a lot of countries, from a day to another, and for an undetermined time, it has become necessary to carry a smartphone (or a printed QR code) in order to get access transport, entertainment, cultural and catering services, but also in order to do simple things such as to have a look at the menu in a bar/restaurant or to make an order.. Thus, this project takes place in a context where techno-surveillance has definitely taken a determining place in the way we can access spaces and services related to the physical world. <br><br>
====<p style="font-family:helvetica">Characteristics</p> ====


Data Marketisation / Self Data: Quantified Self / Attention Economy / Public Health Surveillance / Cybernetics 
Here are a few of it's technical characteristic of the HC-SR04 ultrasonic sensor : <br>


*    Power supply: 5v.
*    Consumption in use: 15 mA.
*    Distance range: 2 cm to 5 m.
*    Resolution or accuracy: 3 mm.
*    Measuring angle: < 15°.


===Who can help you?===
[https://elec13.wordpress.com/2016/10/15/realisez-un-telemetre-avec-le-hc-sr04-et-une-carte-arduino/ Ref]
More infos about the sensor [https://www.robot-maker.com/shop/img/cms/datasheet-capteur-ultrasons-hc-sr04.pdf here] and [https://www.aranacorp.com/fr/mesure-de-distance-avec-un-capteur-hc-sr04/ here]


<b>Prototyping</b><br>
====<p style="font-family:helvetica"> Where to buy the ultrasonic Sensor (HC-SR04)</p> ====
Manetta Berends<br>
Michael Murtaugh<br>
<br>
<b>Writing</b> <br>
Luke Williams<br>
<br>
<b>Arduino</b><br>
XPUB Arduino knowledge sharing group<br>
Dennis de Bel <br>
Aymeric Mansoux<br>
<br>
<b>Installation building</b><br>
Wood station<br>
Interaction station <br>
<br>
<b>Illustrations</b><br>
Adrien Jacquemet (illustrator/graphic designer)<br>
<br>
<b>Installation location</b> <br>
Leslie Robbins<br>
<br>
<b>Extra help for game narratives</b><br>
Rosa Zangenberg (writer/artist)<br>


===Relation to previous practice?===
*  1piece = 9,57 € - https://fr.shopping.rakuten.com/offer/buy/7112482554/module-de-mesure-a-ultrasons-hc-sr04-capteur-de-mesure-de-distance-5v-pour.html?t=7036&bbaid=8830210388
*  20 pieces = 34,22 € - https://fr.shopping.rakuten.com/offer/buy/7112482554/module-de-mesure-a-ultrasons-hc-sr04-capteur-de-mesure-de-distance-5v-pour.html?t=7036&bbaid=8830210388


During my previous studies in graphic design, I started being engaged with the new media by making a small online reissue of Raymond Queneau’s book called Exercices de Style. In this issue called Incidences Médiatiques (2017), the user/reader was encouraged to explore the 99 different versions of a same story written by the author in a less-linear way. The idea was to consider each user graphic user interface as a unique reading context. It would determine which story could be read, depending on the device used by the reader, and the user could navigate through these stories by resizing the Web window, by changing browser or by using on a different device.
=== Prototype 1 : Arduino + Resistor ===
<br><br>
As part of my graduation project called Media Spaces (2019), I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print website. Subsequently, this website became translated in the physical space as a printed book, and a series of installations displayed in an exhibition space that was following the online structure of my thesis (home page, index, part 1-2-3-4). In that way, I was interested to inviting to visitors to make a physical experience some aspects of the Web
<br><br>
As a first-year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website called Tense (2020) willing to display the invisible html <meta> tags inside of an essay in order to affect our interpretation of the text. In 2021, I worked on a geocaching pinball game highlighting invisible Web event, and a Web oscillator, which amplitude and frequency range were directly related to the user’s cursor position and browser screen-size.
<br><br>
While it has always been clear to me that these works were motivated by the desire to define media as context, subject or/and content, the projects presented here have often made use of surveillance tools to detect and translate user information into feedbacks, participating in the construction of an individualized narrative or/and a unique viewing/listening context (interaction, screen size, browser, mouse position). The current work aims to take a critical look at the effect of these practices in the context of techno surveillance.
<br><br>
Similarly, projects such as Media Spaces have sought to explore the growing confusion between human and web user, physical and virtual space or online and offline spaces. This project will demonstrate that these growing confusions will eventually lead us to be tracked in all circumstances, even in our most innocuous daily human activities/actions.


During a workshop, we started with a very basic fake arduino kit, a led, a motor, and a sensor.
After making a few connections, we got to understand a bit how it works.


===Selected References===


<b>Works:</b>


* M. DARKE, fairlyintelligent.tech (2021) https://fairlyintelligent.tech/
    #include <Servo.h>
« invites us to take on the role of an auditor, tasked with addressing the biases in a speculative AI »Alternatives to techno-surveillance
    Servo myservo;  // create servo object to control a servo
<br>
    int pos = 0;    // variable to store the servo position
* MANUEL BELTRAN, Data Production Labor (2018) https://v2.nl/archive/works/data-production-labour/
    int ldr = 0;    // vairable to store light intensity
Expose humans as producers of useful intellectual labor that is benefiting to the tech giants and the use than can be made out of that labor.
<br>
* TEGA BRAIN and SURYA MATTU, Unfit-bits (2016) http://tegabrain.com/Unfit-Bits
Claims that that technological devices can be manipulated easily and hence, that they are fallible and subjective. They do this by simply placing a self-tracker (connected bracelet) in another context, such as on some other objects, in order to confuse these devices.
<br>
* JACOB APPELBAUM, Autonomy Cube (2014), https://www.e-flux.com/announcements/2916/trevor-paglen-and-jacob-appelbaumautonomy-cube/
Allows galleries to enjoy encrypted internet access and communications, through a Tor Network
<br>
* STUDIO MONIKER, Clickclickclick.click (2016) https://clickclickclick.click/
You are rewarded for exploring all the interactive possibilities of your mouse, revealing how our online behaviors can be monitored and interpretated by machines.
<br>
* RAFAEL LOZANO-HEMMER, Third Person (2006) https://www.lozano-hemmer.com/third_person.php
Portrait of the viewer is drawn in real time by active words, which appear automatically to fill his or her silhouette https://www.lozano-hemmer.com/third_person.php
<br>
* JONAL LUND, What you see is what you get (2012) http://whatyouseeiswhatyouget.net/
«Every visitor to the website’s browser size, collected, and played back sequentially, ending with your own.»
<br>
* USMAN HAQUE, Mischievous Museum (1997) https://haque.co.uk/work/mischievous-museum/
Readings of the building and its contents are therefore always unique -- no two visitors share the same experience. https://haque.co.uk/work/mischievous-museum/
<br><br>
<b>Books & Articles:</b>


* SHOSHANA ZUBOFF, The Age of Surveillance Capitalism (2020)  
    void setup() {
Warns against this shift towards a «surveillance capitalism». Her thesis argues that, by appropriating our personal data, the digital giants are manipulating us and modifying our behavior, attacking our free will and threatening our freedoms and personal sovereignty.
    Serial.begin(9600); // begin serial communication, NOTE:set the same baudrate in the serial monitor/plotter
<br>
    myservo.attach(D7);  // attaches the servo on pin 9 to the servo object
* EVGENY MOROZOV, Capitalism’s New Clothes (2019)
    }
Extensive analysis and critic of Shoshana Zuboff research and publications.
<br>
* BYRON REEVES AND CLIFFORD NASS, The Media Equation, How People Treat Computers, Television, and New Media Like Real People and Places (1996)
Precursor study of the relation between humans and machine, and how do you human relate to them.
<br>
* OMAR KHOLEIF, Goodbye, World! — Looking at Art in the digital Age (2018)  
Authors shares it’s own data as a journal in a part of the book, while on another part, question how the Internet has changed the way we perceive and relate, and interact with/to images.
<br>
* KATRIN FRITSCH, Towards an emancipatory understanding of widespread datafication (2018)
Suggests that in response to our society of surveillance, artists can suggest activist response that doesn’t necessarily involve technological literacy, but instead can promote strong counter metaphors or/and counter use of these intrusive technologies.


==<p style="font-family:helvetica"> Reading Sources</p> ==
    void loop() {
    //lets put the LDR value in a variable we can reuse
    ldr = analogRead(A0);
   
    //the value of the LDR is between 400-900 at the moment
    //the servo can only go from 0-180
    //so we need to translate 400-900 to 0-180
    //also the LDR value might change depending on the light of day
    //so we need to 'contrain' the value to a certain range
    ldr = constrain(ldr, 400, 900);


* [ Bootleg]
    //now we can translate
* [https://aaaaarg.fail/ Aaaarg]
    ldr = map(ldr, 400, 900, 0, 180);
* [https://www.jstor.org/ JSTOR]


==<p style="font-family:helvetica">Themes (keywords)</p>==
    //lets print the LDR value to serial monitor to see if we did a good job
    Serial.println(ldr); // read voltage on analog pin 0, print the value to serial monitor


* Interfaced Reality
    //now we can move the sensor accoring to the light/our hand!
* Museum Display vs Screen display
    myservo.write(ldr);      // tell servo to go to position in variable 'pos'
* Exhibition space vs User interface
    delay(15);   
* Web Elasticy vs Physical Rigidity
    }
* Museology / Curation / Gallery and Museum display
 
* Technological context
 
* Mediatization of Media / Meta Art
[[File:Servo-breadboard.jpg|300px|thumb|left|How to make a engine work<br>credits: Dennis de Bel]]
[[File:Ldrbreadboard.jpg|300px|thumb|center|How to make a sensor work<br>Credits: Dennis de Bel]]
[[File:Combined-breadboard2.jpg|300px|thumb|left|How to make both sensor and engine works together<br>Credits: Dennis de Bel]]
[[File:Sensortest.gif|300px|thumb|center|Sensortest during workshop]]


=<p style="font-family:helvetica">Draft Thesis</p>=
=== Split Screen Arduino + Sensor + Serial Plotter + Responsive Space ===


===What do you want to make?===
Trying here to show the simutaneous responses between the sensor, the values, and the simualtion.


My project is a data collection installation that monitors people's behaviors in public physical spaces while explicitly encouraging them to help the algorithm collect more information. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.  
[[File:Splitscreen.gif|800px|thumb|left|Splitscreen Arduino + Sensor + Serial Plotter + Responsive Space]]
<br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
The way the device is designed doesn’t pretend to give any beneficial outcomes for the subject, but only makes visible the benefits that the machine is getting from collecting their data. Yet, the way the device visually or verbally presents this collected data is done in a grateful way, which might be stimulating for the subject. In that sense, the subject, despite knowing that their actions are done solely to satisfy the device, could become intrigued, involved, or even addicted by a mechanism that deliberately uses it as a commodity. In that way, I intend to trigger conflictual feelings in the visitor’s mind, situated between a state of awareness regarding the operating monetization of their physical behaviors, and a state of engagement/entertainment /stimulation regarding the interactive value of the installation.
 
<br><br>
===<p style="font-family:helvetica">Prototype 2: Arduino + Ultrasonic sensor</p> ===
My first desire is to make the mechanisms by which data collection is carried out, marketized and legitimized both understandable and accessible. The array of sensors, the Arduinos and the screen are the mainly technological components of this installation. Rather than using an already existing and complex tracking algorithm, the program is built from scratch, kept open source and limits itself to the conversion of a restricted range of physical actions into interactions. These include the detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection. Optionally they may also include the detection of the subject smartphone device or the log on a local Wi-Fi hotspot made by the subject.
 
<br><br>
For this very simple first sketch and for later, I will include newPing library that improves a lot the ultrasonic sensor capacities.
In terms of mechanic, the algorithm creates feedback loops starting from: <br>
_the subject behaviors being converted into information; <br>
_the translation of this information into written/visual feedback; <br>
_and the effects of this feedbacks on subject’s behavior; and so on. <br>
By doing so, it tries to shape the visitors as free data providers inside their own physical environment, and stimulate their engagement by converting each piece of collected information into points/money, feeding a user score among a global ranking.
<br><br>
On the screen, displayed events can be:


_ “subject [] currently located at [ ]<br>
[[File:ArduinoUno Sensor Sketch1.jpg|400px|thumb|left|Sketch 1: Arduino Uno + Sensor]]
[x points earned/given]<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
_ “subject [] entered the space” <br>
  #include <NewPing.h>  
[x points earned/given]<br>
  int echoPin = 10;
_ “subject [] left the space”<br>  
  int trigPin = 9;
[x points earned/given]<br>
 
_ “subject [] moving/not moving”<br>  
  NewPing MySensor(trigPin, echoPin); //This defines a new variable
[x points earned/given]<br>
 
_ “subject [] distance to screen: [ ] cm” <br>
  void setup() {
[x points earned/given]<br>
    // put your setup code here, to run once:
_ “subject [] stayed at [ ] since [ ] seconds” <br>
    Serial.begin(9600);
[x points earned/given]<br>
  }
_ “subject [] device detected <br>
 
[x points earned/given] (optional)<br>
  void loop() {
_ “subject logged onto local Wi-Fi<br>
    // put your main code here, to run repeatedly:
[x points earned/given] (optional)<br>
  int duration = MySensor.ping_median();
<br>
  int distance = MySensor.convert_in(duration);
Added to that comes the instructions and comments from the devices in reaction to the subject’s behaviors:<br>
 
<br>
  Serial.print(distance);
_ “Congratulations, you have now given the monitor 25 % of all possible data to collect!” <br>
  Serial.println("cm");
[when 25-50-75-90-95-96-97-98-99% of the total array of events has been detected at least once]<br>
  delay(250);
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot!”<br>
  }
[if the subject stands still in a specific location]<br>
 
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
===<p style="font-family:helvetica">Prototype 3:  Arduino Uno + Sensor + LCD (+ LED)</p> ===
[unlocked at x points earned/given]<br>
 
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers!”<br>
All together from https://www.youtube.com/watch?v=GOwB57UilhQ
[if the subject stand still in a specific location]<br>
 
_ “Leaving all ready? The monitor has yet to collect 304759 crucial pieces of information from you!”<br>
[[File:Sketch2ArduinoUnoSensorLCD.jpg|400px|thumb|left|Sketch 2: Arduino Uno + Sensor + LCD]]
[if the subject is a the edge of the detection range]<br>
[[File:Sketch3ArduinoUnoSensorLCDLED.jpg|400px|thumb|right|Sketch 3: Arduino Uno + Sensor + LCD + LED]]
_ “You are only 93860 pieces of information away from being the top one data-giver!”<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
[unlocked at x points earned/given]<br>
  #include <LiquidCrystal.h>
_ “Statistics show that people staying for more than 5 minutes average will benefit me on average 10 times more!”<br>
 
[randomly appears]<br>
  LiquidCrystal lcd(10,9,5,4,3,2);
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
 
[if the subject stands still for a long time any location]<br>
  const int trigPin = 11;
<br>
  const int echoPin = 12;
Responding positively to the monitors instructions unlocks special achievement and extra points<br>
 
<br>
  long duration;
—Accidental data-giver badge <br>
  int distance;
[unlocked if the subject has passed the facility without deliberately wishing to interact with it] + [x points earned/given]<br>
 
—Lazy data-giver badge <br>
  void setup() {
[unlocked if the subject has been standing still for at least one minute] + [x points earned/given]<br>
    // put your setup code here, to run once:
—Novice data-giver badge <br>
      analogWrite(6,100);
[unlocked if the subject has been successfully completing 5 missions from the monitor] + [x points earned/given]<br>
      lcd.begin(16,2);
—Hyperactive data-giver badge <br>
      pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
[unlocked if the subject has never been standing still for 10 seconds within 2 minutes lapse time] + [x points earned/given]<br>
  pinMode(echoPin, INPUT); // Sets the echoPin as an Input
—Expert data-giver badge <br>
  Serial.begin(9600); // Starts the serial communication
[unlocked if the subject has been successfully completing 10 missions from the monitor within 10 minutes] + [x points earned/given]<br>
 
—Master data-giver badge <br>
     
[unlocked if the subject has been successfully logging on the local Wi-Fi hotspot] + [x points earned/given] (optional)<br>
  }
<br>
 
On the top left side of the screen, a user score displays the number of points generated by the collected pieces of information, and the unlocking of special achievements instructed by the monitor.<br>
  void loop() {
<br>
  long duration, distance;
—Given information: 298 pieces <br>
    digitalWrite(trigPin,HIGH);
[displays number of collected events]<br>
    delayMicroseconds(1000);
—Points: 312000 <br>
    digitalWrite(trigPin, LOW);
[conversion of collected events and achievement into points]<br>
    duration=pulseIn(echoPin, HIGH);
<br>
    distance =(duration/2)/29.1;
On the top right of the screen, the user is ranked among x number of previous visitors and the prestigious badge recently earned is displayed bellow<br>
    Serial.print(distance);
<br>
    Serial.println("CM");
—subject global ranking: 3/42 <br>
    delay(10);
[compares subject’s score to all final scores from previous subjects]<br>
  // Prints the distance on the Serial Monitor
—subject status:  expert data-giver<br>
  Serial.print("Distance: ");
[display the most valuable reward unlocked by the subject]<br>
  Serial.println(distance);
<br>
 
When leaving the detection range, the subject gets a warning message and a countdown starts, and encouraging it to take the quick decision to come back<br>
      lcd.clear();
<br>
      lcd.setCursor(0,0);
—“Are you sure you want to leave? You have 5-4-3-2-1-0 seconds to come back within the detection range”<br>
      lcd.print("Distance = ");
[displayed as long as the subject remains completely undetected]<br>
      lcd.setCursor(11,0);
<br>
      lcd.print(distance);
If the subject definitely stands out of the detection range for more than 5 seconds, the monitor will also address a thankful message and the amount of money gathered, achievements, ranking, complete list of collected information and a qr code will be printed as a receipt with the help of a thermal printer. The QR will be a link to my thesis.<br>
      lcd.setCursor(14,0);
<br>
      lcd.print("CM");
—* “Thank you for helping today, don’t forget to take your receipt in order to collect and resume your achievements”<br>
     
[displayed after 5 seconds being undetected]<br>
      delay(500);
<br>
     
In order to collect, read or/and use that piece of information, the visitor will inevitably have to come back within the range of detection, and intentionally, or not, reactivate the data tracking game. It is therefore impossible to leave the area of detection without leaving at least one piece of your own information printed in the space. Because of this, the physical space should gradually be invaded by tickets scattered on the floor. As in archaeology, these tickets give a precise trace of the behavior and actions of previous subjects for future subjects. <br>
  }


===Why do you want to make it?===
From this sketch, I start considering that the distance value could be directly sent to a computer and render a Web page depending on its value.
<br>Note: It looks like this sensor max range is 119cm, which is almost 4 times less than the 4 meters max range stated in component description.


When browsing online or/and using connected devices in the physical world, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. While social platforms, brands, public institutions and governments explicitly promote the use of monitoring practices in order to better serve or protect us, we could also consider these techniques as implicitly political, playing around some dynamics of visibility and invisibility in order to assert new forms of power over targeted audiences.
===<p style="font-family:helvetica">Prototype 4:  Arduino Uno + Sensor + LCD + 2 LED = Physical vs Digital Range detector </p> ===
<br><br>
In the last decade, a strong mistrust of new technologies has formed in the public opinion, fueled by events such as the revelations of Edward Snowden, the Cambridge Analytica scandal or the proliferation of fake news on social networks. We have also seen many artists take up the subject, sometimes with activist purposes. But even if a small number of citizens have begun to consider the social and political issues related to mass surveillance, and some individuals/groups/governments/associations have taken legal actions, surveillance capitalism still remains generally accepted, often because ignored or/and misunderstood.
<br><br>
Thanks to the huge profits generated by the data that we freely provide every day, big tech companies have been earning billions of dollars over the sale of our personal information. With that money, they could also further develop deep machine learning programs, powerful recommendation systems, and to broadly expand their range of services in order to track us in all circumstances and secure their monopolistic status. Even if we might consider this realm specific to the online world, we have seen a gradual involvement from the same companies to monitor the physical world and our human existences in a wide array of contexts. For example, with satellite and street photography (Google Earth, Street View), geo localization systems, simulated three-dimensional environments (augmented reality, virtual reality or metaverse) or extensions of our brains and bodies (vocal assistance and wearable devices). Ultimately, this reality has seen the emergence of not only a culture of surveillance but also of self-surveillance, as evidenced by the popularity of self-tracking and data sharing apps, which legitimize and encourage the datafication of the body for capitalistic purposes.
<br><br>
For the last 15 years, self-tracking tools have made their way to consumers. I believe that this trend is showing how ambiguous our relationship can be with tools that allow such practices. Through my work, I do not wish to position myself as a whistleblower, a teacher or activist. Indeed, to adopt such positions would be hypocritical, given my daily use of tools and platforms that resort to mass surveillance. Instead, I wish to propose an experience that highlights the contradictions in which you and I, internet users and human beings, can find ourselves. This contradiction is characterized by a paradox between our state of concern about the intrusive surveillance practices operated by the Web giants (and their effects on societies and humans) and a state of entertainment or even active engagement with the tools/platforms through which this surveillance is operated/allowed. By doing so, I want to ask how do these companies still manage to get our consent and what human biases do they exploit in order to do so. That’s is how my graduation work and my thesis will investigate the effect of gamification, gambling or reward systems as well as a the esthetization of data/self-data as means to hook our attention, create always more interactions and orientate our behaviors.


Using in-between values to activate the green LED<br>
Once again, puting together the simulation and the device in use.


===How to you plan to make it and on what timetable?===
[[File:SensorSpace.gif|400px|thumb|left|Sensor Test VS Elastic Space]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


I am developing this project with Arduino Uno/Mega boards, an array of ultrasonic sensor, P5.js and screens.<br><br>
  #include <LiquidCrystal.h>
 
  #include <LcdBarGraph.h>  
How does it work?
  #include <NewPing.h>  
<br><br>
 
The ultrasonic sensors can detect obstacles in a physical space and know the distance between the sensor and obstacle(s) by sending and receiving back an ultrasound. The Arduino Uno/Mega boards are microcontrollers which can receive this information, run it in a program in order to convert these values into a mm/cm/m but also map the space into an invisible grid. Ultimately, values collected on the Arduino’s serial monitor can be sent to P5.js through P5.serialcontrol. P5.js will then allow a greater freedom in way the information can be displayed on the screens.
    LiquidCrystal lcd(10,9,5,4,3,2);
<br><br>
 
Process:
  const int LED1 = 13;
<br><br>
  const int LED2 = 8;  
1st semester: Building a monitoring device, converting human actions into events, and events into visual feedbacks
  const int trigPin = 11;
<br><br>
  const int echoPin = 12;
During the first semester, I am focused on exploring monitoring tools that can be used in the physical world, with a specific attention to ultrasonic sensors. Being new to Arduino programming, my way of working is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale. Prototypes are subject to testing, documentation and comments helping to define which direction to follow. The first semester also allows to experiment with different kind of screen (LCD screens, Touch screens, computer screens, TV screens) until finding the most adequate screen monitor(s) for the final installation. Before building the installation, the project is subject to several sketching and animated simulations in 3 dimensions, exploring different scenarios and narrations. At the end of the semester, the goal is to be able to convert a specific range of human actions into events and visual feedback creating a feedback loop from the human behaviors being converted into information; the translation of this information into written/visual feedbacks; and the effects of this feedbacks on human behavior; and so on.
 
<br><br>
  long duration; //travel time
2nd semester: Implementing gamification with the help of collaborative filtering, point system and ranking.
  int distance;
<br><br>
  int screensize;
During the second semester, it is all about building and implementing a narration with the help of gaming mechanics that will encourage humans to feed the data gathering device with their own help. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.
 
<br><br>
  void setup() {
To summarize the storyline, the subject being positioned in the detection zone finds herself/himself unwillingly embodied as the main actor of a data collection game. Her/His mere presence generates a number of points/dollars displayed on a screen, growing as she/he stays within the area. The goal is simple: to get a higher score/rank and unlock achievements by acting as recommended by a data-collector. This can be done by setting clear goals/rewards to the subject, and putting its own performance in comparison to all the previous visitors, giving unexpected messages/rewards, and give an aesthetic value to the displayed informations.
    // put your setup code here, to run once:
<br><br>
      analogWrite(6,100);
The mechanism is based on a sample of physical events that have been already explored in the first semester of prototyping (detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection). Every single detected event in this installation is stored in a data bank, and with the help of collaborative filtering, will  allow to the display of custom recommendations such as:
      lcd.begin(16,2);
<br><br>
      pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
_ “Congratulations, you have now given the monitor 12 % of all possible data to collect”<br>
      pinMode(echoPin, INPUT); // Sets the echoPin as an Input
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot”<br>
      Serial.begin(9600); // Starts the serial communication
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
 
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers”<br>
      pinMode(LED1, OUTPUT);
_ “Leaving all-ready? The monitor has yet 304759 crucial pieces of information to collect from you”<br>
      pinMode(LED2, OUTPUT);
_ “You are only 93860 actions away from being the top one data-giver”<br>
  }
_ “Statistics are showing that people staying for more than 5 minutes average will be 10 times more benefitting for me”<br>
 
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
  void loop() {
<br><br>
  long duration, distance;
The guideline is set out here, but will be constantly updated with the help of experiments and the results observed during the various moments of interaction between the students and the algorithm. For this purpose, the installation under construction will be left active and autonomous in its place of conception (studio) and will allow anyone who deliberately wishes to interact with it to do so. Beyond the voluntary interactions, my interest is also to see what can be extracted from people simply passing in front of this installation. In addition to this, some of the mechanics of the installation will be further explored by collaborating with other students, and setting up more ephemeral and organized experiences with the participants. (ex: 15 February 2022 with Louisa)
    digitalWrite(trigPin,HIGH);
<br><br>
    delayMicroseconds(1000);
This semester will also include the creation of a definite set of illustrations participating to engage the participants of the installation in a more emotional way, the illustrations will be made by an illustrator/designer, with whom I usually collaborate.
    digitalWrite(trigPin, LOW);
<br><br>
    duration=pulseIn(echoPin, HIGH);
3rd semester: Build the final installation of final assessment and graduation show. Test runs, debug and final touchs.
    distance =(duration/2)/29.1; //convert to centimers
<br><br>
    screensize = distance*85;
During the third semester, the installation should be settled in the school, in the alumni area, next to XPUB studio for the final assessment, and ultimately settled again at WORM for the graduation show. I am interested in putting this installation into privileged spaces of human circulation, (such as hallways) that would more easily involve the detection of people, and highlight the intrusive aspect of such technologies. The narration, the mechanics, illustrations and graphic aspect should be finalized at the beginning of the 3rd semester, and subject to intense test runs during all that period until meeting the deadline.
    Serial.print(distance);
 
    Serial.println("CM");
 
    Serial.print(screensize);
===Relation to larger context===
    delay(10);
 
 
As GAFAM companies are facing more and more legal issues, and held accountable in growing numbers of social and political issues around the world, the pandemic context has greatly contributed to make all of us more dependent than ever on the online services provided by these companies and to somehow force our consent. While two decades of counter-terrorism measures legitimized domestic and public surveillance techniques such as online and video monitoring, the current public health crisis made even more necessary the use of new technologies for regulating the access to public spaces and services, but also for socializing, working together, accessing to culture, etc. In a lot of countries, from a day to another, and for an undetermined time, it has become necessary to carry a smartphone (or a printed QR code) in order to get access transport, entertainment, cultural and catering services, but also in order to do simple things such as to have a look at the menu in a bar/restaurant or to make an order.. Thus, this project takes place in a context where techno-surveillance has definitely taken a determining place in the way we can access spaces and services related to the physical world. <br><br>
    if ((distance >= 15) && (distance<=20))
 
    {
Data Marketisation / Self Data: Quantified Self / Attention Economy / Public Health Surveillance / Cybernetics 
      digitalWrite(LED2, HIGH);
 
      digitalWrite(LED1, LOW);
 
    }
===Who can help you?===
    else
 
    {
<b>Prototyping</b><br>
      digitalWrite(LED1, HIGH);
Manetta Berends<br>
      digitalWrite(LED2, LOW);   
Michael Murtaugh<br>
    }
<br>
 
<b>Writing</b> <br>
  // Prints the distance on the Serial Monitor
Luke Williams<br>
  Serial.print("Distance: ");
<br>
  Serial.println(distance);
<b>Arduino</b><br>
 
XPUB Arduino knowledge sharing group<br>
      lcd.clear();
Dennis de Bel <br>
      lcd.setCursor(0,0);
Aymeric Mansoux<br>
      lcd.print("ROOM");
<br>
      lcd.setCursor(6,0);
<b>Installation building</b><br>
      lcd.print(distance);
Wood station<br>
      lcd.setCursor(9,0);
Interaction station <br>
      lcd.print("cm");   
<br>
      lcd.setCursor(0,2);
<b>Illustrations</b><br>
      lcd.print("SCR");
Adrien Jacquemet (illustrator/graphic designer)<br>
      lcd.setCursor(6,2);
<br>
      lcd.print(screensize);
<b>Installation location</b> <br>
      lcd.setCursor(9,2);
Leslie Robbins<br>
      lcd.print("x1080px");
<br>
         
<b>Extra help for game narratives</b><br>
      delay(500);
Rosa Zangenberg (writer/artist)<br>
     
 
  }
===Relation to previous practice?===
 
During my previous studies in graphic design, I started being engaged with the new media by making a small online reissue of Raymond Queneau’s book called Exercices de Style. In this issue called Incidences Médiatiques (2017), the user/reader was encouraged to explore the 99 different versions of a same story written by the author in a less-linear way. The idea was to consider each user graphic user interface as a unique reading context. It would determine which story could be read, depending on the device used by the reader, and the user could navigate through these stories by resizing the Web window, by changing browser or by using on a different device.  
<br><br>
As part of my graduation project called Media Spaces (2019), I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print website. Subsequently, this website became translated in the physical space as a printed book, and a series of installations displayed in an exhibition space that was following the online structure of my thesis (home page, index, part 1-2-3-4). In that way, I was interested to inviting to visitors to make a physical experience some aspects of the Web
<br><br>
As a first-year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website called Tense (2020) willing to display the invisible html <meta> tags inside of an essay in order to affect our interpretation of the text. In 2021, I worked on a geocaching pinball game highlighting invisible Web event, and a Web oscillator, which amplitude and frequency range were directly related to the user’s cursor position and browser screen-size.
<br><br>
While it has always been clear to me that these works were motivated by the desire to define media as context, subject or/and content, the projects presented here have often made use of surveillance tools to detect and translate user information into feedbacks, participating in the construction of an individualized narrative or/and a unique viewing/listening context (interaction, screen size, browser, mouse position). The current work aims to take a critical look at the effect of these practices in the context of techno surveillance.
<br><br>
Similarly, projects such as Media Spaces have sought to explore the growing confusion between human and web user, physical and virtual space or online and offline spaces. This project will demonstrate that these growing confusions will eventually lead us to be tracked in all circumstances, even in our most innocuous daily human activities/actions.




===Selected References===
I brought a second arduino, 2 long breadboards, black cables, another LCD screen, and remade the setup on this format.
For some reasons the new LCD screen is not going in the breadboard, and I need more male to female cables in order to connect it correctly.
With this longer breadboard, I want to extend the range value system, and make it visible with leds and sounds.
 
[[File:Arduino Setup V3.jpg|400px|thumb|left|Upgrade]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
 
===<p style="font-family:helvetica">How to get more digital pins [not working]</p> ===
 
* How to use analog pins as digital pins https://www.youtube.com/watch?v=_AAbGLBWk5s
* Up to 60 more pins with Arduino Mega https://www.tinytronics.nl/shop/en/development-boards/microcontroller-boards/arduino-compatible/mega-2560-r3-with-usb-cable
 
I tried 4 different tutorials but still didn't find a way to make the thing work, that's very weird, so I will just give up and take a arduino mega =*(


<b>Works:</b>
[[File:ArduinoExtraDigitalPins.jpg|300px|thumb|left|ArduinoExtraDigitalPins]]


* M. DARKE, fairlyintelligent.tech (2021) https://fairlyintelligent.tech/
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
« invites us to take on the role of an auditor, tasked with addressing the biases in a speculative AI »Alternatives to techno-surveillance
 
<br>
===<p style="font-family:helvetica">Prototype 5: Arduino Uno + 3 Sensor + 3 LEDS </p> ===
* MANUEL BELTRAN, Data Production Labor (2018) https://v2.nl/archive/works/data-production-labour/
 
Expose humans as producers of useful intellectual labor that is benefiting to the tech giants and the use than can be made out of that labor.
With a larger breadboard, connecting 3 sensors all together. Next step will be to define different ranges of inbetween values for each sensor in order to make a grid.
<br>
To accomplish this grid I will make a second row of sensors such as this, in order to get x and y values in space
* TEGA BRAIN and SURYA MATTU, Unfit-bits (2016) http://tegabrain.com/Unfit-Bits
 
Claims that that technological devices can be manipulated easily and hence, that they are fallible and subjective. They do this by simply placing a self-tracker (connected bracelet) in another context, such as on some other objects, in order to confuse these devices.
===<p style="font-family:helvetica">Prototype 6: Arduino Uno + 3 Sensor + 12 LEDS</p> ===
<br>
* JACOB APPELBAUM, Autonomy Cube (2014), https://www.e-flux.com/announcements/2916/trevor-paglen-and-jacob-appelbaumautonomy-cube/
Allows galleries to enjoy encrypted internet access and communications, through a Tor Network
<br>  
* STUDIO MONIKER, Clickclickclick.click (2016) https://clickclickclick.click/
You are rewarded for exploring all the interactive possibilities of your mouse, revealing how our online behaviors can be monitored and interpretated by machines.
<br>
* RAFAEL LOZANO-HEMMER, Third Person (2006) https://www.lozano-hemmer.com/third_person.php
Portrait of the viewer is drawn in real time by active words, which appear automatically to fill his or her silhouette https://www.lozano-hemmer.com/third_person.php
<br>
* JONAL LUND, What you see is what you get (2012) http://whatyouseeiswhatyouget.net/
«Every visitor to the website’s browser size, collected, and played back sequentially, ending with your own.»
<br>
* USMAN HAQUE, Mischievous Museum (1997) https://haque.co.uk/work/mischievous-museum/
Readings of the building and its contents are therefore always unique -- no two visitors share the same experience. https://haque.co.uk/work/mischievous-museum/
<br><br>
<b>Books & Articles:</b>


* SHOSHANA ZUBOFF, The Age of Surveillance Capitalism (2020)
With 3 sensors, added on 2 long breadboads, and with a different set of range values, we can start mapping a space.
Warns against this shift towards a «surveillance capitalism». Her thesis argues that, by appropriating our personal data, the digital giants are manipulating us and modifying our behavior, attacking our free will and threatening our freedoms and personal sovereignty.
 
<br>
[[File:SensorMediaQueries 01.gif|300px|thumb|left|SensorMediaQueries]]
* EVGENY MOROZOV, Capitalism’s New Clothes (2019)
[[File:Physical Space Mapping.png|500px|thumb|center|Physical Space Mapping]]
Extensive analysis and critic of Shoshana Zuboff research and publications.
<br>
* BYRON REEVES AND CLIFFORD NASS, The Media Equation, How People Treat Computers, Television, and New Media Like Real People and Places (1996)
Precursor study of the relation between humans and machine, and how do you human relate to them.
<br>
* OMAR KHOLEIF, Goodbye, World! — Looking at Art in the digital Age (2018)
Authors shares it’s own data as a journal in a part of the book, while on another part, question how the Internet has changed the way we perceive and relate, and interact with/to images.
<br>
* KATRIN FRITSCH, Towards an emancipatory understanding of widespread datafication (2018)
Suggests that in response to our society of surveillance, artists can suggest activist response that doesn’t necessarily involve technological literacy, but instead can promote strong counter metaphors or/and counter use of these intrusive technologies.


=<p style="font-family:helvetica">What is my work, What do I want to tell, What is my position</p>=
<br><br><br><br><br><br><br>


Translated from [https://pad.xpub.nl/p/2021-11-09-xpub2 discussion] with Michael
===<p style="font-family:helvetica">Prototype 7:  Arduino Uno + 12 LEDS + 3 Sensor + Buzzer + Potentiometer + LCD</p> ===


People have now more concrete experiences of the digital/Web interface than the physical space. Museums, hotels, houses, cars interiors, restaurants are themselves becoming more and more comparable to digital interface where everything is optimized, and where our behaviours, actions and even inactions are being detected and converted into commands in order to offer a more customized (and profitable) experience to each of us. In that sense, we are getting closer from becoming users of our own interfaced physical reality. By creating a exhibition spaces explicitly inspired from a desktop Web interface, I wish to question what could be the future of exhibition space, what are the limits of this interfaced and individualized reality and  how could it affect our own experience and understanding of art.
For this prototype, I implement a buzzer that will emit a specific sound depending on the distance of the obstacle detected by the sensor.
I also puted back a LCD displaying the 3 sensors values. The screen luminosity can be changed via a potentiometer.
<br>
<br>
Ressources:
* https://samsneatprojectblogcode.blogspot.com/2016/06/piezo-buzzer-code-and-fritzing.html
* https://www.youtube.com/watch?v=m7bbfzZ2UNo
* https://www.youtube.com/watch?v=K8AnlUT0ng0


What could we learn from interface design?
[[File:ArduinoMegaSensorBuzzerLCD.jpg|300px|thumb|left|ArduinoMegaSensorBuzzerLCD]]
What could be the future of exhibition space?


"Bring attention to the systems underlying artistic productions" both on the Web and the physical world<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
"reversal of the desktop metaphor" (using the virtual as "concrete" to metaphorically understand a physical exhibition space), what will be the future of an exhibition space... (is already working exhibition spaces working with sensors)
scary and fascinating at the same time...<br>
"my embracing/use of sensors isn't about proposing these techniques as a solution / ideal / about control... interfaces requiring less and less from us but paradoxically extracting more and more from us"<br>
every small unconsidered behaviour is being used (trying to used)...<br>
there is unpredictable.... because of all the factors, want unexpected things to happen...<br>
the reality of digital isn't all about precision and control, this notion of surprise is key for an experience.<br>
Exploring the fullness of digital / programmed / computational media, including those "edge" cases / the "glitch" ... the surprise...<br>
Examples from museums: (for instance Brussels has the MIM Museum Instrument Museum, sadly the old now retired interface was a system with headphones that were activated in the space, so as you approached vitrines with a violin you would here a performance of that instrument)... <br>
How a mistake can create something else. / Bugs / Glitch  Letting an accident/surprise/unexpect exist, exploring the fullness of digital programming <br>
My position seems to fit with Researcher/Designer<br>
Digital is not precise and omnipotent, it has so faults, flows and vulnerabilities.


===<p style="font-family:helvetica">Prototype 8:  Arduino Uno + 12 LEDS + 3 Sensor on mini breadboards + Buzzer + Potentiometer + LCD</p> ===


Same code, but new setup detaching each sensor from each others and allowing to place them anywhere.


To check:
[[File:ArduinoMegaSensorBuzzerLCDMinibreadboard.jpg|300px|thumb|left]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
===<p style="font-family:helvetica">Prototype 9:  Arduino Uno + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD</p> ===


[https://arvindguptatoys.com/arvindgupta/mindstorms.pdf Mindstorms  Seymour Papert]  
[[File:Sensor Wall 01.png|300px|thumb|left|Sensor Wall 01]]
*  Serendipity
[[File:PhysicalMapping3.png|500px|thumb|right|PhysicalMapping2]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


===<p style="font-family:helvetica">Sketch 10:  Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js </p> ===


[[File:P5.js sensor.gif|300px|thumb|left|P5.js and ultrasonic sensor]]


==<p style="font-family:helvetica">Software Art</p>==
The goal here was to create a first communication between the physical setup and a P5.js web page


Software creation or use of software concepts for artworks representation.
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
Commonly put the spectator in the role of a user.


==<p style="font-family:helvetica">Internet Art</p>==
===<p style="font-family:helvetica">Sketch 11:  Arduino Mega + UltraSonicSensor + LCD TouchScreen </p> ===


Elements from the Internet bringed outside of the Internet and promoting the Internet as part of both virtual and physical realities.
[[File:LCDArduinoVariableposter.gif|300px|thumb|left|LCD Arduino Variable poster]]
* John Ippolito


==<p style="font-family:helvetica">Post-Internet Art vs Internet 2.0</p>==
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


Post-Internet Art: Litteraly Art after the internet. Can consists of using online material for later use in offline works or can relate on the effect of the Internet in various aspects of the culture, aesthetic and society.
==<p style="font-family:helvetica">Semester 2</p> ==
* Olia Lialina
VS <br>
Internet 2.0: Assuming that a world Internet doesn't exist anymore
* Zach Blas


==<p style="font-family:helvetica">Net Art</p>==
[[File:Simu part 02.gif|left|1000px]]


Started in late 70's and nowadays associated with a outdated era of the Internet (1.0?)<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
Closely related to Network Art
* Olia Lialina, My Boyfriend Came Back From the War, 1996


==<p style="font-family:helvetica">New Aesthetics</p>==
===<p style="font-family:helvetica">Sketch 12:  Arduino Uno + P5serialcontrol + P5.js web editor = Code descrypter </p> ===


Confronting/merging virtual and physical, or humans and machine, etc
[[File:Codeword 01.png|300px|thumb|left|P]]
* James Bridle
[[File:Codeword 02.png|300px|thumb|center|I]]
[[File:Codeword 03.png|300px|thumb|left|G]]
[[File:Codeword 04.png|300px|thumb|center|E]]
[[File:Codeword 05.png|300px|thumb|left|O]]
[[File:Codeword 06.png|300px|thumb|center|N]]


==<p style="font-family:helvetica">Funware</p>==
<br><br>
<br><br>


Gamification of non-game platforms in order to encourage some actions, behaviors, transactions with the help of various rewarding systems.
===<p style="font-family:helvetica">Sketch 13:  Arduino Uno + P5serialcontrol + P5.js web editor = Game </p> ===


=<p style="font-family:helvetica">Connections to XPUB1</p>=
[[File:01 Screen .png|300px|thumb|left|Stage 0<br>The subject is located too far away]]
[[File:02 Screen.png|300px|thumb|center|Stage 0<br>The subject is well located and will hold position to reach next stage]]
[[File:03 Screen.png|300px|thumb|left|Stage 1<br>The subject unlocked Stage 1 and will hold position to reach next stage ]]
[[File:04 Screen.png|300px|thumb|center|Stage 2<br>The subject unlocked Stage 2 and is located too close]]
[[File:06 Screen.png|300px|thumb|left|Stage 3<br>The subject unlocked Stage 3 and need to get closer]]
[[File:07 Screen.png|300px|thumb|center|Transition Stage<br>The subject unlocked all stage and needs to wait the countdown for following steps]]
<br><br>


==<p style="font-family:helvetica">User viewing contexts (on the Web) from special issue 13</p>==
===<p style="font-family:helvetica">Sketch 14:  Arduino Uno + P5serialcontrol + P5.js web editor = Simplified interface</p> ===
 
[[File:Data Collector Sample 01.gif|400px|thumb|left]]


===<p style="font-family:helvetica">Description</p>===
<br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br>
<br><br>


Create motion from the diffraction of the user interface  which offers flexible and almost infinite possible renders of a same Web page. The sensible variety of user viewing contexts tells about the placiticy of the user interface. This is the result from the wide range user devices, window or screen sizes, Web browsers, (as part of many other parameters). A first of movement capture and montage of the user interface placticity can be as part of the post-production of my interpretation of the esssay "[https://issue.xpub.nl/13/TENSE/ Tense]", part of the [https://issue.xpub.nl/13/ Special Issue 13].
====<p style="font-family:helvetica">How to add split serial data value with more than one sensor</p>====


==== Capturing and puting into motion the User interface placticity ====
* Use Split: function https://p5js.org/reference/#/p5/split
* Pad example: https://hub.xpub.nl/soupboat/pad/p/Martin


Trying to play around with the Browser window resizing in order to create a playful animation decidaced be a thumbnail of the project.
The two first screen capture will be the basis of the upcoming motion. I will first try to smooth the window movement and make the two screen capture fit togehter before synchronizing them and looping them.


[[File:Debug Martin 01.png|500px|thumb|left]]
[[File:Debug Martin 05.png|500px|thumb|center]]
[[File:Debug Martin 02.png|500px|thumb|left]]
[[File:Debug Martin 03.png|500px|thumb|center]]
[[File:Debug Martin 04.png|500px|thumb|left]]
[[File:Debug Martin 06.png|500px|thumb|center]]


[[File:TENSE Motion Rectangle.gif|thumb|left|TENSE Motion Rectangle Format Loop in the loop]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br>
[[File:TENSEMOTIONInitialCapture.gif|thumb|right|TENSE MOTION Initial Screen Capture 1]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


Notes: Add Web Oscillator


=<p style="font-family:helvetica">Prototyping</p>=
===<p style="font-family:helvetica">Installation update</p>===


[[File:Installation Update 01.jpg|300px|thumb|left]]
[[File:Installation Update 02.jpg|300px|thumb|center]]


==<p style="font-family:helvetica">Arduino</p>==
<br><br><br><br><br><br><br><br>


Early sketch that is about comparing and questioning our Spectator experience of a physical exhibition space (where everything is often fixed and institutionalized), with our User experience of a Web space (where everything is way more elastic, unpredictable and obsolete). I’m interested about how slighly different can be rendered a same Web page to all different users depending on technological contexts (device nature, browser, IP address, screen size, zoom level, default settings, updates, luminosity, add-ons, restrictions, etc). I would like to try to create a physical exhibition space/installation that would be inspired from the technology of a Web user window interface in order then to play with exhbitions parameters such as the distance between the spectator and the artwork, the circulation in space, the luminosity/lighting of the artwork(s), the sound/acoustics, etc etc etc.
===<p style="font-family:helvetica">To do</p> ===
<br><br>
Distance between wall behind the spectator and the artwork has to be translated into a variable that can affect sound or light in the room.
Wall position could be connected to the dimensions of a user interface in real time with arduino and a motor.
<br>


===<p style="font-family:helvetica">Create a connected telemeter with an Arduino, a ultrasonic Sensor (HC-SR04) and a ESP8266 module connected to Internet </p>===
* Manage to store the data with WEB STORAGE API
** https://www.w3schools.com/JS/js_api_web_storage.asp
* Import Live data or Copy data from
** https://www.worldometers.info/
* Import Live data from stock market
** https://money.cnn.com/data/hotstocks/index.html
** https://www.tradingview.com/chart/?symbol=NASDAQ%3ALIVE
** https://www.google.com/finance/portfolio/972cea17-388c-4846-95da-4da948830b03
* Make a Bar graph
** https://openprocessing.org/sketch/1152792


It seems  possible to create your own telemeter with a arduino by implementing an [https://components101.com/sensors/ultrasonic-sensor-working-pinout-datasheet ultrasonic Sensor HC-SR04]<br>
===<p style="font-family:helvetica">Stages Design</p> ===
By doing so, the values capted by the sensor could potentaialy be directly translated as a variable.<br> Then with the ESP8266 module, the values could be translated on a database on the internet.
Then I could enter that website and see the values from anywhere and use them to control light, sound or anything else I wish.


* https://pzwiki.wdka.nl/mediadesign/Arduino101#Anatomy_of_a_sketch
Many stages (mini-levels) are being designed. They are all intended to evoke the different contexts and pretexts for which we perform daily micro-tasks to collect data.
<br>
The visitor can unlock the next step by successfully completing one or more tasks in a row. After a while, even if the action is not successful, a new step will appear with a different interface and instructions.  


===<p style="font-family:helvetica">Tool/Material list: </p>===
The list below details the different stages being produced, they will follow each others randomly during the session:


* Telemeter (user to get the distance between the device and an obstacle)<br>
** Money/Slot Machine
* Rails
** Well-Being
* Handles
** Healthcare
* Wheels
** Yoga
* Movable light wall
** Self-Management
* Fixed walls
** Stock Market Exchange
* USB Cable
** Military interface
* Connexion cables
 
* [https://www.arduino.cc/ Arduino]
The visuals bellow illustrate their design.
* ESP8266


[[File:Connexioncable.jpg|200px|thumb|left|Connexion cables (Arduino)]]
[[File:Captcha 01.png|thumb|left|Captcha:<br>one action needed
[[File:Usb cable1.jpg|200px|thumb|center|USB Cable]]
moving forward, backward or standing
[[File:Arduino1.png|200px|thumb|left|Arduino]]
stillnext stage unlock after action done
[[File:HC-SR04 Ultrasonic Sensor.jpg|200px|thumb|center|HC-SR04 Ultrasonic Sensor]]
and short animation]]
[[File:Plywood.jpg|200px|thumb|left|Plywood x 3]]
[[File:Self Track 01.png|thumb|center|Self Tracking:<br>no interaction needed
[[File:Handle.jpg|200px|thumb|center|Handle]]
visitor must stand still until
[[File:ESP8266.jpg|200px|thumb|left|ESP8266]]
one of the goal is completed]]
[[File:Rail.jpg|200px|thumb|center|Rail]]
[[File:Self Track 02.png|thumb|left|Self Tracking:<br>no interaction needed
visitor must stand still until
one of the goal is completed]]
[[File:Slot Machine 01.png|thumb|center|Slot Machine:<br>no interactions needed
transition between 2 stages
determines randomly the next stage
displayed when nobody detected]]
[[File:Social Live 01.png|thumb|left|Social Live:<br>no interaction needed
visitor must stand still until
money goal is completed]]
[[File:Stock Ticket 01.png|thumb|center|Stock Ticket:<br>no interactions needed
displayed when nobody detected]]
<br><br><br><br><br><br><br>
<br><br><br><br><br><br><br>


===<p style="font-family:helvetica">Stages Design with P5.js</p> ===
[[File:AllStages HomoData.png|400px|thumb|left]]
[[File:Homo Data 02.gif|400px|thumb|left|6 levels in a row then randomnized, more to come]]
[[File:Consolelog 01.gif|400px|thumb|center]]
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>


===<p style="font-family:helvetica"> About the ultrasonic Sensor (HC-SR04)</p> ===
===<p style="font-family:helvetica">Grad Show: Worm</p> ===
 
[[File:CountonMephoto 01 RD.png|left|thumb|702x702px|Count on Me - Worm - 01]]
====<p style="font-family:helvetica">Characteristics</p> ====
[[File:CountonMephoto 03 RD.png|thumb|702x702px|Count on Me - Worm - 02]]
 
[[File:CountonMephoto 04 RD.png|left|thumb|702x702px|Count on Me - Worm - 03]]
Here are a few of it's technical characteristic of the HC-SR04 ultrasonic sensor : <br>
[[File:CountonMephoto 05 RD.png|thumb|702x702px|Count on Me - Worm - 04]]
[[File:CountonMephoto 06 RD.png|left|thumb|702x702px|Count on Me - Worm - 05]]
[[File:CountonMephoto 07 RD.png|thumb|702x702px|Count on Me - Worm - 06]]
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br>


*    Power supply: 5v.
==<p style="font-family:helvetica"> Prototyping Ressources</p> ==
*    Consumption in use: 15 mA.
*    Distance range: 2 cm to 5 m.
*    Resolution or accuracy: 3 mm.
*    Measuring angle: < 15°.


[https://elec13.wordpress.com/2016/10/15/realisez-un-telemetre-avec-le-hc-sr04-et-une-carte-arduino/ Ref]
===Do it Yourself Ressources (from Dennis de Bel)===
More infos about the sensor [https://www.robot-maker.com/shop/img/cms/datasheet-capteur-ultrasons-hc-sr04.pdf here] and [https://www.aranacorp.com/fr/mesure-de-distance-avec-un-capteur-hc-sr04/ here]


====<p style="font-family:helvetica"> Where to buy the ultrasonic Sensor (HC-SR04)</p> ====
* [https://www.instructables.com/DIY-Sensors/ Instructables] is a huge source of (written) tutorials on all kinds of topics. Keep in mind it's more quantity than quality. Interesting for you might be 'diy sensors'
* [http://loliel.narod.ru/DIY.pdf Hand Made Electronic (Music)]: Great resource for cheap, diy electronics project focussing on
sound/music (pdf findable online)
* [https://www.n5dux.com/ham/files/pdf/Make%20-%20Electronics.pdf Make: Electronics:] Amazing, complete guide to everything 'electronics' (Warning, HUGE pdf)
* [https://www.thingiverse.com/ Thingiverse:] The place to find 3d printable mechanics, enclosures, parts etc.


*  1piece = 9,57 € - https://fr.shopping.rakuten.com/offer/buy/7112482554/module-de-mesure-a-ultrasons-hc-sr04-capteur-de-mesure-de-distance-5v-pour.html?t=7036&bbaid=8830210388
===Electronic Shops (physical)===
*  20 pieces = 34,22 € - https://fr.shopping.rakuten.com/offer/buy/7112482554/module-de-mesure-a-ultrasons-hc-sr04-capteur-de-mesure-de-distance-5v-pour.html?t=7036&bbaid=8830210388


=== Prototype 1 : Arduino + Resistor ===
* [https://www.radiotwenthe.nl/ Radio Twenthe] (Den Haag)
* [https://radiopiet.nl/ Radio Piet] (Arnhem)
LIST OF SHOPS (also more physical NL ones)
* https://www.circuitsonline.net/shops


During a workshop, we started with a very basic fake arduino kit, a led, a motor, and a sensor.
===Electronic Webshops (NL)===
After making a few connections, we got to understand a bit how it works.


* https://www.tinytronics.nl/ (semi physical, pickup only in Eindhoven)
* http://www.newtone-online.nl/catalog/
* https://www.brigatti.nl/


===Electronic Webshops (Rest)===


    #include <Servo.h>
* [https://www.conrad.nl/ Conrad] (Germany, expensive)
    Servo myservo;  // create servo object to control a servo
* [https://www.tme.eu/en/ TME] (Poland, cheap, ridiculously difficult website)
    int pos = 0;    // variable to store the servo position
* [https://www.segor.de/#/index Segor] (Germany)
    int ldr = 0;    // vairable to store light intensity
* [http://mouser.de/ Mouser] (Germany)
 
* [https://www.reichelt.de/ Reichelt] (Germany)
    void setup() {
* [https://www.farnell.com/ Farnell] (Germany/UK)
    Serial.begin(9600); // begin serial communication, NOTE:set the same baudrate in the serial monitor/plotter
* [https://www.digikey.com/ Digi-key] (USA, fast but expensive shipping + tax)
    myservo.attach(D7)// attaches the servo on pin 9 to the servo object
* [https://www.taydaelectronics.com/ Tayda] (Thailand/USA, 2-3 weeks shipping)
    }
* [https://nl.aliexpress.com/ A liexpress] (China, 2-3 weeks shipping)


    void loop() {
===PCB making EU (Expensive)===
    //lets put the LDR value in a variable we can reuse
    ldr = analogRead(A0);
   
    //the value of the LDR is between 400-900 at the moment
    //the servo can only go from 0-180
    //so we need to translate 400-900 to 0-180
    //also the LDR value might change depending on the light of day
    //so we need to 'contrain' the value to a certain range
    ldr = constrain(ldr, 400, 900);


    //now we can translate
* [https://www.eurocircuits.com/ Eurocircuits] (Germany)
    ldr = map(ldr, 400, 900, 0, 180);
* [https://aisler.net/ Aisler] (Germany, 1 week from design upload to in your hands, very high quality)
* [https://www.leiton.de/ Leiton] (Germany)


    //lets print the LDR value to serial monitor to see if we did a good job
===PCB making China (Cheap but import tax)===
    Serial.println(ldr); // read voltage on analog pin 0, print the value to serial monitor


    //now we can move the sensor accoring to the light/our hand!
* [https://jlcpcb.com/ JLCPCB] (1 week from design upload to in your hands, low quality solder mask)
    myservo.write(ldr);      // tell servo to go to position in variable 'pos'
* [https://www.pcbway.com/ PCBWAY] (1 week from design upload to in your hands)
    delay(15);   
* [https://www.allpcb.com/ ALLPCB] (1 week from design upload to in your hands)
    }


===Arduino and Sensors===


[[File:Servo-breadboard.jpg|300px|thumb|left|How to make a engine work<br>credits: Dennis de Bel]]
* https://www.floris.cc/shop/en/19-starter-kits
[[File:Ldrbreadboard.jpg|300px|thumb|center|How to make a sensor work<br>Credits: Dennis de Bel]]
* https://www.tinytronics.nl/shop/nl/arduino/kits/arduino-starter-kit
[[File:Combined-breadboard2.jpg|300px|thumb|left|How to make both sensor and engine works together<br>Credits: Dennis de Bel]]
* https://www.conrad.nl/p/makerfactory-beginnersset-voor-arduino-1612782
[[File:Sensortest.gif|300px|thumb|center|Sensortest during workshop]]


=== Split Screen Arduino + Sensor + Serial Plotter + Responsive Space ===
===Sensor only Kit===


Trying here to show the simutaneous responses between the sensor, the values, and the simualtion.
* [https://nl.aliexpress.com/item/4000238240904.html?spm=a2g0o.search0304.0.0.7bca74dbu2i6KH&algo_pvid=59637d40-1368-41ba-bc30-63b1517303e9&aem_p4p_detail=2021092303503115630656510062050000317987&algo_exp_id=59637d40-1368-41ba-bc30-63b1517303e9-0 45-in-1] (aliexpress) Example sensor you will find in such a kit documented here


[[File:Splitscreen.gif|800px|thumb|left|Splitscreen Arduino + Sensor + Serial Plotter + Responsive Space]]
===Arduino Starter Projects===
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


===<p style="font-family:helvetica">Prototype 2: Arduino + Ultrasonic sensor</p> ===
* https://www.makerspaces.com/simple-arduino-projects-beginners/
 
or slightly more complex:
For this very simple first sketch and for later, I will include newPing library that improves a lot the ultrasonic sensor capacities.
* http://www.makeuseof.com/tag/10-great-arduino-projects-for-beginners/
or in videos:
*[https://www.youtube.com/playlist?list=PLT6rF_I5kknPf2qlVFlvH47qHvqvzkkn%20d youtube playlist]
or just many different ideas:
*http://playground.arduino.cc/Projects/Ideas
or - of course - on Instructables if you want to have a complete course:
*https://www.instructables.com/class/Arduino-Class/
or this course:
*https://arduino.tkkrlab.space/
ARDUINO + PROCESSING (visualizing sensors)
*https://create.arduino.cc/projecthub/projects/tags/processing
MISCELANIOUS KEYWORDS and LINKS
*citizen science, for example: https://www.meetjestad.net/
*https://forensic-architecture.org/
 
====<p style="font-family:helvetica"> About the ESP8266 module</p> ====
 
The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.
 
====<p style="font-family:helvetica"> Ressources about ESP8266 module</p>  ====


[[File:ArduinoUno Sensor Sketch1.jpg|400px|thumb|left|Sketch 1: Arduino Uno + Sensor]]
Kindly fowarded by Lousia:<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
  #include <NewPing.h>
  int echoPin = 10;
  int trigPin = 9;
 
  NewPing MySensor(trigPin, echoPin); //This defines a new variable
 
  void setup() {
    // put your setup code here, to run once:
    Serial.begin(9600);
  }
 
  void loop() {
    // put your main code here, to run repeatedly:
  int duration = MySensor.ping_median();
  int distance = MySensor.convert_in(duration);
 
  Serial.print(distance);
  Serial.println("cm");
  delay(250);
  }


===<p style="font-family:helvetica">Prototype 3: Arduino Uno + Sensor + LCD (+ LED)</p> ===
* https://www.youtube.com/watch?v=6hpIjx8d15s
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
* https://www.youtube.com/watch?v=dWM4p_KaTHY
* https://randomnerdtutorials.com/esp8266-web-server/
* https://www.youtube.com/watch?v=6hpIjx8d15s
* https://electronoobs.com/eng_arduino_tut101.php
* http://surveillancearcade.000webhostapp.com/index.php (interface)


All together from https://www.youtube.com/watch?v=GOwB57UilhQ
====<p style="font-family:helvetica">Which ESP8266 to buy</p> ====


[[File:Sketch2ArduinoUnoSensorLCD.jpg|400px|thumb|left|Sketch 2: Arduino Uno + Sensor + LCD]]
* https://makeradvisor.com/tools/esp8266-esp-12e-nodemcu-wi-fi-development-board/
[[File:Sketch3ArduinoUnoSensorLCDLED.jpg|400px|thumb|right|Sketch 3: Arduino Uno + Sensor + LCD + LED]]
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
* https://www.amazon.nl/-/en/dp/B06Y1ZPNMS/ref=sr_1_5?crid=3U8B6L2J834X0&dchild=1&keywords=SP8266%2BNodeMCU%2BCP2102%2BESP&qid=1635089256&refresh=1&sprefix=sp8266%2Bnodemcu%2Bcp2102%2Besp%2Caps%2C115&sr=8-5&th=1
  #include <LiquidCrystal.h>
 
 
==<p style="font-family:helvetica">Installation</p>==
  LiquidCrystal lcd(10,9,5,4,3,2);
 
 
===<p style="font-family:helvetica">Ressources</p> ===
  const int trigPin = 11;
 
  const int echoPin = 12;
* Movable walls build out for Art Museum of West Virginia University [https://www.youtube.com/watch?v=E3SrGzAsRBE link]
 
* Gallery Wall System (GWS) [https://www.youtube.com/watch?v=uJ-yFkJ9ykA link]
  long duration;
* CASE-REAL installs movable walls inside a basement art gallery in tokyo [https://www.youtube.com/watch?v=kHTgAeN4XDg link]
  int distance;
 
 
=<p style="font-family:helvetica">Venues</p> =
  void setup() {
 
    // put your setup code here, to run once:
==<p style="font-family:helvetica">Venue 1: Aquarium </p>==
      analogWrite(6,100);
 
      lcd.begin(16,2);
===<p style="font-family:helvetica">Description</p>===
      pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
<br>
  pinMode(echoPin, INPUT); // Sets the echoPin as an Input
<b>AQUARIUM 1.0</b><br>
  Serial.begin(9600); // Starts the serial communication
<br><br>
 
A Small Ecosystem for Living Thoughts<br>
     
<br>
  }
Monday, 11th October<br>
 
19:30 – 21:30<br>
  void loop() {
Leeszaal Rotterdam West<br>
  long duration, distance;
Rijnhoutplein 3, 3014 TZ Rotterdam<br>
    digitalWrite(trigPin,HIGH);
<br>
    delayMicroseconds(1000);
with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni,  
    digitalWrite(trigPin, LOW);
Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann
    duration=pulseIn(echoPin, HIGH);
<br><br>
    distance =(duration/2)/29.1;
It’s oh-fish-ial! Students of the Experimental Publishing Master invite you to dive into their small ecosystem of living thoughts. Join us for an evening of conversation, discussion and new view points. If you look closely, you might even see some early thesis ideas hatching. Let's leave no rock unturned.
    Serial.print(distance);
 
    Serial.println("CM");
==<p style="font-family:helvetica">Observation questionnaire</p>==
    delay(10);
 
  // Prints the distance on the Serial Monitor
This exercice is a very small, humble and almost 100% analog exercice questioning representation in two small steps.  
  Serial.print("Distance: ");
 
  Serial.println(distance);
===<p style="font-family:helvetica">1st step</p>===
 
 
      lcd.clear();
[[File:Brick.jpg|300px|thumb|left|photo of a brick]]
      lcd.setCursor(0,0);
<br><br><br><br><br><br><br><br><br><br><br>
      lcd.print("Distance = ");
* <b>1st step:</b> I give a sheet of paper to people during the venue and ask them to answer a series of questions concerning the object (brick) that is being displayed in the middle of the room on a podium. It is specified to them that they can be anywhere while observing this brick and in any position. Here are the quesitons:
      lcd.setCursor(11,0);
<br>
      lcd.print(distance);
      lcd.setCursor(14,0);
      lcd.print("CM");
     
      delay(500);
     
  }


From this sketch, I start considering that the distance value could be directly sent to a computer and render a Web page depending on its value.
* <b>Please write down your first name:
<br>Note: It looks like this sensor max range is 119cm, which is almost 4 times less than the 4 meters max range stated in component description.
<br>


===<p style="font-family:helvetica">Prototype 4: Arduino Uno + Sensor + LCD + 2 LED = Physical vs Digital Range detector </p> ===
* Describe your position (sitting/standing/other):
<br>


Using in-between values to activate the green LED<br>
* Describe your location in the room:
Once again, puting together the simulation and the device in use.
<br>


[[File:SensorSpace.gif|400px|thumb|left|Sensor Test VS Elastic Space]]
* Describe what you are seeing while looking at the screen:
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br>
* Describe how you feel mentaly/emotionaly:</b>
<br>
<br>
===<p style="font-family:helvetica">2nd step</p>===
 
[[File:Brickphoto.png|300px|thumb|left|photo of brick displayed inside a computer screen]]
<br><br><br><br><br><br><br><br><br><br><br>
 
* <b>2nd step:</b> I take the answers, wait a round, and then give back a new sheet of paper to the same people with the exact same questions concerning the respresentation of the object (brick) that is being displayed in the middle of the room on a computer screen on the same podium.
 
===<p style="font-family:helvetica">Answer Samples</p>===
 
1.0 <b>Object on a podium</b>


  #include <LiquidCrystal.h>
* 1.1 Sitting on corner stairs  —> Want to see it from different angles —> Feeling trapped, frustrated
  #include <LcdBarGraph.h>  
* 1.2 Sitting on stairs —> a rock looking dead —> Feeling sad
  #include <NewPing.h>  
* 1.3 Sitting on the left close from columns —> rational observation —> Nostalgic memories because participated to the creation of the object as it looks right now
 
* 1.4 Sitting in front of object —> Calm and slighly confused
    LiquidCrystal lcd(10,9,5,4,3,2);
* 1.5 Sitting on the floor next to stairs in between the side and the middle —> Looking at the object from the bottom —> Feeling a bit confused and inspired
 
<br><br>
  const int LED1 = 13;
2.0 <b>Photo of the object displayed on a computer screen placed on a podium</b>
  const int LED2 = 8; 
 
  const int trigPin = 11;
* 2.1 Sitting on a chair seeing the brick from a bird perspective -> Feeling more control of the situation
  const int echoPin = 12;
* 2.2 Sitting very close from the brick —> Seeing a flat and almost abstract picture —> Feeling drawn to the picture, aesthetically pleasing, feeling less sad about the object
 
* 2.3 Sitting under a table very far way —> Looking abstract but identifiable —> Exited about the unusual and childish observation position
  long duration; //travel time
* 2.4 Sitting on stairs —> and seeing the brick in 2D —> Feeling fine
  int distance;
* 2.5 Sittiing on the stairs —> Seeing a side of the screen with a top view photo of the object —> Feeling confortable
  int screensize;
<br><br>
 
[[File:Answers1 RepresentationQuestionnaire.png|300px|thumb|left|Answers1_RepresentationQuestionnaire]]
  void setup() {
[[File:Answers2 RepresentationQuestionnaire.png|300px|thumb|center|Answers2_RepresentationQuestionnaire]]
    // put your setup code here, to run once:
[[File:Answers3 RepresentationQuestionnaire.png|300px|thumb|left|Answers3_RepresentationQuestionnaire]]
      analogWrite(6,100);
[[File:Answers4 RepresentationQuestionnaire.png|300px|thumb|center|Answers4_RepresentationQuestionnaire]]
      lcd.begin(16,2);
[[File:Answers5 RepresentationQuestionnaire.png|300px|thumb|left|Answers5_RepresentationQuestionnaire]]
      pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
 
      pinMode(echoPin, INPUT); // Sets the echoPin as an Input
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
      Serial.begin(9600); // Starts the serial communication
 
 
=<p style="font-family:helvetica">Venues</p> =
      pinMode(LED1, OUTPUT);
 
      pinMode(LED2, OUTPUT);
==<p style="font-family:helvetica">Venue 2: Aquarium 2.0 </p>==
  }
 
 
===<p style="font-family:helvetica">Description</p>===
  void loop() {
<br>
  long duration, distance;
Date 29th Nov — 4th Dec 2021 <br>
    digitalWrite(trigPin,HIGH);
Time 15h — 18h <br>
    delayMicroseconds(1000);
29th Nov — 4th Dec 2021 (all day)<br>
    digitalWrite(trigPin, LOW);
Location: De Buitenboel, Rosier Faassenstraat 22 3025 GN Rotterdam, NL<br>
    duration=pulseIn(echoPin, HIGH);
<br><br>
    distance =(duration/2)/29.1; //convert to centimers
AQUARIUM 2.0 <br>
    screensize = distance*85;
<br>
    Serial.print(distance);
An ongoing window exhibition with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann<br>
    Serial.println("CM");
<br>
    Serial.print(screensize);
Tap upon the glass and peer into the research projects we are currently working on.
    delay(10);
From Monday 29th of November until Saturday 4th of December we put ourselves on display in the window of De Buitenboel as an entry point into our think tank. Navigating between a range of technologies, such as wireless radio waves, virtual realities, sensors, ecological and diffractive forms of publishing, web design frameworks, language games, and an ultra-territorial residency; we invite you to gaze inside the tank and float with us. Welcome back to the ecosystem of living thoughts.<br>
 
 
    if ((distance >= 15) && (distance<=20))
==<p style="font-family:helvetica">Aquarium LCD Portal (29 Nov – 4th Dec)</p>==
    {
      digitalWrite(LED2, HIGH);
      digitalWrite(LED1, LOW);
    }
    else
    {
      digitalWrite(LED1, HIGH);
      digitalWrite(LED2, LOW);   
    }
 
  // Prints the distance on the Serial Monitor
  Serial.print("Distance: ");
  Serial.println(distance);
 
      lcd.clear();
      lcd.setCursor(0,0);
      lcd.print("ROOM");
      lcd.setCursor(6,0);
      lcd.print(distance);
      lcd.setCursor(9,0);
      lcd.print("cm");   
      lcd.setCursor(0,2);
      lcd.print("SCR");
      lcd.setCursor(6,2);
      lcd.print(screensize);
      lcd.setCursor(9,2);
      lcd.print("x1080px");
         
      delay(500);
     
  }


This interactive micro-installation composed of a LCD screen and sensor(s) invites users/visitors to change the color of the screen and displayed messages by getting more or less close from the window.
[https://www.pzwart.nl/blog/2021/11/30/aquarium-2-0/ Link]


I brought a second arduino, 2 long breadboards, black cables, another LCD screen, and remade the setup on this format.
[[File:ScreenPortalFront.jpg|300px|thumb|left|ScreenPortalFront]]
For some reasons the new LCD screen is not going in the breadboard, and I need more male to female cables in order to connect it correctly.
[[File:ScreenPortalback.jpg|300px|thumb|right|ScreenPortalback]]
With this longer breadboard, I want to extend the range value system, and make it visible with leds and sounds.
[[File:LCDScreenTest.gif|600px|thumb|center|LCDScreenTest]]


[[File:Arduino Setup V3.jpg|400px|thumb|left|Upgrade]]
<br><br><br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


===<p style="font-family:helvetica">How to get more digital pins [not working]</p> ===
=<p style="font-family:helvetica">Readings (new)(english)(with notes in english) </p>=


* How to use analog pins as digital pins https://www.youtube.com/watch?v=_AAbGLBWk5s
==<p style="font-family:helvetica">About Institutional Critique</p>==
* Up to 60 more pins with Arduino Mega https://www.tinytronics.nl/shop/en/development-boards/microcontroller-boards/arduino-compatible/mega-2560-r3-with-usb-cable


I tried 4 different tutorials but still didn't find a way to make the thing work, that's very weird, so I will just give up and take a arduino mega =*(
===<p style="font-family:helvetica">To read</p>===
→&nbsp;1.&nbsp;[[Art and Contemporary Critical Practice: Reinventing Institutional Critique]][https://aaaaarg.fail/thing/51c584186c3a0ed90ba30900 Doc]<br>
→&nbsp;2.&nbsp;[[From the Critique of Institutions to an Institution of Critique - Andrea Fraser]][https://aaaaarg.fail/upload/andrea-fraser-from-the-critique-of-institutions-to-an-institution-of-critique-1.pdf Doc]<br>
→&nbsp;3.&nbsp;[[Institutional critique, an anthology of artists writings - Alexander Alberro]][https://aaaaarg.fail/upload/alexander-alberro-institutional-critique-an-anthology-of-artists-writings.pdf Doc]


[[File:ArduinoExtraDigitalPins.jpg|300px|thumb|left|ArduinoExtraDigitalPins]]
==<p style="font-family:helvetica">About Techno-Solutionism</p>==


<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
===<p style="font-family:helvetica">To read</p>===


===<p style="font-family:helvetica">Prototype 5:  Arduino Uno + 3 Sensor + 3 LEDS </p> ===
→&nbsp;1.&nbsp;The Folly of Technological Solutionism: An Interview with Evgeny Morozov - Natasha Dow Schüll


With a larger breadboard, connecting 3 sensors all together. Next step will be to define different ranges of inbetween values for each sensor in order to make a grid.
==<p style="font-family:helvetica">About Meta</p>==
To accomplish this grid I will make a second row of sensors such as this, in order to get x and y values in space


===<p style="font-family:helvetica">Prototype 6:  Arduino Uno + 3 Sensor + 12 LEDS</p> ===
===<p style="font-family:helvetica">To read</p>===
→&nbsp;1.&nbsp; [https://www.tandfonline.com/doi/full/10.3402/jac.v6.23009  The meta as an aesthetic category] Bruno Trentini (2014)<br>
→&nbsp;2.&nbsp; [[File:RMZ ARTIST WRITING(2).pdf|left|[[thumb]]Rosa Maria Zangenberg ]] The eye tells the story by Rosa Maria Zangenberg (2017)<br>
→&nbsp;3.&nbsp; [https://brill.com/view/title/2075 Leonardo Da Vinci - Paragone by Louise Farago]


With 3 sensors, added on 2 long breadboads, and with a different set of range values, we can start mapping a space.
==<p style="font-family:helvetica">About exhibition space</p>==


[[File:SensorMediaQueries 01.gif|300px|thumb|left|SensorMediaQueries]]
===<p style="font-family:helvetica">To read</p>===
[[File:Physical Space Mapping.png|500px|thumb|center|Physical Space Mapping]]


<br><br><br><br><br><br><br>
→&nbsp;2.&nbsp;Kluitenberg, Eric, ed. Book of imaginary media. Excavating the dream of the ultimate communication medium. Rotterdam: NAi Publishers, 2006.<br>
→&nbsp;3.&nbsp;[http://nt2.uqam.ca/fr/biblio/wall-and-canvas-lissitzkys-spatial-experiments-and-white-cube The wall and the canvas: Lissitzky’s spatial experiments and the White Cube]<br>
→&nbsp;6.&nbsp;[https://eastofborneo.org/articles/decorative-arts-billy-al-bengston-and-frank-gehry-discuss-their-1968-collaboration-at-lacma/ Decorative Arts: Billy Al Bengston and Frank Gehry discuss their 1968 collaboration at LACMA by Aram Moshayedi]<br>
→&nbsp;8.&nbsp; [[File:Resonance and Wonder STEPHEN GREENBLATT.pdf|thumb]] Resonance and Wonder - STEPHEN GREENBLATT<br>
→&nbsp;9.&nbsp; A Canon of Exhibitions - Bruce Altshuler [[File:A Canon of Exhibitions - Bruce Altshuler.pdf|thumb]]<br>
→&nbsp;10.&nbsp;Documenta - [[File:A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar.pdf|thumb]] A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar<br>
→&nbsp;11.&nbsp;Pallasmaa - The Eyes of the Skin [[File:Pallasmaa - The Eyes of the Skin.pdf|thumb]]<br>
→&nbsp;12.&nbsp;Venturi - Learning from Las Vegas [[File:Venturi - Learning from Las Vegas.pdf|thumb]]<br>
→&nbsp;13.&nbsp;Preserving and Exhibiting Media Art: Challenges and Perspectives - JULIA NOORDEGRAAF, COSETTA G. SABA; BARBARA LE MAÎTRE; VINZENZ HEDIGER
Copyright: 2013 - Publisher: Amsterdam University Press Series: Framing Film


===<p style="font-family:helvetica">Prototype 7:  Arduino Uno + 12 LEDS + 3 Sensor + Buzzer + Potentiometer + LCD</p> ===
===<p style="font-family:helvetica">Reading/Notes</p>===


For this prototype, I implement a buzzer that will emit a specific sound depending on the distance of the obstacle detected by the sensor.
→&nbsp;1.&nbsp;[http://nt2.uqam.ca/fr/biblio/after-white-cube [[After the White Cube.]]] [https://www.lrb.co.uk/the-paper/v37/n06/hal-foster/after-the-white-cube ref] 2015 NOTES INSIDE<br>
I also puted back a LCD displaying the 3 sensors values. The screen luminosity can be changed via a potentiometer.
* How and why White Cube rised and became democratized
* White Cube // Consumerism = Art Consumerism?
* Exhibition Space > Artworks
* Experience of interpretation = Entertainment of Art?
* Museum vs Mausoleum
<br>
→&nbsp;2.&nbsp;[http://nt2.uqam.ca/fr/biblio/spaces-experience-art-gallery-interiors-1800-2000 [[Spaces of Experience: Art Gallery Interiors from 1800 – 2000]]] [http://nt2.uqam.ca/fr/biblio/spaces-experience-art-gallery-interiors-1800-2000 ref] NOTES INSIDE<br>
* Art vs 50's consumerism / Choregraphy of desire?
* Check theorists Hermann von Helmholtz and Wilhelm Wundt
<br>
<br>
Ressources:
→&nbsp;3.&nbsp;[https://www.e-flux.com/announcements/262138/colour-critique/ [[Colour Critique A Symposium on Colour as an Agent for Taste, Experience and Value in the Exhibition Space]]] NOTES INSIDE<br>
* https://samsneatprojectblogcode.blogspot.com/2016/06/piezo-buzzer-code-and-fritzing.html
May 24, 2019 - Noise! Frans Hals, Otherwise, Frans Hals Museum<br>
* https://www.youtube.com/watch?v=m7bbfzZ2UNo
→&nbsp;4.&nbsp; [[Colour_Critique_A_Symposium_on_Colour_as_an_Agent_for_Taste,_Experience_and_Value_in_the_Exhibition_Space|Noise! Frans Hals, Otherwise]] NOTES INSIDE<br>
* https://www.youtube.com/watch?v=K8AnlUT0ng0
* Role of colours in the viewer's experience of an exhibition
 
* Institutional Critique
[[File:ArduinoMegaSensorBuzzerLCD.jpg|300px|thumb|left|ArduinoMegaSensorBuzzerLCD]]
* Institutionalised Space / White cube
 
<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
→&nbsp;5.&nbsp;[[Mental Spaces - Joost Rekveld/Michael van Hoogenhuyze]] NOTES INSIDE<br>(course for Artscience 2007/8)
 
[http://www.joostrekveld.net/wp/?page_id=590 doc]<br>
===<p style="font-family:helvetica">Prototype 8:  Arduino Uno + 12 LEDS + 3 Sensor on mini breadboards + Buzzer + Potentiometer + LCD</p> ===
* About perspective
 
* About Space time
Same code, but new setup detaching each sensor from each others and allowing to place them anywhere.
* About Cyber Space
 
<br>
[[File:ArduinoMegaSensorBuzzerLCDMinibreadboard.jpg|300px|thumb|left]]
→&nbsp;6.&nbsp; [[THE  DEVIL  IS IN THE  DETAILS: MUSEUM - Displays  and the Creation of Knowledge]] [https://pzwiki.wdka.nl/mw-mediadesign/images/5/57/The_Devil_is_in_the_details-_DETAIT-_MUSEUM_Displays_and_the_Creation_of_Knowledge.pdf Doc] NOTES INSIDE<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
Stephanie Moser SOUTHAMPTON UNIVERSITY (MUSEUM ANTHROPOLOGY) 2010<br>
===<p style="font-family:helvetica">Prototype 9:  Arduino Uno + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD</p> ===
* Architecture (Neoclassical buildings)
 
* Big vs Small exhibition Space
[[File:Sensor Wall 01.png|300px|thumb|left|Sensor Wall 01]]
* Lined up objects vs non systematic display
[[File:PhysicalMapping3.png|500px|thumb|right|PhysicalMapping2]]
* Architecture/Design
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
* Gallery interiors (Ceiling/Interior Design elements/Furniture
 
* Colors
===<p style="font-family:helvetica">Sketch 10: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js </p> ===
* Individual lighting of objects vs global lighting
* Dark vs Bright lighting
* Chronological vs Thematic arrangement
* Academic vs Journalistic writting
* Busy layout vs Minimal Layout
* Exibition seen vs other exhibitions
* Themed/idea-oriented vs objectled exhibitions
* Didactic vs discovery exhibition
* Contextual, immersive, or atmospheric exhibitions
* Audience vs Reception
<br>
→&nbsp;7.&nbsp;[[Fantasies of the Library - Etienne Turpin (ed.), Anne-Sophie Springer (ed.)]] [https://hub.xpub.nl/bootleglibrary/category/new/326 Ref]; Editeur: The MIT Press; Date de publication: 1 sept. 2018
* How the a physical organization influence of a bookshelf can influence it's digital version
* The book as a minitaure gallery/exhibition space
* The library as a public place of reading
* Library vs Exhibition Space = Use vs Display
* Book-theme exhibitions


[[File:P5.js sensor.gif|300px|thumb|left|P5.js and ultrasonic sensor]]
==<p style="font-family:helvetica">About User vs Visitor, or user in exhibition space</p>==


The goal here was to create a first communication between the physical setup and a P5.js web page
[[Designing the user experience in exhibition spaces - Elisa Rubegni, Caporali Maurizio, Antonio Rizzo, Erik Grönvall]]
 
* What are the GUI intentions
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
* What is the WIMP interaction model
* What are the post-Wimp models
* About Memex


===<p style="font-family:helvetica">Sketch 11:  Arduino Mega + UltraSonicSensor + LCD TouchScreen </p> ===
==<p style="font-family:helvetica">About User Interface</p>==


[[File:LCDArduinoVariableposter.gif|300px|thumb|left|LCD Arduino Variable poster]]
=== Readings/Notes===


<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
→&nbsp;1.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=+The+Interface+Effect bootleg][[Alexander R. Galloway - The Interface Effect]]  1st ed. Malden, USA: Polity Press.<br>
 
* The interface paradox
====<p style="font-family:helvetica"> About the ESP8266 module</p> ====
* The less they do, the more they achieve and the more they become invisible & unconsidered
 
* The interface as a "significant surface"
The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.
* The interface as a gateway
 
* The interface as "the place where information moves from one entity to another"
====<p style="font-family:helvetica"> Ressources about ESP8266 module</p>  ====
* The interface as the media itself
 
* The interface as "agitation or generative friction between different formats"
Kindly fowarded by Lousia:<br>
* The interface as "an area" that "separates and mixes the two worlds that meet together there"
 
<br>
* https://www.youtube.com/watch?v=6hpIjx8d15s
→&nbsp;2.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=Navigating+Neoliberalism bootleg] [[Nick Srnicek - Navigating Neoliberalism: Political Aesthetics in an Age of Crisis]] NOTES INSIDE<br>
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
Editeur: medium.com, Date de publication: 20 oct. 2016<br>
* https://www.youtube.com/watch?v=dWM4p_KaTHY
* From an aesthetic of sublime into an aesthetics of the interface
* https://randomnerdtutorials.com/esp8266-web-server/
* Cognitive mapping
* https://www.youtube.com/watch?v=6hpIjx8d15s
<br>
* https://electronoobs.com/eng_arduino_tut101.php
→&nbsp;3.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=+Program+or+be+programmed bootleg] [[Program Or Be Programmed - Ten Commands For A Digital Age  Douglas Rushkoff]] NOTES INSIDE<br>
* http://surveillancearcade.000webhostapp.com/index.php (interface)
Douglas Rushkoff, A., 2010. Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff. 1st ed. Minneapolis, USA: OR Books.<br>
 
*"Instead of learning about our technology, we opt for a world in which our technology learns about us."
====<p style="font-family:helvetica">Which ESP8266 to buy</p> ====
* Programmed by the interfaces
 
* From a transparent to an opaque medium
* https://makeradvisor.com/tools/esp8266-esp-12e-nodemcu-wi-fi-development-board/
<br>
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
→&nbsp;4.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=The+Best+Interface+Is+No+Interface bootleg][[The Best Interface Is No Interface - Golden Krishna]] NOTES INSIDE<br>
* https://www.amazon.nl/-/en/dp/B06Y1ZPNMS/ref=sr_1_5?crid=3U8B6L2J834X0&dchild=1&keywords=SP8266%2BNodeMCU%2BCP2102%2BESP&qid=1635089256&refresh=1&sprefix=sp8266%2Bnodemcu%2Bcp2102%2Besp%2Caps%2C115&sr=8-5&th=1
Krishna, G., 2015. The Best Interface Is No Interface: The simple path to brilliant technology (Voices That Matter). 1st ed. unknown: New Riders Publishing.
* "Screen Obsessed Approach to Design"
* UI vs UX
<br>
→&nbsp;5.&nbsp;[[Plasticity of User Interfaces:A Revised Reference Framework]] NOTES INSIDE<br>
Gaëlle Calvary, Joëlle Coutaz, David Thevenin
Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt<br>
* About the term 'Placticity'
<BR>
→&nbsp;6.&nbsp;[[Interface Critique- Beyond UX - FLORIAN HADLER, ALICE SOINÉ; DANIEL IRRGANG]] [https://hub.xpub.nl/bootleglibrary/search?query=interface DOC] Florian Hadler, Alice Soiné, Daniel Irrgang<br>
* The interface as an "historical artifact", a "space of power"
* The interface as human -machine boudary
* What is interface critique
* Interface in computer science
* The screen for Lev Manovitch


==<p style="font-family:helvetica">Semester 2</p> ==
<br><br>


[[File:Simu part 02.gif|left|1000px]]
=== More to read/see===


<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
→&nbsp;1.&nbsp;Bickmore, T.W., Schilit, B.N., Digestor: Device-
Independent Access To The World Wide Web, in Proc. of 6th
Int. World Wide Web Conf. WWW’6 <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Santa Clara, April
1997)


===<p style="font-family:helvetica">Sketch 12:  Arduino Uno + P5serialcontrol + P5.js web editor = Code descrypter </p> ===
→&nbsp;2.&nbsp;Bouillon, L., Vanderdonckt, J., Souchon, N., Recovering
Alternative Presentation Models of a Web Page with VAQUITA,
Chapter 27, in Proc. of 4th Int. Conf. on Computer-
Aided Design of User Interfaces CADUI’2002 <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Valenciennes,
May 15-17, 2002)


[[File:Codeword 01.png|300px|thumb|left|P]]
→&nbsp;3.&nbsp;Calvary, G., Coutaz, J., Thevenin, D., Supporting Context
[[File:Codeword 02.png|300px|thumb|center|I]]
Changes for Plastic User Interfaces: a Process and a
[[File:Codeword 03.png|300px|thumb|left|G]]
Mechanism, in “People and Computers XV – <br>
[[File:Codeword 04.png|300px|thumb|center|E]]
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Interaction
[[File:Codeword 05.png|300px|thumb|left|O]]
without Frontiers”, Joint Proceedings of AFIHM-BCS Conference
[[File:Codeword 06.png|300px|thumb|center|N]]
on Human-Computer Interaction IHM-HCI’2001(Lille, 10-14 September 2001)


<br><br>
→&nbsp;4.&nbsp;Cockton, G., Clarke S., Gray, P., Johnson, C., Literate Development:
<br><br>
Weaving Human Context into Design Specifications,
in “Critical Issues in User Interface Engineering”, <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;P.
Palanque & D. Benyon (eds), Springer-Verlag, London,
1995.


===<p style="font-family:helvetica">Sketch 13:  Arduino Uno + P5serialcontrol + P5.js web editor = Game </p> ===
→&nbsp;5.&nbsp;Graham, T.C.N., Watts, L., Calvary, G., Coutaz, J., Dubois,
E., Nigay, L., A Dimension Space for the Design of Interactive
Systems within their Physical Environments, in Proc. of
Conf. on Designing Interactive Systems DIS’2000<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (New
York, August 17-19, 2000,), ACM Press, New York, 2000,


[[File:01 Screen .png|300px|thumb|left|Stage 0<br>The subject is located too far away]]
→&nbsp;6.&nbsp;Lopez, J.F., Szekely, P., Web page adaptation for Universal
[[File:02 Screen.png|300px|thumb|center|Stage 0<br>The subject is well located and will hold position to reach next stage]]
Access, in Proc. of Conf. on Universal Access in HCI UAHCI’
[[File:03 Screen.png|300px|thumb|left|Stage 1<br>The subject unlocked Stage 1 and will hold position to reach next stage ]]
2001 <br>
[[File:04 Screen.png|300px|thumb|center|Stage 2<br>The subject unlocked Stage 2 and is located too close]]
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(New Orleans, August 5-10, 2001), Lawrence
[[File:06 Screen.png|300px|thumb|left|Stage 3<br>The subject unlocked Stage 3 and need to get closer]]
Erlbaum Associates, Mahwah, 2001,
[[File:07 Screen.png|300px|thumb|center|Transition Stage<br>The subject unlocked all stage and needs to wait the countdown for following steps]]
<br><br>


===<p style="font-family:helvetica">Sketch 14: Arduino Uno + P5serialcontrol + P5.js web editor = Simplified interface</p> ===
→&nbsp;7.&nbsp;Thevenin, D., Coutaz, J., Plasticity of User Interfaces:
Framework and Research Agenda, in Proc. of 7th IFIP International
Conference on Human-Computer Interaction Interact'
99<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Edinburgh, August 30 - September 3, 1999),
Chapman & Hall, London, pp. 110-117.


[[File:Data Collector Sample 01.gif|400px|thumb|left]]
→&nbsp;8.&nbsp;Thevenin, D., Adaptation en Interaction Homme-Machine:
Le cas de la Plasticité, Ph.D. thesis, Université Joseph Fourier,<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
Grenoble, 21 December 2001.


<br><br>
→&nbsp;9.&nbsp;[[Graspable interfaces (Fitzmaurice et al., 1995)]] [https://www.dgp.toronto.edu/~gf/papers/PhD%20-%20Graspable%20UIs/Thesis.gf.html link]
<br><br>


===<p style="font-family:helvetica">Things to try</p> ===
==<p style="font-family:helvetica">About User Condition</p>==


* Connect multiple arduinos (if necessary): https://www.youtube.com/watch?v=tU6jIoQ6M_E
=== Readings ===
* Connect lamp to arduino: https://www.youtube.com/watch?v=F-yk4Tyc44g


==<p style="font-family:helvetica"> Prototyping Ressources</p> ==
→&nbsp;1.&nbsp;[[The User Condition 04: A Mobile First World - Silvio Lorusso]] [https://networkcultures.org/entreprecariat/mobile-first-world/ Doc]
* Most web user are smarphone users
* How "mobile's first" affect global web design
* How "mobile's first" affect the way we use computers


===Do it Yourself Ressources (from Dennis de Bel)===
=<p style="font-family:helvetica">Readings (old)(mostly french)(with notes in french)</p>=


* [https://www.instructables.com/DIY-Sensors/ Instructables] is a huge source of (written) tutorials on all kinds of topics. Keep in mind it's more quantity than quality. Interesting for you might be 'diy sensors'
===<p style="font-family:helvetica">Books (old)</p>===
* [http://loliel.narod.ru/DIY.pdf Hand Made Electronic (Music)]: Great resource for cheap, diy electronics project focussing on
sound/music (pdf findable online)
* [https://www.n5dux.com/ham/files/pdf/Make%20-%20Electronics.pdf Make: Electronics:] Amazing, complete guide to everything 'electronics' (Warning, HUGE pdf)
* [https://www.thingiverse.com/ Thingiverse:] The place to find 3d printable mechanics, enclosures, parts etc.


===Electronic Shops (physical)===
<br>
 
→&nbsp;1.&nbsp; [[L'art comme expérience — John Dewey]] (french) ⚠️(yet to be filled)⚠️<br>
* [https://www.radiotwenthe.nl/ Radio Twenthe] (Den Haag)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Gallimard (1934)<br>
* [https://radiopiet.nl/ Radio Piet] (Arnhem)
→&nbsp;2.&nbsp; [[L'œuvre d'art à l'époque de sa reproductibilité technique — Walter Benjamin]] (french<br>
LIST OF SHOPS (also more physical NL ones)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Alia (1939)<br>
* https://www.circuitsonline.net/shops
→&nbsp;3.&nbsp; [[La Galaxie Gutemberg — Marshall McLuhan]] (french)<br>
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: University of Toronto Press (1962)<br> 
===Electronic Webshops (NL)===
→&nbsp;3.&nbsp; [[Pour comprendre les médias — Marshall McLuhan]] (french)<br>
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: McGraw-Hill Education (1964)<br> 
* https://www.tinytronics.nl/ (semi physical, pickup only in Eindhoven)
→&nbsp;4.&nbsp; [[Dispositif — Jean-Louis Baudry]] (french) <br>
* http://www.newtone-online.nl/catalog/
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Raymond Bellour, Thierry Kuntzel et Christian Metz (1975)<br> 
* https://www.brigatti.nl/
→&nbsp;5.&nbsp; [[L’Originalité de l’avant-garde et autres mythes modernistes — Rosalind Krauss]] (french) ⚠️(yet to be filled)⚠️<br>
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Macula (1993)<br>
===Electronic Webshops (Rest)===
→&nbsp;6.&nbsp; [[L'art de l'observateur: vision et modernité au XIXe siècle — Jonathan Crary]] (french)<br>
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Jacqueline Chambon (Editions) (1994)<br>
* [https://www.conrad.nl/ Conrad] (Germany, expensive)
→&nbsp;7.&nbsp; [[Inside the White Cube, the Ideology of Gallery Space — Brian O'Doherty]] (english) ⚠️(yet to be filled)⚠️<br>
* [https://www.tme.eu/en/ TME] (Poland, cheap, ridiculously difficult website)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Les presses du réel (2008)<br>
* [https://www.segor.de/#/index Segor] (Germany)
→&nbsp;8.&nbsp; [[Préçis de sémiotique générale — Jean-Marie Klinkenbeg]] (french) ⚠️(yet to be filled)⚠️<br>
* [http://mouser.de/ Mouser] (Germany)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Point (2000)<br>
* [https://www.reichelt.de/ Reichelt] (Germany)
→&nbsp;9.&nbsp; [[Langage des nouveaux médias — Lev Manovitch]] (french) ⚠️(yet to be filled)⚠️<br> 
* [https://www.farnell.com/ Farnell] (Germany/UK)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Presses du Réel (2001)<br>
* [https://www.digikey.com/ Digi-key] (USA, fast but expensive shipping + tax)
→&nbsp;10.&nbsp;[[L'empire cybernétique — Cécile Lafontaine]] (french)<br> 
* [https://www.taydaelectronics.com/ Tayda] (Thailand/USA, 2-3 weeks shipping)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Seuil (2004)<br>
* [https://nl.aliexpress.com/ A liexpress] (China, 2-3 weeks shipping)
→&nbsp;11.&nbsp; [[La relation comme forme — Jean Louis Boissier]] (french)<br> 
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Genève, MAMCO(2004)<br>
===PCB making EU (Expensive)===
→&nbsp;12.&nbsp; [[Le Net Art au musée — Anne  Laforêt]] (french)<br> 
 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Questions Théoriques(2011)<br>
* [https://www.eurocircuits.com/ Eurocircuits] (Germany)
→&nbsp;13.&nbsp; [[Narrative comprehension and Film communication — Edward Branigan]] (english)<br>
* [https://aisler.net/ Aisler] (Germany, 1 week from design upload to in your hands, very high quality)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Routledge (2013) <br>
* [https://www.leiton.de/ Leiton] (Germany)
→&nbsp;14.&nbsp;[[Statement and counter statement / Notes on experimental Jetset — Experimental Jetset]] (english)<br> 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Roma (2015)<br>
→&nbsp;15.&nbsp;[[Post Digital Print — Alessandro Ludovico]] (french) ≈<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: B42 (2016)<br>
→&nbsp;16.&nbsp;[[L'écran comme mobile — Jean Louis Boissier]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Presses du réel (2016)<br>
→&nbsp;17.&nbsp;[[Design tactile — Josh Clark]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Eyrolles (2016)<br>
→&nbsp;18.&nbsp;[[Espaces de l'œuvre, espaces de l'exposition — Pamela Bianchi]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Eyrolles (2016)<br>
→&nbsp;19.&nbsp;[[Imprimer le monde]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Éditions HYX et les Éditions du Centre Pompidou (2017)<br>
→&nbsp;20.&nbsp;[[Version 0 - Notes sur le livre numérique]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: ECRIDIL (2018)<br>


===PCB making China (Cheap but import tax)===
===<p style="font-family:helvetica">Articles (old)</p>===


* [https://jlcpcb.com/ JLCPCB] (1 week from design upload to in your hands, low quality solder mask)
→&nbsp;1.&nbsp;[[Frederick Kiesler — artiste- architecte]] ⚠️(yet to be filled)⚠️<br>
* [https://www.pcbway.com/ PCBWAY] (1 week from design upload to in your hands)
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(communiqué de presse)
* [https://www.allpcb.com/ ALLPCB] (1 week from design upload to in your hands)
Centre pompidou; source : centrepompidou.fr (1996)<br>
→&nbsp;2.&nbsp;[[Oublier l'exposition]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Artpress special numéro 21 (2000)<br>
→&nbsp;3.&nbsp;[[Composer avec l’imprévisible: Le questionnaire sur les médias variables]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Jon Ippolito; source : variablemedia.net/pdf/Permanence (2003)<br>
→&nbsp;4.&nbsp;[[Esthétique du numérique : rupture et continuité]] <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Fred Forest; source : archives.icom.museum (2010)<br>
→&nbsp;5.&nbsp;[[La narration interactive]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Dragana Trgovčević source : ensci.com/file_intranet/mastere_ctc/etude_Dragana_Trgovcevic.pdf (2011)<br>
→&nbsp;6.&nbsp;[[Des dispositifs aux appareils - L'Espacement d'un calcul]]<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Anthony Masure source :  anthonymasure.com (2013)<br>
→&nbsp;7.&nbsp;[https://www.cairn.info/revue-cahiers-philosophiques1-2011-1-page-9.html?contenu=article Le musée n'est pas un dispositif - Jean-Louis Déotte] p.9 - 22 (2011)<br>
→&nbsp;8.&nbsp;[http://nt2.uqam.ca/fr/entree-carnet-recherche/apogee-et-perigee-du-white-cube Apogée et périgée du White Cube] Loosli, Alban<br>


===Arduino and Sensors===
=<p style="font-family:helvetica">References</p>=


* https://www.floris.cc/shop/en/19-starter-kits
===<p style="font-family:helvetica">Exhibition space</p>===
* https://www.tinytronics.nl/shop/nl/arduino/kits/arduino-starter-kit
* https://www.conrad.nl/p/makerfactory-beginnersset-voor-arduino-1612782


===Sensor only Kit===
→&nbsp;&nbsp;[https://vanabbemuseum.nl/en/collection/details/collection/?lookup%5B1673%5D%5Bfilter%5D%5B0%5D=id%3AC969 Prouns Spaces — El lissitzky] (1920)<br>
→&nbsp;&nbsp;[https://thecharnelhouse.org/2013/11/19/frederick-kiesler-city-of-space-1925/ City in Space — Frederick Kiesler] (1920)<br>
→&nbsp;&nbsp;[https://fr.wikipedia.org/wiki/Air-Conditioning_Show The air conditionning Show — Terry Atkinson & Michael Baldwin](1966-67)<br>
→&nbsp;&nbsp;[https://www.google.com/search?q=Sans+titre+%E2%80%94+Michael+Asher&client=firefox-b-e&sxsrf=ALeKk002R7DoF2Y8QZ5Dp0GCnNVIJLEs5w:1603360434290&source=lnms&tbm=isch&sa=X&ved=2ahUKEwiDzIeJ98fsAhXxyIUKHUB-DVUQ_AUoAXoECAQQAw&biw=1406&bih=759 Sans titre — Michael Asher] (1973)<br>
→&nbsp;&nbsp;[https://www.centrepompidou.fr/cpv/resource/cXjGAE/rREaz7 Serra Corner prop n°7 (for Nathalie) Richard Serra] (1983)<br>
→&nbsp;&nbsp;[https://www.muhka.be/programme/detail/1405-shilpa-gupta-today-will-end/item/30302-speaking-wall Speaking Wall]  (2009 - 2010) <br>


* [https://nl.aliexpress.com/item/4000238240904.html?spm=a2g0o.search0304.0.0.7bca74dbu2i6KH&algo_pvid=59637d40-1368-41ba-bc30-63b1517303e9&aem_p4p_detail=2021092303503115630656510062050000317987&algo_exp_id=59637d40-1368-41ba-bc30-63b1517303e9-0 45-in-1] (aliexpress) Example sensor you will find in such a kit documented here
===<p style="font-family:helvetica">Nothingness with Media</p>===


===Arduino Starter Projects===
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=JTEFKFiXSx4 4’’33’ — John Cage] (1952)<br>
→&nbsp;&nbsp;[https://www.saatchigallery.com/artists/artpages/tom_friedman_8.htm Untitled - A Curse — Tom Friedman] (1965)<br>
→&nbsp;&nbsp;[https://fr.wikipedia.org/wiki/Air-Conditioning_Show The air conditionning Show — Terry Atkinson & Michael Baldwin](1966-67)<br>
→&nbsp;&nbsp;[https://www.google.com/search?q=Sans+titre+%E2%80%94+Michael+Asher&client=firefox-b-e&sxsrf=ALeKk002R7DoF2Y8QZ5Dp0GCnNVIJLEs5w:1603360434290&source=lnms&tbm=isch&sa=X&ved=2ahUKEwiDzIeJ98fsAhXxyIUKHUB-DVUQ_AUoAXoECAQQAw&biw=1406&bih=759 Sans titre — Michael Asher] (1973)<br>


* https://www.makerspaces.com/simple-arduino-projects-beginners/
===<p style="font-family:helvetica">Mediatization of Media</p>===
or slightly more complex:
* http://www.makeuseof.com/tag/10-great-arduino-projects-for-beginners/
or in videos:
*[https://www.youtube.com/playlist?list=PLT6rF_I5kknPf2qlVFlvH47qHvqvzkkn%20d youtube playlist]
or just many different ideas:
*http://playground.arduino.cc/Projects/Ideas
or - of course - on Instructables if you want to have a complete course:
*https://www.instructables.com/class/Arduino-Class/
or this course:
*https://arduino.tkkrlab.space/
ARDUINO + PROCESSING (visualizing sensors)
*https://create.arduino.cc/projecthub/projects/tags/processing
MISCELANIOUS KEYWORDS and LINKS
*citizen science, for example: https://www.meetjestad.net/
*https://forensic-architecture.org/


==<p style="font-family:helvetica">Installation</p>==
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=JTEFKFiXSx4 4’’33’ — John Cage] (1952)<br>
 
→&nbsp;&nbsp;[https://www.guggenheim.org/artwork/9537 TV Garden — Nam June Paik] (1974)<br>
===<p style="font-family:helvetica">Creating an elastic exhibition space</p> ===
→&nbsp;&nbsp;[https://re-voir.com/shop/en/michael-snow/71-michael-snow-presents.html Presents — Michael Snow] (soon to be translated) <br>
→&nbsp;&nbsp;[https://www.experimentaljetset.nl/archive/lostformats Lost Formats Preservation Society — Experimental Jetset] (2000) <br>
→&nbsp;&nbsp;[https://www.experimentaljetset.nl/archive/lost-formats-winterthur Lost Formats Winterthur — Experimental Jetset] (2000) <br>
→&nbsp;&nbsp;[http://indexgrafik.fr/latlas-critique-dinternet/ L’atlas critique d’Internet] Louise Drulhe (2014-2015) <br>


[[File:ResponsiveSpaceSquare.gif|400px|thumb|left|Responsive Space Installation Simulation]]
===<p style="font-family:helvetica">Flags</p>===
[[File:Responsive Space (detail).png|400px|thumb|center|Responsive Space (detail)]]
[[File:Spectator friendly physical exhibition space.png|600px|thumb|left]]
[[File:Moving Wall Structure Shema 1.gif|400px|thumb|center|Moving Wall Structure Shema 1]]


<br><br><br><br>
→&nbsp;&nbsp;[https://www.guggenheim.org/artwork/10703 Netflag — Mark Napier] (2002) <br>
→&nbsp;&nbsp;[https://019-ghent.org/flags/flagpole/sd025/ 019 - Flag show] (2015)


===<p style="font-family:helvetica">Ressources</p> ===
===<p style="font-family:helvetica">User perspective</p>===
 
→&nbsp;&nbsp;[http://whatyouseeiswhatyouget.net/ What you see is what you get — Jonas Lund] (2012) <br>


* Movable walls build out for Art Museum of West Virginia University [https://www.youtube.com/watch?v=E3SrGzAsRBE link]
===<p style="font-family:helvetica">Media Time perception</p>===
* Gallery Wall System (GWS) [https://www.youtube.com/watch?v=uJ-yFkJ9ykA link]
* CASE-REAL installs movable walls inside a basement art gallery in tokyo [https://www.youtube.com/watch?v=kHTgAeN4XDg link]


=<p style="font-family:helvetica">Venues</p> =
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=aLNfUB7JtA4&list=PLDBIRMMxLGqOnvwJGO5kge8RMESFmA0MX&index=3 Present Continuous Past — Dan Graham's] (1974)


==<p style="font-family:helvetica">Venue 1: Aquarium </p>==
===<p style="font-family:helvetica">Experimental cinema</p>===


===<p style="font-family:helvetica">Description</p>===
→&nbsp;&nbsp;[https://re-voir.com/shop/en/michael-snow/71-michael-snow-presents.html Presents — Michael Snow] (soon to be translated) <br>
<br>
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=bMDr_CFFgWE&list=PLDBIRMMxLGqOnvwJGO5kge8RMESFmA0MX&index=4 Displacements — Michael Naimark] (1980) <br>
<b>AQUARIUM 1.0</b><br>
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=J2-VQFPYftM&list=PLDBIRMMxLGqOnvwJGO5kge8RMESFmA0MX&index=5 BE NOW HERE — Michael Naimark] (1995)
<br><br>
A Small Ecosystem for Living Thoughts<br>
<br>
Monday, 11th October<br>
19:30 – 21:30<br>
Leeszaal Rotterdam West<br>
Rijnhoutplein 3, 3014 TZ Rotterdam<br>
<br>
with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni,
Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann
<br><br>
It’s oh-fish-ial! Students of the Experimental Publishing Master invite you to dive into their small ecosystem of living thoughts. Join us for an evening of conversation, discussion and new view points. If you look closely, you might even see some early thesis ideas hatching. Let's leave no rock unturned.


==<p style="font-family:helvetica">Observation questionnaire</p>==
===<p style="font-family:helvetica">CSS composition</p>===


This exercice is a very small, humble and almost 100% analog exercice questioning representation in two small steps.  
→&nbsp;&nbsp;[http://sebastianlyserena.dk/ Sebastianly Serena]<br>
→&nbsp;&nbsp;[http://www.scrollbarcomposition.com/ Scrollbar Composition]<br>
→&nbsp;&nbsp;[http://www.intotime.com/ into time .com - Rafael Rozendaal]<br>
→&nbsp;&nbsp;[https://www.nicolassassoon.com/RIDGE_11.html Ridge 11 - Nicolas Sassoon]<br>
→&nbsp;&nbsp;[https://www.associationpaste.com/rectangulaire/ Rectangulaire - Claude Closky]<br>
→&nbsp;&nbsp;[https://jacksonpollock.org/ Jacksonpollock.org - Miltos Manetas]<br>
→&nbsp;&nbsp;[http://aabrahams.free.fr/movpaint/frame5.htm Moving Paintings - Annie Abrahams]


===<p style="font-family:helvetica">1st step</p>===
===<p style="font-family:helvetica">Media deterioration</p>===


[[File:Brick.jpg|300px|thumb|left|photo of a brick]]
→&nbsp;&nbsp;[https://img214270416jpg.tumblr.com/ Img214270417]<br>
<br><br><br><br><br><br><br><br><br><br><br>
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=mjnAE5go9dI&list=LLfu-Fy4NjlpiIYJyE447UDA&index=49&t=832s William Basinski - The Disintegration Loops]
* <b>1st step:</b> I give a sheet of paper to people during the venue and ask them to answer a series of questions concerning the object (brick) that is being displayed in the middle of the room on a podium. It is specified to them that they can be anywhere while observing this brick and in any position. Here are the quesitons:
<br>


* <b>Please write down your first name:
===<p style="font-family:helvetica">Undefined</p>===
<br>


* Describe your position (sitting/standing/other):
→&nbsp;&nbsp;[http://untitledsans.com/ Untitled Sans]
<br>


* Describe your location in the room:
===<p style="font-family:helvetica">User friendliness and anti-user friendliness</p>===
<br>


* Describe what you are seeing while looking at the screen:
→&nbsp;&nbsp;[https://www.juhavaningen.com/websafe/1.html Web-Safe - Juha van Ingen]
<br>
* Describe how you feel mentaly/emotionaly:</b>
<br>
<br>
===<p style="font-family:helvetica">2nd step</p>===


[[File:Brickphoto.png|300px|thumb|left|photo of brick displayed inside a computer screen]]
===<p style="font-family:helvetica">Media Art conservation</p>===
<br><br><br><br><br><br><br><br><br><br><br>


* <b>2nd step:</b> I take the answers, wait a round, and then give back a new sheet of paper to the same people with the exact same questions concerning the respresentation of the object (brick) that is being displayed in the middle of the room on a computer screen on the same podium.
→&nbsp;&nbsp;[https://www.guggenheim.org/conservation/the-variable-media-initiative The Variable Media Initiative] 1999<br>
 
→&nbsp;&nbsp;[https://www.eai.org/resourceguide/ EAI Online Resource Guide forExhibiting, Collecting & Preserving Media Art]<br>
===<p style="font-family:helvetica">Answer Samples</p>===
→&nbsp;&nbsp;[http://mattersinmediaart.org/ Matters in Media Art]<br>
→&nbsp;&nbsp;[https://www.incca.org/ The International Network for the Preservation of Contemporary Art (INCCA)]<br>
→&nbsp;&nbsp;[https://www.tandfonline.com/doi/full/10.1080/19455224.2019.1604398 Archiving complex digital artworks - Dušan Barok]


1.0 <b>Object on a podium</b>
===<p style="font-family:helvetica">Emulation</p>===


* 1.1 Sitting on corner stairs  —> Want to see it from different angles —> Feeling trapped, frustrated
→&nbsp;&nbsp;[https://www.guggenheim.org/exhibition/seeing-double-emulation-in-theory-and-practice Seeing Double: Emulation in Theory and Practice]<br>
* 1.2 Sitting on stairs —> a rock looking dead —> Feeling sad
* 1.3 Sitting on the left close from columns —> rational observation —> Nostalgic memories because participated to the creation of the object as it looks right now
* 1.4 Sitting in front of object —> Calm and slighly confused
* 1.5 Sitting on the floor next to stairs in between the side and the middle —> Looking at the object from the bottom —> Feeling a bit confused and inspired
<br><br>
2.0 <b>Photo of the object displayed on a computer screen placed on a podium</b>


* 2.1 Sitting on a chair seeing the brick from a bird perspective -> Feeling more control of the situation
===<p style="font-family:helvetica">Technological Timeline</p>===
* 2.2 Sitting very close from the brick —> Seeing a flat and almost abstract picture —> Feeling drawn to the picture, aesthetically pleasing, feeling less sad about the object
* 2.3 Sitting under a table very far way —> Looking abstract but identifiable —> Exited about the unusual and childish observation position
* 2.4 Sitting on stairs —> and seeing the brick in 2D —> Feeling fine
* 2.5 Sittiing on the stairs —> Seeing a side of the screen with a top view photo of the object —> Feeling confortable
<br><br>
[[File:Answers1 RepresentationQuestionnaire.png|300px|thumb|left|Answers1_RepresentationQuestionnaire]]
[[File:Answers2 RepresentationQuestionnaire.png|300px|thumb|center|Answers2_RepresentationQuestionnaire]]
[[File:Answers3 RepresentationQuestionnaire.png|300px|thumb|left|Answers3_RepresentationQuestionnaire]]
[[File:Answers4 RepresentationQuestionnaire.png|300px|thumb|center|Answers4_RepresentationQuestionnaire]]
[[File:Answers5 RepresentationQuestionnaire.png|300px|thumb|left|Answers5_RepresentationQuestionnaire]]


<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
→&nbsp;&nbsp;[https://www.docam.ca/en/technological-timeline.html Technological Timeline]<br>


=<p style="font-family:helvetica">Venues</p> =
===<p style="font-family:helvetica">Media Art Online Archive</p>===


==<p style="font-family:helvetica">Venue 2: Aquarium 2.0 </p>==
→&nbsp;&nbsp;[https://digitalartarchive.siggraph.org/ ACM SIGGRAPH Art Show Archives]<br>
→&nbsp;&nbsp;[https://www.digitalartarchive.at/nc/home.html Archive of Digital Art (ADA)]<br>
→&nbsp;&nbsp;[https://ars.electronica.art/about/en/archiv/ Ars Electronica Archive]<br>
→&nbsp;&nbsp;[https://archive-it.org/collections/4388 Digital Art Web Archive (collected by Cornell)]<br>
→&nbsp;&nbsp;[https://monoskop.org/Recent Monoskop]<br>
→&nbsp;&nbsp;[https://rhizome.org/art/artbase/ The Rhizome ArtBase]<br>


===<p style="font-family:helvetica">Description</p>===
===<p style="font-family:helvetica">Music/Sound</p>===
<br>
Date 29th Nov — 4th Dec 2021 <br>
Time 15h — 18h <br>
29th Nov — 4th Dec 2021 (all day)<br>
Location: De Buitenboel, Rosier Faassenstraat 22 3025 GN Rotterdam, NL<br>
<br><br>
AQUARIUM 2.0 <br>
<br>
An ongoing window exhibition with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann<br>
<br>
Tap upon the glass and peer into the research projects we are currently working on.
From Monday 29th of November until Saturday 4th of December we put ourselves on display in the window of De Buitenboel as an entry point into our think tank. Navigating between a range of technologies, such as wireless radio waves, virtual realities, sensors, ecological and diffractive forms of publishing, web design frameworks, language games, and an ultra-territorial residency; we invite you to gaze inside the tank and float with us. Welcome back to the ecosystem of living thoughts.<br>


==<p style="font-family:helvetica">Aquarium LCD Portal (29 Nov – 4th Dec)</p>==
→&nbsp;&nbsp;[https://wabi-sabi-tapes.bandcamp.com/album/the-end-of-music The end of music]


This interactive micro-installation composed of a LCD screen and sensor(s) invites users/visitors to change the color of the screen and displayed messages by getting more or less close from the window.
===<p style="font-family:helvetica">HTML Quines</p>===
[https://www.pzwart.nl/blog/2021/11/30/aquarium-2-0/ Link]


[[File:ScreenPortalFront.jpg|300px|thumb|left|ScreenPortalFront]]
→&nbsp;&nbsp;https://hugohil.github.io/dedans/<br>
[[File:ScreenPortalback.jpg|300px|thumb|right|ScreenPortalback]]
→&nbsp;&nbsp;https://secretgeek.github.io/html_wysiwyg/html.html<br>
[[File:LCDScreenTest.gif|600px|thumb|center|LCDScreenTest]]
→&nbsp;&nbsp;http://all-html.net/?<br>


<br><br><br><br><br><br><br><br><br><br><br><br><br>
<div style='
width: 75%; 
font-size:16px;
background-color: white;
color:black;
float: left;
border:1px black;
font-family: helvetica;
'>


=<p style="font-family:helvetica">Readings (new)(english)(with notes in english) </p>=
<div style='  
 
width: 75%;   
==<p style="font-family:helvetica">About Institutional Critique</p>==
font-size:16px;
 
background-color: white;
===<p style="font-family:helvetica">To read</p>===
color:black;
→&nbsp;1.&nbsp;[[Art and Contemporary Critical Practice: Reinventing Institutional Critique]][https://aaaaarg.fail/thing/51c584186c3a0ed90ba30900 Doc]<br>
float: left;
→&nbsp;2.&nbsp;[[From the Critique of Institutions to an Institution of Critique - Andrea Fraser]][https://aaaaarg.fail/upload/andrea-fraser-from-the-critique-of-institutions-to-an-institution-of-critique-1.pdf Doc]<br>
border:1px black;
→&nbsp;3.&nbsp;[[Institutional critique, an anthology of artists writings - Alexander Alberro]][https://aaaaarg.fail/upload/alexander-alberro-institutional-critique-an-anthology-of-artists-writings.pdf Doc]
font-family: helvetica;
 
'>
==<p style="font-family:helvetica">About Techno-Solutionism</p>==
 
===<p style="font-family:helvetica">To read</p>===
 
→&nbsp;1.&nbsp;The Folly of Technological Solutionism: An Interview with Evgeny Morozov - Natasha Dow Schüll
 
==<p style="font-family:helvetica">About Meta</p>==
 
===<p style="font-family:helvetica">To read</p>===
→&nbsp;1.&nbsp; [https://www.tandfonline.com/doi/full/10.3402/jac.v6.23009  The meta as an aesthetic category] Bruno Trentini (2014)<br>
→&nbsp;2.&nbsp; [[File:RMZ ARTIST WRITING(2).pdf|left|[[thumb]]Rosa Maria Zangenberg ]] The eye tells the story by Rosa Maria Zangenberg (2017)<br>
→&nbsp;3.&nbsp; [https://brill.com/view/title/2075 Leonardo Da Vinci - Paragone by Louise Farago]
 
==<p style="font-family:helvetica">About exhibition space</p>==
 
===<p style="font-family:helvetica">To read</p>===
 
→&nbsp;2.&nbsp;Kluitenberg, Eric, ed. Book of imaginary media. Excavating the dream of the ultimate communication medium. Rotterdam: NAi Publishers, 2006.<br>
→&nbsp;3.&nbsp;[http://nt2.uqam.ca/fr/biblio/wall-and-canvas-lissitzkys-spatial-experiments-and-white-cube The wall and the canvas: Lissitzky’s spatial experiments and the White Cube]<br>
→&nbsp;6.&nbsp;[https://eastofborneo.org/articles/decorative-arts-billy-al-bengston-and-frank-gehry-discuss-their-1968-collaboration-at-lacma/ Decorative Arts: Billy Al Bengston and Frank Gehry discuss their 1968 collaboration at LACMA by Aram Moshayedi]<br>
→&nbsp;8.&nbsp; [[File:Resonance and Wonder STEPHEN GREENBLATT.pdf|thumb]] Resonance and Wonder - STEPHEN GREENBLATT<br>
→&nbsp;9.&nbsp; A Canon of Exhibitions - Bruce Altshuler [[File:A Canon of Exhibitions - Bruce Altshuler.pdf|thumb]]<br>
→&nbsp;10.&nbsp;Documenta - [[File:A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar.pdf|thumb]] A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar<br>
→&nbsp;11.&nbsp;Pallasmaa - The Eyes of the Skin [[File:Pallasmaa - The Eyes of the Skin.pdf|thumb]]<br>
→&nbsp;12.&nbsp;Venturi - Learning from Las Vegas [[File:Venturi - Learning from Las Vegas.pdf|thumb]]<br>
→&nbsp;13.&nbsp;Preserving and Exhibiting Media Art: Challenges and Perspectives - JULIA NOORDEGRAAF, COSETTA G. SABA; BARBARA LE MAÎTRE; VINZENZ HEDIGER
Copyright: 2013 - Publisher: Amsterdam University Press Series: Framing Film
 
===<p style="font-family:helvetica">Reading/Notes</p>===
 
→&nbsp;1.&nbsp;[http://nt2.uqam.ca/fr/biblio/after-white-cube [[After the White Cube.]]] [https://www.lrb.co.uk/the-paper/v37/n06/hal-foster/after-the-white-cube ref] 2015 NOTES INSIDE<br>
* How and why White Cube rised and became democratized
* White Cube // Consumerism = Art Consumerism?
* Exhibition Space > Artworks
* Experience of interpretation = Entertainment of Art?
* Museum vs Mausoleum
<br>
→&nbsp;2.&nbsp;[http://nt2.uqam.ca/fr/biblio/spaces-experience-art-gallery-interiors-1800-2000 [[Spaces of Experience: Art Gallery Interiors from 1800 – 2000]]] [http://nt2.uqam.ca/fr/biblio/spaces-experience-art-gallery-interiors-1800-2000 ref] NOTES INSIDE<br>
* Art vs 50's consumerism / Choregraphy of desire?
* Check theorists Hermann von Helmholtz and Wilhelm Wundt
<br>
→&nbsp;3.&nbsp;[https://www.e-flux.com/announcements/262138/colour-critique/ [[Colour Critique A Symposium on Colour as an Agent for Taste, Experience and Value in the Exhibition Space]]] NOTES INSIDE<br>
May 24, 2019 - Noise! Frans Hals, Otherwise, Frans Hals Museum<br>
→&nbsp;4.&nbsp; [[Colour_Critique_A_Symposium_on_Colour_as_an_Agent_for_Taste,_Experience_and_Value_in_the_Exhibition_Space|Noise! Frans Hals, Otherwise]] NOTES INSIDE<br>
* Role of colours in the viewer's experience of an exhibition
* Institutional Critique
* Institutionalised Space / White cube
<br>
→&nbsp;5.&nbsp;[[Mental Spaces - Joost Rekveld/Michael van Hoogenhuyze]] NOTES INSIDE<br>(course for Artscience 2007/8)
[http://www.joostrekveld.net/wp/?page_id=590 doc]<br>
* About perspective
* About Space time
* About Cyber Space
<br>
→&nbsp;6.&nbsp; [[THE  DEVIL  IS IN THE  DETAILS: MUSEUM - Displays  and the Creation of Knowledge]] [https://pzwiki.wdka.nl/mw-mediadesign/images/5/57/The_Devil_is_in_the_details-_DETAIT-_MUSEUM_Displays_and_the_Creation_of_Knowledge.pdf Doc] NOTES INSIDE<br>
Stephanie Moser SOUTHAMPTON UNIVERSITY (MUSEUM ANTHROPOLOGY) 2010<br>
* Architecture (Neoclassical buildings)
* Big vs Small exhibition Space
* Lined up objects vs non systematic display
* Architecture/Design
* Gallery interiors (Ceiling/Interior Design elements/Furniture
* Colors
* Individual lighting of objects vs global lighting
* Dark vs Bright lighting
* Chronological vs Thematic arrangement
* Academic vs Journalistic writting
* Busy layout vs Minimal Layout
* Exibition seen vs other exhibitions
* Themed/idea-oriented vs objectled exhibitions
* Didactic vs discovery exhibition
* Contextual, immersive, or atmospheric exhibitions
* Audience vs Reception
<br>
→&nbsp;7.&nbsp;[[Fantasies of the Library - Etienne Turpin (ed.), Anne-Sophie Springer (ed.)]] [https://hub.xpub.nl/bootleglibrary/category/new/326 Ref]; Editeur: The MIT Press; Date de publication: 1 sept. 2018
* How the a physical organization influence of a bookshelf can influence it's digital version
* The book as a minitaure gallery/exhibition space
* The library as a public place of reading
* Library vs Exhibition Space = Use vs Display
* Book-theme exhibitions
 
==<p style="font-family:helvetica">About User vs Visitor, or user in exhibition space</p>==
 
[[Designing the user experience in exhibition spaces - Elisa Rubegni, Caporali Maurizio, Antonio Rizzo, Erik Grönvall]]
* What are the GUI intentions
* What is the WIMP interaction model
* What are the post-Wimp models
* About Memex
 
==<p style="font-family:helvetica">About User Interface</p>==
 
=== Readings/Notes===
 
→&nbsp;1.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=+The+Interface+Effect bootleg][[Alexander R. Galloway - The Interface Effect]]  1st ed. Malden, USA: Polity Press.<br>
* The interface paradox
* The less they do, the more they achieve and the more they become invisible & unconsidered
* The interface as a "significant surface"
* The interface as a gateway
* The interface as "the place where information moves from one entity to another"
* The interface as the media itself
* The interface as "agitation or generative friction between different formats"
* The interface as "an area" that "separates and mixes the two worlds that meet together there"
<br>
→&nbsp;2.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=Navigating+Neoliberalism bootleg] [[Nick Srnicek - Navigating Neoliberalism: Political Aesthetics in an Age of Crisis]] NOTES INSIDE<br>
Editeur: medium.com, Date de publication: 20 oct. 2016<br>
* From an aesthetic of sublime into an aesthetics of the interface
* Cognitive mapping
<br>
→&nbsp;3.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=+Program+or+be+programmed bootleg] [[Program Or Be Programmed - Ten Commands For A Digital Age  Douglas Rushkoff]] NOTES INSIDE<br>
Douglas Rushkoff, A., 2010. Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff. 1st ed. Minneapolis, USA: OR Books.<br>
*"Instead of learning about our technology, we opt for a world in which our technology learns about us."
* Programmed by the interfaces
* From a transparent to an opaque medium
<br>
→&nbsp;4.&nbsp;[https://hub.xpub.nl/bootleglibrary/search?query=The+Best+Interface+Is+No+Interface bootleg][[The Best Interface Is No Interface - Golden Krishna]] NOTES INSIDE<br>
Krishna, G., 2015. The Best Interface Is No Interface: The simple path to brilliant technology (Voices That Matter). 1st ed. unknown: New Riders Publishing.
* "Screen Obsessed Approach to Design"
* UI vs UX
<br>
→&nbsp;5.&nbsp;[[Plasticity of User Interfaces:A Revised Reference Framework]] NOTES INSIDE<br>
Gaëlle Calvary, Joëlle Coutaz, David Thevenin
Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt<br>
* About the term 'Placticity'
<BR>
→&nbsp;6.&nbsp;[[Interface Critique- Beyond UX - FLORIAN HADLER, ALICE SOINÉ; DANIEL IRRGANG]] [https://hub.xpub.nl/bootleglibrary/search?query=interface DOC] Florian Hadler, Alice Soiné, Daniel Irrgang<br>
* The interface as an "historical artifact", a "space of power"
* The interface as human -machine boudary
* What is interface critique
* Interface in computer science
* The screen for Lev Manovitch
 
<br><br>
 
=== More to read/see===
 
→&nbsp;1.&nbsp;Bickmore, T.W., Schilit, B.N., Digestor: Device-
Independent Access To The World Wide Web, in Proc. of 6th
Int. World Wide Web Conf. WWW’6 <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Santa Clara, April
1997)
 
→&nbsp;2.&nbsp;Bouillon, L., Vanderdonckt, J., Souchon, N., Recovering
Alternative Presentation Models of a Web Page with VAQUITA,
Chapter 27, in Proc. of 4th Int. Conf. on Computer-
Aided Design of User Interfaces CADUI’2002 <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Valenciennes,
May 15-17, 2002)
 
→&nbsp;3.&nbsp;Calvary, G., Coutaz, J., Thevenin, D., Supporting Context
Changes for Plastic User Interfaces: a Process and a
Mechanism, in “People and Computers XV – <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Interaction
without Frontiers”, Joint Proceedings of AFIHM-BCS Conference
on Human-Computer Interaction IHM-HCI’2001(Lille, 10-14 September 2001)
 
→&nbsp;4.&nbsp;Cockton, G., Clarke S., Gray, P., Johnson, C., Literate Development:
Weaving Human Context into Design Specifications,
in “Critical Issues in User Interface Engineering”, <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;P.
Palanque & D. Benyon (eds), Springer-Verlag, London,
1995.
 
→&nbsp;5.&nbsp;Graham, T.C.N., Watts, L., Calvary, G., Coutaz, J., Dubois,
E., Nigay, L., A Dimension Space for the Design of Interactive
Systems within their Physical Environments, in Proc. of
Conf. on Designing Interactive Systems DIS’2000<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (New
York, August 17-19, 2000,), ACM Press, New York, 2000,
 
→&nbsp;6.&nbsp;Lopez, J.F., Szekely, P., Web page adaptation for Universal
Access, in Proc. of Conf. on Universal Access in HCI UAHCI’
2001 <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(New Orleans, August 5-10, 2001), Lawrence
Erlbaum Associates, Mahwah, 2001,
 
→&nbsp;7.&nbsp;Thevenin, D., Coutaz, J., Plasticity of User Interfaces:
Framework and Research Agenda, in Proc. of 7th IFIP International
Conference on Human-Computer Interaction Interact'
99<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Edinburgh, August 30 - September 3, 1999),
Chapman & Hall, London, pp. 110-117.
 
→&nbsp;8.&nbsp;Thevenin, D., Adaptation en Interaction Homme-Machine:
Le cas de la Plasticité, Ph.D. thesis, Université Joseph Fourier,<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
Grenoble, 21 December 2001.
 
→&nbsp;9.&nbsp;[[Graspable interfaces (Fitzmaurice et al., 1995)]] [https://www.dgp.toronto.edu/~gf/papers/PhD%20-%20Graspable%20UIs/Thesis.gf.html link]
 
==<p style="font-family:helvetica">About User Condition</p>==
 
=== Readings ===
 
→&nbsp;1.&nbsp;[[The User Condition 04: A Mobile First World - Silvio Lorusso]] [https://networkcultures.org/entreprecariat/mobile-first-world/ Doc]
* Most web user are smarphone users
* How "mobile's first" affect global web design
* How "mobile's first" affect the way we use computers
 
=<p style="font-family:helvetica">Readings (old)(mostly french)(with notes in french)</p>=
 
===<p style="font-family:helvetica">Books (old)</p>===
 
<br>
→&nbsp;1.&nbsp; [[L'art comme expérience — John Dewey]] (french) ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Gallimard (1934)<br>
→&nbsp;2.&nbsp; [[L'œuvre d'art à l'époque de sa reproductibilité technique — Walter Benjamin]] (french<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Alia (1939)<br>
→&nbsp;3.&nbsp; [[La Galaxie Gutemberg — Marshall McLuhan]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: University of Toronto Press (1962)<br> 
→&nbsp;3.&nbsp; [[Pour comprendre les médias — Marshall McLuhan]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: McGraw-Hill Education (1964)<br> 
→&nbsp;4.&nbsp; [[Dispositif — Jean-Louis Baudry]] (french) <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Raymond Bellour, Thierry Kuntzel et Christian Metz (1975)<br> 
→&nbsp;5.&nbsp; [[L’Originalité de l’avant-garde et autres mythes modernistes — Rosalind Krauss]] (french) ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Macula (1993)<br>
→&nbsp;6.&nbsp; [[L'art de l'observateur: vision et modernité au XIXe siècle — Jonathan Crary]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Jacqueline Chambon (Editions) (1994)<br>
→&nbsp;7.&nbsp; [[Inside the White Cube, the Ideology of Gallery Space — Brian O'Doherty]] (english) ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Les presses du réel (2008)<br>
→&nbsp;8.&nbsp; [[Préçis de sémiotique générale — Jean-Marie Klinkenbeg]] (french) ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Point (2000)<br>
→&nbsp;9.&nbsp; [[Langage des nouveaux médias — Lev Manovitch]] (french) ⚠️(yet to be filled)⚠️<br> 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Presses du Réel (2001)<br>
→&nbsp;10.&nbsp;[[L'empire cybernétique — Cécile Lafontaine]] (french)<br> 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Seuil (2004)<br>
→&nbsp;11.&nbsp; [[La relation comme forme — Jean Louis Boissier]] (french)<br> 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Genève, MAMCO(2004)<br>
→&nbsp;12.&nbsp; [[Le Net Art au musée — Anne  Laforêt]] (french)<br> 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Questions Théoriques(2011)<br>
→&nbsp;13.&nbsp; [[Narrative comprehension and Film communication — Edward Branigan]] (english)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Routledge (2013) <br>
→&nbsp;14.&nbsp;[[Statement and counter statement / Notes on experimental Jetset — Experimental Jetset]] (english)<br> 
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Roma (2015)<br>
→&nbsp;15.&nbsp;[[Post Digital Print — Alessandro Ludovico]] (french) ≈<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: B42 (2016)<br>
→&nbsp;16.&nbsp;[[L'écran comme mobile — Jean Louis Boissier]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Presses du réel (2016)<br>
→&nbsp;17.&nbsp;[[Design tactile — Josh Clark]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Eyrolles (2016)<br>
→&nbsp;18.&nbsp;[[Espaces de l'œuvre, espaces de l'exposition — Pamela Bianchi]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Eyrolles (2016)<br>
→&nbsp;19.&nbsp;[[Imprimer le monde]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: Éditions HYX et les Éditions du Centre Pompidou (2017)<br>
→&nbsp;20.&nbsp;[[Version 0 - Notes sur le livre numérique]] (french)<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;publisher: ECRIDIL (2018)<br>
 
===<p style="font-family:helvetica">Articles (old)</p>===
 
→&nbsp;1.&nbsp;[[Frederick Kiesler — artiste- architecte]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(communiqué de presse)
Centre pompidou; source : centrepompidou.fr (1996)<br>
→&nbsp;2.&nbsp;[[Oublier l'exposition]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Artpress special numéro 21 (2000)<br>
→&nbsp;3.&nbsp;[[Composer avec l’imprévisible: Le questionnaire sur les médias variables]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Jon Ippolito; source : variablemedia.net/pdf/Permanence (2003)<br>
→&nbsp;4.&nbsp;[[Esthétique du numérique : rupture et continuité]] <br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Fred Forest; source : archives.icom.museum (2010)<br>
→&nbsp;5.&nbsp;[[La narration interactive]] ⚠️(yet to be filled)⚠️<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Dragana Trgovčević source : ensci.com/file_intranet/mastere_ctc/etude_Dragana_Trgovcevic.pdf (2011)<br>
→&nbsp;6.&nbsp;[[Des dispositifs aux appareils - L'Espacement d'un calcul]]<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Anthony Masure source :  anthonymasure.com (2013)<br>
→&nbsp;7.&nbsp;[https://www.cairn.info/revue-cahiers-philosophiques1-2011-1-page-9.html?contenu=article Le musée n'est pas un dispositif - Jean-Louis Déotte] p.9 - 22 (2011)<br>
→&nbsp;8.&nbsp;[http://nt2.uqam.ca/fr/entree-carnet-recherche/apogee-et-perigee-du-white-cube Apogée et périgée du White Cube] Loosli, Alban<br>
 
=<p style="font-family:helvetica">References</p>=
 
===<p style="font-family:helvetica">Exhibition space</p>===
 
→&nbsp;&nbsp;[https://vanabbemuseum.nl/en/collection/details/collection/?lookup%5B1673%5D%5Bfilter%5D%5B0%5D=id%3AC969 Prouns Spaces — El lissitzky] (1920)<br>
→&nbsp;&nbsp;[https://thecharnelhouse.org/2013/11/19/frederick-kiesler-city-of-space-1925/ City in Space — Frederick Kiesler] (1920)<br>
→&nbsp;&nbsp;[https://fr.wikipedia.org/wiki/Air-Conditioning_Show The air conditionning Show — Terry Atkinson & Michael Baldwin](1966-67)<br>
→&nbsp;&nbsp;[https://www.google.com/search?q=Sans+titre+%E2%80%94+Michael+Asher&client=firefox-b-e&sxsrf=ALeKk002R7DoF2Y8QZ5Dp0GCnNVIJLEs5w:1603360434290&source=lnms&tbm=isch&sa=X&ved=2ahUKEwiDzIeJ98fsAhXxyIUKHUB-DVUQ_AUoAXoECAQQAw&biw=1406&bih=759 Sans titre — Michael Asher] (1973)<br>
→&nbsp;&nbsp;[https://www.centrepompidou.fr/cpv/resource/cXjGAE/rREaz7 Serra Corner prop n°7 (for Nathalie) Richard Serra] (1983)<br>
→&nbsp;&nbsp;[https://www.muhka.be/programme/detail/1405-shilpa-gupta-today-will-end/item/30302-speaking-wall Speaking Wall]  (2009 - 2010) <br>
 
===<p style="font-family:helvetica">Nothingness with Media</p>===
 
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=JTEFKFiXSx4 4’’33’ — John Cage] (1952)<br>
→&nbsp;&nbsp;[https://www.saatchigallery.com/artists/artpages/tom_friedman_8.htm Untitled - A Curse — Tom Friedman] (1965)<br>
→&nbsp;&nbsp;[https://fr.wikipedia.org/wiki/Air-Conditioning_Show The air conditionning Show — Terry Atkinson & Michael Baldwin](1966-67)<br>
→&nbsp;&nbsp;[https://www.google.com/search?q=Sans+titre+%E2%80%94+Michael+Asher&client=firefox-b-e&sxsrf=ALeKk002R7DoF2Y8QZ5Dp0GCnNVIJLEs5w:1603360434290&source=lnms&tbm=isch&sa=X&ved=2ahUKEwiDzIeJ98fsAhXxyIUKHUB-DVUQ_AUoAXoECAQQAw&biw=1406&bih=759 Sans titre — Michael Asher] (1973)<br>
 
===<p style="font-family:helvetica">Mediatization of Media</p>===
 
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=JTEFKFiXSx4 4’’33’ — John Cage] (1952)<br>
→&nbsp;&nbsp;[https://www.guggenheim.org/artwork/9537 TV Garden — Nam June Paik] (1974)<br>
→&nbsp;&nbsp;[https://re-voir.com/shop/en/michael-snow/71-michael-snow-presents.html Presents — Michael Snow] (soon to be translated) <br>
→&nbsp;&nbsp;[https://www.experimentaljetset.nl/archive/lostformats Lost Formats Preservation Society — Experimental Jetset] (2000) <br>
→&nbsp;&nbsp;[https://www.experimentaljetset.nl/archive/lost-formats-winterthur Lost Formats Winterthur — Experimental Jetset] (2000) <br>
→&nbsp;&nbsp;[http://indexgrafik.fr/latlas-critique-dinternet/ L’atlas critique d’Internet] Louise Drulhe (2014-2015) <br>
 
===<p style="font-family:helvetica">Flags</p>===
 
→&nbsp;&nbsp;[https://www.guggenheim.org/artwork/10703 Netflag — Mark Napier] (2002) <br>
→&nbsp;&nbsp;[https://019-ghent.org/flags/flagpole/sd025/ 019 - Flag show] (2015)
 
===<p style="font-family:helvetica">User perspective</p>===
 
→&nbsp;&nbsp;[http://whatyouseeiswhatyouget.net/ What you see is what you get — Jonas Lund] (2012) <br>
 
===<p style="font-family:helvetica">Media Time perception</p>===
 
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=aLNfUB7JtA4&list=PLDBIRMMxLGqOnvwJGO5kge8RMESFmA0MX&index=3 Present Continuous Past — Dan Graham's] (1974)
 
===<p style="font-family:helvetica">Experimental cinema</p>===
 
→&nbsp;&nbsp;[https://re-voir.com/shop/en/michael-snow/71-michael-snow-presents.html Presents — Michael Snow] (soon to be translated) <br>
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=bMDr_CFFgWE&list=PLDBIRMMxLGqOnvwJGO5kge8RMESFmA0MX&index=4 Displacements — Michael Naimark] (1980) <br>
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=J2-VQFPYftM&list=PLDBIRMMxLGqOnvwJGO5kge8RMESFmA0MX&index=5 BE NOW HERE — Michael Naimark] (1995)
 
===<p style="font-family:helvetica">CSS composition</p>===
 
→&nbsp;&nbsp;[http://sebastianlyserena.dk/ Sebastianly Serena]<br>
→&nbsp;&nbsp;[http://www.scrollbarcomposition.com/ Scrollbar Composition]<br>
→&nbsp;&nbsp;[http://www.intotime.com/ into time .com - Rafael Rozendaal]<br>
→&nbsp;&nbsp;[https://www.nicolassassoon.com/RIDGE_11.html Ridge 11 - Nicolas Sassoon]<br>
→&nbsp;&nbsp;[https://www.associationpaste.com/rectangulaire/ Rectangulaire - Claude Closky]<br>
→&nbsp;&nbsp;[https://jacksonpollock.org/ Jacksonpollock.org - Miltos Manetas]<br>
→&nbsp;&nbsp;[http://aabrahams.free.fr/movpaint/frame5.htm Moving Paintings - Annie Abrahams]
 
===<p style="font-family:helvetica">Media deterioration</p>===
 
→&nbsp;&nbsp;[https://img214270416jpg.tumblr.com/ Img214270417]<br>
→&nbsp;&nbsp;[https://www.youtube.com/watch?v=mjnAE5go9dI&list=LLfu-Fy4NjlpiIYJyE447UDA&index=49&t=832s William Basinski - The Disintegration Loops]
 
===<p style="font-family:helvetica">Undefined</p>===
 
→&nbsp;&nbsp;[http://untitledsans.com/ Untitled Sans]
 
===<p style="font-family:helvetica">User friendliness and anti-user friendliness</p>===
 
→&nbsp;&nbsp;[https://www.juhavaningen.com/websafe/1.html Web-Safe - Juha van Ingen]
 
===<p style="font-family:helvetica">Media Art conservation</p>===
 
→&nbsp;&nbsp;[https://www.guggenheim.org/conservation/the-variable-media-initiative The Variable Media Initiative] 1999<br>
→&nbsp;&nbsp;[https://www.eai.org/resourceguide/ EAI Online Resource Guide forExhibiting, Collecting & Preserving Media Art]<br>
→&nbsp;&nbsp;[http://mattersinmediaart.org/ Matters in Media Art]<br>
→&nbsp;&nbsp;[https://www.incca.org/ The International Network for the Preservation of Contemporary Art (INCCA)]<br>
→&nbsp;&nbsp;[https://www.tandfonline.com/doi/full/10.1080/19455224.2019.1604398 Archiving complex digital artworks - Dušan Barok]
 
===<p style="font-family:helvetica">Emulation</p>===
 
→&nbsp;&nbsp;[https://www.guggenheim.org/exhibition/seeing-double-emulation-in-theory-and-practice Seeing Double: Emulation in Theory and Practice]<br>
 
===<p style="font-family:helvetica">Technological Timeline</p>===
 
→&nbsp;&nbsp;[https://www.docam.ca/en/technological-timeline.html Technological Timeline]<br>
 
===<p style="font-family:helvetica">Media Art Online Archive</p>===
 
→&nbsp;&nbsp;[https://digitalartarchive.siggraph.org/ ACM SIGGRAPH Art Show Archives]<br>
→&nbsp;&nbsp;[https://www.digitalartarchive.at/nc/home.html Archive of Digital Art (ADA)]<br>
→&nbsp;&nbsp;[https://ars.electronica.art/about/en/archiv/ Ars Electronica Archive]<br>
→&nbsp;&nbsp;[https://archive-it.org/collections/4388 Digital Art Web Archive (collected by Cornell)]<br>
→&nbsp;&nbsp;[https://monoskop.org/Recent Monoskop]<br>
→&nbsp;&nbsp;[https://rhizome.org/art/artbase/ The Rhizome ArtBase]<br>
 
===<p style="font-family:helvetica">Music/Sound</p>===
 
→&nbsp;&nbsp;[https://wabi-sabi-tapes.bandcamp.com/album/the-end-of-music The end of music]
 
===<p style="font-family:helvetica">HTML Quines</p>===
 
→&nbsp;&nbsp;https://hugohil.github.io/dedans/<br>
→&nbsp;&nbsp;https://secretgeek.github.io/html_wysiwyg/html.html<br>
→&nbsp;&nbsp;http://all-html.net/?<br>
 
<div style='  
width: 75%;   
font-size:16px;
background-color: white;
color:black;
float: left;
border:1px black;
font-family: helvetica;
'>
 
<div style='
width: 75%; 
font-size:16px;
background-color: white;
color:black;
float: left;
border:1px black;
font-family: helvetica;
'>
 
=<p style="font-family:helvetica"> Current Thesis</p> =
 
==<p style="font-family:helvetica">Introduction (700 words)</p>==
<br><br>
Have you ever been anxiously waiting for feedback on your most recent online publication? Have you ever been relieved to see your notification bar filled up with likes and comments? Have you ever felt proud to be followed by that person you've been admiring for so long? Have you ever found yourself passively numb by the infinite flow of text and images being displayed to you? Have you ever found yourself almost unconsciously pressing some interaction buttons or checking the upper right corner of your screen? Have you ever felt envious when confronted by the successful image of your friends and idols? Have you ever felt discriminated by the beauty standards promoted on some platforms? Have you ever had the sensation that your smartphone vibrated when it didn’t? Have you ever felt that you could have disappointed a machine?
<br><br>
If you have ever felt this way, you may be like me and like some of the other billions of online users who find themselves constantly interacting with technological devices. When being connected, and/or using connected devices, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. In an attempt to capitalize the last remnants of our attention and existences, tech giants have gone as far as to create addiction machines by exploiting our deepest desires and biases. And thanks to the huge amount of money earned from the data we freely provide every day, this influence continues to take up our time, our sleep, our bodies, our spaces and shape our behavior, our perception and our identities.
That being said, the next questions I would like to ask (you) are:
<br><br>
Is to be aware of this reality be enough to start individually or collectively subverting, bypassing or boycotting these tools? On the contrary, are we not fully agreeing and participating in our own alienation?
<br><br>
At the time of writing this thesis, I still find myself in the posture of investigating the effects of tools which I am not yet able to distance myself from. As a matter of fact, my connected devices, creation software, or social media accounts still appear as essential vectors of my daily entertainment, professional practices and social interactions. This paradoxical posture defines itself by a state of rejection, concern or disapproval as opposed to a state of consent, entertainment and interaction. In a sense, these conflictual feelings could be compared to the drives of some drug addicts or gamblers to get high or bet all their economies with full knowledge of the risks involved.
<br><br>
“The problem is, widespread knowledge of the dangers of addiction doesn’t stop it from happening. Likewise, we know by now that if social industry platforms get us addicted, they are working well. The more they wreck our lives, the better they’re functioning. Yet we persist.” (Seymour, 2019)
<br><br>
To understand this paradox and the dilemmas it leads to, we will try to first take a look the economic model lays between us and the interfaces that we use and use us every day. 
<br><br>
The first step is to understand the model of surveillance and attention on which many of the tools and services we use are based on, and to situate this economy within the broader historical framework of the internet. We will get to know how and when did this business model emerged, how it evolved, and who are we inside of this economy. As a clue of the widespread datafication and the culture of surveillance that emerged from this economy, we’ll observe how tech companies tried to legitimate practices of surveillance and self-surveillance through the promotion of tracking apps and data-driven advertisements.
<br><br>
Secondly, it is about investigating the perverse effects of the interfaces built in order to mobilize our attention, stimulate our interactions, obtain our consent, and create addictions. By doing so, we will observe the more recent implementation of gamification, gambling or lottery systems as part of the systems of some online platforms and question their effects on online users.
<br><br>
By closely following the expansionist and invasive logic of these companies, we will also observe the intrusion of these surveillance tools in our physical environments, as much as on our bodies and speculate on the consequences of a datafied society.
<br><br>
Finally, we will consider solutions to the culture of surveillance and self-surveillance that we live in, notably through the exploration of counter-practices, alternative tools and critical and activist works.
<br><br><br><br>
==<p style="font-family:helvetica">Chapter 1: The marketisation of user data and its legitimization (1400 words)</p>==
<br><br>
Have algorithms taken control of our brains? Have we become the willing slaves of platforms, which under the pretext to entertain and educate us, do not hesitate to manipulate us in order to convert our precious data into dollars?
<br><br>
In some studies, the main online platforms mobilize our attention for an average of 2.5 hours a day (Stewart, 2016), a figure that is constantly increasing among all age groups, and being especially high when it comes to the youngest generations. This realm is the result of an economic strategy based on the mobilization of a maximum of users attention in order to collect and resale their data. In this attention economy, the tech giants have aimed to develop increasingly addictive and distracting tools, notably by emotionally stimulating us with false rewards (likes, thumbs up, badges, followers), making us stay with endless flow of recommended content and keep us coming back with persistent notifications.
<br><br>
“(…) they miss you, they love you, they just want to make you laugh:
please come back. » (Seymour, 2019)
<br><br>
Before going further into the study of the mechanisms used in order to maximize our attention online, let’s head back two decades. During its first years of existence, between 1998 and 2001, Google was in possession of an already overwhelming amount of data collected from its early users. At the time, as the company firstly publicly positioned itself against the presence of advertisements in their search engine, this information was mainly used in order to improve the referencing/indexing system. However, later on, this growing quantity of informational units found its true lucrative potential by the study and selling of user’s behavioral data for advertising purposes
<br><br>
“Ironically, it was contempt for advertising (on thepart of the founders and chief engineers) that would ultimately pave the way to the company’s unrivaled success as an attention merchant. The key was in renegotiating the terms under which the public was asked to tolerate ads. It presented what seemed a reasonable trade-off. So unintrusive was AdWords that some people didn’t even realize that Google was ad-supported. » (Wu, 2016)
<br><br>
Thus, by remaining free while subtlety inserting advertisements in its search engine, Google would go from gaining almost no financial benefit to outrageous profit until this day. While this business model was already used in some other industries such as television and printed newspapers, (Wu, 2016) the democratization of the Web, the American liberalism and the lack of rules concerning Internet users’ privacy allowed Google, as well as a handful of other companies such as Baidu, Facebook, Microsoft, Twitter, Yahoo or Verizon to surely acquire a dominant status in their respective sectors.
<br><br>
“Life became, thanks to changes in audiovisual culture, a patchwork of jagged, broken states of attending, of being riveted by a sequence of stimuli. Advertising, movies, news cycles, all relied on their growing ability to force attention.” (Crary, 2001)
<br><br>
Until the last ten years, when the first real signs of discontent started to grow, it was yet still considered that these services were aiming to make our lives better, improve our working/living places and connect us. In fact, even for a minority of geeks, journalists and researchers writing about the subject such as Soshana Zuboff, their main objective long appeared to be led by the desire to allow users/humans to get what they want, on their own terms. (Morozov, 2019)
<br><br>
Since some events such as the Cambridge Analytica scandal, the January 6 U.S Capitol riot, or the recent revelations made by ex-employees of big tech firms such as Frances Haugen (Slotnik, 2021), we are witnessing a growing mediatization and awareness of the issues related to the toxic and intrusive effects of services provided by the tech Giants. Among the long list of issues addressed are notably the political interference linked to the proliferation of false information, the resale of personal data to obscure third parties, the abuse of a dominant position towards concurrent companies, the creation of precarious and non-unionized jobs, but also the significant increase of psychological disorders among Web users.
<br><br>
While experts do not yet widely agree on what can be the pathological effect of digital media overuse, it is worth considering that pathological effects may vary depending on the existing tools as well as depending on how much are they being used. While it is often considered that a limited use can also show beneficial effects, it seems more and more clear that users struggle to use such tools with moderation. Many experts consider that digital media overuse can lead to psychological disorders such as depression, anxiety, dependance symptoms or attention deficit hyperactivity disorder (ADHD). Beyond the studies and research, it may also be interesting to engage in an introspective exercise concerning our own use of the media, or to witness that some of our friends and family members may be also affected by one or more of the effects described here. (see personal experience/diary at the end of document)
<br><br>
From a legal point of view, we are also witnessing an increasing number of legal actions
engaged by some governments and organizations in order to regulate the leaders of this industry and limit their power. In the USA, several trials have been started, with the clear goal to regulate these companies’ business model. However, the sharp political division in the American landscape, as well as the great influence from tech lobbies hasn’t allowed any significant sanctions/regulations to date, but the fight continues. In the European Union, a bit more was done, such as with the introduction of the General Data Protection Regulation (RGPD) in 2018 and upcoming regulations such the Digital Markets Acts or the Digital Services Act (DSA) in 2023. However, at the exception of a few symbolic actions, no significant change has been observed yet. Instead, we come to observe that these companies are making use of their dominant position to effectively negotiate their sanctions, or even blackmail governments with the threat of potential shutdowns in some countries.  As a result, it appears such platforms have simply become too important for media, people and businesses around the world to truly be worried yet.
<br><br>
Today, despite growing disillusion and awareness about the real intentions of these platforms, we come to observe a quiet paradoxical willingness or even enthusiasm from most users to share information about themselves and continue using these tools. In the most common cases, users share personal information of their own free will through likes, publications, tweets, comments, photos/videos, allowing them to build their online alter-ego, publicly confess and represent themselves to the online world. But this enthusiasm doesn’t stop there, and can be further exemplified by the emergence of self-tracking tools and wearable devices which in the last 15 years, have allowed users to track their own body and legitimate the datafication and the marketisation of the body.
<br><br>
For instance, sport and health apps have been some of the firsts to promote the visualization and sharing of daily performance on social networks. In doing so, they would allow humans to regulate their own behavior based on their data, but would also encourage them to give away some of their most private information, which could then be sold to third parties such as advertisers or insurers.
<br><br>
The recent proliferation of wearable self-tracking devices intended to regulate and measure the body has brought contingent questions of controlling, accessing and interpreting personal data. Given a socio-technical context in which individuals are no longer the most authoritative source on data about themselves, wearable self-tracking technologies reflect the simultaneous commodification and knowledge-making that occurs between data and bodies.( (Crawford, et al., 2015)
<br><br>
Beyond self-tracking practices and the quantified self, which we will return to extensively later in this essay (chapter 3), we also witness a tendency from some companies to use data driven methods as a public marketing tool such as in the audiovisual streaming industry. Since a few years, Spotify for example has launched an advertising campaign revealing the musical habits and personality of its users in the form of humorous messages1 “Dear Person who played “Sorry” 42 times on Valentine’s Day – What did you do?” (fig.1) (Kholeif, 2018)  While these messages do not necessarily reflect how data mining actually works, it could be said that this advertising strategy appears to normalise the practice of user monitoring in the eyes of society. As part of that same advertising campaign, Spotify also encourages users to visualize and share their listening habits on social networks and to the public with “your Spotify year wrapped”, an individualized storyline featuring a variety of personalized, data-driven and editorially curated content.
<br><br>
The emergence of such communication strategies and the globally positive feedback from consumers are the signs of our ambivalent feelings towards these tools, which we seem as much to worry about than to confess and depend on. This contradiction is characterized on one side by a state of concern about the intrusive surveillance practices operated by the Web giants, and on the other hand, a state of entertainment and even active engagement with the tools/platforms through which this surveillance is mainly operated.
<br><br>
In the next chapter, we will investigate how tech companies still manage to get our consent, attention and engagement despite the growing concerns of the effects of such tools on society, our privacy and our mental health. Furthermore, we will also get a closer look at the involvement from the same companies in monitoring the physical world and our human existences/experiences through a wide array of tools, pretexts and contexts.
<br><br><br><br>
==<p style="font-family:helvetica">Chapter 2: The mechanisms of addiction and consent (1500 words)</p>==
<br><br>
“How can we be devoted to a technology that is marketed as our servant?” (Seymour, 2019)1
<br><br>
“Is it possible that in their voluntary communication and expression, in their
blogging and social media practices, people are contributing to instead of
contesting repressive forces?” (Hardt & Negri, 2012)
<br><br>
Have you ever found yourself compulsively checking your phone at random times of the day and night? Have you ever feared missing something when not checking your feed for a long time? Have you been afraid to lose all your online followers forever? Have you ever tried to delete or suspend your social media account(s) before ultimately restoring it/them? Have you ever given up on the idea of actually living without these tools? Have you ever felt that you could be fully consenting to your own mental alienation?
<br><br>
By 2022, just over half of the world's population have access to the internet. Among them, a majority (93,33%) also have their own social media accounts with an average of 8 platforms per user. Among some of the most popular ones, 2,9 billion users are currently registered on Facebook [Meta], 2,5 billion users on Youtube [Google], 2 billion on Whatsapp [Meta], 1,3 billion on Facebook Messenger, 1,2 billion on Instagram [Meta], 1,2 billion on WeChat, 1 billion on Tik Tok, 740 million on Linkedin [Microsoft], and 353 million on Twitter (Statistica, 2016). While these numbers could impress, it should be noted that the condition of access to the Internet, as well as the use of certain platforms can be affected by various economic or geopolitical factors within each country/region. As a result of a lack of means, some parts of the world such as Central Africa do not have regular access to the internet yet. In some countries such as China, Russia or Iran, governments have also created their own social platforms/messengers or/and taken over existing ones in order to apply more governmental control and censorship on them. In some dictatorship such as in North Korea or during periods of conflict such as currently in Ukraine, access to the internet can also be restricted on shorter or longer term.
<br><br>
Nowadays, most of the leading online platforms and tech brands rely on our willingness to share information about ourselves. Given this, one of the central questions of this thesis is to understand how such companies encourage our participation in an economy that paradoxically exploits us.
<br><br>
As argued by Richard Seymour in The Twittering Machine (Seymour, 2019), it is first essential to understand that when we do such innocent actions as searching, looking, clicking, scrolling or purchasing products online, we are collectively writing to the machines.
<br><br>
“The nuance added by social industry’s platforms is that they don’t necessarily have to spy on us. They have created a machine for us to write to. The bait is that we are interacting with other people: our friends, professional colleagues, celebrities, politicians, royals, terrorists, porn actors – anyone we like. We are not interacting with them, however, but with the machine. We write to it, and it passes on the message for us, after keeping a record of the data. The machine benefits from the ‘network effect’: the more people write to it, the more benefits it can offer, until it becomes a disadvantage not to be part of it. (Seymour, 2019)
<br><br>
Because of this invisible layer, it is mostly unconsciously or unwillingly that we participate in a social industry where platforms take the shape of giant virtual laboratories with millions, if not billions of guinea pigs. At first and all along, our participation is never forced, but we quickly find ourselves navigating inside interfaces that persuasively stimulate our desires, twist our emotions and keep us hooked by all means.
<br><br>
As argued by Seymour, social media platforms have become “rigged lottery systems”, giving to users/gamblers an impression of constant wins and objective randomness by feeding users with intermittent variable rewards. In reality, the users/gamblers find themselves always losing, but their “lost”, disguised into “wins”, stimulates compulsive play in order to persuasively keep them playing again and again.
<br><br>
Something similar happens when we post a tweet or a status or an image, where we have little control over the context in which it will be seen and understood. It’s a gamble.  (Seymour, 2019)
<br><br>
As in many gambling games, the idea remains that supposedly anyone can win big almost instantly, no matter how ephemeral or artificial that fame can be. But as with many types of addiction, it is also important to consider that the real pleasure is not necessarily to be found in the moment of winning (if ever that is possible), but when players dislocate themselves from time, from their own bodies and find themselves numbed only by the idea of winning.
<br><br>
In some cases, we observe that some of the way the interfaces work are becoming even more explicitly related to gambling mechanisms. As one of the mist obvious example, since the last decade, we have witnessed platforms such as Facebook implementing “social media games”, (games within social media platforms) allowing users to play for free, but also encouraging them to purchase tokens, chips and items with real money.
<br><br>
In a more subtle way, Instagram recommendation page also has now also something very similar to a slot machine. When scrolling up while being at the top of the home page, the user can automatically refresh the page and displays a new set of randomlt recommended content for as long as needed. (fig.2) As it turns one, this mechanism quickly became a reflex for many users/gamblers such as me.
<br><br>
On these platforms, the fact that mostly “popular” contents/people are being highlighted in the recommendation feed actively entertain the fantasy to become popular/famous. As exposed by Guy Debord already fifty years ago, the post-war consumerist big bang gave birth to a “celebrity culture” or a “society of spectacle” where “the spectacular representation of a living human being, embodies this banality by embodying the image of a possible role” (Debord, 2014)
<br><br>
The status of celebrity offers the promise of being showered with ‘all good things’ that capitalism has to offer. The grotesque display of celebrity lives (and deaths) is the contemporary form of the cult of personality; those ‘famous for being famous’ hold out the spectacular promise of the complete erosion of a autonomously lived life in return for an apotheosis as an image. The ideological function of celebrity (and lottery systems) is clear - like a modern ‘wheel of fortune’ the message is ‘all is luck; some are rich, some are poor, that is the way the world is...it could be you ! (Debord, 2014)
<br><br>
Online, the obsession with visibility leads numerous people to adapt their way of interacting or/and posting depending on the ever-changing algorithm logics. Within this new algorithmic governance, each new update leads communities of users, influencers and content-creators to calibrate their behaviors and contents to a new set of mystified parameters. On YouTube for example, countless videos will guide you through the best ways to get your content displayed in the recommendation pages, often by trying to catch viewer’s attention for as long as possible (fig.3). On Instagram, some will suggest you to post, like, comment on a daily basis or at peak traffic hours. On Spotify, numerous song-makers are even willing to collaborate with search engine optimization companies in order to know which artist, song, or album names and formats would get the best chances to get highlighted by the algorithms. As a symptom of this “fame rush”, we are witnessing a limitation of the variety of contents being recommended on these platforms, often revealing recurrent patterns among the most popular contents. However, making contents that achieve a significant amount of visibility or feedbacks (like, comments, suscriptions, etc) remains quiet unpredictable.
<br><br>
“Whatever we write has to be calibrated for social approval. Not only do we aim for conformity among our peers but, to an extent, we only pay attention to what our peers write insofar as it allows us to write something in reply, for the ‘likes’. Perhaps this is what, among other things, gives rise to what is often derided as ‘virtuesignalling’, not to mention the ferocious rows, overreactions, wounded amour propre and grandstanding that often characterize social industry communities. » (Seymour, 2019)
<br><br>
By extension, it is also the way in which users consume content, navigate through interfaces or interact with them that become subject to the same limitations. User’s behaviors are industrially automated in order to keep them almost passively stuck in the loop of watching, scrolling, swiping, watching, liking for as long as possible.
As the exchange currency for their labor, users are rewarded with likes/hearts, comments/responses, followers/subscribers/fans/fired, symbolized by some stimulating visual feedbacks such as hearts, thumbs up, stars and stats. In order to work, these visual stimuli play with our psychological vulnerabilities by feeding our instinctive need for social validation, self-display, and by extensively creating the fantasy of an accessible and quantifiable fame.
<br><br>
Since it’s first apparition on the video sharing site called Vimeo (2005), it’s popularization on Facebook (2009) and it’s implementation on most of the other online social media platforms after this, the “like button” became one of the main vectors of most social platform economy. As the ultimate symbol of popularity and approval, the like button became on the main reasons for Facebook’s long-term success, by giving a true motivating reason for users to log in, be active, interact and come back. In 2015, when the “like button” started to feel too limited for its users, Facebook implemented additional interaction buttons under the form of emojis (heart, sad, happy, angry, supportive, surprised). This update would not only respond to the user’s demand but also allow the company to get to know more accurately know how do users react to a content or another, and be able to sell such information to third parties.
<br><br>
As argued by Adam Atler, author of the book Iresistible (Atler, 2017), another determining factor in the way we consume online contents and become addicted is also affected to the absence of stopping cues in the content feed.  With more traditional media such as newspapers, books, television or radio, contents usually have a transition or a stooping time naturally leading readers/viewers/listeners to do something else. However, with some of the 21st century social media, streaming/video platforms (such as Netflix, Youtube, TikTok, Snapchat), the auto-play is either settled as a default setting (Youtube, Netflix) or part of the entire concept of the platform (TikTok, Instagram). By doing so, interface designers are appealing to another human cognitive bias commonly referred to as the 'default choice, putting users in the middle of an endless stream of recommended content and encouraging their passivity.
<br><br>
“In August 2012, Netflix introduced a subtle new feature called “post-play.” With post-play, a thirteen-episode season of Breaking Bad became a single, thirteen-hour film. As one episode ended, the Netflix player automatically loaded the next one, which began playing five seconds later. If the previous episode left you with a cliffhanger, all you had to do was sit still as the next episode began and the cliffhanger resolved itself. Before August 2012 you had to decide to watch the next episode; now you had to decide to not watch the next episode . » (Alter, 2017)
<br><br>
Stuck inside these feedback loops, it is no longer a surprise that such patterns can lead some users to digital overuse, to develop some different degrees of addiction and to eventually suffer from mental heath issues. Thus, while links between social media addiction and mental illness are still being intensely studied and debated, raising symptoms are being observed, symptoms such as depression, anxiety, bipolarity, eating disorders or attention deficits, become more and more evidently linked to problematic social media use. In 2022, a research paper from the Journal of Affective Disorders (Mahalingham et al., 2022) 
studied potential links in the relationship between social media use and psychological distress. Ultimately the research evidenced that heavy social media use may cause problematic mental health consequences among those who experience difficulties with attention control. Among the symptoms experienced by these specific subjects, it has been noticed that the subject were more susceptible to create unobtainable ideals and experience exacerbated feelings of depression or anxiety.
<br><br>
As early evidence on the toxicity of these tools, let’s also remind that some of the creators of these tools and platforms are often themselves preventing their own children to use them, as in a way to prevent them to become as addicted as most consumers.
<br><br>
« For ninety minutes, Jobs explained why the iPad was the best way to look at photos, listen to music, take classes on iTunes U, browse Facebook, play games, and navigate thousands of apps. He believed everyone should own an iPad. But he refused to let his kids use the device. It seemed as if the people producing tech products were following the cardinal rule of drug dealing: never get high on your own supply. » (Alter, 2017)
<br><br>
In some other cases, we also find that some of the most early designers of these tools have themselves chosen to denounce the harmful effects of their creation.
<br><br>
“ Sean Parker, the Virginia-born billionaire hacker and inventor of the file-sharing site Napster, was an early investor in Facebook and the company’s first president.16 Now he’s a ‘conscientious objector’. Social media platforms, he explains, rely on a ‘social validation feedback loop’ to ensure that they monopolize as much of the user’s time as possible. This is ‘exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.” (Seymour, 2019)
<br><br>
Work on transition here
<br><br>
Faced with this observation, one of the key questions would be to ask: How far are big tech companies willing to go in order to make us addicted and engaged? Where could this business model extend to? What are some examples of capitalist surveillance practices that apply to the physical world? How does the promotion of self-tracking practices allow tech giants to gather and sell even more personal information about it’s users?
<br><br><br><br>
<b>Chapter 3: The datafication of the world (to be continued)</b>
<br><br>
Have you ever felt comforted by the possibility of checking your daily step counts, your heart beat/second or/and your followers on a daily basis? Have you ever felt adapted your actions/behavior on behalf of this data? Have you ever felt like your body was a device?
Have you ever wondered who could benefit from this kind of information or/ and for what purpose?
<br><br>
Thanks to the gigantic profit made from the massification and marketisation of our data online, tech giants such as Google, Amazon or Facebook have been able to secure their dominant position, become part of every aspect of our daily lives and commodify the last remains of our existences. In this hyperconnected reality, the Internet has become more than something exclusive to the digital platforms, it can now also be found in our physical environments, inside our cities, our cars, our homes, our bathrooms, around our wrists, in our pockets, next to our beds. In other words, the Internet is everywhere and has created a datafied society where have become the users of our own bodies and of our own physical environments.
<br><br>
“Especially for the younger generation, the Internet is not some standalone, separate domain where a few of life’s functions are carried out. It is not merely our post office and our telephone. Rather, it is the epicenter of our world, the place where virtually everything is done. It is where friends are made, where books and films are chosen, where political activism is organized, where the most private data is created and stored. It is where we develop and express our very personality and sense of self.
— (Greenwald, 2015)
<br><br>
In the past two decades, we have witnessed a long-term effort from the GAFAM to extend their practices in an always wider array of contexts such as with satellite and street photography (Google Earth, Street View), geo location systems, simulated three-dimensional environments (augmented reality, virtual reality, metaverse) or extensions of our bodies, homes and cities (wearable devices, vocal assistants, smart cities). Slowly, these technologies have created a world where the distinctions between online and offline spaces, private and public information, physical and virtual environments, online and offline worlds or users and humans are increasingly blurred. A world where the technologies originally created and used in the military context of surveillance are now fully devoted for capitalistic purposes. A world where more than half of the population owns a smartphone and can be tracked in all circumstances.
<br><br>
The individual finds itself permanently communicating, interfacing and engaging with technological devices (Couldry & Powell, 2014)
<br><br>
More than ten years ago, animated billboards or “smart boards” gradually replaced traditional billboards in many streets, highways, metro stations and train stations in Western countries. But soon enough, some people realized that these devices were in fact often equipped with sensors, cameras and/or microphones allowing some companies to collect, study and resell a maximum of biometric data for advertising purposes. With the help of these screens, it would be possible to determine whether a subject(s) is located in front of the screen, if he/she/they is looking at it, for how long, but also to speculate on its/their age, gender, physical shape, the color of his/her/their clothes, the brand of his/her/their car, etc.
<br><br>
Based on our location in space, some shops and retailers also can send notifications and shopping promotions when their registered consumers are passing next to a physical shop.
To be continued
<br><br><br><br>
==<p style="font-family:helvetica">Journal (1st draft)</p>==
<br><br>
My name is Martin (he), I’m a 26 years old student and freelance graphic designer from France, currently living in the Netherlands. My interests are focused on art, design, music, video games, geopolitics and cinema. As part of my degree, I am writing a thesis on the effects of an economic model based on the extraction and resale of human personal data. As a complement to my research, my wish is to engage into an introspective exercise relating to my own experiences with digital tools, but also to interview a few of my friends.
<br><br>
In my case, I find interesting to note that my work of writing can often be disturbed by the attention mechanisms that I am trying to study. Thus, on a regular basis, the course of my thoughts can be interrupted by a desire to glance at my social platforms feed, as if out of a desire to take a break, to be distracted, or to put myself in a situation where my brain won’t be as much solicited as when I write.
<br><br>
Since a few years ago and until now, I decided to disable the push notification systems linked to most of my accounts (expect mails/messengers) in order to try to go there less often. Although this was at first relatively effective, the opposite effect started to happen with time. As in a fear of missing out something, the fact that I simply couldn't be aware any more if someone sent me a message, subscribed to my account, or if someone published something just led me to check my account even more frequently.
<br><br>
Youtube, Facebook and Instagram, are the three platforms that I undeniably use the most. However, my motivations for using them, as well as my way of using them remains quite different, as much in terms or the frequency I visit them, the time spent, or the degree of interaction. Wrongly formulated
<br><br>
In 10 years of use, Youtube remains the platform I use for the longest time. As a spectator, my use isn’t motivated by a desire to upload contents or to post comments, but more simply to access information or entertainment contents. My subscriptions are currently limited to 78, among which are a majority of information channels and a fewer amount of channels related to video games, cinema or art/design. In opposition to some other platforms/tools, my use of Youtube does not have a particularly guilt-inducing effect on me and does not exacerbate any desire for comparison or visibility. However, the vast majority of the videos I watch does not come from my own subscriptions, or even from custom searches, but mostly from what the algorithms simply recommend me to watch. Because of this randomness, I also find myself encouraged to come back to the recommendation page very often, or to refresh the page until finding something interesting to watch. Youtube mostly suggests me contents related to the interests that I listed earlier such  as art, design, music, video games, geopolitics and cinema. Since some time, I also got the habit to click on “not interested” or “do not recommend this channel” when the algorythms would display some contents that would seem irrelevant to me. As it turns out, recommended contents can now be filtered by users in order to teach the algorithm a bit more about ourselves.
<br><br>
As it comes for most of its early users, my use of Facebook is now very limited, leaving much more space for older generation to take over the platform. Because of my almost inexistant degree of interaction/engagement with other users, Facebook randomly sends me notifications such as “You have a new friend suggestion:” “This person shared a link”, “This person share an even that might interest you”, or “you might like this page” from times to times.  Among all platforms, Facebook now feels to me like the most useless social media, but as it turns out, I still find myself compulsively opening and closing the page with a remarkable rapidity and consistency pretty much everyday.
<br><br>
Among all the platforms I use, Instagram remains the one that obviously shows most undesirable side effects. Instagram is a photo and video sharing social network that allows users to upload contents to audiences. Today the platform perfectly represents the society of the spectacle that Guy Debord already described 50 years ago, a society where everyone become the entrepreneur of it’s own “fame” and tries to live the life of people who are seemingly more famous than them. As a matter of fact, the platform is highly used among young creators, businesses or graphic designers as a way of promoting their work and networking with other people/clients/designers. As a result, Instagram has become increasingly important to me while undeniably affecting my way of designing, showcasing my work and interacting with others. From being a graphic designer essentially creating fixed images, my practice has progressively shifted towards motion design (animated images), a popular format on the platform. In the same way, the pressure felt by my rather small but growing audience of followers has also led me feel more and more pressure to publish something. In consequence of that, I have been showing less and less content over time, with an average of 3 posts/year over these past 3 years.  While my use if Instagram could be judged as quiet healthy, it remains important to understand that this tool is reached an importance that no other platform has ever achieved in my mind by providing me a relatively free and mainstream space of expression.
<br><br>
To be continued and to reformulate
 
<b>Reference list</b>
 
*Alter, A., 2017. Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, 1st edition, London: Penguin Press Illustrated edition.
 
*Crary, J., 2001. Suspensions of Perception: Attention, Spectacle, and Modern Culture, Cambride, MA, The MIT Press
 
*Couldry, N., Powell, A., 2014. Big Data from the bottom up, Big Data & Society [e-journal] Available through: https://journals.sagepub.com [Accessed 11 march 2022]
 
*Crawford, K., Lingel, J., Karppi, T., 2015. Our metrics, ourselves: A hundred years of self-tracking from the weight scale to the wrist wearable device, European Journal of Cultural Studies [e-journal]  Available through: https://www.dhi.ac.uk com [Accessed 11 march 2022]
 
*Debord, G., 2014, The Society of the Spectacle, 4th edition, Berkeley CA, Bureau of Public Secrets
 
*Greenwald, G., 2015. No Place to Hide: Edward Snowden, the NSA, and the U.S. Surveillance State.  Re-publication, New York, Picador
 
*Hardt, M., Negri, A., 2012. Declaration, 1st ed, New-York, Argo Navis
 
*Kholeif, O., 2018 Goodbye, World! — Looking at Art in the digital Age. 1st ed, Berlin, Sternberg Press pp.183-184
 
*Mahalingham, T. and Howell, J. and Clarke, P., 2022. Attention control moderates the relationship between social media use and psychological distress, School of Population Health, Curtin University, [e-journal]  Available through: https://www.sciencedirect.com [Accessed 13 march 2022]
 
*Morozov, E., 2019. Capitalism’s New Clothes. The Baffler [e-journal] Available through: https://thebaffler.com  [Accessed 11 march 2022]].
 
*Seymour, R. ,2019.The Twittering Machine. 1st ed. London: The Indigo Press
 
*Slotnik, D., 2021. Whistle-Blower Unites Democrats and Republicans in Calling for Regulation of Facebook. New York Times  [e-journal] Available through: https://www.nytimes.com/ [Accessed 11 march 2022]
 
*Spotify, 2021, The Wait Is Over. Your Spotify 2021 Wrapped Is Here — [online] Available at: https://newsroom.spotify.com/2021-12-01/the-wait-is-over-your-spotify-2021-wrapped-is-here/
 
*Stewart, J., 2016. Facebook Has 50 Minutes of Your Time Each Day. It Wants More. New York Times [e-journal] Available through:  https://www.nytimes.com/ [Accessed 11 march 2022]
 
*Wu, T., 2016 The Attention Merchants: The Epic Scramble to Get Inside Our Heads. 1st edition. New York City, Knopf.
 
</div>

Latest revision as of 20:53, 26 February 2024

Links

Draft Thesis

What do you want to make?

My project is a data collection installation that monitors people's behaviors in public physical spaces while explicitly encouraging them to help the algorithm collect more information. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.

The way the device is designed doesn’t pretend to give any beneficial outcomes for the subject, but only makes visible the benefits that the machine is getting from collecting their data. Yet, the way the device visually or verbally presents this collected data is done in a grateful way, which might be stimulating for the subject. In that sense, the subject, despite knowing that their actions are done solely to satisfy the device, could become intrigued, involved, or even addicted by a mechanism that deliberately uses it as a commodity. In that way, I intend to trigger conflictual feelings in the visitor’s mind, situated between a state of awareness regarding the operating monetization of their physical behaviors, and a state of engagement/entertainment /stimulation regarding the interactive value of the installation.

My first desire is to make the mechanisms by which data collection is carried out, marketized and legitimized both understandable and accessible. The array of sensors, the Arduinos and the screen are the mainly technological components of this installation. Rather than using an already existing and complex tracking algorithm, the program is built from scratch, kept open source and limits itself to the conversion of a restricted range of physical actions into interactions. These include the detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection. Optionally they may also include the detection of the subject smartphone device or the log on a local Wi-Fi hotspot made by the subject.

In terms of mechanic, the algorithm creates feedback loops starting from:
_the subject behaviors being converted into information;
_the translation of this information into written/visual feedback;
_and the effects of this feedbacks on subject’s behavior; and so on.
By doing so, it tries to shape the visitors as free data providers inside their own physical environment, and stimulate their engagement by converting each piece of collected information into points/money, feeding a user score among a global ranking.

On the screen, displayed events can be:

_ “subject [] currently located at [ ]”
[x points earned/given]
_ “subject [] entered the space”
[x points earned/given]
_ “subject [] left the space”
[x points earned/given]
_ “subject [] moving/not moving”
[x points earned/given]
_ “subject [] distance to screen: [ ] cm”
[x points earned/given]
_ “subject [] stayed at [ ] since [ ] seconds”
[x points earned/given]
_ “subject [] device detected
[x points earned/given] (optional)
_ “subject logged onto local Wi-Fi
[x points earned/given] (optional)

Added to that comes the instructions and comments from the devices in reaction to the subject’s behaviors:

_ “Congratulations, you have now given the monitor 25 % of all possible data to collect!”
[when 25-50-75-90-95-96-97-98-99% of the total array of events has been detected at least once]
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot!”
[if the subject stands still in a specific location]
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”
[unlocked at x points earned/given]
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers!”
[if the subject stand still in a specific location]
_ “Leaving all ready? The monitor has yet to collect 304759 crucial pieces of information from you!”
[if the subject is a the edge of the detection range]
_ “You are only 93860 pieces of information away from being the top one data-giver!”
[unlocked at x points earned/given]
_ “Statistics show that people staying for more than 5 minutes average will benefit me on average 10 times more!”
[randomly appears]
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”
[if the subject stands still for a long time any location]

Responding positively to the monitors instructions unlocks special achievement and extra points

—Accidental data-giver badge
[unlocked if the subject has passed the facility without deliberately wishing to interact with it] + [x points earned/given]
—Lazy data-giver badge
[unlocked if the subject has been standing still for at least one minute] + [x points earned/given]
—Novice data-giver badge
[unlocked if the subject has been successfully completing 5 missions from the monitor] + [x points earned/given]
—Hyperactive data-giver badge
[unlocked if the subject has never been standing still for 10 seconds within 2 minutes lapse time] + [x points earned/given]
—Expert data-giver badge
[unlocked if the subject has been successfully completing 10 missions from the monitor within 10 minutes] + [x points earned/given]
—Master data-giver badge
[unlocked if the subject has been successfully logging on the local Wi-Fi hotspot] + [x points earned/given] (optional)

On the top left side of the screen, a user score displays the number of points generated by the collected pieces of information, and the unlocking of special achievements instructed by the monitor.

—Given information: 298 pieces
[displays number of collected events]
—Points: 312000
[conversion of collected events and achievement into points]

On the top right of the screen, the user is ranked among x number of previous visitors and the prestigious badge recently earned is displayed bellow

—subject global ranking: 3/42
[compares subject’s score to all final scores from previous subjects]
—subject status: expert data-giver
[display the most valuable reward unlocked by the subject]

When leaving the detection range, the subject gets a warning message and a countdown starts, and encouraging it to take the quick decision to come back

—“Are you sure you want to leave? You have 5-4-3-2-1-0 seconds to come back within the detection range”
[displayed as long as the subject remains completely undetected]

If the subject definitely stands out of the detection range for more than 5 seconds, the monitor will also address a thankful message and the amount of money gathered, achievements, ranking, complete list of collected information and a qr code will be printed as a receipt with the help of a thermal printer. The QR will be a link to my thesis.

—* “Thank you for helping today, don’t forget to take your receipt in order to collect and resume your achievements”
[displayed after 5 seconds being undetected]

In order to collect, read or/and use that piece of information, the visitor will inevitably have to come back within the range of detection, and intentionally, or not, reactivate the data tracking game. It is therefore impossible to leave the area of detection without leaving at least one piece of your own information printed in the space. Because of this, the physical space should gradually be invaded by tickets scattered on the floor. As in archaeology, these tickets give a precise trace of the behavior and actions of previous subjects for future subjects.

Why do you want to make it?

When browsing online or/and using connected devices in the physical world, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. While social platforms, brands, public institutions and governments explicitly promote the use of monitoring practices in order to better serve or protect us, we could also consider these techniques as implicitly political, playing around some dynamics of visibility and invisibility in order to assert new forms of power over targeted audiences.

In the last decade, a strong mistrust of new technologies has formed in the public opinion, fueled by events such as the revelations of Edward Snowden, the Cambridge Analytica scandal or the proliferation of fake news on social networks. We have also seen many artists take up the subject, sometimes with activist purposes. But even if a small number of citizens have begun to consider the social and political issues related to mass surveillance, and some individuals/groups/governments/associations have taken legal actions, surveillance capitalism still remains generally accepted, often because ignored or/and misunderstood.

Thanks to the huge profits generated by the data that we freely provide every day, big tech companies have been earning billions of dollars over the sale of our personal information. With that money, they could also further develop deep machine learning programs, powerful recommendation systems, and to broadly expand their range of services in order to track us in all circumstances and secure their monopolistic status. Even if we might consider this realm specific to the online world, we have seen a gradual involvement from the same companies to monitor the physical world and our human existences in a wide array of contexts. For example, with satellite and street photography (Google Earth, Street View), geo localization systems, simulated three-dimensional environments (augmented reality, virtual reality or metaverse) or extensions of our brains and bodies (vocal assistance and wearable devices). Ultimately, this reality has seen the emergence of not only a culture of surveillance but also of self-surveillance, as evidenced by the popularity of self-tracking and data sharing apps, which legitimize and encourage the datafication of the body for capitalistic purposes.

For the last 15 years, self-tracking tools have made their way to consumers. I believe that this trend is showing how ambiguous our relationship can be with tools that allow such practices. Through my work, I do not wish to position myself as a whistleblower, a teacher or activist. Indeed, to adopt such positions would be hypocritical, given my daily use of tools and platforms that resort to mass surveillance. Instead, I wish to propose an experience that highlights the contradictions in which you and I, internet users and human beings, can find ourselves. This contradiction is characterized by a paradox between our state of concern about the intrusive surveillance practices operated by the Web giants (and their effects on societies and humans) and a state of entertainment or even active engagement with the tools/platforms through which this surveillance is operated/allowed. By doing so, I want to ask how do these companies still manage to get our consent and what human biases do they exploit in order to do so. That’s is how my graduation work and my thesis will investigate the effect of gamification, gambling or reward systems as well as a the esthetization of data/self-data as means to hook our attention, create always more interactions and orientate our behaviors.


How to you plan to make it and on what timetable?

I am developing this project with Arduino Uno/Mega boards, an array of ultrasonic sensor, P5.js and screens.

How does it work?

The ultrasonic sensors can detect obstacles in a physical space and know the distance between the sensor and obstacle(s) by sending and receiving back an ultrasound. The Arduino Uno/Mega boards are microcontrollers which can receive this information, run it in a program in order to convert these values into a mm/cm/m but also map the space into an invisible grid. Ultimately, values collected on the Arduino’s serial monitor can be sent to P5.js through P5.serialcontrol. P5.js will then allow a greater freedom in way the information can be displayed on the screens.

Process:

1st semester: Building a monitoring device, converting human actions into events, and events into visual feedbacks

During the first semester, I am focused on exploring monitoring tools that can be used in the physical world, with a specific attention to ultrasonic sensors. Being new to Arduino programming, my way of working is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale. Prototypes are subject to testing, documentation and comments helping to define which direction to follow. The first semester also allows to experiment with different kind of screen (LCD screens, Touch screens, computer screens, TV screens) until finding the most adequate screen monitor(s) for the final installation. Before building the installation, the project is subject to several sketching and animated simulations in 3 dimensions, exploring different scenarios and narrations. At the end of the semester, the goal is to be able to convert a specific range of human actions into events and visual feedback creating a feedback loop from the human behaviors being converted into information; the translation of this information into written/visual feedbacks; and the effects of this feedbacks on human behavior; and so on.

2nd semester: Implementing gamification with the help of collaborative filtering, point system and ranking.

During the second semester, it is all about building and implementing a narration with the help of gaming mechanics that will encourage humans to feed the data gathering device with their own help. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.

To summarize the storyline, the subject being positioned in the detection zone finds herself/himself unwillingly embodied as the main actor of a data collection game. Her/His mere presence generates a number of points/dollars displayed on a screen, growing as she/he stays within the area. The goal is simple: to get a higher score/rank and unlock achievements by acting as recommended by a data-collector. This can be done by setting clear goals/rewards to the subject, and putting its own performance in comparison to all the previous visitors, giving unexpected messages/rewards, and give an aesthetic value to the displayed informations.

The mechanism is based on a sample of physical events that have been already explored in the first semester of prototyping (detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection). Every single detected event in this installation is stored in a data bank, and with the help of collaborative filtering, will allow to the display of custom recommendations such as:

_ “Congratulations, you have now given the monitor 12 % of all possible data to collect”
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot”
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers”
_ “Leaving all-ready? The monitor has yet 304759 crucial pieces of information to collect from you”
_ “You are only 93860 actions away from being the top one data-giver”
_ “Statistics are showing that people staying for more than 5 minutes average will be 10 times more benefitting for me”
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”


The guideline is set out here, but will be constantly updated with the help of experiments and the results observed during the various moments of interaction between the students and the algorithm. For this purpose, the installation under construction will be left active and autonomous in its place of conception (studio) and will allow anyone who deliberately wishes to interact with it to do so. Beyond the voluntary interactions, my interest is also to see what can be extracted from people simply passing in front of this installation. In addition to this, some of the mechanics of the installation will be further explored by collaborating with other students, and setting up more ephemeral and organized experiences with the participants. (ex: 15 February 2022 with Louisa)

This semester will also include the creation of a definite set of illustrations participating to engage the participants of the installation in a more emotional way, the illustrations will be made by an illustrator/designer, with whom I usually collaborate.

3rd semester: Build the final installation of final assessment and graduation show. Test runs, debug and final touchs.

During the third semester, the installation should be settled in the school, in the alumni area, next to XPUB studio for the final assessment, and ultimately settled again at WORM for the graduation show. I am interested in putting this installation into privileged spaces of human circulation, (such as hallways) that would more easily involve the detection of people, and highlight the intrusive aspect of such technologies. The narration, the mechanics, illustrations and graphic aspect should be finalized at the beginning of the 3rd semester, and subject to intense test runs during all that period until meeting the deadline.

Relation to larger context

As GAFAM companies are facing more and more legal issues, and held accountable in growing numbers of social and political issues around the world, the pandemic context has greatly contributed to make all of us more dependent than ever on the online services provided by these companies and to somehow force our consent. While two decades of counter-terrorism measures legitimized domestic and public surveillance techniques such as online and video monitoring, the current public health crisis made even more necessary the use of new technologies for regulating the access to public spaces and services, but also for socializing, working together, accessing to culture, etc. In a lot of countries, from a day to another, and for an undetermined time, it has become necessary to carry a smartphone (or a printed QR code) in order to get access transport, entertainment, cultural and catering services, but also in order to do simple things such as to have a look at the menu in a bar/restaurant or to make an order.. Thus, this project takes place in a context where techno-surveillance has definitely taken a determining place in the way we can access spaces and services related to the physical world.

Data Marketisation / Self Data: Quantified Self / Attention Economy / Public Health Surveillance / Cybernetics


Relation to previous practice?

During my previous studies in graphic design, I started being engaged with the new media by making a small online reissue of Raymond Queneau’s book called Exercices de Style. In this issue called Incidences Médiatiques (2017), the user/reader was encouraged to explore the 99 different versions of a same story written by the author in a less-linear way. The idea was to consider each user graphic user interface as a unique reading context. It would determine which story could be read, depending on the device used by the reader, and the user could navigate through these stories by resizing the Web window, by changing browser or by using on a different device.

As part of my graduation project called Media Spaces (2019), I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print website. Subsequently, this website became translated in the physical space as a printed book, and a series of installations displayed in an exhibition space that was following the online structure of my thesis (home page, index, part 1-2-3-4). In that way, I was interested to inviting to visitors to make a physical experience some aspects of the Web

As a first-year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website called Tense (2020) willing to display the invisible html <meta> tags inside of an essay in order to affect our interpretation of the text. In 2021, I worked on a geocaching pinball game highlighting invisible Web event, and a Web oscillator, which amplitude and frequency range were directly related to the user’s cursor position and browser screen-size.

While it has always been clear to me that these works were motivated by the desire to define media as context, subject or/and content, the projects presented here have often made use of surveillance tools to detect and translate user information into feedbacks, participating in the construction of an individualized narrative or/and a unique viewing/listening context (interaction, screen size, browser, mouse position). The current work aims to take a critical look at the effect of these practices in the context of techno surveillance.

Similarly, projects such as Media Spaces have sought to explore the growing confusion between human and web user, physical and virtual space or online and offline spaces. This project will demonstrate that these growing confusions will eventually lead us to be tracked in all circumstances, even in our most innocuous daily human activities/actions.


Selected References

Works:

« invites us to take on the role of an auditor, tasked with addressing the biases in a speculative AI »Alternatives to techno-surveillance

Expose humans as producers of useful intellectual labor that is benefiting to the tech giants and the use than can be made out of that labor.

Claims that that technological devices can be manipulated easily and hence, that they are fallible and subjective. They do this by simply placing a self-tracker (connected bracelet) in another context, such as on some other objects, in order to confuse these devices.

Allows galleries to enjoy encrypted internet access and communications, through a Tor Network

You are rewarded for exploring all the interactive possibilities of your mouse, revealing how our online behaviors can be monitored and interpretated by machines.

Portrait of the viewer is drawn in real time by active words, which appear automatically to fill his or her silhouette https://www.lozano-hemmer.com/third_person.php

«Every visitor to the website’s browser size, collected, and played back sequentially, ending with your own.»

Readings of the building and its contents are therefore always unique -- no two visitors share the same experience. https://haque.co.uk/work/mischievous-museum/

Books & Articles:

  • SHOSHANA ZUBOFF, The Age of Surveillance Capitalism (2020)

Warns against this shift towards a «surveillance capitalism». Her thesis argues that, by appropriating our personal data, the digital giants are manipulating us and modifying our behavior, attacking our free will and threatening our freedoms and personal sovereignty.

  • EVGENY MOROZOV, Capitalism’s New Clothes (2019)

Extensive analysis and critic of Shoshana Zuboff research and publications.

  • BYRON REEVES AND CLIFFORD NASS, The Media Equation, How People Treat Computers, Television, and New Media Like Real People and Places (1996)

Precursor study of the relation between humans and machine, and how do you human relate to them.

  • OMAR KHOLEIF, Goodbye, World! — Looking at Art in the digital Age (2018)

Authors shares it’s own data as a journal in a part of the book, while on another part, question how the Internet has changed the way we perceive and relate, and interact with/to images.

  • KATRIN FRITSCH, Towards an emancipatory understanding of widespread datafication (2018)

Suggests that in response to our society of surveillance, artists can suggest activist response that doesn’t necessarily involve technological literacy, but instead can promote strong counter metaphors or/and counter use of these intrusive technologies.

Prototyping

Arduino

Early sketch that is about comparing and questioning our Spectator experience of a physical exhibition space (where everything is often fixed and institutionalized), with our User experience of a Web space (where everything is way more elastic, unpredictable and obsolete). I’m interested about how slighly different can be rendered a same Web page to all different users depending on technological contexts (device nature, browser, IP address, screen size, zoom level, default settings, updates, luminosity, add-ons, restrictions, etc). I would like to try to create a physical exhibition space/installation that would be inspired from the technology of a Web user window interface in order then to play with exhbitions parameters such as the distance between the spectator and the artwork, the circulation in space, the luminosity/lighting of the artwork(s), the sound/acoustics, etc etc etc.

Distance between wall behind the spectator and the artwork has to be translated into a variable that can affect sound or light in the room. Wall position could be connected to the dimensions of a user interface in real time with arduino and a motor.

Create a connected telemeter with an Arduino, a ultrasonic Sensor (HC-SR04) and a ESP8266 module connected to Internet

It seems possible to create your own telemeter with a arduino by implementing an ultrasonic Sensor HC-SR04
By doing so, the values capted by the sensor could potentaialy be directly translated as a variable.
Then with the ESP8266 module, the values could be translated on a database on the internet. Then I could enter that website and see the values from anywhere and use them to control light, sound or anything else I wish.

Tool/Material list:

  • Telemeter (user to get the distance between the device and an obstacle)
  • Rails
  • Handles
  • Wheels
  • Movable light wall
  • Fixed walls
  • USB Cable
  • Connexion cables
  • Arduino
  • ESP8266
Connexion cables (Arduino)
USB Cable
Arduino
HC-SR04 Ultrasonic Sensor
Plywood x 3
Handle
ESP8266
Rail








About the ultrasonic Sensor (HC-SR04)

Characteristics

Here are a few of it's technical characteristic of the HC-SR04 ultrasonic sensor :

  • Power supply: 5v.
  • Consumption in use: 15 mA.
  • Distance range: 2 cm to 5 m.
  • Resolution or accuracy: 3 mm.
  • Measuring angle: < 15°.

Ref More infos about the sensor here and here

Where to buy the ultrasonic Sensor (HC-SR04)

Prototype 1 : Arduino + Resistor

During a workshop, we started with a very basic fake arduino kit, a led, a motor, and a sensor. After making a few connections, we got to understand a bit how it works.


   #include <Servo.h>
   Servo myservo;  // create servo object to control a servo
   int pos = 0;    // variable to store the servo position
   int ldr = 0;    // vairable to store light intensity
   void setup() {
   Serial.begin(9600); // begin serial communication, NOTE:set the same baudrate in the serial monitor/plotter
   myservo.attach(D7);  // attaches the servo on pin 9 to the servo object
   }
   void loop() {
   //lets put the LDR value in a variable we can reuse
   ldr = analogRead(A0);
   
   //the value of the LDR is between 400-900 at the moment 
   //the servo can only go from 0-180
   //so we need to translate 400-900 to 0-180
   //also the LDR value might change depending on the light of day
   //so we need to 'contrain' the value to a certain range
   ldr = constrain(ldr, 400, 900); 
   //now we can translate
   ldr = map(ldr, 400, 900, 0, 180);
   //lets print the LDR value to serial monitor to see if we did a good job
   Serial.println(ldr); // read voltage on analog pin 0, print the value to serial monitor
   //now we can move the sensor accoring to the light/our hand!
   myservo.write(ldr);      // tell servo to go to position in variable 'pos'
   delay(15);    
   }


How to make a engine work
credits: Dennis de Bel
How to make a sensor work
Credits: Dennis de Bel
How to make both sensor and engine works together
Credits: Dennis de Bel
Sensortest during workshop

Split Screen Arduino + Sensor + Serial Plotter + Responsive Space

Trying here to show the simutaneous responses between the sensor, the values, and the simualtion.

Splitscreen Arduino + Sensor + Serial Plotter + Responsive Space




















Prototype 2: Arduino + Ultrasonic sensor

For this very simple first sketch and for later, I will include newPing library that improves a lot the ultrasonic sensor capacities.

Sketch 1: Arduino Uno + Sensor






















 #include <NewPing.h> 
 int echoPin = 10;
 int trigPin = 9;
 
 NewPing MySensor(trigPin, echoPin); //This defines a new variable
 
 void setup() {
   // put your setup code here, to run once:
   Serial.begin(9600);
 }
 
 void loop() {
   // put your main code here, to run repeatedly:
  int duration = MySensor.ping_median(); 
  int distance = MySensor.convert_in(duration);
 
  Serial.print(distance);
  Serial.println("cm");
  delay(250);
 }

Prototype 3: Arduino Uno + Sensor + LCD (+ LED)

All together from https://www.youtube.com/watch?v=GOwB57UilhQ

Sketch 2: Arduino Uno + Sensor + LCD
Sketch 3: Arduino Uno + Sensor + LCD + LED






















  #include <LiquidCrystal.h>
  
  LiquidCrystal lcd(10,9,5,4,3,2);
 
 const int trigPin = 11;
 const int echoPin = 12;
 
 long duration;
 int distance;
 
 void setup() {
   // put your setup code here, to run once:
     analogWrite(6,100);
     lcd.begin(16,2);
     pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
 pinMode(echoPin, INPUT); // Sets the echoPin as an Input
 Serial.begin(9600); // Starts the serial communication
 
     
 }
 
 void loop() {
 long duration, distance;
   digitalWrite(trigPin,HIGH);
   delayMicroseconds(1000);
   digitalWrite(trigPin, LOW);
   duration=pulseIn(echoPin, HIGH);
   distance =(duration/2)/29.1;
   Serial.print(distance);
   Serial.println("CM");
   delay(10);
 // Prints the distance on the Serial Monitor
 Serial.print("Distance: ");
 Serial.println(distance);
 
     lcd.clear();
     lcd.setCursor(0,0);
     lcd.print("Distance = ");
     lcd.setCursor(11,0);
     lcd.print(distance);
     lcd.setCursor(14,0);
     lcd.print("CM");
     
     delay(500);
     
 }

From this sketch, I start considering that the distance value could be directly sent to a computer and render a Web page depending on its value.
Note: It looks like this sensor max range is 119cm, which is almost 4 times less than the 4 meters max range stated in component description.

Prototype 4: Arduino Uno + Sensor + LCD + 2 LED = Physical vs Digital Range detector

Using in-between values to activate the green LED
Once again, puting together the simulation and the device in use.

Sensor Test VS Elastic Space




































 #include <LiquidCrystal.h>
 #include <LcdBarGraph.h> 
 #include <NewPing.h> 
 
   LiquidCrystal lcd(10,9,5,4,3,2);
 
 const int LED1 = 13; 
 const int LED2 = 8;   
 const int trigPin = 11;
 const int echoPin = 12;
 
 long duration; //travel time
 int distance;
 int screensize;
 
 void setup() {
   // put your setup code here, to run once:
     analogWrite(6,100);
     lcd.begin(16,2);
     pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
     pinMode(echoPin, INPUT); // Sets the echoPin as an Input
     Serial.begin(9600); // Starts the serial communication
 
     pinMode(LED1, OUTPUT);
     pinMode(LED2, OUTPUT);
 }
 
 void loop() {
 long duration, distance;
   digitalWrite(trigPin,HIGH);
   delayMicroseconds(1000);
   digitalWrite(trigPin, LOW);
   duration=pulseIn(echoPin, HIGH);
   distance =(duration/2)/29.1; //convert to centimers
   screensize = distance*85;
   Serial.print(distance);
   Serial.println("CM");
   Serial.print(screensize);
   delay(10);
 
   if ((distance >= 15) && (distance<=20))
   {
      digitalWrite(LED2, HIGH);
      digitalWrite(LED1, LOW);
   }
   else
   {
      digitalWrite(LED1, HIGH);
      digitalWrite(LED2, LOW);    
   }
 
 // Prints the distance on the Serial Monitor
 Serial.print("Distance: ");
 Serial.println(distance);
 
     lcd.clear();
     lcd.setCursor(0,0);
     lcd.print("ROOM");
     lcd.setCursor(6,0);
     lcd.print(distance);
     lcd.setCursor(9,0);
     lcd.print("cm");    
     lcd.setCursor(0,2);
     lcd.print("SCR");
     lcd.setCursor(6,2);
     lcd.print(screensize);
     lcd.setCursor(9,2);
     lcd.print("x1080px");
         
     delay(500);
     
 }


I brought a second arduino, 2 long breadboards, black cables, another LCD screen, and remade the setup on this format. For some reasons the new LCD screen is not going in the breadboard, and I need more male to female cables in order to connect it correctly. With this longer breadboard, I want to extend the range value system, and make it visible with leds and sounds.

Upgrade























How to get more digital pins [not working]

I tried 4 different tutorials but still didn't find a way to make the thing work, that's very weird, so I will just give up and take a arduino mega =*(

ArduinoExtraDigitalPins





























Prototype 5: Arduino Uno + 3 Sensor + 3 LEDS

With a larger breadboard, connecting 3 sensors all together. Next step will be to define different ranges of inbetween values for each sensor in order to make a grid. To accomplish this grid I will make a second row of sensors such as this, in order to get x and y values in space

Prototype 6: Arduino Uno + 3 Sensor + 12 LEDS

With 3 sensors, added on 2 long breadboads, and with a different set of range values, we can start mapping a space.

SensorMediaQueries
Physical Space Mapping








Prototype 7: Arduino Uno + 12 LEDS + 3 Sensor + Buzzer + Potentiometer + LCD

For this prototype, I implement a buzzer that will emit a specific sound depending on the distance of the obstacle detected by the sensor. I also puted back a LCD displaying the 3 sensors values. The screen luminosity can be changed via a potentiometer.
Ressources:

ArduinoMegaSensorBuzzerLCD





























Prototype 8: Arduino Uno + 12 LEDS + 3 Sensor on mini breadboards + Buzzer + Potentiometer + LCD

Same code, but new setup detaching each sensor from each others and allowing to place them anywhere.

ArduinoMegaSensorBuzzerLCDMinibreadboard.jpg





























Prototype 9: Arduino Uno + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD

Sensor Wall 01
PhysicalMapping2





























Sketch 10: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js

P5.js and ultrasonic sensor

The goal here was to create a first communication between the physical setup and a P5.js web page






































Sketch 11: Arduino Mega + UltraSonicSensor + LCD TouchScreen

LCD Arduino Variable poster





























Semester 2

Simu part 02.gif





























Sketch 12: Arduino Uno + P5serialcontrol + P5.js web editor = Code descrypter

P
I
G
E
O
N





Sketch 13: Arduino Uno + P5serialcontrol + P5.js web editor = Game

Stage 0
The subject is located too far away
Stage 0
The subject is well located and will hold position to reach next stage
Stage 1
The subject unlocked Stage 1 and will hold position to reach next stage
Stage 2
The subject unlocked Stage 2 and is located too close
Stage 3
The subject unlocked Stage 3 and need to get closer
Transition Stage
The subject unlocked all stage and needs to wait the countdown for following steps



Sketch 14: Arduino Uno + P5serialcontrol + P5.js web editor = Simplified interface

Data Collector Sample 01.gif





















How to add split serial data value with more than one sensor


Debug Martin 01.png
Debug Martin 05.png
Debug Martin 02.png
Debug Martin 03.png
Debug Martin 04.png
Debug Martin 06.png






























Installation update

Installation Update 01.jpg
Installation Update 02.jpg









To do

Stages Design

Many stages (mini-levels) are being designed. They are all intended to evoke the different contexts and pretexts for which we perform daily micro-tasks to collect data.
The visitor can unlock the next step by successfully completing one or more tasks in a row. After a while, even if the action is not successful, a new step will appear with a different interface and instructions.

The list below details the different stages being produced, they will follow each others randomly during the session:

    • Money/Slot Machine
    • Well-Being
    • Healthcare
    • Yoga
    • Self-Management
    • Stock Market Exchange
    • Military interface

The visuals bellow illustrate their design.

Captcha:
one action needed moving forward, backward or standing stillnext stage unlock after action done and short animation
Self Tracking:
no interaction needed visitor must stand still until one of the goal is completed
Self Tracking:
no interaction needed visitor must stand still until one of the goal is completed
Slot Machine:
no interactions needed transition between 2 stages determines randomly the next stage displayed when nobody detected
Social Live:
no interaction needed visitor must stand still until money goal is completed
Stock Ticket:
no interactions needed displayed when nobody detected








Stages Design with P5.js

AllStages HomoData.png
6 levels in a row then randomnized, more to come
Consolelog 01.gif

























Grad Show: Worm

Count on Me - Worm - 01
Count on Me - Worm - 02
Count on Me - Worm - 03
Count on Me - Worm - 04
Count on Me - Worm - 05
Count on Me - Worm - 06

































































































































































































Prototyping Ressources

Do it Yourself Ressources (from Dennis de Bel)

  • Instructables is a huge source of (written) tutorials on all kinds of topics. Keep in mind it's more quantity than quality. Interesting for you might be 'diy sensors'
  • Hand Made Electronic (Music): Great resource for cheap, diy electronics project focussing on

sound/music (pdf findable online)

  • Make: Electronics: Amazing, complete guide to everything 'electronics' (Warning, HUGE pdf)
  • Thingiverse: The place to find 3d printable mechanics, enclosures, parts etc.

Electronic Shops (physical)

LIST OF SHOPS (also more physical NL ones)

Electronic Webshops (NL)

Electronic Webshops (Rest)

PCB making EU (Expensive)

PCB making China (Cheap but import tax)

  • JLCPCB (1 week from design upload to in your hands, low quality solder mask)
  • PCBWAY (1 week from design upload to in your hands)
  • ALLPCB (1 week from design upload to in your hands)

Arduino and Sensors

Sensor only Kit

  • 45-in-1 (aliexpress) Example sensor you will find in such a kit documented here

Arduino Starter Projects

or slightly more complex:

or in videos:

or just many different ideas:

or - of course - on Instructables if you want to have a complete course:

or this course:

ARDUINO + PROCESSING (visualizing sensors)

MISCELANIOUS KEYWORDS and LINKS

About the ESP8266 module

The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.

Ressources about ESP8266 module

Kindly fowarded by Lousia:

Which ESP8266 to buy

Installation

Ressources

  • Movable walls build out for Art Museum of West Virginia University link
  • Gallery Wall System (GWS) link
  • CASE-REAL installs movable walls inside a basement art gallery in tokyo link

Venues

Venue 1: Aquarium

Description


AQUARIUM 1.0


A Small Ecosystem for Living Thoughts

Monday, 11th October
19:30 – 21:30
Leeszaal Rotterdam West
Rijnhoutplein 3, 3014 TZ Rotterdam

with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann

It’s oh-fish-ial! Students of the Experimental Publishing Master invite you to dive into their small ecosystem of living thoughts. Join us for an evening of conversation, discussion and new view points. If you look closely, you might even see some early thesis ideas hatching. Let's leave no rock unturned.

Observation questionnaire

This exercice is a very small, humble and almost 100% analog exercice questioning representation in two small steps.

1st step

photo of a brick












  • 1st step: I give a sheet of paper to people during the venue and ask them to answer a series of questions concerning the object (brick) that is being displayed in the middle of the room on a podium. It is specified to them that they can be anywhere while observing this brick and in any position. Here are the quesitons:


  • Please write down your first name:


  • Describe your position (sitting/standing/other):


  • Describe your location in the room:


  • Describe what you are seeing while looking at the screen:


  • Describe how you feel mentaly/emotionaly:



2nd step

photo of brick displayed inside a computer screen












  • 2nd step: I take the answers, wait a round, and then give back a new sheet of paper to the same people with the exact same questions concerning the respresentation of the object (brick) that is being displayed in the middle of the room on a computer screen on the same podium.

Answer Samples

1.0 Object on a podium

  • 1.1 Sitting on corner stairs —> Want to see it from different angles —> Feeling trapped, frustrated
  • 1.2 Sitting on stairs —> a rock looking dead —> Feeling sad
  • 1.3 Sitting on the left close from columns —> rational observation —> Nostalgic memories because participated to the creation of the object as it looks right now
  • 1.4 Sitting in front of object —> Calm and slighly confused
  • 1.5 Sitting on the floor next to stairs in between the side and the middle —> Looking at the object from the bottom —> Feeling a bit confused and inspired



2.0 Photo of the object displayed on a computer screen placed on a podium

  • 2.1 Sitting on a chair seeing the brick from a bird perspective -> Feeling more control of the situation
  • 2.2 Sitting very close from the brick —> Seeing a flat and almost abstract picture —> Feeling drawn to the picture, aesthetically pleasing, feeling less sad about the object
  • 2.3 Sitting under a table very far way —> Looking abstract but identifiable —> Exited about the unusual and childish observation position
  • 2.4 Sitting on stairs —> and seeing the brick in 2D —> Feeling fine
  • 2.5 Sittiing on the stairs —> Seeing a side of the screen with a top view photo of the object —> Feeling confortable



Answers1_RepresentationQuestionnaire
Answers2_RepresentationQuestionnaire
Answers3_RepresentationQuestionnaire
Answers4_RepresentationQuestionnaire
Answers5_RepresentationQuestionnaire


























Venues

Venue 2: Aquarium 2.0

Description


Date 29th Nov — 4th Dec 2021
Time 15h — 18h
29th Nov — 4th Dec 2021 (all day)
Location: De Buitenboel, Rosier Faassenstraat 22 3025 GN Rotterdam, NL


AQUARIUM 2.0

An ongoing window exhibition with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann

Tap upon the glass and peer into the research projects we are currently working on. From Monday 29th of November until Saturday 4th of December we put ourselves on display in the window of De Buitenboel as an entry point into our think tank. Navigating between a range of technologies, such as wireless radio waves, virtual realities, sensors, ecological and diffractive forms of publishing, web design frameworks, language games, and an ultra-territorial residency; we invite you to gaze inside the tank and float with us. Welcome back to the ecosystem of living thoughts.

Aquarium LCD Portal (29 Nov – 4th Dec)

This interactive micro-installation composed of a LCD screen and sensor(s) invites users/visitors to change the color of the screen and displayed messages by getting more or less close from the window. Link

ScreenPortalFront
ScreenPortalback
LCDScreenTest














Readings (new)(english)(with notes in english)

About Institutional Critique

To read

→ 1. Art and Contemporary Critical Practice: Reinventing Institutional CritiqueDoc
→ 2. From the Critique of Institutions to an Institution of Critique - Andrea FraserDoc
→ 3. Institutional critique, an anthology of artists writings - Alexander AlberroDoc

About Techno-Solutionism

To read

→ 1. The Folly of Technological Solutionism: An Interview with Evgeny Morozov - Natasha Dow Schüll

About Meta

To read

→ 1.  The meta as an aesthetic category Bruno Trentini (2014)
→ 2.  File:RMZ ARTIST WRITING(2).pdf The eye tells the story by Rosa Maria Zangenberg (2017)
→ 3.  Leonardo Da Vinci - Paragone by Louise Farago

About exhibition space

To read

→ 2. Kluitenberg, Eric, ed. Book of imaginary media. Excavating the dream of the ultimate communication medium. Rotterdam: NAi Publishers, 2006.
→ 3. The wall and the canvas: Lissitzky’s spatial experiments and the White Cube
→ 6. Decorative Arts: Billy Al Bengston and Frank Gehry discuss their 1968 collaboration at LACMA by Aram Moshayedi
→ 8.  File:Resonance and Wonder STEPHEN GREENBLATT.pdf Resonance and Wonder - STEPHEN GREENBLATT
→ 9.  A Canon of Exhibitions - Bruce Altshuler File:A Canon of Exhibitions - Bruce Altshuler.pdf
→ 10. Documenta - File:A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar.pdf A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar
→ 11. Pallasmaa - The Eyes of the Skin File:Pallasmaa - The Eyes of the Skin.pdf
→ 12. Venturi - Learning from Las Vegas File:Venturi - Learning from Las Vegas.pdf
→ 13. Preserving and Exhibiting Media Art: Challenges and Perspectives - JULIA NOORDEGRAAF, COSETTA G. SABA; BARBARA LE MAÎTRE; VINZENZ HEDIGER Copyright: 2013 - Publisher: Amsterdam University Press Series: Framing Film

Reading/Notes

→ 1. After the White Cube. ref 2015 NOTES INSIDE

  • How and why White Cube rised and became democratized
  • White Cube // Consumerism = Art Consumerism?
  • Exhibition Space > Artworks
  • Experience of interpretation = Entertainment of Art?
  • Museum vs Mausoleum


→ 2. Spaces of Experience: Art Gallery Interiors from 1800 – 2000 ref NOTES INSIDE

  • Art vs 50's consumerism / Choregraphy of desire?
  • Check theorists Hermann von Helmholtz and Wilhelm Wundt


→ 3. Colour Critique A Symposium on Colour as an Agent for Taste, Experience and Value in the Exhibition Space NOTES INSIDE
May 24, 2019 - Noise! Frans Hals, Otherwise, Frans Hals Museum
→ 4.  Noise! Frans Hals, Otherwise NOTES INSIDE

  • Role of colours in the viewer's experience of an exhibition
  • Institutional Critique
  • Institutionalised Space / White cube


→ 5. Mental Spaces - Joost Rekveld/Michael van Hoogenhuyze NOTES INSIDE
(course for Artscience 2007/8) doc

  • About perspective
  • About Space time
  • About Cyber Space


→ 6.  THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge Doc NOTES INSIDE
Stephanie Moser SOUTHAMPTON UNIVERSITY (MUSEUM ANTHROPOLOGY) 2010

  • Architecture (Neoclassical buildings)
  • Big vs Small exhibition Space
  • Lined up objects vs non systematic display
  • Architecture/Design
  • Gallery interiors (Ceiling/Interior Design elements/Furniture
  • Colors
  • Individual lighting of objects vs global lighting
  • Dark vs Bright lighting
  • Chronological vs Thematic arrangement
  • Academic vs Journalistic writting
  • Busy layout vs Minimal Layout
  • Exibition seen vs other exhibitions
  • Themed/idea-oriented vs objectled exhibitions
  • Didactic vs discovery exhibition
  • Contextual, immersive, or atmospheric exhibitions
  • Audience vs Reception


→ 7. Fantasies of the Library - Etienne Turpin (ed.), Anne-Sophie Springer (ed.) Ref; Editeur: The MIT Press; Date de publication: 1 sept. 2018

  • How the a physical organization influence of a bookshelf can influence it's digital version
  • The book as a minitaure gallery/exhibition space
  • The library as a public place of reading
  • Library vs Exhibition Space = Use vs Display
  • Book-theme exhibitions

About User vs Visitor, or user in exhibition space

Designing the user experience in exhibition spaces - Elisa Rubegni, Caporali Maurizio, Antonio Rizzo, Erik Grönvall

  • What are the GUI intentions
  • What is the WIMP interaction model
  • What are the post-Wimp models
  • About Memex

About User Interface

Readings/Notes

→ 1. bootlegAlexander R. Galloway - The Interface Effect 1st ed. Malden, USA: Polity Press.

  • The interface paradox
  • The less they do, the more they achieve and the more they become invisible & unconsidered
  • The interface as a "significant surface"
  • The interface as a gateway
  • The interface as "the place where information moves from one entity to another"
  • The interface as the media itself
  • The interface as "agitation or generative friction between different formats"
  • The interface as "an area" that "separates and mixes the two worlds that meet together there"


→ 2. bootleg Nick Srnicek - Navigating Neoliberalism: Political Aesthetics in an Age of Crisis NOTES INSIDE
Editeur: medium.com, Date de publication: 20 oct. 2016

  • From an aesthetic of sublime into an aesthetics of the interface
  • Cognitive mapping


→ 3. bootleg Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff NOTES INSIDE
Douglas Rushkoff, A., 2010. Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff. 1st ed. Minneapolis, USA: OR Books.

  • "Instead of learning about our technology, we opt for a world in which our technology learns about us."
  • Programmed by the interfaces
  • From a transparent to an opaque medium


→ 4. bootlegThe Best Interface Is No Interface - Golden Krishna NOTES INSIDE
Krishna, G., 2015. The Best Interface Is No Interface: The simple path to brilliant technology (Voices That Matter). 1st ed. unknown: New Riders Publishing.

  • "Screen Obsessed Approach to Design"
  • UI vs UX


→ 5. Plasticity of User Interfaces:A Revised Reference Framework NOTES INSIDE
Gaëlle Calvary, Joëlle Coutaz, David Thevenin Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt

  • About the term 'Placticity'


→ 6. Interface Critique- Beyond UX - FLORIAN HADLER, ALICE SOINÉ; DANIEL IRRGANG DOC Florian Hadler, Alice Soiné, Daniel Irrgang

  • The interface as an "historical artifact", a "space of power"
  • The interface as human -machine boudary
  • What is interface critique
  • Interface in computer science
  • The screen for Lev Manovitch



More to read/see

→ 1. Bickmore, T.W., Schilit, B.N., Digestor: Device- Independent Access To The World Wide Web, in Proc. of 6th Int. World Wide Web Conf. WWW’6
         (Santa Clara, April 1997)

→ 2. Bouillon, L., Vanderdonckt, J., Souchon, N., Recovering Alternative Presentation Models of a Web Page with VAQUITA, Chapter 27, in Proc. of 4th Int. Conf. on Computer- Aided Design of User Interfaces CADUI’2002
         (Valenciennes, May 15-17, 2002)

→ 3. Calvary, G., Coutaz, J., Thevenin, D., Supporting Context Changes for Plastic User Interfaces: a Process and a Mechanism, in “People and Computers XV –
         Interaction without Frontiers”, Joint Proceedings of AFIHM-BCS Conference on Human-Computer Interaction IHM-HCI’2001(Lille, 10-14 September 2001)

→ 4. Cockton, G., Clarke S., Gray, P., Johnson, C., Literate Development: Weaving Human Context into Design Specifications, in “Critical Issues in User Interface Engineering”,
         P. Palanque & D. Benyon (eds), Springer-Verlag, London, 1995.

→ 5. Graham, T.C.N., Watts, L., Calvary, G., Coutaz, J., Dubois, E., Nigay, L., A Dimension Space for the Design of Interactive Systems within their Physical Environments, in Proc. of Conf. on Designing Interactive Systems DIS’2000
          (New York, August 17-19, 2000,), ACM Press, New York, 2000,

→ 6. Lopez, J.F., Szekely, P., Web page adaptation for Universal Access, in Proc. of Conf. on Universal Access in HCI UAHCI’ 2001
         (New Orleans, August 5-10, 2001), Lawrence Erlbaum Associates, Mahwah, 2001,

→ 7. Thevenin, D., Coutaz, J., Plasticity of User Interfaces: Framework and Research Agenda, in Proc. of 7th IFIP International Conference on Human-Computer Interaction Interact' 99
         (Edinburgh, August 30 - September 3, 1999), Chapman & Hall, London, pp. 110-117.

→ 8. Thevenin, D., Adaptation en Interaction Homme-Machine: Le cas de la Plasticité, Ph.D. thesis, Université Joseph Fourier,
          Grenoble, 21 December 2001.

→ 9. Graspable interfaces (Fitzmaurice et al., 1995) link

About User Condition

Readings

→ 1. The User Condition 04: A Mobile First World - Silvio Lorusso Doc

  • Most web user are smarphone users
  • How "mobile's first" affect global web design
  • How "mobile's first" affect the way we use computers

Readings (old)(mostly french)(with notes in french)

Books (old)


→ 1.  L'art comme expérience — John Dewey (french) ⚠️(yet to be filled)⚠️
         publisher: Gallimard (1934)
→ 2.  L'œuvre d'art à l'époque de sa reproductibilité technique — Walter Benjamin (french
         publisher: Alia (1939)
→ 3.  La Galaxie Gutemberg — Marshall McLuhan (french)
         publisher: University of Toronto Press (1962)
→ 3.  Pour comprendre les médias — Marshall McLuhan (french)
         publisher: McGraw-Hill Education (1964)
→ 4.  Dispositif — Jean-Louis Baudry (french)
         publisher: Raymond Bellour, Thierry Kuntzel et Christian Metz (1975)
→ 5.  L’Originalité de l’avant-garde et autres mythes modernistes — Rosalind Krauss (french) ⚠️(yet to be filled)⚠️
         publisher: Macula (1993)
→ 6.  L'art de l'observateur: vision et modernité au XIXe siècle — Jonathan Crary (french)
         publisher: Jacqueline Chambon (Editions) (1994)
→ 7.  Inside the White Cube, the Ideology of Gallery Space — Brian O'Doherty (english) ⚠️(yet to be filled)⚠️
         publisher: Les presses du réel (2008)
→ 8.  Préçis de sémiotique générale — Jean-Marie Klinkenbeg (french) ⚠️(yet to be filled)⚠️
         publisher: Point (2000)
→ 9.  Langage des nouveaux médias — Lev Manovitch (french) ⚠️(yet to be filled)⚠️
         publisher: Presses du Réel (2001)
→ 10. L'empire cybernétique — Cécile Lafontaine (french)
         publisher: Seuil (2004)
→ 11.  La relation comme forme — Jean Louis Boissier (french)
         publisher: Genève, MAMCO(2004)
→ 12.  Le Net Art au musée — Anne Laforêt (french)
         publisher: Questions Théoriques(2011)
→ 13.  Narrative comprehension and Film communication — Edward Branigan (english)
         publisher: Routledge (2013)
→ 14. Statement and counter statement / Notes on experimental Jetset — Experimental Jetset (english)
          publisher: Roma (2015)
→ 15. Post Digital Print — Alessandro Ludovico (french) ≈
          publisher: B42 (2016)
→ 16. L'écran comme mobile — Jean Louis Boissier (french)
          publisher: Presses du réel (2016)
→ 17. Design tactile — Josh Clark (french)
          publisher: Eyrolles (2016)
→ 18. Espaces de l'œuvre, espaces de l'exposition — Pamela Bianchi (french)
          publisher: Eyrolles (2016)
→ 19. Imprimer le monde (french)
          publisher: Éditions HYX et les Éditions du Centre Pompidou (2017)
→ 20. Version 0 - Notes sur le livre numérique (french)
          publisher: ECRIDIL (2018)

Articles (old)

→ 1. Frederick Kiesler — artiste- architecte ⚠️(yet to be filled)⚠️
        (communiqué de presse) Centre pompidou; source : centrepompidou.fr (1996)
→ 2. Oublier l'exposition ⚠️(yet to be filled)⚠️
        Artpress special numéro 21 (2000)
→ 3. Composer avec l’imprévisible: Le questionnaire sur les médias variables ⚠️(yet to be filled)⚠️
        Jon Ippolito; source : variablemedia.net/pdf/Permanence (2003)
→ 4. Esthétique du numérique : rupture et continuité
        Fred Forest; source : archives.icom.museum (2010)
→ 5. La narration interactive ⚠️(yet to be filled)⚠️
        Dragana Trgovčević source : ensci.com/file_intranet/mastere_ctc/etude_Dragana_Trgovcevic.pdf (2011)
→ 6. Des dispositifs aux appareils - L'Espacement d'un calcul
        Anthony Masure source :  anthonymasure.com (2013)
→ 7. Le musée n'est pas un dispositif - Jean-Louis Déotte p.9 - 22 (2011)
→ 8. Apogée et périgée du White Cube Loosli, Alban

References

Exhibition space

→  Prouns Spaces — El lissitzky (1920)
→  City in Space — Frederick Kiesler (1920)
→  The air conditionning Show — Terry Atkinson & Michael Baldwin(1966-67)
→  Sans titre — Michael Asher (1973)
→  Serra Corner prop n°7 (for Nathalie) Richard Serra (1983)
→  Speaking Wall (2009 - 2010)

Nothingness with Media

→  4’’33’ — John Cage (1952)
→  Untitled - A Curse — Tom Friedman (1965)
→  The air conditionning Show — Terry Atkinson & Michael Baldwin(1966-67)
→  Sans titre — Michael Asher (1973)

Mediatization of Media

→  4’’33’ — John Cage (1952)
→  TV Garden — Nam June Paik (1974)
→  Presents — Michael Snow (soon to be translated)
→  Lost Formats Preservation Society — Experimental Jetset (2000)
→  Lost Formats Winterthur — Experimental Jetset (2000)
→  L’atlas critique d’Internet Louise Drulhe (2014-2015)

Flags

→  Netflag — Mark Napier (2002)
→  019 - Flag show (2015)

User perspective

→  What you see is what you get — Jonas Lund (2012)

Media Time perception

→  Present Continuous Past — Dan Graham's (1974)

Experimental cinema

→  Presents — Michael Snow (soon to be translated)
→  Displacements — Michael Naimark (1980)
→  BE NOW HERE — Michael Naimark (1995)

CSS composition

→  Sebastianly Serena
→  Scrollbar Composition
→  into time .com - Rafael Rozendaal
→  Ridge 11 - Nicolas Sassoon
→  Rectangulaire - Claude Closky
→  Jacksonpollock.org - Miltos Manetas
→  Moving Paintings - Annie Abrahams

Media deterioration

→  Img214270417
→  William Basinski - The Disintegration Loops

Undefined

→  Untitled Sans

User friendliness and anti-user friendliness

→  Web-Safe - Juha van Ingen

Media Art conservation

→  The Variable Media Initiative 1999
→  EAI Online Resource Guide forExhibiting, Collecting & Preserving Media Art
→  Matters in Media Art
→  The International Network for the Preservation of Contemporary Art (INCCA)
→  Archiving complex digital artworks - Dušan Barok

Emulation

→  Seeing Double: Emulation in Theory and Practice

Technological Timeline

→  Technological Timeline

Media Art Online Archive

→  ACM SIGGRAPH Art Show Archives
→  Archive of Digital Art (ADA)
→  Ars Electronica Archive
→  Digital Art Web Archive (collected by Cornell)
→  Monoskop
→  The Rhizome ArtBase

Music/Sound

→  The end of music

HTML Quines

→  https://hugohil.github.io/dedans/
→  https://secretgeek.github.io/html_wysiwyg/html.html
→  http://all-html.net/?