XPUB2 Research Board / Martin Foucaut: Difference between revisions

From XPUB & Lens-Based wiki
(87 intermediate revisions by the same user not shown)
Line 1: Line 1:
=<p style="font-family:helvetica">Pads</p>=
===<p style="font-family:helvetica">Manetta / Michael </p> ===
* Group meeting Michael - https://pad.xpub.nl/p/2021_sandbox
* Group discussion Michael/Manetta - https://pad.xpub.nl/p/20210928_xpub2
* 2nd Group discussion Michael/Manetta - https://pad.xpub.nl/p/2021-10-05-xpub2
* Aquarium — https://pad.xpub.nl/p/aquarium
* Post-Aquarium — https://pad.xpub.nl/p/2021-10-12-postaquarium
* Prototyping — https://pad.xpub.nl/p/2021-11-09-xpub2
===<p style="font-family:helvetica">Steve / Marloes</p> ===
* Graduate Seminar Session 1 — https://pad.xpub.nl/p/GRS_session1_20_21
* Graduate Seminar Session 2 — https://pad.xpub.nl/p/GRS7Oct21
* https://pad.xpub.nl/p/LB2_%26_XPUB2_introduction_to_the_Graduate_Research
* https://pad.xpub.nl/p/GRS7Oct21
* https://pad.xpub.nl/p/GRS_session_3_14_Oct_21
* https://pad.xpub.nl/p/Thesi_OutlinePlanSteve
===<p style="font-family:helvetica">Eleanor Greenhalgh</p> ===
* Collaboration, Conflict & Consent - part 2 — https://pad.xpub.nl/p/2021-10-XPUB2-Nor
=<p style="font-family:helvetica">Links</p>=
=<p style="font-family:helvetica">Links</p>=


*[[Martin (XPUB)-project proposal]]
*[[Martin (XPUB)-project proposal]]
*[[Martin (XPUB)-thesis outline]]
*[[Martin (XPUB)-thesis outline]]
*[[Martin (XPUB)-thesis]]


=<p style="font-family:helvetica">Seminars ([[Graduate_Seminar_2021-2022|source]])</p>=
=<p style="font-family:helvetica">Draft Thesis</p>=


===<p style="font-family:helvetica"> Key Dates and Deadlines </p>===
===What do you want to make?===


'''These are the key dates for 2021-22'''
My project is a data collection installation that monitors people's behaviors in public physical spaces while explicitly encouraging them to help the algorithm collect more information. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.
<br><br>
The way the device is designed doesn’t pretend to give any beneficial outcomes for the subject, but only makes visible the benefits that the machine is getting from collecting their data. Yet, the way the device visually or verbally presents this collected data is done in a grateful way, which might be stimulating for the subject. In that sense, the subject, despite knowing that their actions are done solely to satisfy the device, could become intrigued, involved, or even addicted by a mechanism that deliberately uses it as a commodity. In that way, I intend to trigger conflictual feelings in the visitor’s mind, situated between a state of awareness regarding the operating monetization of their physical behaviors, and a state of engagement/entertainment /stimulation regarding the interactive value of the installation.
<br><br>
My first desire is to make the mechanisms by which data collection is carried out, marketized and legitimized both understandable and accessible. The array of sensors, the Arduinos and the screen are the mainly technological components of this installation. Rather than using an already existing and complex tracking algorithm, the program is built from scratch, kept open source and limits itself to the conversion of a restricted range of physical actions into interactions. These include the detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection. Optionally they may also include the detection of the subject smartphone device or the log on a local Wi-Fi hotspot made by the subject.
<br><br>
In terms of mechanic, the algorithm creates feedback loops starting from: <br>
_the subject behaviors being converted into information; <br>
_the translation of this information into written/visual feedback; <br>
_and the effects of this feedbacks on subject’s behavior; and so on. <br>
By doing so, it tries to shape the visitors as free data providers inside their own physical environment, and stimulate their engagement by converting each piece of collected information into points/money, feeding a user score among a global ranking.
<br><br>
On the screen, displayed events can be:


* 19 November - Graduate Proposal Deadline
_ “subject [] currently located at [ ]” <br>
[x points earned/given]<br>
_ “subject [] entered the space” <br>
[x points earned/given]<br>
_ “subject [] left the space”<br>
[x points earned/given]<br>
_ “subject [] moving/not moving”<br>
[x points earned/given]<br>
_ “subject [] distance to screen: [ ] cm” <br>
[x points earned/given]<br>
_ “subject [] stayed at [ ] since [ ] seconds” <br>
[x points earned/given]<br>
_ “subject [] device detected <br>
[x points earned/given] (optional)<br>
_ “subject logged onto local Wi-Fi<br>
[x points earned/given] (optional)<br>
<br>
Added to that comes the instructions and comments from the devices in reaction to the subject’s behaviors:<br>
<br>
_ “Congratulations, you have now given the monitor 25 % of all possible data to collect!” <br>
[when 25-50-75-90-95-96-97-98-99% of the total array of events has been detected at least once]<br>
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot!”<br>
[if the subject stands still in a specific location]<br>
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
[unlocked at x points earned/given]<br>
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers!”<br>
[if the subject stand still in a specific location]<br>
_ “Leaving all ready? The monitor has yet to collect 304759 crucial pieces of information from you!”<br>
[if the subject is a the edge of the detection range]<br>
_ “You are only 93860 pieces of information away from being the top one data-giver!”<br>
[unlocked at x points earned/given]<br>
_ “Statistics show that people staying for more than 5 minutes average will benefit me on average 10 times more!”<br>
[randomly appears]<br>
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
[if the subject stands still for a long time any location]<br>
<br>
Responding positively to the monitors instructions unlocks special achievement and extra points<br>
<br>
—Accidental data-giver badge <br>
[unlocked if the subject has passed the facility without deliberately wishing to interact with it] + [x points earned/given]<br>
—Lazy data-giver badge <br>
[unlocked if the subject has been standing still for at least one minute] + [x points earned/given]<br>
—Novice data-giver badge <br>
[unlocked if the subject has been successfully completing 5 missions from the monitor] + [x points earned/given]<br>
—Hyperactive data-giver badge <br>
[unlocked if the subject has never been standing still for 10 seconds within 2 minutes lapse time] + [x points earned/given]<br>
—Expert data-giver badge <br>
[unlocked if the subject has been successfully completing 10 missions from the monitor within 10 minutes] + [x points earned/given]<br>
—Master data-giver badge <br>
[unlocked if the subject has been successfully logging on the local Wi-Fi hotspot] + [x points earned/given] (optional)<br>
<br>
On the top left side of the screen, a user score displays the number of points generated by the collected pieces of information, and the unlocking of special achievements instructed by the monitor.<br>
<br>
—Given information: 298 pieces <br>
[displays number of collected events]<br>
—Points: 312000 <br>
[conversion of collected events and achievement into points]<br>
<br>
On the top right of the screen, the user is ranked among x number of previous visitors and the prestigious badge recently earned is displayed bellow<br>
<br>
—subject global ranking: 3/42 <br>
[compares subject’s score to all final scores from previous subjects]<br>
—subject status:  expert data-giver<br>
[display the most valuable reward unlocked by the subject]<br>
<br>
When leaving the detection range, the subject gets a warning message and a countdown starts, and encouraging it to take the quick decision to come back<br>
<br>
—“Are you sure you want to leave? You have 5-4-3-2-1-0 seconds to come back within the detection range”<br>
[displayed as long as the subject remains completely undetected]<br>
<br>
If the subject definitely stands out of the detection range for more than 5 seconds, the monitor will also address a thankful message and the amount of money gathered, achievements, ranking, complete list of collected information and a qr code will be printed as a receipt with the help of a thermal printer. The QR will be a link to my thesis.<br>
<br>
* “Thank you for helping today, don’t forget to take your receipt in order to collect and resume your achievements”<br>
[displayed after 5 seconds being undetected]<br>
<br>
In order to collect, read or/and use that piece of information, the visitor will inevitably have to come back within the range of detection, and intentionally, or not, reactivate the data tracking game. It is therefore impossible to leave the area of detection without leaving at least one piece of your own information printed in the space. Because of this, the physical space should gradually be invaded by tickets scattered on the floor. As in archaeology, these tickets give a precise trace of the behavior and actions of previous subjects for future subjects. <br>


Last year's Graduate Proposals [[Graduate Proposals 2020-2021|UPLOAD YOUR PROPOSAL HERE!]]
===Why do you want to make it?===


* 19 November - Thesis Outline Deadline 
When browsing online or/and using connected devices in the physical world, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. While social platforms, brands, public institutions and governments explicitly promote the use of monitoring practices in order to better serve or protect us, we could also consider these techniques as implicitly political, playing around some dynamics of visibility and invisibility in order to assert new forms of power over targeted audiences.
<br><br>
In the last decade, a strong mistrust of new technologies has formed in the public opinion, fueled by events such as the revelations of Edward Snowden, the Cambridge Analytica scandal or the proliferation of fake news on social networks. We have also seen many artists take up the subject, sometimes with activist purposes. But even if a small number of citizens have begun to consider the social and political issues related to mass surveillance, and some individuals/groups/governments/associations have taken legal actions, surveillance capitalism still remains generally accepted, often because ignored or/and misunderstood.
<br><br>
Thanks to the huge profits generated by the data that we freely provide every day, big tech companies have been earning billions of dollars over the sale of our personal information. With that money, they could also further develop deep machine learning programs, powerful recommendation systems, and to broadly expand their range of services in order to track us in all circumstances and secure their monopolistic status. Even if we might consider this realm specific to the online world, we have seen a gradual involvement from the same companies to monitor the physical world and our human existences in a wide array of contexts. For example, with satellite and street photography (Google Earth, Street View), geo localization systems, simulated three-dimensional environments (augmented reality, virtual reality or metaverse) or extensions of our brains and bodies (vocal assistance and wearable devices). Ultimately, this reality has seen the emergence of not only a culture of surveillance but also of self-surveillance, as evidenced by the popularity of self-tracking and data sharing apps, which legitimize and encourage the datafication of the body for capitalistic purposes.
<br><br>
For the last 15 years, self-tracking tools have made their way to consumers. I believe that this trend is showing how ambiguous our relationship can be with tools that allow such practices. Through my work, I do not wish to position myself as a whistleblower, a teacher or activist. Indeed, to adopt such positions would be hypocritical, given my daily use of tools and platforms that resort to mass surveillance. Instead, I wish to propose an experience that highlights the contradictions in which you and I, internet users and human beings, can find ourselves. This contradiction is characterized by a paradox between our state of concern about the intrusive surveillance practices operated by the Web giants (and their effects on societies and humans) and a state of entertainment or even active engagement with the tools/platforms through which this surveillance is operated/allowed. By doing so, I want to ask how do these companies still manage to get our consent and what human biases do they exploit in order to do so. That’s is how my graduation work and my thesis will investigate the effect of gamification, gambling or reward systems as well as a the esthetization of data/self-data as means to hook our attention, create always more interactions and orientate our behaviors.


Last year's Thesis Outlines [[Thesis Outlines 2020-2021|UPLOAD YOUR THESIS OUTLINE HERE!]]


* 3 Dec         - Deadline First Chapter
===How to you plan to make it and on what timetable?===


* 18 Feb       - Deadline First Draft Thesis
I am developing this project with Arduino Uno/Mega boards, an array of ultrasonic sensor, P5.js and screens.<br><br>


* 18 March         - Deadline Second Draft thesis (texts to 2nd readers)                      
<b>How does it work?</b>
 
<br><br>
* 1 April - Deadlines Second readers' comments
The ultrasonic sensors can detect obstacles in a physical space and know the distance between the sensor and obstacle(s) by sending and receiving back an ultrasound. The Arduino Uno/Mega boards are microcontrollers which can receive this information, run it in a program in order to convert these values into a mm/cm/m but also map the space into an invisible grid. Ultimately, values collected on the Arduino’s serial monitor can be sent to P5.js through P5.serialcontrol. P5.js will then allow a greater freedom in way the information can be displayed on the screens.
 
<br><br>
* 14 April - DEADLINE THESIS
Process:
<br><br>
<b>1st semester: Building a monitoring device, converting human actions into events, and events into visual feedbacks</b>
<br><br>
During the first semester, I am focused on exploring monitoring tools that can be used in the physical world, with a specific attention to ultrasonic sensors. Being new to Arduino programming, my way of working is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale. Prototypes are subject to testing, documentation and comments helping to define which direction to follow. The first semester also allows to experiment with different kind of screen (LCD screens, Touch screens, computer screens, TV screens) until finding the most adequate screen monitor(s) for the final installation. Before building the installation, the project is subject to several sketching and animated simulations in 3 dimensions, exploring different scenarios and narrations. At the end of the semester, the goal is to be able to convert a specific range of human actions into events and visual feedback creating a feedback loop from the human behaviors being converted into information; the translation of this information into written/visual feedbacks; and the effects of this feedbacks on human behavior; and so on.
<br><br>
<b>2nd semester: Implementing gamification with the help of collaborative filtering, point system and ranking.</b>
<br><br>
During the second semester, it is all about building and implementing a narration with the help of gaming mechanics that will encourage humans to feed the data gathering device with their own help. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.
<br><br>
To summarize the storyline, the subject being positioned in the detection zone finds herself/himself unwillingly embodied as the main actor of a data collection game. Her/His mere presence generates a number of points/dollars displayed on a screen, growing as she/he stays within the area. The goal is simple: to get a higher score/rank and unlock achievements by acting as recommended by a data-collector.  This can be done by setting clear goals/rewards to the subject, and putting its own performance in comparison to all the previous visitors, giving unexpected messages/rewards, and give an aesthetic value to the displayed informations.
<br><br>
The mechanism is based on a sample of physical events that have been already explored in the first semester of prototyping (detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection). Every single detected event in this installation is stored in a data bank, and with the help of collaborative filtering, will  allow to the display of custom recommendations such as:
<br><br>
_ “Congratulations, you have now given the monitor 12 % of all possible data to collect”<br>
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot”<br>
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”<br>
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers”<br>
_ “Leaving all-ready? The monitor has yet 304759 crucial pieces of information to collect from you”<br>
_ “You are only 93860 actions away from being the top one data-giver”<br>
_ “Statistics are showing that people staying for more than 5 minutes average will be 10 times more benefitting for me”<br>
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”<br>
<br><br>
The guideline is set out here, but will be constantly updated with the help of experiments and the results observed during the various moments of interaction between the students and the algorithm. For this purpose, the installation under construction will be left active and autonomous in its place of conception (studio) and will allow anyone who deliberately wishes to interact with it to do so. Beyond the voluntary interactions, my interest is also to see what can be extracted from people simply passing in front of this installation. In addition to this, some of the mechanics of the installation will be further explored by collaborating with other students, and setting up more ephemeral and organized experiences with the participants. (ex: 15 February 2022 with Louisa)
<br><br>
This semester will also include the creation of a definite set of illustrations participating to engage the participants of the installation in a more emotional way, the illustrations will be made by an illustrator/designer, with whom I usually collaborate.
<br><br>
<b>3rd semester: Build the final installation of final assessment and graduation show. Test runs, debug and final touchs.</b>
<br><br>
During the third semester, the installation should be settled in the school, in the alumni area, next to XPUB studio for the final assessment, and ultimately settled again at WORM for the graduation show. I am interested in putting this installation into privileged spaces of human circulation, (such as hallways) that would more easily involve the detection of people, and highlight the intrusive aspect of such technologies. The narration, the mechanics, illustrations and graphic aspect should be finalized at the beginning of the 3rd semester, and subject to intense test runs during all that period until meeting the deadline.


====<p style="font-family:helvetica">Guides and Guidelines</p>====
===Relation to larger context===


*[[Graduate_proposal_guidelines]]
As GAFAM companies are facing more and more legal issues, and held accountable in growing numbers of social and political issues around the world, the pandemic context has greatly contributed to make all of us more dependent than ever on the online services provided by these companies and to somehow force our consent. While two decades of counter-terrorism measures legitimized domestic and public surveillance techniques such as online and video monitoring, the current public health crisis made even more necessary the use of new technologies for regulating the access to public spaces and services, but also for socializing, working together, accessing to culture, etc. In a lot of countries, from a day to another, and for an undetermined time, it has become necessary to carry a smartphone (or a printed QR code) in order to get access transport, entertainment, cultural and catering services, but also in order to do simple things such as to have a look at the menu in a bar/restaurant or to make an order.. Thus, this project takes place in a context where techno-surveillance has definitely taken a determining place in the way we can access spaces and services related to the physical world. <br><br>


*[[Second Readers Guidelines]]
Data Marketisation / Self Data: Quantified Self / Attention Economy / Public Health Surveillance / Cybernetics 


*[[A Guide to Essay Writing]] (including guide to Harvard method).


*[[ Handbook details- thesis and final project]]
===Relation to previous practice?===


*[[Thesis Guidelines]]
During my previous studies in graphic design, I started being engaged with the new media by making a small online reissue of Raymond Queneau’s book called Exercices de Style. In this issue called Incidences Médiatiques (2017), the user/reader was encouraged to explore the 99 different versions of a same story written by the author in a less-linear way. The idea was to consider each user graphic user interface as a unique reading context. It would determine which story could be read, depending on the device used by the reader, and the user could navigate through these stories by resizing the Web window, by changing browser or by using on a different device.  
 
*[[Criteria for evaluation (Thesis)]]
 
LB Code link (in progress)
 
*https://pad.xpub.nl/p/LB-groupcritprotocals
 
*[https://libguides.elmira.edu/research How to do research]
 
===<p style="font-family:helvetica">About thesis</p>===
 
====<p style="font-family:helvetica">Thesis criteria</p>====
 
# Intelligibly express your ideas, thoughts and reflections in written English.
# Articulate in writing a clear direction of your graduate project by being able to identify complex and coherent questions, concepts and appropriate forms.
# Clearly structure and analyse an argument.
# Use relevant source material and references.
# Research texts and practices and reflect upon them analytically.
# Synthesize different forms of knowledge in a coherent, imaginative and distinctive way.
# Position one's own views within a broader context.
# Recognize and perform the appropriate mode of address within a given context.
# Engage in active dialogue about your written work with others.
 
====<p style="font-family:helvetica">Thesis format</p>====
 
# A report on your research and practice.
 
# An analytical essay exploring related artistic, theoretical, historical and critical issues and practices that inform your practice, without necessarily referring to your work directly.
 
# The presentation of a text as a body of creative written work.
 
====<p style="font-family:helvetica">Thesis Outline (guideline)</p>====
 
Don't make it more than 1500 words
 
<b>What is your question?</b>
 
Break the proposed text down into parts. 
Think of the separate sections as "containers"
(this may change as you progress with the text but try to make a clear plan with a word count in place)
 
Thesis Outline (consider the following before writing the outline. Include all these points in the intro to the outline)
 
Conceptual Outline (what is your question? Try to be a specific as possible. More specific than identifying a subject or general interest. It helps to ask: "what questions does the work I make generate?")
 
Why do you want to write this text?
 
Outline of Methodology 
(for example: " I would like to structure my thesis in relation to the a series of interviews I will conduct for my proposed project" 
OR 
"I will make a 'close reading' of three of my past projects" 
 
Time line
(how will you plan your time between now and April)
 
 
 
* Introduction- overview 
[500 words]
 
* Chapter 1 
[2000 words]
 
* Chapter 2 
[2000 words]
 
* Chapter 3 
[2000 words]
 
* Conclusion [500 words] 
 
=7000
 
===<p style="font-family:helvetica">Bibliography</p>===
 
Annotated bibliography (five texts max). Make a synopsis of 5 texts  that will be central to your thesis.
 
*Example of annotated bibliography 
https://pzwiki.wdka.nl/mediadesign/Mia/Thesis
 
*Example of a thesis outline:
    #)
    https://pzwiki.wdka.nl/mw-mediadesign/images/f/f3/Thesis_outline_final_Yuching.pdf
    #1)
    https://pzwiki.wdka.nl/mediadesign/User:Zpalomagar/THESIS_OUTLINE/FIFTH_DRAFT
 
=== Referencing System ===
 
*Harvard Referencing system [https://library.aru.ac.uk/referencing/files/QuickHarvardGuide2019.pdf PDF]
 
 
==<p style="font-family:helvetica">Graduate proposal guidelines</p>==
 
===<p style="font-family:helvetica">What do you want to make?</p>===
 
I want to build a dystopian cybernetic exhibition space reflecting on the increasing presence of the digital/virtual in our culture. This work will be speculating on how the modes of representation, inside the exhibition spaces, as well as the agencies, behaviors and circulations of its visitors could be affected by the growing translation of our physical/digital behaviors into informational units . The idea is to make use of digital technologies (ultrasonic sensors, microcontrollers, screens) and get inspired by the inherent mechanisms of the Web digital interfaces (responsive, Web events, @media queries) in order to create an exhibition space explicitly willing to offer a customizable perspective to its visitors. In this regard, the number of visitors, their position within the space, their actions or inactions as well as their movements and trajectories will be mapped (made interdependent) to various settings of the space itself, such as the room size, the lighting,  the audio/sound, the information layout and format, etc.  
<br><br>
<br><br>
In order to enlighten the invisible, silent and often poorly considered dynamics that can co-exist in-between both digital and physical spaces, the data captured inside of this space will be  displayed on screens and be the main content of the exhibition. Ultimately, the graphic properties of this data (typography, layout, font-size, letter-space, line-height screen luminosity) will also be affected by the indications given by these same information units.
As part of my graduation project called Media Spaces (2019), I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print website. Subsequently, this website became translated in the physical space as a printed book, and a series of installations displayed in an exhibition space that was following the online structure of my thesis (home page, index, part 1-2-3-4). In that way, I was interested to inviting to visitors to make a physical experience some aspects of the Web
<br><br>
<br><br>
Far from wanting to glorify the use of technology or to represent it as an all-powerful evil, this space will also be subject to accidents, bugs, dead zones and glitches, among some of them might have been intentionally left there. This about a "critical look and experience of the architecture and digital infrastructure used to observe, control and facilitate contemporary existence". (quoted from https://v2.nl/events/reasonable-doubt).
As a first-year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website called Tense (2020) willing to display the invisible html <meta> tags inside of an essay in order to affect our interpretation of the text. In 2021, I worked on a geocaching pinball game highlighting invisible Web event, and a Web oscillator, which amplitude and frequency range were directly related to the user’s cursor position and browser screen-size.
 
===<p style="font-family:helvetica">How do you plan to make it?</p>===
 
While working with Arduino Mega and P5.js my aim is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale  (see: prototyping).
<br><br> 
Once an exhibition space will be determined for the assessment, I will introduce my project to the wood and metal stations of the school in order to get help to build at least one mobile wall fixed on a rail system. This wall will include handle(s) on the interior side, allowing visitors to reduce or expend the size of the space (by pushing or pulling the wall) from a minimum and to maximum range  (estimated in between 5m2 to 15m2). On the exterior side of this wall, at least one ultrasonic sensor will be implemented in order to determine the surface of the room in real time. (see schema). With the help of an array of ultrasonic sensors putted on interior of the 4 surrounding walls, the space will be mapped into an invisible grid that will detect the exact position of the visitor(s) in real-time, as well as their number. With an extensive use of other sensors such as temperature sensors, light sensors, motion sensor, more information will be gathered, and assigned to specific parameters of the exhibition display.
<br><br> 
One or various screens will be placed in the space itself, they will be displaying the data captured by the various sensors. Serial communications will allow to transfer the information gathered by the Arduinos to P5.js, allow variable displays of the units.  Resizing the space will specifically affect the lighting of the space,  the luminosity of the screens and the size of the informations displayed. The number of visitors will affect the number of active screens as well as the room temperature display. The position in the room will trigger different voice instructions or/and textual instructions if the visitor is not placed in a meaningful way toward the displayed contents. (Ref: [https://www.muhka.be/programme/detail/1405-shilpa-gupta-today-will-end/item/30302-speaking-wall Speaking wall] - Shilpa Gupta, 2009 - 2010)
<br><br> 
In order to allow myself to take a step back on the making of this project, I will take advantage of the different venues organized by XPUB2 and create mini-workshops that will relate more  (see venue1) (see: simulation)
 
[[File:SensorSpace.gif|200px|thumb|left|Sensor Test VS Elastic Space<br><b>Click to watch</b>]]
[[File:Sensor Wall 01.png|200px|thumb|right|Sensor Wall 01]]
[[File:SensorMediaQueries 01.gif|200px|thumb|center|SensorMediaQueries<br><b>Click to watch</b>]]
<br><br><br>
 
===<p style="font-family:helvetica">What is your timetable?</p>===
 
1st semester Prototyping mainly with Arduino, connecting Arduino to P5.js , finding a space to set up the installation for final assesment
* 1st prototype: mini arduio + light sensor (understanding arduino basics / connecting a sensor to a servo motor) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_1_:_Arduino_.2B_Resistor|prototype]]
* 2nd prototype: Arduino uno + utlrasonic sensor (working with ultrasonic sensors / display values of the sensor on serial monitor) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_2:_Arduino_.2B_Ultrasonic_sensor|prototype]]
* 3rd prototype: Arduino uno + utlrasonic sensor + LCD screen (working with values display on a small digital screen) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_3:_Arduino_Uno_.2B_Sensor_.2B_LCD_.28.2B_LED.29|prototype]]
* 4th prototype: Arduino uno + utlrasonic sensor + 2 LEDS (creating range distance values detection, and trigger different lights depending on distance detected) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_4:_Arduino_Uno_.2B_Sensor_.2B_LCD_.2B_2_LED_.3D_Physical_vs_Digital_Range_detector|prototype]]
* 5th prototype: Arduino uno + 3 utlrasonic sensor + 3 LEDS (mapping range distance values in a simple 3X3 grid) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_5:_Arduino_Uno_.2B_3_Sensor_.2B_3_LEDS|prototype]]
* 6th prototype: Arduino uno + 3 utlrasonic sensor + 12 LEDS (assigning a signal to each position of a person  inside the grid by adding more LEDS) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_6:_Arduino_Uno_.2B_3_Sensor_.2B_12_LEDS|prototype]]
* 7th prototype: Arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer (adding audio signals to the range value detection / changing the luminosity of the screen with a potentiometer) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_7:_Arduino_Uno_.2B_12_LEDS_.2B_3_Sensor_.2B_Buzzer_.2B_Potentiometer_.2B_LCD|prototype]]
* 8th prototype: Arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer + mini breadboard (separating sensors from each others) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_8:_Arduino_Uno_.2B_12_LEDS_.2B_3_Sensor_on_mini_breadboards_.2B_Buzzer_.2B_Potentiometer_.2B_LCD|prototype]]
* 9th prototype: Arduino Mega + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD (expending the prototype to human scale with a  7x3 grid / assigning each position within the grid to a specific led and buzzer signal) [[XPUB2_Research_Board_/_Martin_Foucaut#Prototype_9:_Arduino_Uno_.2B_21_LEDS_.2B_7_Sensor_.2B_Buzzer_.2B_Potentiometer_.2B_LCD|prototype]]
* 10th prototype: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js (allow muttiple sound signal at the same time if 2 people or more are in the grid) [[XPUB2_Research_Board_/_Martin_Foucaut#Sketch_10:_Arduino_Mega_.2B_7_Sensors_.2B_LCD_.2B_3_buzzers_.2B_P5.js|prototype]]
* 11th prototype: Arduino Mega + 7 Sensors + LCD + P5.js (connecting the prototoype to a Web page via serial communications, changing the size of a circle with distance sensors)
——————————— NOW —————————————
* Upcoming -  Arduino Mega + 7 Sensors + P5.js (display live values on a screen, and change the display parameters depending on the values themselves)
* Upcoming -  Arduino Mega + 7 Sensors + P5.js (create voice commands/instruction depending on the visitors position)
* Upcoming - Arduino Mega + 7 Sensors + LCD + P5.js (play sounds and affect pitch/tone depending on position one Web page)
* Optional: Arduino uno + ESP8266 (WIFI) (transmit or/and control value from arduino to computer and vice versa via WIFI transmitter / not necessary anymore, since I found another way to do that via USB serial communications)
2nd semester: Find what will be graduation space, build the mobile wall, implement screen displays, and continue work with the arduinos
* Show prototype and schemas of the wall to wood and metal workshops in order to get advices until final validation to build (starting to build physical elements)
* Search, find and validate what will be the space used for the installation during the graduation.
* Start building of the movable wall by considering the characteristic of the space used for graduation.
* Implement the sensors inside the movable wall, and the other devices in the fixed space
 
===<p style="font-family:helvetica">Why do you want to make it?</p>===
 
At the origin of this project and previous works over the past years lies the desire to make the invisible visible. In my opinion, the better a medium mediates, the more it becomes invisible and unconsidered.  This paradox stimulates a need to reflect on and enlighten the crucial role of the media in the way we create, perceive, receive and interpret a content, a subject or a work, but also in the way we behave and circulate in relation to it/them. It is probably not so common to appreciate an artwork  for its frame or for the quality of the space in which it is displayed. It is however more common to let ourselves (as spectators/observers) be absorbed by the content itself and to naturally make abstraction of all mediating technologies. This is why I often try to  « mediate the media » (see: Mediatizing the media), which means to put the media at the center of our attention, transform it as the subject itself. In that sense my graduation project as well as some of my previous works could be considered as meta-works. I want to give users/visitors/spectators occasions to reflect on what is containing, surrounding, holding or hosting a representation.
<br><br>
<br><br>
On the other hand, I have been more recently attached to the idea of reversing the desktop metaphor. The desktop metaphor refers to the terms and objects that the Web borrowed from the physical world in order to make its own concepts more familiar and intelligible to its users. Now, largely democratized and widely spread in modern society, people may have now more clear understanding and concrete experiences of the digital/Web interface. Museums, hotels, houses, cars interiors, restaurants are themselves becoming more and more comparable to digital interface where everything is optimized, and where our behaviors, actions and even inactions are being detected and converted into commands in order to offer a more customized (and lucrative) experience to each of us. In that sense, we are getting closer from becoming users of our own interfaced/augmented physical realities. By creating a exhibition spaces explicitly merging the concepts of digital Web interface with the concept of exhibition space, I wish to create a specific space dedicated to the experience of cybernetics, and to questioning what could be the future of exhibition space. It is also about asking and displaying what are the vulnerabilities of such technologies that we sometimes tend to glorify or demonize. In that sense, the restitution of this exhibition space will intentionally leave bugs, glitches and other accidents that may have been encountered in the making of this work.
While it has always been clear to me that these works were motivated by the desire to define media as context, subject or/and content, the projects presented here have often made use of surveillance tools to detect and translate user information into feedbacks, participating in the construction of an individualized narrative or/and a unique viewing/listening context (interaction, screen size, browser, mouse position). The current work aims to take a critical look at the effect of these practices in the context of techno surveillance.
<br><br>
<br><br>
Finally, it is about putting together two layers of reality that are too often clearly opposed/seperated(IRL VS Online). This is about making the experience of their ambiguities, similarities, and differences. It is about reconsidering their modalities by making them reflect on each others, and making the user/spectator/visitor reflect on its own agencies inside of them.
Similarly, projects such as Media Spaces have sought to explore the growing confusion between human and web user, physical and virtual space or online and offline spaces. This project will demonstrate that these growing confusions will eventually lead us to be tracked in all circumstances, even in our most innocuous daily human activities/actions.


===<p style="font-family:helvetica">Who can help you?</p>===


About the overall project
===Selected References===
# Stephane Pichard, ex-teacher and ex production-tutor for advices and help about scenography/exhibition space
# Emmanuel Cyriaque: my ex-teacher and writting-tutor for advices and help to contextualize my work
# Manetta
# Michael
About Arduino
# XPUB Arduino Group (knowledge sharing)
# Dennis de Bel (ex-XPUB)
# Aymeric Mansoux
About the physical elements of the exhibition:
# Wood station (for movable walls)
# Metal station (for rail system)
# Interaction station (for arduino/P5.js)
About theory/writting practice:
# Rosa: ex-student in history art and media at Leiden Universtity.
# Yael: ex-student in philosophy, getting started with curatorial practice and writtings about the challenges/modalities of the exhibition space. Philosophy of the media (?)
About finding an exhibiting space:
# Leslie Robbins


===<p style="font-family:helvetica">Relation to previous practice</p>===
<b>Works:</b>


During the first part of my previous studies, I really started being engaged into questioning the media by making a small online reissue of Raymond Queneau's book Exercices de Style. In this issue called [https://pzwiki.wdka.nl/mediadesign/Incidences_M%C3%A9diatiques_%E2%80%94_Martin_Foucaut Incidences Médiatiques], the user/reader was encouraged to explore the 99 different ways to tell a same story from the author, by trying to put itself in different online reading contexts. In order to suggest a more non-linear reading experience, reflecting on the notion of context, perspective and point of view, the user could unlock and read these stories by zooming-in or out the Web window, resizing it, changing the browser, going on a different device, etc. As part of my previous graduation project called [https://pzwiki.wdka.nl/mediadesign/Media_Spaces_%E2%80%94_Martin_Foucaut Media Spaces], I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print Website. Subsequently, this website became translated in the physical space as a printed book, a set of meta flags, and a series of installations displayed in a set of exhibition rooms that was following the online structure of thesis (home page, index, part 1-2-3-4). It was my first attempt to create a physical interface inside an exhibition space, but focused on the structure and non-linear navigation . As a first year student of Experimental Publishing, I continued to work in that direction by eventually creating a [https://issue.xpub.nl/13/TENSE/ meta-website] making visible html <meta> tags in an essay. I also worked on a geocaching pinball game highlighting invisible Web events and inviting users to drift in the city of the Hague to find hotspots. More recently I conceived an performed on a [https://hub.xpub.nl/sandbot/~foucaut/SI15/martin.html Web oscillator] inspired from analog instruments's body size, and which amplitude and frequency range were directly related to the user's device screen-size.
* M. DARKE, fairlyintelligent.tech (2021) https://fairlyintelligent.tech/
 
« invites us to take on the role of an auditor, tasked with addressing the biases in a speculative AI »Alternatives to techno-surveillance
[[File:Incidences médiatiques .gif|250px|thumb|left|Incidences Médiatiques <br><b>click to watch GIF</b>]]
<br>
[[File:Tense Button Display 2.png|250px|thumb|right|Special issue 13 - Wor(l)ds for the Future<br> Tense screen recording montage of <i>Tense</i><br><b>click to watch GIF</b>]]
* MANUEL BELTRAN, Data Production Labor (2018) https://v2.nl/archive/works/data-production-labour/
[[File:ESPACESMEDITATIQUE MARTINFOUCAUT 07.jpg|250px|thumb|left|Media Spaces - graduation project]]
Expose humans as producers of useful intellectual labor that is benefiting to the tech giants and the use than can be made out of that labor.
[[File:ESPACESMEDITATIQUE MARTINFOUCAUT 06.jpg|250px|thumb|right|Media Spaces - graduation project]]
<br>
 
* TEGA BRAIN and SURYA MATTU, Unfit-bits (2016) http://tegabrain.com/Unfit-Bits
[[File:ESPACESMEDITATIQUE MARTINFOUCAUT 09.jpg|250px|thumb|left|Media Spaces - graduation project]]
Claims that that technological devices can be manipulated easily and hence, that they are fallible and subjective. They do this by simply placing a self-tracker (connected bracelet) in another context, such as on some other objects, in order to confuse these devices.
[[File:ESPACESMEDITATIQUE MARTINFOUCAUT 04.jpg|250px|thumb|right|Media Spaces - graduation project]]
<br>
 
* JACOB APPELBAUM, Autonomy Cube (2014), https://www.e-flux.com/announcements/2916/trevor-paglen-and-jacob-appelbaumautonomy-cube/
[[File:ESPACESMEDITATIQUE MARTINFOUCAUT 01.jpg|250px|thumb|left|Media Spaces - graduation project]]
Allows galleries to enjoy encrypted internet access and communications, through a Tor Network
[[File:OscillatorGifWiki.gif|250px|thumb|right|Web oscillator]]
<br>  
 
* STUDIO MONIKER, Clickclickclick.click (2016) https://clickclickclick.click/
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
You are rewarded for exploring all the interactive possibilities of your mouse, revealing how our online behaviors can be monitored and interpretated by machines.
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
* RAFAEL LOZANO-HEMMER, Third Person (2006) https://www.lozano-hemmer.com/third_person.php
<br><br><br>
Portrait of the viewer is drawn in real time by active words, which appear automatically to fill his or her silhouette https://www.lozano-hemmer.com/third_person.php
 
<br>
===<p style="font-family:helvetica">Relation to a larger context</p>===
* JONAL LUND, What you see is what you get (2012) http://whatyouseeiswhatyouget.net/
 
«Every visitor to the website’s browser size, collected, and played back sequentially, ending with your own.»
With the growing presence of digital tools in all aspects of our lives, people may now have more concrete experiences of the digital/Web interfaces than the physical space. The distinctions between the physical and virtual worlds are being blurred, as they gradually tend to affect & imitate each other, create interdependencies, and translate our behaviors into informational units (data). Public spaces, institutions and governments are gradually embracing these technologies and explicitly promoting them as ways to offer us more efficient; easy of use; safer; customizable services.  However, we could also see these technologies as implicit political tools, playing around dynamics of visibility and invisibility in order to assert power and influence over publics and populations.
<br>
In a context where our physical reality is turning into a cybernetic reality, my aim is to observe and speculate on how mediating technologies could affect our modes of representation inside the exhibition spaces, as much as ask how could they redefine the agencies, behaviors and circulations of its visitors. In order to do so, it will also be important to put this project in the historical framework of exhibition space and user infterfacers, and observe on what point they might be merging.
* USMAN HAQUE, Mischievous Museum (1997) https://haque.co.uk/work/mischievous-museum/
Readings of the building and its contents are therefore always unique -- no two visitors share the same experience. https://haque.co.uk/work/mischievous-museum/
<br><br>
<br><br>
Curatorial Practice / New Media Art / Information Visualization / Software Art / Institutional Critique / Human Sciences / Cybernetics
<b>Books & Articles:</b>
 
===<p style="font-family:helvetica">Selected References</p>===
 
* [https://clickclickclick.click/ Clickclickclick.click] -  VPRO Medialab & Moniker
*Stéphanie Moser, 2010. [[THE_DEVIL_IS_IN_THE_DETAILS:_MUSEUM_-_Displays_and_the_Creation_of_Knowledge|THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge]]. 1st ed. Southampton, England
*  [https://academic.oup.com/jcmc/article/7/1/JCMC713/4584229 From Cyber Space to Cybernetic Space: Rethinking the Relationship between Real and Virtual SpacesAnanda Mitra, Rae Lynn SchwartzJournal of Computer-Mediated Communication, Volume 7, Issue 1, 1 October 2001]
*Alexander R. Galloway - [[Alexander_R._Galloway_-_The_Interface_Effect|The Interface Effect]] 1st ed. Malden, USA: Polity Press.
*Jonas Lund, 2012. [http://whatyouseeiswhatyouget.net/ What you see is what you get]
*Frederick Kiesler, 1925, [https://thecharnelhouse.org/2013/11/19/frederick-kiesler-city-of-space-1925/ City of space]
*Brendan Howell, 2017(?) - [http://screenl.es/ The screenless office]
 
==<p style="font-family:helvetica"> Reading Sources</p> ==
 
* [ Bootleg]
* [https://aaaaarg.fail/ Aaaarg]
* [https://www.jstor.org/ JSTOR]
 
==<p style="font-family:helvetica">Themes (keywords)</p>==
 
* Interfaced Reality
* Museum Display vs Screen display
* Exhibition space vs User interface
* Web Elasticy vs Physical Rigidity
* Museology / Curation / Gallery and Museum display
* Technological context
* Mediatization of Media / Meta Art
 
=<p style="font-family:helvetica">Draft Thesis</p>=
 
 
==<p style="font-family:helvetica">Introduction</p>==
 
With the growing presence of digital tools in all aspects of our lives, people may now have more concrete experiences of the digital/Web interfaces than the physical space. The distinctions between the physical and virtual worlds are being blurred, as they gradually tend to affect & imitate each other, create interdependencies, and translate our behaviors into informational units (data).  Public spaces, institutions and governments are gradually embracing these technologies and explicitly promoting them as ways to offer us more efficient; easy of use; safer; customizable services.  However, we could also see these technologies as implicit political tools, playing around dynamics of visibility and invisibility in order to assert power and influence over publics and populations.
In a context where our physical reality is turning into a cybernetic reality, my aim is to observe and speculate on how mediating technologies could affect our modes of representation inside the exhibition spaces, as much as ask how could they redefine the agencies, behaviors and circulations of its visitors. 
Through digital and analogical comparisons, we will first try to find out what are the status of visitors inside of these spaces (what is a user or a visitor), what do they have to agree on (terms, conditions, agreements vs Rules, safety, regulations), what is expected from them, how some behaviors and circulations are being encouraged or required, while some others are being minimized or prohibited.
In a second phase, we will go into a study of these spaces themselves, by first contextualizing the exhibition space and the web/digital interfaces in a historical framework, then consider how a set of spatial, technological and political factors defines a context in itself. In order to do so, we will identify what are the elements defining, communicating or giving structure to the contents; in which kind architecture or system are they existing or being displayed and how does it affect their sustainability or the way they can be perceived.
Thirdly, we will speculate and make the experience of possible implementations of cybernetics in the exhibition space, by formulating and producing various combinations of concepts belonging to both the physical exhibition space and the virtual/digital interface.
In complement to the writing of this thesis, an exhibition space will be conceived, inviting the readers to make an experience of the above mentioned speculations . Among them, we will for example explore and experience the conceptual notions of « architectural devices »; « physical events », « programmed physical space » or « exhibition user».
 
==<p style="font-family:helvetica">I. Agencies and factors within the spaces of representation</p>==
 
===<p style="font-family:helvetica">1. AGENCIES</p>===
 
What are the status, conditions and agencies of users on the Web in comparison to being a visitor/spectator inside an exhibition space?  What does it means to be a user, a visitor or spectator? What behaviors are being allowed, promoted, limited or prohibited? When and how does these conditions for entering and using these space are being stated? Can these conditions be set in detail by the user/viewer?
 
====<p style="font-family:helvetica">1.1 Terms, conditions, agreements — The user’s agencies through the Web interfaces</p>====
 
What does it means to be a user? Does it necessarily involves interactivity? What are the status, conditions of use and agencies of users on the World Wide Web? What are the user’s agencies when visiting a specific website? We will go through terms and agreements; cookies, privacy settings, legal uses, advertisement, copyrights, licenses, etc.
 
* [[The_User_Condition_04:_A_Mobile_First_World_-_Silvio_Lorusso|The User Condition 04: A Mobile First World]] - Silvio Lorusso
 
====<p style="font-family:helvetica">1.2 Rules, safety, regulations — The spectator’s agencies through the physical exhibition spaces</p>====
 
What does it’s mean to be exhibition visitor? Does it necessarily involved to be spectating? What is the status, conditions and agencies of a visitor inside a museum, gallery or any other exhibition space? We will talk about artwork(s) safety, public safety, prohibited items, public speaking, photography, equipments, behavior, circulation, etc.
 
* Louvres Visitors rules: https://www.louvre.fr/en/visit/museum-rules
 
===<p style="font-family:helvetica">2. CONTEXTS</p>===
 
What are the spatial, technological and/or political factors defining the context in which the user(s) or visitor(s) is/are situated? Is/are the users/visitors and the content situated within the same space? What are the elements defining, communicating or giving structure to the contents? In  which kind architecture or system are these parameters existing or being displayed? How does the technologies used to support and display contents can affect their sustainability, or the way they can be perceived/experienced?
 
====<p style="font-family:helvetica">2.1 Technological context of the Web</p>====
 
=====<p style="font-family:helvetica">2.1.1 Historical framework of the user interfaces</p>=====
 
From Batch Computing & Command Line Interfaces [The IBM 029 Card Punch]; to Command-Line Interfaces (CLIs); to video display terminals; to Graphical User Interface (GUI) [Xerox, Windows 1.0, Apple Lisa OfficeSystem 1, VisiCorp Visi On, Mac OS System 1] introducing the pointing systems (mouse/cursor) but also window systems with icons (folders, bins, etc); futher improved [Amiga Workbench 1.0, Windows 2.0, and 3.0, and Mac OS System 7, Windows 95 ]. Then the smartphone shifted the way to conceive UI design and democratized the concept of phone apps, that itself influenced back the way to conceive desktop interfaces (Windows 10). Actual user interfaces seem to give more and more space to  voice, touch imputs, augmented reality, virtual reality, etc. As an observation of this historical framework, we could speculate of the fact that the user interfaces will be less and less embodied inside devices themselves, and more and more projected into the physical space itself, or by the conception virtual spaces. This is how in my opinion, the concept of cybernetic spacex is becoming a reality. (we will also evoke the UI and UX)
 
*[https://hub.xpub.nl/bootleglibrary/search?query=+The+Interface+Effect bootleg][[Alexander R. Galloway - The Interface Effect]]  1st ed. Malden, USA: Polity Press.<br>
*[https://hub.xpub.nl/bootleglibrary/search?query=Navigating+Neoliberalism bootleg] [[Nick_Srnicek_-_Navigating_Neoliberalism:_Political_Aesthetics_in_an_Age_of_Crisis|Srnicek, L., 2016. Navigating Neoliberalism: Political Aesthetics in an Age of Crisis]], medium.com
*[https://hub.xpub.nl/bootleglibrary/search?query=+Program+or+be+programmed bootleg][[Program_Or_Be_Programmed_-_Ten_Commands_For_A_Digital_Age_Douglas_Rushkoff|Douglas Rushkoff, A., 2010. Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff]]. 1st ed. Minneapolis, USA: OR Books.
*[https://hub.xpub.nl/bootleglibrary/search?query=The+Best+Interface+Is+No+Interface bootleg][[The_Best_Interface_Is_No_Interface_-_Golden_Krishna|Krishna, G., 2015. The Best Interface Is No Interface: The simple path to brilliant technology (Voices That Matter)]]. 1st ed. unknown: New Riders Publishing.
*[https://hub.xpub.nl/bootleglibrary/search?query=interface bootleg][[Interface Critique- Beyond UX - FLORIAN HADLER, ALICE SOINÉ; DANIEL IRRGANG]] Florian Hadler, Alice Soiné, Daniel Irrgang
 
=====<p style="font-family:helvetica">2.1.2 An infinite array of individualized, elastic and obsolete perspectives/renders</p>=====
 
The Web digital interfaces offer to each of its users/a custom point of view based on an innumerable and ever-changing array of technological factors.  To list only few of them we could find for example the device; the browser;  the system; the screen-size; the resolution; the user configurations and defaults settings, the IP address; etc.. The users have the choice to change most these settings, often without having to refresh their web page (ex: resizing user interface). Added to that, the display/render of a website are also affected by the constant evolution of the Web itself, with patches, updates, expired and added elements that contribute to the ephemerality and unpredictability of what can be seen.
How to make these differences visible, and why would it be important? How does this ever changing technology involves some unpredictability and obsolescence in the way contents can be rendered? How could the plastic property of the Web digital interfaces be emulated in the exhibition space?  How did this constraint slowly democratized the implementation responsive mechanics inside the Web.
 
 
* [http://whatyouseeiswhatyouget.net/ What you see is what you get] — Jonas Lund (2012)
* [[Plasticity_of_User_Interfaces:A_Revised_Reference_Framework|Plasticity of User Interfaces:A Revised Reference Framework]] - Gaëlle Calvary, Joëlle Coutaz, David Thevenin Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt 

* Lopez, J.F., Szekely, P., Web page adaptation for Universal Access, in Proc. of Conf. on Universal Access in HCI UAHCI’ 2001

 
====<p style="font-family:helvetica">2.2 Technological contexts in the museum/exhibition space</p>====
 
=====<p style="font-family:helvetica">2.2.1 Historical framework of the exhibition spaces</p>=====
 
What is the purpose of an exhibition space? Exhibition spaces are meant to be where arts becomes public.
From private galleries (owned and exhibited by individuals to other individiuals) to public galleries (owned by institutions and exhibited to public audiences); From condensed displays (when frames used to be very close from each others but also puted everywhere on ceiling, corners, etc) to spaced displays to spatialized displays (assigning a space to an artwork or a few artworks); From physical exhibtions to virtual exhibitions). I need to find back my notes about this topic and definitivly need more documentation.
 
* [http://nt2.uqam.ca/fr/biblio/after-white-cube [[After the White Cube.]]] [https://www.lrb.co.uk/the-paper/v37/n06/hal-foster/after-the-white-cube doc] 2015
* [http://nt2.uqam.ca/fr/biblio/spaces-experience-art-gallery-interiors-1800-2000 [[Spaces of Experience: Art Gallery Interiors from 1800 – 2000]]] [http://nt2.uqam.ca/fr/biblio/spaces-experience-art-gallery-interiors-1800-2000 doc]
* [https://www.e-flux.com/announcements/262138/colour-critique/ [[Colour Critique A Symposium on Colour as an Agent for Taste, Experience and Value in the Exhibition Space]]]
* [[Mental Spaces - Joost Rekveld/Michael van Hoogenhuyze]] (course for Artscience 2007/8) [http://www.joostrekveld.net/wp/?page_id=590 doc]
 
=====<p style="font-family:helvetica">2.2.2 Spaces and agents of the production of knowledge</p>=====
 
What are the elements involved into the museum display? Why do they matter? How do they orientate our circulation, affect our perception, and define a object/subject as an artwork? We will be considering the maximum amount of parameters that can be controlled by the curator such as architecture, scale, size, interior design, colors, temperature, layout, writing, arrangement, lighting, display, etc. We will also be talking about some of the parameters than can escape the control of a curator such as the number of visitors inside the space, the surrounding environment of an exhibition, the possible occurence(s) of external constraints and restrictions, etc.
 
* [[THE_DEVIL_IS_IN_THE_DETAILS:_MUSEUM_-_Displays_and_the_Creation_of_Knowledge|Stéphanie Moser, 2010. THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge Doc. 1st ed. Southampton, England]]
* [[From_the_Critique_of_Institutions_to_an_Institution_of_Critique_-_Andrea_Fraser|From the Critique of Institutions to an Institution of Critique]] - Andrea FraserDoc
 
==<p style="font-family:helvetica">II. Reversing the desktop metaphor</p>==
 
This second part directly evokes the concepts surrounding the exhibition space that is being build for the graduation.
The desktop metaphor was invented in the early ages of computers in order to facilitate the use and understanding of the digital interfaces, by making mental associations related to domains from the physical world. Now democratised, widely used and often replacing our needs to converge in physical spaces (especially in times of pandemic), I would like to reverse this process by getting inspired by the concepts of the digital interfaces in order to suggest a singular experience and understanding of the exhibition space.
 
===<p style="font-family:helvetica">1. CONCEPTS OF THE CYBERNETIC EXHIBITION SPACE</p>===
 
Conceiving the exhibition space as a user interface and exploring concepts that bring together notions from both digital and physical world.
 
*  [https://academic.oup.com/jcmc/article/7/1/JCMC713/4584229 From Cyber Space to Cybernetic Space: Rethinking the Relationship between Real and Virtual SpacesAnanda Mitra, Rae Lynn SchwartzJournal of Computer-Mediated Communication, Volume 7, Issue 1, 1 October 2001]
* [http://screenl.es/ The screenless office] - Brendan Howell
 
====<p style="font-family:helvetica">1.1 "Architectural Device"</p>====
 
Conceiving the architecture as a spatial, technological and political device composed of a set of factors and parameters that can be configured.
 
====<p style="font-family:helvetica">1.2 "Physical Events"</p>====
 
On the Web, our actions and inactions can be converted into (silent and invisible) events that can give activate things and be converted into valuable informations for advertisers, algorythms, etc. How could such thing be conceptualized inside an exhibition space.
 
* [https://clickclickclick.click/ Clickclickclick.click] - VPRO Medialab & Moniker
 
====<p style="font-family:helvetica">1.3 "Programmed physical space"</p>====
 
Comparing the programming of an interface with the curation of a exhbibition space. Could an exhibition space be programmed? Does it make the visitor a user of the space?
 
====<p style="font-family:helvetica">1.4 "Exhibition User"</p>====
 
Conceiving the Spectator as a User or a performer of the physical space
 
* [https://www.muhka.be/programme/detail/1405-shilpa-gupta-today-will-end/item/30302-speaking-wall Speaking wall] Shilpa Gupta, 2009 - 2010
 
====<p style="font-family:helvetica">1.5 "Variable Display"</p>====
 
Conceiving the physical space as an elastic/variable and potentially unpredicatable display; in order to diffract the range of viewing contexts offered by the Web.
 
====<p style="font-family:helvetica">1.6 "Meta space"</p>====
 
Conceiving a cybernetic exhibition space, capturing information, translating these informations into intra-actions and displaying these informations as the content of the exhibition itself
 
*  [https://www.tandfonline.com/doi/full/10.3402/jac.v6.23009  The meta as an aesthetic category] Bruno Trentini (2014)<br>
 
==<p style="font-family:helvetica">Conclusion</p>==
[...]
 
==<p style="font-family:helvetica">Selected References</p>==
 
* [https://clickclickclick.click/ Clickclickclick.click] -  VPRO Medialab & Moniker
*Stéphanie Moser, 2010. [[THE_DEVIL_IS_IN_THE_DETAILS:_MUSEUM_-_Displays_and_the_Creation_of_Knowledge|THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge]]. 1st ed. Southampton, England
*  [https://academic.oup.com/jcmc/article/7/1/JCMC713/4584229 From Cyber Space to Cybernetic Space: Rethinking the Relationship between Real and Virtual SpacesAnanda Mitra, Rae Lynn SchwartzJournal of Computer-Mediated Communication, Volume 7, Issue 1, 1 October 2001]
*Alexander R. Galloway - [[Alexander_R._Galloway_-_The_Interface_Effect|The Interface Effect]] 1st ed. Malden, USA: Polity Press.
*Jonas Lund, 2012. [http://whatyouseeiswhatyouget.net/ What you see is what you get]
*Frederick Kiesler, 1925, [https://thecharnelhouse.org/2013/11/19/frederick-kiesler-city-of-space-1925/ City of space]
*Brendan Howell, 2017(?) - [http://screenl.es/ The screenless office]


* SHOSHANA ZUBOFF, The Age of Surveillance Capitalism (2020)
Warns against this shift towards a «surveillance capitalism». Her thesis argues that, by appropriating our personal data, the digital giants are manipulating us and modifying our behavior, attacking our free will and threatening our freedoms and personal sovereignty.
<br>
* EVGENY MOROZOV, Capitalism’s New Clothes (2019)
Extensive analysis and critic of Shoshana Zuboff research and publications.
<br>
* BYRON REEVES AND CLIFFORD NASS, The Media Equation, How People Treat Computers, Television, and New Media Like Real People and Places (1996)
Precursor study of the relation between humans and machine, and how do you human relate to them.
<br>
<br>
More [[XPUB2_Research_Board_/_Martin_Foucaut#Readings_.28new.29.28english.29.28with_notes_in_english.29|here]]
* OMAR KHOLEIF, Goodbye, World! — Looking at Art in the digital Age (2018)  
 
Authors shares it’s own data as a journal in a part of the book, while on another part, question how the Internet has changed the way we perceive and relate, and interact with/to images.
 
=<p style="font-family:helvetica">What is my work, What do I want to tell, What is my position</p>=
 
Translated from [https://pad.xpub.nl/p/2021-11-09-xpub2 discussion] with Michael
 
People have now more concrete experiences of the digital/Web interface than the physical space. Museums, hotels, houses, cars interiors, restaurants are themselves becoming more and more comparable to digital interface where everything is optimized, and where our behaviours, actions and even inactions are being detected and converted into commands in order to offer a more customized (and profitable) experience to each of us. In that sense, we are getting closer from becoming users of our own interfaced physical reality. By creating a exhibition spaces explicitly inspired from a desktop Web interface, I wish to question what could be the future of exhibition space, what are the limits of this interfaced and individualized reality and how could it affect our own experience and understanding of art.
<br>
<br>
 
* KATRIN FRITSCH, Towards an emancipatory understanding of widespread datafication (2018)
What could we learn from interface design?
Suggests that in response to our society of surveillance, artists can suggest activist response that doesn’t necessarily involve technological literacy, but instead can promote strong counter metaphors or/and counter use of these intrusive technologies.
What could be the future of exhibition space?
 
"Bring attention to the systems underlying artistic productions" both on the Web and the physical world<br>
"reversal of the desktop metaphor" (using the virtual as "concrete" to metaphorically understand a physical exhibition space), what will be the future of an exhibition space... (is already working exhibition spaces working with sensors)
scary and fascinating at the same time...<br>
"my embracing/use of sensors isn't about proposing these techniques as a solution / ideal / about control... interfaces requiring less and less from us but paradoxically extracting more and more from us"<br>
every small unconsidered behaviour is being used (trying to used)...<br>
there is unpredictable.... because of all the factors, want unexpected things to happen...<br>
the reality of digital isn't all about precision and control, this notion of surprise is key for an experience.<br>
Exploring the fullness of digital / programmed / computational media, including those "edge" cases / the "glitch" ... the surprise...<br>
Examples from museums: (for instance Brussels has the MIM Museum Instrument Museum, sadly the old now retired interface was a system with headphones that were activated in the space, so as you approached vitrines with a violin you would here a performance of that instrument)... <br>
How a mistake can create something else. / Bugs / Glitch  Letting an accident/surprise/unexpect exist, exploring the fullness of digital programming <br>
My position seems to fit with Researcher/Designer<br>
Digital is not precise and omnipotent, it has so faults, flows and vulnerabilities.
 
 
 
To check:
 
*  [https://arvindguptatoys.com/arvindgupta/mindstorms.pdf Mindstorms  Seymour Papert]
*  Serendipity
 
 
 
==<p style="font-family:helvetica">Software Art</p>==
 
Software creation or use of software concepts for artworks representation.
Commonly put the spectator in the role of a user.
 
==<p style="font-family:helvetica">Internet Art</p>==
 
Elements from the Internet bringed outside of the Internet and promoting the Internet as part of both virtual and physical realities.
* John Ippolito
 
==<p style="font-family:helvetica">Post-Internet Art vs Internet 2.0</p>==
 
Post-Internet Art: Litteraly Art after the internet. Can consists of using online material for later use in offline works or can relate on the effect of the Internet in various aspects of the culture, aesthetic and society.
* Olia Lialina
VS <br>
Internet 2.0: Assuming that a world Internet doesn't exist anymore
* Zach Blas
 
==<p style="font-family:helvetica">Net Art</p>==
 
Started in late 70's and nowadays associated with a outdated era of the Internet (1.0?)<br>
Closely related to Network Art
* Olia Lialina, My Boyfriend Came Back From the War, 1996
 
==<p style="font-family:helvetica">New Aesthetics</p>==
 
Confronting/merging virtual and physical, or humans and machine, etc
* James Bridle
 
==<p style="font-family:helvetica">Funware</p>==
 
Gamification of non-game platforms in order to encourage some actions, behaviors, transactions with the help of various rewarding systems.
 
=<p style="font-family:helvetica">Connections to XPUB1</p>=
 
==<p style="font-family:helvetica">User viewing contexts (on the Web) from special issue 13</p>==
 
===<p style="font-family:helvetica">Description</p>===
 
Create motion from the diffraction of the user interface  which offers flexible and almost infinite possible renders of a same Web page. The sensible variety of user viewing contexts tells about the placiticy of the user interface. This is the result from the wide range user devices, window or screen sizes, Web browsers, (as part of many other parameters). A first of movement capture and montage of the user interface placticity can be as part of the post-production of my interpretation of the esssay "[https://issue.xpub.nl/13/TENSE/ Tense]", part of the [https://issue.xpub.nl/13/ Special Issue 13].
 
==== Capturing and puting into motion the User interface placticity ====
 
Trying to play around with the Browser window resizing in order to create a playful animation decidaced be a thumbnail of the project.
The two first screen capture will be the basis of the upcoming motion. I will first try to smooth the window movement and make the two screen capture fit togehter before synchronizing them and looping them.
 
 
[[File:TENSE Motion Rectangle.gif|thumb|left|TENSE Motion Rectangle Format Loop in the loop]]
[[File:TENSEMOTIONInitialCapture.gif|thumb|right|TENSE MOTION Initial Screen Capture 1]]
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
 
Notes: Add Web Oscillator


=<p style="font-family:helvetica">Prototyping</p>=
=<p style="font-family:helvetica">Prototyping</p>=
Line 547: Line 256:
[[File:Rail.jpg|200px|thumb|center|Rail]]
[[File:Rail.jpg|200px|thumb|center|Rail]]
<br><br><br><br><br><br><br>
<br><br><br><br><br><br><br>


===<p style="font-family:helvetica"> About the ultrasonic Sensor (HC-SR04)</p> ===
===<p style="font-family:helvetica"> About the ultrasonic Sensor (HC-SR04)</p> ===
Line 846: Line 554:


===<p style="font-family:helvetica">Sketch 10:  Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js </p> ===
===<p style="font-family:helvetica">Sketch 10:  Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js </p> ===
[[File:P5.js sensor.gif|300px|thumb|left|P5.js and ultrasonic sensor]]


The goal here was to create a first communication between the physical setup and a P5.js web page
The goal here was to create a first communication between the physical setup and a P5.js web page


Here is the code used:
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
 
    //LIBRARIES
     
      #include "pitches.h" //PITCH
      #include <LiquidCrystal.h> //LCD
      //#include <LiquidCrystal_I2C.h> //LCD
      #include <LcdBarGraph.h> //LCD
      #include <Wire.h> // LCD
      #include <NewPing.h> //SENSOR ACCURACY
     
      LiquidCrystal lcd(34, 35, 32, 33, 30, 31);
     
      //LCD
      //const int SDAPin = A4; //Data pin
      //const int SCLPin = A5; //Clock pin
     
      //BUZZER
      const int BUZZER1 = 2;
      const int BUZZER2 = 3;
      const int BUZZER3 = 4;
   
      //A
     
      int trigPinA= A0;
      int EchoPinA= A1;
   
      //B
     
      int trigPinB= A2;
      int EchoPinB= A3;
   
      //C
     
      int trigPinC= A4;
      int EchoPinC= A5;
     
      //D
     
      int trigPinD = A6;                                 
      int EchoPinD= A7;   
   
                               
      //D
     
      int trigPinE = A8;                                 
      int EchoPinE= A9; 
   
      //F
     
      int trigPinF = A10;                                 
      int EchoPinF= A11; 
       
      //G
     
      int trigPinG= A12;
      int EchoPinG= A13;
     
      //A RANGG OF LEDS
     
      int LED_A1_ping= 25;
      int LED_A2_ping= 26;
      int LED_A3_ping= 27;
     
      //B RANGG OF LEDS
     
      int LED_B1_ping= 28; 
      int LED_B2_ping= 29;
      int LED_B3_ping= 36;
     
      //C RANGG OF LEDS
     
      int LED_C1_ping= 37; 
      int LED_C2_ping= 38;
      int LED_C3_ping= 39;
     
      //D RANGG OF LEDS
     
      int LED_D1_ping= 40; 
      int LED_D2_ping= 41;
      int LED_D3_ping= 42;
   
      //G RANGG OF LEDS
     
      int LED_E1_ping= 43 ; 
      int LED_E2_ping= 44;
      int LED_E3_ping= 45;
   
      //F RANGG OF LEDS
     
      int LED_F1_ping= 46; 
      int LED_F2_ping= 47;
      int LED_F3_ping= 48;
   
      //G RANGG OF LEDS
     
      int LED_G1_ping= 49; 
      int LED_G2_ping= 50;
      int LED_G3_ping= 51;
     
      //LCD DISPLAY
      //LiquidCrystal_I2C lcd = LiquidCrystal_I2C (0x3F,16,2);
     
      long duration, distance, UltraSensorA, UltraSensorB, UltraSensorC, UltraSensorD, UltraSensorE, UltraSensorF,  UltraSensorG;
      char data;
      String SerialData="";
     
      void setup()
      {// START SGTUP FUNCTION
     
      lcd.begin(16, 2);
      //lcd.init();
     
      Serial.begin (9600);
                               
      pinMode(BUZZER1, OUTPUT);
      pinMode(BUZZER2, OUTPUT);
      pinMode(BUZZER3, OUTPUT);
   
      //setup pins sensor A
      pinMode(trigPinA, OUTPUT);
      pinMode(EchoPinA, INPUT);
      pinMode(LED_A1_ping, OUTPUT);
      pinMode(LED_A2_ping, OUTPUT);
      pinMode(LED_A3_ping, OUTPUT);
   
      //setup pins sensor B
      pinMode(trigPinB, OUTPUT);
      pinMode(EchoPinB, INPUT);
      pinMode(LED_B1_ping, OUTPUT);
      pinMode(LED_B2_ping, OUTPUT);
      pinMode(LED_B3_ping, OUTPUT);
   
      //setup pins sensor C
      pinMode(trigPinC, OUTPUT);
      pinMode(EchoPinC, INPUT);
      pinMode(LED_C1_ping, OUTPUT);
      pinMode(LED_C2_ping, OUTPUT);
      pinMode(LED_C3_ping, OUTPUT);
     
      // setup pins sensor D
      pinMode(trigPinD, OUTPUT);                       
      pinMode(EchoPinD, INPUT);                       
      pinMode(LED_D1_ping, OUTPUT);                 
      pinMode(LED_D2_ping, OUTPUT);                 
      pinMode(LED_D3_ping, OUTPUT);                 
   
      //setup pins sensor E
      pinMode(trigPinE, OUTPUT);
      pinMode(EchoPinE, INPUT);
      pinMode(LED_E1_ping, OUTPUT);                 
      pinMode(LED_E2_ping, OUTPUT);                 
      pinMode(LED_E3_ping, OUTPUT);                 
     
      //setup pins sensor F
      pinMode(trigPinF, OUTPUT);
      pinMode(EchoPinF, INPUT);
      pinMode(LED_F1_ping, OUTPUT);                 
      pinMode(LED_F2_ping, OUTPUT);                 
      pinMode(LED_F3_ping, OUTPUT);                 
       
      //setup pins sensor G
      pinMode(trigPinG, OUTPUT);
      pinMode(EchoPinG, INPUT);
      pinMode(LED_G1_ping, OUTPUT);                 
      pinMode(LED_G2_ping, OUTPUT);                 
      pinMode(LED_G3_ping, OUTPUT);                 
     
      //inisializG LED status
      digitalWrite(LED_A1_ping,LOW);
      digitalWrite(LED_A2_ping,LOW);
      digitalWrite(LED_A3_ping,LOW);
     
      digitalWrite(LED_B1_ping,LOW);
      digitalWrite(LED_B2_ping,LOW);
      digitalWrite(LED_B3_ping,LOW);
     
      digitalWrite(LED_C1_ping,LOW);
      digitalWrite(LED_C2_ping,LOW);
      digitalWrite(LED_C3_ping,LOW);
     
      digitalWrite(LED_D1_ping,LOW);
      digitalWrite(LED_D2_ping,LOW);
      digitalWrite(LED_D3_ping,LOW);
   
      digitalWrite(LED_E1_ping,LOW);
      digitalWrite(LED_E2_ping,LOW);
      digitalWrite(LED_E3_ping,LOW);
   
      digitalWrite(LED_F1_ping,LOW);
      digitalWrite(LED_F2_ping,LOW);
      digitalWrite(LED_F3_ping,LOW);
   
      digitalWrite(LED_G1_ping,LOW);
      digitalWrite(LED_G2_ping,LOW);
      digitalWrite(LED_G3_ping,LOW);
   
      }
     
      void loop()
      {
     
       
      // START THG LOOP FUNCTION
      SonarSensor(trigPinA,EchoPinA);             
      UltraSensorA = distance;
      SonarSensor(trigPinB,EchoPinB);             
      UltraSensorB = distance;
      SonarSensor(trigPinC,EchoPinC);             
      UltraSensorC = distance;
      SonarSensor(trigPinD, EchoPinD);             
      UltraSensorD = distance;                                         
      SonarSensor(trigPinE,EchoPinE);             
      UltraSensorE = distance;
      SonarSensor(trigPinF,EchoPinF);             
      UltraSensorF = distance;
      SonarSensor(trigPinG,EchoPinG);             
      UltraSensorG = distance;
       
      //Serial.print("A: ");
      //Serial.print(UltraSensorA);
      //Serial.println(" cm");
   
      //Serial.print("B: ");
      //Serial.print(UltraSensorB);
      //Serial.println(" cm");
   
      //Serial.print("C: ");
      //Serial.print(UltraSensorC);
      //Serial.println(" cm");
   
      //Serial.print("D: ");
      //Serial.print(UltraSensorD);
      //Serial.println(" cm");
   
      //Serial.print("E: ");
      //Serial.print(UltraSensorE);
      //Serial.println(" cm");
   
      //Serial.print("F: ");
      //Serial.print(UltraSensorF);
      //Serial.println(" cm");
     
      //Serial.print("G: ");
      //Serial.print(UltraSensorG);
      //Serial.println(" cm");
     
      lcd.setCursor(0,0);
      lcd.print("A");
      lcd.setCursor(1,0);
      lcd.print(UltraSensorA);
 
      lcd.setCursor(5,0);
      lcd.print("B");
      lcd.setCursor(6,0);
      lcd.print(UltraSensorB);
     
      lcd.setCursor(10,0);
      lcd.print("C");
      lcd.setCursor(11,0);
      lcd.print(UltraSensorC);
 
      lcd.setCursor(0,1);
      lcd.print("D");
      lcd.setCursor(1,1);
      lcd.print(UltraSensorD);
 
      lcd.setCursor(5,1);
      lcd.print("E");
      lcd.setCursor(6,1);
      lcd.print(UltraSensorE);
     
      lcd.setCursor(10,1);
      lcd.print("F");
      lcd.setCursor(11,1);
      lcd.print(UltraSensorF);
 
        //int mappedPot = map(UltraSensorD, 0, 1023, 0, 255);
        // print it out the serial port:
        Serial.write(UltraSensorD);
        delay(1);                                           
   
      // A SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
      // A1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorA <=30)// if distance is lGss than 10 Cm turn thG LED ON
        {
 
          digitalWrite(LED_A1_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_A1_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15); 
        }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_A1_ping,LOW);
          digitalWrite(BUZZER1, LOW);
        }
        // A2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorA >=31) && (UltraSensorA <=60))
        {
          digitalWrite(LED_A2_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_A2_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15);
        }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_A2_ping,LOW);
        }
        // A3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorA >=61) && (UltraSensorA <=100))
        {
          digitalWrite(LED_A3_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_A3_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15);
        }
        else             
        {
          digitalWrite(LED_A3_ping,LOW);
        }
        // A4———————————————————————————————————————————————————————————————————————————————————————————————
    //    if((UltraSensorA >=31) && (UltraSensorA <=10000))
    //    {
    //      digitalWrite(LED_A4_ping,HIGH);
    //      lcd.setCursor(0,1);
    //      lcd.print("4:");
    //    }
    //    else               
    //    {
    //      digitalWrite(LED_A4_ping,LOW);
    //    }
        // B SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
        // B1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorB <=30)
        {
          digitalWrite(LED_B1_ping,HIGH);
          digitalWrite(BUZZER2, HIGH);
          delay(15);
          digitalWrite(LED_B1_ping,LOW);
          digitalWrite(BUZZER2, LOW);
          delay(15); 
        }
        else
        {
          digitalWrite(LED_B1_ping,LOW);
        }
        // B2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorB >=31) && (UltraSensorB <=60))
        {
          digitalWrite(LED_B2_ping,HIGH);
          digitalWrite(BUZZER2, HIGH);
          delay(15);
          digitalWrite(LED_B2_ping,LOW);
          digitalWrite(BUZZER2, LOW);
          delay(15);  }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_B2_ping,LOW);
        }
        // B3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorB >=61) && (UltraSensorB <=100))
        {
          digitalWrite(LED_B3_ping,HIGH);
          digitalWrite(BUZZER2, HIGH);
          delay(15);
          digitalWrite(LED_B3_ping,LOW);
          digitalWrite(BUZZER2, LOW);
          delay(15);   
        }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_B3_ping,LOW);
        }
        // B4———————————————————————————————————————————————————————————————————————————————————————————————
    //    if((UltraSensorB >=31) && (UltraSensorB <=10000))
    //    {
    //    digitalWrite(LED_B4_ping,HIGH);
    //    lcd.setCursor(10,2);
    //    lcd.print("4:");
    //    }
    //    else                // else turn thG LED OFF
    //    {
    //      digitalWrite(LED_B4_ping,LOW);
    //    }
        // C SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
        // C1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorC <=30)
        {
          digitalWrite(LED_C1_ping,HIGH);
          digitalWrite(BUZZER3, HIGH);
          delay(15);
          digitalWrite(LED_C1_ping,LOW);
          digitalWrite(BUZZER3, LOW);
          delay(15);
        }
        else
        {
          digitalWrite(LED_A3_ping,LOW);
        }
        // C2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorC >=31) && (UltraSensorC <=60)) // if distance is lGss than 10 Cm turn thG LED ON
        {
          digitalWrite(LED_C2_ping,HIGH);
          digitalWrite(BUZZER3, HIGH);
          delay(15);
          digitalWrite(LED_C2_ping,LOW);
          digitalWrite(BUZZER3, LOW);
          delay(15);  }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_C2_ping,LOW);
        }
        // C3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorC >=61) && (UltraSensorC <=100)) // if distance is lGss than 10 Cm turn thG LED ON
        {
          digitalWrite(LED_C3_ping,HIGH);
          digitalWrite(BUZZER3, HIGH);
          delay(15);
          digitalWrite(LED_C3_ping,LOW);
          digitalWrite(BUZZER3, LOW);
          delay(15);  }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_C3_ping,LOW);
        }
        // C4———————————————————————————————————————————————————————————————————————————————————————————————
    //    if((UltraSensorC >=31) && (UltraSensorC <=10000)) // if distance is lGss than 10 Cm turn thG LED ON
    //    {
    //      digitalWrite(LED_C4_ping,HIGH);
    //      lcd.setCursor(5,0);
    //      lcd.print("4:");
    //    }
    //    else                // else turn thG LED OFF
    //    {
    //      digitalWrite(LED_C4_ping,LOW);
    //    }
   
      // D SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
    // D1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorD <=30)// if distance is lGss than 10 Cm turn thG LED ON
        {
          digitalWrite(LED_D1_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_D1_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15); 
        }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_D1_ping,LOW);
          digitalWrite(BUZZER1, LOW);
        }
        // D2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorD >=31) && (UltraSensorD <=60))
        {
          digitalWrite(LED_D2_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_D2_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15);
        }
        else                // else turn thG LED OFF
        {
          digitalWrite(LED_D2_ping,LOW);
        }
        // D3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorD >=61) && (UltraSensorD <=100))
        {
          digitalWrite(LED_D3_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_D3_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15);
        }
        else             
        {
          digitalWrite(LED_D3_ping,LOW);
        }
        // E SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
      // E1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorE <=30)// if distance is lGss than 10 Cm turn thG LED ON
        {
          digitalWrite(LED_E1_ping,HIGH);
          digitalWrite(BUZZER2, HIGH);
          delay(15);
          digitalWrite(LED_E1_ping,LOW);
          digitalWrite(BUZZER2, LOW);
          delay(15); 
        }
        else                // else turn thG LED OGG
        {
          digitalWrite(LED_E1_ping,LOW);
          digitalWrite(BUZZER2, LOW);
        }
        // E2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorE >=31) && (UltraSensorE <=60))
        {
          digitalWrite(LED_E2_ping,HIGH);
          digitalWrite(BUZZER2, HIGH);
          delay(15);
          digitalWrite(LED_E2_ping,LOW);
          digitalWrite(BUZZER2, LOW);
          delay(15);
        }
        else                // else turn thG LED OGG
        {
          digitalWrite(LED_E2_ping,LOW);
        }
        // E3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorE >=61) && (UltraSensorE <=100))
        {
          digitalWrite(LED_E3_ping,HIGH);
          digitalWrite(BUZZER2, HIGH);
          delay(15);
          digitalWrite(LED_E3_ping,LOW);
          digitalWrite(BUZZER2, LOW);
          delay(15);
        }
        else             
        {
          digitalWrite(LED_E3_ping,LOW);
        }
        // F SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
      // F1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorF <=30)// if distance is lGss than 10 Cm turn thG LED ON
        {
          digitalWrite(LED_F1_ping,HIGH);
          digitalWrite(BUZZER3, HIGH);
          delay(15);
          digitalWrite(LED_F1_ping,LOW);
          digitalWrite(BUZZER3, LOW);
          delay(15); 
        }
        else                // else turn the LEF OFF
        {
          digitalWrite(LED_F1_ping,LOW);
          digitalWrite(BUZZER3, LOW);
        }
        // F2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorF >=31) && (UltraSensorF <=60))
        {
          digitalWrite(LED_F2_ping,HIGH);
          digitalWrite(BUZZER3, HIGH);
          delay(15);
          digitalWrite(LED_F2_ping,LOW);
          digitalWrite(BUZZER3, LOW);
          delay(15);
        }
        else                // else turn the LEF OFF
        {
          digitalWrite(LED_F2_ping,LOW);
        }
        // F3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorF >=61) && (UltraSensorF <=200))
        {
          digitalWrite(LED_F3_ping,HIGH);
          digitalWrite(BUZZER3, HIGH);
          delay(15);
          digitalWrite(LED_F3_ping,LOW);
          digitalWrite(BUZZER3, LOW);
          delay(15);
        }
        else             
        {
          digitalWrite(LED_F3_ping,LOW);
        }
        // G SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
      // G1———————————————————————————————————————————————————————————————————————————————————————————————
        if(UltraSensorG <=30)// if distance is lGss than 10 Cm turn thG LED ON
        {
          digitalWrite(LED_G1_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_G1_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15); 
        }
        else                // else turn the LEG OGG
        {
          digitalWrite(LED_G1_ping,LOW);
          digitalWrite(BUZZER1, LOW);
        }
        // G2———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorG >=31) && (UltraSensorG <=60))
        {
          digitalWrite(LED_G2_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_G2_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15);
        }
        else                // else turn the LEG OGG
        {
          digitalWrite(LED_G2_ping,LOW);
        }
        // G3———————————————————————————————————————————————————————————————————————————————————————————————
        if((UltraSensorG >=61) && (UltraSensorG <=200))
        {
          digitalWrite(LED_G3_ping,HIGH);
          digitalWrite(BUZZER1, HIGH);
          delay(15);
          digitalWrite(LED_G3_ping,LOW);
          digitalWrite(BUZZER1, LOW);
          delay(15);
        }
        else             
        {
          digitalWrite(LED_G3_ping,LOW);
        }
      }
     
      void SonarSensor(int trigPinSensor,int echoPinSensor)//it takes the trigPIN and the echoPIN
      {
        //generate the ultrasonic wave
      digitalWrite(trigPinSensor, LOW);// put trigpin LOW
      delayMicroseconds(2);// wait 2 microseconds
      digitalWrite(trigPinSensor, HIGH);// switch trigpin HIGH
      delayMicroseconds(10); // wait 10 microseconds
      digitalWrite(trigPinSensor, LOW);// turn it LOW again
     
      //read the distance
      duration = pulseIn(echoPinSensor, HIGH);//pulseIn funtion will return the time on how much the configured pin remain the level HIGH or LOW; in this case it will return how much time echoPinSensor stay HIGH
      distance= (duration/2) / 29.1; // A1 we have to divide the duration by two 
      }


===<p style="font-family:helvetica">Sketch 11:  Arduino Mega + UltraSonicSensor + LCD TouchScreen </p> ===
===<p style="font-family:helvetica">Sketch 11:  Arduino Mega + UltraSonicSensor + LCD TouchScreen </p> ===
Line 1,491: Line 567:
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>


====<p style="font-family:helvetica"> About the ESP8266 module</p> ====
==<p style="font-family:helvetica">Semester 2</p> ==
 
[[File:Simu part 02.gif|left|1000px]]
 
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
 
===<p style="font-family:helvetica">Sketch 12:  Arduino Uno + P5serialcontrol + P5.js web editor = Code descrypter </p> ===
 
[[File:Codeword 01.png|300px|thumb|left|P]]
[[File:Codeword 02.png|300px|thumb|center|I]]
[[File:Codeword 03.png|300px|thumb|left|G]]
[[File:Codeword 04.png|300px|thumb|center|E]]
[[File:Codeword 05.png|300px|thumb|left|O]]
[[File:Codeword 06.png|300px|thumb|center|N]]
 
<br><br>
<br><br>
 
===<p style="font-family:helvetica">Sketch 13:  Arduino Uno + P5serialcontrol + P5.js web editor = Game </p> ===
 
[[File:01 Screen .png|300px|thumb|left|Stage 0<br>The subject is located too far away]]
[[File:02 Screen.png|300px|thumb|center|Stage 0<br>The subject is well located and will hold position to reach next stage]]
[[File:03 Screen.png|300px|thumb|left|Stage 1<br>The subject unlocked Stage 1 and will hold position to reach next stage ]]
[[File:04 Screen.png|300px|thumb|center|Stage 2<br>The subject unlocked Stage 2 and is located too close]]
[[File:06 Screen.png|300px|thumb|left|Stage 3<br>The subject unlocked Stage 3 and need to get closer]]
[[File:07 Screen.png|300px|thumb|center|Transition Stage<br>The subject unlocked all stage and needs to wait the countdown for following steps]]
<br><br>
 
===<p style="font-family:helvetica">Sketch 14:  Arduino Uno + P5serialcontrol + P5.js web editor = Simplified interface</p> ===
 
[[File:Data Collector Sample 01.gif|400px|thumb|left]]
 
<br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br>
<br><br>
 
====<p style="font-family:helvetica">How to add split serial data value with more than one sensor</p>====
 
* Use Split: function https://p5js.org/reference/#/p5/split
* Pad example: https://hub.xpub.nl/soupboat/pad/p/Martin
 
 
[[File:Debug Martin 01.png|500px|thumb|left]]
[[File:Debug Martin 05.png|500px|thumb|center]]
[[File:Debug Martin 02.png|500px|thumb|left]]
[[File:Debug Martin 03.png|500px|thumb|center]]
[[File:Debug Martin 04.png|500px|thumb|left]]
[[File:Debug Martin 06.png|500px|thumb|center]]
 
<br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br>
 
 
===<p style="font-family:helvetica">Installation update</p>===
 
[[File:Installation Update 01.jpg|300px|thumb|left]]
[[File:Installation Update 02.jpg|300px|thumb|center]]
 
<br><br><br><br><br><br><br><br>


The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.
===<p style="font-family:helvetica">To do</p> ===


====<p style="font-family:helvetica"> Ressources about ESP8266 module</p>  ====
* Manage to store the data with WEB STORAGE API
** https://www.w3schools.com/JS/js_api_web_storage.asp
* Import Live data or Copy data from
** https://www.worldometers.info/
* Import Live data from stock market
** https://money.cnn.com/data/hotstocks/index.html
** https://www.tradingview.com/chart/?symbol=NASDAQ%3ALIVE
** https://www.google.com/finance/portfolio/972cea17-388c-4846-95da-4da948830b03
* Make a Bar graph
** https://openprocessing.org/sketch/1152792


Kindly fowarded by Lousia:<br>
===<p style="font-family:helvetica">Stages Design</p> ===


* https://www.youtube.com/watch?v=6hpIjx8d15s
Many stages (mini-levels) are being designed. They are all intended to evoke the different contexts and pretexts for which we perform daily micro-tasks to collect data.
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
<br>
* https://www.youtube.com/watch?v=dWM4p_KaTHY
The visitor can unlock the next step by successfully completing one or more tasks in a row. After a while, even if the action is not successful, a new step will appear with a different interface and instructions.  
* https://randomnerdtutorials.com/esp8266-web-server/
* https://www.youtube.com/watch?v=6hpIjx8d15s
* https://electronoobs.com/eng_arduino_tut101.php
* http://surveillancearcade.000webhostapp.com/index.php (interface)


====<p style="font-family:helvetica">Which ESP8266 to buy</p> ====
The list below details the different stages being produced, they will follow each others randomly during the session:


* https://makeradvisor.com/tools/esp8266-esp-12e-nodemcu-wi-fi-development-board/
** Money/Slot Machine
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
** Well-Being
* https://www.amazon.nl/-/en/dp/B06Y1ZPNMS/ref=sr_1_5?crid=3U8B6L2J834X0&dchild=1&keywords=SP8266%2BNodeMCU%2BCP2102%2BESP&qid=1635089256&refresh=1&sprefix=sp8266%2Bnodemcu%2Bcp2102%2Besp%2Caps%2C115&sr=8-5&th=1
** Healthcare
** Yoga
** Self-Management
** Stock Market Exchange
** Military interface


===<p style="font-family:helvetica">Things to try</p> ===
The visuals bellow illustrate their design.


* Connect multiple arduinos (if necessary): https://www.youtube.com/watch?v=tU6jIoQ6M_E
[[File:Captcha 01.png|thumb|left|Captcha:<br>one action needed
* Connect lamp to arduino: https://www.youtube.com/watch?v=F-yk4Tyc44g
moving forward, backward or standing
stillnext stage unlock after action done
and short animation]]
[[File:Self Track 01.png|thumb|center|Self Tracking:<br>no interaction needed
visitor must stand still until
one of the goal is completed]]
[[File:Self Track 02.png|thumb|left|Self Tracking:<br>no interaction needed
visitor must stand still until
one of the goal is completed]]
[[File:Slot Machine 01.png|thumb|center|Slot Machine:<br>no interactions needed
transition between 2 stages
determines randomly the next stage
displayed when nobody detected]]
[[File:Social Live 01.png|thumb|left|Social Live:<br>no interaction needed
visitor must stand still until
money goal is completed]]
[[File:Stock Ticket 01.png|thumb|center|Stock Ticket:<br>no interactions needed
displayed when nobody detected]]
<br><br><br><br><br><br><br>
 
===<p style="font-family:helvetica">Stages Design with P5.js</p> ===
[[File:AllStages HomoData.png|400px|thumb|left]]
[[File:Homo Data 02.gif|400px|thumb|left|6 levels in a row then randomnized, more to come]]
[[File:Consolelog 01.gif|400px|thumb|center]]
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>
<br><br>


==<p style="font-family:helvetica"> Prototyping Ressources</p> ==
==<p style="font-family:helvetica"> Prototyping Ressources</p> ==
Line 1,594: Line 772:
*https://forensic-architecture.org/
*https://forensic-architecture.org/


==<p style="font-family:helvetica">Installation</p>==
====<p style="font-family:helvetica"> About the ESP8266 module</p> ====
 
The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.
 
====<p style="font-family:helvetica"> Ressources about ESP8266 module</p>  ====
 
Kindly fowarded by Lousia:<br>
 
* https://www.youtube.com/watch?v=6hpIjx8d15s
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
* https://www.youtube.com/watch?v=dWM4p_KaTHY
* https://randomnerdtutorials.com/esp8266-web-server/
* https://www.youtube.com/watch?v=6hpIjx8d15s
* https://electronoobs.com/eng_arduino_tut101.php
* http://surveillancearcade.000webhostapp.com/index.php (interface)


===<p style="font-family:helvetica">Creating an elastic exhibition space</p> ===
====<p style="font-family:helvetica">Which ESP8266 to buy</p> ====


[[File:ResponsiveSpaceSquare.gif|400px|thumb|left|Responsive Space Installation Simulation]]
* https://makeradvisor.com/tools/esp8266-esp-12e-nodemcu-wi-fi-development-board/
[[File:Responsive Space (detail).png|400px|thumb|center|Responsive Space (detail)]]
* https://randomnerdtutorials.com/getting-started-with-esp8266-wifi-transceiver-review/
[[File:Spectator friendly physical exhibition space.png|600px|thumb|left]]
* https://www.amazon.nl/-/en/dp/B06Y1ZPNMS/ref=sr_1_5?crid=3U8B6L2J834X0&dchild=1&keywords=SP8266%2BNodeMCU%2BCP2102%2BESP&qid=1635089256&refresh=1&sprefix=sp8266%2Bnodemcu%2Bcp2102%2Besp%2Caps%2C115&sr=8-5&th=1
[[File:Moving Wall Structure Shema 1.gif|400px|thumb|center|Moving Wall Structure Shema 1]]


<br><br><br><br>
==<p style="font-family:helvetica">Installation</p>==


===<p style="font-family:helvetica">Ressources</p> ===
===<p style="font-family:helvetica">Ressources</p> ===
Line 1,612: Line 803:


=<p style="font-family:helvetica">Venues</p> =
=<p style="font-family:helvetica">Venues</p> =
==<p style="font-family:helvetica"> Introduction </p>==
* [https://pad.xpub.nl/p/20210928_xpub2 PAD]
We will organize 2 moments of shared work in progress: one in October, one in November.
It is not necessarily a presentation, but more conversation-based in order to practice the "making public act".
Speak about our work, conversations with people about it, and people talking about our work.


==<p style="font-family:helvetica">Venue 1: Aquarium </p>==
==<p style="font-family:helvetica">Venue 1: Aquarium </p>==
Line 1,697: Line 879:


<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
<br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
=<p style="font-family:helvetica">Venues</p> =
==<p style="font-family:helvetica">Venue 2: Aquarium 2.0 </p>==
===<p style="font-family:helvetica">Description</p>===
<br>
Date 29th Nov — 4th Dec 2021 <br>
Time 15h — 18h <br>
29th Nov — 4th Dec 2021 (all day)<br>
Location: De Buitenboel, Rosier Faassenstraat 22 3025 GN Rotterdam, NL<br>
<br><br>
AQUARIUM 2.0 <br>
<br>
An ongoing window exhibition with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann<br>
<br>
Tap upon the glass and peer into the research projects we are currently working on.
From Monday 29th of November until Saturday 4th of December we put ourselves on display in the window of De Buitenboel as an entry point into our think tank. Navigating between a range of technologies, such as wireless radio waves, virtual realities, sensors, ecological and diffractive forms of publishing, web design frameworks, language games, and an ultra-territorial residency; we invite you to gaze inside the tank and float with us. Welcome back to the ecosystem of living thoughts.<br>
==<p style="font-family:helvetica">Aquarium LCD Portal (29 Nov – 4th Dec)</p>==
This interactive micro-installation composed of a LCD screen and sensor(s) invites users/visitors to change the color of the screen and displayed messages by getting more or less close from the window.
[https://www.pzwart.nl/blog/2021/11/30/aquarium-2-0/ Link]
[[File:ScreenPortalFront.jpg|300px|thumb|left|ScreenPortalFront]]
[[File:ScreenPortalback.jpg|300px|thumb|right|ScreenPortalback]]
[[File:LCDScreenTest.gif|600px|thumb|center|LCDScreenTest]]
<br><br><br><br><br><br><br><br><br><br><br><br><br>


=<p style="font-family:helvetica">Readings (new)(english)(with notes in english) </p>=
=<p style="font-family:helvetica">Readings (new)(english)(with notes in english) </p>=
Line 1,786: Line 997:
* Library vs Exhibition Space = Use vs Display
* Library vs Exhibition Space = Use vs Display
* Book-theme exhibitions
* Book-theme exhibitions
==<p style="font-family:helvetica">About User vs Visitor, or user in exhibition space</p>==
[[Designing the user experience in exhibition spaces - Elisa Rubegni, Caporali Maurizio, Antonio Rizzo, Erik Grönvall]]
* What are the GUI intentions
* What is the WIMP interaction model
* What are the post-Wimp models
* About Memex


==<p style="font-family:helvetica">About User Interface</p>==
==<p style="font-family:helvetica">About User Interface</p>==
Line 1,884: Line 1,103:
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
Grenoble, 21 December 2001.
Grenoble, 21 December 2001.
→&nbsp;9.&nbsp;[[Graspable interfaces (Fitzmaurice et al., 1995)]] [https://www.dgp.toronto.edu/~gf/papers/PhD%20-%20Graspable%20UIs/Thesis.gf.html link]


==<p style="font-family:helvetica">About User Condition</p>==
==<p style="font-family:helvetica">About User Condition</p>==
Line 2,064: Line 1,285:
→&nbsp;&nbsp;http://all-html.net/?<br>
→&nbsp;&nbsp;http://all-html.net/?<br>


===<p style="font-family:helvetica">More references to check (from  THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge) </p>===
<div style='
width: 75%; 
font-size:16px;
background-color: white;
color:black;
float: left;
border:1px black;
font-family: helvetica;
'>


* Alexander,  Edward  P.<br>
<div style='  
1997 The Museum in America. Walnut Creek,<br>
width: 75%; 
CA: AltaMira  Press.<br>
font-size:16px;
<br>
background-color: white;
* Ambrose,  Timoth¡  and Crispin Paine<br>
color:black;
2006  Museum  Basics.  2nd edition.  London:<br>
float: left;
Routledge.<br>
border:1px black;
<br>
font-family: helvetica;
* Ames, Kenneth  L., Barbard  Franco,  and  L.  Thomas Frye<br>
'>
1997 Ideas and  Images:  Developing  Interpretive History  Exhibits.  Walnut  Creek,<br>
CA: AltaMira  Press.<br>
<br>
* Ames, Michael<br>
1992  Cannibal  Tours and Glass  Boxes:  The Anthropology of Museums.  2nd edition.<br>
Vancouver:  University  of British Columbia Press.<br>
<br>
* Barringer,  Tim, and  Tom  Fþn,  eds.<br>
1997 Colonialism and  the Object: Empire,  <br>
Material Culture  and the  Museum. London:<br>
Routledge.<br>
<br>
* Belcher,  Michael<br>
199I Exhibitions in Museums. <br>
Leicester:  Leicester Museum Studies.<br>
<br>
* Bennett,  Tony<br>
1995 The  Birth  of the Museum.<br> 
London:  Routledge.<br>
<br>
* Black, Graham<br>
2005 The Engaging  Museum. London: Routledge.<br>
<br>
* Bouquet,  Mar¡ed.<br>
2001 Academic  Anthropology  and  the  Museum.<br>
New York Berghahn  Books.<br>
<br>
* Caulton,  Tim<br>
1998 Hands  on Exhibitions: Managing  Interactive Museums  and Science  Centres.<br>
London:  Routledge.<br>
<br>
*Coombes,  Annie<br>
1994  Reinventing  Africa: Museums, Material Culture  and  Popular Imagination  in Late Victorian  and Edwardian  England.<br>
New Haven: Yale  University  Press.<br>
<br>
*Dean,  David<br>
1997  Museum  Exhibition:  Theory and  Practice.<br>
London:  Routledge.<br>
<br>
*Dubin, Steven<br>
1999 Displays of Power: Memory  and  Amnesia in the  American  Museum.  New York New York  University  Press.<br>
<br>
*2006  Transforming  Museums: Mounting  Queen Victoria  in a Democratic  South  Africa.<br>
New York Palgrave  Macmillan.<br>
<br>
*Falk, John H.,  and Lynn Dierking<br>
2000 Learning from Museums:  Visitor Experiences  and  the  Making of Meaning.<br> 
Walnut Creek, CA: AltaMira  Press.<br>
<br>
*Fienup-Riordan,  Anne<br>
2005 Yup'ik Elders  at the  Ethnologisches Museum<br>
Berlin: Fieldwork Tumed  on Its Head.  Seattle: University of Washington  Press.<br>
<br>
*Hein,  George<br>
1998 Learning  in the  Museum.  London:  Routledge.<br>
<br>
*Henderson,  Am¡ and Adrienne  Kaeppler<br>
1997 Exhibiting  Dilemmas:  Issues  of Representation at the Smithsonian.<br>
Washington, DC:  Smithsonian Institution Press.<br>
<i>"In twelve essays on such diverse Smithsonian Institution holdings as the Hope Diamond, the Wright Flyer, wooden Zuni carvings, and the Greensboro, North Carolina Woolworth lunch counter that became a symbol of the Civil Rights movement, Exhibiting Dilemmas explores a wide range of social, political, and ethical questions faced by museum curators in their roles as custodians of culture."</i>
<br>
*Hooper-Greenhill,  Eileen<br>
1991 Museum  and  Gallery Education.<br>
Leicester:Leicester  University  Press.<br>
<br>
*1992  Museums and the Shaping of Knowledge.<br>
London:  Routledge.<br>
*1994 Museums  and Their Visitors.<br> 
London: Routledge.<br>
<br>
*2001 Cultural Diversity:  Developing  Museum Audiences  in Britain.<br>
Leicester:  Leicester University  Press.<br>
*Kaplan,  Flora  E. S.<br>
1995 Museums  and  the Making of 'Ourselves.'<br>
Leicester:  Leicester  University  Press.<br>
<br>
*Karp,  Ivan, and Steven  D.  Lavine<br>
1991 Exhibiting  Cultures:  The  Poetics  and  Politics  of Museum  Display.<br>
Washington,  DC: Smithsonian Institution Press.<br>
<br>
* Kreps, Christina  F.<br>
2003 Liberating  Culture:  Cross-Cultural Perpectives on Museums,  Curation,  and heritage  Preservation.<br>
London: Routledge.<br>
<br>
* Lindauer,  Margaret<br>
2006 The  Critical Museum  Visitor. <br>
New Museum Theory  and Practice: An  Introduction. f. Marstine,  ed. Pp. 203-225.<br>
Malden: Wiley-Blackwell.<br>
<br>
* Lord,  Barr¡ and Gail Lord,  eds.<br>
2002 The Manual of Museum  Exhibitions. <br>
Wal-nut Creek, CA:AltaMira Press.<br>
<br>
* Macdonald,  Sharon,  ed.<br>
1998 The Politics of Display.  London:  Routledge.<br>
<br>
* Macdonald,  Sharon,  and Gordon  Fyfe<br>
1996  Theorizing  Museums.  Oxford: Blackwell.<br>
* MacGregor,  Arthur<br>
<br>
2007 Curiosity  and Enlightenment:  Collecting and  Collections  from the  Sixteenth to the Nineteenth  century.<br>
New Haven:  Yale University  Press.<br>
<br>
* Macleod,  Suzanne,  ed.<br>
2005 Reshaping  Museum Space: Architecture,Design, Exhibitions.  London:  Routledge.<br>
<br>
* Mcloughlin,  Moira<br>
1999 Museums  and  the Representation  of Native Canadians.<br>
New York Garland Publishing.<br>
<br>
* Metzler,  Sally<br>
2008 Theatres  of Nature:  Dioramas  at the Field Museum.<br>
Chicago:  Field  Museum of Natural History.<br>
<br>
* Moore,  Kevin<br>
1997 Museums and  Popular  Culture.<br>
London: Cassell.<br>
<br>
* Moser,  Stephanie<br>
1999 The Dilemma  of Didactic  Displays:  Habitat Dioramas,  Life-Groups  and Reconstruc- tions  of the Past.  In Making Early Histories in Museums.<br>
N. Merriman,  ed.  Pp. 65-116.<br>
London: Cassell/Leicester University  Press.<br>
<br>
* 2001  Archaeological Representation: TheVisual Conventions  for Constructing  Knowledge about  the Past. In Archaeological  Theory Today. 
<br>I. Hodder, ed.  Pp.  262-283.<br>
Cambridge: Polity  Press.<br>
<br>
* 2003 Representing Human Origins:  Constructing  Knowledge  in Museums  and Dismantling  the Display  Canon.<br>
public Archaeology  3(t):I-17.<br>
<br>
* 2006  Wondrous Curiosities:  Ancient  Egypt  at the British  Museum.  Chicago: Chicago University Press.<br>
<br>
* 2008 Archaeological  Representation:  The  Consumption  and  Creation of the Past.<br>
In Oxford Handbook of Archaeology.  B.
Cunliffe and  C. Gosden,  eds.  pp. 1048-
1077.<br>  Oxford:  Oxford University  press.
Pearce, <br>
<br>
* Susan M., ed.<br>
1994  Interpreting  Objects and Collections.<br>
Routledge:  Leicester Readers  in Museum Studies.<br>
<br>
* Pearce, Susan M.<br>
1998 Museums,  Objects  and Collections.<br>
Leicester: Leicester  University  Press.<br>
<br>
* Peers,  Laura,  and Alison  K. Brown,  eds.
<br>
2003  Museums  and  Source  Communities: A Routledge  Reader.  London:  Routledge.<br>
<br>
* Quinn, Stephen  C.<br>
2006  Windows on Nature:  The Great  Habitat
<br>Dioramas  of the American Museum of Natural  History.  New  York  Harry N. Abrams,<br>
<br>
* Roberts,  Lisa C.<br>
1997  From Knowledge  to Narrative: Educators and the Changing  Museum. Washington,<br>
DC: Smithsonian  Institute Press.<br>
<br>
* Sandell, Richard,  ed.<br>
2002 Museums, Societ¡ Inequality.<br>
London: Routledge.<br>
<br>
* Scott,  Monique<br>
2007 Rethinking  Evolution  in the  Museum:  Envisioning African Origins.<br>
London: Routledge.<br>
<br>
* Serrell, Barbara<br>
1996 Exhibit  Labels: An Interpretive Approach.<br>
Walnut  Creek, CA:  AltaMira  press.<br>
<br>
2006 fudging Exhibitions: A Framework  for Excellence. Walnut Creek,<br>
CA: Left Coast Press.<br>
<br>
* Sheets-Pyenson,  Susan<br>
1988 Cathedrals  of Science:  The  Development  ofColonial  Natural  History  Museums  During the Late Nineteenth  Century.
<br> Ontario: McGill-Queen's  lJniversity  Press.<br>
* Simpson,  Moira<br>
1996 Making  Representations:  Museums  in the Post-Colonial  Era. <br>
London:  Routledge.<br>
<br>
* Spalding,Iulian<br>
2002 The Poetic  Museum:  Reviving  Historic Collections. <br>
London:  Prestel.<br>
<br>
* Swain, Hedley<br>
2007 An Introduction  to Museum  Archaeology.<br>
Cambridge:  Cambridge  University  press.<br>
<br>
* Vergo, Petet  ed.<br>
1990  The  New Museology. 
London:  Reaktion Books.<br>
<br>
* Walsh,  Kevin<br>
1992 The Representation of the Past: Museums and  Heritage  in the Post-Modern  World.<br>
London:  Routledge.<br>
<br>
* Witcomb,  Andrea<br>
2003 Re-Imagining  the Museum: Beyond  the Mausoleum.<br>
London:  Routledge.<br>
<br>
* Yanni,  Carla<br>
2005 Nature's  Museum;  Victorian  Science  and the Architecture  of Display.<br>
Princeton: Princeton Architectural  Press<br>
<br>

Revision as of 10:56, 1 May 2022

Links

Draft Thesis

What do you want to make?

My project is a data collection installation that monitors people's behaviors in public physical spaces while explicitly encouraging them to help the algorithm collect more information. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.

The way the device is designed doesn’t pretend to give any beneficial outcomes for the subject, but only makes visible the benefits that the machine is getting from collecting their data. Yet, the way the device visually or verbally presents this collected data is done in a grateful way, which might be stimulating for the subject. In that sense, the subject, despite knowing that their actions are done solely to satisfy the device, could become intrigued, involved, or even addicted by a mechanism that deliberately uses it as a commodity. In that way, I intend to trigger conflictual feelings in the visitor’s mind, situated between a state of awareness regarding the operating monetization of their physical behaviors, and a state of engagement/entertainment /stimulation regarding the interactive value of the installation.

My first desire is to make the mechanisms by which data collection is carried out, marketized and legitimized both understandable and accessible. The array of sensors, the Arduinos and the screen are the mainly technological components of this installation. Rather than using an already existing and complex tracking algorithm, the program is built from scratch, kept open source and limits itself to the conversion of a restricted range of physical actions into interactions. These include the detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection. Optionally they may also include the detection of the subject smartphone device or the log on a local Wi-Fi hotspot made by the subject.

In terms of mechanic, the algorithm creates feedback loops starting from:
_the subject behaviors being converted into information;
_the translation of this information into written/visual feedback;
_and the effects of this feedbacks on subject’s behavior; and so on.
By doing so, it tries to shape the visitors as free data providers inside their own physical environment, and stimulate their engagement by converting each piece of collected information into points/money, feeding a user score among a global ranking.

On the screen, displayed events can be:

_ “subject [] currently located at [ ]”
[x points earned/given]
_ “subject [] entered the space”
[x points earned/given]
_ “subject [] left the space”
[x points earned/given]
_ “subject [] moving/not moving”
[x points earned/given]
_ “subject [] distance to screen: [ ] cm”
[x points earned/given]
_ “subject [] stayed at [ ] since [ ] seconds”
[x points earned/given]
_ “subject [] device detected
[x points earned/given] (optional)
_ “subject logged onto local Wi-Fi
[x points earned/given] (optional)

Added to that comes the instructions and comments from the devices in reaction to the subject’s behaviors:

_ “Congratulations, you have now given the monitor 25 % of all possible data to collect!”
[when 25-50-75-90-95-96-97-98-99% of the total array of events has been detected at least once]
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot!”
[if the subject stands still in a specific location]
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”
[unlocked at x points earned/given]
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers!”
[if the subject stand still in a specific location]
_ “Leaving all ready? The monitor has yet to collect 304759 crucial pieces of information from you!”
[if the subject is a the edge of the detection range]
_ “You are only 93860 pieces of information away from being the top one data-giver!”
[unlocked at x points earned/given]
_ “Statistics show that people staying for more than 5 minutes average will benefit me on average 10 times more!”
[randomly appears]
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”
[if the subject stands still for a long time any location]

Responding positively to the monitors instructions unlocks special achievement and extra points

—Accidental data-giver badge
[unlocked if the subject has passed the facility without deliberately wishing to interact with it] + [x points earned/given]
—Lazy data-giver badge
[unlocked if the subject has been standing still for at least one minute] + [x points earned/given]
—Novice data-giver badge
[unlocked if the subject has been successfully completing 5 missions from the monitor] + [x points earned/given]
—Hyperactive data-giver badge
[unlocked if the subject has never been standing still for 10 seconds within 2 minutes lapse time] + [x points earned/given]
—Expert data-giver badge
[unlocked if the subject has been successfully completing 10 missions from the monitor within 10 minutes] + [x points earned/given]
—Master data-giver badge
[unlocked if the subject has been successfully logging on the local Wi-Fi hotspot] + [x points earned/given] (optional)

On the top left side of the screen, a user score displays the number of points generated by the collected pieces of information, and the unlocking of special achievements instructed by the monitor.

—Given information: 298 pieces
[displays number of collected events]
—Points: 312000
[conversion of collected events and achievement into points]

On the top right of the screen, the user is ranked among x number of previous visitors and the prestigious badge recently earned is displayed bellow

—subject global ranking: 3/42
[compares subject’s score to all final scores from previous subjects]
—subject status: expert data-giver
[display the most valuable reward unlocked by the subject]

When leaving the detection range, the subject gets a warning message and a countdown starts, and encouraging it to take the quick decision to come back

—“Are you sure you want to leave? You have 5-4-3-2-1-0 seconds to come back within the detection range”
[displayed as long as the subject remains completely undetected]

If the subject definitely stands out of the detection range for more than 5 seconds, the monitor will also address a thankful message and the amount of money gathered, achievements, ranking, complete list of collected information and a qr code will be printed as a receipt with the help of a thermal printer. The QR will be a link to my thesis.

—* “Thank you for helping today, don’t forget to take your receipt in order to collect and resume your achievements”
[displayed after 5 seconds being undetected]

In order to collect, read or/and use that piece of information, the visitor will inevitably have to come back within the range of detection, and intentionally, or not, reactivate the data tracking game. It is therefore impossible to leave the area of detection without leaving at least one piece of your own information printed in the space. Because of this, the physical space should gradually be invaded by tickets scattered on the floor. As in archaeology, these tickets give a precise trace of the behavior and actions of previous subjects for future subjects.

Why do you want to make it?

When browsing online or/and using connected devices in the physical world, even the most innocent action/information can be invisibly recorded, valued and translated into informational units, subsequently generating profit for monopolistic companies. While social platforms, brands, public institutions and governments explicitly promote the use of monitoring practices in order to better serve or protect us, we could also consider these techniques as implicitly political, playing around some dynamics of visibility and invisibility in order to assert new forms of power over targeted audiences.

In the last decade, a strong mistrust of new technologies has formed in the public opinion, fueled by events such as the revelations of Edward Snowden, the Cambridge Analytica scandal or the proliferation of fake news on social networks. We have also seen many artists take up the subject, sometimes with activist purposes. But even if a small number of citizens have begun to consider the social and political issues related to mass surveillance, and some individuals/groups/governments/associations have taken legal actions, surveillance capitalism still remains generally accepted, often because ignored or/and misunderstood.

Thanks to the huge profits generated by the data that we freely provide every day, big tech companies have been earning billions of dollars over the sale of our personal information. With that money, they could also further develop deep machine learning programs, powerful recommendation systems, and to broadly expand their range of services in order to track us in all circumstances and secure their monopolistic status. Even if we might consider this realm specific to the online world, we have seen a gradual involvement from the same companies to monitor the physical world and our human existences in a wide array of contexts. For example, with satellite and street photography (Google Earth, Street View), geo localization systems, simulated three-dimensional environments (augmented reality, virtual reality or metaverse) or extensions of our brains and bodies (vocal assistance and wearable devices). Ultimately, this reality has seen the emergence of not only a culture of surveillance but also of self-surveillance, as evidenced by the popularity of self-tracking and data sharing apps, which legitimize and encourage the datafication of the body for capitalistic purposes.

For the last 15 years, self-tracking tools have made their way to consumers. I believe that this trend is showing how ambiguous our relationship can be with tools that allow such practices. Through my work, I do not wish to position myself as a whistleblower, a teacher or activist. Indeed, to adopt such positions would be hypocritical, given my daily use of tools and platforms that resort to mass surveillance. Instead, I wish to propose an experience that highlights the contradictions in which you and I, internet users and human beings, can find ourselves. This contradiction is characterized by a paradox between our state of concern about the intrusive surveillance practices operated by the Web giants (and their effects on societies and humans) and a state of entertainment or even active engagement with the tools/platforms through which this surveillance is operated/allowed. By doing so, I want to ask how do these companies still manage to get our consent and what human biases do they exploit in order to do so. That’s is how my graduation work and my thesis will investigate the effect of gamification, gambling or reward systems as well as a the esthetization of data/self-data as means to hook our attention, create always more interactions and orientate our behaviors.


How to you plan to make it and on what timetable?

I am developing this project with Arduino Uno/Mega boards, an array of ultrasonic sensor, P5.js and screens.

How does it work?

The ultrasonic sensors can detect obstacles in a physical space and know the distance between the sensor and obstacle(s) by sending and receiving back an ultrasound. The Arduino Uno/Mega boards are microcontrollers which can receive this information, run it in a program in order to convert these values into a mm/cm/m but also map the space into an invisible grid. Ultimately, values collected on the Arduino’s serial monitor can be sent to P5.js through P5.serialcontrol. P5.js will then allow a greater freedom in way the information can be displayed on the screens.

Process:

1st semester: Building a monitoring device, converting human actions into events, and events into visual feedbacks

During the first semester, I am focused on exploring monitoring tools that can be used in the physical world, with a specific attention to ultrasonic sensors. Being new to Arduino programming, my way of working is to start from the smallest and most simple prototype and gradually increase its scale/technicality until reaching human/architectural scale. Prototypes are subject to testing, documentation and comments helping to define which direction to follow. The first semester also allows to experiment with different kind of screen (LCD screens, Touch screens, computer screens, TV screens) until finding the most adequate screen monitor(s) for the final installation. Before building the installation, the project is subject to several sketching and animated simulations in 3 dimensions, exploring different scenarios and narrations. At the end of the semester, the goal is to be able to convert a specific range of human actions into events and visual feedback creating a feedback loop from the human behaviors being converted into information; the translation of this information into written/visual feedbacks; and the effects of this feedbacks on human behavior; and so on.

2nd semester: Implementing gamification with the help of collaborative filtering, point system and ranking.

During the second semester, it is all about building and implementing a narration with the help of gaming mechanics that will encourage humans to feed the data gathering device with their own help. An overview of how it works is presented here in the project proposal and will be subject to further developments in the practice.

To summarize the storyline, the subject being positioned in the detection zone finds herself/himself unwillingly embodied as the main actor of a data collection game. Her/His mere presence generates a number of points/dollars displayed on a screen, growing as she/he stays within the area. The goal is simple: to get a higher score/rank and unlock achievements by acting as recommended by a data-collector. This can be done by setting clear goals/rewards to the subject, and putting its own performance in comparison to all the previous visitors, giving unexpected messages/rewards, and give an aesthetic value to the displayed informations.

The mechanism is based on a sample of physical events that have been already explored in the first semester of prototyping (detection of movements, positions, lapse of time spent standing still or moving, and entry or exit from a specific area of detection). Every single detected event in this installation is stored in a data bank, and with the help of collaborative filtering, will allow to the display of custom recommendations such as:

_ “Congratulations, you have now given the monitor 12 % of all possible data to collect”
_ “Are you sure you don’t want to move to the left? The monitor has only collected data from 3 visitors so far in this spot”
_ “Congratulations, the monitor has reached 1000 pieces of information from you!”
_ “If you stay there for two more minutes, there is a 99% chance you will be in the top 100 of ALL TIME data-givers”
_ “Leaving all-ready? The monitor has yet 304759 crucial pieces of information to collect from you”
_ “You are only 93860 actions away from being the top one data-giver”
_ “Statistics are showing that people staying for more than 5 minutes average will be 10 times more benefitting for me”
_ “The longer you stay on this spot, the more chance you have to win a “Lazy data-giver” badge”


The guideline is set out here, but will be constantly updated with the help of experiments and the results observed during the various moments of interaction between the students and the algorithm. For this purpose, the installation under construction will be left active and autonomous in its place of conception (studio) and will allow anyone who deliberately wishes to interact with it to do so. Beyond the voluntary interactions, my interest is also to see what can be extracted from people simply passing in front of this installation. In addition to this, some of the mechanics of the installation will be further explored by collaborating with other students, and setting up more ephemeral and organized experiences with the participants. (ex: 15 February 2022 with Louisa)

This semester will also include the creation of a definite set of illustrations participating to engage the participants of the installation in a more emotional way, the illustrations will be made by an illustrator/designer, with whom I usually collaborate.

3rd semester: Build the final installation of final assessment and graduation show. Test runs, debug and final touchs.

During the third semester, the installation should be settled in the school, in the alumni area, next to XPUB studio for the final assessment, and ultimately settled again at WORM for the graduation show. I am interested in putting this installation into privileged spaces of human circulation, (such as hallways) that would more easily involve the detection of people, and highlight the intrusive aspect of such technologies. The narration, the mechanics, illustrations and graphic aspect should be finalized at the beginning of the 3rd semester, and subject to intense test runs during all that period until meeting the deadline.

Relation to larger context

As GAFAM companies are facing more and more legal issues, and held accountable in growing numbers of social and political issues around the world, the pandemic context has greatly contributed to make all of us more dependent than ever on the online services provided by these companies and to somehow force our consent. While two decades of counter-terrorism measures legitimized domestic and public surveillance techniques such as online and video monitoring, the current public health crisis made even more necessary the use of new technologies for regulating the access to public spaces and services, but also for socializing, working together, accessing to culture, etc. In a lot of countries, from a day to another, and for an undetermined time, it has become necessary to carry a smartphone (or a printed QR code) in order to get access transport, entertainment, cultural and catering services, but also in order to do simple things such as to have a look at the menu in a bar/restaurant or to make an order.. Thus, this project takes place in a context where techno-surveillance has definitely taken a determining place in the way we can access spaces and services related to the physical world.

Data Marketisation / Self Data: Quantified Self / Attention Economy / Public Health Surveillance / Cybernetics


Relation to previous practice?

During my previous studies in graphic design, I started being engaged with the new media by making a small online reissue of Raymond Queneau’s book called Exercices de Style. In this issue called Incidences Médiatiques (2017), the user/reader was encouraged to explore the 99 different versions of a same story written by the author in a less-linear way. The idea was to consider each user graphic user interface as a unique reading context. It would determine which story could be read, depending on the device used by the reader, and the user could navigate through these stories by resizing the Web window, by changing browser or by using on a different device.

As part of my graduation project called Media Spaces (2019), I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print website. Subsequently, this website became translated in the physical space as a printed book, and a series of installations displayed in an exhibition space that was following the online structure of my thesis (home page, index, part 1-2-3-4). In that way, I was interested to inviting to visitors to make a physical experience some aspects of the Web

As a first-year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website called Tense (2020) willing to display the invisible html <meta> tags inside of an essay in order to affect our interpretation of the text. In 2021, I worked on a geocaching pinball game highlighting invisible Web event, and a Web oscillator, which amplitude and frequency range were directly related to the user’s cursor position and browser screen-size.

While it has always been clear to me that these works were motivated by the desire to define media as context, subject or/and content, the projects presented here have often made use of surveillance tools to detect and translate user information into feedbacks, participating in the construction of an individualized narrative or/and a unique viewing/listening context (interaction, screen size, browser, mouse position). The current work aims to take a critical look at the effect of these practices in the context of techno surveillance.

Similarly, projects such as Media Spaces have sought to explore the growing confusion between human and web user, physical and virtual space or online and offline spaces. This project will demonstrate that these growing confusions will eventually lead us to be tracked in all circumstances, even in our most innocuous daily human activities/actions.


Selected References

Works:

« invites us to take on the role of an auditor, tasked with addressing the biases in a speculative AI »Alternatives to techno-surveillance

Expose humans as producers of useful intellectual labor that is benefiting to the tech giants and the use than can be made out of that labor.

Claims that that technological devices can be manipulated easily and hence, that they are fallible and subjective. They do this by simply placing a self-tracker (connected bracelet) in another context, such as on some other objects, in order to confuse these devices.

Allows galleries to enjoy encrypted internet access and communications, through a Tor Network

You are rewarded for exploring all the interactive possibilities of your mouse, revealing how our online behaviors can be monitored and interpretated by machines.

Portrait of the viewer is drawn in real time by active words, which appear automatically to fill his or her silhouette https://www.lozano-hemmer.com/third_person.php

«Every visitor to the website’s browser size, collected, and played back sequentially, ending with your own.»

Readings of the building and its contents are therefore always unique -- no two visitors share the same experience. https://haque.co.uk/work/mischievous-museum/

Books & Articles:

  • SHOSHANA ZUBOFF, The Age of Surveillance Capitalism (2020)

Warns against this shift towards a «surveillance capitalism». Her thesis argues that, by appropriating our personal data, the digital giants are manipulating us and modifying our behavior, attacking our free will and threatening our freedoms and personal sovereignty.

  • EVGENY MOROZOV, Capitalism’s New Clothes (2019)

Extensive analysis and critic of Shoshana Zuboff research and publications.

  • BYRON REEVES AND CLIFFORD NASS, The Media Equation, How People Treat Computers, Television, and New Media Like Real People and Places (1996)

Precursor study of the relation between humans and machine, and how do you human relate to them.

  • OMAR KHOLEIF, Goodbye, World! — Looking at Art in the digital Age (2018)

Authors shares it’s own data as a journal in a part of the book, while on another part, question how the Internet has changed the way we perceive and relate, and interact with/to images.

  • KATRIN FRITSCH, Towards an emancipatory understanding of widespread datafication (2018)

Suggests that in response to our society of surveillance, artists can suggest activist response that doesn’t necessarily involve technological literacy, but instead can promote strong counter metaphors or/and counter use of these intrusive technologies.

Prototyping

Arduino

Early sketch that is about comparing and questioning our Spectator experience of a physical exhibition space (where everything is often fixed and institutionalized), with our User experience of a Web space (where everything is way more elastic, unpredictable and obsolete). I’m interested about how slighly different can be rendered a same Web page to all different users depending on technological contexts (device nature, browser, IP address, screen size, zoom level, default settings, updates, luminosity, add-ons, restrictions, etc). I would like to try to create a physical exhibition space/installation that would be inspired from the technology of a Web user window interface in order then to play with exhbitions parameters such as the distance between the spectator and the artwork, the circulation in space, the luminosity/lighting of the artwork(s), the sound/acoustics, etc etc etc.

Distance between wall behind the spectator and the artwork has to be translated into a variable that can affect sound or light in the room. Wall position could be connected to the dimensions of a user interface in real time with arduino and a motor.

Create a connected telemeter with an Arduino, a ultrasonic Sensor (HC-SR04) and a ESP8266 module connected to Internet

It seems possible to create your own telemeter with a arduino by implementing an ultrasonic Sensor HC-SR04
By doing so, the values capted by the sensor could potentaialy be directly translated as a variable.
Then with the ESP8266 module, the values could be translated on a database on the internet. Then I could enter that website and see the values from anywhere and use them to control light, sound or anything else I wish.

Tool/Material list:

  • Telemeter (user to get the distance between the device and an obstacle)
  • Rails
  • Handles
  • Wheels
  • Movable light wall
  • Fixed walls
  • USB Cable
  • Connexion cables
  • Arduino
  • ESP8266
Connexion cables (Arduino)
USB Cable
Arduino
HC-SR04 Ultrasonic Sensor
Plywood x 3
Handle
ESP8266
Rail








About the ultrasonic Sensor (HC-SR04)

Characteristics

Here are a few of it's technical characteristic of the HC-SR04 ultrasonic sensor :

  • Power supply: 5v.
  • Consumption in use: 15 mA.
  • Distance range: 2 cm to 5 m.
  • Resolution or accuracy: 3 mm.
  • Measuring angle: < 15°.

Ref More infos about the sensor here and here

Where to buy the ultrasonic Sensor (HC-SR04)

Prototype 1 : Arduino + Resistor

During a workshop, we started with a very basic fake arduino kit, a led, a motor, and a sensor. After making a few connections, we got to understand a bit how it works.


   #include <Servo.h>
   Servo myservo;  // create servo object to control a servo
   int pos = 0;    // variable to store the servo position
   int ldr = 0;    // vairable to store light intensity
   void setup() {
   Serial.begin(9600); // begin serial communication, NOTE:set the same baudrate in the serial monitor/plotter
   myservo.attach(D7);  // attaches the servo on pin 9 to the servo object
   }
   void loop() {
   //lets put the LDR value in a variable we can reuse
   ldr = analogRead(A0);
   
   //the value of the LDR is between 400-900 at the moment 
   //the servo can only go from 0-180
   //so we need to translate 400-900 to 0-180
   //also the LDR value might change depending on the light of day
   //so we need to 'contrain' the value to a certain range
   ldr = constrain(ldr, 400, 900); 
   //now we can translate
   ldr = map(ldr, 400, 900, 0, 180);
   //lets print the LDR value to serial monitor to see if we did a good job
   Serial.println(ldr); // read voltage on analog pin 0, print the value to serial monitor
   //now we can move the sensor accoring to the light/our hand!
   myservo.write(ldr);      // tell servo to go to position in variable 'pos'
   delay(15);    
   }


How to make a engine work
credits: Dennis de Bel
How to make a sensor work
Credits: Dennis de Bel
How to make both sensor and engine works together
Credits: Dennis de Bel
Sensortest during workshop

Split Screen Arduino + Sensor + Serial Plotter + Responsive Space

Trying here to show the simutaneous responses between the sensor, the values, and the simualtion.

Splitscreen Arduino + Sensor + Serial Plotter + Responsive Space




















Prototype 2: Arduino + Ultrasonic sensor

For this very simple first sketch and for later, I will include newPing library that improves a lot the ultrasonic sensor capacities.

Sketch 1: Arduino Uno + Sensor






















 #include <NewPing.h> 
 int echoPin = 10;
 int trigPin = 9;
 
 NewPing MySensor(trigPin, echoPin); //This defines a new variable
 
 void setup() {
   // put your setup code here, to run once:
   Serial.begin(9600);
 }
 
 void loop() {
   // put your main code here, to run repeatedly:
  int duration = MySensor.ping_median(); 
  int distance = MySensor.convert_in(duration);
 
  Serial.print(distance);
  Serial.println("cm");
  delay(250);
 }

Prototype 3: Arduino Uno + Sensor + LCD (+ LED)

All together from https://www.youtube.com/watch?v=GOwB57UilhQ

Sketch 2: Arduino Uno + Sensor + LCD
Sketch 3: Arduino Uno + Sensor + LCD + LED






















  #include <LiquidCrystal.h>
  
  LiquidCrystal lcd(10,9,5,4,3,2);
 
 const int trigPin = 11;
 const int echoPin = 12;
 
 long duration;
 int distance;
 
 void setup() {
   // put your setup code here, to run once:
     analogWrite(6,100);
     lcd.begin(16,2);
     pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
 pinMode(echoPin, INPUT); // Sets the echoPin as an Input
 Serial.begin(9600); // Starts the serial communication
 
     
 }
 
 void loop() {
 long duration, distance;
   digitalWrite(trigPin,HIGH);
   delayMicroseconds(1000);
   digitalWrite(trigPin, LOW);
   duration=pulseIn(echoPin, HIGH);
   distance =(duration/2)/29.1;
   Serial.print(distance);
   Serial.println("CM");
   delay(10);
 // Prints the distance on the Serial Monitor
 Serial.print("Distance: ");
 Serial.println(distance);
 
     lcd.clear();
     lcd.setCursor(0,0);
     lcd.print("Distance = ");
     lcd.setCursor(11,0);
     lcd.print(distance);
     lcd.setCursor(14,0);
     lcd.print("CM");
     
     delay(500);
     
 }

From this sketch, I start considering that the distance value could be directly sent to a computer and render a Web page depending on its value.
Note: It looks like this sensor max range is 119cm, which is almost 4 times less than the 4 meters max range stated in component description.

Prototype 4: Arduino Uno + Sensor + LCD + 2 LED = Physical vs Digital Range detector

Using in-between values to activate the green LED
Once again, puting together the simulation and the device in use.

Sensor Test VS Elastic Space




































 #include <LiquidCrystal.h>
 #include <LcdBarGraph.h> 
 #include <NewPing.h> 
 
   LiquidCrystal lcd(10,9,5,4,3,2);
 
 const int LED1 = 13; 
 const int LED2 = 8;   
 const int trigPin = 11;
 const int echoPin = 12;
 
 long duration; //travel time
 int distance;
 int screensize;
 
 void setup() {
   // put your setup code here, to run once:
     analogWrite(6,100);
     lcd.begin(16,2);
     pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
     pinMode(echoPin, INPUT); // Sets the echoPin as an Input
     Serial.begin(9600); // Starts the serial communication
 
     pinMode(LED1, OUTPUT);
     pinMode(LED2, OUTPUT);
 }
 
 void loop() {
 long duration, distance;
   digitalWrite(trigPin,HIGH);
   delayMicroseconds(1000);
   digitalWrite(trigPin, LOW);
   duration=pulseIn(echoPin, HIGH);
   distance =(duration/2)/29.1; //convert to centimers
   screensize = distance*85;
   Serial.print(distance);
   Serial.println("CM");
   Serial.print(screensize);
   delay(10);
 
   if ((distance >= 15) && (distance<=20))
   {
      digitalWrite(LED2, HIGH);
      digitalWrite(LED1, LOW);
   }
   else
   {
      digitalWrite(LED1, HIGH);
      digitalWrite(LED2, LOW);    
   }
 
 // Prints the distance on the Serial Monitor
 Serial.print("Distance: ");
 Serial.println(distance);
 
     lcd.clear();
     lcd.setCursor(0,0);
     lcd.print("ROOM");
     lcd.setCursor(6,0);
     lcd.print(distance);
     lcd.setCursor(9,0);
     lcd.print("cm");    
     lcd.setCursor(0,2);
     lcd.print("SCR");
     lcd.setCursor(6,2);
     lcd.print(screensize);
     lcd.setCursor(9,2);
     lcd.print("x1080px");
         
     delay(500);
     
 }


I brought a second arduino, 2 long breadboards, black cables, another LCD screen, and remade the setup on this format. For some reasons the new LCD screen is not going in the breadboard, and I need more male to female cables in order to connect it correctly. With this longer breadboard, I want to extend the range value system, and make it visible with leds and sounds.

Upgrade























How to get more digital pins [not working]

I tried 4 different tutorials but still didn't find a way to make the thing work, that's very weird, so I will just give up and take a arduino mega =*(

ArduinoExtraDigitalPins





























Prototype 5: Arduino Uno + 3 Sensor + 3 LEDS

With a larger breadboard, connecting 3 sensors all together. Next step will be to define different ranges of inbetween values for each sensor in order to make a grid. To accomplish this grid I will make a second row of sensors such as this, in order to get x and y values in space

Prototype 6: Arduino Uno + 3 Sensor + 12 LEDS

With 3 sensors, added on 2 long breadboads, and with a different set of range values, we can start mapping a space.

SensorMediaQueries
Physical Space Mapping








Prototype 7: Arduino Uno + 12 LEDS + 3 Sensor + Buzzer + Potentiometer + LCD

For this prototype, I implement a buzzer that will emit a specific sound depending on the distance of the obstacle detected by the sensor. I also puted back a LCD displaying the 3 sensors values. The screen luminosity can be changed via a potentiometer.
Ressources:

ArduinoMegaSensorBuzzerLCD





























Prototype 8: Arduino Uno + 12 LEDS + 3 Sensor on mini breadboards + Buzzer + Potentiometer + LCD

Same code, but new setup detaching each sensor from each others and allowing to place them anywhere.

ArduinoMegaSensorBuzzerLCDMinibreadboard.jpg





























Prototype 9: Arduino Uno + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD

Sensor Wall 01
PhysicalMapping2





























Sketch 10: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js

P5.js and ultrasonic sensor

The goal here was to create a first communication between the physical setup and a P5.js web page






































Sketch 11: Arduino Mega + UltraSonicSensor + LCD TouchScreen

LCD Arduino Variable poster





























Semester 2

Simu part 02.gif





























Sketch 12: Arduino Uno + P5serialcontrol + P5.js web editor = Code descrypter

P
I
G
E
O
N





Sketch 13: Arduino Uno + P5serialcontrol + P5.js web editor = Game

Stage 0
The subject is located too far away
Stage 0
The subject is well located and will hold position to reach next stage
Stage 1
The subject unlocked Stage 1 and will hold position to reach next stage
Stage 2
The subject unlocked Stage 2 and is located too close
Stage 3
The subject unlocked Stage 3 and need to get closer
Transition Stage
The subject unlocked all stage and needs to wait the countdown for following steps



Sketch 14: Arduino Uno + P5serialcontrol + P5.js web editor = Simplified interface

Data Collector Sample 01.gif





















How to add split serial data value with more than one sensor


Debug Martin 01.png
Debug Martin 05.png
Debug Martin 02.png
Debug Martin 03.png
Debug Martin 04.png
Debug Martin 06.png






























Installation update

Installation Update 01.jpg
Installation Update 02.jpg









To do

Stages Design

Many stages (mini-levels) are being designed. They are all intended to evoke the different contexts and pretexts for which we perform daily micro-tasks to collect data.
The visitor can unlock the next step by successfully completing one or more tasks in a row. After a while, even if the action is not successful, a new step will appear with a different interface and instructions.

The list below details the different stages being produced, they will follow each others randomly during the session:

    • Money/Slot Machine
    • Well-Being
    • Healthcare
    • Yoga
    • Self-Management
    • Stock Market Exchange
    • Military interface

The visuals bellow illustrate their design.

Captcha:
one action needed moving forward, backward or standing stillnext stage unlock after action done and short animation
Self Tracking:
no interaction needed visitor must stand still until one of the goal is completed
Self Tracking:
no interaction needed visitor must stand still until one of the goal is completed
Slot Machine:
no interactions needed transition between 2 stages determines randomly the next stage displayed when nobody detected
Social Live:
no interaction needed visitor must stand still until money goal is completed
Stock Ticket:
no interactions needed displayed when nobody detected








Stages Design with P5.js

AllStages HomoData.png
6 levels in a row then randomnized, more to come
Consolelog 01.gif

























Prototyping Ressources

Do it Yourself Ressources (from Dennis de Bel)

  • Instructables is a huge source of (written) tutorials on all kinds of topics. Keep in mind it's more quantity than quality. Interesting for you might be 'diy sensors'
  • Hand Made Electronic (Music): Great resource for cheap, diy electronics project focussing on

sound/music (pdf findable online)

  • Make: Electronics: Amazing, complete guide to everything 'electronics' (Warning, HUGE pdf)
  • Thingiverse: The place to find 3d printable mechanics, enclosures, parts etc.

Electronic Shops (physical)

LIST OF SHOPS (also more physical NL ones)

Electronic Webshops (NL)

Electronic Webshops (Rest)

PCB making EU (Expensive)

PCB making China (Cheap but import tax)

  • JLCPCB (1 week from design upload to in your hands, low quality solder mask)
  • PCBWAY (1 week from design upload to in your hands)
  • ALLPCB (1 week from design upload to in your hands)

Arduino and Sensors

Sensor only Kit

  • 45-in-1 (aliexpress) Example sensor you will find in such a kit documented here

Arduino Starter Projects

or slightly more complex:

or in videos:

or just many different ideas:

or - of course - on Instructables if you want to have a complete course:

or this course:

ARDUINO + PROCESSING (visualizing sensors)

MISCELANIOUS KEYWORDS and LINKS

About the ESP8266 module

The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.

Ressources about ESP8266 module

Kindly fowarded by Lousia:

Which ESP8266 to buy

Installation

Ressources

  • Movable walls build out for Art Museum of West Virginia University link
  • Gallery Wall System (GWS) link
  • CASE-REAL installs movable walls inside a basement art gallery in tokyo link

Venues

Venue 1: Aquarium

Description


AQUARIUM 1.0


A Small Ecosystem for Living Thoughts

Monday, 11th October
19:30 – 21:30
Leeszaal Rotterdam West
Rijnhoutplein 3, 3014 TZ Rotterdam

with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann

It’s oh-fish-ial! Students of the Experimental Publishing Master invite you to dive into their small ecosystem of living thoughts. Join us for an evening of conversation, discussion and new view points. If you look closely, you might even see some early thesis ideas hatching. Let's leave no rock unturned.

Observation questionnaire

This exercice is a very small, humble and almost 100% analog exercice questioning representation in two small steps.

1st step

photo of a brick












  • 1st step: I give a sheet of paper to people during the venue and ask them to answer a series of questions concerning the object (brick) that is being displayed in the middle of the room on a podium. It is specified to them that they can be anywhere while observing this brick and in any position. Here are the quesitons:


  • Please write down your first name:


  • Describe your position (sitting/standing/other):


  • Describe your location in the room:


  • Describe what you are seeing while looking at the screen:


  • Describe how you feel mentaly/emotionaly:



2nd step

photo of brick displayed inside a computer screen












  • 2nd step: I take the answers, wait a round, and then give back a new sheet of paper to the same people with the exact same questions concerning the respresentation of the object (brick) that is being displayed in the middle of the room on a computer screen on the same podium.

Answer Samples

1.0 Object on a podium

  • 1.1 Sitting on corner stairs —> Want to see it from different angles —> Feeling trapped, frustrated
  • 1.2 Sitting on stairs —> a rock looking dead —> Feeling sad
  • 1.3 Sitting on the left close from columns —> rational observation —> Nostalgic memories because participated to the creation of the object as it looks right now
  • 1.4 Sitting in front of object —> Calm and slighly confused
  • 1.5 Sitting on the floor next to stairs in between the side and the middle —> Looking at the object from the bottom —> Feeling a bit confused and inspired



2.0 Photo of the object displayed on a computer screen placed on a podium

  • 2.1 Sitting on a chair seeing the brick from a bird perspective -> Feeling more control of the situation
  • 2.2 Sitting very close from the brick —> Seeing a flat and almost abstract picture —> Feeling drawn to the picture, aesthetically pleasing, feeling less sad about the object
  • 2.3 Sitting under a table very far way —> Looking abstract but identifiable —> Exited about the unusual and childish observation position
  • 2.4 Sitting on stairs —> and seeing the brick in 2D —> Feeling fine
  • 2.5 Sittiing on the stairs —> Seeing a side of the screen with a top view photo of the object —> Feeling confortable



Answers1_RepresentationQuestionnaire
Answers2_RepresentationQuestionnaire
Answers3_RepresentationQuestionnaire
Answers4_RepresentationQuestionnaire
Answers5_RepresentationQuestionnaire


























Venues

Venue 2: Aquarium 2.0

Description


Date 29th Nov — 4th Dec 2021
Time 15h — 18h
29th Nov — 4th Dec 2021 (all day)
Location: De Buitenboel, Rosier Faassenstraat 22 3025 GN Rotterdam, NL


AQUARIUM 2.0

An ongoing window exhibition with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann

Tap upon the glass and peer into the research projects we are currently working on. From Monday 29th of November until Saturday 4th of December we put ourselves on display in the window of De Buitenboel as an entry point into our think tank. Navigating between a range of technologies, such as wireless radio waves, virtual realities, sensors, ecological and diffractive forms of publishing, web design frameworks, language games, and an ultra-territorial residency; we invite you to gaze inside the tank and float with us. Welcome back to the ecosystem of living thoughts.

Aquarium LCD Portal (29 Nov – 4th Dec)

This interactive micro-installation composed of a LCD screen and sensor(s) invites users/visitors to change the color of the screen and displayed messages by getting more or less close from the window. Link

ScreenPortalFront
ScreenPortalback
LCDScreenTest














Readings (new)(english)(with notes in english)

About Institutional Critique

To read

→ 1. Art and Contemporary Critical Practice: Reinventing Institutional CritiqueDoc
→ 2. From the Critique of Institutions to an Institution of Critique - Andrea FraserDoc
→ 3. Institutional critique, an anthology of artists writings - Alexander AlberroDoc

About Techno-Solutionism

To read

→ 1. The Folly of Technological Solutionism: An Interview with Evgeny Morozov - Natasha Dow Schüll

About Meta

To read

→ 1.  The meta as an aesthetic category Bruno Trentini (2014)
→ 2.  File:RMZ ARTIST WRITING(2).pdf The eye tells the story by Rosa Maria Zangenberg (2017)
→ 3.  Leonardo Da Vinci - Paragone by Louise Farago

About exhibition space

To read

→ 2. Kluitenberg, Eric, ed. Book of imaginary media. Excavating the dream of the ultimate communication medium. Rotterdam: NAi Publishers, 2006.
→ 3. The wall and the canvas: Lissitzky’s spatial experiments and the White Cube
→ 6. Decorative Arts: Billy Al Bengston and Frank Gehry discuss their 1968 collaboration at LACMA by Aram Moshayedi
→ 8.  File:Resonance and Wonder STEPHEN GREENBLATT.pdf Resonance and Wonder - STEPHEN GREENBLATT
→ 9.  A Canon of Exhibitions - Bruce Altshuler File:A Canon of Exhibitions - Bruce Altshuler.pdf
→ 10. Documenta - File:A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar.pdf A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar
→ 11. Pallasmaa - The Eyes of the Skin File:Pallasmaa - The Eyes of the Skin.pdf
→ 12. Venturi - Learning from Las Vegas File:Venturi - Learning from Las Vegas.pdf
→ 13. Preserving and Exhibiting Media Art: Challenges and Perspectives - JULIA NOORDEGRAAF, COSETTA G. SABA; BARBARA LE MAÎTRE; VINZENZ HEDIGER Copyright: 2013 - Publisher: Amsterdam University Press Series: Framing Film

Reading/Notes

→ 1. After the White Cube. ref 2015 NOTES INSIDE

  • How and why White Cube rised and became democratized
  • White Cube // Consumerism = Art Consumerism?
  • Exhibition Space > Artworks
  • Experience of interpretation = Entertainment of Art?
  • Museum vs Mausoleum


→ 2. Spaces of Experience: Art Gallery Interiors from 1800 – 2000 ref NOTES INSIDE

  • Art vs 50's consumerism / Choregraphy of desire?
  • Check theorists Hermann von Helmholtz and Wilhelm Wundt


→ 3. Colour Critique A Symposium on Colour as an Agent for Taste, Experience and Value in the Exhibition Space NOTES INSIDE
May 24, 2019 - Noise! Frans Hals, Otherwise, Frans Hals Museum
→ 4.  Noise! Frans Hals, Otherwise NOTES INSIDE

  • Role of colours in the viewer's experience of an exhibition
  • Institutional Critique
  • Institutionalised Space / White cube


→ 5. Mental Spaces - Joost Rekveld/Michael van Hoogenhuyze NOTES INSIDE
(course for Artscience 2007/8) doc

  • About perspective
  • About Space time
  • About Cyber Space


→ 6.  THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge Doc NOTES INSIDE
Stephanie Moser SOUTHAMPTON UNIVERSITY (MUSEUM ANTHROPOLOGY) 2010

  • Architecture (Neoclassical buildings)
  • Big vs Small exhibition Space
  • Lined up objects vs non systematic display
  • Architecture/Design
  • Gallery interiors (Ceiling/Interior Design elements/Furniture
  • Colors
  • Individual lighting of objects vs global lighting
  • Dark vs Bright lighting
  • Chronological vs Thematic arrangement
  • Academic vs Journalistic writting
  • Busy layout vs Minimal Layout
  • Exibition seen vs other exhibitions
  • Themed/idea-oriented vs objectled exhibitions
  • Didactic vs discovery exhibition
  • Contextual, immersive, or atmospheric exhibitions
  • Audience vs Reception


→ 7. Fantasies of the Library - Etienne Turpin (ed.), Anne-Sophie Springer (ed.) Ref; Editeur: The MIT Press; Date de publication: 1 sept. 2018

  • How the a physical organization influence of a bookshelf can influence it's digital version
  • The book as a minitaure gallery/exhibition space
  • The library as a public place of reading
  • Library vs Exhibition Space = Use vs Display
  • Book-theme exhibitions

About User vs Visitor, or user in exhibition space

Designing the user experience in exhibition spaces - Elisa Rubegni, Caporali Maurizio, Antonio Rizzo, Erik Grönvall

  • What are the GUI intentions
  • What is the WIMP interaction model
  • What are the post-Wimp models
  • About Memex

About User Interface

Readings/Notes

→ 1. bootlegAlexander R. Galloway - The Interface Effect 1st ed. Malden, USA: Polity Press.

  • The interface paradox
  • The less they do, the more they achieve and the more they become invisible & unconsidered
  • The interface as a "significant surface"
  • The interface as a gateway
  • The interface as "the place where information moves from one entity to another"
  • The interface as the media itself
  • The interface as "agitation or generative friction between different formats"
  • The interface as "an area" that "separates and mixes the two worlds that meet together there"


→ 2. bootleg Nick Srnicek - Navigating Neoliberalism: Political Aesthetics in an Age of Crisis NOTES INSIDE
Editeur: medium.com, Date de publication: 20 oct. 2016

  • From an aesthetic of sublime into an aesthetics of the interface
  • Cognitive mapping


→ 3. bootleg Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff NOTES INSIDE
Douglas Rushkoff, A., 2010. Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff. 1st ed. Minneapolis, USA: OR Books.

  • "Instead of learning about our technology, we opt for a world in which our technology learns about us."
  • Programmed by the interfaces
  • From a transparent to an opaque medium


→ 4. bootlegThe Best Interface Is No Interface - Golden Krishna NOTES INSIDE
Krishna, G., 2015. The Best Interface Is No Interface: The simple path to brilliant technology (Voices That Matter). 1st ed. unknown: New Riders Publishing.

  • "Screen Obsessed Approach to Design"
  • UI vs UX


→ 5. Plasticity of User Interfaces:A Revised Reference Framework NOTES INSIDE
Gaëlle Calvary, Joëlle Coutaz, David Thevenin Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt

  • About the term 'Placticity'


→ 6. Interface Critique- Beyond UX - FLORIAN HADLER, ALICE SOINÉ; DANIEL IRRGANG DOC Florian Hadler, Alice Soiné, Daniel Irrgang

  • The interface as an "historical artifact", a "space of power"
  • The interface as human -machine boudary
  • What is interface critique
  • Interface in computer science
  • The screen for Lev Manovitch



More to read/see

→ 1. Bickmore, T.W., Schilit, B.N., Digestor: Device- Independent Access To The World Wide Web, in Proc. of 6th Int. World Wide Web Conf. WWW’6
         (Santa Clara, April 1997)

→ 2. Bouillon, L., Vanderdonckt, J., Souchon, N., Recovering Alternative Presentation Models of a Web Page with VAQUITA, Chapter 27, in Proc. of 4th Int. Conf. on Computer- Aided Design of User Interfaces CADUI’2002
         (Valenciennes, May 15-17, 2002)

→ 3. Calvary, G., Coutaz, J., Thevenin, D., Supporting Context Changes for Plastic User Interfaces: a Process and a Mechanism, in “People and Computers XV –
         Interaction without Frontiers”, Joint Proceedings of AFIHM-BCS Conference on Human-Computer Interaction IHM-HCI’2001(Lille, 10-14 September 2001)

→ 4. Cockton, G., Clarke S., Gray, P., Johnson, C., Literate Development: Weaving Human Context into Design Specifications, in “Critical Issues in User Interface Engineering”,
         P. Palanque & D. Benyon (eds), Springer-Verlag, London, 1995.

→ 5. Graham, T.C.N., Watts, L., Calvary, G., Coutaz, J., Dubois, E., Nigay, L., A Dimension Space for the Design of Interactive Systems within their Physical Environments, in Proc. of Conf. on Designing Interactive Systems DIS’2000
          (New York, August 17-19, 2000,), ACM Press, New York, 2000,

→ 6. Lopez, J.F., Szekely, P., Web page adaptation for Universal Access, in Proc. of Conf. on Universal Access in HCI UAHCI’ 2001
         (New Orleans, August 5-10, 2001), Lawrence Erlbaum Associates, Mahwah, 2001,

→ 7. Thevenin, D., Coutaz, J., Plasticity of User Interfaces: Framework and Research Agenda, in Proc. of 7th IFIP International Conference on Human-Computer Interaction Interact' 99
         (Edinburgh, August 30 - September 3, 1999), Chapman & Hall, London, pp. 110-117.

→ 8. Thevenin, D., Adaptation en Interaction Homme-Machine: Le cas de la Plasticité, Ph.D. thesis, Université Joseph Fourier,
          Grenoble, 21 December 2001.

→ 9. Graspable interfaces (Fitzmaurice et al., 1995) link

About User Condition

Readings

→ 1. The User Condition 04: A Mobile First World - Silvio Lorusso Doc

  • Most web user are smarphone users
  • How "mobile's first" affect global web design
  • How "mobile's first" affect the way we use computers

Readings (old)(mostly french)(with notes in french)

Books (old)


→ 1.  L'art comme expérience — John Dewey (french) ⚠️(yet to be filled)⚠️
         publisher: Gallimard (1934)
→ 2.  L'œuvre d'art à l'époque de sa reproductibilité technique — Walter Benjamin (french
         publisher: Alia (1939)
→ 3.  La Galaxie Gutemberg — Marshall McLuhan (french)
         publisher: University of Toronto Press (1962)
→ 3.  Pour comprendre les médias — Marshall McLuhan (french)
         publisher: McGraw-Hill Education (1964)
→ 4.  Dispositif — Jean-Louis Baudry (french)
         publisher: Raymond Bellour, Thierry Kuntzel et Christian Metz (1975)
→ 5.  L’Originalité de l’avant-garde et autres mythes modernistes — Rosalind Krauss (french) ⚠️(yet to be filled)⚠️
         publisher: Macula (1993)
→ 6.  L'art de l'observateur: vision et modernité au XIXe siècle — Jonathan Crary (french)
         publisher: Jacqueline Chambon (Editions) (1994)
→ 7.  Inside the White Cube, the Ideology of Gallery Space — Brian O'Doherty (english) ⚠️(yet to be filled)⚠️
         publisher: Les presses du réel (2008)
→ 8.  Préçis de sémiotique générale — Jean-Marie Klinkenbeg (french) ⚠️(yet to be filled)⚠️
         publisher: Point (2000)
→ 9.  Langage des nouveaux médias — Lev Manovitch (french) ⚠️(yet to be filled)⚠️
         publisher: Presses du Réel (2001)
→ 10. L'empire cybernétique — Cécile Lafontaine (french)
         publisher: Seuil (2004)
→ 11.  La relation comme forme — Jean Louis Boissier (french)
         publisher: Genève, MAMCO(2004)
→ 12.  Le Net Art au musée — Anne Laforêt (french)
         publisher: Questions Théoriques(2011)
→ 13.  Narrative comprehension and Film communication — Edward Branigan (english)
         publisher: Routledge (2013)
→ 14. Statement and counter statement / Notes on experimental Jetset — Experimental Jetset (english)
          publisher: Roma (2015)
→ 15. Post Digital Print — Alessandro Ludovico (french) ≈
          publisher: B42 (2016)
→ 16. L'écran comme mobile — Jean Louis Boissier (french)
          publisher: Presses du réel (2016)
→ 17. Design tactile — Josh Clark (french)
          publisher: Eyrolles (2016)
→ 18. Espaces de l'œuvre, espaces de l'exposition — Pamela Bianchi (french)
          publisher: Eyrolles (2016)
→ 19. Imprimer le monde (french)
          publisher: Éditions HYX et les Éditions du Centre Pompidou (2017)
→ 20. Version 0 - Notes sur le livre numérique (french)
          publisher: ECRIDIL (2018)

Articles (old)

→ 1. Frederick Kiesler — artiste- architecte ⚠️(yet to be filled)⚠️
        (communiqué de presse) Centre pompidou; source : centrepompidou.fr (1996)
→ 2. Oublier l'exposition ⚠️(yet to be filled)⚠️
        Artpress special numéro 21 (2000)
→ 3. Composer avec l’imprévisible: Le questionnaire sur les médias variables ⚠️(yet to be filled)⚠️
        Jon Ippolito; source : variablemedia.net/pdf/Permanence (2003)
→ 4. Esthétique du numérique : rupture et continuité
        Fred Forest; source : archives.icom.museum (2010)
→ 5. La narration interactive ⚠️(yet to be filled)⚠️
        Dragana Trgovčević source : ensci.com/file_intranet/mastere_ctc/etude_Dragana_Trgovcevic.pdf (2011)
→ 6. Des dispositifs aux appareils - L'Espacement d'un calcul
        Anthony Masure source :  anthonymasure.com (2013)
→ 7. Le musée n'est pas un dispositif - Jean-Louis Déotte p.9 - 22 (2011)
→ 8. Apogée et périgée du White Cube Loosli, Alban

References

Exhibition space

→  Prouns Spaces — El lissitzky (1920)
→  City in Space — Frederick Kiesler (1920)
→  The air conditionning Show — Terry Atkinson & Michael Baldwin(1966-67)
→  Sans titre — Michael Asher (1973)
→  Serra Corner prop n°7 (for Nathalie) Richard Serra (1983)
→  Speaking Wall (2009 - 2010)

Nothingness with Media

→  4’’33’ — John Cage (1952)
→  Untitled - A Curse — Tom Friedman (1965)
→  The air conditionning Show — Terry Atkinson & Michael Baldwin(1966-67)
→  Sans titre — Michael Asher (1973)

Mediatization of Media

→  4’’33’ — John Cage (1952)
→  TV Garden — Nam June Paik (1974)
→  Presents — Michael Snow (soon to be translated)
→  Lost Formats Preservation Society — Experimental Jetset (2000)
→  Lost Formats Winterthur — Experimental Jetset (2000)
→  L’atlas critique d’Internet Louise Drulhe (2014-2015)

Flags

→  Netflag — Mark Napier (2002)
→  019 - Flag show (2015)

User perspective

→  What you see is what you get — Jonas Lund (2012)

Media Time perception

→  Present Continuous Past — Dan Graham's (1974)

Experimental cinema

→  Presents — Michael Snow (soon to be translated)
→  Displacements — Michael Naimark (1980)
→  BE NOW HERE — Michael Naimark (1995)

CSS composition

→  Sebastianly Serena
→  Scrollbar Composition
→  into time .com - Rafael Rozendaal
→  Ridge 11 - Nicolas Sassoon
→  Rectangulaire - Claude Closky
→  Jacksonpollock.org - Miltos Manetas
→  Moving Paintings - Annie Abrahams

Media deterioration

→  Img214270417
→  William Basinski - The Disintegration Loops

Undefined

→  Untitled Sans

User friendliness and anti-user friendliness

→  Web-Safe - Juha van Ingen

Media Art conservation

→  The Variable Media Initiative 1999
→  EAI Online Resource Guide forExhibiting, Collecting & Preserving Media Art
→  Matters in Media Art
→  The International Network for the Preservation of Contemporary Art (INCCA)
→  Archiving complex digital artworks - Dušan Barok

Emulation

→  Seeing Double: Emulation in Theory and Practice

Technological Timeline

→  Technological Timeline

Media Art Online Archive

→  ACM SIGGRAPH Art Show Archives
→  Archive of Digital Art (ADA)
→  Ars Electronica Archive
→  Digital Art Web Archive (collected by Cornell)
→  Monoskop
→  The Rhizome ArtBase

Music/Sound

→  The end of music

HTML Quines

→  https://hugohil.github.io/dedans/
→  https://secretgeek.github.io/html_wysiwyg/html.html
→  http://all-html.net/?