XPUB2 Research Board / Martin Foucaut: Difference between revisions

From XPUB & Lens-Based wiki
Line 177: Line 177:


* <b>1st semester</b> Prototyping with Arduino all long, getting started with Raspery, and finding a space to set up
* <b>1st semester</b> Prototyping with Arduino all long, getting started with Raspery, and finding a space to set up
# 1st prototype: mini arduio + light sensor (<b>understanding arduino basics</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_1_:_Arduino_.2B_Resistor|link]]
* 1st prototype: mini arduio + light sensor (<b>understanding arduino basics</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_1_:_Arduino_.2B_Resistor|link]]
# 2nd prototype: arduino uno + utlrasonic sensor (<b>working with ultrasonic sensors</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_2:_Arduino_.2B_Ultrasonic_sensor|link]]
* 2nd prototype: Arduino uno + utlrasonic sensor (<b>working with ultrasonic sensors</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_2:_Arduino_.2B_Ultrasonic_sensor|link]]
# 3rd prototype: arduino uno + utlrasonic sensor + LCD screen (<b>working with values display</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_3:_Arduino_Uno_.2B_Sensor_.2B_LCD_.28.2B_LED.29|link]]
* 3rd prototype: Arduino uno + utlrasonic sensor + LCD screen (<b>working with values display</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_3:_Arduino_Uno_.2B_Sensor_.2B_LCD_.28.2B_LED.29|link]]
# 4th prototype: arduino uno + utlrasonic sensor + 2 LEDS (<b>working with in-between distance range values detection</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_4:_Arduino_Uno_.2B_Sensor_.2B_LCD_.2B_2_LED_.3D_Physical_vs_Digital_Range_detector|link]]
* 4th prototype: Arduino uno + utlrasonic sensor + 2 LEDS (<b>working with in-between distance range values detection</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_4:_Arduino_Uno_.2B_Sensor_.2B_LCD_.2B_2_LED_.3D_Physical_vs_Digital_Range_detector|link]]
# 5th prototype: arduino uno + 3 utlrasonic sensor + 3 LEDS (<b>mapping range values detection in a grid and attributing signals with LEDS</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_5:_Arduino_Uno_.2B_3_Sensor_.2B_3_LEDS|link]]
* 5th prototype: Arduino uno + 3 utlrasonic sensor + 3 LEDS (<b>mapping range values detection in a grid and attributing signals with LEDS</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_5:_Arduino_Uno_.2B_3_Sensor_.2B_3_LEDS|link]]
# 6th prototype: arduino uno + 3 utlrasonic sensor + 12 LEDS (<b>mapping range values detection in a grid and attributing signals with more LEDS</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_6:_Arduino_Uno_.2B_3_Sensor_.2B_12_LEDS|link]]
* 6th prototype: Arduino uno + 3 utlrasonic sensor + 12 LEDS (<b>mapping range values detection in a grid and attributing signals with more LEDS</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_6:_Arduino_Uno_.2B_3_Sensor_.2B_12_LEDS|link]]
# 7th prototype: arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer (<b>adding audio signals to the range value detection</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_7:_Arduino_Uno_.2B_12_LEDS_.2B_3_Sensor_.2B_Buzzer_.2B_Potentiometer_.2B_LCDS|link]]
* 7th prototype: Arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer (<b>adding audio signals to the range value detection</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_7:_Arduino_Uno_.2B_12_LEDS_.2B_3_Sensor_.2B_Buzzer_.2B_Potentiometer_.2B_LCDS|link]]
# 8th prototype: arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer + mini breadboard (<b>separating sensors from each others</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_8:_Arduino_Uno_.2B_12_LEDS_.2B_3_Sensor_on_mini_breadboards_.2B_Buzzer_.2B_Potentiometer_.2B_LCD|link]]
* 8th prototype: Arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer + mini breadboard (<b>separating sensors from each others</b>) [[https://pzwiki.wdka.nl/mediadesign/XPUB2_Research_Board_/_Martin_Foucaut#Prototype_8:_Arduino_Uno_.2B_12_LEDS_.2B_3_Sensor_on_mini_breadboards_.2B_Buzzer_.2B_Potentiometer_.2B_LCD|link]]
# 9th prototype: arduino Uno + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD <b>expending the prototype to human scale</b>
* 9th prototype: Arduino Mega + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD (<b>expending the prototype to human scale</b>)
* 10th prototype: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js  (<b>connecting the prototoype to a Web page</b>)
——————————— NOW —————————————————————————————————————————
——————————— NOW —————————————————————————————————————————
# Upcoming: arduino uno + 3 utlrasonic sensor + 5V Relay + Lamp (<b>controlling a lamp with arduino</b>)
* Upcoming - 11th prototype: Arduino Mega + 7 Sensors + LCD + P5.js  (change the size of a Web page while moving the sensor wall)
# Upcoming: arduino uno + 3 utlrasonic sensor + ESP8266 (WIFI) + Rasperry Pi (Self hosted website) (<b>transmit or/and control value from arduino to computer and vice versa via WIFI transmitter</b>)
* Upcoming - 12th prototype: Arduino Mega + 7 Sensors + LCD + P5.js  (play sounds and affect pitch/tone depending on position one Web page)
# Upcoming: small room + arduino uno + 8 utlrasonic sensor + ESP8266 (WIFI) + Rasperry Pi (connect to a Web page)
* Upcoming: arduino uno + 3 utlrasonic sensor + ESP8266 (WIFI) + Rasperry Pi (Self hosted website) (<b>transmit or/and control value from arduino to computer and vice versa via WIFI transmitter</b>)
* Upcoming: small room + arduino uno + 8 utlrasonic sensor + ESP8266 (WIFI) + Rasperry Pi (connect to a Web page)





Revision as of 14:38, 19 November 2021

Pads

Manetta / Michael

Steve / Marloes

Eleanor Greenhalgh

Links

Seminars (source)

Key Dates and Deadlines

These are the key dates for 2021-22

  • 19 November - Graduate Proposal Deadline

Last year's Graduate Proposals UPLOAD YOUR PROPOSAL HERE!

  • 19 November - Thesis Outline Deadline

Last year's Thesis Outlines UPLOAD YOUR THESIS OUTLINE HERE!

  • 3 Dec - Deadline First Chapter
  • 18 Feb - Deadline First Draft Thesis
  • 18 March - Deadline Second Draft thesis (texts to 2nd readers)
  • 1 April - Deadlines Second readers' comments
  • 14 April - DEADLINE THESIS

Guides and Guidelines

LB Code link (in progress)

About thesis

Thesis criteria

  1. Intelligibly express your ideas, thoughts and reflections in written English.
  2. Articulate in writing a clear direction of your graduate project by being able to identify complex and coherent questions, concepts and appropriate forms.
  3. Clearly structure and analyse an argument.
  4. Use relevant source material and references.
  5. Research texts and practices and reflect upon them analytically.
  6. Synthesize different forms of knowledge in a coherent, imaginative and distinctive way.
  7. Position one's own views within a broader context.
  8. Recognize and perform the appropriate mode of address within a given context.
  9. Engage in active dialogue about your written work with others.

Thesis format

  1. A report on your research and practice.
  1. An analytical essay exploring related artistic, theoretical, historical and critical issues and practices that inform your practice, without necessarily referring to your work directly.
  1. The presentation of a text as a body of creative written work.

Thesis Outline (guideline)

Don't make it more than 1500 words

What is your question?

Break the proposed text down into parts.  Think of the separate sections as "containers" (this may change as you progress with the text but try to make a clear plan with a word count in place)

Thesis Outline (consider the following before writing the outline. Include all these points in the intro to the outline)

Conceptual Outline (what is your question? Try to be a specific as possible. More specific than identifying a subject or general interest. It helps to ask: "what questions does the work I make generate?")

Why do you want to write this text?

Outline of Methodology  (for example: " I would like to structure my thesis in relation to the a series of interviews I will conduct for my proposed project"  OR  "I will make a 'close reading' of three of my past projects" 

Time line (how will you plan your time between now and April)


  • Introduction- overview 

[500 words]

  • Chapter 1 

[2000 words]

  • Chapter 2 

[2000 words]

  • Chapter 3 

[2000 words]

  • Conclusion [500 words] 

=7000

Bibliography

Annotated bibliography (five texts max). Make a synopsis of 5 texts  that will be central to your thesis.

  • Example of annotated bibliography 

https://pzwiki.wdka.nl/mediadesign/Mia/Thesis

  • Example of a thesis outline:

    #)     https://pzwiki.wdka.nl/mw-mediadesign/images/f/f3/Thesis_outline_final_Yuching.pdf     #1)     https://pzwiki.wdka.nl/mediadesign/User:Zpalomagar/THESIS_OUTLINE/FIFTH_DRAFT

Referencing System

  • Harvard Referencing system PDF


Graduate proposal guidelines

What do you want to make?

I want to explore the key roles of digital and physical architectures in the representation and transmission of knowledge by merging them under the form of a physical exhibition interface. To be more specific, I wish to build an exhibition space inspired from the elastic (plastic) display/render of online contents that mostly adapt to the user/viewers perspective. In that sense, I want to build an exhibiting device willing to put the spectator in the position of being the curator and user of it's own physical exhiibition space and allow a single representation/content/artwork to be displayed under a wide range of possible settings. Comparably as a pointer or a cursor, the spectator/user can move inside the space (SpectatorX/SpectatorY = MouseX/MouseY) and resize this same space (SpaceWidth & SpaceHeight = WindowWidth & Windowheight) by pushing or pulling a movable wall fixed on some rails/wheels. The number of spectators, its/their position inside the space, as well as the interactions engaged with it will affect various display factors such as the lighting, sound, projection format, information layout, etc. Such interactions could also give space to some accidents, unforeseen, bugs, deadzones and glitches that might have been intentionaly left there). This is an attempt to speculate on how the cybernetics(?) affects alls aspect of our lifes and could also transform the exhibition space, the curatiorial practice, our experience of art and the nature of representation itself.

  • I am still questionning if this space should be completely empty of information or not. In my mind, inside the space should be displayed with the help of one or several beamer(s)) (or screens) the live values of the variable exhibition space (width/lenght; spectator position; luminosity; information font; sound frequency, etc), in order to enhance the idea of being inside a exhibiting device. The way these values are displayed (format, font, layout) should also change depending on the values themselves (I will make some sketches about that).

How do you plan to make it?

While working with Arduino Mega and Rasperry Pi, my aim is to start from the smallest and most simple prototype, gradually increase its scale/technicality until reaching human scale and getting closer from emulating the properties of a Web window (see: prototyping). Once an exhibtion space will be found/determined for the graduation, I will build a custom mobile wall, or use a preexisting one, add some handles to it, and fix it on a rail system that will allow to reduce or expend the size of a space from in between a maximum and a minimum range. This wall will include on on the back at least one sensor that will help to determine the size of the room in real time. (see schema). With the help of an array of sensors putted on the 3 static walls of the exhibition space, the space will be mapped into a grid, and will allow to know the real-time position of the spectator(s) inside of it.

In order to better define my audience, the issues, and the direction of this work, I will take advantages of the different venues organized by XPUB to involve people into testing and reflecting on various prototypes of this work in progress. (see venue1) (see: simulation)

Sensor Test VS Elastic Space
Click to watch
Sensor Wall 01
SensorMediaQueries
Click to watch




What is your timetable?

  • 1st semester Prototyping with Arduino all long, getting started with Raspery, and finding a space to set up
  • 1st prototype: mini arduio + light sensor (understanding arduino basics) [[1]]
  • 2nd prototype: Arduino uno + utlrasonic sensor (working with ultrasonic sensors) [[2]]
  • 3rd prototype: Arduino uno + utlrasonic sensor + LCD screen (working with values display) [[3]]
  • 4th prototype: Arduino uno + utlrasonic sensor + 2 LEDS (working with in-between distance range values detection) [[4]]
  • 5th prototype: Arduino uno + 3 utlrasonic sensor + 3 LEDS (mapping range values detection in a grid and attributing signals with LEDS) [[5]]
  • 6th prototype: Arduino uno + 3 utlrasonic sensor + 12 LEDS (mapping range values detection in a grid and attributing signals with more LEDS) [[6]]
  • 7th prototype: Arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer (adding audio signals to the range value detection) [[7]]
  • 8th prototype: Arduino uno + 3 utlrasonic sensor + 1 buzzer + 1 LCD + 1 Potentiometer + mini breadboard (separating sensors from each others) [[8]]
  • 9th prototype: Arduino Mega + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD (expending the prototype to human scale)
  • 10th prototype: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js (connecting the prototoype to a Web page)

——————————— NOW —————————————————————————————————————————

  • Upcoming - 11th prototype: Arduino Mega + 7 Sensors + LCD + P5.js (change the size of a Web page while moving the sensor wall)
  • Upcoming - 12th prototype: Arduino Mega + 7 Sensors + LCD + P5.js (play sounds and affect pitch/tone depending on position one Web page)
  • Upcoming: arduino uno + 3 utlrasonic sensor + ESP8266 (WIFI) + Rasperry Pi (Self hosted website) (transmit or/and control value from arduino to computer and vice versa via WIFI transmitter)
  • Upcoming: small room + arduino uno + 8 utlrasonic sensor + ESP8266 (WIFI) + Rasperry Pi (connect to a Web page)


  • 2nd semester: Find what will be graduation space, build the mobile wall, and translate the setup installation to human/spectator scale.
  1. Show prototype and schemas of the wall to wood and metal workshops in order to get advices until final validation to build (starting to build physical elements)
  2. Search, find and validate what will be the space used for the installation during the graduation.
  3. Start building of the movable wall by considering the characteristic of the space used for graduation.
  4. Implement the sensors inside the movable wall, and the other devices in the fixed space

Why do you want to make it?

In opposition to the physical exhibition space, the Web offers to each of its user/visitors a custom point of view based on an innumerable and everchanging array of technological factors. I like to call this the technological context. Among these, we could list: the browser, the device, the explotation system, the screensize, the resolution, the user configuration and defaults settings, the updates, the IP adress, etc.. This technological complexity diffracts the possible renders of a same Web page in an almost infinite array of user perspectives. Therefore, this is the nature and meaning of representation itself that is redifined by the Web. Web representations are sort of plastic/elastic, they demultiplies and transforms themselves as much as needed in order to be rendered in an optimal way through our own user perspective/interface. By implementing these notions and properties into the phyiscal exhibition spaces, I would like to put a step in the curatorial practice.

From our own user perspective/point of view, behind our own screen, this technological complexity and the infinite spectrum of perspectives that it leads to can hardly be considered (expect here for example). This brings us to uncounsioulsy forget about the singularity and fragility of what we is being seen/experienced/interpretated. By creating a physical interface conceived on the model of a responsive Web page, I want to give to the visitors the power to manipulate and diffract this spectrum of perspectives by their own hands and to consider the role and effects of these mediating technologies on the visitor's behaviors and perception of an artwork/content.

On the other hand, I am attached to the idea of reversing the desktop metaphor. The desktop metaphors refers to the terms and objects that the Web borrowed from the physical world in order to make its own concepts more familiar and intelligible to its users. Now, largely democratized and widely spread in modern society, people may have now more concrete experiences of the digital/Web interface than the physical space. Museums, hotels, houses, cars interiors, restaurants are themselves becoming more and more comparable to digital interface where everything is optimized, and where our behaviours, actions and even inactions are being detected and converted into commands in order to offer a more customized (and lucrative) experience to each of us. In that sense, we are getting closer from becoming users of our own interfaced/augmented physical realities. By creating a exhibition spaces explicitly inspired from a desktop Web interface, I wish to question what could be the future of exhibition space, and what are the vulnerabilities of such technologies.

Conceiving the exhibition space as a Web interface, and the spectator as a user is also about puting together two layers of reality that are too often clearly opposed/seperated(IRL VS Online). This is about experiencing their ambiguities, similarities, and differences. It is about reconsidering their modalities by making them reflect on each others, and making the user/spectator reflect on its own agencies inside of them. (see: Reversing the desktop metaphor)

More generally, it is about reflecting on media itself, and deal with a paradox that I've always found interesting: The better a medium mediates, the more it becomes invisible and unconsidered.
see: Mediatizing the media). This observation makes me want to mediate the media and to give spectators more occasions to focus on what is containing, surrounding, holding or hosting a representation instead of giving all our focus on the representation itself.

Who can help you?

  • About the overall project
  1. Stephane Pichard, ex-teacher and ex production-tutor for advices and help about scenography
  2. Emmanuel Cyriaque: my ex-teacher and writting-tutor for advices and help contextualize my work
  • About Arduino
  1. XPUB Arduino Group
  2. Dennis de Bel
  3. Aymeric Mansoux
  • About Rasperry Pi
  1. XPUB2 students (Jacopo, Camillo, Federico, etc)
  2. Michael Murtaugh
  • About creating the physical elements:
  1. Wood station (for movable walls)
  2. Metal station (for rails)
  3. Interaction station (for arduino/rasperyPi assistance)
  • About theory/writting practice:
  1. Rosa Zangenberg: ex-student in history art and media at Leiden Universtity.
  2. Yael: ex-student in philosophy, getting started with curatorial practice and writtings about the challenges/modalities of the exhibition space. Philosophy of the media (?)
  • About finding an exhibiting space:
  1. Leslie Robbins

Relation to previous practice

During the first part of my previous studies, I really started being engaged into questioning the media by making a small online reissue of Raymond Queneau's book Exercices de Style. In this work called Incidences Médiatiques, the user/reader was encouraged to explore the 99 differents ways to tell a same story from Queneau, by putting itself in at least 99 different reading contexts. In order to suggest a more non-linear reading experience, reflecting on the notion of context, perpective and point of view, the user could unlock these stories by zooming-in or out the Web window, resizing it, changing the browser, going on a different device, etc. As part of my previous graduation project, I wanted to reflect on the status of networked writing and reading, by programming my thesis in the form of Web to Print Website. Subsequently, this website became translated in the physical space as a printed book, a set of meta flags, and a series of installations displayed in a set of exhibition rooms that was following the online structure of thesis (home page, index, part 1-2-3-4) Project link. It was my first attempt to create a physical interface inside an exhibition space, but it was focused on the structure and non-linear navigation rather than the elastic property of Web Interfaces. As a first year student of Experimental Publishing, I continued to work in that direction by eventually creating a meta-website making visible html <meta> tags in the middle of an essay. I also worked a geocaching pinball highligting invisible Web events as well as a Web oscillator inspired from analog instruments's body size, and which amplitude and frequency range were directly related to the user's device screen-size .

Incidences Médiatiques
click to watch GIF
Special issue 13 - Wor(l)ds for the Future
Tense screen recording montage of Tense
click to watch GIF
Media Spaces - graduation project
Media Spaces - graduation project
Media Spaces - graduation project
Media Spaces - graduation project







































Relation to a larger context

With the growing presence of digital tools in all aspects of our lives, the distinctions between the physical and virtual world are being blurred as they tend to affect each other, create interdependencies, and transform our environments into cybernetic spaces where our behaviours tend to be translated into informational units. In that context, amy aim is to observe and speculate on the ways cybernetics could considerably change and affect modes of representation inside exhibition spaces, and the agencies or status of the visitors themselves.

Curatorial Practice / New Media Art / Information Visualization / Software Art / Institutional Critique / Human Sciences / Cybernetics

Key References

Reading Sources

Themes (keywords)

  • Interfaced Reality
  • Museum Display vs Screen display
  • Exhibition space vs User interface
  • Web Elasticy vs Physical Rigidity
  • Museology / Curation / Gallery and Museum display
  • Technological context
  • Mediatization of Media / Meta Art

Draft Thesis

Notes:

  • For what concerns the form/share of the thesis itself, I am considering to create a WebtoPrint thesis that would display a different amount/arrangement of content (text and images), depending on the user's device (context). If this thesis has to be printed, each user should also get a different physical book (format, layout, etc).
  • I know what I want to talk about, and how to connect it to my production, but the structure (parts, sub-parts, and order) is definitly not fixed.
  • Reasons why I connect the Web digital interfaces to specificaly the physical exhibition space should be clarified
  • Focus on the cybernetic empire, and its influence on different aspects of society (beyond the exhibition space)
  • Dystopian space?
  • How systems works, how do they use you, how do they structure you as you
  • Where are these sensors are used outside of these exhibition space.

Introduction


[...]
People may have now more concrete experiences of the digital/Web interfaces than the physical space. Museums, galleries, restaurants, hotels, houses or cars are themselves commonly taking the shape of interfaced physical environments. Both online and in the physical reality, our (past and present) behaviours (and non-behaviours) can be detected and used by the use of various technologies in order to provide us an individualized/customizable perception/experience of a same environment. With the embodiement of user interfaces into many layers of the physical reality, the so called "reality" becomes not only augmented but also optimized to our very own perception. In that sense, "what you see is what you get" may also define new modalities to the spaces of representation (exhibition spaces), the agencies of its visitors, and the experience of art itself.

Through digital and analogical comparisons between Web interfaces and Exhibition space, I wish to find out what are the elements defining, communicating or giving stucture to contents inside both of these spaces, how do they relate to each others, and what could be the effects of mergings their concepts/properties.

I. Agencies and factors within the spaces of representation

1. THE AGENCIES OF USERS & SPECTATORS

What are users and spectators allowed or expected to do, what should they agree on, what is the meaning of being a spectator or a user, what is the purpose of an exhibition space or an Web interface, how could both differs and relate to each others on all these aspect.

1.1 The user agency through the Web interfaces

1.1.1 Terms, conditions, agreements

Cookies, privacy, legal uses, advertisment, copyrights, etc

1.2 The spectator agency through the Exhibition Spaces/Museums/Galleries

1.2.1 Rules, safety, regulations

Artwork safety, public safety, prohibed items, public speaking, photography, equipments, behavior, circulation, etc. Maybe even more than on the Web, being a gallery/museum visitor implies to agree on terms and conditions.

2. SPATIAL/TECHNOLOGICAL FACTORS

What are the spatial, technological, political factors defining the context/situation in which the representation is being displayed and experienced.

2.1 Technological context of the Web

2.1.1 An infinite array of individualized/customized perpectives

In opposition to the physical exhibition space, the Web offers to each of its user/visitors a custom point of view based on an innumerable and everchanging array of technological factors. I like to call this array of factors: a (technological) context. Among these factors, we could list: the browser, the device, the explotation system, the screensize, the resolution, the user configuration and defaults settings, the updates, the IP adress, etc.. This technological complexity diffracts the possible renders of a same Web page in an almost infinite array of user perspectives. From our own user perspective/point of view, behind our own screen, this technological complexity and the infinite spectrum of perspectives that it leads to can hardly be considered (expect here for example). This brings us to uncounsioulsy forget about the singularity and fragility of what we is being seen/experience/interpretated

Ref:

2.1.2 Elasticity, obsolescence and unpredictability / Responsive technology

Web representations are sort of plastic/elastic, they demultiplies and transforms themselves as much as needed in order to be rendered in an optimal way through our own user perspective/interface. Added to that, the display/render of a website are also affected by the constant evolution of the Web fitself, with patches, updates, expired and added elements that contribute to the ephemerality and unpredictability of what can be seen. In order to overcome the impredictability of rendering online interfaces among the incredible diversity of connected devices, a technology of flexibility/responsiveness/elasticity has been developped, improved and democratised on the Web, and willing to offer an optimal render in most technological contexts.

Ref:

Gaëlle Calvary, Joëlle Coutaz, David Thevenin Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt

See more:

  • Lopez, J.F., Szekely, P., Web page adaptation for Universal Access, in Proc. of Conf. on Universal Access in HCI UAHCI’ 2001

(New Orleans, August 5-10, 2001), Lawrence Erlbaum Associates, Mahwah, 2001,

2.2 Technological contexts in the museum/exhibition space

2.2.1 Public space and agents of the production of knowledge

Architecture, scale, size, interior design, colors, layout, writing, arrangement, lighting, display, etc

2.2.2 Institutional critique (optional)

Questioning and redifining the exhibition spaces and the heritage from the White Cube by the institutional critique practice (?)

II. Reversing the desktop metaphor

The desktop metaphor was invented in the early ages of computers for facilitating the use and understanding of the digital interfaces, by making mental associations related to domains from the physical world. Now democratised, widely used and quiet often replacing our needs to converge in physical spaces, I would like to reverse the process by getting inspired by the concepts of the Web interfaces in order to suggest a singular experience and understanding of the physical exhibition space who is awell another space of representation.

1. CONCEPTS OF THE INTERFACED REALITY

Conceiving the exhibition space as a digital Web interface and exploring concepts that bring together notions from both digital and physical world.

1.1 "Architectural Device"

Conceiving the architecture as a technological and political device made of a set of factors and parameters that can be configured

1.2 "Physical Events"

On the Web, our actions and inactions can be converted into (silent and invisible) events that can give activate things and be converted into valuable informations for advertisers, algorythms, etc. How could such thing be conceptualized inside an exhibition space.

1.3 "Programmed physical space"

Comparing the programming of an interface with the curation of a exhbibition space. Could an exhibition space be programmed?

1.4 "Exhibition User"

Conceiving the Spectator as a User of the physical space

1.5 "Variable Display"

Conceiving the physical space as an elastic/variable and potentially unpredicatable display; in order to diffract the range of viewing contexts offered by the Web.

Conclusion

[...]

References


More here

What is my work, What do I want to tell, What is my position

Translated from discussion with Michael

People have now more concrete experiences of the digital/Web interface than the physical space. Museums, hotels, houses, cars interiors, restaurants are themselves becoming more and more comparable to digital interface where everything is optimized, and where our behaviours, actions and even inactions are being detected and converted into commands in order to offer a more customized (and profitable) experience to each of us. In that sense, we are getting closer from becoming users of our own interfaced physical reality. By creating a exhibition spaces explicitly inspired from a desktop Web interface, I wish to question what could be the future of exhibition space, what are the limits of this interfaced and individualized reality and how could it affect our own experience and understanding of art.

What could we learn from interface design? What could be the future of exhibition space?

"Bring attention to the systems underlying artistic productions" both on the Web and the physical world
"reversal of the desktop metaphor" (using the virtual as "concrete" to metaphorically understand a physical exhibition space), what will be the future of an exhibition space... (is already working exhibition spaces working with sensors) scary and fascinating at the same time...
"my embracing/use of sensors isn't about proposing these techniques as a solution / ideal / about control... interfaces requiring less and less from us but paradoxically extracting more and more from us"
every small unconsidered behaviour is being used (trying to used)...
there is unpredictable.... because of all the factors, want unexpected things to happen...
the reality of digital isn't all about precision and control, this notion of surprise is key for an experience.
Exploring the fullness of digital / programmed / computational media, including those "edge" cases / the "glitch" ... the surprise...
Examples from museums: (for instance Brussels has the MIM Museum Instrument Museum, sadly the old now retired interface was a system with headphones that were activated in the space, so as you approached vitrines with a violin you would here a performance of that instrument)...
How a mistake can create something else. / Bugs / Glitch Letting an accident/surprise/unexpect exist, exploring the fullness of digital programming
My position seems to fit with Researcher/Designer
Digital is not precise and omnipotent, it has so faults, flows and vulnerabilities.


To check:


Software Art

Software creation or use of software concepts for artworks representation. Commonly put the spectator in the role of a user.

Internet Art

Elements from the Internet bringed outside of the Internet and promoting the Internet as part of both virtual and physical realities.

  • John Ippolito

Post-Internet Art vs Internet 2.0

Post-Internet Art: Litteraly Art after the internet. Can consists of using online material for later use in offline works or can relate on the effect of the Internet in various aspects of the culture, aesthetic and society.

  • Olia Lialina

VS
Internet 2.0: Assuming that a world Internet doesn't exist anymore

  • Zach Blas

Net Art

Started in late 70's and nowadays associated with a outdated era of the Internet (1.0?)
Closely related to Network Art

  • Olia Lialina, My Boyfriend Came Back From the War, 1996

New Aesthetics

Confronting/merging virtual and physical, or humans and machine, etc

  • James Bridle

Funware

Gamification of non-game platforms in order to encourage some actions, behaviors, transactions with the help of various rewarding systems.

Connections to XPUB1

User viewing contexts (on the Web) from special issue 13

Description

Create motion from the diffraction of the user interface which offers flexible and almost infinite possible renders of a same Web page. The sensible variety of user viewing contexts tells about the placiticy of the user interface. This is the result from the wide range user devices, window or screen sizes, Web browsers, (as part of many other parameters). A first of movement capture and montage of the user interface placticity can be as part of the post-production of my interpretation of the esssay "Tense", part of the Special Issue 13.

Capturing and puting into motion the User interface placticity

Trying to play around with the Browser window resizing in order to create a playful animation decidaced be a thumbnail of the project. The two first screen capture will be the basis of the upcoming motion. I will first try to smooth the window movement and make the two screen capture fit togehter before synchronizing them and looping them.


TENSE Motion Rectangle Format Loop in the loop
TENSE MOTION Initial Screen Capture 1




























Notes: Add Web Oscillator

Prototyping

Arduino

Early sketch that is about comparing and questioning our Spectator experience of a physical exhibition space (where everything is often fixed and institutionalized), with our User experience of a Web space (where everything is way more elastic, unpredictable and obsolete). I’m interested about how slighly different can be rendered a same Web page to all different users depending on technological contexts (device nature, browser, IP address, screen size, zoom level, default settings, updates, luminosity, add-ons, restrictions, etc). I would like to try to create a physical exhibition space/installation that would be inspired from the technology of a Web user window interface in order then to play with exhbitions parameters such as the distance between the spectator and the artwork, the circulation in space, the luminosity/lighting of the artwork(s), the sound/acoustics, etc etc etc.

Distance between wall behind the spectator and the artwork has to be translated into a variable that can affect sound or light in the room. Wall position could be connected to the dimensions of a user interface in real time with arduino and a motor.

Create a connected telemeter with an Arduino, a ultrasonic Sensor (HC-SR04) and a ESP8266 module connected to Internet

It seems possible to create your own telemeter with a arduino by implementing an ultrasonic Sensor HC-SR04
By doing so, the values capted by the sensor could potentaialy be directly translated as a variable.
Then with the ESP8266 module, the values could be translated on a database on the internet. Then I could enter that website and see the values from anywhere and use them to control light, sound or anything else I wish.

Tool/Material list:

  • Telemeter (user to get the distance between the device and an obstacle)
  • Rails
  • Handles
  • Wheels
  • Movable light wall
  • Fixed walls
  • USB Cable
  • Connexion cables
  • Arduino
  • ESP8266
Connexion cables (Arduino)
USB Cable
Arduino
HC-SR04 Ultrasonic Sensor
Plywood x 3
Handle
ESP8266
Rail









About the ultrasonic Sensor (HC-SR04)

Characteristics

Here are a few of it's technical characteristic of the HC-SR04 ultrasonic sensor :

  • Power supply: 5v.
  • Consumption in use: 15 mA.
  • Distance range: 2 cm to 5 m.
  • Resolution or accuracy: 3 mm.
  • Measuring angle: < 15°.

Ref More infos about the sensor here and here

Where to buy the ultrasonic Sensor (HC-SR04)

Prototype 1 : Arduino + Resistor

During a workshop, we started with a very basic fake arduino kit, a led, a motor, and a sensor. After making a few connections, we got to understand a bit how it works.


   #include <Servo.h>
   Servo myservo;  // create servo object to control a servo
   int pos = 0;    // variable to store the servo position
   int ldr = 0;    // vairable to store light intensity
   void setup() {
   Serial.begin(9600); // begin serial communication, NOTE:set the same baudrate in the serial monitor/plotter
   myservo.attach(D7);  // attaches the servo on pin 9 to the servo object
   }
   void loop() {
   //lets put the LDR value in a variable we can reuse
   ldr = analogRead(A0);
   
   //the value of the LDR is between 400-900 at the moment 
   //the servo can only go from 0-180
   //so we need to translate 400-900 to 0-180
   //also the LDR value might change depending on the light of day
   //so we need to 'contrain' the value to a certain range
   ldr = constrain(ldr, 400, 900); 
   //now we can translate
   ldr = map(ldr, 400, 900, 0, 180);
   //lets print the LDR value to serial monitor to see if we did a good job
   Serial.println(ldr); // read voltage on analog pin 0, print the value to serial monitor
   //now we can move the sensor accoring to the light/our hand!
   myservo.write(ldr);      // tell servo to go to position in variable 'pos'
   delay(15);    
   }


How to make a engine work
credits: Dennis de Bel
How to make a sensor work
Credits: Dennis de Bel
How to make both sensor and engine works together
Credits: Dennis de Bel
Sensortest during workshop

Split Screen Arduino + Sensor + Serial Plotter + Responsive Space

Trying here to show the simutaneous responses between the sensor, the values, and the simualtion.

Splitscreen Arduino + Sensor + Serial Plotter + Responsive Space




















Prototype 2: Arduino + Ultrasonic sensor

For this very simple first sketch and for later, I will include newPing library that improves a lot the ultrasonic sensor capacities.

Sketch 1: Arduino Uno + Sensor






















 #include <NewPing.h> 
 int echoPin = 10;
 int trigPin = 9;
 
 NewPing MySensor(trigPin, echoPin); //This defines a new variable
 
 void setup() {
   // put your setup code here, to run once:
   Serial.begin(9600);
 }
 
 void loop() {
   // put your main code here, to run repeatedly:
  int duration = MySensor.ping_median(); 
  int distance = MySensor.convert_in(duration);
 
  Serial.print(distance);
  Serial.println("cm");
  delay(250);
 }

Prototype 3: Arduino Uno + Sensor + LCD (+ LED)

All together from https://www.youtube.com/watch?v=GOwB57UilhQ

Sketch 2: Arduino Uno + Sensor + LCD
Sketch 3: Arduino Uno + Sensor + LCD + LED






















  #include <LiquidCrystal.h>
  
  LiquidCrystal lcd(10,9,5,4,3,2);
 
 const int trigPin = 11;
 const int echoPin = 12;
 
 long duration;
 int distance;
 
 void setup() {
   // put your setup code here, to run once:
     analogWrite(6,100);
     lcd.begin(16,2);
     pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
 pinMode(echoPin, INPUT); // Sets the echoPin as an Input
 Serial.begin(9600); // Starts the serial communication
 
     
 }
 
 void loop() {
 long duration, distance;
   digitalWrite(trigPin,HIGH);
   delayMicroseconds(1000);
   digitalWrite(trigPin, LOW);
   duration=pulseIn(echoPin, HIGH);
   distance =(duration/2)/29.1;
   Serial.print(distance);
   Serial.println("CM");
   delay(10);
 // Prints the distance on the Serial Monitor
 Serial.print("Distance: ");
 Serial.println(distance);
 
     lcd.clear();
     lcd.setCursor(0,0);
     lcd.print("Distance = ");
     lcd.setCursor(11,0);
     lcd.print(distance);
     lcd.setCursor(14,0);
     lcd.print("CM");
     
     delay(500);
     
 }

From this sketch, I start considering that the distance value could be directly sent to a computer and render a Web page depending on its value.
Note: It looks like this sensor max range is 119cm, which is almost 4 times less than the 4 meters max range stated in component description.

Prototype 4: Arduino Uno + Sensor + LCD + 2 LED = Physical vs Digital Range detector

Using in-between values to activate the green LED
Once again, puting together the simulation and the device in use.

Sensor Test VS Elastic Space




































 #include <LiquidCrystal.h>
 #include <LcdBarGraph.h> 
 #include <NewPing.h> 
 
   LiquidCrystal lcd(10,9,5,4,3,2);
 
 const int LED1 = 13; 
 const int LED2 = 8;   
 const int trigPin = 11;
 const int echoPin = 12;
 
 long duration; //travel time
 int distance;
 int screensize;
 
 void setup() {
   // put your setup code here, to run once:
     analogWrite(6,100);
     lcd.begin(16,2);
     pinMode(trigPin, OUTPUT); // Sets the trigPin as an Output
     pinMode(echoPin, INPUT); // Sets the echoPin as an Input
     Serial.begin(9600); // Starts the serial communication
 
     pinMode(LED1, OUTPUT);
     pinMode(LED2, OUTPUT);
 }
 
 void loop() {
 long duration, distance;
   digitalWrite(trigPin,HIGH);
   delayMicroseconds(1000);
   digitalWrite(trigPin, LOW);
   duration=pulseIn(echoPin, HIGH);
   distance =(duration/2)/29.1; //convert to centimers
   screensize = distance*85;
   Serial.print(distance);
   Serial.println("CM");
   Serial.print(screensize);
   delay(10);
 
   if ((distance >= 15) && (distance<=20))
   {
      digitalWrite(LED2, HIGH);
      digitalWrite(LED1, LOW);
   }
   else
   {
      digitalWrite(LED1, HIGH);
      digitalWrite(LED2, LOW);    
   }
 
 // Prints the distance on the Serial Monitor
 Serial.print("Distance: ");
 Serial.println(distance);
 
     lcd.clear();
     lcd.setCursor(0,0);
     lcd.print("ROOM");
     lcd.setCursor(6,0);
     lcd.print(distance);
     lcd.setCursor(9,0);
     lcd.print("cm");    
     lcd.setCursor(0,2);
     lcd.print("SCR");
     lcd.setCursor(6,2);
     lcd.print(screensize);
     lcd.setCursor(9,2);
     lcd.print("x1080px");
         
     delay(500);
     
 }


I brought a second arduino, 2 long breadboards, black cables, another LCD screen, and remade the setup on this format. For some reasons the new LCD screen is not going in the breadboard, and I need more male to female cables in order to connect it correctly. With this longer breadboard, I want to extend the range value system, and make it visible with leds and sounds.

Upgrade























How to get more digital pins [not working]

I tried 4 different tutorials but still didn't find a way to make the thing work, that's very weird, so I will just give up and take a arduino mega =*(

ArduinoExtraDigitalPins





























Prototype 5: Arduino Uno + 3 Sensor + 3 LEDS

With a larger breadboard, connecting 3 sensors all together. Next step will be to define different ranges of inbetween values for each sensor in order to make a grid. To accomplish this grid I will make a second row of sensors such as this, in order to get x and y values in space

Prototype 6: Arduino Uno + 3 Sensor + 12 LEDS

With 3 sensors, added on 2 long breadboads, and with a different set of range values, we can start mapping a space.

SensorMediaQueries
Physical Space Mapping








Prototype 7: Arduino Uno + 12 LEDS + 3 Sensor + Buzzer + Potentiometer + LCD

For this prototype, I implement a buzzer that will emit a specific sound depending on the distance of the obstacle detected by the sensor. I also puted back a LCD displaying the 3 sensors values. The screen luminosity can be changed via a potentiometer.
Ressources:

ArduinoMegaSensorBuzzerLCD





























Prototype 8: Arduino Uno + 12 LEDS + 3 Sensor on mini breadboards + Buzzer + Potentiometer + LCD

Same code, but new setup detaching each sensor from each others and allowing to place them anywhere.

ArduinoMegaSensorBuzzerLCDMinibreadboard.jpg





























Prototype 9: Arduino Uno + 21 LEDS + 7 Sensor + Buzzer + Potentiometer + LCD

Sensor Wall 01
PhysicalMapping2

























 //LIBRARIES 
   
   #include "pitches.h" //PITCH 
   #include <LiquidCrystal.h> //LCD 
   //#include <LiquidCrystal_I2C.h> //LCD 
   #include <LcdBarGraph.h> //LCD 
   #include <Wire.h> // LCD
   #include <NewPing.h> //SENSOR ACCURACY 
   
   LiquidCrystal lcd(34, 35, 32, 33, 30, 31);
   
   //LCD 
   const int SDAPin = A4; //Data pin
   const int SCLPin = A5; //Clock pin
   
   //BUZZER
   const int BUZZER = 24; 
 
   //A
   
   int trigPinA= 2;
   int EchoPinA= 3;
 
   //B 
   
   int trigPinB= 4;
   int EchoPinB= 5;
 
   //C 
   
   int trigPinC= 6;
   int EchoPinC= 7;
   
   //D
   
   int trigPinD = 8;                                  
   int EchoPinD= 9;     
 
                              
   //D
   
   int trigPinE = 10;                                  
   int EchoPinE= 11;   
 
   //F
   
   int trigPinF = 12;                                  
   int EchoPinF= 13;   
      
   //G
   
   int trigPinG= 22;
   int EchoPinG= 23;
   
   //A RANGG OF LEDS
   
   int LED_A1_ping= 25; 
   int LED_A2_ping= 26;
   int LED_A3_ping= 27;
   
   //B RANGG OF LEDS
   
   int LED_B1_ping= 28;  
   int LED_B2_ping= 29;
   int LED_B3_ping= 36;
   
   //C RANGG OF LEDS
   
   int LED_C1_ping= 37;  
   int LED_C2_ping= 38;
   int LED_C3_ping= 39;
   
   //D RANGG OF LEDS
   
   int LED_D1_ping= 40;  
   int LED_D2_ping= 41;
   int LED_D3_ping= 42;
 
   //G RANGG OF LEDS
   
   int LED_E1_ping= 43 ;  
   int LED_E2_ping= 44;
   int LED_E3_ping= 45;
 
   //F RANGG OF LEDS
   
   int LED_F1_ping= 46;  
   int LED_F2_ping= 47;
   int LED_F3_ping= 48;
 
   //G RANGG OF LEDS
   
   int LED_G1_ping= 49;  
   int LED_G2_ping= 50;
   int LED_G3_ping= 51;
   
   //LCD DISPLAY
   //LiquidCrystal_I2C lcd = LiquidCrystal_I2C (0x3F,16,2);
   
   long duration, distance, UltraSensorA, UltraSensorB, UltraSensorC, UltraSensorD, UltraSensorE, UltraSensorF,  UltraSensorG; 
   char data;
   String SerialData="";
   
   void setup()
   {// START SGTUP FUNCTION
   
   lcd.begin(16, 2);
   //lcd.init();
   
   Serial.begin (9600); 
                             
   pinMode(BUZZER, OUTPUT);
 
   //setup pins sensor A
   pinMode(trigPinA, OUTPUT);
   pinMode(EchoPinA, INPUT);
   pinMode(LED_A1_ping, OUTPUT);
   pinMode(LED_A2_ping, OUTPUT);
   pinMode(LED_A3_ping, OUTPUT);
 
   //setup pins sensor B
   pinMode(trigPinB, OUTPUT);
   pinMode(EchoPinB, INPUT);
   pinMode(LED_B1_ping, OUTPUT);
   pinMode(LED_B2_ping, OUTPUT);
   pinMode(LED_B3_ping, OUTPUT);
 
   //setup pins sensor C
   pinMode(trigPinC, OUTPUT);
   pinMode(EchoPinC, INPUT);
   pinMode(LED_C1_ping, OUTPUT);
   pinMode(LED_C2_ping, OUTPUT);
   pinMode(LED_C3_ping, OUTPUT);
   
   // setup pins sensor D
   pinMode(trigPinD, OUTPUT);                        
   pinMode(EchoPinD, INPUT);                         
   pinMode(LED_D1_ping, OUTPUT);                  
   pinMode(LED_D2_ping, OUTPUT);                  
   pinMode(LED_D3_ping, OUTPUT);                  
 
   //setup pins sensor E
   pinMode(trigPinE, OUTPUT);
   pinMode(EchoPinE, INPUT);
   pinMode(LED_E1_ping, OUTPUT);                  
   pinMode(LED_E2_ping, OUTPUT);                  
   pinMode(LED_E3_ping, OUTPUT);                  
   
   //setup pins sensor F
   pinMode(trigPinF, OUTPUT);
   pinMode(EchoPinF, INPUT);
   pinMode(LED_F1_ping, OUTPUT);                  
   pinMode(LED_F2_ping, OUTPUT);                  
   pinMode(LED_F3_ping, OUTPUT);                  
     
   //setup pins sensor G
   pinMode(trigPinG, OUTPUT);
   pinMode(EchoPinG, INPUT);
   pinMode(LED_G1_ping, OUTPUT);                  
   pinMode(LED_G2_ping, OUTPUT);                  
   pinMode(LED_G3_ping, OUTPUT);                  
   
   //inisializG LED status 
   digitalWrite(LED_A1_ping,LOW);
   digitalWrite(LED_A2_ping,LOW);
   digitalWrite(LED_A3_ping,LOW);
   
   digitalWrite(LED_B1_ping,LOW);
   digitalWrite(LED_B2_ping,LOW);
   digitalWrite(LED_B3_ping,LOW);
   
   digitalWrite(LED_C1_ping,LOW);
   digitalWrite(LED_C2_ping,LOW);
   digitalWrite(LED_C3_ping,LOW);
   
   digitalWrite(LED_D1_ping,LOW);
   digitalWrite(LED_D2_ping,LOW);
   digitalWrite(LED_D3_ping,LOW);
 
   digitalWrite(LED_E1_ping,LOW);
   digitalWrite(LED_E2_ping,LOW);
   digitalWrite(LED_E3_ping,LOW);
 
   digitalWrite(LED_F1_ping,LOW);
   digitalWrite(LED_F2_ping,LOW);
   digitalWrite(LED_F3_ping,LOW);
 
   digitalWrite(LED_G1_ping,LOW);
   digitalWrite(LED_G2_ping,LOW);
   digitalWrite(LED_G3_ping,LOW);
  
   }
   
   void loop() 
   {
   
     
   // START THG LOOP FUNCTION
   SonarSensor(trigPinA,EchoPinA);              
   UltraSensorA = distance; 
   SonarSensor(trigPinB,EchoPinB);              
   UltraSensorB = distance; 
   SonarSensor(trigPinC,EchoPinC);              
   UltraSensorC = distance; 
   SonarSensor(trigPinD, EchoPinD);              
   UltraSensorD = distance;                                           
   SonarSensor(trigPinE,EchoPinE);              
   UltraSensorE = distance; 
   SonarSensor(trigPinF,EchoPinF);              
   UltraSensorF = distance; 
   SonarSensor(trigPinG,EchoPinG);               
   UltraSensorG = distance; 
     
   Serial.print("A: ");
   Serial.print(UltraSensorA);
   Serial.println(" cm");
 
   Serial.print("B: ");
   Serial.print(UltraSensorB);
   Serial.println(" cm");
 
   Serial.print("C: ");
   Serial.print(UltraSensorC);
   Serial.println(" cm");
 
   Serial.print("D: ");
   Serial.print(UltraSensorD);
   Serial.println(" cm");
 
   Serial.print("E: ");
   Serial.print(UltraSensorE);
   Serial.println(" cm");
 
   Serial.print("F: ");
   Serial.print(UltraSensorF);
   Serial.println(" cm");
   
   Serial.print("G: ");
   Serial.print(UltraSensorG);
   Serial.println(" cm");
   
   
   
   lcd.setCursor(2,2);
   lcd.print(UltraSensorD);
   
   lcd.setCursor(7,0);
   lcd.print(UltraSensorA);
   
   lcd.setCursor(11,2);
   lcd.print(UltraSensorC);
       
   // A SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
   // A1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorA <=10)// if distance is lGss than 10 Cm turn thG LED ON
     {
       lcd.setCursor(0,1);
       lcd.print("1:");
       digitalWrite(LED_A1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_A1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);  
     }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_A1_ping,LOW);
       digitalWrite(BUZZER, LOW);
     }
     // A2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorA >=11) && (UltraSensorA <=20))
     {
       lcd.setCursor(0,1);
       lcd.print("2:");
       digitalWrite(LED_A2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(50);
       digitalWrite(LED_A2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(50); 
     }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_A2_ping,LOW);
     }
     // A3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorA >=21) && (UltraSensorA <=30))
     {
       lcd.setCursor(0,1);
       lcd.print("3:");
       digitalWrite(LED_A3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(100);
       digitalWrite(LED_A3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(100); 
     }
     else               
     {
       digitalWrite(LED_A3_ping,LOW);
     }
     // A4———————————————————————————————————————————————————————————————————————————————————————————————
 //    if((UltraSensorA >=31) && (UltraSensorA <=10000)) 
 //    {
 //      digitalWrite(LED_A4_ping,HIGH);
 //      lcd.setCursor(0,1);
 //      lcd.print("4:");
 //    }
 //    else                
 //    {
 //      digitalWrite(LED_A4_ping,LOW);
 //    }
     // B SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
     // B1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorB <=10)
     {
       lcd.setCursor(10,2);
       lcd.print("1:");
       digitalWrite(LED_B1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_B1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);  
     }
     else
     {
       digitalWrite(LED_B1_ping,LOW);
     }
     // B2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorB >=11) && (UltraSensorB <=20)) 
     {
       lcd.setCursor(10,2);
       lcd.print("2:");
       digitalWrite(LED_B2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(50);
       digitalWrite(LED_B2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(50);   }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_B2_ping,LOW);
     }
     // B3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorB >=21) && (UltraSensorB <=30)) 
     {
       lcd.setCursor(10,2);
       lcd.print("3:");
       digitalWrite(LED_B3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(100);
       digitalWrite(LED_B3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(100);     
     }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_B3_ping,LOW);
     }
     // B4———————————————————————————————————————————————————————————————————————————————————————————————
 //    if((UltraSensorB >=31) && (UltraSensorB <=10000)) 
 //    {
 //    digitalWrite(LED_B4_ping,HIGH);
 //    lcd.setCursor(10,2);
 //    lcd.print("4:");
 //    }
 //    else                // else turn thG LED OFF
 //    {
 //      digitalWrite(LED_B4_ping,LOW);
 //    }
     // C SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
     // C1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorC <=10)
     {
       lcd.setCursor(5,0);
       lcd.print("1:");
       digitalWrite(LED_C1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_C1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10); 
     }
     else
     {
       digitalWrite(LED_A3_ping,LOW);
     }
     // C2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorC >=11) && (UltraSensorC <=20)) // if distance is lGss than 10 Cm turn thG LED ON
     {
       lcd.setCursor(5,0);
       lcd.print("2:");
       digitalWrite(LED_C2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_C2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);   }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_C2_ping,LOW);
     }
     // C3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorC >=21) && (UltraSensorC <=30)) // if distance is lGss than 10 Cm turn thG LED ON
     {
       lcd.setCursor(5,0);
       lcd.print("3:");
       digitalWrite(LED_C3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_C3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);   }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_C3_ping,LOW);
     }
     // C4———————————————————————————————————————————————————————————————————————————————————————————————
 //    if((UltraSensorC >=31) && (UltraSensorC <=10000)) // if distance is lGss than 10 Cm turn thG LED ON
 //    {
 //      digitalWrite(LED_C4_ping,HIGH);
 //      lcd.setCursor(5,0);
 //      lcd.print("4:");
 //    }
 //    else                // else turn thG LED OFF
 //    {
 //      digitalWrite(LED_C4_ping,LOW);
 //    }
 
   // D SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
 // D1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorD <=10)// if distance is lGss than 10 Cm turn thG LED ON
     {
       digitalWrite(LED_D1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_D1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);  
     }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_D1_ping,LOW);
       digitalWrite(BUZZER, LOW);
     }
     // D2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorD >=11) && (UltraSensorD <=20))
     {
       digitalWrite(LED_D2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(50);
       digitalWrite(LED_D2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(50); 
     }
     else                // else turn thG LED OFF
     {
       digitalWrite(LED_D2_ping,LOW);
     }
     // D3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorD >=21) && (UltraSensorD <=30))
     {
       digitalWrite(LED_D3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(100);
       digitalWrite(LED_D3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(100); 
     }
     else               
     {
       digitalWrite(LED_D3_ping,LOW);
     }
     // E SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
   // E1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorE <=10)// if distance is lGss than 10 Cm turn thG LED ON
     {
       digitalWrite(LED_E1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_E1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);  
     }
     else                // else turn thG LED OGG
     {
       digitalWrite(LED_E1_ping,LOW);
       digitalWrite(BUZZER, LOW);
     }
     // E2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorE >=11) && (UltraSensorE <=20))
     {
       digitalWrite(LED_E2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(50);
       digitalWrite(LED_E2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(50); 
     }
     else                // else turn thG LED OGG
     {
       digitalWrite(LED_E2_ping,LOW);
     }
     // E3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorE >=21) && (UltraSensorE <=30))
     {
       digitalWrite(LED_E3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(100);
       digitalWrite(LED_E3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(100); 
     }
     else               
     {
       digitalWrite(LED_E3_ping,LOW);
     }
     // F SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
   // F1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorF <=10)// if distance is lGss than 10 Cm turn thG LED ON
     {
       digitalWrite(LED_F1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_F1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);  
     }
     else                // else turn the LEF OFF
     {
       digitalWrite(LED_F1_ping,LOW);
       digitalWrite(BUZZER, LOW);
     }
     // F2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorF >=11) && (UltraSensorF <=20))
     {
       digitalWrite(LED_F2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(50);
       digitalWrite(LED_F2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(50); 
     }
     else                // else turn the LEF OFF
     {
       digitalWrite(LED_F2_ping,LOW);
     }
     // F3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorF >=21) && (UltraSensorF <=30))
     {
       digitalWrite(LED_F3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(100);
       digitalWrite(LED_F3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(100); 
     }
     else               
     {
       digitalWrite(LED_F3_ping,LOW);
     }
     // G SENSOR ———————————————————————————————————————————————————————————————————————————————————————————————
   // G1———————————————————————————————————————————————————————————————————————————————————————————————
     if(UltraSensorG <=10)// if distance is lGss than 10 Cm turn thG LED ON
     {
       digitalWrite(LED_G1_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(10);
       digitalWrite(LED_G1_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(10);  
     }
     else                // else turn the LEG OGG
     {
       digitalWrite(LED_G1_ping,LOW);
       digitalWrite(BUZZER, LOW);
     }
     // G2———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorG >=11) && (UltraSensorG <=20))
     {
       digitalWrite(LED_G2_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(50);
       digitalWrite(LED_G2_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(50); 
     }
     else                // else turn the LEG OGG
     {
       digitalWrite(LED_G2_ping,LOW);
     }
     // G3———————————————————————————————————————————————————————————————————————————————————————————————
     if((UltraSensorG >=21) && (UltraSensorG <=30))
     {
       digitalWrite(LED_G3_ping,HIGH);
       digitalWrite(BUZZER, HIGH);
       delay(100);
       digitalWrite(LED_G3_ping,LOW);
       digitalWrite(BUZZER, LOW);
       delay(100); 
     }
     else               
     {
       digitalWrite(LED_G3_ping,LOW);
     }
   }
   
   void SonarSensor(int trigPinSensor,int echoPinSensor)//it takes the trigPIN and the echoPIN 
   {
     //generate the ultrasonic wave
   digitalWrite(trigPinSensor, LOW);// put trigpin LOW 
   delayMicroseconds(2);// wait 2 microseconds
   digitalWrite(trigPinSensor, HIGH);// switch trigpin HIGH
   delayMicroseconds(10); // wait 10 microseconds
   digitalWrite(trigPinSensor, LOW);// turn it LOW again
   
   //read the distance
   duration = pulseIn(echoPinSensor, HIGH);//pulseIn funtion will return the time on how much the configured pin remain the level HIGH or LOW; in this case it will return how much time echoPinSensor stay HIGH
   distance= (duration/2) / 29.1; // A1 we have to divide the duration by two  
   }

Sketch 10: Arduino Mega + 7 Sensors + LCD + 3 buzzers + P5.js

The goal here was to create a first communication between the physical setup and a P5.js web page

About the ESP8266 module

The ESP8266 is a microcontroller IC with Wi-Fi connection, it will allow us to connect the arduino to the internet so we can get the values obtained from sensors received directly on a self-hosted webpage. From this same web page, it would also be possible to control LESs, motors, LCD screens, etc.

Ressources about ESP8266 module

Kindly fowarded by Lousia:

Which ESP8266 to buy

Things to try

How to connect Arduino to P5.js

Serial Control Port ID through P5.serialcontrol app

1. Download and install P5.serialcontrol app
2. Get your serial port number through the app (see thumbnail)
3. Connect to https://editor.p5js.org/ for less issues with libraries, and preferably on Chrome
4. Reference to try test 1;test2












Prototyping Ressources

Do it Yourself Ressources (from Dennis de Bel)

  • Instructables is a huge source of (written) tutorials on all kinds of topics. Keep in mind it's more quantity than quality. Interesting for you might be 'diy sensors'
  • Hand Made Electronic (Music): Great resource for cheap, diy electronics project focussing on

sound/music (pdf findable online)

  • Make: Electronics: Amazing, complete guide to everything 'electronics' (Warning, HUGE pdf)
  • Thingiverse: The place to find 3d printable mechanics, enclosures, parts etc.

Electronic Shops (physical)

LIST OF SHOPS (also more physical NL ones)

Electronic Webshops (NL)

Electronic Webshops (Rest)

PCB making EU (Expensive)

PCB making China (Cheap but import tax)

  • JLCPCB (1 week from design upload to in your hands, low quality solder mask)
  • PCBWAY (1 week from design upload to in your hands)
  • ALLPCB (1 week from design upload to in your hands)

Arduino and Sensors

Sensor only Kit

  • 45-in-1 (aliexpress) Example sensor you will find in such a kit documented here

Arduino Starter Projects

or slightly more complex:

or in videos:

or just many different ideas:

or - of course - on Instructables if you want to have a complete course:

or this course:

ARDUINO + PROCESSING (visualizing sensors)

MISCELANIOUS KEYWORDS and LINKS

Installation

Creating an elastic exhibition space

Responsive Space Installation Simulation
Responsive Space (detail)
Spectator friendly physical exhibition space.png
Moving Wall Structure Shema 1





Ressources

  • Movable walls build out for Art Museum of West Virginia University link
  • Gallery Wall System (GWS) link
  • CASE-REAL installs movable walls inside a basement art gallery in tokyo link

Venues

Introduction

We will organize 2 moments of shared work in progress: one in October, one in November. It is not necessarily a presentation, but more conversation-based in order to practice the "making public act". Speak about our work, conversations with people about it, and people talking about our work.


Venue 1: Aquarium

Description


AQUARIUM 1.0


A Small Ecosystem for Living Thoughts

Monday, 11th October
19:30 – 21:30
Leeszaal Rotterdam West
Rijnhoutplein 3, 3014 TZ Rotterdam

with Clara Gradel, Floor van Meeuwen, Martin Foucaut, Camilo Garcia, Federico Poni, Nami Kim, Euna Lee, Kendal Beynon, Jacopo Lega and Louisa Teichmann

It’s oh-fish-ial! Students of the Experimental Publishing Master invite you to dive into their small ecosystem of living thoughts. Join us for an evening of conversation, discussion and new view points. If you look closely, you might even see some early thesis ideas hatching. Let's leave no rock unturned.

Observation questionnaire

This exercice is a very small, humble and almost 100% analog exercice questioning representation in two small steps.

1st step

photo of a brick












  • 1st step: I give a sheet of paper to people during the venue and ask them to answer a series of questions concerning the object (brick) that is being displayed in the middle of the room on a podium. It is specified to them that they can be anywhere while observing this brick and in any position. Here are the quesitons:


  • Please write down your first name:


  • Describe your position (sitting/standing/other):


  • Describe your location in the room:


  • Describe what you are seeing while looking at the screen:


  • Describe how you feel mentaly/emotionaly:



2nd step

photo of brick displayed inside a computer screen












  • 2nd step: I take the answers, wait a round, and then give back a new sheet of paper to the same people with the exact same questions concerning the respresentation of the object (brick) that is being displayed in the middle of the room on a computer screen on the same podium.

Answer Samples

1.0 Object on a podium

  • 1.1 Sitting on corner stairs —> Want to see it from different angles —> Feeling trapped, frustrated
  • 1.2 Sitting on stairs —> a rock looking dead —> Feeling sad
  • 1.3 Sitting on the left close from columns —> rational observation —> Nostalgic memories because participated to the creation of the object as it looks right now
  • 1.4 Sitting in front of object —> Calm and slighly confused
  • 1.5 Sitting on the floor next to stairs in between the side and the middle —> Looking at the object from the bottom —> Feeling a bit confused and inspired



2.0 Photo of the object displayed on a computer screen placed on a podium

  • 2.1 Sitting on a chair seeing the brick from a bird perspective -> Feeling more control of the situation
  • 2.2 Sitting very close from the brick —> Seeing a flat and almost abstract picture —> Feeling drawn to the picture, aesthetically pleasing, feeling less sad about the object
  • 2.3 Sitting under a table very far way —> Looking abstract but identifiable —> Exited about the unusual and childish observation position
  • 2.4 Sitting on stairs —> and seeing the brick in 2D —> Feeling fine
  • 2.5 Sittiing on the stairs —> Seeing a side of the screen with a top view photo of the object —> Feeling confortable



Answers1_RepresentationQuestionnaire
Answers2_RepresentationQuestionnaire
Answers3_RepresentationQuestionnaire
Answers4_RepresentationQuestionnaire
Answers5_RepresentationQuestionnaire


























Readings (new)(english)(with notes in english)

About Institutional Critique

To read

→ 1. Art and Contemporary Critical Practice: Reinventing Institutional CritiqueDoc
→ 2. From the Critique of Institutions to an Institution of Critique - Andrea FraserDoc
→ 3. Institutional critique, an anthology of artists writings - Alexander AlberroDoc

About Techno-Solutionism

To read

→ 1. The Folly of Technological Solutionism: An Interview with Evgeny Morozov - Natasha Dow Schüll

About Meta

To read

→ 1.  The meta as an aesthetic category Bruno Trentini (2014)
→ 2.  File:RMZ ARTIST WRITING(2).pdf The eye tells the story by Rosa Maria Zangenberg (2017)
→ 3.  Leonardo Da Vinci - Paragone by Louise Farago

About exhibition space

To read

→ 2. Kluitenberg, Eric, ed. Book of imaginary media. Excavating the dream of the ultimate communication medium. Rotterdam: NAi Publishers, 2006.
→ 3. The wall and the canvas: Lissitzky’s spatial experiments and the White Cube
→ 6. Decorative Arts: Billy Al Bengston and Frank Gehry discuss their 1968 collaboration at LACMA by Aram Moshayedi
→ 8.  File:Resonance and Wonder STEPHEN GREENBLATT.pdf Resonance and Wonder - STEPHEN GREENBLATT
→ 9.  A Canon of Exhibitions - Bruce Altshuler File:A Canon of Exhibitions - Bruce Altshuler.pdf
→ 10. Documenta - File:A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar.pdf A BRIEF HISTORY OF AN EXHIBITION AND ITS CONTEXTS - Klaus Siebenhaar
→ 11. Pallasmaa - The Eyes of the Skin File:Pallasmaa - The Eyes of the Skin.pdf
→ 12. Venturi - Learning from Las Vegas File:Venturi - Learning from Las Vegas.pdf
→ 13. Preserving and Exhibiting Media Art: Challenges and Perspectives - JULIA NOORDEGRAAF, COSETTA G. SABA; BARBARA LE MAÎTRE; VINZENZ HEDIGER Copyright: 2013 - Publisher: Amsterdam University Press Series: Framing Film

Reading/Notes

→ 1. After the White Cube. ref 2015 NOTES INSIDE

  • How and why White Cube rised and became democratized
  • White Cube // Consumerism = Art Consumerism?
  • Exhibition Space > Artworks
  • Experience of interpretation = Entertainment of Art?
  • Museum vs Mausoleum


→ 2. Spaces of Experience: Art Gallery Interiors from 1800 – 2000 ref NOTES INSIDE

  • Art vs 50's consumerism / Choregraphy of desire?
  • Check theorists Hermann von Helmholtz and Wilhelm Wundt


→ 3. Colour Critique A Symposium on Colour as an Agent for Taste, Experience and Value in the Exhibition Space NOTES INSIDE
May 24, 2019 - Noise! Frans Hals, Otherwise, Frans Hals Museum
→ 4.  Noise! Frans Hals, Otherwise NOTES INSIDE

  • Role of colours in the viewer's experience of an exhibition
  • Institutional Critique
  • Institutionalised Space / White cube


→ 5. Mental Spaces - Joost Rekveld/Michael van Hoogenhuyze NOTES INSIDE
(course for Artscience 2007/8) doc

  • About perspective
  • About Space time
  • About Cyber Space


→ 6.  THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge Doc NOTES INSIDE
Stephanie Moser SOUTHAMPTON UNIVERSITY (MUSEUM ANTHROPOLOGY) 2010

  • Architecture (Neoclassical buildings)
  • Big vs Small exhibition Space
  • Lined up objects vs non systematic display
  • Architecture/Design
  • Gallery interiors (Ceiling/Interior Design elements/Furniture
  • Colors
  • Individual lighting of objects vs global lighting
  • Dark vs Bright lighting
  • Chronological vs Thematic arrangement
  • Academic vs Journalistic writting
  • Busy layout vs Minimal Layout
  • Exibition seen vs other exhibitions
  • Themed/idea-oriented vs objectled exhibitions
  • Didactic vs discovery exhibition
  • Contextual, immersive, or atmospheric exhibitions
  • Audience vs Reception


→ 7. Fantasies of the Library - Etienne Turpin (ed.), Anne-Sophie Springer (ed.) Ref; Editeur: The MIT Press; Date de publication: 1 sept. 2018

  • How the a physical organization influence of a bookshelf can influence it's digital version
  • The book as a minitaure gallery/exhibition space
  • The library as a public place of reading
  • Library vs Exhibition Space = Use vs Display
  • Book-theme exhibitions

About User Interface

Readings/Notes

→ 1. bootlegAlexander R. Galloway - The Interface Effect 1st ed. Malden, USA: Polity Press.

  • The interface paradox
  • The less they do, the more they achieve and the more they become invisible & unconsidered
  • The interface as a "significant surface"
  • The interface as a gateway
  • The interface as "the place where information moves from one entity to another"
  • The interface as the media itself
  • The interface as "agitation or generative friction between different formats"
  • The interface as "an area" that "separates and mixes the two worlds that meet together there"


→ 2. bootleg Nick Srnicek - Navigating Neoliberalism: Political Aesthetics in an Age of Crisis NOTES INSIDE
Editeur: medium.com, Date de publication: 20 oct. 2016

  • From an aesthetic of sublime into an aesthetics of the interface
  • Cognitive mapping


→ 3. bootleg Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff NOTES INSIDE
Douglas Rushkoff, A., 2010. Program Or Be Programmed - Ten Commands For A Digital Age Douglas Rushkoff. 1st ed. Minneapolis, USA: OR Books.

  • "Instead of learning about our technology, we opt for a world in which our technology learns about us."
  • Programmed by the interfaces
  • From a transparent to an opaque medium


→ 4. bootlegThe Best Interface Is No Interface - Golden Krishna NOTES INSIDE
Krishna, G., 2015. The Best Interface Is No Interface: The simple path to brilliant technology (Voices That Matter). 1st ed. unknown: New Riders Publishing.

  • "Screen Obsessed Approach to Design"
  • UI vs UX


→ 5. Plasticity of User Interfaces:A Revised Reference Framework NOTES INSIDE
Gaëlle Calvary, Joëlle Coutaz, David Thevenin Quentin Limbourg, Nathalie Souchon, Laurent Bouillon, Murielle Florins, Jean Vanderdonckt

  • About the term 'Placticity'


→ 6. Interface Critique- Beyond UX - FLORIAN HADLER, ALICE SOINÉ; DANIEL IRRGANG DOC Florian Hadler, Alice Soiné, Daniel Irrgang

  • The interface as an "historical artifact", a "space of power"
  • The interface as human -machine boudary
  • What is interface critique
  • Interface in computer science
  • The screen for Lev Manovitch



More to read/see

→ 1. Bickmore, T.W., Schilit, B.N., Digestor: Device- Independent Access To The World Wide Web, in Proc. of 6th Int. World Wide Web Conf. WWW’6
         (Santa Clara, April 1997)

→ 2. Bouillon, L., Vanderdonckt, J., Souchon, N., Recovering Alternative Presentation Models of a Web Page with VAQUITA, Chapter 27, in Proc. of 4th Int. Conf. on Computer- Aided Design of User Interfaces CADUI’2002
         (Valenciennes, May 15-17, 2002)

→ 3. Calvary, G., Coutaz, J., Thevenin, D., Supporting Context Changes for Plastic User Interfaces: a Process and a Mechanism, in “People and Computers XV –
         Interaction without Frontiers”, Joint Proceedings of AFIHM-BCS Conference on Human-Computer Interaction IHM-HCI’2001(Lille, 10-14 September 2001)

→ 4. Cockton, G., Clarke S., Gray, P., Johnson, C., Literate Development: Weaving Human Context into Design Specifications, in “Critical Issues in User Interface Engineering”,
         P. Palanque & D. Benyon (eds), Springer-Verlag, London, 1995.

→ 5. Graham, T.C.N., Watts, L., Calvary, G., Coutaz, J., Dubois, E., Nigay, L., A Dimension Space for the Design of Interactive Systems within their Physical Environments, in Proc. of Conf. on Designing Interactive Systems DIS’2000
          (New York, August 17-19, 2000,), ACM Press, New York, 2000,

→ 6. Lopez, J.F., Szekely, P., Web page adaptation for Universal Access, in Proc. of Conf. on Universal Access in HCI UAHCI’ 2001
         (New Orleans, August 5-10, 2001), Lawrence Erlbaum Associates, Mahwah, 2001,

→ 7. Thevenin, D., Coutaz, J., Plasticity of User Interfaces: Framework and Research Agenda, in Proc. of 7th IFIP International Conference on Human-Computer Interaction Interact' 99
         (Edinburgh, August 30 - September 3, 1999), Chapman & Hall, London, pp. 110-117.

→ 8. Thevenin, D., Adaptation en Interaction Homme-Machine: Le cas de la Plasticité, Ph.D. thesis, Université Joseph Fourier,
          Grenoble, 21 December 2001.

About User Condition

Readings

→ 1. The User Condition 04: A Mobile First World - Silvio Lorusso Doc

  • Most web user are smarphone users
  • How "mobile's first" affect global web design
  • How "mobile's first" affect the way we use computers

Readings (old)(mostly french)(with notes in french)

Books (old)


→ 1.  L'art comme expérience — John Dewey (french) ⚠️(yet to be filled)⚠️
         publisher: Gallimard (1934)
→ 2.  L'œuvre d'art à l'époque de sa reproductibilité technique — Walter Benjamin (french
         publisher: Alia (1939)
→ 3.  La Galaxie Gutemberg — Marshall McLuhan (french)
         publisher: University of Toronto Press (1962)
→ 3.  Pour comprendre les médias — Marshall McLuhan (french)
         publisher: McGraw-Hill Education (1964)
→ 4.  Dispositif — Jean-Louis Baudry (french)
         publisher: Raymond Bellour, Thierry Kuntzel et Christian Metz (1975)
→ 5.  L’Originalité de l’avant-garde et autres mythes modernistes — Rosalind Krauss (french) ⚠️(yet to be filled)⚠️
         publisher: Macula (1993)
→ 6.  L'art de l'observateur: vision et modernité au XIXe siècle — Jonathan Crary (french)
         publisher: Jacqueline Chambon (Editions) (1994)
→ 7.  Inside the White Cube, the Ideology of Gallery Space — Brian O'Doherty (english) ⚠️(yet to be filled)⚠️
         publisher: Les presses du réel (2008)
→ 8.  Préçis de sémiotique générale — Jean-Marie Klinkenbeg (french) ⚠️(yet to be filled)⚠️
         publisher: Point (2000)
→ 9.  Langage des nouveaux médias — Lev Manovitch (french) ⚠️(yet to be filled)⚠️
         publisher: Presses du Réel (2001)
→ 10. L'empire cybernétique — Cécile Lafontaine (french)
         publisher: Seuil (2004)
→ 11.  La relation comme forme — Jean Louis Boissier (french)
         publisher: Genève, MAMCO(2004)
→ 12.  Le Net Art au musée — Anne Laforêt (french)
         publisher: Questions Théoriques(2011)
→ 13.  Narrative comprehension and Film communication — Edward Branigan (english)
         publisher: Routledge (2013)
→ 14. Statement and counter statement / Notes on experimental Jetset — Experimental Jetset (english)
          publisher: Roma (2015)
→ 15. Post Digital Print — Alessandro Ludovico (french) ≈
          publisher: B42 (2016)
→ 16. L'écran comme mobile — Jean Louis Boissier (french)
          publisher: Presses du réel (2016)
→ 17. Design tactile — Josh Clark (french)
          publisher: Eyrolles (2016)
→ 18. Espaces de l'œuvre, espaces de l'exposition — Pamela Bianchi (french)
          publisher: Eyrolles (2016)
→ 19. Imprimer le monde (french)
          publisher: Éditions HYX et les Éditions du Centre Pompidou (2017)
→ 20. Version 0 - Notes sur le livre numérique (french)
          publisher: ECRIDIL (2018)

Articles (old)

→ 1. Frederick Kiesler — artiste- architecte ⚠️(yet to be filled)⚠️
        (communiqué de presse) Centre pompidou; source : centrepompidou.fr (1996)
→ 2. Oublier l'exposition ⚠️(yet to be filled)⚠️
        Artpress special numéro 21 (2000)
→ 3. Composer avec l’imprévisible: Le questionnaire sur les médias variables ⚠️(yet to be filled)⚠️
        Jon Ippolito; source : variablemedia.net/pdf/Permanence (2003)
→ 4. Esthétique du numérique : rupture et continuité
        Fred Forest; source : archives.icom.museum (2010)
→ 5. La narration interactive ⚠️(yet to be filled)⚠️
        Dragana Trgovčević source : ensci.com/file_intranet/mastere_ctc/etude_Dragana_Trgovcevic.pdf (2011)
→ 6. Des dispositifs aux appareils - L'Espacement d'un calcul
        Anthony Masure source :  anthonymasure.com (2013)
→ 7. Le musée n'est pas un dispositif - Jean-Louis Déotte p.9 - 22 (2011)
→ 8. Apogée et périgée du White Cube Loosli, Alban

References

Exhibition space

→  Prouns Spaces — El lissitzky (1920)
→  City in Space — Frederick Kiesler (1920)
→  The air conditionning Show — Terry Atkinson & Michael Baldwin(1966-67)
→  Sans titre — Michael Asher (1973)
→  Serra Corner prop n°7 (for Nathalie) Richard Serra (1983)
→  Speaking Wall (2009 - 2010)

Nothingness with Media

→  4’’33’ — John Cage (1952)
→  Untitled - A Curse — Tom Friedman (1965)
→  The air conditionning Show — Terry Atkinson & Michael Baldwin(1966-67)
→  Sans titre — Michael Asher (1973)

Mediatization of Media

→  4’’33’ — John Cage (1952)
→  TV Garden — Nam June Paik (1974)
→  Presents — Michael Snow (soon to be translated)
→  Lost Formats Preservation Society — Experimental Jetset (2000)
→  Lost Formats Winterthur — Experimental Jetset (2000)
→  L’atlas critique d’Internet Louise Drulhe (2014-2015)

Flags

→  Netflag — Mark Napier (2002)
→  019 - Flag show (2015)

User perspective

→  What you see is what you get — Jonas Lund (2012)

Media Time perception

→  Present Continuous Past — Dan Graham's (1974)

Experimental cinema

→  Presents — Michael Snow (soon to be translated)
→  Displacements — Michael Naimark (1980)
→  BE NOW HERE — Michael Naimark (1995)

CSS composition

→  Sebastianly Serena
→  Scrollbar Composition
→  into time .com - Rafael Rozendaal
→  Ridge 11 - Nicolas Sassoon
→  Rectangulaire - Claude Closky
→  Jacksonpollock.org - Miltos Manetas
→  Moving Paintings - Annie Abrahams

Media deterioration

→  Img214270417
→  William Basinski - The Disintegration Loops

Undefined

→  Untitled Sans

User friendliness and anti-user friendliness

→  Web-Safe - Juha van Ingen

Media Art conservation

→  The Variable Media Initiative 1999
→  EAI Online Resource Guide forExhibiting, Collecting & Preserving Media Art
→  Matters in Media Art
→  The International Network for the Preservation of Contemporary Art (INCCA)
→  Archiving complex digital artworks - Dušan Barok

Emulation

→  Seeing Double: Emulation in Theory and Practice

Technological Timeline

→  Technological Timeline

Media Art Online Archive

→  ACM SIGGRAPH Art Show Archives
→  Archive of Digital Art (ADA)
→  Ars Electronica Archive
→  Digital Art Web Archive (collected by Cornell)
→  Monoskop
→  The Rhizome ArtBase

Music/Sound

→  The end of music

HTML Quines

→  https://hugohil.github.io/dedans/
→  https://secretgeek.github.io/html_wysiwyg/html.html
→  http://all-html.net/?

More references to check (from THE DEVIL IS IN THE DETAILS: MUSEUM - Displays and the Creation of Knowledge)

  • Alexander, Edward P.

1997 The Museum in America. Walnut Creek,
CA: AltaMira Press.

  • Ambrose, Timoth¡ and Crispin Paine

2006 Museum Basics. 2nd edition. London:
Routledge.

  • Ames, Kenneth L., Barbard Franco, and L. Thomas Frye

1997 Ideas and Images: Developing Interpretive History Exhibits. Walnut Creek,
CA: AltaMira Press.

  • Ames, Michael

1992 Cannibal Tours and Glass Boxes: The Anthropology of Museums. 2nd edition.
Vancouver: University of British Columbia Press.

  • Barringer, Tim, and Tom Fþn, eds.

1997 Colonialism and the Object: Empire,
Material Culture and the Museum. London:
Routledge.

  • Belcher, Michael

199I Exhibitions in Museums.
Leicester: Leicester Museum Studies.

  • Bennett, Tony

1995 The Birth of the Museum.
London: Routledge.

  • Black, Graham

2005 The Engaging Museum. London: Routledge.

  • Bouquet, Mar¡ed.

2001 Academic Anthropology and the Museum.
New York Berghahn Books.

  • Caulton, Tim

1998 Hands on Exhibitions: Managing Interactive Museums and Science Centres.
London: Routledge.

  • Coombes, Annie

1994 Reinventing Africa: Museums, Material Culture and Popular Imagination in Late Victorian and Edwardian England.
New Haven: Yale University Press.

  • Dean, David

1997 Museum Exhibition: Theory and Practice.
London: Routledge.

  • Dubin, Steven

1999 Displays of Power: Memory and Amnesia in the American Museum. New York New York University Press.

  • 2006 Transforming Museums: Mounting Queen Victoria in a Democratic South Africa.

New York Palgrave Macmillan.

  • Falk, John H., and Lynn Dierking

2000 Learning from Museums: Visitor Experiences and the Making of Meaning.
Walnut Creek, CA: AltaMira Press.

  • Fienup-Riordan, Anne

2005 Yup'ik Elders at the Ethnologisches Museum
Berlin: Fieldwork Tumed on Its Head. Seattle: University of Washington Press.

  • Hein, George

1998 Learning in the Museum. London: Routledge.

  • Henderson, Am¡ and Adrienne Kaeppler

1997 Exhibiting Dilemmas: Issues of Representation at the Smithsonian.
Washington, DC: Smithsonian Institution Press.
"In twelve essays on such diverse Smithsonian Institution holdings as the Hope Diamond, the Wright Flyer, wooden Zuni carvings, and the Greensboro, North Carolina Woolworth lunch counter that became a symbol of the Civil Rights movement, Exhibiting Dilemmas explores a wide range of social, political, and ethical questions faced by museum curators in their roles as custodians of culture."

  • Hooper-Greenhill, Eileen

1991 Museum and Gallery Education.
Leicester:Leicester University Press.

  • 1992 Museums and the Shaping of Knowledge.

London: Routledge.

  • 1994 Museums and Their Visitors.

London: Routledge.

  • 2001 Cultural Diversity: Developing Museum Audiences in Britain.

Leicester: Leicester University Press.

  • Kaplan, Flora E. S.

1995 Museums and the Making of 'Ourselves.'
Leicester: Leicester University Press.

  • Karp, Ivan, and Steven D. Lavine

1991 Exhibiting Cultures: The Poetics and Politics of Museum Display.
Washington, DC: Smithsonian Institution Press.

  • Kreps, Christina F.

2003 Liberating Culture: Cross-Cultural Perpectives on Museums, Curation, and heritage Preservation.
London: Routledge.

  • Lindauer, Margaret

2006 The Critical Museum Visitor.
New Museum Theory and Practice: An Introduction. f. Marstine, ed. Pp. 203-225.
Malden: Wiley-Blackwell.

  • Lord, Barr¡ and Gail Lord, eds.

2002 The Manual of Museum Exhibitions.
Wal-nut Creek, CA:AltaMira Press.

  • Macdonald, Sharon, ed.

1998 The Politics of Display. London: Routledge.

  • Macdonald, Sharon, and Gordon Fyfe

1996 Theorizing Museums. Oxford: Blackwell.

  • MacGregor, Arthur


2007 Curiosity and Enlightenment: Collecting and Collections from the Sixteenth to the Nineteenth century.
New Haven: Yale University Press.

  • Macleod, Suzanne, ed.

2005 Reshaping Museum Space: Architecture,Design, Exhibitions. London: Routledge.

  • Mcloughlin, Moira

1999 Museums and the Representation of Native Canadians.
New York Garland Publishing.

  • Metzler, Sally

2008 Theatres of Nature: Dioramas at the Field Museum.
Chicago: Field Museum of Natural History.

  • Moore, Kevin

1997 Museums and Popular Culture.
London: Cassell.

  • Moser, Stephanie

1999 The Dilemma of Didactic Displays: Habitat Dioramas, Life-Groups and Reconstruc- tions of the Past. In Making Early Histories in Museums.
N. Merriman, ed. Pp. 65-116.
London: Cassell/Leicester University Press.

  • 2001 Archaeological Representation: TheVisual Conventions for Constructing Knowledge about the Past. In Archaeological Theory Today.


I. Hodder, ed. Pp. 262-283.
Cambridge: Polity Press.

  • 2003 Representing Human Origins: Constructing Knowledge in Museums and Dismantling the Display Canon.

public Archaeology 3(t):I-17.

  • 2006 Wondrous Curiosities: Ancient Egypt at the British Museum. Chicago: Chicago University Press.


  • 2008 Archaeological Representation: The Consumption and Creation of the Past.

In Oxford Handbook of Archaeology. B. Cunliffe and C. Gosden, eds. pp. 1048- 1077.
Oxford: Oxford University press. Pearce,

  • Susan M., ed.

1994 Interpreting Objects and Collections.
Routledge: Leicester Readers in Museum Studies.

  • Pearce, Susan M.

1998 Museums, Objects and Collections.
Leicester: Leicester University Press.

  • Peers, Laura, and Alison K. Brown, eds.


2003 Museums and Source Communities: A Routledge Reader. London: Routledge.

  • Quinn, Stephen C.

2006 Windows on Nature: The Great Habitat
Dioramas of the American Museum of Natural History. New York Harry N. Abrams,

  • Roberts, Lisa C.

1997 From Knowledge to Narrative: Educators and the Changing Museum. Washington,
DC: Smithsonian Institute Press.

  • Sandell, Richard, ed.

2002 Museums, Societ¡ Inequality.
London: Routledge.

  • Scott, Monique

2007 Rethinking Evolution in the Museum: Envisioning African Origins.
London: Routledge.

  • Serrell, Barbara

1996 Exhibit Labels: An Interpretive Approach.
Walnut Creek, CA: AltaMira press.

2006 fudging Exhibitions: A Framework for Excellence. Walnut Creek,
CA: Left Coast Press.

  • Sheets-Pyenson, Susan

1988 Cathedrals of Science: The Development ofColonial Natural History Museums During the Late Nineteenth Century.
Ontario: McGill-Queen's lJniversity Press.

  • Simpson, Moira

1996 Making Representations: Museums in the Post-Colonial Era.
London: Routledge.

  • Spalding,Iulian

2002 The Poetic Museum: Reviving Historic Collections.
London: Prestel.

  • Swain, Hedley

2007 An Introduction to Museum Archaeology.
Cambridge: Cambridge University press.

  • Vergo, Petet ed.

1990 The New Museology. London: Reaktion Books.

  • Walsh, Kevin

1992 The Representation of the Past: Museums and Heritage in the Post-Modern World.
London: Routledge.

  • Witcomb, Andrea

2003 Re-Imagining the Museum: Beyond the Mausoleum.
London: Routledge.

  • Yanni, Carla

2005 Nature's Museum; Victorian Science and the Architecture of Display.
Princeton: Princeton Architectural Press