Wang/GraduationProject: Difference between revisions

From XPUB & Lens-Based wiki
 
(21 intermediate revisions by one other user not shown)
Line 1: Line 1:
=Draft project proposal=
=Draft project proposal=
==What do you want to make?==
==What do you want to make?==
I want to make something that combines the features of SI22, SI23, and SI24.
 
*SI22: a device that interact with weather/nature,
[[Steve'sCommentsWangPP]]
 
How do we adapt to nature and transform the language of nature?
 
*''Human can accurately predict the regular of tides and use the kinetic energy of waves.''
 
*''Human use geomagnetism to determine direction and make compasses.''
 
*''Human use wind energy to generate electricity and test wind direction and speed through different instruments.''
 
We are constantly obtaining energy and dynamics from nature.
 
While we adapt to and use natural energy, we continue to produce/invent various machines to transform energy. This is a physical, energy transformation.
 
I want to apply this energy conversion method to this project to create one or several devices to control digital devices, sounds, or web page interactions through these natural variables.
 
In my past projects, I established a foundation for this project by tackling technical challenges, gathering ideas, and conducting focused research.
 
*''Rain Receiver was the first step,  a wearable device that let sound interact with rain, exploring how machines could respond to natural elements.'' <br>
*''With Sound Quilt, I moved into web-based interaction, using JavaScript and HTML to test out practical methods for digital performance through workshops and live demonstrations.'' <br>
*''Finally, Solar Beep marked a significant step forward in hardware development, where I gained hands-on experience with machine design and production. ''<br>
Here are three descriptions about the previous projects:
===SI22: Rain Receiver - a device that interact with weather/nature===
https://pzwiki.wdka.nl/mediadesign/Wang_SI22#Rain_Receiver<br>
https://pzwiki.wdka.nl/mediadesign/Wang_SI22#Rain_Receiver<br>
https://pzwiki.wdka.nl/mediadesign/Express_lane#Rain_receiver
https://pzwiki.wdka.nl/mediadesign/Express_lane#Rain_receiver
Line 11: Line 33:
With help from Zuzu, we connected the receiver to a printer, creating symbols like "/" marks alongside apocalyptic-themed words to archive each raindrop as a kind of message. After fixing a few bugs, the Rain Receiver was showcased at WORM’s Apocalypse event, hinting at how this device could be a meaningful object in the future.<br>
With help from Zuzu, we connected the receiver to a printer, creating symbols like "/" marks alongside apocalyptic-themed words to archive each raindrop as a kind of message. After fixing a few bugs, the Rain Receiver was showcased at WORM’s Apocalypse event, hinting at how this device could be a meaningful object in the future.<br>


===SI23: sound experimentation tools base on HTML===
https://pzwiki.wdka.nl/mediadesign/Express_lane#Javasript<br>
https://pzwiki.wdka.nl/mediadesign/JavaScriptClub/04<br>
During my experiment with Tone.js, I found its sound effects were quite limited. This led me to explore other JavaScript libraries with more interesting sound capabilities, like Pizzicato.js. Pizzicato.js offers effects such as ping pong delay, fuzz, flanger, tremolo, and ring modulation, which allow for much more creative sound experimentation. Since everything is online, there’s no need to download software, and it also lets users experience sound with visual and interactive web elements.<br><br>
Inspired by deconstructionism, I created a project that presents all these effects on one webpage, experimenting with ways to layer and combine them like a "sound quilt."<br><br>
To introduce the project and gather feedback, I hosted a workshop and a performance. I started a JavaScript Club to introduce sound-related JavaScript libraries and show people how to experiment with sound on the web. At the end of the workshop, with a form of “Examination”, I gave out a zine hidden inside a pen called Script Partner. Each zine included unique code snippets for creating instruments or sound effects.<br><br>
For the performance, I used an HTML interface I designed, displaying it on a large TV screen. The screen showed a wall of bricks, each brick linked to a frequency. By interacting with different bricks and sound effect sliders, the HTML could create a blend of sound and visuals, a strange, wobbly, and mysterious soundscape.


 
===SI24: MIDI device===
*SI23: sound experimentation tools base on HTML,
https://pzwiki.wdka.nl/mediadesign/Wang_SI24#Midi_Controller<br>
200 words description
Rain Receiver is actually a midi controller but just by using sensor.
*SI24: a MIDI device,
Most logic of a MIDI controller is to help users map their own values/parameters in the DAW, using potentiometers, sliders, buttons for sequencer steps (momentary), or buttons for notes (latching). The layout of the controls on these devices usually features a tight square grid with knobs, which is a common design for controllers intended for efficient control over parameters such as volume, pan, filters, and effects.
200 words description<br>
During an period of using pedals and MIDI controllers for performance, I've organized most of my mappings into groups, with each group containing 3 or 4 parameters.<br><br>
The result would be a device that interact to nature, with a corresponding tools in HTML form.
For example,
the Echo group includes Dry/Wet, Time, and Pitch parameters, the ARP group includes Frequency, Pitch, Fine, Send C, and Send D.<br><br>
In such cases, a controller with a 3x8 grid of knobs is not suitable for mapping effects with 5 parameters, like the ARP. A structure resembling a tree or radial structure would be more suitable for the ARP's complex configuration.
The structures, such as planets and satellites, demonstrate how a large star can be radially mapped to an entire galaxy; A tiny satellite orbiting a planet may follow an elliptical path rather than a circular one; Comets exhibit irregular movement patterns. The logic behind all these phenomena is similar to the logic I use when producing music.<br><br>
"Solar Beep" is the result of this search, built to elevate creative flow with a uniquely flexible design. It offers three modes—Mono, Sequencer, and Hold—alongside 8 knobs, a joystick, and a dual-pitch control for its 10-note button setup. With 22 LEDs providing real-time visual feedback, "Solar Beep" simplifies complex mappings and gives users an intuitive experience, balancing precision and adaptability for a more responsive and engaging production tool.


==How do you plan to make it?==
==How do you plan to make it?==
Line 26: Line 59:
Process;<br>
Process;<br>
Debug;<br>
Debug;<br>
I want to create a device that captures natural dynamics—like wind, tides, and direction—and converts them into MIDI signals.
These signals can then control musical instruments or interact with web pages and software, offering new experimental methods for sound and visual design.
To support this, I’ll design dedicated web pages and visual elements to enhance the interactive experience with the device.
In the initial testing phase, I’ll focus on three natural variables:
*''waves''
*''direction''
*''wind''
Each variable brings unique characteristics—
*''wind, for instance, is unpredictable and unstable; ''
*''tides are relatively regular and consistent; ''
*''Direction requires intentional human input, like a compass.''
Through sensor-based testing, I’ll gather and analyze data from each of these factors to determine which is most suitable for this project. By integrating these natural variables, I hope to introduce new ways of interaction in both sound and visual media.


==What is your timetable?==
==What is your timetable?==

Latest revision as of 14:51, 17 November 2024

Draft project proposal

What do you want to make?

Steve'sCommentsWangPP

How do we adapt to nature and transform the language of nature?

  • Human can accurately predict the regular of tides and use the kinetic energy of waves.
  • Human use geomagnetism to determine direction and make compasses.
  • Human use wind energy to generate electricity and test wind direction and speed through different instruments.

We are constantly obtaining energy and dynamics from nature.

While we adapt to and use natural energy, we continue to produce/invent various machines to transform energy. This is a physical, energy transformation.

I want to apply this energy conversion method to this project to create one or several devices to control digital devices, sounds, or web page interactions through these natural variables.

In my past projects, I established a foundation for this project by tackling technical challenges, gathering ideas, and conducting focused research.

  • Rain Receiver was the first step, a wearable device that let sound interact with rain, exploring how machines could respond to natural elements.
  • With Sound Quilt, I moved into web-based interaction, using JavaScript and HTML to test out practical methods for digital performance through workshops and live demonstrations.
  • Finally, Solar Beep marked a significant step forward in hardware development, where I gained hands-on experience with machine design and production.

Here are three descriptions about the previous projects:

SI22: Rain Receiver - a device that interact with weather/nature

https://pzwiki.wdka.nl/mediadesign/Wang_SI22#Rain_Receiver
https://pzwiki.wdka.nl/mediadesign/Express_lane#Rain_receiver

The Rain Receiver project started with a simple question during a picnic: what if an umbrella could do more than just protect us from the rain? Imagined as a device to capture and convert rain into digital signals, the Rain Receiver is designed to archive nature's voice, capturing the interactions between humans and the natural world. Inspiration for the project came from the "About Energy" skills classes, where I learned about ways to use natural energy—like wind power for flying kites. This got me thinking about how we could take signals from nature and use them to interact with our devices.

The Rain Receiver came together through experimenting with Max/MSP and Arduino. Using a piezo sensor to detect raindrops, I set it up to translate each drop into MIDI sounds. By sending these signals through Arduino to Max/MSP, I could automatically trigger the instruments and effects I had preset. Shifting away from an umbrella, I designed the Rain Receiver as a wearable backpack so people could experience it hands-free outdoors.

With help from Zuzu, we connected the receiver to a printer, creating symbols like "/" marks alongside apocalyptic-themed words to archive each raindrop as a kind of message. After fixing a few bugs, the Rain Receiver was showcased at WORM’s Apocalypse event, hinting at how this device could be a meaningful object in the future.

SI23: sound experimentation tools base on HTML

https://pzwiki.wdka.nl/mediadesign/Express_lane#Javasript
https://pzwiki.wdka.nl/mediadesign/JavaScriptClub/04
During my experiment with Tone.js, I found its sound effects were quite limited. This led me to explore other JavaScript libraries with more interesting sound capabilities, like Pizzicato.js. Pizzicato.js offers effects such as ping pong delay, fuzz, flanger, tremolo, and ring modulation, which allow for much more creative sound experimentation. Since everything is online, there’s no need to download software, and it also lets users experience sound with visual and interactive web elements.

Inspired by deconstructionism, I created a project that presents all these effects on one webpage, experimenting with ways to layer and combine them like a "sound quilt."

To introduce the project and gather feedback, I hosted a workshop and a performance. I started a JavaScript Club to introduce sound-related JavaScript libraries and show people how to experiment with sound on the web. At the end of the workshop, with a form of “Examination”, I gave out a zine hidden inside a pen called Script Partner. Each zine included unique code snippets for creating instruments or sound effects.

For the performance, I used an HTML interface I designed, displaying it on a large TV screen. The screen showed a wall of bricks, each brick linked to a frequency. By interacting with different bricks and sound effect sliders, the HTML could create a blend of sound and visuals, a strange, wobbly, and mysterious soundscape.

SI24: MIDI device

https://pzwiki.wdka.nl/mediadesign/Wang_SI24#Midi_Controller
Rain Receiver is actually a midi controller but just by using sensor. Most logic of a MIDI controller is to help users map their own values/parameters in the DAW, using potentiometers, sliders, buttons for sequencer steps (momentary), or buttons for notes (latching). The layout of the controls on these devices usually features a tight square grid with knobs, which is a common design for controllers intended for efficient control over parameters such as volume, pan, filters, and effects. During an period of using pedals and MIDI controllers for performance, I've organized most of my mappings into groups, with each group containing 3 or 4 parameters.

For example, the Echo group includes Dry/Wet, Time, and Pitch parameters, the ARP group includes Frequency, Pitch, Fine, Send C, and Send D.

In such cases, a controller with a 3x8 grid of knobs is not suitable for mapping effects with 5 parameters, like the ARP. A structure resembling a tree or radial structure would be more suitable for the ARP's complex configuration. The structures, such as planets and satellites, demonstrate how a large star can be radially mapped to an entire galaxy; A tiny satellite orbiting a planet may follow an elliptical path rather than a circular one; Comets exhibit irregular movement patterns. The logic behind all these phenomena is similar to the logic I use when producing music.

"Solar Beep" is the result of this search, built to elevate creative flow with a uniquely flexible design. It offers three modes—Mono, Sequencer, and Hold—alongside 8 knobs, a joystick, and a dual-pitch control for its 10-note button setup. With 22 LEDs providing real-time visual feedback, "Solar Beep" simplifies complex mappings and gives users an intuitive experience, balancing precision and adaptability for a more responsive and engaging production tool.

How do you plan to make it?

With the same formula I used in SI24, it still will be the similar steps:
Research;
Draft;
Shopping list based on the draft;
Process;
Debug;

I want to create a device that captures natural dynamics—like wind, tides, and direction—and converts them into MIDI signals. These signals can then control musical instruments or interact with web pages and software, offering new experimental methods for sound and visual design.

To support this, I’ll design dedicated web pages and visual elements to enhance the interactive experience with the device.

In the initial testing phase, I’ll focus on three natural variables:

  • waves
  • direction
  • wind

Each variable brings unique characteristics—

  • wind, for instance, is unpredictable and unstable;
  • tides are relatively regular and consistent;
  • Direction requires intentional human input, like a compass.

Through sensor-based testing, I’ll gather and analyze data from each of these factors to determine which is most suitable for this project. By integrating these natural variables, I hope to introduce new ways of interaction in both sound and visual media.

What is your timetable?

From September to December, I will be constantly making workshops, exercises, and performances related to the main subject in order to develop more concrete content. Including:
a Javascript Club, about using web audio tools to make sound experiment.
INC zine event
Public event, a performance in the kitchen using a microwave counting time.
Colloquium workshop
a performance using the devices I made to present.

Why do you want to make it?

I think Nature is a large content, it could related to using the nature as a language to communicate, by various machine, or as a predicted way,to predict fortune, but not a typical Divination function. Such as direction, temperature, magnetic, gravity... these factors could be connections with the device. By using an unstable way to control the devices, like the nature factors.

Who can help you and how?

Xpub tutors will help me a lot to clarify my logical structure and assist me with the technological issues I may encounter during the process. I will also try to contact relevant engineers, product designers, sound designers, or relevant communities for more inspiration. During the testing phase, I will reach out to different communities to organize workshops and performances for testing the devices and gathering feedback from users.

Relation to previous practice

Building on my experiences with SI22, SI23, and SI24, I will combine my technical skills and creative exploration from those projects. The process of researching, drafting, and debugging will support my development approach, allowing me to incorporate previous learnings into this new project and create a seamless integration of sound and nature.

Relation to a larger context

This project connects to broader themes of environmental awareness and the fusion of technology with nature. By exploring the interplay between natural phenomena and digital soundscapes, it will also utilize unstable elements from nature to engage with digital aspects. This could be an important way to enhance creativity in sound design.

References/bibliography

https://www.ableton.com/en/packs/inspired-nature/ https://www.lovehulten.com/

Fortune

Believe it Yourself is a series of real-fictional belief-based computing kits to make and tinker with vernacular logics and superstitions. Created by Shanghai based design studio automato.farm, 'BIY™ - Believe it Yourself' is a series of real-fictional belief-based computing kits to make and tinker with vernacular logics and superstitions. The team worked with experts in fortune telling from Italy, geomancy from China and numerology from India to translate their knowledge and beliefs into three separate kits – BIY.SEE, BIY.MOVE and BIY.HEAR. They invite users to tinker with cameras that can see luck*, microphones that interpret your destiny*, and compasses that can point you to harmony and balance*.

http://automato.farm/portfolio/believe_it_yourself/

other inspirations:

Plants


https://www.datagarden.org/technology

https://design-milk.com/love-hultens-desert-songs-sounds-like-a-blast-from-the-chloroplast/?utm_source=Design+Milk+Staff&utm_medium=email&utm_campaign=1.31+Tuesday+Daily+Digest+%2801GR1X73VDA1TR617J0BFQ4J05%29&_kx=n4Il-D-uZPpoBNk0mS2m2lANwpKa53_4l8QWmP5bv0M%3D.MfHiGP&epik=dj0yJnU9RGxZSXNRekJqOEdBWTFEekQxNF9PSlREN0RJV0JuVGsmcD0wJm49NDNLa2FzWFZNRXMydVREWXBKd01SZyZ0PUFBQUFBR2NzRFJJ#038;utm_medium=rss&utm_campaign=love-hultens-desert-songs-sounds-like-a-blast-from-the-chloroplast

Gravities

Bouncy Notes

https://dillonbastan.com/inspiredbynature_manuals/Bouncy%20Notes%20User%20Manual.pdf https://www.youtube.com/watch?v=C2hQ-WbKBhU

Droplets

https://finneganeganegan.xyz/works/droplets

Thesis Outline

This thesis will explores how natural forces like temperature, direction, magnetism, and gravity can be used to control sound devices such as MIDI controllers and synthesizers. Rather than relying on traditional, stable inputs (like pressing keys or turning knobs), we investigate how unpredictable, changing elements from nature can be used to influence sound production. The idea is to treat nature as a kind of "language" that machines can understand and respond to in real-time, creating a more dynamic and organic way to control devices.

Beyond just controlling sound, this project aims to experiment with how different technologies can interact with each other. By using natural forces to control devices, we can create new and unexpected results. For example, changes in wind direction might not only affect sound but also trigger visual changes or interact with other devices, creating a chain reaction. The goal is to explore how these connections between sound, natural data, and other forms of technology can open up creative possibilities that go beyond traditional methods.

To take this experiment further, the thesis will also explore how these ideas can be combined into a website or an interactive HTML platform. This would allow users to experience the sound, visuals, and interactions online, making the project more immersive. By connecting natural forces with sound devices and digital platforms, this project aims to create an evolving system where the physical and digital worlds blend together in new and interesting ways.

chapter 1: Natural Forces

Introduce to Natural Forces

The Concept of Nature as Language

chapter 2: Interactivity connections

chapter 3: Digital Platforms and Future Applications

Conclusion