Wang/GraduationProject: Difference between revisions
Wang ziheng (talk | contribs) |
|||
(62 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
=Draft project proposal= | =Draft project proposal= | ||
==What do you want to make?== | ==What do you want to make?== | ||
I want to | |||
*SI22: a device that interact with weather/nature, | [[Steve'sCommentsWangPP]] | ||
How do we adapt to nature and transform the language of nature? | |||
The | |||
*''Human can accurately predict the regular of tides and use the kinetic energy of waves.'' | |||
*''Human use geomagnetism to determine direction and make compasses.'' | |||
*''Human use wind energy to generate electricity and test wind direction and speed through different instruments.'' | |||
We are constantly obtaining energy and dynamics from nature. | |||
While we adapt to and use natural energy, we continue to produce/invent various machines to transform energy. This is a physical, energy transformation. | |||
I want to apply this energy conversion method to this project to create one or several devices to control digital devices, sounds, or web page interactions through these natural variables. | |||
In my past projects, I established a foundation for this project by tackling technical challenges, gathering ideas, and conducting focused research. | |||
*''Rain Receiver was the first step, a wearable device that let sound interact with rain, exploring how machines could respond to natural elements.'' <br> | |||
*''With Sound Quilt, I moved into web-based interaction, using JavaScript and HTML to test out practical methods for digital performance through workshops and live demonstrations.'' <br> | |||
*''Finally, Solar Beep marked a significant step forward in hardware development, where I gained hands-on experience with machine design and production. ''<br> | |||
Here are three descriptions about the previous projects: | |||
===SI22: Rain Receiver - a device that interact with weather/nature=== | |||
https://pzwiki.wdka.nl/mediadesign/Wang_SI22#Rain_Receiver<br> | |||
https://pzwiki.wdka.nl/mediadesign/Express_lane#Rain_receiver | |||
The Rain Receiver project started with a simple question during a picnic: what if an umbrella could do more than just protect us from the rain? Imagined as a device to capture and convert rain into digital signals, the Rain Receiver is designed to archive nature's voice, capturing the interactions between humans and the natural world. | |||
Inspiration for the project came from the "About Energy" skills classes, where I learned about ways to use natural energy—like wind power for flying kites. This got me thinking about how we could take signals from nature and use them to interact with our devices.<br><br> | |||
The Rain Receiver came together through experimenting with Max/MSP and Arduino. Using a piezo sensor to detect raindrops, I set it up to translate each drop into MIDI sounds. By sending these signals through Arduino to Max/MSP, I could automatically trigger the instruments and effects I had preset. Shifting away from an umbrella, I designed the Rain Receiver as a wearable backpack so people could experience it hands-free outdoors.<br><br> | |||
With help from Zuzu, we connected the receiver to a printer, creating symbols like "/" marks alongside apocalyptic-themed words to archive each raindrop as a kind of message. After fixing a few bugs, the Rain Receiver was showcased at WORM’s Apocalypse event, hinting at how this device could be a meaningful object in the future.<br> | |||
===SI23: sound experimentation tools base on HTML=== | |||
https://pzwiki.wdka.nl/mediadesign/Express_lane#Javasript<br> | |||
https://pzwiki.wdka.nl/mediadesign/JavaScriptClub/04<br> | |||
During my experiment with Tone.js, I found its sound effects were quite limited. This led me to explore other JavaScript libraries with more interesting sound capabilities, like Pizzicato.js. Pizzicato.js offers effects such as ping pong delay, fuzz, flanger, tremolo, and ring modulation, which allow for much more creative sound experimentation. Since everything is online, there’s no need to download software, and it also lets users experience sound with visual and interactive web elements.<br><br> | |||
Inspired by deconstructionism, I created a project that presents all these effects on one webpage, experimenting with ways to layer and combine them like a "sound quilt."<br><br> | |||
To introduce the project and gather feedback, I hosted a workshop and a performance. I started a JavaScript Club to introduce sound-related JavaScript libraries and show people how to experiment with sound on the web. At the end of the workshop, with a form of “Examination”, I gave out a zine hidden inside a pen called Script Partner. Each zine included unique code snippets for creating instruments or sound effects.<br><br> | |||
For the performance, I used an HTML interface I designed, displaying it on a large TV screen. The screen showed a wall of bricks, each brick linked to a frequency. By interacting with different bricks and sound effect sliders, the HTML could create a blend of sound and visuals, a strange, wobbly, and mysterious soundscape. | |||
===SI24: MIDI device=== | |||
https://pzwiki.wdka.nl/mediadesign/Wang_SI24#Midi_Controller<br> | |||
Rain Receiver is actually a midi controller but just by using sensor. | |||
Most logic of a MIDI controller is to help users map their own values/parameters in the DAW, using potentiometers, sliders, buttons for sequencer steps (momentary), or buttons for notes (latching). The layout of the controls on these devices usually features a tight square grid with knobs, which is a common design for controllers intended for efficient control over parameters such as volume, pan, filters, and effects. | |||
During an period of using pedals and MIDI controllers for performance, I've organized most of my mappings into groups, with each group containing 3 or 4 parameters.<br><br> | |||
For example, | |||
the Echo group includes Dry/Wet, Time, and Pitch parameters, the ARP group includes Frequency, Pitch, Fine, Send C, and Send D.<br><br> | |||
In such cases, a controller with a 3x8 grid of knobs is not suitable for mapping effects with 5 parameters, like the ARP. A structure resembling a tree or radial structure would be more suitable for the ARP's complex configuration. | |||
The structures, such as planets and satellites, demonstrate how a large star can be radially mapped to an entire galaxy; A tiny satellite orbiting a planet may follow an elliptical path rather than a circular one; Comets exhibit irregular movement patterns. The logic behind all these phenomena is similar to the logic I use when producing music.<br><br> | |||
"Solar Beep" is the result of this search, built to elevate creative flow with a uniquely flexible design. It offers three modes—Mono, Sequencer, and Hold—alongside 8 knobs, a joystick, and a dual-pitch control for its 10-note button setup. With 22 LEDs providing real-time visual feedback, "Solar Beep" simplifies complex mappings and gives users an intuitive experience, balancing precision and adaptability for a more responsive and engaging production tool. | |||
==How do you plan to make it?== | ==How do you plan to make it?== | ||
Line 14: | Line 59: | ||
Process;<br> | Process;<br> | ||
Debug;<br> | Debug;<br> | ||
I want to create a device that captures natural dynamics—like wind, tides, and direction—and converts them into MIDI signals. | |||
These signals can then control musical instruments or interact with web pages and software, offering new experimental methods for sound and visual design. | |||
To support this, I’ll design dedicated web pages and visual elements to enhance the interactive experience with the device. | |||
In the initial testing phase, I’ll focus on three natural variables: | |||
*''waves'' | |||
*''direction'' | |||
*''wind'' | |||
Each variable brings unique characteristics— | |||
*''wind, for instance, is unpredictable and unstable; '' | |||
*''tides are relatively regular and consistent; '' | |||
*''Direction requires intentional human input, like a compass.'' | |||
Through sensor-based testing, I’ll gather and analyze data from each of these factors to determine which is most suitable for this project. By integrating these natural variables, I hope to introduce new ways of interaction in both sound and visual media. | |||
==What is your timetable?== | ==What is your timetable?== | ||
From September to December, I will be constantly making workshops, exercises, and performances related to the main subject in order to develop more concrete content. | From September to December, I will be constantly making workshops, exercises, and performances related to the main subject in order to develop more concrete content. | ||
Including: <br> | Including: <br> | ||
a Javascript Club, about using web audio tools to make sound experiment. | a Javascript Club, about using web audio tools to make sound experiment.<br> | ||
INC zine event | INC zine event<br> | ||
Public event, a performance in the kitchen using a microwave counting time. | Public event, a performance in the kitchen using a microwave counting time.<br> | ||
Colloquium workshop | Colloquium workshop<br> | ||
a performance using the devices I made to present.<br> | |||
==Why do you want to make it?== | ==Why do you want to make it?== | ||
Line 29: | Line 92: | ||
==Who can help you and how?== | ==Who can help you and how?== | ||
Xpub tutors will help me a lot to clarify my logical structure and assist me with the technological issues I may encounter during the process. I will also try to contact relevant engineers, product designers, sound designers, or relevant communities for more inspiration. During the testing phase, I will reach out to different communities to organize workshops and performances for testing the devices and gathering feedback from users. | |||
==Relation to previous practice== | ==Relation to previous practice== | ||
Building on my experiences with SI22, SI23, and SI24, I will combine my technical skills and creative exploration from those projects. The process of researching, drafting, and debugging will support my development approach, allowing me to incorporate previous learnings into this new project and create a seamless integration of sound and nature. | |||
==Relation to a larger context== | ==Relation to a larger context== | ||
This project connects to broader themes of environmental awareness and the fusion of technology with nature. By exploring the interplay between natural phenomena and digital soundscapes, it will also utilize unstable elements from nature to engage with digital aspects. This could be an important way to enhance creativity in sound design. | |||
==References/bibliography== | ==References/bibliography== | ||
https://www.ableton.com/en/packs/inspired-nature/ | |||
https://www.lovehulten.com/ | |||
===Fortune=== | |||
Believe it Yourself is a series of real-fictional belief-based computing kits to make and tinker with vernacular logics and superstitions. Created by Shanghai based design studio automato.farm, 'BIY™ - Believe it Yourself' is a series of real-fictional belief-based computing kits to make and tinker with vernacular logics and superstitions. The team worked with experts in fortune telling from Italy, geomancy from China and numerology from India to translate their knowledge and beliefs into three separate kits – BIY.SEE, BIY.MOVE and BIY.HEAR. They invite users to tinker with cameras that can see luck*, microphones that interpret your destiny*, and compasses that can point you to harmony and balance*. | |||
http://automato.farm/portfolio/believe_it_yourself/ | |||
<img src="https://speculativeedu.eu/wp-content/uploads/2019/07/03_BIY-Hear-Training-768x512.jpg" width='700px'> | |||
<img src="https://s3files.core77.com/blog/images/942225_33852_89014_UPgf7T8qH.jpg" width='700px'> | |||
<img src="https://s3files.core77.com/blog/images/942223_33852_89014_iz4mAJm3s.jpg" width='700px'> | |||
other inspirations: | |||
<img src="https://robu.in/wp-content/uploads/2023/06/1628494-4.jpg" width='300px'> | |||
<img src="https://i-bosity-com.oss-cn-hongkong.aliyuncs.com/product_img/265/61016712/61016712_3_image.jpg?x-oss-process=image/resize,p_100/watermark,image_d2F0ZXJtYXJrX2ltZy8xNzExMTQwMS9kZWZhdWx0LnBuZz94LW9zcy1wcm9jZXNzPWltYWdlL3Jlc2l6ZSxQXzk5,g_nw,x_0,y_0" width='300px'> | |||
===Plants=== | |||
<img src="https://plantwave.com/cdn/shop/files/slide-img-2.jpg?v=1676814383&width=1070" width='600px'> | |||
<br> | |||
https://www.datagarden.org/technology | |||
<img src="https://i.pinimg.com/736x/1d/3c/0d/1d3c0dbefebca88bf8adbbf76e601ee3.jpg" width='600px'> | |||
<img src="https://i.pinimg.com/736x/f1/42/b2/f142b25ef904a472a32d13c32ef45e3c.jpg" width='600px'> | |||
<img src="https://i.pinimg.com/736x/1c/6d/96/1c6d969cb3e1d6237ea9a630cffcb4bf.jpg" width='600px'> | |||
https://design-milk.com/love-hultens-desert-songs-sounds-like-a-blast-from-the-chloroplast/?utm_source=Design+Milk+Staff&utm_medium=email&utm_campaign=1.31+Tuesday+Daily+Digest+%2801GR1X73VDA1TR617J0BFQ4J05%29&_kx=n4Il-D-uZPpoBNk0mS2m2lANwpKa53_4l8QWmP5bv0M%3D.MfHiGP&epik=dj0yJnU9RGxZSXNRekJqOEdBWTFEekQxNF9PSlREN0RJV0JuVGsmcD0wJm49NDNLa2FzWFZNRXMydVREWXBKd01SZyZ0PUFBQUFBR2NzRFJJ#038;utm_medium=rss&utm_campaign=love-hultens-desert-songs-sounds-like-a-blast-from-the-chloroplast | |||
===Gravities=== | |||
====Bouncy Notes==== | |||
https://dillonbastan.com/inspiredbynature_manuals/Bouncy%20Notes%20User%20Manual.pdf | |||
https://www.youtube.com/watch?v=C2hQ-WbKBhU | |||
====Droplets==== | |||
https://finneganeganegan.xyz/works/droplets | |||
=Thesis Outline= | =Thesis Outline= | ||
==chapter 1== | |||
==chapter 2== | This thesis will explores how natural forces like temperature, direction, magnetism, and gravity can be used to control sound devices such as MIDI controllers and synthesizers. Rather than relying on traditional, stable inputs (like pressing keys or turning knobs), we investigate how unpredictable, changing elements from nature can be used to influence sound production. The idea is to treat nature as a kind of "language" that machines can understand and respond to in real-time, creating a more dynamic and organic way to control devices. | ||
==chapter 3== | |||
Beyond just controlling sound, this project aims to experiment with how different technologies can interact with each other. By using natural forces to control devices, we can create new and unexpected results. For example, changes in wind direction might not only affect sound but also trigger visual changes or interact with other devices, creating a chain reaction. The goal is to explore how these connections between sound, natural data, and other forms of technology can open up creative possibilities that go beyond traditional methods. | |||
To take this experiment further, the thesis will also explore how these ideas can be combined into a website or an interactive HTML platform. This would allow users to experience the sound, visuals, and interactions online, making the project more immersive. By connecting natural forces with sound devices and digital platforms, this project aims to create an evolving system where the physical and digital worlds blend together in new and interesting ways. | |||
==chapter 1: Natural Forces== | |||
===Introduce to Natural Forces=== | |||
===The Concept of Nature as Language=== | |||
==chapter 2: Interactivity connections== | |||
==chapter 3: Digital Platforms and Future Applications== | |||
==Conclusion== | ==Conclusion== |
Latest revision as of 14:51, 17 November 2024
Draft project proposal
What do you want to make?
How do we adapt to nature and transform the language of nature?
- Human can accurately predict the regular of tides and use the kinetic energy of waves.
- Human use geomagnetism to determine direction and make compasses.
- Human use wind energy to generate electricity and test wind direction and speed through different instruments.
We are constantly obtaining energy and dynamics from nature.
While we adapt to and use natural energy, we continue to produce/invent various machines to transform energy. This is a physical, energy transformation.
I want to apply this energy conversion method to this project to create one or several devices to control digital devices, sounds, or web page interactions through these natural variables.
In my past projects, I established a foundation for this project by tackling technical challenges, gathering ideas, and conducting focused research.
- Rain Receiver was the first step, a wearable device that let sound interact with rain, exploring how machines could respond to natural elements.
- With Sound Quilt, I moved into web-based interaction, using JavaScript and HTML to test out practical methods for digital performance through workshops and live demonstrations.
- Finally, Solar Beep marked a significant step forward in hardware development, where I gained hands-on experience with machine design and production.
Here are three descriptions about the previous projects:
SI22: Rain Receiver - a device that interact with weather/nature
https://pzwiki.wdka.nl/mediadesign/Wang_SI22#Rain_Receiver
https://pzwiki.wdka.nl/mediadesign/Express_lane#Rain_receiver
The Rain Receiver project started with a simple question during a picnic: what if an umbrella could do more than just protect us from the rain? Imagined as a device to capture and convert rain into digital signals, the Rain Receiver is designed to archive nature's voice, capturing the interactions between humans and the natural world.
Inspiration for the project came from the "About Energy" skills classes, where I learned about ways to use natural energy—like wind power for flying kites. This got me thinking about how we could take signals from nature and use them to interact with our devices.
The Rain Receiver came together through experimenting with Max/MSP and Arduino. Using a piezo sensor to detect raindrops, I set it up to translate each drop into MIDI sounds. By sending these signals through Arduino to Max/MSP, I could automatically trigger the instruments and effects I had preset. Shifting away from an umbrella, I designed the Rain Receiver as a wearable backpack so people could experience it hands-free outdoors.
With help from Zuzu, we connected the receiver to a printer, creating symbols like "/" marks alongside apocalyptic-themed words to archive each raindrop as a kind of message. After fixing a few bugs, the Rain Receiver was showcased at WORM’s Apocalypse event, hinting at how this device could be a meaningful object in the future.
SI23: sound experimentation tools base on HTML
https://pzwiki.wdka.nl/mediadesign/Express_lane#Javasript
https://pzwiki.wdka.nl/mediadesign/JavaScriptClub/04
During my experiment with Tone.js, I found its sound effects were quite limited. This led me to explore other JavaScript libraries with more interesting sound capabilities, like Pizzicato.js. Pizzicato.js offers effects such as ping pong delay, fuzz, flanger, tremolo, and ring modulation, which allow for much more creative sound experimentation. Since everything is online, there’s no need to download software, and it also lets users experience sound with visual and interactive web elements.
Inspired by deconstructionism, I created a project that presents all these effects on one webpage, experimenting with ways to layer and combine them like a "sound quilt."
To introduce the project and gather feedback, I hosted a workshop and a performance. I started a JavaScript Club to introduce sound-related JavaScript libraries and show people how to experiment with sound on the web. At the end of the workshop, with a form of “Examination”, I gave out a zine hidden inside a pen called Script Partner. Each zine included unique code snippets for creating instruments or sound effects.
For the performance, I used an HTML interface I designed, displaying it on a large TV screen. The screen showed a wall of bricks, each brick linked to a frequency. By interacting with different bricks and sound effect sliders, the HTML could create a blend of sound and visuals, a strange, wobbly, and mysterious soundscape.
SI24: MIDI device
https://pzwiki.wdka.nl/mediadesign/Wang_SI24#Midi_Controller
Rain Receiver is actually a midi controller but just by using sensor.
Most logic of a MIDI controller is to help users map their own values/parameters in the DAW, using potentiometers, sliders, buttons for sequencer steps (momentary), or buttons for notes (latching). The layout of the controls on these devices usually features a tight square grid with knobs, which is a common design for controllers intended for efficient control over parameters such as volume, pan, filters, and effects.
During an period of using pedals and MIDI controllers for performance, I've organized most of my mappings into groups, with each group containing 3 or 4 parameters.
For example,
the Echo group includes Dry/Wet, Time, and Pitch parameters, the ARP group includes Frequency, Pitch, Fine, Send C, and Send D.
In such cases, a controller with a 3x8 grid of knobs is not suitable for mapping effects with 5 parameters, like the ARP. A structure resembling a tree or radial structure would be more suitable for the ARP's complex configuration.
The structures, such as planets and satellites, demonstrate how a large star can be radially mapped to an entire galaxy; A tiny satellite orbiting a planet may follow an elliptical path rather than a circular one; Comets exhibit irregular movement patterns. The logic behind all these phenomena is similar to the logic I use when producing music.
"Solar Beep" is the result of this search, built to elevate creative flow with a uniquely flexible design. It offers three modes—Mono, Sequencer, and Hold—alongside 8 knobs, a joystick, and a dual-pitch control for its 10-note button setup. With 22 LEDs providing real-time visual feedback, "Solar Beep" simplifies complex mappings and gives users an intuitive experience, balancing precision and adaptability for a more responsive and engaging production tool.
How do you plan to make it?
With the same formula I used in SI24, it still will be the similar steps:
Research;
Draft;
Shopping list based on the draft;
Process;
Debug;
I want to create a device that captures natural dynamics—like wind, tides, and direction—and converts them into MIDI signals. These signals can then control musical instruments or interact with web pages and software, offering new experimental methods for sound and visual design.
To support this, I’ll design dedicated web pages and visual elements to enhance the interactive experience with the device.
In the initial testing phase, I’ll focus on three natural variables:
- waves
- direction
- wind
Each variable brings unique characteristics—
- wind, for instance, is unpredictable and unstable;
- tides are relatively regular and consistent;
- Direction requires intentional human input, like a compass.
Through sensor-based testing, I’ll gather and analyze data from each of these factors to determine which is most suitable for this project. By integrating these natural variables, I hope to introduce new ways of interaction in both sound and visual media.
What is your timetable?
From September to December, I will be constantly making workshops, exercises, and performances related to the main subject in order to develop more concrete content.
Including:
a Javascript Club, about using web audio tools to make sound experiment.
INC zine event
Public event, a performance in the kitchen using a microwave counting time.
Colloquium workshop
a performance using the devices I made to present.
Why do you want to make it?
I think Nature is a large content, it could related to using the nature as a language to communicate, by various machine, or as a predicted way,to predict fortune, but not a typical Divination function. Such as direction, temperature, magnetic, gravity... these factors could be connections with the device. By using an unstable way to control the devices, like the nature factors.
Who can help you and how?
Xpub tutors will help me a lot to clarify my logical structure and assist me with the technological issues I may encounter during the process. I will also try to contact relevant engineers, product designers, sound designers, or relevant communities for more inspiration. During the testing phase, I will reach out to different communities to organize workshops and performances for testing the devices and gathering feedback from users.
Relation to previous practice
Building on my experiences with SI22, SI23, and SI24, I will combine my technical skills and creative exploration from those projects. The process of researching, drafting, and debugging will support my development approach, allowing me to incorporate previous learnings into this new project and create a seamless integration of sound and nature.
Relation to a larger context
This project connects to broader themes of environmental awareness and the fusion of technology with nature. By exploring the interplay between natural phenomena and digital soundscapes, it will also utilize unstable elements from nature to engage with digital aspects. This could be an important way to enhance creativity in sound design.
References/bibliography
https://www.ableton.com/en/packs/inspired-nature/ https://www.lovehulten.com/
Fortune
Believe it Yourself is a series of real-fictional belief-based computing kits to make and tinker with vernacular logics and superstitions. Created by Shanghai based design studio automato.farm, 'BIY™ - Believe it Yourself' is a series of real-fictional belief-based computing kits to make and tinker with vernacular logics and superstitions. The team worked with experts in fortune telling from Italy, geomancy from China and numerology from India to translate their knowledge and beliefs into three separate kits – BIY.SEE, BIY.MOVE and BIY.HEAR. They invite users to tinker with cameras that can see luck*, microphones that interpret your destiny*, and compasses that can point you to harmony and balance*.
http://automato.farm/portfolio/believe_it_yourself/
other inspirations:
Plants
https://www.datagarden.org/technology
Gravities
Bouncy Notes
https://dillonbastan.com/inspiredbynature_manuals/Bouncy%20Notes%20User%20Manual.pdf https://www.youtube.com/watch?v=C2hQ-WbKBhU
Droplets
https://finneganeganegan.xyz/works/droplets
Thesis Outline
This thesis will explores how natural forces like temperature, direction, magnetism, and gravity can be used to control sound devices such as MIDI controllers and synthesizers. Rather than relying on traditional, stable inputs (like pressing keys or turning knobs), we investigate how unpredictable, changing elements from nature can be used to influence sound production. The idea is to treat nature as a kind of "language" that machines can understand and respond to in real-time, creating a more dynamic and organic way to control devices.
Beyond just controlling sound, this project aims to experiment with how different technologies can interact with each other. By using natural forces to control devices, we can create new and unexpected results. For example, changes in wind direction might not only affect sound but also trigger visual changes or interact with other devices, creating a chain reaction. The goal is to explore how these connections between sound, natural data, and other forms of technology can open up creative possibilities that go beyond traditional methods.
To take this experiment further, the thesis will also explore how these ideas can be combined into a website or an interactive HTML platform. This would allow users to experience the sound, visuals, and interactions online, making the project more immersive. By connecting natural forces with sound devices and digital platforms, this project aims to create an evolving system where the physical and digital worlds blend together in new and interesting ways.