User:Francg/expub/specialissue2/dev
Motion Dialogues
Initially influenced by the interest on the current migration flux hitting occident, I will explore physical movement in specific delimited spaces and the reaction/consequences that this physical action will involve. It aims to generate audio feedback loops by using motion and color detection techniques in Pd.
* * Audio Feedback Loop * *
* The loop stops / starts again when new movement is detected inside the square’s area.
1- Webcam + Pd: Square Detects color change / motion
2- Pd: It activates oscilators and generates sound / or plays imported audio file
3- Pd: Sends signal to speakers
4- Speakers: They reproduce Pd audio
5- Microphone: Mic connected to Raspberry Pi, placed near speakers detects the sound (+ surroundings making a mix?)
6- Pd: Live-recording on table (array) with [tabwrite~] just a few sec
7- Pd: Plays automatically after a few seconds with [tabplay~]
* * Meeting notes / Feedback * *
- How can a body be represented into a score?
- "Biovision Hierarchy" = file format - motion detection.
- Femke Snelting reads the Biovision Hierarchy Standard
- Systems of notations and choreography - Johanna's thesis in the wiki
Raspberry Pi *
- Floppy disk: contains a patch from Pd.
- Box: Floppy Drive, camera, mic...
- Server: Documentation such as images, video, prototypes, resources...
- There are two different research paths that could be more interestingly further explored separately;
1 - on one hand, * motion capture * by employing tools/software like "Kinect", "Synapse" app, "Max MSP", "Ableton", etc...
2 - on the other hand, there is data / information reading * This can be further developed and simplified. * However, motion capture using Pd and an ordinary webcam to make audio effects could be efficiently linked.