Platforms shape practice session: Difference between revisions
(→Today) |
(→Today) |
||
Line 5: | Line 5: | ||
Picking up from [[Cloudmix session|last session]]... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of '''sound sharing'''? | Picking up from [[Cloudmix session|last session]]... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of '''sound sharing'''? | ||
The proposal is to use material related to this Special Issue, that is uploaded to Mixcloud or MediaWiki. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc. | The proposal is to use material related to this Special Issue, that is uploaded to Mixcloud or MediaWiki. In this way, we can explore these platforms while working with materials we made or are close to. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc. | ||
And '''standalone''', is an advised constraint, because it's a useful skill to learn how to make standalone web pages! | And '''standalone''', is an advised constraint, because it's a useful skill to learn how to make standalone web pages! |
Revision as of 08:49, 5 November 2024
Today
Today we will make standalone web pages based on Mixcloud or MediaWiki API calls.
Picking up from last session... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of sound sharing?
The proposal is to use material related to this Special Issue, that is uploaded to Mixcloud or MediaWiki. In this way, we can explore these platforms while working with materials we made or are close to. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc.
And standalone, is an advised constraint, because it's a useful skill to learn how to make standalone web pages!
At the end of the day, we will upload all pages to: /var/www/html/SI25/
Context/references
- MIT Software Studies book, 2008 + MIT Software Studies revisited, Wendy Hui Kyong Chun, Winnie Soon, Noah Wardrip-Fruin, Jichen Zhu, 2022
- Queer Motto API, Winnie Soon & Helen Pritchard
- Shell Song, Everest Pipkin (ref to text adventure games)
- DiVersions, Constant/OSP (generated from a wiki)
- Volumetric Regimes, Possible Bodies/Manetta Berends (generated from a wiki)
Prepared prototypes
Mixcloud
https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud-room/room.html
└── mixcloud-once-upon ├── extratonality.json ├── room.html └── xpubweek1.json
This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change www
into api
and add ?metadata=1
at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line 30 of this page to explore the metadata of that show.
To make a standalone version: you can download the API response as a JSON file with: $ curl URL > filename.json
and change the API call URL to this JSON file in line 30.
MediaWiki authors
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors.html + https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors_with-input-field.html
This page dives into the revisions of a wiki page, and lists max 500 authors.
MediaWiki links
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-links/links.html
This page lists all internal wiki links on a page...
See get-json.sh
how you can save the JSON data to a file.
MediaWiki crowd
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-crowd/crowd.html
https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=*
This page parses the RSS feed of recent changes, to filter the usernames out.
See get-crowd.sh
how you can save the RSS feed to a file.