Platforms shape practice session: Difference between revisions
m (Manetta moved page Platforms shape practice to Platforms shape practice session) |
No edit summary |
||
(9 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
==Today== | ==Today== | ||
Today we will make '''standalone web pages''' based on '''Mixcloud or MediaWiki API calls'''. | |||
Picking up from [[Cloudmix session|last session]]... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of '''sound sharing'''? | |||
If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of sound sharing? | |||
The proposal is to use material related to this Special Issue, that is uploaded to Mixcloud or MediaWiki. In this way, we can explore these platforms while working with materials we made or are close to. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc. | |||
And '''standalone''', is an advised constraint, because it's a useful skill to learn how to make standalone web pages! | And '''standalone''', is an advised constraint, because it's a useful skill to learn how to make standalone web pages! | ||
At the end of the day, we will upload all pages to: <code>/var/www/html/ | At the end of the day, we will upload all pages to: <code>/var/www/html/SI25/</code> | ||
==Context/references== | |||
* [https://siusoon.net/projects/queermottoapi Queer Motto API, Winnie Soon & Helen Pritchard] | |||
* [https://everest-pipkin.com/projects/shellsong Shell Song, Everest Pipkin] (ref to text adventure games) | |||
* [https://calibre.constantvzw.org/book/180 DiVersions, Constant/OSP] (generated from a wiki) | |||
* [http://data-browser.net/db08.html Volumetric Regimes, Possible Bodies/Manetta Berends] (generated from a wiki) | |||
==Prepared prototypes== | ==Prepared prototypes== | ||
===Mixcloud=== | ===Mixcloud=== | ||
https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud- | https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud-room/room.html | ||
API call: https://api.mixcloud.com/radiowormrotterdam/protocols-for-collective-performance-w-xpub-300924/?metadata=1 | |||
└── mixcloud-once-upon | └── mixcloud-once-upon | ||
├── extratonality.json | ├── extratonality.json | ||
├── | ├── room.html | ||
└── xpubweek1.json | └── xpubweek1.json | ||
This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change <code>www</code> into <code>api</code> and add <code>?metadata=1</code> at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line | This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change <code>www</code> into <code>api</code> and add <code>?metadata=1</code> at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line 30 of this page to explore the metadata of that show. | ||
To make a '''standalone''' version: you can download the API response as a JSON file with: <code>$ curl URL > filename.json</code> and change the API call URL to this JSON file in line 30. | |||
===MediaWiki authors=== | |||
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors.html + https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors_with-input-field.html | |||
API call: https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=query&prop=revisions&rvlimit=500&titles=Radio_WORM:_Protocols_for_Collective_Performance&format=json&origin=* | |||
This page dives into the revisions of a wiki page, and lists max 500 authors. | |||
===MediaWiki links=== | |||
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-links/links.html | |||
API call: https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=parse&page=Radio_WORM:_Protocols_for_Collective_Performance&prop=links&format=json&origin=* | |||
This page lists all internal wiki links on a page... | |||
See <code>get-json.sh</code> how you can save the JSON data to a file. | |||
===MediaWiki crowd=== | |||
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-crowd/crowd.html | |||
API call: https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=* | |||
This page parses the RSS feed of recent changes, to filter the usernames out. | |||
See <code>get-crowd.sh</code> how you can save the RSS feed to a file. | |||
===Charlie's prototype=== | |||
https://hub.xpub.nl/cerealbox/~charlie/naenie_udra.html | |||
API calls: https://api.wikimedia.org/core/v1/wikisource/en/search/title?q=quest&limit=10 + https://api.wikimedia.org/core/v1/wikisource/en/page/quest/html | |||
===Kim's prototype=== | |||
https://hub.xpub.nl/cerealbox/~kim/mediawiki-api2/ | |||
https://hub.xpub.nl/cerealbox/~kim/mediawiki-api3/ | |||
API call: https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=* | |||
==To start== | |||
$ ssh cerealbox | |||
$ cd public_html | |||
$ mkdir FOLDERNAMEFORTHESEPROTOTYPES | |||
$ cd FOLDERNAMEFORTHESEPROTOTYPES | |||
$ wget URL-OF-EXAMPLE-YOU-WANT-TO-WORK-WITH | |||
Tip: open the API call you want to work with in the browser, so you can inspect the data of the API response! |
Latest revision as of 11:52, 5 November 2024
Today
Today we will make standalone web pages based on Mixcloud or MediaWiki API calls.
Picking up from last session... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of sound sharing?
The proposal is to use material related to this Special Issue, that is uploaded to Mixcloud or MediaWiki. In this way, we can explore these platforms while working with materials we made or are close to. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc.
And standalone, is an advised constraint, because it's a useful skill to learn how to make standalone web pages!
At the end of the day, we will upload all pages to: /var/www/html/SI25/
Context/references
- Queer Motto API, Winnie Soon & Helen Pritchard
- Shell Song, Everest Pipkin (ref to text adventure games)
- DiVersions, Constant/OSP (generated from a wiki)
- Volumetric Regimes, Possible Bodies/Manetta Berends (generated from a wiki)
Prepared prototypes
Mixcloud
https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud-room/room.html
└── mixcloud-once-upon ├── extratonality.json ├── room.html └── xpubweek1.json
This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change www
into api
and add ?metadata=1
at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line 30 of this page to explore the metadata of that show.
To make a standalone version: you can download the API response as a JSON file with: $ curl URL > filename.json
and change the API call URL to this JSON file in line 30.
MediaWiki authors
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors.html + https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors_with-input-field.html
This page dives into the revisions of a wiki page, and lists max 500 authors.
MediaWiki links
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-links/links.html
This page lists all internal wiki links on a page...
See get-json.sh
how you can save the JSON data to a file.
MediaWiki crowd
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-crowd/crowd.html
API call: https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=*
This page parses the RSS feed of recent changes, to filter the usernames out.
See get-crowd.sh
how you can save the RSS feed to a file.
Charlie's prototype
https://hub.xpub.nl/cerealbox/~charlie/naenie_udra.html
API calls: https://api.wikimedia.org/core/v1/wikisource/en/search/title?q=quest&limit=10 + https://api.wikimedia.org/core/v1/wikisource/en/page/quest/html
Kim's prototype
https://hub.xpub.nl/cerealbox/~kim/mediawiki-api2/
https://hub.xpub.nl/cerealbox/~kim/mediawiki-api3/
API call: https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=*
To start
$ ssh cerealbox $ cd public_html $ mkdir FOLDERNAMEFORTHESEPROTOTYPES $ cd FOLDERNAMEFORTHESEPROTOTYPES $ wget URL-OF-EXAMPLE-YOU-WANT-TO-WORK-WITH
Tip: open the API call you want to work with in the browser, so you can inspect the data of the API response!