Platforms shape practice session: Difference between revisions

From XPUB & Lens-Based wiki
No edit summary
Line 3: Line 3:
Today we will make '''standalone web pages''' based on '''Mixcloud or MediaWiki API calls'''.
Today we will make '''standalone web pages''' based on '''Mixcloud or MediaWiki API calls'''.


Picking up from [[Cloudmix session|last session]]... we will make prototypes as platform studies: how can you make a web page to emphasize specific features of Mixcloud or MediaWiki?<br>
Picking up from [[Cloudmix session|last session]]... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of '''sound sharing'''?  
If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of '''sound sharing'''?  


What kind of web page you make, is up to you. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the alphabet soup on the wiki, the glossary wiki pages, personal wiki pages you made with notes, the course wiki in general, Mixcloud in general, ..., ...
The proposal is to use pages related to this Special Issue. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc.


And '''standalone''', is an advised constraint, because it's a useful skill to learn how to make standalone web pages!
And '''standalone''', is an advised constraint, because it's a useful skill to learn how to make standalone web pages!


At the end of the day, we will upload all pages to: <code>/var/www/html/platforms-shape-practice/</code>
At the end of the day, we will upload all pages to: <code>/var/www/html/SI25/</code>
 
==Context/references==
 
* https://mitpress.mit.edu/9780262062749/software-studies/ + http://computationalculture.net/software-studies-revisited/
* https://siusoon.net/projects/queermottoapi Queer Motto API, Winnie Soon & Helen Pritchard]
* [https://everest-pipkin.com/projects/shellsong Shell Song, Everest Pipkin] (ref to text adventure games)
* [https://calibre.constantvzw.org/book/180 DiVersions, Constant/OSP] (generated from a wiki)
* [http://data-browser.net/db08.html Volumetric Regimes, Possible Bodies/Manetta Berends] (generated from a wiki)


==Prepared prototypes==
==Prepared prototypes==
===Mixcloud===
===Mixcloud===


https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud-once-upon/
https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud-room/room.html
 
https://api.mixcloud.com/radiowormrotterdam/protocols-for-collective-performance-w-xpub-300924/?metadata=1


  └── mixcloud-once-upon
  └── mixcloud-once-upon
     ├── extratonality.json
     ├── extratonality.json
     ├── get-extratonality.sh
     ├── room.html
    ├── get-xpubweek1.sh
    ├── once-upon.html
     └── xpubweek1.json
     └── xpubweek1.json


This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change <code>www</code> into <code>api</code> and add <code>?metadata=1</code> at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line 20 of this page to explore the metadata of that show.
This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change <code>www</code> into <code>api</code> and add <code>?metadata=1</code> at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line 30 of this page to explore the metadata of that show.


To make a '''standalone''' version: you can download the API response as a JSON file with: <code>$ curl URL > filename.json</code> and change the API call URL to this JSON file in line 20.
To make a '''standalone''' version: you can download the API response as a JSON file with: <code>$ curl URL > filename.json</code> and change the API call URL to this JSON file in line 30.


===MediaWiki===
===MediaWiki authors===
 
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors.html + https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors_with-input-field.html
 
https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=query&prop=revisions&rvlimit=500&titles=Radio_WORM:_Protocols_for_Collective_Performance&format=json&origin=*
 
This page dives into the revisions of a wiki page, and lists max 500 authors.
 
===MediaWiki links===
 
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-links/links.html
 
https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=parse&page=Radio_WORM:_Protocols_for_Collective_Performance&prop=links&format=json&origin=*
 
This page lists all internal wiki links on a page...
 
See <code>get-json.sh</code> how you can save the JSON data to a file.
 
===MediaWiki crowd===


https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-crowd/crowd.html
https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-crowd/crowd.html


https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors.html + https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors_with-input-field.html
https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=*
 
This page parses the RSS feed of recent changes, to filter the usernames out.


https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-links/links.html
See <code>get-crowd.sh</code> how you can save the RSS feed to a file.

Revision as of 08:46, 5 November 2024

Today

Today we will make standalone web pages based on Mixcloud or MediaWiki API calls.

Picking up from last session... we will make prototypes as platform studies: how can you make a web page to emphasize present and non-present features of Mixcloud or MediaWiki? If practice shapes tools shapes practice..., how does Mixcloud/MediaWiki shape the practice of sound sharing?

The proposal is to use pages related to this Special Issue. You can work for example with Radio WORM's account on Mixcloud, the radio shows you did that were uploaded there, it could be the broadcast pages on the wiki, the alphabet soup, personal wiki pages you made with notes, prototypes, etc.

And standalone, is an advised constraint, because it's a useful skill to learn how to make standalone web pages!

At the end of the day, we will upload all pages to: /var/www/html/SI25/

Context/references

Prepared prototypes

Mixcloud

https://hub.xpub.nl/cerealbox/~manetta/platforms/mixcloud-room/room.html

https://api.mixcloud.com/radiowormrotterdam/protocols-for-collective-performance-w-xpub-300924/?metadata=1

└── mixcloud-once-upon
    ├── extratonality.json
    ├── room.html
    └── xpubweek1.json

This page displays the metadata of a radio show in a custom way. To load this metadata it uses the Mixcloud API. To make an API call, first open a Mixcloud page of one single radio show. Edit the URL: change www into api and add ?metadata=1 at the end. You should get a response back, formatted in JSON. Copy this URL, the API call, and paste it in line 30 of this page to explore the metadata of that show.

To make a standalone version: you can download the API response as a JSON file with: $ curl URL > filename.json and change the API call URL to this JSON file in line 30.

MediaWiki authors

https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors.html + https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-authors/authors_with-input-field.html

https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=query&prop=revisions&rvlimit=500&titles=Radio_WORM:_Protocols_for_Collective_Performance&format=json&origin=*

This page dives into the revisions of a wiki page, and lists max 500 authors.

MediaWiki links

https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-links/links.html

https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=parse&page=Radio_WORM:_Protocols_for_Collective_Performance&prop=links&format=json&origin=*

This page lists all internal wiki links on a page...

See get-json.sh how you can save the JSON data to a file.

MediaWiki crowd

https://hub.xpub.nl/cerealbox/~manetta/platforms/mediawiki-crowd/crowd.html

https://pzwiki.wdka.nl/mw-mediadesign/api.php?action=feedrecentchanges&origin=*

This page parses the RSS feed of recent changes, to filter the usernames out.

See get-crowd.sh how you can save the RSS feed to a file.