User:Lidia.Pereira/SDR/SF: Difference between revisions
No edit summary |
No edit summary |
||
(7 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
The idea of generating slash fiction from RSS feeds brings into play ideas of promiscuity as deeply connected to power structures. | ''November/December 2013'' | ||
The idea of generating slash fiction from RSS feeds brings into play ideas of promiscuity as deeply connected to power structures. Also, as part of my interest towards our participation in culture and how that participation, even when explicit, is monetized implicitly, I tried to reflect on how can that explicit participation remain in the user's domain. This exercise provides me with an opportunity to develop Python skills within a more research-based frame. | |||
(see "Bastard Culture" and "Return of the Crowds: Mechanical Turk and Neoliberal States of Exception") | |||
Steps to take: <br> | |||
1. Filter the RSS feeds so that it won't show results that are not related to figures in power. <br> | |||
2. Filter the names in the Harry Potter slash fiction text file and replace them with the names of figures in power currently on the news.<br> | |||
3. Generate! <br> | |||
'''Small mockup for RSS feed slash fiction:''' | '''Small mockup for RSS feed slash fiction:''' | ||
Line 10: | Line 19: | ||
<syntaxhighlight lang="python"> | <syntaxhighlight lang="python"> | ||
import | import feedparser, pickle, nltk | ||
from nltk.corpus import PlaintextCorpusReader | from nltk.corpus import PlaintextCorpusReader | ||
Line 17: | Line 26: | ||
rawFeed = feedparser.parse(url) | rawFeed = feedparser.parse(url) | ||
def convert ( | def convert (humpty): | ||
tokens = nltk.word_tokenize(humpty) | |||
return nltk.Text(tokens) | |||
return nltk.Text ( | |||
superSumario = " " | superSumario = " " | ||
Line 31: | Line 38: | ||
out = (open("filtertest.txt", "w")) | out = (open("filtertest.txt", "w")) | ||
pickle.dump(superSumario,out) | pickle.dump(superSumario,out) | ||
textrss = open('filtertest.txt').read() | textrss = open('filtertest.txt').read() | ||
textslash = open('harrypotterslash.txt').read() | textslash = open('harrypotterslash.txt').read() | ||
both = textrss.strip() + textslash.strip() | both = textrss.strip() + textslash.strip() | ||
puff = convert(both) | |||
herpDerp = str(puff.generate(300)) | |||
print herpDerp.strip() | |||
puff = convert( | |||
herpDerp = str(puff.generate( | |||
</syntaxhighlight> | </syntaxhighlight> | ||
Line 48: | Line 52: | ||
aside and his mind drifted and it was Harry she was after " and "Great game Ron " as they passed in the glasgow helicopter crash both | aside and his mind drifted and it was Harry she was after " and "Great game Ron " as they passed in the glasgow helicopter crash both | ||
feature . david cameron's whole body shivered with pleasure. | feature . david cameron's whole body shivered with pleasure. | ||
'''Most Successful So Far:''' | |||
<syntaxhighlight lang="python"> | |||
import feedparser, pickle, nltk | |||
import re, random | |||
url = "http://feeds.bbci.co.uk/news/rss.xml" | |||
searchwords= ["president","minister","cameron","presidential","obama","angela","barroso","pm","chancellor"] | |||
rawFeed = feedparser.parse(url) | |||
def convert (humpty): | |||
tokens = nltk.word_tokenize(humpty) | |||
return nltk.Text(tokens) | |||
superSumario = " " | |||
for i in rawFeed.entries: | |||
sumario = i["summary"].lower() | |||
for w in searchwords: | |||
if w in sumario.strip(): | |||
superSumario = superSumario + sumario | |||
out = (open("afiltertest.txt", "w")) | |||
pickle.dump(superSumario,out) | |||
textrss = open('afiltertest.txt').read() | |||
textslash = open('harrypotterslash.txt').read() | |||
names = re.compile(r"[a-z]\s([A-Z]\w+)") | |||
pi = names.findall(textslash) | |||
lista = [] | |||
for name in pi: | |||
if name not in lista: | |||
lista.append(name) | |||
variavel = "" | |||
for name in lista: | |||
variavel = variavel + name + "|" | |||
ze = "(" + variavel.rstrip ("|") + ")" | |||
nomes = re.compile(ze) | |||
office1 = re.compile(r"(president\s\w+\w+)") | |||
office2 = re.compile(r"(minister\s\w+\w+)") | |||
office3 = re.compile(r"(chancellor\s\w+\w+)") | |||
repls = office1.findall(textrss) + office2.findall(textrss) + office3.findall(textrss) | |||
def r(m): | |||
return random.choice(repls) | |||
yes = nomes.sub(r,textslash) | |||
both = yes.strip() + textrss.strip() | |||
puff = convert(both) | |||
herpDerp = puff.generate(500) | |||
</syntaxhighlight> | |||
"minister david cameron leads politicians in tributes to the common | |||
room if you want to just. Suddenly a wild thought entried his head and | |||
thoughts of lay beneath them. Every now and then handed them to | |||
acknowledge president sepp blatter said minister david cameron reached | |||
the bottom of the ladder and only looked to see were he was not always | |||
looked upon as favourably by the british political establishment.in | |||
the course of making a two-part bbc profile of angela merkel put her | |||
panties down , which soon lay disguarded on the floor simultaneously." |
Latest revision as of 12:22, 11 December 2013
November/December 2013
The idea of generating slash fiction from RSS feeds brings into play ideas of promiscuity as deeply connected to power structures. Also, as part of my interest towards our participation in culture and how that participation, even when explicit, is monetized implicitly, I tried to reflect on how can that explicit participation remain in the user's domain. This exercise provides me with an opportunity to develop Python skills within a more research-based frame.
(see "Bastard Culture" and "Return of the Crowds: Mechanical Turk and Neoliberal States of Exception")
Steps to take:
1. Filter the RSS feeds so that it won't show results that are not related to figures in power.
2. Filter the names in the Harry Potter slash fiction text file and replace them with the names of figures in power currently on the news.
3. Generate!
Small mockup for RSS feed slash fiction:
French Foreign Minister Laurent Fabius muttered quietly that some EU sanctions on Iran could be lifted next month, as part of a nuclear deal with world powers. "What?" Japan PM Shinzo Abe asked, while smoothly caressing his partner's nipples. Ukrainian President Viktor Yanukovych sat on a couch, with both sitting either side of him kissing his neck, each with one hand stroking his inner thighs. Meanwhile, in the room next door, Prince Harry claimed his brother the Duke of Cambridge is "jealous" of his charity trek to the South Pole. To blow off some steam, he begins to stroke the large bulge he found in Angela Merkel's trousers.
First Experiences:
import feedparser, pickle, nltk
from nltk.corpus import PlaintextCorpusReader
url = "http://feeds.bbci.co.uk/news/rss.xml"
searchwords= ["president","minister","cameron","presidential","european","obama","angela","barroso","pm","chancellor"]
rawFeed = feedparser.parse(url)
def convert (humpty):
tokens = nltk.word_tokenize(humpty)
return nltk.Text(tokens)
superSumario = " "
for i in rawFeed.entries:
sumario = i["summary"].lower()
for w in searchwords:
if w in sumario.strip():
superSumario = superSumario + sumario
out = (open("filtertest.txt", "w"))
pickle.dump(superSumario,out)
textrss = open('filtertest.txt').read()
textslash = open('harrypotterslash.txt').read()
both = textrss.strip() + textslash.strip()
puff = convert(both)
herpDerp = str(puff.generate(300))
print herpDerp.strip()
Small extract from the result:
David cameron's panties which were becoming damp and got wetter with every slow movement . Tentatively he moved the lacy fabric
aside and his mind drifted and it was Harry she was after " and "Great game Ron " as they passed in the glasgow helicopter crash both
feature . david cameron's whole body shivered with pleasure.
Most Successful So Far:
import feedparser, pickle, nltk
import re, random
url = "http://feeds.bbci.co.uk/news/rss.xml"
searchwords= ["president","minister","cameron","presidential","obama","angela","barroso","pm","chancellor"]
rawFeed = feedparser.parse(url)
def convert (humpty):
tokens = nltk.word_tokenize(humpty)
return nltk.Text(tokens)
superSumario = " "
for i in rawFeed.entries:
sumario = i["summary"].lower()
for w in searchwords:
if w in sumario.strip():
superSumario = superSumario + sumario
out = (open("afiltertest.txt", "w"))
pickle.dump(superSumario,out)
textrss = open('afiltertest.txt').read()
textslash = open('harrypotterslash.txt').read()
names = re.compile(r"[a-z]\s([A-Z]\w+)")
pi = names.findall(textslash)
lista = []
for name in pi:
if name not in lista:
lista.append(name)
variavel = ""
for name in lista:
variavel = variavel + name + "|"
ze = "(" + variavel.rstrip ("|") + ")"
nomes = re.compile(ze)
office1 = re.compile(r"(president\s\w+\w+)")
office2 = re.compile(r"(minister\s\w+\w+)")
office3 = re.compile(r"(chancellor\s\w+\w+)")
repls = office1.findall(textrss) + office2.findall(textrss) + office3.findall(textrss)
def r(m):
return random.choice(repls)
yes = nomes.sub(r,textslash)
both = yes.strip() + textrss.strip()
puff = convert(both)
herpDerp = puff.generate(500)
"minister david cameron leads politicians in tributes to the common room if you want to just. Suddenly a wild thought entried his head and thoughts of lay beneath them. Every now and then handed them to acknowledge president sepp blatter said minister david cameron reached the bottom of the ladder and only looked to see were he was not always looked upon as favourably by the british political establishment.in the course of making a two-part bbc profile of angela merkel put her panties down , which soon lay disguarded on the floor simultaneously."