User:Joca/word-embeddings: Difference between revisions

From XPUB & Lens-Based wiki
No edit summary
Line 3: Line 3:
= Algolit @ Varia =
= Algolit @ Varia =


I participated in the Algolit session of March 17th and learnt about word embeddings. This is a way of unsupervised machine learning where an algoritm turns text into numbers and places them in a multi dimensional space. The relative distance between specific words is the result of how often they are placed close to each other in the original text.


[https://gitlab.constantvzw.org/algolit/algolit/tree/master/algologs Scripts used during the session]
[https://pad.constantvzw.org/p/180317_algolit_word2vec Pad of the day]
[https://pad.constantvzw.org/p/180317_algolit_word2vec Pad of the day]

Revision as of 10:18, 28 March 2018

Word embeddings in my reader for Special Issue 5

Algolit @ Varia

I participated in the Algolit session of March 17th and learnt about word embeddings. This is a way of unsupervised machine learning where an algoritm turns text into numbers and places them in a multi dimensional space. The relative distance between specific words is the result of how often they are placed close to each other in the original text.

Scripts used during the session Pad of the day