User:Joca/word-embeddings: Difference between revisions
No edit summary |
|||
Line 3: | Line 3: | ||
= Algolit @ Varia = | = Algolit @ Varia = | ||
I participated in the Algolit session of March 17th and learnt about word embeddings. This is a way of unsupervised machine learning where an algoritm turns text into numbers and places them in a multi dimensional space. The relative distance between specific words is the result of how often they are placed close to each other in the original text. | |||
[https://gitlab.constantvzw.org/algolit/algolit/tree/master/algologs Scripts used during the session] | |||
[https://pad.constantvzw.org/p/180317_algolit_word2vec Pad of the day] | [https://pad.constantvzw.org/p/180317_algolit_word2vec Pad of the day] |
Revision as of 09:18, 28 March 2018
Algolit @ Varia
I participated in the Algolit session of March 17th and learnt about word embeddings. This is a way of unsupervised machine learning where an algoritm turns text into numbers and places them in a multi dimensional space. The relative distance between specific words is the result of how often they are placed close to each other in the original text.