You shall know a piece by the company it keeps. Chess plays as a data for word2vec models
- In this paper, I apply linguistic methods of analysis to non-linguistic data, chess plays, metaphorically equating one with the other and seeking analogies. Chess game notations are also a kind of text, and one can consider the records of moves or positions of pieces as words and statements in a certain language. In this article I show how word embeddings (word2vec) can work on chess game texts instead of natural language texts. I don’t see how this representation of chess data can be used productively. It’s unlikely that these vector models will help engines or people choose the best move. But in a purely academic sense, it’s clear that such methods of information representation capture something important about the very nature of the game, which doesn’t necessarily lead to a win.

tSNE visualisation of endgame moves
Applying word2vec to Recommenders and Advertising
Tasks
- Doctor Appt 8:40 – done! PT too. Love Kaiser Permanente.
- Trash
- Roll in edits for book
SBIRs
- Roll in more changes to the quarterly report – done for now. See how far we get tomorrow
- Start reading up on the W2V algorithm – looks very promising
- Tutorial for Tensorflow
- Word2Vec Tutorial – The Skip-Gram Model from Chris McCormick, who more recently wrote this: Reading and Writing with Projections. Also, he has a W2V book that you can buy: The Inner Workings of word2vec
- Start up a new trajectory generator project with Ron basics are set up
