Graduation today starting at 10:00!
D20
- Looks like ASRC actually wanted us to build a completely different marketing app that would be done by 3 people in 3-4 Fridays. So no sponsorship there
GPT2 Agents
- Adding promotion – done. I have to say that I’m pretty pleased with how the parser handles it. We go from:
62. a8=Q+ Kg1 63. Qa7 Kg2
To:
[62] expanded: white: F Caruana moves white pawn from a7 to a8. White pawn is promoted to white queen. Check. black: Ding Liren moves black king from g2 to g1. [63] expanded: white: F Caruana moves white queen from a8 to a7. black: Ding Liren moves black king from g1 to g2.
- Had a bit of trouble figuring out how to deal with the end of a file with a potentially incomplete game.
- Everything works!
- Now I need to write games out to files, do something with the introductions, etc.
- Also add some variability in the language later
- Also need to handle queenside castling (O-O-O)
GOES
- See how the 20,000 epoch run went, and start on Wasserstein Loss
- The 20,000 epoch really wasn’t any better. rerunning the 10k and saving the model
- The 10k version seems to be wider and fits better in the bounds. Looking at the two charts, it looks like following the loss/accuracy of the fake data make be best in this case:
- Reading the GAN chapter in Advanced Deep Learning with Keras (github)
- Adding a plot of plots to show the appearance of the generator over the epochs. It should help in understanding the evolution of the generator
- Bless StackOverflow
side = 4 fig = plt.figure(1) axs = fig.subplots(side, side) fig.set_figheight(15) fig.set_figwidth(15) for i in range(side*side): X = gen_data() plot_grid(axs, X.T, i, side=side) fig = plt.figure(2) plt.plot([1,2,3,4,3,2,1,2,3], label = "GREEN", color="green") plt.plot([1, 2, 3, 4, 5, 6, 7, 8, 9], label = "RED", color="red") plt.legend(loc="upper left") plt.show()
- 2:00 Meeting
- Write up a code walkthrough proposal and send to Erik
- set up 10:00 meeting with Vadim for tomorrow
- 3:00 Log file meeting (T & Isaac)
- Demo’d LMN and RB
#COVID
- Good discussion about data and paper writing
- Tried to see if the annotated twitter data is in the DB. Some is, some is not
- Tried to get the transformers translate example running again, including downloading the files explicitly. Still didn’t work. Submitted a ticket