Phil 10.16.20

Ping UMBC about onboarding and connecting my accounts

Today I learned of living tree root bridges

Measuring Gendered Correlations in Pre-trained NLP Models

  • In “Measuring and Reducing Gendered Correlations in Pre-trained Models” we perform a case study on BERT and its low-memory counterpart ALBERT, looking at correlations related to gender, and formulate a series of best practices for using pre-trained language models. We present experimental results over public model checkpoints and an academic task dataset to illustrate how the best practices apply, providing a foundation for exploring settings beyond the scope of this case study. We will soon release a series of checkpoints, Zari1, which reduce gendered correlations while maintaining state-of-the-art accuracy on standard NLP task metrics.

#COVID/GPT-2 Agents

  • Going to generate a new set of text for the Saudi tweets and then do the same for the US set, then start training runs for the weekend

GOES

  • 10:00 Meeting with Vadim

Book

  • More on cults
  • Looking at how alpha males influence chimpanzee travel patterns