Phil 6.12.20

Hey! My dissertation is online now!

Optimizing Multiple Loss Functions with Loss-Conditional Training

  • The idea behind our approach is to train a single model that covers all choices of coefficients of the loss terms, instead of training a model for each set of coefficients. We achieve this by (i) training the model on a distribution of losses instead of a single loss function, and (ii) conditioning the model outputs on the vector of coefficients of the loss terms. This way, at inference time the conditioning vector can be varied, allowing us to traverse the space of models corresponding to loss functions with different coefficients

GPT-2 Agents

  • Applied to get on the OpenAI API waitlist
  • Started figuring out igraph. Welp, it doesn’t plot because cannot load library ‘libcairo-2.dll’: error 0x7e Diesn’t seem to be a good fix. It’s a shame, because igraph seems to be great for analyzing graphs mathematically. Removing everything
  • Looks like I can use networkx combined with networkx_viewer (pypi)(github). Look into that next. Upgraded from 2.1 to 2.4
  • Pulled my NetworkxGraphing.py class over from Antibubbles and verified that it still works!

networkx

GOES

  • Send Jason my download code
  • Work on GVSETS paper
    • Added formatting changes and moved footnotes to citations
    • Adding a figure for the pipeline. Hmmm. It’s um… big

pipeline

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.