Phil 1.19.2023

GPT Agents

  • Updated the roadmap project, and I had a thought where we could train a model on a corpora that has all the periods replaced with probability markers (1%, 2%, 3%, 4%, 6%, 8%, 16%, 25%, 35%) to evaluate how well the model learned those patters as an indicator of overall learning. Maybe train on the complete works of Shakespeare, since that’s all in the public domain and easy to get. Compare finetuning a GPT-2 to training one up from scratch.

SBIRs

  • More on resilience in the Conclusions section. Show two network diagrams. One is a star model where the AI is the central node, with a human-on-the-loop and weapons on each ray. The other is more complex, with AI operators and models as the central nodes, connecting to weapons. Will need to write some generator code for Gephi – done!
  • 9:15 standup
  • 11:00 Artemis meeting
  • 11:30 CSC touchpoint

Book

  • So far nothing back about the permissions log! Or Disney, for that matter