Phil 10.6.2023

Had a good discussion with Shimei and Jimmy yesterday about Language Models Represent Space and Time. Basically the idea that the model itself should have the relative representation if information in it and that could be available. The token embeddings are a kind of direction, after all.

Tasks

  • Call Jim Donnie’s – done
  • Call Nathan – done
  • Chores
  • Load Garmin (done) and laptop (on thumb drive)
  • Pack!
  • Bennie note

SBIRs

  • Cancelled group dinner

GPT Agents

  • Wrote up some thoughts about mapping using the LLM itself here