Tasks
- Review AI/Ethics paper
- Review book requests
SBIRs
- Set up app. I want to create a series of components in their own files so the App file isn’t so unmanagable
- Try compressing 2-n sentences into one, and build a hierarchical synthesis of a book. Moby-dick, at about 10,000 lines would compress at a 4:1 ratio like this (=B$1/POWER($E$1,A2)):

- Trying that out with text-davinci-003 gives good results (temp 0, presence penalty 0.8):

- That should keep drift at bay and allow for a better ability to look back rather than just using the closest prompts. We use each level of compression in the prompt, with the last line in the prompt from the uncompressed source, and then trace our sources through the levels of indirection.
- Didn’t like the line splits. That makes the training text inconsistent. Switching to words. That seems to be working!
Summarize the following into a single sentence:
ETYMOLOGY. Supplied by a Late Consumptive Usher to a Grammar School. The pale Usher—threadbare in coat, heart, body, and brain; I see him now. He was ever dusting his old lexicons and grammars, with a queer handkerchief, mockingly embellished with all the gay flags of all the known nations of the world.
Summary: A pale, threadbare Usher at a Grammar School was often seen dusting his old lexicons and grammars with a handkerchief decorated with flags from different nations.
- The recursive summarization is crazy good. Need to run it on the full book. I think I need to add an int that shows the “level” of the abstraction. Thinking that the the summary out of the number of summaries would be good
- 9:15 standup
- 11:00 ChatGPT working session – went well!