Phil 2.18.21

Book

  • Finished the Google Doodle. Next is balloon challenge.

GOES

SBIR

  • Meeting on NN architecture

GPT Agents

  • Language Models are few-shot learners (video). Lots of good stuff about how to build probes.
  • Was able to get the ecco library to work with my local model!
from transformers import AutoConfig, AutoModelForCausalLM, AutoTokenizer
from ecco.lm import LM

activations=False
attention=False
hidden_states=True
activations_layer_nums=None

model_str = '../models/chess_model'
tokenizer = AutoTokenizer.from_pretrained(model_str)
model = AutoModelForCausalLM.from_pretrained(model_str, output_hidden_states=hidden_states, output_attentions=attention)

lm_kwargs = {
    'collect_activations_flag': activations,
    'collect_activations_layer_nums': activations_layer_nums}
lm = LM(model, tokenizer, **lm_kwargs)

# Input text
text = "Check."

# Generate 100 tokens to complete the input text.
output = lm.generate(text, generate=100, do_sample=True)

print(output)

Had a nice chat with Antonio about an introduction for the special issue