Monthly Archives: June 2021

Phil 6.7.21

Retractable Pergola Covers & Awnings

  • Unlike drop awnings, the Verona is a traditional horizontal awning or angled awning that provides excellent shade coverage. The Verona is designed to be installed on top of a pergola or trellis providing a natural and effective means for light and temperature control while still allowing open air spaces. The Verona can also be installed over traditional construction such as conservatories, glass ceilings, atriums, solariums and skylights to control interior light, ultraviolet rays, glare, and heat. The box awning frame of the Verona uses compact mounting hardware that make it simple to install over almost any kind of frame.

SocialSens 2021

  • Keynote: Cecile Paris, CSIRO, Australia, “Mapping Emotions on Social Media”
  • 10:45 session
  • Done with my presentation! This person looks interesting: Adriana Iamnitchi
    • My research is rooted in distributed systems, with emphasis on characterizing cyber-social systems and designing, implementing and experimenting with algorithms, services and applications for large-scale networked-systems. In a typical project cycle, in our group we quantitatively characterize socio-technical phenomena at scale, model them, apply new understandings to the design of distributed systems, and experimentally measure the performance differences. In the process we often rely on, and contribute to, research from other fields. Recently we have used research from sociology, psychology and political science to build better understandings of quantitative observations or to inform my design and experiments. While my recent work is related mainly to online social interactions and big data processing, the same research practice (of quantitatively evaluating socio-technical environments and then applying observations to the design of distributed systems or services) defines my early work in scientific grids and peer-to-peer systems. For more details, please refer to my research statement.
  • Had to bail to frantically assemble 3 near-useless quad charts by 4:00

SBIR

  • Had to assemble 3 near useless quad charts by COB because someone realized that LM needed them today. First time I seriously thought about quitting this company

Phil 6.4.21

Tesla sees a truck carrying traffic lights (via Twitter):

Ping Tim!

Send David money!

GPT Agents

  • Finish slides
  • 3:30 Walkthrough

Book

  • Started the “Do you see yourself here” section. Thought a lot about John 1:1
  • 2:00 Meeting with Michelle

Phil 6.3.21

Decision Transformer: Reinforcement Learning via Sequence Modeling

  • We present a framework that abstracts Reinforcement Learning (RL) as a sequence modeling problem. This allows us to draw upon the simplicity and scalability of the Transformer architecture, and associated advances in language modeling such as GPT-x and BERT. In particular, we present Decision Transformer, an architecture that casts the problem of RL as conditional sequence modeling. Unlike prior approaches to RL that fit value functions or compute policy gradients, Decision Transformer simply outputs the optimal actions by leveraging a causally masked Transformer. By conditioning an autoregressive model on the desired return (reward), past states, and actions, our Decision Transformer model can generate future actions that achieve the desired return. Despite its simplicity, Decision Transformer matches or exceeds the performance of state-of-the-art model-free offline RL baselines on Atari, OpenAI Gym, and Key-to-Door tasks.
  • I think this means that the backwards transformer could be trained to write questions that are most likely to result in a particular answer.

Book

  • Did a little fixing of the maps and chapters when I realized that the government is not like a large company. Companies are much more tied up in money, which makes sense. The government is about the power to protect, punish, and hide knowledge. It’s much closer to Greek/Roman gods?
  • Need to respond to Uprenda today

SBIR

  • More final report writing
  • 9:15 standup
  • 10:30 proposal meeting

GPT-Agents

  • More slide re-working
  • At 600k updates. So this will take about 2 weeks

Phil 6.2.21

GPT Agents

  • Gave up on the SQL update because I think it was taking more time per each line. I’m now using the primary key for a select/update pair and it seems to be taking the same time for each. Looks like around 25k updates/hour?
  • Updating my slides to show the gpt generating text:
  • And many, many more fixes

Book

  • More conspiracy article/chapter

SBIR

  • Starting final report
  • 10:00 meeting

JuryRoom

7:00 Meeting

Phil 6.1.21

June!

This looks quite interesting:

An Attention Free Transformer

  • We introduce Attention Free Transformer (AFT), an efficient variant of Transformers that eliminates the need for dot product self attention. In an AFT layer, the key and value are first combined with a set of learned position biases, the result of which is multiplied with the query in an element-wise fashion. This new operation has a memory complexity linear w.r.t. both the context size and the dimension of features, making it compatible to both large input and model sizes. We also introduce AFT-local and AFT-conv, two model variants that take advantage of the idea of locality and spatial weight sharing while maintaining global connectivity. We conduct extensive experiments on two autoregressive modeling tasks (CIFAR10 and Enwik8) as well as an image recognition task (ImageNet-1K classification). We show that AFT demonstrates competitive performance on all the benchmarks, while providing excellent efficiency at the same time.

SBIR

  • More writing. It turns out that the conference that I was aiming for had a (required) early submission for US authors that I missed. Sigh
  • Wrote of a description of cloud computing for big science for Eric H
  • Worked on 2 proposal overviews of Orest

Book

  • Working on conspiracy article/chapter

GPT-Agents,

  • Still running the statement that I put together Saturday
  • 3:00 – ICWSM rehearsal – lots of good comments, which means lots of revisions. Another walkthrough this Friday at 3:30