Feeling like the inauguration will go smoothly, but holding my breath anyway
Fixing disinformation won’t save us – Ethan Zuckerman
- There have been countless fact-checking and other efforts designed to rid social media of misinformation. They’re not going to work until the party and the major ideological amplifiers start explicitly renouncing these points of view. The signs are not good – while Fox News was willing to declare that Joe Biden had won the election, they are still providing platforms for people denying the facts of the victory. And a majority of Republican representatives voted to overturn a democratic election. Until there are consequences for perpetuating those falsehoods, don’t count on changes to the media to solve this problem
The end of the Trump-Fox feedback loop
- Twitter’s January 8 decision to permanently suspend Trump’s account closed a rare window into a president’s mindset and policymaking that we are unlikely to ever see again. For the past four years, I documented the sources of the president’s grievances and obsessions, matching Trump’s tweets to the television segments he was watching. The president’s TV addiction inspired at least 1,375 tweets dating back to September 1, 2018. The vast majority came in response to his favorite programs on the pro-Trump Fox News and Fox Business networks.
But if there ever was a coda for the Trump years, this has got to be it:


- In this article, we will focus on the hidden state as it evolves from model layer to the next. By looking at the hidden states produced by every transformer decoder block, we aim to gleam information about how a language model arrived at a specific output token. This method is explored by Voita et al.[1]. Nostalgebraist [2] presents compelling visual treatments showcasing the evolution of token rankings, logit scores, and softmax probabilities for the evolving hidden state through the various layers of the model.
- [1]: The Bottom-up Evolution of Representations in the Transformer: A Study with Machine Translation and Language Modeling Objectives
- [2]: interpreting GPT: the logit lens
Book
- Start on diversity injection section
- Research note: Examining false beliefs about voter fraud in the wake of the 2020 Presidential Election
- The 2020 U.S. Presidential Election saw an unprecedented number of false claims alleging election fraud and arguing that Donald Trump was the actual winner of the election. Here we report a survey exploring belief in these false claims that was conducted three days after Biden was declared the winner. We find that a majority of Trump voters in our sample – particularly those who were more politically knowledgeable and more closely following election news – falsely believed that election fraud was widespread, and that Trump won the election. Thus, false beliefs about the election are not merely a fringe phenomenon. We also find that Trump conceding or losing his legal challenges would likely lead a majority of Trump voters to accept Biden’s victory as legitimate, although 40% said they would continue to view Biden as illegitimate regardless. Finally, we found that levels of partisan spite and endorsement of violence were equivalent between Trump and Biden voters.
MDS
- Meeting with Aaron today to discuss nest steps and how to combine with his project?
- Still need to be able to access the VPN – more paperwork. Wheee!
GOES
- Continue with the new TopController
- Reading in and stepping through the script. Now I need to slew through the points and return a done when the l2 dist is within a threshold
GPT Agents
- Fit More and Train Faster With ZeRO via DeepSpeed and FairScale
- If you use the Hugging Face Trainer, as of
transformers
v4.2.0 you have the experimental support for DeepSpeed’s and FairScale’s ZeRO features. The new--sharded_ddp
and--deepspeed
command lineTrainer
arguments provide FairScale and DeepSpeed integration respectively. Here is the full documentation.
- If you use the Hugging Face Trainer, as of
- Send data – done!
- 3:30 Meeting
- Finish paper and submit?
- Need to export embeddings to tensorboard embedding viewer. Here’s how: tensorflow.org/tensorboard/tensorboard_projector_plugin