Not much time to work on it this morning, but added some text about Stephens’ neural coupling. Tomorrow I’ll talk about the agents a bit and in particular how increasing dimensions makes it harder to have a stampede
There have been countless fact-checking and other efforts designed to rid social media of misinformation. They’re not going to work until the party and the major ideological amplifiers start explicitly renouncing these points of view. The signs are not good – while Fox News was willing to declare that Joe Biden had won the election, they are still providing platforms for people denying the facts of the victory. And a majority of Republican representatives voted to overturn a democratic election. Until there are consequences for perpetuating those falsehoods, don’t count on changes to the media to solve this problem
Twitter’s January 8 decision to permanently suspend Trump’s account closed a rare window into a president’s mindset and policymaking that we are unlikely to ever see again. For the past four years, I documented the sources of the president’s grievances and obsessions, matching Trump’s tweets to the television segments he was watching. The president’s TV addiction inspired at least 1,375 tweets dating back to September 1, 2018. The vast majority came in response to his favorite programs on the pro-Trump Fox News and Fox Business networks.
But if there ever was a coda for the Trump years, this has got to be it:
In this article, we will focus on the hidden state as it evolves from model layer to the next. By looking at the hidden states produced by every transformer decoder block, we aim to gleam information about how a language model arrived at a specific output token. This method is explored by Voita et al.[1]. Nostalgebraist [2] presents compelling visual treatments showcasing the evolution of token rankings, logit scores, and softmax probabilities for the evolving hidden state through the various layers of the model.
The 2020 U.S. Presidential Election saw an unprecedented number of false claims alleging election fraud and arguing that Donald Trump was the actual winner of the election. Here we report a survey exploring belief in these false claims that was conducted three days after Biden was declared the winner. We find that a majority of Trump voters in our sample – particularly those who were more politically knowledgeable and more closely following election news – falsely believed that election fraud was widespread, and that Trump won the election. Thus, false beliefs about the election are not merely a fringe phenomenon. We also find that Trump conceding or losing his legal challenges would likely lead a majority of Trump voters to accept Biden’s victory as legitimate, although 40% said they would continue to view Biden as illegitimate regardless. Finally, we found that levels of partisan spite and endorsement of violence were equivalent between Trump and Biden voters.
MDS
Meeting with Aaron today to discuss nest steps and how to combine with his project?
Still need to be able to access the VPN – more paperwork. Wheee!
GOES
Continue with the new TopController
Reading in and stepping through the script. Now I need to slew through the points and return a done when the l2 dist is within a threshold
If you use the Hugging Face Trainer, as of transformers v4.2.0 you have the experimental support for DeepSpeed’s and FairScale’s ZeRO features. The new --sharded_ddp and --deepspeed command line Trainer arguments provide FairScale and DeepSpeed integration respectively. Here is the full documentation.
The thing is, contemporary Hasidic sects are designed for authoritarian control. Each Hasidic sect, from Bobov to Viznitz to Satmar to Skver, are run by what is called a “grand rabbi.” These rabbis are demanding patriarchs. They expect women to wear particular shades of stockings, men to dress identically, congregants to receive their blessings before making any personal life decisions, and they believe in a world where Hasids are the only Jews worth mentioning. Most importantly, Hasidic grand rabbis center their congregants’ worlds around themselves. They are populist leaders of miniature nations. Congregants have paintings and photographs of grand rabbis around their homes, sacrifice family time for tisches (Friday night gatherings) with their leaders, and would do anything to protect the power of their particular grand rabbi.
Book
Working on Making better Human-Computer Interfaces for Populations. Finished my first pass at The signature of dangerous misinformation section
2:00 Meeting with Michelle
GOES
Decided to build out a sandbox ScriptReaderScratch RCS controller to work out the file loading and playback. Rather than AngleController, I’ll have a method that interpolates to the newest target. That should be enough to let me work out the details without breaking anything
ParametricUMAP allows users to train a neural network to optimize the embedding, resulting in a direct neural net based mapping from source data to embedding. This allows for extremely fast inference (embedding of new data points), orders of magnitude faster than standard UMAP. It also provides facilities for an inverse transform, mapping from the embedding space to the original data space that is both far faster and more robust that that provided by standard UMAP. Since network architectures can be user provided this also allows for CNN and RNN based UMAP embeddings for images or sequences.
Book
Continue Making better Human-Computer Interfaces for Populations
GOES
Add in mapping to script reader, verify by adding legends
MDS
Status meeting maybe produce a spreadsheet to walk through that shows a time series of inputs and a calculation for each set? I think the inputs can be a column of six (for now?) variables as a set of rows, and the prediction calculations are shown below that. Make a DataFrame and see what that looks like.
The playbook for the Maga invasion of the nation’s Capitol building on Wednesday has been developing for years in plain sight, at far-right rallies in cities like Charlottesville, Berkeley and Portland, and then, in the past year, at state capitols across the country, where heavily armed white protesters have forced their way into legislative chambers to accuse politicians of tyranny and treason.
Here’s what seems to have happened with the Parler hack. The data may be available for research
Nice paper on training a model to generate synthetic data for better classification training: Reducing AI bias with Synthetic data. It uses the gretel’s gretel-synthetics library It’s free to use during the beta period, not sure about after, or what the pricing will be. They are hiring, with about seven openings at the moment, so they are burning through someone’s money.
GPT Agents
Finish abstract submission – done
Make an Overleaf project for qualitative paper?
GOES
Finish up the ManeuverReader – done! Here’s the original, with some large number of points that is subsampled to 100 points and stored as a json file
Here’s a reconstructed version that uses 1/3 (33) steps through the file. You can see a little roughness, but with more points it’s indistinguishable from the original pulled off influxDB:
Working on Hierarchies, Networks, and Technology. New technologies may have the same arc as writing and printing, which is initial hierarchy that produces influence networks that counter (to a degree), the more aggressive aspects of a dominance hierarchy
And just so we remember that the pandemic is not going well here. For comparison, the battle that took the most American lives was Antietam, where there were 3,675 fatalities if you count both sides.
Voting in Georgia today. I am pessimistic but hopeful about the outcome
GPT Agents
I’m not sure if the meeting is today at 3:30 or Friday at 4:00?
It was today. Continuing on trying to figure out the best way to understand the behavior of the model. One of the interesting findings for today was that if the data isn’t in the dataset, then the model will start generating tokes at the meta wrapper.
More coding
Book
Working on what’s become Hierarchies, Networks, and Technology, and I think I’m now happy with where it’s going. It makes sense to use as the end of the chapter as well
I’m going to start on a script-reading capability for TopController. I think a JSON or XML file that contains the following elements:
Absolute or relative move
axis name
Target (HPR or XYZ)
Timestamp
Required accuracy
So a move could be a series of HPR coordinates that ‘play’. The first step is a MOVE command which includes the filename. The TopController opens the file (or fails and reports it), loads the move into memory and begins to step through it based on the timestamp. On reaching the end of the file and when the AngleController reports success/failure, the TopController reports DONE and is ready for the next MOVE
Working on the section about displaying. I found Mike, the chimp that used the Kerosene cans. There’s apparently a paper as well, so I put in a request
Loading data about democracies from here (ourworldindata.org/democracy) into my db for better queries and charts. I want to look at recent changes in authoritarian systems as social technologies have changed in the last couple of decades
GOES
11:00 Meeting with Vadim
More sparring with Biruh?
MDA
Need some kind of kickoff with the technical folks?
And here are the worst performing states over the duration of the epidemic. Georgia continues to be a mess. Those states at the bottom are coming up fast…
Working on importing and transcribing the debate. Since the original won’t upload, I pulled the video into Adobe Premiere and cut off the head and tail, then exported as an AVI. We’ll see how that works. Nope – it’s ENOURMOUS! Trying other formats and getting progressively more annoyed. Aaaaand never got it to work. At least not today.
I did start editing the whole video down to just the displays
GPT Agents
Need to start coding, Going to talk to Stacey about that before I start.
Got some good advice and started.
As I’m coding, it looks like I’m making a nice set of tags for a training set. I wonder how small a set could be used to train something like BERT. Here’s an article:
In this tutorial, we will take you through an example of fine tuning BERT (as well as other transformer models) for text classification using Huggingface Transformers library on the dataset of your choice.
Other work on interpreting transformer internals has focused mostly on what the attention is looking at. The logit lens focuses on what GPT “believes” after each step of processing, rather than how it updates that belief inside the step.
GOES
Sent a note to Biruh asking how the servers will handle interactive video. He said that I could keep the server at home. So he just hates workstations? Anyway, lots of back and forth. Not sure where it’s going.
You must be logged in to post a comment.