Discovered State Azure, which is some very nice chill music
My thoughts on the debate last night:
Creating a list of distinct content that translated to “elderman”. Going to see if I can get the Google Translate API to deal with these problem children
Installing the python libraries for google translate
Because I love pain, upgraded tensorflow. Let’s see if anything still works! It does! At least for translation and GPT, which is good enough for me at the moment
Working on getting the translate API running
The paper’s submitted!
Had a good chat with Shimei and Sim last night. The db has been uploaded, and we talked about next steps. I also showed how to install the Huggingface transformers library from source. That involved uninstalling the Typing library for some reason. Seems there are conflicts?
Updated my DaysToZero code to put charts into the spreadsheet, which was pretty straightforward using xlsxwriter. There do seem to be three basic patterns:
The first is steady growth:
So no curve flattening here. The disease is moving pretty steadily through the population. The USA is mostly on this track as are countries like India and Chile. I think the difference that we see is related to a first, faster wave among the more vulnerable populations.
The second pattern is the ‘flattened curve’. Ireland shows this really well, as does New York state:
The last pattern is the ‘second wave’ pattern. Japan seems to be having one now:
So it looks like we are far from out of the woods on this, and letting your guard down is soundly punished.
Start the db to fixing “elderman” posts. Running. I’ve got about 100k bad posts. The fixes seem to be taking care of some. This is going to require multiple passes
2:00 Meeting with Vadim
Updated the document. Antonio’s going to submit. Fingers crossed! It would be nice to go to London in May
Need to backup and save the db. Done. It’s almost 11GB! Compressing. Backed up the compressed version, which is (only!) 3GB.
Adding to these thoughts about using three stories to frame nomad, flock, and stampede behaviors.
Listening to BBC Business Daily on London’s dirty financial secrets. In the episode, Tom Burgis, author of a new book Kleptopia: How Dirty Money is Conquering the World, discusses money laundering. It’s making me think about how though money is a dimension reduction process, it’s a very peculiar one. The ability to simplify transactions and “store” the profits means that power accumulates with money. That money lets the owner of the money pay people to add dimensions in other areas. This can be good, as with scientific research, or it can be bad, as with the creation of the byzantine dimensions of money laundering. In the middle somewhere are activities like high-speed trading.
The thing is, any increase in the ability to create social realities that exist independently of the environmental reality creates the conditions for stampedes. A lot of crime (Scams, cons, embezzlement) depends on the creation of a social reality that overwhelms trustworthy information coming in through other channels.
At 7.5M tweets so far
Got Antonio’s comments back. Need to roll them in
More rotations. I think we found the problems. The first is that the angles being fed to the absolute angle calculations were wrong. The second was that the sign of the pitch and yaw vectors flip near 180 degrees and we were not compensating for that.
Need to try a runthrough for the GVSETS slides. Too tired. Maybe over the weekend.
Need to fix “elderman” translations, though there are some other bad/partial translations as well
Create logger for DDict – done!
Start rwheel coding for incremental rotations – started
2:00 Meeting and demo. It went well, I think. We can run 100x speedup now!
Vadim has this thought: Was just thinking that at the next demo meeting, we should mention that this is adaptable to many different situations, including simulating the Launch Orbit Raising scenarios, specifically for the upcoming GOES-T launch. Since it’s a physics sim and we can make the various pieces move, such as deploying the solar panels.
Today the USA passed 200,000 dead from COVID-19. That triggered a memory from the earlier days of the pandemic. Italy handled the virus poorly, and I remember thinking that will be the standard that the USA will be judged. It should have been relatively easy to do better than Italy.
In the chart above, I scaled the deaths for selected countries so that they could be compared to the US directly. As you can see, we are now worse than any country in the EU, and staggeringly worse than South Korea.
Not sure what to do about that other than just be angry.
Reading more on the money book, and coming to the conclusion that money provides consistent mathematical rules for transactions that promote coordination
10:00 Meeting with Vadim. Walked through a lot and fixed a few things. There seems to be some problem related to 120 degrees, which is the angular spacing of the reaction wheels. I think that we are hitting a singularity. Working on a short term fix, though I think the long term is to simply use the reference frame code I’m working on
Speaking of which, I need to have a case for handling overlapping vectors (angle = 0), and not using an angle that is too small, and using the last angle if possible. If no angles are available, then wait until the next time and get larger angles? Done!
I would love to see a timeline of all the things that we’ve done in response to 9-11 and how they’ve worked out
Parse timestamps – done
Hand-annotate the schema?
datetime – done!
Start running analysis? Wrote a tweet to the db. Calling it a day
2:30 Meeting with Shimei and Sim – started a google doc for tasking
10:00 Meeting with Vadim. I think the goal should be to tune the rwheels so that the yaw flip curves start to look more realistic, and then see how it works at speed. One of the issues that we need to think about is the role of mass in high-timestep physics. Would lower mass make better behavior?
2:00 Meeting with Michelle – tweaked the dimension reduction section
Graph neural networks exploit relational inductive biases for data that come in the form of a graph. However, in many cases we do not have the graph readily available. Can graph deep learning still be applied in this case? In this post, I draw parallels between recent works on latent graph learning and older techniques of manifold learning.
Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
Made good progress on the table builder. I have a MySql implementation that’s pretty much done
Made a pitch for IRAD funding
Working on the tweet parsing
I have an issue with the user table. Tweets have a many-to-one relationship with user, so it’s a special case.
Added “object” and “array” to the cooked dictionary so that I can figure out what to do
Got the main pieces working, but the arrays can contain objects and the schema generator doesn’t handle that.
I think I’m going to add DATETIME processing for now and call it a day. I can start ingesting over the weekend
Didn’t make the progress I needed to on translating the text, so I asked for a week extension
Downloaded the CSV file. Looks the same as the other formats with a “Label” addition. Should be straightforward
Looks like Vadim fixed the transforms, so I’m off the hook
Registered for M&S Affinity Group. Looks like I’ll be speaking at 12:20 on Monday
10:00 Meeting with Vadim
Updated the DataDictionary to sys.exit(-1) on a name redefinition
11:00 Slides with T
ML-seminar (3:30 – 5:30)
My nomination for Adjunct Assistant Research Professor has been approved! Now I need to wait for the chain of approvals
JuryRoom (5:30 – 7:00)
Alex was the only one on. We discussed HTML, CSS, and LaTeX