Didn’t get a chance to write a post yesterday, so I’ll just include yesterday’s progress.
- Graph neural networks exploit relational inductive biases for data that come in the form of a graph. However, in many cases we do not have the graph readily available. Can graph deep learning still be applied in this case? In this post, I draw parallels between recent works on latent graph learning and older techniques of manifold learning.
- Through this post, I want to establish links between Graph Neural Networks (GNNs) and Transformers. I’ll talk about the intuitions behind model architectures in the NLP and GNN communities, make connections using equations and figures, and discuss how we could work together to drive progress.
- Made good progress on the table builder. I have a MySql implementation that’s pretty much done
- Made a pitch for IRAD funding
- Working on the tweet parsing
- I have an issue with the user table. Tweets have a many-to-one relationship with user, so it’s a special case.
- Added “object” and “array” to the cooked dictionary so that I can figure out what to do
- Got the main pieces working, but the arrays can contain objects and the schema generator doesn’t handle that.
- I think I’m going to add DATETIME processing for now and call it a day. I can start ingesting over the weekend
- Didn’t make the progress I needed to on translating the text, so I asked for a week extension
- Downloaded the CSV file. Looks the same as the other formats with a “Label” addition. Should be straightforward
- Looks like Vadim fixed the transforms, so I’m off the hook
- Registered for M&S Affinity Group. Looks like I’ll be speaking at 12:20 on Monday
- 10:00 Meeting with Vadim
- Updated the DataDictionary to sys.exit(-1) on a name redefinition
- 11:00 Slides with T
- Write letter
ML-seminar (3:30 – 5:30)
- My nomination for Adjunct Assistant Research Professor has been approved! Now I need to wait for the chain of approvals
JuryRoom (5:30 – 7:00)
- Alex was the only one on. We discussed HTML, CSS, and LaTeX