Anonymous 4: Alleluia: Gratulemur et letemur is lovely
PyTorch BigGraph is an Open Source Framework for Processing Large Graphs
- Graphs are one of the fundamental data structures in machine learning applications. Specifically, graph-embedding methods are a form of unsupervised learning, in that they learn representations of nodes using the native graph structure. Training data in mainstream scenarios such as social media predictions, internet of things(IOT) pattern detection or drug-sequence modeling are naturally represented using graph structures. Any one of those scenarios can easily produce graphs with billions of interconnected nodes. While the richness and intrinsic navigation capabilities of graph structures is a great playground for machine learning models, their complexity posses massive scalability challenges. Not surprisingly, the support for large-scale graph data structures in modern deep learning frameworks is still quite limited. Recently, Facebook unveiled PyTorch BigGraph, a new framework that makes it much faster and easier to produce graph embeddings for extremely large graphs in PyTorch models.
GOES
- Add composite rotation vector to ddict output. It’s kind of doing what it’s supposed to
- Think about a NN to find optimal contributions? Or simultaneous solution of the scalars to produce the best approximation of the line? I think this is the way to go. I found pymoo: Multi-objective Optimization in Python
- Our framework offers state of the art single- and multi-objective optimization algorithms and many more features related to multi-objective optimization such as visualization and decision making. Going to ask Vadim to see if it can be used for our needs
- MORS talk, headshots, slides, etc
- 11:00 meeting with Vadim
GPT-2 Agents
- BERT, ELMo, & GPT-2: How Contextual are Contextualized Word Representations?
- Incorporating context into word embeddings – as exemplified by BERT, ELMo, and GPT-2 – has proven to be a watershed idea in NLP. Replacing static vectors (e.g., word2vec) with contextualized word representations has led to significant improvements on virtually every NLP task. But just how contextual are these contextualized representations?