Post this ride for tomorrow
Word embeddings quantify 100 years of gender and ethnic stereotypes (2018 paper)
- Word embeddings are a popular machine-learning method that represents each English word by a vector, such that the geometry between these vectors captures semantic relations between the corresponding words. We demonstrate that word embeddings can be used as a powerful tool to quantify historical trends and social change. As specific applications, we develop metrics based on word embeddings to characterize how gender stereotypes and attitudes toward ethnic minorities in the United States evolved during the 20th and 21st centuries starting from 1910. Our framework opens up a fruitful intersection between machine learning and quantitative social science.
Book
- Made some good progress on influence and started dominance
- 2:00 meeting with Michelle
GOES
- Sent Vadim a note about catching up. Looks like 2:00 on Monday
- More Plotly
- I know it’s dumb, but I figured out how to do a favicon and webpage title. It’s quite simple. Put the favicon file in a folder called “assets” in the same directory as the Dash code. Setting the title is even easier. Add a “title=’my title'” to the dash.Dash() call:
app = dash.Dash(__name__, title = 'Interactive!')
- It looks like it is very important to load the data before interacting. Updating global data is bad.
GPT-Agents
- I’m really thinking about the Language Through a Prism: A Spectral Approach for Multiscale Language Representations paper as a basis for topic extraction
- Installed dash-cytoscape for network graphs. Pretty cool:

- Got 3D scatterplots working as well. Next I need to do some 3D embeddings in Gensim and display them. After that, try doing a least-squares to rotate two embeddings to align with each other. If that works, Try seeing if embeddings from each month can be aligned ina reasonable way and what they look like

- Also, set up matrix distance code for Monday