Work some more on deploying Keyword Explorer – done! Here’s how you deploy using JetBains PyCharm
First, make sure you have up-to-date versions of setuptools and twine. Also make sure your directory looks something like this:
Create your setup.cfg file
[metadata]
description-file = README.md
name = keyword_explorer
Then, your setup.py file. It is important to explicitly list the subdirectories in the packages array. Here’s my example (Note that “long_description) is a variable. It is what PyPI uses to create its description, and needs to be updated before creating the wheel. (see here for an example):
from distutils.core import setup
setup(
name='keyword_explorer',
version= "0.0.3.dev",
packages=['keyword_explorer',
'keyword_explorer.utils',
'keyword_explorer.TwitterV2',
'keyword_explorer.tkUtils',
'keyword_explorer.OpenAI',
'keyword_explorer.Apps'],
url='https://github.com/pgfeldman/KeywordExplorer',
license='MIT',
author='Philip Feldman',
author_email='phil@philfeldman.com',
description='A tool for producing and exploring keywords',
long_description= long_s,
install_requires=[
'pandas~=1.3.5',
'matplotlib~=3.2.2',
'numpy~=1.19.5',
'sklearn~=0.0',
'scikit-learn~=0.24.2',
'requests~=2.27.1',
'wikipedia~=1.4.0',
'openai~=0.11.5',
'networkx~=2.6.2',
'tkinterweb~=3.12.2'],
classifiers=[ # Optional
# How mature is this project? Common values are
# 3 - Alpha
# 4 - Beta
# 5 - Production/Stable
'Development Status :: 3 - Alpha',
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
)
NEW! As of June 1, 2023, PyPi requires 2FA and tokens to upload:
Set your username to __token__
Set your password to the (long!) token value, including the pypi- prefix (e.g. pypi-AgEIcHl…)
Where you edit or add these values will depend on your individual use case. For example, some users may need to edit their .pypirc file, while others may need to update their CI configuration file (e.g. .travis.yml if you are using Travis).
Advanced users may wish to inspect their token by decoding it with base64, and checking the output against the unique identifier displayed on PyPI.
Moved interview with a biased machine to the front of the book since it could be a nice attention grabber
Started on the Egalitarianism section
Added some stuff to the technology section – Paleolithic and Early modern Human technologies. I’ll probably move the Paleolithic tech to the Egalitarianism section since it explains a lot about that behavior and the amount of evolutionary adaptation that took place around it. We do not look like the other great apes. This is why.
SBIRs
Good chat with Steve. Suggested how to create and visualize multiple attribute maps to see how the loss function is working
Worked out next steps with Rukan:
try predicting more than one point
try sine wave
loss function over 256 collecting loss, but maybe try 16 or some smaller number first
try to train for better convergence to see if it fixes the indexing issue
fix the indexing issue
try pytorch hyperparameter optimizer, probably just LR
What is morality? And to what extent does it vary around the world? The theory of “morality-as-cooperation” argues that morality consists of a collection of biological and cultural solutions to the problems of cooperation recurrent in human social life. Morality-as-cooperation draws on the theory of non-zero-sum games to identify distinct problems of cooperation and their solutions, and it predicts that specific forms of cooperative behavior—including helping kin, helping your group, reciprocating, being brave, deferring to superiors, dividing disputed resources, and respecting prior possession—will be considered morally good wherever they arise, in all cultures. To test these predictions, we investigate the moral valence of these seven cooperative behaviors in the ethnographic records of 60 societies. We find that the moral valence of these behaviors is uniformly positive, and the majority of these cooperative morals are observed in the majority of cultures, with equal frequency across all regions of the world. We conclude that these seven cooperative behaviors are plausible candidates for universal moral rules, and that morality-as-cooperation could provide the unified theory of morality that anthropology has hitherto lacked.
In brief:Reality Team runs ads on Instagram designed to limit the influence of disinformation. We developed a method to run randomized control trials to test the impact on knowledge and opinions about climate and covid vaccines. We saw very significant increases in knowledge 24–72 hours post exposure to a single viewing of a 10 second video ad, and shifts in opinions 7–18 days later within a specific audience of Passive Information Consumers.
Book
Writing a bit more on Age Dominance
While reading Hierarchy in the Forest, I realized that Egalitarianism in bands is probably a form of Nash Equilibrium. Which is wild, since my coevolution of weapons and agression turned out to be the iterated prisoner’s dilemma.
SBIRs
More DARPA abstract. Check to see if the LAIC Phase II proposal has any good text
How is the BigScience 176B model trained: a visual overview of the hardware and parallelism setup https://t.co/jX68m4UGVp— BigScience Large Model Training (@BigScienceLLM) March 23, 2022
Book
Finished the first pass at age bias. I still want to add examples of age dominance like
Continue with code generator. I think I need to set up the hmodule class explicitly, rather than having them store a count. This will allow multiple nodes to have multiple children and generate the correct connections
Good progress. Starting to create modules and connect them in bdmon
1:00 Dev meeting
Look at resumes and send Orest an example of what we’re looking for
Time series analysis has proven to be a powerful method to characterize several phenomena in biological, neural and socio-economic systems, and to understand their underlying dynamical features. Despite a plethora of methods having been proposed for the analysis of multivariate time series, most of them do not investigate whether signals result from independent, pairwise, or group interactions. Here, we propose a novel framework to characterize the temporal evolution of higher-order dependencies within multivariate time series. Using network analysis and topology, we show that, unlike traditional tools, our framework robustly differentiates various spatiotemporal regimes of coupled chaotic maps, including chaotic dynamical phases and various types of synchronization. By analysing fMRI signals, we find that, during rest, the human brain mainly oscillates between chaotic and few partially intermittent states, with higher-order structures reflecting sensorimotor areas. Similarly, in financial and epidemic time series, instead, higher-order information efficiently discriminates between radically different coordination and spreading regimes. Overall, our approach sheds new light on the higher-order organization of multivariate time series, allowing for a better characterization of dynamical group dependencies inherent to real-world systems.
SBIRs
8:30 Meeting
9:15 standup + went over generator concept
2:00 meeting with Ron
Need to set up overleaf project and add meeting notes section – in progress
Continue on code generator
Here’s my fancy piece of code for the dat that sets attributes from a dict:
class HierarchyModule: quantity: int name: str parent: str commands:List
Worked with Rukan on the RCSNN test implementation. You CANNOT have two enum classes with some of the same elements and get an equality between the two
Chat with Loren about the stunt fom and how the various pieces work together. We’re goring to need some kind of table that describes the behavior of each of the agents
TriMapis a dimensionality reduction method that forms a low-dimensional embedding of data by minimizing a contrastive loss over a set of triplets. The triplets are sampled from the original high-dimensional data representation and are weighted based on the distances between the (closer and farther) pairs of points. Although t-SNE and UMAP are excellent methods for forming low-dimensional embeddings, TriMap provides an alternative view of the data which is more representative “globally”. Specifically, TriMap is able to:
reflect the relative placement of the clusters in high-dimension,
reveal possible outliers and anomalies in the data,
generate embeddings that are more robust to certain transformations (see here for more details).
You must be logged in to post a comment.