7:00 – 4:00 ASRC PhD
- Regression with Probabilistic Layers in TensorFlow Probability
- At the 2019 TensorFlow Dev Summit, we announced Probabilistic Layers in TensorFlow Probability (TFP). Here, we demonstrate in more detail how to use TFP layers to manage the uncertainty inherent in regression predictions.
- Calculate the cosine similarity between all posts and populate a matrix to view and analyze. See if BERT makes sense for this or start with Word2Vec?
- Looking at my Word2Vec notes as a place to get started, since all the embedding looks about the same
- Put W2V in a class
- Started reading the db and creating a postanalyzer for each user in each dungeon. Right now, that’s easy because the user names are the same. Fixed that to be a combination of the channel and the user
- TF Embedding models (https://tfhub.dev/s?module-type=text-embedding)
- Elmo model in TF (https://tfhub.dev/google/elmo/2)
- Elmo looks promising. Here’s a tutorial (https://github.com/PrashantRanjan09/Elmo-Tutorial)
- Looking at my Word2Vec notes as a place to get started, since all the embedding looks about the same
- Map can be a combination of height and color, sort of like the clustering work
- More work on iConf slides/presentation
- Does embedding make sense for log files?
- Three Things We Learned About Applying Word Vectors to Computer Logs
- Experience Report: Log Mining using Natural Language Processing and Application to Anomaly Detection
- DeepLog: Anomaly Detection and Diagnosis from System Logs through Deep Learning
- Wrote up a proposal for embedding to analyze log files