Phil 4.18.18

7:00 – 6:30 ASRC MKT/BD

  • Meeting with James Foulds. We talked about building an embedding space for a literature body (The works of Jack London, for example) that agents can then navigate across. At the same time, train an LSTM on the same corpora so that the ML system, when given the vector of terms from the embedding (with probabilities/similarities?), produce a line that could be from the work that incorporates those terms. This provides a much more realistic model of the agent output that could be used for mapping. Nice paper to continue the current work while JuryRoom comes up to speed.
  • Recurrent Neural Networks for Multivariate Time Series with Missing Values
    • Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRUD, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.
  •  The fall of RNN / LSTM
    • We fell for Recurrent neural networks (RNN), Long-short term memory (LSTM), and all their variants. Now it is time to drop them!
  • JuryRoom
  • Back to proposal writing
  • Done with section 5! LaTex FTW!
  • Clean up Abstract, Exec Summary and Transformative Impact tomorrow

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.