Category Archives: Machine Learning

Phil 3.12.19

7:00 – 4:00 ASRC PhD

TFK

d1dpqqlxgaansuo

Phil 3.11.19

7:00 – 10:00 ASRC PhD. Fun, long day.

Phil 1.29.19

7:00 – 5:30 ASRC IRAD

  • Theories of Error Back-Propagation in the Brain
    • This review article summarises recently proposed theories on how neural circuits in the brain could approximate the error back-propagation algorithm used by artificial neural networks. Computational models implementing these theories achieve learning as efficient as artificial neural networks, but they use simple synaptic plasticity rules based on activity of presynaptic and postsynaptic neurons. The models have similarities, such as including both feedforward and feedback connections, allowing information about error to propagate throughout the network. Furthermore, they incorporate experimental evidence on neural connectivity, responses, and plasticity. These models provide insights on how brain networks might be organised such that modification of synaptic weights on multiple levels of cortical hierarchy leads to improved performance on tasks.
  • Interactive Machine Learning by Visualization: A Small Data Solution
    • Machine learning algorithms and traditional data mining process usually require a large volume of data to train the algorithm-specific models, with little or no user feedback during the model building process. Such a “big data” based automatic learning strategy is sometimes unrealistic for applications where data collection or processing is very expensive or difficult, such as in clinical trials. Furthermore, expert knowledge can be very valuable in the model building process in some fields such as biomedical sciences. In this paper, we propose a new visual analytics approach to interactive machine learning and visual data mining. In this approach, multi-dimensional data visualization techniques are employed to facilitate user interactions with the machine learning and mining process. This allows dynamic user feedback in different forms, such as data selection, data labeling, and data correction, to enhance the efficiency of model building. In particular, this approach can significantly reduce the amount of data required for training an accurate model, and therefore can be highly impactful for applications where large amount of data is hard to obtain. The proposed approach is tested on two application problems: the handwriting recognition (classification) problem and the human cognitive score prediction (regression) problem. Both experiments show that visualization supported interactive machine learning and data mining can achieve the same accuracy as an automatic process can with much smaller training data sets.
  • Shifted Maps: Revealing spatio-temporal topologies in movement data
    • We present a hybrid visualization technique that integrates maps into network visualizations to reveal and analyze diverse topologies in geospatial movement data. With the rise of GPS tracking in various contexts such as smartphones and vehicles there has been a drastic increase in geospatial data being collect for personal reflection and organizational optimization. The generated movement datasets contain both geographical and temporal information, from which rich relational information can be derived. Common map visualizations perform especially well in revealing basic spatial patterns, but pay less attention to more nuanced relational properties. In contrast, network visualizations represent the specific topological structure of a dataset through the visual connections of nodes and their positioning. So far there has been relatively little research on combining these two approaches. Shifted Maps aims to bring maps and network visualizations together as equals. The visualization of places shown as circular map extracts and movements between places shown as edges, can be analyzed in different network arrangements, which reveal spatial and temporal topologies of movement data. We implemented a web-based prototype and report on challenges and opportunities about a novel network layout of places gathered during a qualitative evaluation.
    • Demo!
  • More TkInter.
    • Starting Modern Tkinter for Busy Python Developers
    • Spent a good deal of time working through how to get an image to appear. There are two issues:
      • Loading file formats:
        from tkinter import *
        from tkinter import ttk
        from PIL import Image, ImageTk
      • This is because python doesn’t know natively how to load much beyond gif, it seems. However, there is the Python Image Library, which does. Since the original PIL is deprecated, install Pillow instead. It looks like the import and bindings are the same.
      • dealing with garbage collection (“self” keeps the pointer alive):
        image = Image.open("hal.jpg")
        self.photo = ImageTk.PhotoImage(image)
        ttk.Label(mainframe, image=self.photo).grid(column=1, row=1, sticky=(W, E))
      • The issue is that if the local variable that contains the reference goes out of scope, the garbage collector (in Tkinter? Not sure) scoops it up before the picture can even appear, causing the system (and the debugger) to try to draw a None. If you make the reference global to the class (i.e. self.xxx), then the reference is maintained and everything works.
    • The relevant stack overflow post.
    • A pretty picture of everything working:
      • app
  • The 8.6.9 Tk/Ttk documentation
  • Looks like there are some WYSIWYG tools for building pages. PyGubu looks like its got the most recent activity
  • Now my app resizes on grid layouts: app2

Phil 1.27.19

The first group is through the test dungeon! Sooooooooooooooooooo much good data! Here’s a taste.

“Huffing a small breath out she did her best to figure out if the beings they were seeing matched up to the outlines they’d seen in the mist previously and if anything about the pair seemed off or odd. She does the same for the dragon though less familiar with the beasts than normal humans, it’s deal… seemed like too easy of a solution and it seemed highly unlikely that it was going to just let them run off with part of its hoard – which in her mind meant it was likely some sort of trick. Figuring out what the trick of it all was currently was her main focus.”

Continuing on my into to TkInter, which is looking a lot like FLTK from my C++ GUI days. I am not complaining. FLTK was awesome.

Phil 1.22.19

9:00 – 5:00 – ASRC PhD/NASA

  • Google AI proposal is due today! DONE!
  • Next steps for financial analytics
    • Get the historical data to Aaron’s code. Need to look at Pandas’ read_json
    • Get the predictions and intervals back
    • Store the raw data
    • update and insert the lineitems – nope
    • populate PredictedAvailableUDO table
  • Big five personality test (For players and characters) Github

Phil 1.21.19

9:00 – 3:30 ASRC NASA

woodgrain

Starting the day off right

    • Less than you think: Prevalence and predictors of fake news dissemination on Facebook
      • So-called “fake news” has renewed concerns about the prevalence and effects of misinformation in political campaigns. Given the potential for widespread dissemination of this material, we examine the individual-level characteristics associated with sharing false articles during the 2016 U.S. presidential campaign. To do so, we uniquely link an original survey with respondents’ sharing activity as recorded in Facebook profile data. First and foremost, we find that sharing this content was a relatively rare activity. Conservatives were more likely to share articles from fake news domains, which in 2016 were largely pro-Trump in orientation, than liberals or moderates. We also find a strong age effect, which persists after controlling for partisanship and ideology: On average, users over 65 shared nearly seven times as many articles from fake news domains as the youngest age group.
  • Working with Aaron on plumbing up the analytic pieces
  • Getting interpolation to work. Done! Kind of tricky, I had to iterate in reverse over the range so that I didn’t step on my indexes. By the way, this is how Python code looks if you’re not a Python programmer:
def interpolate(self):
    num_entries = len(self.data)
    # we step backwards so that inserts don't mess up our indexing
    for i in reversed(range(0, num_entries - 1)):
        current = self.data[i]
        next = self.data[i + 1]
        next_month = current.fiscal_month + 1
        if next_month > 12:
            next_month = 1
        if next_month != next.fiscal_month: # NOTE: This will not work if there is exactly one year between disbursements
            # we need to make some entries and insert them in the list
            target_month = next.fiscal_month
            if next.fiscal_month < current.fiscal_month:
                target_month += 12
            #print("interpolating between current month {} and target month {} / next fiscal {}".format(current.fiscal_month, target_month, next.fiscal_month))
            for fm in reversed(range(current.fiscal_month+1, target_month)):
                new_entry = PredictionEntry(current.get_creation_query_result())
                new_entry.override_dates(current.fiscal_year, fm)
                self.data.insert(i+1, new_entry)
                #print("\tgenerateing fiscal_month {}".format(fm))
  • So this:
tuple = 70/1042/402 contract expires: 2018-12-30 23:59:59
fiscalmonth = 9, fiscalyear = 2017, value = 85000.0, balance = 85000.0
fiscalmonth = 12, fiscalyear = 2017, value = -11041.23, balance = 73958.77
fiscalmonth = 1, fiscalyear = 2018, value = 0.0, balance = 73958.77
fiscalmonth = 2, fiscalyear = 2018, value = -28839.7, balance = 45119.07
fiscalmonth = 3, fiscalyear = 2018, value = 171490.55, balance = 216609.62
fiscalmonth = 4, fiscalyear = 2018, value = -14539.61, balance = 202070.01
fiscalmonth = 5, fiscalyear = 2018, value = -15608.09, balance = 186461.92
fiscalmonth = 9, fiscalyear = 2018, value = -60967.36, balance = 125494.56
fiscalmonth = 10, fiscalyear = 2018, value = -14211.78, balance = 111282.78
fiscalmonth = 1, fiscalyear = 2019, value = -23942.68, balance = 87340.1
fiscalmonth = 2, fiscalyear = 2019, value = -35380.81, balance = 51959.29
  • Gets expanded to this
tuple = 70/1042/402 contract expires: 2018-12-30 23:59:59
fiscalmonth = 9, fiscalyear = 2017, value = 85000.0, balance = 85000.0
fiscalmonth = 10, fiscalyear = 2017, value = 85000.0, balance = 85000.0
fiscalmonth = 11, fiscalyear = 2017, value = 85000.0, balance = 85000.0
fiscalmonth = 12, fiscalyear = 2017, value = -11041.23, balance = 73958.77
fiscalmonth = 1, fiscalyear = 2018, value = 0.0, balance = 73958.77
fiscalmonth = 2, fiscalyear = 2018, value = -28839.7, balance = 45119.07
fiscalmonth = 3, fiscalyear = 2018, value = 171490.55, balance = 216609.62
fiscalmonth = 4, fiscalyear = 2018, value = -14539.61, balance = 202070.01
fiscalmonth = 5, fiscalyear = 2018, value = -15608.09, balance = 186461.92
fiscalmonth = 6, fiscalyear = 2018, value = -15608.09, balance = 186461.92
fiscalmonth = 7, fiscalyear = 2018, value = -15608.09, balance = 186461.92
fiscalmonth = 8, fiscalyear = 2018, value = -15608.09, balance = 186461.92
fiscalmonth = 9, fiscalyear = 2018, value = -60967.36, balance = 125494.56
fiscalmonth = 10, fiscalyear = 2018, value = -14211.78, balance = 111282.78
fiscalmonth = 11, fiscalyear = 2018, value = -14211.78, balance = 111282.78
fiscalmonth = 12, fiscalyear = 2018, value = -14211.78, balance = 111282.78
fiscalmonth = 1, fiscalyear = 2019, value = -23942.68, balance = 87340.1
fiscalmonth = 2, fiscalyear = 2019, value = -35380.81, balance = 51959.29
  • Next steps
    • Get the historical data to Aaron’s code
    • Get the predictions and intervals back
    • Store the raw data
    • update and insert the lineitems

 

Phil 1.18.19

7:00 – ASRC PhD/NASA

  • Finalized the Google AIfSG proposal with Don yesterday evening. Here’s hoping it goes in!
  • Worked on the PHP code to show the story. Converting from BBCode is a pain
  • Now that I have sign off on the charts and have data to work with, I’m building the history ingestor that works on a set of tuples and interpolates across gaps in months. Once that code’s working, I’ll output to excel for a sanity check
    • Got the tuples extracted.
    • Do I need to back project to the beginning of the contract? No.
    • Discussion with Heath about how I’m just basing off the analytic_contractdata and producing predictions ONLY as lineitems. I’ll then modify the lineitems per tuple and new predictions.
  • Discussed using my toy NN to calculate hyperparameters for ARIMA

Phil 1.17.19

7:00 – 3:30 ASRC PhD, NASA

  • Lyrn.AI – Deep Learning Explained
  • Re-learning how to code in PHP again, which is easier if you’ve been doing a lot of C++/Java and not so much if you’ve been doing Python. Anyway, I wrote a small class:
    class DbIO2 {
        protected $connection = NULL;
    
        function connect($db_hostname, $db_username, $db_password, $db_database){
            $toReturn = array();
            $this->connection = new mysqli($db_hostname, $db_username, $db_password, $db_database);
            if($this->connection->connect_error){
                $toReturn['connect_successful'] = false;
                $toReturn['connect_error'] = $this->connection->error;
            } else {
                $toReturn['connect_successful'] = true;
            }
            return $toReturn;
        }
    
    
        function runQuery($query) {
            $toReturn = array();
            if($query == null){
                $toReturn['query_error'] = "query is empty";
                return $toReturn;
            }
            $result = $this->connection->query($query);
    
            if (!$result) {
                $toReturn['database_access'] = $this->connection->error;
                return $toReturn;
            }
    
            $numRows = $result->num_rows;
    
            for ($j = 0 ; $j < $numRows ; ++$j)         {             $result->data_seek($j);
                $row = $result->fetch_assoc();
                $toReturn[$j] = $row;
            }
            return $toReturn;
        }
    }
  • And exercised it
    require_once '../../phpFiles/ro_login.php';
    require_once '../libs/io2.php';
    
    $dbio = new DbIO2();
    
    $result = $dbio->connect($db_hostname, $db_username, $db_password, $db_database);
    
    printf ("%s\n",json_encode($result));
    
    $result = $dbio->runQuery("select * from post_view");
    
    foreach ($result as $row)
        printf ("%s\n", json_encode($row));
  • Which gave me some results
    {"connect_successful":true}
    {"post_id":"4","post_time":"2018-11-27 16:00:27","topic_id":"4","topic_title":"SUBJECT: 3 Room Linear Dungeon Test 1","forum_id":"14","forum_name":"DB Test","username":"dungeon_master1","poster_ip":"71.244.249.217","post_subject":"SUBJECT: 3 Room Linear Dungeon Test 1","post_text":"POST: dungeon_master1 says that you are about to take on a 3-room linear dungeon."}
    {"post_id":"5","post_time":"2018-11-27 16:09:12","topic_id":"4","topic_title":"SUBJECT: 3 Room Linear Dungeon Test 1","forum_id":"14","forum_name":"DB Test","username":"dungeon_master1","poster_ip":"71.244.249.217","post_subject":"SUBJECT: dungeon_master1's introduction to room_0","post_text":"POST: dungeon_master1 says, The party now finds itself in room_0. There is a troll here."}
    (repeat for another 200+ lines)
  • So I’m well on my way to being able to show the stories (both from the phpbb and slack) on the Antibubbles “stories” page

4:00 – 5:00 Meeting with Don

Phil 1.16.18

7:00 – 5:00 ASRC NASA

  • Starting to take a deep look at Slack as another Antibubbles RPG dungeon. From yesterday’s post
    • You can download conversations as JSON files, and I’d need to build (or find) a dice bot.
    • Created Antibubbles.slack.com
    • Ok, getting at the data is trivial. An admin can just go to antibubbles.slack.com/services/export. You get a nice zip file that contains everything that you need to reconstruct users and conversations: slack
    • The data is pretty straightforward too. Here’s the JSON file that has my first post in test-dungeon-1:
      {
              "client_msg_id": "41744548-2c8c-4b7e-b01a-f7cba402a14e",
              "type": "message",
              "text": "SUBJECT: dungeon_master1's introduction to the dungeon\n\tPOST: dungeon_master1 says that you are about to take on a 3-room linear dungeon.",
              "user": "UFG26JUS3",
              "ts": "1547641117.000400"
          }

      So we have the dungeon (the directory/file), unique id for message and user, the text and a timestamp. I’m going to do a bit more reading and then look into getting the Chat & Slash App.

    • Looking at the Workspace Admin page. Trying to see where the IRB can be presented.
  • More work on getting the historical data put into a reasonable format. Put together a spreadsheet with the charts for all permutations of fundcode/project/contractfor discussion tomorrow.
  • Updated the AI for social good proposal. Need to get the letter signed by mayself and Aaron tomorrow.
  • Pytorch tutorial, with better variable names than usual

Phil 1.15.19

7:00 – 3:00 ASRC NASA

  • Cool antibubbles thing: artboard 1
  • Also, I looked into a Slack version of Antibubbles. You can download conversations as JSON files, and I’d need to build (or find) a dice bot.
  • Fake News, Real Money: Ad Tech Platforms, Profit-Driven Hoaxes, and the Business of Journalism
    • Following the viral spread of hoax political news in the lead-up to the 2016 US presidential election, it’s been reported that at least some of the individuals publishing these stories made substantial sums of money—tens of thousands of US dollars—from their efforts. Whether or not such hoax stories are ultimately revealed to have had a persuasive impact on the electorate, they raise important normative questions about the underlying media infrastructures and industries—ad tech firms, programmatic advertising exchanges, etc.—that apparently created a lucrative incentive structure for “fake news” publishers. Legitimate ad-supported news organizations rely on the same infrastructure and industries for their livelihood. Thus, as traditional advertising subsidies for news have begun to collapse in the era of online advertising, it’s important to understand how attempts to deal with for-profit hoaxes might simultaneously impact legitimate news organizations. Through 20 interviews with stakeholders in online advertising, this study looks at how the programmatic advertising industry understands “fake news,” how it conceptualizes and grapples with the use of its tools by hoax publishers to generate revenue, and how its approach to the issue may ultimately contribute to reshaping the financial underpinnings of the digital journalism industry that depends on the same economic infrastructure.
  • The structured backbone of temporal social ties
    • In many data sets, information on the structure and temporality of a system coexists with noise and non-essential elements. In networked systems for instance, some edges might be non-essential or exist only by chance. Filtering them out and extracting a set of relevant connections is a non-trivial task. Moreover, mehods put forward until now do not deal with time-resolved network data, which have become increasingly available. Here we develop a method for filtering temporal network data, by defining an adequate temporal null model that allows us to identify pairs of nodes having more interactions than expected given their activities: the significant ties. Moreover, our method can assign a significance to complex structures such as triads of simultaneous interactions, an impossible task for methods based on static representations. Our results hint at ways to represent temporal networks for use in data-driven models.
  • Brandon RohrerData Science and Robots
  • Physical appt?
  • Working on getting the histories calculated and built
    • Best contracts are: contract 4 = 6, contract 5 = 9,  contract 12 = 10, contract 18 = 140
    • Lots of discussion on how exactly to do this. I think at this point I’m waiting on Heath to pull some new data that I can then export to Excel and play with to see the best way of doing things

Phil 1.14.19

7:00 – 5:00 ASRC NASA

  • Artificial Intelligence in the Age of Neural Networks and Brain Computing
    • Artificial Intelligence in the Age of Neural Networks and Brain Computing demonstrates that existing disruptive implications and applications of AI is a development of the unique attributes of neural networks, mainly machine learning, distributed architectures, massive parallel processing, black-box inference, intrinsic nonlinearity and smart autonomous search engines. The book covers the major basic ideas of brain-like computing behind AI, provides a framework to deep learning, and launches novel and intriguing paradigms as future alternatives.
  • Sent Aaron Mannes the iConference and SASO papers
  • Work on text analytics
    • Extract data by groups, group, user and start looking at cross-correlations
      • Continued modifying post_analyzer.py
      • Commenting out TF-IDF and coherence for a while?
  • Registered for iConference
  • Renew passport!
  • Current thinking on the schema. db_diagram
  • Making progress on the python to write lineitems and prediction history entries
  • Meeting with Don
    • Got most of the paperwork in line and then went over the proposal. I need to make changes to the text based on Don’t suggestions

Phil 1.10.19

7:00 – 4:00 ASRC

  • The fragility of decentralised trustless socio-technical systems
    • The blockchain technology promises to transform finance, money and even governments. However, analyses of blockchain applicability and robustness typically focus on isolated systems whose actors contribute mainly by running the consensus algorithm. Here, we highlight the importance of considering trustless platforms within the broader ecosystem that includes social and communication networks. As an example, we analyse the flash-crash observed on 21st June 2017 in the Ethereum platform and show that a major phenomenon of social coordination led to a catastrophic cascade of events across several interconnected systems. We propose the concept of “emergent centralisation” to describe situations where a single system becomes critically important for the functioning of the whole ecosystem, and argue that such situations are likely to become more and more frequent in interconnected socio-technical systems. We anticipate that the systemic approach we propose will have implications for future assessments of trustless systems and call for the attention of policy-makers on the fragility of our interconnected and rapidly changing world.
  • Realized this morning that the weight matrix is a connectivity matrix between the neurons. That means that there are some very interesting things that we could do with partially connected layers. Sending signals just to adjacent downstream nodes in 2D – nD
  • More DNN post. Need to incorporate neuron graphs with the weight graphs, and update the sections of code about graphing. Done! And yet, somehow I’m still tweaking…
  • Working on the NoI. It’s a grind… Done? Sent off to John D.
  • Back to Docker
    • docker build -t friendlyhello . # Create image using this directory’s Dockerfile
    • docker run -p 4000:80 friendlyhello # Run “friendlyname” mapping port 4000 to 80
    • docker run -d -p 4000:80 friendlyhello # Same thing, but in detached mode
    • docker container ls # List all running containers docker container ls -a # List all containers, even those not running
    • docker container stop <hash> # Gracefully stop the specified container
    • docker container kill <hash> # Force shutdown of the specified container
    • docker container rm <hash> # Remove specified container from this machine
    • docker container rm $(docker container ls -a -q) # Remove all containers
    • docker image ls -a # List all images on this machine
    • docker image rm <image id> # Remove specified image from this machine
    • docker image rm $(docker image ls -a -q) # Remove all images from this machine
    • docker login # Log in this CLI session using your Docker credentials
    • docker tag <image> username/repository:tag # Tag <image> for upload to registry
    • docker push username/repository:tag # Upload tagged image to registry
    • docker run username/repository:tag # Run image from a registry
  • Ok, let’s see how to integrate with IntelliJ – Nope, reworking the data structures for better queries (and best practices as well). Sigh.

Phil 1.9.18

ASRC NASA(?) 7:00 – 6:30

  • Selective Exposure to Misinformation: Evidence from the consumption of fake news during the 2016 U.S. presidential campaign
    • Though some warnings about online “echo chambers” have been hyperbolic, tendencies toward selective exposure to politically congenial content are likely to extend to misinformation and to be exacerbated by social media platforms. We test this prediction using data on the factually dubious articles known as “fake news.” Using unique data combining survey responses with individual-level web trac histories, we estimate that approximately 1 in 4 Americans visited a fake news website from October 7-November 14, 2016. Trump supporters visited the most fake news websites, which were overwhelmingly pro-Trump. However, fake news consumption was heavily concentrated among a small group — almost 6 in 10 visits to fake news websites came from the 10% of people with the most conservative online information diets. We also find that Facebook was a key vector of exposure to fake news and that fact-checks of fake news almost never reached its consumers.
  • Need to write justifications for Don – Done
  • More DNN from scratch
    • Added plotting of neurons converging to values. Now I need to change the writeup
  • Aaron’s sick. Not sure what the task for today should be. Antibubbles?
  • Downloading and installing Docker
    • Has to run as admin
    • Got Hello world running after getting this error:
      • C:\Windows\System32>docker run hello-world
      • docker: error during connect: Post http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.39/containers/create: open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run elevated to connect. This error may also indicate that the docker daemon is not running.
      • See ‘docker run –help’.
    • You have to run the “Docker” app
    • Created a “Hello World” in python and containerized it. It runs!
    • Had to set up virtualization on the laptop
  • Connected to the ASRC gitlab and set up the IDE to use it
  • Write up a 250 word Notice of Intent
    • Notice of Intent (NOI) to Propose Material in a NOI is confidential and will be used for NASA planning purposes only, unless otherwise stated in the FA. An NOI is submitted by logging into NSPIRES at http://nspires.nasaprs.com. Space is provided for the applicant to provide, at a minimum, the following information, although additional special requests may also be indicated:
      • A Short Title of the anticipated proposal (50 characters or less); 7
      • A Full Title of the anticipated proposal (which should not exceed 254 characters and is of a nature that is understandable by a scientifically trained person);
      • A brief description of the primary research area(s) and objective(s) of the anticipated work (the information in this item does not constrain in any way the proposal summary that must be submitted with the final proposal); and
      • The names of any Co-Is and/or Collaborators as known at the time the NOI is submitted. In order to enter these names those team members must have previously accessed and registered in NSPIRES themselves; a Principle Investigator (PI) cannot do this for them. 
  • Meeting with Shimei. Long! Discussed the NN code, RPGs and D&D, mapmaking
    • Send list of map quality markers from dissertation
    • Send some links about D&D and Play-by-post

Phil 1.8.18

7:00 – ASRC NASA

  • Software meeting at 9:00 in Beltsville
    • Products group
    • Attach an adder to the overhead?
    • 12.5% per bill on contract, so 10 contracts support one person
    • Currently covered for 3 months
    • AIMS 2019, TACLAMBDA have been approved (for the next 3 months?)
    • $300k from corporate across all groups.
    • Tasking for the next three months
    • Taking 2 modules out of A2P and making them compatible with AIMS.
    • Erik Velte runs TACLAMBDA
    • Evaluate the modules within A2P and migrate to TACLAMBDA (90% phil)
    • Some kind of machine learning for visa applications (RFP)?
    • Machine learning BAA?
    • We’re all ASTS, with TS signed by Eric/T
    • JPSS/NPP – changing from instrument data to telemetry
  • Sprint planning Meeting
  • Working on nn blog post

Phil 1.7.19

7:00 – 5:00 ASRC

  • Call Tim – The week looks dry
  • Schedule Physical – try tomorrow?
  • Continue with A guided tour through a dirt-simple “deep” neural network. Finished learning, started graphing
  • Downloaded the latest antibubbles and ran processing
  • More financial forecasting?
  • Sprint review?
    • Prepping by adding in all the things that I wound up doing
  • Worked on getting Aaron’s code working, which required installing MSVC 2017 which required me redistributing apps to clear up space on the SSD drive.