Spending the day at a Navy-sponsored miniconference on AI, ethics and the military (no wifi at Annapolis, so I’ll put up notes later). This was an odd mix of higher-level execs in suits, retirees, and midshipmen, with a few technical folks sprinkled in. It is clear that for these people, the technology(?) is viewed as AI/ml. The idea that AI is a thing that we don’t do yet does not emerge at this level. Rather, AI is being implemented using machine learning, and in particular deep learning.
pyforest lazy-imports all popular Python Data Science libraries so that they are always there when you need them. If you don’t use a library, it won’t be imported. When you are done with your script, you can export the Python code for the import statements.
Ping Antonio about TAAS. Important points are round-tripping ABS, and enabling navigation as a way of prediction
Transition text from TAAS to Dissertation
Mission Drive – nope, couldn’t get in
Show Bruce model and control
pip3 install openpyxl, for some reason
Meeting with Will this evening
after a moderate amount of flailing, got his Slack message files into a database
Deep convolutional networks have become a popular tool for image generation and restoration. Generally, their excellent performance is imputed to their ability to learn realistic image priors from a large number of example images. In this paper, we show that, on the contrary, the structure of a generator network is sufficient to capture a great deal of low-level image statistics prior to any learning. In order to do so, we show that a randomly-initialized neural network can be used as a handcrafted prior with excellent results in standard inverse problems such as denoising, super-resolution, and inpainting. Furthermore, the same prior can be used to invert deep neural representations to diagnose them, and to restore images based on flash-no flash input pairs.