7:00 –
- Learning: Neural Nets, Back Propagation
- Synaptic weights are higher for some synapses than others
- Cumulative stimulus
- All-or-none threshold for propagation.
- Once we have a model, we can ask what we can do with it.
- Now I’m curious about the MIT approach to calculus. It’s online too: MIT 18.01 Single Variable Calculus
- Back-propagation algorithm. Starts from the end and works forward so that each new calculation depends only on its local information plus values that have already been calculated.
- Overfitting and under/over damping issues are also considerations.
- Scrum meeting
- Remember to bring a keyboard tomorrow!!!!
- Checking that my home dev code is the same as what I pulled down from the repository
- No change in definitelytyped
- No change in the other files either, so those were real bugs. Don’t know why they didn’t get caught. But that means the repo is good and the bugs are fixed.
- Validate that PHP runs and debugs in the new dev env. Done
- Add a new test that inputs large (thousands -> millions) of unique ENTITY entries with small-ish star networks of partially shared URL entries. Time view retrieval times for SELECT COUNT(*) from tn_view_network_items WHERE network_id = 8;
- Computer: 2008 Dell Precision M6300
- System: Processor Intel(R) Core(TM)2 Duo CPU T7500 @ 2.20GHz, 2201 Mhz, 2 Core(s), 2 Logical Processor(s), Available Physical Memory 611 MB
- 100 is 0.09 sec
- 1000 is 0.14 sec
- 10,000 is 0.84 sec
- Using Open Office’s linear regression function, I get the equation t = 0.00007657x + 0.733 with an R squared of 0.99948.
- That means 1,000,000 view entries can be processed in 75 seconds or so as long as things don’t get IO bound
- Got the PHP interpreter and debugger working. In this case, it was just refreshing in settings->languages->php
