# Phil 11.2.18

7:00 – 2:30 ASRC PhD (feeling burned out – went home early for a nap)

• Continuing with my 810 assignment. Just found out about finite semiotics, which could be useful for trustworthiness detection (variance in terms and speed of adoption)
• I like this! Creating a Perceptron From Scratch
• In order to gain more insight as to how Neural Networks (NNs) are created and used, we must first understand how they work. It is important to always create a solid foundation as to why you are doing something, instead of navigating blindly. With the ubiquity of Tensorflow or Keras, sometimes it is easy to forget what you are actually building and how to best develop your NN. For this project I will be using Python to create a simple Perceptron that will implement the basics of Back-Propagation to Optimize our Synapse Weighting. I’ll be sure to explain everything along the way and always encourage you to reach out if you have any questions! I will assume no prior knowledge in NNs, but you will instead need to know some fundamentals of Python programming, low-level calculus, and a bit of linear algebra. If you aren’t quite sure what a NN is and how they are used in the field of AI, I encourage you to first read my article covering that topic before tackling this project. So let’s get to it!
• And this is very interesting:
• SHAP (SHapley Additive exPlanations) is a unified approach to explain the output of any machine learning model. SHAP connects game theory with local explanations, uniting several previous methods [1-7] and representing the only possible consistent and locally accurate additive feature attribution method based on expectations (see the SHAP NIPS paper for details).
• Ok, back to generators. Here are several versions of Call of the Wild
• Tokens
```index, token
0, quivering
1, scraped
2, introspective
3, confines
4, restlessness
5, pug
6, mandate
7, twisted
8, part
9, error
10, thong
11, resolved
12, daunted
13, spray
14, trees
15, caught
16, fearlessly
17, quite
18, soft
19, sounds
20, slaying```
• Text sequences
```#confg: {"sequence_length":10, "step":1, "type":"words"}
buck, did, not, read, the, newspapers, or, he, would, have
did, not, read, the, newspapers, or, he, would, have, known
not, read, the, newspapers, or, he, would, have, known, that
read, the, newspapers, or, he, would, have, known, that, trouble
the, newspapers, or, he, would, have, known, that, trouble, was
newspapers, or, he, would, have, known, that, trouble, was, brewing
or, he, would, have, known, that, trouble, was, brewing, not
he, would, have, known, that, trouble, was, brewing, not, alone
would, have, known, that, trouble, was, brewing, not, alone, for
have, known, that, trouble, was, brewing, not, alone, for, himself
known, that, trouble, was, brewing, not, alone, for, himself, but
that, trouble, was, brewing, not, alone, for, himself, but, for
trouble, was, brewing, not, alone, for, himself, but, for, every
was, brewing, not, alone, for, himself, but, for, every, tidewater
brewing, not, alone, for, himself, but, for, every, tidewater, dog
not, alone, for, himself, but, for, every, tidewater, dog, strong
alone, for, himself, but, for, every, tidewater, dog, strong, of
for, himself, but, for, every, tidewater, dog, strong, of, muscle
himself, but, for, every, tidewater, dog, strong, of, muscle, and```

• Index sequences
```#confg: {"sequence_length":10, "step":1, "type":"integer"}
4686, 1720, 283, 1432, 1828, 1112, 4859, 3409, 3396, 379
1720, 283, 1432, 1828, 1112, 4859, 3409, 3396, 379, 4004
283, 1432, 1828, 1112, 4859, 3409, 3396, 379, 4004, 3954
1432, 1828, 1112, 4859, 3409, 3396, 379, 4004, 3954, 4572
1828, 1112, 4859, 3409, 3396, 379, 4004, 3954, 4572, 4083
1112, 4859, 3409, 3396, 379, 4004, 3954, 4572, 4083, 3287
4859, 3409, 3396, 379, 4004, 3954, 4572, 4083, 3287, 283
3409, 3396, 379, 4004, 3954, 4572, 4083, 3287, 283, 1808
3396, 379, 4004, 3954, 4572, 4083, 3287, 283, 1808, 975
379, 4004, 3954, 4572, 4083, 3287, 283, 1808, 975, 532
4004, 3954, 4572, 4083, 3287, 283, 1808, 975, 532, 973
3954, 4572, 4083, 3287, 283, 1808, 975, 532, 973, 975
4572, 4083, 3287, 283, 1808, 975, 532, 973, 975, 4678
4083, 3287, 283, 1808, 975, 532, 973, 975, 4678, 3017
3287, 283, 1808, 975, 532, 973, 975, 4678, 3017, 2108
283, 1808, 975, 532, 973, 975, 4678, 3017, 2108, 984
1808, 975, 532, 973, 975, 4678, 3017, 2108, 984, 1868
975, 532, 973, 975, 4678, 3017, 2108, 984, 1868, 3407```