Author Archives: pgfeldman

Phil 12.6.2025

Time, space, memory and brain–body rhythms

  • Time and space are crucial concepts in neuroscience, because our personal memories are tied to specific events that occur ‘in’ a particular space and on a ‘timeline’. Thus, we seek to understand how the brain constructs time and space and how these are related to episodic memory. Place cells and time cells have been identified in the brain and have been proposed to ‘represent’ space and time via single-neuron or population coding, thus acting as hypothetical coordinates within a Newtonian framework of space and time. However, there is a fundamental tension between the linear and unidirectional flow of physical time and the variable nature of experienced time. Moreover, modern physics no longer views space as a fixed container and time as something in which events occur. Here, I articulate an alternative view: that time (physical and experienced) is an abstracted relational measure of change. Physical time is measured using arbitrary units and artificial clocks, whereas experienced time is linked to a hierarchy of brain–body rhythms that provide a range of reference scales that reflect the full span of experienced time. Changes in body and brain circuits, tied to these rhythms, may be the source of our subjective feeling of time.

Neurophysiology of Remembering

  • By linking the past with the future, our memories define our sense of identity. Because human memory engages the conscious realm, its examination has historically been approached from language and introspection and proceeded largely along separate parallel paths in humans and other animals. Here, we first highlight the achievements and limitations of this mind-based approach and make the case for a new brain-based understanding of declarative memory with a focus on hippocampal physiology. Next, we discuss the interleaved nature and common physiological mechanisms of navigation in real and mental spacetime. We suggest that a distinguishing feature of memory types is whether they subserve actions for single or multiple uses. Finally, in contrast to the persisting view of the mind as a highly plastic blank slate ready for the world to make its imprint, we hypothesize that neuronal networks are endowed with a reservoir of neural trajectories, and the challenge faced by the brain is how to select and match preexisting neuronal trajectories with events in the world.

If I’m reading this right, bias is a function of neurophysiological alignment. Which is wild, but makes sense

Tasks

  • Email to hotel – done
  • Chores – done
  • Laundry – done
  • Groceries – done
  • And a COLD, short ride.

Phil 12.5.2025

Tasks

  • Checked the keybox, and it appears to be broken. Good nibble yesterday though
  • Email to hotel
  • Bills – done
  • Pay Barbara – done
  • Chores
  • NO YARDWORK BECAUSE IT’S SNOWING
  • Leave for Suz’s at 1:00-ish – done! Fun! Yum!

SBIRs

  • Struggled to get the VPN working so I can check on the instance. Nope. Put in a ticket as around 8:30. If this goes on, I’m going to have to rework my dev environment and will probably need to put together a story for that. Got a response at 4:09 pm on how to get a new cert installed. Not the best use of a day.

Phil 12.4.2025

Tasks

  • Bennie and Phil’s trip to Terry’s
  • Drivers license?
  • Ping KP at 8:00

SBIRs

  • See if I can get the clustering trajectories to run along the time axis as well
  • Monitor Gutenberg download – seems to have broken in the AWS? Restarted.
  • The index2vec looks pretty similar to the straight embeddings. Not sure if this is the way I want to go or not:

Index2Vec embeddings look a lot like sentence embeddings, but narrower. Maybe

Phil 12.3.2025

Tasks

  • 10:30 – 11:00 showing

SBIRs

  • I’m trying to download the gutenberg corpus, but my instance appears to be broken. First, I couldn’t SSH in at all. After restarting, I can, but all my project directories are missing.
  • Because of that, I decided to download locally and figure out how to move the data later. Because rsync isn’t available on Windows, I’m running in WSL. Which works, but not in the company VPN. Since I need the VPN to log into my instance, I can’t run it on my company box. Instead I set it up on my dev laptop, which is happily cooking along. JFC.

LLM stuff

  • 2:30 Biweekly meeting – done. Shimei suggested using the 2D embeddings and using the Z-axis for time, which looks fantastic!

Paragraph-level embedding

Sentence-level embedding

Phil 12.2.2025

Got a lot done on P33 on my cruise!

Tasks

  • Showing 5:00 – 6:00

SBIRs

  • Pinged Tivern about dates
  • Chatted with Aaron about the project
  • Need to see if I can pull down the Gutenberg corpora – started! Big!

ZZZzzzzZZZZzzzz

Phil 12.1.2025

Cycling in Mauritius is perfect for cyclists who love beautiful, tropical islands and warm year-round cycling conditions.

Tasks

  • A bit of house cleaning and demo prep – and leaving for hours! Got to see the first episode of Andor though, and worked out some travel plans
  • Organize conclusions
  • Cycling in Mauritius is perfect for cyclists who love beautiful, tropical islands and warm year-round cycling conditions.

SBIRs

  • Figure out where I was!

Phil 11.27.2025

Happy Tday to those who celebrate!

Early science acceleration experiments with GPT-5

  • AI models like GPT-5 are an increasingly valuable tool for scientists, but many remain unaware of the capabilities of frontier AI. We present a collection of short case studies in which GPT-5 produced new, concrete steps in ongoing research across mathematics, physics, astronomy, computer science, biology, and materials science. In these examples, the authors highlight how AI accelerated their work, and where it fell short; where expert time was saved, and where human input was still key. We document the interactions of the human authors with GPT-5, as guiding examples of fruitful collaboration with AI. Of note, this paper includes four new results in mathematics (carefully verified by the human authors), underscoring how GPT-5 can help human mathematicians settle previously unsolved problems. These contributions are modest in scope but profound in implication, given the rate at which frontier AI is progressing.

CIFAR10 hyperlightspeedbench is a neural network implementation of a very speedily-training network that originally started as a painstaking reproduction of David Page’s original ultra-fast CIFAR-10 implementation on a single GPU, but written nearly from the ground-up to be extremely rapid-experimentation-friendly. Part of the benefit of this is that we now hold the world record for single GPU training speeds on CIFAR10, for example.

What we’ve added:

  • custom architecture that is somehow even faster
  • way too much hyperparameter tuning
  • miscellaneous architecture trimmings (see the patch notes)
  • memory format changes (and more!) to better use tensor cores/etc
  • dirac initializations on non-depth-transitional layers (information passthrough on init)
  • and more!

What we’ve removed:

  • explicit residual layers. yep.

This code, in comparison to David’s original code, is in a single file and extremely flat, but is not as durable for long-term production-level bug maintenance. You’re meant to check out a fresh repo whenever you have a new idea. It is excellent for rapid idea exploring — almost everywhere in the pipeline is exposed and built to be user-friendly. I truly enjoy personally using this code, and hope you do as well! 😀 Please let me know if you have any feedback. I hope to continue publishing updates to this in the future, so your support is encouraged. Share this repo with someone you know that might like it!

Phil 11.22.2025

I did a Big Deal Thing. Anyone want a nice house in Catonsville, MD?

Tasks

  • Print tags – done
  • Pack – done
  • Moar paperwork
  • Get a ride in? Yes!

Phil 11.21.2025

I think the Aztecs had it right about winter. Their year was 18 months of 18 days, with 5 days at the winter solstice to tray to get the sun to start rising earlier. Their methods were horrific, but I can appreciate the sentiment.

Tasks

  • Bills – done
    • Ricardo, Sande, and Edwin’s – done
  • Print tags
  • Chores – done
  • Dishes -done
  • Lawn – done
  • Keys – done
  • Shelving – done
  • Storage run
  • Recycling – done
  • Light groceries

Phil 11.20.2025

What Donald Trump Has Taught Us about American Political Institutions | Political Science Quarterly | Oxford Academic

  • Generations of political scientists have viewed the American constitutional system and its surrounding pluralist civil society as stable touchstones that safeguard against the threat of authoritarian leadership. Capitalizing on changes that go back several decades—the rise of nationalized polarization, the development of the unitary executive theory, and the growing sway of populist conservatives within the Republican Party—Donald Trump has demonstrated that the sources of countervailing power in the U.S. political system are far more fragile than previously understood. Trump has prevailed upon congressional Republicans to surrender their core constitutional responsibilities, has eviscerated critical foundations of the modern administrative state, and upended the relationship between the federal government and major civil society actors. Political scientists did not anticipate the potential for democratic breakdown that has emerged; we must now direct our energies to understanding this new constellation of power, as well as the pathways available for opponents to respond.

SBIRs

  • 9:00 Standup – done
  • 11:00 Phase2+ – done
  • 4:00 MDA – done
  • Create walk sequences of cluster trajectories (ignore -1) and make an index2vec model. Let’s see what it looks like!
    • Wrote a create_csv_from_story_embedding.py script that creates a walk sequence csv file and creates a model card
    • Trained a 2D and 3D model!
    • Got everything working! The embeddings are different than the topic embeddings. They appear to be more linear-ish. I think I need a good deal more data, because the 2D model seems to be better than the 3D. And multiple points are placed in the same coordinate. So 1) Try smaller clusters. That should give me 20% more data right there. And then generate more scenarios. Looking at Gutenberg just to see what-re-embedding means is also appropriate as a next step.

Phil 11.19.2025

Need to try this: Generative UI: A rich, custom, visual interactive user experience for any prompt

  • We introduce a novel implementation of generative UI, enabling AI models to create immersive experiences and interactive tools and simulations, all generated completely on the fly for any prompt. This is now rolling out in the Gemini app and Google Search, starting with AI Mode.

Scammers net nearly $100k in Chesapeake catfish – The Baltimore Banner

  • The phone numbers checked out. The emails seemed to come from McCain Foods. Even the name on the order matched an executive at the multinational frozen foods company.

How to disable all AI stuff in Visual Studio Code

SBIRs

  • Add a “unclustered” count – done. They are all 1,091 unclustered points out of a total of 5,326
  • Work on scenario trajectories. Working! This is raw embeddings for Scenario 2. It seems to show that even though the topics are close together, the trajectory through the space is somewhat chaotic. It’ll be interesting to see what the index2vec training does:
  • Added some detail:
  • Maybe start on walk sequences – not quite.

GPT Agents

  • 2:30 meeting – done

Phil 11.18.2025

White nationalist talking points and racial pseudoscience: welcome to Elon Musk’s Grokipedia | Elon Musk | The Guardian

  • “Grokipedia is a copy of Wikipedia but one where in each instance that Wikipedia disagrees with the richest man in the world, it’s ‘rectified’ so that it’s congruent with them.”

Tasks

  • Send Nellie a response on lead and keys – done
  • Vanessa xfer – done!
  • See if there is anything to pull from the garden

SBIRs

  • Did some clustering on the sentence embedding data and see what that looks like. It seems as though the lowest number of dimensions results in the best clustering. Not surprising, but good to know the curse of dimensionality intuition holds
  • Here’s a set of screenshots for each of the UMAP/HDBSCAN variations:
  • Write a class that reads in one scenario and then plots lines for each version. See how to animate dots that move along the lines.

Phil 11.17.2025

Disrupting the first reported AI-orchestrated cyber espionage campaign \ Anthropic

  • This campaign has substantial implications for cybersecurity in the age of AI “agents”—systems that can be run autonomously for long periods of time and that complete complex tasks largely independent of human intervention. Agents are valuable for everyday work and productivity—but in the wrong hands, they can substantially increase the viability of large-scale cyberattacks.

Tasks

  • 4:00 Meeting with Nellie – done

SBIRs

  • Slides! Done!
  • 9:00 Sprint demos – done
  • 3:00 Sprint planning – done
  • Got sentence-level embeddings done
  • Now I need to see how clustering looks. Definitely some different regions, though there may just be a big blob too.