145
Majority of CEOs Alarmed as AI Delivers No Financial Returns
(futurism.com)
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
Rules:
this is an incredible list of research. TYSM! In spare work time i have a small tool that tries to accomplish what #2 describes i have not clicked the link and read yet but now i will read everything
I played around with implementing the recursive language model paper, and that actually turned out pretty well https://git.sr.ht/~yogthos/matryoshka
Basically, I spin up a js repl in a sandbox, and the agent can feed files into it, and then run commands against them. What normally happens is that the agent has to ingest the whole file into its context, but now it can just shove files into the repl, and then do operations on them akin to a db. And it can create variables. For example, if it searches for something in a file, it can bind the result to a variable and keep track of it. If it needs to filter the search later, it can just reference the variable it already made. This saves a huge amount of token use, and also helps the model stay more focused.
about how large are the codebases you’ve used this rlm with
Around around 10k lines or so. I use it as MCP that the agent uses when it decides it needs to. The whole code base doesn't get loaded in the repl, just individual files as it searches through them.