Eliezer Yudkowsky’s explanation of FEP/Active Inference:
Our incredibly badly designed bodies do insane shit like repurposing superoxide as a metabolic signaling molecule. Our incredibly badly designed brains have some subprocesses that take a bit of predictive machinery lying around and repurpose it to send a control signal, which is even crazier than the superoxide thing, which is pretty crazy. Prediction and planning remain incredibly distinct as structures of cognitive work, and the people who try to deeply tie them together by writing wacky equations that sum them both together plus throwing in an entropy term, are nuts. It’s like the town which showed a sign with its elevation, population, and year founded, plus the total of those numbers. But one reason why the malarky rings true to the knowlessones is that the incredibly badly designed human brain actually is grabbing some bits of predictive machinery and repurposing them for control signals, just like the human metabolism has decided to treat insanely reactive molecular byproducts as control signals. The other reason of course is the general class of malarky which consists of telling a susceptible person that two different things are the same.
Original context: https://www.lesswrong.com/posts/isSBwfgRY6zD6mycc/eliezer-s-unteachable-methods-of-sanity?commentId=FTemSvm7m54JeuRAM
– Richard Kennaway