

Rationalists 🤝 Postrationalists Writing screeds about how following the advice of Internet posts and self-medicating with controlled substances can be good if you are very smart


Rationalists 🤝 Postrationalists Writing screeds about how following the advice of Internet posts and self-medicating with controlled substances can be good if you are very smart


Also had a beef with Aella when they were both in Austin!
They are not beating the allegations that a Postrationalist is a Rationalist who admits Yud is just a dude and their goals are religious goals.
Update: There is a partially cached Facebook post where MacDonald states what he claims is Eigenrobot’s government name and says that Eigenrobot’s “(self admitted) BPD wife used to want to fuck me” (Google and Bing saved snippets, but do not share the full cache). That could have been what got him kicked out of VibeCamp for doxxing.


The simplenote post from a VibeCamp attendee claimed that “prior to Scott Alexander’s articles on Desoxyn, virtually no one talked about microdosing methamphetamine as a substitute for Adderall,” Any idea what post he was thinking of? Know Your Amphetamines - archive was from January 2021 so about the time VibeCamp got started.
The post is undated but mentions Hereticon in January 2022. Update: it links several more simplenote posts with statements like “It’s 2022 as of the time of this writing” and “as of Mar 30 2022.” The author lived in Austin when Aella was there. So it seems likely that the post about VibeCamp was written in 2022.
Update Again: the author posted to Substack under a meatspace name and linked “Know Your Amphetamines”. I don’t recommend reading the whole post, its just an unpleasant person with logorrhea unloading a stream of consciousness at unpleasant people.


invest
If you are broadly invested in US stocks, you are already invested in the chatbot bubble and the defense industry. If you are worried about that, an easy solution is to move some of that money elsewhere.


You would need a non-self-published source which says u/TPO = Lasker


GeneSmith who told LessWrong “How to Make Superbabies” also has no bioscience background. This essay in Liberal Currents thinks that a lot of right-wing media personalities are using synthetic testosterone now (but don’t call it gender-affirming care!). Roid rage may be hard to separate from Twitter brain-rot and slop-chugging.


CFAR lists nine employees with six-figure salaries plus a president. Oliver Habryka is one of those employees at the lower end of the pay scale. LightCone lists Habryka with a $3,000 honorarium and $110,000 in other salaries and expenses which looks like one or two system administrators or IT technicians. In 2024 Lightcone Infrastructure gave most of its expenses to something called Lightcone Research which actually operates LessWrong, and I predict that in 2026 LightCone will give most of the money raised to CFAR to pay the mortgage on the Rose Garden property and be very worried about Robot God.


In December LightCone raised $1.6 million of donations plus a 12.5% matching donation from the Survival and Flourishing Fund. They threatened to shut down if they didn’t raise $1.4 million and wanted at least $2 million.
Jaan and SFC (Jaan Tallinn and the Survival and Flourishing Corp) helped us fund the above-mentioned settlement with the FTX estate (providing $1.7M in funding). This was structured as a virtual “advance” against future potential donations, where Jaan expects to only donate 50% of future recommendations made to us via things like the SFF, until the other 50% add up to $1.29M in “garnished” funding. This means for the foreseeable future, our funding from the SFF is cut in half.
Lightcone Infrastructure did not list any large liabilities like this on its 2024 form 990, but CFAR listed several things which could cover it if the settlement was in 2024.
In December MIRI raised $1.6 million in donations plus a 100% matching donation from SFF. They wanted a total of $6 million. The donations grew from $1 million to $1.6 million in the last few days, suggesting that they talked a few of their upper-middle-class supporters into chipping in amounts in the high tens or low hundreds of thousand to capture the matching donation. Both fundraisers reached their minimum targets but not their goals.


I could one day but nitter and the Wayback Machine and public tools have gotten me this far!


Jax Romana accused Yud of using his polycule as servants and the Nonlinear Fund openly used an intern as a cheap maid-of-all-work so they could focus on their important{CitationNeeded} work


MIRI mentioned receiving “A two-year $7.7M grant from Open Philanthropy, in partnership with Ben Delo, co-founder of the BitMEX cryptocurrency trading platform.” in April 2020. Good luck to anyone who wants to track down that $15.6 million donation.


Sounds like a typical young make seeker (with a bit of épater les bourgeois). Not the classic Red Guard personality but it served Melon Husk’s needs.


Two of the bsky posts are log-in only. Huh, Killian is in to Decentralized Autonomous Organizations (blockchain), high-frequency trading (like our friends at Jane Street), veganism, and Effective Altruism?


Does anyone have an explainer on the supposed DOGE/EA connection? All I can find is this dude with a blo wobbling back and forth with LessWrong flavoured language https://www.statecraft.pub/p/50-thoughts-on-doge (he quotes Venkatesh Rao and Dwarkesh Patel who are part of the LessWrong Expanded Universe).


The February 2024 Medium post by Moskovitz objects to cognitive decoupling as an excuse to explore eugenics and says that Eliezer Yudkowsky seems unreasonably confident in immanent AI doom. It also notes that Utilitarianism can lead ugly places such as longtermism and Derek Parfit’s repugnant conclusion. In the comments he mentions no longer being convinced that its as useful to spend on insect welfare as on “chicken, cow, or pig welfare.” He quotes Julia Galef several times. A choice quote from his comments on forum.effectivealtruism.org:
If the (Effective Altruism?) brand wasn’t so toxic, maybe you wouldn’t have just one foundation like us to negotiate with, after 20 years?


Max Read argues that LessWrongers and longtermists are specifically trained to believe “I can’t call BS, I must listen to the full recruiting pitch then compose a reasoned response of at least 5,000 words or submit.”


A few weeks ago, David Gerard found this blog post with a LessWrong post from 2024 where a staffer frets that:
Open Phil generally seems to be avoiding funding anything that might have unacceptable reputational costs for Dustin Moskovitz. Importantly, Open Phil cannot make grants through Good Ventures to projects involved in almost any amount of “rationality community building”
So keep whisteblowing and sneering, its working.
Sailor Sega Saturn found a deleted post on https://forum.effectivealtruism.org/users/dustin-moskovitz-1 where Moskovitz says that he has moral concerns with the Effective Altruism / Rationalist movement not reputation concerns (he is a billionaire executive so don’t get your hopes up)


In November 2024, Habryka also said " we purchased a $16.5M hotel property, renovated it for approximately $6M and opened it up … under the name Lighthaven." So the disconnect between what Lightcone says to the taxman (we are small bois, CFAR owns the real estate) and what it says to believers (we own the real estate) was already there.


Over on Buttcoin, David Gerard found this substack post which quotes Lightcone staff in early 2025:
Open Phil generally seems to be avoiding funding anything that might have unacceptable reputational costs for Dustin Moskovitz. Importantly, Open Phil cannot make grants through Good Ventures to projects involved in almost any amount of “rationality community building”
So I think Dustin told OpenPhil “when I read about an organization I funded, I don’t want to read about phygs, sexual harassment, or pseudoscientific racism.”
Argentina and Russia are the usual examples of countries which had great futures in 1913 and threw them away with a series of bad decisions