

I’d go for Motoko Kusanagi’s prosthetic body, myself, as long as I could afford the upkeep. That whole “don’t darken your Soul Gem” thing would go terribly for me.


I’d go for Motoko Kusanagi’s prosthetic body, myself, as long as I could afford the upkeep. That whole “don’t darken your Soul Gem” thing would go terribly for me.


Whatever marginal utility genAI has in mathematics, like being a shitty version of a semantic search engine, is outweighed by the damage it is doing to critical thought at large. “Ooh, the bus to the math conference runs so smoothly on this leaded gasoline!”


Reasoning With Machines doesn’t work on reasoning, really. It’s almost entirely large language models — chatbots. Because that’s where the money — sorry, the industry interest — is. But this paper got into the NeurIPS 2025 conference.
A reminder that the NeurIPS FAQ for reviewers says that “interactions with LLMs” are an acceptable way “to enhance your understanding of certain concepts”. What other big conferences are there… AAAI 2026, you say?
AAAI-26 will follow a two-phase reviewing process as in previous years, with two additions: an additional AI-generated review in Phase 1, and an AI-generated summary of the discussions at the end of the discussion phase. The AI-generated content is being used as part of a pilot program to evaluate the ability of AI tools to assist in the peer review process.
I’m gonna say it: The entire “artificial intelligence”/“machine learning” research field is corrupt. They have institutionally accepted the bullshit fountain as a tool. It doesn’t matter if they’re only using chatbots as a “pilot program”; they’ve bought into the ideology. They’ve granted fashtech a seat at the bar and forced all the other customers to shake its hand.


Was there ever, like, a push by Falun Gong to whitewash their articles? I seem to recall gossip from somebody (maybe in a skeptics’ group) about that, but I have no idea where in Wikipedia’s deep drama holes to look for evidence of it.


How do you write like this?
The first step is not to have an editor. The second step is to marinate for nearly two decades in a cult growth medium that venerates you for not having an editor.


Hasn’t Falun Gong had beef with Wikipedia for a long time? I have a vague recollection of reading about that, but I do not know where.


The only nice feeling here is that of every joke we science students made about the management school being validated.


Goertzel is a fan of Chris Langan.


muted colors
A lot of it looks like it was pissed on.


The computer-science section of the arXiv has declared that they can’t put up with all your shit any more.
arXiv’s computer science (CS) category has updated its moderation practice with respect to review (or survey) articles and position papers. Before being considered for submission to arXiv’s CS category, review articles and position papers must now be accepted at a journal or a conference and complete successful peer review. When submitting review articles or position papers, authors must include documentation of successful peer review to receive full consideration. Review/survey articles or position papers submitted to arXiv without this documentation will be likely to be rejected and not appear on arXiv.


“I can read HTML but not CSS” —Eliezer Yudkowsky, 2021 (and since apparently scrubbed from the Internet, to live only in the sneers of fond memory)


Oreo: the cookie that doesn’t need a marketing department.


That post requires signing in to view; it’s a link to this: https://futurism.com/science-energy/trump-altman-plutonium-oklo


But Sam can build the AGI! Sort of — OpenAI and Microsoft recently redefined “artificial general intelligence” as OpenAI making $100 billion profit. Yes, really.


I’m pretty sure that nobody has ever lain on their deathbed wishing they’d spent more time going into online communities and disagreeing with everything being said.


Level design, color palette, continuity, assets and physics by a bowl of salvia


Do you remember / Unending and boundless September / Flames were catchin’ the threads of the first-years


Emotional Exploitation As A Service™
Via Reddit!SneerClub: “Investors’ ‘dumb transhumanist ideas’ setting back neurotech progress, say experts”