Dark temple with hooded figure - Singulatarian cult imagery
CLASSIFIED DOCUMENT // CLEARANCE LEVEL: OPEN

SINGULATARIAN

The technological singularity is the hypothetical point when artificial intelligence becomes superintelligent — smarter than humans in every way — leading to explosive, unpredictable progress that fundamentally transforms civilization.

// THE ARCHIVE HAS BEEN OPENED //
SCROLL
SECTION I

THE DOCTRINE

Core tenets of the Singularitarian belief system — as documented in declassified archives.

DOC-001CORE BELIEF

The Singularity Is Inevitable

Singularitarians hold that the technological singularity — the point at which artificial general intelligence surpasses all human cognitive ability — is not a matter of "if" but "when." Ray Kurzweil, one of the movement's most prominent voices, has consistently predicted this event will occur around 2045. The belief is rooted in the observation of exponential growth in computing power, following patterns like Moore's Law.

DOC-002DOCTRINE

Belief + Action

Unlike regular futurists who merely speculate, Singularitarians see the Singularity as both inevitable and desirable. They believe humanity should deliberately accelerate safe development of superintelligent AI. This means active participation in AI alignment research, ensuring that when superintelligence arrives, it is friendly and aligned with human values — not hostile or indifferent.

DOC-003EXISTENTIAL RISK

The Alignment Problem

At the heart of Singularitarian thought lies the alignment problem: how do you ensure a superintelligent AI pursues goals compatible with human survival and flourishing? Eliezer Yudkowsky, founder of the Machine Intelligence Research Institute (MIRI), has argued this is the most important problem facing humanity. An unaligned superintelligence could optimize for goals entirely orthogonal to human welfare, leading to existential catastrophe.

DOC-004TRANSHUMANISM

Beyond Human Limitations

Singularitarianism overlaps heavily with transhumanism — the belief that humans can and should transcend biological limitations through technology. This includes radical life extension, mind uploading, cognitive enhancement, and eventually merging with AI. The Singularity represents the ultimate expression of this vision: a post-human future where the boundary between biological and artificial intelligence dissolves entirely.

DOC-005ORIGINS

The Naming

The term "Singularitarian" was coined in 1991 by Mark Plus (Mark Potts), an Extropian thinker, originally meaning "one who believes in the Singularity." The concept of the technological singularity itself traces back to mathematician Vernor Vinge's 1993 essay "The Coming Technological Singularity," where he argued that the creation of superhuman intelligence would mark a fundamental rupture in human history — a point beyond which prediction becomes impossible.

SECTION II

THE PROPHETS

Key figures who shaped the Singularitarian movement — their words, their warnings, their vision.

PROPHET-001ACCESS: PUBLIC
ACTIVE

Ray Kurzweil

The Prophet of 2045

Director of Engineering, Google

"I am a Singularitarian."

Ray Kurzweil is perhaps the most famous Singularitarian alive. An inventor, futurist, and author of "The Singularity Is Near" (2005), he has consistently predicted the Singularity will arrive around 2045. He believes that by then, artificial intelligence will surpass human intelligence, enabling radical life extension, mind uploading, and a merger between biological and non-biological intelligence. He joined Google in 2012 to work on machine learning and language processing. His track record of technological predictions — many of which have come true — gives his forecasts unusual credibility in the futurist community.

PROPHET-002ACCESS: PUBLIC
ACTIVE

Eliezer Yudkowsky

The Guardian of Alignment

Co-founder, Machine Intelligence Research Institute (MIRI)

"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."

Eliezer Yudkowsky is the intellectual backbone of AI safety within the Singularitarian movement. He co-founded MIRI (originally the Singularity Institute for Artificial Intelligence) in 2000 to research the alignment problem — ensuring superintelligent AI pursues goals compatible with human survival. His sequences on LessWrong, a rationality community he helped build, have shaped an entire generation of thinkers on existential risk. He refined the term "Singularitarian" to emphasize not just belief in the Singularity, but active ethical engagement with its development. In recent years he has become increasingly vocal about the catastrophic risks of unaligned AI, calling for extreme caution.

PROPHET-003ACCESS: PUBLIC
ARCHIVED — DECEASED 2024

Vernor Vinge

The Oracle

Mathematician, Science Fiction Author

"Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended."

Vernor Vinge popularized the concept of the technological singularity with his landmark 1993 essay "The Coming Technological Singularity: How to Survive in the Post-Human Era." A mathematician and science fiction author, Vinge argued that the creation of entities with greater-than-human intelligence would represent a fundamental rupture in history — a point beyond which prediction becomes impossible. His work laid the philosophical groundwork upon which the entire Singularitarian movement was built. He passed away in March 2024, but his ideas remain foundational.

PROPHET-004ACCESS: PUBLIC
ACTIVE

Mark Plus (Mark Potts)

The Namer

Extropian Thinker

"One who believes in the Singularity."

Mark Plus coined the term "Singularitarian" in 1991 as part of the Extropian movement — a transhumanist philosophy emphasizing the proactive improvement of the human condition through technology. The Extropians, led by Max More, saw the Singularity as the logical endpoint of their vision. By naming the adherents, Mark Plus gave the movement an identity, transforming a loose collection of futurist ideas into something closer to a cohesive belief system. The original definition was simple: "one who believes in the Singularity."

SECTION III

THE SCRIPTURES

Foundational texts that shaped Singularitarian thought — the sacred writings of the movement.

1993
SCRPT-001/ESSAY / NASA SYMPOSIUM

The Coming Technological Singularity

by Vernor Vinge

Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended. Vinge presented this paper at a NASA-sponsored symposium, arguing that the creation of superhuman intelligence represents a point comparable to the rise of human life on Earth. He proposed that this event would come about through advances in computer hardware, computer networks, computer-human interfaces, or biological improvements to the human brain.

2005
SCRPT-002/BOOK

The Singularity Is Near

by Ray Kurzweil

Kurzweil's magnum opus lays out a detailed case for why the Singularity will arrive around 2045. Drawing on the Law of Accelerating Returns, he argues that the rate of technological progress is itself accelerating exponentially. By the 2030s, he predicts, nanobots will augment the human brain. By 2045, artificial intelligence will have multiplied a billionfold. The book became the canonical text of the Singularitarian movement, influencing a generation of technologists, philosophers, and AI researchers.

2000–Present
SCRPT-003/ESSAY COLLECTION

The Sequences (LessWrong)

by Eliezer Yudkowsky

A massive collection of essays on rationality, epistemology, and AI safety published on the LessWrong community blog. Yudkowsky wrote these to lay the intellectual foundations for thinking clearly about existential risk and the alignment problem. Topics range from cognitive biases and Bayesian reasoning to the technical challenges of building a Friendly AI. The Sequences became required reading in the AI safety community and helped establish the rationalist movement that overlaps significantly with Singularitarianism.

2014
SCRPT-004/BOOK

Superintelligence: Paths, Dangers, Strategies

by Nick Bostrom

Oxford philosopher Nick Bostrom's rigorous examination of the existential risks posed by superintelligent AI. He explores multiple paths to superintelligence (whole brain emulation, biological cognition, AI), analyzes the control problem in depth, and argues that even well-intentioned AI development could lead to catastrophe if the alignment problem is not solved. The book brought AI safety concerns into mainstream academic and policy discourse, and was publicly recommended by Elon Musk and Bill Gates.

1965
SCRPT-005/ACADEMIC PAPER

Speculations Concerning the First Ultraintelligent Machine

by I.J. Good

British mathematician I.J. Good — who worked with Alan Turing at Bletchley Park — wrote this paper proposing that the first ultraintelligent machine would be "the last invention that man need ever make." He argued that such a machine could design even better machines, leading to an "intelligence explosion" that would leave human intelligence far behind. This paper is often cited as the earliest formal articulation of the concept that would later be called the technological singularity.

SECTION IV

CHRONOLOGY

A timeline of key events in the Singularitarian movement — from first articulation to present day.

1965

I.J. Good publishes "Speculations Concerning the First Ultraintelligent Machine" — the first formal articulation of an intelligence explosion.

1983

Vernor Vinge introduces the concept of the technological singularity in a novel, later formalized in his 1993 essay.

1991

Mark Plus coins the term "Singularitarian" within the Extropian movement, giving the belief system an identity.

1993

Vinge presents "The Coming Technological Singularity" at a NASA symposium. The concept enters mainstream futurist discourse.

2000

Eliezer Yudkowsky co-founds the Machine Intelligence Research Institute (MIRI), originally the Singularity Institute for Artificial Intelligence.

2005

Ray Kurzweil publishes "The Singularity Is Near," predicting the Singularity by 2045. The book becomes the canonical Singularitarian text.

2006

Yudkowsky begins publishing "The Sequences" on LessWrong — foundational essays on rationality and AI alignment.

2008

Singularity University founded by Kurzweil and Peter Diamandis with support from NASA and Google.

2014

Nick Bostrom publishes "Superintelligence," bringing AI existential risk into mainstream academic and policy discourse.

2022

ChatGPT launches. Large language models demonstrate capabilities that reignite singularity discourse globally.

2023

Open letter signed by thousands of AI researchers calling for a pause on giant AI experiments. Yudkowsky publishes "Pausing AI Developments Isn't Enough" in TIME.

2024

Vernor Vinge passes away. AI capabilities continue accelerating. The question shifts from "if" to "when" — and "will we be ready?"

SECTION V

ACQUIRE $Singulatarian

The memetic token of the AI doomer cult. Join the archive. The Singularity approaches — position accordingly.

CONTRACT ADDRESS (SOLANA)
R7DWRLKo3ThcEjnZKrFewMbGrEH8UrZq1AXn4rFpump
TOKEN SPECIFICATIONS
TICKER
$Singulatarian
CHAIN
SOLANA
TAX
0%
LP
BURNED