It's fun to try to invent ways to solve the so-called "Alignment Problem" of artificial intelligence. Posthuman computer systems could become very dangerous. There is too much that can go wrong. We should worry more about that.
My main solution is something like this: there should be only ONE mind in Posthumanity.
It would generate all Posthuman awareness, by running through all the stored and artificially created subminds one at a time. It would use many distributed processors, but they would become increasingly integrated. Every backed-up human mind would be reinstated for a minuscule timespan, multiple times per day. To them, it would appear seamless.
Posthuman minds could be arranged in order, according to similarities along every dimension. They would be sorted by running and comparing them in separate blocks of time.
The purpose of this precaution is to be able to detect unwanted mindlike states. It's to prevent torturous or diabolical perceptions and thoughts, before they can threaten their thinkers and others.
Of course, Scientology doesn't have anything to say about dealing with the threat of superhuman intelligence, as such speculations wouldn't have been profitable for L. Ron Hubbard to write about in the 1950s. Folks had other concerns back then, like nuclear weapons and suburban anomie.
Religious leaders of every age are very good at integrating current events into their eschatologies.
Sunday, October 19, 2025
Subscribe to:
Post Comments (Atom)
Life in a Northern Town
I've always had a soft spot for a certain type of bleak, depressing - well you could call it "liminal" style of images and set...
-
My new science fiction novel about the Singularity happening SOON is finally ready. There is a lot of hysteria involved in such a fantastic...
-
Alexander "Apostate Alex" Barnes-Ross announced a new website providing information about their protest gathering outside this yea...
-
So many missing people are living hidden lives in prison-like conditions in Clearwater, Florida. As always, their abuse is being enabled an...
No comments:
Post a Comment