Wednesday, April 20, 2022

The Posthuman Singularity alignment problem

My ongoing novel about the Technological Singularity describes one way the Singularity could happen MUCH sooner than anyone expects (like next week). And not necessarily in a good way.
In technical terms, this is known as an "alignment failure". However, the novel also implies a possible solution to this error mode:

In the story, the Singularity happens not because a super advanced AI makes itself ever smarter (which isn't practical in the near future), but in a more informal manner. The process is triggered by an expanding group of researchers and information networks becoming ever more organized. This organization acts almost like an organism.
By investigating various ways to create an AI to make the Singularity happen, they inadvertently identify all the most dangerous lines of research. This information is then automatically released in the spirit of openness to those networks best able to pursue it.
And that's when the trouble happens all at once, too fast to respond in any meaningful way.

Which brings us to the big reverse implication:
The only way to control the Technological Singularity would be to have it brought about by a large network composed of the greatest possible number of members. All or at least most of their skills would be needed.
This would also have the advantage that it could be brought about slightly sooner, rather than having to wait until we could build or evolve a superhuman AI.

* SINGULARITY SOON *

1 comment:

Sea Orgers at the Flag Land Base in Clearwater, FL

A post about life in the Sea Org in the "good old days" in LA: tonyortega.substack.com/p/scientology-in-the-good-old-days Today...