We should not try to create superhuman minds at this time.
Instead, we should try to create superhuman imagination modules, planning systems, simulators, data analyzers etc. These would be potential components of superhuman minds, but they would not be directly integrated.
There should be nothing there with a will or a sense of identity.
In the simplest terms, nothing that could theoretically feel pain. IMHO that has always been the most immediate risk of the vast "black box" training systems that AI researchers seem to fear - not civilization ending AI plots.
Subscribe to:
Post Comments (Atom)
Is Heber Jentzsch being held at the Flag Land Base?
So many missing people are living hidden lives in prison-like conditions in Clearwater, Florida. As always, their abuse is being enabled an...
-
When you realize how evil the world is, and how totally deliberate this evil is, and above all how all-pervasive this evil is, things become...
-
My new science fiction novel about the Singularity happening SOON is finally ready. There is a lot of hysteria involved in such a fantastic...
-
These posts below got me kicked out from their "web community". My crime was to claim that their focus on future AI risks complete...
No comments:
Post a Comment