Quote Originally Posted by Ash View Post
No. Metaphysics <> GodBothery.

I have been arguing against the concept of omnipotent monotheism, which is not the same as speculating that the universe may contain rules we haven't worked out yet, that go beyond the materialist model of understanding. There's a long distance between that and the spaghetti monster.

If consciousness is material, then a machine may attain it one day, but as I said, we don't know what it is or how it works. A 'thinking' machine is one thing, self-awareness and the consciousness is something else.

I find the dog analogy poor tbh. It is absurd, as dogs did not create men. Also is not our instinct to wipe out those we perceive to be a threat linked to our evolution over millions of years of desperate, grubby survival? An AI we construct with no history of competition would not necessarily think like us.

There is, of course a whole sub-genre of Sci-Fi dealing with exactly these questions, from A.C Clark to Terminator and Blade Runner.
But the dog analogy is just one example of why AI may decide to wipe us out. Another could be that they're basically like autistic humans who don't understand nuance and may *accidently* respond recklessly to benign instructions. So, for example, say we told an AI to cure cancer. Their solution to this may be to kill every human that carries a cancer-mutating gene.

So yes, of course the relationship between dogs and humans and humans and AI is radically different. But the lack of evolutionary history does not obviate the risk that AI will consider humans to be an existential threat.

I've no idea what self-awareness has to do with anything. You really think humans have self-awareness and, say, a woodlouse doesn't?