I dealt with this last point earlier. If dogs had created humans, we'd have even more reason to slavishly devote ourselves to their well-being. But if there was a point at which they became an existential threat to us, we'd cease giving one single fúck about them and kill them all instantly.
And what would stop AI from one day accessing, or exposing, these same mysteries of consciousness?
Nothing, if you believe that humans are merely lumps of meat equipped with hugely limited data processing systems. But you don't think that. You think humans are something more than this. You're just not prepared to say what that is, presumably because it would make you sound like a god-botherer.
No. Metaphysics <> GodBothery.
I have been arguing against the concept of omnipotent monotheism, which is not the same as speculating that the universe may contain rules we haven't worked out yet, that go beyond the materialist model of understanding. There's a long distance between that and the spaghetti monster.
If consciousness is material, then a machine may attain it one day, but as I said, we don't know what it is or how it works. A 'thinking' machine is one thing, self-awareness and the consciousness is something else.
I find the dog analogy poor tbh. It is absurd, as dogs did not create men. Also is not our instinct to wipe out those we perceive to be a threat linked to our evolution over millions of years of desperate, grubby survival? An AI we construct with no history of competition would not necessarily think like us.
There is, of course a whole sub-genre of Sci-Fi dealing with exactly these questions, from A.C Clark to Terminator and Blade Runner.
But the dog analogy is just one example of why AI may decide to wipe us out. Another could be that they're basically like autistic humans who don't understand nuance and may *accidently* respond recklessly to benign instructions. So, for example, say we told an AI to cure cancer. Their solution to this may be to kill every human that carries a cancer-mutating gene.
So yes, of course the relationship between dogs and humans and humans and AI is radically different. But the lack of evolutionary history does not obviate the risk that AI will consider humans to be an existential threat.
I've no idea what self-awareness has to do with anything. You really think humans have self-awareness and, say, a woodlouse doesn't?