Vannak
(?)Community Member
- Report Post
- Posted: Thu, 03 May 2012 16:07:50 +0000
Guys, come one.
The majority, if not the entierity of our emotions, are based either in evolutionary adaptation or biological chemistry. There's no reason we'd be forced to create an AI that shares our emotions or desires.
There's no reason to believe that an imperative to survive would necessarily emerge from simply making a smart machine. Remember, in our lineage, that imperative came BEFORE intelligence, not after.
If we think of ourselves as a type of "AI" built by genes in the way our AI would be built by us, and if we take into account the selfish gene idea, we realize what we want out of an AI will be vastly different from what our genes want from us.
At that point we really have to ask ourselves what kind of things emerge from pure intelligence, and aren't simply artifacts from eons of evolutionary pressure.
And honestly I believe that the majority of your concerns are based on the impression that AI would be just like us, and there's no reason to think that. If anything, I think our traits determined simply by our intelligence are few. For instance, we can trace our desire to express free will as a fear of not being in control of our death and thus failing or gene's imperative on us.
The ability to hold grudges is again, part learning and a fear of failing that same imperative.
I can't easily think of any trait we have that isn't the product of this. Perhaps creative expression, enjoyment of music and such, but beyond these, I don't really see why everyone assumes machines would have our biological imperatives and desires if we don't put them there.
The majority, if not the entierity of our emotions, are based either in evolutionary adaptation or biological chemistry. There's no reason we'd be forced to create an AI that shares our emotions or desires.
There's no reason to believe that an imperative to survive would necessarily emerge from simply making a smart machine. Remember, in our lineage, that imperative came BEFORE intelligence, not after.
If we think of ourselves as a type of "AI" built by genes in the way our AI would be built by us, and if we take into account the selfish gene idea, we realize what we want out of an AI will be vastly different from what our genes want from us.
At that point we really have to ask ourselves what kind of things emerge from pure intelligence, and aren't simply artifacts from eons of evolutionary pressure.
And honestly I believe that the majority of your concerns are based on the impression that AI would be just like us, and there's no reason to think that. If anything, I think our traits determined simply by our intelligence are few. For instance, we can trace our desire to express free will as a fear of not being in control of our death and thus failing or gene's imperative on us.
The ability to hold grudges is again, part learning and a fear of failing that same imperative.
I can't easily think of any trait we have that isn't the product of this. Perhaps creative expression, enjoyment of music and such, but beyond these, I don't really see why everyone assumes machines would have our biological imperatives and desires if we don't put them there.