|
|
|
|
|
|
Posted: Tue Nov 13, 2007 6:55 am
Well, all who don't know, AI is Artificial Intelligence. My friend and I are having a discussion about is it fair for humans to create humanoid beings such as robots with such suffisticated AI that they are like humans and can think like humans. I personally say this is not fair. Why is it right to create beings like this only to serve us?
Also, if they were not created to serve us, would they be treated as humans or would rights be lost to them? it may seem far fetched now, but think about it. We're closing in on a robotic revolution.
I'm not saying they're going to rebel, but what do you think. Is creating Robots with AI so similar to humans right, or should creation be left to the God?
My opinion, if you want it is that we should leave creating to God. I know robotics will help our race in good ways, but making that that smart is wrong. And if we do, they should have full human rights.
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Tue Nov 13, 2007 4:32 pm
i say that we let the robots rule... if we manage to create something with a human mind, thats superior in every other way.. then let the obsolete models ( thats us humans ) be discarded.. just like we discard stuff we dont need anymore...
as long as we dont start messing them up by giving them emotions.. thats the one big error about humans.. impairs judgement and clouds the mind...
let em be logical beings who can wipe the miserable failed humans out and do better than we did^^
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Tue Nov 13, 2007 4:43 pm
cloning internal organs for people who need replacement parts is one thing. I think that could be extremely beneficial to people who are already living and in need of help.. but my question is, where does it end? I'm against cloning humans (no need, there's too many of us as it is anyway), and I'm against genetically engineered food therefore, I'd be against cloning animals for food as well. I think scientists play god too much for our own good. we're NOT gods and all the tinkering we do with nature and natural processes could come with a price. for every 'progressive' step we take, there is another problem that appears to go along with it and the thing is, science cant always accurately predict what the consequences will be.
as far as robots are concerned... why? we're beings that are perfectly able to do things for ourselves. most of us are already fat and lazy enough as it is. I dont care much for sci-fi. when the possibility for robotic slaves is really here (as the possibility for cloning is already here), then I'll think about it more seriously. until then, it remains in the realm of Terminator and the Matrix.
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Wed Nov 14, 2007 6:31 am
Well, I'm not saying that there will be an uprising. I'm asking is it fair to them?
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Wed Nov 14, 2007 7:23 am
then are you saying that if their is a "god" like being is it fair it(they) created us, intelegent, beings thats pretty much what your argument said except about robots and yes humans should creat artificial intelegence, it could help in areas that human kind has yet to precieve, and a.i is a lot more a bit longer then you think. just look at some game nowadays, the a.i. in them adapts to the user and trys to outsmart the user(sometimes it workds sometimes it doesnt
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Wed Nov 14, 2007 12:33 pm
dreams into nights yami then are you saying that if their is a "god" like being is it fair it(they) created us, intelegent, beings thats pretty much what your argument said except about robots and yes humans should creat artificial intelegence, it could help in areas that human kind has yet to precieve, and a.i is a lot more a bit longer then you think. just look at some game nowadays, the a.i. in them adapts to the user and trys to outsmart the user(sometimes it workds sometimes it doesnt Not what I'm saying at all. God created. I'm asking if we should leave the creating to God. I'm also asking if it's fair to create something as smart as us, but not give it any freewill?
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Wed Nov 14, 2007 12:44 pm
Shelfkid91 dreams into nights yami then are you saying that if their is a "god" like being is it fair it(they) created us, intelegent, beings thats pretty much what your argument said except about robots and yes humans should creat artificial intelegence, it could help in areas that human kind has yet to precieve, and a.i is a lot more a bit longer then you think. just look at some game nowadays, the a.i. in them adapts to the user and trys to outsmart the user(sometimes it workds sometimes it doesnt Not what I'm saying at all. God created. I'm asking if we should leave the creating to God. I'm also asking if it's fair to create something as smart as us, but not give it any freewill? a bit off topic here but i have to ask then: is it fair to have freewilled beings as intelligent as humans controlled and manipulated by biased news reports and half truths? and its more of a only smart people are freewilled, the rest just watch TV comment than a actual Question..
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Sun Nov 18, 2007 8:16 am
Shelfkid91 dreams into nights yami then are you saying that if their is a "god" like being is it fair it(they) created us, intelegent, beings thats pretty much what your argument said except about robots and yes humans should creat artificial intelegence, it could help in areas that human kind has yet to precieve, and a.i is a lot more a bit longer then you think. just look at some game nowadays, the a.i. in them adapts to the user and trys to outsmart the user(sometimes it workds sometimes it doesnt Not what I'm saying at all. God created. I'm asking if we should leave the creating to God. I'm also asking if it's fair to create something as smart as us, but not give it any freewill? then ask your self this was your god right to create humans with intelegence and free will? and explain how you think so
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Sun Nov 18, 2007 9:15 am
First off, you have to define intelligence, to create it artificially. In our time, scientists are still argueing about the clear definition of intelligence.
Anyways, I don't think that robots will get their own protecting laws, they'll get handled like minorities and minorities always got enslaved and abused.
And leaving the creation to god is nonsense, mankind evolves by creation and not by waiting for god's next prototype. If mankind waited for god's creations for mankind we'd still wander through woods on the search for some foods...
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Sun Nov 18, 2007 3:52 pm
Hellraver First off, you have to define intelligence, to create it artificially. In our time, scientists are still argueing about the clear definition of intelligence. Anyways, I don't think that robots will get their own protecting laws, they'll get handled like minorities and minorities always got enslaved and abused. And leaving the creation to god is nonsense, mankind evolves by creation and not by waiting for god's next prototype. If mankind waited for god's creations for mankind we'd still wander through woods on the search for some foods... indeed, mankind creates all the time. art, architecture, technology. but where does it end? just because we can do certain things doesnt mean that we should (as my favorite saying in Jurassic Park goes). creating AI beings could very well be inviting trouble. I dont see a need for it anyway, as I said before.
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Mon Nov 19, 2007 5:45 am
Calypsophia Hellraver First off, you have to define intelligence, to create it artificially. In our time, scientists are still argueing about the clear definition of intelligence. Anyways, I don't think that robots will get their own protecting laws, they'll get handled like minorities and minorities always got enslaved and abused. And leaving the creation to god is nonsense, mankind evolves by creation and not by waiting for god's next prototype. If mankind waited for god's creations for mankind we'd still wander through woods on the search for some foods... indeed, mankind creates all the time. art, architecture, technology. but where does it end? just because we can do certain things doesnt mean that we should (as my favorite saying in Jurassic Park goes). creating AI beings could very well be inviting trouble. I dont see a need for it anyway, as I said before. heres the thing computers do anything their programed to nothing more and nothing less, so the only way it would be inviting trouble is if a terrorist nation got ahold of a programer and gained access to the a.i directory and reprogramed it to do something, but then it would have to compaire its current program to its logic program that would be infused in its hardrive, so that it wouldnt do anything it wasnt made for
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Mon Nov 19, 2007 2:07 pm
If any of you are familiar with the concept of Technological Singularity, then that would be an appropriate statement of my beliefs about creating intelligence. However, the level of technology for such a scenario is millions, if not billions, of years away. The planet is more likely to have been destroyed before that kind of thing would happen.
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Mon Nov 19, 2007 4:50 pm
@ Calypsosophia: They'd go to work for you wink
AI's would still think in variables. I think every possibility of creating things should be used. Every defeat in such things is also a victory. Even if the machines want to revolt, just build in a collapsing trigger, like a global shutdown code. It gets anchored so deeply in their "Subconsciousness" that they can't change anything on it. It's a simple thing: Don't mess with your creator, we gave life to you and we can take it!
|
 |
 |
|
|
|
|
|
|
|
|
Posted: Mon Nov 19, 2007 7:41 pm
The thing is, after a certain point, intelligence is capable of making itself MORE INTELLIGENT. You think you might place such a shut down code so deep that they can't reach it, but what happens when they become smart enough to override or nullify it? Human brains basically function like computers do, only on a much more efficient rate and scale due to parallel processing. Anything that a human is capable of doing, "artificial" intelligences would be able to do the same.
|
 |
 |
|
|
|
|
|
|
|
|
|
|
Posted: Fri Dec 07, 2007 2:00 pm
behold my almighty opinion....
i think A.I is a terrible idea....becasue if we make them to smart...they may turn out like us like humans they may be able to learn and if they learn to fast they could overthrow humans and we would have brought our end upon ourselves...well we might anyway yet in a different manor......(probably will) well everyone is entitled to their own opinion if u disagree with me thats good but dont tell me PLEASE or ur going to start a world war three
|
 |
 |
|
|
|
|
|
|
 |
|
|
|
|
|