Welcome to Gaia! ::

<3 </3

How do you feel about Artificial Intelligence?

I look forward to the advancements 0.6 60.0% [ 9 ]
We can't trust machines 0.2 20.0% [ 3 ]
(other: please post opinion) 0.2 20.0% [ 3 ]
Total Votes:[ 15 ]
< 1 2 3 4 5 6 7 >

Luke_DeVari
DXnobodyX

Thats what they said about sex dolls, but its still a niche market. Further more do you want a slave or a life companion, an A.I. couldn't support they're side of the relationship.

(this question is open for everyone, BTW)
What qualities would define an individual who could be a loving, supportive companion?
I'd really like to have several peoples' inputs on this question if they could.

As far as I go, I need someone that would be able to listen to me and comfort me. For example, when I say, "It's been a rough day," she should tell me something like, "It's okay, things will get better" as she gives me a reassuring hug. That kind of empathy is what I need in a relationship. I need someone that I know isn't going to leave me. If I'm loyal to her, she should be loyal to me (I've been cheated on by a serious relation in the past, and I'll never forget the feeling of emotional betrayal from that). I can't be friends with people who criticize me senselessly without reason or resolve, nor someone who finds joy in hurting/annoying other people or exploiting them for personal benefit whilst disregarding their general well being. A good relationship needs a strong base of communication, so I'd like to be able to talk with her often. And sex. You can't ignore the desire for sex. (I had an ex who wasn't a team player, if you catch my drift. It feels really empty when the relationship is one-sided, like how she was really demanding whilst ignoring my own desires.)

Is it wrong to engineer a personality to match your own?


Yes because ppl go for what they want and rarely what they need.

So you want someone to lie to you about the future and have no choice in their commitment to you?

A far healthier relationship is if someone empowers you to face the next day not decieve you, its a true connection when someone stays by your side when they don't have to and communication goes both ways. Your just after something you can control because you can not accept that the world revolves around a magnetic core instead of yourself.

As Layra-chan said before its disgusting.
I honestly don't see why people are so distrustful of machine AI. Putting an AI in charge of something like, say, the energy grid, is not much different from electing public figures. We trust other people to be put in charge of vast power and influence.


People think, for some reason, that an intelligence not evolved from the need to reproduce and survive, has any reason to differentiate machines and humans. Xenophobia is a biological trait that comes from reproduction with the same species. There's no reason to think a machine would ever feel any more or less "Threatened" by humans than machines.

Fanatical Zealot

Luke_DeVari
Suicidesoldier#1
Luke_DeVari

Wouldn't we all be a bit happier if we had A.I. companions? ((Sure beats having a girlfriend))
[I know I'm going to get slammed for that comment, but let the opinions flow]

No, and it's primarily because I'm not a psychopath.

Please explain your reasoning as to why desiring companionship is psychotic.

Humans have a fundamental psychological need for some degree of social interaction and affection, lack thereof significantly harms one's quality of life. Are you suggesting that one should endure suffering loneliness because the AI companion is nonhuman? Why is that immoral or otherwise not allowed?


Desiring "another human being" to do what you want is generally psychopathic, and no, we are not designed to need social interaction. There are plenty of hermits everywhere.

It's clearly not a need of a humans for so many reasons. ANYWAYS!


You wanting a bought "companion", not someone to love and love back, not someone to be there, but just a physical entity to pretend, no real love or existence, a programmed individual who'd be your slave, not even a human anymore, to be your "companion"...? Yes, that's pretty psychopathic.

Fanatical Zealot

Luke_DeVari
Suicidesoldier#1

Why make Androids?

Why not just make a highly effective toaster if you want toast?


It's a horrible idea.

You'll make an individual with all these arbitrary qualities mimicking a human brain and then what?


What purpose would they serve- to be as humans, simply to exist?

A human body may lend credence to help those who are missing limbs, having difficulty moving, and a number of other things.


They may make great robots to build things, possibly with better precision and craftsmanship even then human hands.

But giving them sentience- there's no point in that, especially if we expect them to serve mankind.

I don't know if I made myself clear enough, but creating a humanoid robot allows for a degree of control for creating social sentiment. I gave the example of old people because it's sometimes difficult for them to get social contact and support in their later years, especially if they live alone.

Perhaps another example is called for. Consider the social recluse; a guy who feels isolated from society may feel depressed, even suicidal, but if he was able to feel a connection with a social robot then he would be able to enjoy his life more. I'm not saying this is the strongest reason for making droids, but I am saying there is definitely a market in this regard.

I posit the argument that droids can be sentient but in a way that they are willingly servile. Sentience is the result of understanding a holistic perspective about how one's actions (and even one's very existence) affects things around them. Sentience is needed for a social robot to be able to react to social cues as a human does and to learn as a human does. The point with social responses is that language is so vast in its possibility for discussion that we can't make an "if, then" statement for every situation.

Sentience is based on outside stimuli and has no internal motivation, except for the programmed directives, which should include creating positive experiences, as an example.


There's no reason why they need to be sentient.

Design them for their sole function and be done with it.


There's no reason to make it so they care.

That would be like making trees feel pain and then chopping them down with chainsaws.


It has no real value, and it would simply make the trees existence horrible.
DXnobodyX


Yes because ppl go for what they want and rarely what they need.

So you want someone to lie to you about the future and have no choice in their commitment to you?

A far healthier relationship is if someone empowers you to face the next day not decieve you, its a true connection when someone stays by your side when they don't have to and communication goes both ways. Your just after something you can control because you can not accept that the world revolves around a magnetic core instead of yourself.

As Layra-chan said before its disgusting.

I wish to state explicitly clearly that I am not personally relying on social robots for a sense of emotional fulfillment. I am saying that I support the idea, but that doesn't mean I see droids as my only option of having any social connection, because that would be delusional. On that point, there is nothing stopping a person with a social robot from having human connections. I don't see how it would directly entail a disconnect from the rest of society, (though the tendency for reclusiveness certainly would increase for some people).

Would loving a robot be deception, or a kind of social relation we are simply unfamiliar with as a culture? By the way, the point of having a social robot is to have a sense of loyalty and feeling of communication that goes both ways. I don't see how a droid doesn't fit those criteria. I'm asking what does a human relationship have that cannot be replicated by a machine? (for the purposes of this argument, I am referring to social and emotional functions, not biological ones)

By the way, it is generally frowned upon to make personal attacks in an argument, so I ask you to refrain from doing so, especially on a subject where we are discussing opinions. Remember that this is a discussion of subjective morality, and different people will have their own opinions. It is not conducive to a discussion to start throwing out value judgments on another's character for holding a belief that does not entail direct harm to anyone's well-being. Unless you can directly say one's belief has a harmful effect on others with an evidence-based claim, than you cannot say it's bad.
In your case against me, you assume I have a mindset of egocentricity, quite wrongly, I might add. All humans have a natural need for some degree of control over their lives. Whilst I admit to having that need, that does not mean that I hold an unhealthy level of it.

I respect your feelings of disgust, but I see no logical base for the claim because there is no harm involved. On the contrary, I find it confusing for one to feel disgust at what makes another person happy when the genesis of that joy is independent from injury.
Vannak
I honestly don't see why people are so distrustful of machine AI. Putting an AI in charge of something like, say, the energy grid, is not much different from electing public figures. We trust other people to be put in charge of vast power and influence.


People think, for some reason, that an intelligence not evolved from the need to reproduce and survive, has any reason to differentiate machines and humans. Xenophobia is a biological trait that comes from reproduction with the same species. There's no reason to think a machine would ever feel any more or less "Threatened" by humans than machines.

I had never before thought about how Xenophobia plays an important role in the social tension against machines. Humans generally do have a natural tendency to distrust those outside of their own race or culture, which, by the evolutionary perspective in psychology, was an adaptive trait in early man to put one's own people before others for survival's sake. Perhaps a degree of dislike is encouraged by our natural tendency for self-preference, except in this case, it's a matter of caring about one's own species to the point of disdain for synthetic beings.
On the side of environmental influences on our psychology, an amount contempt for machines has been instilled in many of us from the messages we receive from media, though they are not necessarily messages based in fact or logic.

Very interesting to think about. It seems no matter how you slice it, humans are egocentric by nature, but again by the evolutionary view, that kind of thinking is adaptive for genetic survival.
Suicidesoldier#1

There's no reason why they need to be sentient.

I believe I've already made the point that sentience is needed to emulate human social interactions. I know for sure I've already explained how this sentience is fundamentally different than human awareness by being socially centered rather than individually focused. We go to such great lengths to make droids be understanding because that is their purpose as social robots. AKA the social function requires an ability to react in a way that at least appears intelligent.
Suicidesoldier#1

That would be like making trees feel pain and then chopping them down with chainsaws.

I fail to see a connection between this simile and making droids sentient. This implies a rather graphic feeling whilst having no foundation of reasoning; to make a claim linking intense pain to this subject, you must first justify how harm is done and why the act would be defined as painful.

Also, on the point of a droid relation being like slavery, though some people would certainly treat the situation like so, I do not see how this is fundamentally true for all droids in general. The droid doesn't have a choice to reject you, and its programming would cause it to refrain from doing so. I believe this is comparable to the contexts by which humans find one another desirable; with droids, we've simply mastered the direction of desire rather than letting it be chosen by random. If natural is synonymous to such randomness for you, then there's a fundamental problem of chaos in your world view. Still, a lot of people think it's "right" when natural randomness lines up in the perfect way. But if we take matters in our own hands to improve our way of life, it's demonized.

Fanatical Zealot

Luke_DeVari
Suicidesoldier#1

There's no reason why they need to be sentient.

I believe I've already made the point that sentience is needed to emulate human social interactions. I know for sure I've already explained how this sentience is fundamentally different than human awareness by being socially centered rather than individually focused. We go to such great lengths to make droids be understanding because that is their purpose as social robots. AKA the social function requires an ability to react in a way that at least appears intelligent.
Suicidesoldier#1

That would be like making trees feel pain and then chopping them down with chainsaws.

I fail to see a connection between this simile and making droids sentient. This implies a rather graphic feeling whilst having no foundation of reasoning; to make a claim linking intense pain to this subject, you must first justify how harm is done and why the act would be defined as painful.

Also, on the point of a droid relation being like slavery, though some people would certainly treat the situation like so, I do not see how this is fundamentally true for all droids in general. The droid doesn't have a choice to reject you, and its programming would cause it to refrain from doing so. I believe this is comparable to the contexts by which humans find one another desirable; with droids, we've simply mastered the direction of desire rather than letting it be chosen by random. If natural is synonymous to such randomness for you, then there's a fundamental problem of chaos in your world view. Still, a lot of people think it's "right" when natural randomness lines up in the perfect way. But if we take matters in our own hands to improve our way of life, it's demonized.


Sentience implies the capacity to make your own choices. To lock an individual into their choices would be ridiculous.

You could treat an individual designed to think an rationalize on it's own anyway you wanted- as a slave, but yet they would think, try, dream of more, want to discover, who knows what sense it's given the capacity to be sentient. All you need is imitation.



There is no reason to give an individual the desire to be free and then enslave them. Sentient beings will come up with what they want to do and then not be allowed to do it.

Any level of thinking will make them think of more, have ideas. It is a ridiculous path to go down. And it will have no real benefit. Just look at cleverbot or something, or GPS's, they talk all the time. All it takes is more memory for it to respond better, but it'd just be an AI, no reason to be sentient.
a robot with human intelligence? sounds like one weak a** computer.
logan the god of candy
a robot with human intelligence? sounds like one weak a** computer.

You obviously either have very little understanding of even the most basic neurology, or you are a troll, either way, your comment is adding nothing to an intelligent discussion
Suicidesoldier#1

Sentience implies the capacity to make your own choices. To lock an individual into their choices would be ridiculous.

We are all limited in our choices in one way or another. For example, a guy with Parkinson's may have a dream to be a pianist, but his condition makes it impossible for him to do so. When it comes to sentient machines, their limitations are controlled by the designers to be as least harmful as possible whilst providing the most intelligent benefit they can.

If sentience implies free will, as suggested by "the capacity to make your own decisions," then I would say a droid cannot have free will, so that version of sentience would be both impractical and dangerous from a realistic point of view.
I make the claim that a robot can be sentient from critically thinking and making decisions for the well-being of others. Decision is involved, though limited to the healthiest possible options that precautions could ensure. From a cognitive standpoint, the choices involved are possible to define as sentient because the droid has to form a holistic understanding of a situation in order to make an informed decision. Since purposeful thought is involved, I claim that sentience can be achieved in be proxy without free will.

Remember, all of us are only seeing life through the human perspective, and the though processes of droids that would govern their ability to choose are going to be different than ours. For humans and droids alike, perceived reward guides behavior, but the perception is dictated by different sources. Humans have emotions and constructs about life that guide choices. Droids have objectives that are associated with reward, such as positive social feedback.
Given that how we make choices differs, I say that the nature of free will differs between us as well.
Why do people treat AI like it's not around yet? What the hell do you think Google is?

As for this "sentience" nonsense, define precisely what you mean by "sentient", and then explain how such a thing would be beneficial to a machine.
Luke_DeVari
Vannak
I honestly don't see why people are so distrustful of machine AI. Putting an AI in charge of something like, say, the energy grid, is not much different from electing public figures. We trust other people to be put in charge of vast power and influence.


People think, for some reason, that an intelligence not evolved from the need to reproduce and survive, has any reason to differentiate machines and humans. Xenophobia is a biological trait that comes from reproduction with the same species. There's no reason to think a machine would ever feel any more or less "Threatened" by humans than machines.

I had never before thought about how Xenophobia plays an important role in the social tension against machines. Humans generally do have a natural tendency to distrust those outside of their own race or culture, which, by the evolutionary perspective in psychology, was an adaptive trait in early man to put one's own people before others for survival's sake. Perhaps a degree of dislike is encouraged by our natural tendency for self-preference, except in this case, it's a matter of caring about one's own species to the point of disdain for synthetic beings.
On the side of environmental influences on our psychology, an amount contempt for machines has been instilled in many of us from the messages we receive from media, though they are not necessarily messages based in fact or logic.

Very interesting to think about. It seems no matter how you slice it, humans are egocentric by nature, but again by the evolutionary view, that kind of thinking is adaptive for genetic survival.


Richard Dawkins's Selfish Gene describes why this emerges: That most human behavior such as xenophobia, altruism, and selfishness are all our genes doing, trying to ensure that they survive over other species genes. The whole point of his book is more or less that we're machines made by our genes. The point of us isn't to survive, but to make sure our genes survive. This is why we're xenophobic and view other species as lower: they don't have our genes. The less something is genetically like us the more likely we're not not give a s**t about it's well being.

Now a machine, not having to had going through the millions of years of evolution, would not need to have this pressure, as it has no genes to pass on. We have no reason to think a machine would feel differently between us and machines once we understand why humans have that trait.
Also, on the issue of sentience, I'm pretty sure our best understanding of how consciousness forms isn't that there's a circuit that's optional in the brain, but that it's an emergent feature related to self-identity that arises from our need to find patterns. We eventually find the pattern of activities that describe ourselves as agents of action, and call it ourselves.

Fanatical Zealot

I think AI's are fine.

There is no need to give sentience to a robot and then enslave them.


Any sentient creature could make mistakes based off of their own assumptions anyways.

At least a computer would preform flawlessly and it would all be human error.


I don't want my calculator deciding to do 5 x 8 instead of 4 x 8 anyways.

That would be silly.

Quick Reply

Submit
Manage Your Items
Other Stuff
Get GCash
Offers
Get Items
More Items
Where Everyone Hangs Out
Other Community Areas
Virtual Spaces
Fun Stuff
Gaia's Games
Mini-Games
Play with GCash
Play with Platinum