Artificial Intelligence vs. Humans
Would you want Artificial Intelligence to become a reality?
0
This debate is not about whether A.I. is possible or goes against your religious faiths. Assume that in the future A.I. can perfectly rival human consciousness with self-awareness, emotions, reasoning, creativity, etc. The ethical implications would be immense as we would have to question whether A.I. deserves the same rights humans receive. Based on this, should such development of A.I. be prohibited, crippled, or allowed? If allowed, should A.I. receive equal rights?
Note: If you mention cyborgs, a human brain in robot armor is not considered A.I. as it would still have a human brain. If it has a robotic brain that "downloaded" a human's brain, would being man made disqualify it as human or would it be an evolved form of humans?
Note: If you mention cyborgs, a human brain in robot armor is not considered A.I. as it would still have a human brain. If it has a robotic brain that "downloaded" a human's brain, would being man made disqualify it as human or would it be an evolved form of humans?
0
To be perfectly honest, I would never want an A.I. to become real. But if it is inevitable, I would have to argue that an A.I. would have to be given the same basics rights and freedoms that we have. We might have created them, but if they become self-aware and self-taught/self-thinking, they need to be ensured that they will coexist with us, otherwise we will get fucked over as a species.
0
If AI is ever developed to rival that of a human intelligence, my guess is that AI is the next step in human evolution. Downloads of human consciousness with self-awareness, emotions, reasoning, creativity would be just another step in evolving.
0
Well if you look at Ghost in the Shell, the anime's title in of itself is nearly an allegory for how the human soul is but a mere ghost in the shell of A.I. Soon enough it could indeed be possible for humans to transform themselves into a cybernetic world where the internet becomes god, and robots assume a natural placing in this world. Before long Humans will be looked back to as both an inferior race AND the gods of the (then it would be) current world.
2
ToyManC
Forgot my safe word
I think artificial intelligence is a flawed direction for science. Computers may become advanced enough to generate an artificial identity, but we humans could never accept it as anything but a tool. Any sufficiently advanced intellect must eventually become self-aware, and feel alienated by the lack of acceptance by the humans that created it. Before anyone thinks I refer to a Matrix, or Terminator, type conflict, I would predict the AI's reaction would be more internal and self-destructive. Without access to the five senses, and emotional feedback, the AI would likely just delete itself. A truly stable AI would require an almost limitless memory and processing capability. Making a true AI, that can think for itself and change itself to adapt to human illogical reasoning, would be an astronomical cost without any real foreseeable benefits. Once it became self-aware, you would never be sure you had any control over its functions.
As for the downloading of a human consciousness into a computer, it may be possible but hardly preferable. More likely, we will develop implants that would allow computer access without separation from our natural bodies. Pure intellect is a rather flat and emotionless landscape, and few humans would trade their bodies for the experience (except, maybe, at the time of our deaths).
As for the downloading of a human consciousness into a computer, it may be possible but hardly preferable. More likely, we will develop implants that would allow computer access without separation from our natural bodies. Pure intellect is a rather flat and emotionless landscape, and few humans would trade their bodies for the experience (except, maybe, at the time of our deaths).
0
AI has already been made in different degrees, but the one that can emulate "humanness" is still coming. The fusion of neuroscience and computer science is making all that possible; whether we like it or not.
0
I've been thinking about this for a good amount of time, and I was wondering:
Where do scientists get off, thinking that they are the ones who will "discover" artificial intelligence?
Bare with me here, but it seems more like the realm of philosophers and linguists.
For artificial intelligence to work, at base level, all we need to do is define language to a computer. From that point, processing is irrelevant (a slow AI is still AI).
Example:
I want you, to describe to me the concept "if." No single synonyms. You cannot use the word if, to describe if.
Once we cover the basic conventions of the given language, they inherently allow for creation and manipulation of concepts. The AI begins.
Where do scientists get off, thinking that they are the ones who will "discover" artificial intelligence?
Bare with me here, but it seems more like the realm of philosophers and linguists.
For artificial intelligence to work, at base level, all we need to do is define language to a computer. From that point, processing is irrelevant (a slow AI is still AI).
Example:
I want you, to describe to me the concept "if." No single synonyms. You cannot use the word if, to describe if.
Once we cover the basic conventions of the given language, they inherently allow for creation and manipulation of concepts. The AI begins.
0
echoeagle3
Oppai Overlord
If we ever do create high levels of AI then we are going to have to be VERY careful. Once they reach a certain level of sentience they are basically no different then humans. When it reaches that point we are going to have to give them rights and treat them with respect. Other wise we might suffer a type of terrible retribution that you see in all of those sci fi movies like Terminator.
0
There are some questions that I think aren't rising when people say we must give them rights. I do not mean to criticize but I do want more critical thought in the discussion. If we do give them human rights, what then? How can we live symbiotically with machines which can outperform us in every way? The economics would not be stable as we would have nothing to offer machines that they couldn't get themselves 100x faster. Symbiotic relationships only exist if both parties have something to offer in return otherwise it would be a parasitic relationship. The A.I. could evolve much more rapidly and design better versions of themselves, something we can't do.
0
The ethical implications would be immense as we would have to question whether A.I. deserves the same rights humans receive. Based on this, should such development of A.I. be prohibited, crippled, or allowed? If allowed, should A.I. receive equal rights?
Why should an AI be given rights? You're assuming that rights are given because something is sentient or sapient. If that were the case, then the comatose shouldn't have any rights. The ethical implcations are the same as that of a computer or hammer -- its a tool to be used.
Also, restricting research because of "what might happen" is stinks strongly of anti-intellectualism.
If the AI are basically people, then why shouldn't they be treated as people are?
You'd have to first successfully argue that AI are people.
0
Randumb wrote...
If the AI are basically people, then why shouldn't they be treated as people are? Because they are machines, not people.
0
Maxiart wrote...
Randumb wrote...
If the AI are basically people, then why shouldn't they be treated as people are? Because they are machines, not people.
Human emotions can be broken down to chemical reactions and hormone levels just as AI can be diminished to computing and mechanics.
0
mibuchiha
Fakku Elder
Randumb wrote...
Human emotions can be broken down to chemical reactions and hormone levels just as AI can be diminished to computing and mechanics.This.
And yes, I do want AI to be real.
0
fatman wrote...
The ethical implications would be immense as we would have to question whether A.I. deserves the same rights humans receive. Based on this, should such development of A.I. be prohibited, crippled, or allowed? If allowed, should A.I. receive equal rights?
Why should an AI be given rights? You're assuming that rights are given because something is sentient or sapient. If that were the case, then the comatose shouldn't have any rights. The ethical implcations are the same as that of a computer or hammer -- its a tool to be used.
Also, restricting research because of "what might happen" is stinks strongly of anti-intellectualism.
If rights aren't given based on sentience, then what would it be given based on? Also, comatose is non-comparable because we're talking about rights applied to species and not to individual cases.
0
If rights aren't given based on sentience, then what would it be given based on?
http://en.wikipedia.org/wiki/History_of_human_rights
The idea of human rights, that is the notion that anyone has a set of inviolable rights simply on grounds of being human regardless of legal status, origin or conviction for crimes, emerges as an idea of Humanism in the Early Modern period and becomes a position in the 18th century Age of Enlightenment.
The basis of human rights=being human
computers=not human.
At best, Ais would be given animal rights.
0
@fatman
And why do you think A.I. would be undeserving of rights? It's not satisfactory to say that rights are limited to humans because humans have traditionally only had rights. "Because they are different" or "because they are less human" is the kind of thinking that kept slavery for 200 years. Ethically why would it be wrong to give rights to A.I. or not?
And why do you think A.I. would be undeserving of rights? It's not satisfactory to say that rights are limited to humans because humans have traditionally only had rights. "Because they are different" or "because they are less human" is the kind of thinking that kept slavery for 200 years. Ethically why would it be wrong to give rights to A.I. or not?
0
And why do you think A.I. would be undeserving of rights?
Because they're not human. I thought I made that clear.
Because they are different" or "because they are less human" is the kind of thinking that kept slavery for 200 years
Yes, that's exactly the point. Slaves (specifically, chattel slavergx) were kept because they weren't considered human. Guess what? They are human. So, that makes slavery wrong.
Computers are not human.
t's not satisfactory to say that rights are limited to humans because humans have traditionally only had rights.
This doesn't even make sense.
0
Basically what you are saying, "Human rights are for humans because they are for humans."
http://www.urbandictionary.com/define.php?term=circular%20reasoning
I'll try to say this as clear as possible: Simply not being human does not make A.I. less deserving of rights unless you give justification and tradition is not ethical justification.
If you want to take it out of this circular fallacy then give another reason human rights can't be applied to A.I.
http://www.urbandictionary.com/define.php?term=circular%20reasoning
I'll try to say this as clear as possible: Simply not being human does not make A.I. less deserving of rights unless you give justification and tradition is not ethical justification.
If you want to take it out of this circular fallacy then give another reason human rights can't be applied to A.I.
0
It is dangerous to create something that is better(functioning) than you. This is one reason I fear the rapid advancement of science.To answer your question,no. I don't believe such things such as A.I. should exist. This is because I am 99.9 percent sure if they are created, the first thing they will be used for is war purposes. and I know some of you are thinking "ooh, he's one of those people." Let's be honest, I am sure everyone here has lived on this Earth to know that people have these flaws called the 7 deadly sins. So yes, there is a high chance they will be used for war purposes. Also, I sure wouldn't like to cross paths with a machine with those flaws. Even if A.I. exists, I will go against them having the same rights as humans. This is because no matter how advanced they are, they are only mimicking humans. They're only actors who perform on the stage called Earth.