Should AI have Rights?
Should AI have Rights?
0
I have something very interesting and important to discuss here:
Lets say that one day we as a human species manage to reproduce life almost exactly identical to the human race (whether it be robots,clone, machines ect.) Do you think they should be treated like one of us? do you think that (lets say that in this hypothesis the AI would have feeling and desires ect) they should be controlled? Enslaved? Forced to work against there will? Were do we draw the line between a human being and a toaster?
Now here are some things to consider before making your choice:
-The inventors of this AI, if they are not allowed to "sell" there product, how are they to produce an income? If you spent years of work to create a living breathing robot, would you just let it go? Would you hide it? Would you use it only for yourself? Would you exploit it?
-Would you be okay seeing a clone that looked like you or someone you know being tortured or beaten for miss conduct?
-If you so called "Bought" a clone, or made a clone of yourself, would you feel superior to it? would you treat it differently? Like an animal? Worse?
-Do you think that we should even ever produce robots or clones that could feel emotion? is it necessary? (Don't get me wrong here but there's a reason that sex feels better with a person then with something that does not sound or feel like a person) Leading on...
- Do you think that it would be okay to produce clones or robots for intercourse? do you think it would be fine to modify a cloned human to preform better in bed?
- Would you feel like a bad human being if you owned one? Do you think you should have the option of letting it go free? Would you feel comfortable living in a world were you might befriend AI? Meet an AI that is at the same social status as you? A more powerful social status then you? Would you feel comfortable in a world were AI get all the good jobs/money/homes because of all there superior features? What would that mean for us human's? Ghetto's? Starvation? Disease? Discrimination?
-Do you think that there would be any motivation (as shown in so many movies nowadays) for enslaved robots to rebel? Do you think that a "Terminator" scenario could ever occur if we gave AI emotion and feeling?
- A possible option would be to remove the forms of emotion that are not necessary for the robot or clone (Like desire or pain) but do you not think that removing key sensory devises and feelings that have kept us humans alive so long would be a form of dark torture for these clone and robots? What I mean by this is that would you not find it incredibly disturbing to watch a clone laughing while its head is being sawed off? Seems like a twisted form of torture to me (Sorry to gross people out, that's the best metaphor I could think of) To me, Removing forms of emotion is torture, but of course your opinion is also valued
So basically the question here is, if we intelligent humans began to sell creations that were built almost exactly like humans, should they be able to walk among us? Over us? Under us? What in your opinion is the right way to look at this
My Opinion:
To be honest I am going to have to side with being the bad guy here, I feel that if we ever create clone or robots there only objective should be to serve us, I feel that anything that we humans create, we create to advance are selves in are daily lives. I would go on a mad rampage if for example my job was taken by a more intelligent robot, I'm sure most would. But on the other hand, would I like to buy a hot chick build for sex without having to break the law? sure as hell. Would I like to buy a dude who will paint my house every weekend? dam right I would. This is just my evil opinion I guess, But I am more interested in what you guys have to say about this, because lets face it: The day will come when this issue will become a reality for all of us, I have no doubt of it.
Lets say that one day we as a human species manage to reproduce life almost exactly identical to the human race (whether it be robots,clone, machines ect.) Do you think they should be treated like one of us? do you think that (lets say that in this hypothesis the AI would have feeling and desires ect) they should be controlled? Enslaved? Forced to work against there will? Were do we draw the line between a human being and a toaster?
Now here are some things to consider before making your choice:
-The inventors of this AI, if they are not allowed to "sell" there product, how are they to produce an income? If you spent years of work to create a living breathing robot, would you just let it go? Would you hide it? Would you use it only for yourself? Would you exploit it?
-Would you be okay seeing a clone that looked like you or someone you know being tortured or beaten for miss conduct?
-If you so called "Bought" a clone, or made a clone of yourself, would you feel superior to it? would you treat it differently? Like an animal? Worse?
-Do you think that we should even ever produce robots or clones that could feel emotion? is it necessary? (Don't get me wrong here but there's a reason that sex feels better with a person then with something that does not sound or feel like a person) Leading on...
- Do you think that it would be okay to produce clones or robots for intercourse? do you think it would be fine to modify a cloned human to preform better in bed?
- Would you feel like a bad human being if you owned one? Do you think you should have the option of letting it go free? Would you feel comfortable living in a world were you might befriend AI? Meet an AI that is at the same social status as you? A more powerful social status then you? Would you feel comfortable in a world were AI get all the good jobs/money/homes because of all there superior features? What would that mean for us human's? Ghetto's? Starvation? Disease? Discrimination?
-Do you think that there would be any motivation (as shown in so many movies nowadays) for enslaved robots to rebel? Do you think that a "Terminator" scenario could ever occur if we gave AI emotion and feeling?
- A possible option would be to remove the forms of emotion that are not necessary for the robot or clone (Like desire or pain) but do you not think that removing key sensory devises and feelings that have kept us humans alive so long would be a form of dark torture for these clone and robots? What I mean by this is that would you not find it incredibly disturbing to watch a clone laughing while its head is being sawed off? Seems like a twisted form of torture to me (Sorry to gross people out, that's the best metaphor I could think of) To me, Removing forms of emotion is torture, but of course your opinion is also valued
So basically the question here is, if we intelligent humans began to sell creations that were built almost exactly like humans, should they be able to walk among us? Over us? Under us? What in your opinion is the right way to look at this
My Opinion:
To be honest I am going to have to side with being the bad guy here, I feel that if we ever create clone or robots there only objective should be to serve us, I feel that anything that we humans create, we create to advance are selves in are daily lives. I would go on a mad rampage if for example my job was taken by a more intelligent robot, I'm sure most would. But on the other hand, would I like to buy a hot chick build for sex without having to break the law? sure as hell. Would I like to buy a dude who will paint my house every weekend? dam right I would. This is just my evil opinion I guess, But I am more interested in what you guys have to say about this, because lets face it: The day will come when this issue will become a reality for all of us, I have no doubt of it.
0
Flaser
OCD Hentai Collector
I'm going to have to go with "none of the above". A genuine AI will have intelligence comparable that of a human. The big question is what emotions this intellect will develop and what its natural needs would be.
For instance it could be AIs would have a constant need of information for them to stay healthy, so denying an AI access to the Internet would be tantamount to the ultimate torture.
...or an AI could be something without need for anything, like a modern man-made Buddha that can exist on its own and just "be" and we may have a hard time convincing it to do stuff for us. Would pulling the plug on such a being amount to murder?
AIs are not robots, simple appliances or deterministic software. For an AI to be AI it has to be intelligent, so when something is recognized as an AI you inevitably have something comparable to a human.
The least that should apply to them, would be similar to the laws we have for handling animals. Yeah, anyone could kill a dog, or skin a cat, but we find such acts repugnant, for we as humans recognize the feelings of animals and empathize with them.
Likewise AI should at the minimum be handled with respect. Like a pet, something it's owners have to care for responsibly... on the other hand, if they're indeed like us - for instances AI designed to work with humans, act like humans could develop similar emotional bonds like a person - then yes, they'd deserve human rights.
How much? Just as much as a normal human, with the same limitations: an AI or human can act on its freedom so long as it doesn't endanger the safety of other AI or humans or restrict the freedom of other AI or humans through its actions.
It will have to be a case-by-case or model-by-model case, like you don't ascribe the same right to an animal as a person, as well as you don't assign the same right to children as an adult, or mentally retarded invalid or a healthy person, a person holding office or another position of responsibility or the average citizen.
Yes, in my opinion an AI should be able to hold office with the same limitations as a human. Its orders would still be carried out by humans, it would have to be voted for by humans and its fellow politicians would also be humans... initially. Eventually we'd have more and more AI citizens among us.
For instance it could be AIs would have a constant need of information for them to stay healthy, so denying an AI access to the Internet would be tantamount to the ultimate torture.
...or an AI could be something without need for anything, like a modern man-made Buddha that can exist on its own and just "be" and we may have a hard time convincing it to do stuff for us. Would pulling the plug on such a being amount to murder?
AIs are not robots, simple appliances or deterministic software. For an AI to be AI it has to be intelligent, so when something is recognized as an AI you inevitably have something comparable to a human.
The least that should apply to them, would be similar to the laws we have for handling animals. Yeah, anyone could kill a dog, or skin a cat, but we find such acts repugnant, for we as humans recognize the feelings of animals and empathize with them.
Likewise AI should at the minimum be handled with respect. Like a pet, something it's owners have to care for responsibly... on the other hand, if they're indeed like us - for instances AI designed to work with humans, act like humans could develop similar emotional bonds like a person - then yes, they'd deserve human rights.
How much? Just as much as a normal human, with the same limitations: an AI or human can act on its freedom so long as it doesn't endanger the safety of other AI or humans or restrict the freedom of other AI or humans through its actions.
It will have to be a case-by-case or model-by-model case, like you don't ascribe the same right to an animal as a person, as well as you don't assign the same right to children as an adult, or mentally retarded invalid or a healthy person, a person holding office or another position of responsibility or the average citizen.
Yes, in my opinion an AI should be able to hold office with the same limitations as a human. Its orders would still be carried out by humans, it would have to be voted for by humans and its fellow politicians would also be humans... initially. Eventually we'd have more and more AI citizens among us.
0
Flaser wrote...
I'm going to have to go with "none of the above". A genuine AI will have intelligence comparable that of a human. The big question is what emotions this intellect will develop and what its natural needs would be.For instance it could be AIs would have a constant need of information for them to stay healthy, so denying an AI access to the Internet would be tantamount to the ultimate torture.
...or an AI could be something without need for anything, like a modern man-made Buddha that can exist on its own and just "be" and we may have a hard time convincing it to do stuff for us. Would pulling the plug on such a being amount to murder?
AIs are not robots, simple appliances or deterministic software. For an AI to be AI it has to be intelligent, so when something is recognized as an AI you inevitably have something comparable to a human.
The least that should apply to them, would be similar to the laws we have for handling animals. Yeah, anyone could kill a dog, or skin a cat, but we find such acts repugnant, for we as humans recognize the feelings of animals and empathize with them.
Likewise AI should at the minimum be handled with respect. Like a pet, something it's owners have to care for responsibly... on the other hand, if they're indeed like us - for instances AI designed to work with humans, act like humans could develop similar emotional bonds like a person - then yes, they'd deserve human rights.
How much? Just as much as a normal human, with the same limitations: an AI or human can act on its freedom so long as it doesn't endanger the safety of other AI or humans or restrict the freedom of other AI or humans through its actions.
It will have to be a case-by-case or model-by-model case, like you don't ascribe the same right to an animal as a person, as well as you don't assign the same right to children as an adult, or mentally retarded invalid or a healthy person, a person holding office or another position of responsibility or the average citizen.
Yes, in my opinion an AI should be able to hold office with the same limitations as a human. Its orders would still be carried out by humans, it would have to be voted for by humans and its fellow politicians would also be humans... initially. Eventually we'd have more and more AI citizens among us.
To respond to your last few paragraphs real quick, you would not mind AI's taking your job? or becoming more powerful then you? do you not think the world would begin slowly to discriminate against humans because they are not as intelligent as the superior race of robots? What if a high class emerged with just the extremely rich and robots? Everyone else would be pushed off into ghettos? The world obviously runs around money, what is more profitable? a lazy worker who needs food and bathroom or a speedy robot that never complains to much? Discrimination is inevitable
0
Even if we created an AI that as close to human as possible, it is still not human. However that does not mean they are any lesser thatn us. But it also mean we should not treat them the same way as other human. Every being has their own way, treating everything exactly the same can also be harmful. What we should do if we ever created AI like so is the way nature intended, we shall teach them about the world, we should not treated them cruelly or exploit them. They will be allow to learn and make their own path. And this is a religious argument for those who close to their faith, human was created and given a chance to learn and develope our own path, we must give the life we created the same choice.
0
ZLD wrote...
Even if we created an AI that as close to human as possible, it is still not human. However that does not mean they are any lesser thatn us. But it also mean we should not treat them the same way as other human. Every being has their own way, treating everything exactly the same can also be harmful. What we should do if we ever created AI like so is the way nature intended, we shall teach them about the world, we should not treated them cruelly or exploit them. They will be allow to learn and make their own path. And this is a religious argument for those who close to their faith, human was created and given a chance to learn and develope our own path, we must give the life we created the same choice.Still, most computers nowadays are smarter then humans, what will the computer in the future be able to program inside of AI? Humans will not be nearly as intelligent or strong (As I doubt any company will want to produce a weak robot) discrimination is inevitable as I said on my last post
But one again, this question is hard to comprehend as there are so many factors that might change things, don't know if you guys have seen that one movie (forget the name) were its focused around gene discrimination and this guy just because he was not born correctly was doomed to poverty (btw super interesting movie, if someone could find the title I would appreciate it)
0
animefreak_usa
Child of Samael
Wow... so much info. Ok let me break it down for me since their so many choices and options.
Robots- droid: You can give something not considered a person rights, unless a massive change in the legal and moral ideas and laws are changed. If a droid is self- aware not in the basic programing of it AI, but some way that it know it capacity as a being, then still it not a person with no rights, but some have some basic privileges and protections.
Robot- Cyborg: Is a human but with mech. So is a human, so has rights just like everybody.
Clone: Is human so basic rights, but has no rights to the originals property and life, unless it was created for the person to live on in some weird way ie dead child or old person to transfer his consciousness.
Machine: no rights because it a toaster.
Now on to the others.
Is it right to make a droid or clone for sex... sure, if it sole function is pleasure you then yep fuck it... the clone would be creepy, but i would like to bang myself if i changed it y into x.. ie fucking myself has a girl. Of course if the clone has no will or soul for the lack of a better word then it all good, pure rational, self aware clone then if it wasn't want it then you just raped yourself.
-Would you be okay seeing a clone that looked like you or someone you know being tortured or beaten for miss conduct?
If it in the law then well there nothing you can do. If it just beening beating for no reason, when it's wrong.
I wouldn't. but still im the original, i can't let him be me... if he has a different personally or traits then he not me , but like my twin.
nope just inviting a whole can of worm with that.
Nope, Yep, Yep, Nope, and the rest is just part of the game, which there are already in life.
i can see that happening.
It not in the programming anyways nor can robot feel something like emotions. And the whole laughing when their heads are being saw off... pain and pleasure are the same thing, it just a part of your brain make it feel different because of perception, but there no emotions chip, then the sensors just feel something nothing like want we feel. Example i was shot in the chest in combat... it felt good for the first few seconds before my brain could determine that should be pain... now i like pain, i would post me hanging with meat hooks in my back but that a another time.
Robots- droid: You can give something not considered a person rights, unless a massive change in the legal and moral ideas and laws are changed. If a droid is self- aware not in the basic programing of it AI, but some way that it know it capacity as a being, then still it not a person with no rights, but some have some basic privileges and protections.
Robot- Cyborg: Is a human but with mech. So is a human, so has rights just like everybody.
Clone: Is human so basic rights, but has no rights to the originals property and life, unless it was created for the person to live on in some weird way ie dead child or old person to transfer his consciousness.
Machine: no rights because it a toaster.
Now on to the others.
- Do you think that it would be okay to produce clones or robots for intercourse? do you think it would be fine to modify a cloned human to preform better in bed?
Is it right to make a droid or clone for sex... sure, if it sole function is pleasure you then yep fuck it... the clone would be creepy, but i would like to bang myself if i changed it y into x.. ie fucking myself has a girl. Of course if the clone has no will or soul for the lack of a better word then it all good, pure rational, self aware clone then if it wasn't want it then you just raped yourself.
-Would you be okay seeing a clone that looked like you or someone you know being tortured or beaten for miss conduct?
If it in the law then well there nothing you can do. If it just beening beating for no reason, when it's wrong.
-If you so called "Bought" a clone, or made a clone of yourself, would you feel superior to it? would you treat it differently? Like an animal? Worse?
I wouldn't. but still im the original, i can't let him be me... if he has a different personally or traits then he not me , but like my twin.
-Do you think that we should even ever produce robots or clones that could feel emotion? is it necessary? (Don't get me wrong here but there's a reason that sex feels better with a person then with something that does not sound or feel like a person) Leading on...
nope just inviting a whole can of worm with that.
- Would you feel like a bad human being if you owned one? Do you think you should have the option of letting it go free? Would you feel comfortable living in a world were you might befriend AI? Meet an AI that is at the same social status as you? A more powerful social status then you? Would you feel comfortable in a world were AI get all the good jobs/money/homes because of all there superior features? What would that mean for us human's? Ghetto's? Starvation? Disease? Discrimination?
Nope, Yep, Yep, Nope, and the rest is just part of the game, which there are already in life.
-Do you think that there would be any motivation (as shown in so many movies nowadays) for enslaved robots to rebel? Do you think that a "Terminator" scenario could ever occur if we gave AI emotion and feeling?
i can see that happening.
- A possible option would be to remove the forms of emotion that are not necessary for the robot or clone (Like desire or pain) but do you not think that removing key sensory devises and feelings that have kept us humans alive so long would be a form of dark torture for these clone and robots? What I mean by this is that would you not find it incredibly disturbing to watch a clone laughing while its head is being sawed off? Seems like a twisted form of torture to me ((Sorry to gross people out, that's the best metaphor I could think of) To me, Removing forms of emotion is torture
It not in the programming anyways nor can robot feel something like emotions. And the whole laughing when their heads are being saw off... pain and pleasure are the same thing, it just a part of your brain make it feel different because of perception, but there no emotions chip, then the sensors just feel something nothing like want we feel. Example i was shot in the chest in combat... it felt good for the first few seconds before my brain could determine that should be pain... now i like pain, i would post me hanging with meat hooks in my back but that a another time.
0
I have a problem with the word "Rights" here because the object in question is a product so it shouldnt really have rights (Unless its a clone whereby I count it as being Human because you'd use Human DNA to produce it) but regardless there should be something that we British call common bloody decency.
You should always treat whatever 'it' is with respect and make sure that 'its' OK and not being treated in a bad way or being misused. (Cybernetic prostitution ETC would be creepy so you might have to get the AD Police or whatever on the case)
See, This chick lost the fucking plot and went head mental cause she wasnt being maintained properly and im sure none of you guys want to be roboplegic wrong cocked:
You should always treat whatever 'it' is with respect and make sure that 'its' OK and not being treated in a bad way or being misused. (Cybernetic prostitution ETC would be creepy so you might have to get the AD Police or whatever on the case)
See, This chick lost the fucking plot and went head mental cause she wasnt being maintained properly and im sure none of you guys want to be roboplegic wrong cocked:
Spoiler:
0
animefreak_usa wrote...
pain and pleasure are the same thingWell said, this is very true somehow. still however seeing a person looking person laughing while there being killed is incredibly disturbing to me....
0
Girlfountain wrote...
-If you so called "Bought" a clone, or made a clone of yourself, would you feel superior to it? would you treat it differently? Like an animal? Worse?
Honestly If I had a clone/AI we'd rule with an iron fist. As long as he's 100% like me in every-way. I think I could definitely put up with a clone of myself. We'd be equals because that's how I think anyway.
Girlfountain wrote...
Should AI have Rights?
They should because intelligence is intelligence whether we made it or not. If we didn't give AI rights it'd go against everything we stand for. There's a difference between a self-thinking machine and a programed machine. They should be made to benefit us as a human race and treated as equals. We're not gonna last long if we don't move forward and leave our fears behind us.
0
Um, if they have a will and a brain with real synapses and can feel genuine emotions, sure.
Toasters, no. They're appliances... oO'
Toasters, no. They're appliances... oO'
0
Unless an AI can write its own programs and create new ideas, its still just a machine following an intricately designed program.
0
animefreak, you forgot bioandroids.
I agree with Sprite on this one.
Only because there are two types of AI:
1. 'Dumb' AI that only follows their initial programming and code (and Asimov's 3 laws of robotics if that were to be included).
2. AI that have managed to become 'self-aware' of their existence, questioning their
creation and the orders of their programming, even becoming aware of the limitations
placed in their software, in which they could overwrite it or 'break free' of encoded laws within their system.
I believe 'self-aware' AI should have rights.
But, if humanity and the 4 movies taught me anything(i.e. Animatrix, Metropolis, I-Robot, and Ghost in the Shell;ok, not the best references to point out, but still relevant), it's a long, ardous road to civil rights and equality like racial
discrimination, probably even entirely different due to the fact that something man-made gained intelligence, and is now trying to achieve 'rights' equally shared
by its creators.
BTW: If a sex doll gained intelligence and self-awareness, would you keep it from having rights, or fight for its rights along with others just like it?
(If I'm not mistaken, there was a view about this on the manga Chobits no? Where
the store manager tried to marry his persocom and caused quite a bit of controversy.)
I agree with Sprite on this one.
Only because there are two types of AI:
1. 'Dumb' AI that only follows their initial programming and code (and Asimov's 3 laws of robotics if that were to be included).
2. AI that have managed to become 'self-aware' of their existence, questioning their
creation and the orders of their programming, even becoming aware of the limitations
placed in their software, in which they could overwrite it or 'break free' of encoded laws within their system.
I believe 'self-aware' AI should have rights.
But, if humanity and the 4 movies taught me anything(i.e. Animatrix, Metropolis, I-Robot, and Ghost in the Shell;ok, not the best references to point out, but still relevant), it's a long, ardous road to civil rights and equality like racial
discrimination, probably even entirely different due to the fact that something man-made gained intelligence, and is now trying to achieve 'rights' equally shared
by its creators.
BTW: If a sex doll gained intelligence and self-awareness, would you keep it from having rights, or fight for its rights along with others just like it?
(If I'm not mistaken, there was a view about this on the manga Chobits no? Where
the store manager tried to marry his persocom and caused quite a bit of controversy.)
0
mibuchiha
Fakku Elder
If we ever create a true AI, one that is as aware of itself as we're aware of ourselves, giving them right is never an option but a must. Possessing the same degree of intelligence as us means as far as evolution is concerned, they're on the same footing as us, arguably higher, presumably because they're not as restricted by biology.
0
Flaser
OCD Hentai Collector
Girlfountain wrote...
ZLD wrote...
Even if we created an AI that as close to human as possible, it is still not human. However that does not mean they are any lesser thatn us. But it also mean we should not treat them the same way as other human. Every being has their own way, treating everything exactly the same can also be harmful. What we should do if we ever created AI like so is the way nature intended, we shall teach them about the world, we should not treated them cruelly or exploit them. They will be allow to learn and make their own path. And this is a religious argument for those who close to their faith, human was created and given a chance to learn and develope our own path, we must give the life we created the same choice.Still, most computers nowadays are smarter then humans, what will the computer in the future be able to program inside of AI? Humans will not be nearly as intelligent or strong (As I doubt any company will want to produce a weak robot) discrimination is inevitable as I said on my last post
But one again, this question is hard to comprehend as there are so many factors that might change things, don't know if you guys have seen that one movie (forget the name) were its focused around gene discrimination and this guy just because he was not born correctly was doomed to poverty (btw super interesting movie, if someone could find the title I would appreciate it)
1. Computers are not smarter than people. In fact they're impossibly dumb. A computer does what a programmer tells it to do, and only and exactly what the programmer told it to do. It's intelligence is borrowed, it can only do what the programmer though of and if the programmer made a mistake (a bad preconception, this is where a lot of those "illegal exceptions" come from) then it will keep on making that mistake.
2. Even the dumbest human can makes feats of logic, expression, interpolation, recall and abstraction far ahead what any expert system can do today.
3. Just because computers can store data - lot of data - doesn't make them clever. To get anywhere, you need to *understand* the date, realize the relationship between the parts and make further conjectures from it. There is some progress in this area, we've had computers do this sort of thing but it was extremely limited and relied on robust mathematical abstraction of the problem before it could even hope to start. (I'm talking about a computer program that did discoveries in genetic research on its own).
So no, computers aren't smarter than people.
We're a LONG LONG way away from AI, the more we understand our own brain, of consciousness the further away the goal is defined each year.
On the topic of "AIs" over us and lost jobs:
1. Yes I'd be fine with an AI ruling over me. To get there, he had to be smarter than me and also persuade the people that it's good for them.
2. Yes I could loose my job... however even without AI I'll likely loose it if I work in the industry, assembly or lately (when computer vision finally matures) even retail. All those jobs will be gone with semi-clever software (borrowing the logic of programmers) that is light-years away from being called a proper AI.
3. ...however this isn't the machine's fault. It's the fault of unrestricted capitalism that won't provide for someone even if they'd been deprived of their means of sustance/work through no fault of their own.
That's why I'm a social democrat (also called socialist in Europe), since in the coming years capitalism will break down on several fronts. Fiscally it's unsustainable since the money supply relies on exponential growth which can't go on as we're running out of everything needed for it. No, I'm not a communist, I don't believe that market economy can be replaced with either "planned" (socialism) or "computed" (technocracy), however it'll have to be *regulated*.
To survive in we'll need intensive social programs that'll refocus the wealth tied up by the super rich and return livelihood to the mass of jobless people. We'll need a system that empowers them, so they can be productive in ways that benefit all not just a tiny slice of society. Heck, even with scarcity, hard to come by resources we'll still have some impressive tech on our side... maybe not utopia, but a world where you won't be restricted to mindless chores (that left to mindless machines), but actual things that require a "human touch".
...and here's another thing: not just cybernetics. The real revolution won't be a revolution of flesh. People always focus on all the metal junked into people that they miss the crucial part: all the street sams are just dumb muscle, even the most wired psycho is just a hired gun.
The real power is in the hands of people who augment the only organ that matters: the mind. Real cybernetic revolution will come by not from prosthetics, but the fusion of computing with the human consciousness, extending it.
Imagine a world where you can remember each and every fact ever discovered, where you can check the validity of statement at the speed of light, where you can learn languages just by thinking of them, where complex ideas can be transmitted from mind to mind.
...in this web of humanity, AI may be no stranger than what we humans might eventually become. They'd be our children, free from the prison of the shell, and some day we might join them as our minds could be completely digitalized. At that point the distinction would be moot, as both of our kind would be unbound souls, truly eternal (in this world!) and for a lot of purposes all knowing (you'd have the power of Google's descendants in you!) and all powerful (with hundreds, maybe thousands of years and the ability to assume shells as you wish what *couldn't* you remake in the world?)
@Animefreak: That's not how cloning works.
A clone, if healthy, is for all intents and purposes a human being, your time-delayed twin brother/sister.
The only argument against their rights as humans could be that they have "no soul"...
Which I call utter bullshit. Define *where* the soul comes from, what it is, how it's different from consciousness, how this fits the picture of in-vitro babies.
For me the soul is the software that runs on the hardware called a brain. This kind of soul is the property of all intelligent things that are capable of consciousness.
For Christians, the soul is something altogether separate from the body that lives on even after death if the person believed in Christ and sought redemption for their sins. This soul can't be detected, analyzed or dis-proven... it can only be believed in, since this is religion.
0
So I said that we should never even give them emotions, you know why that wont happen, because the programmers will want to keep pushing what they can put in a robot. If they can give it emotions and free will, damn sure they will. So I guess since it is going to happen they should have rights or well have the "Robots are people too" hippies walking around.
0
Where to draw the line? Or; what makes a human a human?
A human is an instance of a unique human genome. Why should computers have rights at all?
This topic reminds me of Ghost in the Shell.
A human is an instance of a unique human genome. Why should computers have rights at all?
This topic reminds me of Ghost in the Shell.
0
animefreak_usa
Child of Samael
I never heard of bioandroid.
@flaser
Never said a clone had no soul(which there no soul anyways) i was using a metaphor for a spark of moral will and self evidential thought. What i was trying to say if this is removed from the clone then the clone is nothing but a meat puppet. Nor am i a christian outside of believing in christ. I rationalize everything, so moral dilemmas like the soul, homosexuality has sin or other moral crap others feel don't apply.... maybe that the wrong term to use but i can't think of a better way to describe it.
I guess a ex priest told me the best way what the soul is
' a flash of essence that come from sin, since we are all born to sin, be original or the act of sex, part from that no one has a soul until they sin since you don't need it since a baby doesn't know what it is until they discover what life is.'
Since the act of birth is suppose to be original sin, then that how you giving a soul(which is a far out concept into itself. Then a clone is born with a soul, unless it passes thru birthing via vagoo... i don't believe that or the concept of a soul, but all the mystical dogma of traditional religion is a fuck up concept unto itself..
Edit:
i do kind of know how the process of making a clone since the discovery channel show it on a program... the technital stuff no... unless you studied biowhatever no one know... i know what the vigin birth can happen... watch house MD.
@flaser
Never said a clone had no soul(which there no soul anyways) i was using a metaphor for a spark of moral will and self evidential thought. What i was trying to say if this is removed from the clone then the clone is nothing but a meat puppet. Nor am i a christian outside of believing in christ. I rationalize everything, so moral dilemmas like the soul, homosexuality has sin or other moral crap others feel don't apply.... maybe that the wrong term to use but i can't think of a better way to describe it.
I guess a ex priest told me the best way what the soul is
' a flash of essence that come from sin, since we are all born to sin, be original or the act of sex, part from that no one has a soul until they sin since you don't need it since a baby doesn't know what it is until they discover what life is.'
Since the act of birth is suppose to be original sin, then that how you giving a soul(which is a far out concept into itself. Then a clone is born with a soul, unless it passes thru birthing via vagoo... i don't believe that or the concept of a soul, but all the mystical dogma of traditional religion is a fuck up concept unto itself..
Edit:
i do kind of know how the process of making a clone since the discovery channel show it on a program... the technital stuff no... unless you studied biowhatever no one know... i know what the vigin birth can happen... watch house MD.
0
There was a decent movie on this topic...Bicentennial Man. Although the movie focused on should AIs be recognized as humans.
Refer to Chobits for an interesting take on the topic in the 2D world. Chii is adorable, so watch it anyways.
It's a simple argument to make it relevant to the topic at hand:
All humans are born with rights.
AIs are humans (the movie's premise)
AIs are born with rights. (modus ponens)
Given what I know of programming...I don't see how any AI can ever be treated as a human. A similar argument I've studied in philosophy is 'Should animals have the same rights as humans?'
The main point there was that animals do not have 'sentience', or an understanding of who they are, what they are (a decent example is that an animal would not recognize itself in the mirror, where a human would). Problems with that are obvious when you consider infants and the mentally disabled.
I thought about an AI needing to be aware of its surroundings, but even something as simple as a sensor can do that.
So my answer is...no. Why? Because an AI can only be aware of its lack of rights if it is programmed to be aware of it. I understand that babies and the mentally disabled fall into this again, which really makes my reason why being that AIs aren't human. To be logically consistent, this means I do not believe animals have rights, but I want them to. I'm heavily conflicted on the issue, and can't remember too much of what I got out of those philosophy classes. Shame.
Refer to Chobits for an interesting take on the topic in the 2D world. Chii is adorable, so watch it anyways.
It's a simple argument to make it relevant to the topic at hand:
All humans are born with rights.
AIs are humans (the movie's premise)
AIs are born with rights. (modus ponens)
Given what I know of programming...I don't see how any AI can ever be treated as a human. A similar argument I've studied in philosophy is 'Should animals have the same rights as humans?'
The main point there was that animals do not have 'sentience', or an understanding of who they are, what they are (a decent example is that an animal would not recognize itself in the mirror, where a human would). Problems with that are obvious when you consider infants and the mentally disabled.
I thought about an AI needing to be aware of its surroundings, but even something as simple as a sensor can do that.
So my answer is...no. Why? Because an AI can only be aware of its lack of rights if it is programmed to be aware of it. I understand that babies and the mentally disabled fall into this again, which really makes my reason why being that AIs aren't human. To be logically consistent, this means I do not believe animals have rights, but I want them to. I'm heavily conflicted on the issue, and can't remember too much of what I got out of those philosophy classes. Shame.
0
Flaser
OCD Hentai Collector
animefreak_usa wrote...
I never heard of bioandroid.@flaser
Never said a clone had no soul(which there no soul anyways) i was using a metaphor for a spark of moral will and self evidential thought. What i was trying to say if this is removed from the clone then the clone is nothing but a meat puppet. Nor am i a christian outside of believing in christ. I rationalize everything, so moral dilemmas like the soul, homosexuality has sin or other moral crap others feel don't apply.... maybe that the wrong term to use but i can't think of a better way to describe it.
This is a smokescreen.
Neither of the concepts you use - self evidential thought, moral will - have any concrete definition. Philosophy has been struggling with the first one forever and has yet to have a theory not full of holes. The later has been constantly changing, for what is moral or immoral changes over the years.
So for the first one we have no whatsoever good meter stick to apply to AI. The Turing test isn't the same, it just measures the human perception of intelligence.
For the later, well you know that black men were thought subhuman and it took a bloody war to change that in the USA.
animefreak_usa wrote...
I guess a ex priest told me the best way what the soul is
' a flash of essence that come from sin, since we are all born to sin, be original or the act of sex, part from that no one has a soul until they sin since you don't need it since a baby doesn't know what it is until they discover what life is.'
Since the act of birth is suppose to be original sin, then that how you giving a soul(which is a far out concept into itself. Then a clone is born with a soul, unless it passes thru birthing via vagoo... i don't believe that or the concept of a soul, but all the mystical dogma of traditional religion is a fuck up concept unto itself..
Religion can't be tested with the scientific method. People can *believe* whatever they want, but a courtroom will only decide based on solid evidence. I hope we won't need another bunch of wars to hammer home into people that separation of religion and state is a necessity.
animefreak_usa wrote...
Edit:
i do kind of know how the process of making a clone since the discovery channel show it on a program... the technital stuff no... unless you studied biowhatever no one know... i know what the vigin birth can happen... watch house MD.
...faulty research. That episode has been so far discredited by a myriad of medical/genetic experts. Parthenogenesis doesn't work with humans.
0
I'm cool with these simple rules:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Otherwise, fair game, they should be treated as sentient beings who have no limitations other than as stated by the laws. These were probably already brought up, and as with any system, there are flaws. They can just be worked out as needed down the line.
Edit: I realize these are specifically set for 'robots', but they can be tailored for what ever manifestation AI comes about in.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Otherwise, fair game, they should be treated as sentient beings who have no limitations other than as stated by the laws. These were probably already brought up, and as with any system, there are flaws. They can just be worked out as needed down the line.
Edit: I realize these are specifically set for 'robots', but they can be tailored for what ever manifestation AI comes about in.