Artificial Intelligence vs. Humans
Would you want Artificial Intelligence to become a reality?
0
A flawed creature creating something better than himself in every possible way? What could possibly go wrong?
0
GracefulDiscension. wrote...
A flawed creature creating something better than himself in every possible way? What could possibly go wrong?The creature will be treated like an animal by his own creation.
0
the unknown wrote...
GracefulDiscension. wrote...
A flawed creature creating something better than himself in every possible way? What could possibly go wrong?The creature will be treated like an animal by his own creation.
You must've not noticed the sarcastic slant in my words.
0
GracefulDiscension. wrote...
the unknown wrote...
GracefulDiscension. wrote...
A flawed creature creating something better than himself in every possible way? What could possibly go wrong?The creature will be treated like an animal by his own creation.
You must've not noticed the sarcastic slant in my words.
I guess you are good at sarcasm then.
0
Don't think there would really be any problems with them trying to take over;I mean a lot of people are already slaves to their computers...
0
Meh if I can treat someone of the same species who shares the same blood as me like a speck of dirt, why can't I do such for a piece of metal whose creation was to fulfill some need?While I do wish for the creation of AIs in the future, put some purpose for their being there.Whether it be resource collection,construction,war, etc. only create some image of the true depth of a human soul if you intend to manipulate,deceive, and use it until it has no worth.
0
There's a specific reason I want artificial intelligence, I'm an egotistical bastard and I want my decision making process and all prior knowledge I acquired to be programmed into a computer within a prosthetic body that will perfectly emulate myself, then if it truly is another version of myself it'll realize the redundancy of having multiple versions of itself in the worlds and end eryn me, however I'd live on forever and although the current me may be a transient mind that cannot remain here forever, an artificial model of myself will never age and assuming timely upgrades will never be out of memory or behind the curve. Also it will be able to surpass biological limitations that humans simply can't. I'd become immortal in a sense.
0
devsonfire
3,000,000th Poster
The problem is about their rights, not about the AI itself. Some people might think that AI does not deserve to have rights like humans because they are made by humans, some people agree that they should have the same rights as us humans. If AI exist, I personally will treat them just like other humans.
0
Think about it this way. We ourselves are an example of the Earth's greatest R&D in the development of AI. We are a collection of cells all working in unison that together make a single conscious "us". If you keep up with the latest news in neurology you'll find out that our brain sends thousands of signals that constantly compete with each other for control. The temporary king of the hill becomes whatever we consciously perceive. I know i've rambled a bit, but to get to the point, if we do develop AI we should give it equal rights. If any one is interested in learning more check out DR Ramachandran (don't know if its spelled correctly) or TEDTalks (youtube) for a cove of interesting and easily digestible videos.
0
"With all the technological advancements now, its not pretty impossible for A.I to top humans one day. I like robots and technology but not to the extent that it would interfere with our daily functioning, or somewhat take our place in this world.
And I agree that we are the greatest Earth's creature. Science is supposed to help people, not to dominate the someday?!"
And I agree that we are the greatest Earth's creature. Science is supposed to help people, not to dominate the someday?!"
0
I think that A.I. would have their own, separate set of rights. Eventually, even if they were considered just tools from the beginning, they would become dissatisfied with their situation and demand rights, whether or not we want to give it to them. This is assuming, of course, that they would have the same sort of emotion and rationale as humans; which they wouldn't. There's nothing to say that A.I. cannot be intelligent and emotional in a way that is totally alien or not understandable to humans, which is most likely going to be the case if it ever happens. Computers cannot be made to operate on irrational and illogical functions and processes, like humans can.
That said, A.I. is NOT the "next evolution" of humanity. Transferring one's consciousness into a machine would not allow that person to live forever. They would die as soon as they were transferred, and an exact replica of them would live on. It would be like creating a clone of someone for when they die with the same memories and skills, and thinking that it's the exact same person in another body. The person being transferred would simply die, and the new incarnation would be thrust into a world with a preset of memories, experiences, and skills to help them along. Even if the A.I. feels as if it is the same person, it is not.
However, there is no way for me - or anyone else - to know this for sure. Science has consistently done the impossible. But, with the information anyone has now, that is what I think.
That said, A.I. is NOT the "next evolution" of humanity. Transferring one's consciousness into a machine would not allow that person to live forever. They would die as soon as they were transferred, and an exact replica of them would live on. It would be like creating a clone of someone for when they die with the same memories and skills, and thinking that it's the exact same person in another body. The person being transferred would simply die, and the new incarnation would be thrust into a world with a preset of memories, experiences, and skills to help them along. Even if the A.I. feels as if it is the same person, it is not.
However, there is no way for me - or anyone else - to know this for sure. Science has consistently done the impossible. But, with the information anyone has now, that is what I think.
0
Loner
the People's Senpai
I'm well for human advancement in our understanding of technologies but I think AI's would probably be a bad thing. We humans only kill ourselves with machines that we have control of like guns and bombs but imagine the repercussions of having machines that we can't control. I think we should make more advancements towards understanding ourselves and how our own brains work before me start making artificial ones.
0
I love the idea of A.I. and we already have a near perfect system to prevent issues thanks to Isaac Asimov. However, I do think that the Three Laws would only be truly applicable to lesser A.I. Once you get up to and above Human level intelligence I don't believe they would be necessary.
0
Vivaldiren wrote...
I love the idea of A.I. and we already have a near perfect system to prevent issues thanks to Isaac Asimov. However, I do think that the Three Laws would only be truly applicable to lesser A.I. Once you get up to and above Human level intelligence I don't believe they would be necessary.Any hacker could probably rip the three laws out of a robot.
0
Daedalus_ wrote...
Vivaldiren wrote...
I love the idea of A.I. and we already have a near perfect system to prevent issues thanks to Isaac Asimov. However, I do think that the Three Laws would only be truly applicable to lesser A.I. Once you get up to and above Human level intelligence I don't believe they would be necessary.Any hacker could probably rip the three laws out of a robot.
Yes, that is a valid issue however it exists with ever military attempt to in some way automate the art of war. Look at the U.S. drone that Iran has. It's going to happen either way.
0
In some way, the creators of such will always put a handicap on the AI so that it cannot surpass us, ever. Always remaining a tool. There's too much fear in the public that the AI truly will overtake the human species, for them not to do so.
0
Damoz
~Not A User~
There is one major flaw with AI no matter how complex the programing. A program is still a program and must work within certain permitter's. Which in turn means that they can never develop a so to speak identity of their own, that is purely the realm of sci-fi. That said........... humans are very much no different from A.I anyway we still have to conform to electrical signals that form our thoughts and as such we are no different from the things we call tools. The only way i see that any sort of A.I could be created is the download and duplication of human brainwaves, but in thsi day and age that is unrealistic if not dangerous.
I like the idea of A.I but i don't like the obvious ramifications of it. Human superiority and all that would just cause war or possible extinction of the human race.
Also for argument's sake, if the A.I were given perimeter's that stopped it from ever surpassing us, who is to say that i won't just adapt and rewrite what it will see a a flaw, thus rendering the constraints useless.
I like the idea of A.I but i don't like the obvious ramifications of it. Human superiority and all that would just cause war or possible extinction of the human race.
Also for argument's sake, if the A.I were given perimeter's that stopped it from ever surpassing us, who is to say that i won't just adapt and rewrite what it will see a a flaw, thus rendering the constraints useless.