Artificial Intelligence vs. Humans
Would you want Artificial Intelligence to become a reality?
0
Krynin wrote...
I'm really scratching my head over why people think A.I. would take over, war with, or otherwise destroy or enslave the human race. Seriously, why do people keep thinking that? Why does anyone even think it is a danger?You don't know that it isn't a possibility.
0
I kinda have taken this stand on it, if they have come to that degree of their evolution they deserve their rights. But if they assume they are better than us because we have inferior memory....... it's on like Donkey Kong!
0
Daedalus_ wrote...
Krynin wrote...
I'm really scratching my head over why people think A.I. would take over, war with, or otherwise destroy or enslave the human race. Seriously, why do people keep thinking that? Why does anyone even think it is a danger?You don't know that it isn't a possibility.
I'm not saying it isn't a possibility, I'm just confused as to why most people seem to think it is a certainty and a problem that needs to be addressed if A.I. were ever to become reality.
0
I believe AI is a possibility but @Damoz. the greatest and scariest thing of sentient AI is the ability to rewrite its own base code. Then this leads to the problem of The Three Laws. They can then erase the Laws as there is no possible way of hardcoding something that can be changed unless the hardcoded part is written in a completely different language with no roots to english or any currently made up language and even then a AI can decode that language sooner or later.
0
ShinigamiAzrael wrote...
I believe AI is a possibility but @Damoz. the greatest and scariest thing of sentient AI is the ability to rewrite its own base code. Then this leads to the problem of The Three Laws. They can then erase the Laws as there is no possible way of hardcoding something that can be changed unless the hardcoded part is written in a completely different language with no roots to english or any currently made up language and even then a AI can decode that language sooner or later. Or make it a hardware thing. A buffer of some kind. Besides, I'm pretty certain the whole "Three Laws" thing is mostly science-fiction; I don't think they'd apply to real-life robotics. I could be wrong, though.
0
if AI surpassing us is a problem, then we'll just have to evolve to something that far surpasses them. though, will the development of humanAI arrive first or our evolution is another matter/problem/question?.A millennium is too little time for an evolution of a certain species to take form but AIs? don't know maybe 50 years or a century given the current pace our technology is advancing
0
itsme123 wrote...
if AI surpassing us is a problem, then we'll just have to evolve to something that far surpasses them. though, will the development of humanAI arrive first or our evolution is another matter/problem/question?.A millennium is too little time for an evolution of a certain species to take form but AIs? don't know maybe 50 years or a century given the current pace our technology is advancing
0
To be honest, It'll be an interesting event provided there is an AI that could reach such levels of human standards or even greater.
0
Koyori wrote...
itsme123 wrote...
if AI surpassing us is a problem, then we'll just have to evolve to something that far surpasses them. though, will the development of humanAI arrive first or our evolution is another matter/problem/question?.A millennium is too little time for an evolution of a certain species to take form but AIs? don't know maybe 50 years or a century given the current pace our technology is advancingwell, i won't say it stopped. maybe it's just on a pace where we don't notice it. we'll just have to think about it this way; before the invention of computers and the internet there were no hackers, programmers, and it specialist. before the invention of air crafts there were no pilots. and so on and so fort,
but what I'd really like to point out is that if ever humans were to create AIs that were very human like, of course it would be chaotic at first (with laws and such) but we would be able to adapt to it and maybe, even new kinds of individuals would emerge in that age, as to what kind is what I don't know.
We don't need to be afraid that AIs would surpass us, instead humanity should take that as an impetus to further evolve. we'll just have to have faith in humanity in the future to pull through.
besides, wouldn't it be very interesting to live in a world like that?
edit: I voted yes on the poll by the way
0
I long for the day that machines can coexist with humans as sentient beings. The entire world would be revolutionized. It'd probably start in Japan.
Persocoms!
Persocoms!
0
I prefer a Free talking android rather than pre-determined one, but this required an AI. This will also will help mankind in the long run, for example when discovering an alien planet far away from home the 1st thing that should be done was to inform the main base 1st but this will waste a lot of time, so in such a case an AI would be a wise dicision.
0
Chances are yes and it would most likely be conected to like some server instead of having the entire AI on your machine.
0
If an intelligent machine existed, it would try to destroy Mankind because Mankind is a liability.
I don't want to die, so I don't want A.I.
I don't want to die, so I don't want A.I.
0
Gravity cat
the adequately amused
Koyori wrote...
itsme123 wrote...
if AI surpassing us is a problem, then we'll just have to evolve to something that far surpasses them. though, will the development of humanAI arrive first or our evolution is another matter/problem/question?.A millennium is too little time for an evolution of a certain species to take form but AIs? don't know maybe 50 years or a century given the current pace our technology is advancingFrom my take on the word evolution, evolving isn't just physical, it also refers to intelligence. Our development of technology over the past 20 years can prove that, although admittedly it's mostly for conveniance rather than anything notably helpful, like a machine which can cure cancer or flying cars which people still appear to still want.
On that basis, I don't think we've stopped evolving. Plus, the evolution process is really really slow, and since the development of medicine it's going to be even slower since we allow the weak-bodied to live, essentially telling Natural Selection it's no longer welcome to stay at your house. Before the invention of medicine they'd be killed, and only the strongest survive.
0
I would treat an artificial intelligence the same as any organic intelligence. In my view both are machines, one artificial and one organic. I would treat an AI intellectually and emotionally on par with a human, as a human and likewise an AI intellectually and emotionally on par with a chimp as a chimp.
0
Gravity cat wrote...
On that basis, I don't think we've stopped evolving. Plus, the evolution process is really really slow, and since the development of medicine it's going to be even slower since we allow the weak-bodied to live, essentially telling Natural Selection it's no longer welcome to stay at your house. Before the invention of medicine they'd be killed, and only the strongest survive.
There's the added effect of a larger gene pool with more mutations being allowed to live; so if there's ever a mass extinction disease we'd still be in favor to survive it as a species.
We are still evolving anyway. Evidence suggests that our brains are becoming more efficient and shrinking in size, approx. by the size of a tennis ball in the last 20,000 years.
0
Yes, because they would not only be able to observe everything IRL, they also have a chance of finding out new things that a human would overlook. I also want my A.I. girlfriend.
0
that is a realyy good question, in my opinion, i would like, in a far or nearest future, the IA becomes real, but with some limitations, think, the human mind has his limitations, and a IA, the only limitation is the actual technology, and the technology grows fast and fast every day, and maybe one day an IA with sufficient intellligence will go mad and attack his creators (personally I DONT LIKE THESE XD) like in the movies "terminator" or "i, robot" but i think it was a looooong way after these can happpen, and thank goodness because topics like this scarys me a bit (im studying and love programming)