Are we really that different?
0
A new year brings no stop to the things round us. & so i wanted to ask what are your view on the growing artifical intelligence in mechines will they some day be on the level as us. will we ever think as our self as god(in the sence that we seen the growth & helped along another sentinel to same level of consciousness as another). can we ever thing of them as humans with hardrives.
really no post a guest no body like the think of GHOST IN THE SHELL OR APPLESEEDmmm..
really no post a guest no body like the think of GHOST IN THE SHELL OR APPLESEEDmmm..
0
It depends on the level of understanding the machine is able to reproduce in it's own "mind" I guess you could say.
0
no not really i still think it's far away but we never know what the future will hold for us.When technology advances that far the people changes along with it.
0
Buff_Daddy_Dizzle
The True Buff Bizzle
Its an interesting thought, with the advances in technology at the rate its been going, its possible that we may be able to create some very advanced AI. But at the same time, I'm think that science nowadays is going off in a bad tangent. I like the way a friend of mine put it.
"The world's sole focus of research should be science and medicine, this way we get people to live longer to do more science"
...I think there may be an error in thought process somewhere, but then again, I'm not a scientist yet, I'm sure you get the gist of it.
"The world's sole focus of research should be science and medicine, this way we get people to live longer to do more science"
...I think there may be an error in thought process somewhere, but then again, I'm not a scientist yet, I'm sure you get the gist of it.
0
I'm afraid fear of a 'terminator/matrix' type AI might hold back research. some dip wad politition will make it his election platform. then the media will blow it even farther out of perportion. and then it will be up to the idiot masses who will let some bizar law get passed. here in america anyway. other countries might not have the same level of fear mongering.
0
Why create robots with advanced AI when you can create living organisms?
The problem you have with creating an AI that has the same capacity for learning as a human being is that you then make human beings obsolete. They would have larger storage capacities, they would process information faster, they would be stronger, they would not suffer the effects of stamina loss, they do not need to eat or sleep and most importantly they can upgrade themselves.
We take millennia to evolve, they take an extremely short time in comparison. What's to stop them from forcing us into extinction? If they're as smart as us, I'm sure they would figure out a way to break whatever laws we enforced upon them, or whatever code we implemented in them to protect us.
As for the attachment to robotic beings I would imagine it would be much like our love for 2D. Those who felt emotional attachment to them would be looked down upon and possibly even shunned but; yes, it's a likely possibility.
The problem you have with creating an AI that has the same capacity for learning as a human being is that you then make human beings obsolete. They would have larger storage capacities, they would process information faster, they would be stronger, they would not suffer the effects of stamina loss, they do not need to eat or sleep and most importantly they can upgrade themselves.
We take millennia to evolve, they take an extremely short time in comparison. What's to stop them from forcing us into extinction? If they're as smart as us, I'm sure they would figure out a way to break whatever laws we enforced upon them, or whatever code we implemented in them to protect us.
As for the attachment to robotic beings I would imagine it would be much like our love for 2D. Those who felt emotional attachment to them would be looked down upon and possibly even shunned but; yes, it's a likely possibility.
0
It is true; there are several examples of an AI growing sentient and deeming humanity the cancer of the world - which in a sense (when looking at the facts) is true - and proceeds to weed it out. Terminator and I, Robot are such examples.
In the latter example there are the "Three Laws":
1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
However, as there are people who interpret laws (lawyers), a sentient being can, as shown in the example, view inaction in face of humanity destroying themselves as a violation of the first law. And since any orders (2nd law) given to them to stop would conflict with the first law, they wouldn't have to obey humans anymore.
However, I don't write this out of fear; this is just another example for how things can go wrong, as has many things due to "misinterpretation" - intentional or unintentional.
In the latter example there are the "Three Laws":
1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
However, as there are people who interpret laws (lawyers), a sentient being can, as shown in the example, view inaction in face of humanity destroying themselves as a violation of the first law. And since any orders (2nd law) given to them to stop would conflict with the first law, they wouldn't have to obey humans anymore.
However, I don't write this out of fear; this is just another example for how things can go wrong, as has many things due to "misinterpretation" - intentional or unintentional.
0
Computers can only perform tasks based on their input, as defined by their language library. Example:
sudo mkdir /media/harddrive
(Root power, make directory, in the folder /media--name it "harddrive)
After much meditation, I have come to the conclusion that the simple solution is to fully define a modern language to a computer.
(Modern just eases understanding)
sudo mkdir /media/harddrive
(Root power, make directory, in the folder /media--name it "harddrive)
After much meditation, I have come to the conclusion that the simple solution is to fully define a modern language to a computer.
(Modern just eases understanding)
0
I don't we'll have anything like terminator, unless it was purposefully allowed to happen... in the end, a machine isn't able to understand new concepts unless we teach them, so as long as we don't allow them to evolve in a way that gives them too much interpreting ability we are safe.
0
I wouldn't like it to advance that far.
I'd fear machines would think they were superior and kill us all.
Unless the "3 laws" were put into play.
I'd fear machines would think they were superior and kill us all.
Unless the "3 laws" were put into play.
0
AciD=mitsu wrote...
I wouldn't like it to advance that far.I'd fear machines would think they were superior and kill us all.
Unless the "3 laws" were put into play.
Read my post above. Apparently for a sentient entity, the three laws pose only a light obstacle when acting human by misinterpreting them for its personal gain, in a sense.
0
Kagamin wrote...
AciD=mitsu wrote...
I wouldn't like it to advance that far.I'd fear machines would think they were superior and kill us all.
Unless the "3 laws" were put into play.
Read my post above. Apparently for a sentient entity, the three laws pose only a light obstacle when acting human by misinterpreting them for its personal gain, in a sense.
Then new greater laws would have to written for us to have the slightest chance of living together.
Still wouldn't want it though.
0
AciD=mitsu wrote...
Kagamin wrote...
AciD=mitsu wrote...
I wouldn't like it to advance that far.I'd fear machines would think they were superior and kill us all.
Unless the "3 laws" were put into play.
Read my post above. Apparently for a sentient entity, the three laws pose only a light obstacle when acting human by misinterpreting them for its personal gain, in a sense.
Then new greater laws would have to written for us to have the slightest chance of living together.
Still wouldn't want it though.
Sorry to disappoint... I don't like it either, but as we know, there are lawyers. Without loopholes in the laws of a country, that kind of job wouldn't exist. For something that calculates at Petabyte/s (let's face it, once we can make such AI, it'd be at that speed) finding a loophole is a matter of nano-seconds.
"Rules are made to be broken!" says the criminal.
"Rules are made to be loopholed." says the lawyer.
The result of the former is that they go into jail. The result of the latter is that the former gets out again. And ultimately, both are the same.
0
Kagamin wrote...
Sorry to disappoint... I don't like it either, but as we know, there are lawyers. Without loopholes in the laws of a country, that kind of job wouldn't exist. For something that calculates at Petabyte/s (let's face it, once we can make such AI, it'd be at that speed) finding a loophole is a matter of nano-seconds."Rules are made to be broken!" says the criminal.
"Rules are made to be loopholed." says the lawyer.
The result of the former is that they go into jail. The result of the latter is that the former gets out again. And ultimately, both are the same.
Speaking of jobs, that's another thing.
Think of all the people being made redundant because of machines.
0
It's what's happening right now. Humanity's population is increasing, but the jobs they can do are decreasing almost every day; machines just do things more precise, more quickly, they don't need holidays, maintaineance is cheaper than any insurance and they will not sue you when they get hurt at work.
0
Kagamin wrote...
It's what's happening right now. Humanity's population is increasing, but the jobs they can do are decreasing almost every day; machines just do things more precise, more quickly, they don't need holidays, maintaineance is cheaper than any insurance and they will not sue you when they get hurt at work.What about when they fuck up?
Or would they be too advance they they wouldn't need repairs?
0
I mentioned maintaineance. It does cost money, but in the end you can trust a machine to do the work on days when humans might "not want to work" - as well as in conditions humans can't even think about working. An example is in a nuclear power plant, but that's just extremes.
0
Kagamin wrote...
conditions humans can't even think about working.Well, doing jobs humans can't really do goes without saying.
You'd just have them do just that and nothing else.
0
Glad everyone taken the idea & ran with it. & to keep things going i tell my side. if at a point the AI's do turn on us,its just because we taken advantage of them.& so they feel they should control their destiny or future. now lets go back turn back the time table. so say we get to "the" point where AIs have self thought,at this time
we should stop treating them as mere machines,to as intelligent being & offer co-living to not start beef or problems. any thing that happen after that is because of misunderstanding of one another.
we should stop treating them as mere machines,to as intelligent being & offer co-living to not start beef or problems. any thing that happen after that is because of misunderstanding of one another.