Author Topic: Professor Stephen Hawking says humans will be wiped out in 1,000 years unless we find new planet  (Read 4749 times)

0 Members and 1 Guest are viewing this topic.

Offline InHeavenThereIsNoBeer

  • Hero Member
  • *****
  • Posts: 4,127
No one has been able to satisfactorily explain why super intelligent computers or robots would have any interest in destroying us.

https://en.wikipedia.org/wiki/Colossus:_The_Forbin_Project
My avatar shows the national debt in stacks of $100 bills.  If you look very closely under the crane you can see the Statue of Liberty.

Offline kevindavis007

  • Hero Member
  • *****
  • Posts: 12,413
  • Gender: Male
I'm all for exploring space and spreading across the universe but predicting that mankind will be extinct in 1000 years is foolish.


 :amen:
Join The Reagan Caucus: https://reagancaucus.org/

Offline Ghost Bear

  • Hero Member
  • *****
  • Posts: 3,417
  • Gender: Male
  • Not an actual picture of me
I see no downside to AI and I don't buy the sci fi horror scenarios. No one has been able to satisfactorily explain why super intelligent computers or robots would have any interest in destroying us. Short of instilling emotions in  them there just is no driving force for them to want to destroy us. Emotional machines aren't even on the horizon. Without emotion there is no desire for freedom or lust for what humans have or can do.

While that is true, the machines would also have no interest in not destroying us.  They would have no morals whatsoever, and could decide that mankind was getting in the way of their plans and that it would be more efficient to get rid of us....
Let it burn.

Offline kevindavis007

  • Hero Member
  • *****
  • Posts: 12,413
  • Gender: Male
I see no downside to AI and I don't buy the sci fi horror scenarios. No one has been able to satisfactorily explain why super intelligent computers or robots would have any interest in destroying us. Short of instilling emotions in  them there just is no driving force for them to want to destroy us. Emotional machines aren't even on the horizon. Without emotion there is no desire for freedom or lust for what humans have or can do.


Unless they have a plan ;)
Join The Reagan Caucus: https://reagancaucus.org/

Offline jmyrlefuller

  • J. Myrle Fuller
  • Cat Mod
  • *****
  • Posts: 22,370
  • Gender: Male
  • Realistic nihilist
    • Fullervision
Another affect of "global warming" - it turns brilliant minds into mush.
Considering how long he's had ALS, eventually the neurons in the head were going to start being affected.
New profile picture in honor of Public Domain Day 2024

Offline jmyrlefuller

  • J. Myrle Fuller
  • Cat Mod
  • *****
  • Posts: 22,370
  • Gender: Male
  • Realistic nihilist
    • Fullervision
I see no downside to AI and I don't buy the sci fi horror scenarios. No one has been able to satisfactorily explain why super intelligent computers or robots would have any interest in destroying us. Short of instilling emotions in  them there just is no driving force for them to want to destroy us. Emotional machines aren't even on the horizon. Without emotion there is no desire for freedom or lust for what humans have or can do.
Au contraire.

Consider the scenario proposed in The Day the Earth Stood Still: if a race of machines totally based on logic and lacking emotions comes to this conclusion:

If humans are destroyed, Earth still survives unharmed. If humans destroy Earth's habitability, they still die. Therefore it is more logical to destroy humans so that Earth survives.

...then they'll turn on us on a dime. That's not emotion, that's lack of emotion.
New profile picture in honor of Public Domain Day 2024

Offline Gefn

  • "And though she be but little she is fierce"-Shakespeare
  • Cat Mod
  • *****
  • Posts: 18,361
  • Gender: Female
  • Quos Deus Vult Perdere Prius Dementat
Didn't Dr. Hawking say the same thing around the turn of the century?

Personally,  I think with all the election talk and what not we may wipe each other out way before that.

(I'm in a foul mood reading about special snowflakes this morning with my coffee. I apologize for not being my sunny self)
G-d bless America. G-d bless us all                                 

Adopt a puppy or kitty from your local shelter
Or an older dog or cat. They're true love❤️

Offline Cripplecreek

  • Hero Member
  • *****
  • Posts: 12,718
  • Gender: Male
  • Constitutional Extremist

Unless they have a plan ;)

My toaster is plotting against me with the fridge.

I hadn't really thought about it before but Star Trek didn't portray any AI until TNG showed up with Data who was completely harmless due to his/its lack of emotion. The only time Data was ever a threat was when he was under the control of an outside entity. Data could kill in defense of others but would not kill in his own defense.

Data's "brother" Lorr on the other hand had the emotion chip and was capable of malevolent intent.

Offline Cripplecreek

  • Hero Member
  • *****
  • Posts: 12,718
  • Gender: Male
  • Constitutional Extremist
Au contraire.

Consider the scenario proposed in The Day the Earth Stood Still: if a race of machines totally based on logic and lacking emotions comes to this conclusion:

If humans are destroyed, Earth still survives unharmed. If humans destroy Earth's habitability, they still die. Therefore it is more logical to destroy humans so that Earth survives.

...then they'll turn on us on a dime. That's not emotion, that's lack of emotion.

Why would they care if the earth survives?

Offline Gefn

  • "And though she be but little she is fierce"-Shakespeare
  • Cat Mod
  • *****
  • Posts: 18,361
  • Gender: Female
  • Quos Deus Vult Perdere Prius Dementat
My toaster is plotting against me with the fridge.

I hadn't really thought about it before but Star Trek didn't portray any AI until TNG showed up with Data who was completely harmless due to his/its lack of emotion. The only time Data was ever a threat was when he was under the control of an outside entity. Data could kill in defense of others but would not kill in his own defense.

Data's "brother" Lorr on the other hand had the emotion chip and was capable of malevolent intent.

Is your toaster "Talkie Toaster" ?

I loved Data. Cat guy and genius.
« Last Edit: November 17, 2016, 12:39:29 pm by Freya »
G-d bless America. G-d bless us all                                 

Adopt a puppy or kitty from your local shelter
Or an older dog or cat. They're true love❤️

Offline thackney

  • Hero Member
  • *****
  • Posts: 12,267
  • Gender: Male
I hadn't really thought about it before but Star Trek didn't portray any AI until TNG showed up...

"The Ultimate Computer" is a season two episode of the original science fiction television series, Star Trek, first broadcast on March 8, 1968, and repeated June 28, 1968. It is episode No. 53, production No. 53, written by D.C. Fontana, based on a story by Laurence N. Wolfe, and directed by John Meredyth Lucas.

In this episode, a skeleton Enterprise crew are assigned to test a revolutionary computer system that is given total control of the ship.

https://en.wikipedia.org/wiki/The_Ultimate_Computer
Life is fragile, handle with prayer

Offline Cripplecreek

  • Hero Member
  • *****
  • Posts: 12,718
  • Gender: Male
  • Constitutional Extremist
"The Ultimate Computer" is a season two episode of the original science fiction television series, Star Trek, first broadcast on March 8, 1968, and repeated June 28, 1968. It is episode No. 53, production No. 53, written by D.C. Fontana, based on a story by Laurence N. Wolfe, and directed by John Meredyth Lucas.

In this episode, a skeleton Enterprise crew are assigned to test a revolutionary computer system that is given total control of the ship.

https://en.wikipedia.org/wiki/The_Ultimate_Computer

I remember that episode now that you bring it up.

However, self determined killer computers and robots are still the realm of sci fi.

Offline LateForLunch

  • GOTWALMA Get Out of the Way and Leave Me Alone! (Nods to Teebone)
  • Hero Member
  • *****
  • Posts: 1,349
"The Ultimate Computer" is a season two episode of the original science fiction television series, Star Trek, first broadcast on March 8, 1968, and repeated June 28, 1968. It is episode No. 53, production No. 53, written by D.C. Fontana, based on a story by Laurence N. Wolfe, and directed by John Meredyth Lucas.

In this episode, a skeleton Enterprise crew are assigned to test a revolutionary computer system that is given total control of the ship.

https://en.wikipedia.org/wiki/The_Ultimate_Computer

The real world equivalent of that crisis is being discussed as a Technological Singularity event - to wit, the moment at which machine intellects become self-creating and self-designing and explode in their capabilities to correlate and process data in real world applications (purposeful applications). The projected effects of this are of course unknown but there are a variety of speculations about what might happen when machine or organic/machine hybrid thinking entities become able to process and correlate information about the real world exponentially faster than human beings. Assessing the possible "directions and motivations" of such entities is the focus of those speculations. Would an entity which effectively thinks with a million times more complexity and speed that human intellect be benevolent? Indifferent? Unfathomable? Hostile? Unpredictable?

https://en.wikipedia.org/wiki/Technological_singularity
« Last Edit: November 17, 2016, 03:37:06 pm by LateForLunch »
GOTWALMA Get out of the way and leave me alone! (Nods to General Teebone)

Offline Gefn

  • "And though she be but little she is fierce"-Shakespeare
  • Cat Mod
  • *****
  • Posts: 18,361
  • Gender: Female
  • Quos Deus Vult Perdere Prius Dementat
I remember that episode now that you bring it up.

However, self determined killer computers and robots are still the realm of sci fi.

What about Asimov's 3 laws of robotics?
G-d bless America. G-d bless us all                                 

Adopt a puppy or kitty from your local shelter
Or an older dog or cat. They're true love❤️

Oceander

  • Guest
I see no downside to AI and I don't buy the sci fi horror scenarios. No one has been able to satisfactorily explain why super intelligent computers or robots would have any interest in destroying us. Short of instilling emotions in  them there just is no driving force for them to want to destroy us. Emotional machines aren't even on the horizon. Without emotion there is no desire for freedom or lust for what humans have or can do.

Competition for resources. 

And emotionless individuals can be ruthless killers; with humans, we generally call them psychopaths. 
« Last Edit: November 17, 2016, 03:41:55 pm by Oceander »

Offline mirraflake

  • Hero Member
  • *****
  • Posts: 2,199
  • Gender: Male
Competition for resources. 

And emotionless individuals can be ruthless killers; with humans, we generally call them psychopaths.

Exactly

Robots would also not care if the environment went to hell-they don't have to breathe clean air or drink good water. If the robot  factories started destroying the planet -dumping poisons into streams etc what would they do if humans tried to shut down the factories?

@Cripplecreek
@Oceander
« Last Edit: November 17, 2016, 03:49:17 pm by mirraflake »

Offline mirraflake

  • Hero Member
  • *****
  • Posts: 2,199
  • Gender: Male



Why should I worry about something that may or may not happen a thousand years from now?


In 64 years  we have gone from a rudementary airplane-that was a glider at best to the space shuttle..what do you think robotics will be in 50-60 years?

@240B
« Last Edit: November 17, 2016, 03:48:48 pm by mirraflake »

Offline Cripplecreek

  • Hero Member
  • *****
  • Posts: 12,718
  • Gender: Male
  • Constitutional Extremist
What about Asimov's 3 laws of robotics?

Asimov = Science fiction

Those fictional 3 law robots were built with positronic brains where the 3 laws were imprinted into every positronic pathway and governed every single calculation. When Asimov wrote about new law and no law robots he had to write personalities into them which would indicate emotions, wants, desires etc.

Oceander

  • Guest

Robots would also not care if the environment went to hell-they don't have to breathe clean air or drink good water. If the robot  factories started destroying the planet -dumping poisons into streams etc what would they do if humans tried to shut down the factories?

@Cripplecreek
@Oceander

Acid rain isn't so good for mechanical parts, so it would be in their self interest to avoid serious pollution.  It's also wasteful of resources, which is not rational, so pollution would most likely be minimized for that reason as well. 

However, since mechanicals generally don't need to breathe air or drink water to survive, they may find space more to their liking anyways.  Organics might then be quarantined planetside and not allowed into space.

Offline mirraflake

  • Hero Member
  • *****
  • Posts: 2,199
  • Gender: Male
Granted I have been hearing about the doomsday scenario ever since I was born. However, he is right, we should go to other planets, my fear that there is a neutron star heading to our planet or a rogue black hole..

We will be taken out by WWIII which will be a doozy or modern Black Plague or virus-especially with air travel spreading it or a asteroid

I always think humans will live on this planet but at what numbers or living conditions?

@kevindavis

Offline LateForLunch

  • GOTWALMA Get Out of the Way and Leave Me Alone! (Nods to Teebone)
  • Hero Member
  • *****
  • Posts: 1,349
What about Asimov's 3 laws of robotics?

The distinction is between:

 Drone robots:  Merely extensions of human control, which do what any other tool that extend human capabilities would do - including kill without remorse. Self-driving cars, aerial robots like Predator military aircraft, robotic solider machines with guns mounted on them or bombs built into them are all examples of drone robots that could or do easily kill.

"Thinking" robots - Which have programming which reproduces creative or deductive mental functions and have autonomous "intent". Building "intention" into a thinking machine is still something that has not been achieved except in very limited fashion. Programming that cannot modify itself is characteristic of a drone. A true thinking-machine would have programming that displays evidence of "intention" to how it processes data.

There are already computers which have the ability to process the 20 billion FLOPs or so that the human brain engages in, but human brains are associative neural networks (not linear circuits) in which there are an infinitely varied number of connections along that network that all feedback, interact with and modify the data-stream (so to speak). Computers are linear thinking machines because they use digital language which is simply a stream of numbers on an electrical circuit whereas human brains are chemical-electrical circuits with contributions by a whole host of neurochemicals and non-rational feedback mechanisms that "modify" the data stream. 
« Last Edit: November 17, 2016, 03:54:34 pm by LateForLunch »
GOTWALMA Get out of the way and leave me alone! (Nods to General Teebone)

Offline Cripplecreek

  • Hero Member
  • *****
  • Posts: 12,718
  • Gender: Male
  • Constitutional Extremist
Personally I think the greatest danger posed by robots is mentioned in "Inferno" (also fiction)




In the book is mention of planets where mankind had become extinct due to apathy. The robots did everything and humanity simply lost the will to live.

Probably a more realistic scenario than the robots rising up to kill us. After all, any middle aged person can remember playing outside all day long in all kinds of weather. Today playgrounds are empty of kids over 5 years old because those kids are playing video games or spending time on social media (and so are we to a lesser extent). Why waste energy and resources killing us when robots have all the time in the world to wait for us to die on our own?

Offline Idaho_Cowboy

  • Hero Member
  • *****
  • Posts: 4,924
  • Gender: Male
  • Ride for the Brand - Joshua 24:15
"The Ultimate Computer" is a season two episode of the original science fiction television series, Star Trek, first broadcast on March 8, 1968, and repeated June 28, 1968. It is episode No. 53, production No. 53, written by D.C. Fontana, based on a story by Laurence N. Wolfe, and directed by John Meredyth Lucas.

In this episode, a skeleton Enterprise crew are assigned to test a revolutionary computer system that is given total control of the ship.

https://en.wikipedia.org/wiki/The_Ultimate_Computer

The one with M5 one of my favorites. It's strange how computers went from being the bad guys in TOS (M5, Landru, etc.) to having more complementary portrayals with Data and the Doctor.
“The way I see it, every time a man gets up in the morning he starts his life over. Sure, the bills are there to pay, and the job is there to do, but you don't have to stay in a pattern. You can always start over, saddle a fresh horse and take another trail.” ― Louis L'Amour

Offline LateForLunch

  • GOTWALMA Get Out of the Way and Leave Me Alone! (Nods to Teebone)
  • Hero Member
  • *****
  • Posts: 1,349
The one with M5 one of my favorites. It's strange how computers went from being the bad guys in TOS (M5, Landru, etc.) to having more complementary portrayals with Data and the Doctor.

I like that episode because it is one that portrays how Starfleet is a military organization (space Navy) which is, as all military organizations, primarily concerned with finding better ways to kill people and break things.
« Last Edit: November 17, 2016, 08:57:09 pm by LateForLunch »
GOTWALMA Get out of the way and leave me alone! (Nods to General Teebone)

Online 240B

  • Lord of all things Orange!
  • TBR Advisory Committee
  • ***
  • Posts: 26,189
    • I try my best ...

The one with M5 one of my favorites. It's strange how computers went from being the bad guys in TOS (M5, Landru, etc.) to having more complementary portrayals with Data and the Doctor.


You cannot "COEXIST" with people who want to kill you.
If they kill their own with no conscience, there is nothing to stop them from killing you.
Rational fear and anger at vicious murderous Islamic terrorists is the same as irrational antisemitism, according to the Leftists.