Understanding through Discussion


Welcome! You are not logged in. [ Login ]
EvC Forum active members: 78 (8896 total)
Current session began: 
Page Loaded: 03-24-2019 4:37 AM
44 online now:
dwise1, PaulK, Tangle (3 members, 41 visitors)
Chatting now:  Chat room empty
Newest Member: WookieeB
Post Volume:
Total: 848,608 Year: 3,645/19,786 Month: 640/1,087 Week: 9/221 Day: 9/36 Hour: 0/7


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   Grey Goo: Legitmate concern or hysteria?
Hyroglyphx
Member
Posts: 5622
From: Austin, TX
Joined: 05-03-2006


Message 1 of 7 (560764)
05-17-2010 1:19 PM


Grey Goo is a theoretical scenario based off of unintended consequenes associated with molecular nanotechnology. The fear is one of growing dependence upon technology that will lead to a technological singularity.

More specifically, is it possible for nanobots, which are theoretically designed for good causes like seeking out cancer cells and destroying them, to self-replicate and mimic something biological like bacteria through recursive self-improvement?

Computers are designed to do only what the programmer functions it to do. However, with the advent of artificial intelligence, machines are designed to problem solve. Being that these machines are now smart and also feed off of biological material, in order to replicate could they consume biological matter?

Is this fear of ecophagy through an evolving artificial intelligence more like science fiction or science fact?


"Men never do evil so completely and cheerfully as when they do it from mistaken conviction." Blaise Pascal
Replies to this message:
 Message 3 by nwr, posted 05-18-2010 10:13 AM Hyroglyphx has not yet responded
 Message 4 by Larni, posted 05-18-2010 10:36 AM Hyroglyphx has not yet responded
 Message 5 by Rahvin, posted 05-18-2010 11:31 AM Hyroglyphx has responded
 Message 7 by tesla, posted 05-26-2010 10:59 AM Hyroglyphx has not yet responded

    
Admin
Director
Posts: 12579
From: EvC Forum
Joined: 06-14-2002
Member Rating: 2.9


Message 2 of 7 (560909)
05-18-2010 5:01 AM


Thread Copied from Proposed New Topics Forum
Thread copied here from the Grey Goo: Legitmate concern or hysteria? thread in the Proposed New Topics forum.
    
nwr
Member
Posts: 5585
From: Geneva, Illinois
Joined: 08-08-2005


Message 3 of 7 (560970)
05-18-2010 10:13 AM
Reply to: Message 1 by Hyroglyphx
05-17-2010 1:19 PM


Hyroglyphx writes:
Is this fear of ecophagy through an evolving artificial intelligence more like science fiction or science fact?

It's more like fiction.

They aren't going to have working artificial intelligence any time soon, not even by accident.


This message is a reply to:
 Message 1 by Hyroglyphx, posted 05-17-2010 1:19 PM Hyroglyphx has not yet responded

  
Larni
Member
Posts: 3975
From: Liverpool
Joined: 09-16-2005


Message 4 of 7 (560975)
05-18-2010 10:36 AM
Reply to: Message 1 by Hyroglyphx
05-17-2010 1:19 PM


Crush, kill, destroy!
It is a fact that all robots default settings is to crush, kill and destroy, but what you are suggesting is a nonobot functioning like a virus of sorts.

What senario are you proposing? When would a beneficial nonobot revert to it's primary derectives of crush, kill and destroy?

The nanobots don't need to smart at all. They do the job they are designed for and then stop working. They don't need to reproduce, really. If you need more get another dose.

Science fact? Science fiction, more like.

This seems to boil down to a Frankenstein's monster effect where we fear what we build will one day default to crush, kill and destroy.


This message is a reply to:
 Message 1 by Hyroglyphx, posted 05-17-2010 1:19 PM Hyroglyphx has not yet responded

    
Rahvin
Member (Idle past 1265 days)
Posts: 3964
Joined: 07-01-2005


Message 5 of 7 (560992)
05-18-2010 11:31 AM
Reply to: Message 1 by Hyroglyphx
05-17-2010 1:19 PM


A few points...
1) You don;t need AI for a "gray goo" scenario. All it takes is a self-replicating nano-disassembler that is able to reconstruct extremely varied source materials into energy and copies of itself. Intelligence doesn't factor into it, any more than a natural virus or bacteria needs to be smart to eat everything it's able to eat (yes, I know viruses don't actually consume, the point still remains

This is fortunate, because a nanomachine like that is going to be difficult enough to make without having to add some method of intelligent control. Fitting a method of communication for each nanobot to send and receive information to participate in and receive instructions from a communal intelligence into something that small while still allowing it to perform its self-replication function boggles the mind.

2) Nanotech is slow. When you're only a few nanometers at most, you aren't going to be moving at high speed. In fact, it's difficult to conceive of a form of propulsion that could be used by a nanobot other than the examples we've been given by bacteria and the like. If the nanobots are actually in a "gray goo" scenario, then they aren't even going to be spending most of their time and energy moving - they'll be consuming whatever is close at hand and making copies of themselves. And ripping apart molecules and reassembling them an atom at a time as proposed in most nanotech scare fiction is not fast.

Yes, they'd be replicating at an exponential rate. But they're not exactly going to cover the Earth within days or even weeks or months, any faster than bacteria grows in a petri dish. This leaves them vulnerable to any form of countermeasure we can come up with.

Worrying about nanobots being poorly designed and attacking unintended targets for self-replication (attacking healthy cells along with the cancer cells, for example) is a very legitimate concern, and it's something that engineers will be stressing over when we reach the point where such technology is really feasible.

Worrying about a "gray goo" covering the Earth and replicating until there's nothing left except nanobots is hysteria. Like most scenarios that involve the End of Life As We Know It (tm).


This message is a reply to:
 Message 1 by Hyroglyphx, posted 05-17-2010 1:19 PM Hyroglyphx has responded

Replies to this message:
 Message 6 by Hyroglyphx, posted 05-18-2010 1:16 PM Rahvin has not yet responded

  
Hyroglyphx
Member
Posts: 5622
From: Austin, TX
Joined: 05-03-2006


Message 6 of 7 (561020)
05-18-2010 1:16 PM
Reply to: Message 5 by Rahvin
05-18-2010 11:31 AM


Re: A few points...
You don;t need AI for a "gray goo" scenario. All it takes is a self-replicating nano-disassembler that is able to reconstruct extremely varied source materials into energy and copies of itself. Intelligence doesn't factor into it, any more than a natural virus or bacteria needs to be smart to eat everything it's able to eat

When I say intelligence I'm not talking like cyborgs and terminators able to make complex decisions without the ability to feel empathy. It doesn't have to be that extreme, but your point stands. You're right, all that really matters is self-replicating.

I believe it is more hysteria than anything else, but I also believe that if you don't brainstorm the possibility of unintended consequences, you foolishly run headlong in to problems.

That we can even make molecular nanotechnology in and of itself is amazing. What we know of biological replicators, like viruses and bacteria, is that they have the ability to reek havoc. Even still, they haven't consumed the world, so why would a nanobot?

Even then this placing the cart way before the horse. We don't even have true nanobots yet.


"Men never do evil so completely and cheerfully as when they do it from mistaken conviction." Blaise Pascal
This message is a reply to:
 Message 5 by Rahvin, posted 05-18-2010 11:31 AM Rahvin has not yet responded

    
tesla
Member (Idle past 2136 days)
Posts: 1198
Joined: 12-22-2007


Message 7 of 7 (562183)
05-26-2010 10:59 AM
Reply to: Message 1 by Hyroglyphx
05-17-2010 1:19 PM


If there is a concern:
I'm not particularly afraid of technology taking over the world or destroying mankind directly.

I believe that our reliance on technology is the downfall, and at some point in time: A catastrophe will probably wipe out alot of our technology upon which we rely. This means a breakdown in the subsystems we count on. If a society is too reliant on it and cannot gain food and medicine etc, many would die.

I also find genetically mutating our crops and such a concern. The concern is: That there are things we do not understand about our food and DNA that could cause us to produce a food or genetic stimulant unawares that could cause a mutation of our DNA to correct a missing link, and cause a scenario similar to what bees and bats have faced. in the case of bees; They had (still have?) a problem breaking down proteins and many died of starvation.

But i also realize fear of unknowns can hinder growth. Its just that meddling with the base components that our existence relies on is scary to me.


keep your mind from this way of enquiry, for never will you show that not-being is
~parmenides
This message is a reply to:
 Message 1 by Hyroglyphx, posted 05-17-2010 1:19 PM Hyroglyphx has not yet responded

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2018 by EvC Forum, All Rights Reserved

™ Version 4.0 Beta
Innovative software from Qwixotic © 2019