Register | Sign In


Understanding through Discussion


EvC Forum active members: 65 (9162 total)
3 online now:
Newest Member: popoi
Post Volume: Total: 915,807 Year: 3,064/9,624 Month: 909/1,588 Week: 92/223 Day: 3/17 Hour: 1/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   The Social Implications Of "The Singularity Moment"
Straggler
Member
Posts: 10333
From: London England
Joined: 09-30-2006


Message 46 of 169 (604694)
02-14-2011 11:57 AM
Reply to: Message 44 by New Cat's Eye
02-14-2011 11:49 AM


Re: "Absorb Technological Change" - Huh?
CS writes:
Every American does not have to understad how they work, to say that America absorbed atomic bombs.
Yeah obviously.
So to what extent or what aspects of the culture do need to understand it for it to qualify as "absorbed"?
This seems very vague.
CS writes:
Its about mainstream utilization of the technology rather than it simply having been invented but just sitting on a shelf (so to speak).
In that case it seems Betamax was a step too far

This message is a reply to:
 Message 44 by New Cat's Eye, posted 02-14-2011 11:49 AM New Cat's Eye has not replied

  
Theodoric
Member
Posts: 9076
From: Northwest, WI, USA
Joined: 08-15-2005
Member Rating: 3.7


Message 47 of 169 (604695)
02-14-2011 12:02 PM
Reply to: Message 44 by New Cat's Eye
02-14-2011 11:49 AM


Re: "Absorb Technological Change" - Huh?
So what is your idea of what the "singularity" is? Neither your idea or Crash's has anything to do with the "singularity" of the OP.
Yeah, and its only going to get worse.
Why is this worse? Why is it any different than any other time in history of technological innovation? You and Crash both may have sound reasoning for this belief, as of yet Crash refuses to give any reasoning. Maybe you can explain your reasonings for this belief.
You don't like me that is clear. How about just trying to provide an argument to support this belief?
Its about the culture absorbing the technology, not the individuals of that culture.
Still don't get what that means. Everyone needs to understand the nitty gritty of all technologies? Hell most people don't understand how POTS or even an internal combustion engine works. Seems like the criteria for cultural absorption is a little flawed.
What the hell does any of this have to do with the "singularity"?

Facts don't lie or have an agenda. Facts are just facts

This message is a reply to:
 Message 44 by New Cat's Eye, posted 02-14-2011 11:49 AM New Cat's Eye has replied

Replies to this message:
 Message 49 by New Cat's Eye, posted 02-14-2011 12:12 PM Theodoric has not replied

  
New Cat's Eye
Inactive Member


Message 48 of 169 (604697)
02-14-2011 12:05 PM
Reply to: Message 45 by Theodoric
02-14-2011 11:54 AM


Re: "Absorb Technological Change" - Huh?
Based upon Crash's comments he must think the singularity ahs already happened. If he thinks that we haven "absorbed" the things we already have then isn't the singularity already upon us?
No, I don't think the singularity is simply the existence of unabsorbed technologies.
Why shouldn't I expect evidence for something that is "clearly true"?
So that you will be less annoying.
but the notion that the rate of technological change will increase past our ability to culturally absorb the changes is clearly true.
It is an unevidenced assertion, that Straggler pointed, needs to be explained. I want to know what he means by the comment and and why it is clearly true. His explanations that technology always increases has nothing to do with my comment and does nothing to support his assertion.
Its a simple deduction from the fact that if the *rate* of technological advancement only increases, then our culture's ability to absorb it will keep falling farther and farther behind.
Absorption has to lag behind advancement, obviously, so if the *rate* of advancement keeps increasing then eventually absorption will just get left in the dust.

This message is a reply to:
 Message 45 by Theodoric, posted 02-14-2011 11:54 AM Theodoric has replied

Replies to this message:
 Message 50 by Theodoric, posted 02-14-2011 12:29 PM New Cat's Eye has replied

  
New Cat's Eye
Inactive Member


Message 49 of 169 (604698)
02-14-2011 12:12 PM
Reply to: Message 47 by Theodoric
02-14-2011 12:02 PM


Re: "Absorb Technological Change" - Huh?
So what is your idea of what the "singularity" is? Neither your idea or Crash's has anything to do with the "singularity" of the OP.
I thought crashes first post was clear:
quote:
Ray Kurzwiels' pretty much predicting science fiction, but the notion that the rate of technological change will increase past our ability to culturally absorb the changes is clearly true. The rate of technological change has never decreased throughout human history. The people who are predicting that it will are the ones making predictions utterly at odds with history, not Kurzweil.
Does that mean AI, teleportation, living forever in virtual worlds? Who knows? The point of the "singularity" is that it's the point at which technological change is happening so fast the results can't be predicted.
And it just can't be argued that that is {not} going to someday be the case.
Although I think he missed placing that NOT that I added at the end and I added the bold.
Why is this worse? Why is it any different than any other time in history of technological innovation?
Because the *rate* of avancement is increasing, i.e. technology is accelerating, then the displacement between advancement and absorption will keep growing, i.e. get "worse".
What the hell does any of this have to do with the "singularity"?
Essentially its when that displacement spirals out of control, or as crash put it: the results of it can't be predicted.
Its about the culture absorbing the technology, not the individuals of that culture.
Still don't get what that means. Everyone needs to understand the nitty gritty of all technologies? Hell most people don't understand how POTS or even an internal combustion engine works. Seems like the criteria for cultural absorption is a little flawed.
Maybe it is. No, not everybody has to understand it for it to be absorbed.
Edited by Catholic Scientist, : No reason given.

This message is a reply to:
 Message 47 by Theodoric, posted 02-14-2011 12:02 PM Theodoric has not replied

  
Theodoric
Member
Posts: 9076
From: Northwest, WI, USA
Joined: 08-15-2005
Member Rating: 3.7


Message 50 of 169 (604700)
02-14-2011 12:29 PM
Reply to: Message 48 by New Cat's Eye
02-14-2011 12:05 PM


Re: "Absorb Technological Change" - Huh?
Its a simple deduction from the fact that if the *rate* of technological advancement only increases, then our culture's ability to absorb it will keep falling farther and farther behind.
But hasn't then always been happening. Why is now so different? Seems unevidenced to me.
So that you will be less annoying.
So I should let you and others just make unevidenced assertions without question? I think you are on the wrong forum if that is what you expect.

Facts don't lie or have an agenda. Facts are just facts

This message is a reply to:
 Message 48 by New Cat's Eye, posted 02-14-2011 12:05 PM New Cat's Eye has replied

Replies to this message:
 Message 53 by New Cat's Eye, posted 02-14-2011 12:57 PM Theodoric has replied
 Message 63 by crashfrog, posted 02-14-2011 3:35 PM Theodoric has not replied

  
xongsmith
Member
Posts: 2578
From: massachusetts US
Joined: 01-01-2009
Member Rating: 6.8


Message 51 of 169 (604703)
02-14-2011 12:51 PM
Reply to: Message 32 by crashfrog
02-13-2011 11:16 PM


crashfrog writes:
This is an interpretation of someones emotions. It is not a physical or concrete thing. To me it would not be his grandfathers axe.
I know, I'm just trying to get a handle on your opinion, here. It's clearly his grandfather's axe when it has the original head and handle, right? And your opinion is that it has stopped being his grandfather's axe once the head and handle have been replaced.
So where's the boundary? Be specific. When his father replaces the head? When he replaces the handle?
Well, me - I'd say all of the above.
And also I'd say none of the above.
Simultaneously.
But also lying.
Maybe I would answer "Yes, the apple blossoms are indeed going to be particularly good this spring."

- xongsmith, 5.7d

This message is a reply to:
 Message 32 by crashfrog, posted 02-13-2011 11:16 PM crashfrog has not replied

Replies to this message:
 Message 52 by xongsmith, posted 02-14-2011 12:56 PM xongsmith has seen this message but not replied

  
xongsmith
Member
Posts: 2578
From: massachusetts US
Joined: 01-01-2009
Member Rating: 6.8


Message 52 of 169 (604704)
02-14-2011 12:56 PM
Reply to: Message 51 by xongsmith
02-14-2011 12:51 PM


42
BTW, the Singularity has already occurred. We missed a great party...

- xongsmith, 5.7d

This message is a reply to:
 Message 51 by xongsmith, posted 02-14-2011 12:51 PM xongsmith has seen this message but not replied

  
New Cat's Eye
Inactive Member


Message 53 of 169 (604706)
02-14-2011 12:57 PM
Reply to: Message 50 by Theodoric
02-14-2011 12:29 PM


Re: "Absorb Technological Change" - Huh?
But hasn't then always been happening. Why is now so different? Seems unevidenced to me.
Its the nature of exponential growth, the further along you get the more it grows. We're about 200 years past the Industrial Revolution and the last 50 years has had a lot more advancement than the previous 150. The next 50 years should be even more than that.
So I should let you and others just make unevidenced assertions without question?
Yes. Its no big deal, really. Nothing bad is going to happen if you don't
I think you are on the wrong forum if that is what you expect.
We're talking about science fiction in the COFFEE HOUSE FORUM!

This message is a reply to:
 Message 50 by Theodoric, posted 02-14-2011 12:29 PM Theodoric has replied

Replies to this message:
 Message 54 by Theodoric, posted 02-14-2011 1:02 PM New Cat's Eye has replied
 Message 55 by xongsmith, posted 02-14-2011 1:09 PM New Cat's Eye has not replied

  
Theodoric
Member
Posts: 9076
From: Northwest, WI, USA
Joined: 08-15-2005
Member Rating: 3.7


Message 54 of 169 (604707)
02-14-2011 1:02 PM
Reply to: Message 53 by New Cat's Eye
02-14-2011 12:57 PM


Re: "Absorb Technological Change" - Huh?
We're talking about science fiction in the COFFEE HOUSE FORUM!
I meant EVC in general.
I would rather be an irritating poster that asks for evidence, then someone that gets peeved every time they get asked to support their argument.
In doesn't matter where we are on EVC, everyone should be able to support their assertions.
ABE
So you think the whole singularity thing is SciFi? Then why are you so adamant in your arguments that we are reaching this point?
Edited by Theodoric, : No reason given.
Edited by Theodoric, : replaced particular with general

Facts don't lie or have an agenda. Facts are just facts

This message is a reply to:
 Message 53 by New Cat's Eye, posted 02-14-2011 12:57 PM New Cat's Eye has replied

Replies to this message:
 Message 56 by New Cat's Eye, posted 02-14-2011 1:31 PM Theodoric has replied

  
xongsmith
Member
Posts: 2578
From: massachusetts US
Joined: 01-01-2009
Member Rating: 6.8


Message 55 of 169 (604708)
02-14-2011 1:09 PM
Reply to: Message 53 by New Cat's Eye
02-14-2011 12:57 PM


Re: "Absorb Technological Change" - Huh?
Catholic Scientist writes:
We're talking about science fiction in the COFFEE HOUSE FORUM!
As Ernest Borgnine said back to William Holden in The Wild Bunch, in the late glow of the campfire, "I wouldn't have it any other way..."

- xongsmith, 5.7d

This message is a reply to:
 Message 53 by New Cat's Eye, posted 02-14-2011 12:57 PM New Cat's Eye has not replied

  
New Cat's Eye
Inactive Member


Message 56 of 169 (604712)
02-14-2011 1:31 PM
Reply to: Message 54 by Theodoric
02-14-2011 1:02 PM


Re: "Absorb Technological Change" - Huh?
So you think the whole singularity thing is SciFi? Then why are you so adamant in your arguments that we are reaching this point?
Well, the OP brings up "super-intelligent cyborgs" so we are talkin' SciFi, but the principle behind it, that technological advancement outruns cultures' abilities to absorb them, is something that can be discussed as non-fiction.
I meant EVC in general.
Oh, EvC has many forums, some of which pertain to religion and beliefs. You're wrong to think that every assertion here requires evidence.
I would rather be an irritating poster that asks for evidence, then someone that gets peeved every time they get asked to support their argument.
I would rather be someone who actually contributes to discussions with something interesting to read.
In doesn't matter where we are on EVC, everyone should be able to support their assertions.
Yeah, but any asshat can scan through threads looking for unsubstantiated claims and respond with: "Ya got any evidence for that assertion?"
Its one thing to seek new information for something that interests you, but to just take potshots from the sidelines at everything unsupported is just annoyingly junking up the threads. You can just ignore people...

This message is a reply to:
 Message 54 by Theodoric, posted 02-14-2011 1:02 PM Theodoric has replied

Replies to this message:
 Message 59 by Theodoric, posted 02-14-2011 2:04 PM New Cat's Eye has not replied

  
AZPaul3
Member
Posts: 8513
From: Phoenix
Joined: 11-06-2006
Member Rating: 5.2


Message 57 of 169 (604714)
02-14-2011 1:40 PM


Kurzweil's "Singularity" may seem hooey to a lot of people but it already has started and will continue to grow.
Ray has fallen off the wagon in taking this concept to absurd extremes but his initial view is quite viable.
The singularity is not about technology take-over or cultural absorption problems. It is, as defined by Kurzweil, the evolution of human intelligence augmented by technology (read his book).
The "Singularity" has already begun. This computer and this Internet are extensions and augment human intelligence today. No big surprise here. Where Kurzweil takes this (before jumping off his bean) is that future enhancements of human intelligence will be via nano-tech directly implanted in our bodies.
Nano-tech is already a viable technology though it is young and still quite specialized. But as this rate of technology continues to speed up (a phenomenon I think we can all agree is real) the nano-tech parts start entering the human body and eventually our brains.
Nano-bots will be used to clean up the plaque in our arteries and help stave off disease. Nano-bots will slowly build tissues replacing our own tissues with less-degradable organics. Who knows. Maybe replacing our bones with carbon composites, our muscles with "memory metals," our stomachs with acid-resistant polymers.
Then as more technology and knowledge becomes available these nano things will be directly interfaced into our brains and connected to the vast libraries of human knowledge.
If Straggler in London and I in Arizona are having a real-time face-to-face (via an implanted Virtual Interactive Visitation link) discussion that devolves into a syntax tussle over the definition of Bill Clinton's favorite word "is" neither of us will have to consult a dictionary. The Interactive Robotic Virtual Intelligence Network Gateway (IRVING) will find it for us and instantly pop it into our consciousness like we know it all along. Then, of course, we can argue whether my definition from the GT283.234 node is more apropos to the theme than his which came from the LM921.077 node. Such is the nature of discussion.
Anyway.
We will not become machines nor will we be controlled by machines, any more then we can say we, today, are controlled by computers and the Internet. Our humanity or spirit or soul or whatever the religionists are want to call it will not disappear or be subjugated to a superior non-human intellect. We will have arrested the human genome, replaced degradable body organics with non-degradable counterparts, augmented our innate intellect with nano-technologies with the prospect of living happily ever after. So goes the idea of the Singularity.
I won't be here to see it if it happens. Some of you may ... if it happens.
This is all doable with future technologies. That's not to say it will be done, but there isn't anything but a lack of knowledge and technology standing in the way, both of which will come in time.
I think Kurzweil greatly overestimates the speed of this timeline, but not the viability of such technology and its integration into our bodies.

  
Rahvin
Member
Posts: 4032
Joined: 07-01-2005
Member Rating: 9.2


(1)
Message 58 of 169 (604715)
02-14-2011 1:44 PM
Reply to: Message 1 by Phat
02-12-2011 10:39 AM


Recently, I read an article in Time Magazine titled 2045: The Year Man Becomes Immortal In it, they discuss the rapid advance in artificial intelligence, and have popularized the phrase "The singularity" as the moment when computers become capable themselves of designing more intelligent computers ad infinitum.
Time Magazine did not coin the term "Singularity" as it pertains to AI.
So-called "post-humanism" has driven a large amount of fiction as authors speculate as to the social impact of the introduction of a Artificial General Intelligence (the term "general" is important; artificially intelligent programs already exist, but are very specific in their application, not adaptable to multiple or unforeseen situations). We're all familiar with the Terminator-style apocalyptic visions of the effects of AGI, for example.
As to whether a Singularity event will happen at all...it seems inevitable that AGI will eventually be developed. After all, we know that a general intelligence is possible - every human brain is an example of an adaptable intellect. It seems logically foolish to claim that while a natural general intelligence is possible, it is impossible for humans to artificially create one.
It's far, far easier to modify a computer program or computer hardware than it is to modify the living brain of a human being.
It's far, far easier to keep track of what specifically is going on inside of a computer program, what task is currently being processed, and how specifically it works, than it is to do the same with a living human brain.
Computer processing happens far faster than human thought, and further is not limited to a specific amount of space. A human brain needs to fit within the confines of a human skull, and has to work with the finite energy and nutritional resources of a human body. Electronic computers are far more easily up-scaled, adding additional processors or memory or storage, etc.
Can you imagine, then, an artificial intelligence that is capable of altering itself? Of analyzing its own thought processes, assessing its own performance pursuant to its goals, and modifying its core programming to be more efficient? Human beings are flawed intelligences, with cognitive faults like confirmation bias; imagine being able to actually, literally, reprogram those faults out of your thought process. An AGI would likely be able to do that, and depending on the hardware allocated and processing requirements for running the program, might well be able to do it faster than we can track.
Some of the implications of AGI are difficult to predict; others are easy.
Some potential pathways to AGI involve trying to duplicate a human brain in a computer environment; one proposed technique involves attempting to simulate every neuron in a human brain within a computer. This might make a more human-like AGI, but has significant downsides - indirect simulation processing is relatively inefficient, and simulating human neurons is rather like copying Chinese when you can't read the language - you still have very little idea what's going on inside (conversely, one of the benefits is that such a simulation, since we could pause the simulation and trace the activity of every neuron, might actually help us gain a better understanding of our own brains). The religious reaction to such a human-like AGI seems easy enough to predict - there would be debates over whether such constructs have "souls," whether they are an example of "playing God," and so on.
Imagine that neuron-simulation AGI paves the way for human brain uploads - being able to transfer a human consciousness from a biological body to a computer. Would "uploaded" humans still retain human rights? If, once uploaded, such intelligences can modify themselves, would they tend to remain particularly "human" in their thought processes? Would they simply simulate for themselves a Utopian dreamworld like the Matrix and leave the rest of us behind? If survival simply means an available power source and an industrial base capable of replacing defective components, and if backups are possible, would this then not be a route to human immortality?
Other potential pathways involve actually creating an AGI from the ground up, not trying to copy the human brain at all. Some of these methods carry the benefit of being able to better understand what's happening in the AGI and how it works from day one, but may carry the downside of being wholly alien to the way we think. There's no requirement that an AGI be sentient, or contain self-preservation as a goal, and so on. Interacting with a machine that is quite literally smarter than you but is not self-aware in the way that you or I am, or which has such drastically different goals than humans generally follow (self-preservation, procreation, entertainment, etc) would be an interesting experience, to say the least. An "alien" AGI would likely face a steeper battle for any sort of rights; an AGI that isn't sentient would likely not even trigger much ethical or religious debate at all.
If an AGI is simply a complex computer program, it would be trivially simple to copy an AGI. Creating just one means that it should be possible to create as many as you like provided you can obtain the hardware to run them.
AGIs, not being bound to the requirements of flesh-and-blood bodies, could open up new horizons for space travel. A 1000-year long trip sounds less troublesome for an AGI whose lifespan would be measured in terms of available power and the occasional replacement part. Food, water and air would be unnecessary, as would exercise, space to move around, some radiation shielding, etc. An AGI could directly interface with all manner of sensors, not be limited to the senses of a human being; it could directly see x-rays rather than using false-color intermediary photographs, for example. A single AGI on a ship equipped with telepresence utility bodies (repair bots, rovers, probes, etc) could hypothetically carry out an entire interplanetary mission on its own. All it needs is enough fuel for transportation and power.
The Singularity, at its heart, represents the question, "what happens when we build something that's smarter than we are?" Will we be left behind as our intellectual children outpace us? Will we "upgrade" ourselves with cybernetic implants or upload our brains to keep up? Will the machines be friendly towards us, or will they try to kill us off as potential threats?
It's easy to predict things like easier space exploration given AGI. It's easy to predict some cultural challenges, like rights for AGIs, their place in society, religious reactions, etc.
It's difficult to predict exactly what happens to us. Many post-humanists base their predictions on the assumption that AGI would be able to solve many of our basic problems, like the energy crisis, through an exponential increase in technological advancement. While in general that assumption isn't too unlikely, the specifics are where they start to fail. The development of an AGI tomorrow doesn't necessarily put us on a 3-year plan to working fusion power reactors. If human "uploads" become possible, that says nothing about the cost of such a procedure or the hardware to run it.
I don't think we can accurately make predictions about the social consequences, simply because there are too many massively important variables. What is the AGI like? What are its core goals, and how do we fit into them? Can it change its goals? Would it? Can we reasonably pull the plug? What's the economic cost of running one? Will we be limited to only a few in the world because they need building-sized data centers and lots of power, or will they be as common as personal computers? Can human beings relate to them in our interactions, or are they so much different in their thought processes that they feel alien to us? Are they sentient?
That's why opinions of the Singularity are so varied. It's very likely that an AGI will be developed; it's very likely that an AGI could be our intellectual superior in every way, thinking faster, recalling data more accurately, being able to modify itself to overcome flaws without lengthy 12-step programs, and able to integrate new information quickly and easily. Everything beyond that, from HAL-9000 to the Culture to Terminator to the Matrix to a post-humanist Utopia is nearly pure speculation.

This message is a reply to:
 Message 1 by Phat, posted 02-12-2011 10:39 AM Phat has not replied

  
Theodoric
Member
Posts: 9076
From: Northwest, WI, USA
Joined: 08-15-2005
Member Rating: 3.7


Message 59 of 169 (604720)
02-14-2011 2:04 PM
Reply to: Message 56 by New Cat's Eye
02-14-2011 1:31 PM


Re: "Absorb Technological Change" - Huh?
Its one thing to seek new information for something that interests you, but to just take potshots from the sidelines at everything unsupported is just annoyingly junking up the threads. You can just ignore people...
Maybe you should actually try reading my posts. I clearly am trying to find out how Crash feels he can make such absolute statements. There have been no potshots. Just an attempt to get clarifications and valid arguments. Other than lame responses like "it's obvious" or 'is clearly true". Evidently it ain't.
You have yet to answer simple questions I have posed.
For example, a reasonable explanation of "cultural absorption.
Or this
Its a simple deduction from the fact that if the *rate* of technological advancement only increases, then our culture's ability to absorb it will keep falling farther and farther behind.
But hasn't then always been happening. Why is now so different?
Instead your dislike for me has started to even prohibit you from responding in an intelligent and reasonable manner.
The point of the "singularity" is that it's the point at which technological change is happening so fast the results can't be predicted.
This is from Crash's original comment. Which I would like to point out again, has nothing to do with the "singularity" as presented in the OP.
Why is now any different than any other time in history? At what point in time were people able to predict the results of technological change? We can do this now? Could it be done in 1960? 1800's? The year 1100?
This is what I am trying to get at. If you would quit being such an asshole, maybe you could take the time to tell me how that comment of crash's even makes sense. I am still trying to get someone to explain why now and the near future are any different. Some explanation, other than just an assertion, that technological advancements are exponential and "cultural absorption"(whatever the hell that is, still waiting for a clear answer) is linear.
Finally, what is so freaking special about the singularity if all it is is
The point of the "singularity" is that it's the point at which technological change is happening so fast the results can't be predicted.
I am just trying to understand what the big deal is. Kurzweil is obviously out in left field, but I and it seems others think this whole singularity idea is bunk.
Maybe you and Crash need to decide on what is meant by the "singularity" and start a new thread. Ther seems to be many ideas about what the "singularity" even is.
Tech Luminaries Address Singularity - IEEE Spectrum
IEEE Spectrum - 404 Not Found
Maybe we can then address the Singularity posited in the OP, which is the Kurzweilian singularity.

Facts don't lie or have an agenda. Facts are just facts

This message is a reply to:
 Message 56 by New Cat's Eye, posted 02-14-2011 1:31 PM New Cat's Eye has not replied

  
crashfrog
Member (Idle past 1466 days)
Posts: 19762
From: Silver Spring, MD
Joined: 03-20-2003


Message 60 of 169 (604728)
02-14-2011 3:24 PM
Reply to: Message 35 by Theodoric
02-14-2011 9:12 AM


Re: Maybe read some of what Kurzweil writes?
I am arguing that you have provided no support to your assertion that we are nearing the limit of society to absorb technological changes
I've never made such an assertion. I said nothing about nearing the limit; only that the limit exists.
Which, mathematically, it must. QED.
You misrepresent what someone says then attack that strawman.
Whatever you say (Dronester actually had made precisely the claims I said he did, as I repeatedly showed), but currently you're the one attacking a strawman. I never claimed we're nearing the limit of society to absorb technological changes.
This actually is a lot like the embassy thread - complete with your general level of dishonesty and spurious accusation.
Maybe your concept of this technological singularity is different than what I understand.
Well, I've only explained the concept four times, now. Maybe you'd like to go back and read for comprehension, instead of with an eye towards misleading and dishonest misstatements of the positions I've articulated.
You have shown no evidence for the first.
I'm not under an obligation to. Those who, like you, assert that society's capacity to absorb technological change is growing exponentially are the ones who are obligated to provide evidence for their views. Can you?

This message is a reply to:
 Message 35 by Theodoric, posted 02-14-2011 9:12 AM Theodoric has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024