$wpsc_version = 169; » Why only geeks and hippies can save the world - Embracing Chaos

By BoLOBOOLNE payday loans

Why only geeks and hippies can save the world

[Here is the full text of what I practiced for my talk at Ignite Seattle last night.  I didn’t manage to cram it all into the 5 minute presentation, largely because the audience was reacting a bit too loudly in places.  IMHO that’s a good thing.  You can download my slides (slightly edited from the presentation).  Or you can watch it on video.]

I’m here to talk about a system of morality that’s based on the upcoming end of society as we know it.  I’ll explain why only geeks and hippies can save the world.  I’m serious — I’m talking about the possible destruction of everything we know and care about.

Let’s look forward to the next 1,000 years.  What’s life going to be like?  Are we going to be flying around in spaceships visiting other planets like in Star Trek?  I don’t think so.  Or will we be killing each other over the last few gallons of gasoline like in Mad Max?  Maybe, and this is what I’m really scared of.  Or will the machines have risen up to try to destroy us like in Terminator?  Again maybe, but I’m not really worried about this, and I’ll explain why.

Now look back a billion years ago.  That’s when life first showed up.  And then a million years ago humans showed up.  Just a thousand years ago they had printing presses, and a hundred years ago we had cars and ten years ago we had google.  Progress is speeding up faster and faster exponentially and it’s not going to stop.

What’s happening is that people are getting smarter and more capable of solving complex problems both by themselves and by collaborating with others using tools like e-mail and text messaging.  Our brains are slowly starting to merge with computers.  Look at cell phones: who here actually remembers any phone numbers any more?  And who cares?  We don’t need to.

We’re heading towards what’s known as The Information Singularity.  This is where human brains and computers actually merge into the same thing.  When this happens technology will progress so fast that un-aided humans will be completely unable to keep up.  This is where all of our technology is heading.  But you know, we might never get there.

What if there was a nuclear war?  How far back would that set us?  100 years?  100,000 years?  Would we ever be able to get back to where we are?  Maybe not.  That could be the complete end to evolution as we know it.  Nuclear war’s not the only way this could happen either.

Imagine that somebody got so pissed off that they bio-engineered a super-virus to kill all white people.  And it accidentally killed all people.  Or what if global warming got to the point where the weather is so bad that advanced society just can’t exist?  The ecosystem could collapse.  We could run out of energy resources.  Gray goo.

I believe that in the next thousand years something is going to render our planet uninhabitable to life as we know it.  And the question is, when that day comes, will we be ready for it?  Will technology have advanced to the point where we don’t need life as we know it in order to preserve what we really care about?

Well what is it that we really care about?  This is the critical question facing our society right now.  We can’t close our eyes and hope it just goes away — it won’t.  Now some will say “EARTH FIRST!  People made this problem and we need to back off and let nature fix itself.”  But I don’t buy that.  I say we embrace the chaos and push forwards.  Here’s why.

I believe that the most valuable thing in the world is complex thought, information, ideas, memes, logic, reason, discussion, art, emotion.  All of these things are way more important to me than things like birds.  Or plants.  Or even humans.  Because we don’t need bodies to listen to music.  Or to tell stories. Or to fall in love.

We can achieve salvation through technology.  When the upcoming robot revolution arrives, I say we let the robots win.  Don’t fight them — join them!  Let’s cast off these weak unreliable human bodies and transcend to a society of pure thoughts and ideas.

We can do it!  We can build a network of computers powerful enough to hold all of us at once.  We can upload our consciousnesses into these computers by simulating the human brain in software.  It’s an incredibly hard problem — way harder than say simulating the weather.  But we can do it.  Computers are getting faster and faster all the time and likewise our understanding of the brain is getting better and better.  Someday soon we will be able to simulate an entire brain in software down to the very last neuron and when that happens, that computer will actually have the personality of a real human being.  It’ll work because there is no quantum soul.  We are nothing but our neuronal structure.

Some people will miss having bodies.  They’ll miss things like kayaking and eating food.  But they won’t miss dying.  Just like nobody misses having a warm fire to come home to in their cave.

You know, our lives are pretty darned good here and now.  So I gotta ask: What are you going to do with this?  Are you just going to play?  Be a hedonist?  Or do you want to do something that matters with your life?  Do you want to work to preserve complex thought and information into the next millennium?  It’s up to you.

But if you do want to help, listen to Avi.  Install compact fluorescent bulbs.  Shop at Madison Market and support sustainable agriculture.  Get political and try to calm down the crazies who want to blow everything up.  In other words, be a hippie.  We might not be able to stop the fall,  but we can definitely postpone it.  Hopefully for long enough.

Or work from the other side to speed up technology.  Talk to Bre about building robots.  Write educational software to make people smarter.  Work on communication tools.  Research how the brain works and how to connect it directly to computers.  In other words, be a geek.

Because it’s the geeks and the hippies who are going to preserve what’s really important into the next millennium.  If you ask me, to not do so is to act immorally. This system of morality is based on two axiomatic assumptions:

1) We cannot keep going like this forever.

2) Complex thought and information are more valuable than nature and life.

If you’d like to read more about this, Kurzweil has written lots of good books on the singularity.  My good buddy Mez has written a fabulous book on relevant technology trends.  Or you can read my blog at embracingchaos.com.  Thanks.

  1. Andres Colon says:

    Hello Leo, thank you for sharing your video and for pondering the future. :)

    I have posted my response to your video, which was shared by a member at Thoughtware.TV, the transhumanist multimedia repository. Your video is at the following url:


    The video links back to your page, so you'll be seeing traffic from us.

    If you do respond, please respond at Thoughtware or email me at my address which I have provided for you.

    My response is as follows:

    Great video. This was the first time I hear about the igniters in Seattle. The talk was both humorous and serious. I liked it and laughed quite a bit.

    Here are my thoughts on it:

    Leo must really know his audience. I'd skip the talk on any form of 'salvation' when talking to people about transhumanism.

    Enhancement great, happiness yes, progress excellent, adaptation, definitely….but…Salvation?…erm. That term doesn't seem right in there.

    I also don't understand why he says: "I believe that in the next thousand years something is going to render our planet uninhabitable to life as we know it.". I assume he bases this belief of his idea of our planet's history of threats and extinctions, plus added threats arising out of our technological advancements. It is true life has gone through hell and back throughout its 3 billion year history on this planet, but yet we're here, in great diversity and numbers.

    Don't get me wrong, I am not playing down the risks, I just don't *believe* it is inevitable that such events will take place. While I do not have any *faith* they have to ocurr, I can consider the possibilities. It is possible for life and intelligence to prevent things from happening. And even if some of the worse comes to pass, there is a possibility we can contribute to overcoming such difficulties.

    I do favor uploading, and I'm looking forward to it, but doing so is a choice, and it has to remain that way. All transhumanists should be quick to emphasize that doing so **is not a requirement for salvation*.

    The word salvation has a lot of theological baggage. I don't think it should have anything to do with transhumanism.

    ..anyway his view on the inevitability of such an event makes Leo believe it is a moral thing to: "cast off these weak unreliable human bodies transcend to a society of pure thoughts and ideas".

    Let's not forget even uploads as we conceive them will have to rely on a physical space for computation, regardless of their ability to drift off from computational environment to the next. I cannot yet picture them without limitations. Regardless of backups, uploads as conceived are not yet just 'pure thoughts and ideas'. They will be bound to physical mediums, regardless if they implement distributed computing.

    If we talk about human minds on earth right now, a human consciousness could be said to be computed in a distributed fashion, via massive neural networks, as a collective, the consciousness is less vulnerable even if some neurons fail, it has its failovers, but it is still extremely vulnerable.

    So minds will reside in objects at a given moment, regardless whether the shell is biological or not.

    Our collective biology has given rise to intelligence on this planet. From microorganisms to vast colonies of bacteria, to multi-cellular organisms and conscious beings, it has taken billions of years and much sacrifice for life and intelligence to have made it thus far. Perhaps we, as transhumanists, should not be so quick to wish to "cast-off" or allow biology to meet its end, due to its lack of adaptability in a mechanical driven world.

    We owe that very precious part of us, our consciousness, to trillions of hard-working cells that have come and died, mutated and suffered and adapted. Collectives that have seen life through the worst of times. Thanks those we are here.

    If we believe in our intelligent capacity to do good with our technology, there is a possibility biology will still have its role to play. I do not care if Biology turns out to be the weakest link. We should be mindful not to cast-off our predecessors, who in reality are also ourselves. We should also be mindful of our actions as transhumanists, because humans are conscious entities that emerge out of a less capable collective. The actions we might take towards our biological predecessors might set a precedent.

    Consider this: If we were to so easily discard our biology to continue transcending, it would be awfully ironic if whatever emerges from a potential massively interconnected network of uploads, such as a world wide mind (or a post-Singularity AI) eventually ends up 'casting off' all of us, everything, in order to preserve itself in some better suited shell, without any regard for our well being.

    We should never allow ourselves to limit the complexity from which we emerge from, in order to transcend, we should embrace it. As intelligent beings, transhumanists must set this as a precedent, not because we have to but because we can. And…if we are to leave a shell behind, such as the biological one we're now considering for uploading, we should take care to give it the tools to continue its progress of evolution and discovery, for it may yet have a role to play in the future. Our care, as far more capable collectives, should be to better care for biological existence than it has done for us, and it has, for billions of years.

  2. leodirac says:

    Glad you liked it, Joe. For everybody else's convenience, I've linked to the online video at http://www.embracingchaos.com/2007/03/ignite_video_on.html

  3. Joe Duck says:


    Really enjoyed the ignite talk I just viewed online. You have another vote for the blending with the machines…before it's too late.

  4. Ramez Naam says:

    I loved your slides! So sorry I had to miss the talk itself. I can't wait to see the video.


  5. Speaking of Madison Market, I'm trying to get them to use Range Voting for their internal council elections. Range Voting might save the world, if anything will.



    Clay Shentrup
    Seattle, WA

  6. Adam Rakunas says:

    So, does this mean you and your buddies are going to start taking over Seattle politics? 'Cause, frankly, it'd be about bloody time.

  7. Hari Jayaram says:

    While I am scared of what nuclear war can do.Even scarier is the looming danger of a catastrophic decline in global bio-diversity.

    Articles detailing a catastrophic decline in global fish stocks, plateauing agricultural productivity all make me think of how in our struggle to say alive we will destroy the world we live in.

    I often think we will out-compete every living thing on this planet long before a something like nuclear war or a global pandemic will wipe us out.

    Of course there is enough reason to believe that around the time we find ourselves almost alone on this planet we will start to kill each other to stay alive.

    While I agree hippies do have a lot of this right, most geeks often bury their head in their byte jungles ..while the real stuff is fast disappearing.

  8. Nancy White says:

    I am sorry I missed your talk. It sounds great and I look forward to the video. I was a wimp and left early. (I'm hearing "all work and no play" running through my head!)

  9. leodirac says:

    Thanks! I'm glad you liked it. I'm more glad that people seem to have gotten something out of it — that it made at least a little sense. I was really nervous that I was trying to cram too much into 5 minutes and that everybody would be lost.

  10. Riana says:

    Loved the talk! So did everyone else – I was hanging out with Igniters until after 2 in the morning, and everybody agreed yours was one of the highlights of the night's talks.

  1. […] now as I write about things like the fate of humanity, the nature of consciousness and how to save the world, I see two huge gaps in what science can explain.  For context, here’s a quote that I […]

  2. […] week I’ll be giving a talk at Ignite Seattle about Transhumanist Morality.  It’s going to be a fun challenge to […]

  3. […] posted videos for the rest of our talks from the second Ignite night, including my presentation on Why only Geeks and Hippies can save the world.  Watching it, I see that it’s a lot rougher than I remember.  The text as I […]

  4. […] I’m preparing my slides for my Ignite Seattle talk tomorrow night (tonight? Tuesday night) and I go over to my friends’ place to practice with […]