Log in

No account? Create an account

Previous Entry | Next Entry

Singularity Summit, part four

Continuing where I left off in part three, the next few speakers at the summit were...

Eric Drexler:

Drexler is the author of Engines of Creation, the book which first introduced the world to "nanotechnology", the idea that we'll one day be able to build tiny little machines that operate on the scale of molecules. Later, the word sort of got warped into meaning "any tiny little machines", even if they are much bigger than moleuclar (nanometer) scales... which is kind of silly, but this has caused the terminology today to shift towards calling real nanotech "molecular assemblers" instead. I enjoyed listening to Drexler; but the thing I found most interesting is how much trouble he went to emphasize that he wasn't talking about "nanobots"... that is, nanoscale self-replicating machines. It seems fairly natural to me to assume that not too long after we have molecular assemblers, we should be able to make ones that can self-replicate... but he went out of his way several times to emphasize that that wasn't what he thought would happen, and he even said once "that's not a part of anybody's vision today." I think that's an exaggeration, but maybe he just means that nobody is working on it yet since it's too far off. Or maybe he's just tired of people wanting to ban molecular assemblers based on the assumption that it would lead to swarms of nanobots, grey goo, etc.

Max More:

Max More is the founder of the notorious Extropy Insitute, which has recently shut down. The Extropy Institute was one of the two largest Transhumanist organizations (the other being the WTA which James Hughes currently runs). I remember being personally inspired by the idea of extropy a decade or so ago when I first found their website (I was 18, my first year out of high school). I started occasionally referring to myself as an "entropy warrior" for a while, but it soon became old so I gave it up. It's always seemed to me that the idea of extropy is a bit loose and, aside from sounding neat, the whole thing involves enough different buzzwords that I've never bothered to really get into it enough to figure out what exactly they're saying or doing with it. Nevertheless, Max More seemed like a cool guy and I got to say hi afterwards and asked him about why they recently decided to shut down. (Although his answer was diplomatic enough, that I had to go to troyworks for the real scoop... whom incidentally, I really enjoyed finally meeting) What did I think of his actual talk? Hmmm... I guess I liked it, although I can't remember all the things he said. Basically, general agreement on my part but I see a lot of the kind of ethical philosophizing that I find a little bit pointless. I like thinking about meta-ethics sometimes, but when it comes to ethics I think it's just too subjective (and boring) and about all you can say is "to each her own". Basically, I like him more than John Smart, but not nearly as much as Bostrom, Yudkowsky, or Kurzweil. (I mention these 5 together because they were the 5 most hardcore transhumanists there).

Christine Peterson

She is Eric Drexler's x-wife and runs the Nanotech Foresight Institute, an organization he founded originally. I don't remember everything she said, but she emphasized the need for people to design secure operating systems and improve internet/network security tools in order for us to have a safe and positive future for humanity, especially once we get closer to the singularity. I think this is a good point, and I'm glad somebody made it. Security is extremely important as we increasingly rely on technology, especially if we're to merge with it or become submerged in it. I give her the thumbs up, even though she reminds me way too much of a high school principal (the same problem I had with Janeway which caused me not to like Voyager). :)

John Smart:

I wish I could like him more. After all, he started the Future Salon Network which I'm affiliated with, and I think it's a wonderful way to get people together to talk about technological impacts and I'm grateful for his contribution to that. But he used so many buzzwords that I have a hard time believing they mean much. Example: "MEST = Matter Energy Space Time". Something about the way he talks about these things just gives me the "pseudoscience" vibe even though I should probably give him more of a chance than just hearing one talk and visiting his webpage a couple times. A friend of mine pointed out to me yesterday that the comment about "intelligent" black holes on his website does make sense in a roundabout way, under very speculative assumptions that I won't go into. Nevertheless, I don't see the point of worrying about these things now when we can't possibly predict what things will be like that far in the future. He referred to half a dozen different popular science books, some of which I am apathetic or neutral to and others which I put squarely in the "crap" category (*cough* James Gardner). I'm sure he's a nice guy, and I have to respect what he's done with Future Salon, but I really wasn't very impressed with his talk... my least favorite of the 12 (well, aside from Bill McKibben's of course... boy is he a piece of work). Smart came across to me as what I half expected Ray Kurzweil to sound like. But instead, Ray sounded completely down to earth and John Smart sounded a bit "out there" to me.

Well, it appears that once again, I'm going to have to save the rest for later. In part five, I'll finish off the last two speakers and then summarize the whole event. When I started this summary I labelled it "succinct". I apologize for not living up to that, in any way shape or form. I guess I just have a lot more to say about the summit than I thought :)


( 4 comments — Leave a comment )
May. 22nd, 2006 05:44 am (UTC)
I read "Nanosystems" cover to cover, and even stopped to learn a lot of the math and physics required to understand it along the way. If you gave this man a billion dollars, he'd have a working molecular manufacturing sytem within 5 years. Its a matter of economics, not scientific know-how.

The world will be a very strange place soon. Consider yourself lucky you're one of the few people who will be able to comprehend it.
May. 22nd, 2006 06:23 am (UTC)
Smart - I had the same reaction.
Aug. 21st, 2008 05:47 pm (UTC)
Drexler's reasons for expecting assemblers not to be autonomous self-replicators are the risks of grey goo type problems, and the limited advantages to creating fully autonomous self-replicators. See http://www.iop.org/EJ/abstract/0957-4484/15/8/001/.
I'm a bit concerned that there's more of a slippery slope toward self-replicators than he admits.

I find it a bit odd that shortly after a favorable summary of Cory Doctorow's ideas you link to a dead tree version of Engines of Creation rather than the html version.

Your opinion of John Smart is about right. He consistently strikes me as mostly a salesman, not an intellectual. I like much of what he's selling, but he seems blind to the problems that may be associated with new technologies.
Aug. 21st, 2008 08:12 pm (UTC)
At the time I wrote this, I had never met John. Now I am a lot more closely involved in the same network of friends as him, and have talked to him a few times about stuff. Also, he came and gave a talk at my Future Salon in Santa Cruz (which has since dissolved) which cleared a lot of things up for me. I like him a lot more now than the first impressions I gave here. But I still think his ideas are very vague and may not hold a lot of content. I've told him that I like the spirit of his ideas, but I feel like they need to be sharpened up a lot more before academics take them seriously.

( 4 comments — Leave a comment )


domino plural

Latest Month

May 2017


Powered by LiveJournal.com
Designed by Lizzy Enger