beetiger: (portrait)
[personal profile] beetiger
Although I really enjoy hanging out and dreaming with people who are, I'm definitely not a transhumanist.

I think one of the reasons probably overlaps with one of the reasons that I'm not a Christian. I'm wary of any philosophy that focuses on the "next world" while not paying enough attention to this one. I'm afraid of feeling disconnected from the things that are right in front of me. I love the warmth of sun and the stars in the sky and the smell of ozone from rain on pavement and the sound of human voices singing, not just because of my experience of them, but because of the mundane conceptual background that tells me that they are there, and real. From this spot in my life, I like the reality that now matters, because we don't have forever. I still really think this is one of the things that drives us to do the work that does, in fact, move our world into the future.

This pregnancy makes me feel very biological. It's draining, in some ways, and I certainly am not happy about the temporary loss of the ability to think as critically as usual, but essentially it's a marvel. There's a groundedness about knowing that my body knows how to make a person, a person who is currently squirming and kicking and is going to do something that I've never dreamed of, in a world I'm never going to see. Check in with me in a few decades and ask again, but right now, I think I'm going to be ready to give the world to him to do that.

Maybe I'm not a transhumanist because I don’t think I'm likely to be as smart as the people who come after me, if I'm willing to step aside and leave them the world to play in. Maybe I'm not a transhumanist because I want to love this life to its fullest, and I really don't believe we are ready to succeed in time for me to realize those dream, and that fills me with frustration, rather than anticipation. Maybe I'm just lucky enough to be living in a body that mostly works. Maybe I'm just not very ambitious.

But I don't see much of a point in living forever if we haven't even figured out yet how to fully live right here, right now.

Date: 2003-08-11 11:21 am (UTC)
From: [identity profile] en-ki.livejournal.com
I want to live forever precisely because it's such a big job to figure out how to live right here, right now. Why waste the effort by dying before you have time to make any real progress?

Date: 2003-08-11 11:52 am (UTC)
From: [identity profile] postrodent.livejournal.com
You took the words out of my audio output device. Also, transhumanism holds out the thought that we might actually become _smarter_, or that at least our machines might, and then perhaps we'd do a better job of running our lives and correcting all the horrible damage we've done to ourselves and the biosphere.
But on the other hand, my faith in transhumanism/singularity/the power of technology in general has taken a hit lately. It's nice to have insanely advanced hardware, but most of the important decisions are made at the software level. We have tons of resources, technology and smarts right now, but those riches are mostly squandered on trivialities or disastrous folly. For the most part, we aren't even _looking_ for the answers to the most important questions about our future -- and that's because of our society's misplaced priorities and bad memes. What we really need is _wisdom_, another highly subjective term, but one that regardless isn't likely to show up in implantable chip form anytime soon.
This leaves me in the disquieting position of putting my hopes for the future on _human beings_. Yikes.

Date: 2003-08-11 12:18 pm (UTC)
From: [identity profile] en-ki.livejournal.com
[I think I ended up just restating what you said in my own terms.]

Right: hardware, which has improved and will improve according to Moore's law until it doesn't anymore, can only make processes faster and more reliable, not more correct. Wisdom is in software.

But where does wisdom come from? Experience, integrated with existing wisdom. Faster I/O means experience comes in faster and faster CPUs mean it can be integrated faster. So to me it's still credible that we can have Really Wise AI by starting with something as wise as we can write (perhaps even just an image of a human mind) and giving it an imperial arsetonne of bandwidth and CPU cycles with which to advance itself. We just need to bootstrap things up to that level, either by augmenting the brain or by improving our programming techniques, and a lot of these processes are still hardware-limited.

This is where the priorities and the memes that set them come in, of course. I could rant about this for a long time, but my priorities are misplaced; I must resume extracting resources from my small organization that exists to extract resources from large organizations programmed to extract resources from my countrymen and destroy them in order to destroy or control resources belong to other people. Because, you know, actually using resources to accomplish things other than the acquisition of more resources is bad. Maybe the right chip in my brain would give me the sense to quit my job and write the software I want to write.

Date: 2003-08-11 12:47 pm (UTC)
From: [identity profile] neillparatzo.livejournal.com
This is so true. Life is hard. Give me some time to figure it out, for Christ's sake!

December 2013

S M T W T F S
1234567
89 1011121314
15161718192021
22232425262728
293031    

Style Credit

Expand Cut Tags

No cut tags
Page generated Mar. 1st, 2026 03:36 am
Powered by Dreamwidth Studios