This has been sitting in my draft box for a while.
I read Flowers for Algernon (FoA) not too long ago, and then read Ender's Shadow (ES) two weeks or so ago, and realized that both authors are playing off the same theme: that of giving up long life for extreme intelligence. There were some side effects too - a decreased ability to understand social motivations, isolation, etc. I'm not sure if this is what the authors see happening in real life, of it's just coincidence that both of them used the same events.
I am really more interested in whether I, given the choice, would choose extreme intelligence over life. Charlie in FoA didn't really know what he was getting into, whereas Bean in ES had no choice at all. For Charlie the outcome was unavoidable; by the end of the book he had already fallen back to his previous state. Bean, on the other hand, went on a relativistic journey so others can discover a cure.
If I had the choice, it wouldn't take long for me to decide at all: I would go through the surgery. I'm not sure what it is that compels me to do so. Part of it certainly has to do with the fame - not of the surgery as with Charlie - but of discovering something in the short time that you are brilliant. The career peak of many mathematicians and scientists are in their 20s, after which they simply don't have the mental agility to do anything great. Is such a surgery not simply being shortly brilliant at will, then dying off?
The other compelling factor is anticipating how much you will learn and know. Here I am reminded of Charles Babbage, who said that he "would gladly give up [his] life if [he] could have lived three days five hundred years hence." This is a deal that I would make, too. Being in a religion class now makes me think of transcendence, a plateau of intellectual nirvana at the end of the surgery. But think also how mankind would advance! If I'm not the only person willing to do this - and I'm pretty sure I'm not - then science would grow in leaps, led by consecutive short generations of super-geniuses. Even without a cure to the aging side effect, there would not be the start and stop of current scientific progress.
I see why this is attractive now: it's a very transhumanist idea. Valuing intelligence above many things and not caring if we are displaced by others, the leadership of super-geniuses doesn't bother me at all.
My only regret would probably be, after being brilliant for a short period, the uselessness I would feel afterwards. Like Charlie I would probably contemplate suicide, and unlike him I won't be bound by observation. By then death doesn't bother me either.
"For God doth know that in the day ye eat thereof, then your eyes shall be opened, and ye shall be as gods..."
No comments
Post a Comment