swinging with the big j

I’ve been away, but while in the Great Lakes region, I came across this framed picture and snapped a photo of it. I mailed it to my friend Liz and expressed my amazement at this image. Her response: “Are you saying you were never pushed on a swing by Jesus or a robed Jesus-like man in your childhood? I am very sorry for you.”

Dang. I didn’t know everyone except me got a swing push by the white-man Jesus. My experience was of some guy with horns and a tail pushing me off the merry-go-round.

jesus_tire_swing2

perfect cult recipe

Haven’t posted for quite a while, so in desperation I’m putting up this long, somewhat dry message. One night I was wandering from site to site, and came across a link to this interview transcript from 2010. Though it wasn’t intended to do so, the interview lays out a distilled essence of cult.

For some background; the site this interview comes from is lesswrong.com, and many of the people who post there are believers in the probability of a technological singularity. By one common definition, the singularity is when artificial intelligence is created and it rapidly leapfrogs human intelligence and becomes unimaginably smart and powerful. That belief is the basis for all that comes after. The interviewee is Eliezer Yudkowsky, a research fellow at an organization he helped to create, the Machine Intelligence Research Institute (MIRI), and a big promoter of the singularity concept.

Step one in the development of a cult group is a shared belief in something that inspires people. In this case, it’s the assumed probability of the singularity. Step two is elevated importance; the assumption that the most important project in the world is developing this super-AI and making sure that it does not damage or destroy humans. Step three is a consequence of step two – the assumption that giving money to this project is the most useful and important thing you can do with your money. Here’s one of the interview questions.

I would therefore like to ask Eliezer whether he in fact believes that the only two legitimate occupations for an intelligent person in our current world are (1) working directly on Singularity-related issues, and (2) making as much money as possible on Wall Street in order to donate all but minimal living expenses to SIAI/Methuselah/whatever.

The SIAI is the Singularity Institute for Artificial Intelligence (now MIRI, I believe), and Methuselah is a project to extend human lifespan. Yudkowsky’s response:

So, first, why restrict it to intelligent people in today’s world?  Why not everyone?  And second… the reply to the essential intent of the question is yes, with a number of little details added.

In other words, he agrees with the questioner. There are only two “legitimate” occupations. He adds some fine points about how much of one’s earnings might realistically be donated. A bit later, he goes on to really amp up the urgency.

This is crunch time for the entire human species.

… and it’s crunch time not just for us, it’s crunch time for the intergalactic civilization whose existence depends on us.

This is starting to sound like a science fiction dream — saving the world from unfriendly AI, enabling our intergalactic future. Yes, but if you read a lot of science fiction, and if you’re looking for something cool to believe this might get your blood going.  If you go beyond that and believe that Yudkowsky’s organization is supremely important, and that Yudkowsky himself is supremely important, this is step four, having a great and wise leader.

Someone asks, what would happen if you, Yudkowsky, were no longer on the scene? Who would carry on? Part of his response:

And Marcello Herreshoff would be the one who would be tasked with recognizing another Eliezer Yudkowsky if one showed up and could take over the project, but at present I don’t know of any other person who could do that, or I’d be working with them.

Only one man can save the world, and it’s me! And now you have finished the making of a cult. If you wonder if there is really a potential cult of personality built around this person, another question for him is:

Could you please tell us a little about your brain? For example, what is your IQ, at what age did you learn calculus, do you use cognitive enhancing drugs or brain fitness programs ….?

We have the most important cause in the world, in all of history, present and future. Therefore give most of your spare money to this cause. By the way, I’m the only guy who can do it. It could be L. Ron Hubbard talking, or almost any other cult leader, but I’ve never seen the belief, the money, and the “guru” personality encapsulated so neatly in one interview and tied with a bow.

Singularitans love graphs like this:

singularity1

 

Singularitans with a good sense of humor love graphs like this:

singularity2

spectral zoology – a preview

OMG. ROFL. Holy Toledo.  A new website is born. And it presages the greatest animal-spirit-guide-oracle-card event in history. 

Spectral Zoology

The card decks, stuffed with magic and packed with celestial animal soul power, should be available by the end of October. Check back frequently at this location:

Spectral Zoology

The card deck of the future, and of the deep past. These 35 cards will rock the very foundation of the animal-spirit-guide-oracle-card industry.

Spectral Zoology

The invisible hand of the market will place a deck in your pocket when you aren’t looking.