Tag Archives: Nagel

Thomas Nagel and materialism

I was reading a post yesterday on the Signs of the Times about Thomas Nagel and thought about the idea of artificial intelligence is one way to look at the concept that there’s no God and all we are is physical material. Nagel critiques the pure materialist point of view and has made himself a bunch of enemies because of it.

Artificial General Intelligence (AGI) dealing ...

Artificial General Intelligence (AGI) dealing with new situations (Photo credit: brewbooks)

Basically the question would be can you get more intelligence out of a machine than the intelligence you put into it? Think about all the people working on say the closest we have to real artificial intelligence in a computer, you’ve got a huge team of programmers, engineers, mathematicians etc. On top of that you’ve got all the theories, findings and work done by others before them say just in computer sciences that this team is developing on top of, not to mention all of these people collectively are working on top of ideas from all sorts of fields that makes the sum total of their experience and education to get to where they are with their best AI.

So imagine you could quantify that – all that intelligence inputed into the creation of the most advanced AI yet developed to date. There would be a huge amount of intelligence banked up, if you put that to use it would be an extreme super brain, capable of God knows what superhuman mental feats.

So that’s the intelligence put in and if you compared that to the intelligence coming out of this AI, well the best AI machines as far as I’m aware have difficulty just mimicking human walking movement. So not a lot of intelligence coming out by comparison to the vast amount needed to be put in and create it.

So say we look at the atomic energetic soup that was the base material for our universe. There’s not much intelligence there put in. But it randomly produced complex beings that get inspired, have feelings and can ponder all this stuff we’re all pondering about.

The randomness of that ever happening is just for me unfathomable, it’s like saying if I threw a bucket of matches in the air enough times eventually it would fall into place making a toy sized house rather than just a pile of matches.

For me anyway, the material reductionist view seems to be the most spookiest.