Thought — 2 Min Read

Little Lot Life

by Case Greenfield, March 31st, 2024

Thought — 2 Min Read

Little Lot Life

by Case Greenfield

March 31st, 2024

Probably the biggest mystery for us is how life ever started. How can it be that a complexity of seemingly dead atoms and molecules ultimately leads to something that we call life? How do molecules in a cell know what to do? How do cells in a body know what to do? How does it get from litte to a lot to life?

A few days ago, Microsoft and OpenAI announced, that they plan to build a hundred Billion US Dollar datacenter, mainly for developing AI, Artificial Intelligence. Apart from the truly incredible investment, this really struck me. Why? Well, the basic philosophy behind the investment is that more data and more complexity create more intelligent systems.

Artificial Intelligence

This may require some explanation. The development of AI started already seventy years ago. The basic idea was to copy the workings of the human brain, especially the basic building blocks, the neurons. So-called ANN’s, Artificial Neural Networks were built. But until roughly 2012, they didn’t work very well. So, many scientists believed the entire idea of ANN’s would never work, even if you would make them much, much bigger. The philosophy was simple: If you have something that doesn’t work, and you multiply it by a thousand, a million, a billion or even more, then it still won’t work!

Well, it turns out now, they could not be more wrong! I believe it was Stanford computer scientists Andrew Ng and Fei-Fei Li who first developed a method to process data in ANN’s much faster with computer graphics cards (Graphics Processing Units, GPUs) rather than with the normal central processing units (CPUs). The basic principle is that ANN’s require a lot of simple computations that can easily be executed in parallel. And GPU’s are specifically designed for parallel computing. So, it worked! They were able to process much bigger amounts of data in much shorter timeframes. This resulted in much more accurate ANN’s, initally for image recognition.

Since then, people around the world have built larger and faster computers with unbelievable amounts of GPU’s in parallel. And, the algorithms also improved, of course. Back proagation was a big step forward. And today, transformers, token-based LLM’s, Large Language Models, such as ChatGPT and GPT-4 from OpenAI even already seem to show a form of intelligence. It is expected that still faster systems with even bigger data sets will produce even more intelligent AI’s.

That quest has led Microsoft and OpenAI to decide on this totally out-of-the-world investment of more than 100 Billion US Dollars for a new datacenter, dubbed ‘Stargate’. And, the underlying assumption is simple: more complexity leads to more intelligence.

The expectation by techno-optimists is that within a few or a few tens of years some parties will succeed in building AI’s that are at least as intelligent as a human being. That is called AGI, Artificial General Intelligence. One step beyond that, ASI, Artificial Super Intelligence, seems to be the holy grail in this competition. Then, AI is much smarter than any human being.

The origin of life

Now, what I find fascinating here is the idea that complexity leads to intelligence. It made me think about the unsolved question how life ever has emerged from dead atoms and molecules by increasing complexity. The origin of life, yes. Scientists call it abiogenesis.

How did we ever get from little atoms and molecules to a lot of atoms and molecules to … life? And not only life, conscious life.

And, what will be the next step, ater conscious life as we know it today? What will happen if we add even more complexity than the complexity of the human brain? What comes aftre consciousness?

One thing seems clear to me. Apparently, it has something to do with complexity. (Despite the second law of thermodynamics, haha.) Unlike what these scientists said, that more of the same would never give a different result, apparently something happens when you increase the complexity of systems. It is not 1 + 1 = 2. It is not linear.

One plus one is three, after all!

Something additional arises. 1 + 1 = 3, after all! You see it with the LLM’s. You see it with life. Will we see it with consciousness?

Of course, this idea has led to lots of speculative theories, such as Gaia and the self-concious internet. The Gaia idea assumes that the complexity of all matter and life on planet Earth together have led to a conscious organism called Gaia. Similarly, the complexity of all data and interactions on the global internet would have led to a conscious system. You see the same with AGI and ASI. Some people believe, that AGI and ASI systems will develop a form and degree of consciousness.

Typically, the origin-of-life mystery is one of those questions that we simply have no answer to.

Live with it!

Share This Story: