Look at the picture above. It is the motherboard (the main printed circuit board) from a computer. How do you think it came to be? It could have been formed by an earthquake at a spare electronics warehouse. After all, it’s at least possible that in the midst of the warehouse shaking and being destroyed, lots of spare electronic parts started crashing into each other and, as a result, just happened to produce what you see above. Of course, I hope that no one is silly enough to believe such a story. It is obvious that the motherboard is the result of careful design and craftsmanship.
As someone who started programming computers with punch cards, used DOE grant money to buy a 40-Megabyte hard disc for $1,200 in 1989, and stored his unimaginably vast (800 Megabytes) thesis experiment data on sixteen separate 10.5-inch magnetic tape reels, I have witnessed a lot of progress in the area of computers. I marvel at how technology can produce a hand-held device that is more powerful than the VAX-11/750 (pictured on the left, without the keyboard and monitor) that I used to analyze my thesis experiment. Nevertheless, with all of the amazing progress that human engineering has produced, it doesn’t come close to what we see in the natural world.
Consider, for example, this article’s comparison of a personal computer to a mouse brain:
A personal computer simulates a mouse-scale cortex model (2.5×106 neurons) 9000 times slower than a real mouse brain operates, while using 40,000 times more power (400 W versus 10 mW). [reference marks omitted for clarity]
In case you don’t understand the lingo, 2.5×106 is 2.5 million, and neurons are the cells that carry out brain-related functions. There are other support cells in the brain, but the neurons are the brain’s “workhorses.” So a mouse brain has about 2.5 million operational cells, which is close to the complexity of a personal computer. However, the mouse brain operates 9,000 times faster than the computer, and it is 40,000 times more efficient when it comes to energy usage!
Imagine deciding to use a computer that is 9,000 times slower than the computer you use now and consumes 40,000 times as much power. What would you think? You would think it’s a piece of garbage, right? Compared to a mouse brain, one of humankind’s major technological achievements is like garbage! Of course, if you compare technology to the human brain, it gets much, much worse. The same paper indicates:
Simulating a human-scale cortex model (2×1010 neurons), the Human Brain Project’s goal, is projected to require an exascale supercomputer (1018 flops) and as much power as a quarter-million households(0.5 GW). [reference marks omitted for clarity]
Once again, to decode the lingo, you need to know that “flops” is a measure of computer speed, and 1018 is a 1 followed by 18 zeroes. Currently, the fastest supercomputer clocks in at about 1017 flops, so at this point, we can’t even build a computer that could be compared to the human brain. Our computers need to become 10 times faster to even start such a project.
In addition, look at the power required: 0.5 Gigawatts, which is 500 million watts. Guess how much power the human brain requires? About 12 Watts. So even if we could build a computer that at least has the human brain’s level of complexity, it would require about 42,000,000 times as much power as the brain. Of course, with all that, it would almost certainly be much slower than the brain.
Some scientists would have you believe that random mutations filtered by natural selection produced biological technology that makes our best human technology seem like garbage. Not this scientist!
A correction for you Jay.
All biological computational systems run at speeds many orders of magnitude slower than silicon based computational systems. If we use comparative clock speeds, biological computational systems run at under kilohertz, whereas most silicon based systems now-a-days run at 1,000 to 1,000,000 times faster. That’s by no means the whole, but this for another day.
The inherent difference is that biological systems appear to be mostly parallel, whereas silicon based systems are mostly serial (even in their parallel mode). Our biological systems are orders of magnitude slower than the silicon based machines, but this inherent parallelism on the part of biological systems seems to make up for this. As well, biological systems are not binary in nature, there is features of continuous analogue as well as various multi-state processing (2, 3, 4 and up to n states, where n is unknown for sure).
Your other points about system complexity and power consumption are good. As a recent video discussion on silicon based computation pointed out, binary systems are not actually binary, they are analog and we consider voltage levels below a certain level as off and above a certain level as on so as to give us binary. The internal circuits are always on and, as such, have relatively high power requirements. if we could make actual on/off circuits we could reduce the energy requirements considerably. But still not anywhere near the level of biological systems.
A little while ago, there was an interesting article on the Hacker News site related to an AI beating humans at various games (the example being Go). A number of the responses focused on the fact that the human involved was doing far more processing of various I/O related information as well as playing the game as speeds far lower than what the computer was running at. The computer was solely focused on the game and was only just able to beat the human player.
As someone who has been designing and developing various level systems over a period of 35+ years, I would say that our computer and associated technology is still at a fairly primitive level compared to the various biological systems in existence.
The other aspect that arises is that we do NOT know how the biological systems function, especially the computational varieties. There are various ideas but no actual understanding. For everyone who declares that they know how the biological computational systems work, we can find others in the same field with just as much expertise or even more who will dispute that claim.
In addition, all silicon based computational systems are based on a systematic logic (even those that use probability functions) which is essentially boolean. The logic of biological computational systems is, at present, unknown. It sometimes appears to be be based on some understandable logical framework, but at other times, we have no idea what is going on.
I will have to disagree with your comment about speed, Bruce. As the source clearly states, a computer has the complexity of a mouse brain but runs 9,000 times more slowly. Thus, for the same level of complexity, biological systems are orders-of-magnitude faster than silicon-based systems. Now, you may be right that they accomplish this speed with more parallel processing than silicon-based systems. I really don’t know. However, the fact is that however they do it, the biological systems are orders-of-magnitude faster than silicon-based systems.
Please note that supercomputers do parallel processing, and yet our best supercomputer is still ten times too slow to simulate the performance of a human brain, at least according to this source. Thus, if the difference in speed between biological and silicon-based systems is parallel processing, then the biological systems do that better as well.
Good evening or morning to you Jay.
Irrespective of what or how it may appear to you, the firing of a nerve impulse in biological computational machinery is slow. In terms of comparison to silicon firing rates, it is glacial slow.
Where the difference lies, is the incredibly complex parallel inter-connectivity between neurons. Supercomputer parallelism is so minor in comparison that for all intents and purposes it is serial. This does not mean that the biological systems are running faster.
The second aspect is that how and why a neuron may fire is not boolean. In silicon, the effective logic is, on the whole, boolean – two states.
There are other factors that come into play in biological computational systems that depend on various chemical processes as well. All of these make the “clock speed” of biological computational system slow.
When trying to simulate a biological computational system in silicon, most of the simulated processes are performed serially, irrespective of how much parallelism has been created in the silicon based computational device.
For example, if the supercomputer is able to handle, say 100,000 parallel processes, it is still minute in comparison to the parallel processes being undertaken by the biological computational system (mouse brain). The simulation has to account for all known processes occurring and most of these will be undertaken serially. Whereas, in the biological system, most of these are done in parallel, other than the triggering of one neuron by another. Designing parallel computational algorithms is hard, very hard. The more complex the algorithm the harder it is.
What you have to understand is that there are apples vs oranges comparisons being undertaken when talking about each system. If we use an analogy of “clock speed”, all biological computational systems are slow because of the inherent slowness of nerve conduction along nerve fibres. Certain biological features of neurons are there to speed the electrical impulses along the cell. But these transmissions are still very slow.
Don’t make the mistake of thinking that just because something responds quickly that it is running faster than something that responds slower. We can look at it this way, a process that takes 100,000 steps to complete using a serial algorithm running at a clock speed of 100 kHz will take 1 second to complete. Another process uses a highly parallel algorithm of 20.000 processes each taking 5 steps where the clock speed is at .2 second. Both will complete the task in one second. The first system is running at a clock speed 20,000 time faster. Think of silicon systems as the first and biological as the second.
The problem for AI is that biological computational systems are so complex we have no handle on how to simulate those systems in silicon. We make lots of guesses as to what is happening and have been doing so for many decades, but they are still only guesses. These guesses change every time some researcher sneezes (I know that’s hyperbole, but AI research gives that distinct impression all the time).
It is why I have no concern over AI at all. God’s creation is of such complexity and magnificence that mankind will only poorly ever match what He has done. From my perspective, too many of mankind’s ideas of how the various parts of the universe work are based on the concept that God does not exist and as such will not succeed in gaining a proper understanding of the universe around us.
In every technological area, including computational developments, we try to emulate what we see in nature. But we do it poorly and very inefficiently and commonly with serious side effects on our living environment.
Finally, every so often, we see much ado about specific advances being made in computational systems, including AI etc. Yet, when the dust settles, there has been fundamentally little progress made. Technological developments may mean faster and smaller machines, but many times the lessons of the past have been forgotten and the advancements are not as significant as they have been publicised.
God is great and may His blessings bountifully cover you and your family in the days and weeks ahead.
Thanks for posting, definitely an interesting article!
I’d go as far as to say a single cell in our body is far more advanced than our best computational creations. CLEARLY a sign of intelligent design. A single cell contains machinery that we can only dream of producing. 3D DNA origami is the closest thing I can think of and so far we’ve only been able to make fragile smiley faces and boxes. Most recent biophysical advances have relied on exploiting enzymes built by nature to produce biology of our needs.
Regardless, I’m not against the idea that humans could produce a machine far more capable than the human brain some day. A computer built in 1998 is well over 9,000x slower (in FLOPS) than one built today — and a 1998 computer you would most certainly consider “garbage”.
I agree with Bruce on speed. Biology is VERY slow, but operates in a massive parallel manner making it complete tasks fast and efficient. The polymerase responsible for DNA replication only runs at a few thousand (if that) cycles per second in vivo (in vitro it’s even slower) neuron signals only move at 100,000,000 m/s. I remember hearing somewhere that at this rate of exponential computational increase, by 2035 we’ll have a computer capable of simulating a human brain — and by 2040 we’ll have general AI. Who knows?
Good morning Seth,
Late last night (my time), as I was writing my response to Jay, I took some time to refresh my memory about neuron transmission times and the fastest is only 200 m/s. It can be under 1 m/s for some neurons.
What makes it so fascinating, it that God has so designed the systems in question that they operate so well even at these incredibly slow speeds.
As far as silicon computational speeds are concerned, we are at or are close to reaching the top speeds possible. Our only real options are increasing parallelism in computational devices or actually making real breakthroughs in quantum computational devices. Designing highly parallel computational devices is very hard to get right. We know how to do some things well in this areas, but most of our algorithms are serial.
Quantum computing is something that we will need to see first. there are lots of ideas in this space, but we haven’t progressed very far yet.
Any AI’s that we produced are very simple (no matter how complex they may appear to be) when compared to the existing biological systems. An active intelligent AI can make for a good science fiction story, but the reality of creating one is very different.
One simple area that is rarely considered is how such a construct would operate in faulty infrastructure. Biological systems handle failures of components in ways we have no understanding of (as a complete picture) and can route around damage in the infrastructure. We design our silicon based computational systems to simply be thrown away when damage occurs. This is an open problem.
May God’s grace cover all today.
It seens you`re getting famous at RationalWiki, Dr.Jay! ^^
https://rationalwiki.org/wiki/Jay_Wile
Hehe. I guess I have arrived!
I guess according to rationalwiki that no one is able to talk about evolution except people holding a doctorate in evolutionary biology from certain universities. I am still not sure the process they use to determine which universities make the cut.
A phd in nuclear chemistry tells me more about what you can’t do than what you can do. You can’t read a biology text book or put together a pre-fab bookcase or operate your lawnmower because you don’t have a biology degree, are not a licensed contractor, and you don’t hold a CDL.
In their attempt to just be a joke and ridicule people they have made themselves look idiotic.
Charles,
What I find interesting is that their description of Jay is that it is a not so subtle ad hominem attack on him. This alone diminishes anything they have to say. At various times, I have followed links to this site and found the material less than convincing. In many of the cases that I looked at, it appeared to be from the “trust us, we know what we are saying” position instead of presenting the evidence required.
Dr James Tour has an open invitation to anyone (including evolutionary biologists) worldwide to debate with him the problems in evolutionary biology. He will take the side of organic chemical problems. No one has yet taken up his challenge. As he is classified as a member of the top 10 organic chemists worldwide, are we then able to conclude anything from the non-acceptance of his challenge?
I found his discussion on pre-biotic conditions to be a marvelous discussion. There were aspects I couldn’t follow as I haven’t studied organic chemistry for 40 years. But I could follow most of what he was saying.
May blessings be upon all this beautiful day.
Yes, I fully agree with your assessment of that site. You could not have said it any better.
I have heard of several debaters with open invitations to atheists and the like which no one is taking them up on. It seems their position is surprisingly the biggest cop-out I’ve ever seen. I believe Richard Dawkins started the whole, “I’m so great so I am not going to share a stage with you otherwise I lend credibility to your position”.
I think it is exactly as Dr. Wiles says, “You can pick to go first or last because you have the weaker position.” They know this so they just choose not to go at all and then they just make sites like rationalwiki and basically cyber bully.
As a former microprocessor circuit designer and now long-time software developer, I totally agree: our technology doesn’t even compare to God’s technology.
This article focuses on comparing computing hardware with biological nervous systems, but comparing software would show God’s work comes out even further ahead of ours.
Looking at biological systems, there isn’t such a clear boundary between hardware and software as there is in human technology. But it is clear that there is data storage and processing going on. Our memories of sights, smells, sounds, and thoughts since childhood indicate huge amounts of data are being stored. Also, I have read that for people who are blind, the parts of their brains that are normally used for vision processing are re-purposed for better auditory processing and spacial awareness. Human designed systems just don’t have this kind of successful flexibility. Systems that we do attempt to design to be flexible end up being overly complex, cumbersome, and unreliable. The best we can do is attempt to stick to one task and hope we do somewhat well at it.
Our human-designed systems just can’t stand up in the face of software bugs that cause them to fail. We humans have to intervene, fix the bugs, and prop them up again constantly. Often bug fixes introduce more bugs, and the process must be repeated. If biological software has “bugs” in it (something that has never been demonstrated unambigiously that I’m aware of) they seem to fix themselves, such that the overall system keeps going for thousands of years. In contrast, no human data center can operate very long without patches, reboots, and periodic complete overhauls. All with constant intervention and monitoring by intelligent agents, I might add.
Oh yes, and let’s not forget that these biological information processing systems reproduce their hardware without multi-billion dollar corporations and large centralized fabrication plants. As for our software, it takes thousands of hard-working programmers to come up with it (bugs and all), but somehow biological software wrote itself by accident, without any bugs. Hmm.
Everyone knows that each and every one of these inferior human-designed systems has a creator with a specific purpose in mind. But evolutionists think that the infinitely superior biological systems have no purpose and no creator. Strange concept, with no evidence nor logic behind it.
The study of nature should make every software developer more humble.