Let's Discuss Aging



We are not right now, but we will be 40-50 years from now. People far more qualified than either of us think it will happen in less time (although you may disagree, technological singularity isn't some crazy theory with no chance of happening).

It's a cool theory, but it's also kind of crazy. There's a chance of almost anything happening, rationality comes into play when you think about how realistic that chance is.

That doesn't mean that such intuition will not be replicated. As I said before, the ability for a technological singularity to occur is largely dependent on our ability to do such a thing.

Why do you think computers (binary machines) can accurately simulate intuition? To me, it's natural that advances in computing power leads to "smarter" computers in terms of computers, but not in terms of human processing.

Trust me, I love ideas about supercomputers generating consciousness and the idea that we all might be a simulated consciousness...but it's a crazy idea at best. It's just not realistic.

I already defined technological singularity for you (I'll rephrase it for you: When an affordable computer is able to mimic most intelligent human behavior). I am not talking about any other type of singularity.

Well singularity has nothing to do with the price of computers but, singularity is singularity whether technical or not. It's a concept similar to infinity.

Technological singularity is not just "affordable computers that think like humans", it's a theoretical point in which the outcome of technology is incomprehensible and unpredictable to the human brain. Again, this is all non-fact theory. And again, I hope for it but I certainly don't expect it.
 
Why do you think computers (binary machines) can accurately simulate intuition? To me, it's natural that advances in computing power leads to "smarter" computers in terms of computers, but not in terms of human processing.

Trust me, I love ideas about supercomputers generating consciousness and the idea that we all might be a simulated consciousness...but it's a crazy idea at best. It's just not realistic.

Because the human brain is nothing more than a fairly advanced computer.
 
Because the human brain is nothing more than a fairly advanced computer.

Quite a religious statement. For starters, I think "fairly advanced computer" is a bit of an understatement for the human brain. The processing power of the brain is at least equal to any current supercomputer, and that's not even taking into account the biological aspect that makes way for my second point.

Second, emotions and intuition can't be correlated to sheer processing power. Neurons are absolutely different than tiny metal conductors. This brings you into a clusterfuck philosophical argument about the origin of morality, emotions, etc; all of which is unknown. The closest thing I've seen is when scientists were able to make a DNA-based computer that could solve a few simple square root problems, and it took the computer hours to do so.

So there's potentially some prospect that in the future we'll be able to create more advanced biological computers, but I see no reason to believe why this growth will lead to a (theoretical) singularity, or happen within our lifetime.
 
Quite a religious statement. For starters, I think "fairly advanced computer" is a bit of an understatement for the human brain. The processing power of the brain is at least equal to any current supercomputer, and that's not even taking into account the biological aspect that makes way for my second point.
The brain is estimated to be able to compute 10^15 calculations per second (a petaflop). Current supercomputers have speeds around 10^16 calculations per second (ten petaflops). Looking at past trends in the growth of computing and what new technologies are being worked on by Intel, it is estimated that we'll reach 10^15 calculations per second in a $1,000 computer by about 2025.

This brings you into a clusterfuck philosophical argument about the origin of morality, emotions, etc; all of which is unknown.
These are not all unknown. They are all the result of evolution, and there is no way you could possibly argue this. We consider something to be moral, because in our ancestral history, creatures who viewed that action as being moral were more likely to survive and reproduce compared to creatures who didn't. The same thing is true for emotion. Unless you are a hardcore Christian who doesn't believe in evolution, the roots of morality and emotion are pretty universally understood.

So there's potentially some prospect that in the future we'll be able to create more advanced biological computers, but I see no reason to believe why this growth will lead to a (theoretical) singularity, or happen within our lifetime.
I'm not sure where you are getting the idea that you cannot simulate a biological neuron with silicon. The issue is simply that we do not have the processing power to simulate the massive parallelism in which neurons calculate events, nor have we completely reverse engineered the brain yet. We will definitely have computers fast enough to mimic the brain within the next 20-25 years (see above). And I see no reason why silicon chips cannot mimic all of the processes that go on in the human brain.
 
The current worldwide life expectancy average is 67.2 (2010 data) and if you're living in a civilized country, the numbers obviously look even better (over 82 in Israel, Switzerland, Hong Kong and Japan).

The average worldwide life expectancy was 31 back in the early 20th century, so yeah, we're probably going to see improvements. But has the human who will end up living let's say 200+ years been born? Probably not.

Oh and there are also events which could actually make the worldwide life expectancy numbers drop. You know... nuclear wars, unprecedented natural disasters and so on.
 
The brain is estimated to be able to compute 10^15 calculations per second (a petaflop). Current supercomputers have speeds around 10^16 calculations per second (ten petaflops). Looking at past trends in the growth of computing and what new technologies are being worked on by Intel, it is estimated that we'll reach 10^15 calculations per second in a $1,000 computer by about 2025.

Do you have references for this? I'm just curious, I couldn't really find anything aside from vague articles. The problem as you mentioned below is that the biological aspect increases the overall parallel processing power, and to replicate that first we have to fully understand it. I know it's reddit, but there's an interesting discussion here:

If the human brain were a computer, what would its specs look like? : askscience

They are all the result of evolution, and there is no way you could possibly argue this. We consider something to be moral, because in our ancestral history, creatures who viewed that action as being moral were more likely to survive and reproduce compared to creatures who didn't. The same thing is true for emotion. Unless you are a hardcore Christian who doesn't believe in evolution, the roots of morality and emotion are pretty universally understood.

Fair enough. Going beyond this would make things religious, but to me it's always interesting to question if there is a fundamental consciousness that is "God"...basically Einstein's God. If that's true, deriving human consciousness may be tricker than it seems, and it may be impossible with our mental capacity. Then the question begs: how can we create fully conscious computers without the capability to understand our own consciousness?

nor have we completely reverse engineered the brain yet. We will definitely have computers fast enough to mimic the brain within the next 20-25 years (see above). And I see no reason why silicon chips cannot mimic all of the processes that go on in the human brain.

Because of the first sentence. While we're making progress every day on reverse engineering the brain, we're not close to having definitive answers. I'm not saying we won't eventually learn them, but I'm not confident enough to throw out answers like 20-25 years when we don't know the ceiling. We don't fully understand what consciousness is, and we need to fully understand that before we can start to fully reverse engineer the brain.

I would say that it's likely to happen, but I can't put any timeframe on it. Science has always showed us that answers to questions typically provide more questions than answers. Just when we think we can't break it down any more, it gets broken into 10 different bits that all get broken into their own microscopic bits. We can theorize these things with ideas like string theory, but we're nowhere close to fully understanding them.