What would our world be today if humans had started off with a purely rational intelligence?
It seems as though a dominant aspect of rationality deals with risk management. For example, an irrational person might feel that the thrill of riding a zip line for a few seconds as being well worth the risk of injuring themselves, contracting a flesh eating bug, and losing a leg along with both hands (sorry, but that story has been freaking me out the past few days, I in no way mean to trivialize the woman’s situation). A purely rational person would (I’m making an assumption here because I am certainly not a rational person) recognize the high probability of something going wrong and determine that the risks were too steep when compared with the minimal gain of a short-lived thrill.
But how does a purely rational intelligence—even an intelligence at the current human level with a limited ability to analyze probabilities—impact the advancement of technology? As an example, would humanity have moved forward with the combustible engine and motor vehicles as purely rational beings? History shows us that humans tend to leap headlong into technological advancements with very little thought regarding the potential damage they may cause. Every technological advancement of note has had negative impacts that may have been deemed too steep as probability equations from a purely rational perspective.
Would pure rationality have severely limited the advancement of technology?
Taken further, would a purely rational intelligence far beyond human levels be so burdened by risk probabilities as to render it paralyzed… suspended in a state of infinite stagnation? OR, would a purely rational mind simply ensure that more cautious advancement take place (which would certainly have slowed things down)?
Many of humanity’s great success stories begin as highly irrational ventures that had extremely low chances for positive results. Humans, being irrational and not all that intelligent, are very capable of ignoring risk or simply not recognizing the level of risk inherent in any given situation. But to what extent would a purely rational approach limit a being’s willingness to take action?
*I apologize if these questions have already been asked and/or discussed at length. I did do some searches but did not find anything that seemed specifically related to this line of thought.*
"A purely rational person would be nigh omniscient"
Even at current human intelligence levels? I don't see how pure rationality without the ability to crunch massive amounts of data extremely fast would make someone omniscient, but I may be missing something.
"If a combustible engine does more good than bad (which it does)"
Of course, I'm playing devil's advocate with this post a bit, but I do have some uncertainty about.... well, your certainty about this :)
What if a purely rational mind decides that while there is a high probability that the combustible engine would bring about more "good" than "bad", the probable risks compels them to reject its production in favor of first improving the technology into something with a better reward/risk ratio? A purely rational mind would certainly recognize that, over time, the resource of gasoline derived from oil would lead to shortages and potential global warfare. This is a rather high risk probability. Perhaps a purely rational mind would opt to continue development until a more sustainable technology could be mass produced, greatly reducing the potential need for war/pollution/etc. Keep in mind, we have yet to see the final aftermath of our combustible engine reliance.
"The lives lost due to technological advancements have been dwarfed by the lives saved."
How does a purely rational mind feel about the inevitable over-population issue that will occur if more and more lives are saved and/or extended by technology? How many people lead very low quality lives today due to over population? Would a purely rational mind make decisions to limit population rather than help them explode?
Does a purely rational mind value life less or more? Are humans MORE expendable to a purely rational mind so long as it is 51% beneficial, or is there a rational reason to value each individual life more passionately?
I feel that we tend to associate pure rationality with a rather sci-fi notion of robotic intelligence. In other words, pure rationality is cold and mathematical and would consider compassion a weakness. While this may be true, a purely rational mind may have other reasons than compassion to value individual life MORE rather than less, even when measured against a potential benefit.
The questions seem straight forward at first, but is it possible that we lean toward the easy answers that may or may not be highly influenced by very irrational cultural assumptions?
Overpopulation isn't caused by technology. It's caused by having too many kids, and not using resources well enough. Technology has drastically increased our efficiency with resources, allowing us to easily grow enough to feed everyone.
The utility function is not up for grabs. Specifying that a mind is rational does not specify how much it va... (read more)