Wiki Contributions

Comments

I have to admit that I greatly enjoyed this topic because it introduced me to new concepts. When I clicked on this discussion I hadn't a clue what Neo-Reactionaries were. I knew what a political reactionary is but I hadn't a clue about this particular movement.

The thing that I have found fascinating is the fundamental concept of the movement (and please correct me if I am wrong) is that they want a way out. That the current system is horribly flawed, eventually doomed and that they want to strike a new deal that would fix things once and for all. The recognition is that even if abolished governments will again form. As such they hope to devise a government that is no longer a sham, and structurally will have finally the best interest of the people at its heart instead of selfishness.

What fascinates me about this is some of the discussions about AGI here. Plenty of people apparently feel that eventually agi will rule over us. They essentially are interested in building "a better tyrant." I don't know, give me a thumbs down on this comment if you want but I found the parallel interesting. Of course many ideologies are more alike then people care to admit. For example communism is supposed to be economic and social power sharing and to ensure at the very least everyone's material needs are met. Capitalism and the corporate structure actually aim for the same thing.

The problem is that you cannot be quite absolutely certain that someone will in fact fail. You can express any likelihood of them amounting to anything other than "normal" or "average" is frighteningly small, but that's not quite the same as an absolute fact that they will not succeed ever, nor does any of this mean that the effort to reach their goal on some level wouldn't make them happy even if they never succeed. The effort to reach that goal also can be also very socially and economically productive.

I think the better advice is "Dream of victory, but prepare for defeat." The idea is that if they are truly passionate about something they should push towards it but prepare themselves to fail again and again. That means that they shouldn't just abandon all family and stable work for said goals, but instead maintain those in preparation for the likely event that they fail in each attempt. This is important because no one goes through life without taking a blow so to speak. Everyone spends some of their time taking their own share of lumps and preparing for this instead of living in a fantasy world in which nothing can go wrong is important.

I suppose its a fundamental disagreement of basic philosophy here. You are arguing the Buddhist and Epicurean thought "Unhappiness is caused by unnecessary desire." Whereas my observation and platform is based upon the idea that "True depression is stillness born from a lack of worthwhile purpose and objectives in life." Its the recognition that for some people at least (such as myself) they need fantastic goals and overriding purpose in life to be happy, even if the chance of success is quite low.

The problem is what is "correct thinking"? Is "correct" telling people to never try? Is "correct" sticking to safe, sure bets always? Is correct giving up on something because the challenge will be great and the odds long? What kind of world would we live in if everyone took that mentality? I would argue that ambition is powerful, it shapes this world and builds monumental things. Its irrational to expect people to be completely rational, that can only result in depression, stagnation and death. This all does remind me of a story with an important message.

Long ago in the Arizona desert, there once was a scout for the garrison at Fort Huachuca. During this man's spare time he ranged off into the desert searching for veins of silver or gold. The land was wild back then and very dangerous. When this young man's friend learned what he was doing this friend told him "The only rock you will find out there is your own tombstone." Undeterred he continued searching and eventually found a vein of silver all the while with only thirty cents in his pockets. He eventually staked the claim, got the ore appraised and founded the settlement "Tombstone" in honor of what everyone told him he would find. Edward Lawrence Schieffelin would became a millionaire due to his discovery in the late 1870's.

Were the odds actually in favor of Ed only finding his grave in that desert? Yes. Did most people setting out west fail in whatever ambitions they had? Yes. Does that mean it was not worth trying? No, it does not. The fact is that the vast majority of those setting out west to settle the land would fail utterly in some respect. That doesn't mean they should have stayed home and never dreamed at all. Where would we be if everyone had said "You know what? I think I'll just play it safe. ?

Statement Retratcted: I should sit and think on this a bit more just to be sure I am posing the correct response.

[This comment is no longer endorsed by its author]Reply

Human alteration certainly wont magically improve human being's mental capabilities all on their own. That's why I put the qualifier that education "is and will be the primary means of improving the human mind"

I was point out when faced with an artificial intelligence that can continually upgrade itself the only way the human mind can compete is to upgrade as well. At some point current human physical limitations will be to limiting and human beings will fall to the wayside of uselessness in the face of artificial intelligence.

A weapon is no more than a mere tool. It is a thing that when controlled and used properly magnifies the force that the user is capable of. Due to this relationship I point out that an AGI that is subservient to man is not a weapon, for a weapon is a tool with which to do violence, that is physical force upon another. Instead an AGI is a tool that can be transformed into many types of tools. A possible tool that it can be transformed into is in fact a weapon, however as I have pointed out that does not mean that the AGI will always be a weapon.

Power is neither good nor ill. Uncontrolled, uncontested power however is dangerous. Would you start a fire without anything to contain it? For sentient beings we posses social structures, laws and reprisals to manage and regulate the behavior of the powerful force that is man. If even man is managed and controlled and managed into the most intelligent manner we can muster then why would an AGI be free of any such restraint? If sentient being A is due to its power cannot be trusted to operate without rules then how can we trust sentient being B whom is much more powerful to operate without any constraints? Its a logic hole.

A gulf of power is every bit as dangerous. When power between two groups is too disparate then there is a situation of potentially dangerous instability. As such its important for mankind to seek to improve itself so as to shrink this gap between power. Controlling an AGI and using it as a tool to improve man is one potential option to shrink this potential gulf in power.

My nightmare was a concept of how things would rationally likely to happen. Not how they ideally would happen. I had envisioned an AGI that was subservient to us and was everything that mankind hopes for. However, I also took into account human sentiment which would not tolerate the AGI simply taking nuclear weapons away, or really the AGI forcing us to do anything.

As soon as the AGI makes any visible move to command and control people the population of the world would scream out about the AGI trying to "enslave" humanity. Efforts to destroy the machine would happen almost instantly.

Human sentiment and politics need always be taken into account.

Education can allow someone access to a platform from which to stand upon that is certain. I was unconcerned because even if you spend thirty years educating someone they are still limited by their own intelligence when it comes to discovery, creativity, and decision making.

Spending time studying philosophy has greatly improved my ability to understand logic structures and has helped me make better decisions. However there are still limits set upon me by my own biological design. More than that, I am limited with how much education I can receive and still be able to work off the debt in a single lifetime. Even in state funded education its an investment, the student must generate more value in a lifetime than the cost of the education to be worth the education.

The pace of education is limited by a great number of variables including the student's IQ. Therefore we cannot simply solve that problem by trying to educate at a faster pace. The other solution is a form of transhumanism, that is alter my body so that I may live longer in order to be worth the cost of longer education. However postulating about such a long and substantial education is ignoring whether or not there is a point when education has no effect and the only other option is actual hands on experience in life.

We can logically see that we cannot magically educate every problem away. Education is and will be the primary means of improving the human mind. However, if we need to improve our natural limitations on how quickly we can learn and so fourth physical alteration of the human body may be necessary.

The existence of a super intelligent AGI would not somehow magic the knowledge of nuclear ordinance out of existence, nor would that AGI magically make the massive stockpiles of currently existing ordinance disappear. Getting governments to destroy those stockpiles for the foreseeable future is a political impossibility. The existence of a grand AGI doesn't change the nature of humanity, nor does it change how politics work.

This goes the same with the rich and the working classes, the existence of a super intelligent AGI does not mean that the world will magically overnight transform into a communist paradise. Of course you do have a sound point if you state that once the AGI has reached a certain point and its working machines are so sophisticated and common that such a paradise is possible to create. That does not mean it would be politically expedient enough to actually form.

However, lets assume that a communist paradise is formed and it is at this point that mankind realizes that the AGI is doing everything and as such we have very little meaning in our own existence. At this point if we begin to go down the path of transhumanism with cybernetics then there still would be a point in which these technologies are still quite rare and therefore rationed. What many don't realize is that in the end a communist system and a capitalist system behave similarly when there is a resource or production shortfall. The only difference is that in a capitalist system money determines who gets the limited resource, in any communist system politics would determine who gets the limited resource.

So in the end, even in a world in which we laze about, money doesn't exist and the AGI builds everything for us, new technologies that are still limited in number means that there will be people who have more than others. More than that I do not see people submitting to an AGI to determine who gets what, as such the distribution of the product of the AGI's work would be born of a human political system and clearly there would be people who game the system better gaining much more resources than everyone else, just like some people are better at business in our modern capitalist world.

You actually hit the nail on the head in terms of understanding the AGI I was referencing.

I thought about problems such as why would a firm researching crop engineering to solve world hunger bother with paying a full and very expensive staff? Wouldn't an AGI that not only crunches the numbers but manages mobile platforms for physical experimentation be more cost effective? The AGI would be smarter and run around the clock testing, postulating and experimenting. Researchers would quickly find themselves out of a job if the ideal AGI were born for this purpose.

Of course if men took on artificial enhancements their own cognitive abilities could improve to compete. They could even potentially digitally network ideas or even manage mobile robotic platforms with their minds as well. It seems therefore that the best solution to the potential labor competition problems with AGI is to simply use the AGI to help or outright research methods of making men mentally and physically better.

Load More