TimS comments on Can the Chain Still Hold You? - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (354)
I don't think I understand.
A new technology is useful if it is serves a specific purpose for human manipulation of territory. The more unknown the technology the more dangerous it is to human survival, and thus can no longer be seen as progressive. Furthermore the introduction of new technology reshapes the social topography of a territory. If erosion/alteration of social topography happens at too fast a rate it becomes impossible to navigate based off the experiences of others. Just as if all the currents and depths of a channel suddenly changed the built up knowledge of generations of fishers would become irrelevant.
Whether technological/scientific advancement is progress or just impact depends on these two factors
1.) The degree of unknowns involved with the technology 2.) The extent to which social topography is eroded/altered.
If we look at cell phones and other types of information-technologies they have completely reconstructed the social topography of the world, and they continue to develop at an astonishing rate. As to the degree of unknowns, cell phones have already been completely integrated into everyday life, despite their relatively short lifespan. What happens when a person lives 70 years with a cellphone in their pocket, or an i-pad? We have no idea because they have not been around long enough to have any cases. There is still a huge degree of unknowns with these new technologies, yet we are already completely dependent on them.
I am not saying that this is not progress, it is not possible to say at this point; but I will say that we are walking a fine line between true progress and unrestrained impact.
Here is a genuine disagreement between us.
I don't think increasing our ability to control the world is an inherently good or bad thing (somewhat like how concepts like equality don't have a particular political affiliation). The Spaniards did terrible things to the natives of the New World, but the proximate cause of their behavior was their extreme aversion to Otherness (like Orientalism, but worse). Spain's technological superiority made their oppressive behavior possible, but it is insufficient to explain what happened.
To your specific point about cell phones, the data is pretty clear they are fairly safe. We have a good understanding of what radiation of various kinds can and can't do. And social topography has nothing to do with this risk.
I don't think he means the biological effects of radiation, but the psychological/sociological effects of always being available for conversation. (Being unable to talk to me for one freakin' day would bother the living crap out of my mother, for example. I'm not sure that's a healthy thing.)
I didn't thumbs down you, just saying.
I agree that our ability to control the world is not inherently good or bad. What I am saying is that the rate at which we use this ability can be beneficial or harmful. In my mind it is analogous to a person running through a forest to win a race. There is no path, but they have a pretty good idea of the general direction they want to go. The faster the run the quicker they close the distance between themselves and their objective, but at the same time, if they run too fast they risk stumbling into a pitfall, shooting off a sudden drop, tripping, building up too much momentum on a downhill run. All these things are potentially dangerous. The cellphones causing cancer was the wrong point to focus on. But it cannot be denied that cell phones in general have changed the structure of society at an alarming pace. Again, I am not saying this is inherently good or bad. It could be that our barreling through the forest brings us to our destination in the least possible time. I guess I am just a somewhat pessimistic person. I think rather than getting there faster, it would be better to minimize any chance of tragedy.
I think these two sentences are in quite a bit of tension. The speed at which we get better at controlling the world can best be judged by whether we should be trying to control the world at all.
I deny. Cell phones have changed the structure of society at a very high pace. Alarming? That's a value judgment that needs a fair amount of justification. Even assuming that it isn't possible to live "how things used to be" because of widespread expectations of cell phone usage (and I'm not sure this is true), why is this worse?
I don't think there is a tension. It is kind of like I do not not think coffee is inherently good or bad. It is the rate of use that defines it as good or bad to me. Drinking 10 cups of day (a very high rate of use) I find to be bad for you; whereas if you have a cup of coffee a day (a slower rate of use) it is good for you. I think the same principle is true for technology. Developing too fast without regard for the societal impact or potential dangers of what you are creating is negative in my opinion.
I don't really understand this sentence could you explain it more. What I get from reading it is: "if it does not seem feasible it should be abandoned?"
Mobile phones have changed social interaction, how people think (through texting), the structure of business and economics, they have become a status symbol, do I need to keep going?
Coffee isn't such a good analogy. That's got a certain finite set of effects on a well-known neurotransmitter system, and while not all of the secondary or more subtle effects are known we can take a pretty good stab at describing what levels are likely to be harmful given a certain set of parameters. Social change and technology don't have a well-defined set of effects at all: they're not definitive terms, they're descriptive terms encompassing any deltas in our culture or technical capabilities respectively.
Speaking of technology as if it's a thing with agency is obviously improper; I doubt we'd disagree on that point. But I'd actually go farther than that and say that speaking of technology as a well-defined force (and thus something with a direction that we can talk about precisely, or can or should be retarded or encouraged as a whole) isn't much better. It may or may not be reasonable to accept a precautionary principle with regard to particular technologies; there's a decent consensus here that we should adopt one for AGI, for example. But lumping all technology into a single category for that purpose is terribly overgeneral at best, and very likely actively destructive when you consider opportunity costs.
When I talk about technology, what I am really talking about is a rate of technological innovation. Technological innovation is inevitably going to change the dynamics of a society in some way. The slower that change, the more predictable and manageable it is. If that change continues to accelerate, eventually it will reach a point where it moves beyond the limitations of existing tracking technology. At that point, it becomes purely a force. That force could result in positive impacts, but it could also result in negative ones, however, To determine or manage whether it is positive or negative is impossible for us since it moves beyond our capacity to track. Do you disagree with this idea?
This is essentially a restatement of the accelerating change model of a technological singularity. I suspect that most of that model's weak predictions kicked in several decades ago: aside from some very coarse-grained models along the lines of Moore's Law, I don't think we've been capable of making accurate predictions about the decade-scale future since at least the 1970s and arguably well before. If we can expect technological change to continue to accelerate (a proposition dependent on the drivers of technological change, and which I consider likely but not certain), we can expect effective planning horizons in contexts dependent on tech in general to shrink proportionally. (The accelerating change model also offers some stronger predictions, but I'm skeptical of most of them for various reasons, mainly having to do with the misleading definitivism I allude to in the grandparent.)
Very well; the next obvious question is should this worry me? To which I'd answer yes, a little, but not as much as the status quo should. With the arguable exception of weapons, the first-order effects of any new technology are generally positive. It's second-order effects that worry people; in historical perspective, though, the second-order downsides of typical innovations don't appear to have outweighed their first-order benefits. (They're often more famous, but that's just availability bias.) I don't see any obvious reason why this would change under a regime of accelerating innovation; shrinking planning horizons are arguably worrisome given that they provide incentive to ignore long-term downsides, but there are ways around this. If I'm right, broad regulation aimed at slowing overall innovation rates is bound to prevent more beneficial changes than harmful; it's also game-theoretically unstable, as faster-innovating regions gain an advantage over slower-innovating ones.
And the status quo? Well, as environmentalists are fond of pointing out, industrial society is inherently unsustainable. Unfortunately, the solutions they tend to propose are unlikely to be workable in the long run for the same game-theoretic reasons I outline above. Transformative technologies usually don't have that problem.
I was not familiar with the theory of technological singularity, but from reading your link I feel that there is a big difference between it and what I am saying. Namely that it states, "Technological change follows smooth curves, typically exponential. Therefore we can predict with fair precision when new technologies will arrive..." whereas I am saying that such prediction is impossible beyond a certain point. I would agree with you that we have already pasted that point (perhaps in the 70s).
This I disagree with. If you continue reading my discussion with TimS you will see that I suggest (well Jean Baudrillard suggests) a shift in technological production from purely economic and function based production, to symbolic and sign based production. There are technologies where the first-order effects are generally positive, but I would argue that there are many novel technological innovations that provides no new functional benefit. At best, they work to superimpose symbolic or semiotic value upon existing functional properties; at worst, they create dysfunctional tools that are masked with illusionary social benefits. I agree that these second order effects as you call them are slower acting, but that is not an argument to ignore them, especially since, as you say, they have been building up since the 70s.
I agree that the status quo is a problem, but I do not see it as more of a problem than the subtle amassment of second order technological problems. I think both are serious dangers to our society that need to be addressed as soon as possible. The former is an open wound, the latter is a tumor. Treating the wound is necessary, but if one does not deal with the later as early as possible it will grow beyond the point of remedy.
Really nice post. I apologize about my analogy. Truthfully I picked it not for its accuracy, but its ability to make my point. After recently reading Eliezer's essay about sneaking connotations I am afraid it is a bad habit I have. I completely agree it is a bad analogy.
As to your second point. It is a really interesting question that honestly I have never thought about. If you don't mind I would like a little more time to think about it. I agree with it is improper to speak of technology as a thing with agency, but I am not sure if I agree that speaking of technology as a well-defined force is just as bad.
My point is that the factors that are relevant to deciding how fast to research new technology are the same factors that are relevant in deciding whether to use technology at all.
The word I was disputing in your prior post was alarming. Cell phones have caused and are causing massive social change.
What do you see as the primary factors determining how fast to research new technology? Ideally technology would be driven by necessity or efficiency, but that is an idea. In my opinion the driving factor for new technologies is profit. For example, my uncle installs home entertainment systems for the rich. He tells me that he gets sent dozens of new types of wire, new routers, new systems for free that some engineer is hoping to make it big off of. The development of new mediums of audio/video, drugs, TVs, honestly I feel like in most fields there is a constant push for innovation for the sake of entrepreneurship alone, and I don't think that is relevant to the actual use of the technology.
P.S When I say technology I am using it as a extremely broad term for any tool used to manipulate the physical world.
I'm not saying you are wrong (although I don't agree with the normative implications), but what is the difference between efficiency and profit?
Efficiency has to do with the use of the tool being created. An efficient ax is sharp and will not break easily. Profit has to do with the producer maximizing their intake and minimizing their costs.