The implications for AI are a stretch.
This being the case, the importance of the whole thing is a stretch.
It depends on the time scales you are thinking of.
Even on a shorter time scale, something like the EU deciding to fund a singularity-like insitute - or not fund it - or fiddle arround with regulations for funding - this has a large effect that's hard to predict.
It seems to be big and powerful, but the implications are opaque. What similar treaties have existed? Can we take an "external view"?
The original US constitution, the Swiss confederacy maybe?
But the interesting fact here is that the EU represents such a large share of the economy/power of the world, meaning that the number of poles in the world is dramatically reduced. It also provides a map for the creation of more large blocks throughout the world.
US constitution
Egads, Euros, shouldn't you have learned from our mistake! Thankfully, it's not an actual constitution and you may be able to get out of it.
Not being European, I probably haven't paying as close attention to this as I should be. What does the Treaty do besides for adding an extra crunchy bureaucratic layer on top of the European "Community" ?
Mainly it centralises power. Reduces the number of veto points. Makes it significantly easier for the EU to speak as one voice.
If neither is the case... why not?
It's not going to affect me for a while, and I couldn't have affected it had I wanted to. I find it easy and natural to ignore it on an emotional level.
Indeed. All the effects that guarantee emotional disconect are there (that's why I choose that example). The treaty itself is long and tedious; the process of ratification is drawn out and unexciting; the consequences are somewhat hazy, and very much in the future.
Yet it's the sort of thing we should be getting emotional about. And we should also be shutting up and multiplying with our emotions; the chances of us affecting the treaty are very low, but multiplied by the consequences, our expected change could be quite high.
You're talking about affecting it before it had passed, right? I understand there's nothing much to do about it now that's it ratified.
In general, there are just too many good causes, worrisome dangers and all-around Important Things for me to actively worry about more than a few of them. Just learning about all the things whose potential for good or ill is comparable to that of the Lisbon Treaty would probably take up all my time.
I'm going with 'eh'. I don't see the Lisbon treaty adding much to the effective power of the EU bureaucracy, and it may have weakened the EU by exposing fault-lines and offering something formal to attack.
(A specific clear law, or even a giant treaty as long as it goes by one easily-remembered name, is much easier to attack than a thousand creeping fees & regulations & legal precedents & partnerships & shifting mores.)
Though if it did help the EU, I'd probably be on the nay side. Some close cooperation is fine. Too much can be toxic (do you really want a large country overseeing FAI research? remember Gresham's Law.) Multiple competing states, even if they are frequently going to war, are good for the arts & sciences.
(This is a general statistical correlation from a few studies like Human Accomplishment, but examples are easy to come: the part of Greek history anyone cares about; the Renaissance; the Warring States period in China; the Industrial Revolution.)
Multiple competing states, even if they are frequently going to war, are good for the arts & sciences.
Not so good for the people in there. How do you figure that the benefit to scientific research and artistic legacy outweighs the human cost of frequent war?
If you think the science will allow the human race to fend off extinction for longer than if we had stayed at the pre-industrial level it might be justified. It is not a pleasent justification to make.
The idea of science fending off extinction requires some serious justification. At pre-industrial technology levels, none of today's biggest extinction threats would exist (high-tech war, singularity-class technology, climate change, etc.)
Think longer term things such as near super novas, large meteors etc. I rate meteors as a bigger extinction threat than climate change. Even with runaway climate change it still is unlikely to make us extinct .
Think of science as a gamble that can pay off big, if we manage to get off this rock, but might just backfire.
All of this implies you assign significant utility to the indefinite survival of humanity regardless of your personal survival, the survival of any particular persons you know, or your personal legacy and influence on that future.
I assign little utility to this. For instance I'd choose a 50% chance of extinction of humanity, with guaranteed survival of myself and friends in the event humanity survives; over a 20% chance of extinction, with another 50% chance of my death even if humanity survives (which sums to a total 60% chance of my death).
Do you have different preferences here and does that relate to our differences on the war question?
I assign a higher value to the survival of humanity than to my own personal survival/survival of my legacy. So yes we have a value disconnect.
If you value personal survival you should also value science fairly highly, as the longevity research program is a product of that as well, not to mention the basic useful health care.
Anyway we are getting fairly far off-topic. If you want to follow this further we should probably go to the open thread.
There are species we probably wouldn't have been able to make extinct without high tech. But we've almost never needed to deliberately make another species extinct at all, so we weren't trying all that hard.
For starters, the casaulties are sunk costs. And the long-term gains are plausibly far greater. With several billion people around at any given moment directly thanks to things like the Industrial Revolution, that vastly outweighs the endless little European wars that encouraged it. (How many died in the 30 years' war? 3-11 million? Global population growth adds that much in 5-50 days.)
The problem with arguing that way is that for you, living now, the wars were a good thing insofar as they enabled your higher tech level and quality of life today. But if you lived in Europe during the Thirty Years' War, you would have certainly preferred to have peace even if it meant no Industrial Revolution later on.
For broadly similar reasons, wars now which might involve you are bad for you, even if they're good for your future self under the assumption you survive the war.
You seem to be shifting gears; I've explained exactly how the gains outweigh the costs of the wars - what the few people at the time preferred is irrelevant except as a few pebbles on one side of the scale, and I don't know why you're belaboring the point. It's nothing new to say that the optimal long-term course of action may be suboptimal in the short-term.
I read your original comment to mean you also thought wars today were good (i.e. for us, who live today) because they advance science.
To be clearer then: Wars today are different from wars then. The positive effects of conflicting states is clear enough from history, but lately the side-effects have started to reach into the unacceptable range. Ideally, Europe and Asia would be filled with active jostling competing states so we get the benefits of whatever Renaissance or Industrial Revolution or 100 Schools of Thought would happen in our era, but with enough of an international structure to prevent actual military operations (and particularly use of nukes or worse); the EU seems to me to have gone well beyond the salutary conflict-prevention point, and into stifling-Imperial-China territory.
I don't consider this to be vastly important, in the long run. The EU states don't have much force projection military power, or the will to use it. I don't think the Lisbon treaty portends much for AI, really. The nations within (and subsets of the whole) still have their own agendas.
Crucially, the EU still doesn't really have sovereignty over "EU territory". They won't have sovereignty until they have a coherent centralized military capable of controlling every EU nation-state. This is unlikely to happen in the next 50 years. If a member state wanted to leave, they could.
The EU beats the US, but NAFTA beats the EU, if I recall correctly. Of course, the EU is only rich because its constituents are rich; our new unhappy lords haven't contributed much to GDP since the free trade area.
The Lisbon treaty was finally ratified last Tuesday, in a most wonderfully disdainful signing cermony.
I take it that everyone on the list is emotionally overwhelmed by this, one of the most important political events in recent history. The world's largest economy has taken a firm step towards statehood; the ramifications of this will be felt across the world. People will die who would have lived; people will live who would have died: the body count is much affected. The potential implications for AI alone (think political singleton, research funding priorities) are huge. Depending on your opinion of the consequences, you are probably dumped into a dark ditch of despair or swimming in a limitless ocean of triumphant glee.
If neither is the case... why not?