Dom Polsinelli

Wikitag Contributions

Comments

Sorted by

Honestly, I'm not sure. I read about the biosphere 2 experiments a while ago and they pretty much failed to make a self sustaining colony with only a few people and way more mass than we could practically get into space. I really want us as a species to keep working on that so we can solve any potential problems in parallel with our development of rockets or other launch systems. I could see a space race esque push getting us there in under a decade but there currently isn't any political or commercial motivation to do that. I don't know if it would necessarily need a military. I could easily be very wrong but there's so much space in space and so much stuff on earth trying to conquer a habitat with a few thousand people on it seems a little unnecessary. Italy won't take over Vatican city, not because they can't but because there really isn't a good reason to. As for political freedom, that's the most speculative of all as I understand it less than the technology. My intuition is that they could simply because a self sustaining colony doesn't need to produce any surplus a government would be interested in taxing. If you set up an asteroid mining operation I can see all the governments wanting to take a cut of the profits but if all you wanted was to get away from an implicit surveillance state it would have to be truly dystopian to keep you from leaving. As long as you don't launch anything dangerous toward Earth and you aren't growing exponentially to the point where you might rival the power of a country and you aren't engaging in incredibly lucrative trade, the only motivation left to govern you would be control for control's sake and I guess I'm just optimistic enough to think that there will always be at least one place on earth with high tech that isn't that dystopian.

I don't know about inevitable but I imagine that it is such an attractive option to governments that if the technology gets there it will be enacted before laws are passed preventing it, if any ever are. I would include a version of this where it is practically mandatory through incentives like greatly increased cost of insurance, near inability to defend yourself in a court or cross borders if you lacked it, or it just becomes the social norm to give up as much data about yourself as possible.

That said, I also think that if things go well we will have good space technology allowing relatively small communities to live in self sustaining habitats/colony ships which would kind of break any meaningful surveillance.  
 

This is a very off the cuff remark, I haven't given this topic a great deal of thought before reading this post so make of that what you will.

I have a vague sense that these two people live in my brain and are constantly arguing and that argument is fundamentally unproductive and actively harmful for whoever should be dominant, if either.

I am very interested in mind uploading

I want to do a PhD in a related field and comprehensively go through "whole brain emulation: a roadmap" and take notes on what has changed since it was published

If anyone knows relevant papers/researchers that would be useful to read for that or so I can make an informed decision on where to apply to gradschool next year, please let me know

Maybe someone has already done a comprehenisve update on brain emulation I would like to know and I would still like to read more papers before I apply to grad school

My interpretation of that was whenever you're having an opinion or discussion in which facts are relevant, make sure you actually know the statistics. An example is an argument (discussion?) my whole family had mid covid. The claim of some people was that generally, covid was only as bad as the flu. Relevant statistics were readily available for things like mortality rate and total deaths that some people making said claim were ignorant of (off by OOMs). With covid it seems obvious but for other things maybe not. Things people frequently have strong opinions about and don't frequently look up may include: the return on investment of college, the number of deaths due to firearms, the cost of alternative energy sources, how much taxes are actually going to change for a given bill.

I agree with a lot of what you said but I am generally skeptical of any emulation that does not work from a bottom up simulation of neurons. We really don't know about how and what causes consciousness and I think that it can't be ruled out that something with the same input and outputs at a high level misses out on something important that generate consciousness. I don't necessarily believe in p zombies, but if they are possible then it seems they would be built by creating something that copies the high level behavior but not the low level functions. Also on a practical level, I don't know how you could verify a high level recreation is accurate. If my brain is scanned into a supercomputer that can run molecular dynamics on every molecule in my brain then I think there is very little doubt that it is an accurate reflection of my brain. It might not be me in the sense that I don't have any continuity of consciousness, but it is me in the sense that it would behave like me in every circumstance. Conversely, a high level copy could miss some subtle behavior that is not accounted for in the abstraction used to form the model of the brain. If an LLM was trained on everything I ever said, it could imitate me for a good long while but it wouldn't think the same way I do. A more complex model would be better but not certain to be perfect. How could we ever be sure that our understanding was such that we didn't miss something subtle that emerges from the fundamentals? Maybe I'm misinterpreting the post, but I don't see a huge benefit from reverse engineering the brain in terms of simulating it. Are you suggesting something other than an accurate simulation of each neuron and its connections? If you are, I think that method is liable to miss something important. If you are not I think that understanding each piece but not the whole is sufficient to emulate a brain.