Hi I'm helping organising the Stockholm LW meetup but I need more karma to be able to post, upboats plz.
Update on LW 2.0: user interviews scheduled for this week, work on the design underway, as well as some extra features. The broad plan is something like the following: user interviews / alpha testing to find the breaking UX bugs and get the design squared away, a closed beta to find more bugs and make sure the experience with multiple people doing stuff on the site is good / how we expect it to be, and then an open beta to give the broader community a chance to see it and find things for us to fix before it goes live at lesswrong.com. A core part of this process is making sure that there's consensus that it's actually worth switching.
Some random barely-edited thoughts on my experience with weight loss:
In the midst of a diet where I will lose 15 lbs (15.9lb, from 185.8 lb to 169.9, to be exact) in 40 days.
I have 95% certainty I will reach this goal in the appointed time. Even if I don't reach exactly 169.9lb, I'll be close, so whether or not I hit the exact number is arbitrary for my purposes. (I'm losing some weight to see if it helps a lingering back injury.)
I'm just eating a disciplined diet and working out according to a consistent schedule.
My diet is simple and not starvation-y at all. Most people wouldn't do it because it's repetitive (I literally eat the same thing nearly everyday so I can know my calorie intake without any counting.)
My workout isn't hard but most people wouldn't do it because...I don't know why, it's just my experience that people won't. It's 4-5 days per week of 30-60 minutes cardio and 30-60 minutes of weight training. I have a back injury that's limiting me, so it's nothing terribly rigorous.
...
In my years at health clubs, talking to health-club-going people, I've seen all the evidence I'll ever need to believe, basically, the Calories In / Calories Out model of weight loss is corre...
Why does patternism [the position that you are only a pattern in physics and any continuations of it are you/you'd sign up for cryonics/you'd step into Parfit's teleporter/you've read the QM sequence]
not imply
subjective immortality? [you will see people dying, other people will see you die, but you will never experience it yourself]
(contingent on the universe being big enough for lots of continuations of you to exist physically)
I asked this on the official IRC, but only feep was kind enough to oblige (and had a unique argument that I don't think everyone ...
There's a free market idea that the market rewards those who provide value to society. I think I've found a simple counterexample.
Imagine a loaf of bread is worth 1 dollar to consumers. If you make 100 loaves and sell them for 99 cents each, you've provided 1 dollar of value to society, but made 99 dollars for yourself. If you make 100 loaves and give them away to those who can't afford it, you've provided 100 dollars of value to society, but made zero for yourself. Since the relationship is inverted, we see that the market doesn't reward those who provide...
I often feel like upvotes on LW correspond more to the "insightfulness" of a post, rather than its perceived instrumental value. Unsure how I feel about this because if I'm relying on upvotes as a social incentive to write things, this shapes what I write in directions that might not be directly useful (IMO) to the most people.
Another week, another Open thread, another problem:
https://protokol2020.wordpress.com/2017/05/14/chesslike-problem/
Is there a good reason, that I am not seeing that there isn't a society for AGIrisk?
It would do various meta things around AGIrisk like
Outreach to AI students to inform them and measure the spread of ai safety ideas
Co-ordinate with the research institutes to provide experts for the media/government
Provide opsec advice for researchers to keep their dangerous results hidden.
Is there some nice game-theoretic solution that deals with the 'free rider problem', in the sense of making everyone pay in proportion to their honest valuation? Like how Vickery Auctions reveal honest prices, or Sperner's lemma can help with envy-free rent division?
I've watched Stuart Russel's TED talk on AI risk, and my gut reaction to it was "do you want to be paperclips? this is how you become paperclips!". It goes completely against the grain of the view that has been expressed on this blog as of few years ago. But, then again, AI is hard, and there might be some recent developments that I have missed. What is the current state of the research? What does EY and his camarilla think about the state of the problem as of now?
I'm searching for a quote. It goes something like this:
"In nearly every contest there comes a point where one competitor has decided that they are going to lose. Sometimes it's near the end; sometimes it's right at the start. After that point, everything they do will be aimed at bringing that result to pass."
And then continues in that vein for a bit. I don't have the wording close enough to correct for Google to get me what I'm looking for, though. And I could swear I've seen it quoted here before. Does someone else remember the source?
The last bigger Windows update left my computer without a driver for the GPU. Hardware acceleration on websites like https://human.biodigital.com/index.html didn't work. Unfortunately, it took a few months notice the specific problem. I installed the open source tool Snappy Driver Installer and it fixed the issue. It also installed proper drivers for other hardware.
Does anybody here know any personally successful techniques or strategies on handling regret? I have some regrets from my past that occasional come and bother me sometimes whenever I study.
It also happens when you have stuff to sell, but the folks who would normally buy it from you can't pay you enough to survive today, because they don't have anything to sell,
It seems unlikely that you would have a skillset which allows you to produce valuable stuff only for poor people, but you wouldn't be able to produce valuable stuff for anyone else.
Okay, thinking hard, I can make up some situations like that, for example that you are a skiled translator into some kind of indigenous language, where all speakers of the language are too poor to actually pay you for the translations (even if they would like reading them a lot). Or that your services are limited to your local area, e.g. you can provide accommodation for people, but there are only poor people living in that area, and zero tourists.
and the problem can start randomly and build on itself
Something like... million people living on an island, where most of them can provide some valuable service to their neighbors (but not to anyone outside the island), but some critical skill is missing on the whole island... like, all of them are genius teachers or movie producers, but none of them can grow food... so they are all going to starve, despite being so skilled that an average inhabitant of the island would be a rich person if they would be teleported into our society?
In short term, this certainly can happen, especially if the situation can change overnight. Like, yesterday, there were hundred specialized food producers, but by miracle, all of them were killed by a lightning during the night. To make it sound more likely, all of them were at the same place (the annual food-producer conference), and something exploded there and killed them all.
But... I don't see how any other economical system would deal with the fact that, no matter how you distribute the money, there is not going to be any food in the island anyway. With free market, at least now all professors and movie producers see the opportunity to become millionaires overnight if they succeed to reinvent e.g. the lost art of picking fruit. Even if they would be great movie producers, but quite lousy fruit pickers.
(Actually, such situation would be made worse by an unfree market, for example if the government of the island would insist that the wannabe professor-becoming-fruit-picker is legally not allowed to pick fruit because he doesn't have a diploma from Fruit Picking University; and any attempt to illegally do the job he is not qualified for would get him arrested.)
Now, let's assume that the island actually is okay, able to grow its own food, etc. It's just that the money flow happens to be hopelessly unidirectional. No one outside the island wants to buy anything from the island. (Let's suppose they are not interested in your stuff, and you can't gain customers even for trying to sell really cheaply, because the costs of ship fuel will still make everything more expensive than anyone is willing to pay for.) On the other hand, people on the island sometimes buy something from the outside, e.g. because they cannot produce their own iPhones. Thus, money only ever goes out of the island, but never in. The island is constantly losing its global PageRank, ahem, money reserve. What happens now?
If I understand it correctly, the standard market outcome will be that -- assuming the island uses its own currency -- the exchange rate will gradually approach "1 out-of-island currency = infinity island currency". The people on the island will stop being able to buy stuff from outside, because it will become astronomically expensive for them.
Yet, within the island, people will be able to sell to each other, because both sides will pay using island currency. And there will be things to sell, for example the locally grown food. No one will be able to buy iPhones anymore, and that sucks, but the island will still be not worse than if the rest of the world would simply stop existing.
And if someone comes from the outside, and uses their infinitely valuable out-of-island money to buy the local food, then the assumption of unidirectional flow of money is no longer true; we now have money flow in both directions.
Etc, economics 101.
However, one possible solution for "people who have nothing to sell" is generally known as Basic Income. Not universally accepted, of course, but it is a way to make sure everyone can buy stuff, at the cost of doing relatively small damage to the economy. By relatively small I mean, of course entrepreneurs will complain about higher tax rate, but as far as I know, they usually complain much more about regulation, bureaucracy, or unpredictability; and Basic Income doesn't create a lot of these compared with the usual government interventions.
Essentially, Basic Income + market profit seems like a plausible approximation of our model of terminal + instrumental value, when we assign approximately the same terminal value to each human (expressed as Basic Income), and more instrumental value to people doing useful stuff to others (express as the market profit).
one possible solution for "people who have nothing to sell" is generally known as Basic Income
A solution for "people who have nothing to sell" is generally known as welfare. It exists in all the developed world and consumes large fractions of government budgets.
Basic income is more of a solution for people who are capable of, but don't want to make something to sell.
If it's worth saying, but not worth its own post, then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should start on Monday, and end on Sunday.
4. Unflag the two options "Notify me of new top level comments on this article" and "