MealSquares (the company I'm starting with fellow LW user RomeoStevens) is searching for nutrition experts to join our advisory team. The ideal person has a combination of formally recognized nutrition expertise & also at least a casual interest in things like study methodology and effect sizes (this unfortunately seems to be a rare combination). Advising us will be an opportunity to improve the diets of many people, it should not be much work, you'll get a small stake in our company, and you'll help us earn money for effective giving. Please get in touch with us (ideally using this page) if you or someone you know might be interested!
MealSquares are nutritionally complete--5 MealSquares contain all the vitamins & minerals you need to survive for a day, in the amounts you need them. In principle you could eat only MealSquares and do quite well, although we don't officially recommend this. It's more about having an easy "default meal" that you can eat with confidence once or twice a day when you don't have something more interesting to do like get dinner with friends.
MealSquares is made from a variety of whole foods, and almost all of the vitamins and minerals are from whole food sources (as opposed to competing products like Soylent that use dubious vitamin powders). Virtually every nutrition expert in the past century has recommended eating a variety of whole foods, and MealSquares stuffs more than 10 whole food ingredients in to a single convenient package, including 3 different fruits and 3 different vegetables.
We've put a lot of research in to MealSquares to make it better for you than most or all competing products on the market. For example, the first ingredient in Clif Bar is brown rice syrup (basically a glorified form of sugar), and they get their protein from rice and soy (not a...
More data on Kepler star KIC 8462852.
http://www.nasa.gov/feature/jpl/strange-star-likely-swarmed-by-comets
After going back through Spitzer space telescope infrared images, the star did not have an infrared excess as recently as earlier in 2015, meaning that there wasn't some kind of event that generated huge amounts of persistent dust between the last measurements of spectra and the Kepler dataset showing the dips in brightness. This bolsters the 'comet storm / icy body breakup' theory in that that would generate dust close to the star that rapidly goes away and is positioned such that we are primed to see large fractions of it as it is generated close to the star rather than a tiny fraction of dust further away.
(This comes after the Allen telescope array, failing to detect anything interesting, put an upper limit on radio radiation coming from the system at 'weaker than 400x the strength we could put out with Aricebo in narrow bands, or 5,000,000x in wide bands' for what that's worth)
Why is my karma so low? Is there something I'm consistently doing wrong that I can do less wrong? I'm sorry.
The first association I have with your username is "spams Open Threads with not really interesting questions".
Note that there are two parts in that objection. Posting a boring question in an Open Thread is not a problem per se -- I don't really want to discourage people from doing that. It's just that when I open any Open Thread, and there are at least five boring top-level comments by the same user, instead of simply ignoring them I feel annoyed.
Many of your comments are very general debate-openers, where you expect others to entertain you, but don't provide anything in return. Choosing your recent downvoted question as an example:
How do you estimate threats and your ability to cope; what advice can you share with others based on your experiences?
First, how do you estimate "threats and your ability to cope"? If you ask other people to provide their data, it would be polite to provide your own.
Second, what is your goal here? Are you just bored and want to start a debate that could entertain you? Or are you thinking about a specific problem you are trying to solve? Then maybe being more specific in the question could help to give you more relevant answer. But the thing is, your not being specific seems like an evidence for the "I am just bored and want you to entertain me" variant.
You use LW as a dumping ground for whatever crosses your mind at the moment, and that is usually random and transient noise.
By "transient" I mean that you mention a topic once and then never show any interest in it again. By "noise" I mean random pieces of text which neither contain useful information nor are interesting.
As I said before, I think it would be good if you get in the habit of trying to predict the votes that your posts get beforehand and then not post when you think that a post would produce negative karma.
One way to do this might be, whenever you write a post keep it in a textfile and wait a day. The next day you ask yourself whether there anything you can do to improve it. If you feel you can improve it, do it. Then you estimate a confidence interval for the karma you expect your post to get and take a note of it in a spreadsheet. If you think it will be positive post your comment.
If you train that skill I would expect you to raise your karma and learn a generally valuable skill.
If at the end of writing a post you think "I’m not sure where I was going with this anymore." as in http://lesswrong.com/r/discussion/lw/mzx/some_thoughts_on_decentralised_prediction_markets/ , don't publish the post. If you yourself don't see the point in your writing it's unlikely that others will consider it valuable.
Thank you for asking. I've been trying to figure out what to say to you, but couldn't figure out quite what the issue is. One possibility in terms of karma is to bundle a number of comments into a single comment, but this doesn't address how the comments could be better.
A possible angle is to work on is being more specific. It might be like the difference between a new computer user and a more sophisticated computer user. The new user says "My computer doesn't work!", and there is no way to help that person from a distance until they say what sort of computer it is, what they were trying to do, and some detail about what happened.
Being specific doesn't come naturally to all people on all subjects, but it's a learnable skill, and highly valued here.
I think it's that you post a lot of questions and not a lot of content. Less Wrong is predisposed to upvoting high-content responses. I haven't had an account for very long, but I have lurked for ages. That's my impression, anyways. I recognize that since I haven't actually pulled comment karma data from the site and analyzed it, I could be totally off-base.
Maybe when you ask questions, use this form:
[This is a general response to the post] and [This is what is confusing me] but [I thought about it and I think I have the answer, is this correct?] or [I thought about it, came up with these conclusions, but rejected them for reasons listed here, I'm still confused]
EDIT: I just looked at your submitted history. You do post content in Main, apparently, but your posts seem to run counter to the popular ideas here. There is bias, and LessWrong has a lot of ideas deemed "settled." Effective Altruism appears to be one, and you have posted arguments against it. I've also seen some of your posts jump to conclusions without explaining your explicit reasons. LWers seem to appreciate having concepts reduced as much as possible to make reasoning more explicit.
What is the optimal amount of attention to pay to political news? I've been trying to cut down to reduce stress over things I can't control, but ignoring it entirely seems a little dangerous. For an extreme example, consider the Jews in Nazi Germany - I'd imagine those who kept an eye on what was going on were more likely to leave the country before the Holocaust. Of course something that bad is unlikely, but it seems like it could still be important to be aware of impactful new laws that are passed - eg anti-privacy laws, or internet piracy now much more heavily punishable, etc.
So what's the best way to keep up on things that might have an impact on one's life, without getting caught up in the back-and-forth of day-to-day politics?
Some things to think about:
Are there actual political threats to you in your own polity (nation, state, etc.)? Do you belong to groups that there's a history of official repression or large-scale political violence against? Are there notable political voices or movements explicitly calling for the government to round you up, kill you, take away your citizenship or your children, etc.? (To be clear: An entertainer tweeting "kill all the lawyers" is not what I mean here.)
Are you engaged in fields of business or hobbies that are novel, scary, dangerous, or offensive to a lot of people in your polity, and that therefore might be subject to new regulation? This includes both things that you acknowledge as possibly harmful (say, working with poisonous chemicals that you take precautions against, but which the public might be exposed to) as well as things that you don't think are harmful, but which other people might disagree. (Examples: Internet; fossil fuels; drones; guns; gambling; recreational drugs; pornography)
Internationally — In the past two hundred years, how often has your country been invaded or conquered? How many civil wars, coups d'état, or failed wars of independence have there been; especially ones sponsored by foreign powers? How much of your country's border is disputed with neighboring nations?
So, it seems like lots of people advise buying index funds, but how do I figure out which specific ones I should choose?
Short version: try something like Vanguard's online recommendation, or check out Wealthfront or Betterment. Probably you'll just end up buying VTSMX.
Long version: The basic argument for index funds over individual stocks is that you think that a is going to outperform a because of general economic growth and reduced risk through pooling. So if you apply the same logic to index funds, what that argues is that you should find the index fund that covers the largest possible pool.
But it also becomes obvious that this logic only stretches so far--one might think that meta-indexing requires having a stock index fund and a bond index fund that are both held in proportion to the total value of stocks and bonds. So let's start looking at the factors that push in the opposite direction.
First, historically stocks have returned more than bonds long-term, with higher variability. It makes sense to balance your holdings based on your time and risk preferences, rather than the total market's time and risk preferences. (If you're young, preferentially own stocks.)
As well, you might live in the US, for example, and find it more legally convenient to own US stocks than international stocks. The co...
Meta-research: Evaluation and Improvement of Research Methods and Practices by John P. A. Ioannidis , Daniele Fanelli, Debbie Drake Dunne, Steven N. Goodman.
...As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives
Are there any studies that highlight which biases become stronger when someone "falls in love"? (Assume the love is reciprocated.) I am mainly interested in biases that affect short- and medium-term decisions, since the state of mind in question usually doesn't last long.
One example is the apparent overblown usage of the affect heuristic when judging the goodness of the new partner's perceived characteristics and actions (the halo effect on steroids).
Further possible evidence for a Great Filter: A recent paper suggests that as long as the probability of an intelligent species arising on a habitable planet is not tiny, at least about 10^-24 then with very high probability humans are not the only civilization to have ever been in the observable universe, and a similar result holds for the Milky Way with around 10^-10 as the relevant probability. Article about paper is here and paper is here.
Do transhumanist types tend to value years of life lived past however long they'd expect to live anyways linearly (I.e. if they'd pay a maximum of exactly n to live an extra year, then would they also be willing to pay a maximum of exactly 100n to live 100 extra years)?
If so, the cost effectiveness of cryonics (in terms of added life years lived) could be compared with the cost effectiveness of other implementable health interventions would-be cryonicists are on the fence on. What's the marginal disutility that a given transhumanist might get from forcing ...
I'm going to guess that English language proficiency is far higher in Europe than it is in China. But Asian Americans seem underrepresented on LW relative to the fields that LW draws heavily from, so that seems unlikely to be a complete explanation.
Nothing to do with IQ, but with modes of thinking. According to Nisbett, Eastern thinking is more holistic and concrete vs. the Western formal and abstract approach. He says that Easterners often make fewer thinking mistakes when dealing with other people, where a more holistic approach is needed (for example, Easterners are much less prone to the Fundamental Attribution Error). But at the same time they tend to make more thinking mistakes when it comes to thinking about scientific questions, as that often requires formal, abstract thinking. Nisbett also speculates that this is why science developed only in the west even though China was way ahead of the west in (concrete-thinking-based) technological progress.
In general there's very little if any correlation between IQ and rationality. A lot of Keith Stanovich's work is on this.
Facebook question:
I have different types of 'friends' on Facebook, such as "Family", "Rationalists", "English-speaking", etc. Different materials I post are interesting for different groups. There is an option to select visibility of my posts, but that seems not exactly what I want.
What I'd like is to make my posts so that they are available to everyone, including people I don't know (e.g. if anyone clicks on my name, they will see everything I ever posted), but I don't want all my posts to appear automatically on all of my 'f...
I don't typically read a lot of sci-fi, but I did recently read Perfect State, by Brandon Sanderson (because I basically devour everything that guy writes) and I was wondering how it stacks up to typical post-singularity stories.
Has anyone here read it? If so, what did you think of the world that was presented there, would this be a good outcome of a singularity?
For people that haven't read it, I would recommend it only if you are either a sci-fi fan that wants to try something by Brandon Sanderson or if you read some cosmere novels and would like a story touches on some slightly complexer (and more LWish) themes than usual (and don't mind it being a bit darker than usual).
I just found out about the “hot hand fallacy fallacy” (Dan Kahan, Andrew Gelman, Miller&Sanjuro paper) as a type of bias that more numerate people are likely more susceptible to, and for whom it's highly counterintuitive. It's described as a specific failure mode of the intuition used to get rid of the gambler's fallacy.
I understand the correct statement like this. Suppose we’re flipping a fair coin.
*If you're predicting future flips of the coin, the next flip is unaffected by the results of your previous flips, because the flips are independent. So ...
I think this is not quite right, and it's not-quite-right in an important way. It really isn't true in any sense that "it's more likely that you'll alternate between heads and tails". This is a Simpson's-paradox-y thing where "the average of the averages doesn't equal the average".
Suppose you flip a coin four times, and you do this 16 times, and happen to get each possible outcome once: TTTT TTTH TTHT TTHH THTT THTH THHT THHH HTTT HTTH HTHT HTHH HHTT HHTH HHHT HHHH.
What's going on here isn't any kind of tendency for heads and tails to alternate. It's that an individual head or tail "counts for more" when the denominator is smaller, i.e., when there are fewer heads in the sample.
This week on the slack: http://lesswrong.com/r/discussion/lw/mpq/lesswrong_real_time_chat/
Business and startups - CACE (Change Anything Chances Everything) with respect to startups and machine learning. prediction.io , ,meetings: [each person speaks, so the length of meeting of the meeting is O(n) and there are n people, so the total meeting cost is O(n^2). On the margin, adding one person to the standup means they listen to n peo
Introverts, Extroverts, and Cooperation
As usual, a small hypothetical social science study, but I'm willing to play with the conclusion, which is that extroverts are more likely to cheat unless they're likely to get caught. It wouldn't surprise the hell out of me if introverts are more likely to internalize social rules (or are people on the autism spectrum getting classified as introverts?).
Could "publicize your charity" be better advice for extroverts and/or majority extrovert subcultures than for introverts?
I've heard the Beatles have some recorded song they never released because they were too low quality. I think it would be worthwhile to study their material in its full breadth, mediocrity included, to get a sense for the true nature of the minds behind some greatness.
I've saved writings and poetry and raw, potentially embarrassing past creations for the sake of a similar understanding. I wish I had recordings of my initial fumblings with the instruments I now play rather better.
So it is in this general context of seeking fuller understanding, that I ask if anyone knows where to find these legendary old writings from Eliezer Yudkowsky, reputed to be embarrassing in their hubris, etc..
The "legendary old writings from Eliezer Yudkowsky" are probably easy to find, but I am not going to help you.
I do not like the idea of people (generally, not just EY) being judged for what they wrote dozens of years ago. (The "sense for the true nature" seems like the judgement is being prepared.)
Okay, I would make an exception in some situations; the rule of thumb being "more extreme things take longer time to forget". For example if someone would advocate genocide, or organize a murder of a specific person, then I would be suspicious of them even ten years later. But "embarrassing in their hubris"? Come on.
I'm defensive about digging in people's past, only to laugh that as teenagers they had the usual teenage hubris, and maybe as highly intelligent people they kept it for a few more years... and then use it to hint that even today 'deeply inside' they are 'essentially the same', i.e. not worth to be taken seriously.
What exactly are we punishing here; what exactly are we rewarding?
Ten or more years ago I also had a few weird ideas. My advantage is that I didn't publish them on visible places in English, and that I didn't become famous enough so people would now spend their time digging in my past. Also, I kept most of my ideas to myself, because I didn't try to organize people into anything. I didn't keep a regular diary, and when I find some old notes, I usually just cringe and quickly destroy them.
(So no, I don't care about any of Eliezer's flaws reflecting on me, or anything like that. Instead I imagine myself in a parallel universe, where I was more agenty and perhaps less introverted, so I started to spread my ideas sooner and wider, had the courage to try changing the world, and now people are digging up similar kinds of my writings. Generally, this is a mechanism for ruining si...
The Guardian had an interesting article on biases. Makes a similar point as http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/
I recall a tool, by WeiDao if I'm not mistaken which will display all a users posts and comments ever on one page. I was wondering if anyone had the link. Perhaps we could get a wiki page with all of the LessWrong widgets like this for reference? I am not authorised to make Wiki pages myself.
If anybody is interested in Moscow postrationality meetup, please comment here or pm me. Thanks!
Do you know of any remedy or prevention for hiccups? I can't get anything trusthworthy out of the internet nor out of friends and family. All just anecdotes.
The other day I met a woman named common first name redacted out of respect to commentator's recommendation near the train station. I was just sitting and eating lunch, and she came over to chat. She had been ill with Lithium toxicity in hospital lately. She attends the same (mental) health complex as me. She was lovely, lonely, dated younger guys. She mentioned that her money is controlled by a State Trust to an extent, and that her last boyfriend continues to abuse her financially, and occasionally physically. She mentioned the police have recommended she break up with him, but she says that she loves him. We swapped numbers. Anything I can do for her?
'noisy text analytics'. Has anyone trialed applying those algorithms in their minds with human conversations or text messaging (say through facebook) it to filter information in real life? Was it more efficient than your default or non-volitional approach?
How do you estimate threats and your ability to cope; what advice can you share with others based on your experiences?
I have a student email account that forwards messages to my personal gmail account. Sometimes I have to send messages from my student gmail account. Can these get automatically moved to my personal gmail sent folder so that I can find them with one search?
What, other than an interest in the commercial success of the car lot business, normative social influence and scrupulosity (all tenuous), stops someone from taking a second ticket (by foot) from a gated car park then immediately paying that one off when leaving, rather than paying the original entry ticket?
Any US lawyers here?
A woman who once worked in a law office told me that clients come and go (she used the word e·phem·er·al) so the real allegiance for a lawyer is to other lawyers. Because they will see them again and again.
And Game Theory has something to say about how to treat a person that you are not likely to see again.
Please, folks, do not ask me to justify this "hearsay". I found her credible, so please take this woman's word as gospel, as an axiom, and go from there.
Please confirm, deny, explain or comment on her statement.
TIA.
What would happen if a altcoin was developed where users had to precommit not to forking that coin?
If it's worth saying, but not worth its own post (even in Discussion), then it goes here.
Notes for future OT posters:
1. Please add the 'open_thread' tag.
2. Check if there is an active Open Thread before posting a new one. (Immediately before; refresh the list-of-threads page before posting.)
3. Open Threads should be posted in Discussion, and not Main.
4. Open Threads should start on Monday, and end on Sunday.