If there's some other weird attractor state of beliefs that also fulfills those requirements, I guess I risk falling into it. But then again, so do you - such beliefs would have to predict experience as successfully as the truth, which means they would have to give you the same widget-making capacity as true beliefs.
There are plenty of things like this -- engineering models, heuristics, etc. You don't have to have a "true" map to have a "useful" map. An idealized right-angle, not-to-scale map of a city which nonetheless allowed you to logically navigate from point A to point B would be "useful" even if not "true" or "accurate" in certain senses.
Meanwhile, if you wait around for a "true" map, you're not going anywhere.
But such maps are only useful insofar as they are true. For example, the London Tube Map claims to be a useful representation of which stations are on which lines. It's useful in doing that because it is correct in its domain - every station it says is on the Piccadilly Line really is on the Piccadilly Line. It doesn't claim to accurately represent distance, and anyone who tried to use it to determine distances would quickly get some surprises.
There the danger doesn't seem to be getting something that isn't the truth, the danger is stopping at something th...
Robin wrote how being rational can harm you. Let's look at the other side: what significant benefits does rationality give?
The community here seems to agree that rationality is beneficial. Well, obviously people need common sense to survive, but does an additional dose of LessWrong-style rationality help us appreciably in our personal and communal endeavors?
Does LessWrong make us WIN?
(If we don't WIN, our evangelism rings a little hollow. Science didn't spread due to evangelism, science spread because it works. Art spreads because people love it. I want to hold my Art to this standard. Push-selling a solution while it's still inferior might be the locally optimal decision but it corrupts long-term, as many of us have seen in the IT industry. That's if the example of all religions and political movements isn't enough for you. Beware the Evangelism Death Spiral!)
We may claim internal benefits such as improved clarity of thought from each new blog insight. But religious people claim similar internal benefits that actually spill out into the measurable world, such as happiness and charitability. This fact gives us envy and we attempt to use our internal changes to group together for world-benefitting tasks. To my mind this looks like putting the cart before the horse: why compete with religion on its terms, don't we have utility functions of our own to satisfy?
No, feelings won't do. If feelings turn you on, do drugs or get religious. Rationalism needs to verifiably bring external benefit. Don't help me become pure from racism or somesuch. Help me WIN, and the world will beat a path to our door.
Okay, interpersonal relationships are out. Then the most obvious area where rationalism could help is business. And the most obvious community-beneficial application (riffing on some recent posts here) would be scientists banding together and making a profitable part-time business to fund their own research. I can see how many techniques taught here could help, e.g. PD cooperation techniques. If a "rationalism case study" of this sort ever gets launched, I for one will gladly offer my effort. Of course this is just one suggestion; everything's possible.
One thing's definite for me: rationalism needs to be grounded in real-world victories for each one of us. Otherwise what's the point?