If all we have is observations and models and predictions, and the whole enterprise of trying to reason about whatever might underlie those observations is misguided, presumably abstract assertions like "there exists a single X" or "there exists more than one X" are also misguided except insofar as they can be grounded out in differential predictions about future events.
That is, I don't see how the distinction you're drawing here between those two superficially distinct accounts is at all meaningful on your ontology.
If all we have is observations and models and predictions, and the whole enterprise of trying to reason about whatever might underlie those observations is misguided
That is the minimal model I currently prefer, yes. And "the whole enterprise of trying to reason about whatever might underlie those observations" is not misguided, it's useful for coming up with better models, but that's all it is useful for. Assigning it any ontology is unnecessary.
...That is, I don't see how the distinction you're drawing here between those two superficially dist
Today's post, When Science Can't Help was originally published on 15 May 2008. A summary (taken from the LW wiki):
Discuss the post here (rather than in the comments to the original post).
This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Science Doesn't Trust Your Rationality, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.
Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.