You need to clarify your intentions/success criteria. :) Here's my What Actually Happened technique to the rescue:
(a) You argued with some (they seem) conventional philosophers on various matters of epistemology.
(b) You asked LessWrong-type philosophers (presumably having little overlap with the aforementioned conventional philosophers) how to do epistemology.
(c) You outlined some of the conventional philosophy arguments on the aforementioned epistemological matters.
(d) You asked for neuroscience pointers to be able to contribute intelligently.
(e) Most of the responses here used LessWrong philosophy counterarguments against arguments you outlined.
(f) You gave possible conventional philosophy countercounterarguments.
This is largely a failure of communication because the counterarguers here are playing the game of LessWrong philosophy, while you've played, in response, the game of conventional philosophy, and the games have very different win conditions that lead you to play past each other. From skimming over the thread, I am as usual most inclined to agree with Eliezer: Epistemology is a domain of philosophy, but conventional philosophers are mostly not the best at—or necessarily t...
Give up on justifying answers and just try to figure out what the answers really actually are, i.e., are you really actually inside an Evil Demon or not. Once you learn to quantify the reasoning involved using math, the justification thing will seem much more straightforward when you eventually return to it. Meanwhile you're asking the wrong question. Real epistemology is about finding correct answers, not justifying them to philosophers.
Without a justification, I cannot rationally believe in the truth of the senses.
Yeah you can. Like, are you wearing socks? Yes, you're wearing socks. People were capable of this for ages before philosophy. That's not about what's useful, it's about what's true. How to justify it is a way more complex issue. But if you lose sight of the fact that you are really actually in real life wearing socks, and reminding you of this doesn't help, you may be beyond my ability to rescue by simple reminders. I guess you could read "The Simple Truth", "Highly Advanced Epistemology 101 for Beginners", and if that's not enough the rest of the Sequences.
Externalism is always the answer! Accept that some unlucky people who are in sceptical scenarios would be doomed; but that doesn't mean that you, who are not in a sceptical scenario, are not, even though they're subjectively indistinguishable.
Warning: I am not a philosophy student and haven't the slightest clue what any of your terms mean. That said, I can still answer your questions.
1) Occam's Razor to the rescue! If you distribute your priors according to complexity and update on evidence using Bayes' Theorum, then you're entirely done. There's nothing else you can do. Sure, if you're unlucky then you'll get very wrong beliefs, but what are the odds of a demon messing with your observations? Pretty low, compared to the much simpler explanation that what you think you see correlates well to th...
Finally, how do I speak intelligently on the Contextualist v.s Invariantist problem? I can see in basic that it is an empirical problem and therefore not part of abstract philosophy, but that isn't the same thing as having an answer. It would be good to know where to look up enough neuroscience to at least make an intelligent contribution to the discussion.
Invariantism, in my opinion, is rooted precisely in the failure to recognize that this is an empirical and ultimately linguistic question. I'm not sure how neuroscience would enter into it, actually. ...
Not a philosophy student, but it seems to me that your question is basicly this:
If everything is uncertain (including reality, state of my brain, etc.), how can I become certain about anything?
And the answer is:
Taking your question literally, you can't.
In real life, we don't take it literally. We don't start by feeling uncertain about literally everything at the same time. We take some things as granted and most people don't examine them (which is functionally equivalent to having axioms); and some people examine them step by step, but not all at the same time (which is functionally equivalent to circular reasoning).
To combat skepticism, or at least solipsism, you just need to realise that there are no certainties, but that does mean you know nothing. You can work probabilistically.
Consider: http://lesswrong.com/lw/mn/absolute_authority/ http://lesswrong.com/lw/mo/infinite_certainty/ http://lesswrong.com/lw/mp/0_and_1_are_not_probabilities/ http://wiki.lesswrong.com/wiki/Absolute_certainty
This book might be what you are looking for. It's Evidence and Inquiry by Susan Haack. I have it, but I've only done a few very cursory skims of it (ETA: It's on my summer reading list, though). It has two very positive reviews on Amazon. Also, she calls out the Gettier "paradoxes" for what they are (for the most part, pointless distractions).
I doubt people are actually still interesting, but just in case I've actually managed to solve this problem.
IF the Correspondence Theory of Truth is assumed (defining "Truth" as that which corresponds to reality) and the assumption is made that philosophy should pursue truth rather than what is pragmatically useful, then for any non-Strong Foundationalist method of determining truth the objection could be made that it could easily have no correlation with reality and there would be no way of knowing.
Probabalistic arguments fall apart because they...
Not trying to answer your questions, sorry. Just wanted to mention that different philosophical camps pattern-match to different denominations of the same religion. They keep arguing without any hope of agreeing. Occasionally some denominations prevail and others die out, or get reborn when a new convincing guru or a prophet shows up. If you have a strong affinity for theism, err, mainstream philosophy, just pick whichever denomination you feel like, or whichever gives you the best chance of advancement. If you care about that real world thing, consider de...
(Then again, it has been argued, if a Coherentist were decieved by an evil demon they could be decieved into thinking data coheres when it doesn't. Since their belief rests upon the assumption that their beliefs cohere, should they not discard if they can't know if it coheres or not? The seems to cohere formulation has it's own problem)
Doesn't Coherentism idea say that even if the knowledge is incorrect, it is still "true" for the observer because it coheres with the rest of their beliefs?
The opinion Eliezer says is essentially that yes, you c...
I have naturally read the material here, but am still not sure how to act on two questions.
1: I've been arguing out the question of Foundationalism v.s Coherentism v.s other similiarly basic methods of justifying knowledge (e.g. infinitism, pragmatism). The discussion left off with two problems for Foundationalism.
a: The Evil Demon argument, particularly the problem of memory. When following any piece of reason, an Evil Demon could theoretically fool my reason into thinking that it had reasoned correctly when it hadn't, or fool my memory into thinking I'd reasoned properly before with reasoning I'd never done. Since a Foundationalist either is a weak Foundationalist (and runs into severe problems) or must discard all but self-evident and incorrigible assumptions (of which memory is not one), I'm stuffed.
(Then again, it has been argued, if a Coherentist were decieved by an evil demon they could be decieved into thinking data coheres when it doesn't. Since their belief rests upon the assumption that their beliefs cohere, should they not discard if they can't know if it coheres or not? The seems to cohere formulation has it's own problem)
b: Even if that's discarded, there is still the problem of how Strong Foundationalist beliefs are justified within a Strong Foundationalist system. Strong Foundationalism is neither self-evident nor incorrigible, after all.
I know myself well enough to know I have an unusually strong (even for a non-rationalist) irrational emotive bias in favour of Foundationalism, and even I begin to suspect I've lost the argument (though some people arguing on my side would disagree). Just to confirm, though- have I lost? What should I do now, either way?
2: What to say on the question of skepticism (on which so far I've technically said nothing)? If I remember correctly Elizier has spoken of philosophy as how to act in the world, but I'm arguing with somebody who maintains as an axiom that the purpose of Philosophy is to find truth, whether useful or useless, in whatever area is under discussion.
3: Finally, how do I speak intelligently on the Contextualist v.s Invariantist problem? I can see in basic that it is an empirical problem and therefore not part of abstract philosophy, but that isn't the same thing as having an answer. It would be good to know where to look up enough neuroscience to at least make an intelligent contribution to the discussion.