Comment author: KPier 29 December 2011 02:49:00AM *  6 points [-]

I'm 17 and just got into a top U.S. college, where I want to major in math and economics. I am a bit worried that I haven't learned good work habits and that I waste too much time on the internet, since high school was mostly a breeze for me. I've heard from a lot of people that kids like me get hit hard in college when they have to work hard for the first time, and while I can think of lots of reasons I'm different, this is probably a good situation to take the outside view.

So in short, I'd love a mentor. How does this work, exactly?

Comment author: wuthefwasthat 29 December 2011 12:50:42PM *  0 points [-]

I was in a similar situation as you, 4 years ago. I worked a fair bit harder, learned WAY more, and had a WAY better time in college. To be fair, my work ethic is still not very good, but I pretty much get A's in the classes I care about and B's in the ones I don't.

I suspect that you will be fine - you're probably smart enough that college won't be as hard as you might think, and you'll also be more motivated to dig up good work habits if you really need/want to.

Comment author: wuthefwasthat 29 December 2011 12:36:27PM *  3 points [-]

I'm in my last year of studying CS/Math as an undergradate at MIT (I'm going to do a Master's next year though). I'd really like some advice about what I should do after I graduate - Grad school? Industry? Any alternative?

I care a fair amount about reducing xrisk, but I am also fairly skeptical that there is much I can do about it right now.

I have job offers with Google and some tech start-ups, and I suspect I could get a job in finance if I tried. I personally have some desire to start a tech company one day. I'm not sure what the tradeoff between doing good work and making money is, but I suspect my main goal should be maximizing expected income. I'd try to use most my money to support people doing good things. (Though I'm not sure money is the limiting factor here. Perhaps discovering what to do would be better...)

I'm not sure whether I can get into a top graduate school - I have a 5.0 technical GPA (4.8 overall), but no research record or particularly good recommendations at the moment. I believe I am much better than most mathematicians/computer scientists, but I am also not sure what sort of research I could do that I would consider worthwhile. Realistically, I am at least 2 huge leaps down from being as good as, say, John Von Neumann (that is, people far from as good as him are far better than me). I also have really bad attention span, and tend not to think about problems for extended periods of time. I'm not sure this is worth factoring in, as it can probably be remedied. Even if I am not suited for solving the really hard theory problems out there, but minimally, I can code and do math better than most people in most labs. I'm basically open to going to graduate school in any field, as long as I'd make an impact, and hopefully have a comparative advantage.

I've talked to a number people about this, but I'm still pretty uncertain about what to do. I'd love to hear people's thoughts. Thanks in advance!

Comment author: wuthefwasthat 27 December 2011 11:05:44PM 1 point [-]

I'm in the Cambridge area, and haven't been attending meetups, but this seems like the most awesome possible way to start. What is the mailing list/point of contact to work things out with Cambridge meet-up guys?

In response to The Allais Paradox
Comment author: GreedyAlgorithm 19 January 2008 11:02:34AM 1 point [-]

I am intuitively certain that I'm being money-pumped all the time. And I'm very, very certain that transaction costs of many forms money-pump people left and right.

Comment author: wuthefwasthat 26 December 2011 10:03:04AM 1 point [-]
Comment author: Anatoly_Vorobey 17 December 2011 09:43:05AM 0 points [-]

Are you confusing soundness with consistency? omega-consistency is much weaker than soundness.

Consistency: a syntactic claim that it's impossible to derive a contradiction. Doesn't require a notion of truth to be useful.

Omega-consistency: a syntactic claim that it's impossible to prove certain statements together. Doesn't need a notion of truth, but is motivated by the standard model of natural numbers.

Soundness: a semantic claim that given a specific notion of "true" as applies to a statement, e.g. truth in the model N of natural numbers, all the axioms of the theory are true. Automatically implies both consistency and omega-consistency. Requires a notion of the "intended model" or a "standard model" for the theory in which we consider the truth of propositions. For example, soundness is meaningless to talk about in the case of ZFC, which doesn't have an intended model.

I think this argument holds as long as the other formal system is recursively enumerable, and if PA is omega-consistent.

Look at this sentence from the argument:

"If the oracle says yes, you know that the statement is true for standard integers because they're one of the models of PA, therefore N is a standard integer, therefore T halts."

If the oracle for a formal system S says "yes" on a given statement S that encodes the proposition "a Turing machine T will halt on this input", this means that S proves this proposition. It does not mean that the proposition is true and T will in fact halt! For that to be true, you need the formal system S to be sound. S could easily be omega-consistent and not sound, in which case it'll lie to you (example, assuming you believe PA to be consistent: the formal system.

Comment author: wuthefwasthat 17 December 2011 08:00:12PM *  0 points [-]

Soundness: a semantic claim that given a specific notion of "true" as applies to a statement, e.g. truth in the model N of natural numbers, all the axioms of the theory are true. Automatically implies both consistency and omega-consistency. Requires a notion of the "intended model" or a "standard model" for the theory in which we consider the truth of propositions. For example, soundness is meaningless to talk about in the case of ZFC, which doesn't have an intended model.

I looked it up, and it seems like what you're referring to as soundness is called "arithmetic soundness." The soundness I know doesn't require a notion of standard model. It simply says that anything provable syntactically is also true in every model/interpretation. This is automatically true for any theory in first order logic. (Note that my version of soundness is the correct analogue of completeness, which is also true for any FOL theory, as Godel showed, less trivially.)

"If the oracle says yes, you know that the statement is true for standard integers because they're one of the models of PA, therefore N is a standard integer, therefore T halts."

So here, the oracle says something of the form "exists x such that T halts after x steps". Omega-consistency guarantees that there is some actual standard number N so that "T halts after N steps" is provable/true. The existence of a standard model implies omega-consistency, so I might has well have gone with that, but I was just trying to be minimalist.

Sorry for the miscommunication. Are we on the same page now? I do think that saying "soundness" generally refers to the notion I said, though.

Comment author: cousin_it 17 December 2011 10:08:41AM 0 points [-]

If they were, the oracle would never be more helpful than a halting oracle, no?

But we want the oracle to be less helpful than the halting oracle...

Anyway, the question is settled now, thanks a lot :-)

Comment author: wuthefwasthat 17 December 2011 12:26:56PM *  0 points [-]

Oops sorry! Ignore what I said there. Anyways, the axioms aren't necessarily r.e., but as far as I can tell, they don't need to be.

Comment author: cousin_it 17 December 2011 12:18:19AM *  1 point [-]

I'm a little out of my depth here, so sorry if my comments don't make sense.

Then just include the axiom that says R(x) for all x in L, and not R(x) for all x not in L.

That's supposed to be a r.e. set of axioms, not a single axiom, right? I can easily imagine the program that successively prints the axioms R(x) for all x in L, but how do you enumerate the axioms not R(x) for all x not in L, given that L is only r.e. and not recursive? Or am I missing some easy way to have the whole thing as a single axiom without pulling in the machinery for running arbitrary programs and such?

On second thought, maybe we don't need the second part. Just having R(x) for all x in L could be enough.

It seems pretty clear that the only things this theory proves are things that FOL proves, silly things about the successor function, and silly things about R, like "forall x, R(x) -> R(x)" and "(R(14) and R(12)) -> R(17)" where R(14) is false. So an oracle for the logical implications of this theory is basically the same as an oracle for L.

I don't completely understand why there won't be an accidental smart thing among all the silly things...

Comment author: wuthefwasthat 17 December 2011 06:35:43AM *  1 point [-]

I'm a little out of my depth here, so sorry if my comments don't make sense.

I'm not an expert either, so I'm probably just being unclear

That's supposed to be a r.e. set of axioms, not a single axiom, right? I can easily imagine the program that successively prints the axioms R(x) for all x in L, but how do you enumerate the axioms not R(x) for all x not in L, given that L is only r.e. and not recursive? Or am I missing some easy way to have the whole thing as a single axiom without pulling in the machinery for running arbitrary programs and such?

The axioms don't need to be r.e. If they were, the oracle would never be more helpful than a halting oracle, no?

I don't completely understand why there won't be an accidental smart thing among all the silly things...

I don't either. It's just a strong intuition which I'm not sure I can justify, and which might be wrong.

ETA: By silly, I don't necessarily mean as simple as the examples I gave. Basically if you have a formula phi(S(x), T, F), which holds for arbitrary sentences S(x), provably true T, and provably false S, then you can replace S(x) with R(x), T with R(x in L), and S with R(x not in L). Not sure if that was well explained, but yeah.

At least it looks like my answer is correct :). Also my proof should generalize, if it does work. So I would have guessed that Feferman's (stronger) result was true, and I wouldn't be surprised if the argument was along these lines, though maybe the details are harder.

Comment author: wuthefwasthat 16 December 2011 11:57:00PM *  2 points [-]

I believe the answer to your question is yes. I'm going to just interpret "formal system" as "first order theory", and then try to do the most straightforward thing.

Take a language L of intermediate degree, as constructed via the priority method. I'd like to just take the strings (or numbers) in this language to be the theory's axioms. So let the theory have some 1-ary relation, call it R, as well as +, and constants 0 and 1. Assert that everything has a successor, just to get the "natural numbers" (without having multiplication though). Then just include the axiom that says R(x) for all x in L, and not R(x) for all x not in L.

It seems pretty clear that the only things this theory proves are things that FOL proves, silly things about the successor function, and silly things about R, like "forall x, R(x) -> R(x)" and "(R(14) and R(12)) -> R(17)" where R(14) is false. So an oracle for the logical implications of this theory has the same degree as an oracle for L.

Don't feel like thinking about how to say/prove this part formally, but maybe someone can help (or correct) me. Also, for reference, Presburger arithmetic is basically arithmetic without multiplication, and is decidable.

Comment author: cousin_it 16 December 2011 11:14:40PM *  0 points [-]

Right. Also there's always the freaky possibility that there's no such thing as the "standard integers". After all, we don't really have any formal grounds to believe that they exist, only unreliable evolved intuitions about counting apples, and these have already let us down many times (e.g. Russell's paradox).

Comment author: wuthefwasthat 16 December 2011 11:23:42PM *  0 points [-]

Sure. You actually need something a bit stronger than soundness, in that you want omega-consistency, right?

I still don't agree/understand with what you two are saying about having the standard integers as a model, or interepreting PA with its own axioms, though (or anything along the lines of needing to contain PA). I think this argument holds as long as the other formal system is recursively enumerable, and if PA is omega-consistent.

Comment author: Anatoly_Vorobey 16 December 2011 10:35:10PM 0 points [-]

So having a provability oracle for PA, or any other formal system that has a model containing the standard integers (like ZFC), is equivalent to having a halting oracle

No, you need guarantees on the formal system's complexity, similar to the ones in Godel's incompleteness theorem(s). You also need the formal system to be sound for your argument to carry through. This is stronger than your "has a model containing the standard integers", and is equivalent to "has the standard integers as a model".

In other words, if you knew all about the logical implications of PA, then you would also know all about the logical implications of ZFC

Note that you're assuming here that PA is sound and in particular consistent.

Is there a formal system (not talking about the standard integers, I guess) whose provability oracle is strictly weaker than the halting oracle, but still uncomputable?

The halting oracle's uncomputability degree is the smallest possible uncomputable degree, so no.

Comment author: wuthefwasthat 16 December 2011 10:40:00PM *  2 points [-]

The halting oracle's uncomputability degree is the smallest possible uncomputable degree, so no.

What? That's false. See http://en.wikipedia.org/wiki/Turing_degree#Post.27s_problem

ETA: Also, not sure what you are saying about soundness... =/

View more: Prev | Next