Rationality requires intelligence, and the kind of intelligence that we use (for communication, progress, FAI, etc.) runs on language.
It seems that the place we should start is optimizing language for intelligence and rationality. One of SIAI's proposals includes using Lojban to interface between humans and an FAI. And of course, I should hope the programming language used to build a FAI would be "rational". But it would seem to me that the human-generated priors, correct epistemic rationality, decision theory, metaethics, etc. all depend on using a language that sufficiently rigorously maps to our territory.
Are "naturally evolved" languages such as English sufficient, with EY-style taboos and neologisms? Or are they sick to the core?
Please forgive and point me towards previous discussion or sequences about this topic.
There's certainly a lot of complexity being glossed over, but I think it's manageable. Natural languages borrow words from each other all the time, and while there are issues and ambiguities with how to do it, they develop rules that seem to cover them - forbbidden phonemes and clusters get replaced in predictable ways, affixes get stripped off and replaced with local versions, and the really hard cases like highly-irregular verbs, prepositions and articles form closed sets, so they don't need to be borrowable.
If I'm translating a math paper from English to an artificial language, and the author makes up a new concept and calls it blarghlizability, I should be able to find a unique, non-conflicting and invertible translation by replacing the -izability affix and leaving the rest the same or applying simple phonetic transforms on it. More importantly, this translation process should determine most of the language's vocabulary. It's the difference between a language that has O(n) things to memorize and a language that has O(1) things to memorize.
(EDIT: Deleted a half-finished sentence that I'll finish in a separate reply downthread)
Yes, that's true about natural language borrowing, to some extent. Note that calques (borrowing of a phrase with the vocabulary translated but the syntactic structure of the source language retained) are also common; presumably the artificial language would want to avoid these.
Also, some very high percentage of natural language borrowings are nouns. This clearly has a lot to do with the fact that if you encounter a new object, a natural way to label it is to adopt the existing term of the people who've already encountered it, but I think there are other fa... (read more)