“There’s something odd about the experience of talking to [Singularity Institute researcher] Carl Shulman,” I said.
“He never blinks?” said my friend.
“No. I mean: Yes, but that’s not what I was thinking of.”
“He speaks only facts.”
I paused.
“Yes,” I said. “That is what I meant.”
Normally, when I ask someone “Do you think human-level AI will arrive in the next 30 years?” or “Should we encourage faster development of whole brain emulation?” I get answers like “Yes” or “No, I don’t think so.”
When I ask Carl a question like “Do you think human-level AI will arrive in the next 30 years?” he instead begins to state known facts relevant to answering the question, such as facts about the history of Moore’s law, progress in algorithms, trends in scientific progress, past examples of self-improving systems, and so on.
Maybe this is a bit rude. Carl didn’t answer my question about his opinion. He answered a different question instead, about facts.
But I never feel like it’s rude. Carl went out of his way to make his answer more useful to me. His testimony alone would have been helpful, but argument screens off authority, so Carl’s “flood of facts” way of answering questions gives me more evidence about what’s true than his mere opinion could.
Why isn’t this more common? For one thing, most people don’t know many facts. I’ve read a lot of facts, but do I remember most of them? Hell no. If I forced myself to respond to questions only by stating facts, I’d be worried that I have fewer facts available to me than I’d like to admit. I often have to tell people: “I can’t remember the details in that paper but I remember thinking his evidence was weak.”
But it's worth a try. I think I've noticed that when I try to answer with facts more often, my brain is primed to remember them better, as if it's thinking: "Oh, I might actually use this fact in conversation, so I should remember it." But I haven't measured this, so I could be fooling myself.
;)
(That is a winky-face.)