Daniel_Burfoot comments on Open thread, 3-8 June 2014 - Less Wrong

3 Post author: David_Gerard 03 June 2014 08:57AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (153)

You are viewing a single comment's thread. Show more comments above.

Comment author: Daniel_Burfoot 03 June 2014 05:06:07PM 1 point [-]

I would guess strongly (75%) that the answer is yes. There are incredible stories about people's brains adapting to new inputs. There is one paper in the neuroscience literature that showed how if you connect a video input to a blind cat's auditory cortex, that brain region will adapt new neural structures that are usually associated with vision (like edge detectors).

Comment author: Error 03 June 2014 05:16:29PM *  0 points [-]

This makes me wonder what could be done with, say, a bluetooth earbud and a smartphone, both of which are rather less conspicuous than Google Glass. Not quite as good as connecting straight to the auditory cortex, but still. The first thing that comes to mind is trying to get GPS navigation to work on a System 1 rather than System 2 level, through subtle cues rather than interpreted speech.

[Edit: or positional cues rather than navigational. Not just knowing which way north is, but knowing which way home is.]