Brain's Semantic Mapping System Decoded

Posted 21 Dec 2012 at 21:04 UTC (updated 22 Dec 2012 at 05:11 UTC) by steve Share This

Yet another brain mapping project has announced some pretty amazing new findings. Researchers at UC Berkeley's Gallant Lab have succeeded in decoding the semantic mapping space in which the brain stores all the information we take in. They've mapped the space both as abstract, multi-dimensional graphics and they've mapped the actual locations where the information nodes are stored in the physical brain. They've learned all sorts of new things about how the brain categorizes things. For example, one semantic dimension (abbreviated PC) of our brain space categorizes things by whether they move - cars, motorcycles, people vs buildings, cities, and the sky. Another dimension distinguishes between things involved in social interaction (people, verbs, furniture) and things involved in less interactive outdoor activities (geological formations, animals, vehicles). They've identified four semantic dimensions so far but believe with higher resolution scans and more work, many more will be revealed.

"Across the cortex, semantic representation is organized along smooth gradients that seem to be distributed systematically. Functional areas defined using classical contrast methods are merely peaks or nodal points within these broad semantic gradients. Furthermore, cortical maps based on the group semantic space are significantly smoother than expected by chance. These results suggest that semantic representation is analogous to retinotopic representation, in which many smooth gradients of visual eccentricity and angle selectivity tile the cortex (Engel, Glover, & Wandell, 1997; Hansen, Kay, & Gallant, 2007). Unlike retinotopy, however, the relevant dimensions of the space underlying semantic representation are not known a priori, and so must be derived empirically"

The mapping of the semantic space onto the brain reveals that as much as 20% of the brain, including parts of the somatosensory and frontal cortices, is devoted to storing these highly organized semantic maps. Less surprisingly, the maps confirm the location of previously established specialized areas. Information about humans, for example, overlaps the fusiform face area (FFA) of the brain which is known to be involved in face recognition. For more see the paper "A continuous semantic space describes the representation of thousands of object and action categories across the human brain" (PDF format). The paper will be in Neuron Vol 76, Iss 6. If you're using a browser such as Google's Chrome that supports WebGL graphics, you can explore an interactive version of the researcher's semantic brain map. And read on to see examples of the semantic space mapped onto the physical brain as well as a short video describing the research.




See more of the latest robot news!

Recent blogs

30 Sep 2017 evilrobots (Observer)
10 Jun 2017 wedesoft (Master)
9 Jun 2017 mwaibel (Master)
25 May 2017 AI4U (Observer)
25 Feb 2017 steve (Master)
16 Aug 2016 Flanneltron (Journeyer)
27 Jun 2016 Petar.Kormushev (Master)
2 May 2016 motters (Master)
10 Sep 2015 svo (Master)
14 Nov 2014 Sergey Popov (Apprentice)
Share this page