World Library  
Flag as Inappropriate
Email this Article

Brain-reading

Article Id: WHEBN0009404689
Reproduction Date:

Title: Brain-reading  
Author: World Heritage Encyclopedia
Language: English
Subject: Neuroinformatics, Neuroprosthetics, Exploratory engineering, 3D printing, Neurotechnology
Collection: Computational Neuroscience, Emerging Technologies, Neurotechnology, Visual Perception
Publisher: World Heritage Encyclopedia
Publication
Date:
 

Brain-reading

Brain-reading uses the responses of multiple voxels in the brain evoked by stimulus then detected by fMRI in order to decode the original stimulus. Brain reading studies differ in the type of decoding (i.e. classification, identification and reconstruction) employed, the target (i.e. decoding visual patterns, auditory patterns, cognitive states), and the decoding algorithms (linear classification, nonlinear classification, direct reconstruction, Bayesian reconstruction, etc.) employed.

Contents

  • Classification 1
  • Reconstruction 2
  • Natural images 3
  • Other types 4
  • Accuracy 5
  • Limitations 6
  • Applications 7
  • See also 8
  • References 9
  • External links 10

Classification

In classification, a pattern of activity across multiple voxels is used to determine the particular class from which the stimulus was drawn.[1] Many studies have classified visual stimuli, but this approach has also been used to classify cognitive states.

Reconstruction

In reconstruction brain reading the aim is to create a literal picture of the image that was presented. Early studies used voxels from early visual cortex areas (V1, V2, and V3) to reconstruct geometric stimuli made up of flickering checkerboard patterns.[2][3]

Natural images

More recent studies used voxels from early and anterior visual cortex areas forward of them (visual areas V3A, V3B, V4, and the lateral occipital) together with Bayesian inference techniques to reconstruct complex natural images. This brain reading approach uses three components:[4] A structural encoding model that characterizes responses in early visual areas; a semantic encoding model that characterizes responses in anterior visual areas; and a Bayseian prior that describes the distribution of structural and semantic scene statistics.[4]

Experimentally the procedure is for subjects to view 1750 black and white natural images that are correlated with voxel activation in their brains. Then subjects viewed another 120 novel target images, and information from the earlier scans is used reconstruct them. Natural images used include pictures of a seaside cafe and harbor, performers on a stage, and dense foliage.[4]

Other types

It is possible to track which of two forms of rivalrous binocular illusions a person was subjectively experiencing from fMRI signals.[5] The category of event which a person freely recalls can be identified from fMRI before they say what they remembered.[6] Statistical analysis of EEG brainwaves has been claimed to allow the recognition of phonemes,[7] and at a 60% to 75% level color and visual shape words.[8] It has also been shown that brain-reading can be achieved in a complex virtual environment.[9]

Accuracy

Brain-reading accuracy is increasing steadily as the quality of the data and the complexity of the decoding algorithms improve. In one recent experiment it was possible to identify which single image was being seen from a set of 120.[10] In another it was possible to correctly identify 90% of the time which of two categories the stimulus came and the specific semantic category (out of 23) of the target image 40% of the time.[4]

Limitations

It has been noted that so far brain reading is limited. "In practice, exact reconstructions are impossible to achieve by any reconstruction algorithm on the basis of brain activity signals acquired by fMRI. This is because all reconstructions will inevitably be limited by inaccuracies in the encoding models and noise in the measured signals. Our results demonstrate that the natural image prior is a powerful (if unconventional) tool for mitigating the effects of these fundamental limitations. A natural image prior with only six million images is sufficient to produce reconstructions that are structurally and semantically similar to a target image."[4]

Applications

Brain reading has been suggested as an alternative to polygraph machines as a form of lie detection.[11] One neuroimaging method that has been proposed as a lie detector is EEG “brain-fingerprinting”, in which event related potentials are supposedly used to determine whether a stimulus is familiar or unfamiliar. [12] The inventor of brain fingerprinting, Lawrence Farwell, has attempted to demonstrate its use in a legal case, Harrington v. State of Iowa, although the state objected on the basis that the probes used by Farwell were too general for familiarity or unfamiliarity with them to prove innocence.[11] Another alternative to polygraph machines is Blood Oxygenated Level Dependent functional MRI technology (BOLD fMRI). This technique involves the interpretation of the local change in the concentration of oxygenated hemoglobin in the brain, although the relationship between this blood flow and neural activity is not yet completely understood. [11]

A number of concerns have been raised about the accuracy and ethical implications of brain reading for this purpose. Laboratory studies have found rates of accuracy of up to 85%; however, there are concerns about what this means for false positive results among non-criminal populations: “If the prevalence of “prevaricators” in the group being examined is low, the test will yield far more false-positive than true-positive results; about one person in five will be incorrectly identified by the test.” [11] Ethical problems involved in the use of brain reading as lie detection include misapplications due to adoption of the technology before its reliability and validity can be properly assessed and due to misunderstanding of the technology, and privacy concerns due to unprecedented access to individual’s private thoughts. [11] However, it has been noted that the use of polygraph lie detection carries similar concerns about the reliability of the results[11] and violation of privacy. [13]

Brain-reading has also been proposed as a method of improving human-machine interfaces, by the use of EEG to detect relevant brain states of a human. [14] In recent years, there has been a rapid increase in patents for technology involved in reading brainwaves, rising from fewer than 400 from 2009-2012 to 1600 in 2014. [15] These include proposed ways to control video games via brain waves and “neuro-marketing” to determine someone’s thoughts about a new product or advertisement.

See also

References


-- Module:Hatnote -- -- -- -- This module produces hatnote links and links to related articles. It -- -- implements the and meta-templates and includes -- -- helper functions for other Lua hatnote modules. --


local libraryUtil = require('libraryUtil') local checkType = libraryUtil.checkType local mArguments -- lazily initialise Module:Arguments local yesno -- lazily initialise Module:Yesno

local p = {}


-- Helper functions


local function getArgs(frame) -- Fetches the arguments from the parent frame. Whitespace is trimmed and -- blanks are removed. mArguments = require('Module:Arguments') return mArguments.getArgs(frame, {parentOnly = true}) end

local function removeInitialColon(s) -- Removes the initial colon from a string, if present. return s:match('^:?(.*)') end

function p.findNamespaceId(link, removeColon) -- Finds the namespace id (namespace number) of a link or a pagename. This -- function will not work if the link is enclosed in double brackets. Colons -- are trimmed from the start of the link by default. To skip colon -- trimming, set the removeColon parameter to true. checkType('findNamespaceId', 1, link, 'string') checkType('findNamespaceId', 2, removeColon, 'boolean', true) if removeColon ~= false then link = removeInitialColon(link) end local namespace = link:match('^(.-):') if namespace then local nsTable = mw.site.namespaces[namespace] if nsTable then return nsTable.id end end return 0 end

function p.formatPages(...) -- Formats a list of pages using formatLink and returns it as an array. Nil -- values are not allowed. local pages = {...} local ret = {} for i, page in ipairs(pages) do ret[i] = p._formatLink(page) end return ret end

function p.formatPageTables(...) -- Takes a list of page/display tables and returns it as a list of -- formatted links. Nil values are not allowed. local pages = {...} local links = {} for i, t in ipairs(pages) do checkType('formatPageTables', i, t, 'table') local link = t[1] local display = t[2] links[i] = p._formatLink(link, display) end return links end

function p.makeWikitextError(msg, helpLink, addTrackingCategory) -- Formats an error message to be returned to wikitext. If -- addTrackingCategory is not false after being returned from -- Module:Yesno, and if we are not on a talk page, a tracking category -- is added. checkType('makeWikitextError', 1, msg, 'string') checkType('makeWikitextError', 2, helpLink, 'string', true) yesno = require('Module:Yesno') local title = mw.title.getCurrentTitle() -- Make the help link text. local helpText if helpLink then helpText = ' (help)' else helpText = end -- Make the category text. local category if not title.isTalkPage and yesno(addTrackingCategory) ~= false then category = 'Hatnote templates with errors' category = string.format( '%s:%s', mw.site.namespaces[14].name, category ) else category = end return string.format( '%s', msg, helpText, category ) end


-- Format link -- -- Makes a wikilink from the given link and display values. Links are escaped -- with colons if necessary, and links to sections are detected and displayed -- with " § " as a separator rather than the standard MediaWiki "#". Used in -- the template.


function p.formatLink(frame) local args = getArgs(frame) local link = args[1] local display = args[2] if not link then return p.makeWikitextError( 'no link specified', 'Template:Format hatnote link#Errors', args.category ) end return p._formatLink(link, display) end

function p._formatLink(link, display) -- Find whether we need to use the colon trick or not. We need to use the -- colon trick for categories and files, as otherwise category links -- categorise the page and file links display the file. checkType('_formatLink', 1, link, 'string') checkType('_formatLink', 2, display, 'string', true) link = removeInitialColon(link) local namespace = p.findNamespaceId(link, false) local colon if namespace == 6 or namespace == 14 then colon = ':' else colon = end -- Find whether a faux display value has been added with the | magic -- word. if not display then local prePipe, postPipe = link:match('^(.-)|(.*)$') link = prePipe or link display = postPipe end -- Find the display value. if not display then local page, section = link:match('^(.-)#(.*)$') if page then display = page .. ' § ' .. section end end -- Assemble the link. if display then return string.format('%s', colon, link, display) else return string.format('%s%s', colon, link) end end


-- Hatnote -- -- Produces standard hatnote text. Implements the template.


function p.hatnote(frame) local args = getArgs(frame) local s = args[1] local options = {} if not s then return p.makeWikitextError( 'no text specified', 'Template:Hatnote#Errors', args.category ) end options.extraclasses = args.extraclasses options.selfref = args.selfref return p._hatnote(s, options) end

function p._hatnote(s, options) checkType('_hatnote', 1, s, 'string') checkType('_hatnote', 2, options, 'table', true) local classes = {'hatnote'} local extraclasses = options.extraclasses local selfref = options.selfref if type(extraclasses) == 'string' then classes[#classes + 1] = extraclasses end if selfref then classes[#classes + 1] = 'selfref' end return string.format( '
%s
', table.concat(classes, ' '), s )

end

return p-------------------------------------------------------------------------------- -- Module:Hatnote -- -- -- -- This module produces hatnote links and links to related articles. It -- -- implements the and meta-templates and includes -- -- helper functions for other Lua hatnote modules. --


local libraryUtil = require('libraryUtil') local checkType = libraryUtil.checkType local mArguments -- lazily initialise Module:Arguments local yesno -- lazily initialise Module:Yesno

local p = {}


-- Helper functions


local function getArgs(frame) -- Fetches the arguments from the parent frame. Whitespace is trimmed and -- blanks are removed. mArguments = require('Module:Arguments') return mArguments.getArgs(frame, {parentOnly = true}) end

local function removeInitialColon(s) -- Removes the initial colon from a string, if present. return s:match('^:?(.*)') end

function p.findNamespaceId(link, removeColon) -- Finds the namespace id (namespace number) of a link or a pagename. This -- function will not work if the link is enclosed in double brackets. Colons -- are trimmed from the start of the link by default. To skip colon -- trimming, set the removeColon parameter to true. checkType('findNamespaceId', 1, link, 'string') checkType('findNamespaceId', 2, removeColon, 'boolean', true) if removeColon ~= false then link = removeInitialColon(link) end local namespace = link:match('^(.-):') if namespace then local nsTable = mw.site.namespaces[namespace] if nsTable then return nsTable.id end end return 0 end

function p.formatPages(...) -- Formats a list of pages using formatLink and returns it as an array. Nil -- values are not allowed. local pages = {...} local ret = {} for i, page in ipairs(pages) do ret[i] = p._formatLink(page) end return ret end

function p.formatPageTables(...) -- Takes a list of page/display tables and returns it as a list of -- formatted links. Nil values are not allowed. local pages = {...} local links = {} for i, t in ipairs(pages) do checkType('formatPageTables', i, t, 'table') local link = t[1] local display = t[2] links[i] = p._formatLink(link, display) end return links end

function p.makeWikitextError(msg, helpLink, addTrackingCategory) -- Formats an error message to be returned to wikitext. If -- addTrackingCategory is not false after being returned from -- Module:Yesno, and if we are not on a talk page, a tracking category -- is added. checkType('makeWikitextError', 1, msg, 'string') checkType('makeWikitextError', 2, helpLink, 'string', true) yesno = require('Module:Yesno') local title = mw.title.getCurrentTitle() -- Make the help link text. local helpText if helpLink then helpText = ' (help)' else helpText = end -- Make the category text. local category if not title.isTalkPage and yesno(addTrackingCategory) ~= false then category = 'Hatnote templates with errors' category = string.format( '%s:%s', mw.site.namespaces[14].name, category ) else category = end return string.format( '%s', msg, helpText, category ) end


-- Format link -- -- Makes a wikilink from the given link and display values. Links are escaped -- with colons if necessary, and links to sections are detected and displayed -- with " § " as a separator rather than the standard MediaWiki "#". Used in -- the template.


function p.formatLink(frame) local args = getArgs(frame) local link = args[1] local display = args[2] if not link then return p.makeWikitextError( 'no link specified', 'Template:Format hatnote link#Errors', args.category ) end return p._formatLink(link, display) end

function p._formatLink(link, display) -- Find whether we need to use the colon trick or not. We need to use the -- colon trick for categories and files, as otherwise category links -- categorise the page and file links display the file. checkType('_formatLink', 1, link, 'string') checkType('_formatLink', 2, display, 'string', true) link = removeInitialColon(link) local namespace = p.findNamespaceId(link, false) local colon if namespace == 6 or namespace == 14 then colon = ':' else colon = end -- Find whether a faux display value has been added with the | magic -- word. if not display then local prePipe, postPipe = link:match('^(.-)|(.*)$') link = prePipe or link display = postPipe end -- Find the display value. if not display then local page, section = link:match('^(.-)#(.*)$') if page then display = page .. ' § ' .. section end end -- Assemble the link. if display then return string.format('%s', colon, link, display) else return string.format('%s%s', colon, link) end end


-- Hatnote -- -- Produces standard hatnote text. Implements the template.


function p.hatnote(frame) local args = getArgs(frame) local s = args[1] local options = {} if not s then return p.makeWikitextError( 'no text specified', 'Template:Hatnote#Errors', args.category ) end options.extraclasses = args.extraclasses options.selfref = args.selfref return p._hatnote(s, options) end

function p._hatnote(s, options) checkType('_hatnote', 1, s, 'string') checkType('_hatnote', 2, options, 'table', true) local classes = {'hatnote'} local extraclasses = options.extraclasses local selfref = options.selfref if type(extraclasses) == 'string' then classes[#classes + 1] = extraclasses end if selfref then classes[#classes + 1] = 'selfref' end return string.format( '
%s
', table.concat(classes, ' '), s )

end

return p
  1. ^
  2. ^
  3. ^
  4. ^ a b c d e
  5. ^
  6. ^
  7. ^
  8. ^
  9. ^
  10. ^
  11. ^ a b c d e f Wolpe, P. R., Foster, K. R., & Langleben, D. D. (2005). Emerging neurotechnologies for lie-detection: promises and perils. The American Journal Of Bioethics: AJOB, 5(2), 39-49.
  12. ^ Farwell, L.A. & Donchin, E. (1991) The Truth Will Out: Interrogrative Polygraphy (“Lie Detection”) with Event-Related Brain Potentials. Psychophysiology, 28(5), 531-547.
  13. ^ Arstila, V., & Scott, F. (2011). BRAIN READING AND MENTAL PRIVACY. TRAMES: A Journal of the Humanities & Social Sciences, 15(2), 204-212. doi: 10.3176/tr.2011.2.08
  14. ^ Kirchner, E. A., Kim, S. K., Straube, S., Seeland, A., Wöhrle, H., Krell, M. M., . . . Fahle, M. (2013). On the Applicability of Brain Reading for Predictive Human-Machine Interfaces in Robotics. PLoS ONE, 8(12), e81732. doi: 10.1371/journal.pone.0081732
  15. ^ Surge in U.S. ‘brain-reading’ patents. (2015, May 7) BBC.com. Retrieved from http://www.bbc.com/news/technology-32623063

External links

  • Brain scanners can tell what you're thinking about New Scientist article on brain-reading 28 October 2009
  • 2007 Pittsburgh Brain Activity Interpretation Competition:Interpreting subject-driven actions and sensory experience in a rigorously characterized virtual world
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.
 


Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.