Showing posts with label plasticity. Show all posts
Showing posts with label plasticity. Show all posts

Tuesday, 16 March 2010

And now, the end is near...

Somewhat unbelievably, we are now in the last 2 weeks of formal lectures, although as usual there is still an array of optional and supplementary talks going on at the various affiliated institutions. However, the taught aspect of the MSc is now all but over. This leaves only the prospect of the research project looming, lumbering into sight like some gigantic beast from a 1950s B-movie, pulverising anything that dare get in its way.

Tomorrow I meet with my supervisor/collaborator to really get the process started, and we will draw up exactly what will happen, when, and who will be responsible for what.

A few weeks ago I posted here a simplified explanation of our research proposal. I will now attempt to explain, in as accessible a form as possible, what this research has got to do with the real world. But first, a very quick recap of the basics.

The field is auditory psychophysics. At first the name alone was enough to make me want to run for the hills, but it's not half as scary as it sounds. As is so often the case in neuroscience the intimidating nomenclature masks a surprisingly simple concept. Psychophysics, it turns out, is just the study of how the brain processes the information we get from the senses, in this case, sound. Therefore the grand old title 'auditory psychophysics' really just means 'what the brain does with everything you hear'.

There is now a wealth of evidence to suggest that the structure of the brain, i.e. which cells connect with which and how strong the connections are, is constantly changing, and that this change is driven by the importance of the sensory information we receive.

In the case of sound, incoming information is processed mainly in the primary auditory cortex, or A1. I have explained before the concept of tonotopic organisation, but as a little refresher imagine that a little part of the surface of the brain is like the keys of a piano. When you hear a high pitched sound the braincells at the top of the piano are activated, and as the pitch decreases the activity moves further down as the musical pitch gets deeper.

As a result of this type of organisation each cell in A1 has what is termed a best frequency, or BF. This is frequency to which the cells responds the strongest, and as the frequency moves away from the BF the response gets smaller, until there is no response at all.

So, what happens if a certain frequency suddenly becomes very important to your behaviour. For example, consider the sound of a screeching predator, which would be a very good indicator that you should make yourself scarce as soon as possible. It would be very helpful if you processed these behaviourally relevant sounds quicker than irrelevant background sounds.

Well, when a sound like this suddenly becomes very important we find that more of the auditory neurons will change their BF to the frequency in question, making the animal more sensitive to that sound.

It sounds fairly straightforward, but we are talking about a small cluster of cells amongst tens of billions, a great many of which show a similar adaptability for the area to which they are specialised. So this will be going on not just for sound frequency, but also for the other properties of sound, such as volume. Additionally, plasticity has been shown in other domains such as vision, touch and smell. And that is just the senses, our own internal states are also constantly being monitored in a similar fashion.

The bigger picture is one of a brain that is constantly adapting to perform at peak performance in whatever environment it is placed.

This plasticity is greatest in infancy. Babies are born with far more connections between brain cells than are present in adults, perhaps as many as double. This is because most of our adaptation to our environment happens in the first few years of life. Once the infant is adapted to its environment, the irrelevant brain connections are pruned away, remaining if not dead then largely dormant.

This extreme early adaptability has a few intriguing applications. For example, if a human baby is exposed to enough monkey faces early in development it will be able to distinguish monkey faces just as well as human faces (presumably into adulthood), although for an adult this would be almost impossible to learn. Another example of this early adaptability and pruning is seen in the use of language, with babies able to learn all the different vocal intonations seen in languages around the world, even sounds that are almost indistinguishable to Western adults, such as certain African dialects that involve communication through clicks produced in the throat. This potential bilingualism does not last long, and beyond the first couple of years of life we become locked into the grammatical constraints of our first language (which incidentally is the reason that native Japanese speakers find it so hard to distinguish between R and L, a feature of language that is nor present in Japanese).

However, as I said, the connections that are pruned after infancy remain dormant rather than dead, and plasticity experiments suggest that with appropriate training they can be revived to some degree.

Plasticity, therefore, is like Darwinism happening in real time. It takes many generations for a species to physically adapt to their environment, but the clever old brain can do it in a matter of hours.

Friday, 18 December 2009

Research Proposal

Today I had to submit my research proposal for my thesis project. Here it is, written by myself and Dr Christian Kluge:


Rapid plastic changes in Auditory Cortex: a classical conditioning paradigm

Chris Fassnidge, Dr Christian Kluge and Professor Jon Driver

Aim

This study seeks to determine whether detection and/or discrimination of a pure auditory tone can be improved by classical conditioning, pairing a target frequency with an electric shock

Literature Review

Work by Merzenich, Weinberger, Irvine and others has shown that receptive field properties of neurons in primary auditory cortex (AI) can undergo rapid plastic changes in response to behavioral learning in animals (reviewed in Weinberger 2004, Weinberger 2007, Irvine 2007). Remarkably, these changes occur within minutes. During learning, when a target frequency acquires behavioral relevance a large number of AI pyramidal cells shift their best frequency towards this distinct frequency. This effect has been shown to depend on attention , i.e. behavioural relevance (Polley et al., 2006). Two groups of rats underwent operant conditioning with identical stimulus sets. One group responded to a target frequency and demonstrated tonotopic changes resulting in an increased representation of the target frequency, while the second group performed the task (with exactly the same stimuli) in response to a target loudness which led to changes in the topographic organisation of neurons’ preferred loudness. In non-human primates, Blake et al. (2006) demonstrated a crucial role for active cognitive control and involvement needed for tonotopic re-mapping to occur.

Later mechanistic assessment has revealed that the neurotransmitter acetylcholine (ACh) is crucially involved in these plastic processes. Pairing brief ACh infusions with the purely passive presentation of tones induced changes in the AI tonotopic maps similar to the ones observed in the experiments described above. In addition, stimulation of the nucleus basalis, the main source of corticopetal cholinergic projections, led to identical remapping. These findings are intriguing because they strongly argue against the long-held view that primary sensory cortices are merely passive input structures in which plastic changes of receptive fields occur only during early ontogeny. Instead, the studies summarised indicate that the sensitivity and perhaps even local network resonance patterns can be dynamically adapted to current behavioural requirements.

Very little work has been done in humans on this subject. Thus, we aim to behaviourally determine whether conditioning of one or another frequency can lead to improved performance in detection and/ or discrimination of pure tones, as would be predicted if human receptive fields show similar plasticity to that documented in animals.

Materials and Method

In a within-subject design (with conditioned frequencies counterbalanced over subjects), we will compare the detection (experiment A) as well as the discrimination (experiment B) of pure tones. The detection task will employ a two alternative forced choice (2AFC) scheme in which subjects have to decide which of two successively presented white noise stimuli actually contained a pure tone. In the discrimination experiment, participants will be required to decide whether the second of two successively presented pure tones was higher or lower than the first one. In both experiments tones of a range of frequencies will be used and this part of the experiment will last about 15 minutes.

After this initial detection / discrimination block, subjects will undergo classical conditioning, pairing one distinct target frequency tone with an electric shock to the forearm. After this association is established, the detection / discrimination 2AFC routines are repeated, interleaved with further conditioning blocks (“topping up”). After 40 minutes, the detection / discrimination task will cease to be interupted by further conditioning. The absence of reinforcement of the target frequency will then lead to extinction of the association between frequency and shock (extinction).

A number of potential follow-up studies are conceivable. First, the work by Blake and colleagues (2006) suggests that operant conditioning might be more effective in inducing tonotopic changes. Thus, modifications of the paradigm employing reward or punishment depending on performance are possible. Also, there are potential MEG versions of all experiments described which would, through analysis of early latency auditory components of the evoked magnetic fields, allow for a direct assessment of the underlying neurophysiological principles.

Predicted Outcomes

This series of experiments allows for three possible outcomes:
1. Conditioning may improve tone detection performance but not tone discrimination. This situation would allow for the conclusion that a greater number of neurons areresponding to the target frequency after conditioning but that this improvement does not involve a sharpening of best frequency tuning curves.
2. Conditioning may improve tone discrimination performance but not tone detection. This outcome could be interpreted as a potential increased local signal-noise ratio. This situation seems somewhat unlikely, however, since previous studies reported best frequency shifts in large numbers of cells rather than sharpening of existing tuning curves.
3. Finally, if conditioning leads to performance improvements in both detection and discrimination our interpretation would be that although there was an increase in the number of neurons responding to the target frequency, this change does not come at the expense of frequencies around it. In this situation it would be interesting to study the underlying compensatory mechanisms in a later MEG experiment.

Analysis

The data will be analyzed with ANOVA (random effects) using the SPSS statistics software package. Further analysis may be required depending on results.

Timetable

Preparatory work: January - March 2010
(generation of stimuli, programming of the actual experiment, pilot measurements)
Data collection: March - June 2010
(16 to 20 subjects each group)
Analysis and write up: June - July 2010

Budget

Participants will be reimbursed for their time and effort using existing research grants of the ICN attention group. No investment in equipment or software will be neccessary.

Ethics

Full ethical approval will be sought from the Graduate School Research Ethics Committee prior to pilot data collection. The ethics application will be submitted in early January 2010.







References

Blake, D. T., Heiser, M. A., Caywood, M., & Merzenich, M. M. (2006). Experience-dependent adult cortical plasticity requires cognitive association between sensation and reward. Neuron, 52(2), 371-381.

Irvine, D. R. F. (2007). Auditory cortical plasticity: Does it provide evidence for
cognitive processing in the auditory cortex? Hearing Research, 229(1-2), 158-170.

Polley, D. B., Steinberg, E. E., & Merzenich, M. M. (2006). Perceptual Learning Directs Auditory Cortical Map Reorganization through Top-Down Influences. The Journal of Neuroscience, 26(18), 4970–4982.

Weinberger N. M. (2004) Specific long-term memory traces in primary auditory cortex. Nature Reviews Neuroscience, 5(4), 279-290.

Weinberger N. M. (2007). Associative representational plasticity in the auditory cortex: a synthesis of two disciplines. Learning & Memory, 14(1-2) 1-16.

Wednesday, 25 November 2009

Neuroplasticity.

Not much to report over the last couple of weeks, I had a reading week recently and actually managed to get rather a lot done, having spent several hours in the library making notes on this and that. It was a much needed 'break' of sorts, if only from the time-tabled activities.

I am currently trying to think of a research project which will form my thesis. I have a pretty good idea who will be supervising me, a guy over at the institute of cognitive neuroscience who is researching plasticity in the auditory cortex.

Plasticity is something that fascinates me greatly, and one book in particular, 'the brain that changes itself' by Norman Doidge, was more or less the main reason why I applied for this particular course.

Essentially, plasticity (or neuroplasticity) is the ability of the brain to 'rewire' itself, to make new connections. For the century or so since the 'neuron doctrine' came to the forefront of thinking about the brain, it was believed that the structure of the brain was more or less fixed from adolescence onward. Scientists thought that no new brain cells could be formed, and it a part of the brain was damaged then it was lost for life.

Of course, there has to be some degree of plasticity in the brain, otherwise we couldn't form new memories, but we now know that the brain is far, far more plastic than previously thought.

The most exciting research being done in this field that I am aware of is in the area of sensory substitution, most notably the work carried out by a guy by the name of Paul Bach-y-rita.

Bach-y-rita developed a device which could allow the brain to recover one lost sense from another. If I wanted to sensationalise this, I would say he made blind people see again. But that wouldn't quite do him justice - he made them see out of their tongue.

Sounds insane, but it is actually possible. The key concept here is that we don't see with out eyes, just as we don't hear with our ears. All of our senses are essentially electrical information carried to and processed in the brain. For example, the actual physical image of what you see doesn't get any further than the back of the retina, before it becomes a complex series of electrical pulses that are then carried to the visual areas at the back of the brain.

When somebody is blind, it is generally because of a problem with their eyes, rather than the visual areas of the brain. Therefore if an alternative pathway could be found to get these electrical pulses to the visual cortex, you have vision.

What Bach-y-rita invented was a small device that sits on your tongue and converts a picture from a video camera into electrical information. Imagine a strip of plastic with hundreds of tiny electrical points covering it's surface. Each one of these points is like the pixels that make up a digital image, the brighter the pixel the stronger the electrical pulse. Multiply this over the entire surface of the tongue and you can make up a crude image of your visual field. Apparently, when this device is activated it feels like those old 'popping candy' sweets you can get that fizz in your mouth.

The amazing thing is, over time your brain learns how to process the signals as images, and slowly the signal becomes a valid, if slightly basic, black and white image. Brain scans using MRI machines have even confirmed that this information is being processed in the visual parts of the brain.

So this now becomes a philosophical question - if visual information is being processed in the visual cortex, but it just so happens it is relayed via the tongue by a video camera - is this still eyesight?

Take a look at these two videos and decide for yourself.