Wednesday, 25 November 2009


Not much to report over the last couple of weeks, I had a reading week recently and actually managed to get rather a lot done, having spent several hours in the library making notes on this and that. It was a much needed 'break' of sorts, if only from the time-tabled activities.

I am currently trying to think of a research project which will form my thesis. I have a pretty good idea who will be supervising me, a guy over at the institute of cognitive neuroscience who is researching plasticity in the auditory cortex.

Plasticity is something that fascinates me greatly, and one book in particular, 'the brain that changes itself' by Norman Doidge, was more or less the main reason why I applied for this particular course.

Essentially, plasticity (or neuroplasticity) is the ability of the brain to 'rewire' itself, to make new connections. For the century or so since the 'neuron doctrine' came to the forefront of thinking about the brain, it was believed that the structure of the brain was more or less fixed from adolescence onward. Scientists thought that no new brain cells could be formed, and it a part of the brain was damaged then it was lost for life.

Of course, there has to be some degree of plasticity in the brain, otherwise we couldn't form new memories, but we now know that the brain is far, far more plastic than previously thought.

The most exciting research being done in this field that I am aware of is in the area of sensory substitution, most notably the work carried out by a guy by the name of Paul Bach-y-rita.

Bach-y-rita developed a device which could allow the brain to recover one lost sense from another. If I wanted to sensationalise this, I would say he made blind people see again. But that wouldn't quite do him justice - he made them see out of their tongue.

Sounds insane, but it is actually possible. The key concept here is that we don't see with out eyes, just as we don't hear with our ears. All of our senses are essentially electrical information carried to and processed in the brain. For example, the actual physical image of what you see doesn't get any further than the back of the retina, before it becomes a complex series of electrical pulses that are then carried to the visual areas at the back of the brain.

When somebody is blind, it is generally because of a problem with their eyes, rather than the visual areas of the brain. Therefore if an alternative pathway could be found to get these electrical pulses to the visual cortex, you have vision.

What Bach-y-rita invented was a small device that sits on your tongue and converts a picture from a video camera into electrical information. Imagine a strip of plastic with hundreds of tiny electrical points covering it's surface. Each one of these points is like the pixels that make up a digital image, the brighter the pixel the stronger the electrical pulse. Multiply this over the entire surface of the tongue and you can make up a crude image of your visual field. Apparently, when this device is activated it feels like those old 'popping candy' sweets you can get that fizz in your mouth.

The amazing thing is, over time your brain learns how to process the signals as images, and slowly the signal becomes a valid, if slightly basic, black and white image. Brain scans using MRI machines have even confirmed that this information is being processed in the visual parts of the brain.

So this now becomes a philosophical question - if visual information is being processed in the visual cortex, but it just so happens it is relayed via the tongue by a video camera - is this still eyesight?

Take a look at these two videos and decide for yourself.

Monday, 9 November 2009

A little fact.

Hello everyone,

I realise that my last entry was pretty dull, so I found out an interesting little fact to keep your interest.

There are 100-150 billion neurons in the human brain.
Each neuron may connect with around 10,000 other neurons.
If each neuron connected with every other single neuron, our brain would be 12.5 miles in diameter (Nelson & Bower, 1990). This is the size of greater London.

Taken from Jamie Ward's book 'The students guide to cognitive neuroscience', chapter 2.

Sunday, 8 November 2009

Weeks 4 & 5

Over a month into the course now, and my predominant feeling is exhaustion. I can't particularly explain why, but I am bloody tired. How handy it is then, that next week is a reading week, and it couldn't have been timed better.

There isn't a great deal to report about the last fortnight. We had some interesting lectures and some achingly dull ones. We learnt a little about how computational modelling can help us to understand the brain, had another case study at the hospital, and a whole load more stats.

I got my marks back for the first statistics test as I was handing in the second assignment. I did reasonably well, considering the conditions under which I wrote the last one (see my previous entry on my late-night rewrite). The marks I dropped were, I think, due mostly to the length of time it has been since I had to do any statistical analysis, and a couple of silly mistakes. I feel a lot more confident about the second test, and it would seem that I am back into the swing of things.

My main worry now is the exam in January (the only exam on this course), which is on the neurophysiological side of the course. Of course, this stuff is bloody hard, and Marty has just put a few example questions on his website, which really put the fear into me. As I was rereading my notes on how brain cells communicate with one another, I was struck by just how easy this would be it the brain were intrinsically self-aware. My neurons are firing right now, in many different parts of my brain, as are yours, as are all of ours, all the time, many many times over. We should be experts in this. Given the frequency with which our neurons fire you could argue that it is the single most practised act any human has ever performed.

So why then is this exam shaping up to be such a struggle?