Last night I attended my first Cafe Scientifique, the second to be held here in Velcro City, put on in a combined effort by the University and the Council.The guest speaker was Professor Bob Nichol of the Institute of Cosmology here in Velcro City; holder of the only Marie Curie Excellence chair in Cosmology, no less. His talk was entitled ‘Where is the Universe?’
And very interesting it was, too. He started off by explaining what cosmology actually is, namely the science of the cartography of the universe. When his kids ask him what he does, he tells them, “I count stars!”
He discussed the boom in cosmology over the last ten years, which can be largely attributed to the advances in digital camera technology, and then moved on to talking about measurements, and how they are made ‘in the field’. Parallax was the first method employed for working out the distance to other suns, but is pretty useless once you get beyond the nearest hundred stars or so (my poor maths tells me that this is something to do with the smallest side of a triangle being too small in comparison to the other sides, but I could well be wrong there). Cosmology now uses the inverse square law to work out distances, with standard reference lightsources used for calibration purposes.
It was using this method that first allowed Hubble (the man, not the telescope) to discover the expansion of the universe. It was he who first realised (less than a hundred years ago, in 1925) that other galaxies were not part of our own galaxy, and that the furthest galaxies were retreating from us the fastest. This discovery changed the science of cosmology – it could be argued it created it in its modern form.
Cosmologists look for supernovae, and measure their brightness against the calculated energy levels of a simulated reference standard (the ‘supernova 1A’, based on the detonation of a carbon-oxygen white dwarf star). It is impossible to predict exactly where and when these will happen, so they have to watch a lot of sky all the time to get the results they need. But statistics mean that there will always be one somewhere within a certain timeframe. You can then use the brightness of the light emitted to make certain predictions and calculations.
Once the cosmologists started doing this, however, they found they weren’t getting the results they expected – the light was almost always 25% fainter than they had calculated it would be. This meant that either their analysis was wrong, or the stars were further away than they had previously thought. Using Occam’s razor, they settled for option B. This meant that the universe’s expansion is believed to have been accelerating over the past 3 billion years.
The only force that was known to operate at galactic scales at this point was gravity. But gravity is an attractive force, and the universe appears to have a large amount of repulsive force in action throughout it. A debate opened up as to what was causing this discrepency.
The particle physics people explained it away with the phenomenon of vacuum energy, that they claim pervades space and causes the repulsion, and hence the expansion. The problem with this answer is that their results for the theoretical amount of energy involved differed from the results observed by cosmologists quite considerably – in fact, by 120 orders of magnitude! Other hypotheses picked holes in gravitational theory, suggesting that gravity may ‘leak’ into dimensions above and beyond the four that make up space-time as we understand it. The problem there is that no-one knows where to look for these extra dimensions.
And so the concept of dark matter was formed; this stuff acts the same as normal matter from a gravitational point of view, but there the resemblance ends. It also makes up nearly 30% of the universe. Normal matter accounts for a fraction of a percentage. It all gets weird from here on in!
The rest of the evening was given over to discussion of the topics raised, with some great questions from the audience that shed light on a few topics that were passed by in the initial presentation. Dr. Nichol was a genial and entertaining speaker, and seemed to genuinely enjoy talking about his work to a receptive crowd. The atmosphere in general was very pleasant, and it was a lot of fun to go out for a bit of scientific exploration of a Tuesday evening. I’m looking forward to next month’s Cafe (on conscious machines and artificial intelligence), and I can see it becoming a regular fixture on my monthly calendar. If there’s a Cafe Scientifique in your town or city, and you like a bit of science geekery (if you read this blog, you must at least like it a little), I heartily recommend going along to one and seeing what it’s like. It dispels the concept of science being a dull and dry topic, that’s for certain.
PS: As an attempt to further my citizen-journalist credentials (and just because I’d like to ask him more questions), I’m going to see if I can get an interview with Dr. Nichol at some point in the future, to ask him some more science-fiction related questions pertaining to his field. Watch this space! And sorry for the lack of pictures, but my phone’s camera wasn’t up to the challenge. I shall plan ahead next time.