Some good news about the ozone layer
Kim Strong gets at the roots of climate change by Paul Fraumeni
Let’s begin with something positive — over the next 20 years or so, the global ozone layer should recover to the state it was in before the infamous Antarctic ozone hole was first sighted in the 1980s. That’s assuming we humans continue to phase out the harmful chlorofluorocarbons (CFCs) that created the ozone hole in the first place.
How are scientists able to predict this recovery?
In the same way that they were able to detect the depletion of the ozone layer — with spectrometers and computers.
Over the past 15 years, U of T physicist Kim Strong has built a reputation as a leading innovator in the use of spectrometers to measure gases from CFCs in the atmosphere. “Specifically, we measure the absorption or emission of light in the atmosphere by gases. By looking at the strength of the absorption or emissions, we can then work out the concentrations of the gases.”
And it’s the concentrations of the gases in CFCs that have caused the depletion of the ozone layer — the protective shield invented by nature that keeps the sun’s harmful ultraviolet rays from burning up life on Earth. The ozone layer protects humans against such harmful effects as skin cancer and cataracts, while also preventing UV radiation from damaging ecosystems, and reducing the degradation of materials used in our physical infrastructure.
Meanwhile, the warming of the climate is causing ice to melt at the poles and is changing everything from the destruction of polar bears’ natural home on the Arctic ice to a possible eventual rise in ocean water and flooding of low-lying locations such as Manhattan. Climate change also has links to the recovery of ozone. Unravelling the interactions between ozone and climate is a topic of considerable interest in the global scientific community.
Strong and her colleagues use a variety of spectrometers in various settings — at the Atmospheric Observatory on U of T’s downtown St. George campus and 4,000 kilometres to the north at the Polar Environment Atmospheric Research Laboratory at Eureka, Nunavut.
Her team has also put spectrometers on balloons that have traversed much of the atmosphere above the Earth (a recent grant to physics professor Kaley Walker will enable her and Strong to resume the balloon program over the next few years). And, finally, Strong utilizes spectrometer measurements of the atmosphere from satellites, which provide a global picture of atmospheric composition.
But how do the scientists control their spectrometers?
“Computers are essential to our measurement research. Some of our spectrometers at Eureka, for example, are fully automated. So we can reset the commands for what we need the instrument to do by way of a computer in Toronto.”
The Strong team also relies on computing to analyze the vast amounts of data the spectrometers measure over spans of many years. “When we use these instruments, it’s not like taking a small sample and doing a chemical analysis. We’re looking at radiation from the sun and atmosphere and how it interacts with gases. This is complex physics and you need a good model to see how the interaction happens. Computers are essential for this level of complexity. You can’t use a paper and pencil to retrieve gas concentrations from atmospheric spectra.”
And computers enable the scientists in Strong’s lab to interpret their measurements. The pollutants the Strong team analyzes come from all over the world, borne on winds that might take carbon monoxide from a forest fire in Siberia or coal-fired plant in Ohio to the various levels of the atmosphere. “So, you need more than just measurements in Toronto or Eureka. That’s when it’s valuable to compare measurements with simulations from various models. You need powerful computing to do that.”
Back to that ozone recovery. While scientists predict that this will happen during the 21st century, it may be a bumpy road getting there. Last year, for example, Arctic ozone depletion was bigger than ever before. While this was something of an anomaly, resulting from an unusual mix of conditions, Strong notes that continued atmospheric measurement of gases remains essential for understanding the forces acting on the ozone layer and monitoring its recovery.
“We’re not out of the woods yet. The atmosphere can surprise us sometimes. That’s why I want to be doing these measurements using the instruments and computing power we have. I want to know if the atmosphere is going to do what we expect it to do — or is it going to do something different? If we’re not watching, how will we know?”
Our fragile ecosystem needs help
And with a computer, Benjamin Gilbert is doing his part by Dana Yates
Global warming is well-known for its effect on the climate. But it also poses a threat to the world’s ecosystems. University of Toronto researcher Benjamin Gilbert wants to know more about that process.
Gilbert, an assistant professor in the Department of Ecology and Evolutionary Biology, examines the factors that enable species to co-exist in certain regions. “This has great importance in the context of climate change, acid rain and the arrival of invasive species,” he says. “We’re trying to get a handle on how things work and what is happening.”
To understand biodiversity, however, one must go to diverse areas. That’s because the mechanisms that drive one ecosystem can either be one-of-a-kind or can offer important lessons about other ecological communities. So Gilbert has travelled from northern boreal regions to the southern tropics, studying organisms as varied as tree seedlings and mosquito larvae. He has looked at environmental adaptation, the interaction of species and the spatial processes that promote or reduce diversity.
Studying complex ecosystems, though, requires sophisticated data management and processing technologies. And for that reason, computing plays a critical role in Gilbert’s research. In fact, without the aid of statistical models, he could very well spend all his time sorting data.
For example, in an ongoing project involving Jonathan Levine of the University of California, Santa Barbara, Gilbert is quantifying the effect of invasive grasses on native annual plants. Following a biological invasion, native species often experience reduced numbers and are forced to take refuge in small, sub-optimal areas. Using models and simulations, the researchers are generating estimates about whether these environmental changes will eventually lead to the extinction of native flora and, if so, how long it may take.
In some research projects, Gilbert is then able to compare the computer’s forecasts to actual data as a means of determining if the models are accurate. But that’s not always possible, especially when the simulations suggest that a species will become extinct in 600 to 800 years.
“Without any way to test or verify that information in the short term, you have to take it with a grain of salt,” Gilbert says. “But it does provide insight into what species we should be keeping an eye on, and gives us a testable hypothesis for the future.”
Is there hope for native species? Gilbert believes so. To that end, he would like his work to lead to more careful monitoring of species at risk of extinction. And, as a result of the increased scrutiny, governments may be prompted to take action. That could mean applying herbicides to invasive species or creating tree corridors to connect the fragmented, vulnerable forests that are left behind by fires, logging and agriculture.
“There are many ideas about how we can protect native species by reducing invaders or supporting fragile ecosystems. But without computer models, we have no way of predicting how well these interventions should work. We are beginning to use these models to make testable predictions — this will allow us to better apply the scientific method to conservation.”
Canada’s supercomputing pioneer
Dick Peltier is a climate modeller extraordinaire by Jenny Hall
U of T physicist Dick Peltier is an acclaimed scientist who has spent his career as a tireless advocate for high performance computing (HPC) infrastructure. He is known for his research on the dynamics of Earth’s interior and ice-age surface processes including climate modeling. His work on past climate states illustrates the way in which our climate has evolved over the past 750 million years — and suggests what global warming has in store for us. He’s been celebrated with a host of accolades including the 2011 Gerhard Herzberg Gold Medal for Science and Engineering, the 2010 Bower Award and Prize and the 2004 Vetlesen Prize.
You’re known as an advocate of high performance computing at U of T and across Canada. What is the history of your interest?
I became vitally interested in this area as a postdoctoral fellow, working at the National Centre for Atmospheric Research in Boulder, Colorado, where they had one of the first HPC machines. When I returned to U of T as a faculty member in 1973, I began advocating strongly for the university to invest in work of this kind. It was already clear then that so many aspects of science and engineering were going to be strongly influenced by computing capability.
I was one of those responsible for starting up the Ontario Centre for Large-Scale Scientific Computation (OCLSC). We began that centre by purchasing a CRAY vector computer that had previously been employed at one of the U.S. energy laboratories. We installed it in the top floor of the north wing of the physics department, where the university’s main machine room is still located. Later, I began applying to the Canada Foundation for Innovation on behalf of many groups in the physical sciences in the university to acquire successive sets of three different computers on two different occasions.
And today we have SciNet.
Yes, SciNet is the result of a group of scientists from these different research areas coming together to build a single facility that could be shared by the community as a whole. We’ve installed two major computing systems in the SciNet data centre and we’re about to install a third system, a BlueGene/Q system, which will come to us through the Southern Ontario Smart Computing and Innovation Platform (SOSCIP). (Editor’s note — see pages 6–7 for a feature on SOSCIP.)
Why is HPC important for research?
There isn’t a quantitative field of science or engineering that hasn’t been hugely impacted by our increasing ability to calculate the evolution of extremely complicated systems. We used to speak of science as involving a two-legged stool, consisting of theory and experiment. The third leg of the stool, the one that really stabilizes it, is computational science.
Is HPC important outside universities?
The government clearly realizes that HPC is going to be a mainstay for Canada’s leadership position going forward. One of the main issues we’re working on now is how to engage the private sector — in particular small- and medium-sized enterprises (SMEs) — in high performance computing so that individual companies can improve their competitive position.
SMEs wouldn’t normally have access to HPC facilities?
That’s right. They’re increasingly becoming aware of how much they could accelerate their product-to-market cycle with HPC tools.
What role has HPC played in your own research?
In one particular application my students and I are using high performance computing capability to regionalize global warming projections in order to provide the kind of detailed information about specific regions that is required before policy makers can decide what needs to be done.
We have major papers that are about to appear, for example, on the impact of the Great Lakes system on climate change futures for Ontario. To do this kind of work you have to start out with a global model of climate change projection. In the next step we take these global predictions and downscale them for a particular region, in this case the Great Lakes Basin. This requires embedding very high resolution models inside the global model.
Even though we might think the Great Lakes cover a very large area, the global models of climate change are of such a low spatial resolution that they effectively don’t “see” the lakes at all. The same thing is true for other regions including the mountains of western Canada, and the northeastern Arctic and the Queen Elizabeth archipelago. All of these regions are making an enormous contribution to their own climate futures but to understand these contributions requires very high resolution climate change projections.
The computer vs. the brain
Geoffrey Hinton and the ultimate technology challenge by Patchen Barss
People tend to learn words like this one long after they’re familiar with their meanings.
“So that’s what you call it!” we say. “I knew there had to be a word for it.” The fact that such reactions are possible says much about how people learn. It means they understand concepts, objects and feelings independently of the labels they assign them.
This phenomenon is particularly relevant for researchers like U of T computer scientist Geoffrey Hinton, who strive to create intelligent machines that can learn about and make sense of the world around them. His work in artificial intelligence has not only led to smarter machines, but it has also cemented his beliefs about how human beings learn.
“Many psychologists believe that you learn to recognize objects by someone telling you the name of that object. Then, when you later see something similar, you can assign the same label. This is called ‘supervised learning’ because someone needs to tell you the name of the object. This is a completely hopeless theory.”
Instead, our brains learn to recognize patterns, to notice when one object is similar to another, to create categories and to predict the future based on what we’ve experienced in the past. All of this can be done through “unsupervised learning,” in which the only teacher is our own experiences. It explains how a child knows that dogs are different from cats even before learning the words “dog” and “cat.”
By building artificial neural networks with a capacity for unsupervised learning, Hinton is getting closer to the holy grail of his field: a universal learning algorithm — a simple set of rules that would allow a computer to learn with as much flexibility and self-direction as a human being. Hinton has made great progress toward smarter machines, but he knows he’s still far from his ultimate goal.
“Matching human intelligence in a machine is a long way away,” says Hinton, who won the 2010 Gerhard Herzberg Canada Gold Medal for Science and Engineering. “We can’t even do a pigeon. If we had a system as good as a pigeon we’d be ecstatic.”
That’s not to say the state-of-the-art is primitive by any means — Hinton’s progress toward a universal learning algorithm has already contributed to a long list of applications: advances in unsupervised machine learning have led to better data compression, visual search engines, voice and character recognition software and many other pattern-sensitive applications. The advance of computing technology has contributed greatly to Hinton’s research. Hinton began working on one particular learning algorithm in 1982. Had he begun running one of his current neural networks on the computer he owned then, and let it go continuously, it would take one of his current computers only 15 minutes to catch up to it.
“You need fantastic hardware to compete with the human brain and we’re getting it,” he says. “But it’s no use building great big brains in hardware if you don’t know how to make them learn.”
EDGE · SUMMER 2012 · VOL.14, NO.1