RESEARCH AND INNOVATION AT THE UNIVERSITY OF TORONTO
Summer 2012 · VOL.14, NO.1
Edge Home
Big data, big impact Ontario’s new $210 million supercomputing network aims at boosting research innovation, job creation and economic prosperity by Paul Fraumeni

 

partnershipAs Ontario grapples with the difficult transition from a resource-based economy to one that is driven by knowledge and high technology, government, business and university leaders are collaborating to jumpstart a new high performance computing initiative.

It’s called the Southern Ontario Smart Computing Innovation Platform — SOSCIP for short. And the partners who’ve developed the initiative are banking on SOSCIP’s enormous potential to generate university research at a whole new level on key issues such as health care, water, energy and cities, as well as create jobs and drive up prosperity throughout the province.

It’s a complex program that took almost two years to design and finalize.

Building a better supercomputer
U of T researchers are working on the next generation of greener, smarter machines by Jenny Hall


Computing performance is measured in flops (floating- point operations per second), which refers to how many operations a system can perform every second. The first supercomputer, built in 1976, was an 80 megaflop machine, capable of performing 80 million operations per second. Modern-day desktops (and phones) are in the giga range — capable of performing billions of operations per second. Today’s most powerful supercomputers are petaflop machines, performing quadrillions of operations per second (that’s 15 zeros).

The next frontier in computing is what’s called “exascale.” Here we’re into quintillions of operations per second (that’s 18 zeros, or one billion billions). The problem is that we can no longer simply scale up existing technology. The major barriers are power consumption, the physical operation speeds of the components that will be required and how to design and use an exaflop machine. Petascale computers use about $3 million per year in electricity. Reaching the exascale using today’s technology would mean these annual bills would rise to $3 billion.

The exaflop frontier is especially compelling because neuroscientists believe that the computational power of a human brain is roughly equivalent to an exaflop. This means, for the first time in human history, we might be able to build a computer that could replicate the complexities of a human brain.

Bianca Schroeder

Bianca Schroeder

Can data centres run hotter?

Data centres and high performance computing installations involve thousands of components. They’re energy hogs. “Data centres worldwide use one per cent of the planet’s power,” says Professor Bianca Schroeder of the Department of Computer Science and the Department of Computer and Mathematical Sciences at the University of Toronto Scarborough. “That’s more carbon emissions than all of Argentina.” A large part of the electricity bill is cooling — components that overheat will literally melt. Conventional wisdom suggests that 20°C is the ideal temperature, but this, says Schroeder, is folk wisdom and isn’t based on any hard evidence. Her work, based on data from large companies like Google and from major U.S. research labs like Los Alamos, suggests that raising the temperature as high as 40°C would have little effect on reliability and therefore require much less energy for cooling.

Joyce Poon

Joyce Poon

Harnessing the power of light

A large-scale computing system is built from many computer chips and components that must work together. Vast amounts of data are transmitted within and among the components to support the system’s performance. Today, data is transmitted mostly as electrical signals along metal wires within and between electronic chips. Professor Joyce Poon of the Edward S. Rogers Sr. Department of Electrical and Computer Engineering is inventing tiny photonic devices — some have features 1/500 the width of a human hair — that can be densely integrated with electronics. These devices have the potential to replace traditional electronic connections with optical connections. It is expected that exascale computing systems will require the extensive use of optical connections to shuttle around the large volumes of data at very fast rates in an energy efficient manner.

Paul Chow

Paul Chow

Reconfigurable computing

For most of the history of computing, processing speed got faster. Around 2004, though, the physical limit of processing speed was reached because the processor chips were getting too hot. Manufacturers compensated by beginning to release multicore processors — instead of making processors faster, they started giving us more of them. However, many programs are written for single core processors, and they don’t scale well to a multicore context. The result is that programs are no longer getting faster. Professor Paul Chow of the Edward S. Rogers Sr. Department of Electrical and Computer Engineering is working on speedups through reconfigurable computing. “The microprocessor you use,” he says, “is designed to be general. You can write any program for it. This is fine for most of us, but what if you could build hardware specially designed to solve a specific problem?”


 

THE PROJECT


The project

THE PARTNERS


The partners

THE METHOD


Scientists from academia and industry will be able to work together in a collaborative model unique in Canada.
Small and medium-sized enterprises (SMEs) will be invited into the network, giving them access to university and IBM scientists and to computing facilities that wouldn’t otherwise be available to them.
IBM research staff will also be located directly at campus computing facilities with university scientists.

THE RESEARCH

cities
With 69 per cent of Canada’s population now in urban centres, cities are stymied by budget restraints, rapid acceleration of urban growth, aging infrastructure and networks of unconnected, complex systems. The initiative will help channel research efforts on new breakthroughs to help resolve these challenges and improve our cities.


healthcare
Rising health care costs associated with chronic diseases and the lengthy development cycle for new medicines cost millions of dollars for taxpayers. Duplication of information across multiple systems is rampant, often rendering deep wells of lifesaving information inaccessible. The new investments will help researchers begin to address these issues. For example, over one million people in Ontario have neurological disorders. The impact of brain disorders on the Ontario economy is estimated at $39 billion per year. Researchers will tackle this challenge to accelerate brain research.


energy
Ontario is a leader in smart grid and alternative energy distribution. Continued efficiency gains in distribution will save consumers money and help reduce waste. Inefficient energy distribution can equal approximately 40 per cent waste from source to use. The new initiative intends to advance research on smart grid technologies to help increase energy distribution efficiencies for Ontario and Canada. New weather modeling capabilities can help pinpoint trends for precision agriculture and other initiatives aimed at increasing predictability of sun, rain, wind and plant growth patterns for increased crop yield and overall productivity gains.


water
Almost nine per cent of Canada’s total area is fresh water, but health problems related to water pollution are estimated to cost Canadians $300 million per year. Municipalities commonly lose about 20 per cent of their water supply due to infrastructure leaks. By better automating and more efficiently managing city services around water use, municipalities can compare the volume of sewage from homes and businesses against the volume of water coming in from ground water or rainfall. Problem areas can then be targeted for detailed inspection. Saving water loss and inflow on the sewer system helps keep consumer rates as low as possible.


computing
Through SOSCIP, researchers will focus on software modeling innovation within exascale and high performance computing platforms to help strengthen Canada’s digital infrastructure and digital advantage. Increasing the speed of these high performance computing systems will harness massive volumes of data to more quickly solve critical challenges with our cities, water, energy and health care. This foundation will also allow new advances from IBM’s global research team to be tested in Canada.


 

TOP

EDGE · SUMMER 2012 · VOL.14, NO.1