RESEARCH AND INNOVATION AT THE UNIVERSITY OF TORONTO
Summer 2012 · VOL.14, NO.1
Edge Home
How do political parties influence you?
And how does Chris Cochrane figure that out? by Paul Fraumeni

 

Chris CochraneWhen U.S. President Barack Obama declared his support for same-sex marriage on May 9 of this year, the Twitterverse went into overdrive. Social media analysts reported that Twitter saw 1.6 million #gaymarriage tweets immediately after Obama’s announcement. The pace peaked at 7,347 tweets per minute that afternoon.

That’s a lot of talk about an issuea that could well have a tangible impact on the 2012 U.S. presidential campaign.

But how do political scientists like U of T Scarborough professor Chris Cochrane make sense of the actual messages being articulated in this wave of opinion?

Welcome to political analysis in the age of big data.

“Computing has changed the nature of research in any discipline and it’s making a huge impact on how we keep track of what’s going on in politics,” says Cochrane, who specializes in the left-right divide and how that plays out with issues such as abortion, capital punishment and same-sex marriage.

One of Cochrane’s key interests is in how political parties get their manifestos in the minds of the broad population and then into the conversations that influence how people vote. “Twitter has become an excellent source of data about those political conversations. It’s just staggering how much data is available. Analyzing it and then storing it requires computational power that wouldn’t have been possible only a few years ago.”

The power of high-performance computing has also enabled researchers to “expand the horizons of what we are able to investigate.” Cochrane points to a project that has been underway for a number of years that has gathered and analyzed the election platforms of every political party from every democratic country since World War II. “The sheer amount of data this kind of a project generates and the extent of coordination it demands wouldn’t be possible without contemporary computing. And this analysis is providing us with superb detail on politics worldwide.”

Cochrane notes that modern content analysis software enables political analysts to look at surveys or masses of tweets and determine the zeitgeist on a certain subject by examining such factors as the frequency at which certain words appear or the way in which words are used in combination with others.

Still, as powerful as computers can be, at a certain point in political analysis the human element is needed to look at the subtleties computers can’t detect. “There’s a term, ‘dog whistle politics,’ which is about words that indicate meaning to political party supporters but don’t actually say anything. These are difficult to pick up using computing. But it’s tedious and expensive to hire people to do this work.”

Cochrane’s current work is focused on making it easier to nail those subtleties. “Professional survey software enables us to ask questions which structure their answers in very particular directions and that I can code into the categories I want to look at with comparable answers from any number of people. This is called ‘crowdsourcing.’ And it wouldn’t be possible without modern computing technology.”

As an example, Cochrane points to a key question in his research — do political parties actually disagree with each other on issues such as capital punishment? “Is it the case that the right and left compete because the right supports capital punishment and the left opposes it? Or do the right and left compete because they really talk about entirely different issues, such as economic growth on the right and environmental protection on the left? Computing enables me to first gather the content and store it and then have people analyze that content and understand exactly how political messaging is working.”

Inside public transit
Eric Miller’s models answer a lot of questions about one of society’s toughest challenges by Dana Yates

 

Eric MillerSubways or light rail transit? That question has been a source of much debate in Toronto recently. But which mode of public transportation is the best remedy for the city’s traffic troubles? University of Toronto researcher Eric Miller believes the solution can be found in computer models.

Miller is a professor in the Department of Civil Engineering and former director of the University of Toronto Cities Centre, a multidisciplinary research institute that focuses on cities and a wide range of urban policy issues. As a transportation engineer, Miller is clear about the need for computing in his field: “It’s absolutely essential.”

Indeed, computers help Miller see the proverbial forest for the trees, or in his case, the congestion for the cars. And for the record, there is a great deal of gridlock in Toronto. According to the Geneva-based Organization for Economic Cooperation and Development, traffic jams cost Toronto $3.3 billion in lost productivity each year.

“I’m interested in creating more sustainable transportation systems to benefit our economy, environment and society,” Miller says. “It’s important to experiment with models of the transportation system before we make transit decisions that will affect thousands of people’s lives and will mean billions of dollars in spending.”

Miller has developed several transportation-based computer models. The Greater Toronto Area (GTA) model, for instance, forecasts overall travel demand and various versions of the simulation are used by the municipal governments of Toronto, Mississauga and Brampton, as well as the Region of Durham. The GTA model predicts the number of trips that commuters will make, as well as where, when, by what mode and how fast they will travel.

The model also demonstrates how different types of public transit will affect the entire transportation system. That information, in fact, formed the basis of a March 2012 expert panel report that concluded light rail transit is a better option for Toronto’s Sheppard Avenue than an expansion of the existing subway line. The panel, which included Miller, was established by Toronto’s city council.

But Miller’s models don’t just offer a big-picture perspective of transportation. He has also developed programs that offer more nuanced views. For example, his “agent-based micro-simulation” models use population survey data, including age and household income, to make statistical estimates about the behaviour of individuals and families within a region. That to-ing and fro-ing may include the daily commute to work or the pre-dinner dash to the grocery store.

Currently, Miller is developing general software that will serve as the framework for future transportaion models. This research, in turn, will make it easier to create and adjust complex, user-friendly modelling systems.

“These models inform us of the costs and benefits of our decisions and for that reason, they need to be a voice at the policy-making table,” says Miller. “Transit decisions will always be very political, but we need to think them through systematically using the best evidence available.”

Gary Crawford
Picturing the past
Gary Crawford is using powerful computers to track and display ancient artifacts by Dana Yates

 

Computing and archaeology are unrelated. Or are they? Could an invention of the modern age actually benefit a field that focuses on the past? The answer is yes, says a University of Toronto Mississauga (UTM) researcher. And he’s proving it, artifact by artifact.

Gary Crawford, a professor in the Department of Anthropology, is interested in the long-gone settlements of East Asia. And whereas some archaeologists hunt for tools and pottery to understand past civilizations, Crawford looks for fruits, seeds and grains.

Crawford studies the relationships between people and the plants they once consumed, cultivated and collected. He sees more in charred millet seeds, for example, than the average person. Specifically, he pictures how well and whether communities sustained themselves and their ecosystems millennia ago. An important component of his research is exploring how agriculture developed.

“Plant remains tell us about life at various times,” says Crawford. “We gain an ecological view of human settlements and can discern what the plants were used for or how plants responded to human intervention. Was the agricultural system sustainable? How did it begin and how did it evolve?”

Crawford has learned of societies that successfully balanced population growth with the need to produce food. He points to China as an example. Between 3,500 and 10,000 years ago, the country underwent rapid urbanization, yet its agricultural system kept pace with the changes, making essential adjustments over millennia.

Studying such agrarian systems, however, requires careful record-keeping and Crawford, like many archaeologists, relies upon computers to manage complex data. Computers are used to document where and when an object was found, its age, use and possible connection to other items. Without this vital information, artifacts have no context — a problem with which Crawford is well-familiar.

In 2002, a fire in a Japanese archaeology centre resulted in the loss of 80,000 artifacts and destroyed many of the archival records of the Yagi Site Collection, a set of rare, ancient artifacts that was retrieved in the early 1980s by a team that included Crawford. Now, UTM is the keeper of the world’s only subset of the Yagi Collection that still has documentation available.

But computers do more than keep track of archaeological discoveries — they can also bring them to life. With support from the Henry Luce Foundation, and through a partnership with Arius 3D Inc., Crawford, the UTM Library and Department of Anthropology have created full-colour, 3-D images of objects from two of U of T’s teaching and research collections, including the Yagi Collection. Now students, researchers and the public can view exact digital replicas of many Yagi pottery fragments online (see http://library.utm.utoronto.ca/faculty/yagi-gallery).

It’s important to provide online access to the world’s archaeological findings, says Crawford. First, it protects irreplaceable artifacts from further deterioration. Second, it enables the public to see more objects and to view them in greater detail than could possibly be achieved in museums.

“Finally, we are protecting cultural heritage, the legacy of a nation or culture, so that its heritage can continue to be meaningful in the future,” Crawford says. “Archaeology isn’t just an intellectual pursuit. It’s part of a nation’s identity. It’s part of how we see ourselves.”


 

TOP

EDGE · SUMMER 2012 · VOL.14, NO.1