Back to top

CAUT Bulletin Archives
1996-2016

April 2013

Funding shortfall threatens HPC

By James Wadsley
High performance computing (HPC) has become an indispensable tool for research in all fields. Faculty members and research teams in universities, colleges and research hospitals across the country now rely on computers to sift through increasing amounts of data and run simulations with remarkable fidelity that complement, and sometimes exceed, what can be done with experiments.

Over the past decade, Canada built up a national network of high-end research computing facilities and staff to support their use. Four years ago we had systems peaking in the top 20 in the world. Now Canada’s most recent foray into internationally competitive research computing is about to crash disastrously.

At a time when most countries are making sustained, long-term investments in computing infrastructure to support research and innovation, Canada is set to pull the plug.

HPC is computing that significantly exceeds what you can do on your desktop or laptop. Canada’s current top academic system, GPC, first available in 2009, was then number 16 in the world and equal to about 30,000 desktops. But computers depreciate rapidly. New systems are typically twice as fast as those of the previous year.

GPC is now number 94 and will fall out of the top 500 within a couple of years, at which point it will not be cost-effective to keep it running.

Big systems also require two to three years for planning and installation. GPC resulted from a special 2006 Canada Foundation for Innovation grant competition: the National Platforms Fund. However, no new funding has been announced, so we cannot expect new systems earlier than 2016. At this point Canada will be a research computing wasteland.

Some fields have a long history with computing such as computer science, astrophy­sics, weather (climate modelling) and engineering. Computational fluid dynamics has now reached a point where companies like Bombardier and Pratt & Whitney are massively dependent on HPC and much of their design is done virtually rather than in wind tunnels. Computational modelling is now everywhere, including medicine (e.g. epidemiology such as models of Swine Flu H1N1), chemistry, pharmaceuticals, business (e.g. finance) and biology (e.g. protein folding and molecular dynamics).

In recent years there has also been a massive increase in digital experimental data so that applied scientists and humanists now depend on HPC to process it. Canadian researchers are dealing with diverse datasets from sources such as medical and neuro-imaging (e.g. CBRAIN), the Large Hadron Collider (ATLAS), the Sudbury Neutrino Observatory, the Ontario Cancer Biomarker Network, Ge­nome Canada, GeoChronos (Earth observations), telescope arrays (SKA and ALMA) and the Digital Humanities.

“Computationally intensive methods abound in the humanities and social sciences (think natural language processing and ‘big data’ applications in economics),” says Jeff Racine, an economics professor at McMaster University. “Academics in countries lacking cutting edge compute facilities will be relegated to the back of the pack, while active researchers will migrate to computationally greener pastures.”

Canada’s most recent period of high performance computing investment began around 1995 with the incorporation of an advocacy organization, c3.ca, by university researchers and the National Research Council. In 1996 a Major Facilities Access grant from the Natural Sciences and Engineering Research Council provided for technical staff support. The creation of the Canada Foundation for Innovation in 1997 provided a mechanism to fund large scale academic computing. This led to the formation of several regional computing consortia and HPC systems being installed across Canada.

In 2005, the National Research Council, c3.ca, CANARIE, the Canada Foundation for Innovation and the granting councils supported the development of a long-range plan for HPC in Canada (computecanada.ca/cc_files/publications/lrp/LRP.pdf). The plan provided a compelling case for HPC investment that was favourably received by po­liticians across the spectrum. This led the Canada Foundation for Innovation to institute a National Platforms Fund competition in 2006. A new national organization, Compute Canada, was created to oversee a national network of systems, available to all Canadian researchers and matched to their needs.

During this period, funding for research computing across Canada was a factor of three below the average of other G8 countries. But even this modest investment was a huge improvement over the decades before. Our current standing versus our G8 peers is shown in the accompanying graph. In this graph a value of 1 is the G8 average over the past five years which corresponds to about 0.01% of GDP spent on HPC.

However, Canada’s systems are now old. Most have fallen out of warranty, and as parts fail they can no longer be replaced. The computing centres continue to run, training early career researchers and working with research groups to help them run bigger computations on more data; but the re­searchers are then left to run on systems which are stagnant or failing.

This has happened before: Canada’s first “electronic computer” in 1952, FERUT, was never properly used; a large Cyber 205 computer was installed in Calgary in 1985, only to be taken down in 1991; the Ontario Centre for Large Scale Computing in Toronto had a computer installed in 1988, only to have the entire centre disbanded in 1992; a large Fujitsu computer, installed in Calgary by 1994 with a combined commercial and research mandate, was gone by 1996.

Canada’s history in large-scale research computing has been one of investment, followed by optimism, followed by abandonment. But today, with simulations and data sets growing in both size and importance, abandonment would be especially damaging.

Richard Peltier, climate researcher and director of Toronto’s Centre for Global Change Science, summed it up, “Access to state-of-the-art HPC infrastructure was once considered a research luxury. It has become a must-have capability that underpins research competitiveness in an ever-widening range of disciplines. In Canada, the progress achieved over the past 10 years in developing a national HPC platform is in serious danger of being reversed through inattention of the funding bodies respon-sible for investment in major components of research infrastructure.”

A major impact will be on our students and graduate students. Most students find careers outside academia. The growth areas are increa­singly in compute intensive industries from aerospace (e.g. Bombar­dier) to finance. Peltier notes: “If we are to continue to be successful in the hunt for and development of the Highly Qualified Persons that form the nucleus of Canadian innovation we must have the means to compete.”

A recent development that is compounding the problem this time is the curtailing of programs like NSERC’s Research Tools and Instruments grants used to fund local compute clusters. Canada Foundation for Innovation matching requirements also act to soak up provincial funds. Other nations separately fund computer systems for research groups and universities, maintaining a flexible base of computing infrastructure. Instead, both the Canada Foundation for Innovation and the cash-strapped granting councils are redirecting computing requests to Compute Canada without providing new funds.

In recent years the Canada Foundation for Innovation has sought to reorganize rather than reinvest. At the foundation’s request, Canada’s research institutions set up a new, incorporated Compute Canada last fall. Dr. Bill Appelbe, its first CEO, started in January of 2013 and is rapidly getting up to speed: “I travelled across Canada and I’ve been struck by both the enthusiasm and dedication of staff, and the close engagement of the research community. Nevertheless much of the HPC infrastructure in Canada is ageing and no longer internationally competitive. Thus it is an urgent priority of Compute Canada to work with research communities to develop a strong and sustained national advocacy program for new investment in HPC infrastructure.”

The only way we can ensure things get better is to get involved and make the case. Along with colleagues within the Canadian Astronomical Society we estimate our community’s projected needs for both simulations and data processing. We already know that Canada will be unable to meet even the minimum data storage and processing requirements related to its telescope investments.

All disciplines will encounter this bottleneck as Canada’s HPC capability crashes over the next few years. It is imperative that academic and professional societies come together and present a united front on the need for immediate and sustained HPC investment in Canada. A direct approach is to work with Compute Canada to lobby governments. The timing is critical and we will be in Ottawa making the case over the next few months.

Over the longer term, Compute Canada will set up a research advisory committee to provide direct
representation for researchers from across Canada and hire a vice-president for research. You can find out how you and your disciplinary society can get involved at info@ computecanada.ca. You can also talk to your local research computing consortium: ACEnet (www.ace-net.ca) in Atlantic Canada; Calcul Quebec (www.calculquebec.ca); Sci-Net (www.scinethpc.ca), SHARCNET (www.sharcnet.ca) or HPCVL (www.hpcvl.org) in Ontario; and Westgrid (www.westgrid.ca) in the western provinces.

With ever more data coming, with our students graduating into a world where computing is increasingly important, and with our international colleagues having easy access to the very latest computing technology, Canada can’t afford to have research computing abandoned yet again by the funding agencies.

---------------------------------------------------------------
James Wadsley is an associate professor of physics and astronomy at McMaster University and chair of the computing, data and networks committee of the Canadian Astronomical Society.

The views expressed are those of the author and not necessarily CAUT.

Comment
CAUT welcomes articles between 800 and 1,500 words on contemporary issues directly related to post-secondary education. Articles should not deal with personal grievance cases nor with purely local issues. They should not be libellous or defamatory, abusive of individuals or groups, and should not make unsubstantiated allegations. They should be objective and on a political rather than a personal subject. A commentary is an opinion and not a “life story.” First person is not normally used. Articles may be in English or French, but will not be translated. Publication is at the sole discretion of CAUT. Commentary authors will be contacted only if their articles are accepted for publication. Commentary submissions should be sent to Liza Duhaime.