Cross-sectoral input for the potential role of science in Canada’s environmental assessment

Publication: FACETS
7 May 2018

Abstract

Since being elected in 2015, Canada’s federal Liberal government has taken steps to overhaul major environment-related laws and policies, including federal environmental assessment (EA) and regulatory processes. During 2016–2017, a government-appointed panel toured Canada and received >1000 suggestions from diverse sectors of society regarding EA reform. Yet, different sectors of society may have different views concerning scientific components of EA. We analyzed written submissions during public consultation (categorized into five sectors) regarding five key scientific components of EA: (1) openly sharing information, (2) evaluating cumulative effects, (3) scientific rigour, (4) transparency in decision-making, and (5) independence between regulators and proponents. On the whole, submissions from Indigenous groups, non-governmental organizations, and individuals/academics supported strengthening all five components. In contrast, most contributions from industry/industry associations, and, to a lesser extent, government bodies or agencies, suggested that there was no need for increased scientific rigour or increased independence. These findings indicate that there is cross-sectoral support for strengthening some scientific aspects of EA. However, the degree to which the Government of Canada strengthens the scientific rigour and independence of EA will indicate whether environmental decision-making in Canada is aligned with preferences from industry or the rest of Canada.

Introduction

The environmental assessment (EA) process is used to predict and prevent environmental harm caused by industrial and economic development (MacKinnon 2017). Often enshrined in national legislation and regulatory structures, EAs rely heavily on putative scientific input at all stages, including scoping, collecting baseline data, estimating or predicting impacts, planning mitigation measures, evaluating risk, designing and implementing monitoring programs, and reviewing relevant technical and scientific reports. Scientific inputs (inclusive of the natural, social, and health sciences) are used in the recursive activity of improving the methods of environmental review processes (Jones and Greig 1985; MacKinnon 2017).
Canada is one of many countries which have experienced weakened laws and policies related to the environment and scientific integrity in the last decade, including those concerning EA (Reynolds et al. 2012; Carroll et al. 2017). Some form of EA has been taking place in Canada since the 1930s (Couch et al. 1983), with formal EA legislation adopted at the federal (national) level in 1995. Although the Canadian Environmental Assessment Act 1995 (hereafter CEAA 1995) was criticized for lacking provisions for robust science (Greig and Duinker 2011), reforms made in the Canadian Environmental Assessment Act 2012 (hereafter CEAA 2012) were widely and strongly criticized by many experts (including scientists, legal scholars, and former politicians) for reducing scientific rigour, independence, and public participation (Doelle 2012; Gibson 2012; Reynolds et al. 2012; Schindler et al. 2012; Siddon et al. 2012; Hutchings and Post 2013).
In concert with criticism from the Canadian scientific community, public confidence in the federal EA process has been low since at least CEAA 2012. For example, half of Canadians felt that the federal government was doing a poor or very poor job of building confidence in how decisions are made about energy, whereas only 17% felt it was doing a good or very good job (Nanos 2017). Furthermore, 80% of Canadians agreed or somewhat agreed that there needs to be better management of cumulative effects of multiple projects (Nanos 2017). When asked in a government-led questionnaire on EA reform to rank the top three elements (of eight) that should be considered when making environmental regulatory decisions, 74% (the top-ranked element) of respondents answered “Science, facts and evidence have been used to support decisions” (Nielsen et al. 2016). The second and third-ranked elements were “Environmental benefits/impacts have been considered” (66%) and “Expert knowledge/input has been gathered and considered” (42%). Only 25% of respondents selected “Economic benefits/impacts have been considered” as one of three top-ranked concerns.
In 2015, the Canadian government signaled intent to reform four environment-related laws, including CEAA 2012. Recently-elected Prime Minister Justin Trudeau pledged that the federal government would ensure “robust oversight and thorough environmental assessments”, including that “decisions are based on science, facts, and evidence, and serve the public’s interest” (Trudeau 2015). Working towards this mandate, in 2016 a four-person expert panel was appointed to review current federal EA processes. This review included engagement with subjectmatter experts, former project review panel members, and nationwide consultation with the public, Indigenous peoples, provinces and territories, and key stakeholders. The goal of the EA review was “to develop new, fair processes that are robust, incorporate scientific evidence, protect our environment, respect the rights and title of Indigenous peoples, and support economic growth” (Government of Canada 2016a).
Following four months of written, in-person, and online contributions from industry, government, Indigenous, and non-governmental organization (NGO) sectors, as well as individuals/academics not acting on behalf of particular groups or organizations, the expert panel released a report in April 2017. The panel recommended a number of changes to federal EA processes, including more meaningful engagement with Indigenous peoples at all stages of the process, a holistic focus on sustainability, and strategies to ensure evidence-based decision-making (Expert Panel Review of Environmental Assessment Processes 2017). The federal government introduced proposed EA legislation to Parliament in February 2018: the Impact Assessment Act (part of omnibus Bill C-69).
Canada’s federal government has previously solicited public input when reviewing EA legislation. In 1999, a legislatively-mandated five-year review of CEAA 1995 was launched alongside a discussion paper to establish context (Canadian Environmental Assessment Agency 1999). At this time, Sinclair and Fitzpatrick (2002) qualitatively evaluated public input received during the review and subsequently compared it to proposed legislative changes. We repeat, in part, the exercise of Sinclair and Fitzpatrick (2002), which focused on general aspects of EA review such as consultation and regulatory oversight. Here, we focus on support for science in federal EA rather than consultation processes, in part because of previously identified concerns over scientific rigour in the current EA processes.
We provide a synoptic synthesis of input for the panel on EA reform. We ask the question—to what degree do different sectors of Canada express support for or against different scientific components of EA reform? The data for our analysis were gathered from a public registry of written submissions to the expert panel during the 2016 national public consultations on EA reform. Written submissions were analyzed to determine the extent to which sectors of society expressed support for or against five key scientific components of an evidence-based approach to EA (see Materials and Methods): (1) openly sharing information; (2) evaluating cumulative effects; (3) scientific rigour; (4) transparency in decision-making; and (5) independence, e.g., between industrial proponents and government regulators. In addition, we provide social context for forthcoming changes to EA legislation, policy, and regulation in Canada. Although other components of impact assessment exist and are critical to an effective and fair process (e.g., appropriate and meaningful inclusion of Indigenous knowledge, socio-economic evaluation, and public participation), our focus relates to our collective experience regarding a scientifically rigorous approach to EA.

Materials and methods

Materials

The Canadian Government invited any interested person or party (e.g., groups, organizations, associations) to participate in the Expert Panel Review of Environmental Assessment Processes. Participation took three forms: (1) written documents (e.g., letters, reports in English or French) electronically submitted between 19 September and 23 December 2016 and posted in an online public registry between 19 October 2016 and 30 March 2017; (2) in-person presentations at public and Indigenous consultation sessions in 21 cities across Canada between 19 September and 15 December 2016; and (3) responses to an online questionnaire called ChoiceBook between 20 June and 31 August 2016 (Nielsen et al. 2016). The expert panel received 530 written submissions and 397 in-person presentations (Expert Panel Review of Environmental Assessment Processes 2017). Written submissions were not categorized by sector (A. Jacob, personal communication to CEAA, 2017). In-person presentations were categorized into five sectors: 11.6% (n = 46) government body or government agency; 32.5% (n = 129) Indigenous groups; 9.1% (n = 36) industry and industry associations; 25.7% (n = 102) NGO; and 21.2% (n = 84) individuals/academics (Expert Panel Review of Environmental Assessment Processes 2017). In addition, there were 2673 responses to the online questionnaire called ChoiceBook (Expert Panel Review of Environmental Assessment Processes 2017), with participants categorized into eight sectors, but with only proportions reported: 61% general public, 15% NGO, 15% industry, 13% academic, 12% government, 5% youth, 4% Indigenous group, and 8% other (Nielsen et al. 2016). All written submissions, in-person presentations, and questionnaire responses were voluntary contributions of individuals and parties with an interest in EA processes in Canada; as such, the views expressed therein do not represent a random or statistically representative sample of Canadians.
We restricted our analysis to written submissions for the following reasons: (i) some presentation materials lacked appropriate context to determine support for or against the five scientific components (e.g., slides contained only photographs, fragments of text) and (ii) to avoid duplication between written submissions uploaded through the online portal and the speaking notes, slides, and (or) transcripts used during in-person presentations (e.g., groups or individuals submitted a written submission in addition to one representative delivering an in-person presentation).
Of the written submissions, approximately one-fifth of the total documents posted to the online registry was an additional file submitted by the same contributor(s) (e.g., addendum, cover letter), in which case we grouped the documents together and counted them as one unique submission from one contributor. We also excluded written submissions authored by members of the expert panel (e.g., requests for costing information). Our total sample for analysis was 421 unique written submissions.

Key scientific components

For the five key scientific components, we drew from current best practices in scientific publishing, impact assessment, the role of environmental science and ecology in EA, and criticisms and recommendations from Canadian scientists (Schindler 1976; Beanlands and Duinker 1983; Greig and Duinker 2011; Gibson 2012; Ford et al. 2016; Jacob et al. 2016 and references therein; Moore 2016; MacKinnon 2017) regarding elements necessary for robust, defensible science and the role of science in governance. Below we briefly describe each of the five key scientific components.

Component 1—Open information

Open access to data has emerged as an important trend in modern scientific practice, including publishing. Many high-profile journals, granting agencies, and foundations now require researchers to make their data publicly available to any user in near perpetuity (McNutt 2014; Government of Canada 2016b; Nature 2016). Such commitments to “open science” can decrease scientific fraud and allow the scientific process and results to be verifiable and reproducible, integrate into larger-scale studies, and provide opportunities for ongoing learning. Sharing information is critical to the modern practice of science, particularly to inform and evaluate decisions made in the public interest. For example, open science in EA would help provide baseline biodiversity data required for systematic conservation planning and to track trends in ecosystem and human health. Some jurisdictions and groups are moving to this model, such as the Alberta Biodiversity Monitoring Institute (abmi.ca/home.html), the USGS LandSat imagery (landsat.usgs.gov/landsat-data-access), and many geospatial clearinghouses (e.g., the Province of British Columbia’s R package bcmaps (cran.r-project.org/web/packages/bcmaps/index.html)).
Although many EA documents are available in the Canadian Environmental Assessment Registry (ceaa.gc.ca/050/evaluations/index), the underlying empirical data, model parameters, and methods are generally either not posted or are posted with insufficient detail to facilitate validation (Ford et al. 2016). Furthermore, existing data are not sufficient for statistically rigorous evaluations of important environmental risks (e.g., pipeline failures, Belvederesi et al. 2017). Access to raw data, methods in sufficient detail for reproduction, and results in formats available for other users is typically limited. Barring certain sensitive information such as individual, family, and (or) community-held knowledge, or information with national security implications, permanent, public, and free access to the various knowledge products of environmental assessment could strengthen the credibility and rigour of the process, reducing redundant studies and the expenses associated with them.

Component 2—Evaluating cumulative effects

Cumulative effects assessment considers potential project-related impacts at multiple spatial and temporal scales, often including past, present, and planned future development. Cumulative effects began receiving serious attention in the mid-1990s (Connelly 2011). Since then, cumulative effects assessment has continued to evolve to include evaluating how proposed human activities interact with species, ecosystem processes, functions, and services, and the total effects of multiple plans, projects, and activities on a region (Sinclair et al. 2017). In principle, evaluating cumulative effects can help determine if the proposed project impacts are likely to exceed thresholds identified at larger scales (e.g., the amount of habitat fragmented, total greenhouse gas emissions, concentration of carcinogenic chemicals) and, where present, meet local, regional, and national sustainability objectives and commitments (Sinclair et al. 2017; Westwood et al. 2017).

Component 3—Scientific rigour

Evidence is a key ingredient for making robust, sound, and credible public policy decisions (Science Integrity Project 2014), including EA. Public agencies are under heightened scrutiny to deliver more rigorous assessments of environmental change and the status of natural resources (Reed 2008; Artelle et al. 2018). Whether relating to field data collection, the identification of indicator species, or determining the likelihood of project-related impacts, EA must be underpinned by rigorously evaluating, testing, and ultimately drawing conclusions from the best available evidence (Treweek 1996). A key component of the scientific effort is to quantify the impact of a proposed project, to predict the effectiveness of proposed mitigation prior to project approval, and then to measure the effectiveness of mitigation during and after the operational lifespan of the project. Such predictive evidence requires data derived through science. Current scientific practices in an EA framework have faced criticism because of persistent knowledge gaps, uncertainty, and because they are often solely reliant on static, desk-based literature reviews or incomplete field studies, with many inferences and assumptions remaining untested or validated (Greig and Duinker 2011; Ford et al. 2016; Jacob et al. 2016 and references therein; Expert Panel Review of Environmental Assessment Processes 2017; Smith et al. 2017).

Component 4—Transparent decision-making

Related to open information and scientific rigour is the issue of transparency, a component that straddles the science–policy interface. Multiple sources and types of information, including scientific evidence, are used to inform decision-making during the EA process, in addition to social, environmental, and economic project costs and benefits. Lack of trust has been cited as a weakness in current EA processes (Expert Panel on the Modernization of the National Energy Board 2017), including questions about the legitimacy of decision-making for project approvals (Expert Panel Review of Environmental Assessment Processes 2017). Given these challenges, future decisions could be explained by including an explicit justification that accounts for the information relied upon, costs and benefits weighed, and the full implications of the decision (e.g., costs, benefits, and risks to threatened species, economy, human or ecosystem health at different scales, where known or estimated). Explaining the rationale and roles that evidence played in project approval (or rejection) may better help stakeholders understand the decision, regardless of whether they agree with it (Mackinnon 2017).

Component 5—Independence

Here, “independence” refers to the relationship between project proponents and parties hired to prepare the EA studies, as well as the relationship between regulators and (or) decision-makers. The current EA process is typically reliant on groups and individuals hired by, and under contract with, project proponents. In turn, these assessments are reviewed by ostensibly independent government bodies such as the National Energy Board. This system has been criticized at provincial and federal levels in Canada (Beanlands and Duinker 1983; Haddock 2015; Cleland and Gattinger 2017; Smith et al. 2017) and the United States (Caldwell et al. 1982), including recently for loss of public trust (Expert Panel on the Modernization of the National Energy Board 2017).

Methods

For each written submission, we categorized the contributor (i.e., the person(s) listed as the document author in the registry and on the document itself) into one of the five sectors used by the Expert Panel to categorize in-person presentations (see fig. 5 in Expert Panel Review of Environmental Assessment Processes 2017): (1) industry or industry associations (e.g., private consultants, industry associations, project proponents); (2) Indigenous groups (e.g., Indigenous governments, organizations, and communities, or official representatives writing on their behalf); (3) NGO; (4) individuals and academics; or (5) government bodies and agencies. If the written submission(s) did not include sufficient information to categorize the contributor, we determined it by an internet search. When a contributor was deemed to not be writing on behalf of their organization or group (e.g., not using official letterhead, not identifying themselves as a representative of that group, or retired government official), we categorized them in the individual/academic group.
The support for each of the five key scientific components of strengthened EA (Table 1, described further below) in each written contribution was coded as “yes” (affirmative, recommend for), “no” (negative, recommend against, oppose), or “N/A” (did not address or neutral). Four of the authors (ALJ, ATF, CHF, JWM) determined the rubric for how text in the written submissions would be interpreted (and coded) as expressing affirmative, negative, or neutral support for each scientific component (see Table 1 for examples) and trained the three evaluators. One author (EJS) evaluated all English submissions and another author (DG) evaluated all French submissions posted before 10 February 2017. Although that date was intended to be the original cut-off, an additional 27 written submissions were posted after this deadline; these were evaluated in English and French by a third author (ARW). To address potential bias and reproducibility in how the submissions were evaluated, 40 random submissions were independently interpreted and coded by a fourth author (CHF) and compared post hoc; the mean accuracy of sector categorization and coding of responses across all five components was 93% ± 0.02 SE.
Table 1.
Table 1. Questions used to interpret whether written submissions (n = 421) posted on the Expert Panel Review of Environmental Assessment Processes website supported (Yes) or did not support (No) the five key scientific components of environmental assessment.
Scientific component numberScientific componentQuestion used to interpret written submissionsExample text interpreted in support of component (Yes)Example text interpreted as against component (No)
1Open information
Does the submission recommend that all informationa from environmental assessments be made publicly available?
“Data must be open access… Monitoring approaches and methodology must be presented to allow for public evaluation of compliance”. (Environmental Planning and Assessment Caucus)
“The proponent, its consultants and independent experts should provide all relevant information to the project team […] (e.g., a requirement that all drafts and raw data are to be shared with the joint project team, not just the proponent)”. (Tsilhqot’in National Government)
   
“Toute information pertinente doit être facilement accessible au public; elle doit être partagée entre les différents niveaux d’évaluation et rester disponible pour une utilisation future. En lien avec nos recommandations précédentes, l’information relative à un projet et son évaluation doit respecter systématiquement les critères suivants : transparence : toute l’information doit être rendue disponible au public; accessibilité : information vulgarisée et disponible systématiquement dans les deux langues officielles canadiennes; mise à disposition suffisamment longtemps à l’avance”. (Regroupement national des conseils régionaux de l’environnement du Québec)
“[…] encouragent les promoteurs a faire une divulgation proactive, toute en perrnettant cependant Ia protection des secrets industrials, des dispositifs de securite et autres renseignements confidentiels. A ce sujet, nous notons que les regles de transparence etablies en 2012 sont adequates et qu’elles devraient etre maintenues”. (Jean Piette, Conseil Patronale de l’Environnement du Québec)
2Cumulative effects
Does the submission recommend that there be stronger evaluation of cumulative effects?
“[F]ederal assessment should… provide oversight for assessment of cumulative environmental effects for indicators that fall under federal jurisdiction”. (Alberta Energy Regulator)
“The environmental outcomes of individual projects should not be assessed against broader national goals (i.e., GHG emissions)”. (SaskPower)
   
“Le cadre actuel d’évaluation environnementale reconnait en théorie la nécessité d’évaluer les risques cumulatifs liés aux projets. En pratique toutefois, selon notre expérience, ce type d’études n’est pas nécessairement disponible. Tout au contraire, nous avons observé que les projets sont présentés à la pièce, souvent sur la base d’unités de mesure disparates qui brouillent la vued’ensemble. Il s’agit là d’une lacune importante car une évaluation des risques cumulatifs effectuée en temps opportun pourrait tuer dans l’œuf des projets qui seraient éventuellement jugés irrecevables de toute façon à cause de ce facteur”. (Carole Dupuis, Regroupement Vigilance Hydrocarbures Québec)
“Finalement, en ce qui a trait aux évaluations environnementales à l’échelle régionale, la FCCQ ne croit pas que les effets cumulatifs devraient être l’élément central sur lequel une telle évaluation pourrait être lancée mais plutôt sur la portée régionale des impacts environnementaux potentiels. En effet, le promoteur ne doit pas faire les frais d’une situation particulière survenant dans une région donnée à cause d’entreprises avec qui il n’a pas affaires. De plus, une telle situation particulière peut évoluer au fil du temps (fermeture d’entreprises, meilleures technologies environnementales, etc.) et donc pénaliser le promoteur si l’on considère comme immuables ces effets cumulatifs”. (Fédération des chambres de commerce du Québec)
3Scientific rigour
Does the submission recommend that the rigour, strength, and (or) quality of scientific studies underlying environmental assessments be improved?
“There is a gap between best practices in the literature and what is being applied in the field” (Karthikeshwar Sankar
“Repsol believes that the rigor incorporated into the existing CEAA process is appropriate and adequate enough to ensure that the environmental and social aspects of any given project are taken into consideration”. (Repsol)
   
“CEAA should require that post-project completion monitoring be conducted, at the highest scientific standard possible and at the expense of the proponent, to ensure accurate determination of impacts and effectiveness of implemented mitigation and (or) compensation strategies”. (Lake Babine Nation)
No French examples.
   
“Notamment, la complexité technique, scientifique, y compris écologique, des projets soumis, leur ampleur spatiale, le nombre de ministères et de parties impliquées, l’importance des impacts appréhendés, tous ces facteurs devraient militer pour que ce 365 jours soit un seuil minimum, et non un délai butoir. C’est la qualité des analyses qui peut être en jeu”. (Premières Nations des Pekuakamiulnuatsh, des Innus Essipit et des Innus de Nutashkuan)
4Transparent decision-making
Does the submission recommend that environmental assessment decisions be explicit, transparent, and clearly communicated?
“There needs to be clear and explicit decision-making criteria in the Act. Specifically, there needs to clear and explicit decision-making criteria for the ‘significance’ and ‘justification’ determinations”. (Maliseet Nation of New Brunswick)
“[…] decision makers should weigh public input as they see fit”. (Christy Ngan)
   
“Être crédible et transparent, ancré dans la science et les connaissances autochtones, pour permettre au public canadien de s’assurer de la bonne administration gouvernementale”. (Grand Conseil de la Nation Waban-Aki)
No French examples.
5Independence
Does the submission recommend that there be greater independence, e.g., between project proponents and the preparation of the EA?
“A separate independent body should have decision-making authority, as many participants spoke about the real and perceived conflict of interest concerns that arise when the regulator is also the party responsible for leading the EA”. (Abby Schwartz)
“Our recommendation assumes the proponent remains responsible for conducting all Environmental Assessment (EA) studies while the Responsible Authority reviews the EA pursuant to the existing CEAA 2012 process”. (Canadian Hydropower Association)
   
“Toutes les évaluations environnementales devront être effectuées par l’ACÉE ou par une organisation indépendante, partiale, transparente, et neutre qui la remplacera”. (Regroupement national des conseils régionaux de l’environnement du Québec (RNCREQ))
No French examples.

Note: Examples of responses in English and French are provided. Questions were applied post hoc to written submissions.

a
Excluding certain sensitive information, such as individual, family, and (or) community-held Indigenous knowledge, or information with national security implications.
We summed affirmative and negative responses by sector; “N/A” responses (i.e., where the submission was deemed to not address that scientific component) were excluded from subsequent analyses (Table 2). A generalized linear model (GLM; binomial distribution with a logit link) was applied to these data using submission responses (binary Y/N) as the dependent variable and sector (categorical; 1–5) and question (categorical; 1–5) as the independent variables in SPSS v24 (IBM Corporation, Armonk, New York, USA).
Table 2.
Table 2. Proportion and number of written submissions (n = 421) interpreted to express support for (yes) or against (no) each of the five scientific components.
 YesNoN/A
SectorProportionNumber of responsesProportionNumber of responsesNumber of responses
Component 1—Open information
Government body or agency1.0040.00022
Indigenous group0.86250.144109
Individual/academic1.00370.00096
Industry or industry association0.88150.12236
Non-governmental organization1.00380.00032
Component 2—Cumulative effects
Government body or agency1.00190.0007
Indigenous group1.001200.00018
Individual/academic1.00760.00057
Industry or industry association0.87270.13422
Non-governmental organization1.00580.00012
Component 3—Scientific rigour
Government body or agency0.71120.2959
Indigenous group1.00740.00064
Individual/academic0.98580.02174
Industry or industry association0.40120.601823
Non-governmental organization0.98490.02120
Component 4—Decision-making
Government body or agency1.00170.0009
Indigenous group1.00870.00051
Individual/academic0.98520.02180
Industry or industry association0.94320.06219
Non-governmental organization1.00550.00015
Component 5—Independence
Government body or agency0.65130.3576
Indigenous group1.00970.00041
Individual/academic0.93570.07472
Industry or industry association0.2780.732223
Non-governmental organization0.98500.02119

Note: Proportions were calculated based on responses Yes and No only.

Results

A total of 421 unique written submissions were made to the expert panel. Of these, submissions from or on behalf of Indigenous peoples (n = 138, 32.8%) and members of the public and academics (n = 133, 31.6%) were the most common, followed by NGO (n = 69, 16.6%), industry/industry associations (n = 53, 12.6%), and government bodies/agencies (n = 26, 6.2%) (Fig. 1). Of the written submissions, 368 (87.4%) included text interpreted to directly express support for or against at least one of the five scientific components (n = 63 referred to one component, n = 93 to two, n = 63 to three, n = 90 to four, and n = 59 to five; Table 2). Here, we highlight patterns across and between the five components, overall and by each sector, focusing on submissions that explicitly expressed support for (yes) or against (no) at least one of the five components (Table 2; Fig. 2).
Fig. 1.
Fig. 1. Number of unique written submissions to the 2016 Expert Panel Review of Environmental Assessment Processes (n = 421), categorized by sector. NGO, non-governmental organization.
Fig. 2.
Fig. 2. Proportion of written submissions from each sector that were interpreted to affirm support for five key scientific components of federal environmental assessment. Here, lighter shades indicate that more written submissions were explicitly in favour of that component; darker shades indicate that more submissions from that sector explicitly did not agree with that component. Proportions were calculated based on the number of submissions that explicitly discussed that component; submissions that did not discuss that aspect (i.e., not applicable) were not included in the calculations. NGO, non-governmental organization.
When analyzed collectively (i.e., not split by sector), the majority of written submissions that were interpreted to express an opinion supported assessing cumulative effects (component 2; n = 300 out of 304, 99%), transparent decision-making (component 4; n = 243 out of 246, 99%), greater independence (n = 226 out of 260, 87%), increased scientific rigour (component 3; n = 205 out of 230, 89%), and open information (component 1; n = 119 out of 125, 95%) (Fig. 2, Table 2). In other words, the five scientific components were supported by nearly nine out of 10 written submissions that expressed an opinion.
When split was analyzed by sector, each of the five sectors expressed strong support for open information (86%–100%), cumulative effects (87%–100%), and transparent decision-making (94%–100%; Fig. 2, Table 2). The proportion of support for open information from industry/industry associations and Indigenous groups (88% and 86%, respectively) was slightly lower than from the other three sectors (Table 2, Fig. 2). However, industry/industry associations and government bodies/agencies sectors differed from the other three sectors regarding increased scientific rigour and independence. First, 40% of industry/industry association and 71% of government body/agency submissions supported greater scientific rigour, whereas support among other three sectors was near unanimous (98%–100%). Second, 27% of industry/industry association and 65% of government agency/body submissions explicitly supported increased independence, compared with greater support from other sectors (93%–100%). Sector and component had a significant influence on written submission responses (Sector: Wald χ2 = 120.00, df = 4, p < 0.001; Component: Wald χ2 = 44.31, df = 4, p < 0.001; Table S1). This indicates that the levels of support for the five scientific components are associated with the contributor’s sector.

Discussion

We analyzed the degree to which different sectors of society expressed support for, or discouragement of, an increased role for science in Canada’s EA regime based on 530 official, publicly-available written submissions to the 2016 Expert Panel Review of Environmental Assessment Processes. Our results demonstrate that, across all sectors, contributors engaging in the official process for federal EA reform process in Canada showed strong support for improving three key scientific components of EA: improving open access to EA-related information, evaluating cumulative effects, and transparent decision-making. For the other two components (increased scientific rigour and increased independence between project proponents, assessors, regulators, and (or) decision-makers), the stated preference of industry/industry associations was towards maintaining the status quo. We note, however, that some submissions from industry/industry associations supported increased scientific rigour and greater independence (Table 2, Fig. 2). These differences within the industrial sector may indicate issues with confidence in the decision-making process and social license, and further exploration of these issues is warranted.
Our results can be compared with two other exercises evaluating public comments about EA reform in Canada: one exercise was conducted in 2000 (a legislatively-mandated five-year review of CEAA 1995) and one exercise was conducted in 2016 (assessing results of the online questionnaire ChoiceBook associated with the same EA expert panel and public consultation that we evaluated). First, we evaluated more than twice as many unique written submissions than the exercise conducted by Sinclair and Fitzpatrick (2002): 26% classified as industry, 25% as environmental NGO, 23% as NGO, 13% as public, 21% as Aboriginal, 11% as academic, 4% as consultants, and 3% as government. Key themes identified in the 2000 public consultation were improvement in public participation, Aboriginal involvement, and access to information. The latter included the use of “new technology”, a centralized registry, and all material related to EA (Sinclair and Fitzpatrick 2002). When comparing recent efforts to engage Canadians on EA reform (i.e., in 2016) as opposed to earlier (i.e., 2000), it is possible that (1) interested individuals and parties are more motivated now to share their views on EA-related issues, (2) the 2016 public consultations were more effective at soliciting public participation, and (or) (3) improved available technology (e.g., online submission, raising awareness) better facilitated public participation. With the exception of improved data transparency, our key scientific components for EA were not identified as a recurring themes by Sinclair and Fitzpatrick (2002); that the 2016 review showed more frequent concern about these components may reflect that perceptions of, attitudes towards, and practices in EA are changing (e.g., major technological and cultural shifts in sharing scientific information, knowledge and concern about cumulative effects like climate change) and (or) that revisions made in CEAA 2012 attracted public interest. By design, EA methods and regimes adapt to reflect changing knowledge (Jones and Greig 1985), and thus is it is not surprising that the scientific concerns highlighted by many contributors in 2016 were not shared in prior decades.
Second, when evaluating the online questionnaire ChoiceBook that accompanied the 2016 EA review, Nielsen et al. (2016) found that the most common response about how to improve the federal EA process was to base “decisions on science, facts, and evidence” (31%, n = 189), followed by the importance of public participation and consultation (27%, n = 167), and the importance of credibility, independence, and lack of bias and influence (19%, n = 113). However, of the 340 respondents who answered the question “What is working well in the current federal environmental assessment process under CEAA 2012?”, only 33% felt that the consideration of science was working well. Nielsen et al. (2016) did not report on the share of comments mentioning science or scientific rigour in their summary of >2000 online questionnaire responses solicited during the government’s EA review process; we did not have access to their dataset for comparison.
Aspects of all five scientific components that we evaluated were also discussed by the 21-member Multi-Interest Advisory Committee that advised the expert panel (MIAC 2016), including representatives from national Indigenous organizations, industry associations, environmental groups, and federal government departments and agencies. For instance, making EA data accessible, including “fully disclos[ing] methods, data quality, assumptions and other pertinent considerations” and allowing independent evaluation (component 1-open information), was recommended as means to ensure continuous learning from EA information, including monitoring during and after project operation (MIAC 2016). Another recommendation was that EA legislation should explicitly refer to considering the best available information, with policy guidance about the quality and rigour of evidence (component 3-scientific rigour) (MIAC 2016). Explicitly explaining the rationale for decisions (component 4-transparent decision-making) was recommended to enhance credibility of EA processes (MIAC 2016). These issues are not new; environmental scientists with expertise in EA have raised concerns about aspects of the five scientific components that we evaluated, and other related issues, for more than 40 years (e.g., Schindler 1976; Caldwell et al. 1982; Beanlands and Duinker 1983).

Implications

Although the five scientific components we evaluated are supported by the literature on the scientific basis of EA (e.g., Beanlands and Duinker 1983; Duinker and Greig 2006; Greig and Duinker 2011; MacLean et al. 2015; MacKinnon 2017), different sectors may have different perspectives on science in EA. In cases of full cross-sectoral agreement, such as improving the treatment of cumulative effects and open information (Fig. 2), revisions to EA may incur low political cost. Reforms to Canada’s EA processes, understandably, make some people, groups, and sectors of society apprehensive; it has the potential to alter how industrial development is evaluated and regulated over large spatial scales, including hydroelectric dams, fossil fuel extraction, pipelines, transmission lines, mines, and their associated infrastructure (Gibson 2012). For instance, changes in regulatory regimes will likely affect the energy sector, which in 2010 accounted for 6.8% of Canada’s economy and 2.4% employment (Natural Resources Canada 2016). The degree to which potential legislative reform aligns with the publicly or privately expressed perspectives of different sectors may illuminate the relative influence of different societal sectors on government legislative reform and policy-making.
Strengthening the five scientific components of EA could have a range of foreseeable impacts and considerations. Inaccessibility and poor-quality of information (components 1 and 3) have been critiqued as weaknesses of Canadian EA processes for decades (e.g., Schindler 1976), including information either withheld or not delivered within appropriate timelines (Sinclair and Fitzpatrick 2002; Cizek 2014). “Open science” has the potential to confer significant national competitive advantages (Government of Canada 2014) and clarify knowledge gaps in baseline information and (or) potential environmental risk (e.g., Green et al. 2017). For the potential disclosure of sensitive information, such as traditionally- or culturally-held knowledge or intellectual property, privacy experts and community members could help develop best practices that still maximize open data (e.g., the Department of Defense scored relatively high in scientific transparency; Magnuson-Ford and Gibbs 2014).
Although the assessment of cumulative effects (component 2) has been a legal requirement in Canada since 1995, the process has been criticized for being insufficient (Duinker and Greig 2006; Sinclair et al. 2017). Our analysis and other studies indicate that there are high levels of support for improving cumulative effects assessments: 8 in 10 Canadians agreed or somewhat agreed that Canada needs to be better manage cumulative effects (Nanos 2017) and nearly half of Choicebook respondents felt EA processes should “completely” address Canada’s climate change commitments (Nielsen et al. 2016).
Although requiring a higher level of scientific rigour (component 3) may be perceived as placing unnecessary regulatory or cost burden on proponents, maximizing opportunities to set national standards could reduce some costs. The final two components, transparent decision-making (component 4) and independence (component 5), both emphasize the importance of limiting bias in the decision-making process, whether that bias is related to scientific information, conflict of interest, or the credibility of the overall process and ultimate project decisions. According to the EA Expert Panel report, “One of the most critical issues identified by participants is a lack of transparency in current assessment processes, especially in decision-making” (Expert Panel Review of Environmental Assessment Processes 2017) and could assist achieving stated goals of regaining public trust and introducing new, fair EA processes (Trudeau 2015).

Limitations

Although our synoptic analysis reveals key commonalities and divergences across diverse sectors of Canada, the contributors of the source material are not a random selection of the public. Recent nationwide publicly-representative surveys on natural resource issues were conducted by Nanos (2017) and Environics Research Group (2017), and present valuable data sources for analysis on broader public trends. The ChoiceBook survey also reached a wide audience, though it is not statistically projectable to a broader population (Nielsen et al. 2016). Indeed, it showed that most Canadians have little awareness of the EA process and do not feel sufficiently informed to judge whether or not it works well or needs to be modified. The sample we evaluated is not a random sample of the Canadian public, but rather self-selected contributors with enough interest in EA and its potential reform to participate in the public consultation process. Although this is a drawback in terms of understanding broad national trends, it is advantageous in that we were able to assess the opinions and recommendations of interested and engaged Canadians across sectors. The aggregated views of contributors should not be interpreted as a quantitative metric of support of Canadians; many contributions were made by a single person, whereas others had hundreds of signatories or represented government agencies or large, multi-corporation industry groups.
Furthermore, not all written submissions addressed one or more of the five key scientific components of EA (Table 2). The individuals and parties who engaged in the public consultation process regarding federal EA reform may have been motivated by other considerations (e.g., enabling public participation, the consent and rights of Indigenous peoples, triggers for assessments, timelines, socio-economic evaluation, etc.). Such other components of an effective and fair EA process were outside the scope of the present science-focused exercise and of our expertise to interpret the results, but they are very important and deserve attention and analysis.

Conclusions

From an environmental protection perspective, CEAA 2012 has been declared by experts to be “ineffective” (MacLean et al. 2015) and most Canadians believe the government’s role should be to regulate industry and provide environmental protection (Environics Research Group 2017). Our results indicate that there is full cross-sectoral support for strengthening some scientific aspects of EA—namely the treatment of cumulative effects, making scientific information more open, and improving the transparency of decision-making. Our analysis revealed recommendations regarding the scientific components of EA that spanned diverse sectors, meaning that there is clear social license to implement changes in upcoming legislative reform and accompanying policy and regulations. Yet, there is a history of actual legislative implementation differing from solicited public perspectives. For example, of the issues raised during public consultations in 1999/2000 for the five-year comprehensive review of CEAA 1995, only five of 15 were included in Bill C-19 to reform the Act (Sinclair and Fitzpatrick 2002). Here, we discovered that industry/industry association contributions differed from other Canadian sectors in that they did not want more rigorous science and independence between industry and decision-makers, regulators, and (or) assessors. The degree to which the Canadian government strengthens the scientific rigour and independence of EA will indicate whether environmental decision-making in Canada is created for industry or the rest of Canada. Regardless, by quantitatively evaluating publicly-expressed recommendations about the role of scientific aspects of environmental impact assessment, our results indicate opportunities for the federal government to achieve their stated goals of strengthening the role of science in environmental decision-making (Trudeau 2015) while simultaneously meeting the recommendations of Canadians across a diverse spectrum of sectors and views.

Acknowledgements

Funding for this research was provided by the Liber Ero Fellowship program to ALJ and ATF. Funders had no role in the design, analysis, or reporting. We thank Isabelle Côté, Katie Gibbs, Martin Olszynski, Sarah Otto, Justina Ray, John Reynolds, anonymous reviewers, and the editor for helpful discussions and suggestions that greatly improved this research.

References

Artelle KA, Reynolds JD, Treves A, Walsh JC, Paquet PC, and Darimont CT. 2018. Hallmarks of science missing from North American wildlife management. Science Advances, 4: eaao0167.
Beanlands GE, and Duinker PN. 1983. An ecological framework for environmental impact assessment in Canada. Institute for Resource and Environmental Studies, Dalhousie University, Halifax, Nova Scotia, and Federal Environmental Assessment Review Office, Ottawa, Ontario. 128 p.
Belvederesi C, Thompson MS, and Komers PE. 2017. Canada’s federal database is inadequate for the assessment of environmental consequences of oil and gas pipeline failures. Environmental Reviews, 25(4): 415–422.
Caldwell LK, Bartlett RV, Parker DE, and Keys DL. 1982. A study of ways to improve the scientific content and methodology of environmental impact analysis. School of Public and Environmental Affairs, Indiana University, Bloomington, Indiana. 453 p.
Canadian Environmental Assessment Agency. 1999. Review of the Canadian Environmental Assessment Act: a discussion paper for public consultation. Minister of Environment, Ottawa, Ontario. 71 p. [online]: Available from publications.gc.ca/site/eng/87789/publication.html.
Carroll C, Hartl B, Goldman GT, Rohlf DJ, Treves A, Kerr JT, et al. 2017. Defending the scientific integrity of conservation-policy processes. Conservation Biology, 31: 967–975.
Cizek P. 2014. Visualising the industrial north: exploring new ways to engage and inform the public on the physical footprint and scale of very large resource extraction projects such as the Alberta tar sands open pit mines and associated pipelines. Ph.D. thesis, University of British Columbia, Vancouver, British Columbia. 491 p. [online]: Available from open.library.ubc.ca/media/download/pdf/24/1.0166032/2.
Cleland M, and Gattinger M. 2017. System under stress: energy decision-making in Canada and the need for informed reform. Institute for Science, Society, and Politics, University of Ottawa, Ottawa, Ontario [online]: Available from energyregulationquarterly.ca/articles/system-under-stress-energy-decision-making-in-canada-and-the-need-for-informed-reform#sthash.xArLlRcS.dpbs.
Connelly RB. 2011. Canadian and international EIA frameworks as they apply to cumulative effects. Environmental Impact Assessment Review, 31: 453–456.
Couch WJ, Herity JF, and Munn RE. 1983. Environmental impact assessment in Canada. In Environmental impact assessment. NATO ASI Series (Series D: Behavioural and Social Sciences). Edited by PADC Environmental Impact Assessment and Planning Unit. Springer, Dordrecht, the Netherlands. Vol. 14, pp. 41–59.
Doelle M. 2012. CEAA 2012: the end of federal EA as we know it? Journal of Environmental Law and Practice, 24: 1–17.
Duinker PN, and Greig LA. 2006. The impotence of cumulative effects assessment in Canada: ailments and ideas for redeployment. Environmental Management, 37: 153–161.
Environics Research Group. 2017. Public opinion research on natural resource issues 2017. Natural Resources Canada, Toronto, Ontario.
Expert Panel on the Modernization of the National Energy Board. 2017. Forward, together—enabling Canada’s clean, safe and secure energy future. Report of the Expert Panel on the Modernization of the National Energy Board. 90 p. [online]: Available from nrcan.gc.ca/19667.
Expert Panel Review of Environmental Assessment Processes. 2017. Building common ground: a new vision for impact assessment in Canada. The Final Report of the Expert Panel for the Review of Environmental Assessment Processes. ISBN: 978-0-660-08091-8. 120 p. [online]: Available from canada.ca/en/services/environment/conservation/assessments/environmental-reviews/environmental-assessment-processes/building-common-ground.html.
Ford AT, Coristine L, Davies K, Flockhart T, Jacob AL, Palen W, et al. 2016. Improving environmental assessment in Canada. Submission to the Expert Panel Review of Environmental Assessment Processes, 27 October. 10 p. [online]: Available from eareview-examenee.ca/view-submission/?id=1482353579.1985.
Gibson RB. 2012. In full retreat: the Canadian government’s new environmental assessment law undoes decades of progress. Impact Assessment and Project Appraisal, 30: 179–188.
Government of Canada. 2014. Seizing Canada’s moment: moving forward in science, technology and innovation. 68 p. [online]: Available from ic.gc.ca/eic/site/icgc.nsf/eng/h_07472.html.
Government of Canada. 2016a. About the review of environmental assessment processes [online]: Available from canada.ca/en/services/environment/conservation/assessments/environmental-reviews/environmental-assessment-processes.html.
Government of Canada. 2016b. Tri-agency open access policy on publications [online]: Available from science.gc.ca/default.asp?lang=En&n=F6765465-1.
Green SJ, Arbeider M, Palen WJ, Salomon AK, Sisk TD, Webster M, et al. 2017. Oil sands and the marine environment: current knowledge and future challenges. Frontiers in Ecology and the Environment, 15: 74–83.
Greig LA, and Duinker PN. 2011. A proposal for further strengthening science in environmental impact assessment in Canada. Impact Assessment and Project Appraisal, 29: 159–165.
Haddock M. 2015. Professional reliance and environmental regulation in British Columbia. Environmental Law Centre, University of Victoria, Victoria, British Columbia. 69 p. [online]: Available from elc.uvic.ca/wordpress/wp-content/uploads/2015/02/Professional-Reliance-and-Environmental-Regulation-in-BC_2015Feb9.pdf.
Hutchings JA, and Post JR. 2013. Gutting Canada’s Fisheries Act: no fishery, no fish habitat protection. Fisheries, 38: 497–501.
Jacob AL, Fox CH, Gerwing TG, Muñoz N, Pitman K, and Price M. 2016. Young researchers’ open letter to Prime Minister Trudeau. Submission to the Expert Panel Review of Environmental Assessment Processes. 2 p. [online]: Available from eareview-examenee.ca/wp-content/uploads/uploaded_files/openletter_earlycareerresearchers_dec23.pdf.
Jones ML, and Greig LA. 1985. Adaptive environmental assessment and management: a new approach to environmental impact assessment. In New directions in environmental impact assessment in Canada. Edited by V MacLaren and J Whitney. Methuen, Toronto, Ontario. pp. 21–42.
Mackinnon AJ. 2017. Implementing science in environmental assessment: a review of theory. Master’s thesis, Dalhousie University, Halifax, Nova Scotia. 180 p.
MacLean J, Doelle M, and Tollefson C. 2015. The past, present, and future of Canadian environmental law: a critical dialogue. Lakehead Law Journal, 1: 79–104 [online]: Available from ssrn.com/abstract=2708144.
Magnuson-Ford K, and Gibbs K. 2014. Can scientists speak? Evidence for Democracy; Simon Fraser University, Vancouver, British Columbia. 24 p. [online]: Available from evidencefordemocracy.ca/en/research/reports/canscientistsspeak.
McNutt M. 2014. Reproducibility. Science, 343: 229.
Moore JW. 2016. Written submission for Nanaimo, Dec 14 2016. Submission to the Expert Panel Review of Environmental Assessment Processes, 8 December. 8 p. [online]: Available from eareview-examenee.ca/view-submission/?id=1482267930.1003.
Multi-Interest Advisory Committee (MIAC). 2016. Advice to the Expert Panel reviewing environmental assessment processes from the Multi-Interest Advisory Committee. 64 p. [online]: Available from eareview-examenee.ca/view-submission/?id=1481330791.1676.
Nanos. 2017. Canadians more negative than positive about energy decision-making. University of Ottawa Positive Energy Summit. 5 p. [online]: Available from nanosresearch.com/sites/default/files/POLNAT-S15-T763.pdf.
Natural Resources Canada. 2016. Additional statistics on energy [online]: Available from nrcan.gc.ca/publications/statistics-facts/1239.
Nature. 2016. Availability of data, material and methods [online]: Available from nature.com/authors/policies/availability.html.
Nielsen, Delaney + Associates, and Publivate. 2016. Review of Canada’s environmental and regulatory process—questionnaire report (final draft). Prepared for the Government of Canada, 23 December. Nielsen, Montreal, Quebec. 50 p. [online]: Available from eareview-examenee.ca/wp-content/uploads/uploaded_files/167205_questionnaire-summary-report-dec-23docx.docx.
Reed MS. 2008. Stakeholder participation for environmental management: a literature review. Biological Conservation, 141: 2417–2431.
Reynolds JD, Côté IM, and Favaro B. 2012. Canada: a bleak day for the environment. Nature, 487: 171.
Schindler DW. 1976. The impact statement boondoggle. Science, 192: 509.
Schindler DW, Smol JP, Peltier WR, Miall AD, Dillon P, Hecky RE, et al. 2012. Potential amendments to section 35 of the Fisheries Act [online]: Available from sfu.ca/~amooers/scientists4species/FA_letter_2012.pdf.
Science Integrity Project. 2014. Statement of principles for sound decision-making in Canada [online]: Available from scienceintegrity.ca.
Siddon T, Anderson D, Fraser J, and Dhaliwal H. 2012. An open letter to Stephen Harper on fisheries. The Globe and Mail [online]: Available from theglobeandmail.com/opinion/an-open-letter-to-stephen-harper-on-fisheries/article4224866.
Sinclair AJ, and Fitzpatrick P. 2002. Provisions for more meaningful public participation still elusive in proposed Canadian EA bill. Impact Assessment and Project Appraisal, 20: 161–176.
Sinclair AJ, Doelle M, and Duinker P. 2017. Looking up, down, and sideways: reconceiving cumulative effects assessment as a mindset. Environmental Impact Assessment Review, 62: 183–194.
Smith T, Gibbs K, Westwood A, Taylor S, and Walsh K. 2017. Oversight at risk: the state of government science in British Columbia. Evidence for Democracy, Ottawa, Ontario. 26 p. [online]: Available from evidencefordemocracy.ca/en/research/reports/bc.
Treweek J. 1996. Ecology and environmental impact assessment. Journal of Applied Ecology, 33: 191–199.
Trudeau J. 2015. Minister of Environment and Climate Change Mandate Letter [online]: Available from pm.gc.ca/eng/minister-environment-and-climate-change-mandate-letter.
Westwood AR, Jacob AL, Boyd DR, Chan KMA, Cooke SJ, Daigle RM, et al. 2017. Strong foundations: recap and recommendations from scientists regarding the federal environmental and regulatory reviews. Prepared in response to the Government of Canada’s Environmental and Regulatory Reviews Discussion Paper. 24 p. [online]: Available from y2y.net/strongfoundations.

Supplementary material

Supplementary Material 1 (DOCX / 15 KB)

Information & Authors

Information

Published In

cover image FACETS
FACETS
Volume 3Number 1October 2018
Pages: 512 - 529
Editor: Nicole L. Klenk

History

Received: 31 August 2017
Accepted: 23 March 2018
Version of record online: 7 May 2018

Data Availability Statement

All relevant data are within the paper and in the Supplementary Material.

Key Words

  1. public consultation
  2. environmental law
  3. environmental science
  4. impact assessment
  5. scientific integrity
  6. science–policy interface

Sections

Subjects

Authors

Affiliations

Aerin L. Jacob [email protected]
Yellowstone to Yukon Conservation Initiative, 200-1350 Railway Avenue, Canmore, AB T1W 1P6, Canada
Liber Ero Fellowship Program, Biodiversity Research Centre, University of British Columbia, 141-2212 Main Mall, Vancouver, BC V6T 1Z4, Canada
Jonathan W. Moore
Earth to Ocean Research Group, Simon Fraser University, 8888 University Drive, Burnaby, BC V5A 1S6, Canada
Caroline H. Fox
Department of Oceanography, Dalhousie University, 6299 South Street, Halifax, NS B3H 4R2, Canada
Emily J. Sunter
Department of Biology, University of British Columbia—Okanagan Campus, 1177 Research Road, Kelowna, BC V1V 1V7, Canada
Danielle Gauthier
Department of Biology, University of British Columbia—Okanagan Campus, 1177 Research Road, Kelowna, BC V1V 1V7, Canada
Alana R. Westwood
Yellowstone to Yukon Conservation Initiative, 200-1350 Railway Avenue, Canmore, AB T1W 1P6, Canada
Department of Biology, Dalhousie University, 1355 Oxford Street, Halifax, NS B3H 4R2, Canada
Adam T. Ford
Department of Biology, University of British Columbia—Okanagan Campus, 1177 Research Road, Kelowna, BC V1V 1V7, Canada

Author Contributions

ALJ, JWM, CHF, and ATF conceived and designed the study.
EJS, DG, and ARW performed the experiments/collected the data.
CHF analyzed and interpreted the data.
ALJ and ATF contributed resources.
All drafted or revised the manuscript.

Competing Interests

The authors have declared that no competing interests exist.

Metrics & Citations

Metrics

Other Metrics

Citations

Cite As

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

1. Exploring practices, challenges, and priorities for human health and ecological risk assessments in Indigenous communities in Canada
2. Access to Environmental Justice in Canadian environmental impact assessment
3. Interference in science: scientists’ perspectives on their ability to communicate and conduct environmental research in Canada
4. Evidence for the Combined Impacts of Climate and Landscape Change on Freshwater Biodiversity in Real-World Environments: State of Knowledge, Research Gaps and Field Study Design Recommendations
5. Risks of mining to salmonid-bearing watersheds
6. International progress in cumulative effects assessment: a review of academic literature 2008–2018
7. Key information needs to move from knowledge to action for biodiversity conservation in Canada
8. Consequences of information suppression in ecological and conservation sciences
9. The NSERC Canadian Lake Pulse Network: A national assessment of lake health providing science for water management in a changing climate
10. Protecting biodiversity in British Columbia: Recommendations for developing species at risk legislation

View Options

View options

PDF

View PDF

Get Access

Media

Media

Other

Tables

Share Options

Share

Share the article link

Share on social media