Conservation Certification is designed to be both credible and accessible, both of which are reflected in the evaluation of applications. Additional information about these tenets can be found in the “Setting the New Standard” posts of the WHC President's Blog.
Reviewers review all applications based on the established rubrics provided in each Scoring Sheet. Points are not awarded to or deducted from a score outside the framework of the scoring rubrics – there are no bonus points or penalties. Reviewers use their professional knowledge, paired with the additional information below, to inform their decisions as to how the information provided in the application addresses each criterion.
Overall Considerations for Project Evaluation
Content Over Form
Evaluation focuses on the quality of efforts being reported in the application. Grammatical errors or sub-optimal image quality will not adversely impact scores as long as the reviewer is able to fully and clearly understand all of the material presented in the application.
Nested Questions
Application questions utilize conditional logic and nesting of questions to ensure the applicant only needs to answer relevant questions. As a result, some application questions will not be displayed in some applications. For example, if an applicant selected “no” to the question about whether baseline data was collected, the application would skip over the questions asking for a detailed description and to upload the baseline data.
As a result of this nesting of application questions, some criteria will correlate to questions that do not appear in an application. If an application question does not appear in the application, the answer from the preceding application question will inform the scoring of the criteria.
Credibility Considerations
In order to ensure projects are credible, Reviewers must find sufficient information and documentation of ongoing activity to award scores using the rubrics in the Scoring Rubrics. There are several key aspects that Reviewers take into consideration to ensure credibility:
Current Information Applications evaluate recent efforts. For programs applying for initial certification, all information and documentation is considered but the evaluation focuses on information from the past 1-3 years if the project has been active for longer than that. Programs applying for renewal must provide updated information and documentation for each project to describe and demonstrate what has been done for the project since the applicant last applied (2-3 years previously).
Documentation Unless noted as optional, all upload fields are required. Uploaded files (e.g., monitoring logs, photos, receipts) serve as documentation to support the other information provided by the applicant. Documentation in other languages can be included in applications as long as there is a sufficient level of translation to English for reviewers to be able to fully assess the information for scoring. If the applicant does not provide the required supporting documentation, they will not be awarded points.
Applicants are encouraged to submit all applicable documentation but if the amount of documentation would be overwhelming (e.g., hundreds of files), applicants can submit a representative sample of documentation. This representative sample should support the information about content provided elsewhere in the application. For example, if an applicant provides a monitoring protocol that mentions recording date, time, number of individuals, etc. weekly, documentation of conducting monitoring should be submitted with enough examples for the reviewer to confirm that monitoring included the specified data and took place weekly during the certification term.
The following document icon is used to highlight criterion that require documentation. image.png438 Bytes
Insufficient Information Reviewers evaluate all projects that have been filled out as part of an application. If answers to application questions for a project do not fully address the criteria in the scoring rubric, the reviewer will not award points. Reviewers do not infer details, so applications are scored as fully as possible given the information provided. For example, if a plant list consisted of unspecific names such as “lily, rose, dogwood”, the reviewer will not infer that it is referring to lily, rose or dogwood native to the region (as there are multiple species and the exact species may or may not be native).
Inconsistent Information Information presented in the application may sometimes have inconsistencies. For example, the applicant may have selected a checkbox that does not correspond to the details written out in the associated long text field. Reviewers score based on the most detailed fields, as the more detailed fields provide more information and additional credibility.
Level of Detail
Question Type
Low
Checkbox, radio button, yes/no
Medium
Text boxes
High
Uploaded documents or photos
Applicant Understanding Some criteria are evaluated based very specifically on the applicant’s answer to the question. For example, if an applicant does not acknowledge that their project aligns with a large-scale initiative, but the reviewer has knowledge that is does, the reviewer cannot award points for alignment because the applicant has answered “no” to the question.
The following icon is used to highlight criterion that evaluate applicant’s understanding. image.png552 Bytes
Scoring Sheets and Criteria
Each of the 24 project types has a corresponding Scoring Rubric with a defined scoring rubric, all of which are publicly accessible .
Reviewers review each project in a program using the designated Scoring Sheet for the project type. The reviewer assigns scores for a series of criteria based on the rubric provided. These completed Scoring Sheets determine the outcome of the application.
Point Values
The Scoring Sheets consist of a number of specific criteria. WHC developed the levels of achievement for each criteria as well as the weight assigned to each criteria drawing on input from several Advisory Committees that included external conservation experts.
The Reviewer scores provide a simplified way to score individual criteria. Although there are some yes/no scoring criteria as well as some specific values (such as acreage), most of the criteria are evaluated against defined levels of achievement. For example, a reviewer can score a monitoring protocol as 0, 1, or 2 based on defined levels of achievement in the scoring sheet:
0 = No monitoring protocol or a protocol that is not relevant
1 = Monitoring protocol that is relevant but not scientifically rigorous
2 = Monitoring protocol that is both relevant and scientifically rigorous
Each selected score is then input into calculations to determine the final point value that is awarded for the criteria. The sum of the point values awarded for each criterion results in the Project Score.
Did you find this helpful?
Cookies & Tracking
In order to better understand how this site is used, we're making use of cookies to anonymously track your visit. We're not storing any personal information.