Credentialing Excellence in Health Education

Exam Background and Scoring

MCHES® Exam Background and Scoring 

The MCHES® examination contains 165 items; 150 of the items are scored and 15 of the items are used as pilot items and do not contribute to the final score on each exam. Examinees are informed that there are pilot items on the exam, however, the candidate is not told which items are being piloted and which items are being scored. NCHEC pilots new items on the MCHES® exam to ensure high statistical integrity of this certification examination. Item piloting is used to determine the psychometric properties of an item before the item is actually included as a “scored item” on an examination. This also allows for the removal of items that do not perform at acceptable levels for certifying examinations.

Determining the Passing Point on the MCHES® Exam

Determining the passing point for a multiple-choice examination involves the conduct of a standard-setting study, typically conducted each time the content outline for the exam is updated.  A representative panel of subject-matter experts develops a standard of performance for minimally-qualified candidates (also known as borderline candidates – those who know just enough to have earned the certification).  Panel members then rate each test question as to how well minimally-qualified candidates will perform on each question.   Questions with disparate ratings are discussed at length, and the panel comes to agreement on a recommended cut score for the organization’s decision makers to review and make a final decision.

The goal of the standard-setting study is not to determine how many candidates will pass and how many will fail.  The goal is to determine how much of the test has to be answered correctly in order to pass, taking into account the difficulty of the questions for minimally-qualified candidates.  With the cut score determined in this way, all candidates who meet the minimally-qualified standard will pass.  There is no pre-determined “curve” for scoring in which a certain percentage of candidates must either pass or fail.  Technically all candidates could pass, or all candidates could fail, but the usual passing rate is somewhere between and can vary from testing window to testing window.  The standard of performance on which the cut score is based remains the same until changed by a new standard setting study, usually 5-7 years after the first.  What determines any variation in passing rate from one testing window to the next is the ability of the group of candidates taking the test in each window to meet the minimally-qualified standard.

Reliability and Validity of the MCHES® Exam 

Item analyses are conducted and the results are reviewed for each examination form administered. Reliability of the examination is calculated using the Kuder-Richardson Formula 20 (KR20). Reliability coefficients above 0.80 are considered satisfactory for credentialing exams. The MCHES® exam reliability coefficient, as determined by the KR20, has consistently met or exceeded the standard over the years.

Statistical Information Regarding the MCHES® April 2016 and October 2016 Exams

2016 MCHESⓇ Exam Statistical Information  April 2016 Examination  October 2016 Examination 
 Number of Items  150  150
 Pass Point  106 99
 Average Raw Score  114.85 108.11
 Standard Deviation  13.83 14.20
 Range of Raw Scores 75 - 139 73 - 134
 Average Percent Score 77 72
 Number of Candidates Tested 72 63
 Number of Candidates who passed (pass rate)  53 (73.61%) 47 (74.60%) 

Confidentiality Policy

The exam score is confidential and will not be disclosed unless NCHEC receives a written request to do so from a candidate or is directed to do so by subpoena or court order. A candidate wanting scores released to another entity must indicate in writing which particular scores may be disclosed and identify specifically the person or organization to which the scores should be revealed. No candidate scores will be given by telephone, facsimile or e-mail for any reason.

MCHES® Exam Analysis: April 2016 - October 2016

The below table shows the average score for each Area of Responsibility for the April and October 2016 MCHESⓇ exam.

  I. Assess Needs II.
Plan Programs
III. Implement Programs IV. Evaluate Programs V. Administer Programs VI. Act as a Resource Person VII. Communicate / Advocate Total Score Number Tested Number Passed Percent Passed
Max Score     15 24 23 30 26 18 14 150      
Co-hort National Av  Scores April 16       78.40% 79.79% 77.35% 74.17% 81.15% 66.33% 77.50% 76.57% 72 53 73.61%
Co-hort   National Av Scores Oct 16      76.40%  73.63%  68.26%  73.87%  73.38% 69.56%  68.00%  72.07%  63  47  74.60%

Ready to learn more? Request our presentation kit that illustrates the benefits of NCHEC certification or join our mailing list for the latest NCHEC news.