Exam banner

You may also be interested in the How To Apply For A Cognitive Exam or the Cognitive Exam Policies page.

About the National Registry Cognitive Exams

Candidates seeking National EMS Certification at the Emergency Medical Responder, Emergency Medical Technician, and Paramedic level take a Computerized Adaptive Test (CAT). The advantage of using CAT is that it allows for the construction of a shorter, more precise test that is individualized to each candidate’s level of knowledge and skills. A passing standard, identical for all candidates at their level of certification, is used to determine whether a candidate passed or failed the cognitive exam.

Candidates seeking National EMS Certification as an Advanced-EMT take Computer Based Linear Tests (CBT). A linear CBT exam is a fixed length, computer version of a paper and pencil exam.

This same method is used to develop all National Registry test items. First, an item (or question) is drafted. Then it is pilot tested in a high stakes atmosphere by being placed in live exam test pools. The test pool is a ‘bank’ of test questions that the computer can draw from when delivering an exam. Pilot items are placed in test pools to be calibrated, determining where on the scale of difficulty they will be placed. When the drafted item is being pilot tested, it does not count towards the pass/fail score of the candidate being examined. In order for an item to be placed in a “live” (when the items counts toward pass/fail) test pool, it must meet strict calibration requirements. The difficulty statistic of an item identifies the “ability” necessary to answer an item correctly. Some items require a low ability to answer correctly while others may require moderate or high level of ability.

About the Minimum Passing Standard

The minimum passing standard is the level of knowledge or ability that a competent EMS provider must demonstrate. The minimum passing standard is set by the National Registry Board of Directors and reviewed at least every three years. This Board action is based on the recommendation from a panel of experts and providers from the EMS community. The panel is facilitated by psychometricians - experts in testing. The panel uses a variety of recognized methods (such as the Angoff method) to assess how a minimally competent provider would respond to examination items. These are combined to form a recommendation on the minimum passing standard for the exam. The Board considers this recommendation and the impact on the community to set the minimum passing standard.

Pilot Questions

During National Registry exams, every candidate receives pilot questions that are indistinguishable from ‘live’ items. However, pilot questions are not factored into a candidate's performance. The number of 'pilot items' included on each each exam are detailed below:

  • EMR: 30 items
  • EMT: 10 items
  • AEMT: 35 items
  • Paramedic: 20 items

Pearson VUE: The National Registry Test Provider

The National Registry utilizes Pearson VUE as its exclusive test provider. Pearson VUE is one of the world's largest assessment and testing businesses. They have an extensive network of testing centers and extensive experience in delivering high-stakes computer based tests. You will take the exam at one of two types of Pearson VUE testing centers:

  • Pearson VUE Professional Centers (PPCs) are testing centers that are owned and operated by Pearson. They are located in most urban areas.
  • Pearson VUE Testing Centers (PVTCs) have a contractual relationship with Pearson VUE and are typically found in smaller towns and rural areas. PVTCs are used to increase the access to EMS testing in rural areas.

In some cases, the closest Pearson VUE test center is located in another state. Candidates can test at any authorized Pearson VUE test center in the United States at a convenient date, time and location. The exam delivery process is the same regardless of where it is taken.

Computer Adaptive Tests

Since CAT exams are delivered in a completely different manner than pencil-paper exams, they will “feel” more difficult. Candidates should not be concerned about the ability level of an item on the exam because their ability is being ‘measured’ in a different manner. This works by placing all items on a standard scale in order to identify where the candidate falls within the scale. As a result, candidates should answer all items to the best of their ability. Let’s use an example to explain this:

Suppose that a middle-school athlete is trying out to be a member of the high jump team of the track team. The coach, after many years of experience as a middle-school coach, knows that in order to score any points at a middle-school track meet, his individual jumpers need to jump over a bar placed at four feet above the ground. This is the “competency” standard. If he enters jumpers who can jump three feet, he knows these jumpers will rarely-- if ever--score points for his team during a track meet. Those who jump four feet on the first day of try-outs, after training and coaching, can not only jump four feet (the minimum) but; later may, through additional education, learn to jump five or more feet. The coach knows that it will be worth his time and effort to coach these try-out jumpers to greater heights. Therefore, he tells those who jump over four feet at try-outs that they are members of the high jump team (because they have met the entry-level or competency standard).

Since the coach knows the competency standard, he can hold a try-out to see who meets the entry-level competency. The coach will likely set the bar at or near 3 feet 6 inches for the first jump attempt. Those who make it over this bar will then progress to perhaps 3 feet 9 inches to test their ability at that height. After a group has passed 3 feet 9 inches the coach will again raise the bar to 4 feet and have the successful jumpers attempt to clear it., A smart coach will likely not tell the team the necessary height so that he can learn the maximum ability of each try-out jumper. At the 4 foot level, the coach may find that seven of ten athletes clear the bar. He will then raise the bar to 4 feet 3 inches and later to 4 feet 6 inches. He will increase the height of the bar until he determines the maximum individual ability of each try-out jumper. If he has four slots on his team, he will select the top four or five jumpers and begin the coaching process to help them reach even greater heights. In this manner, the coach has learned about the ability of the try-out jumpers based upon a standard scale (feet and inches). The coach then sets a standard (4 feet) for membership on the team, based upon his knowledge of what is necessary to score points at track meets (the competency standard).

How A CAT Exam Works

The above high-jump illustration can describe the way a CAT exam works. Every item within a live item pool has been calibrated to determine its level of difficulty. As each candidate takes an exam, the computer adaptive test must learn the ability level of the candidate.

The test typically starts with an item being administered that is slightly below the passing standard. The item may be from any subject area in the test plan:

  • Airway, Respiration & Ventilation
  • Cardiology & Resuscitation
  • Trauma
  • Medical/Obstetrics/Gynecology
  • EMS Operations

After the candidate gets a short series of items correct, the computer will choose items of a higher ability, perhaps near entry-level competency. These items will also be taken from a variety of content areas of the test plan. If the candidate answers most of the questions in this series of items correctly, then the computer will choose new items that are at a higher ability level. Again, if the candidate answers many of these items correctly the computer will again present the candidate with items of an even higher ability level. Eventually, every candidate will reach his or her maximum ability level (and answer questions incorrectly). The computer then determines whether or not the individual is above the standard (entry-level competency) in these content areas, and the examination ends.

A 95% Confidence is Necessary to Pass or Fail a CAT Exam

The high achiever who is able to answer most of the questions correctly will find that the computer ends the exam. A candidate may worry that something is wrong because the exam was so short, when in reality, the computer was able to determine that the candidate jumped far higher than the standard level—or was well above the level of competency In a CAT exam.

The computer stops the exam when it is 95% confident that the individual candidate has reached the level of competency, is 95% confident the individual cannot reach the level of competency, or has reached the maximum allotted time. Thus, the length of a CAT exam is variable. Sometimes a candidate can demonstrate a level of competency in as few as 60 test items. Sometimes, after 60 questions, the candidate has shown to be close to entry-level competency but the computer has not determined within the 95% confidence requirement that the candidate is either above or below the entry-level competency standard. In cases when the computer is not 95% confident, the test continues to provide additional items. Each additional test item provides more information to determine whether or not a candidate meets the entry-level of competency. Regardless of the length of the test, items will still vary over the content domain (Airway/Oxygenation/Ventilation, Cardiology, Medical, Trauma, and Operations).

When (and if) the candidate reaches the maximum length of an examination, the ability estimate of that candidate will be most precise. Using the high jumper example, the computer will be able to determine those who jump 3 feet 11 inches from those who jump 4 feet 1 inch. Those who clear 4 feet more times than they miss 4 feet will pass. Those who jump 3 feet 11 inches but fail to clear 4 feet enough times will fail and are required to repeat the test. Some candidates won’t even be able to jump close to four feet. These candidates are below or well below the entry-level of competency. This too can be determined fairly quickly, and these candidates may have their examination ended quickly. When the examination is near 70 questions (for EMTs, or 90 for EMRs) and a candidate fails, he or she has demonstrated within 95% confidence that he or she cannot reach the entry-level of competency.

In a CAT exam, it is important that the candidate needs to answer every question to the best of their ability. The CAT exam provides the candidate with more than adequate opportunity to demonstrate their ability and is able to provide precision, efficiency, and confidence that a successful candidate meets the definition of entry-level competency and can be a Nationally Certified EMS provider.

Computer Based Linear Tests

Computer based linear tests are fixed length exams. The Advanced-EMT exam is currently a Computer Based Linear Test.

Unlike CAT exams, candidates who take a linear exam can skip questions, mark questions for review, and go back and change their answers provided time has not expired. There is no penalty for guessing. Any questions that are left blank are scored as incorrect.


Exam Results

Exam results are posted on the National Registry’s password-secure website through an individual’s login account usually within two business days following the completion of the examination provided you have met all other requirements of registration. Those candidates who pass the exam are sent National EMS Certification credentials by the National Registry. Candidates who successfully demonstrate entry-level competency do not receive specific details regarding their examination results, as it is not necessary.

Candidates who fail to meet entry-level competency are provided information regarding their testing experience. Studying examination items to prepare to do the job of an EMT is not helpful. Studying the tasks and the job of an EMT provides the best preparation. Candidates who memorize items in hopes of “getting them right,” the next time are wasting their time because masking items prevents them from seeing the same item again.

What is included on the exam? The "Test Plan"

National Registry examinations are developed to measure the important aspects of out-of-hospital care practice. Examination items are developed in relation to tasks identified in the practice analysis. The domain of practice that limits therapy addressed in an item is based upon National EMS Scope of Practice Model and the National Registry Practice Analysis.

EMS education programs are encouraged to review the current National Registry practice analysis when teaching courses and as a part of the final review of the abilities of students to properly deliver the tasks necessary for competent patient practice.


Based on the 2014 Practice Analysis, the current National EMS Certification Examinations cover five content areas:

  • Airway, Respiration & Ventilation
  • Cardiology & Resuscitation
  • Trauma
  • Medical/Obstetrics/Gynecology
  • EMS Operations
All sections, except EMS Operations, have a content distribution of 85% adult and 15% pediatrics.
 
Content Area EMR
(90-110 items)
EMT
(70-120 items)
Advanced EMT
(135 items)
Paramedic
(80-150 items)
Airway, Respiration & Ventilation 18%-22% 18%-22% 18%-22% 18%-22%
Cardiology & Resuscitation 20%-24% 20%-24% 21%-25% 22%-26%
Trauma 15%-19% 14%-18% 14%-18% 13%-17%
Medical/Obstetrics/Gyn 27%-31% 27%-31% 26%-30% 25%-29%
EMS Ops 11%-15% 10%-14% 11%-15% 10%-14%

National EMS Practice Analysis

The goal of licensure and certification is to assure the public that individuals who work in a particular profession have met certain standards and are qualified to engage in practice (American Educational Research Association, American Psychological Association, and National Council on Measurement in Education, 1999). To meet this goal, the requirements for certification and licensure must be based on the ability to practice safely and effectively (Kane, 1982). The practice analysis is a critical component in the development of a legally defensible and psychometrically sound credentialing process.

The primary purpose of a practice analysis is to develop a clear and accurate picture of the current practice of a job or profession, in this case the provision of emergency medical care in the out-of-hospital environment. The results of the practice analysis are used throughout the entire National Registry of Emergency Medical Technicians (National Registry) examination development process, which helps to ensure a connection between the examination content and actual practice. The practice analysis helps to answer the questions, "What are the most important aspects of practice?" and "What constitutes safe and effective care?" It also enables the National Registry to develop examinations that reflect contemporary, real-life practice of out-of-hospital emergency medicine.

The National Registry conducted its first practice analysis in 1994 and at five year intervals. The most recent practice analysis was conducted in 2014. After the data is collected, an analysis is conducted that accounts for frequency and criticality. This weighted importance score is then combined for each of the five domains on the National Registry examinations: Airway, Respiration & Ventilation, Cardiology & Resuscitation; Medical & OB/Gyn, Trauma, and EMS Operations. After the weighted importance scores are calculated, the proportion represented by each area is used to set the blueprint for the next five years of National Registry examinations. A copy of the 2014 Practice Analysis is available for free, download it here.

Example Items

To help you understand what to expect from exam questions, we have provided examples of the types of questions entry level providers are expected to answer on the exam. These three questions range in difficulty but are all either at the passing standard or above the passing standard.

  1. Your patient fell while skateboarding and has a painful, swollen, deformed lower arm. You are unable to palpate the radial pulse. You should immediately
    1. apply cold packs.
    2. align the arm with gentle traction.
    3. splint the arm in the position found.
    4. ask the patient to try moving the arm.
  2. Your patient is a disoriented 86 year old male who has a history of terminal brain cancer. He fell out of bed and complains of severe right hip pain. His wife called 9-1-1 for assistance in putting him back to bed. She tells you that he has DNR orders and that she does not want him transported. You should initially
    1. explain why he needs to be transported.
    2. ask to see the DNR orders
    3. have her sign a refusal form.
    4. call for law enforcement back-up.
  3. You are called to a scene where law enforcement officers have detained a man who they thought was drunk. They called you because he has a history of diabetes. You administer oral glucose and within a minute the patient becomes unresponsive. You should
    1. remain at the scene under law enforcement authority and request ALS back-up.
    2. leave the glucose in place, complete a primary survey and transport.
    3. open his airway with an oropharyngeal airway and ventilate him.
    4. suction his mouth and begin transport.

Where do the test questions (items) come from?

The National Registry of EMTs (National Registry) follows an extensive process to develop/create cognitive exam items (questions). Each exam question takes approximately one year to create. The estimated cost to create one valid test question is approximately $1600. The item development process is the same for all four levels of National EMS Certification: Emergency Medical Responder (EMR), Emergency Medical Technician (EMT), Advanced Emergency Medical Technician (AEMT) and Paramedic (NRP).

Computer based cognitive examinations consist of items drawn from the National Registry's item banks. National Registry computer based exams are constructed to ensure that each candidate receives a distribution of items from five categories:

  • Airway, Respiration & Ventilation
  • Cardiology & Resuscitation
  • Trauma
  • Medical/Obstetrics/Gynecology
  • EMS Operations

Fifteen percent (15%) of the items in all categories cover pediatric emergency care, except Operations. The number of items from each category is determined by an examination test plan (also known as a blueprint) which has been approved by the National Registry Board of Directors.

Process for item creation.

Individual examination items are developed by members of the EMS community serving on Item Writing Committees convened by the National Registry. Item Writing Committees typically have nine to ten national EMS experts as members (physicians, state regulators, educators and providers). They meet over several days to review, rewrite, and reconstruct drafted items. Consensus by the committee must be gained for each item. This ensures that every question is in direct reference to the tasks in the practice analysis; that the correct answer is the one and only correct answer; that each distracter option has some plausibility; and the answer can be found within commonly available EMS textbooks. Controversial questions are discarded and not placed within the pilot item pools. Items are also reviewed for the appropriate reading level and to ensure no bias exists related to race, gender or ethnicity.

Following completion of the item-writing phase, all items are pilot tested. Pilot items are administered to all candidates during the computer based cognitive exams. To the candidates, pilot items are indistinguishable from scored items; however, they do not count for or against the candidate. Extensive analysis of the performance of the pilot items is conducted with those functioning properly, under high stakes pilot testing. When the item analysis is complete, the items are determined to be functioning properly and are psychometrically sound; they are placed in “live” item pools.

The National Registry conducts differential statistical analysis of items in pre-test item pools and live item pools on an annual basis. Panels are then convened to review the items that show differential statistics and decisions are made regarding maintenance of any items within the pools.

COGNITIVE EXAM ONSCREEN CALCULATOR

Beginning in 2018, an onscreen calculator will be available to candidates throughout the cognitive examination. The onscreen calculator will be released by certification level, in accordance with the following schedule:

  • Paramedic– March 2018
  • AEMT – April 2018
  • EMT– May 2018
  • EMR – September 2018

Below, you will see a list of the calculator’s functions.
 

Calculator preview

Calculator functions

HELPFUL INFORMATION ABOUT THE EXAM

All exam items evaluate your ability to apply knowledge from your course and textbook to the types of tasks and situations that are expected of entry-level EMS professionals. Questions that are answered incorrectly on the exam mean you could choose the wrong assessment or treatment in the field. There are some general concepts to remember about the cognitive exam:

  • There is only one best answer. The items are written to determine how you would respond when providing patient care. Incorrect responses may be misunderstandings, common mistakes or inefficient approaches that represent less-than-optimal care.
  • Examination content reflects the National EMS Educational Standards, not local or state protocols. The National Registry avoids questions on specific details that have regional differences. Some topics in EMS are controversial, and experts disagree on the single best approach to some situations. The National Registry avoids testing over controversial areas.
  • National Registry exams focus on what providers should do in the field. The exam is not taken from any single textbook or source. The exams are intended to reflect the current accepted practices of EMS. Fortunately, most textbooks are up-to-date and written to a similar standard; however, no single source thoroughly prepares you for the exam. You are encouraged to consult multiple references, especially in areas in which you are having difficulty.
  • You do not need to be an experienced computer user or be able to type to take the computer based exam. The computer testing system has been designed so that it can be used by those with even minimal computer experience and typing skills. A tutorial is available to each candidate at the testing center prior to taking the examination.
PREPARING FOR THE EXAM

Here are a few simple suggestions that will help you to perform to the best of your ability on the examination:

  • Study your textbook thoroughly and consider using the accompanying workbooks to help you master the material.
  • Thoroughly review the current American Heart Association’s Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care. You will be tested on this material at the level of the exam you are taking.
  • The National Registry does not recommend a particular study guide but recognizes that they can be useful. Study guides may help you identify your weaknesses, but should be used carefully. Some study guides have many easy questions leading some candidates to believe that they are prepared for the exam when more study is warranted. If you choose to use a study guide, we suggest that you do so a few weeks before your actual exam. You can obtain these from your local bookstore or library. Use the score to identify your areas of strength and weakness. Re-read and study your notes and materials for the areas you did not do well in.
  • The National Registry is not able to provide candidates information about their specific deficiencies.

The Night Before The Exam

  • Do not try to study the night before the exam. If you come across a topic you do not think you know well, there will not be enough time to study. This will only create a stressful situation.
  • Get a good night’s sleep

The Day of the Exam

  • Eat a well-balanced meal.
  • Arrive at the test center at least 30 minutes before the scheduled testing time. The identification and examination preparation process takes time. You may also need this time to review the tutorial on taking a computer based test. Arriving early will reduce stress.
  • Be sure to have the proper identification as outlined in your confirmation materials before you head to the test center
  • You will not be able to take the exam if you do not have the proper form of identification.
  • Relax, thorough preparation and confidence are the best ways to reduce test anxiety

During the Exam

  • Take your time and read each question carefully. The exam is constructed so most people will have plenty of time to finish. Most successful candidates spend about 30 – 60 seconds per item reading each question carefully and thinking it through.
  • Less than 1% of the candidates are unable to finish the exam. Your risk of misreading a question is far greater than your risk of running out of time.
  • Don’t get frustrated. Because of the adaptive nature of the exam, everyone will think their exam is difficult. The CAT algorithm is adjusting the exam to your maximum ability level, so you may feel that all the items are difficult. Focus on one question at a time, do your best on that question and move on.

After The Exam

  • Examination results are not released at the test center or over the telephone.
  • Your examination results will be posted to your National Registry account usually within 2 business days following the completion of the examination provided you have met all other requirements of registration.
  • Log into your account and click on "Dashboard" or "My Application -> Application Status" to view your exam results.
AMERICANS WITH DISABILITIES ACT INFORMATION
The National Registry complies with the Americans with Disabilities Act (ADA) of 1990 and offers reasonable accommodations for individuals with disabilities. Pearson VUE test centers are also ADA compliant. Complete information about the National Registry Accommodations Disability Policy can be found here.