Translate this page into:
Issues impeding satisfactory outcomes with the competency-based medical education program for undergraduates, and some possible solutions
[To cite: Ananthakrishnan N. Issues impeding satisfactory outcomes with the competency-based medical education program for undergraduates, and some possible solutions Natl Med J India 2026;39:184-7. DOI: 10.25259/NMJI_1023_2023]
Abstract
The announcement of the competency-based medical education (CBME) curriculum for undergraduates by the National Medical Commission (NMC) in India in 2019, and its consequent implementation by all medical colleges has, by and large, been unsatisfactory because of many difficulties experienced by colleges in its implementation. The program has over 2800 listed competencies for the students to be trained in, over 800 listed skills, and over 100 skills for mandatory certification before the summative examination. The greatest reason for failure, however, is no major change in the assessment process with CBME. Workplace-based assessment methods on a continuous basis are well known and applied widely in other countries. However, these are yet to be widely accepted and notified by the regulatory agency for the Indian undergraduate medical student. We focus on the reasons for the unsatisfactory state of affairs in CBME assessment that have led to deficiencies in the intended outcome. Major policy changes in assessment such as revision of the list of competencies and skills, along with adoption of a combined formative-summative assessment process based on learning place-based assessment methods could ensure creation of an adequately trained Indian medical graduate.
INTRODUCTION
The introduction of competency-based medical education (CBME) for undergraduate education in 2019 was an important milestone for medical education in India.1 In 2018, for the first time, the concept of an Indian medical graduate (IMG) was clearly defined, requiring 5 attributes, viz. (i) a clinician, (ii) a leader and member of the healthcare team, (iii) a communicator, (iv) a lifelong learner, and (v) a professional.2 These attributes are similar to those of the medical graduate described elsewhere, as in the Accreditation Council for Graduate Medical Education (USA) or Canadian Medical Graduate Competencies framed by the Royal College of Physicians and Surgeons of Canada. To facilitate the transformation of a medical student to an IMG, the mandatory acquisition of 35 sub-competencies under these 5 heads was published.
Simultaneously, a detailed subject-wise list of ‘competencies’ was released by the National Medical Commission (NMC), amounting to 2884 competencies in 23 subjects, along with a list of skills (again subject-wise) amounting to 873 in the same 23 departments, including dentistry. Of these, 48 skills were mandated to be performed for a total of 153 times before being eligible to be certified as achieved prior to graduation.1 Strangely, no skills were required for certification in many departments such as community medicine, radiodiagnosis, forensic medicine, orthopaedics, psychiatry, dermatology, venereology and leprology, and general surgery. The conduct of 10 normal labours was the sole mandatory skill required in Obstetrics and Gynaecology.
Realizing the importance of soft skills, an attitude, ethics, and communication (AETCOM), NMC released the AETCOM module, which focused on these skills as a vertical curricular thread across the full Bachelor of Medicine and Bachelor of Surgery program.2 Besides operational constraints, the implementation of the CBME program has led to many issues due to its intrinsic deficiencies and incomplete nature.
ISSUES WITH THE CBME AND CONSTRAINTS TO ITS IMPLEMENTATION
Several concerns have been raised regarding the CBME as currently notified. Many of the 2884 competencies listed across various subjects do not truly qualify as competencies in the accepted sense, that is, as multi-domain descriptions of the attributes, abilities, and qualities expected of a graduating student, which can be objectively measured and certified. Some were objectives of intent, others were long-answer questions. Very few were in conformity with the definition of a competency as enunciated by Epstein and Hundert.3 None were stated in a measurable format. The list of 873 skills was too numerous to be taught and did not match the needs of the IMG or parallel the duration spent in teaching/learning (T/L) in the corresponding subjects. Some subjects, as mentioned earlier, like forensic medicine, were ignored and had no listed skills to be acquired, although some of the outgoing graduates may be employed as physicians in primary care settings and may need to carry out basic medical legal functions such as issuing a wound certificate or a certificate of drunkenness. One would have thought skills such as splinting and transfer of an individual with a fracture, passage of a nasogastric tube, or a urinary catheter are really necessary for the day-to-day functioning of the IMG. Although 153 of the skills were mandated to be pre-certified before qualification, no protected curricular time was provided for this purpose. Most importantly, there was no change in the prescribed assessment methods to match the requirements of the CBME. They essentially remained the same as before the introduction of the CBME.
There were also other constraints in implementing CBME. Some of these were to be expected with the rapid expansion of the number of medical colleges across the country to over 700 at present. A major issue was the existence of large batches of 250 students per year in many colleges. This resulted in the implementation of the so-called small group T/L activities with 40–50 students per group. Since the requirement of faculty per medical college did not proportionately increase with student intake, there was a shortage of faculty for small group teaching, which the new curriculum mandated. Even existing requirements of faculty per subject could not be met due to the paucity of available, qualified, and eligible personnel.
Other problems were a shortage of clinical resources due to an inadequate number of patients and a growing tendency and unwillingness of patients to be used as teaching resource material for undergraduates. Inadequate facilities for small group teaching, such as space, simulation facilities, trained manpower, and limited embedded time for continuous assessment, were major constraints. Inadequate training and experience in workplace-based assessment (WPBA) methods among teachers compounded the problem. Regulatory norms providing for only a cursory role for formative assessment with limited weightage and an overarching importance of the summative assessment process, largely unchanged from the pre-CBME phase, did not help matters.
CURRENT ASSESSMENT PROCESS
The deficiencies of the current assessment process are well known but still merit being documented. The internal assessment (IA) with limited weightage consists of periodic theory examination modelled on the current university pattern, ward leaving clinical examinations, objective structured clinical examination (OSCEs), and a pre-final model examination based on the university pattern. The summative examination comprises theory, consisting of one or two 3-hour papers with long answer questions, short answer questions, very short answer questions, and lastly multiple choice questions (MCQs; usually only about 20 in number), testing only recall. This is followed by a clinical examination with ‘long’ cases, ‘short’ cases, and ‘spotters.’ Viva voce is generally unstructured. The weightage for IA is 20% in the overall process. This system, obviously, cannot function effectively in ensuring competency.
REQUIRED CHANGES IN ASSESSMENT POLICY
One of the principal reasons why CBME has not had the desired effect is the lack of concurrent changes in assessment, the principal driving force for learning. In view of the obvious non-suitability of the current assessment process to CBME, there is a need to evolve a workable continuous formative model, ensuring validity, feasibility, reliability, and objectivity, and meeting the desired educational outcome. In this context, several changes appear necessary in the assess-ment process.
There needs to be greater emphasis on a continuous formative process with complete alignment of the T/L and assessment process. The theoretical assessment should focus on knowledge and higher-order thinking skills. Such a continuous, regular, objective, theoretical assessment will be in synchrony with the proposed single national exit examination (NExT) and prepare students for that exit examination.
Simultaneously, there should be continuous WPBA focusing on clinical/laboratory skills and associated cognitive requirements. Ethics, professionalism, and communication skills teaching and assessment should be aligned with the AETCOM module. Special teaching and learning methods are required to focus attention on IMG requirements, such as leadership and lifelong learning skills. Like many other countries, all undergraduate students should be given basic training in research methodology and statistics, and be encouraged to do small projects. This will foster a habit in them of lifelong learning skills.
To make all this feasible, there is a need for the creation of a combined formative-summative process for all skills. This process should be continuous and count for summative certification. Unless this is ensured, students will not take the continuous assessment seriously based on the adage ‘they don’t respect what you don’t inspect’.4 Assessment tools must be matched with the competency being evaluated to ensure that what is being measured is what needs to be measured. One needs to use tools for assessment matched to the competency being tested on a continuous basis, like WPBA methods for postgraduate residents in other countries. For want of a better name, this is being referred to in this paper as ‘learning place assessment methods (LPBA)’ since it is being used for undergraduate students at their learning place.
Achieving a system of continuous, reliable, and certified record keeping of scores obtained in the continuous formative process, which enables confident certification of attainment of skills at the end of the process, is best done in the form of an e-portfolio, which is dynamic and provides for 2-way interaction compared to the static logbook. It also allows for continuous feedback and promotes reflective learning.
AVAILABLE ASSESSMENT TOOLS
These are numerous and well-known.5,6 From this toolbox of assessments, the following are suggested for their ease of use, utility, validity, and objectivity. All must be used with prepared checklists for objectivity. The following list is prepared for immediate application across all colleges. Each method should be used along with standardized checklists approved by the regulatory agency.
-
Knowledge and application of basic sciences
Scenario-based case discussion
Structured oral examination
MCQs
Simulated patient management problems
-
Clinical skills
Mini clinical evaluation exercise (Mini CEX)
Standardized patients
Blinded patient encounters (like bedside clinics)
OSCE
Mannikins for clinical skills demonstration
-
Analytical skills
Scenario-based case discussion
Bedside patient-based case discussion
Problem-based group work
Structured viva voce
-
Procedural skills
Direct observation of procedural skills
OSCE/objective structured practical examination
Standardized patients
Mannikins for procedural skills
-
Soft skills, communication, professionalism
Mini CEX
OSCE
Faculty feedback with standardized patients
Peer and near peer feedback
-
Leaderships
Group exercises with faculty feedback
Peer and near-peer feedback
-
Lifelong learner
Mini projects
Assignments requiring retrieval of information by students
Exercises of evidence-based medicine and changing management
The more useful amongst these in the Indian context have been marked in bold. These are simpler and require less training for implementation. Once the process has been set in place, other methods may be introduced in the future after reviewing performance. As mentioned earlier, there are several other assessment methods used in other countries for this purpose. These are not detailed here as there are excellent references pointing out their utility and the need for a different type of assessment for competency-based programs.7,8 The purpose of this article is not to focus on WPBA methods and the merits and demerits of the individual methods, but to focus on the urgent need to make a transformative change in the assessment pattern in the medical undergraduate program to fulfil the intent in creating an IMG with requisite competencies as intended by the 2019 curriculum of the NMC.
ADVANTAGES OF LPBA
The advantages of LPBA in assessment are that they are trainee-led, focus on global skills, including soft skills, and test at the highest level of Miller’s pyramid. Other advantages are broader and representative sampling leading to high validity and reliability; objectivity is built in due to multi-sourcing of encounters and use of structured checklists. There is a high degree of feasibility in most situations; they provide the highest degree of feedback to promote reflective learning and provide an opportunity for longitudinal assessment.
For LPBA to be used continuously, it is necessary that a record is maintained in an e-portfolio which is real-time, allows for feedback on a continuous basis, promotes reflective learning, and keeps a longitudinal, continuous record of performance which can be used for combined formative-summative assessment.
RESPONSIBILITIES OF FACULTY AND STUDENTS
The responsibilities of faculty and students are very important in this respect. The roles of faculty are to identify/define competencies and standards, describe what constitutes acceptable evidence of accomplishment of competence, determine timelines and guidelines for evaluation, monitor progress, mentor, provide feedback, and provide intervention where required for students requiring additional curricular support. The roles of students in this process are to collect evidence, document their learning progress, including reflection on the learning process, defend evidence of accomplishment, and receive and respond to feedback.
Likely factors that may be challenges to implementation are time constraints, faculty-intensive nature of the process, facilities/case load, absence of standardized patients, absence of a good simulation laboratory, and need for frequent assessment and continuous monitoring. Faculty motivation may be facilitated by enrolling them as willing partners in progress rather than reluctant dragons, convincing them regarding the need for change, bringing out the advantages of the new system, emphasizing the quality of output, emphasizing the benefits to candidates, reducing undue workload, recognizing and rewarding them, and making the process as simple as possible.
PROPOSALS FOR CONSIDERATION BY THE NMC
None of the required changes is feasible for universal implementation across the country without direct regulatory approval and mandatory guidelines. The following steps are suggested for consideration by the NMC to ensure uniformity.
Revision of the competency list
It is necessary to re-examine the list of ‘competencies’ in the 2019 notification and rewrite them in a measurable format as per the definition of a competency. Since competencies are multi-domain, broad but descriptive attributes, there should not be a need for more than 20–25 competencies per subject, and an overall number not exceeding 500 if this definition is adopted. The NMC should focus on converting them into higher-order cognition skills requiring analysis and application of knowledge for problem-solving. It is necessary to eliminate all those that are not really competencies from the original list.
Revision of skill list
Simultaneously, it is necessary to re-examine the list of skills since they neither match the needs of the IMG nor the length of period over which the subject is taught. The number needs to be reduced to not more than about 200. Considering the duration of the programme, it is certainly not possible to teach and evaluate nearly 900 skills as listed in the CBME. At the same time, one should introduce other skills that are required for the IMG but do not find a place, for example, in the subject of forensic medicine, where no skills are listed. All the skills listed should be mandated for certification without a separate skill list.
Due to paucity of teaching materials, and to allow students to acquire skills without disturbing patients in the early stages of learning, standardized patients and simulation pedagogy must be embedded in the curriculum and implemented continuously. This, of course, implies identification and training of standardized patients, training of teachers to use standardized patients for teaching and assessment, procurement of simulation equipment, and training in simulation pedagogy. It goes without saying that the curriculum must be modified to allow for small group activities to play a major role instead of focusing on large group teaching. Faculty numbers must be adequate for this major conversion of the curriculum from a didactic to a small group activity-based exercise to be successful.
Major revision of the assessment process
The first step, logically, to enable continuous measurement and certification of competencies and skills, is to approve the principle, process, and weightage for the LPBA method for the combined formative-summative assessment. While a whole toolbox with different assessment methods may be available for T/L and feedback purposes, the regulator must prescribe a single tool (and a standardized checklist) based on effectiveness to ensure standardization between schools for the summative-formative process, others may be used for T/L and feedback.
All theoretical assessments from semester 1 should be based on MCQs, particularly those testing higher order thinking skills (HOTS). NExT should be the single summative theoretical examination. There should be no summative practical examination. These should be replaced by periodic summative-formative LPBA exercises. All clinical examinations should be assessment-based on LPBA. Pure formative clinical examinations should be used for T/L and feedback purposes only. Records of summative-formative clinical examinations are to be maintained in an e-portfolio. There need be no summative clinical examination, as is the practice, in countries like the USA. However, if considered necessary, a single exit OSCE whose weightage should not exceed 25% of total clinical marks can be used as the final summative examination.
Faculty development
All these measures would succeed only with extensive and mandated faculty development focused on LAPS, like that conducted for implementing the AETCOM module. Other than pedagogists, the process, the types, and the usage of WPBA are not very familiar to the large majority of faculty since these have not been in place earlier and have not been covered in previous faculty development programmes with any regularity or focus, even in institutions where they have been used. This is because of the lack of emphasis by the regulatory agency in the past. It is well known that the summative examination pattern is entirely controlled by the NMC, and no institution is hitherto allowed to make any changes. Hence, the emphasis must come from the top. Since WPBA (or in this case LPBA) is new, a detailed curriculum for training has to be prepared wherein the faculty, in groups, are introduced to the concept of WPBA, the different methods already in existence for various competencies, the steps of each method, and its utility, along with a detailed discussion on their relevance for ensuring competency. This is best done in small groups, as is the practice for the AETCOM module. A massive effort is required to cover all medical colleges and the existing faculty. This may take a couple of years before the NMC can notify the change in assessment as a universal practice. There should also be emphasis in the faculty development programmes on the need for maintaining a continuous, valid record of the performance of the students longitudinally. The faculty must be introduced to the concept of e-portfolios and their advantages over a logbook for this purpose. Institutions must invest in resources and faculty training to make this feasible. Only if a continuous satisfactory record is maintained can this be used as a valid, reliable, and objective combined summative-formative process.
Simultaneously, there should be training on construction, pre- and post-validation of MCQS, focusing on items testing HOTS. Such regular examinations will foster simultaneous preparation of students for NExT, which is due to be implemented as the single summative theory examination for the undergraduate programme. Both these measures have to be done on a war footing for implementation at the earliest, since already 4 years have passed after the introduction of the new CBME curriculum.
CONCLUSION
The above measures, if implemented in full, would go a long way in ensuring the success of the CBME program and creating the IMG that we desire as per the attributes and sub-competencies mentioned in the CBME document. It has been 7 years since the introduction of CBME, and to avoid further deterioration in the quality of medical education and achieve the laudable objectives intended by the CBME, immediate action is required. It is necessary to consider and introduce a system of assessment that is in tune with the intent of the CBME in creating the IMG, and that can be practised continuously over the full duration of education. It should permit teaching and assessment of all the listed competencies over the duration of the program in the learning place. For this, a combined formative-summative assessment process needs to be put in place, and the record of continuous assessment needs to be recorded in a dynamic e-portfolio. It is time to move on and move away from the all-encompassing single summative examination.
Conflicts of interest
None declared
References
- Competency based undergraduate curriculum. 2019. Available at www.nmc.org.in/information-desk/for-colleges/ug-curriculum/ (accessed on 10 Aug 2023)
- [Google Scholar]
- Attitude, Ethics and Communication 2018. Available at efaidnbmnnnibpcajpcglclefindmkaj/https://www.nmc.org.in/wpcontent/uploads/2020/01/AETCOM_book.pdf (accessed on 10 Aug 2023)
- [Google Scholar]
- Defining and assessing professional competence. JAMA. 2002;287:226-35.
- [CrossRef] [PubMed] [Google Scholar]
- Combined formative and summative professional behaviour assessment approach in the bachelor phase of medical school: A Dutch perspective. Med Teach. 2010;32:e517-e531.
- [CrossRef] [PubMed] [Google Scholar]
- Workplace-based assessment as an educational tool. Med Teach. 2007;29:855-71.
- [CrossRef] [PubMed] [Google Scholar]
- Toolbox for assessment of clinical competence Chandigarh: Unistar Books Pvt Ltd; 2023.
- [Google Scholar]
- Core principles of assessment in competency-based medical education. Med Teach. 2017;39:609-16.
- [CrossRef] [PubMed] [Google Scholar]
- Enhanced requirements for assessments in a competency-based time-variable medical education system. Acad Med. 2018;93:S17-S21.
- [CrossRef] [PubMed] [Google Scholar]