Translate this page into:
Helping students bridge their cognitive competence gap: Effectiveness of a faculty development workshop on ‘giving feedback’. A mixed methods study
Correspondence to JYOTSNA AGARWAL; jyotsnaagarwal.micro@gmail.com
[To cite: Agarwal J, Singh V, Singh MK, Chacko TV. Helping students bridge their cognitive competence gap: Effectiveness of a faculty development workshop on ‘giving feedback’. A mixed methods study. Natl Med J India. DOI: 10.25259/NMJI_720_2023]
Abstract
Background
Although the National Medical Commission (NMC) mandates regular feedback for undergraduate students, it is not often practised. Lack of time and knowledge on giving feedback are reasons often cited by teachers. In a mixed-method study, we assessed the effectiveness of a training workshop for faculy on the ‘feedback process’ and the effect of giving constructive feedback to students in microbiology on bridging learning gaps.
Methods
A core team of nine facilitators was trained in the importance of feedback and methods to provide feedback. The Kirkpatrick model was used for measuring training effectiveness. Sixty-two consenting students attended a regular teaching session for a pre-decided competency, followed by a reasoning-based assessment. Students scoring <60% (group A) received individual feedback, and those scoring >60% (group B) received feedback in groups. This was followed by another regular teaching session for a related competency and assessment. Reflections were noted from both students and facilitators. Themes generated and satisfaction indices were calculated.
Results
Facilitators were happy with the workshop and felt satisfied with the feedback session conducted by them. There was a significant improvement in the performance of students after the feedback session, especially for group A (median score pre-feedback=4.5; post-feedback=7.5; out of 10). Most (93.5%) students strongly agreed that the feedback session was helpful in making them aware of their learning gaps (satisfaction index=93.54) and facilitated the bridging of the gaps. Students wanted feedback sessions to continue across all competencies and other subjects as well.
Conclusion
It is possible to improve student performance by conducting a faculty development workshop emphasizing the need and process of giving feedback.
INTRODUCTION
Cognitive theorists have shown that feedback assists learners in reconstructing their knowledge, changing performance, and feeling motivated for future learning.1,2 On the teacher’s part, it also conveys an attitude of concern for the learner’s progress and development. During feedback, the gap between desired performance and the present state of knowledge, skill, or understanding can be recognized, and actions to be taken by learners to close that gap can be suggested.3–5 The new competency-based medical education (CBME) curriculum, as mandated by the National Medical Commission (NMC), also includes incorporating feedback and formative assessments. However, medical students are often not given feedback in their teaching/learning sessions.6 Lack of time and motivation are often the reasons cited by the facilitators/teachers, besides the lack of formal training on how to give feedback.5–7 Some previous studies have focused on ‘structured feedback’ in various clinical and pre-clinical setups.8–11 However, none speak about training the teachers/facilitators on the process of giving feedback. The most widely accepted model for giving feedback in medical education is based on Pendleton’s rules,5,7,8 which is a modified ‘Feedback Sandwich’ where students are actively involved through structured reflections on their performance. It is done in a way that positives are highlighted first, to create a safe environment.5,8–10,12 These can be practised well even by beginners and less experienced teachers.
We have tried to elicit the utility of a formal training workshop for young facilitators in microbiology, on the process of feedback, and assessed the effect of giving structured construc-tive feedback to undergraduate students in improving their cognitive competence. We also assessed the impact of a faculty development programme beyond the workshop to their workplace in terms of improved learning by students in a new CBME curricular paradigm. This will encourage others to similarly generate evidence that new critical processes, like feedback for learners in the CBME curriculum, actually work and can be implemented elsewhere.
METHODS
We evaluated the effectiveness of a faculty development programme (workshop on giving feedback) based on the Kirkpatrick model framework13 using convergent parallel mixed methods for evidence generation in two stages: first on the participants of the workshop (reaction, learning) and then testing the effectiveness of their transfer of learning at their workplace to improve the learning of 62 phase II MBBS students (Fig. 1).

- Using the Kirkpatrick model of measuring training effectiveness: Indicators and data sources
The study was conducted from July to November 2022 in the microbiology department of a teaching hospital in northern India. Necessary approvals were obtained from the institutional ethics committee. Motivated microbiology faculty members (2 in number) and 7 senior residents (post-MD microbiology) were chosen as the core team of facilitators; 3 trained senior residents’ terms were completed before the study could begin, ultimately the core team consisted of 6 members. Written informed consent was taken from the phase II MBBS students, and only those who consented participated in the study. Students were told that the assessment done as part of this study would not be included in the internal assessment. The methodology has been explained in a flow chart (Fig. 2).

- Flowchart of steps used in the intervention
A workshop for the core team of facilitators was conducted by a peer expert from the institute’s medical education unit (MEU) to create awareness of the importance of giving feedback in the new CBME curriculum paradigm and how it should be delivered. Workshop content was validated through discussions involving the principal investigator and other experts from MEU. Pre and post-workshop knowledge, attitude, and perceptions about ‘giving feedback’ were elicited from all members of the core team on a predesigned and validated questionnaire based on 5 point Likert scale from 1=strongly disagree to 5=strongly agree, with a few open-ended questions to elicit participants’ perspectives in their own words for qualitative data. Various principles of giving feedback to students as proposed by Ende7 and Pendleton’s rules5 were deliberated in this 3-hour-long interactive workshop with role plays; a case-based example from society was taken to emphasise the importance of feedback in enhancing the performance of the learners. Resource faculty gave valuable suggestions and discussed various doubts shared by the participants regarding the method of giving feedback.
Participating students were sensitized for ‘feedback sessions’ through an interactive meeting. As per the total number of students (200 in number), considering a margin of error 10% and with 5% level of significance, the sample size came to be 62; however, considering 5% loss of student information during the study period, the actual sample size came to be 65. The structured feedback programme was then introduced to 62 consenting students, which involved routine didactic lectures on two related topics in microbiology (influenza virus and coronavirus, including Covid-19), each followed by assessment by reasoning-based short questions. After the first assessment, verbal feedback sessions in small groups of 5 to 6 students for those who scored more than 60%, and one-on-one feedback for those who scored <60% were held by the trained core team.
The feedback sessions were carried out over and within 2 days of the first assessment. Students were pre-informed about the time, place, and name of the facilitator, and all sessions were planned within the time frame of microbiology tutorials. Two related competencies of the ‘know how’ level were chosen for the present study to pilot the process to see the effect of feedback on cognitive competence. The second competency was taught within 20 days of the feedback session and was followed by another assessment. Reflections on the structured feedback sessions were noted using a questionnaire (based on a 5 point Likert scale 1=strongly disagree to 5=strongly agree, with a few open-ended questions) by participating students and the facilitators (separate forms were designed, both of which were peer and expert validated. These were administered after the 2nd assessment scores were evaluated by the core team and shown to the students.
Data entry and analysis
Collected data were entered into Microsoft Excel 2016. All responses from the facilitator and students were analysed in terms of percentages, satisfaction index (SI), and medians, and represented in graphs. SI was calculated for each item (SI=cumulative score achieved/maximum possible score ×100). The rubric scoring format was used to evaluate the short, structured questions attempted by students for the competencies assessed. Further, transcripts were prepared for qualitative data (responses to open-ended questions), themes were identified, and a thematic map was prepared.
RESULTS
Workshop on ‘feedback’ for the facilitators
A 20-point Yes/No/Don’t know questionnaire was given before and after the workshop to all facilitators (9 in number). Most facilitators present in the workshop did not know about Pendleton’s method of providing feedback. Upon analysis of transcripts, the biggest limitations expressed by the facilitators were: choice of words and feedback delivery. After the workshop, most were able to satisfactorily answer the questions.
Effect of ‘feedback workshop’ on cognitive competence of students (reflecting Kirkpatrick level 4: teacher’s transfer of learning to practice)
Group 1. Students scoring <60% in pre-feedback test: It was found that the post-feedback assessment (conducted after the 2nd competency was taught) showed a significant increase in the marks obtained by students, with the average score rising from 4.5 to 7.5 (6+1.5).
Group 2. Students scoring >60% in pre-feedback test: 32 students who scored >60% in their pre-feedback assessment (median 7.5 marks), scored a median of 8.5 marks in the post-feedback assessment (done after the 2nd competency was taught). Five students who scored good marks in the first assessment did not appear for the feedback session, hence data for 27 students were analysed (7+0.5).
Students’ reflections about feedback
Most (93.5%) students strongly agreed that the feedback session made them aware of their learning gaps (SI=93.54) and drove the process for bridging them. Similarly, 82% of students strongly agreed that feedback helped in better understanding of the topic (SI=95.4), better learning and retention (SI=94.5), and provided a guide for further improvement (SI=95.4). A significant number of students perceived that feedback made them aware of the right way to attempt answering the questions (SI=94.88), to look for more resource material, and to make them feel more confident in facing the examination. Most (95%–97%) students strongly agreed that they felt satisfied with the feedback delivery (including friendly manner, sufficient time, and positive construct; Fig. 3).

- Students’ reflections on role of feedback
The main themes identified from the open-ended question ‘Why do you want to attend more such sessions’ in the student feedback questionnaire were: ‘improved understanding of the topic (23%), improvement in answer writing (25%), mentor-mentee centric (19%), understanding weak points (14%), self-motivation (11%), makes the learning process more logical and scientific (8%). The majority (58%) of the students believed that the correct time to receive feedback was immediately post-assessment to correlate the answer they wrote. While 77% of students said feedback sessions were needed for both theory and practical classes, 69% stated that feedback is definitely needed for incorrect responses, but also for the correct responses to further improve their understanding.
Workshop participants’ reflections
All participants found the feedback workshop to be a very interesting and unique experience, which helped them learn various principles of giving feedback. All 6 facilitators strongly agreed that the feedback session was helpful in identifying and bridging the learning gaps of students. Further, they agreed/ strongly agreed that the response of the students to the feedback session was enthusiastic (SI=100). The facilitators disagreed that the feedback session was time-consuming or added an extra burden to the routine schedule.
DISCUSSION
Feedback is known to be one of the most important forms of interaction between the ‘teacher’ and the ‘learner.’ Our study is a small educational intervention study generating evidence about the usefulness of ‘Feedback training workshop for facilitators’ and to study the effect of feedback in bridging the learning gap for students. We also looked at perceptions of facilitators and students regarding the feasibility and acceptability of ‘feedback’ as a tool to improve cognitive competence, as this is important for the educational intervention to be accepted and institutionalized.
The initial step in the study was to record the facilitators’ knowledge, attitude, and perception on ‘feedback,’ before and after the training workshop on ‘feedback.’ The facilitators acknowledged the necessity of feedback, but did not know ‘how to’ give it, including the choice of words. New faculty or young tutors, though, have attuned themselves to the CBME curriculum and its demands; most lack the training/confidence to provide feedback to students. There is a paucity of studies that report training of facilitators and recording their baseline knowledge and perceptions.14 The improvement in scores regarding the knowledge on feedback delivery methods after the workshop and increase in confidence on giving feedback to students indicates that the faculty development workshop is needed, feasible, and effective, and replicable in achieving institutional goals as per NMC.
Among 62 student participants, 32 scored >60% marks, and they received feedback in groups of 5–6 students. An increase of 1 mark on average from a mean of 7.5/10 (pre-feedback) to 8.5/ 10 (post-feedback) was observed. At the same time, a significant increase in marks (from a pre-feedback average score of 4.5/10 to a post-feedback score of 7.5/10) was seen for 30 students who scored < 60% in the assessment before the feedback. This is also the group that received individual feedback. Alkhateeb et al., however, reported that among fifth-year medical students, a single formative assessment does not necessarily lead to better performance in subsequent sessions.15
Though giving feedback has proven to be an important component for learning, a recent study by Karol and Pugh found that many factors, such as the emotional reactions that feedback evokes, may impact its effect.16 In our study, none of the students felt that they experienced negative emotional reactions like embarrassment or anxiousness for the verbal feedback received or that it negatively impacted them in any way; in fact, students appreciated that within a day or two of the assessment, they got to know the deficiency in their learning process and clarified what they misunderstood. Various themes generated are depicted in Fig. 4.

- Themes that emerged from thematic analysis of student reflections on the feedback process
Workshop participants who gave feedback to students felt satisfied with the feedback sessions. These results are similar to a study by Brazeau et al., where faculty members found the process of giving feedback to the students educationally satisfying.17 Force field analysis of reflections questionnaire administered to facilitators highlighted themes like ‘improved teaching skills’ and ‘improved confidence in giving feedback to students,’ and ‘first time understood that in feedback provision of plan and blueprint is also important.’ These findings are similar to a previous study by Sulaiman et al., wherein both students and clinical tutors valued the experience.18 In our study, both students and the facilitators wanted the process of feedback to be continued and expressed that it is necessary in other subjects as well.
We feel that to study the long-term beneficial effect of the training workshop as well as the feedback process, and to substantiate the themes generated in this study, a longer follow-up will be needed for a larger sample size, across various competencies in the subject. The study’s strength was the effective training of facilitators to make them understand the importance of appropriately given feedback in developing a better education environment, and the fact that we chose two competencies for the study to truly reflect feedback’s effect on the learning process. Our challenge was to identify a receptive core team that was willing to be part of this ‘change.’
To conclude, although not practised often by faculty, feedback is an invaluable component of CBME as proposed by NMC. A definite change in knowledge, attitude, and perceptions of facilitators was noticed after they attended the workshop on feedback. We need to focus more on training our resource faculty/facilitators on the importance of feedback and its delivery methods for a more impactful and sustainable feedback programme, which is well integrated in the curriculum.
Conflicts of interest
None declared
References
- Self-assessment in the health professions: A reformulation and research agenda. Acad Med. 2005;80:S46-S54.
- [CrossRef] [PubMed] [Google Scholar]
- 'I'll never play professional football' and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28:14-19.
- [CrossRef] [PubMed] [Google Scholar]
- Effective feedback in medical education In: Bhuiyan PS, Rege NN, Supe A, eds. The art of teaching medical students (3rd ed). New Delhi: Elsevier; 2015. p. :248-60.
- [Google Scholar]
- Giving feedback in the new CBME curriculum paradigm: Principles, models and situations where feedback can be given. J Educ Technol Health Sci. 2021;8:76-82.
- [CrossRef] [Google Scholar]
- Giving feedback In: Dent JA, Harden RM, eds. A practical guide for medical teachers (5th ed). Edinburgh: Churchill Livingstone, Elsevier; 2009. p. :281-8.
- [Google Scholar]
- The Consultation: An approach to learning and teaching Oxford: Oxford University Press; 1984. p. :101-4.
- [Google Scholar]
- Giving feedback in medical education: Verification of recommended techniques. J Gen Intern Med. 1998;13:111-16.
- [CrossRef] [PubMed] [Google Scholar]
- A five-step 'microskills' model of clinical teaching. J Am Board Fam Pract. 1992;5:419-24.
- [Google Scholar]
- A method of providing engaging formative feedback to large cohort first-year physiology and anatomy students. Adv Physiol Educ. 2016;40:393-7.
- [CrossRef] [PubMed] [Google Scholar]
- Effectiveness of structured feedback after formative tests on first-year MBBS students' performance in summative examination. Int J Adv Med Health Res. 2021;8:70-4.
- [CrossRef] [Google Scholar]
- Evaluating training programs: The four levels. San Francisco: Berrett-Koehler Publishers; 2006:259-61.
- [Google Scholar]
- Introduction of structured feedback to medical undergraduate students in the first professional. Int J App Basic Med Res. 2021;11:21-6.
- [CrossRef] [PubMed] [Google Scholar]
- Effect of a formative objective structured clinical examination on the clinical performance of undergraduate medical students in a summative examination: A randomized controlled trial. Indian Pediatr. 2019;56:745-8.
- [CrossRef] [Google Scholar]
- Potential of feedback during objective structured clinical examination to evoke an emotional response in medical students in Canada. J Educ Eval Health Prof. 2020;17:5.
- [CrossRef] [PubMed] [Google Scholar]
- Changing an existing OSCE to a teaching tool: The making of a teaching OSCE. Acad Med. 2002;77:932.
- [CrossRef] [PubMed] [Google Scholar]
- Group OSCE (GOSCE) as a formative clinical assessment tool for preclerkship medical students at the University of Sharjah. J Taibah Univ Med Sci. 2018;13:409-14.
- [CrossRef] [PubMed] [Google Scholar]