By Anna Lama
I recently presented a workshop on the assessment of professionalism at the Southern Group on Educational Affairs (SGEA) conference. I planned to discuss the elements of assessment: developing a framework to define professionalism, discussing successful assessment practices and reviewing the various tools available to assess professionalism.1 Much to my surprise, the discussion quickly moved into deeper inquiry on student participation, perceptions, and self-identity through the use of peer evaluations on professionalism…
What do your students say about their feedback when they receive it?
Do you talk to students about overtly critical reviews? Overtly glowing reviews?
How do you deal with initial reactions to those comments?
Are they anonymous?
How do you know they are reflecting and “getting real” with themselves?
Inquiring minds wanted to know! Thankfully, two of my medical students happened to be in the room with me. “I really value the content of the peer evals, and the scale was not difficult to use. This evaluation tool can help students with their teamwork skills which translate really well to working with patients.” Gasp! My heart stopped beating for a moment as my student offered his unscripted reflection. He was able to share a perspective in the most honest and intimate way.
As the assessment director for the West Virginia University School of Medicine, I read hundreds, if not thousands, of evaluation data pieces per year. While all are important pieces, the most interesting is our student peer-to-peer evaluation and self-evaluation, which specifically addresses professionalism. Both evaluations are similar in structure where the students are to identify strengths and weaknesses, through specific questions, within nine domains such as honesty and integrity, accountability, responsibility, etc. 2 Unlike most of our standard Likert evaluation scales, this particular evaluation uses a bipolar scale (1 through 7) where extreme ends of the rubric suggest too much “7” or too little “1” demonstration of a particular skill. The middle number “4” is the ideal rating for the domain.
Students complete the trainee self-evaluation after reviewing his/her individual peer-to-peer evaluations, which are aggregated and coalesced into one report. All student evaluator information is anonymous. In the trainee self-evaluation, the students are free to reflect on their personal assessment of self as a learner and, hopefully, as a future physician. Here are a few samples:
“I felt that my strengths were in integrating/learning material and participating in group discussions. Though I felt improvement in my communication skills, I want to continue developing them throughout medical school and practice. I hope to provide comfortability and openness in my patient encounters.”
“In order to be the best physician that I can be, I must continually practice and improve upon my communication skills in group clinical settings. From this semester in PBL and moving forward, I must strive to improve at listening and nonverbal communication (body language, posture, facial expressions).”
“One week, I was assigned two learning objectives and only did one of them. It was not intentional; I merely forgot that I had been assigned two. Though this was minor, slipups are largely unacceptable in healthcare, and should always be addressed. I need to be more meticulous when doing relatively mundane tasks.”
“I really enjoyed working with the group we had in PBL and I think I learned from each of them ways to improve my performance next semester and ultimately my role in a team as a physician (one day).”
I know what you’re thinking. No, not all comments are this raw, insightful and eloquent. “I think I am as good as my peers,” and my personal favorite “Why do I have to do this?” pops up every year with one or two students. But as a seasoned educator, I know how to use these data as an effective learning tool to model professional self-reflection and purposeful change. It is a balancing act, I tell students, which requires taking the good with the bad.
So, how are you talking to your students about feedback and self-reflection?
Anna Lama, MA, is the Director of Assessment at West Virginia University School of Medicine (WVU SOM) for both undergraduate and graduate medical education studies. Her interests include curricular development, assessment, student professionalism, as well as instructor coaching, mentoring and professional development.
- Jordan J. Cohen (2006). Professionalism in Medical Education, an American Perspective: From Evidence to Accountability. Medical Education 40: 607-617.
- Scott Cottrell, Sebastian Diaz, Anne Cather, and James Shumway. (2006). Assessing Medical Student Professionalism: An Analysis of a Peer Assessment. Medical Education Online 11(8): 1-8.