Co-authors: Helle-Mai Lenk, Emiri Oda, Diane Maratta
This post, co-authored by McGill instructor Helle-Mai Lenk, her former student Emiri Oda, and Diane Maratta, a Learning Technology Consultant with McGill’s Teaching and Learning Services, describes the implementation of Perusall, a tool for engaging students with course readings by having them do online, asynchronous annotations in context to which peers can respond.
Helle-Mai: When asked what they find particularly challenging about their first year at McGill, the students in my McGill Writing Centre (MWC) courses inevitably mention the amount of reading. They feel overwhelmed by not only the volume but also the complexity of academic discourse. They look up unfamiliar vocabulary, pick apart complex sentence structures, and highlight what they consider key points, but it’s a long and arduous process. Often they’re bored; sometimes they get lost. In the courses I teach at the MWC, I aim to help students develop more effective strategies for reading academic texts, such as providing an active reading environment. I feel that an active reading environment improves engagement and comprehension. But how to tackle this challenge of not only getting students to read, but also to engage with the course readings? I sought help from Teaching and Learning Services (TLS).
Diane: During class time, many instructors use teaching strategies, such as inquiry and discussion, to engage students with course materials. However instructors are often uncertain about what strategies to use outside of class time. Common questions resurface, such as “How can I get students to generate and share knowledge outside of class?” or “How can I get students to engage with the content?” So, I often meet with instructors to guide them through the process of selecting and implementing online strategies that engage students outside of class time. Helle-Mai came to TLS searching for such a strategy: to have students actively engage with course readings. I suggested Perusall. Perusall is a collaborative annotation tool which allows students to read and mark up texts interactively online in small groups. Pedagogically, the tool has the potential to add value to students’ learning. It can enhance critical inquiry and foster social learning, and it offers the potential for debating perspectives and interpretations.
Since my role as a learning technology consultant involves guiding instructors with the choice of effective educational technologies, as well as implementing them effectively, I have to ensure that the technology does what it claims to do as a piece of software, and pedagogically, as a teaching or learning tool. At TLS, we tested Perusall to determine the learning curve required to use it both for instructors and students, the assessment value of the tool, and the potential for social learning. Such testing allows us to have an informed conversation with the instructor about the best way to implement the tool. Prior to implementation, Helle-Mai and I had some additional conversations about the way she would use the tool with her students, such as the size of the student groupings in Perusall.
Helle-Mai: In Fall 2016, I piloted a collaborative assignment with Perusall in one of my academic writing courses for students whose first language is not English. Academic English 2 (CESL 300) usually attracts students who are in their first semester. One of the assignments in this course is a 750-word problem-solution essay where the problem to be addressed is plagiarism. As sources for this essay, students are restricted to a list of ten readings on this topic, both scholarly and lay, from which they must cite a minimum of five. At least two of the sources cited must be scholarly articles on the list. For many students, this is their first exposure to peer-reviewed academic articles. Although we prepare for reading by examining the structure of scholarly texts and going over some pre-reading strategies, the actual reading of a specific scholarly article is done online using Perusall. Since the article chosen is one of the readings on the list, the students are motivated to read it because it will help them complete the essay assignment. Also, as plagiarism is a problem that engages the students personally, they’re interested in the subject matter. Finally, since they’ve already had an in-class discussion about the problem of plagiarism, they’re interested in comparing the solutions they came up with to the ones proposed in the reading by two university professors.Labeled for reuse: http://www.usafe.af.mil/News/Features/Display/Article/257165/team-keeps-bases-communications-needs-on-track/
Intentionally, I kept the instructions for the collaborative annotation assignment broad and inviting.
For this assignment, you will annotate a text (Teh & Paull, 2013) collaboratively with two of your peers. The purpose of collaborative annotation is to promote critical reading and to stimulate discussion about the topic of Essay 2.
Today I will send you an access code. Log on to Perusall.com using Facebook, Google or Twitter. Enter the access code. Open the document and begin to mark up the text with marginal notes.
You can argue with the text, raise questions, make connections to other readings, evaluate it, extrapolate from it, ask for help if it is difficult to understand, assess its relevance to the essay topic, connect it to personal experience, express excitement at discovering new or surprising ideas, or any other reaction.
In order to get 3 marks, your annotations should be more or less evenly distributed throughout the text and the time period. They should include new comments and questions as well as replies and comments to existing threads.
The annotation period will close on Thur. Oct. 20 at 9 pm.
My goal was for students to read the entire 16-page article, engage with the content and develop knowledge by sharing ideas. I think I succeeded. Over the three-day period students had to do the assignment, all 24 students in my class participated and the academic article was marked up with almost 600 comments! Many of these comments illustrated insightful thinking and the comment threads suggested that students’ ideas were evolving as a result of peers’ comments.Example annotations on Perusall site (https://perusall.com/downloads/scoring-examples.pdf)
Emiri: Perusall definitely supported my learning as I could share the readings collaboratively and annotate documents with ease. My peers and I could share opinions, discuss, and ask questions with one another. As the users can express their opinions and thoughts about whatever topics that intrigued them, I discovered new viewpoints and problems from others. Furthermore, the discussions I had with my classmates taught me some different approaches to controversial issues which brought new perspectives and solutions to our mind. Also, asking and answering questions about the reading helped our learning and deepened our understandings of the materials in the document.
Even though it was my first time using Perusall, I found it rather user-friendly. Firstly, logging into the software by Google, Facebook, or Twitter is highly convenient. Secondly, adding comments on the document is fairly easy. The users only have to select the sentences to which they want to comment and then a comment box automatically appears. It was never difficult to keep up with new comments since Perusall gives notifications. Lastly, the design of the software is very simple. On the reading screen, there are only a few function buttons beside the document which are easy to use and never bother the users reading it.
With Perusall, I can add underlines and take personal notes on the document on screen. It reduces the time and cost to use printers, so it’s environment-friendly. If more courses used Perusall, I would be able to keep all my assigned readings online; this would make it easier for me to revisit the readings and my notes whenever I want. In addition, the auto-save function of the software prevents me from losing all the study notes and highlights I took due to sudden system errors. Overall, Perusall has enabled me to learn more efficiently with my peers and I would like to keep on using such an easy-to-use learning platform.
Diane: If you’re a McGill instructor interested in engaging your students in their learning outside of class time, take a look at this Educational Technology Ecosystem for inspiration. Thinking of adopting a new educational technology? Use the Technology Implementation Plan (TIP), a thinking guide for implementing new educational technology in your class. For additional guidance on implementing educational technologies, request a consultation with a learning technology consultant at TLS. Finally, check out other strategies for getting students to engage with course readings on TLS’ How do I …? page.References
Teh, E. C., & Paull, M. (2013). Reducing the prevalence of plagiarism: A model for staff, students and universities. Issues in Educational Research 23(2). Retrieved from http://www.iier.org.au/iier23/teh.pdf
How do you engage your students with course readings?
Maryellen Weimer from Faculty Focus published a short article on “Getting Students to Take Responsibility for Learning” on their blog.
I’ve been writing for years that we need to teach in ways that encourage students to take more responsibility for their learning. Recently, it became clear that my thinking on this needed more detail and depth. I’ve been saying that it means students should be doing the learning tasks that make them stronger learners. They should be figuring out what’s important in the reading, rather than having the teacher to tell them. They should be taking notes rather than expecting to get the teacher’s slides and notes.
It was a question in a workshop that made me realize my answer wasn’t wrong, just incomplete. “In a formal learning situation, like a course, what responsibilities do students have?” After further reflection, my answer to that question is that the responsibilities exist across three areas.
Students do have a responsiblity in the teaching and learning process and she provides some insightful ideas on how to think this through.
Do you have ideas on getting students ot take responsibility for learning in your class? Post them below!
On May 5, 2017, McGill’s Assessment and Feedback Group held an event entitled Getting students to focus on the questions, not the answers as part of its Brown Bag Series. To an audience of peers, two instructors described assignments they use in their courses that call upon students to create questions as a means for engaging them with course content and getting them to think about how they learn.
Below, Penelope Kostopoulos, a Faculty Lecturer in the Department of Psychology, describes her assignment. Carolyn Samuel, formerly a Senior Faculty Lecturer at the McGill Writing Centre, describes her assignment in a post called What’s the prof gonna ask?
Turning the tables: Getting students to ask the questions
Let me introduce myself. My name is Penelope and I’m a member of the Assessment and Feedback Group @ McGill. I’m part of this group because I teach large introductory classes—200 to just over 650 students—and I love it!
What? You love teaching in ginormous auditoriums where students don’t know each other?
Yes! As daunting as it might sound, I love the challenge this represents. I want to harvest the great potential that lies in my students. I want students to actively engage with the course material and support each other’s learning.
Sure, I can ask students to pair with their neighbor to work on assignments during class time to encourage peer learning, but anyone who has attempted such learning activities in large classrooms will tell you that half the class will pretend not to understand where to turn to find their neighbor and will stare into space. In small classes, professors can go to students and facilitate interactions but, let’s face it, the vision of me jumping over seats to get the students that are sitting next to each other mute to start conversing is not one I would like to foster.
The idea for this assignment came to me while I was reading articles on teaching undergraduate psychology classes: I discovered PeerWise (Denny et al., 2008), a free web-based platform that allows students to create, share and evaluate multiple choice questions related to course material. I decided then to design an assignment where I turned the tables: instead of me giving students questions to answer, I assigned students the task of creating questions for each other to answer.
Assignment description – The devil is in the details
Counting towards 10% of their final grade, the assignment asks students to create multiple choice questions on the course material on two separate dates and then answer each other’s questions prior to the midterm and final exam dates. Students receive grades both for creating and for answering and evaluating each other’s questions. Simple to setup and implement, and a piece of cake to grade.
Students create six multiple choice questions related to the course material. Each question has to include one correct answer and a minimum of three distractors. Students provide an explanation for the choice of correct answer and reasons why the distractors are incorrect.
Students then answer 20 questions created by their peers, rate them in terms of difficulty (easy, medium, hard) and quality (with a numeric rating), and provide constructive comments to the author. All of this is done anonymously, which promotes broader student participation in conversations. Note that, as the instructor, I can identify the students and intervene in conversations where students are clearly struggling with concepts or where questions are inappropriate. The platform allows students to filter questions: they can choose which ones to answer based on popularity among peers, level of difficulty, and number of comments received. Most importantly, students can filter questions by topic in order to hone in on areas where they feel they need more practice.
Students by and large create the required number of questions plus or minus a couple, but many students answer more and a handful of students have come close to—or even surpassed—answering 1000 questions! Overly eager? Maybe!
I think the gaming interface that awards badges to students for different actions helps. The gamification of something as dreaded as studying can be extremely motivating for students. As an instructor, I’m just happy to see students actively involved with the course material.
As an additional motivator, I let students know that I include at least one student-created question in each exam. How do they respond to that? I will let one of my students answer: “One of my questions got used on the final exam! It was amazing!!!”
Student reaction to the assignment
Change in the classroom can be a challenge and is not always well received by students. I wanted to know how students felt having the tables turned with this assignment. So I surveyed them. To my delight, students’ responses have been overwhelmingly positive. They find the assignment:
- easy to complete on the online platform (92%)
- worth the marks allocated (86%)
- helpful in increasing their understanding of the course material (86%)
- helpful in encouraging them to think critically about the material (83%)
Fewer students, however, find the assignment helpful in preparing them for the course exams (66%) … But more on this later.
In addition to the positive survey responses, students indicated that they got a better appreciation for how difficult it is to create good multiple choice questions. They told me that they learned to value instructors’ efforts in designing their exams. Okay, okay … they might not have used those exact words. Turning the tables did, however, give them the opportunity to see things from the instructor’s perspective and they appreciated that.
Is the assignment effective?
Let’s recap: I designed the assignment to get students to actively engage with the study material and to encourage peer learning. I also wanted students to develop their skills in forming questions. Most importantly, I wanted to cultivate students’ metacognitive skills by having them reflect on their learning.
So, while you’ll probably agree that, in theory, this assignment is great for achieving these outcomes, you might still wonder: Does it work? Does it really engage students in learning? To my delight, students do get involved in conversations about the choice of distractors or the applicability of a question to course material, and they do so in a constructive and respectful manner. It is not uncommon for a question to receive over thirty comments in a class of 200 students. These conversations definitely encourage peer learning. Students also indicate having a better grasp on their learning and how they are doing in their progress with this assignment. But you don’t have to take my word for it. Here is a comment from one of my students: “I think that it’s definitely a great study tool that is made by the students through a collaborative effort. It’s a great idea.”
And do students use the assignment to prepare for course exams? The graph below shows PeerWise activity for one of my classes. Students clearly used the assignment to prepare for exams. However, in their feedback, many students indicated that it did not actually help them obtain better grades on the exams. With proper ethics approval, it would be worthwhile to find out whether or not that was really the case.
Lessons learned and words of wisdom
The assignment has gone through some revisions over the course of three years and five classes where I have used it. Live and learn. Here are some of the updates that I thought are worthwhile sharing:
- It’ s important to not only explain the assignment and show students how to write multiple choice questions, but to also provide students with detailed assignment instructions and written guidelines on what constitutes a good and a bad multiple choice question and how to go about creating good questions. Teaching assistants have always been helpful with this part.
- Simply stating assignment requirements is not enough to get students to complete the task according to the instructions, especially given the multiple deadlines. I now use many different ways to highlight information in the instructions document: use BOLD, underline, change fonts, increase fonts, and repeat, repeat, repeat anything related to grading.
- Following a discussion I had with the always helpful staff at Teaching and Learning Services (TLS), I have added a twist to the assignment: I ask students to create questions at different levels of learning. Levels of learning are explained to students using Bloom’s Taxonomy. Students have to create four questions that tap into lower level learning (i.e., questions that involve remembering and understanding) and two questions that tap into higher level learning (i.e., questions that involve applying, analyzing and evaluating).
And my final words of wisdom: If you decide to use an assignment like this in your course, encourage your students to be creative and I promise you will be rewarded. Usernames like one student’s choice of “afreudtolove” for my introductory psychology course make me smile. And I have a collection of Donald Trump questions from Fall 2016 that are extremely entertaining!
- Harry asked his friend Sally who she is voting for in the Presidential election. Sally said all her friends say Donald Trump is the smartest man alive, which must mean he is the best, so she will vote for him. What would we call the thinking trap Sally is caught in based on her reasoning?
a) Not me fallacy
b) Bandwagon fallacy
c) Bias blind spot
d) Emotional reasoning fallacy
- Trump is now the President of the United States! All his life, he possessed a strong desire to lead and dominate over others; from being a class monitor as a child, to a business mogul, and finally as President. The personality theory which best fits with Trump’s personality is:
a) Carl Jung’s collective unconsciousness
b) Henry Thoreau’s civil disobedience
c) Sigmund Freud’s Oedipus complex
d) Alfred Adler’s inferiority complex
Hover over this text for the answers!References
Denny, P., Luxton-Reilly, A., Hamer, J. (2008). The PeerWise system of student contributed assessment questions. Proceedings of the Tenth Conference on Australasian Computing Education, Wollongong, NSW, Australia(78), 69-74.
PhysPort posted a great article on “How can I get students to have productive discussions of clicker questions?” on their blog on supporting physics teaching with research-based resources.
Clicker questions are increasingly being used to stimulate student discussion and provide faculty and students with timely feedback. Research suggests that discussing clicker questions can lead to increased student learning, and that students exchanging constructive criticism can generate conceptual change.
What can you do as an instructor to encourage all students to have productive discussion? We conducted studies of what students say to each other during clicker discussions when instructors use different instructional techniques. Here’s what we and others have learned and how you can apply it in your classroom:
Clickers has been a very useful strategy to engage students in class in many universities (including McGill), even in large class environments. In-class feedback can help students focus on what is important, practice problems or ideas in class and enage with their fellow classmates in discussion.
Polling@McGill can be used for free by any instructor, TA or student on campus. Students can use their own smartphones, tablets or laptops to respond in real-time to questions in class. If you are interested in using the system, just sign up on the Polling@McGill website.
Are you using Polling@McGill in your courses? Do you have any stories you would like to share? Let us know!
On the first day of classes, I, like other instructors, share either a hard copy or electronic copy of the course outline with students. (Actually, at McGill, the course outline must be provided to students during the first week of classes according to the McGill Charter of Students’ Rights (Chapter One, Article 10.2 – amended by McGill Senate 21 January 2009 – of the Handbook on Student Rights and Responsibilities, available as a PDF). I hope all students will be motivated to read it attentively on their own because it has information that is important for them to succeed at the course. But my hope has been repeatedly dashed. So, I tried a more directive approach: orally “walking” students through salient points of the course outline (can you say tedious?) and asking students to pose questions about anything that’s unclear. No questions. Great. It’s confirmation that I write clear course outlines. Probably not. More likely, students don’t have enough time to take in the content of this truly important document.
So, I switched approaches again. On the first day of class, students now have to engage in an awareness-raising activity whereby they have to find important information in the course outline. I’ve coupled this activity with another that allows students to learn to navigate the course website. I call the activity Find It in myCourses. It’s like an online scavenger hunt. From a list I’ve compiled over the years, I select 6-8 search questions that will draw students’ attention to important information in the course outline and to main features of the course website. Using a mobile device (e.g., laptop or tablet), students work in pairs or small groups to search for the answers. The activity takes 10-15 minutes of class time.Find It in myCourses
Work in pairs or small groups to find the answers to the questions below. When you’ve finished, each pair or group should post the answers to the discussion forum in myCourses entitled Find It in myCourses. Include all group members’ names in the posting.
[Remember: I select only 6-8 questions.]
Find the course outline. Search it for the answers to these questions:
- Where are the assignments and assessments described? [I post the details and example assignments in myCourses, not in the course outline.]
- How should you submit your assignments?
- On which days will there be in-class quizzes?
- Will hard copy assignments be accepted?
- What’s the policy in this course for submitting late assignments?
- Where are the textbook and course pack sold?
- What does the policy on Academic Integrity say? Summarize it in 140 characters.
- How quickly will I (the instructor) reply to your e-mail or phone messages?
- What’s the policy in this course for the use of electronic devices in the classroom?
Return to the myCourses course home page and find the answers to these questions.
- What are the instructions for the first assignment?
- You will have online quizzes in this course. Find the Surprise Quiz. What’s the first question in the quiz? You don’t have to do the quiz.
- Find the supplementary course readings. What’s the title of the first reading?
- Which of the Quick Links are you most likely to use on a regular basis? [I’ve created a Quick Links widget for the home page.]
- What information is in the calendar under today’s date? [Assuming you’ve posted some information.]
- What information appears under Announcements? [Assuming you’ve posted some information.]
- Find the Email icon. Send me an email that says, “We’re really enjoying this activity.” [Humour can help establish rapport.]
I skim students’ answers before the next class to check that they’ve done the activity the way I’d hoped. To date, my hopes have not been dashed. If I do notice gaps in students’ ability to find the requisite information, I can address them in the next class or online by posting an “answer key.” I still state explicitly to students that it remains their responsibility to read the entire course outline, and I still provide students the opportunity to ask questions about the course outline and myCourses content.
While this approach to getting students to attend to important course information can benefit all students, it might be especially beneficial for students who are new to the university and/or unfamiliar with myCourses. This approach also encourages students to manage their course responsibilities. It makes it harder for students to say “I didn’t know which text to read” or “I couldn’t find the assignment instructions” or “I didn’t know we had a quiz today.” My hope is that students will be motivated and able to find the course information they need, thereby improving their chances of succeeding at the course.
How do you get students to pay attention to your course outline?
What features of your course website do you want students to pay attention to?
When preparing your course outline, check out McGill’s Course Outline Guide, which offers a template that you can use as you develop or revise your own course outline.
Interested in learning about the different myCourses features that are available? Check out McGill’s IT Knowledge Base: Index of Documentation for Instructors
A number of instructors at McGill have been integrating peer assessment (PA) in their courses and have generously shared some of their reflections on the experience.
Lawrence Chen teaches Introduction to the Engineering Profession (FACC 100), a required course for all first-year students in the Faculty of Engineering. During a conversation about his experience with PA, he shared how he implemented PA in this course of approximately 400 students (across two course sections), and shared feedback from his students about their experience.
What was the PA assignment?
Students had to propose how they would resolve an ethical issue described in a particular scenario. They had to justify their answer in view of theories we had covered in class. For the PA exercise, I gave students specific criteria to look for and instructions on how to assess the paper. Criteria included things like clarity and proper application of the ethical theories.
What did you hope students would get out of the PA experience?
I wanted them to be exposed to PA, to learn how to critically analyze work, and to learn how to give and receive useful feedback.
I wanted to sensitize students to the notion and process of PA. In academic or corporate settings, you’re assessed by peers when applying for a grant or submitting a paper for publication. When it’s time for merit in companies, you’re assessed through a process involving several people. I shared a personal example of a journal paper that didn’t get the most glowing assessment, and explained that less-positive assessments can happen even when you think you’ve done everything properly.
I also wanted the students to develop their ability to analyze and critically assess written work subject to specific criteria. This involved providing not only a numerical assessment but also written feedback. I wanted them to learn how to provide useful feedback to each other. Feedback like “great job” with a numerical assessment of 10/10 is nice to receive but not so helpful if you don’t know what the peer is referring to. Similarly, feedback that just says “poorly written” without further explanation won’t help you understand what you could’ve done better. I wanted students to learn to accept criticism on their work and to figure out what to do with the feedback they received. You can be frustrated by the feedback, but maybe there’s an element of truth to it. Students have to look at their papers more objectively and think, “What should I do with the feedback I got?”
How did you help students prepare to assess their peers’ work?
I explained to the students that they have two roles: they’re authors and they’re reviewers. As authors and reviewers, I wanted them to develop their critical thinking, analysis and assessment skills. As authors specifically, I wanted them to learn how to accept feedback and deal with criticism.
I did a 30-40 minute in-class “calibration exercise” to help students prepare for PA. I gave them a couple of sample papers to read. I also gave them a rubric with scoring criteria. Students assessed the papers. They then discussed their assessments in pairs or small groups. After the discussion, I asked them, “How many of you assessed this paper within one mark of each other? How many of you assessed this paper more than two marks apart?” It became clear that the majority of the students had assessed within one mark of each other. This calibration exercise allowed students to see that they were assessing the sample papers in a relatively similar manner. Then I explained, “This is how I would have assessed this paper…” Generally, the assessments students made were not that far off from how I would have assessed the papers.
Could you talk about how technology supported the PA assignment?
The first time I did a PA assignment in this course (a couple of years ago), I had to do it manually. The logistics were challenging with such a large class. Now I use Peerceptiv, a software tool designed to support PA in large classes. It’s a phenomenal tool to help manage the process. I’ll explain.
The PA was a double blind process, so students didn’t know who they were evaluating or who was evaluating them (although I could see this information). There were five steps:
- Students wrote their papers and submitted them through Peerceptiv.
- Each student was assigned five papers to review. They entered their written feedback and a numerical assessment into Peerceptiv.
- Students received the assessments from their peers and subsequently provided written feedback—and a numerical assessment—to the reviewers on how useful the written feedback was. Providing feedback on feedback in this way is called “back evaluation.”
- The students then revised their papers and resubmitted them.
- The revised papers were assigned to five other students for review.
The assignment rolled out over a nine-week period: two weeks to submit the first paper, two weeks to do the reviews, a week to do the back evaluation, a week to revise the paper, two more weeks to do the second round of reviews, and a week to do the second round of back evaluation.Peer review exercise timeline
How was the assignment grade calculated?
The assignment was worth 20% of the final course grade, broken down as follows:
- 25% was based on timeliness – students had to submit drafts, peer assessments, revisions, and back evaluations on time.
- 45% was based on the quality of the written feedback they provided on peers’ work.
- 30% was based on the average numerical assessment from 10 reviews of their paper. Each student’s grade was the average of the 10 numerical assessments. There are a whole bunch of algorithms that are built into Peerceptiv, so it wasn’t just the raw average of all 10 student evaluations. If one evaluation was really far from the other four, the algorithms gave it less weight in the averaging.
Some students expressed concern about a portion of their assignment grade being decided by peers, but 6% of the overall course grade, (that is, 30% of the 20% assignment grade), is not a “make or break” kind of scenario.
What did the students think about the PA experience?
I explicitly asked about the assignment and Peerceptiv in the course evaluation: “The peer assessment assignment and using a peer assessment tool had a positive impact on my learning.” Here are the results (out of approximately 200 students per section):Section 1 n=93 Strongly agree/agree 63 Neutral 23 Disagree/strongly disagree 7 Section 2 n=84 Strongly agree/agree 44 Neutral 21 Disagree/strongly disagree 19
Students also wrote comments in the course evaluation. Here are some examples:
- “Peer assessment is really interesting and made me learn a lot.”
- “It was interesting to see how other students view writing assignments and how they think and how they approach them.”
- “People responded poorly to criticism.”
- “I liked the honest feedback that people gave me. Some were easy graders but some were pickier over certain details.”
- “Forced me to have a strong understanding of the theory from the class.”
While there was a wide range of comments, most of the students said it was a useful exercise and they would do it again. However, the ones who commented more negatively on it were clearly concerned about their grades. So in my mind, students’ feedback validates the experience. I’ll continue to have students do PA using Peerceptiv.
What would you say to an instructor interested in trying PA for the first time?
Go for it! Be clear in your explanations to students of what you want them to get out of the exercise. Explain to them that they can benefit from carefully considering the assessments from their peers, even if they don’t agree with them at first, as there may be aspects of their written work that they overlooked initially.
Also, ask for your students’ comments on their experience. It was some of their comments that made me realize that I would like to educate the students a little more about the PA process the next time we do this exercise.
If you do a peer assessment activity in your course, how do you (or how might you) gather feedback from students about their experience?
Marie Norman from Faculty Focus just posted a very interesting article online on “Sychronous Online Classes: 10 Tips for Engaging Students“.
There’s a widely circulated YouTube video you may have seen called “A Conference Call in Real Life.” To spoof the strange, stilted dynamics of conference calls, it replicates them in a face-to-face setting. Participants stiffly announce their names at the door of a meeting room, are suddenly interrupted by bizarre background noises, and find themselves inexplicably locked out of a room they were just in.
If you haven’t watched it, do. You’ll recognize the familiar awkwardness of virtual meetings, where the rhythm of conversational interaction is thrown wildly askew by technological hiccups and the absence of visual cues.
Virtual space is not always easy.
Yet, virtual meetings are increasingly common, not only for geographically distributed work teams, but also for online courses.
So how do you teach in this odd virtual space? How do you keep participants from descending into that peculiar passivity characteristic of conference calls? And how do you help students fight the constant temptation of momentarily clicking away from class? While virtual classes are not without challenges, there are, in fact, concrete steps you can take to run class sessions that are energetic, interactive, and productive. Here are a few suggestions…..
…view the rest of the article on their site: Sychronous Online Classes: 10 Tips for Engaging Students
Are you interested in developing a synchronous online class? Come speak with and educational developer in TLS who can help you think about redesigning your class for a new environment.
A number of instructors at McGill have integrated peer assessment (PA) in their courses and have generously shared some of their reflections on the experience.
Carolyn Samuel taught the course Academic English II (CESL 300) for several years through the McGill Writing Centre. In a conversation about her experience implementing PA, she described an assignment, explained how students learn to assess their peers, and offered advice for instructors considering implementing PA in their classes.
Q: How do you use PA in your course?
I like students to get into the habit of doing outlines for their writing. So, for one essay assignment, I have students give one another feedback on an essay outline. I place students in private groups of three in a designated myCourses discussion forum. Students post their essay outlines to the discussion forum, and outside of class time, each student gives feedback to the other two students by posting comments to the discussion forum. I can access all the posts.
Students use the feedback to tweak their outlines. When students submit their essay drafts to me, they’re also required to submit the revised outline. I provide feedback on the revised outline and on the draft essay. Students use this feedback to revise their essays for a final submission.
Q: How do you prepare your students to give one another feedback?
Students learn outlining in class—we look at examples of outlines and critique them. I make a point of sharing examples of well-written outlines as well as some that could be improved. It’s important for students to see and reflect on ranges of assignment quality.
When we do the critiquing, I encourage students to reflect on the kinds of comments that would be helpful. So, when a student says something like, “I don’t understand that,” I ask, “Well, what can a writer do with that piece of feedback? How can a writer make it better just knowing that you don’t understand it?” Students are encouraged to develop an ability to provide feedback that is “actionable” and that will help writers improve their work.An excerpt from the instructions students receive for the PA assignment
We also talk about basic feedback practices, like, if you have something negative to say, make it constructive, and also say something positive about the peer’s writing. Balancing favourable and less favourable comments is important.
Q: What can students comment on when they give feedback?
Typically, students are skeptical about their ability to comment on peers’ work. So, through discussions in class, we address what students can reasonably comment on and what kinds of feedback are helpful for their peers. I’m less interested in students addressing spelling mistakes, or grammar and syntax, areas that bear less on one’s ability to express and support an argument and that students might not be able to address. In this particular assignment, we focus on features like the quality of the thesis statement and the strength of the support for it. Students can also comment on the organization of ideas and the cohesion of ideas from one paragraph to another, even though the outlines are in bullet form. My goal is for students to read their peers’ work in a way they should be reading their own.
It’s worth mentioning that the PA assignment is graded. It’s not worth very much – just 5% of the grade, so it’s 2.5 marks for each of two pieces of feedback. Marks are awarded for constructive feedback, examples of which students have seen in class. I’m not overly strict, though. Essentially, if students have done it, and it’s constructive, they get the marks. If students provide feedback that’s unhelpful, like, “Your spelling isn’t very good” or “I don’t understand what you’re saying,” I don’t award marks.
Q: What would you suggest to instructors interested in trying PA in their class for the first time?
The first time you do it, make it very low-stakes. It can be extremely challenging to implement PA. It’s challenging for the instructor in terms of logistics and pedagogy—you want to get them just right. And it’s often challenging for students, too, because many haven’t done it before. So start out with low-stakes, and try the same assignment more than one time. As you develop your expertise with PA, and if you feel it’s appropriate, then think about raising the stakes, or maybe increasing the complexity of the assignment. Just be kind to yourself and to your students as you experiment because PA takes some planning.
Join the conversation! What experiences have you had with PA in your courses?
by Diane Maratta and Carolyn Samuel
Have you been thinking of using Peer Assessment (PA) in any of your courses? We, at Teaching and Learning Services (TLS), have added a new “PA” section to our website with resources to support instructors who wish to implement PA in their courses.
As we’ve indicated on the website, “Peer Assessment refers to students providing feedback on other students’ assignments to help them improve their work.”
The new web content includes a best practices resource document to facilitate the instructor’s role in implementing PA. The site also has links to examples of guiding questions, rubrics, checklists, and rating scales for doing PA with a variety of written and oral assignments. If you’re wondering how PA actually plays out with students, check out some blog posts that highlight actual cases of PA implementation in McGill courses.
TLS recently launched these PA materials at a pilot workshop entitled Designing Effective Peer Assessment Assignments. The number of participants who attended suggested that instructors have a keen interest in PA.
The workshop was designed for participants to become familiar with best practices for implementing PA; leave with a plan for implementing PA in one of their courses; and be aware of resources to support PA at McGill.
Participants had the opportunity to begin drafting a PA assignment for one of their courses, and they got to engage in some peer feedback themselves!
Feedback from participants* suggests that the pilot workshop and companion resource materials were helpful:
I’m so happy, I’m leaving with a draft plan. In response to a comment made during the session, I actually like how you acknowledge that there isn’t a cookie cutter approach but we need to be able to justify our decisions.
I learned a lot about student buy-in. I think it’s a great idea to include the students in the development of the assessment criteria. I also appreciated the exercise of creating my own PA assignment. It helped me see the details I was missing.
Peer reviewing is worth doing even though it is challenging – it takes careful and intentional planning.
If you didn’t attend the workshop, you’ll find guidelines at the website on how to implement strategies that will foster student buy-in and support students with providing one another with constructive feedback. Instructors are also welcome to request a one-on-one consultation to discuss implementing PA in their courses.
Be sure to check the TLS calendar for future workshop offerings.
How have you used PA with your students?
*Feedback comments used with permission.
A number of instructors at McGill have been integrating peer assessment (PA) in their courses and have generously shared some of their reflections on the experience.
Barry Eidlin teaches Sociological Inquiry (SOCI 211) in the Faculty of Arts. In a conversation about his experience implementing PA, he shared his rationale for using PA, some thoughts about the PA technology he used, and he offered suggestions for instructors who are considering implementing PA in their classes.
Can you describe your PA assignment?
In Sociological Inquiry, students develop a research proposal over the course of the semester, which happens in three steps: a preliminary research proposal, a complete first draft of the research proposal, and then the final version of the research proposal. The first two steps are peer assessed using a software program called Peerceptiv; the TAs and I grade the last step by hand.
Peerceptiv, software specifically designed to support implementing PA when many students are involved, also has important pedagogical benefits: students learn to see writing as a process, and get much more feedback on their work than they would if only the TAs and I were providing feedback. While the interface took some getting used to, Peerceptiv made it possible for my 120 students to receive individual, detailed feedback on their writing throughout the semester, something that wouldn’t have been possible for me to do alone – there’s simply not enough time.
Students provide feedback on three peers’ work and receive feedback from three peers, during each of the first two steps. In their peer feedback, students provide quantitative feedback (numerical ratings from 1-7) and qualitative feedback (written comments) on three different dimensions, or categories, of the assignment: the research question, competing explanations, and hypotheses. I give them a detailed grading rubric that describes these.
Why do you use PA?
I started using PA because I wanted students to grapple in-depth with the core ideas in my classes. These ideas are not well-suited to a multiple-choice exam form of evaluation. Using PA allows students to develop a research proposal in stages over the course of the semester in a large lecture class, via thoughtful assignments that get students more deeply engaged.
I didn’t want a final paper at the end without any sort of evaluation along the way, where students might just throw something together at the last minute. I wanted sustained engagement over the course of the semester, and the PA activity and software gave me a way to do that.
Working on an assignment over time is probably a better way of engaging with the material, and the three stages force students to spread out the workload, so I hope it reduces their end of semester stress levels.
I also thought that at a pedagogical level, it was important to expose students to other people’s writing process, in the sense that most undergrads typically encounter only two forms of writing: their own jumbled mess of ideas that they struggle through as they’re coming up with their own assignments, and the highly polished, revised, peer reviewed work that they read in their classes. They don’t immediately see anything connecting the two. And so it’s hard for them to understand that the polished writing they encounter in their classes started off as a jumbled mess of ideas.
Having students do PA exposes them to the idea of writing as a process—they see concrete examples, and it can help them see that they’re not alone in working through the writing process.
What would you suggest to an instructor interested in trying PA in their course for the first time?
They should do it! Especially with a large class, it allows the students to engage with the material in much greater depth.
Think very carefully about how you will guide students’ peer assessment, for example, by developing a grading rubric for students to use. Make sure to provide sufficient detail and guidance about the different dimensions (categories) of the rubric. Spend time in class explaining these to the students and walking students through what the process is going to look like – explain how the grade is distributed, make the evaluation process clear to them.
You’re always going to get some students who are fearful about what their classmates are going to think of their writing. You have to accept that and do some hand-holding to reassure them. One way to support students is by giving them hints for the review process. Here’s an e-mail I send students to help reduce their anxiety (which other instructors are welcome to adopt for their own students). As long as the students understand the process, it seems to work.
Want to explore PA further? Join us on May 25, 2017 for a workshop on Designing Successful Peer Assessments.
Join the conversation! Why might you consider using PA?
Global Climate Models for The Classroom: Improving Science Education on Today’s Complex Socioscientific Issues
By Drew Bush and Renee Sieber
Each week, we discussed how technology-based learning with a global climate model (GCM) impacted students. Most mornings, Drew also rode the bus to John Abbott College. Over the course of the winter term in 2014, he collaborated with a Geology instructor there to teach 39 students how to conduct research with an actual GCM from the United States National Aeronautics and Space Administration (NASA).
Many of the students were shocked by their findings. They had been taught how to design appropriate modeling experiments, run simulations, post-process data, conduct visual analyses and interpret results. One student reported dismay at changes to ice cover at the poles. Others calculated an alarming estimate of global sea-level rise. More than a few realized that a favorite animal, tree or vintage could suffer with climatic changes. These findings were made despite the fact that few of our students had ever worked with computer models beyond “toy” models used to teach basic physics or those generated through statistical programs/Microsoft Excel.The Educational Global Climate Model (EdGCM) enables students to learn about climate change through inquiry instruction with a real United States National Aeronautics and Space Administration (NASA) global climate model (GCM). It was the subject of interdisciplinary research in McGill University’s Department of Geography, School of Environment and Faculty of Education. (Image courtesy of The Educational Global Climate Modeling Project, Columbia University, NASA/GISS, New York, NY.)
A challenge of climate change is that it tends to be spatially and temporally distant. It is difficult for the majority of students to tangibly experience climate change and most incorrectly associate weather events with it. To compound the problem, climate change often is represented in politics and public discourse in conflicting manners. Even graduate students have trouble understanding the topic. Research has shown that most educational models are based on the idea of a deficit, where students are considered empty buckets to be filled with more and more information. Yet this instructional approach simply hardens positions on scientific issues that are politically controversial.
Our research reviewed educational theory and the literature on teaching science/climate change with science education technology to determine how best to overcome these obstacles. We found that instructional approaches that combine strongly guided student inquiry with scientific technology can impart deeper understandings and develop student higher order thinking abilities. Inquiry instruction emphasizes students posing their own research questions, evaluating evidence-based answers or explanations and communicating findings. What we still didn’t know after this review was what instructional approaches or technologies would prove most effective for climate change.
Drew’s doctoral work in the Department of Geography and McGill School of Environment examined the advantages and shortcomings of specific instructional approaches and scientific technologies for teaching science. Conducted in Dr. Sieber’s lab, the aim of this research was to determine an effective means for improving student comprehension of physical climate science and related policy. Our work also involved an interdisciplinary group of researchers located at NASA’s Goddard Institute for Space Studies (GISS) in New York, NY and McGill University’s Faculty of Education.
This research embraced the techniques of educational research to compare learning gains between a treatment group that worked with Columbia University-NASA GISS’s Educational Global Climate Model (EdGCM) and a control that listened to a lecture on GCMs and worked with climate education technologies suggested by the American Association of Geographers. These included the University Corporation for Atmospheric Research’s Very, Very Simple Climate Model, NASA GISS’s Surface Temperature Analysis Page and data/visualizations from sites like the National Snow and Ice Data Center.
Our hypothesis was that by working with the technology and processes of climate scientists, our treatment students would better understand climate science and related policy. To measure learning gains and analyze the impact of our otherwise identical curricula, we used pre/post diagnostic exams, exit interviews and the minute-by-minute analysis of 535 minutes of class and lab video footage—among other research instruments. This approach allowed deeper interrogation of the technologies used.
EdGCM is based on a real GISS research GCM. Dr. James Hansen first wrote about it in 1983 when he used it, GISS Model II, to make predictions of global change. EdGCM itself consists of a suite of user interfaces that allows students to design experiments by manipulating inputs, running the model and post-processing and visualizing more than 80 different variables. Other graduate students in Dr. Sieber’s lab have designed newer generations of this technology that work online and possess more intuitive user interfaces.
The control students showed us the power of a well-organized and clear lecture. On the post exam, they out-scored treatment students on five multiple-choice questions that tested recall of facts about GCMs. Yet only those students who had worked with EdGCM appeared highly motivated in their work and demonstrated critical thinking about the work of climate scientists and the issue of climate change.
All but one of our 12 treatment student groups posed climate research questions that interrogated the spatial components of climate impacts or relationships between human actions today and regional/global conditions in possible future climates. As a whole, these students also demonstrated significantly greater learning gains on pre to post exams than those in a control.
The implications of this work are clear. More students understood the complex science of climate change when exposed to actual research processes. More importantly, these students better understood scientific research on the topic, a key tool of researchers (the GCM) and how their own behaviors and social interactions can contribute to solutions.
Renee Sieber is an Associate Professor in McGill University’s Department of Geography and School of Environment and Head of Geothink, an interdisciplinary research Partnership Grant funded by the Social Science and Humanities Research Council of Canada. Contact her @re_sieber
Drew Bush is a doctoral student in McGill University’s Department of Geography and School of Environment. His doctoral work was supported through a Richard H. Tomlinson Fellowship in University Science Teaching and his efforts instructing graduate teaching workshops as a Tomlinson Project in University-Level Science Teaching Fellow. Contact him @drewfbushReferences
See Kearney, A. (1994). Understanding global change: A cognitive perspective on communicating through stories. Climatic Change, 27(4), 419–441.
See Krosnick, J. A., Holbrook, A. L., Lowe, L., & Visser, P. S. (2006). The origins and consequences of democratic citizens’ policy agendas: A study of popular concern about global warming. Climatic Change, 77(1), 7–43 and Akerlof, K., Maibach, E. W., Fitzgerald, D., Cedeno, A. Y., & Neuman, A. (2013). Do people “personally experience” global warming, and if so how, and does it matter? Global Environmental Change, 23(1), 81-91.
See Sterman, J. D. (2008). Risk communication on climate: Mental models and mass balance. Science, 322(5901), 532-533.
See Cooper, C.B. (2011). Media literacy as a key strategy toward improving public acceptance of climate change science. BioScience, 61(3), 231–237 and Pidgeon, N. F., & Fischhoff, B. (2011). The role of social and decision sciences in communicating uncertain climate risks. Nature Climate Change, 1(1), 35–41.
 See Hansen, J., Russell, G., Rind, D., Stone, P., Lacis, A., Lebedeff, S., Ruedy R. & Travis, L. (1983). Efficient three-dimensional global models for climate studies: Models I and II. Monthly Weather Review, 111(4), 609-662.
A number of instructors at McGill have been integrating peer assessment (PA) in their courses and have generously shared some of their reflections on the experience.
Rhonda Amsel teaches Statistics for Experimental Design (PSYCH 305) in the Faculty of Science. During a conversation about her experience with PA, she shared how she implemented it for the first time in a 100-student summer course. Rhonda also offered suggestions for instructors who are considering implementing PA in their classes.
Q: How did you introduce PA to your students?
I explained to them very transparently that I had never done PA in a course before. I thought that PA might help students more easily see what I was looking for when I assessed them and see how other people answered the same questions, for clarity.
When you look at your own work, it’s very hard to see where you are being unclear; it’s much more obvious when you look at someone else’s work.
I also asked students to monitor if they were actually learning anything, so they were engaged as they tried to figure out whether the PA was helping them.
I started in the summer because it’s easier to try something out with a hundred students than with three hundred.
Because I wanted buy-in from the students, I explained what the purpose of PA is in terms of benefits to them. I think that students buy in easily when we try new learning strategies because they know that such strategies are aimed at getting them to succeed in the course.
Q: How do you use PA in your course?
The course has three assignments and two exams. The purpose of the assignments is for students to keep up with the work and force them to prepare for the exams. I started by trying PA with the first assignment. This assignment calls for very brief answers, some small calculations, and a tiny report. For the PA, the assignment was submitted twice, first as a draft for feedback from peers, and then as a final version that the TA and I graded. Students had to bring a draft of their assignment to class on the due date. I collected the assignments from the front half of the class and my TA collected the assignments from the back half of the class. Then, we switched assignments and redistributed them to the students so that they received an assignment that belonged to a peer not sitting near them. It took five minutes to switch the assignments. We had the students put an identifying number and their initials on the assignment so that it wasn’t clear whose assignment they had gotten. I prepared a cover sheet that had detailed criteria to look for, along with questions for peer assessors to address, such as: “Is this present and correct?” “Is this idea in the report?” “Is it clearly expressed?” In class, students used the cover sheet to respond “yes” or “no” to each question as we reviewed the assignment. And any “no” would require a resubmission to me and the TA. There was a place at the bottom of the form for peer assessors to put their own identifying number and initials because I wanted the peer assessors to be graded a little bit, too. The peer assessors each received half a point for completing the peer assessment.
After the students had provided feedback, my TA collected the assignments. She took them into the hallway, marked that the assignments had been received, checked for indications of lack of clarity (because the peer assessors noted when an assessment criterion was unclear to them) and then clarified, as necessary. Finally, she indicated whether resubmission was necessary. The TA also used the students’ initials to mark a last name on the assignment so that we could quickly hand back the assignments at the end of the class with no overnight turnaround. It was fast, which is nice in the summer when courses are short.
Q: What did you and the students think about the PA experience?
During the peer assessment activity, we heard from the students themselves as they tried to clarify points and as they brought up typical problems. To me, it’s that discussion that has value: it’s being in the class and hearing other students ask questions and getting something out of that, and hearing the responses, and then carrying on the discussion from there. It was exactly what I had hoped. I think this made it very clear to them that certain things are not of concern and certain things are of concern – the difference between a calculation mistake and a conceptual mistake, for instance. And the exams were very good at the end of the course.
We also had a discussion about the PA experience itself. It’s always interesting to see how the students react and it’s more helpful to learn about their reactions during the course rather than after. I had the class vote on whether or not to do another PA assignment. I asked, “How many of you really would rather not go on with it?” Only two students raised concerns about their ability to assess peers, and these concerns were easily assuaged. So, it was agreed we would try it again because the students felt that it had value. In anticipation of the next assignment, we talked about what problems they had had doing PA and how the activity could be amended for the next time, such as what we could fix on the cover sheet.
Q: What would you suggest to an instructor interested in trying PA for the first time?
Be familiar with your course: I wouldn’t try PA the first time teaching a course. I would teach it several times with more traditional assessment methods so that I could think about where PA would have the most use.
After doing the PA, ask the students what you can do better the next time, or even if you should do it a next time. It’s really a matter of assessing whether PA is having the desired impact. And who better to tell you than the people who are experiencing it?
Want to explore PA further? Join us on May 25, 2017 for a workshop on Designing Successful Peer Assessments.
Join the conversation! What experiences have you had with PA in your courses?
As a practicum student at McGill’s Teaching and Learning Services, I have been examining the role of reflective journals in post-secondary classrooms. Throughout the course of my research, it has come to my attention that, while they are used frequently in the instruction of disciplines like English and Theatre, reflective journals can actually be a helpful learning tool for a much wider range of subjects (Fenwick & Parsons, 2000; Stevens & Cooper, 2009). In fact, they are becoming more popular in law schools, and even in science classrooms (Fenwick & Parsons, 2000; Ogilvy, 1996). Skeptics insist that journal writing is nothing more than busy work for students and a lot of unnecessary extra effort for instructors. However, those who view journals as constructive have demonstrated that, when properly implemented, engaging students in the exercise of journal writing can be beneficial to both students and their instructors.
Journal writing can allow students to reflect on new knowledge learned in class, solidify their learning experience by recording their evolving thought process as they progress further in the course, learn new material, and form new conclusions (Stevens & Cooper, 2009, p. 3). It can also teach them to formulate new opinions and perspectives, and gives them a risk free venue to explore, think, and practice skills learned in class (Stevens & Cooper, 2009, p. 9; Fenwick & Parsons, 2000, p. 155). Students who write regularly in a journal consistently see improvements in their writing skills, as well as their creative and reflective thinking (Stevens & Cooper, 2009, p. 15-16, 33).
When students write journals for class, it not only helps them, but their instructors as well. Instructors who assign journal writing to their students often see an increase in participation from their students: having to respond to class material in writing encourages students to do the readings, as well as participate more in class discussions (Stevens & Cooper, 2009, p. 11). In addition, from reading journal entries, instructors can see which concepts were understood by their students, and which ones may need revisiting (Mills, 2008). Finally, through the use of assigned journal writing topics, instructors can guide and focus their students’ learning, emphasize important concepts from the lectures, and challenge students to employ their critical thinking skills (Mills, 2008).
While such potential benefits can be appealing, it is not always clear how to go about developing and implementing a reflective journal assignment. Here are a couple of things to keep in mind when introducing journal writing to a class:
- Be clear about the journal’s purpose
Whether it be to voice personal feelings and responses, develop and apply critical thinking skills, or some combination of these, communication of the journal’s purpose to students is essential. This purpose should also be reflected in the journal’s evaluation, as well as the type of writing involved (Fenwick & Parsons, 2000).
- Offer personal examples to help students understand what is expected of them
One of the best ways to communicate to students what is expected of them is to provide an example. Having a concrete idea of what their instructor is looking for gives students more confidence that they are capable of creating an acceptable product, and takes some of the ambiguity away from journal writing (Fenwick & Parsons, 2000).
- Evaluate only journal content, not form, spelling, or grammar
Insisting that students revise, rewrite, or edit their journal entries may effectively defeat the purpose of writing them in the first place. It could cause students to be afraid of making mistakes, thus restricting their creativity, curiosity, and honesty. This could in turn have a negative effect on the development of reflective writing skills. Errors made in a journal setting occur because the journal is doing its job of encouraging students to try new things (Fenwick & Parsons, 2000; Marsh, 1998).
Journal writing may be new to many instructors, while other instructors have been using it for years.
- To those instructors who are interested in integrating reflective journals in their courses: what questions do you have? What interests you about this exercise?
- To those instructors who have experience with the use of journal writing: Which aspects of this activity worked well, and which could use some fine-tuning? Why did you decide to incorporate reflective journals? What suggestions do you have for instructors who are considering this activity for the first time?
If this is a topic that interests you, stay tuned for our next blog post that will discuss common concerns regarding journal writing, and how to minimize them. For more information on best practices for journal writing:
Stevens, D., & Cooper, J. (2009). Journal keeping: how to use reflective writing for effective learning, teaching, professional insight, and positive change. Sterling, VA: Stylus Publications. WorldCat: http://mcgill.worldcat.org/oclc/646821096References
Fenwick, T., & Parsons, J. (2000). Toolbox 2: Assessing learner journals. From The Art of Evaluation: A handbook for educators and trainers. Toronto: Thompson Educational Publishing, Inc. pp. 155-161. WorldCat: http://mcgill.worldcat.org/oclc/243514524
Marsh, S. (1998). Widening the lens of diversity: Motivating reflective journal writing. Paper presented at the meeting of the American Educational Research Association (AERA), San Diego, CA. Retrieved from http://files.eric.ed.gov/fulltext/ED418425.pdf
Mills, R. (2008). “It’s just a nuisance”: Improving college student reflective journal writing. College Student Journal, 42(2), 684-690. Retrieved from http://search.proquest.com/docview/61951506?accountid=12339
Ogilvy, J. (1996). The use of journals in legal education: A tool for reflection. Clinical Law Review, 3, 55-107. Columbus School of Law. The Catholic University of America. Retrieved from http://scholarship.law.edu/scholar/264
Stevens, D., & Cooper, J. (2009). Journal keeping: How to use reflective writing for effective learning, teaching, professional insight, and positive change. Sterling, VA: Stylus Publications. WorldCat: http://mcgill.worldcat.org/oclc/646821096
Thank a Prof. Thank a Prof? Yeah, Thank a Prof! McGill’s Teaching and Learning Services (TLS) recently launched an initiative that encourages students to thank profs (i.e., any instructor at McGill) who have made a difference in their lives.
How does it work? It’s simple. Students log in to the Thank a Prof website. They type a message in the textbox that explains why they’d like to thank a prof and then click “Submit”. TLS receives the thank you messages and sends a congratulatory letter to the prof by email that includes the student’s message but not the student’s name. Students can choose whether or not they want their comments to be published on the “Thanked profs” web page (coming soon).
Judging by the student messages submitted to date, McGill has some pretty passionate and dedicated profs! Check out these example student messages:
These brief, yet sincere messages have impact. Profs have been delighted to receive thanks from students and they’ve let TLS know with messages of their own, such as:
Having been the recipient of one of the student messages myself, I’d like to add my own thanks to TLS for launching the Thank a Prof initiative. Teaching is exhilarating and fun; it’s also hard work and challenging. Occasionally, it gives me knots in my stomach, and it can make me feel incredibly frustrated. Trite as it might sound, a thank you from a student makes it all worthwhile.
Have you made a point of thanking a prof who made a difference in your life? What inspired you to thank him or her?
Check the Thank a Prof website later this month for a list of McGill profs who have been thanked by students.
One afternoon last fall, I went to the washroom in the McLennan Library. Unexpectedly, I heard sobbing coming from one of the stalls. I bent over to look for the shoes that would indicate which stall the sobbing was coming from. I saw the shoes; I also saw a bum in jeans. Someone was sitting on the floor of the stall sobbing uncontrollably. I knocked on the door and asked, “Do you need help?” No response other than more sobbing. I knocked again. This time I said, “My name is Carolyn Samuel. I work down the hall at Teaching and Learning Services (TLS). Can you open the door?” There was no vocal reply, but I heard the latch click. Gently, I pushed the door open. She was a student. She sat sobbing and didn’t even look up when I opened the door. I asked if I could put my hand on her shoulder. She nodded. I was hoping the human touch would provide her with some comfort in what was clearly a time of despair. “Can you tell me your name? Your first name only.” She did. With some coaxing, we went together to my office. She continued to sob. I asked only a few questions. She was an undergraduate student from Toronto. It was her first semester. She felt she was falling behind. She agreed to walk with me to the Office of the Dean of Students. On the way, she stopped suddenly. Still sobbing, she blurted, “I can’t go! I have to be in class now or I’ll fall behind even more!” She was in no condition to go to class. With a little more coaxing, we made it to the Brown Building, where I left her in the hands of the staff at the Office of the Dean of Students.
Have you ever thought about what you would do if you found a student in distress on campus? If you’ve never thought about it, you probably should. A 2014 study of McGill students’ psychological wellbeing reveals data on the percentages of respondents who reported taking a prescribed medication for a mental health concern and who indicated that they seriously considered attempting suicide while at university. The numbers suggest that there are students in distress around us.
“Helping Students in Difficulty” documentTLS and Counselling and Mental Health and Services recently co-facilitated a Mental Health 101 workshop for instructors, advisors and staff in the Faculty of Management. One of the messages we hope was a “take away” for participants was that helping a student in distress doesn’t mean you have to be the problem-solver who makes everything better. You can be the person who leads the student—by giving the phone number to Counselling and Mental Health and Services, showing the website of counselling services offered, or walking the student to the Office of the Dean of Students—to the people who are in a position to help.
Check the Helping Students in Difficulty document (formerly known as the “Red Folder”) to find out what to do and who to contact in emergencies, crises and worrisome or difficult situations. If you’re an instructor and you believe a student is in distress, use the Early Alert System in myCourses. And with final exams coming up, let students know to look for therapy dogs on campus.
How have you helped a student in distress?References
DiGenova, L., & Romano, V. (2014). Student psychological wellbeing at McGill University: A report of findings from the 2012 and 2014 Counselling and Mental Health Benchmark Study. Retrieved from https://www.mcgill.ca/counselling/files/counselling/student_psychological_well-being_at_mcgill_december_2014_final_3.pdf
Taking audiences’ cultural and linguistic backgrounds into consideration when communicating at mcgill
A recent publication entitled Twelve tips for promoting learning during presentations in cross cultural settings provides “tips for educators to consider when planning and delivering formal presentations (e.g. lectures and workshops) in cross cultural settings” (Saiki, Snell, & Bhanji, 2017, p. 1). I’d like to highlight the relevance of these tips to communication at McGill—through classroom instruction, meeting presentations, Town Hall talks, etc.—in light of the cultural and linguistic diversity at this institution.
International Student Services (ISS) report that McGill’s 2016-2017 student population comes from 147 different countries. Enrolment Services (ES) reports that as at Fall 2016, 53.9% of students reported a mother tongue other than English. These data suggest that there is cultural and linguistic diversity among students on campus.
This diversity extends to faculty members. For my doctoral research, which addresses, university instructors’ perceptions of their ability to teach in their second or other language (Samuel, 2017), I contacted McGill’s Academic Personnel Office (APO) in 2013 to find out what McGill faculty members’ first languages are. I learned that McGill does not collect such data; however, other relevant data are collected by the APO and the following information was provided to me:
- Faculty members’ country of birth: 76 distinct countries including Canada
- Citizenship at hire: 43 unique citizenships including Canada, excluding dual citizenships
- Recruitment country: 30 distinct countries including Canada
- Countries where the recruited did their PhDs: 26 distinct countries including Canada
While McGill does not collect data on faculty members’ first languages, the federal government does. (Well, it used to. The Canadian government has actually ceased to systematically collect this information.) Per 2006 national data, 44% of university teachers reported mother tongues other than English, and nearly 25% of university teachers reported a mother tongue other than English or French (Canadian Association of University Teachers, 2012-2013, p. 35). These national data (which likely still apply and perhaps in even greater numbers), along with the McGill-specific data from the ISS, ES and the APO, strongly support the existence of a diversity of cultural and linguistic backgrounds at McGill. The Twelve tips for promoting learning during presentations in cross cultural settings are, therefore, a worthwhile read for McGill community members.References
Canadian Association of University Teachers. (2013-2014). CAUT Almanac of post-secondary education in Canada. Ottawa. Retrieved from http://www.caut.ca/docs/default-source/almanac/almanac_2013-2014_print_finalE20A5E5CA0EA6529968D1CAF.pdf?sfvrsn=2
Saiki, T., Snell, L., & Bhanji, F. (2017). Twelve tips for promoting learning during presentations in cross cultural settings. Medical Teacher, 1-5. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/0142159X.2017.1288860
Samuel, C. (2017). University instructors’ perceptions of their ability to teach in their second or other language: An exploratory study. (Unpublished doctoral dissertation.) McGill University, Montreal, QC.
When we hear academic integrity, we often think about the student code of conduct which contains policies on plagiarism and cheating. These polices provide explicit boundaries to help guide students towards learning ethical behaviour practises. The polices also empower instructors with clear definitions to help them teach students the nuances of academic writing, research, and ethical work. However, when students cross the boundaries, these policies become the foundation of disciplinary action. But what about professors and researchers? Their research and publishing is not always confined to an institution and is more commonly found in the global ether of academic publishing where journals and publishers set the boundaries. Who monitors their publishing and research and what happens when they cross the line? Enter Dr. Ivan Oranksy, vice president and global editorial director of MedPage Today and co-founder of Retraction Watch, an online blog. Oransky visited McGill as part of the Academic Integrity Day on Feb. 3. His talk, [Retractions, Post-Publication Peer Review, and Fraud: Scientific Publishing’s Wild West] attracted over 150 profs, graduate students, and staff from four Montreal universities.
When Ivan Oransky and Adam Marcus founded Retraction Watch in 2010, retractions had grown ten-fold in the previous decade. During his talk, Oransky discussed reasons for that increase, and the growth of post-publication peer review, and other trends he and Marcus have seen as they’ve built a site that is now viewed by roughly 150,000 people per month.
Retraction Watch publishes retractions from peer reviewed journals and online publications. The blog follows the stories of egregious researcher behaviours including scientific misconduct, lying, cheating, falsification, and fraud. The blog even has a leaderboard, showcasing the top 30 academic integrity perpetrators.
Oransky claims “… scientific publishing is becoming more unpredictable, and yes, more dangerous. From predatory publishers to sophisticated ruses — including authors submitting fake email addresses for reviewers so they can review their own papers — designed to either subvert existing peer-review processes, or expose their flaws, it’s a wild time” (Ivan Oransky on publication practices and academic fraud, McGill Reporter, Jan. 2017).
Other publications from the event:
- Québec Science: L’homme qui traque les scientifiques hors-la-loi, Feb. 08, 2017
- McGill Tribune: The academic journal detectives behind Retraction Watch, Feb. 13, 2017
- CKUT Radio: Dr. Ivan Oransky of Retraction Watch, Feb. 03, 2017
Professor Madhukar Pai, Canada Research Chair in Epidemiology & Global Health, wrote an insightful post ‘How can we become better teachers‘ in the February issue of Nature Microbiology. Its always great to see posts about teaching and learning appearing in mainstream research journals within a discipline. Professor Pai talks about starting small, thinking about your students, focusing on reflection and much more. He also makes many of his teaching resources available for free on his own teaching epidemiology website.
While from the bounded level of our mind
Short views we take, nor see the lengths behind,
But, more advanced, behold with strange surprise
New distant scenes of endless science rise!
– Alexander Pope, An Essay on Criticism, 1709
Alexander Pope may have been addressing an audience of literary critics, but his message is just as applicable to both faculty and students. To judge fairly and wisely – he wrote – be humble, follow nature, and study deeply. So what does it mean to study deeply?
Professor Richard Hovey, Oral Health and Society, makes a fair point stating that teaching and learning can occur through presentation, but that it’s not always the case: you can explain the science, the theory, the concept to your students –but will they be able to apply this information? Not necessarily. Of course, this also depends on the type of information. In fields like Medicine and Dentistry, sometimes to practice is to learn best. To show students that that there is more to learning than sitting passively and absorbing information, Prof. Hovey decided to teach his students how to juggle. He asked a colleague to do a PowerPoint presentation on juggling. The presentation contained information about the physics of juggling, with lines and arrows depicting the necessary movements and text describing the action. Following the presentation, Prof. Hovey asked his students to try juggling. They had the knowledge, right? They should have been able to juggle perfectly. But learning isn’t always about transferring information to the brain with a click à la The Matrix. Knowing about a skill doesn’t automatically translate into learning the skill as Prof. Hovey’s students realized when they tried to juggle and failed. Once he showed them how to juggle and they tried, tried and tried again, the students eventually learned to juggle – a lesson that can be transferred from juggling to any skill. Hands-on experience and trial and error can be effective routes to learning.
However, taking the time to truly learn something may be daunting, especially when there is a demand for quickly processing large volumes of information. When mainstream media conveys the message that speed is the ultimate ace up your sleeve when it comes to processing information, then you may be reluctant to slow down and learn something in depth. What Prof. Hovey’s example teaches us is that taking the time to learn something may well lead to deeper long-term understanding.
Dr. Carolyn Samuel also helps her students engage in deep learning by focusing on improving students’ reading and writing skills. She does so by helping students identify key disciplinary-specific features of academic writing: how a particular field states the problem, describes a study, defines key concepts, and supports claims with evidence. Students are asked to read scholarly articles and identify these features, repeating the exercise with their own writing. By developing an acute awareness of how problems, arguments and stories in their respective fields unfold through writing, students not only learn how to structure their own work so that it corresponds to the field, but also- and perhaps most importantly- how to really dissect and analyze the knowledge that their fields produce.
In sum, to study deeply is to immerse oneself in the subject at hand. This takes different forms in different disciplines but almost always leads to the opening of new vistas.
Check out the other posts in the Apirations to Action blog series:
A great summary of the closing plenary from Andrew Hendry of our Learn to Teach Day. Thanks Ethan for the great summary!
Ethan’s 3rd post:
In the closing plenary of yesterday’s Learning to Teach workshop, Doctor Andrew Hendry, professor of Evolutionary Ecology at McGill, demonstrated a terrific example of what he called an ‘inspirational class’.
According to him, since information is easy to access nowadays, what distinguishes a good teacher from a mediocre one is whether he or she is able to inspire the students and make them feel sad when the class is over. He surely can do that. In his lecture, he demonstrated how to pass on hands-on learning, how to use social media to inspire students and how to ‘perform’ in front of the class. At the end of his lecture, I could literally sense the energy in every audience and feel that the spirit of the entire hall was lifted up. A picture says a thousand words, and here is a youtube link of how he teaches evolution:
View original post 397 more words