Online & Alternative Assessment Ideas

The following assessments could provide an alternative to your face-to-face final exam. When selecting an alternative assessment, consider if it allows students to demonstrate the core learning outcomes of your course.

Concept Maps

Description

Use the creation of concept maps or flowcharts if you want students to:

  • Represent knowledge in graphic form
  • Organize and categorize knowledge
  • Identify connections and relationships between abstract or complex ideas.

Example forms of digital submission:

  • Written response submitted in an Office 365 document (e.g., Word Smart Art or PowerPoint slide) and submitted to OWL Assignments
  • Visual representation made with an application such as Canva or LucidChart, saved and submitted as a PDF submitted to OWL Assignments or a OWL Student Page

Critical Reflection

Description

Use this form of assessment if you want students to:

  • Apply theory to practice or real world situations
  • Explore alternative perspectives
  • Assess the impact of the course experience on their own personal growth

Critical reflection prompts might ask students to:

  • Prepare a written paper similar to a short or long-form essay question typically seen on a face-to-face final exam. For example, students might be asked to synthesize ideas from across weeks. What are the themes of your course? How can you form synthesis or analysis questions around those themes?
  • Explore their assumptions/biases, strengths, weaknesses, skills, identity, or emotions in relation to a course theme
  • Articulate the assumptions embedded in arguments, social media posts, or any form of communication
  • Engage on a meta-cognitive level by asking students to explain how they have met the learning outcomes in the course and how they will apply this learning in the future.

Example forms of digital submission:

  • Written response submitted as links to Blog posts or a Word document to OWL Assignments
  • Video submitted to OWL Assignments or shared on a OWL Student Page or link to Vlog response
  • Narrated PowerPoint submitted to OWL Assignments or VoiceThread presentation
  • Visual representation made with an application such as Canva or LucidChart, saved and submitted as a PDF

Possible evaluation rubrics:

Discussion Boards

Description

An online forum where students can make posts and reply to one another.

Use this form of assessment if you want students to:

  • Participate in whole-class or small-group discussion about course content.
  • Construct logical arguments and articulate ideas based on their understandings.
  • Develop a critical stance towards content knowledge.
  • Critically think about course concepts.
  • Gain multiple perspectives from personal and professional experiences of their peers. 

Strengths

  • Strong source of social presence for your course.
    Discussion boards are one of the most common ways to harness social interactions in online courses. 
  • Teaches students how to discuss course content succinctly.
    When students communicate their ideas about course content, they learn to articulate those ideas. This is especially true when instructors pose more challenging prompts that give students the opportunity to analyze data, consider alternative interpretations, explain complex phenomena, or debate an issue.
  • Provides an avenue by which students are exposed to multiple perspectives.  
    Discussion of course content with peers exposes different perspectives and personal experiences related to course content.
  • Promotes higher-level thinking.
    Discussions can enhance skills such as analysis and evaluation of arguments and positions, critical examination of one’s attitudes, beliefs, behaviors, and values, problem-solving, and citizenship. 

Limitations

Without a strong prompt, discussion boards can fall flat. To prevent superficial discussion, create topics that are discussable, meaning avoid closed-ended questions. Also, prompt students to debate an issue, solve a problem, persuade their peers, or evaluate a case study.  

Structure is important. Discussion boards need to have structure and clear instructions and expectations. See “Preparation, implementation, grading/feedback” below for question types that provide structure to online discussions. Without structure, students can contribute segmented conversations that lack rich flow.  
Without strong facilitation, the learning benefits of discussions are not realized. Instructor need to plan and lead discussions effectively. See “Preparation, implementation, grading/feedback” below for facilitation tips.


Specific academic integrity issues
 

The most common academic integrity issue in discussion forums is plagiarism. Some students commit plagiarism inadvertently, not properly acknowledging sources, yet some instances of plagiarism are intentional. Combat plagiarism by using originality-verifying software (such as Turnitin), by increasing students’ knowledge of using proper citations, and by levying steep penalties (awarding a zero grade, submitting the issue to the university’s academic integrity department, etc.). Visit Western’s academic integrity policies regarding appeals for undergraduate and graduate students to learn more.  

Best practices online

  1. Students must be prepared for discussion.
    1. Because discussion is tied to interpreting, explaining, evaluating, arguing, and solving course topics, students need to know something about the discussion topic. Make sure to tell students why the discussion is meaningful and how preparing for participation is helpful.
  2. Students need to feel safe.
    1. Course climate, or the intellectual, social, emotional, and physical environments in which students learn, impacts student motivation and learning.
    2. The climate of your course is influenced by teaching presence, social presence, course tone, peer demographics, and course content.
    3. Establish discussion norms with your class (see Preparation, implementation, and grading/feedback section below for considerations) to help facilitate a respectful course environment.
  3. Students respond well to a variety of structured discussion formats.
    1. Consider structuring discussions differently, such as a debate in Week 3, asking students to solve a critical problem in your field in Week 5, and asking students to share their personal and professional experiences with course topics in Week 7.
  4. Students do well with multiple good answers.
    1. Good discussion questions ask for complex and varied answers, moving beyond simple recall. They demand higher-level thinking.
    2. Good discussion questions should be open-ended and invite creativity (see Preparation, implementation, and grading/feedback section for question types).
  5. Students benefit from time to think before contributing.
    1. Students need time to craft a response they are comfortable sharing in public. Time to process the question, review relevant content, and assemble an answer using the language of the discipline is important.
    2. Consider asking for mid-week posts to give students the time they need.
  6. Students can benefit from graphical expression.
    1. When appropriate, asking students to contribute graphical representations of their understanding is a strong method for developing conceptual thinking and examination of the interrelationships among course content.
    2. Flowcharts, concept maps, and Venn Diagrams are examples of graphical representations.
    3. Graphics also align with universal design by crossing cultural and language barriers.
  7. Students discuss better in small groups (especially in large enrolment courses; see links below for creating groups in OWL).
    1. Create groups within which students can dive into deeper, more meaningful discussion.
    2. In a large enrolment course, this also help facilitate instructor or TA moderation of discussions.


Tools @ Western

The Forums Tool in OWL allows instructors to create an unlimited number of discussion forums, and is integrated closely with other tools such as Resources and Gradebook.  

Creating Groups in OWL

The VoiceThread Tool in OWL for discussions that are video, audio, and/or text-based.

The Collaborate UE Tool in OWL or Using Zoom in OWL for synchronous discussions.
 

Preparation, implementation, grading/feedback 

Creating Group Forums:

  • Create a Groups in OWL
  • Create a Forum Topic
  • To create multiple, private group topic areas, create a new Forum topic and choose Automatically create multiple topics for groups. Then select each group for which a topic should be created. Each group member will be set to "Contributor" in their group's topic and "None" in other automatically created topics.
  • Grade the Forum by selecting either to grade across topics or for each topic.

 Considerations for Couse Discussion Norms:

  • Discussion norms are a set of ground rules for discussions within your course.
  • It is worth establishing ground rules early in the semester and co-authoring with students.
  • Consider addressing the following:
    • Treating others with respect
    • Maintaining confidentiality for information shared within the course
    • Making attempts to not stereotype, label, blame, or judge peers
    • It’s okay to challenge ideas, not people
    • Speaking only for yourself (using “I” language)
    • Supporting statements with evidence

 Types of Discussion Questions that foster higher-level thinking:

  • Hypothetical questions – these are what-if situations where you ask students to make up a plausible ramification by drawing on research, course content, and prior knowledge.
  • Comparative questions – these questions ask students to analyze similarities and differences between theories, literary works, research studies, phenomena, events, etc.
  • Critical questions – these questions guide students to evaluate the effectiveness of a given approach, procedure, solution, or recommendation.
  • Summary/synthesis questions – these questions ask students to reflect on course content. What was surprising? Or How might they apply content to their practice? 

Question stems to guide students in evaluating course information:

  • What criteria would you use to assess …
  • How would you determine …
  • How could you verify …
  • How would you generate a plan to …
  • What alternative would you suggest for …
  • What would happen if … 

Concrete examples:

  • What is the most important part of Moby Dick? Why? Give evidence to support your stance.
  • Why do you think this composer may have chosen to use whole notes at the end of each bar in this song? Elaborate on your reason.
  • How could an athlete improve their muscular endurance to become a better football player?

 Tips for Facilitating Discussions:  

  • Ask follow-up questions such as:
    • “How might you add validity to your claim(s)?”
    • “Can you think of additional evidence to contribute?”
    • “How did your peers’ responses shape/change your understanding?”
  • Supplement/correct students when you witness faulty information or see a need to refocus groups.
  • Consider issuing a Trigger Warning, or a warning that the discussion may contain potentially offensive or disturbing content to some students (if applicable). Example trigger warnings:
    • “Our classroom provides an open space for the critical and civil exchange of ideas. Some readings and other content in this course will include topics that some students may find offensive and/or traumatizing. I’ll aim to forewarn students about potentially disturbing content, and I ask all students to help to create an atmosphere of mutual respect and sensitivity.”
    • “Next class our discussion will probably touch on _______. This content is disturbing, so I encourage you to prepare yourself emotionally beforehand. If you believe that you will find the discussion to be traumatizing, you may choose to not participate in the discussion. You will still, however, be responsible for material that you miss, so if you choose not to participate, please arrange to get notes from another student or see me individually.”
    • “The following reading includes a discussion of __________. This content is disturbing, so I encourage everyone to prepare themselves emotionally before proceeding. If you believe that the reading will be traumatizing for you, then you may choose to forgo it. You will still, however, be responsible for material that you miss, so please arrange to get notes from another student or see me individually.”
  • Address microaggressions (usually unintentional comments anchored in stereotypes) as they happen.
    • Stay calm.
    • Acknowledge the moment and immediately take the lead in addressing the situation (slow down or stop the conversation).
    • Acknowledge the emotions in the room, both visible and invisible.
    • Return to the ground rules. Hold students accountable for their actions and ask for clarification.
    • Explain why the incident is problematic. Support students in critical reflection on the situation.
    • While acknowledging the impact, make sure to validate and support those who have been targeted.
    • Follow up as needed, e.g. revisit in next class and/or see individuals after class.
    • Continue to model inclusive language and behaviors.

Assessment Considerations:

  • Require a mid-week and end-of-week post. The mid-week post gives students time to consider their responses and is early enough in week to facilitate discussion so peers can reply later in the week.
  • Require an original post before students can reply to their peers.
  • Require a minimum number of replies.
  • Consider your grading criteria. You should not burden students with more than six criteria.
    • Meeting staggered deadline for discussion
    • Quantity of contributions
    • Quality of contributions (length requirement, formatting, grammar/spelling)
    • Accuracy of content
    • Relevancy to discussion prompt
    • Evidence utilized in contributions
    • Professionalism displayed in contributions
    • Demonstration or critical thinking, analysis, synthesis, or creative thought (as applicable)
  • Consider if you will Grade the Forum by overall contribution or by topic.

References

Centre for Teaching Excellence, University of Calgary. (n.d.). Trigger warnings. Retrieved from https://uwaterloo.ca/centre-for-teaching-excellence/trigger#:~:text=Faculty%20and%20staff-,Trigger%20Warnings,to%20sharing%20potentially%20disturbing%20content.&text=For%20example%2C%20some%20fear%20that,which%20academics%20need%20to%20engage.

Center for Teaching and Learning, University of Washington. (2020). Addressing microaggressions in the classroom. Retrieved from< https://www.washington.edu/teaching/topics/inclusive-teaching/addressing-microaggressions-in-the-classroom/

Darby, F. & Lang, J. (2019). Small teaching online: Applying learning science in online classes. San Francisco, CA: Jossey-Bass.

Nilson, L. & Herman, J. (2018). Creating engaging discussions strategies for “avoiding crickets” in any size classroom and online. Sterling, VA: Stylus Publishing LLC.

Whiter, K. (2019). Strategies for engaging students in the online environment. In E. Alqurashi (Ed.) Fostering Student Engagement with Instructional Technology in Higher Education. IGI Global

Find the Error/Flaw

Description

Students receive calculations or problem-solving questions that have already been solved but contain an error or flaw. Ask students to identify (and possibly correct) the error.

Use this form of assessment if you want students to:

  • Demonstrate their ability to find errors in sets of data, problem solving questions, or arguments

Example forms of digital submission:

  • Written response submitted in an Office 365 document (e.g., Word or Excel) and submitted to OWL Assignments

Group Exams

Description

When students work together in pairs or small groups to complete an exam.

Use this form of assessment if you want students to:

  • Gain important professional skills around collaborating toward a group consequence.
  • Construct logical arguments and articulate ideas based on their understandings.
  • Develop a critical stance towards content knowledge.
  • Critically think about course concepts.
  • Achieve better content retention and greater understanding.
  • Be more interested in course content.

Strengths

Students learn and retain better with group exams.

Most research on group exams demonstrate that test scores are higher when students work together.
Research also distinguishes that these higher scores are because of better learning and retention of course material.
The process of articulating ideas aligns with how our brains store information from short-term to long-term memory.

Group exams are excellent for helping students achieve higher-order learning.

When students communicate their ideas about course content, they learn to articulate those ideas and collaborate to tackle complex problems.
Group exams allow instructors to pose more challenging questions that give students the opportunity to analyze data, consider alternative interpretations, explain complex phenomena, or design experiences.

Test anxiety is reduced with group exams.

Group exams provide immediate feedback to students about their learning through group discussion, which is more comfortable than instructor feedback.

Group exams help students find course content more interesting.

When group exams are used to present critical problems for students to solve, students see the real-world application of course content.
Discussion of course content with peers exposes different perspectives and personal experiences related to course content.

Limitations

While not a necessity, group exams work best when your course contains other collaborative learning elements. It is important that students understand how to work effectively in a group. Tips for developing this comfort level are listed below.

Peer contingencies, or the fact that a grade will be hinge on the knowledge of others, is a real fear for students. It is a tricky balance between individual accountability and group dependence. Tips for grading structures to tackle this challenge are listed below.

Specific academic integrity issues 

Because groups will be completing exams in an unproctored environment, there is little control over the external resources they may draw upon to answer questions. Recognizing this, heightens the need to create questions that assess higher order thinking skills (e.g., application, analysis, evaluation), which reduces the possibility of simply looking up an answer online.

Informing students of the pedagogical rationale for selecting this form of assessment will reduce that cheating.  

Best practices online 

  1. Orient students to group exam.
    1. Articulate your reasonings for using a group exam by providing an overview of course learning objectives and how the group exam will prepare students to meet them (e.g. critical thinking, collaboration, communication, etc).
    2. Provide clear instructions, expectations, and time requirements for the group exam.
    3. Communicate the required technology for completing the group exam.
  2. Formulate groups as early as possible.
    1. It is important that you give students time to connect with their group, especially when using asynchronous methods.
    2. Groups can begin to build rapport before your release the exam questions.
  3. Decide on asynchronous and synchronous tools for groups to utilize.
    1. Consider using Forums, VoiceThread, or Teams for groups to connect and collaborate asynchronously.
    2. For synchronous connection, groups can utilize Collaborate UE or Zoom through your course OWL site.
  4. Decide on Group Exam method (see Preparation, implementation, and grading/feedback section).
  5. To help groups stay focused and demonstrate their learning, present group exams that contain more long-answer type questions where groups will need to present justifications for their answers.


Tools @ Western

The Tests and Quizzes Tool in OWL allows you to create online quizzes with a variety of question formats, including those addressed above. For more information about using the Test and Quizzes Tool for creating, administering, and grading your online quizzes, please see the OWL help page for the Test and Quizzes Tool.  

Creating Groups in OWL

The Forums Tool in OWL

The VoiceThread Tool in OWL

The Collaborate UE Tool in OWL

Using Zoom in OWL
 

Preparation, implementation, grading/feedback 

Method 1: Group Exam as part of an individual exam

  • In this method, students take an exam where a portion is answered on an individual basis, then 1-2 questions are posed to a group.
  • The entire exam is graded as one complete exam, but the group answers might be 20-50% of the exam total.
  • Each student will be turning in their own complete exam after collaborating on answers to the group-posed question(s).

To set this up in OWL Test and Quizzes:

  1. Create a new Test including all questions. Set the open date/time and close date/time for a timeframe that allows enough time for students to collaborate and discuss question(s).
  2. Create groups in OWL and set up Group Forums to share the group question(s) with each group.
  3. Once students have collaborated, they will complete the exam individually, including questions you intended to be completed alone and the question(s) that they discussed as a group.

Method 2: Group Exam as follow-up to an individual exam (also known as the Two-Stage Exam)

  • For this method, students take an exam alone, then are placed in groups to re-take the same exam.
  • Two-stage exams have been linked to greater student retention of course material (Gilley & Clarkston, 2014) and group scores are often higher than those of any individual within the group.
  • Two-stage exams can significantly improve student performance on multiple choice and long-answer questions (Gilley & Clarkston, 2014), but long-answer questions are more conducive for testing higher-order thinking skills and provide a more reflective evaluation of student learning.
  • Grading schemes often include:
    • Averaging scores from individual and group attempts.
    • Adding the scores from both attempts together.
    • Deciding on a percentage split between the attempts (e.g. 60% individual/40% group attempt).

To set this up in OWL:

  1. Create groups in OWL.
  2. Create an assignment in the OWL Assignments tool and duplicate for group attempt. Give the Group Exam a distinguishing name by adding “G” or “Group” to the assignment name.
  3. First, set a due date for the individual attempt and assign to all students.
  4. Next, edit the Group assignment with a due date for the group attempt.
  5. Assign the Group assignment to each group. Each group will only submit one assignment. Anyone from the group can submit, so the group will need to decide on a “submitter”.
  6. Set up OWL Gradebook to reflect the grading scheme you choose (e.g. Using the Categories and Weighting section, create a category for the exams, either to include both exams in the weighting of the exams or to drop the lowest grade.

Method 3: Peer Coaching during an individual exam

  • For this method students take an exam independently, then participate in a group discussion/study session, then re-take the exam independently.
  • This is another form of a two-stage exam and comes with all the benefits.
  • Grading schemes often include:
    • Averaging score of both attempts.
    • Taking the higher score.
    • Using the 2nd attempt as a bonus
      • Students get a percentage of their point value rise in second attempt added to original grade (if they rise 10 points, they get 10%, or 1 point, added to original grade)
      • Students get a percentage of class average point value rise from second attempt added to original grade (if class on average rises 15 points, everyone gets 10%, or 1.5 points, added to their original grade).

To set this up in OWL:

  1. Create the test in OWL and specify two attempts in the settings to account for the second attempt.
  2. First, set a due date that accounts for both the individual and group attempt and assign to all students.
  3. Communicate to your students when you want each attempt completed within the timeframe that the test is open.
  4. Create groups in OWL and give them a due date by which they need to connect and discuss/study.
  5. Set up OWL Gradebook to reflect the grading scheme you choose (e.g. average both attempts, take higher score) or manually adjust the grades after the second attempt to reflect addition of bonus points from the grading section within Tests and Quizzes. 


References

Clark et. al.. (2018). Off to on: Best practices for online team-based learning.Team-Based Learning Collaborative. Retrieved from: http://www.teambasedlearning.org/wp-content/uploads/2018/08/Off-to-On_OnlineTBL_WhitePaper_ClarkEtal2018_V3.pdf

Gilley, B.H., & Clarkston, B. (2014). Collaborative testing: Evidence of learning in a controlled in-class study of undergraduate students.Journal of College Science Teaching, 43(3), 83-91.

Hodges, L. (2004). Group exams in science courses. New Directions in Teaching and Learning, 100, 89-93.

Parkes, J. & Zimmaro, D. (2016). Learning and assessing with multiple-choice questions in college classrooms. Routledge, Taylor & Francis Group.

Online Quizzes

Description

Short un-proctored assessments that students can complete on any computer.

When to use:

  • you want more frequent, lower-stakes assessments for students to engage with course content.
  • you want students to prepare for a larger assessment.
  • you want to try out different question formats so students can practice providing information in different ways.

Strengths and Limitations 

Strengths

  • Allows for frequent, lower stakes assessment
  • Allows for a range of question types
  • Depending on the question types used, can assess a range of learning outcomes and varying cognitive levels (i.e., remember, understand, apply, analyze, evaluate)
  • Can sample a large number of content domains
  • Scoring for most forms of questions is easy
  • Can easily provide feedback for most types of questions
  • Can be completed on any computer or mobile device
  • OWL Tests & Quizzes supports a variety of question types and flexible administration of questions

Limitations 

  • Writing good questions can be difficult, particularly one’s assessing higher order thinking, taking considerable time and skill to write
  • Frequent assessment requires considerable planning and can induce assessment fatigue for faculty and students, and possible academic integrity issues
  • As these are unproctored assessments, students will have access to available resources when answering the questions (e.g., classmates, the internet, textbooks, course notes)

Many of the strengths and limitation for online quizzes are specific to the format of the questions. The strengths and limitations of multiple choice, true/false, and short answer items are addressed on their respective web pages.  

Specific academic integrity issues

Because online quizzes are unproctored, it is possible that students are drawing on external resources to answer the question (e.g., calculators, the internet, classmates). Below are a number of strategies that one might employ to try to address these issues. OWL has the functionality to support all of these options.

  • Create questions that assess higher order thinking skills (e.g., application, analysis, evaluation) as this will reduce the possibility of students simply looking up the answer online.

  • Randomly assign equivalent versions of a quiz to your students to reduce unpermitted collaboration. It is important that the versions are equivalent in terms of characteristics such as difficulty.

  • Randomly order questions, or blocks of questions, within a quiz so that students are not receiving the questions in the same order.

  • For multiple-choice questions (MCQ), randomize the response options so that students are not receiving the response options in the same order. This should not be employed when using response options such as “All of the Above” or “None of the Above” as it will likely cause confusion.

  • Have a larger number of lower stakes quizzes to reduce the impact of any one quiz. Of course, it is important to consider the workload involved for yourself and your students. You do not want to overwhelm them or yourself with the volume of assessments.

  • You could only count a subset of the quizzes, such as the best 10 of 12 or 5 of 6, reducing the stakes of any one quiz (cf. Parkes & Zimmaro, 2018).

  • Administer one question on a page at a time with no ability to go back to a previous question. It would be important to be very transparent with students about this up front as this may change how they approach the quizzes.

  • Set limits on the time that students have to complete a quiz once they start it. Important considerations include that students may live in different time zones or only have access to a computer at different times. Also, it would be important to ensure that students who require extra time to complete assessments are given the appropriate time to do so.

  • Inform students of the pedagogical rationale for selecting this form of assessment as students are more likely to cheat if they think that assessments are simply busy-work (McCabe et al., 2012).

Research on quizzing indicates that students learn content better when they are tested on that content, including self-testing, rather than just re-reading it as testing requires students to process that information more deeply (e.g., retrieving it, analyzing it, applying it; Roediger & Butler, 2011). Also, the quizzes serve a diagnostic function for students – highlighting areas that they need to emphasize in their studying.

Tools @ Western

The Tests and Quizzes Tool in OWL allows you to create online quizzes with a variety of question formats, including multiple choice, true/false, and short answer. For more information about using the Tests and Quizzes Tool for creating, administering, and grading your online quizzes, please see the OWL help page for the Tests and Quizzes Tool.  
 

Preparation, implementation, grading/feedback 

Preparation, implementation, and grading issues will vary in part depending on the types of questions asked. You can access relevant guidelines for different question types by selecting the question types: Multiple choice, true/false, and short answer questions.

General principles are addressed below, unless addressed in an earlier sections above.

  • Good questions take time for you to write, especially ones assessing higher order thinking. Be sure to give yourself enough time to write your questions.

  • As with all assessments, it is important to link your questions to specific learning outcomes at varying levels of cognitive complexity. To help in that process, you could create a test blueprint similar to the one linked here.

  • Seek feedback from colleagues on your items to ensure they are well-written questions assessing the constructs and at the cognitive level you intend.

  • Good questions take time for your students to answer especially one’s assessing higher order thinking. Be sure to give your students enough time to complete the assessment.

Parkes and Zimmaro (2016) suggest, for multiple-choice items, that questions usually take students 1 to 2 minutes to complete, with more complex questions (e.g., one’s involving reading scenarios, interpreting graphs) taking longer. They suggest timing yourself taking the test and multiplying that time by a factor of 4 to get an estimate of the time your students will need.

  • Provide clear instructions on how to take the quiz. How much time do students have to take the quiz? How many items are on the quiz? Are there any aids students are allowed to use/need for the quiz (e.g., a calculator)? Are there any particular procedures in place for presenting the items (e.g., 1 item on a page, no ability to return to earlier items; Parkes & Zimmaro, 2016)?

  • Provide practice questions to allow students to be familiar and have experience with the quiz format and types of questions you will be asking.

  • Many assessment types in OWL Tests & Quizzes allow for automatic grading (e.g., multiple choice, true/false, fill in the blank) and can upload those grades directly to OWL Gradebook.

  • Enable students to receive feedback on their answers so that the quizzes are a learning experience.

The timing of the feedback is important; giving feedback in a timely manner increases the likelihood that students will use it, but it is important not to provide the answers until all of the students have completed the quiz. OWL allows you to time individual feedback to students, allowing you to share the feedback at an appropriate time. It also have some flexibility in terms of the types of feedback provided.

References

McCabe, D. L., Butterfield, K. D., & Trevino, L. K. (2012). Cheating in college: Why students do it and what educators can do about it. JHU Press.

Parkes, J., & Zimmaro, D. (2016). Learning and assessing with multiple-choice questions in college classrooms.Routledge.

Parkes, J., & Zimmaro, D. (2018). The college classroom assessment compendium: A practical guide to the college instructor’s daily assessment life. Routledge.

Roediger III, H. L., & Butler, A. C. (2011). The critical role of retrieval practice in long-term retention. Trends in Cognitive Sciences, 15(1), 20-27.

Open Book Exams or Take-Home Tests

Description

An open book test is any examination in which students can consult resources (e.g., textbooks, notes, online resources) in completing the examination. This can be done synchronously (e.g., during class time or an exam time) or asynchronously (i.e., in a take home format). 

Use this form of assessment if you want students to:

  • Have access to notes, texts, or other information during their exam
  • Apply knowledge from available sources to specific problems, questions, or case studies
  • (Option) work in groups to collaboratively access and apply information to specific problems, questions, or case studies (Johnson et al., 2015) 

Strengths and Limitations 

Strengths

  • Allows for assessment of higher order learning (e.g., application, analysis, evaluation, creation)
  • Develops information literacy skills
  • Mimics actual professional activities where access to information is not intentionally limited
  • More realistic demonstration of learning than standard exams
  • Less anxiety provoking for some students than standard exams
  • Flexible in terms of the form of assessment (e.g., essay questions, analysis of cases, analysis and interpretation of data) 

Limitations

  • Writing good questions can be difficult
  • Generally involve fewer, more involved questions and may not allow as comprehensive an assessment of student learning as closed book exams
  • Students may not be familiar with this form of assessment
  • Grading can be time consuming
  • Grades may be higher than with closed book exams (Parkes & Zimmaro, 2018)

Many of the strengths and limitation for open book and take home exams are specific to the format of the questions [e.g., see Boye (2019) for the strengths and limitations of essay exams]. 

Specific academic integrity issues 

One of the major concerns with take home exams in particular as they are unproctored is academic integrity. Below are a number of strategies that one might employ to try to help address these issues. These are also considerations with open book tests.

  • Create questions that assess higher order thinking skills (e.g., application, analysis, evaluation, creation) as this will reduce the possibility of students simply looking up the answer online.

  • Requiring students to explicitly draw on course materials (e.g., course notes, textbook) and properly cite these sources as this will also reduce students simply looking up the answers online. 

  • Clearly communicate expectations concerning the exam (e.g., are students allowed to collaborate and to what extent; what are the page or word limits; how much time do they have; how, in what format, and using what naming conventions should the exams be submitted; how should they cite sources) and provide an opportunity for students to ask questions about the process.

  • Inform students of the pedagogical rationale for selecting this form of assessment as students are more likely to cheat if they think that assessments are simply busy-work (McCabe et al., 2012).

  • Use Turnitin, a plagiarism detection software, in OWL’s Assignments tool to determine the originality of students’ submissions (click on these links for more information on enabling Turnitin for an assignment and generating a Turnitin originality report). 

Tools @ Western

Example forms of digital submission:

Preparation, implementation, grading/feedback

  • Write clear, unambiguous questions and instructions to avoid confusion and reduce cognitive load.

  • Create questions that assess higher order thinking skills (e.g., application, analysis, evaluation, creation).

    For example, you may want to structure your question around a problem, case, or other real world scenario or provide qualitative or quantitative data for students to analyze and write up (see Centre for Teaching and Learning, University of Newcastle, n.d.; Centre for Teaching and Learning, Queen’s University, n.d.; Teaching at UNSW-Sydney, 2018; Williams, 2004; also see Campbell, 2020 for example questions in Medical Sciences).

  • Require students to utilize the information that they are accessing in the resources (e.g., apply, analyze) rather than just find and restate it.

  • Consider randomly assigning equivalent versions of the exam questions to your students to reduce unpermitted collaboration. It is important that the versions of the questions are equivalent in terms of characteristics such as difficulty and that they adequately assess your learning outcomes. Dickinson and Knabe (2020) illustrate this method with their example: “...perhaps there are 8 quotations and 4 essay questions in total, but each student gets 2 quotes and 1 essay from that pool. This would reduce the extent of the collaboration as each person’s question combination would be different” (p. 1).

  • Set guidelines and/or limits on the length of responses to ensure that students do not submit longer or shorter than necessary responses (e.g., word limits, page limits).

  • Set reasonable time limits that allow students to complete the requested task (or adjust the task to fit the allocated time). Answering questions for open book and take home exams takes longer than for closed book exams.

    Dickinson and Knabe (2020) provide an example in which students are asked to provide a take home exam that includes “…two 750 word responses that should take most students about 4 hours each to complete. In this case you might provide a window of 3 days to complete the assessment” (p. 1).

    They also suggest that a short time frame to complete the assessment may, in essence, be a timed open-book test and “…you may find that the need for accommodation/consideration for disability-related requests and conflicts increases” (p. 1). A longer time frame will give students more flexibility in completing the exam at a time that works with their schedules.

  • For open book exams, it is important when setting the time for the assessment to keep in mind that students may live in different time zones or only have access to a computer at different times. Also, have contingency plans in place for what to do if students experience an issue during your assessment (e.g., unreliable internet access, an emergency).

  • Outline your expectations in terms of how much time specific questions or tasks should take.

  • Discuss with students how to prepare, particularly for open book exams. Students may assume that they will not need to prepare but can simply look up the answers.
    Parkes and Zimmaro (2018) recommend discussing the importance of students organizing and familiarizing themselves with the resources they will bringing to the open book exam. They also suggest telling students to review the questions, provide answers, and rate their confidence in the responses (e.g., “no idea”, “not sure”, “sure”, p. 165) before opening any resources. Drawing on the resources, they can then address the questions for which they were “not sure” as these should take less time than the “no idea” questions.

    Western’s Learning Development & Success (n.d.) have developed a handout for students that provides general guidelines for preparing for open book online exams that you could share with your students (also see Academic Skills at Trent University’s web site on Preparing for an Online, Open-book Exam).

  • Provide students with example exam questions, exemplary answers, the grading rubric, and any other resources that will help understand what is required of them with this form of assessment (for more information on grading with rubrics, click here). Perhaps provide a low stakes version of the assessment to give students experience with the process

  • As outlined above, clearly communicate expectations concerning the exam (e.g., are students allowed to collaborate and to what extent; what are the page limits; how much time do they have; how, in what format, and following what naming convention should the exams be submitted; how should they cite sources).

  • Provide an opportunity for students to ask questions about the process to dispel misconceptions (e.g., take home exams are easy).

  • Grade using a rubric (see above).

  • To reduce your grading time, you could audio or video record your feedback or employ other grading strategies
     

References

Academic Skills, Trent University (n.d.). Preparing for an online, open-book exam. https://www.trentu.ca/academicskills/how-guides/how-study/prepare-and-write-exams/preparing-online-open-book-exam

Boye, A. P. (2019). Writing better essay exams. IDEA Paper# 76. IDEA Center. https://www.ideaedu.org/idea_papers/writing-better-essay-exams/

Campbell, N. (2020, March). Take home exams. https://teaching.uwo.ca/elearning/presentations/Take%20home%20exam%20overview%20and%20example.pdf

Centre for Teaching and Learning, Queen’s University (n.d.). Case-based learning. https://www.queensu.ca/ctl/teaching-support/instructional-strategies/case-based-learning

Centre for Teaching and Learning, Queen’s University (n.d.). Problem-based learning.< https://www.queensu.ca/ctl/teaching-support/instructional-strategies/problem-based-learning

Centre for Teaching and Learning, University of Newcastle (n.d.). A guide for academics: Open book exams. University of Newcastle, Australia. https://www.newcastle.edu.au/__data/assets/pdf_file/0006/268980/Open-Book-Exams.pdf

Dickinson, W., & Knabe, S. (2020). Take home exams. [Unpublished document]. Western University.

Johnson, C. M., Green, K. A., Galbraith, B. J., & Anelli, C. M. (2015). Assessing and refining group take-home exams as authentic, effective learning experiences Journal of College Science Teaching, 44 https://doi.org/10.2505/4/jcst15_044_05_61

Learning Development & Success, Western University (n.d.). Online exams. https://www.uwo.ca/sdc/learning/onlineexams.pdf

Parkes, J., & Zimmaro, D. (2018). The college classroom assessment compendium: A practical guide to the college instructor’s daily assessment life. Routledge.

Teaching at USNW-Sydney (2018, October). Assessment by case studies and scenarios. https://teaching.unsw.edu.au/assessment-case-studies-and-scenarios

Williams, J. B. (2004). Creating authentic assessments: A method for the authoring of open book open web examinations. In R. Atkinson, C. McBeath, D. Jonas-Dwyer & R. Phillips (Eds), Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference(pp. 934-937). Perth, 5-8 December. https://eprints.qut.edu.au/13072/1/13072.pdf

Problem-Based Learning

Description

Problem-based Learning (PBL) is part of a larger group of learning activities and assessments under the umbrella of inquiry-based learning. Other forms of inquiry-based learning include design labs, case studies, moot court in law schools, and medical rounds in health-sciences.

In PBL, learners are tasked with exploring solutions to a problem that derives from real-world situations with as authentic as possible real-world limitations and structures. With time to build content knowledge and discipline-specific skills towards solving the problem, learners make judgments, interpret, and synthesize information in meaningful ways. Given the appropriate support structures and a complex enough problem, PBL learning is more representative of authentic learning and can be highly engaging for students.

Examples of problem-based learning (with a digital framework):

Example 1 (Social Sciences):

  • Social Science students are broken into groups and given specific locations in specific cities in Google Maps. Each group is tasked with exploring their assigned neighborhoods using Google Street View. They need to create an inventory of resources they find in the area to answer the perennial question, “How can we combat urban poverty?” Over the next several weeks, groups must apply what they are learning in class to their neighborhood inventory and analysis. The topics include access to healthcare facilities, public transportation, food deserts, race/class segregation, and urban infrastructure support. The final product students are working to complete is a group presentation of their inventory and social scientific analysis applying their knowledge and competency in mastering the course’s learning objectives. Students will engage with their peers in other groups about the challenges each neighborhood faces (in synchronous discussions via Zoom or Blackboard Collaborate). Individually, they keep assignment logs and must complete a timed weekly quiz on course subject matter that is automatically graded in OWL. They submit group assignments using what they are learning in class to analyze and propose 5 theory-based suggestions on what next steps might augment and/or create further deficits within the community.

Example 2 (Ethics/Engineering):

  • Senior students are given readings on the Genoa bridge collapse and the Volkswagen emissions scandal. Students are asked to identify how ethics played a role in the lead up to these disasters as short essays in assessment. They are given opportunities to do research and are tasked with answering how, given the high level of government regulations, these situations were able to occur? They discuss the question in OWL forums. They are also tasked with using Voicethread to make comments on photos of the equipment and structural problems related to the problem. The final task is to apply their knowledge of both engineering and ethics to come up with suggestions for how to prevent subsequent occurrences of breaches like this. As a class, they come up with a list of 6 well-defined ethics breaches and recommendations. This is completed via peer-reviewed assignments. As individuals, they analyze the bridge problem and write a technical report via the assignment tool on where the engineering failure occurred. Students will also submit an individual reflection on how they might have addressed the ethics gap if they were in a similar situation.

Example 3 (STEM/Biology):

  • Biology students are tasked with answering the question, ”What is the biology of love?” Over the course of the semester, they will need to analyze how various behaviors and biological functions play a role in reproduction of various species. There are 3 subjects (mating habits, reproductive habits, and biochemistry) that students may sign up for in OWL groups. The end goal of each group: create a wiki in OWL that can act as a resource for other biology students who are learning about reproduction functions. These groups will remain consistent for the semester; students will interact in group assigned forums and they will present to each other in peer-reviewed assignments (using a rubric provided by the instructor).  The forums can be graded based on quick reviews of student participation level and/or through faculty evaluation of the quality of posts. To ensure students stay on task, there is a weekly quiz that automatically grades students on course content. Students are required to contribute to their own group wiki and they must submit a contribution to at least one other group’s wiki that makes a connection between their wiki and the others’. For the final, students need to draw upon the class wikis to submit either a mechanistic (proximate) or functional (mechanistic) explanation of the biology of love.

Example 4 (Engineering):

  • Engineers are given a series of cases over the semester that become increasingly more complex as the course progresses. The first case requires them to submit a technical report that outlines chemical reactions in a manufacturing process. The second case requires them to prepare a presentation for a civic board concerned about the possible dangers of the construction of a nearby plastics factory. The presentation must make clear references to technical analysis of data provided to students. The third case is an interview with a petroleum company wherein students are required to outline a component of a manufacturing system in which they demonstrate their understanding of a subunit’s processes and governing protocols for production and control. Each case builds from new content introduced in the course. Later cases depend on the knowledge and skills learned from prior cases. Using the tools in OWL, students learn, practice and, submit their materials for grading. To add increased authenticity, students could make presentations of their work to industry representatives.

Use this form of assessment if you want students to:

  • Practice concrete applications of theory and content.
  • Develop authentic learning and application beyond rote content memorization.
  • Engage students in a much more interactive manner.
  • Increase awareness of application and relevance of theory and discipline-specific knowledge to situations beyond the classroom.
  • Promote soft skills, including working with others, communication, time management.
  • Scaffold learning from multiple units into a more continuous learning experience.

Strengths and Limitations 

Strengths

  • Shift away from classroom practices of short, isolated, content-centered lessons.
  • Impactful because they are longer-term assignments that encourage developmental growth over the semester.
  • Interdisciplinary, open-ended and students develop more organic solutions over time.
  • Integrate real-world issues and practices.
  • Fosters the use of concrete applications to abstract, intellectual tasks wherein learners can explore complex issues with the knowledge and skills they are developing in a course.
  • Encourages the development of critical reflection and self-awareness.
  • • Promotes the development of peer learning within the larger context of independent discovery.

Limitations

  • Requires clearly defined problems and well-established structure that allows for student success. If the problem is too simple or too difficult students will be frustrated, bored, disengaged.
  • Can be difficult to manage in larger classes.
  • Assessment of specific skills can be difficult to single out.

Specific academic integrity issues

PBL tends to have less issues with academic integrity because of the open-ended nature of the problems and pathways to solve them. If a problem is too simple or easily discovered online, then academic integrity can become an issue. For example, asking students to develop plans for an electric vehicle that can hit a speed of 25 kph for 15 kilometers may be open-ended, but easily found with a Google search. Likewise, if a problem is too open ended (solve the problem of a food desert) or does not have enough levels of support to guide student inquiry, then the temptation to break academic integrity will become an issue.

Tools @ Western

Example forms of digital submission:

Preparation, implementation, grading/feedback

Starting and Preparing a PBL Assessment

  • Choose a central idea, concept, or principle that is always taught
  • Think of a typical end-of-chapter problem, assignment, or homework that is usually assigned to students
  • List the learning objectives that students should meet when they work through the problem.
  • Think of a real-world context for the concept under consideration.
  • Develop a storytelling aspect to an end-of-unit problem or research an actual case that can be adapted. This will help you separate the problem into smaller units that will help students develop skills and gather information to apply to solving the problem.

Designing PBL

  • The problem must motivate students to seek out a deeper understanding of concepts.
  • Authentic problems better motivate for students to solve the problem.
  • More complex problems will challenge students to go beyond simple fill in the blank or checking boxes.
  • The problem should require students to make reasoned decisions and defend them.
  • The problem should incorporate the content objectives in such a way as to connect it to previous courses/knowledge.
  • If used for a group project, the problem needs a level of complexity to ensure that the students must work together to solve it. To assist students’ with organizing their thoughts and work, it is sometimes helpful to use the so-called TRAP model. With TRAP, students have distributed Tasks, they have individual Roles, they have an Audience in mind, and a Presentation format to fulfill.
  • If used for a multistage project that can incorporate several learning units over time, the initial steps of the problem should be open-ended and engaging to draw students into the problem.

Implementing your PBL Assessment

  • Create chunks or sprints within the project that will allow the students to tackle smaller or more simple aspects of the problem with gains in content and practice.
  • Coach the students, encourage them come to you regularly. Try to avoid giving direct answers; instead ask them questions about their thinking processes and point them to resources to find information.
  • As a coach, remember to offer learners commendations and recommendations instead of correcting their work. Commendations on the successes, curiosity, and creativity, for example. Recommendations on how they might build upon their work, where to look next, and what they might consider before moving forward.
  • Encourage students to be reflective and identify learning issues. Metacognition is especially important for success in PBL.
    • Have students continually ask, “What?”, “So what?”, and “Now what?” about their learning and experiences. That is:
      • “What have you learned, practiced, experienced?”
      • “So what does this mean for you, your learning, the team, and the project?”
      • “Now what can you now do that you could not do before? Now what does this mean about the course and the project?”
    • Have students keep an assignment log that tracks their activities and contributions to the project.
  • Lead them towards independent research by offering them places to start searching for relevant information. Use class time for brainstorming and strategy formation.
  • Let them acquire and master targeted concepts and skills by allowing for enough time to work on and successfully complete the project. Provide milestones for expected levels of completion over the course of the project.

Assessing PBL

Clear expectations and frequent (formative) assessment is helpful in PBL. Even for shorter assignment, allowing the students to “check-in” with their work at the earlier stages of their work will help them understand where their strengths and challenges are.

    • Student-guided assessment: Check-ins – letting the students assess where they are in the process and compare that with where you suggest they should be at a given point. Confidence or knowledge surveys – ask the students what they know or how confident they are about particular aspects of the solution. This encourages learner metacognition. Group confidence surveys can increase teamwork when students decide to tackle multiple fronts to address lower measures of confidence. These surveys are also a way for you to model the types of thinking and knowledge that students should have along the way. That is, try to create confidence surveys that prompt investigation rather than only measures current attitudes. Rubrics – provide the students with clear expectations and offers up structure as to what they are working to fulfill in the problem’s solution.
    • Formative assessment: Should be regular to motivate and inform students’ progress.
      • Some examples inlcude:
        • Quizzes/Assignments in Owl on content that is relevant to the problem/project;
        • Submissions of drafts, regularly submitted sample code, plans, diagrams of components of the solution, etc;
        • Q&A session with the faculty/TA.
    • Summative assessment: Because of frequent formative assessment and having a clear rubric, final summative assessment of the solution/final submission tends to be fairly easy. Final assessment can be in the form of a project report, presentation, a tangible product, a solution portfolio, a poster presentation with supporting documents.


References

Boston University, Center for Teaching and Learning (n.d.). Problem-based learning: Teaching guide. https://er.educause.edu/articles/2015/1/using-design-thinking-in-higher-education

Morris, H., & Warman, G. (2015, January 12). Using design thinking in higher education. Educause Review. https://er.educause.edu/articles/2015/1/using-design-thinking-in-higher-education

Nilson, L. B. (2010). Teaching at its best: A research-based resource for college instructors (2nd ed.). Jossey-Bass.

Nuhfer, E., & Knipp, D. (2003). The knowledge survey: A tool for all reasons. Academy, 21, 59-78. https://doi.org/10.1002/j.2334-4822.2003.tb00381.x

University of Delaware, Institute for Transforming University Education. (n.d.). Problem Based Learning Clearinghouse. https://www.itue.udel.edu/pbl/problems (Good place to start to look for disciplinary examples)

University of Illinois, Center for Teaching and Learning (n.d.). Problem based learning assignment log. https://citl.illinois.edu/citl-101/teaching-learning/resources/teaching-strategies/problem-based-learning-(pbl)

Qualtrics

Description

Qualtrics, Western’s supported survey platform, can be used to administer assessments (e.g., multiple choice, true/false, short answer, essay) and has the Qualtrics guide to support its use. The software has an array of functionality, and it is important to be experienced with Qualtrics and its various functionality before using it for course assessments. Assessment errors can be very difficult, if not impossible, to address after the fact and can happen easily when using software with which we have limited experience. 

Also, as outlined in the Plan for Success, it is also important to recognize that students may have limited, inconsistent, or no access to the internet, and online tests may pose serious difficulties. Having contingency plans in place in case a student has difficulty accessing your assessment is necessary.     

Dr. Erin Heerey, with the Department of Psychology, has created a series of videos on Qualtrics to support faculty in:

A number of these functions may help reduce academic integrity offensives.

You may want to let students know in advance if you’ve employed certain options (e.g., time limits on pages, preventing multiple attempts at the test, opening and closing times for the test) so that they can plan their test taking strategy accordingly. 

If you have questions after reviewing the videos and Qualtrics guide, please contact the WTS Helpdesk.

Video Presentation or Performance Test

Description

Use these forms of assessment if you want students to:

  • Move in-person oral presentations to digital formats
  • Demonstrate a skill- or performance-based learning outcome (such as theatre, dance, etc.)
  • Apply their knowledge and skills - independently or in groups - by performing a complex procedure or creating a product.

Example forms of digital submission:

  • .mp4 file submitted via the OWL Assignments Tool or on a OWL Student Page
  • Create and post a video to a website such as YouTube or Vimeo and share the link in OWL assignments (suggest that students set these videos to “unlisted” rather than make them publicly available)
  • Have students use Student Pages in OWL to share their videos
  • Use VoiceThread to create an oral presentation that will allow students to leave comments and questions for each other.

More Ideas

If you are looking for more ideas, you may find inspiration from these resources from other universities: