Overview

The Koehler Center for Teaching Excellence (CTE), Professor Suzy Lockwood at Texas Christian University (TCU), and Pearson partnered to improve the use of social tools in online learning. In this exploratory study, the TCU/Pearson team found that doctoral students can be guided towards more substantive, focused discussion contributions through the use of a structured grading rubric combined with limits on student response length.

In the process of studying the impact of grading changes on discussion thread responses, the team developed an innovative approach to quantifying student engagement and contribution based on the topical content of each post and its relationship to the content of earlier posts. This new model for evaluating discussion posts provides a means of assessing discussion thread participation that can give instructors a better idea of how students have achieved learning outcomes and may additionally provide course designers with a way to improve discussion thread design and structure.

Research design

The team selected a course from TCU’s Doctor of Nursing Practice (DNP) program for which the instructor of the course had introduced a structured grading rubric with the goal of making student contributions at once more focused and more substantive. The TCU/Pearson team compared thread posts from the course before and after introduction of the rubric. This provided an experiment-like setting that allowed the evaluation of whether the redesigned thread grading had a beneficial impact on student discussion thread posts.

Custom content analysis approach

After reviewing the research literature on content analysis of discussion thread transcripts and undertaking some preliminary content analysis using a variety of schemes, the team developed a custom content analysis approach based on the pattern of introduction and repetition of topics appearing in a post.

This approach proposed that there are four levels of discussion thread contribution, representing increasingly higher levels of “topic spread”:

Participation occurs when a student posts a response that does not cover topics relevant to the discussion but merely states agreement or disagreement or offers social conversation.

Explanation occurs when a student posts a response that covers topics that have already been introduced in a thread. Elaboration occurs when a student posts a response that provides new topics that are closely related to topics already introduced in a particular top-level threaded response.

Expansion occurs when a student posts a response that connects topics already introduced into the discussion to distantly related topics.

Content analysis may give an evaluator or researcher a much better idea of how students are engaged in the learning and knowledge construction process, but it is time-consuming to code transcripts. Also, content analysis requires a useful coding scheme that can be applied reliably by different raters. Part of the purpose of the current evaluation study is to test a content analysis coding scheme to see if it is useful and reliable in analyzing discussion forum content.

The study involved the following steps:

  1. Course review
  2. Threaded discussion content analysis
  3. Threaded discussion database-query-based analysis
  4. Quantitative analysis/comparison of content-based and database-based measures

Each of these are discussed in more detail below.

Additional grading requirements for the rubric course were as follows:

Students are required to respond to threaded discussions by midnight (MN) on the date designated on the course schedule.

Each student’s original posting should be limited to no more than 350 words; your response should be no more than 150 words. Your citations are not included in the word count. Failure to stay within this limit will be taken into consideration when grading.

There are several reasons for this, the primary one being I want you to demonstrate understanding/synthesis… plus encourage you to make sure you are answering the question or focus of the thread.

The control course did not use a structured rubric for discussion forum grading.

In order to provide context for the later steps in the evaluation, Suzy Lockwood was interviewed. Discussion points and questions included:

  • Confirm that the two sections used different explicit grading instructions/ rubrics.
  • Were there any other important differences between how the two sections were graded or organized? • What motivated you to introduce the more structured rubric?
  • Did you sense a difference in student engagement or learning in the section with the structured rubric?
  • Did you find the rubric helpful in grading?
  • Do you have additional ideas around making discussion threads useful for online learning?

Threaded discussion content analysis

The rubric used in the intervention course called for students to “add significantly to the discussion.” Students may add to the discussion in a number of ways:

  • Explanation – simple discussion of topics that have already been introduced. Describing or quoting what’s in the reading. Giving personal examples that illustrate concepts and ideas in the original topic or points that have already been made.
  • Elaboration – providing additional details that are directly related to topics already under discussion. Extending discussion to closely related areas.
  • Expansion – introducing distantly related topics that can shed light on the topic at hand. Making cross- disciplinary connections. Providing external evidence from research, history, or current news.

In the preliminary content analysis, all of these activities were seen. By extracting concepts from posts and by noting the use of examples, posts could be coded as explanation, elaboration, or expansion of the discussion. Appendix B presents coding instructions refined based on the coding practice exercise.

Conclusion

Discussion threads are one of the key interactive and social components of online courses today, yet instructors and course designers do not have adequate tools and methods for analyzing how well students are contributing and whether threads are structured appropriately. The purpose of this study was to develop a new approach for analyzing discussion thread content and then apply it to an experiment-like setting, with the goal of evaluating whether a structured grading rubric for discussion threads promoted more substantive student posts.

The Explain/Elaborate/Expand model developed for this study identified significant differences in substantive contribution across the rubric and non- rubric course sections, with the rubric section showing greater topic spread. This provides evidence that a more structured rubric that explicitly calls for substantive contribution can drive greater elaboration and expansion of the topics under discussion.

Basic summary statistics around discussion threads such as posts per student and thread depth did not differ across the rubric and non-rubric sections, which suggests that simple comparisons of database summary statistics around threads may not identify actual differences across sections. A content- based approach seems more likely to produce useful information about what’s actually happening in discussion threads. While the approach outlined here was undertaken by human coders, it could relatively easily be automated using text processing techniques. Concept extraction such as by stemmers could identify topics in each post then code for explain, elaborate, expand by considering the successive repetition and introduction of topics. Pearson hopes to explore such automated content analysis of discussion threads and eventually perhaps incorporate it into the platform.

The post hoc exploratory analysis of the data found that posts with greater topic spread were associated with higher numbers of responses, controlling for other important factors such as date of post relative to due date, level of post, and whether the post was made in the rubric or non-rubric section. This finding suggests that posts with higher topic spread attract more attention and participation from other students. Future work might explore the hypothesis that driving greater topic spread via a rubric calling explicitly for substantive contribution increases student engagement overall.


 

This article was written by Suzy Lockwood, Ph.D., Nursing, Romana Hughes, Koehler Center, and Ann Zelenka, Pearson, for the Spring 2013 Issue of Insights Magazine.