Adapting Literature Critique Engagement Activities for Online Learning Due to COVID-19: Use of Online Learning Groups to Promote Scientific Literacy Capabilities in Undergraduate Nutrition Education

Maintaining scientific literacy (SL) skill development in undergraduate science education while transitioning courses from the in-person to online learning environment due to the COVID-19 pandemic requires adaptation of some teaching practices. This study assessed the effectiveness of small online learning groups as the active engagement strategy (replacing in-person breakout groups) to promote SL skill development in fourth year undergraduate nutritional science students in the online learning environment (Fall 2020 semester). As a secondary outcome, SL skill development in the online learning environment (Fall 2020, n=178) was compared to that of the in-person course format (Fall 2019, n=144). Students were surveyed at the start and end of the semester to assess their i) scientific literature comprehension, ii) SL skill perceptions, and iii) practical SL skills. The use of online learning groups contributed to improvements in both literature comprehension and SL skill perceptions (P<0.05), however, practical SL skills remained unchanged (P>0.05). There was no difference in the magnitude of improvement in students‟ SL skill perceptions or their practical SL skills between course formats (P>0.05). The ability to think critically about the scientific literature was increased in both course formats, with greater improvements observed in the online course format (P=0.02). Additionally, only students in the online course format had improved comprehension of scientific methods versus the in-person format (P=0.05). Collectively, these data demonstrate that the adaptations of an in-person course to an online learning environment using small online learning groups can similarly promote the development of SL in undergraduate nutrition education.


Introduction
Due to the COVID-19 pandemic, courses that had been previously taught in an in-person format were suspended and forced to transition to online delivery to continue student skill development (Pather et al., 2020). These changes led to a significant disruption in lectures and assessments, as some previous methods became difficult to use in an online format (Iglesias-Pradas, Herná ndez-Garcí a, Chaparro-Pelá ez, & Prieto, 2021). Online learning, when properly planned, can serve as a viable alternative to in-person learning (Driscoll, Jicha, Hunt, Tichavsky, & Thompson, 2012;Jensen, 2011;Kemp & Grieve, 2014), with the potential to enhance active learning among students through group work assignments or online discussion activities. Student satisfaction with online learning can be lower than in-person courses, as students have reported lower engagement with online material and lectures and an increased need for student self-discipline due to the lack of structure (Jensen, 2011;Wang, Shannon, & Ross, 2013). The implementation of active learning strategies (e.g. discussion activities, case-based learning or problem-based learning) in online courses has been shown to improve the aforementioned difficulties students report with online learning (Brindley, Blaschke, & Walti, 2009). practice of small group discussions into an online learning environment may be a useful strategy to engage students and promote skill development.
Group learning can be an effective method to increase engagement and promote active learning in both in-person and online courses (Brindley et al., 2009;Hadimani, 2014;Xie, Hensley, Law, & Sun, 2017). In-person small group learning has been shown to improve students" familiarity with one another, increase their ability to ask questions, and improve understanding of the course material when compared to large, in-person discussions (Hamann et al., 2012). Small group learning activities have also been shown to stimulate active learning and improve communication skills among undergraduates (Hadimani, 2014). Similarly, small breakout groups used during lecture have been shown to increase student satisfaction with a course (Lougheed et al., 2012), and improve students" understanding of course material (Hamann et al., 2012;Lougheed et al., 2012). These breakout groups have also been shown to improve students" test scores compared to traditional, passive lecture formats (Lyon & Lagowski, 2008). Among online courses, small group learning is often conducted in the form of online discussion activities (Hamann et al., 2012;Thomas & Thorpe, 2019). These online discussion activities result in higher student participation and allow students to apply their knowledge to a greater degree (Hamann et al., 2012), something which can be challenging in large, in-person lectures (Kemp & Grieve, 2014). Participation in online discussion activities have also been shown to improve collaboration and communication skills (Anderson & Simpson, 2004;Brindley et al., 2009). With regard to student skill development, online discussions allow students to discuss thoughts with many other students, leading to a greater exchange of ideas and comprehension of course concepts (Driscoll et al., 2012;Hamann et al., 2012). As such, online discussion within an online learning group can serve as an effective vehicle for critical skill development in undergraduate biological science education.
Scientific literacy (SL) is an essential component of the skills gained over the course of an undergraduate degree in science (De Boer, 2000). SL encompasses many skills and can be described as an ability to critically evaluate and understand scientific literature (Britt, 2014;Porter et al., 2017) and apply this information to a greater context (Britt, 2014: De Boer, 2000. Ensuring that students" SL skills are developed during an undergraduate degree program is important for their future success (Britt, 2014;Newton, Bettger, Buchholz, Kulak, & Racey, 2015). Interactive engagement and learning during lecture can be an effective way to develop students" SL skills (Wyckoff, 2001).
Our previous research conducted prior to the COVID-19 pandemic demonstrated the effectiveness of literature critique activities in breakout group and instructor-led discussions during in-person lectures on the development of students" SL and critical thinking skills (Cartwright, Liddle, Arceneaux, Newton, & Monk, 2020). To accommodate for the switch to online course formats, we adapted these breakout group and instructor-led literature critique activities from the in-person course format (Fall 2019) into small online learning group discussion activities in the online format of the same course (Fall 2020). The objective of this study was to determine the effectiveness of small online learning groups on students" SL skill development in an online learning environment in a fourth-year undergraduate nutritional toxicology course. A secondary objective was to compare students" SL skill perceptions and practical skill acquisition between the in-person and online course formats.

Participants
Participants in the study were undergraduate students enrolled at the University of Guelph in the course Toxicology, Nutrition and Food (NUTR*4510) in the Fall 2020 semester. The course was taught in a 12-week online format, with two weekly asynchronous lecture videos uploaded to the course website in Courselink. Assessment in the course included participation in i) written online group discussions (via a discussion board) critiquing the assigned scientific literature and providing peer feedback (4 x 2.5%), ii) two scientific literature critique assignments that also included an online learning group written discussion with peer feedback via the discussion board (5% each), iii) an online literature critique test (20%), iv) a midterm exam (25%), and iv) a final exam (35%). Out of the 187 students enrolled in the course, 178 (95%) completed both the beginning and end of semester surveys. 64% of students in the Fall 2020 semester were in their fourth year of study. 21% of students were in their fifth year or above, whereas the remaining 15% were in their third year of study. All participants gave informed consent to participate in the study and this project was approved by the University of Guelph Research Ethics Board (REB#20-05-016).

Online Learning Groups and Scientific Literature Critique Activities
Students were randomly assigned to an online learning group consisting of 8 students per group at the start of the semester. The intention of the online learning groups was to help foster a sense of community in the course and connect students together while learning remotely. Students engaged in literature critique activities in the course in a scaffolded manner, which included i) instructor led literature critique demonstrations in a lecture video, ii) small online learning group critiques and discussions of the assigned studies via the discussion board, iii) individual literature critique assignments with online learning group peer feedback provided via the discussion board, iv) an individual literature critique online test, and v) literature critique questions on the final examination.

Online Surveys
Students were invited by email to participate in two online surveys administered at the beginning of the semester (week 1, Survey 1) and at the end of the semester (week 12, Survey 2). Each survey consisted of two categories of questions. The first category focussed on SL questions to determine students" i) perceptions of their SL skill competency, which consists of a non-validated set of survey questions developed by the research team and published previously (Cartwright et al., 2020;Monk & Newton, 2018), ii) perceptions and comprehension of the scientific literature, developed by the research team and published previously (Cartwright et al., 2020), and iii) practical SL skills were assessed using the validated Test of Scientific Literacy Skills (TOSLS) survey (Gormally, Brickman, & Lutz, 2012). To avoid testing bias only 17 (out of a possible 28) TOSLS questions were randomly selected from the 9 skill categories to be included in either Survey 1 or Survey 2 as conducted previously (Cartwright et al., 2020). Student perceptions of their SL capabilities and scientific literature comprehension questions used a scale of 0-10. This scale progressed from 0 "strongly disagree", 1-2 "disagree", 3-4 "somewhat disagree", 5 "neither agree nor disagree", 6-7 "somewhat agree", 8-9 "agree", and 10 "strongly agree", as published previously (Cartwright et al., 2020;Monk & Newton, 2018).
The second category of questions measured students" learning approach and engagement with a given task by utilizing the validated Revised Study Process Questionnaire (R-SPQ) (Biggs, Kember, & Leung, 2001). The R-SPQ questions were only assessed at the end of the semester on Survey 2 as conducted previously (Monk & Newton, 2018). Students were invited via their University email to complete the surveys using a private link. Both surveys were administered using the Qualtrics Insight Platform (Provo, UT, USA). Only students who completed both Survey 1 and 2 were included in the analysis (n=178). As an incentive for participation in the study, students who completed Survey 1 received a 2% bonus on their midterm exam, and students who completed Survey 2 received a 2% bonus on their final exam. Alternative assignments were available for students who did not want to complete the online surveys but still wanted to earn the exam bonus.

Comparison between Online and In-Person Formats
To determine any potential differences in student outcomes (i.e. SL skill perceptions, scientific literature comprehension and practical SL skill levels using TOSLS) between the in-person and online course formats, the results from the online Fall 2020 semester were compared to the in-person Fall 2019 semester using the same survey questions. A detailed analysis of the SL data from the in-person Fall 2019 semester has been published previously (Cartwright et al., 2020).

Statistical Analysis
Statistical analysis was conducted using IBM SPSS (IBM). For all data, the predefined upper limit of probability for statistical significance was P<0.05. Values are presented as means + SEM. Paired t-tests were used to determine changes during the semester (i.e., Survey 1 versus Survey 2) in students" SL perceptions, scientific literature comprehension, and TOSLS scores. Unpaired t-tests were used to determine differences between the in-person and online course formats. GraphPad Prism was used to determine correlations between students" R-SPQ scores and either TOSLS scores or questions assessing critical thinking or scientific literature comprehension.

Effect of Online Learning Group Literature Critique Activities on Students' SL Perceptions and Practical Skills
Changes in students" perceptions of their SL capabilities during the academic semester are shown in Table 1. Students" SL skill perceptions significantly increased over the course of the semester in all 10 skill categories assessed (P<0.05). These skills included defining and searching for literature, information evaluation, data interpretation, and scientific literature comprehension. Among these perceptions, students reported the most significant gains in their perceived abilities to integrate information from the literature to address an existing problem (Question 8) or to identify a new problem or research question (Question 9) (P<0.05). Student perceptions of their scientific literature comprehension capabilities are shown in Table 2. Nearly all measures assessed were improved over the semester, with significant improvement in students" perceived abilities to think critically and evaluate the validity of research study designs (P<0.05). The greatest gains were seen in students" confidence in determining the strengths and weaknesses of research studies (P<0.05). Students also reported significant improvements in their comprehension of the scientific literature, including interpreting the results and understanding the methods utilized and content in the discussion section (P<0.05). However, the only parameter that was not improved during the semester was students" reliance on the content provided in the abstract in favour of reading the entire paper (Question 9; P>0.05). Although perceptions of students" SL skills increased between the beginning and end of the semester, their practical SL skills assessed using TOSLS did not change (P>0.05), as shown in Figure 1. The average TOSLS score remained unchanged over the course of the semester, with the average score on both Survey 1 and Survey 2 being 13 correct out of 17 total questions or 76%. At the end of the semester after engaging in scientific literature critique activities in online learning groups 91% of students reported a positive experience with online group learning. With respect to overall perceptions of skill development using online learning groups, 88% of students reported improvements in their critical thinking capabilities and 91% of students indicated that their scientific literature analysis skills had been improved through these activities in the course. Figure 1. TOSLS scores at the beginning (Survey 1) and end (Survey 2) of Fall 2020 semester. Values are means ± SEM. Data was analyzed by a paired t-test.

Learning Approach Results and Correlations with SL Skill Development
Students" learning approach was assessed only at the end of the semester (part of Survey 2) using the Revised Study Process Questionnaire (R-SPQ) (Biggs et al., 2001). Total scores for surface and deep learning approaches were correlated with students" i) SL perceptions overall score (out of 100), ii) TOSLS score, and iii) scientific literature comprehension. Overall SL skill perception score was negatively correlated with surface learning approach (r = -0.30, P<0.0001; Figure 2A) and positively correlated with a deep learning approach SL (r = 0.39, P<0.0001; Figure 2B), indicating that students with the higher overall SL skill perception scores were associated with a deeper learning approach, whereas lower SL skill perception scores were associated with a surface approach to learning.  Data are presented as means (SEM). A maximum score of 10 was possible for each of the 10 questions that constitute the SL skill perceptions scores. A maximum overall score of 100 was possible for both the start and end of semester surveys. (*) denotes statistically significant differences (P≤0.05) between the mean change in SL during the Fall 2020 academic semester (survey 2 scoresurvey 1 score). The survey scale was from 1-10, wherein 1 indicated the lowest level of agreement and 10 indicated the highest level of agreement.  Data are presented as means (SEM). (*) denotes statistically significant differences (P≤0.05) between the mean in students" responses during the academic semester (survey 2 scoresurvey 1 score). The survey scale was from 1-10, wherein 1 indicated the lowest level of agreement and 10 indicated the highest level of agreement The degree of improvement in students" perceptions of their SL capabilities was not associated with their approach to learning, as there was no relationship between the change in students" SL skill perceptions over the academic semester (i.e., the difference in the overall SL score between Survey 1 and Survey 2) and learning approach (P>0.05, results not shown). Conversely, the change in students" perceptions of their SL skills was positively correlated with their perceived ability to critically assess the validity of research study designs (r=0.43, P<0.001) and to think critically about the scientific literature (r=0.33, P<0.0001), thereby indicating that students who perceived greater development of the SL capabilities also had more confidence in their ability to think critically about the scientific literature. Moreover, the change in students" perceptions of their SL skills was negatively correlated with their perceived difficulty in interpreting the results in a scientific study (r= -0.20, P=0.014), indicating that students who demonstrated the greatest improvement in their SL skills experienced less difficulty interpreting scientific results. Practical SL skills assessed using TOSLS scores were negatively correlated with a surface learning approach (r = -0.27, P=0.004; Figure 3A), indicating that students with lower practical SL skills were more likely to take a surface approach to learning. There was no association between TOSLS scores and a deep approach to learning (r = -0.03, P=0.70; Figure 3B). Additionally, there was no association between students" approach to learning and the change in their scientific literature comprehension during the academic semester for any of the parameters assessed (P>0.05).

Comparison in SL Outcomes between Online Learning During COVID-19 (Fall 2020) and In-Person (Fall 2019) Course Formats
The changes in students" SL capability perceptions, scientific literature comprehension and practical SL skills acquired during the academic semester in the online course format (Fall 2020) were compared to the same SL outcomes assessed during the in-person course format in Fall 2019 semester, which have been published previously (Cartwright et al., 2020). The in-person course format included the same scientific literature critique assignments and literature critique test questions (that were included in the midterm exam instead of a separate test as in the Fall 2020 semester); however, the in-person course utilized breakout group discussions during lecture instead of online learning groups for literature critique practice activities.  The overall change in students" SL skill perceptions during the semester in the in-person and online versions of the course are shown in Table 3. There was no difference between the online and in-person course formats in students" perceptions of their SL skill capabilities (P>0.05), which showed similar improvements in all 10 skill parameters assessed. The overall change in students" scientific literature comprehension over the course of the semester between the online and in-person course formats is shown in Table 4. In the online course format, there was a greater degree of improvement in students" ability to think critically about the scientific literature (Question 4; P=0.02) and to comprehend the methods used (Question 5; P=0.05), compared to the in-person course format. All other scientific literature comprehension outcomes did not differ between course formats (P>0.05). Additionally, in the overall TOSLS score a the end of the semester reflective of practical SL skills did not differ between course formats (P>0.05; Figure 4) and there was no difference in the magnitude of change in TOSLS score over the academic semester (i.e., TOSLS score in Survey 2 -Survey 1) was not different between the in-person and online course formats (P=0.28). Data are presented as means (SEM). (*) denotes statistically significant differences (P<0.05) between the mean degree of change in student responses over the Fall 2019 and Fall 2020 semesters. The survey scale was from 1-10, wherein 1 indicated the lowest level of agreement and 10 indicated the highest level of agreement.

Discussion
This study aimed to determine the effectiveness of literature critique activities using small online learning groups on SL skill perceptions, scientific literature comprehension and practical SL skills among undergraduate students in a fourth-year nutritional science course. Additionally, students" learning approach was measured using the R-SPQ to determine any differences between surface and deep learners in either their SL skill perceptions or practical SL skills. ISSN 1927-6044 E-ISSN 1927 2019 ( As a secondary objective we compared the SL outcomes from the asynchronous online version of the course that utilized small online learning groups and written discussion board forums (Fall 2020 semester) to the traditional face-to-face in-person version of the course that utilized similar assignments but had in-person instructor-led and breakout group discussion formats (Fall 2019 semester). During the semester students showed significant improvements in their perceived SL skills and comprehension of the scientific literature (Table 1 and Table 2), whereas practical SL skills remained unchanged (Figure 1). The significant differences between the online and in-person course formats included students" ability to think critically about the content in a scientific study and student understanding of the methods used, with students in the online course format showing greater improvement in both areas compared to the in-person course format (Table 3 and Table 4). Overall, these data demonstrate the effectiveness of small online learning groups in the development of SL skills among undergraduate students, and that SL capabilities are equally attained in the online and in-person versions of this course.
SL is an essential skill for undergraduate students in biological science and ensuring that the associated skills are developed during an undergraduate degree is necessary for success (Monk & Newton, 2018;Wyckoff, 2001). Skills such as critical thinking and SL have previously been developed through the use of active learning strategies such as breakout groups or interactive engagement in class (Lougheed et al., 2012;Wyckoff, 2001). Small online learning groups and written discussion board interactions within these learning groups were utilized as an active learning approach to promote development of SL skills due to the COVID-19 pandemic associated switch to online learning. The in-person version of the course in the Fall 2019 semester (Cartwright et al., 2020) utilized instructor-led activities and in-person breakout groups that were not possible to use in an online asynchronous course format. Both course formats utilized the same literature critique activities [in-person break out group discussions (Fall 2019) versus written discussion board communication in online learning groups (Fall 2020)] and assessments. In these literature critique discussions students had to discuss the assigned scientific study, provide feedback to their group members and identify both the strengths and weaknesses of the research study, followed by critical analysis of the results and development of a follow-up research study. Additionally, students had to provide critical feedback to their group members.
Throughout the semester, students engaged in literature critique activities in small online learning groups and improved their perceived abilities in all measured SL areas, including the ability to differentiate primary and secondary scientific literature, search for scientific literature, evaluate, extract, and integrate relevant information from scientific studies for a research question, interpret information from tables and figures, using information from scientific literature to address unfamiliar or new problems, and to translate scientific literature into understandable terms (Table 1). Among perceptions of scientific literature comprehension, students demonstrated improvements in several measured domains, including critically analyzing the research study design, determining the strengths and weaknesses of research studies, critically thinking about paper content, understanding the methods, results, and discussion sections of scientific literature, and independently drawing conclusions about the study findings ( Table 2). The only measure that remained unchanged over the semester was students" reliance on abstracts instead of reading the entire paper. TOSLS scores remained stable between Survey 1 and 2, with the average score remaining at 13 out of 17 correct (76% correct). Despite the lack of change in practical SL skill development, students" perceptions and self-evaluation of their own skills is an important part of SL (Gormally, Brickman, Hallar, & Armstrong, 2009); however, a four month period between TOSLS assessments may be too short to observe changes in practical skill development. Additionally, it is important to note that students" practical SL competency assessed by TOSLS was already high and sustained over the course of the semester (Figure 1), indicating that students are maintaining previously developed skills in their academic program. Only 17 out of the original 28 TOSLS questions were used on each survey to i) prevent survey fatigue and recall bias between surveys, and ii) permit comparison to the survey outcomes from the Fall 2019 semester. Including the complete TOSLS questions on future surveys may result in a more accurate assessment of practical SL skills. Moreover, since TOSLS scores are consistent between the start and end of the semester, use of the full TOSLS survey questions may help identify the specific SL competencies that students need additional support in developing.
The current study utilized small online learning groups as an active learning strategy to promote development of students" SL skills in an asynchronous online course format. The switch from in-person to online learning due to COVID-19 caused a significant disruption to class scheduling and assessment in most postsecondary institutions (Iglesias-Pradas et al., 2021;Pather et al., 2020). Many courses that had only existed previously as in-person, traditionally-styled face-to-face course offerings were forced into online formats with very different lecture and assessment plans. Previous research has demonstrated that students academic performance, skill development, and satisfaction are similar in online and in-person courses (Holmes & Reid, 2017;Monk & Newton, 2018;Nichols, Shaffer, & Shockey, 2003;Wang et al., 2013), although some students may prefer in-person lectures (Jensen, 2011). However, these studies report student outcomes from courses that are specifically designed to be conducted entirely or partially online, which is not the case for the majority of courses affected by the COVID-19 pandemic. Online courses can require a higher degree of student self-motivation and focus (Cho & Shen, 2013;Wang et al., 2013), which has been negatively impacted by the COVID-19 pandemic (Kalman, Esparza, & Weston, 2020). This is complicated by the pandemic"s negative effect on students" social support from both peers and instructors (Elmer, Mepham, & Stadtfeld, 2020;Kalman et al., 2020). Social support from these groups is important to ensure student self-motivation and success in the classroom (Kalman et al., 2020;Zumbrunn, McKim, Buhs, & Hawley, 2014). Many students have been forced to move back home or to other environments that may not be ideal or conducive for learning and are isolated from friends or peers that would normally provide this support (Gelles, Lord, Hoople, Chen, & Mejia, 2020;Kalman et al., 2020). Ensuring student satisfaction with online courses during the pandemic by facilitating communication between peers and improving student motivation is essential to maintaining education quality (Gelles et al., 2020). Peer contribution through the use of small online learning groups has been previously demonstrated to improve student satisfaction with online learning (Kurucay & Inan, 2017), and both synchronous and asynchronous discussion can contribute to this sense of community within a course (Young & Bruce, 2011). The results from the current study demonstrate that discussion board communication and scientific literature critique activities in small online learning groups can increase students" engagement in the course and improve their SL and critical thinking skill perceptions (Table 1 and Table 2).
Written discussion board communication in small online learning groups was intended to develop students" SL skills through an active learning approach in a similar manner as the breakout group activities used previously in the in-person version of the course (Fall 2019) (Cartwright et al., 2020). Active learning approaches and interactive engagement during lecture have been demonstrated to improve student comprehension and scientific reasoning skills beyond what is possible using passive learning strategies such as traditional lecture (Allen & Tanner, 2017;Wyckoff, 2001). Students in both course formats demonstrated similar improvements in all areas of their SL skill perceptions (Table 3). Scientific literature comprehension was similar in most areas that were measured, although the students in the online course format showed greater improvements in their ability to think critically about study content and understand the methods used in a scientific study (Table 4). These findings are consistent with previous research showing similar degrees of skill development for online and in-person courses (Holmes & Reid, 2017;Jensen, 2011). Peer discussions, whether they be online or in-person, have been shown to increase students" comprehension of course material (Hamann et al., 2012;Smith et al., 2009). Active involvement in online discussions has been shown to promote student understanding and application of content and improve student test performance (Palmer, Holt, & Bray, 2008).
However, there are differences between online discussions and the instructor-led and breakout group discussions used in-person. In large classes conducted face-to-face, instructor-led discussions can result in lower overall student participation compared to online discussions, as some students may not feel comfortable speaking in front of a large class (Hamann et al., 2012). Breakout groups used in face-to-face classes can improve participation and student satisfaction compared to large class discussions (Hamann et al., 2012), allowing students to discuss and exchange ideas instantly and pool existing knowledge when answering a question, which has been shown to improve individual student success on similar questions asked afterwards (Smith et al., 2009). Breakout groups also facilitate improved understanding of course material and can support critical thinking and student engagement (Lougheed et al., 2012). Additionally, breakout groups allow students to directly interact with other students, something which is lacking in online discussion forums (McCarthy, 2017). Student engagement can be higher in these small group learning environments due to the classroom setting (Kemp & Grieve, 2014). However, both large-group and small-group discussion suffer from students coming to class without adequate preparation or showing little interest in the material provided (Hamann et al., 2012). This is mitigated in online discussion boards, where students have time to reflect and think critically about the content of the discussion they are engaging in (McCarthy, 2017). Online discussion boards have a permanent record of messages accessible to students at all times, which provides opportunities for reflection on both their own ideas and their peers (Ng & Cheung, 2007). Additionally, online discussion boards provide a much greater time frame in which to develop ideas and responses to other students" comments, something which is difficult in the in-person learning environment where discussion time can be limited by the instructor (Anderson & Simpson, 2004;Kemp & Grieve, 2014;McCarthy, 2017). The effectiveness of online discussion groups is dependent on proper design and student participation (Brindley et al., 2009;Palmer et al., 2008), which can be difficult to achieve without prior planning and high student engagement. This study achieved participation on the discussion board by assigning a participation grade to each discussion activity. Mandatory discussion participation can be a source of student dissatisfaction with the use of online discussion boards (Hamann et al., 2012), but discussion boards reliant on voluntary participation have significantly lower student involvement and can result in skewed outcomes due to more high-achieving students using these boards (Cheng, Paré , Collimore, & Joordens, 2011;Palmer et al., 2008). However, the similar degree of SL skill development between course formats that utilized literature critique activities via in-person/breakout group verbal discussions (Fall 2019) and the online/written discussion board (Fall 2020) suggest that each approach was similarly effective in providing an opportunity for students to practice and develop SL skills (Table 3 and Table 4). Thus, online learning groups conducted in this manner could be used in the future as an active learning strategy in an online learning environment to support development of other critical skills.
Students learning approach (deep versus surface approach) may impact their success in an online learning environment and their SL skill development. The R-SPQ was designed to determine if students have a deep or surface approach to learning and can be used to determine the efficacy of a given intervention used in the classroom setting or to measure student engagement (Biggs et al., 2001). The R-SPQ results from this study indicated that a deep learning approach was correlated with higher final SL skill perceptions, while higher surface learning scores were associated with lower SL skill perception scores (Figure 2A). With regard to practical SL skill development, high surface learning scores were associated with lower final TOSLS scores ( Figure 3A). Previously, learning approaches were shown to be correlated with improved learning outcomes when students are engaged in active, independent learning compared to rigid, passive learning (Trigwell & Prosser, 1991;Yew et al., 2016). Deeper learning approaches, specifically, are cultivated through active learning strategies such as discussion or case-based learning (Yew et al., 2016). Another factor influencing student learning approach is student self-discipline, which is also essential for success in online learning environments and activities (Platow, Mavor, & Grace, 2013;Wang et al., 2013). As deep learning scores were associated with higher student SL skill perceptions ( Figure 2B), student self-discipline may be an important factor in their success in online course formats. Self-motivation has been shown to be lower in students with high surface learning scores (Everaert, Opdecam, & Maussen, 2016), which may provide an explanation for their lower perceived and practical SL skill scores. These students may struggle with the increased self-discipline needed to succeed in online courses (Wang et al., 2013), which may impact their SL capabilities, although further study is required. It is important to note that the use of the R-SPQ in this study to evaluate approach to learning was only conducted at the end of the semester, so it is not possible to determine whether the use of small online learning groups impacted students" approach to learning over the academic semester, which should be assessed in future studies.
Collectively, the current study shows that use of small online learning groups as an active learning intervention in an online course is similarly effective in promoting students SL skills compared to the instructor-led and breakout group activities used in the face-to-face course format. Including these types of activities in either online or in-person course formats can provide valuable opportunities to promote development of SL capabilities among undergraduate biological science students.