The Asian Journal of the Scholarship of Teaching and Learning (AJSoTL) is an international, peer-reviewed, open-access, online journal. AJSoTL seeks to create and nurture a global network of academics and educators who will discuss ongoing changes and future trends in tertiary education.
1 Department of Biological Sciences, Faculty of Science, National University of Singapore.
2 Centre for English Language Communication, National University of Singapore.
3 Centre for Development of Teaching and Learning, National University of Singapore.
Name: Dr. LAM Siew Hong Phone: (65)-65167379 Address: Department of Biological Sciences, S3-Level 5, 14 Science Drive 4, National University of Singapore, Email:
Dr. LAM Siew Hong
Department of Biological Sciences, S3-Level 5, 14 Science Drive 4, National University of Singapore,
Lam S.H., Mok J.C.H, & Gan M.J.S. (2018). Enhancing undergraduate students’ confidence in research and communication through integrated course design. Asian Journal of the Scholarship of Teaching and Learning, 8(1). 48-81.
Knowledge and skills related to scientific thinking, practices and communication are essential, but they are rarely taught explicitly together in undergraduate research training. A course entitled “LSM3201 Research and Communication in Life Sciences” was introduced to teach these principles and skills explicitly to students engaged in ongoing research, hence combining course instructions to the process of scientific inquiry, in order to enhance their confidence and abilities in research and communication. This study was conducted to determine the impact of this course on students who took the course (the experimental group) by comparing with those who did not (the control group) through a quasi-experimental design of pre- and post- surveys as well as tests. The study clearly found that students’ confidence in their knowledge of and abilities in research and communication improved significantly after they had taken the course. The improvements were significantly greater in the experimental group than the control group, and were consistent with significant gains in specific knowledge. The control group showed minimal improvements in perception and levels of confidence in their abilities within the same duration of the study. The course was considered to have a medium to large impact on many of the items related to measuring levels of scientific thinking, practices and communication within the process of scientific inquiry. By concurrently providing instructions on scientific thinking, practices and communication to students engaged in ongoing research, the findings indicate that the course—in its design, delivery and assessment—has managed to enhance students’ confidence in their knowledge and abilities in research and communication.
Research and communication skills are perhaps the most important skills to be acquired by a science undergraduate. Students having acquired these skills would know how scientific knowledge is generated and communicated (Alberts, 2009; Coil et al., 2010; Handelsman et al., 2004), and in turn understand scientific content better and perform better academically (Gormally et al., 2009; Dirks & Cunningham, 2006; Yeoman & Zamorski, 2008). These skills are also highly transferable and they enhance employability (Bennett, 2002). It has been suggested that equipping undergraduates with these skills would be one of the hallmarks of successful undergraduate programmes in science (Alberts, 2009; Coil et al., 2010; Handelsman et al., 2004).
Scientific research skills include having the knowledge and ability to actually do science or perform scientific inquiry in order to understand nature better (Kuhn & Pease, 2008; Timmerman et al., 2011). Scientific research involves highly creative and ongoing iterative processes consisting of systematic observations, formulation of the research question, testing of a hypothesis, design and execution of the experiment, collection and analysis of data, followed by the formulation of an evidence-based conclusion and a discussion of the findings that could lead to the proposal of new research questions and hypotheses. These processes are important in knowledge discovery and should be viewed as general principles of scientific thinking and practices, rather than a series of fixed steps (Gauch, 2003). Communication skills would then be required to translate such scientific thinking and practices together with its tentative ‘knowledge product’ into writing, as represented in the four major sections (Introduction, Methods, Results and Discussion) commonly found in primary research publications, thus making them accessible and assessable by the scientific community. Therefore, having the knowledge and ability to apply scientific thinking and practices into an inquiry process, as well as being able to communicate them effectively, are essential to doing good science.
Despite the importance of doing so, scientific research and communication skills are rarely taught together explicitly in an undergraduate curriculum. This is because they are perceived to be better learned while the student is engaged in research rather than taught as a standalone course in a classroom setting (Coil et al., 2010; Emerson, 2017). Therefore, students are usually left on their own to pick up these skills informally through interactions and resources provided by their supervisor or mentor when they engage in research. However, this can become a “catch-22” situation, given that research and communication skills are essential for research. Hence, without being equipped with such skills, students are often ill-prepared to do research and communicate their research findings effectively, and they might find it challenging to pick up these skills on their own when engaged in research projects (Brownell et al. 2013; Zimbardi et al. 2013). Without the knowledge and skills in scientific thinking, practices and communication, students as novice researchers are not equipped and have less confidence to do research, and even much less when it comes to communicating their research findings. Consequently, students and faculty may not benefit as much as they would like to at the end of the research training.
As part of the undergraduate curriculum in the Faculty of Science, National University of Singapore, students have the opportunity to do research under the Undergraduate Research Opportunity Programme in Science (UROPS) during the second and third year of their undergraduate programme. However, undergraduates engaged in UROPS research projects under the supervision of faculty do not receive any systematic classroom instructions on research skills; they are expected to learn these skills by themselves during the course of their research training. Students are not taught scientific thinking and practices applied in research, and how to communicate them together with the findings and conclusion obtained in the form of a scientific research report. It is noteworthy that the skills and requirements for writing a laboratory (lab) report, which undergraduates have learned in laboratory training, are different from those required for producing a scientific research report, and therefore students have found that communicating a research study is a much more challenging task than writing a lab report (Emerson, 2017).
In an attempt to bridge this gap, an undergraduate course entitled "LSM3201 Research and Communication in Life Sciences” was introduced to teach both research and communication skills. A strategic feature of this course was that the students enrolled in it had to be involved in an ongoing research project supervised by another faculty. In this way, the course provides formal instruction and training in scientific research and communication skills to students who are concurrently engaged in research.
Framework for course design
This study recognised the importance of inquiry-based curricula for supporting students to develop scientific research knowledge and skills. The course adopted an inquiry-based teaching approach by focusing on developing students’ 1) scientific thinking, 2) practices, and 3) communication skills as they engaged in research. The development of this course was aligned to the research skills development framework (Willison & O’Regan, 2007). The framework’s conceptualisation was based on two dimensions—facets of inquiry and levels of student autonomy. Facets of inquiry encompass students’ research skills or competencies from scientific thinking, practices, and communication involving information search, organisation, and processing (e.g. analysis, synthesis and communication) which can occur from low to higher order levels of thinking. In terms of levels of student autonomy, this was conceptualised as the degree of engagement in research work, as determined by structure and guidance, moving from “closed” (lecturer specified) to “open” (student specified). The two dimensions allowed for the design of course materials, learning activities, and assessment tasks that support and complement students’ inquiry process in their research projects by progressively moving students through the levels of autonomy, while at the same time, taking into consideration the demands of research skills development as characterised by the facets of inquiry (Tables 1A and 1B).
Curriculum design for LSM3201 as aligned to the research skills development framework – Facets 1 to 4 (Willison & O’Regan, 2007)
Curriculum design for LSM3201 as aligned to the research skills development framework – Facets 5 to 8 (Willison & O’Regan, 2007)
* More details in Appendix A.
This is an exploratory study which aims to examine the relationship between the course (LSM3201) and students’ perceptions and confidence in their knowledge and abilities in scientific research. The study also sought to determine if there were any differences in the perception and levels of confidence in students’ scientific research skills before (pre-) and after (post-) taking this course (henceforth known as the experimental group) compared to students who did not take this course (henceforth known as the control group). Finally, the study was also interested in trying to determine if the relationship and differences observed were associated with gains in specific content knowledge in scientific research. Specifically, this study investigated the following research questions (RQs):
The students who participated in this study were from one cohort of third year Life Sciences students enrolled in UROPS as part of an undergraduate elective research training programme. LSM3201 is an elective course hence it is not compulsory for UROPS students. In this study, the experimental group consisted of UROPS students enrolled in LSM3201 (n=16 for pre- and post-surveys and tests), while the ‘control group’ consisted of UROPS students not enrolled in LSM3201 (n=18 for pre-survey and test; n=16 for post-survey and test). Both the experimental and control groups therefore consisted of students who had completed similar foundational disciplinary courses in their first and second years of the NUS life sciences undergraduate curriculum and were engaged in their respective ongoing UROPS research projects throughout the present study. Both groups were invited to voluntarily participate in our survey and test questions through an online invitation via the university’s virtual learning portal known as the Integrated Virtual Learning Environment (IVLE) at the onset (pre-) and end (post-) of the study. The percentage of third year UROPS students who participated in this study comprised 57% (34 out of a total of 59 third year UROPS students) of the cohort.
This one-semester study adopted a quasi-experimental pre- and post- survey and test design, and it was conducted at the beginning and end of the semester which corresponded respectively to the onset (pre-) and end (post-) of the study. The study was set up to investigate students’ confidence in their knowledge and abilities in research and communication for third year life science UROPS students enrolled in LSM3201 (experimental group) and those who did not (control group).
Design of the course LSM3201 “Research and Communication in Life Sciences”
The course is taught within a 13-week semester where 1.5-hour classes are held twice a week. In total, the course contained 18 lecture sessions, 13 class exercises, 6 major assignments and 1 quiz (refer to Appendix A for details). Among the exercises, students were required to perform weekly reflections on their learning (Item 2 of Appendix A) and to submit them as reflective learning exercises through the IVLE. Samples of the students’ reflections that revealed qualitative insights into the specific learning and gains in knowledge and/or abilities are presented in Appendix B.
In this course, scientific thinking, practices and communication are taught to allow students to gain a better understanding of their interconnectivity and inextricable relationship within the process of scientific inquiry. In scientific thinking, epistemology as well as the philosophy of science and research are introduced to help students gain an understanding of how scientific knowledge are discovered, constructed and communicated through the inquiry process. Students are also taught a comprehensive range of relevant research practices and their underlying scientific thinking, from literature research, problem formulation, generation of a hypothesis to the design and ethical execution of experiments, data collection, analysis, interpretation, and the generalisation of findings. In research communication, students are taught how to effectively communicate the inquiry process through the major sections of a research paper, namely the “Abstract”, “Introduction”, “Methodology”, “Results”, and the “Discussion”. The essentials of each section are taught through various class exercises and writing assignments, including having students participate in a peer review and critique as well as giving oral presentations. The course makes use of student’s research problem(s), methods and findings for the class exercises and assignments. By adopting an inquiry-based approach, students have the opportunity to work on exercises and assignments centred on scientific thinking, practices, and communication in relation to their respective research projects. This would provide students with an authentic learning experience that would give them the opportunity to connect theory with practice, and transfer lessons from the classroom into the research world where knowledge and skills learnt are applied with immediacy to address student’s specific research problems. This approach would further incentivise students’ learning in terms of equipping them with greater ownership and autonomy which in turn would enhance students’ confidence in their knowledge of and abilities in research and communication.
Three forms of measurements were conducted: a) pre- and post- surveys on the control and experimental groups, b) a retrospective survey on the experimental group, and c) pre- and post- tests on the control and experimental groups.
The pre- and post- survey questionnaires contained 27 ‘knowledge’ and ‘ability’ items. These items were constructed in part based on the criteria used for assessing scientific reasoning skills through scientific writing as developed by Timmerman et al. (2011), and skills development for scientific thinking and communication as reported by Yeoman and Zamorski (2008) as well as Zimbardi et al. (2013). These items were aligned with the aim and intended learning outcomes of the course, and were grouped into two categories, namely “scientific thinking and practices” and “scientific communication” (Table 2). The sub-categories comprised topics taught which consisted of content knowledge and abilities. The surveys used a Likert scale (1=”Very Poor” to 7=”Very Good”) to determine the value of the different items. The 27 knowledge and ability survey items were posed in two different ways (Sets A and B) to ensure item validity of the dimensions measured. The Cronbach’s α values for the items measuring each of the sub-categories suggest that the responses for all the items surveyed had good internal consistency (Table 2).
Set A (see Appendix C) consisted of 13 items (Q1-13) while Set B (see Appendix D) consisted 14 items (Q1-14) The retrospective survey only involved survey items from Set A, where students from the experimental group were requested to rate their abilities before and after they had taken the course at the end of the study. In studying the impact of the programmes, retrospective data has been shown to decrease "response-shift bias" that tend to occur in pre- and post- surveys (Howard et al., 1979; Pratt et al., 2000).
Number and type of self-reporting items measuring knowledge and abilities with respective Cronbach’s α values
The multiple-choice test items were intended to assess students’ knowledge in research and communication. The pre- and post- tests contained 12 items each (see Appendix E), organised into the two categories representing scientific thinking and practices and scientific communication (Table 3). The test items were based on general knowledge related to the research process as well as specific instructional content that were provided for training of research skills during the course.
Number and type of test items measuring knowledge of control and experimental groups
The questionnaire and test items were administered twice, one at the onset (pre-) of the study which was at the beginning of the semester while another was conducted 13 weeks later at the end (post-) of the semester. All the surveys were conducted online via IVLE. Students from both the experimental and control groups were invited to participate via general email announcements sent to their respective email accounts. With the exception of the retrospective survey that was conducted only on the experimental group at the end of the study, all the items in the pre- and post- surveys and tests for the experimental and control groups were identical to provide consistency in the understanding of the standard of measurement for the dimension being measured (Cronbach & Furby, 1970).
SPSS software (IBM, USA) was used to generate the frequencies and means with Cronbach coefficients to establish the reliability coefficient of the items. Student’s t-test was employed for comparison and to infer statistical significance between the mean ratings and mean test scores for all the items measured. Cohen’s d-value was calculated to determine the effect size, which is a measurement of the scalable impact of the course.
RESULTS AND DISCUSSION
Finding 1: Students’ perceptions and levels of confidence in their knowledge of and abilities in research and communication improved after taking the course.
For RQ1, the study analysed two sets of data: a) the comparison of pre- and post- mean ratings (Figure 1A), and (b) the comparison of retrospective mean ratings of the experimental group (Figure 1B).
Figure 1A. Difference in ratings [mean + standard deviation (SD)] from pre- and post- surveys of the experimental group. The ratings for each survey item were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
Figure 1B. Difference in ratings [mean + standard deviation (SD)] from “Before” and “After” taking the course in the retrospective survey for the experimental group. Ratings for each survey item were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
The comparison of pre- and post- ratings indicated significant improvements of the mean ratings in all items (see Figure 1A). The increase in mean ratings of the items, ranging from M=1.20 to 1.97, corresponded respectively to an increment in ratings from 34.1% to 49.6%. The items which showed the highest levels of improvement were “Awareness of Ethical Issues”, “Communicating Methodology”, and “Evaluating Scientific Communication and Thinking”.
A retrospective survey was also conducted for the experimental group at the end of the course as it can provide a more accurate view of the self-perception of participants (Raidl et al., 2004). Retrospective analysis showed a more marked improvement in the mean ratings of the items, ranging from M=1.75 to 3.09, which corresponded to an increment in ratings from 42.7% to 107% (Figure 1B). The retrospective analysis showed that the three items with the highest levels of improvement were “Communicating Methodology”, “Communicating Introduction”, and “Evaluating Scientific Communication and Thinking”. The retrospective analysis also showed that the experimental group perceived that their levels of initial knowledge and abilities before taking the course were lower than what they had first thought during the pre- survey. The larger mean differences for the same items in the retrospective survey compared to the pre- and post- surveys (Figure 1A) suggest that students may not have realised the extent of what they did not know prior to taking the course but were more cognisant of their knowledge gaps after taking the course (Figure 1B). This retrospective analysis not only reiterates the positive relationship shown in the mean differences for the pre- and post- surveys, but also presents a more robust measurement of self-reported behavioural change (Raidl et al., 2004).
In addition to the quantitative data, the reflection exercises which students participated in revealed the qualitative aspects of their learning with respect to scientific thinking, practices, and communication (see Appendix B). The findings revealed that specific learning took place and there was a change in perceptions relating to scientific research thinking and process, literature research, research question and hypothesis, experimental design, analysis of data, peer review and critique, communicating the research inquiry process via “Introduction”, “Materials and Methods”, “Results” and “Discussion”, as well as via an oral presentation. The analyses of the data collected from the pre- and post-surveys as well as the retrospective survey together with the students’ reflections of their learning, provided a triangulation of quantitative and qualitative evidence that indicated an improvement in students’ perceptions and levels of confidence in their knowledge of and abilities in research and communication after taking the course.
Finding 2: Improvements were greater in students who had taken the course than those who did not take the course.
To determine if the improvements in perception and level of confidence observed in Finding 1 were a consequence of participating in the course or were qualities students developed on their own (without the course) when they carried out their research projects, RQ2 analysed the following: a) comparison of post- mean ratings between the control and experimental groups (Figure 2A); b) comparison of the pre- mean ratings between the control and experimental groups (Figure 2B); c) comparison of the pre- and post- mean ratings of the control group (Figure 2C) and experimental group (Figure 1A).
Figure 2A. Difference in ratings [mean + standard deviation (SD)] from post- surveys between the control and experimental groups with the respective Cohen’s d-value for comparison of the survey items. The ratings for each survey item were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
Figure 2B. Difference in ratings [mean + standard deviation (SD)] from pre- surveys between the control and experimental groups. The ratings for each survey item were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
Figure 2C. Difference in ratings [mean + standard deviation (SD)] from pre- and post- surveys of the control group. The ratings for each survey item were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
In comparing the post- ratings of the control and experimental groups, it was found that the experimental group had significantly higher mean ratings for all the items measured except for “Presentation Skills” (Figure 2A). The Cohen's d value (Cohen, 1988) reflects the medium to large effect size of the course impact. Five items, i.e. “Communicating Introduction”, “Communicating Discussion”, “Awareness of Ethical Issues”, “Designing and Executing [the] Experiment”, and “Communicating [the] Methodology”, had d=0.8 or greater, suggesting that the course had a large effect size on these items. while the item that had the lowest d value, “Presentation skills” registered d=0.5, which suggests a medium effect size. To further enhance confidence in “Presentation Skills”, students will have to be given another opportunity to improve on their oral presentations after reviewing their recorded presentations and considering the feedback provided by their peers and the instructor.
Interestingly, when comparing the mean ratings of the pre- survey between the control and experimental groups, it was found that seven items (“Awareness of Ethical Issues”, “Writing [the] Abstract”, “Evaluating Scientific Communication and Thinking”, “Preparing Presentation Aids”, “Communicating [the] Methodology”, “Communicating [the] Results”, and “Communicating [the] Discussion’) had marginal to significantly higher mean ratings in the control group than the experimental group (Figure 2B). The mean ratings for the control group were 14% to 29% higher than the experimental group at the onset of this study. The findings suggest that students in the control group had greater confidence in their research and communication abilities, suggesting that they may not have seen the need for this course when compared to students in the experimental group, who had lower levels of confidence at the onset of this study.
To determine if students in the control group experienced improvements in their perceptions and levels of confidence over time, an analysis was done of the pre- and post- mean ratings. It was found that there was a small increase of mean ratings ranging from M=0.14 to 0.56 in all items except for “Awareness of Ethical Issues” (M=-0.03) (Figure 2C). The items "Writing [the] Abstract” (M=0.55) and “Communicating [the] Methodology” (M=0.56) had a slightly larger increment of about 11% in the pre- and post- mean ratings while the remaining items had less than 10% increment in the pre- and post- mean ratings. In contrast, the pre- and post- mean ratings of the experimental group clearly indicated a significant improvement (M=1.20 to 1.97) from about 34.1% to 49.6% in all of the items measured (Figure 1A). This suggests that improvements in perception and levels of confidence across time in the control group, if any, were only in very few items and were minimal.
Taken together, the analyses indicated that the marked pre- and post- improvements in perception and levels of confidence in knowledge of and abilities in research communication as observed in students in the experimental group (Figures 1A and 1B) were mainly contributed by the course. Despite the lower levels of confidence observed in the experimental group at the onset of the study (Figure 2B), they were significantly higher than the control group after taking the course (Figure 2A), suggesting a marked improvement in levels of confidence. The students in the control group may have experienced a slight increase in pre- and post- confidence but the increment was minimal, if not insignificant (Figure 2C). This finding may add to the notion that inquiry-based learning, specifically for novice students in the context of inquiry-based science and research environment, do not markedly enhance students’ levels of confidence in their knowledge and abilities over time (Kirschner et al., 2006).
Finding 3: Improvements consistent with gain in specific knowledge.
To determine if the levels of improvement were consistent with specific knowledge gained in research and communication, RQ3 analysed the following: a) comparison of post-test results between the control and experimental groups (Figure 3A); b) comparison of pre-test results between the control and experimental groups (Figure 3B), and c) comparison of pre- and post- test results of the experimental group (Figure 3C) and control group (Figure 3D).
Figure 3A. Difference in scores [mean + standard deviation (SD)] from the post-test results between the control and experimental groups. The scores for each test category were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
Figure 3B. Difference in scores [mean + standard deviation (SD)] from pre-test results between the control and experimental groups. The scores for each test category were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
Figure 3C. Difference in scores [mean + standard deviation (SD)] from pre- and post- test results of the experimental group. The scores for each test category were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
Figure 3D. Difference in scores [mean + standard deviation (SD)] from pre- and post- test results of the control group. The scores for each test category were subjected to Student’s t-test; *** p<0.0005, ** p<0.005, * p<0.05.
The mean post-test scores of the experimental group for “Scientific Thinking and Practices” as well as “Scientific Communication” increased by 108.6% and 69.9% respectively, when compared with the control group (Figure 3A). The mean total test scores for the experimental group increased by 88.5% (4.08 points) when compared to the control group. The analysis suggests that the experimental group had gained more specific knowledge than the control group with respect to what was tested.
The mean pre-test scores did not show any significant differences for the three categories and total test scores between the control and experimental groups (Figure 3B), suggesting that students from both groups had similar levels of knowledge with respect to what was tested at the onset of this study despite the difference in confident levels (Figure 2B). This finding concurs with the improvements in perception and levels of confidence in knowledge and abilities observed in the post-survey comparison between the experimental and control groups (Figure 2A), suggesting that the improvements were consistent with the gain in specific knowledge in the experimental group when compared with the control group.
The gain in specific knowledge in the experimental group was evident based on the significant increment of 146.3% and 75.8% in the mean scores for “Scientific Thinking and Practices” and “Scientific Communication” respectively, and 107.4% for the mean total test scores between the pre- and post- test results of the experimental group (Figure 3C). This further suggests that the experimental group had gained specific knowledge related to research and communication through the course. This result was consistent with students’ qualitative reflections which revealed examples of their learning with respect to scientific thinking, practices, and communication (see Appendix B). The finding was also consistent with the improvements in perception and levels of confidence in knowledge and abilities observed in the pre- and post- surveys (Figure 1A) and the retrospective survey (Figure 1B) of the experimental group, suggesting that the improvements correlated with the gain in specific knowledge.
In contrast, the analysis of the pre- and post- test results of the control group revealed no significant difference in mean scores for the three categories and total test scores (Figure 3D). The analysis suggests that students in the control group did not gain specific knowledge with respect to what was tested even after having been involved in research. The finding was consistent with the pre- and post- survey results of the control group (Figure 2C), which indicated little or no improvement in perception and levels of confidence in knowledge and abilities. This further corroborates with the findings that the gain in specific knowledge observed in the experimental group was contributed by the course.
The main limitation of this study was the small sample size as only a small number of students had enrolled in the course. Although small in sample size, it is worth noting that more than half of the UROPS cohort participated in this study (see the sub-section “Participants”under the section “Method”). Moreover, an enrolment limit of 20 students was set for this course to maintain effectiveness of the iterative learning activities that involved writing exercises, peer reviews, evaluations and feedback. Nevertheless, the positive and significant effects of size statistics as indicated by Cohen’s d-values (Figure 2A) suggest the possible scaling of the results to larger groups (Cohen, 1988).
Another limitation was that the instrument used to measure the increase in levels of confidence was through an online survey based on self-perception which can be subjective, and the online test only provided a somewhat narrow perspective of students’ knowledge of research and communication. While the online test measured specific content knowledge, it was not able to measure the skills and performances of the students during the process of scientific inquiry. Given that there were external variables that cannot be controlled, such as the differences between various research projects including the levels of difficulty and challenges, as well as the differences in laboratory practices, resources, support and supervision experienced by students, it was a challenge to develop fair, effective and feasible tests to evaluate the performances of specific tasks. However, in this study, with steps taken to ensure internal reliability of the measured item (e.g. the Cronbach α values in Table 2) and the triangulation of three separate data sources (Sets A and B, and the Test items) to corroborate the findings, it was the intent of this study’s methodology to validate the results with a fair amount of internal validity and reliability of the findings.
To improve the robustness of the instrument for future studies, a rubric for assessing scientific thinking, practices and communication, similar to that developed by Timmerman et al. (2011), could be employed by external raters to do a “blind” assessment of the quality of the work produced by the experimental group and compare it to the control group. This would provide an important measurement of the difference in work quality of the two groups that the current study did not capture. As an instrument to measure gains in knowledge, incorporating more questions from the current 12 to 24 items into the online test questions, may provide a better representation and coverage of the specific content and in doing so, enable the investigators to more effectively gauge students’ gains in specific content knowledge.
With respect to further improving the course, it could be re-designed into a blended learning environment, leveraging on the affordances of both online and face-to-face modes. Some of the instructions could be delivered online in order to release more classroom time for activities such as working on corrections for some exercises and providing feedback on students’ reflections, which can include highlighting and discussing the strengths, weaknesses and errors of submitted work with specific examples. Such interaction and feedback exercises are especially important when it comes to the learning of skills and abilities but they are often done in a rush due to limited classroom time. Moreover, students could be given another opportunity to revise their writing, such as the Introduction section of their scientific research report, so that they could compare and experience the improvements made before and after the revisions based on the critique and feedback they received from peers and instructors. This could also serve as additional evidence of students’ learning and the impact on their performance in terms of scientific thinking, practices and communication.
The study clearly found that students’ perception and levels of confidence in their knowledge of and abilities in research and communication improved significantly after they had taken the course. This improvement in perception and levels of confidence in their knowledge and abilities were significantly greater in students who had taken the course when compared to those who did not do so. The improvement in perception and levels of confidence in their knowledge and abilities were consistent with the gain in specific content knowledge and were also evident in students’ reflection of their specific learning. Students who did not take the course showed minimal improvements in perception and levels of confidence in their knowledge and abilities, and there was no significant gain in specific content knowledge. Therefore, the improved confidence and gain in specific knowledge related to research and communication in the experimental group were not merely due to their involvement in research or any other possible influences over time, but were a result of their participation in the course. This study demonstrated that the concurrent teaching of scientific thinking, practices and communication skills by combining these course instructions and exercises to students’ ongoing research projects have resulted in a marked improvement in the students’ knowledge and levels of confidence.
Alberts, B. (2009). Making a science of education. Science, 323, 15. http://dx.doi.org/10.1126/science.1169941
Bennett, R. (2002). Employers' demands for personal transferable skills in graduates: A content analysis of 1000 job advertisements and an associated empirical study. Journal of Vocational Education & Training, 54, 457-476. http://dx.doi.org/10.1080/13636820200200209
Brownell, S.E., Price, J.V., & Steinman, L. (2013). A writing-intensive course improves biology undergraduates’ perception and conﬁdence of their abilities to read scientiﬁc literature and communicate science. Advances in Physiology Education, 37(1), 70-79. http://dx.doi.org/10.1152/advan.00138.2012
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd Ed.) Hillsdale, NJ: Lawrence Earlbaum Associates.
Coil, D., Wenderoth, M.P., Cunningham, M., & Dirks C. (2010). Teaching the process of science: faculty perceptions and an effective methodology. CBE Life Sciences Education, 9(4), 524-535. http://dx.doi.org/10.1187/cbe.10-01-0005
Cronbach, L.J., & Furby, L. (1970). How we should measure "change": Or should we? Psychological Bulletin, 74(1), 68-80. http://dx.doi.org/10.1037/h0029382
Dirks, C., & Cunningham, M. (2006). Enhancing diversity in science: is teaching science process skills the answer? CBE Life Sciences Education, 5(3), 218–226. http://dx.doi.org/10.1187/cbe.05-10-0121
Emerson, L. (2017). Writing science: Implications for the classroom. Asian Journal of the Scholarship of Teaching and Learning, 7(1), 23-36. Retrieved from http://www.cdtl.nus.edu.sg/ajsotl/article/writing-science-implications-for-the-classroom/index.html
Gauch, Hugh G., Jr. (2003). Scientific Method in Practice, Cambridge University Press.
Gormally, C., Brickman, P., Hallar, B., & Armstrong, N. (2009). Effects of inquiry-based learning on student’s science literacy skills and confidence. International Journey for the Scholarship of Teaching and Learning, 3(2), 1-22. https://dx.doi.org/10.20429/ijsotl.2009.030216
Handelsman, J., , Beichner, R., Bruns, P., Chang, A., DeHaan, R., Gentile, J., Lauffer, S., Stewart, J., Tilghman, S.M., & Wood, W.B. (2004). Scientific teaching. Science 304, 521– 522. http://dx.doi.org/10.1126/science.1096022
Howard, G.S., Ralph, K.M., Gulanick, N.A., Maxwell, S.E., Nance, D., & Gerber, S.L. (1979). Internal invalidity in pretest-posttest self-report evaluations and the re-evaluation of retrospective pretests. Applied Psychological Measurements, 3(1), 1-23. https://dx.doi.org/10.1177/014662167900300101
Kirschner, P.A., Sweller, J., & Clark, R.E. (2006). Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psycholologist, 41(2), 75–86. http://dx.doi.org/10.1207/s15326985ep4102_1
Kuhn, D., & Pease, M. (2008). What needs to develop in the development of inquiry skills? Cognition & Instruction, 26(4), 512–559. http://dx.doi.org/10.1080/07370000802391745
Pratt, C.C., McGuigan, W.M., & Katzev, A.R. (2000). Measuring program outcomes: Using retrospective pretest methodology. American Journal of Evaluation, 21(3), 341-349. http://dx.doi.org/10.1016/S1098-2140(00)00089-8
Raidl, M., Johnson, S., Gardiner, K., Denham, M., Spain, K., Lanting, R., Jayo, C., Liddil, A., & Barron, K. (2004). Use retrospective surveys to obtain complete data sets and measure impact in extension programs. Journal of Extension, 42(2). Retrieved from http://www.joe.org/joe/2004april/rb2.php.
Timmerman, B.E.C., Strickland, D.C., Johnson, R.L., & Payne, J.R. (2011). Development of a ‘universal’ rubric for assessing undergraduates’ scientific reasoning skills using scientific writing. Assessment & Evaluation in Higher Education, 36(5), 509-547. http://dx.doi.org/10.1080/02602930903540991
Willison, J., & O’Regan, K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers. Higher Education Research and Development, 26(4), 393–409. http://dx.doi.org/10.1080/07294360701658609
Yeoman, K.H., & Zamorski, B. (2008). Investigating the impact of skill development of an undergraduate scientific research skills course. Bioscience Education, 11(5), 1-14. https://dx.doi.org/10.3108/beej.11.5
Zimbardi, K., Bugarcic, A., Colthorpe, K., Good, J.P., Lluka, L.J. (2013). A set of vertically integrated inquiry-based practical curricula that develop scientific thinking skills for large cohorts of undergraduate students. Advances in Physiology Education, 37(4), 303-315. http://dx.doi.org/10.1152/advan.00082.2012
Appendix A. LSM3201 Content, Exercises, Assignments & Assessments
1. CONTENT OUTLINE (18 lecture sessions)
1.1. Scientific Thinking, Inquiry Practices & Communication [3 lectures]
1.2. Process of Scientific Inquiry in the ‘Introduction’ [2 lectures]
1.3. Process of Scientific Inquiry in the ‘Materials & Methods’ [2 lectures]
1.4. Process of Scientific Inquiry in the ‘Results’ [2 lectures]
1.5. Scientific Inquiry Process in the ‘Discussion’ [2 lectures]
1.6. Summarizing, Presenting & Communicating [4 lectures]
1.7. Peer Review & Critiquing [2 lectures]
1.8. How to Know & How to Make Known: Life, Science and Beyond [1 lecture]
2. CLASS EXERCISES & PARTICIPATION
2.1. Characterizing Your Research Project
2.2. SMART Planner & Reflective Learning Journal
2.3. Deconstruction of Primary Research Article
2.4. Literature Research
2.5. Assembling the ‘Introduction’ section
2.6. Assembling the ‘Materials & Methods’ section
2.7. Assembling the ‘Results’ section
2.8. Assembling the ‘Discussion’ section
2.9. Critiquing the ‘Introduction’ section
2.10. Critiquing the ‘Materials & Methods’ section
2.11. Critiquing the ‘Results’ section
2.12. Critiquing the ‘Discussion’ section
2.13. Peer-feedback for Oral/Poster Presentation
3.1. Writing & Critique Assignment on ‘Introduction’ section
3.2. Writing & Critique Assignment on ‘Materials & Methods’ section
3.3. Writing & Critique Assignment on ‘Results’ section
3.4. Writing & Critique Assignment on ‘Discussion’ section
3.5. Abstract writing, Outline & Title Construction of Research Report
3.6. Oral/Poster Presentation
4. MODE OF ASSESSMENT: 100% Continual Assessment (No Final Examination)
4.1. Writing & Critique Assignment on ‘Introduction’ section: 15%
4.2. Writing & Critique Assignment on ‘Materials & Methods’ section: 15%
4.3. Writing & Critique Assignment on ‘Results’ section: 15%
4.4. Writing & Critique Assignment on ‘Discussion’ section: 15%
4.5. Abstract writing, Outline & Title Construction of Research Report: 10%
4.6. Oral Presentation: 10%
4.7. Quiz: 10%
4.8. Class exercises and participations: 10%.
Appendix B. Samples of Students’ Reflection on Learning in LSM3201
Scientific Thinking & Practices
[Critical evaluation of scientific literature]
I learned to not take things at the face value and to engage in deeper critical thinking, especially when I do literature reviews for my project. Also, instead of taking a passive stance and accepting things as they are, I also want to become more active in finding out the underlying justifications or reasons that lead conclusions. In other words, to be unafraid to ask questions or clarify any doubts that I have.
[Use of scientific literature; Research justification and hypothesis formulation]
Through the assembling the introduction sample and assignment, I am better aware of how a secondary source like a review paper and a tertiary source such as a textbook can be used in literature review just as well as a primary source. I now know the elements behind a hypothesis and in justifying one’s project. I understand that hypothesis is not just an educated guess, but a tentative solution since it still needs to be tested, with the results, variables and operations needing to be measurable and well-defined. Similarly, having a rationale for doing one’s project is not for structure sake, but it actually has bigger implications during results analysis and conclusion.
Prior to the lecture, I had a vague understanding on the importance of ensuring the reproducibility of a scientific research. I failed to see that actually each finding is an important stepping stone that will help in future discoveries. This realisation is important as my project is about providing baseline information for future planning. I also had the idea that once a finding is published, it is finalised and can be taken as definite truth. However, I was deeply mistaken and I saw a bigger picture of how science is dynamic and will constantly be self-correcting.
[Scientific thinking and process]
Another thing that I realized are the gaps in my thought process ever since I started my UROPS project. Until very recently, I have not thought explicitly of the project’s question or its hypothesis. Even when I have thought of it recently, I could not formulate an idea of what it is. I have a clearer picture now after seeing the thought process for scientific research, and I can better appreciate the flow of my project. All in all, I think the most important thing that I’ve learnt this week is the importance to think, plan, and communicate, something which I did not do before, at the expense of many failed experiments and feeling very, very confused for the past few months. As such, I have talked to my prof and lab mates more this week, and I have learnt so much more, and in a much more pleasant way as compared to before.
Immediately after the discussion on controls for experiments, I started thinking about my own experiments. I started asking myself if it was possible for me to include positive controls, or have I been including negative controls with the experiments I have conducted. I started worrying about the reliability and validity of the results I have achieved. I think that these questions prompted me to constantly go back to my hypotheses and ask if the experiments were answering the problem question. At the same time, I realised that the inclusion of controls, especially negative ones, can significantly improve the results I have obtained so far.
[Designing and executing experiment]
I initially thought that because I am working in lab, the extraneous variables are already well taken care of. I mean, compared to doing fieldwork, there are a lot more factors that can affect a person’s experiment; whereas in the lab, the environment is rather constant. There are also other extraneous variables to be taken care of such as my own skills (e.g. pipetting) and instrumentation. Furthermore, as there is a possibility of working with mice in the future, I need to make sure that my treatment and control groups are homogenous groups (i.e. same source, age, sex) while taking into consideration the number of replicates, the limitation of budget, and time.
[Data handling and analysis]
I had the misconception that outliers should be readily discarded as they tend to make the data look awkward and difficult to explain… Outliers can mean potential rival hypotheses or novel findings and should be given attention unless it can be certain that it arose due to an error, otherwise they should be discussed rather than omitted… I will pay closer attention to outliers and unexpected trends during both bench work and report-writing as those are potential sources of novel observations and findings.
[Evaluation of scientific communication; Review, critique and feedback]
Today’s lesson is one of the most helpful lessons we had so far because the critique and feedback are very personalised for my project. In a way, I know what exactly are good and what are not. Some of the comments given by my peers from another field of specialisation are also important to take note of because I feel that it is important for my report to be so clear such that even people not from the same specialisation could understand what my project is about. I also learned how to take in criticisms and always realise the potential to keep improving!
I am now clearer about the purpose and requirements of the introduction in a scientific report, and how to clearly state the hypothesis and define the variables in research. I better appreciate the role of literature review in writing the introduction in a scientific report, and can make better evaluation of which literary references to use for justification.
I think that the four objectives of the introduction is something truly new to me. I am used to writing from a general perspective before moving straight into the hypothesis and the overview of the paper. I think that, after reading through the samples of introductions, I now have a better sense of how to drastically improve my introduction to make it more complete and concise at the same time. I feel that one of the more important points made during the lesson was that it is not necessary to include everything unless the content is truly relevant to the report.
[Materials & Methods (M&M/Methodology)]
I also did not think about the need for an actual structure to the M&M. I always thought that as long as the M&M has defined subheadings, the content and sequence of each experiment should not matter… Again using the template, my M&M became really clear to me in terms of both answering my research questions as well as organizing this section in terms of procedures done.
I have always thought that the M&M section is simply a ‘cut and paste’ section using information from manufacturer’s protocols. I have never thought about the importance of this section as I always think that it’s so generalized that no one will bother to look at it…I have learnt what to include in the section and what not to include. The M&M section is not where one lists down a step-by-step of what he did but rather, a clear and concise summary of all the necessary details that can allow another person to reproduce it. I know now that checking the validity (internal and external) and reliability is crucial in this section.
I used to think that the results section was just a mere listing of the results that were obtained. Never did I knew there was actually a structure to the results section... There is so much more than just listing the results. The results had to make sense and be presented in an easy way to be read and digested by the reader… I will put more effort into constructing the results with respect to the elements, form and function for the results section so that it will be coherent and readable. It has also led me to think about how I should present my data in a format that best represent my results.
I now have a reinforced understanding of results presentation, especially as to when to use graphs and tables, how to label figures and the importance of organizing the results in a way that addresses the mentioned research aims.
I learned that besides only discussing the internal and external validity of results, it is important for the discussion section address the context of the results in relation to research question/problem/hypothesis and in relation to the wider field of knowledge. Even negative results can have importance and be appreciated by the scientific community if put into the proper context.
I now have a better understanding of what needs to be in the discussion section and how to structure it. I am more aware how in-depth one needs to go in discussing one’s results (it isn’t just about explaining it, but whether it addresses the research question, the strengths and limitations/challenges of the experimental design, how much can it contribute/how significant is one’s findings to the broader scope of knowledge and suggestions on how to improve the procedure and identify gaps that could be addressed in future studies). I also have a greater awareness on how to improve cohesion and coherence in my writing.
The assertion-evidence structure was a new concept that I found to be extremely useful - by clearly stating the assertion (or the point you are trying to make in each slide) and then supporting it with relevant evidence in the same slide, each slide is made self-sufficient. A presentation must be able to clearly convey the key points in a logical flow…Key points should be repeated or reinforced near the end of the presentation especially, where attention levels become higher again. As attention levels naturally drop towards the middle of the presentation, extra steps such as varying pitch/tone of voice can be used to maintain contact with the audience.
Common errors in oral presentation, such as the use of filler words, colloquial terms and nervous/informal body language became more apparent to me as I watched my classmates present their study. The lack of transiting/ linking between sections, study context and bigger implications were also major errors that were made by many of my classmates (even me), and this made me realized how little things such as not use a signposting word, slide, or phrase, and making the effort to link back to the study context and research field could affect the clarity of the presentation.
Appendix C. Survey Items (Set A)
Please rate your current understanding of your ability for the following items in the context of research and communication in Life Sciences using the following scale:
[For the retrospective survey: “Please rate your current understanding of your ability before and after you have taken this module for the following items in the context of research and communication in Life Sciences using the following scale:”]
1–Very Poor, 2–Poor, 3–Below Mean, 4–Mean, 5–Above Mean, 6–Good, 7–Very Good
Q1. Awareness of research and communication ethics. [Awareness of Ethical Issues]*
Q2. Writing materials and methods section. [Communicating Methodology]
Q3. Engaging my audience in my scientific presentation. [Presentation Skills]
Q4. Writing introduction section. [Communicating Introduction]
Q5. Conducting literature research. [Literature Research]
Q6. Reviewing and critiquing a scientific paper/report. [Evaluating Scientific Thinking]
Q7. Designing experiment(s) to address the research problem/question and to test the research hypothesis. [Designing & Executing Experiment]
Q8.Preparing visual slides and materials for scientific presentation. [Preparing Presentation Aids]
Q9. Writing results section. [Communicating Results]
Q10. Identifying a research problem/question and formulating a research hypothesis. [Formulating Research Question/Hypothesis]
Q11. Writing abstract section. [Writing Abstract]
Q12. Evaluating a scientific paper/report. [Evaluating Scientific Thinking]
Q 13. Writing discussion section. [Communicating Discussion]
*[Bold phrases in parentheses representing the item category do not appear in survey]
Appendix D. Survey Items (Set B)
Please indicate whether you agree or disagree with the following statements in a scientific research and communication setting using the following scale:
1–Strongly Disagree (Absolutely Not)
5–Mildly Agree (Maybe Yes)
3–Mildly Disagree (Maybe Not)
7–Strongly Agree (Absolutely Yes)
4–Neither Agree nor Disagree
(I Do Not Know/I Cannot Decide)
Q1. I know the important elements needed in an abstract of a scientific report. [Writing Abstract]*
Q2. I know how to identify gaps or missing important information/elements in a scientific report related to my research project. [Evaluating Scientific Thinking]
Q3.I know the important elements needed in the introduction section of a scientific report. [Communicating Introduction]
Q4. I know the important elements involved in designing an experimental. [Designing & Executing Experiments]
Q5.I know how to source and cite important works related to my research project. [Literature Research]
Q6. I know how to explain, compare and generalise research findings. [Communicating Discussion]
Q7.I can deliver an effective and successful scientific presentation. [Presentation Skills]
Q8. I know what to include and exclude in the materials and methods section. [Communicating Methodology]
Q9. I know how to organise, analyse and interpret data. [Communicating Results]
Q10. I can develop effective powerpoint slides. [Preparing Presentation Aids]
Q11. I can tell a good research from a bad one, especially if it is related to my research project. [Evaluating scientific thinking]
Q12.I can present data and findings appropriately and effectively. [Communicating Results]
Q13. I know the general ethical issues surrounding research work and communication. [Awareness of Ethical Issues]
Q14. I can clearly state the research problem/question/hypothesis related to my research project. [Formulating Research Question/Hypothesis]
*[Bold phrases in parentheses representing the item category do not appear in survey]
Appendix E. Test Items
1) Please answer the following questions based on your current knowledge. There is no need to seek out the answer.
Question Type: “True”, “False” or “I Don’t Know”
Q1. ‘Refine’ is one of the ‘3Rs’ in ethical practices for use of animal in research [Awareness of Ethical issues]
Q2. A reproducible data must be a valid data. [Evaluating Scientific Thinking]
Q3. It is both equally important to present results in chronological and logical order of the experiments and findings. [Communicating Results]
Q4. In general, a good oral presentation should allocate 30% to Introduction, 30% to Materials and Methods, 40% to Results and Discussion. [Preparing Presentation Aids/Presentation Skills]
Question Type: Multiple Choice
Q5. Which of the following options is/are consider as primary sources of research information in literature research? [Literature Research]
A. Journal research articles
B. Theses and dissertations
C. Conference Proceedings
D. A and B only.
E. A, B and C.
Q6. Which of the following options is/are the assumption(s) of science? [Evaluating Scientific Thinking]
A. Assumes order (uniformity) & consistency
B. Assumes principles of parsimony.
C. Assumes all truths can be discovered.
D. A and B only.
E. A, B and C.
Q7. Which of the following options is/are characteristics of a good research hypothesis? [Formulating Research Question/Hypothesis]
C. Fit existing observation.
D. A and B only.
E. A, B and C.
Q8. Which of the following options is/are TRUE in an experimental design? [Designing & Executing Experiment]
A. Operational definition is important for defining variables into measurable factors.
B. Extraneous variables are variables that can affect an independent variable.
C. A dependent variable is a measurement caused solely by the change of an independent variable.
D. A and B only.
E. A, B and C.
Q9. Which of the following is TRUE regarding an ‘Abstract’ of a research article? [Writing Abstract]
A. An ‘Abstract can be used to attract and compete for online attention.
B. An ‘Abstract’ can contain new information that is not already in the manuscript.
C. The largest proportion of an ‘Abstract’ should be dedicated to the conclusion of the research.
D. A and B only.
E. A, B and C.
Q10. What makes an observation scientific? [Communicating Methodology]
D. A and B only.
E. A, B and C only.
Q11. Which of the following categories is/are concerned with the external validity of a finding? [Communicating Discussion]
A. Statistical significance.
B. Sample source.
C. Presence of negative control.
D. A and B only.
E. A, B and C.
Q12. What does it mean to ‘generalize’ in the discussion section? [Communicating Discussion]
A. To reconcile the findings with the research problem/hypothesis.
B. To identify general similarities of the findings with other studies.
C. To provide a general description of future related studies.
D. To apply the findings broader into what is established in the field.
E. To provide a general conclusion to the findings.
LAM Siew Hong is a senior lecturer at the Department of Biological Sciences, NUS. He is currently the co-chair of the Department Teaching Committee and is involved in several continuing professional development programmes. He is also interested in equipping students with transferable, essential generic skills for lifelong learning and employability. He has published in various forms; more information can be found at http://www.dbs.nus.edu.sg/staff/lamsh.html.