Skip to main content

Direction of collaborative problem solving-based STEM learning by learning analytics approach

Abstract

The purpose of this study was to explore the factors that might affect learning performance and collaborative problem solving (CPS) awareness in science, technology, engineering, and mathematics (STEM) education. We collected and analyzed data on important factors in STEM education, including learning strategy and learning behaviors, and examined their interrelationships with learning performance and CPS awareness, respectively. Multiple data sources, including learning tests, questionnaire feedback, and learning logs, were collected and examined following a learning analytics approach. Significant positive correlations were found for the learning behavior of using markers with learning performance and CPS awareness in group discussion, while significant negative correlations were found for some factors of STEM learning strategy and learning behaviors in pre-learning with some factors of CPS awareness. The results imply the importance of an efficient approach to using learning strategies and functional tools in STEM education.

Introduction

In the twenty-first century, international concern over science, technology, engineering, and mathematics (STEM) education has increased. We currently face many global issues, including climate change, overpopulation and wellbeing, resource management, health, and biodiversity, that put great pressure on institutions involved in developing science and technology and that require continued development of STEM education (Gough, 2015; Thomas & Watters, 2015). Considering the complexity and diversity of these issues and the great need for the ability to integrate knowledge and skills in science, technology, engineering, and mathematics to solve real problems (Newhouse, 2016), science learning seems to be a powerful way of thinking and understanding the basis of these issues (Thomas & Watters, 2015).

According to Kelley and Knowles (2016), it is indicated that the foundation of STEM learning framework is the situated STEM learning, which is based on the situated cognition theory. Situated cognition theory emphasizes that learners’ knowledge is constructed within the authentic activities and contexts (Brown, Collins, & Duguid, 1989; Lave & Wenger, 1991), and not only knowledge and skill itself, how knowledge and skill can be applied into the authentic contexts is also important (Kelley & Knowles, 2016). Therefore, as one of the critical themes of STEM, acquiring knowledge and skills through solving problems with real-world scenarios is crucial in STEM learning, for the reasons such as it helps students to prepare for real life, and solve life problems and work problems, by using science, technology, engineering, and mathematics-related knowledge and skills (Holmlund, Lesseig, & Slavit, 2018).

Moreover, situated cognition theory recognizes that not only the cognitive aspect but also the social aspect of learning activities are critical to the learning process. It is emphasized the opinion that, rather than constructing knowledge on one’s own, people’s knowledge is constructed through socially communicating and interchanging with others (Lave & Wenger, 1991). In situated STEM learning, it is considered that knowledge is organized around ideas, concepts, or themes, and evolved through social discourse, thus, as one of the key elements of situated learning, a community of practice is considered as the rope of all dimensions of STEM, which connects science inquiry, technological literacy, mathematical thinking, and engineering design (Kelley & Knowles, 2016). It is indicated that not only acquiring the knowledge and skills itself, but also the process of how to acquire them through the authentic contexts and the exchange of ideas, and how to use them to solve authentic problems, which including both cognitive and social aspects, should be considered in STEM education.

Considering the integrative nature of STEM education, it has been indicated that collaboration for problem solving provides some benefits, such as a more effective division of labor and the incorporation of differing perspectives, knowledge, and experiences (Organisation for Economic Cooperation and Development [OECD], 2017). In this regard, collaborative problem solving (CPS) is considered effective in STEM education, especially when dealing with real and complex problems, because of its coordinated nature (Hesse, Care, Buder, Sassenberg, & Griffin, 2015).

According to Hesse et al. (2015), CPS is not a uniform but rather a complex and coordinated skill that comprises several sub-factors. In order to develop CPS skills in STEM education and improve learning effectiveness, it is important to identify the potential influential factors for learning performance and CPS awareness in STEM learning.

In many prior studies, learning behaviors have proved to be important indicators of students’ learning performance (Hwang, Shadiev, Wang, & Huang, 2012; Lowes, Lin, & Kinghorn, 2015) and individual learning awareness of instructional practice (Artino Jr. & Jones II, 2012; Yamada et al., 2017). Since STEM learning allows students to combine theory and practice in real situations, it is necessary to explore the aspects of cognition and behaviors in STEM, including how students comprehend and apply integrated knowledge (Liu & Cavanaugh, 2011). Thus, we took learning behaviors into consideration as important factors that could affect the improvement of learning performance and cultivation of CPS awareness.

Since it is not easy to capture students’ learning behaviors during lessons, instructional experiments that focus on learning behaviors have mostly targeted online courses in higher education or long-term courses in secondary education. However, not all instructors in secondary education will adopt online instructions in the classes due to time constraints, limitations of equipment, and other issues. Therefore, we try to examine students’ learning behaviors that reflect their individual thinking and investigate their influence on learning performance and CPS skills utilization.

In addition to learning behaviors, cognitive learning strategy is also considered a factor affecting learning scores (Yamada et al., 2016). Since STEM education is an integrated learning approach (Kelley & Knowles, 2016; Lou, Liu, Shih, & Tseng, 2011), it is important for both educators and students to understand the coherence, integration, and learning approaches of STEM so that they can apply an effective learning strategy in STEM teaching and learning. As indicators of learning behaviors, it is necessary to make sense of students’ cognitive learning strategies in STEM education, such as whether and how they use their learning strategies during STEM learning and when solving STEM problems (Griese, Lehmann, & Roesken-Winter, 2015).

Based on insights from these prior studies and the needs they revealed for understanding the factors that might affect learning performance and learning awareness, our study aimed to clarify the correlations of students’ learning performance and learning awareness with the variables of their learning strategy and learning behaviors to determine the factors that might affect STEM CPS learning and provide an effective way for researchers and instructors to develop and instructional methods in STEM CPS education.

Literature review

Collaborative problem solving in STEM education

STEM includes scientific study, technology, engineering design, and mathematical analysis (Lou et al., 2011). STEM teaching is commonly conceptualized as a multidisciplinary approach that typically begins with a discipline or multidiscipline, on the basis of which instructors prepare problems for students to solve (Herro, Quigley, Andrews, & Delacruz, 2017). As one of the successful factors in STEM education, it is important not only to embed knowledge and skills in the curriculum but also to assess knowledge and skills in a real situation or problem-solving practice process and focus on the link of knowledge between the four STEM domains (Newhouse, 2016). STEM teaching allows students to examine and apply theories and knowledge to improve their problem-solving skills, as well as to integrate the comprehension and application of complicated knowledge in STEM areas (Lou et al., 2011).

Although it is not suggested that all four STEM domains must be embedded in one STEM curriculum or learning experience, it is still necessary to understand the relationships among these domains and seek coherency in STEM education (Kelley & Knowles, 2016). Thus, Kelley and Knowles emphasized the importance of STEM integration and defined integrated STEM education as “the approach to teaching the STEM content of two or more STEM domains, bound by STEM practices within an authentic context for the purpose of connecting these subjects to enhance student learning” (p. 3). Based on this definition, we emphasize the solution of authentic science problems across the STEM domains in order to enhance learners’ integrated STEM-related knowledge and skills. To achieve that goal, the application of scientific concepts, the individual’s interaction with technology, applied mathematics, and engineering design (Kelley & Knowles, 2016; Lou et al., 2011) are considered the important factors of STEM instruction design in our study.

In STEM education, situated learning is considered as the foundation of the integration of four domains (Kelley & Knowles, 2016), in which students can build an increasing rich understanding of what they are learning, through applying knowledge and skills actively to the authentic situation, rather than just acquire them (Brown et al., 1989). Authentic activities is helpful for students to act meaningfully and purposefully, since by conducting authentic activities, students are provided with experience to represent and describe the knowledge or concepts, and revise their understanding and actions based on the experience and results (Brown et al., 1989). Based on the situated STEM learning, an engineering design approach provides the opportunity for students to build connections among STEM domains and apply science knowledge and inquiry in an authentic context. Students can construct new knowledge and enhance their learning through engineering practice and scientific inquiry (Kelley & Knowles, 2016). Students treat technology and engineering as cognitive tools, apply mathematical and scientific approaches to solve problems, generalize key concepts, and accumulate procedural knowledge (Lou et al., 2011). Thus, students are expected to develop and use their integrated knowledge and cognitive skills such as problem-solving skill, through authentic contexts. Additionally, in light of the importance of the social aspect of STEM learning, Kelley and Knowles (2016) point out that a community of practice (Lave & Wenger, 1991) is an important element in integrating the four STEM domains in situated learning, since students can construct their understanding by expressing and interpreting their thinking, and rich understanding by the exchange of the ideas and the communication and negotiation with others (Brown et al., 1989). Therefore, it is important to take not only social aspect into consideration in STEM learning.

According to the OECD (2017), collaboration for problem solving affords potential advantages over individual problem solving, including the fact that collaboration affords more effective division of labor; incorporation of information from group members with multiple perspectives, experiences, and sources of knowledge; and enhanced creativity and quality of solutions through mutual feedback. Therefore, in order to improve integrated STEM learning, it is necessary to consider social and cognitive factors when tackling STEM challenges. In this regard, collaborative problem solving (CPS) is a promising area in STEM education because of its advantages in inculcating the understanding of scientific knowledge and others’ ideas, training in scientific investigation, and solving applied problems (Hesse et al., 2015; Hogan, 1999; Lin et al., 2015). Lin, Yu, Hsiao, Chang, and Chien (2018) compared the effectiveness of web-based CPS system and the classroom-based hands-on activities including the CPS scenarios from daily life, in order to develop students’ CPS skills in STEM learning. As their results, the virtual STEM learning environment was found to be more effective in the development of their CPS skills than traditional classroom-based hands-on activities, and the effectiveness of the system would be further enhanced with teacher’ involvement and guidance (Lin et al., 2018).

According to Hesse et al. (2015), CPS combines two domains, the social domain and the cognitive domain. The social domain, which refers to collaboration, focuses on managing the interaction and contributions of individuals, while the cognitive domain, which means problem solving, requires effort in task regulation and the application of skills (Care, Scoular, & Griffin, 2016). Due to the complex and coordinated nature of CPS, it is important to identity the factors that might influence all the sub-skills of CPS and how all sub-skills develop. Our previous study indicated that students’ behavioral factors affect learning performance and CPS awareness (Chen et al., 2018). Therefore, in the present study, in order to improve the learning effectiveness of STEM CPS lessons, we explored the relationships of learning effectiveness with learning behaviors and cognitive learning strategies to see how individual cognition and thinking affect CPS learning in order to determine the factors that might impact the improvement of knowledge and CPS awareness.

Learning analytics and online behaviors

Considering the social and cognitive domains of CPS, it is important to clarify how students engage in collaboration activities and how they act and think in individual and collaborative learning (Chen, L et al., 2018). In this regard, it is important to understand students’ individual thinking behaviors and how to use learning strategies in individual and group learning. In light of the difficulty of collecting behavioral data during traditional face-to-face lectures, many studies have focused on educational data collected and analyzed technologically using a learning analytics approach.

According to Dunbar, Dingel, and Prat-Resina (2014), it is important to incorporate the educational data and analysis relevant to student learning into course and curriculum design, and they also indicated a need for methods and tools that curriculum designers can use to explore data on instructional practices. In science education, the analytics and mining of educational data are useful in evaluating and improving educational design (Monroy, Rangel, & Whitaker, 2014). During the past decades, researches put efforts to explore the potential of analytics and data mining techniques and methodologies, to extract valuable and actionable information from large datasets. When applied to education, these methodologies are referred to as learning analytics (LA) and educational data mining (EDM).

Although there are many similarities between LA and EDM, in general methods and procedures, including gathering, processing, reporting, and acting on machine-readable data in order to improve the educational environment, LA and EDM differ in many aspects, such as origins, types of discovery, and adaption and personalization (Baker & Siemens, 2014; Liñán & Pérez, 2015). For example, origins of researches in EDM are related to educational software and student modeling, and are more interested in automated discovery, and look for the automated adaptation for supporting learners, such as identifying learner’s need and automatically change to personalize learner’s experience with the help of educational software. Whereas, LA researches are rooted in curriculum, outcome prediction, and systemic interventions, and are interested in human-led methods to explore the data for more interpretable and understandable models. Rather than automated adaptation, researches of LA look for ways to inform and empower instructors and learners, such as informing instructors or learners themselves concerning specific learners’ learning situation and processes.

The present study aims at exploring the relationships between learning performance and CPS awareness with behavioral and STEM strategical factors. In order to achieve the research aims, the experiment was conducted based on the instructional design, with the interventions from the designers and the instructor. In this research, educational data including not only learning logs which were collected and generated by the system but also the psychological data conducted by the survey. The methods of data collection and analysis were based on human-led methods according to the learning theories and the changing situations of the learning environment.

The purpose of the educational data analysis in this experiment is to clarify the relationships between various factors and predict the potential factors which may influence learning outcomes, and then inform instructors and learners these results to help them understand their teaching or learning, and providing researchers and educators some implications in CPS-based STEM learning. Therefore, we adopted LA approach, rather than EDM approach, to achieve the above research purpose.

Learning analytics (LA) is an effective way of using academic data that allows us to understand and improve learning in various fields. For example, Bazelais, Lemay, and Doleck (2018) investigated the relationship between students’ prior achievement (high school average) with performance in a pre-university physics program by using data of 9877 students’ pre-university physics course. As the results, they found that prior high school performance achievement was a strong predictor of college physics course performance. In higher education, support of the technology makes it possible to collect LA data from new resources such as learning management systems (LMS). Through LMS, students’ learning logs can be collected and data from them can be used to define learning behavior variables when conducting online learning, such as number of logins, pages accessed, and time spent in the system, which can show students’ frequency and duration of participation (Morris, Finnegan, & Wu, 2005). Moreover, some studies have found that certain types of online participation behaviors, such as “page hits,” are correlated with grades (Ramos & Yudko, 2008; Wang & Newlin, 2000). By using an e-book system, students can use multiple functions of e-books such as going to the next page or previous page, adding bookmarks, underlining, annotations, and keyword searching, and all these log data can be collected by the system. Additionally, the system can also know when and for which course the e-book was used, which is very useful information in analyzing students’ learning activities (Ogata et al., 2015). Oi, Okubo, Shimada, Yin, and Ogata (2015) analyzed students’ learning behaviors by collecting e-book logs before and after the main content learning in class to investigate the relationships of preview and review behaviors with academic achievement. Their findings indicate that previewing is more deeply relevant to academic achievement and assessment than reviewing.

Although LA is considered an effective method to assess how online behaviors are associated with learning performance in secondary education (Lowes et al., 2015), there are still fewer studies using LA data at the secondary education level than higher education.

Chang et al. (2017) collected and analyzed multiple data sources, including group discourse, test scores, questionnaire feedback, and problem-solving activity logs, to understand their respective learning effectiveness and CPS patterns, then examined how students solved problems using individual-based and collaborative simulations to understand their effects on science learning. However, the activities in Chang et al.’s study lasted only 60 min, which is hardly likely to be representative of other learning settings. Considering the limitations in student numbers and subjects, multi-subject studies at secondary education level should be conducted that look at large numbers of students across settings, which would add the individual subjects as a complicating factor (Lowes et al., 2015). Liu and Cavanaugh (2011) collected learning logs for one academic year in biology courses in high school to show that certain variables could affect student academic achievement. For example, the time students spent in the LMS positively and significantly affected their final scores in biology courses. Lowes et al. (2015) also explored LMS data for one academic year to examine the link between in-course behaviors and course outcomes, concluding that the level of online behaviors associated with attendance and interactivity showed a positive influence on final grades.

However, as it is difficult to conduct continuous online courses in secondary education due to time restrictions, limitations of equipment, and other issues, in our present study we collected students’ learning logs during short-term online learning in order to revise and improve our next short-term design accordingly.

Based on the insights of previous studies, the present study aimed at exploring the relationships of learning performance and CPS awareness with STEM learning strategy and online learning behaviors in order to investigate potential influencing factors on the learning effectiveness of STEM CPS learning and find an effective method to improve STEM CPS learning.

Research questions

As one of the goals of STEM learning, learners are expected to acquire knowledge during the activities of solving problems with authentic contexts, or apply them into authentic problems (Holmlund et al., 2018). In order to achieve this goal, it is important to understand how do learners acquire and construct related knowledge. The literature review cited above revealed that students’ learning behaviors engaged in individual activities of learning materials, which were related to their scientific thinking, had influence on learning performance and learning awareness. Therefore, in the present study, we try to examine the relationships between learning performance with learning behaviors on how they read and understand scientific contents, and to find out what kind of learning behaviors should be recommended or paid attention during STEM learning. Meanwhile, in order to understand how to instruct students to use effective learning strategies to acquire knowledge, it is necessary to identify what kind of STEM learning strategies would have influence on learning performance.

Moreover, since this was not the first time students got touch in the learning for CO2-related knowledge and the form of group work, students might have different prior knowledge and CPS awareness, which would have influence on the final results. Therefore, considering the different scores of students’ pre-test and CPS pre-questionnaire, it was not suitable to only consider the final results of learning performance and CPS awareness. Therefore, as the first research question, the difference in pre-posttests and questionnaires was examined, and to find out the relationships between the change in learning performance and CPS awareness, with learning behaviors and STEM learning strategies, to find out the potential factors that were related with the change in learning performance.

Additionally, some of the previous researches cited above showed that it is important to identify the characteristics of all the sub-factors of CPS skills when developing CPS skills, such as what kinds of relationships these sub-factors have with other learning factors. Therefore, the second research question was set to examine the relationships between the improvement in CPS awareness with behavioral and strategic factors.

Moreover, in order to understand how students would conduct these behaviors, it is necessary to associate these learning behaviors with the learning strategies they used during the STEM learning. Then students and instructors could be provided with some suggestions on the learning behaviors which had relationships with the improvement in learning performance and CPS awareness, and how to encourage students to conduct these learning behaviors by using related learning strategies.

In this study, we set three research questions:

  • Research question 1: Which factors of STEM learning strategy and learning behaviors have relationships with the change in learning performance in STEM learning?

  • Research question 2: Which factors of STEM learning strategy and learning behaviors have relationships with the change in CPS awareness in STEM learning?

  • Research question 3: What are the relationships between STEM learning strategy and learning behaviors in STEM learning?

Methods

Procedure of the instructional design

In this study, we designed a STEM CPS lesson based on the CPS framework proposed by Hesse et al. (2015). The theme of this STEM lesson is the same as the science lesson conducted in our prior study (Chen, Uemura, Goda, et al., 2018), which involved determining the reason for the Limnic Eruption, a natural disaster that occurred in Cameroon.

In order to facilitate pre-learning and group discussion, as well as to collect and analyze the learning behaviors of participants and pre-learning and individual thinking behaviors during group work, we integrated a BookRoll system into our design.

The BookRoll system is used as an e-book reader system to store lecture materials such as slides or notes (Ogata et al., 2017). Students can access these learning materials both in class and at home, which makes it possible to collect the learning logs from when students prepare or review their lessons to understand their learning conditions. In addition, as students can use additional functions such as highlighting, annotating, and searching for key words, these learning logs can all be collected for further analysis and instruction improvement. The interface for BookRoll used in this study is presented in Fig. 1.

Fig. 1
figure 1

The BookRoll system interface of the Limnic Eruption lesson

The instructional design of the SETM lessons are presented in Appendix 1.

In order to make the tasks proceed more smoothly, we designed five questions for students to discuss with each other. (1) Where was CO2 from in Lake Nyos? (2) Where did more CO2 dissolve and accumulate in the Lake Nyos, the surface or the bottom? Why? (3) According to the Wikipedia, there is about 90 million tons of CO2 dissolved in the Lake Nyos (Wikipedia: Lake Nyos, Japanese version). How much pressure does it take for 90 million tons of CO2 to dissolve in Lake Nyos at 20 °C? (4) In summary, what is the mechanism of limnic eruptions? (5) Could Limnic Eruption possibly occur in Japan?

The process of the lesson Limnic Eruption was presented in Fig. 2. Before each lesson, students were required to read the related learning materials and to highlight and record the contents they did not understand or thought those were important. In the first week, the teacher introduced the Limnic Eruption disaster, and explained the difficult contents according to what students highlighted (pre-learning). And after that, students worked on question 1 and 2 by searching and analyzing the information, and discussing with others about the questions. Students were asked to talk about the highlights and annotations they added during pre-learning. Then, all groups would make a presentation about their conclusion and received feedback from the teacher and other students. During the second and the third week, students preformed pre-learning and group discussion in the same form, and worked on the questions 3~5. In the fourth week, students were asked to use the integrated knowledge and skills to design a manual of disaster mitigation and make a presentation to the whole class.

Fig. 2
figure 2

The process of the lesson Limnic Eruption

Design and procedure

This study was conducted in a tenth-grade science class at a private senior high school in Japan with the participation of 12 students. The period of this study was between November and December 2018 and included four lessons (50 min per lesson) over 4 weeks (one lesson per week). In addition to the teaching hours, students were also asked to read the provided learning materials on the BookRoll system, and to finish the assignments.

Before the lesson, students were required to take a Collaborative Problem Solving questionnaire (hereinafter, the “CPS Questionnaire”), which concerned their prior awareness of whether and how to use CPS skills in typical science classes as the pre-questionnaire, and a pre-test to check their prior knowledge. After the completion of the STEM lesson, students received the same CPS Questionnaire and a new STEM Learning Strategy Questionnaire (hereinafter, the “SLS Questionnaire”) as post-questionnaires. The CPS post-questionnaire was conducted to assess the change in students’ CPS awareness before and after the STEM lesson, while the SLS questionnaire checked the kind of learning strategy students used during the STEM lesson, and a post-test was also conducted to see whether their related knowledge had changed.

Data collection

In order to investigate factors that might affect the cultivation of CPS skills, we examined the relationships between CPS awareness and learning performance, the STEM learning strategy (SLS) used, and the learning behaviors in learning scientific materials during individual pre-learning and collaborative work.

Therefore, we collected data from three tools: questionnaires, tests, learning logs, and the dialogue during group discussion. The CPS Questionnaire was designed with reference to the CPS framework proposed by Hesse et al. (2015), which contains 17 items in total (see Appendix 2). The CPS pre-postquestionnaires contain two dimensions, social skills and cognitive skills, and include five factors, Participation, Perspective Taking, Social Regulation, Task Regulation, and Learning and Knowledge Building. These factors could reflect how the students perceived the quality of collaborative activities and their cognitive process when carrying on tasks. The CPS questionnaire has been used in previous studies (Chen et al., 2018) to examine students’ awareness of collaborative and cognitive activities, showing that the questionnaire was reliable. We also assessed the reliability, and the overall Cronbach’s α value of pre-CPS Questionnaire was 0.77 (the reliability of Participation, Perspective Taking, Social Regulation, Task Regulation, and Learning and Knowledge Building was 0.83, 0.79, 0.79, 0.71, 0.73 respectively), and post-CPS Questionnaire was 0.79 (the reliability of each factor same as above was 0.78, 0.84, 0.75, 0.84, 0.74 respectively).

The SLS Questionnaire developed by Griese et al. (2015) was used as the post-questionnaire, concerning students’ learning strategy during their STEM learning. The SLS Questionnaire contains nine factors, Organizing, Elaborating, Repeating, Effort, Attention, Time Management, Learning Environment, Peer Learning, and Using References, and includes 27 items (see Appendix 3). We translated the SLS Questionnaire to Japanese and made minor changes to the items to make them more suitable for senior high school students. The overall Cronbach’s α value of SLS Questionnaire was 0.78 (the reliability of Organizing, Elaborating, Repeating, Effort, Attention, Time Management, Learning Environment, Peer Learning, and Using References was 0.86, 0.73, 0.78, 0.71, 0.82, 0.84, 0.88, 0.70, 0.71 respectively).

The contents of the CPS Questionnaire and SLS Questionnaire are shown in Table 1. Both questionnaires were rated on a Likert scale from 1 to 5 (1. Strongly disagree; 2. Slightly disagree; 3. Neither; 4. Slightly agree; and 5. Strongly agree). Free text space was also provided on the post-questionnaire to collect students’ individual reflections and their impression of STEM lesson.

Table 1 Contents of the CPS and SLS questionnaires

The pre-posttests contained the same ten questions on students’ acquisition of CO2-related knowledge. The tests included seven conceptual multiple-choice questions concerning the nature (two questions) and solubility of CO2 (five questions), one calculation question, and two application questions that required students to solve problems related to solubility of CO2 and disaster reduction consciousness. Thus, the results of the tests can reflect the change of students’ conceptual understanding of the CO2-related problems, and the ability to transfer their knowledge to solve the problems. The conceptual questions were the same level as their final examination, and calculation question and application questions were more difficult than their final examination.

The learning logs of the operations students performed when reading and understanding digital learning materials through BookRoll system were collected, yielding data on their frequency of turning to the next/previous page and adding/deleting markers, annotations. The learning logs both in-class and out-of-class were collected in this study.

Results and discussion

Research question 1: What factors of STEM learning strategy and learning behaviors have relationships with the change in learning performance in STEM learning?

Changes in learning performance

The pre-posttests consist of ten questions (worth ten full marks) about knowledge related to CO2 and disaster reduction. Due to the small sample size of the study, we looked at histograms of students’ pre-posttests with the normal curve superimpose, and histograms showed obviously not symmetric, moderate tailed distributions, which indicated apparent non-normal distributions of data. Therefore, we adopted non-parametric Wilcoxon rank test to assess the significance of the change of pre-posttests concerning students’ changes of related knowledge.

As shown in Table 2, the mean value improved from 3.42 (SD = 1.24) to 5.92 (SD = 1.51) at a significance value of 0.01, which shows statistically significant differences in students’ learning performance during the STEM lesson.

Table 2 Wilcoxon signed-rank test results of pre-posttests

In order to investigate whether learning strategy and learning behaviors of reading scientific materials had influence on learning performance, we used Spearman’s rank correlation coefficient to assess the correlations between learning performance from the data of tests on the one hand and learning strategy from the SLS Questionnaire and learning logs for the reading digital learning materials on the other.

Correlations between changes in learning performance and STEM learning strategy

First, we analyzed the correlations between changes in learning performance and the SLS (see Table 3); however, no correlation was found.

Table 3 Spearman’s rank correlation coefficients between learning performance and STEM learning strategy

Lou et al. (2011) suggest that students should be guided efficiently to immersion in STEM learning. However, in this lesson, the instructor played the role of a facilitator who only controlled the flow of the lesson and provided advice or gave answers directly when necessary, indicating that efficient guidance for STEM learning is not enough in this instructional design.

As Kelley and Knowles (2016) pointed out, as an important factor in STEM education, both educators and learners should put emphasis on the integration of STEM subjects. In order to make the integrated approach more effective in conveying to students how STEM knowledge can be applied to real-world problems, it is necessary for students to think and understand the relevant ideas in the individual disciplines and multi-disciplinary integration (Kelley & Knowles, 2016). However, the results of the students’ free text questionnaire suggests that they only focused on scientific or mathematical knowledge and on how to use it to solve a specific provided problem, rather than thinking of relevant ideas or integration. This is one possible reason why students’ STEM learning strategy failed to help them to improve their learning performance.

Correlations between changes in learning performance and learning behaviors

Regarding the correlations between changes in learning performance and the learning behaviors involved in reading digital learning materials, we divided learning behaviors into two parts, pre-learning and group discussion. The results are presented in Table 4.

Table 4 Spearman’s rank correlation coefficients between learning performance and learning behaviors

During pre-learning, there was no correlation between changes in learning performance and learning behaviors, and we consider the same reason to be involved here as above, the lack of understanding and thinking about STEM learning methods, as well as guidance from instructors in STEM learning.

Turning to learning behaviors in group discussion, this included students’ operations when reading digital learning materials. Some functional tools such as markers, annotations indicate behaviors associated with students’ ways of thinking (for example, highlighting when they do not understand) and changes in their thinking (deleting markers when they change an idea). The results in Table 4 show that there was a moderate positive correlation between changes in learning performance and Add Marker (ρ = 0.66, p < 0.05), and a strong positive correlation between changes in learning performance and Delete Marker (ρ = 0.76, p < 0.01).

Students frequently add or delete their markers during group discussions because they take others’ contributions into mind when reconsidering problems (Chen, Uemura, Goda, et al., 2018), which was also confirmed by our classroom observation. The marker tool is effective in facilitating students’ deeper processing and retrieval if instruction in thinking about what to mark and in questioning when re-reading is provided (Yue, Storm, Kornell, & Bjork, 2015); otherwise, it may negatively affect performance on higher-level tasks that require them to make inferences (Dunlosky, Rawson, Marsh, Nathan, & Willingham, 2013).

In this STEM instructional design, students were required to highlight the context they did not understand in yellow and the important contexts in red before the lesson, and to discuss these contexts during the lesson. They were also asked to delete the markers if they change their ideas after the discussion. Since no correlation between changes in learning performance with marker-using behaviors was found during pre-learning but was found during group discussion, it can be inferred that marker-using behaviors could facilitate the integration of other contributions into their own thoughts and reconsidering the problems (Perspective taking). However, we have no observations supporting this. Thus, the dialogue analysis had been conducted to examine whether students had the behaviors of perspective taking during discussion.

Research question 2: What factors of STEM learning strategy and learning behaviors have relationships with the change in CPS awareness in STEM learning?

Changes in CPS awareness

As with the assessment of the pre-post questionnaires, we also checked histograms of students’ CPS pre-postquestionnaires with the normal curve superimpose, and found non-normal distributions of data. Therefore, a Wilcoxon signed-rank test was used to assess the significance of the change of CPS pre-postquestionnaires concerning students’ awareness of whether and how to use CPS skills. The results are shown in Table 5.

Table 5 Wilcoxon signed-rank test results for the CPS pre-postquestionnaires

With respect to Social Skills, Perspective Taking was increased from 13.25 (SD = 2.18) to 13.67 (SD = 1.37), while Participation decreased from 12.17 (SD = 2.08) to 11.92 (SD = 1.88) and Social Regulation decreased from 12.17 (SD = 2.44) to 11.58 (SD = 2.19).

Concerning the Cognitive Skills, Task Regulation improved from 14.50 (SD = 3.40) to 15.08 (SD = 3.09), while Learning and Knowledge Building declined from 14.25 (SD = 2.70) to 13.75 (SD = 2.18). However, no statistically significant difference was found for any factor, indicating that CPS awareness had not improved through this STEM lesson.

This might be because we designed this STEM lesson according to the CPS process (Hesse et al., 2015), which includes the steps Identifying the problem, Representing the problem, Planning and executing, and Monitoring and reflecting, as applied in our earlier studies (Chen, Uemura, Hao, et al., 2018). However, according to Hesse et al. (2015), the important points of the latter two CPS processes are understanding other group membersstates for Identifying the problem and understanding the groups states for Monitoring and reflecting. In light of these points, in earlier studies we used a Moodle system to help students and the instructor to understand mutual states. However, in the present study, we only used the BookRoll system because we focused on the effect of learning behaviors on CPS cultivation.

In order to determine factors that might affect the cultivation of CPS skills, we used Spearman’s rank correlation coefficient to assess the correlations of CPS awareness from the data of CPS Questionnaires with STEM learning strategy (SLS Questionnaire) and learning logs (reading digital learning materials).

Correlations between changes in CPS awareness and STEM learning strategy

From the results in Table 6, we can see that there were strong negative correlations of Social Regulation in CPS awareness with Organizing (ρ = − 0.74, p < 0.01) and Using References (ρ = − 0.77, p < 0.01) in SLS, and moderate negative correlations of Social Regulation with Elaborating (ρ = − 0.50, p < 0.1) and Time Management (ρ = − 0.52, p < 0.1) in SLS.

Table 6 Spearman’s rank correlation coefficients between CPS awareness and STEM learning strategy

Since the Social Regulation factor refers to strategies recognizing the diversity of group members and negotiating with them, the point of this factor is communication with others. When conducting STEM education, it is necessary for STEM educators to provide students with multidisciplinary, multi-perspective viewpoints and a collaborative approach that links them with a broader community (Kelley & Knowles, 2016; Kennedy & Odell, 2014). However, the SLS in this study focused on individual learning (except in the case of Peer Learning) and found that individual SLS negatively affected communication during group work. Furthermore, we did not provide training or guidance on how to learn integrated STEM subjects, so future research should consider how to guide students to master STEM learning well and integrate individual learning strategies, especially these four factors, efficiently into collaboration.

Correlations between changes in CPS awareness and learning behaviors

Next, we assess the correlations between CPS awareness and learning behaviors concerning how students read digital STEM learning materials. The results are presented in Table 7.

Table 7 Spearman’s rank correlation coefficients between CPS awareness and learning behaviors

During pre-learning, moderate negative correlations were found for Learning and Knowledge Building in CPS awareness with Prev (ρ = − 0.55, p < 0.1), Add Marker (ρ = − 0.60, p < 0.05), and Delete Marker (ρ = − 0.54, p < 0.1) of learning behaviors.

In an earlier study, we concluded that students’ behaviors of frequently changing pages or turning to the previous page imply a lack of familiarity with the learning contents and cause them difficulties in constructing knowledge. They added markers to what they thought was important or thought they understood and deleted markers when they changed ideas, from which it could also be inferred that they understood the contents poorly and thus changed their minds easily. However, the Add Marker logs we collected included both yellow (not understand) and red (important) highlights, meaning that we could not identify which part of their behaviors actually negatively affected their knowledge building.

In group discussions in STEM lessons, there were moderate positive correlations found between Social Regulation in CPS awareness and Add Marker (ρ = 0.60, p < 0.05) and Delete Marker (ρ = 0.64, p < 0.05) of learning behaviors.

During group discussion, we observed that when students discussed the problems as well as the contents they did not understand, they deleted the old markers when they accepted others’ opinions while adding new markers. Therefore, the behaviors of adding and deleting markers indicate they paid attention to communication and negotiation in group work, showing that the effective utilization of the marker tool could facilitate the social regulation in STEM lessons. And students’ behaviors of social regulation during discussion would be examined by dialogue analysis.

Research question 3: What are the relationships between STEM learning strategy and learning behaviors in STEM learning?

In this study, we used questionnaires to investigate how students used SLS but did not provide instruction or guidance in how to use STEM learning strategies efficiently. Therefore, we assessed the correlations between SLS and learning behaviors of reading scientific materials to find out how their learning strategy affected their actual learning behaviors, which could help our future instructional design in ways of using SLS.

According to the results in Table 8, in students’ pre-learning, a moderate positive correlation was found between Attention of SLS and Add Annotation learning behavior (ρ = 0.54, p < 0.1), while there were moderate negative correlations between Learning Environment of SLS and Add Marker (ρ = − 0.54, p < 0.1), Delete Marker (ρ = − 0.52, p < 0.1), and Add Annotation (ρ = − 0.51, p < 0.1).

Table 8 Spearman’s rank correlation coefficients between STEM learning strategy and learning behaviors

Since students were required to take notes about the problems provided when they read the materials, it is understandable that annotation would help them to concentrate on the contents related to problems and solutions in STEM learning.

However, it would hurt their learning performance if students only took dictation word by word rather than taking notes with conceptual understanding and thinking (Mueller & Oppenheimer, 2014). No guidance in using functional tools seems to be one reason for the negative correlations between the use of the marker and annotation tools with students’ expected learning environment, where it is easy to concentrate and find references.

As for the group work, moderate positive correlations were shown between Repeating of STEM learning strategy and Next Page (ρ = 0.51, p < 0.1) and Previous Page (ρ = 0.61, p < 0.05) of learning behaviors, of Effort strategy with Next Page behavior (ρ = 0.56, p < 0.1), and of Learning Environment strategy with Add Bookmark (ρ = 0.58, p < 0.1), and a strong positive correlation between Effort and Previous Page (ρ = 0.75, p < 0.01).

On the other hand, there were moderate negative correlations found for Organizing strategy with Add Marker behavior (ρ = − 0.58, p < 0.05) and for Elaborating strategy with Delete Marker behavior (ρ = − 0.55, p < 0.1), and a strong negative correlation between Organizing strategy and Delete Marker behavior (ρ = − 0.74, p < 0.01).

Besides guidance on how to use the functional tools of the BookRoll system, some learning strategies such as organizing/summarizing, elaborating/application, and repeating should also be taught with efficient design under certain learning conditions (Dunlosky et al., 2013).

Dialogue analysis

The Spearman’s rank correlation coefficient revealed significant positive correlations between the utilization of marker tool with changes in learning performance (RQ1) and CPS social regulation awareness (RQ2) both during the group discussion. In order to investigate whether students had the behaviors of perspective taking and social regulation, dialogue analysis was conducted to understand how students displayed these skills.

We collected the dialogue data of all groups during the discussion and categorized dialogue thread in relation to perspective taking and social regulation factors with reference to the Hesse et al. (2015).

According to the CPS framework proposed by Hesse et al. (2015), there are two elements adaptive responsiveness and audience awareness (mutual modeling), in perspective taking factor, and four elements in social regulation factor, which are negotiation, self-evaluation (meta-memory), transactive memory, and responsibility initiative. The elements and the indicators of perspective taking and social regulation were listed in Table 9.

Table 9 Elements and indicators in perspective taking and social regulation

Perspective taking—adaptive responsiveness

Example 1

  • 95. Student 9 (S9): The change of the temperature has increased.

  • 96. S7: You mean the temperature has increased?

  • 97. S9: Because there is difference in the temperature (between two place).

  • 98. S7: Yeah, yeah, yeah, I got it.

  • 99. S9: In the lake.

  • 100. S7: So difficult to accumulate (CO2)?

  • 101. S9: Because water constantly circulates through.

  • 102. S7: Yes, yes, yes.

  • (…)

  • 118. S10: So back to the question, what about in Japan?

  • 119. S9: See the beginning this chapter (of the learning materials on BookRoll system), there is huge feature in Cameroon, that is landslide. Especially in summer, it rains almost every day.

  • 120. S10: Yeah, yeah, yeah.

  • 121. S9: Landslides occur easily (in Cameroon), but landslides don’t occur often in Japan.

  • 122. S10: Yes, thats true.

Adaptive responsiveness includes the indicator “ignoring, accepting, or adapting contributions of others.” As shown in example 1, in Line 98, S7 used “I got it” to express his agreement with S9’s explanations, which means he has accepted other’s viewpoints, and reconsidered the problems (in Line 100). Similarly, in Line 122, S10 used “Yes, that’s true” to show his agreement with S9’s viewpoint. Since S9 provided his explanations based on the learning materials on BookRoll system (in Line 119), it could be inferred that the S9 did have the behaviors of Adaptive responsiveness during the utilization of BookRoll system, which is consistent with the findings of relationships between marker tool utilization with changes in learning performance (in RQ1).

Perspective taking—audience awareness (mutual modeling)

Example 2

  • 63. S4: Is Japan different from Cameroon?

  • 64. S5: Of course different, like precipitation.

  • 65. S6: Completely different.

  • 66. S5: (3s) Look at this (the learning materials on BookRoll system), this is different from this, but in summer, precipitation is not that different.

  • 67. S4: Yeah, I see.

  • 68. S6: Because the Japan’s temperature is similar with Cameroon? Like June

  • 69. S4: That’s true.

  • (…)

  • 76. S6: As for Cameroon, where is its location? Around the sea?

  • 77. S5: No, it isn’t. (15s) (Searching the information on the Internet). Here.

  • 78. S4: Yeah, that’ the point. The locations are different.

  • 79. S5: And there is information of Cameroons’ temperature here.

  • 80. S4: Is Cameroon above the equator?

  • 81. S6: About the location, (8s) look at the first page (of the learning materials).

  • 82. S7: (20s) Yeah, I see. I think we should work on the problem now.

According to CPS framework, the indicator of Audience awareness (mutual modeling) is “awareness of how to adapt behavior to increase suitability for others.” In Example 2, when others had difficulties in understanding certain contents, some students chose to utilize the additional reference or information to explain the contents.

For example, in line 64 and 65, S5 and S6 answered S4’s question, but did not received any feedback, which means S4 did not accepted their answers well. So S5 waited for 3 s and chose to use learning materials to explain the question (in Line 66). And he also searched for additional information for S6’ question (in Line 77). It is indicated that he had adapted own behavior based on the feedback and understanding of recipient.

Based on their behaviors of providing additional information by BookRoll system (for example, in Line 66, 81), it can be inferred that students utilized BookRoll system tools with their audience awareness.

Social regulation—negotiation

Example 3

  • 25. S2: Usually, the water in that lake, doesn’t circulate.

  • 26. S1: I think it does.

  • 27. S2: No, it doesnt.

  • 28. S1: Why?

  • 29. S2: The precipitation is high.

  • 30. S1: That’s why I think it circulates. Because even the precipitation is low, the water would circulates in the lake. I understand what you are taking about, but it is strange that the water doesn’t circulate, even the precipitation is high.

The Example 3 showed a whole negotiation part during the group discussion around “whether water circulates in the lake” issue. Negotiation was conducted here in order to “reaching compromise.” In Line 25~27, S1 and S2 expressed their opinions respectively. After that, S2 asked the reasons (in Line 28), and made comments on the difference between their viewpoints, and proposed reasons to persuade S2 to achieve the agreement (in Line 30).

Although students’ behaviors of negotiation were found during CPS learning, it was not clarified whether their negotiation behaviors had relationship with the utilization of BookRoll tools.

Social regulation—self-evaluation (meta-memory)

Example 4

  • 10. S1: (Searching the information on the Internet) What about Cameroon…There is only English version, I am over.

  • (…)

  • 17. S1: It is too terrible.

  • 18: S3: About what?

  • 19. S1: My English.

The Self-evalution behavior had not been executed often, and was be found in only one group about the evalution on his English ability.

Social regulation—transactive memory

There was no Transactive memory behavior found in all groups.

Social regulation—responsibility initiative

Example 5

  • 10. S1: (Searching the information on the Internet) What about Cameroon…(10s) There is only English version, I am over.

  • 11. S1: (Reading out the contexts on the BookRoll system) (15s) Yes, I agree.

  • 12. S2: How dangerous would it be? The limnic eruption.

  • 13. S1: (11s) It has be written here.

  • (…)

  • 46. S2: What about searching on the Internet?

  • 47. S3: Good idea.

  • 48. S1: Let’s search for the history of Cameroon on Wikipedia.

The indicator of Responsibility initiative is “assuming responsibility for ensuring parts of the task are completed by the group,” such as conducting activities and reporting to others, assuming group responsibility as one’s own responsibility. In Line 11 and 13, S1 had investigated certain information and reported it to other members. And in Line 46 and 48, S2 had proposed the activities which should be conducted by group members, S1 accepted that responsibility and taken it as his own responsibility. Moreover, the Responsibility initiative behavior was found during the utilization of BookRoll system (in Line 11), which was consistent with the results of RQ2 (relationship was found between marker tool utilization and CPS social regulation).

Conclusion

The purpose of this study was to investigate the effect of several variables, including students’ learning strategy for STEM and learning behaviors when reading scientific materials online, on their learning performance, and cultivation of CPS skills.

The results of this study showed that different SLS and learning behavior variables would affect students’ learning performance and CPS awareness in different ways. We summarized the relationships between factors of this study with two figures.

As shown in Fig. 3, concerning changes in learning performance, the results implied that the frequency of Add and Delete Marker behaviors in group discussion would have positive influence on students’ learning scores. Since we only analyzed the correlations between certain variables, it can also be inferred that students whose learning performance improved more showed a tendency to use marker tools more frequently. Both explanations indicate that marker tools can be used effectively in STEM learning performance improvement.

Fig. 3
figure 3

The results of Research Question 1&2

As for the CPS awareness, some factors of SLS, including Organizing, Elaborating, Time Management, and Using References, showed negative correlations with Social Regulation of CPS social skills. Moreover, negative correlations were also found between Learning and Knowledge Building of CPS cognitive skills and Add and Delete Marker and Previous Page behaviors in students’ pre-learning. These results imply a deficiency in and necessity for guiding students in how to use STEM learning strategies or functional tools efficiently and integrate them into collaborative activities. The positive correlations found for Add and Delete Marker in group discussions with Social Regulation of CPS social skills also indicate the effectiveness of marker tools.

Furthermore, in order to determine how to support students in utilizing their SLS, we assessed the correlations between students’ SLS with their actual learning behaviors (results were summarized in Fig. 4), finding in pre-learning a positive correlation between Attention strategy and Add Annotation behaviors, and negative correlations between Learning Environment strategy and the Add and Delete Marker and Add Annotation behaviors.

Fig. 4
figure 4

The results of Research Question 3.Factor without significant relationship with other factors were omitted in this figure. **p < 0.01, *p < 0.05, p < 0.1

In group discussion, there were positive correlations between Repeating and Effort strategies and Next and Previous Page behaviors, respectively, while negative correlations were found for Organizing strategy with Add and Delete Marker behaviors and for Elaborating strategy with Delete Marker behaviors. These results suggest the value of teaching these SLSs using functional tools.

Based on the results above, we would give some suggestions for CPS STEM learning linked to the research questions. From the results of RQ1, when participating in collaborative activities in STEM learning, marker tool is considered to be effective in STEM learning performance improvement, because of its advantages such as helping students focus on the discussion topic and integrating contributions from others into their own thoughts.

From the results of RQ2, certain factors in STEM learning strategy should be executed by individual, such as organizing, elaborating, time management, and using reference. Besides, it is also suggested that marker tool is useful in students’ group work including communication and negotiation.

From the results of RQ3, although mark tool is shown to be useful in collaborative activities, it does not help students in knowledge building and knowledge organizing and elaborating without detailed guidance about how to utilize such tool.

Limitations and future work

The first limitation of our study is the small number of participants; as mentioned above, only 12 senior high school students participated in the STEM lesson, which makes our study results hard to generalize. Thus, this study aims at providing the direction to take the behavioral and strategical factors into consideration in CPS-based STEM learning. In our future work, we should increase the sample size to gain a more generalizable conclusion, and provide the specific model for CPS-based STEM learning. Furthermore, it is also considered to use other statistical approach such as SEM or path analysis, to further explore cause-effect relationships between learning performance and CPS awareness with learning behaviors and learning strategies in STEM education.

Second, we have conducted two previous studies on instructional design for improving CPS skills following CPS process; two previous studies both indicated that some of the factors in CPS awareness, however, not all experiments showed the same results in the improvement. As for the present study, although we designed this STEM lesson following the same CPS process, the CPS process was not supported with technology as in our prior studies, which led to our finding no statistically significant differences in the improvements in all factors of CPS awareness. Therefore, considerations should be paid to the issue of how to support students’ CPS learning, such as expand the sample size to gain more representative results and improve the future study, or supporting CPS process with some the help of technology.

Finally, as many results in the present study indicate, it is important to provide students with training or guidance in applying STEM learning strategies and functional tools, especially marker and annotation tools, which should also be taken into consideration in our future research.

Availability of data and materials

All data generated or analyzed during this study are included in this published article.

Abbreviations

CPS:

Collaborative problem solving

LA:

Learning analytics

LMS:

Learning management system

SLS:

STEM learning strategy

STEM:

Science, technology, engineering, mathematics

References

  • Artino Jr., R., & Jones II, K. D. (2012). Exploring the complex relations between achievement emotions and self-regulated learning behaviors in online learning. The Internet and Higher Education, 15(3), 170–175.

    Article  Google Scholar 

  • Baker, R., & Siemens, G. (2014). Educational data mining and learning analytics. In R. Sawyer (Ed.), The Cambridge Handbook of the Learning Sciences (Cambridge Handbooks in Psychology, pp. 253–272). Cambridge: Cambridge University Press.

  • Bazelais, P., Lemay, D. J., & Doleck, T. (2018). Examining the link between prior achievement in secondary education and performance in college: using data from pre-university physics Courses. Journal of Formative Design in Learning, 1–7.

  • Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

    Article  Google Scholar 

  • Care, E., Scoular, C., & Griffin, P. (2016). Assessment of collaborative problem solving in education environments. Applied Measurement in Education, 29(4), 250–264.

    Article  Google Scholar 

  • Chang, C. J., Chang, M. H., Liu, C. C., Chiu, B. C., Fan Chiang, S. H., Wen, C. T., et al. (2017). An analysis of collaborative problem-solving activities mediated by individual-based and collaborative computer simulations. Journal of Computer Assisted Learning, 33, 649–662.

    Article  Google Scholar 

  • Chen, L., Uemura, H., Goda, Y., Okubo, F., Taniguchi, Y., Oi, M., Konomi, S., Ogata, H., & Yamada, M. (2018). Instructional Design and Evaluation of Science Education to Improve Collaborative Problem Solving Skills. Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 1306-1311). Washington, D.C., United States: Association for the Advancement of Computing in Education (AACE).

  • Chen, L., Uemura, H., Hao, H., Goda, Y., Okubo, F., Taniguchi, Y.,…, Yamada, M. (2018). Relationships between collaborative problem solving, learning performance and learning behavior in science education. Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE) (pp. 17–24). Wollongong, NSW, Australia.

  • Dunbar, R. L., Dingel, M. J., & Prat-Resina, X. (2014). Connecting analytics and curriculum design: process and outcomes of building a tool to browse data relevant to course designers. Journal of Learning Analytics, 1(3), 223–243.

    Article  Google Scholar 

  • Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1), 4–58.

    Article  Google Scholar 

  • Gough, A. (2015). STEM policy and science education: Scientistic curriculum and sociopolitical silences. Cultural Studies of Science Education, 10(2), 445–458.

    Article  Google Scholar 

  • Griese, B., Lehmann, M., & Roesken-Winter, B. (2015). Refining questionnaire-based assessment of STEM students’ learning strategies. International Journal of STEM Education, 2, 12.

    Article  Google Scholar 

  • Herro, D., Quigley, C., Andrews, J., & Delacruz, G. (2017). Co-measure: developing an assessment for student collaboration in STEAM activities. International Journal of STEM Education, 4(1), 26.

    Article  Google Scholar 

  • Hesse, F., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A framework for teachable collaborative problem solving skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 37–56). New York: Springer.

    Google Scholar 

  • Hogan, K. (1999). Thinking aloud together: a test of an intervention to foster students’ collaborative scientific reasoning. Journal of Research in Science Teaching, 36(10), 1085–1109.

    Article  Google Scholar 

  • Holmlund, T. D., Lesseig, K., & Slavit, D. (2018). Making sense of “STEM education” in K-12 contexts. International journal of STEM education, 5(1), 32.

    Article  Google Scholar 

  • Hwang, W. Y., Shadiev, R., Wang, C. Y., & Huang, Z. H. (2012). A pilot study of cooperative programming learning behavior and its relationship with students’ learning performance. Computers & Education, 58(4), 1267–1281.

    Article  Google Scholar 

  • Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(1), 11.

    Article  Google Scholar 

  • Kennedy, T. J., & Odell, M. R. L. (2014). Engaging students in STEM education. Science Education International, 25(3), 246–258.

    Google Scholar 

  • Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Lin, K. Y., Yu, K. C., Hsiao, H. S., Chang, Y. S., & Chien, Y. H. (2018). Effects of web-based versus classroom-based STEM learning environments on the development of collaborative problem-solving skills in junior high school students. International Journal of Technology and Design Education, 1–14.

  • Lin, K. Y., Yu, K. C., Hsiao, H. S., Chu, Y. H., Chang, Y. S., & Chien, Y. H. (2015). Design of an assessment system for collaborative problem solving in STEM education. Journal of Computers in Education, 2(3), 301–322.

    Article  Google Scholar 

  • Liñán, L. C., & Pérez, Á. A. J. (2015). Educational data mining and learning analytics: differences, similarities, and time evolution. International Journal of Educational Technology in Higher Education, 12(3), 98–112.

    Google Scholar 

  • Liu, F., & Cavanaugh, C. (2011). Success in online high school biology: factors influencing student academic performance. Quarterly Review of Distance Education, 12(1), 37–55.

    Google Scholar 

  • Lou, S. J., Liu, Y. H., Shih, R. C., & Tseng, K. H. (2011). The senior high school students’ learning behavioral model of STEM in PBL. International Journal of Technology and Design Education, 21(2), 161–183.

    Article  Google Scholar 

  • Lowes, S., Lin, P., & Kinghorn, B. (2015). Exploring the link between online behaviours and course performance in asynchronous online high school courses. Journal of Learning Analytics, 2(2), 169–194.

    Article  Google Scholar 

  • Monroy, C., Rangel, V. S., & Whitaker, R. (2014). A strategy for incorporating learning analytics into the design and evaluation of a K-12 science curriculum. Journal of Learning Analytics, 1(2), 94–125.

    Article  Google Scholar 

  • Morris, L. V., Finnegan, C., & Wu, S.-S. (2005). Tracking student behavior, persistence, and achievement in online courses. Internet and Higher Education, 8, 221–231.

    Article  Google Scholar 

  • Mueller, P. A., & Oppenheimer, D. M. (2014). The pen is mightier than the keyboard: Advantages of longhand over laptop note taking. Psychological Science, 25(6), 1159–1168.

    Article  Google Scholar 

  • Newhouse, C. P. (2016). STEM the boredom: engage students in the Australian curriculum using ICT with problem-based learning and assessment. Journal of Science Education and Technology, 26(1), 44–57.

    Article  Google Scholar 

  • Ogata, H., Taniguchi, Y., Suehiro, D., Shimada, A., Oi, M., Okubo, F., …, Kojima, K. (2017). M2B System: A digital learning platform for traditional classrooms in university. Practitioner Track Proceedings of the Seventh International Learning Analytics & Knowledge Conference (LAK 2017), paper presented at Simon Fraser University, Vancouver, Canada (pp. 155–162).

  • Ogata, H., Yin, C., Oi, M., Okubo, F., Shimada, A., Kojima, K., & Yamada, M. (2015). E book based learning analytics in university education. In the 23rd International Conference on Computer in Education (ICCE 2015), paper presented at Hangzhou, China (pp. 401–406).

  • Oi, M., Okubo, F., Shimada, A., Yin, C., & Ogata, H. (2015). Analysis of preview and review patterns in undergraduates’ e-book logs. In the 23rd International Conference on Computer in Education (ICCE 2015), paper presented at Hangzhou, China (pp. 166–171).

  • Organisation for Economic Cooperation and Development. (2017). PISA 2015: collaborative problem-solving framework. https://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf

  • Ramos, C., & Yudko, E. (2008). “Hits” (not “Discussion Posts”) predict student success in online courses: a double cross-validation study. Computers & Education, 50(4), 1174–1182.

    Article  Google Scholar 

  • Thomas, B., & Watters, J. (2015). Perspectives on Australian, Indian and Malaysian approaches to STEM education. International Journal of Educational Development, 45, 42–53.

    Article  Google Scholar 

  • Wang, A. Y., & Newlin, M. H. (2000). Characteristics of students who enroll and succeed in psychology web-based classes. Journal of Educational Psychology, 92(1), 137–143.

    Article  Google Scholar 

  • Yamada, M., Okubo, F., Oi, M., Shimada, A., Kojima, K., & Ogata, H. (2016). Learning analytics in ubiquitous learning environments: Self-regulated learning perspective. In the 24th International Conference on Computers in Education (ICCE2016), paper presented at IIT Bombay, Mumbai India (pp. 306–314).

  • Yamada, M., Shimada, A., Okubo, F., Oi, M., Kojima, K., & Ogata, H. (2017). Learning analytics of the relationships among self-regulated learning, learning behaviors, and learning performance. Research and Practice in Technology Enhanced Learning, 12, 1–17.

    Article  Google Scholar 

  • Yue, C. L., Storm, B. C., Kornell, N., & Bjork, E. L. (2015). Highlighting and its relation to distributed study and students’ metacognitive beliefs. Educational Psychology Review, 27, 69–78.

    Article  Google Scholar 

Download references

Acknowledgments

This study is funded by Japan Society for the Promotion of Science(JSPS) (Grant Number: JP16H06304, JP16H03080, JP19H01716), and Cross-Ministerial Strategic Innovation Promotion Program from Cabinet Office.

Funding

This study is funded by Japan Society for the Promotion of Science(JSPS) (Grant Number: JP16H06304, JP16H03080, JP19H01716), and Cross-Ministerial Strategic Innovation Promotion Program from Cabinet Office.

Author information

Authors and Affiliations

Authors

Contributions

LC and MY designed this research overall. LC and YG designed this instruction. YG contributed to modify the instructional design. NY is science teacher of the high school. He conducted the class. YG was engaged in considering the evaluation method of this study. FO, YT, AS, SK, HO developed and deployed the learning analytics platform. MO, YG, and MY advised the improvement of the instructional design from the viewpoint of cognitive science. MY supervised this research. All authors read and approved the final manuscript

Corresponding author

Correspondence to Li Chen.

Ethics declarations

Competing interests

Drs. Ogata and Yamada received research grants from JSPS for this research project. Drs. Goda, Okubo, Taniguchi, Oi, Konomi, Shimada, and Yamada received research grants from JSPS for other research project. Drs. Ogata and Shimada received research grants from Cross-Ministerial Strategic Innovation Promotion Program from Cabinet Office. Ms. Li Chen and Mr. Yoshimatsu do not have any conflict on this research.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1

Instructional design of the Limnic Eruption STEM lesson

The description of the case

There is a crater lake named Lake Nyos in Cameroon. In 1986, massive amounts of dissolved CO2 suddenly erupted from the bottom of the lake, causing a limnic eruption. The eruption led to the death by asphyxiation of around 1,700 people in a nearby village. There is still concern about such gas hazards recurring.

The purpose of the lesson

Investigating the mechanism (reason) of the limnic eruption and designing a manual of disaster mitigation, including knowledge of natural disasters, the relationship between the environment and human beings, and what to do when faced with such natural disasters.

Four STEM domains in this lesson

Science (S): Students need to understand CO2-related scientific knowledge, including the nature and generation process of CO2 and the solubility of CO2, and use scientific knowledge and skills to solve the applied problems.

Engineering (E): Students are asked to design a manual of disaster mitigation in order to integrate their multidisciplinary knowledge and skills to problem-solving and improve their understanding of scientific and mathematical knowledge by practicing their application.

Mathematics (M): Students are required to use their mathematical knowledge and skills to perform the respective calculations in CO2-solubility-related problems and make a judgment as to whether the information provided by the Internet is correct.

Technology (T): Students are asked to use a M2B (Moodle, Mahara, and Bookroll) system to support individual pre-learning and group discussion and the Internet for information retrieval. The use of technology and decision-making regarding Internet information is expected to improve students’ technological literacy.

Appendix 2

Table 10 Items of the Collaborative Problem Solving (CPS) Questionnaire

Appendix 3

Table 11 Items of the STEM Learning Strategy (SLS) Questionnaire

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, L., Yoshimatsu, N., Goda, Y. et al. Direction of collaborative problem solving-based STEM learning by learning analytics approach. RPTEL 14, 24 (2019). https://doi.org/10.1186/s41039-019-0119-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s41039-019-0119-y

Keywords