Skip to main content

Geneticus Investigatio: a technology-enhanced learning environment for scaffolding complex learning in genetics

Abstract

Bioscientists such as geneticists and molecular biologists regularly demonstrate the integration of domain concepts and science inquiry practices/skills while explaining a natural phenomenon. The complexity of these concepts and skills becomes manifold at the tertiary undergraduate level and are known to be challenging for learners. They learn these in silos as part of theory classes, practical labs, and tutorial sessions while in an industry, they will be required to integrate and apply in a given authentic context. To support learners in this process, we have designed and developed Geneticus Investigatio (GI), a technology-enhanced learning (TEL) environment for scaffolding complex learning in the context of Mendelian genetics. GI facilitates this complex learning by the integration of domain concepts and science inquiry practices through inquiry-driven reflective learning experiences, which are interspersed with inquiry-based learning steps in an authentic context along with metacognitive reflection. In this paper, we present two cycles of iterative design, development, and evaluation of GI, based on the design-based research (DBR) approach. In the first DBR cycle, we identified the pedagogical design features and learning activities of GI based on an exploratory study with bio-science instructors for facilitating complex learning. We then report a pre-post classroom study (N = 37) in which we investigated the learning and perceptions of usability and usefulness of GI. The results indicate high learning gains after interacting with GI and learner perceptions that activities in GI help learn concepts and inquiry practices along with its integration. It is followed by the identification of interaction and other difficulties by the learner, which were triangulated with different data sources. It provided insights into the pedagogical and design changes required in GI. The revised version of GI was evaluated with a quasi-experimental classroom study (N = 121). The results indicate that the drawbacks of the previous version of GI were addressed. The main contributions of this research are a pedagogical design for facilitating complex learning and its implementation in the form of GI TEL environment.

Introduction

Bioscientists regularly evaluate the effect of a phenomenon across biological levels and understand the inheritance patterns, study structure, function, and growth of living organisms and others (Hoskinson et al. 2013). For example, topics like patterns of inheritance may be explained based on Mendelian or deviation from Mendelian inheritance and encompass a variety of concepts related to the breeding context of plants and animals. It involves relations between the events of different levels of biological organisation hierarchy from molecular to sub-cellular to organismic level. An example of such a problem in Mendelian genetics is as follows ‘A plant geneticist has two pure lines, one with purple petals and one with blue. She hypothesises that the phenotypic difference is due to two alleles of one gene. To test this idea, she aims to look for a 3:1 ratio in the F2. She crosses the lines and finds that all the F1 progeny are purple. The F1 plants are selfed, and 400 F2 plants are obtained. Of these F2 plants, 320 are purple, and 80 are blue. Do these results fit her hypothesis well? If not, suggest why.’ To solve such a problem, a scientist requires an understanding of basic concepts of genetics, knowledge of statistical tests, science inquiry practices of hypothesis testing, and revision. To solve such problem, one has to understand, apply, and integrate these by performing complex cognitive processes which are also known as complex learning. Examples of such cognitive processes are sense-making and interpretation done while understanding the problem context and hypothesis. Learners have to state the assumptions while testing the hypothesis, along with declaring the dependent and independent variable, also, reasoning for designing an experiment along with evaluation for comparing the result of the expected and observed values. It is followed by the cognitive process of decision-making where one has to conclude about the hypothesis based on the evaluation. Besides these, learners perform the cognitive process of metacognition and transfer of learning where they reflect on planning and solving similar problems.

Such problems are also part of an undergraduate curriculum, and a learner learns the concepts of genetics as part of theory class, knowledge about statistical tools as part of tutorial sessions, and science inquiry practices in the practical labs. Besides this, such integration of concepts and skills is expected in learners while doing research projects or working in labs or biotech industry. Hence, they must learn the integration of concepts and skills as part of their undergraduate training. Such authentic scientific practices are common in many STEM domains. To perform such scientific exercises, an undergraduate learner is required to perform a complex cognitive process of integrating domain concepts and skills, also known as complex learning (van Merriënboer & Dolmans, 2015). Besides this, he/she is also expected to transfer and apply this complex learning to a novel scenario. Learners in undergraduate programmes learn the concepts and skills in silos as part of theory classes, practical labs, and tutorial sessions. Hence, it is difficult for them to connect these, and there are few instances in existing curricula where they are explicitly asked to do so (Hester et al. 2018). Demonstrating complex learning is especially challenging at the tertiary level because of the complexity and variety of domain concepts and skills. Also, adapting the curriculum to facilitate complex learning is difficult because a teacher often does not have enough flexibility to change the curriculum (Guan et al. 2014). So there is a need for a solution that facilitates the complex learning which is aligned to the curriculum content.

Research suggests utilising the affordances of technology-enhanced learning (TEL) environments in facilitating complex learning by providing overall structure to the learning activities, immediate and personalised feedback, reflective and evaluative question prompts, and so on. Several inquiry-based learning environments focus on developing such learning, for example, WISE (Slotta 2002) and Geniverse (Concord consortium 2010). Hence, what is required is a TEL environment which requires a learner to simultaneously engage with inquiry practices, statistical procedures, and domain concepts. Learners have to work on activities explicitly and deliberately integrating practices and concepts or three (practices, concepts, and statistical procedures) individually as well as in an integrated manner. Our proposed TEL environment Geneticus Investigatio (GI) is designed for tertiary-level biology undergraduates with a focus on facilitating complex learning by applying concepts of genetics, understanding of basics of statistics with integrated science inquiry practices. Modules in GI are related to the domain of basic genetics, which is commonly studied by undergraduates of all the bio-science disciplines. Learning activities in GI emphasise the integration of concepts across the required topics, along with science practices through inquiry-driven reflective learning experiences.

One design consideration in the development of GI was to make it easily accessible to learners in college classrooms or outside and to make it adaptable to different topics if instructors wished to add or edit content, learning activities, or problems in other topics. GI has thus been developed using Google sites and H5P (Jouble, 2013) as a web-based learning environment, which is known to be convenient to access digital content by learners and teachers alike (Yin et al., 2017). It is browser-based and works on laptops, tablets, or mobile phones. Learners can use it easily in classroom settings or anywhere else and only need a device and wireless connection. Teachers can adapt it quickly since it does not require advanced computer knowledge.

In this paper, we present a pedagogical design for facilitating complex learning in the context of genetics, which is a compulsory foundational course for undergraduate bioscience learners, and its implementation in the GI-TEL environment through two design-based research cycles (McKenney & Reeves, 2014). In the first DBR cycle, we identified the pedagogical design features and learning activities of GI based on a literature review and an exploratory study with bio-science instructors. Next, we report a pre-post classroom study (N = 37) with the research goal of investigating the complex learning and perceptions of usability and usefulness of GI. The results indicate high learning gains with GI and learner perceptions of how learning activities in GI help learn concepts and science inquiry practices. We present our reflection about the interaction and other difficulties faced by the learner from the first cycle. It provided insights into the pedagogical and design changes required in GI, and the revised version of GI was evaluated by a quasi-experimental classroom study (N = 121). The results indicate that the drawbacks of the previous version of GI were addressed.

Literature review

Complex learning through problem-solving

One of the principal goals of science education has been training learners in demonstrating complex cognitive processes of integrating concepts and skills. Science and engineering graduates are expected to demonstrate and apply this as prescribed by different bodies such as Next Generation Science Standards (Nextgenscience.org, 2020) (https://www.nextgenscience.org/) and Accreditation Board for Engineering and Technology (ABET) (Shuman, Besterfield-Sacre & McGourty, 2005). Applying such complex learning has been deemed important not only by educators but also by industries while hiring (Lang et al., 1999). Researchers and theorists have made remarkable progress in identifying and characterising these complex cognitive processes. Some of them include identifying learners’ difficulties in diverse contexts and proposing instructional design phases and associated learning activities (van Merriënboer & Dolmans, 2015). The driving force of this complex learning is authentic learning tasks with the integration of concepts and skills to facilitate the transfer of what has been learned to new and often unique tasks and problem situations (van Merriënboer, 2007). Learning tasks for applying complex learning can be in a project, tasks, cases, problems, etc.

An example in genetics is the following: While breeding for the desired characteristics in offspring, breeders need to know which parents to choose, i.e. which characteristics in parents to cross and they conduct genetic experiments while planning any breeding. To identify the characteristics of parents, one has to find the underlying reason behind these inheritance patterns of characters. So, one has to integrate genetics concepts with science practices and statistics related such as calculating chi-square value and identifying the degree of freedom. As part of the three-dimensional Next Generation Science Standards (NGSS), science inquiry practices include forming testable questions, carrying out experiments, analysing/interpreting data, warranting claims with evidence, and communicating findings. However, in typical undergraduate curricula, learners encounter the required concepts and practices in silos. Thus, learners lack an integrated perspective while solving problems. An important difficulty reported among undergraduate learners is the complex learning of genetics, which occurs via rote application of procedural steps without a comprehensive conceptual understanding (Karagoz & Cakir 2011). Another example is finding the inheritance patterns where they have to solve problems that are either cause-effect problems (closed problems) or effect-cause problems (open problems) (Orcajo & Aznar 2005). Many such scenarios in genetics deal with multiple possible reasons underlying the observations.

Besides this, complex learning is essential in the context of understanding or doing science by the learners, which is the core practice in science education. It helps in establishing the feasibility/correctness of a hypothesis, eliminates candidate hypothesis/set of results, and compares predicted/observed results. It also allows learners to develop an in-depth understanding of the subject (Cooper, Hanmer, & Cerbin, 2006). Solving such problems requires learners to integrate concepts and practices, which becomes a daunting task. Hence, there is a need for scaffolding learners during complex learning.

TEL for scaffolding complex learning during problem-solving

Researchers and educators have developed TEL environments that address one or more aspects of complex learning. Decades of research into science inquiry learning have given us insights into the nature of learning and challenges, design of learning environments to support learners (Bransford, Brown, & Cocking, 2000), and principles for scaffolds (Quintana et al., 2004). Researchers have identified numerous tools and guidelines to support teaching-learning in technology-rich classrooms (Demetriadis, Papadopoulos, Stamelos, & Fischer, 2008). Some examples of guidelines are related to learners’ control over the goals, content, actions, strategies, opportunities for reflection, and opportunities for monitoring their learning progress. Besides these, technology tools are available to facilitate direct instruction (procedural or meta-cognitive guidance), access to content (fixed or dynamic), data collection (cooperative or collaborative), peer-to-peer communication (asynchronous or synchronous), and contextual support (augmented or immersive experience).

Affordances of TEL environments have been used to develop scientific inquiry skills similar to scaffolding complex learning during problem-solving. Among them are WISE (Slotta 2002), Go-Labs (De Jong et al., 2014), Apple Tree (Chen et al. 2013), and Geniverse (Concord consortium 2010). Most of this research has been in topics for the middle and high school levels. These technology solutions are shown to be effective in the learning of inquiry skills (Lazonder & Harmsen, 2016; Suárez et al. 2018). Learners require scaffolds at various places during the problem-solving process. These scaffolds can be in the form of feedback, access to domain concepts, etc. An example of feedback could be in the form of guiding questions for reflection or identification of possible mistakes. Existing learning environments in genetics which meet some of these requirements are Genetics with Jean (Thompson & McGill 2017), an affective tutoring system to teach the concepts of genetics. Another case-based laboratory simulation was built for learning core concepts and skills in medical genetics (Makransky et al. 2016). These environments are to be used either online or can be downloaded. Most of these learning environments have to be installed on a PC or laptop. Apart from this, the teacher has to have advanced computer knowledge to interact and troubleshoot on the go while interacting with the learning environments.

Some features from these learning environments, such as scaffolds and question prompts (Xun & Land, 2004), could meet our needs. An example is the interactive learning activities like drag and drop activity to engage learners while interacting with the content. Our broad goal is solving an open-ended problem which requires the integration of concepts and practices during problem-solving. In an undergraduate course, an instructor might not be an expert in the two diverse topics of genetics and statistics because of the variety and complexity of concepts. Hence, what is required is a TEL environment which requires a learner to simultaneously engage with inquiry practices, statistical procedures, and domain concepts. Learners have to work on activities explicitly and deliberately integrating practices and concepts or three (practices, concepts, and statistical procedures) individually as well as in an integrated manner. Besides, there is a need to provide an overall structure and inquiry-driven reflective learning experiences. A potential solution to the problem related to domain-specific concepts is that technologies can customise prompts to account for differences in prior knowledge. Also, responses to open-ended, transfer questions, and reflections on inquiry will help in transferring the problem-solving process in a novel context (Barab & Plucker, 2002). The problem-solving activities may be scaffolded through interactive cycles of investigation (National Research Council, 2012). Besides this, the learning environment should be easily accessible to learners in college classrooms through a laptop, tab, or mobile using a wireless connection (Yin et al., 2017). It should be browser-based so that there is no need to do any additional installation of software for interaction. For a teacher, the learnability of TEL should be low; it should enable teachers to adapt learning activities to different topics without requiring advanced computer knowledge (Guan et al. 2014). To address this gap, we designed and developed the GI pedagogy and learning environment through a design-based research approach which is discussed in the next section.

Design-based research

Design-based research (DBR) is an approach that guides the development of learning theories, improvement of instructional design, and possibilities of a new design. DBR consists of iterative cycles of design, enactment, analysis, and redesign. We have followed the DBR approach (McKenney & Reeves 2014) which has the following phases: analysis and exploration, design and development, evaluation and reflection. In the first stage, there is a detailed analysis of the problem, the context, and the participants, including analysis of the existing solution to address the problem along with exploratory study with the novice or experts to understand the requirements. The stage of design and development follows it, where the designers or researchers create a preliminary learning design that is evaluated by various qualitative, quantitative, or mixed methods to understand the mechanisms by which learning happens in the learning environment. It is followed by the evaluation and reflection stage, where reflection is done on these learning mechanisms to identify how the learning effectiveness of the design could be improved.

We have adopted the DBR methodology in our research work firstly to identify the pedagogical features and learning activities of the TEL environment to scaffold complex learning and secondly to understand the underlying mechanisms of interactions which are leading to complex learning. Figure 1 shows our adapted version of the DBR methodology.

Fig. 1
figure1

Design-based research approach for developing our TEL environment

DBR cycle 1: analysis and exploration phase

We first explored the topics and application contexts which are suitable for learning integration of domain concepts and science inquiry practices. From literature, we identified those topics within genetics such as monohybrid and dihybrid cross that are challenging for learners (Bahar 1999), and these topics form the basis for scientific inquiry for further genetics topics (Orcajo & Aznar 2005). These topics are typically taught in the context of the pea plant as the model organism. We designed a set of semi-structured questions in these topics, which requires the integration of concepts and science inquiry practices. We then did a study with bioscience instructors to validate these questions and to gather insight into the detailed concepts required to solve the questions, where learners encounter these concepts in their curriculum, and the difficulties learners may face in answering such questions.

Study 1: participants, context, and procedure

Participants were three instructors of the Bachelor of Science in Biotechnology course. The context of the learning material was the breeding cross following the Mendelian inheritance. Problem-solving in this topic requires application of science practices along with integrating concepts related to the basics of Mendelian genetics, understand and decide about the appropriate statistical calculation along with inference. Participants were given six questions from this topic. Some questions were related to the procedural application of genetics and statistics like predicting the result of the breeding experiment and doing the statistical comparison. While the other question on the transfer of learning required a learner to connect concepts of genetics and statistics. The question of understanding the hypothesis and designing the breeding experiment required a learner to integrate genetics and statistical concepts with science inquiry practices. The instructors were asked to solve the questions and indicate their perceptions of learner difficulties. In the focus group interview, they were asked about the concepts required to solve those questions and were asked to mention if the questions were accurate and sufficient for complex learning. The instructors were also asked to give us feedback if we have missed out on any crucial area of the topic which needs to be covered.

Result and discussion

The thematic analysis of the focus group interview revealed the concepts required to solve the semi-structured questions in genetics, along with its curriculum alignment. To solve them, the genetics concepts required are dominant and recessive gene, Mendel laws of inheritance along with the phenotypic and genotypic ratio, which a learner learn in the basic genetics course of the first semester. Along with this, learners learn about various model organisms in their first and second semesters. Another set of concepts required for problem-solving are the concept of biostatistics like the null hypothesis, p value, and degree of freedom that a learner learns in their fourth semester. The learners perform the science inquiry practices of hypothesis testing and revision in their practical labs. Learner’s difficulties in solving such problems, as identified by instructors are in understanding of a hypothesis, determining the degree of freedom and p value, and concluding about the hypothesis. Similar difficulties were seen in the subsequent studies when learners were given the same questions in their pre-test. Many of these difficulties have been reported in the literature (de Jong et al., 1998).

Design of GI

Based on the research goal of scaffolding complex learning through problem-solving along with the identified learner’s difficulties from literature and research study with the instructors, we came up with the learning objectives of GI. After interacting with this system, learners should be able to integrate concepts of genetics and statistics with science inquiry practices. They should be able to understand the breeding context and corresponding hypotheses. It requires a learner to consider the context, identify whether the given hypothesis will explain the phenomena and is testable through experimentation. It is followed by designing a feasible experiment and predicting the result, which requires learners to consider all dependent and independent variables. Also, the learner should be able to calculate the result of the experiment based on the designed experiment and hypothesis under investigation, followed by making a statistical comparison (comparing prediction and observation) and concluding about the hypothesis. Once a learner has solved a breeding scenario, he/she reflects and plans to solve the new context through the transfer of their learning. In GI, the learning activities have been designed based on the components of instructional design for complex learning (Frerejean et al, 2019, Kirschner & Van Merriënboer, 2008, Van Merriënboer, 1997), i.e. learning task, supportive information, procedural information, and part-task practice have been operationalised as follows. In GI, the learning task is the context of a genetics problem. It aims at the integration of knowledge and skills learnt across the curriculum through a whole-task experience based on the authentic real-life scenario. The supportive information like access to domain concepts supports learning and performance of non-recurrent aspects of learning tasks. It acts as a bridge between the previous knowledge of the learners and required knowledge to solve the tasks and supports cognitive processes like reasoning. Inclusion of procedural information, like the process of calculating the chi-square, is presented just-in-time through step-by-step instructions. It is related to the recurrent aspects of the learning tasks which are performed in the same way. In GI, part-task practice is provided by interaction through multiple modules. These are additional practice for achieving a high level of automaticity for the recurrent aspects.

GI is based on inquiry-based learning along with reflection in an authentic scenario. Inquiry learning mimics authentic inquiry because they are closely related and share the following constitutive cognitive processes (Pedaste et al., 2015). The phases and sub-phases of the synthesised inquiry-based learning frameworks are as follows: orientation, conceptualisation, investigation, conclusion, discussion. The emphasis on science inquiry practices by the science educators reflects the fact that traditional form of classroom learning is often disconnected from authentic practices of science (American Association for the Advancement of Science, 2011; National Research Council [NRC], 2012). These authentic research experiences can increase the learner’s confidence in studying biology and interest in pursuing biological research (Brownell et al., 2012; Rodenbusch et al., 2016).

Along with authentic context, the importance of encouraging learners to reflect has been highlighted in various researches as it is critical for the effective integration of concepts and inquiry practices and is instrumental in learning (Davis 2003). Any activity which involves remembering from the past and analysing it in context requires metacognitive thinking which is known as reflection. Reflection in action as described by Schon in 1983 is ‘When someone reflects-in-action, he becomes a researcher in the practice context.’ Reflection-on-action involves looking back on our actions performed (Schon 1983). It also consists of reflecting after the event, in review, analyse, and evaluate the situation, to gain insight for improved practice in future.

Pedagogical features and learning activities

The overall pedagogy of GI contained inquiry-driven reflective learning experiences focused on the integration of genetics concepts and practices in an authentic context. The key features of the GI are presented in Fig. 2.

Fig. 2
figure2

Key features of Geneticus Investigatio

Integration of domain concepts, science practices, and statistical tools

An overall structure of the learning environment ensured that learners should be able to integrate science practices along with the knowledge of genetics and statistics. In GI, learners are required to perform most of science inquiry practices like carrying out experiments in the form of designing breeding experiments and solving the corresponding question in the tests. Learners are also required to analyse data by computing the chi-square value, which requires them to compare predicted and observed result, and warranting claims with evidence by concluding about the hypothesis. Learners began by understanding the problem and the given hypothesis to test. It requires the learner to understand the context, for example, the scientific phenomenon, which is to be explained. To explain the context by the hypothesis, learners have to state the assumptions while testing this hypothesis along with declaring the dependent and independent variable. The system displays hypothesis and drag and drop activity for identifying the variables (Fig. 3).

Fig. 3
figure3

Learning activities in GI to understand the breeding context and corresponding hypothesis

It was followed by the phase of testing hypothesis, which included designing an experiment and reasoning from the hypothesis to predict the result. In this, learning activities require learners to decide about the cross made, design the steps of breeding experiment, and calculate the predicted value (Fig. 4). The system displays activities related to determining the cross made and calculating the ratio by providing editing boxes. These editing boxes are for stating laws of inheritance, creating Punnett square, and calculating ratio of offspring. Besides this, learners interactively design steps of breeding experiments and watch a lab demo video of an actual experiment done in practical labs.

Fig. 4
figure4

Learning activities in GI for designing an experiment and predicting result to test the hypothesis

The evaluation and summarisation phase was to revise the hypothesis if required by comparing the result of the expected and observed values. The last learning objective has two goals, namely ‘Evaluating’ and ‘Summarizing’. Once the experiment is designed, and results are collected, it has to be statistically compared with the predicted outcome and come to a conclusion. In the evaluate phase, learners learn interactively about the chi-square test, calculate the chi-square value, compare it with the critical value, and conclude based on critical value (Fig. 5). The system displays interactive video which has reflective question prompts related to what, why, and how of chi-square and calculates the chi-square value by providing the functionality of the editing table. Besides, this learner reflects on planning and solving similar problems through drag and drop activity.

Fig. 5
figure5

Learning activities in GI for hypothesis conclusion and transfer to a new context

Question prompts

The integration of concepts, inquiry practices, and use of statistics was interspersed with question prompts. There were prompts for self-assessment or to strengthen the application of conceptual knowledge, e.g. ‘The chi-square test is used to...’ Another example of question prompts from GI is when a learner is introduced to the context and hypothesis. Then, they are asked to reflect on their understanding, e.g. Did you think about the following while identifying the parts of the hypothesis? Also, there was a set of reflective question prompts, e.g. What steps will you perform while solving a similar problem? Feedback in these question prompts is customised based on their choice of option. They were also allowed to revise their responses. These question prompts facilitated learners in explanation construction and making justifications.

Reflection activities

Learning activities in GI concluded with the planning and summarising activity. This activity required the learners to do reflection-on-action, which aimed at integrating the concepts and inquiry practices. Learners reflected upon the abstract integration steps, which have to be done while solving a similar scenario, and the learning activity required them to arrange the abstract steps and sub-steps in the correct order. This activity prompts the learner to articulate their thinking.

Scaffolds

Learners were provided with immediate feedback throughout the learning activities. Along with the feedback, hints were provided to scaffold learners in the integration process. Learners were asked to state their reasoning explicitly in many places, which ensured that they should make an informed decision during the interaction. In addition to that, additional resources related to concepts of genetics and statistics were provided in the form of video, pdf, and solved examples which can be accessed by the learners anytime during the problem-solving.

Development of GI

Content development

GI contains modules in basic genetics topics, which are taught as part of the high school curriculum or first year undergraduate course. These topics have multiple underlying reasons which learners have to identify via designing an experiment and testing their hypotheses. Examples of modules include monohybrid and dihybrid cross with Mendelian and non-Mendelian inheritance as the underlying processes, within different model organisms. GI contained additional resources related to these concepts if learners wished to refer to them.

Technology development

The user interface of GI is designed and implemented with Google sites, which is an open-source toolkit. The learning activities of GI have been designed in H5P, which is an open-source, free HTML5 toolkit to develop interactive content. H5P supports the creation of interactive learning activities where learners can interact with artefacts available in the environment. The users can access GI through standard web user interfaces through any device. The interactions in the GI required the learners to navigate back and forth with interspersed drag and drop activities in the majority of the learning activities. Besides this, group-level learning behaviours can be accessed in real-time from the Google Analytics platform, which is helpful for the teachers to provide real-time feedback. This approach creates a seamless transition from guided problem-solving as done in the traditional classroom to a personalised web-based learning environment.

Evaluation of GI: study 2

We implemented a single group pre-post research design to understand the effect of interaction with GI on learner’s learning. The two research questions which were the focus of this study were:

  • Do learners who interact with GI develop an integrated understanding of concepts and skills in genetics?

  • What are the user perceptions of usability and usefulness of GI?

Participants, study context, and procedure

Participants of this study were thirty-seven undergraduate learners of Bio-Science. In this study, the context of the learning material was a monohybrid cross following the Mendelian inheritance. The study was conducted as a part of a workshop for Bio-science learners (Fig. 6). After an introductory registration to inform participants about the objectives and gather prior academic details, the participants were given a pre-test, similar to the questions in study 1. It was followed by interaction with the learning material in GI. After that, the participants took the post-test, which was similar to the pre-test. The workshop concluded with a final survey questionnaire to understand participants’ perception of usability and usefulness of GI. The survey instrument was the 10-item System Usability Scale (SUS) (Brooke, 1996) widely used for assessing the usability of a wide variety of learning environments. We asked additional open-ended questions in the survey to capture participant’s perception on which features of GI were useful and which were challenging, and what was the perceived learning and value from GI.

Fig. 6
figure6

Procedure of the study

Data sources and analysis technique

For the first research question on learning gain after interacting with GI, we evaluated learners’ pre- and post-test responses based on adapted scientific ability rubric (Etkina et al. 2006) (Fig. 7).

Fig. 7
figure7

Rubric item and corresponding question statement

In this research work, the intended purpose of the rubrics was to assess integration ability during problem-solving within the defined domains. The rubrics were tested for different types of validity and reliability. The content validity was established by discussing rubric items and its scoring description with three experts one after another. Of these three experts, two were senior bio-science faculty with experience in teaching genetics and biostatistics course. It was then validated with a learning science expert. Changes suggested by the first expert were incorporated in the rubrics and then validated with the next expert. The construct validity was established through the responses of the test of learners to an open-ended genetics problem which were scored using the rubrics. Test responses of instructors for the same question were scored. For rubrics to demonstrate construct validity, it was expected that instructors would score higher than learners and that there would be a range of scores in learners’ solutions reflecting their differing abilities. Also, the rubric scores allotted by different raters to the same problem should be consistent. This was established using inter-rater reliability. The inter-rater reliability with two coders had substantial agreement (Cohen’s Kappa 0.774, p < 0.001). Sample rubric items and evaluation of student’s response are given in Fig. 8 and Table 1.

Fig. 8
figure8

Sample rubric item

Table 1 Evaluation of sample student response

For the second research question on the usability and usefulness of GI, we calculated the SUS score and did a thematic analysis on response to the four open-ended questions. We also captured the screen recording to analyse their screen activities for identifying interaction difficulties with Kazam, a screen recording software for Linux operating system.

Results

Effect of GI on learners’ learning performance

The mean value and standard deviation of the scores on the rubrics of complex learning are shown in Table 2. A paired sample t test between post- and pre-test scores was statistically significant for many rubric items.

Table 2 Rubric item-wise statistics of the pre- and post-test scores

The perception of usability and usefulness of GI

The SUS score was 59.23, indicating the product is usable (Brooke, 1996). Thematic analysis of the responses to open-ended questions on usefulness and usability of GI showed that participants found the interactive video, question prompts for reflection, drag and drop learning activity, and understanding of domain as useful features, and that GI was valuable in learning hypothesis testing and revision, and learning of genetics concepts (Table 3).

Table 3 Themes and respective sample excerpts

Reflection of DBR cycle 1

Our findings from the evaluation study showed that learners who learn with GI developed an integrated understanding of concepts and skills. However, despite this fact that learners developed integrated understanding, we noticed that there were some problems when learners interacted with the system. The analysis of the open-ended questions provided an insight into the learner’s difficulties. Some of these were conceptual, for example, learners faced difficulty in understanding the context and were not able to comprehend and calculate the predicted result of the designed experiment. Other difficulties related to user interface issues. After analysis of the open-ended feedback responses and interaction analysis of screen-recordings, the likely reasons were identified as pedagogical and interactional. These were used as the basis for the redesign of GI.

DBR cycle 2: redesign of GI

From the findings and reflection from DBR cycle 1, we identified the problems in the design of the first version of GI, a potential reason for the problem, and redesign steps (Table 4).

Table 4 Problems seen in DBR cycle 1, a potential reason for the problem, and redesign step
Fig. 9
figure9

Comparison of the interface of predicting the result of the breeding experiment

Fig. 10
figure10

Comparison of the interface of doing the statistical calculation

Evaluation of GI-2.0: study 3

We conducted a quasi-experimental classroom study with undergraduate bioscience learners in the context of Mendelian genetics with the revised version of GI. The pre-post study conducted in DBR cycle one had already established that learners who interacted with GI demonstrated complex learning. In this study, we compared learning gain of learners who interacted with GI with a control group. The two research questions which were the focus of this study are:

  • Do learners who interact with GI develop an integrated understanding of concepts and skills in genetics?

  • What are user perceptions of usability and usefulness of GI?

Participants, study context, and procedure

Participants in this study were 121 undergraduate bio-science learners. They were randomly assigned to the control or experimental group. In this study, the context of the learning material was a dihybrid cross following the Mendelian inheritance. This study was conducted as a part of a workshop. It was conducted in a supervised setting using the GI learning environment for capturing data of the learning gains on complex learning and their perception of usability of GI. The study had five steps, as presented in Fig. 11.

Fig. 11
figure11

Procedure of the study

After an introductory registration to inform participants about the objectives and gather prior academic details, the participants were given a pre-test. The pre-test questions were similar to the questions of the second study with two minor revisions for better comprehensibility in the first and the last question. The pre-test was followed by interaction with the learning material. In the experimental group, participants worked on the learning activities in GI. The control group participants were given learning material related to concepts of genetics, the importance of model organism, hypothesis formation, and how to calculate the chi-square test and compare it with the critical value. They also went through the worked examples. The main differences in the learning materials of the two groups were the inquiry-driven reflective learning experiences in GI. The learning materials for the control group were in the form of video, pdf, and Google slides, which were organised as a Google website. These videos were the same as in the experimental group but did not contain the scaffolds and prompt which were present in GI. After that, participants of both the groups took the post-test, which was similar to the pre-test. The workshop concluded with the last activity of filling the perception survey similar to the second study.

Data analysis and results

The effect of GI on learners’ learning performance

We did a paired sample t test between post- and pre-test scores, for both the control and experimental groups to evaluate the learning gain after interacting with the learning material. The difference between the means is statistically significant (p value 0.000) for the experimental group for most of the rubric items (Table 5). The observed standardised effect size is large (1.32) for participants who interacted with GI.

Table 5 Rubric item-wise statistics of the pre- and post-test scores for control and experimental groups

For the control group, the difference between the means is statistically significant (p value 0.000) for achieving an integrated understanding of concepts and skills (Table 5). For understanding the breeding context and designing the breeding experiment, the difference was not significant for the control group. The observed standardised effect size for the control group is medium (0.67).

We also did an independent sample t test on normalised gain (Hake 1998) to determine if there exists a difference in the performance between the experimental group learners who learn via GI and the learners in a control group. The difference between the means of the experimental and control group is statistically significant (p value 0.000) for achieving an integrated understanding of concepts and skills.

The perception of usability and usefulness of GI

We performed a thematic analysis for analysing the response to the open-ended questions about the gross usefulness and usability of GI. Participants found the interactive video, question prompts for reflection, drag and drop learning activity, and understanding of domain as useful features of GI similar to what was seen in study 2. Analysis of the result of the two questions related to the usefulness of GI reveals that GI helps in complex learning by integrating the science inquiry practices along with learning of genetics concepts. The SUS score came to 66.01, indicating the product is usable.

Overall discussion and conclusion

In this paper, we presented the GI pedagogy for facilitating complex learning by integrating the concept of genetics and statistics with science inquiry practices. The GI pedagogy was developed and implemented by the two DBR cycles. In the analysis and exploration phase of the first DBR cycle, we identified the concepts required and the learner’s difficulties. We then designed and developed the first version of the GI learning environment based on the recommendations from literature and teacher study, which was evaluated with 37 undergraduate learners of bio-sciences. Rubric-wise analysis of learning gains reveals that learners had difficulty in understanding the breeding context and the corresponding hypothesis. Besides this, analysis of open-ended questions revealed that learners had difficulty in interacting in GI for predicting the result of the experiment and making the statistical comparison. It was analysed, and corresponding design changes were made in the revised version of GI. In the second DBR cycle, we evaluated the revised version of GI through a quasi-experimental study. The experimental group demonstrated high learning gain in demonstrating complex learning as compared to the control group. Rubric-wise analysis of learning gains reveals that interaction with both the control and experimental learning material resulted in higher effectiveness in learning the application of domain-related steps like calculation of the predicting result of the breeding experiment and making the statistical comparison. Stated differently, the participants seemed to understand the application of domain concepts. This is not entirely unexpected as they are used to the kind of teaching method in which the teacher demonstrates the steps, and the learners mechanically apply those steps in similar problems. In GI, instead of the teacher, these participants watched the interactive video explaining the steps of Chi-square calculation.

In contrast to the learning gain of application of domain steps, the learning gain of integration of domain with science inquiry practices, e.g., designing breeding experiments and stating assumptions and hypothesis conclusions based on statistical analysis, was not significant for both the groups. This result is following the findings reported in the existing literature that learning of skills requires multiple interactions over some time (Kim & Hannafin, 2011). Based on the result, we conjecture that multiple interactions with GI and across contexts will lead to significant learning gain. A significant difference in the learning gain of learners of the experimental group as compared to the control group was found for transferring the concepts and skills learned in a new authentic scenario. GI facilitated metacognition and transfer of abstract processes to a novel and authentic context. This result is worth discussing as this high learning gain in this question could be attributed to the summarising activity and reflection done by learners in individual phases of GI. In this activity, since participants had to reflect upon overall learning activity and its sub-activities, they were able to abstract the integration of concepts and skills. The learning activity provided them with the flexibility of sequentially arranging the steps and access the hints as and when required. We conjecture that because of this, they were able to demonstrate this transfer of complex learning explicitly in the answers of the post-test questions.

Along with that, thematic analysis of open-ended responses revealed learner perceptions that GI is valuable for the complex learning of integrating concepts and science inquiry practices of hypothesis testing and revision. Besides this, they also perceive that interaction with GI will help in a better conceptual understanding of genetics and statistical tools and will help in advance studies. GI was marked as a usable product based on the SUS score. Repeated use of the tool is likely to boost their confidence in interacting with the tool. This web-based learning environment makes learning flexible, portable, and attractive (Hashemi, Azizinezhad, Najafi, & Nesari, 2011).

The GI-pedagogy is designed to scaffold learners’ cognitive processes of complex learning during problem-solving. The pedagogy support learners in integrating domain knowledge with science inquiry practices through inquiry-driven reflective learning experiences. These learning experiences in the GI-pedagogy provide the learner with authentic context, and learners perform inquiry-based activities, which are actual practices of a bioscientist. Reflection activities intersperse it, and learners have access to scaffolds as and when required. GI pedagogy could be used by learning environment developers and instructors who wish to operationalise this pedagogy in a self-paced or face-to-face or blended learning mode. Since GI has been developed using Google sites and H5P as a web-based learning environment, it is convenient to access by learners and teachers alike. Since GI is browser-based and works on laptops, tablets, or mobile phones, a learner can use it easily in classroom settings or anywhere else and only need a device and wireless connection. Undergraduate bio-science learners who wish to improve their complex cognitive processes during problem-solving can directly access the link in the absence of a teacher. An instructor who wants to train their learners in complex learning on a new topic can adapt and implement GI pedagogy in a new scenario since it does not require advanced computer knowledge for creating new modules.

An essential limitation of this work is that in GI, the domain was restricted to genetics, and topics were specific to Mendelian and non-Mendelian inheritance. We conjecture that the GI-pedagogy and learning environment can be adapted to a set of topics that requires an integration of domain concepts and inquiry practices. More research is to be done to confirm the conjecture about generalizability to other topics. Another limitation of this study is that we have assumed that learners are motivated in self-learning, and we conjecture that a learner with low motivation will have difficulty in problem-solving after interacting with GI since GI is a self-paced learning environment. Possible directions for future work could be to address these limitations to ensure that GI is useful and effective for a wide range of learners. As part of the future work, we would also like to add feedback component that needs to be built into the GI like a rubric or self-evaluation checklist that will spot the faulty operationalisation of breeding ratio and will act as a hint at corrective measures.

Availability of data and materials

The dataset supporting the conclusions of this article is included in the article.

Abbreviations

GI:

Geneticus Investigatio

TEL:

Technology-enhanced learning

DBR:

Design-based research

References

  1. American Association for the Advancement of Science (2011). Vision and change in undergraduate biology education: A call to action. DC: Washington.

    Google Scholar 

  2. Bahar, M., Johnstone, A. H., & Hansell, M. H. (1999). Revisiting learning difficulties in biology. Journal of Biological Education, 33(2).

  3. Barab, S. A., & Plucker, J. A. (2002). Smart people or smart contexts? Cognition, ability, and talent development in an age of situated approaches to knowing and learning. Educational psychologist, 37(3), 165–182.

    Article  Google Scholar 

  4. Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn, (vol. 11). Washington, DC: National academy press.

    Google Scholar 

  5. Brooke, J. (1996). SUS-A quick and dirty usability scale. Usability evaluation in industry, 189(194), 4–7.

    Google Scholar 

  6. Brownell, S. E., Kloser, M. J., Fukami, T., & Shavelson, R. (2012). Undergraduate biology lab courses: Comparing the impact of traditionally based “cookbook” and authentic research-based courses on student lab experiences. Journal of College Science Teaching, 41(4), 36.

    Google Scholar 

  7. Chen et al. 2013) W. Chen, C. Looi, W. Xie, & Y. Wen, (2013) Empowering argumentation in the science classroom with a complex CSCL environment. In Proceedings of the 21st International Conference on Computers in Education. Indonesia: Asia-Pacific Society for Computers in Education.

  8. Concord Consortium. (2010). Geniverse (Accessed: 16 May. 2019). Geniverse [Online]. Available: https://geniverse-lab.concord.org

  9. Cooper, S., Hanmer, D., & Cerbin, B. (2006). Problem-solving modules in large introductory biology lectures. The American Biology Teacher, 68(9), 524–530.

    Article  Google Scholar 

  10. Davis, E. A. (2003). Prompting middle school science students for productive reflection: Generic and directed prompts. The Journal of the Learning Sciences, 12(1), 91–142.

    Article  Google Scholar 

  11. De Jong, T., Sotiriou, S., & Gillet, D. (2014). Innovations in STEM education: The Go-Lab federation of online labs. Smart Learning Environments, 1(1), 3.

    Article  Google Scholar 

  12. De Jong, T., & Van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of educational research, 68(2), 179–201.

    Article  Google Scholar 

  13. Demetriadis, S. N., Papadopoulos, P. M., Stamelos, I. G., & Fischer, F. (2008). The effect of scaffolding students’ context-generating cognitive activity in technology-enhanced case-based learning. Computers & Education, 51(2), 939–954.

    Article  Google Scholar 

  14. Etkina, E, Van Heuvelen, A, White-Brahmia, S, Brookes, DT, Gentile, M, Murthy, S, Warren, A. (2006). Scientific abilities and their assessment. Phys. Rev. ST Phys. Educ. Res, 2(2), 020103.

  15. Frerejean, J., van Merriënboer, J. J., Kirschner, P. A., Roex, A., Aertgeerts, B., & Marcellis, M. (2019). Designing instruction for complex learning: 4C/ID in higher education. European Journal of Education.

  16. Guan, J., Su, X., Qian, D., & Yu, E. (2014). Current situation and strategy of the promotion of e-schoolbag application area—Based on analysis of content of teachers’ interviews. E-Education Research, 35(10), 53–59.

    Google Scholar 

  17. Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66(1), 64–74.

    Article  Google Scholar 

  18. Hashemi, M., Azizinezhad, M., Najafi, V., & Nesari, A. J. (2011). What is mobile learning? Challenges and capabilities. Procedia-Social and Behavioral Sciences, 30, 2477–2481.

    Article  Google Scholar 

  19. Hester, S. D., Nadler, M., Katcher, J., Elfring, L. K., Dykstra, E., Rezende, L. F., & Bolger, M. S. (2018). Authentic Inquiry through Modeling in Biology (AIM-Bio): An introductory laboratory curriculum that increases undergraduates’ scientific agency and skills. CBE—Life Sciences Education, 17(4), ar63.

  20. Hoskinson, A. M., Caballero, M. D., & Knight, J. K. (2013). How can we improve problem-solving in undergraduate biology? Applying lessons from 30 years of physics education research. CBE—Life Sciences Education, 12(2), 153–161.

    Article  Google Scholar 

  21. Jouble (2013). H5p.org (Accessed: 20 May. 2019). H5p [Online]. Available: https://h5p.org/

  22. Karagoz, M., & Çakir, M. (2011). Problem solving in genetics: Conceptual and procedural difficulties. Educational Sciences: Theory and Practice, 11(3), 1668–1674.

    Google Scholar 

  23. Kim, M. C., & Hannafin, M. J. (2011). Scaffolding problem-solving in technology-enhanced learning environments (TELEs): Bridging research and theory with practice. Computers & Education, 56(2), 403–417.

    Article  Google Scholar 

  24. Kirschner, P. A., & Van Merriënboer, J. (2008). Ten steps to complex learning a new approach to instruction and instructional design.

    Book  Google Scholar 

  25. Lang, J. D., Cruse, S., McVey, F. D., & McMasters, J. (1999). Industry expectations of new engineers: A survey to assist curriculum designers. Journal of Engineering Education, 88(1), 43–51.

    Article  Google Scholar 

  26. Lazonder, A. W., & Harmsen, R. (2016). Meta-analysis of inquiry-based learning: Effects of guidance. Review of educational research, 86(3), 681–718.

    Article  Google Scholar 

  27. Makransky, G., Bonde, M. T., Wulff, J. S., Wandall, J., Hood, M., Creed, P. A., … Nørremølle, A. (2016). Simulation-based virtual learning environment in medical genetics counselling: An example of bridging the gap between theory and practice in medical education. BMC medical education, 16(1), 98.

    Article  Google Scholar 

  28. McKenney, S., & Reeves, T. C. (2014). Educational design research, In Handbook of research on educational communications and technology (pp. 131-140) (). New York, NY: Springer.

    Google Scholar 

  29. NRC (2012). A framework for K–12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.

    Google Scholar 

  30. Orcajo, T., & Aznar, M. (2005). Solving problems in genetics II: Conceptual restructuring. International Journal of Science Education, 27(12), 1495–1519.

    Article  Google Scholar 

  31. Pedaste, M., Mäeots, M., Siiman, L. A., De Jong, T., Van Riesen, S. A., Kamp, E. T., … Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational research review, 14, 47–61.

    Article  Google Scholar 

  32. Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., … Soloway, E. (2004). A scaffolding design framework for software to support science inquiry. The journal of the learning sciences, 13(3), 337–386.

    Article  Google Scholar 

  33. Rodenbusch, S. E., Hernandez, P. R., Simmons, S. L., & Dolan, E. L. (2016). Early engagement in course-based research increases graduation rates and completion of science, engineering, and mathematics degrees. CBE—Life Sciences Education, 15(2).

  34. Schon, D. A. (1983). 1983, The reflective practitioner: How professionals think in action. New York: Basic Books.

    Google Scholar 

  35. Shuman, L. J., Besterfield-Sacre, M., & McGourty, J. (2005). The ABET “professional skills”—can they be taught? Can they be assessed? Journal of Engineering Education, 94(1), 41–55.

    Article  Google Scholar 

  36. Slotta, J. (2002). Designing the web-based inquiry science environment (WISE). Educational technology, 42(5), 15–20.

    Google Scholar 

  37. Suárez, Á., Specht, M., Prinsen, F., Kalz, M., & Ternier, S. (2018). A review of the types of mobile activities in mobile inquiry-based learning. Computers & Education, 118, 38–55.

    Article  Google Scholar 

  38. Thompson, N., & McGill, T. J. (2017). Genetics with Jean: The design, development and evaluation of an affective tutoring system. Educational Technology Research and Development, 65(2), 279–299.

    Article  Google Scholar 

  39. Van Merriënboer, J. J. (1997). Training complex cognitive skills: A four-component instructional design model for technical training. Educational Technology.

    Google Scholar 

  40. Van Merriënboer, J. J., & Dolmans, D. H. J. M. (2015). Research on instructional design in the health sciences: From taxonomies of learning to whole-task models. Researching medical education, 193-206.

  41. Van Merriënboer, J. J. G. (2007). Alternate models of instructional design: Holistic design approaches and complex learning. Trends and issues in instructional design and technology, 72–81.

  42. Xun, G. E., & Land, S. M. (2004). A conceptual framework for scaffolding III-structured problem-solving processes using question prompts and peer interactions. Educational technology research and development, 52(2), 5–22.

    Article  Google Scholar 

  43. Yin, C. J., Uosaki, N., Chu, H. C., Hwang, G. J., Hwang, J. J., Hatono, I., & Tabata, Y. (2017). Learning behavioural pattern analysis based on students’ logs in reading digital books. In Proceedings of the 25th International Conference on Computers in Education (pp. 549-557).

Download references

Acknowledgements

The authors would like to thank everyone who contributed to this research work.

Funding

Not applicable

Author information

Affiliations

Authors

Contributions

Anurag Deep has carried out the development and research of Geneticus Investigatio under the guidance of Prof. Sahana Murthy and Prof. Jayadeva Bhat. Anurag Deep drafted the manuscript; Prof. Sahana Murthy and Prof. Jayadeva Bhat checked it. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Anurag Deep.

Ethics declarations

Ethics approval and consent to participate

A form of consent has been signed by all the participants in this study which states that their participation is voluntary and will bear no impact on their university marks. Also, all participants will be awarded a certificate for participation.

Competing interests

The authors declare that they have no competing interests

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Deep, A., Murthy, S. & Bhat, J. Geneticus Investigatio: a technology-enhanced learning environment for scaffolding complex learning in genetics. RPTEL 15, 24 (2020). https://doi.org/10.1186/s41039-020-00145-5

Download citation

Keywords

  • Complex learning
  • Problem-solving
  • Geneticus Investigatio
  • Design-based research