Portland Community College | Portland, Oregon

Core outcomes.

  • Core Outcomes: Critical Thinking and Problem Solving

Think Critically and Imaginatively

  • Engage the imagination to explore new possibilities.
  • Formulate and articulate ideas.
  • Recognize explicit and tacit assumptions and their consequences.
  • Weigh connections and relationships.
  • Distinguish relevant from non-relevant data, fact from opinion.
  • Identify, evaluate and synthesize information (obtained through library, world-wide web, and other sources as appropriate) in a collaborative environment.
  • Reason toward a conclusion or application.
  • Understand the contributions and applications of associative, intuitive and metaphoric modes of reasoning to argument and analysis.
  • Analyze and draw inferences from numerical models.
  • Determine the extent of information needed.
  • Access the needed information effectively and efficiently.
  • Evaluate information and its sources critically.
  • Incorporate selected information into one’s knowledge base.
  • Understand the economic, legal, and social issues surrounding the use of information, and access and use information ethically and legally.

Problem-Solve

  • Identify and define central and secondary problems.
  • Research and analyze data relevant to issues from a variety of media.
  • Select and use appropriate concepts and methods from a variety of disciplines to solve problems effectively and creatively.
  • Form associations between disparate facts and methods, which may be cross-disciplinary.
  • Identify and use appropriate technology to research, solve, and present solutions to problems.
  • Understand the roles of collaboration, risk-taking, multi-disciplinary awareness, and the imagination in achieving creative responses to problems.
  • Make a decision and take actions based on analysis.
  • Interpret and express quantitative ideas effectively in written, visual, aural, and oral form.
  • Interpret and use written, quantitative, and visual text effectively in presentation of solutions to problems.
Core Outcomes
Sample Indicators

Limited demonstration or application of knowledge and skills.

Identifies the main problem, question at issue or the source’s position.

Identifies implicit aspects of the problem and addresses their relationship to each other.

Basic demonstration and application of knowledge and skills.

Identifies one’s own position on the issue, drawing support from experience, and information not available from assigned sources.

Addresses more than one perspective including perspectives drawn from outside information.

Clearly distinguishes between fact, opinion and acknowledges value judgments.

Demonstrates comprehension and is able to apply essential knowledge and skill.

Identifies and addresses the validity of key assumptions that underlie the issue.

Examines the evidence and source of evidence.

Relates cause and effect.

Illustrates existing or potential consequences.

Analyzes the scope and context of the issue including an assessment of the audience of the analysis.

Demonstrates thorough, effective and/or sophisticated application of knowledge and skills.

Identifies and discusses conclusions, implication and consequences of issues considering context, assumptions, data and evidence.

Objectively reflects upon own assertions.

  • AB: Auto Collision Repair Technology
  • ABE: Adult Basic Education
  • AD: Addiction Studies
  • AM: Automotive Service Technology
  • AMT: Aviation Maintenance Technology
  • APR: Apprenticeship
  • ARCH: Architectural Design and Drafting
  • ASL: American Sign Language
  • ATH: Anthropology
  • AVS: Aviation Science
  • BA: Business Administration
  • BCT: Building Construction Technology
  • BI: Biology
  • BIT: Bioscience Technology
  • CADD: Computer Aided Design and Drafting
  • CAS/OS: Computer Applications & Web Technologies
  • CG: Counseling and Guidance
  • CH: Chemistry
  • CHLA: Chicano/ Latino Studies
  • CHN: Chinese
  • CIS: Computer Information Systems
  • CJA: Criminal Justice
  • CMET: Civil and Mechanical Engineering Technology
  • COMM: Communication Studies
  • Core Outcomes: Communication
  • Core Outcomes: Community and Environmental Responsibility
  • Core Outcomes: Cultural Awareness
  • Core Outcomes: Professional Competence
  • Core Outcomes: Self-Reflection
  • CS: Computer Science
  • CTT: Computed Tomography
  • DA: Dental Assisting
  • DE: Developmental Education – Reading & Writing
  • DE: Developmental Education – Reading and Writing
  • DH: Dental Hygiene
  • DS: Diesel Service Technology
  • DST: Dealer Service Technology
  • DT: Dental Lab Technology
  • DT: Dental Technology
  • EC: Economics
  • ECE/HEC/HUS: Child and Family Studies
  • ED: Paraeducator and Library Assistant
  • EET: Electronic Engineering Technology
  • ELT: Electrical Trades
  • EMS: Emergency Medical Services
  • ENGR: Engineering
  • ESOL: English for Speakers of Other Languages
  • ESR: Environmental Studies
  • Exercise Science (formerly FT: Fitness Technology)
  • FMT: Facilities Maintenance Technology
  • FN: Foods and Nutrition
  • FOT: Fiber Optics Technology
  • FP: Fire Protection Technology
  • GD: Graphic Design
  • GEO: Geography
  • GER: German
  • GGS: Geology and General Science
  • GRN: Gerontology
  • HE: Health Education
  • HIM: Health Information Management
  • HR: Culinary Assistant Program
  • HST: History
  • ID: Interior Design
  • INSP: Building Inspection Technology
  • Integrated Studies
  • ITP: Sign Language Interpretation
  • J: Journalism
  • JPN: Japanese
  • LAT: Landscape Technology
  • LIB: Library
  • Literature (ENG)
  • MA: Medical Assisting
  • MCH: Machine Manufacturing Technology
  • MLT: Medical Laboratory Technology
  • MM: Multimedia
  • MP: Medical Professions
  • MRI: Magnetic Resonance Imaging
  • MSD: Management/Supervisory Development
  • MT: Microelectronic Technology
  • MTH: Mathematics
  • MUC: Music & Sonic Arts (formerly Professional Music)
  • NRS: Nursing
  • OMT: Ophthalmic Medical Technology
  • OST: Occupational Skills Training
  • PCC Core Outcomes/Course Mapping Matrix
  • PE: Physical Education
  • PHL: Philosophy
  • PHY: Physics
  • PL: Paralegal
  • PS: Political Science
  • PSY: Psychology
  • Race, Indigenous Nations, and Gender (RING)
  • RAD: Radiography
  • RE: Real Estate
  • RUS: Russian
  • SC: Skill Center
  • SOC: Sociology
  • SPA: Spanish
  • TA: Theatre Arts
  • TE: Facilities Maintenance
  • VP: Video Production
  • VT: Veterinary Technology
  • WLD: Welding Technology
  • Writing/Composition
  • WS: Women’s and Gender Studies

Teaching Commons Autumn Symposium 2024

Get ready for autumn quarter at the Teaching Commons Autumn Symposium. Friday, September 27.

Creating Learning Outcomes

Main navigation.

A learning outcome is a concise description of what students will learn and how that learning will be assessed. Having clearly articulated learning outcomes can make designing a course, assessing student learning progress, and facilitating learning activities easier and more effective. Learning outcomes can also help students regulate their learning and develop effective study strategies.

Defining the terms

Educational research uses a number of terms for this concept, including learning goals, student learning objectives, session outcomes, and more. 

In alignment with other Stanford resources, we will use learning outcomes as a general term for what students will learn and how that learning will be assessed. This includes both goals and objectives. We will use learning goals to describe general outcomes for an entire course or program. We will use learning objectives when discussing more focused outcomes for specific lessons or activities.

For example, a learning goal might be “By the end of the course, students will be able to develop coherent literary arguments.” 

Whereas a learning objective might be, “By the end of Week 5, students will be able to write a coherent thesis statement supported by at least two pieces of evidence.”

Learning outcomes benefit instructors

Learning outcomes can help instructors in a number of ways by:

  • Providing a framework and rationale for making course design decisions about the sequence of topics and instruction, content selection, and so on.
  • Communicating to students what they must do to make progress in learning in your course.
  • Clarifying your intentions to the teaching team, course guests, and other colleagues.
  • Providing a framework for transparent and equitable assessment of student learning. 
  • Making outcomes concerning values and beliefs, such as dedication to discipline-specific values, more concrete and assessable.
  • Making inclusion and belonging explicit and integral to the course design.

Learning outcomes benefit students 

Clearly, articulated learning outcomes can also help guide and support students in their own learning by:

  • Clearly communicating the range of learning students will be expected to acquire and demonstrate.
  • Helping learners concentrate on the areas that they need to develop to progress in the course.
  • Helping learners monitor their own progress, reflect on the efficacy of their study strategies, and seek out support or better strategies. (See Promoting Student Metacognition for more on this topic.)

Choosing learning outcomes

When writing learning outcomes to represent the aims and practices of a course or even a discipline, consider:

  • What is the big idea that you hope students will still retain from the course even years later?
  • What are the most important concepts, ideas, methods, theories, approaches, and perspectives of your field that students should learn?
  • What are the most important skills that students should develop and be able to apply in and after your course?
  • What would students need to have mastered earlier in the course or program in order to make progress later or in subsequent courses?
  • What skills and knowledge would students need if they were to pursue a career in this field or contribute to communities impacted by this field?
  • What values, attitudes, and habits of mind and affect would students need if they are to pursue a career in this field or contribute to communities impacted by this field?
  • How can the learning outcomes span a wide range of skills that serve students with differing levels of preparation?
  • How can learning outcomes offer a range of assessment types to serve a diverse student population?

Use learning taxonomies to inform learning outcomes

Learning taxonomies describe how a learner’s understanding develops from simple to complex when learning different subjects or tasks. They are useful here for identifying any foundational skills or knowledge needed for more complex learning, and for matching observable behaviors to different types of learning.

Bloom’s Taxonomy

Bloom’s Taxonomy is a hierarchical model and includes three domains of learning: cognitive, psychomotor, and affective. In this model, learning occurs hierarchically, as each skill builds on previous skills towards increasingly sophisticated learning. For example, in the cognitive domain, learning begins with remembering, then understanding, applying, analyzing, evaluating, and lastly creating. 

Taxonomy of Significant Learning

The Taxonomy of Significant Learning is a non-hierarchical and integral model of learning. It describes learning as a meaningful, holistic, and integral network. This model has six intersecting domains: knowledge, application, integration, human dimension, caring, and learning how to learn. 

See our resource on Learning Taxonomies and Verbs for a summary of these two learning taxonomies.

How to write learning outcomes

Writing learning outcomes can be made easier by using the ABCD approach. This strategy identifies four key elements of an effective learning outcome:

Consider the following example: Students (audience) , will be able to label and describe (behavior) , given a diagram of the eye at the end of this lesson (condition) , all seven extraocular muscles, and at least two of their actions (degree) .

Audience 

Define who will achieve the outcome. Outcomes commonly include phrases such as “After completing this course, students will be able to...” or “After completing this activity, workshop participants will be able to...”

Keeping your audience in mind as you develop your learning outcomes helps ensure that they are relevant and centered on what learners must achieve. Make sure the learning outcome is focused on the student’s behavior, not the instructor’s. If the outcome describes an instructional activity or topic, then it is too focused on the instructor’s intentions and not the students.

Try to understand your audience so that you can better align your learning goals or objectives to meet their needs. While every group of students is different, certain generalizations about their prior knowledge, goals, motivation, and so on might be made based on course prerequisites, their year-level, or majors. 

Use action verbs to describe observable behavior that demonstrates mastery of the goal or objective. Depending on the skill, knowledge, or domain of the behavior, you might select a different action verb. Particularly for learning objectives which are more specific, avoid verbs that are vague or difficult to assess, such as “understand”, “appreciate”, or “know”.

The behavior usually completes the audience phrase “students will be able to…” with a specific action verb that learners can interpret without ambiguity. We recommend beginning learning goals with a phrase that makes it clear that students are expected to actively contribute to progressing towards a learning goal. For example, “through active engagement and completion of course activities, students will be able to…”

Example action verbs

Consider the following examples of verbs from different learning domains of Bloom’s Taxonomy . Generally speaking, items listed at the top under each domain are more suitable for advanced students, and items listed at the bottom are more suitable for novice or beginning students. Using verbs and associated skills from all three domains, regardless of your discipline area, can benefit students by diversifying the learning experience. 

For the cognitive domain:

  • Create, investigate, design
  • Evaluate, argue, support
  • Analyze, compare, examine
  • Solve, operate, demonstrate
  • Describe, locate, translate
  • Remember, define, duplicate, list

For the psychomotor domain:

  • Invent, create, manage
  • Articulate, construct, solve
  • Complete, calibrate, control
  • Build, perform, execute
  • Copy, repeat, follow

For the affective domain:

  • Internalize, propose, conclude
  • Organize, systematize, integrate
  • Justify, share, persuade
  • Respond, contribute, cooperate
  • Capture, pursue, consume

Often we develop broad goals first, then break them down into specific objectives. For example, if a goal is for learners to be able to compose an essay, break it down into several objectives, such as forming a clear thesis statement, coherently ordering points, following a salient argument, gathering and quoting evidence effectively, and so on.

State the conditions, if any, under which the behavior is to be performed. Consider the following conditions:

  • Equipment or tools, such as using a laboratory device or a specified software application.
  • Situation or environment, such as in a clinical setting, or during a performance.
  • Materials or format, such as written text, a slide presentation, or using specified materials.

The level of specificity for conditions within an objective may vary and should be appropriate to the broader goals. If the conditions are implicit or understood as part of the classroom or assessment situation, it may not be necessary to state them. 

When articulating the conditions in learning outcomes, ensure that they are sensorily and financially accessible to all students.

Degree 

Degree states the standard or criterion for acceptable performance. The degree should be related to real-world expectations: what standard should the learner meet to be judged proficient? For example:

  • With 90% accuracy
  • Within 10 minutes
  • Suitable for submission to an edited journal
  • Obtain a valid solution
  • In a 100-word paragraph

The specificity of the degree will vary. You might take into consideration professional standards, what a student would need to succeed in subsequent courses in a series, or what is required by you as the instructor to accurately assess learning when determining the degree. Where the degree is easy to measure (such as pass or fail) or accuracy is not required, it may be omitted.

Characteristics of effective learning outcomes

The acronym SMART is useful for remembering the characteristics of an effective learning outcome.

  • Specific : clear and distinct from others.
  • Measurable : identifies observable student action.
  • Attainable : suitably challenging for students in the course.
  • Related : connected to other objectives and student interests.
  • Time-bound : likely to be achieved and keep students on task within the given time frame.

Examples of effective learning outcomes

These examples generally follow the ABCD and SMART guidelines. 

Arts and Humanities

Learning goals.

Upon completion of this course, students will be able to apply critical terms and methodology in completing a written literary analysis of a selected literary work.

At the end of the course, students will be able to demonstrate oral competence with the French language in pronunciation, vocabulary, and language fluency in a 10 minute in-person interview with a member of the teaching team.

Learning objectives

After completing lessons 1 through 5, given images of specific works of art, students will be able to identify the artist, artistic period, and describe their historical, social, and philosophical contexts in a two-page written essay.

By the end of this course, students will be able to describe the steps in planning a research study, including identifying and formulating relevant theories, generating alternative solutions and strategies, and application to a hypothetical case in a written research proposal.

At the end of this lesson, given a diagram of the eye, students will be able to label all of the extraocular muscles and describe at least two of their actions.

Using chemical datasets gathered at the end of the first lab unit, students will be able to create plots and trend lines of that data in Excel and make quantitative predictions about future experiments.

  • How to Write Learning Goals , Evaluation and Research, Student Affairs (2021).
  • SMART Guidelines , Center for Teaching and Learning (2020).
  • Learning Taxonomies and Verbs , Center for Teaching and Learning (2021).

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 11 January 2023

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

  • Enwei Xu   ORCID: orcid.org/0000-0001-6424-8169 1 ,
  • Wei Wang 1 &
  • Qingxia Wang 1  

Humanities and Social Sciences Communications volume  10 , Article number:  16 ( 2023 ) Cite this article

19k Accesses

22 Citations

3 Altmetric

Metrics details

  • Science, technology and society

Collaborative problem-solving has been widely embraced in the classroom instruction of critical thinking, which is regarded as the core of curriculum reform based on key competencies in the field of education as well as a key competence for learners in the 21st century. However, the effectiveness of collaborative problem-solving in promoting students’ critical thinking remains uncertain. This current research presents the major findings of a meta-analysis of 36 pieces of the literature revealed in worldwide educational periodicals during the 21st century to identify the effectiveness of collaborative problem-solving in promoting students’ critical thinking and to determine, based on evidence, whether and to what extent collaborative problem solving can result in a rise or decrease in critical thinking. The findings show that (1) collaborative problem solving is an effective teaching approach to foster students’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]); (2) in respect to the dimensions of critical thinking, collaborative problem solving can significantly and successfully enhance students’ attitudinal tendencies (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI[0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI[0.58, 0.82]); and (3) the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have an impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. On the basis of these results, recommendations are made for further study and instruction to better support students’ critical thinking in the context of collaborative problem-solving.

Similar content being viewed by others

student learning outcomes for problem solving

A meta-analysis of the effects of design thinking on student learning

student learning outcomes for problem solving

Fostering twenty-first century skills among primary school students through math project-based learning

student learning outcomes for problem solving

A meta-analysis to gauge the impact of pedagogies employed in mixed-ability high school biology classrooms

Introduction.

Although critical thinking has a long history in research, the concept of critical thinking, which is regarded as an essential competence for learners in the 21st century, has recently attracted more attention from researchers and teaching practitioners (National Research Council, 2012 ). Critical thinking should be the core of curriculum reform based on key competencies in the field of education (Peng and Deng, 2017 ) because students with critical thinking can not only understand the meaning of knowledge but also effectively solve practical problems in real life even after knowledge is forgotten (Kek and Huijser, 2011 ). The definition of critical thinking is not universal (Ennis, 1989 ; Castle, 2009 ; Niu et al., 2013 ). In general, the definition of critical thinking is a self-aware and self-regulated thought process (Facione, 1990 ; Niu et al., 2013 ). It refers to the cognitive skills needed to interpret, analyze, synthesize, reason, and evaluate information as well as the attitudinal tendency to apply these abilities (Halpern, 2001 ). The view that critical thinking can be taught and learned through curriculum teaching has been widely supported by many researchers (e.g., Kuncel, 2011 ; Leng and Lu, 2020 ), leading to educators’ efforts to foster it among students. In the field of teaching practice, there are three types of courses for teaching critical thinking (Ennis, 1989 ). The first is an independent curriculum in which critical thinking is taught and cultivated without involving the knowledge of specific disciplines; the second is an integrated curriculum in which critical thinking is integrated into the teaching of other disciplines as a clear teaching goal; and the third is a mixed curriculum in which critical thinking is taught in parallel to the teaching of other disciplines for mixed teaching training. Furthermore, numerous measuring tools have been developed by researchers and educators to measure critical thinking in the context of teaching practice. These include standardized measurement tools, such as WGCTA, CCTST, CCTT, and CCTDI, which have been verified by repeated experiments and are considered effective and reliable by international scholars (Facione and Facione, 1992 ). In short, descriptions of critical thinking, including its two dimensions of attitudinal tendency and cognitive skills, different types of teaching courses, and standardized measurement tools provide a complex normative framework for understanding, teaching, and evaluating critical thinking.

Cultivating critical thinking in curriculum teaching can start with a problem, and one of the most popular critical thinking instructional approaches is problem-based learning (Liu et al., 2020 ). Duch et al. ( 2001 ) noted that problem-based learning in group collaboration is progressive active learning, which can improve students’ critical thinking and problem-solving skills. Collaborative problem-solving is the organic integration of collaborative learning and problem-based learning, which takes learners as the center of the learning process and uses problems with poor structure in real-world situations as the starting point for the learning process (Liang et al., 2017 ). Students learn the knowledge needed to solve problems in a collaborative group, reach a consensus on problems in the field, and form solutions through social cooperation methods, such as dialogue, interpretation, questioning, debate, negotiation, and reflection, thus promoting the development of learners’ domain knowledge and critical thinking (Cindy, 2004 ; Liang et al., 2017 ).

Collaborative problem-solving has been widely used in the teaching practice of critical thinking, and several studies have attempted to conduct a systematic review and meta-analysis of the empirical literature on critical thinking from various perspectives. However, little attention has been paid to the impact of collaborative problem-solving on critical thinking. Therefore, the best approach for developing and enhancing critical thinking throughout collaborative problem-solving is to examine how to implement critical thinking instruction; however, this issue is still unexplored, which means that many teachers are incapable of better instructing critical thinking (Leng and Lu, 2020 ; Niu et al., 2013 ). For example, Huber ( 2016 ) provided the meta-analysis findings of 71 publications on gaining critical thinking over various time frames in college with the aim of determining whether critical thinking was truly teachable. These authors found that learners significantly improve their critical thinking while in college and that critical thinking differs with factors such as teaching strategies, intervention duration, subject area, and teaching type. The usefulness of collaborative problem-solving in fostering students’ critical thinking, however, was not determined by this study, nor did it reveal whether there existed significant variations among the different elements. A meta-analysis of 31 pieces of educational literature was conducted by Liu et al. ( 2020 ) to assess the impact of problem-solving on college students’ critical thinking. These authors found that problem-solving could promote the development of critical thinking among college students and proposed establishing a reasonable group structure for problem-solving in a follow-up study to improve students’ critical thinking. Additionally, previous empirical studies have reached inconclusive and even contradictory conclusions about whether and to what extent collaborative problem-solving increases or decreases critical thinking levels. As an illustration, Yang et al. ( 2008 ) carried out an experiment on the integrated curriculum teaching of college students based on a web bulletin board with the goal of fostering participants’ critical thinking in the context of collaborative problem-solving. These authors’ research revealed that through sharing, debating, examining, and reflecting on various experiences and ideas, collaborative problem-solving can considerably enhance students’ critical thinking in real-life problem situations. In contrast, collaborative problem-solving had a positive impact on learners’ interaction and could improve learning interest and motivation but could not significantly improve students’ critical thinking when compared to traditional classroom teaching, according to research by Naber and Wyatt ( 2014 ) and Sendag and Odabasi ( 2009 ) on undergraduate and high school students, respectively.

The above studies show that there is inconsistency regarding the effectiveness of collaborative problem-solving in promoting students’ critical thinking. Therefore, it is essential to conduct a thorough and trustworthy review to detect and decide whether and to what degree collaborative problem-solving can result in a rise or decrease in critical thinking. Meta-analysis is a quantitative analysis approach that is utilized to examine quantitative data from various separate studies that are all focused on the same research topic. This approach characterizes the effectiveness of its impact by averaging the effect sizes of numerous qualitative studies in an effort to reduce the uncertainty brought on by independent research and produce more conclusive findings (Lipsey and Wilson, 2001 ).

This paper used a meta-analytic approach and carried out a meta-analysis to examine the effectiveness of collaborative problem-solving in promoting students’ critical thinking in order to make a contribution to both research and practice. The following research questions were addressed by this meta-analysis:

What is the overall effect size of collaborative problem-solving in promoting students’ critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills)?

How are the disparities between the study conclusions impacted by various moderating variables if the impacts of various experimental designs in the included studies are heterogeneous?

This research followed the strict procedures (e.g., database searching, identification, screening, eligibility, merging, duplicate removal, and analysis of included studies) of Cooper’s ( 2010 ) proposed meta-analysis approach for examining quantitative data from various separate studies that are all focused on the same research topic. The relevant empirical research that appeared in worldwide educational periodicals within the 21st century was subjected to this meta-analysis using Rev-Man 5.4. The consistency of the data extracted separately by two researchers was tested using Cohen’s kappa coefficient, and a publication bias test and a heterogeneity test were run on the sample data to ascertain the quality of this meta-analysis.

Data sources and search strategies

There were three stages to the data collection process for this meta-analysis, as shown in Fig. 1 , which shows the number of articles included and eliminated during the selection process based on the statement and study eligibility criteria.

figure 1

This flowchart shows the number of records identified, included and excluded in the article.

First, the databases used to systematically search for relevant articles were the journal papers of the Web of Science Core Collection and the Chinese Core source journal, as well as the Chinese Social Science Citation Index (CSSCI) source journal papers included in CNKI. These databases were selected because they are credible platforms that are sources of scholarly and peer-reviewed information with advanced search tools and contain literature relevant to the subject of our topic from reliable researchers and experts. The search string with the Boolean operator used in the Web of Science was “TS = (((“critical thinking” or “ct” and “pretest” or “posttest”) or (“critical thinking” or “ct” and “control group” or “quasi experiment” or “experiment”)) and (“collaboration” or “collaborative learning” or “CSCL”) and (“problem solving” or “problem-based learning” or “PBL”))”. The research area was “Education Educational Research”, and the search period was “January 1, 2000, to December 30, 2021”. A total of 412 papers were obtained. The search string with the Boolean operator used in the CNKI was “SU = (‘critical thinking’*‘collaboration’ + ‘critical thinking’*‘collaborative learning’ + ‘critical thinking’*‘CSCL’ + ‘critical thinking’*‘problem solving’ + ‘critical thinking’*‘problem-based learning’ + ‘critical thinking’*‘PBL’ + ‘critical thinking’*‘problem oriented’) AND FT = (‘experiment’ + ‘quasi experiment’ + ‘pretest’ + ‘posttest’ + ‘empirical study’)” (translated into Chinese when searching). A total of 56 studies were found throughout the search period of “January 2000 to December 2021”. From the databases, all duplicates and retractions were eliminated before exporting the references into Endnote, a program for managing bibliographic references. In all, 466 studies were found.

Second, the studies that matched the inclusion and exclusion criteria for the meta-analysis were chosen by two researchers after they had reviewed the abstracts and titles of the gathered articles, yielding a total of 126 studies.

Third, two researchers thoroughly reviewed each included article’s whole text in accordance with the inclusion and exclusion criteria. Meanwhile, a snowball search was performed using the references and citations of the included articles to ensure complete coverage of the articles. Ultimately, 36 articles were kept.

Two researchers worked together to carry out this entire process, and a consensus rate of almost 94.7% was reached after discussion and negotiation to clarify any emerging differences.

Eligibility criteria

Since not all the retrieved studies matched the criteria for this meta-analysis, eligibility criteria for both inclusion and exclusion were developed as follows:

The publication language of the included studies was limited to English and Chinese, and the full text could be obtained. Articles that did not meet the publication language and articles not published between 2000 and 2021 were excluded.

The research design of the included studies must be empirical and quantitative studies that can assess the effect of collaborative problem-solving on the development of critical thinking. Articles that could not identify the causal mechanisms by which collaborative problem-solving affects critical thinking, such as review articles and theoretical articles, were excluded.

The research method of the included studies must feature a randomized control experiment or a quasi-experiment, or a natural experiment, which have a higher degree of internal validity with strong experimental designs and can all plausibly provide evidence that critical thinking and collaborative problem-solving are causally related. Articles with non-experimental research methods, such as purely correlational or observational studies, were excluded.

The participants of the included studies were only students in school, including K-12 students and college students. Articles in which the participants were non-school students, such as social workers or adult learners, were excluded.

The research results of the included studies must mention definite signs that may be utilized to gauge critical thinking’s impact (e.g., sample size, mean value, or standard deviation). Articles that lacked specific measurement indicators for critical thinking and could not calculate the effect size were excluded.

Data coding design

In order to perform a meta-analysis, it is necessary to collect the most important information from the articles, codify that information’s properties, and convert descriptive data into quantitative data. Therefore, this study designed a data coding template (see Table 1 ). Ultimately, 16 coding fields were retained.

The designed data-coding template consisted of three pieces of information. Basic information about the papers was included in the descriptive information: the publishing year, author, serial number, and title of the paper.

The variable information for the experimental design had three variables: the independent variable (instruction method), the dependent variable (critical thinking), and the moderating variable (learning stage, teaching type, intervention duration, learning scaffold, group size, measuring tool, and subject area). Depending on the topic of this study, the intervention strategy, as the independent variable, was coded into collaborative and non-collaborative problem-solving. The dependent variable, critical thinking, was coded as a cognitive skill and an attitudinal tendency. And seven moderating variables were created by grouping and combining the experimental design variables discovered within the 36 studies (see Table 1 ), where learning stages were encoded as higher education, high school, middle school, and primary school or lower; teaching types were encoded as mixed courses, integrated courses, and independent courses; intervention durations were encoded as 0–1 weeks, 1–4 weeks, 4–12 weeks, and more than 12 weeks; group sizes were encoded as 2–3 persons, 4–6 persons, 7–10 persons, and more than 10 persons; learning scaffolds were encoded as teacher-supported learning scaffold, technique-supported learning scaffold, and resource-supported learning scaffold; measuring tools were encoded as standardized measurement tools (e.g., WGCTA, CCTT, CCTST, and CCTDI) and self-adapting measurement tools (e.g., modified or made by researchers); and subject areas were encoded according to the specific subjects used in the 36 included studies.

The data information contained three metrics for measuring critical thinking: sample size, average value, and standard deviation. It is vital to remember that studies with various experimental designs frequently adopt various formulas to determine the effect size. And this paper used Morris’ proposed standardized mean difference (SMD) calculation formula ( 2008 , p. 369; see Supplementary Table S3 ).

Procedure for extracting and coding data

According to the data coding template (see Table 1 ), the 36 papers’ information was retrieved by two researchers, who then entered them into Excel (see Supplementary Table S1 ). The results of each study were extracted separately in the data extraction procedure if an article contained numerous studies on critical thinking, or if a study assessed different critical thinking dimensions. For instance, Tiwari et al. ( 2010 ) used four time points, which were viewed as numerous different studies, to examine the outcomes of critical thinking, and Chen ( 2013 ) included the two outcome variables of attitudinal tendency and cognitive skills, which were regarded as two studies. After discussion and negotiation during data extraction, the two researchers’ consistency test coefficients were roughly 93.27%. Supplementary Table S2 details the key characteristics of the 36 included articles with 79 effect quantities, including descriptive information (e.g., the publishing year, author, serial number, and title of the paper), variable information (e.g., independent variables, dependent variables, and moderating variables), and data information (e.g., mean values, standard deviations, and sample size). Following that, testing for publication bias and heterogeneity was done on the sample data using the Rev-Man 5.4 software, and then the test results were used to conduct a meta-analysis.

Publication bias test

When the sample of studies included in a meta-analysis does not accurately reflect the general status of research on the relevant subject, publication bias is said to be exhibited in this research. The reliability and accuracy of the meta-analysis may be impacted by publication bias. Due to this, the meta-analysis needs to check the sample data for publication bias (Stewart et al., 2006 ). A popular method to check for publication bias is the funnel plot; and it is unlikely that there will be publishing bias when the data are equally dispersed on either side of the average effect size and targeted within the higher region. The data are equally dispersed within the higher portion of the efficient zone, consistent with the funnel plot connected with this analysis (see Fig. 2 ), indicating that publication bias is unlikely in this situation.

figure 2

This funnel plot shows the result of publication bias of 79 effect quantities across 36 studies.

Heterogeneity test

To select the appropriate effect models for the meta-analysis, one might use the results of a heterogeneity test on the data effect sizes. In a meta-analysis, it is common practice to gauge the degree of data heterogeneity using the I 2 value, and I 2  ≥ 50% is typically understood to denote medium-high heterogeneity, which calls for the adoption of a random effect model; if not, a fixed effect model ought to be applied (Lipsey and Wilson, 2001 ). The findings of the heterogeneity test in this paper (see Table 2 ) revealed that I 2 was 86% and displayed significant heterogeneity ( P  < 0.01). To ensure accuracy and reliability, the overall effect size ought to be calculated utilizing the random effect model.

The analysis of the overall effect size

This meta-analysis utilized a random effect model to examine 79 effect quantities from 36 studies after eliminating heterogeneity. In accordance with Cohen’s criterion (Cohen, 1992 ), it is abundantly clear from the analysis results, which are shown in the forest plot of the overall effect (see Fig. 3 ), that the cumulative impact size of cooperative problem-solving is 0.82, which is statistically significant ( z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]), and can encourage learners to practice critical thinking.

figure 3

This forest plot shows the analysis result of the overall effect size across 36 studies.

In addition, this study examined two distinct dimensions of critical thinking to better understand the precise contributions that collaborative problem-solving makes to the growth of critical thinking. The findings (see Table 3 ) indicate that collaborative problem-solving improves cognitive skills (ES = 0.70) and attitudinal tendency (ES = 1.17), with significant intergroup differences (chi 2  = 7.95, P  < 0.01). Although collaborative problem-solving improves both dimensions of critical thinking, it is essential to point out that the improvements in students’ attitudinal tendency are much more pronounced and have a significant comprehensive effect (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]), whereas gains in learners’ cognitive skill are slightly improved and are just above average. (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

The analysis of moderator effect size

The whole forest plot’s 79 effect quantities underwent a two-tailed test, which revealed significant heterogeneity ( I 2  = 86%, z  = 12.78, P  < 0.01), indicating differences between various effect sizes that may have been influenced by moderating factors other than sampling error. Therefore, exploring possible moderating factors that might produce considerable heterogeneity was done using subgroup analysis, such as the learning stage, learning scaffold, teaching type, group size, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, in order to further explore the key factors that influence critical thinking. The findings (see Table 4 ) indicate that various moderating factors have advantageous effects on critical thinking. In this situation, the subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), learning scaffold (chi 2  = 9.03, P  < 0.01), and teaching type (chi 2  = 7.20, P  < 0.05) are all significant moderators that can be applied to support the cultivation of critical thinking. However, since the learning stage and the measuring tools did not significantly differ among intergroup (chi 2  = 3.15, P  = 0.21 > 0.05, and chi 2  = 0.08, P  = 0.78 > 0.05), we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving. These are the precise outcomes, as follows:

Various learning stages influenced critical thinking positively, without significant intergroup differences (chi 2  = 3.15, P  = 0.21 > 0.05). High school was first on the list of effect sizes (ES = 1.36, P  < 0.01), then higher education (ES = 0.78, P  < 0.01), and middle school (ES = 0.73, P  < 0.01). These results show that, despite the learning stage’s beneficial influence on cultivating learners’ critical thinking, we are unable to explain why it is essential for cultivating critical thinking in the context of collaborative problem-solving.

Different teaching types had varying degrees of positive impact on critical thinking, with significant intergroup differences (chi 2  = 7.20, P  < 0.05). The effect size was ranked as follows: mixed courses (ES = 1.34, P  < 0.01), integrated courses (ES = 0.81, P  < 0.01), and independent courses (ES = 0.27, P  < 0.01). These results indicate that the most effective approach to cultivate critical thinking utilizing collaborative problem solving is through the teaching type of mixed courses.

Various intervention durations significantly improved critical thinking, and there were significant intergroup differences (chi 2  = 12.18, P  < 0.01). The effect sizes related to this variable showed a tendency to increase with longer intervention durations. The improvement in critical thinking reached a significant level (ES = 0.85, P  < 0.01) after more than 12 weeks of training. These findings indicate that the intervention duration and critical thinking’s impact are positively correlated, with a longer intervention duration having a greater effect.

Different learning scaffolds influenced critical thinking positively, with significant intergroup differences (chi 2  = 9.03, P  < 0.01). The resource-supported learning scaffold (ES = 0.69, P  < 0.01) acquired a medium-to-higher level of impact, the technique-supported learning scaffold (ES = 0.63, P  < 0.01) also attained a medium-to-higher level of impact, and the teacher-supported learning scaffold (ES = 0.92, P  < 0.01) displayed a high level of significant impact. These results show that the learning scaffold with teacher support has the greatest impact on cultivating critical thinking.

Various group sizes influenced critical thinking positively, and the intergroup differences were statistically significant (chi 2  = 8.77, P  < 0.05). Critical thinking showed a general declining trend with increasing group size. The overall effect size of 2–3 people in this situation was the biggest (ES = 0.99, P  < 0.01), and when the group size was greater than 7 people, the improvement in critical thinking was at the lower-middle level (ES < 0.5, P  < 0.01). These results show that the impact on critical thinking is positively connected with group size, and as group size grows, so does the overall impact.

Various measuring tools influenced critical thinking positively, with significant intergroup differences (chi 2  = 0.08, P  = 0.78 > 0.05). In this situation, the self-adapting measurement tools obtained an upper-medium level of effect (ES = 0.78), whereas the complete effect size of the standardized measurement tools was the largest, achieving a significant level of effect (ES = 0.84, P  < 0.01). These results show that, despite the beneficial influence of the measuring tool on cultivating critical thinking, we are unable to explain why it is crucial in fostering the growth of critical thinking by utilizing the approach of collaborative problem-solving.

Different subject areas had a greater impact on critical thinking, and the intergroup differences were statistically significant (chi 2  = 13.36, P  < 0.05). Mathematics had the greatest overall impact, achieving a significant level of effect (ES = 1.68, P  < 0.01), followed by science (ES = 1.25, P  < 0.01) and medical science (ES = 0.87, P  < 0.01), both of which also achieved a significant level of effect. Programming technology was the least effective (ES = 0.39, P  < 0.01), only having a medium-low degree of effect compared to education (ES = 0.72, P  < 0.01) and other fields (such as language, art, and social sciences) (ES = 0.58, P  < 0.01). These results suggest that scientific fields (e.g., mathematics, science) may be the most effective subject areas for cultivating critical thinking utilizing the approach of collaborative problem-solving.

The effectiveness of collaborative problem solving with regard to teaching critical thinking

According to this meta-analysis, using collaborative problem-solving as an intervention strategy in critical thinking teaching has a considerable amount of impact on cultivating learners’ critical thinking as a whole and has a favorable promotional effect on the two dimensions of critical thinking. According to certain studies, collaborative problem solving, the most frequently used critical thinking teaching strategy in curriculum instruction can considerably enhance students’ critical thinking (e.g., Liang et al., 2017 ; Liu et al., 2020 ; Cindy, 2004 ). This meta-analysis provides convergent data support for the above research views. Thus, the findings of this meta-analysis not only effectively address the first research query regarding the overall effect of cultivating critical thinking and its impact on the two dimensions of critical thinking (i.e., attitudinal tendency and cognitive skills) utilizing the approach of collaborative problem-solving, but also enhance our confidence in cultivating critical thinking by using collaborative problem-solving intervention approach in the context of classroom teaching.

Furthermore, the associated improvements in attitudinal tendency are much stronger, but the corresponding improvements in cognitive skill are only marginally better. According to certain studies, cognitive skill differs from the attitudinal tendency in classroom instruction; the cultivation and development of the former as a key ability is a process of gradual accumulation, while the latter as an attitude is affected by the context of the teaching situation (e.g., a novel and exciting teaching approach, challenging and rewarding tasks) (Halpern, 2001 ; Wei and Hong, 2022 ). Collaborative problem-solving as a teaching approach is exciting and interesting, as well as rewarding and challenging; because it takes the learners as the focus and examines problems with poor structure in real situations, and it can inspire students to fully realize their potential for problem-solving, which will significantly improve their attitudinal tendency toward solving problems (Liu et al., 2020 ). Similar to how collaborative problem-solving influences attitudinal tendency, attitudinal tendency impacts cognitive skill when attempting to solve a problem (Liu et al., 2020 ; Zhang et al., 2022 ), and stronger attitudinal tendencies are associated with improved learning achievement and cognitive ability in students (Sison, 2008 ; Zhang et al., 2022 ). It can be seen that the two specific dimensions of critical thinking as well as critical thinking as a whole are affected by collaborative problem-solving, and this study illuminates the nuanced links between cognitive skills and attitudinal tendencies with regard to these two dimensions of critical thinking. To fully develop students’ capacity for critical thinking, future empirical research should pay closer attention to cognitive skills.

The moderating effects of collaborative problem solving with regard to teaching critical thinking

In order to further explore the key factors that influence critical thinking, exploring possible moderating effects that might produce considerable heterogeneity was done using subgroup analysis. The findings show that the moderating factors, such as the teaching type, learning stage, group size, learning scaffold, duration of the intervention, measuring tool, and the subject area included in the 36 experimental designs, could all support the cultivation of collaborative problem-solving in critical thinking. Among them, the effect size differences between the learning stage and measuring tool are not significant, which does not explain why these two factors are crucial in supporting the cultivation of critical thinking utilizing the approach of collaborative problem-solving.

In terms of the learning stage, various learning stages influenced critical thinking positively without significant intergroup differences, indicating that we are unable to explain why it is crucial in fostering the growth of critical thinking.

Although high education accounts for 70.89% of all empirical studies performed by researchers, high school may be the appropriate learning stage to foster students’ critical thinking by utilizing the approach of collaborative problem-solving since it has the largest overall effect size. This phenomenon may be related to student’s cognitive development, which needs to be further studied in follow-up research.

With regard to teaching type, mixed course teaching may be the best teaching method to cultivate students’ critical thinking. Relevant studies have shown that in the actual teaching process if students are trained in thinking methods alone, the methods they learn are isolated and divorced from subject knowledge, which is not conducive to their transfer of thinking methods; therefore, if students’ thinking is trained only in subject teaching without systematic method training, it is challenging to apply to real-world circumstances (Ruggiero, 2012 ; Hu and Liu, 2015 ). Teaching critical thinking as mixed course teaching in parallel to other subject teachings can achieve the best effect on learners’ critical thinking, and explicit critical thinking instruction is more effective than less explicit critical thinking instruction (Bensley and Spero, 2014 ).

In terms of the intervention duration, with longer intervention times, the overall effect size shows an upward tendency. Thus, the intervention duration and critical thinking’s impact are positively correlated. Critical thinking, as a key competency for students in the 21st century, is difficult to get a meaningful improvement in a brief intervention duration. Instead, it could be developed over a lengthy period of time through consistent teaching and the progressive accumulation of knowledge (Halpern, 2001 ; Hu and Liu, 2015 ). Therefore, future empirical studies ought to take these restrictions into account throughout a longer period of critical thinking instruction.

With regard to group size, a group size of 2–3 persons has the highest effect size, and the comprehensive effect size decreases with increasing group size in general. This outcome is in line with some research findings; as an example, a group composed of two to four members is most appropriate for collaborative learning (Schellens and Valcke, 2006 ). However, the meta-analysis results also indicate that once the group size exceeds 7 people, small groups cannot produce better interaction and performance than large groups. This may be because the learning scaffolds of technique support, resource support, and teacher support improve the frequency and effectiveness of interaction among group members, and a collaborative group with more members may increase the diversity of views, which is helpful to cultivate critical thinking utilizing the approach of collaborative problem-solving.

With regard to the learning scaffold, the three different kinds of learning scaffolds can all enhance critical thinking. Among them, the teacher-supported learning scaffold has the largest overall effect size, demonstrating the interdependence of effective learning scaffolds and collaborative problem-solving. This outcome is in line with some research findings; as an example, a successful strategy is to encourage learners to collaborate, come up with solutions, and develop critical thinking skills by using learning scaffolds (Reiser, 2004 ; Xu et al., 2022 ); learning scaffolds can lower task complexity and unpleasant feelings while also enticing students to engage in learning activities (Wood et al., 2006 ); learning scaffolds are designed to assist students in using learning approaches more successfully to adapt the collaborative problem-solving process, and the teacher-supported learning scaffolds have the greatest influence on critical thinking in this process because they are more targeted, informative, and timely (Xu et al., 2022 ).

With respect to the measuring tool, despite the fact that standardized measurement tools (such as the WGCTA, CCTT, and CCTST) have been acknowledged as trustworthy and effective by worldwide experts, only 54.43% of the research included in this meta-analysis adopted them for assessment, and the results indicated no intergroup differences. These results suggest that not all teaching circumstances are appropriate for measuring critical thinking using standardized measurement tools. “The measuring tools for measuring thinking ability have limits in assessing learners in educational situations and should be adapted appropriately to accurately assess the changes in learners’ critical thinking.”, according to Simpson and Courtney ( 2002 , p. 91). As a result, in order to more fully and precisely gauge how learners’ critical thinking has evolved, we must properly modify standardized measuring tools based on collaborative problem-solving learning contexts.

With regard to the subject area, the comprehensive effect size of science departments (e.g., mathematics, science, medical science) is larger than that of language arts and social sciences. Some recent international education reforms have noted that critical thinking is a basic part of scientific literacy. Students with scientific literacy can prove the rationality of their judgment according to accurate evidence and reasonable standards when they face challenges or poorly structured problems (Kyndt et al., 2013 ), which makes critical thinking crucial for developing scientific understanding and applying this understanding to practical problem solving for problems related to science, technology, and society (Yore et al., 2007 ).

Suggestions for critical thinking teaching

Other than those stated in the discussion above, the following suggestions are offered for critical thinking instruction utilizing the approach of collaborative problem-solving.

First, teachers should put a special emphasis on the two core elements, which are collaboration and problem-solving, to design real problems based on collaborative situations. This meta-analysis provides evidence to support the view that collaborative problem-solving has a strong synergistic effect on promoting students’ critical thinking. Asking questions about real situations and allowing learners to take part in critical discussions on real problems during class instruction are key ways to teach critical thinking rather than simply reading speculative articles without practice (Mulnix, 2012 ). Furthermore, the improvement of students’ critical thinking is realized through cognitive conflict with other learners in the problem situation (Yang et al., 2008 ). Consequently, it is essential for teachers to put a special emphasis on the two core elements, which are collaboration and problem-solving, and design real problems and encourage students to discuss, negotiate, and argue based on collaborative problem-solving situations.

Second, teachers should design and implement mixed courses to cultivate learners’ critical thinking, utilizing the approach of collaborative problem-solving. Critical thinking can be taught through curriculum instruction (Kuncel, 2011 ; Leng and Lu, 2020 ), with the goal of cultivating learners’ critical thinking for flexible transfer and application in real problem-solving situations. This meta-analysis shows that mixed course teaching has a highly substantial impact on the cultivation and promotion of learners’ critical thinking. Therefore, teachers should design and implement mixed course teaching with real collaborative problem-solving situations in combination with the knowledge content of specific disciplines in conventional teaching, teach methods and strategies of critical thinking based on poorly structured problems to help students master critical thinking, and provide practical activities in which students can interact with each other to develop knowledge construction and critical thinking utilizing the approach of collaborative problem-solving.

Third, teachers should be more trained in critical thinking, particularly preservice teachers, and they also should be conscious of the ways in which teachers’ support for learning scaffolds can promote critical thinking. The learning scaffold supported by teachers had the greatest impact on learners’ critical thinking, in addition to being more directive, targeted, and timely (Wood et al., 2006 ). Critical thinking can only be effectively taught when teachers recognize the significance of critical thinking for students’ growth and use the proper approaches while designing instructional activities (Forawi, 2016 ). Therefore, with the intention of enabling teachers to create learning scaffolds to cultivate learners’ critical thinking utilizing the approach of collaborative problem solving, it is essential to concentrate on the teacher-supported learning scaffolds and enhance the instruction for teaching critical thinking to teachers, especially preservice teachers.

Implications and limitations

There are certain limitations in this meta-analysis, but future research can correct them. First, the search languages were restricted to English and Chinese, so it is possible that pertinent studies that were written in other languages were overlooked, resulting in an inadequate number of articles for review. Second, these data provided by the included studies are partially missing, such as whether teachers were trained in the theory and practice of critical thinking, the average age and gender of learners, and the differences in critical thinking among learners of various ages and genders. Third, as is typical for review articles, more studies were released while this meta-analysis was being done; therefore, it had a time limit. With the development of relevant research, future studies focusing on these issues are highly relevant and needed.

Conclusions

The subject of the magnitude of collaborative problem-solving’s impact on fostering students’ critical thinking, which received scant attention from other studies, was successfully addressed by this study. The question of the effectiveness of collaborative problem-solving in promoting students’ critical thinking was addressed in this study, which addressed a topic that had gotten little attention in earlier research. The following conclusions can be made:

Regarding the results obtained, collaborative problem solving is an effective teaching approach to foster learners’ critical thinking, with a significant overall effect size (ES = 0.82, z  = 12.78, P  < 0.01, 95% CI [0.69, 0.95]). With respect to the dimensions of critical thinking, collaborative problem-solving can significantly and effectively improve students’ attitudinal tendency, and the comprehensive effect is significant (ES = 1.17, z  = 7.62, P  < 0.01, 95% CI [0.87, 1.47]); nevertheless, it falls short in terms of improving students’ cognitive skills, having only an upper-middle impact (ES = 0.70, z  = 11.55, P  < 0.01, 95% CI [0.58, 0.82]).

As demonstrated by both the results and the discussion, there are varying degrees of beneficial effects on students’ critical thinking from all seven moderating factors, which were found across 36 studies. In this context, the teaching type (chi 2  = 7.20, P  < 0.05), intervention duration (chi 2  = 12.18, P  < 0.01), subject area (chi 2  = 13.36, P  < 0.05), group size (chi 2  = 8.77, P  < 0.05), and learning scaffold (chi 2  = 9.03, P  < 0.01) all have a positive impact on critical thinking, and they can be viewed as important moderating factors that affect how critical thinking develops. Since the learning stage (chi 2  = 3.15, P  = 0.21 > 0.05) and measuring tools (chi 2  = 0.08, P  = 0.78 > 0.05) did not demonstrate any significant intergroup differences, we are unable to explain why these two factors are crucial in supporting the cultivation of critical thinking in the context of collaborative problem-solving.

Data availability

All data generated or analyzed during this study are included within the article and its supplementary information files, and the supplementary information files are available in the Dataverse repository: https://doi.org/10.7910/DVN/IPFJO6 .

Bensley DA, Spero RA (2014) Improving critical thinking skills and meta-cognitive monitoring through direct infusion. Think Skills Creat 12:55–68. https://doi.org/10.1016/j.tsc.2014.02.001

Article   Google Scholar  

Castle A (2009) Defining and assessing critical thinking skills for student radiographers. Radiography 15(1):70–76. https://doi.org/10.1016/j.radi.2007.10.007

Chen XD (2013) An empirical study on the influence of PBL teaching model on critical thinking ability of non-English majors. J PLA Foreign Lang College 36 (04):68–72

Google Scholar  

Cohen A (1992) Antecedents of organizational commitment across occupational groups: a meta-analysis. J Organ Behav. https://doi.org/10.1002/job.4030130602

Cooper H (2010) Research synthesis and meta-analysis: a step-by-step approach, 4th edn. Sage, London, England

Cindy HS (2004) Problem-based learning: what and how do students learn? Educ Psychol Rev 51(1):31–39

Duch BJ, Gron SD, Allen DE (2001) The power of problem-based learning: a practical “how to” for teaching undergraduate courses in any discipline. Stylus Educ Sci 2:190–198

Ennis RH (1989) Critical thinking and subject specificity: clarification and needed research. Educ Res 18(3):4–10. https://doi.org/10.3102/0013189x018003004

Facione PA (1990) Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations. Eric document reproduction service. https://eric.ed.gov/?id=ed315423

Facione PA, Facione NC (1992) The California Critical Thinking Dispositions Inventory (CCTDI) and the CCTDI test manual. California Academic Press, Millbrae, CA

Forawi SA (2016) Standard-based science education and critical thinking. Think Skills Creat 20:52–62. https://doi.org/10.1016/j.tsc.2016.02.005

Halpern DF (2001) Assessing the effectiveness of critical thinking instruction. J Gen Educ 50(4):270–286. https://doi.org/10.2307/27797889

Hu WP, Liu J (2015) Cultivation of pupils’ thinking ability: a five-year follow-up study. Psychol Behav Res 13(05):648–654. https://doi.org/10.3969/j.issn.1672-0628.2015.05.010

Huber K (2016) Does college teach critical thinking? A meta-analysis. Rev Educ Res 86(2):431–468. https://doi.org/10.3102/0034654315605917

Kek MYCA, Huijser H (2011) The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow’s digital futures in today’s classrooms. High Educ Res Dev 30(3):329–341. https://doi.org/10.1080/07294360.2010.501074

Kuncel NR (2011) Measurement and meaning of critical thinking (Research report for the NRC 21st Century Skills Workshop). National Research Council, Washington, DC

Kyndt E, Raes E, Lismont B, Timmers F, Cascallar E, Dochy F (2013) A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educ Res Rev 10(2):133–149. https://doi.org/10.1016/j.edurev.2013.02.002

Leng J, Lu XX (2020) Is critical thinking really teachable?—A meta-analysis based on 79 experimental or quasi experimental studies. Open Educ Res 26(06):110–118. https://doi.org/10.13966/j.cnki.kfjyyj.2020.06.011

Liang YZ, Zhu K, Zhao CL (2017) An empirical study on the depth of interaction promoted by collaborative problem solving learning activities. J E-educ Res 38(10):87–92. https://doi.org/10.13811/j.cnki.eer.2017.10.014

Lipsey M, Wilson D (2001) Practical meta-analysis. International Educational and Professional, London, pp. 92–160

Liu Z, Wu W, Jiang Q (2020) A study on the influence of problem based learning on college students’ critical thinking-based on a meta-analysis of 31 studies. Explor High Educ 03:43–49

Morris SB (2008) Estimating effect sizes from pretest-posttest-control group designs. Organ Res Methods 11(2):364–386. https://doi.org/10.1177/1094428106291059

Article   ADS   Google Scholar  

Mulnix JW (2012) Thinking critically about critical thinking. Educ Philos Theory 44(5):464–479. https://doi.org/10.1111/j.1469-5812.2010.00673.x

Naber J, Wyatt TH (2014) The effect of reflective writing interventions on the critical thinking skills and dispositions of baccalaureate nursing students. Nurse Educ Today 34(1):67–72. https://doi.org/10.1016/j.nedt.2013.04.002

National Research Council (2012) Education for life and work: developing transferable knowledge and skills in the 21st century. The National Academies Press, Washington, DC

Niu L, Behar HLS, Garvan CW (2013) Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educ Res Rev 9(12):114–128. https://doi.org/10.1016/j.edurev.2012.12.002

Peng ZM, Deng L (2017) Towards the core of education reform: cultivating critical thinking skills as the core of skills in the 21st century. Res Educ Dev 24:57–63. https://doi.org/10.14121/j.cnki.1008-3855.2017.24.011

Reiser BJ (2004) Scaffolding complex learning: the mechanisms of structuring and problematizing student work. J Learn Sci 13(3):273–304. https://doi.org/10.1207/s15327809jls1303_2

Ruggiero VR (2012) The art of thinking: a guide to critical and creative thought, 4th edn. Harper Collins College Publishers, New York

Schellens T, Valcke M (2006) Fostering knowledge construction in university students through asynchronous discussion groups. Comput Educ 46(4):349–370. https://doi.org/10.1016/j.compedu.2004.07.010

Sendag S, Odabasi HF (2009) Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Comput Educ 53(1):132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Sison R (2008) Investigating Pair Programming in a Software Engineering Course in an Asian Setting. 2008 15th Asia-Pacific Software Engineering Conference, pp. 325–331. https://doi.org/10.1109/APSEC.2008.61

Simpson E, Courtney M (2002) Critical thinking in nursing education: literature review. Mary Courtney 8(2):89–98

Stewart L, Tierney J, Burdett S (2006) Do systematic reviews based on individual patient data offer a means of circumventing biases associated with trial publications? Publication bias in meta-analysis. John Wiley and Sons Inc, New York, pp. 261–286

Tiwari A, Lai P, So M, Yuen K (2010) A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 40(6):547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Wood D, Bruner JS, Ross G (2006) The role of tutoring in problem solving. J Child Psychol Psychiatry 17(2):89–100. https://doi.org/10.1111/j.1469-7610.1976.tb00381.x

Wei T, Hong S (2022) The meaning and realization of teachable critical thinking. Educ Theory Practice 10:51–57

Xu EW, Wang W, Wang QX (2022) A meta-analysis of the effectiveness of programming teaching in promoting K-12 students’ computational thinking. Educ Inf Technol. https://doi.org/10.1007/s10639-022-11445-2

Yang YC, Newby T, Bill R (2008) Facilitating interactions through structured web-based bulletin boards: a quasi-experimental study on promoting learners’ critical thinking skills. Comput Educ 50(4):1572–1585. https://doi.org/10.1016/j.compedu.2007.04.006

Yore LD, Pimm D, Tuan HL (2007) The literacy component of mathematical and scientific literacy. Int J Sci Math Educ 5(4):559–589. https://doi.org/10.1007/s10763-007-9089-4

Zhang T, Zhang S, Gao QQ, Wang JH (2022) Research on the development of learners’ critical thinking in online peer review. Audio Visual Educ Res 6:53–60. https://doi.org/10.13811/j.cnki.eer.2022.06.08

Download references

Acknowledgements

This research was supported by the graduate scientific research and innovation project of Xinjiang Uygur Autonomous Region named “Research on in-depth learning of high school information technology courses for the cultivation of computing thinking” (No. XJ2022G190) and the independent innovation fund project for doctoral students of the College of Educational Science of Xinjiang Normal University named “Research on project-based teaching of high school information technology courses from the perspective of discipline core literacy” (No. XJNUJKYA2003).

Author information

Authors and affiliations.

College of Educational Science, Xinjiang Normal University, 830017, Urumqi, Xinjiang, China

Enwei Xu, Wei Wang & Qingxia Wang

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Enwei Xu or Wei Wang .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary tables, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Xu, E., Wang, W. & Wang, Q. The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature. Humanit Soc Sci Commun 10 , 16 (2023). https://doi.org/10.1057/s41599-023-01508-1

Download citation

Received : 07 August 2022

Accepted : 04 January 2023

Published : 11 January 2023

DOI : https://doi.org/10.1057/s41599-023-01508-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Impacts of online collaborative learning on students’ intercultural communication apprehension and intercultural communicative competence.

  • Hoa Thi Hoang Chau
  • Hung Phu Bui
  • Quynh Thi Huong Dinh

Education and Information Technologies (2024)

Exploring the effects of digital technology on deep learning: a meta-analysis

The impacts of computer-supported collaborative learning on students’ critical thinking: a meta-analysis.

  • Yoseph Gebrehiwot Tedla
  • Hsiu-Ling Chen

Sustainable electricity generation and farm-grid utilization from photovoltaic aquaculture: a bibliometric analysis

  • A. A. Amusa
  • M. Alhassan

International Journal of Environmental Science and Technology (2024)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

student learning outcomes for problem solving

Featured Topics

Featured series.

A series of random questions answered by Harvard experts.

Explore the Gazette

Read the latest.

student learning outcomes for problem solving

Professor tailored AI tutor to physics course. Engagement doubled.

Harvard Postdoctoral Research Fellow Colleen Lanier-Christensen.

Did lawmakers know role of fossil fuels in climate change during Clean Air Act era?

1. Co-lead authors Maxwell Block and Bingtian Ye.

Spin squeezing for all

Lessons in learning.

Two students in lecture hall sharing notes.

Sean Finamore ’22 (left) and Xaviera Zime ’22 study during a lecture in the Science Center.

Photos by Kris Snibbe/Harvard Staff Photographer

Peter Reuell

Harvard Staff Writer

Study shows students in ‘active learning’ classrooms learn more than they think

For decades, there has been evidence that classroom techniques designed to get students to participate in the learning process produces better educational outcomes at virtually all levels.

And a new Harvard study suggests it may be important to let students know it.

The study , published Sept. 4 in the Proceedings of the National Academy of Sciences, shows that, though students felt as if they learned more through traditional lectures, they actually learned more when taking part in classrooms that employed so-called active-learning strategies.

Lead author Louis Deslauriers , the director of science teaching and learning and senior physics preceptor, knew that students would learn more from active learning. He published a key study in Science in 2011 that showed just that. But many students and faculty remained hesitant to switch to it.

“Often, students seemed genuinely to prefer smooth-as-silk traditional lectures,” Deslauriers said. “We wanted to take them at their word. Perhaps they actually felt like they learned more from lectures than they did from active learning.”

In addition to Deslauriers, the study is authored by director of sciences education and physics lecturer Logan McCarty , senior preceptor in applied physics Kelly Miller, preceptor in physics Greg Kestin , and Kristina Callaghan, now a physics lecturer at the University of California, Merced.

The question of whether students’ perceptions of their learning matches with how well they’re actually learning is particularly important, Deslauriers said, because while students eventually see the value of active learning, initially it can feel frustrating.

“Deep learning is hard work. The effort involved in active learning can be misinterpreted as a sign of poor learning,” he said. “On the other hand, a superstar lecturer can explain things in such a way as to make students feel like they are learning more than they actually are.”

professor teaching

To understand that dichotomy, Deslauriers and his co-authors designed an experiment that would expose students in an introductory physics class to both traditional lectures and active learning.

For the first 11 weeks of the 15-week class, students were taught using standard methods by an experienced instructor. In the 12th week, half the class was randomly assigned to a classroom that used active learning, while the other half attended highly polished lectures. In a subsequent class, the two groups were reversed. Notably, both groups used identical class content and only active engagement with the material was toggled on and off.

Following each class, students were surveyed on how much they agreed or disagreed with statements such as “I feel like I learned a lot from this lecture” and “I wish all my physics courses were taught this way.” Students were also tested on how much they learned in the class with 12 multiple-choice questions.

When the results were tallied, the authors found that students felt as if they learned more from the lectures, but in fact scored higher on tests following the active learning sessions. “Actual learning and feeling of learning were strongly anticorrelated,” Deslauriers said, “as shown through the robust statistical analysis by co-author Kelly Miller, who is an expert in educational statistics and active learning.”

Those results, the study authors are quick to point out, shouldn’t be interpreted as suggesting students dislike active learning. In fact, many studies have shown students quickly warm to the idea, once they begin to see the results. “In all the courses at Harvard that we’ve transformed to active learning,” Deslauriers said, “the overall course evaluations went up.”

bar chart

Co-author Kestin, who in addition to being a physicist is a video producer with PBS’ NOVA, said, “It can be tempting to engage the class simply by folding lectures into a compelling ‘story,’ especially when that’s what students seem to like. I show my students the data from this study on the first day of class to help them appreciate the importance of their own involvement in active learning.”

McCarty, who oversees curricular efforts across the sciences, hopes this study will encourage more of his colleagues to embrace active learning.

“We want to make sure that other instructors are thinking hard about the way they’re teaching,” he said. “In our classes, we start each topic by asking students to gather in small groups to solve some problems. While they work, we walk around the room to observe them and answer questions. Then we come together and give a short lecture targeted specifically at the misconceptions and struggles we saw during the problem-solving activity. So far we’ve transformed over a dozen classes to use this kind of active-learning approach. It’s extremely efficient — we can cover just as much material as we would using lectures.”

A pioneer in work on active learning, Balkanski Professor of Physics and Applied Physics Eric Mazur hailed the study as debunking long-held beliefs about how students learn.

“This work unambiguously debunks the illusion of learning from lectures,” he said. “It also explains why instructors and students cling to the belief that listening to lectures constitutes learning. I recommend every lecturer reads this article.”

Dean of Science Christopher Stubbs , Samuel C. Moncher Professor of Physics and of Astronomy, was an early convert. “When I first switched to teaching using active learning, some students resisted that change. This research confirms that faculty should persist and encourage active learning. Active engagement in every classroom, led by our incredible science faculty, should be the hallmark of residential undergraduate education at Harvard.”

Ultimately, Deslauriers said, the study shows that it’s important to ensure that neither instructors nor students are fooled into thinking that lectures are the best learning option. “Students might give fabulous evaluations to an amazing lecturer based on this feeling of learning, even though their actual learning isn’t optimal,” he said. “This could help to explain why study after study shows that student evaluations seem to be completely uncorrelated with actual learning.”

This research was supported with funding from the Harvard FAS Division of Science.

Share this article

You might like.

student learning outcomes for problem solving

Preliminary findings inspire other large Harvard classes to test approach this fall

Harvard Postdoctoral Research Fellow Colleen Lanier-Christensen.

New study suggests they did, offering insight into key issue in landmark 2022 Supreme Court ruling on EPA

1. Co-lead authors Maxwell Block and Bingtian Ye.

Physicists ease path to entanglement for quantum sensing

Harvard releases race data for Class of 2028

Cohort is first to be impacted by Supreme Court’s admissions ruling

Parkinson’s may take a ‘gut-first’ path

Damage to upper GI lining linked to future risk of Parkinson’s disease, says new study

Center for Teaching Innovation

Resource library.

  • Establishing Community Agreements and Classroom Norms
  • Sample group work rubric
  • Problem-Based Learning Clearinghouse of Activities, University of Delaware

Problem-Based Learning

Problem-based learning  (PBL) is a student-centered approach in which students learn about a subject by working in groups to solve an open-ended problem. This problem is what drives the motivation and the learning. 

Why Use Problem-Based Learning?

Nilson (2010) lists the following learning outcomes that are associated with PBL. A well-designed PBL project provides students with the opportunity to develop skills related to:

  • Working in teams.
  • Managing projects and holding leadership roles.
  • Oral and written communication.
  • Self-awareness and evaluation of group processes.
  • Working independently.
  • Critical thinking and analysis.
  • Explaining concepts.
  • Self-directed learning.
  • Applying course content to real-world examples.
  • Researching and information literacy.
  • Problem solving across disciplines.

Considerations for Using Problem-Based Learning

Rather than teaching relevant material and subsequently having students apply the knowledge to solve problems, the problem is presented first. PBL assignments can be short, or they can be more involved and take a whole semester. PBL is often group-oriented, so it is beneficial to set aside classroom time to prepare students to   work in groups  and to allow them to engage in their PBL project.

Students generally must:

  • Examine and define the problem.
  • Explore what they already know about underlying issues related to it.
  • Determine what they need to learn and where they can acquire the information and tools necessary to solve the problem.
  • Evaluate possible ways to solve the problem.
  • Solve the problem.
  • Report on their findings.

Getting Started with Problem-Based Learning

  • Articulate the learning outcomes of the project. What do you want students to know or be able to do as a result of participating in the assignment?
  • Create the problem. Ideally, this will be a real-world situation that resembles something students may encounter in their future careers or lives. Cases are often the basis of PBL activities. Previously developed PBL activities can be found online through the University of Delaware’s PBL Clearinghouse of Activities .
  • Establish ground rules at the beginning to prepare students to work effectively in groups.
  • Introduce students to group processes and do some warm up exercises to allow them to practice assessing both their own work and that of their peers.
  • Consider having students take on different roles or divide up the work up amongst themselves. Alternatively, the project might require students to assume various perspectives, such as those of government officials, local business owners, etc.
  • Establish how you will evaluate and assess the assignment. Consider making the self and peer assessments a part of the assignment grade.

Nilson, L. B. (2010).  Teaching at its best: A research-based resource for college instructors  (2nd ed.).  San Francisco, CA: Jossey-Bass. 

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Med Sci Educ
  • v.31(3); 2021 Jun

Logo of medsciedu

Effective Learning Behavior in Problem-Based Learning: a Scoping Review

Azril shahreez abdul ghani.

1 Department of Basic Medical Sciences, Kulliyah of Medicine, Bandar Indera Mahkota Campus, International Islamic University Malaysia, Kuantan, 25200 Pahang Malaysia

2 Department of Medical Education, School of Medical Sciences, Health Campus, Universiti Sains Malaysia, Kubang Kerian, Kota Bharu, 16150 Kelantan Malaysia

Ahmad Fuad Abdul Rahim

Muhamad saiful bahri yusoff, siti nurma hanim hadie.

3 Department of Anatomy, School of Medical Sciences, Health Campus, Universiti Sains Malaysia, Kubang Kerian, 16150 Kota Bharu, Kelantan Malaysia

Problem-based learning (PBL) emphasizes learning behavior that leads to critical thinking, problem-solving, communication, and collaborative skills in preparing students for a professional medical career. However, learning behavior that develops these skills has not been systematically described. This review aimed to unearth the elements of effective learning behavior in a PBL context, using the protocol by Arksey and O’Malley. The protocol identified the research question, selected relevant studies, charted and collected data, and collated, summarized, and reported results. We discovered three categories of elements—intrinsic empowerment, entrustment, and functional skills—proven effective in the achievement of learning outcomes in PBL.

Introduction

Problem-based learning (PBL) is an educational approach that utilizes the principles of collaborative learning in small groups, first introduced by McMaster Medical University [ 1 ]. The shift of the higher education curriculum from traditional, lecture-based approaches to an integrated, student-centered approach was triggered by concern over the content-driven nature of medical knowledge with minimal clinical application [ 2 ]. The PBL pedagogy uses a systematic approach, starting with an authentic, real-life problem scenario as a context in which learning is not separated from practice as students collaborate and learn [ 3 ]. The tutor acts as a facilitator who guides the students’ learning, while students are required to solve the problems by discussing them with group members [ 4 ]. The essential aspect of the PBL process is the ability of the students to recognize their current knowledge, determine the gaps in their knowledge and experience, and acquire new knowledge to bridge the gaps [ 5 ]. PBL is a holistic approach that gives students an active role in their learning.

Since its inception, PBL has been used in many undergraduate and postgraduate degree programs, such as medicine [ 6 , 7 ], nursing [ 8 ], social work education [ 9 ], law [ 10 ], architecture [ 11 ], economics [ 12 ], business [ 13 ], science [ 14 ], and engineering [ 15 ]. It has also been applied in elementary and secondary education [ 16 – 18 ]. Despite its many applications, its implementation is based on a single universal workflow framework that contains three elements: problem as the initiator for learning, tutor as a facilitator in the group versions, and group work as a stimulus for collaborative interaction [ 19 ]. However, there are various versions of PBL workflow, such as the seven-step technique based on the Maastricht “seven jumps” process. The tutor’s role is to ensure the achievement of learning objectives and to assess students’ performance [ 20 , 21 ].

The PBL process revolves around four types of learning principles: constructive, self-directed, collaborative, and contextual [ 19 ]. Through the constructive learning process, the students are encouraged to think about what is already known and integrate their prior knowledge with their new understanding. This process helps the student understand the content, form a new opinion, and acquire new knowledge [ 22 ]. The PBL process encourages students to become self-directed learners who plan, monitor, and evaluate their own learning, enabling them to become lifelong learners [ 23 ]. The contextualized collaborative learning process also promotes interaction among students, who share similar responsibilities to achieve common goals relevant to the learning context [ 24 ]. By exchanging ideas and providing feedback during the learning session, the students can attain a greater understanding of the subject matter [ 25 ].

Dolmans et al. [ 19 ] pointed out two issues related to the implementation of PBL: dominant facilitators and dysfunctional PBL groups. These problems inhibit students’ self-directed learning and reduce their satisfaction level with the PBL session. A case study by Eryilmaz [ 26 ] that evaluated engineering students’ and tutors’ experience of PBL discovered that PBL increased the students’ self-confidence and improved essential skills such as problem-solving, communications, critical thinking, and collaboration. Although most of the participants in the study found PBL satisfactory, many complained about the tutor’s poor guidance and lack of preparation. Additionally, it was noted that 64% of the first-year students were unable to adapt to the PBL system because they had been accustomed to conventional learning settings and that 43% of students were not adequately prepared for the sessions and thus were minimally involved in the discussion.

In a case study by Cónsul-giribet [ 27 ], newly graduated nursing professionals reported a lack of perceived theoretical basic science knowledge at the end of their program, despite learning through PBL. The nurses perceived that this lack of knowledge might affect their expertise, identity, and professional image.

Likewise, a study by McKendree [ 28 ] reported the outcomes of a workshop that explored the strengths and weaknesses of PBL in an allied health sciences curriculum in the UK. The workshop found that problems related to PBL were mainly caused by students, the majority of whom came from conventional educational backgrounds either during high school or their first degree. They felt anxious when they were involved in PBL, concerned about “not knowing when to stop” in exploring the learning needs. Apart from a lack of basic science knowledge, the knowledge acquired during PBL sessions remains unorganized [ 29 ]. Hence, tutors must guide students in overcoming this situation by instilling appropriate insights and essential skills for the achievement of the learning outcomes [ 30 ]. It was also evident that the combination of intention and motivation to learn and desirable learning behavior determined the quality of learning outcomes [ 31 , 32 ]. However, effective learning behaviors that help develop these skills have not been systematically described. Thus, this scoping review aimed to unearth the elements of effective learning behavior in the PBL context.

Scoping Review Protocol

This scoping review was performed using a protocol by Arksey and O’Malley [ 33 ]. The protocol comprises five phases: (i) identification of research questions, (ii) identification of relevant articles, (iii) selection of relevant studies, (iv) data collection and charting, and (v) collating, summarizing, and reporting the results.

Identification of Research Questions

This scoping review was designed to unearth the elements of effective learning behavior that can be generated from learning through PBL instruction. The review aimed to answer one research question: “What are the effective learning behavior elements related to PBL?” For the purpose of the review, an operational definition of effective learning behavior was constructed, whereby it was defined as any learning behavior that is related to PBL instruction and has been shown to successfully attain the desired learning outcomes (i.e., cognitive, skill, or affective)—either quantitatively or qualitatively—in any intervention conducted in higher education institutions.

The positive outcome variables include student viewpoint or perception, student learning experience and performance, lecturer viewpoint and expert judgment, and other indirect variables that may be important indicators of successful PBL learning (i.e., attendance to PBL session, participation in PBL activity, number of interactions in PBL activity, and improvement in communication skills in PBL).

Identification of Relevant Articles

An extensive literature search was conducted on articles published in English between 2015 and 2019. Three databases—Google Scholar, Scopus, and PubMed—were used for the literature search. Seven search terms with the Boolean combination were used, whereby the keywords were identified from the Medical Subject Headings (MeSH) and Education Resources Information Center (ERIC) databases. The search terms were tested and refined with multiple test searches. The final search terms with the Boolean operation were as follows: “problem-based learning” AND (“learning behavior” OR “learning behaviour”) AND (student OR “medical students” OR undergraduate OR “medical education”).

Selection of Relevant Articles

The articles from the three databases were exported manually into Microsoft Excel. The duplicates were removed, and the remaining articles were reviewed based on the inclusion and exclusion criteria. These criteria were tested on titles and abstracts to ensure their robustness in capturing the articles related to learning behavior in PBL. The shortlisted articles were reviewed by two independent researchers, and a consensus was reached either to accept or reject each article based on the set criteria. When a disagreement occurred between the two reviewers, the particular article was re-evaluated independently by the third and fourth researchers (M.S.B.Y and A.F.A.R), who have vast experience in conducting qualitative research. The sets of criteria for selecting abstracts and final articles were developed. The inclusion and exclusion criteria are listed in Table ​ Table1 1 .

Inclusion and exclusion criteria

CriteriaInclusion criteriaExclusion criteria
Criteria for abstract selection

1. Describe at least one effective learning behaviour in PBL setting in higher education setting

2. Provides evidence of a robust study design that is not limited to randomized controlled trials

3. Provides evidence of evaluation of a PBL

4. Outcomes of the study that are measurable either quantitatively or qualitatively

1. Primary and secondary students’ populations

2. Primary and secondary education context

Criteria for full article selection

1. Elaboration on the elements of effective learning behaviour are provided

2. Clear methodology on the measurement of the outcome

3. PBL context

4. Functional element that has been proven to promote learning

5. Well design research intervention

1. Review articles, published theses, books, research report, editorial and letters will be excluded from the searching process

Data Charting

The selected final articles were reviewed, and several important data were extracted to provide an objective summary of the review. The extracted data were charted in a table, including the (i) title of the article, (ii) author(s), (iii) year of publication, (iv) aim or purpose of the study, (v) study design and method, (iv) intervention performed, and (v) study population and sample size.

Collating, Summarizing, and Reporting the Results

A content analysis was performed to identify the elements of effective learning behaviors in the literature by A.S.A.G and S.N.H.H, who have experience in conducting qualitative studies. The initial step of content analysis was to read the selected articles thoroughly to gain a general understanding of the articles and extract the elements of learning behavior which are available in the articles. Next, the elements of learning behavior that fulfil the inclusion criteria were extracted. The selected elements that were related to each other through their content or context were grouped into subtheme categories. Subsequently, the combinations of several subthemes expressing similar underlying meanings were grouped into themes. Each of the themes and subthemes was given a name, which was operationally defined based on the underlying elements. The selected themes and subthemes were presented to the independent researchers in the team (M.S.B.Y and A.F.A.R), and a consensus was reached either to accept or reformulate each of the themes and subthemes. The flow of the scoping review methods for this study is illustrated in Fig.  1 .

An external file that holds a picture, illustration, etc.
Object name is 40670_2021_1292_Fig1_HTML.jpg

The flow of literature search and article selection

Literature Search

Based on the keyword search, 1750 articles were obtained. Duplicate articles that were not original articles found in different databases and resources were removed. Based on the inclusion and exclusion criteria of title selection, the eligibility of 1750 abstracts was evaluated. The articles that did not fulfil the criteria were removed, leaving 328 articles for abstract screening. A total of 284 articles were screened according to the eligibility criteria for abstract selection. Based on these criteria, 284 articles were selected and screened according to the eligibility criteria for full article selection. Fourteen articles were selected for the final review. The information about these articles is summarized in Table ​ Table2 2 .

Studies characteristics

Author (year)LocationStudy design/methodSubjectsInterventionOutcome
Arana-Arexolaleiba et al. [ ]Spain

Quasi-experimental design (one group pretest–posttest design)

Questionnaire only

97 undergraduate engineering students and 20 tutorsAssessing PBL learning environment and supervision on student learning approachEnvironments with higher constructive variables and supervisor formative assessment stimulate deeper learning approach in students
Khoiriyah et al. [ ]Indonesia

Quasi-experimental design (one group posttest-only design) and semi-structured interview

Questionnaire &

Interview protocol

310 undergraduate students, 10 tutors and 15 content expertsEvaluating self-assessment scale for active learning and critical thinking (SSACT) in PBLSSACT improves students critical thinking and self-directed learning
Khumsikiew et al. [ ]Thailand

Quasi-experimental design (one group pretest–posttest design)

Questionnaire only

36 undergraduate pharmacy studentsAssessing the effect of student competence in PBL with clinical environmentStudent clinical skills performance and satisfaction was significantly increase in the PBL with clinical environment
Rakhudu [ ]South Africa

Sequential explanatory mixed method design and focus group discussion

Questionnaire

135 undergraduate nursing students (2011–2013 academic year)

21 participate in FGD

114 participate in questionnaire

Evaluating the effect of PBL scenario in quality improvement in health care unit on nursing studentPBL scenario effective in promoting interdisciplinary and interinstitutional collaboration
Tarhan et al. [ ]Turkey

Quasi-experimental design (one group pretest–posttest design) and semi-structured interview

Questionnaire and

Interviews protocol

36 undergraduate biochemistry course studentsEvaluating the effect of PBL on student interest in biochemistry coursePBL Improve students investigating process, associate information’s, collaborative skills, responsibility and idea expressions
Chou et al. [ ]China

Sequential explanatory mixed method design

Observation checklist and post-PBL homework reflections

45 undergraduate medical students and 44 undergraduate nursing students

All students participate

All students participate but only the IP groups were analyzed

Assessing the effect interprofessional PBL in learning clinical ethicsThe IPE learning through PBL improve respect towards each other and avoid the development of stereotyped behavior
Chung et al. [ ]China

Quasi-experimental design (one group pretest–posttest design) and action research

Observation, instructional journal, interviews protocol and questionnaire

51 undergraduate business studentsEvaluating the effect of PBL on students learning outcome s of industrial-oriented competencesSignificantly enhanced students’ learning motivation, learning outcomes and development of instructional knowledge and capability
Geitz et al. [ ]Netherlands

Semi-structured interview

Interview protocol

62 undergraduate students and 4 tutors in business administration

8 students (selected randomly) and all 4 tutors were selected for the qualitative study

Evaluating the effect of sustainable feedback on self-efficacy and goal orientation given during the PBL sessionsPBL participants positively valued the feedback, their personal characteristics, previous experience with feedback and concomitant perceptions appeared to have greatly influenced both tutors’ and students’ specific, individual behavior, and responses
Dawilai et al. [ ]Thailand

Quasi-experimental design (one group posttest-only design) and interview

Questionnaire and interview protocol

29 English foreign language students

All participate in the questionnaire

10 students with improvement in writing course were selected for the interview

Evaluating self-regulated learning in problem-based blended learning (PBBL)PBBL students reported to apply cognitive strategy and effectively used their time and study environment
Gutman [ ]Israel

Quasi-experimental design (non-equivalent control group posttest-only design)

Questionnaire only

62 pre-service teachersEvaluating achievement goal motivation (AGM) and research literacy skills (RL) between PBL process scaffolding with moderator-based learning (OLC + M) and social based learning (OLC + S)

The PBL participants reported to show significant improvement in AGM

Only OLC + S showed significant improvement in RL

Li [ ]China

Semi-structured interview

Interview protocol

14 studentsEvaluating student learning outcome and attitude between single disciplinary course PBL and lectureThe PBL participants reported to have better outcome in interdisciplinary learning, self-directed learning, problem solving, creative thinking, communication and knowledge retentions. They also showed positive attitude of PBL is they recognize its effectiveness in skill development rather than exam oriented
Asad et al. [ ]Saudi Arabia

Cross-sectional study (period cross sectional)

Questionnaire only

120 undergraduate medical studentsEvaluating student opinion on effectiveness of PBL and interactive lecturesThe PBL participants reported to have better outcome in modes of learning facilitation, professional development, learning behavior, and environment
Hursen [ ]Cyprus

Quasi-experimental design (one group pretest–posttest design) and interview

Questionnaire and interview protocol

25 studentsEvaluating the effect of using Facebook in PBL on adults’ self-efficacy perception for research inquiryThe PBL participants reported to have positive increase in perception of self-efficacy for sustaining research
William et al. [ ]Singapore

Quasi-experimental design (non-equivalent control group posttest-only design)

Questionnaire only

149 studentsEvaluating the effect of supply chain game in PBL environmentThe game based PBL reported to increase score on metacognition function and motivation function. The game based PBL also showed significant correlation between motivation and positive game experience with the students’ perceived learning

Study Characteristics

The final 14 articles were published between 2015 and 2019. The majority of the studies were conducted in Western Asian countries ( n  = 4), followed by China ( n  = 3), European countries ( n  = 2), Thailand ( n  = 2), Indonesia ( n  = 1), Singapore ( n  = 1), and South Africa ( n  = 1). Apart from traditional PBL, some studies incorporated other pedagogic modalities into their PBL sessions, such as online learning, blended learning, and gamification. The majority of the studies targeted a single-profession learner group, and one study was performed on mixed interprofessional health education learners.

Results of Thematic Analysis

The thematic analysis yielded three main themes of effective learning behavior: intrinsic empowerment, entrustment, and functional skills. Intrinsic empowerment overlies four proposed subthemes: proactivity, organization, diligence, and resourcefulness. For entrustment, there were four underlying subthemes: students as assessors, students as teachers, feedback-giving, and feedback-receiving. The functional skills theme contains four subthemes: time management, digital proficiency, data management, and collaboration.

Theme 1: Intrinsic Empowerment

Intrinsic empowerment enforces student learning behavior that can facilitate the achievement of learning outcomes. By empowering the development of these behaviors, students can become lifelong learners [ 34 ]. The first element of intrinsic empowerment is proactive behavior. In PBL, the students must be proactive in analyzing problems [ 35 , 36 ] and their learning needs [ 35 , 37 ], and this can be done by integrating prior knowledge and previous experience through a brainstorming session [ 35 , 38 ]. The students must be proactive in seeking guidance to ensure they stay focused and confident [ 39 , 40 ]. Finding ways to integrate content from different disciplines [ 35 , 41 ], formulate new explanations based on known facts [ 34 , 35 , 41 ], and incorporate hands-on activity [ 35 , 39 , 42 ] during a PBL session are also proactive behaviors.

The second element identified is “being organized” which reflects the ability of students to systematically manage their roles [ 43 ], ideas, and learning needs [ 34 ]. The students also need to understand the task for each learning role in PBL, such as chairperson or leader, scribe, recorder, and reflector. This role needs to be assigned appropriately to ensure that all members take part in the discussion [ 43 ]. Similarly, when discussing ideas or learning needs, the students need to follow the steps in the PBL process and organize and prioritize the information to ensure that the issues are discussed systematically and all aspects of the problems are covered accordingly [ 34 , 37 ]. This team organization and systematic thought process is an effective way for students to focus, plan, and finalize their learning tasks.

The third element of intrinsic empowerment is “being diligent.” Students must consistently conduct self-revision [ 40 ] and keep track of their learning plan to ensure the achievement of their learning goal [ 4 , 40 ]. The students must also be responsible for completing any given task and ensuring good understanding prior to their presentation [ 40 ]. Appropriate actions need to be undertaken to find solutions to unsolved problems [ 40 , 44 ]. This effort will help them think critically and apply their knowledge for problem-solving.

The fourth element identified is “being resourceful.” Students should be able to acquire knowledge from different resources, which include external resources (i.e., lecture notes, textbooks, journal articles, audiovisual instructions, the Internet) [ 38 , 40 , 45 ] and internal resources (i.e., students’ prior knowledge or experience) [ 35 , 39 ]. The resources must be evidence-based, and thus should be carefully selected by evaluating their cross-references and appraising them critically [ 37 ]. Students should also be able to understand and summarize the learned materials and explain them using their own words [ 4 , 34 ]. The subthemes of the intrinsic empowerment theme are summarized in Table ​ Table3 3 .

 Intrinsic empowerment subtheme with the learning behavior elements

Intrinsic empowerment
ProactiveBeing organizedBeing diligentResourceful

• Analyze problems and learning needs

• Seek guidance

• Integrate subjects from different disciplines

• Incorporate hands on activities

• Organize PBL team by assigning roles

• Organize discussed ideas or learning needs

• Prioritize ideas or learning needs

• Consistent in self-study

• Keep track with plans

• Responsible in completing the task

• Responsible in understanding the learning materials

• Use various resources

• Appraise the resources

• Use evidence-based resources

• Paraphrase the resources

Theme 2: Entrustment

Entrustment emphasizes the various roles of students in PBL that can promote effective learning. The first entrusted role identified is “student as an assessor.” This means that students evaluate their own performance in PBL [ 46 ]. The evaluation of their own performance must be based on the achievement of the learning outcomes and reflect actual understanding of the content as well as the ability to apply the learned information in problem-solving [ 46 ].

The second element identified in this review is “student as a teacher.” To ensure successful peer teaching in PBL, students need to comprehensively understand the content of the learning materials and summarize the content in an organized manner. The students should be able to explain the gist of the discussed information using their own words [ 4 , 34 ] and utilize teaching methods to cater to differences in learning styles (i.e., visual, auditory, and kinesthetic) [ 41 ]. These strategies help capture their group members’ attention and evoke interactive discussions among them.

The third element of entrustment is to “give feedback.” Students should try giving constructive feedback on individual and group performance in PBL. Feedback on individual performance must reflect the quality of the content and task presented in the PBL. Feedback on group performance should reflect the ways in which the group members communicate and complete the group task [ 47 ]. To ensure continuous constructive feedback, students should be able to generate feedback questions beforehand and immediately deliver them during the PBL sessions [ 44 , 47 ]. In addition, the feedback must include specific measures for improvement to help their peers to take appropriate action for the future [ 47 ].

The fourth element of entrustment is “receive feedback.” Students should listen carefully to the feedback given and ask questions to clarify the feedback [ 47 ]. They need to be attentive and learn to deal with negative feedback [ 47 ]. Also, if the student does not receive feedback, they should request it either from peers or teachers and ask specific questions, such as what aspects to improve and how to improve [ 47 ]. The data on the subthemes of the entrustment theme are summarized in Table ​ Table4 4 .

Entrustment subtheme with the learning behavior elements

Entrustment
Student as assessorStudent as teacherGive feedbackReceive feedback

• Evaluate individual performance

• Evaluate group performance

• Prepare teaching materials

• Use various learning styles

• Give feedback on individual task

• Give feedback on group learning process

• Prepare feedback questions beforehand

• Suggest measures for future improvement

• Clarify feedback

• Request feedback from peers and teachers

Theme 3: Functional Skills

Functional skills refer to essential skills that can help students learn independently and competently. The first element identified is time management skills. In PBL, students must know how to prioritize learning tasks according to the needs and urgency of the tasks [ 40 ]. To ensure that students can self-pace their learning, a deadline should be set for each learning task within a manageable and achievable learning schedule [ 40 ].

Furthermore, students should have digital proficiency, the ability to utilize digital devices to support learning [ 38 , 40 , 44 ]. The student needs to know how to operate basic software (e.g., Words and PowerPoints) and the basic digital tools (i.e., social media, cloud storage, simulation, and online community learning platforms) to support their learning [ 39 , 40 ]. These skills are important for peer learning activities, which may require information sharing, information retrieval, online peer discussion, and online peer feedback [ 38 , 44 ].

The third functional skill identified is data management, the ability to collect key information in the PBL trigger and analyze that information to support the solution in a problem-solving activity [ 39 ]. Students need to work either individually or in a group to collect the key information from a different trigger or case format such as text lines, an interview, an investigation, or statistical results [ 39 ]. Subsequently, students also need to analyze the information and draw conclusions based on their analysis [ 39 ].

The fourth element of functional skill is collaboration. Students need to participate equally in the PBL discussion [ 41 , 46 ]. Through discussion, confusion and queries can be addressed and resolved by listening, respecting others’ viewpoints, and responding professionally [ 35 , 39 , 43 , 44 ]. In addition, the students need to learn from each other and reflect on their performance [ 48 ]. Table ​ Table5 5 summarizes the data on the subthemes of the functional skills theme.

Functional skills subtheme with the learning behavior elements

Functional skills
Time managementDigital proficiencyData managementCollaborative skill

• Create learning schedule

• Set up deadline for each task

• Prioritize work for each task

• Use digital devices

• Use digital tools

• Collect data

• Analyze data

• Discuss professionally

• Learn from each other

This scoping review outlines three themes of effective learning behavior elements in the PBL context: intrinsic empowerment, entrustment, and functional skills. Hence, it is evident from this review that successful PBL instruction demands students’ commitment to empower themselves with value-driven behaviors, skills, and roles.

In this review, intrinsic empowerment is viewed as enforcement of students’ internal strength in performing positive learning behaviors related to PBL. This theme requires the student to proactively engage in the learning process, organize their learning activities systematically, persevere in learning, and be intelligently resourceful. One of the elements of intrinsic empowerment is the identification and analysis of problems related to complex scenarios. This element is aligned with a study by Meyer [ 49 ], who observed students’ engagement in problem identification and clarification prior to problem-solving activities in a PBL session related to multiple engineering design. Rubenstein and colleagues [ 50 ] discovered in a semi-structured interview the importance of undergoing a problem identification process before proposing a solution during learning. It was reported that the problem identification process in PBL may enhance the attainment of learning outcomes, specifically in the domain of concept understanding [ 51 ].

The ability of the students to acquire and manage learning resources is essential for building their understanding of the learned materials and enriching discussion among team members during PBL. This is aligned with a study by Jeong and Hmelo-Silver [ 52 ], who studied the use of learning resources by students in PBL. The study concluded that in a resource-rich environment, the students need to learn how to access and understand the resources to ensure effective learning. Secondly, they need to process the content of the resources, integrate various resources, and apply them in problem-solving activities. Finally, they need to use the resources in collaborative learning activities, such as sharing and relating to peer resources.

Wong [ 53 ] documented that excellent students spent considerably more time managing academic resources than low achievers. The ability of the student to identify and utilize their internal learning resources, such as prior knowledge and experience, is also important. A study by Lee et al. [ 54 ] has shown that participants with high domain-specific prior knowledge displayed a more systematic approach and high accuracy in visual and motor reactions in solving problems compared to novice learners.

During the discussion phase in PBL, organizing ideas—e.g., arranging relevant information gathered from the learning resources into relevant categories—is essential for communicating the idea clearly [ 34 ]. This finding is in line with a typology study conducted by Larue [ 55 ] on second-year nursing students’ learning strategies during a group discussion. The study discovered that although the content presented by the student is adequate, they unable to make further progress in the group discussion until they are instructed by the tutor on how to organize the information given into a category [ 55 ].

Hence, the empowerment of student intrinsic behavior may enhance students’ learning in PBL by allowing them to make a decision in their learning objectives and instilling confidence in them to achieve goals. A study conducted by Kirk et al. [ 56 ] proved that highly empowered students obtain better grades, increase learning participation, and target higher educational aspirations.

Entrustment is the learning role given to students to be engaging and identify gaps in their learning. This theme requires the student to engage in self-assessment, prepare to teach others, give constructive feedback, and value the feedback received. One of the elements of entrustment is the ability to self-assess. In a study conducted by Mohd et al. [ 57 ] looking at the factors in PBL that can strengthen the capability of IT students, they discovered that one of the critical factors that contribute to these skills is the ability of the student to perform self-assessment in PBL. As mentioned by Daud, Kassim, and Daud [ 58 ], the self-assessment may be more reliable if the assessment is performed based on the objectives set beforehand and if the criteria of the assessment are understood by the learner. This is important to avoid the fact that the result of the self-assessment is influenced by the students’ perception of themselves rather than reflecting their true performance. However, having an assessment based on the learning objective only focuses on the immediate learning requirements in the PBL. To foster lifelong learning skills, it should also be balanced with the long-term focus of assessment, such as utilizing the assessment to foster the application of knowledge in solving real-life situations. This is aligned with the review by Boud and Falchikov [ 59 ] suggesting that students need to become assessors within the concept of participation in practice, that is, the kind that is within the context of real life and work.

The second subtheme of entrustment is “students as a teacher” in PBL. In our review, the student needs to be well prepared with the teaching materials. A cross-sectional study conducted by Charoensakulchai and colleagues discovered that student preparation is considered among the important factors in PBL success, alongside other factors such as “objective and contents,” “student assessment,” and “attitude towards group work” [ 60 ]. This is also aligned with a study conducted by Sukrajh [ 61 ] using focus group discussion on fifth-year medical students to explore their perception of preparedness before conducting peer teaching activity. In this study, the student in the focus group expressed that the preparation made them more confident in teaching others because preparing stimulated them to activate and revise prior knowledge, discover their knowledge gaps, construct new knowledge, reflect on their learning, improve their memory, inspire them to search several resources, and motivate them to learn the topics.

The next element of “student as a teacher” is using various learning styles to teach other members in the group. A study conducted by Almomani [ 62 ] showed that the most preferred learning pattern by the high school student is the visual pattern, followed by auditory pattern and then kinesthetic. However, in the university setting, Hamdani [ 63 ] discovered that students prefer a combination of the three learning styles. Anbarasi [ 64 ] also explained that incorporating teaching methods based on the student’s preferred learning style further promotes active learning among the students and significantly improved the long-term retrieval of knowledge. However, among the three learning styles group, he discovered that the kinesthetic group with the kinesthetic teaching method showed a significantly higher post-test score compared to the traditional group with the didactic teaching method, and he concluded that this is because of the involvement of more active learning activity in the kinesthetic group.

The ability of students to give constructive feedback on individual tasks is an important element in promoting student contribution in PBL because feedback from peers or teachers is needed to reassure themselves that they are on the right track in the learning process. Kamp et al. [ 65 ] performed a study on the effectiveness of midterm peer feedback on student individual cognitive, collaborative, and motivational contributions in PBL. The experimental group that received midterm peer feedback combined with goal-setting with face-to-face discussion showed an increased amount of individual contributions in PBL. Another element of effective feedback is that the feedback is given immediately after the observed behavior. Parikh and colleagues survey student feedback in PBL environments among 103 final-year medical students in five Ontario schools, including the University of Toronto, McMaster University, Queens University, the University of Ottawa, and the University of Western Ontario. They discovered that there was a dramatic difference between McMaster University and other universities in the immediacy of feedback they practiced. Seventy percent of students at McMaster reported receiving immediate feedback in PBL, compared to less than 40 percent of students from the other universities, in which most of them received feedback within one week or several weeks after the PBL had been conducted [ 66 ]. Another study, conducted among students of the International Medical University of Kuala Lumpur examining the student expectation on feedback, discovered that immediate feedback is effective if the feedback is in written form, simple but focused on the area of improvement, and delivered by a content expert. If the feedback is delivered by a content non-expert and using a model answer, it must be supplemented with teacher dialogue sessions to clarify the feedback received [ 67 ].

Requesting feedback from peers and teachers is an important element of the PBL learning environment, enabling students to discover their learning gaps and ways to fill them. This is aligned with a study conducted by de Jong and colleagues [ 68 ], who discovered that high-performing students are more motivated to seek feedback than low-performing students. The main reason for this is because high-performing students seek feedback as a tool to learn from, whereas low-performing students do so as an academic requirement. This resulted in high-performing students collecting more feedback. A study by Bose and Gijselaers [ 69 ] examined the factors that promote feedback-seeking behavior in medical residency. They discovered that feedback-seeking behavior can be promoted by providing residents with high-quality feedback to motivate them to ask for feedback for improvement.

By assigning an active role to students as teachers, assessors, and feedback providers, teachers give them the ownership and responsibility to craft their learning. The learner will then learn the skills to monitor and reflect on their learning to achieve academic success. Furthermore, an active role encourages students to be evaluative experts in their own learning, and promoting deep learning [ 70 ].

Functional skills refer to essential abilities for competently performing a task in PBL. This theme requires the student to organize and plan time for specific learning tasks, be digitally literate, use data effectively to support problem-solving, and work together efficiently to achieve agreed objectives. One of the elements in this theme is to have a schedule of learning tasks with deadlines. In a study conducted by Tadjer and colleagues [ 71 ], they discovered that setting deadlines with a restricted time period in a group activity improved students’ cognitive abilities and soft skills. Although the deadline may initially cause anxiety, coping with it encourages students to become more creative and energetic in performing various learning strategies [ 72 , 73 ]. Ballard et al. [ 74 ] reported that students tend to work harder to complete learning tasks if they face multiple deadlines.

The students also need to be digitally literate—i.e., able to demonstrate the use of technological devices and tools in PBL. Taradi et al. [ 75 ] discovered that incorporating technology in learning—blending web technology with PBL—removes time and place barriers in the creation of a collaborative environment. It was found that students who participated in web discussions achieved a significantly higher mean grade on a physiology final examination than those who used traditional methods. Also, the incorporation of an online platform in PBL can facilitate students to develop investigation and inquiry skills with high-level cognitive thought processes, which is crucial to successful problem-solving [ 76 ].

In PBL, students need to work collaboratively with their peers to solve problems. A study by Hidayati et al. [ 77 ] demonstrated that effective collaborative skills improve cognitive learning outcomes and problem-solving ability among students who undergo PBL integrated with digital mind maps. To ensure successful collaborative learning in PBL, professional communication among students is pertinent. Research by Zheng and Huang [ 78 ] has proven that co-regulation (i.e., warm and responsive communication that provides support to peers) improved collaborative effort and group performance among undergraduate and master’s students majoring in education and psychology. This is also in line with a study by Maraj and colleagues [ 79 ], which showed the strong team interaction within the PBL group leads to a high level of team efficacy and academic self-efficacy. Moreover, strengthening communication competence, such as by developing negotiation skills among partners during discussion sessions, improves student scores [ 80 ].

PBL also includes opportunities for students to learn from each other (i.e., peer learning). A study by Maraj et al. [ 79 ] discovered that the majority of the students in their study perceived improvement in their understanding of the learned subject when they learned from each other. Another study by Lyonga [ 81 ] documented the successful formation of cohesive group learning, where students could express and share their ideas with their friends and help each other. It was suggested that each student should be paired with a more knowledgeable student who has mastered certain learning components to promote purposeful structured learning within the group.

From this scoping review, it is clear that functional skills equip the students with abilities and knowledge needed for successful PBL. Studies have shown that strong time management skills, digital literacy, data management, and collaborative skills lead to positive academic achievement [ 77 , 82 , 83 ].

Limitation of the Study

This scoping review is aimed to capture the recent effective learning behavior in problem-based learning; therefore, the literature before 2015 was not included. Without denying the importance of publication before 2015, we are relying on Okoli and Schabram [ 84 ] who highlighted the impossibility of retrieving all the published articles when conducting a literature search. Based on this ground, we decided to focus on the time frame between 2015 and 2019, which is aligned with the concepts of study maturity (i.e., the more mature the field, the higher the published articles and therefore more topics were investigated) by Kraus et al. [ 85 ]. In fact, it was noted that within this time frame, a significant number of articles have been found as relevant to PBL with the recent discovery of effective learning behavior. Nevertheless, our time frame did not include the timing of the coronavirus disease 19 (COVID-19) pandemic outbreak, which began at the end of 2019. Hence, we might miss some important elements of learning behavior that are required for the successful implementation of PBL during the COVID-19 pandemic.

Surprisingly, the results obtained from this study are also applicable for the PBL sessions administration during the COVID-19 pandemic situation as one of the functional skills identified is digital proficiency. This skill is indeed important for the successful implementation of online PBL session.

This review identified the essential learning behaviors required for effective PBL in higher education and clustered them into three main themes: (i) intrinsic empowerment, (ii) entrustment, and (iii) functional skills. These learning behaviors must coexist to ensure the achievement of desired learning outcomes. In fact, the findings of this study indicated two important implications for future practice. Firstly, the identified learning behaviors can be incorporated as functional elements in the PBL framework and implementation. Secondly, the learning behaviors change and adaption can be considered to be a new domain of formative assessment related to PBL. It is noteworthy to highlight that these learning behaviors could help in fostering the development of lifelong skills for future workplace challenges. Nevertheless, considerably more work should be carried out to design a solid guideline on how to systematically adopt the learning behaviors in PBL sessions, especially during this COVID-19 pandemic situation.

This study was supported by Postgraduate Incentive Grant-PhD (GIPS-PhD, grant number: 311/PPSP/4404803).

Declarations

The study has received an ethical approval from the Human Research Ethics Committee of Universiti Sains Malaysia.

No informed consent required for the scoping review.

The authors declare no competing interests.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Skip to Content
  • Skip to Main Navigation
  • Skip to Search

student learning outcomes for problem solving

Indiana University Indianapolis Indiana University Indianapolis IU Indianapolis

Open Search

  • Center Directory
  • Hours, Location, & Contact Info
  • Course (Re)Design Institute for Student Success
  • Plater-Moore Conference on Teaching and Learning
  • Teaching Foundations Webinar Series
  • In Their Own Words Series
  • Associate Faculty Development
  • Early Career Teaching Academy
  • Faculty Fellows Program
  • Graduate Student and Postdoc Teaching Development
  • Awardees' Expectations
  • Request for Proposals
  • Proposal Writing Guidelines
  • Support Letter
  • Proposal Review Process and Criteria
  • Support for Developing a Proposal
  • Download the Budget Worksheet
  • CEG Travel Grant
  • Albright and Stewart
  • Bayliss and Fuchs
  • Glassburn and Starnino
  • Rush Hovde and Stella
  • Mithun and Sankaranarayanan
  • Hollender, Berlin, and Weaver
  • Rose and Sorge
  • Dawkins, Morrow, Cooper, Wilcox, and Rebman
  • Wilkerson and Funk
  • Vaughan and Pierce
  • CEG Scholars
  • Broxton Bird
  • Jessica Byram
  • Angela and Neetha
  • Travis and Mathew
  • Kelly, Ron, and Jill
  • Allison, David, Angela, Priya, and Kelton
  • Pamela And Laura
  • Tanner, Sally, and Jian Ye
  • Mythily and Twyla
  • Learning Environments Grant
  • Extended Reality Initiative(XRI)
  • Champion for Teaching Excellence Award
  • Feedback on Teaching
  • Consultations
  • Equipment Loans
  • Quality Matters@IU
  • To Your Door Workshops
  • Support for DEI in Teaching
  • IU Teaching Resources
  • Just-In-Time Course Design
  • Teaching Online
  • Description and Purpose
  • Examples Repository
  • Submit Examples
  • Using the Taxonomy
  • Scholarly Teaching Growth Survey
  • The Forum Network
  • Media Production Spaces
  • CTL Happenings Archive
  • Recommended Readings Archive

Center for Teaching and Learning

  • Preparing to Teach

Writing and Assessing Student Learning Outcomes

By the end of a program of study, what do you want students to be able to do? How can your students demonstrate the knowledge the program intended them to learn? Student learning outcomes are statements developed by faculty that answer these questions. Typically, Student learning outcomes (SLOs) describe the knowledge, skills, attitudes, behaviors or values students should be able to demonstrate at the end of a program of study. A combination of methods may be used to assess student attainment of learning outcomes.

Characteristics of Student Learning Outcomes (SLOs)

  • Describe what students should be able to demonstrate, represent or produce upon completion of a program of study (Maki, 2010)

A diagram related to learning outcomes and action verbs. The content includes sample learning outcomes and demonstrated learning actions.

Student learning outcomes also:

  • Should align with the institution’s curriculum and co-curriculum outcomes (Maki, 2010)
  • Should be collaboratively authored and collectively accepted (Maki, 2010)
  • Should incorporate or adapt professional organizations outcome statements when they exist (Maki, 2010)
  • Can be quantitatively and/or qualitatively assessed during a student’s studies (Maki, 2010)

Examples of Student Learning Outcomes

The following examples of student learning outcomes are too general and would be very hard to measure : (T. Banta personal communication, October 20, 2010)

  • will appreciate the benefits of exercise science.
  • will understand the scientific method.
  • will become familiar with correct grammar and literary devices.
  • will develop problem-solving and conflict resolution skills.

The following examples, while better are still general and again would be hard to measure. (T. Banta personal communication, October 20, 2010)

  • will appreciate exercise as a stress reduction tool.
  • will apply the scientific method in problem solving.
  • will demonstrate the use of correct grammar and various literary devices.
  • will demonstrate critical thinking skills, such as problem solving as it relates to social issues.

The following examples are specific examples and would be fairly easy to measure when using the correct assessment measure: (T. Banta personal communication, October 20, 2010)

  • will explain how the science of exercise affects stress.
  • will design a grounded research study using the scientific method.
  • will demonstrate the use of correct grammar and various literary devices in creating an essay.
  • will analyze and respond to arguments about racial discrimination.

Importance of Action Verbs and Examples from Bloom’s Taxonomy

  • Action verbs result in overt behavior that can be observed and measured (see list below).
  • Verbs that are unclear, and verbs that relate to unobservable or unmeasurable behaviors, should be avoided (e.g., appreciate, understand, know, learn, become aware of, become familiar with). View Bloom’s Taxonomy Action Verbs

Assessing SLOs

Instructors may measure student learning outcomes directly, assessing student-produced artifacts and performances; instructors may also measure student learning indirectly, relying on students own perceptions of learning.

Direct Measures of Assessment

Direct measures of student learning require students to demonstrate their knowledge and skills. They provide tangible, visible and self-explanatory evidence of what students have and have not learned as a result of a course, program, or activity (Suskie, 2004; Palomba & Banta, 1999). Examples of direct measures include:

  • Objective tests
  • Presentations
  • Classroom assignments

This example of a Student Learning Outcome (SLO) from psychology could be assessed by an essay, case study, or presentation: Students will analyze current research findings in the areas of physiological psychology, perception, learning, abnormal and social psychology.

Indirect Measures of Assessment

Indirect measures of student learning capture students’ perceptions of their knowledge and skills; they supplement direct measures of learning by providing information about how and why learning is occurring. Examples of indirect measures include:

  • Self assessment
  • Peer feedback
  • End of course evaluations
  • Questionnaires
  • Focus groups
  • Exit interviews

Using the SLO example from above, an instructor could add questions to an end-of-course evaluation asking students to self-assess their ability to analyze current research findings in the areas of physiological psychology, perception, learning, abnormal and social psychology. Doing so would provide an indirect measure of the same SLO.

  • Balances the limitations inherent when using only one method (Maki, 2004).
  • Provides students the opportunity to demonstrate learning in an alternative way (Maki, 2004).
  • Contributes to an overall interpretation of student learning at both institutional and programmatic levels.
  • Values the many ways student learn (Maki, 2004).

Bloom, B. (1956) A taxonomy of educational objectives, The classification of educational goals-handbook I: Cognitive domain . New York: McKay .

Maki, P.L. (2004). Assessing for learning: Building a sustainable commitment across the institution . Sterling, VA: Stylus.

Maki, P.L. (2010 ). Assessing for learning: Building a sustainable commitment across the institution (2nd ed.) . Sterling, VA: Stylus.

Palomba, C.A., & Banta, T.W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education . San Francisco: Jossey-Bass.

Suskie, L. (2004). Assessing student learning: A common sense guide. Bolton, MA: Anker Publishing.

Revised by Doug Jerolimov (April, 2016)

Helpful Links

  • Revise Bloom's Taxonomy Action Verbs
  • Fink's Taxonomy

Related Guides

  • Creating a Syllabus
  • Assessing Student Learning Outcomes

Recommended Books

Book cover of Assessing for Learning by Peggy L. Maki

Center for Teaching and Learning social media channels

Assessing Student Learning Outcomes Across a Curriculum

  • First Online: 20 April 2016

Cite this chapter

student learning outcomes for problem solving

  • Marcia Mentkowski 4 ,
  • Jeana Abromeit 5 ,
  • Heather Mernitz 6 ,
  • Kelly Talley 7 ,
  • Catherine Knuteson 8 ,
  • William H Rickards 9 ,
  • Lois Kailhofer 10 ,
  • Jill Haberman 7 &
  • Suzanne Mente 11  

Part of the book series: Innovation and Change in Professional Education ((ICPE,volume 13))

1322 Accesses

Disciplinary and professional competence in postsecondary education is made up of complex sets of constructs and role performances that differ markedly across the disciplines and professions. These often defy definition as learning outcomes because they are multidimensional and holistic. Even so, instructors who teach and assessors who evaluate competence in many fields may engage their colleagues in processes, usually within disciplines and professions, to capture enough breadth and depth of constructs and performances that are essential for particular roles. The question is whether students can integrate and transfer their learning across a curriculum and over time. Authors report on the design of an assessment technique for integration of knowledge constructs and role performances and their use, and adaptation and transfer across math and science prerequisite coursework. This assessment requires students to demonstrate scientific reasoning, quantitative literacy, analysis, and problem solving across these disciplines and over time, on demand, and in a setting outside of their regular coursework. During training of faculty assessors, independent evaluators recorded and categorized faculty questions re validity and reliability of their judgments and of assessment policies and procedures. A subgroup resolved them through action research. The authors conclude that each of the validity and reliability issues, also identified by the subgroup of multidisciplinary faculty and educational researchers, was also raised by faculty members as they were being trained as assessors. These faculty assessors were from across the disciplines and professions. Thus, faculties experienced in performance assessments who also serve as assessors of broad learning outcomes are likely to continue to develop assessment techniques with appropriate considerations of validity, reliability, and especially consequential validity. At this college, contextual and consequential validity for demonstration of individual student learning outcomes on assessments of integration and transfer imply achievement of complex, multidimensional learning outcomes, so students who were unsuccessful had further opportunity for instruction and reassessment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

student learning outcomes for problem solving

Taking Stock of Initiatives to Improve Learning Quality in American Higher Education Through Assessment

Examining the effectiveness of a learning outcomes assessment program: a four frames perspective.

student learning outcomes for problem solving

International Performance Assessment of Learning in Higher Education (iPAL): Research and Development

Rubric authors identify some other purposes as well, for example, using the Rubrics to shape learning outcomes for a series of courses.

Statements were checked by participants from the meeting to ensure accuracy.

Alverno College Faculty. (1985, revised 1994). Student assessment-as-learning at Alverno College. Milwaukee, WI: Alverno College Institute.

Google Scholar  

American Association of Colleges and Universities. (2011). The LEAP vision for learning: Outcomes, practices, impact, and employers’ views . Washington, DC: American Association of Colleges and Universities.

Arum, R., & Roska, J. (2010). Academically adrift: Limited learning on college campuses . Chicago, IL: University of Chicago Press.

Book   Google Scholar  

Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school (Expanded ed.). National Academy Press: National Research Council.

Ericsson, K. A., Charness, N., Feltovich, P. J., & Hoffman, R. R. (Eds.). (2006). The Cambridge handbook of expertise and expert performance . New York: Cambridge University Press.

Hammond, K. R. (1996). Human judgment and social policy: Irreducible uncertainty, inevitable error, and unavoidable injustice . New York: Oxford University Press.

Harris, I. B. (1991). Deliberative inquiry: The arts of planning. In E. C. Short (Ed.), Forms of curriculum inquiry (pp. 285–308). Albany: State University of New York Press.

Hout, M., Elliott, S. W. (Eds.). (2011). Committee on Incentives and Test-Based Accountability in Public Education. Incentives and test-based accountability in education. Washington, DC: National Research Council.

Huber, M. T., & Hutchings, P. (2004). Integrative learning: Mapping the terrain . Washington, D.C.: American Association of Colleges and Universities.

Kane, M. T. (1992). The assessment of professional competence. Evaluation and the Health Professions, 15 (2), 163–182.

Article   Google Scholar  

Kuh, G. D., Kinzie, J., Schuh, J. H., Whitt, E. J., et al. (2005). Student success in college: Creating conditions that matter . San Francisco: Jossey-Bass.

Loacker, G., & Rogers, G. (2005). Assessment at Alverno College: Student, program, institutional . Milwaukee, WI: Alverno Institute.

Mentkowski, M., & Rogers, G. P. (1985). Longitudinal assessment of critical thinking in college: What measures assess curricular impact? . Milwaukee, WI: Alverno College Productions.

Mentkowski, M., & Rogers, G. (1986). Assessing critical thinking. In L. S. Cromwell (Ed.), Teaching critical thinking in the arts and humanities (pp. 117–128). Milwaukee, WI: Alverno Productions.

Mentkowski, M., & Rogers, G. (1988). Establishing the validity of measures of college student outcomes . Milwaukee, WI: Alverno College Institute.

Mentkowski, M. (1988). Paths to integrity: Educating for personal growth and professional performance. In S. Srivastva, et al. (Eds.), Executive integrity: The search for high human values in organizational life (pp. 89–121). San Francisco: Jossey-Bass.

Mentkowski, M. (1991). Creating a context where institutional assessment yields educational improvement. Journal of General Education, 40 , 255–283. (Reprinted in Assessment and program evaluation (ASHE Reader Series), pp. 251−268, by J. S. Stark, & A. Thomas, Eds., 1994, Needham Heights, MA: Simon & Schuster).

Mentkowski, M. (1998). Higher education assessment and national goals for education: Issues, assumptions, and principles. In N. M. Lambert & B. L. McCombs (Eds.), How students learn: Reforming schools through learner-centered education (pp. 269–310). Washington, DC: American Psychological Association.

Chapter   Google Scholar  

Mentkowski, M., & Associates (2000). Learning that lasts: Integrating learning, development, and performance in college and beyond . San Francisco: Jossey-Bass.

Mentkowski, M. (2006). Accessible and adaptable elements of Alverno student assessment-as-learning: Strategies and challenges for peer review. In C. Bryan & K. Clegg (Eds.), Innovative assessment in higher education (pp. 48–63). London, UK: Taylor and Francis.

Mentkowski, M., & Sharkey, S. (2011). How we know it when we see it: Conceptualizing and applying integrative and applied learning-in-use. In Jeremy D. Penn (Ed.), Assessing complex general education student learning outcomes, ( pp. 48—63). New Directions for Institutional Research (149, Spring). San Francisco: Jossey-Bass.

Messick, S. (1980). Test validity and the ethics of assessment. American Psychologist, 35 (11), 1012–1027.

Messick, S. (1982). Abilities and knowledge in educational achievement testing: The assessment of dynamic cognitive structures . Princeton, NJ: Educational Testing Service.

Messick, S. (1989). Meaning and values in test validation: The science and ethics of assessment. Educational Researcher, 18 (2), 5–11.

Messick, S. (1992). The interplay of evidence and consequences in the validation of performance assessments . Princeton, NJ: Educational Testing Service.

Messick, S. (1993). Trait equivalence as construct validity of score interpretation across multiple methods of measurement. In R. E. Bennett & W. C. Ward (Eds.), Construction versus choice in cognitive measurement: Issues in constructed response, performance testing, and portfolio assessment (pp. 61–73). Hillsdale, NJ: Erlbaum.

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23 (2), 13–23.

Messick, S. J. (Ed.). (1999). Assessment in higher education: Issues of access, quality, student development, and public policy . Mahwah, NJ: Erlbaum.

O’Brien, R. (2001). Um exame da abordagem metodológica da pesquisa ação [An overview of the methodological approach of action research]. In Roberto Richardson (Ed.), Teoria e Prática da Pesquisa Ação [Theory and practice of action research] . João Pessoa, Brazil: Universidade Federal da Paraíba. English version retrieved 5/25/11 at http://www.web.ca/~robrien/papers/arfinal.html

Pascarella, E. T., & Blaich, C. (2013). Lessons from the Wabash National Study of Liberal Arts Educaton. Change: The Magazine of Higher Learning . London: Taylor and Francis.

Reason, P., & McArdle, K. L. (2008). Action research and organization development. In T. Cummings (Ed.), Handbook of organization development (pp. 123–137). Thousand Oaks, CA: Sage Publications. Retrieved 5/25/11 at http://www.peterreason.eu/Papers/ActionResearch&OrganizationDevelopment.pdf

Rhodes, T. L. (Ed.). (2010). Assessing outcomes and improving achievement: Tips and tools for using rubrics . Washington, DC: American Association of Colleges and Universities.

Rogers, G. (1994, January–February). Measurement and judgment in curriculum assessment systems. Assessment Update , 6(1), 6–7.

Rogers, G., & Mentkowski, M. (1994). Alverno faculty validation of abilities scored in five-year alumna performance . Milwaukee, WI: Alverno College Institute.

Rogers, G., & Mentkowski, M. (2004). Abilities that distinguish the effectiveness of five-year alumna performance across work, family, and civic roles: A higher education validation. Higher Education Research & Development, 23 (3), 347–374.

Sadler, R. D. (1989). Formative assessment and the design of instructional systems. Instructional Science , 18 , 119–144.

Shavelson, R. J. (2010). Measuring college learning responsibly: Accountability in a new era . Stanford, CA: Stanford University Press.

van der Vleuten, C. P. M. (1996). The assessment of professional competence: Developments, research and practical implications. Advances in Health Sciences Education, 1 (1), 41–67.

Winter, D. G., McClelland, D. C., & Stewart, A. J. (1981). A new case for the liberal arts: Assessing institutional goals and student development . San Francisco: Jossey-Bass.

Download references

Author information

Authors and affiliations.

Faculty of Psycology, Alverno College, Milwaukee, WI, USA

Marcia Mentkowski

Faculty of Sociology, Alverno College, Milwaukee, WI, USA

Jeana Abromeit

Faculty of Natural Sciences, Alverno College, Milwaukee, WI, USA

Heather Mernitz

Asseseement Center, Alverno College, Milwaukee, WI, USA

Kelly Talley & Jill Haberman

Faculty of Nursing, Alverno College, Milwaukee, WI, USA

Catherine Knuteson

Research and Evaluation Department, Alverno College, Milwaukee, WI, USA

William H Rickards

Faculty of Mathematics, Alverno College, Milwaukee, WI, USA

Lois Kailhofer

Faculty of Instructional Services, Alverno College, Milwaukee, WI, USA

Suzanne Mente

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Marcia Mentkowski .

Editor information

Editors and affiliations.

University of California, Los Angeles, California, USA

Paul F. Wimmers

Alverno College, Milwaukee, Wisconsin, USA

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Mentkowski, M. et al. (2016). Assessing Student Learning Outcomes Across a Curriculum. In: Wimmers, P., Mentkowski, M. (eds) Assessing Competence in Professional Performance across Disciplines and Professions. Innovation and Change in Professional Education, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-319-30064-1_8

Download citation

DOI : https://doi.org/10.1007/978-3-319-30064-1_8

Published : 20 April 2016

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-30062-7

Online ISBN : 978-3-319-30064-1

eBook Packages : Education Education (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Teaching Methods
  • Problem-Based Learning

Improving Students' Problem Solving Ability and High Level Thinking in Mathematics Through Problem Based Learning Model in The Covid-19 Pandemic

  • January 2021

Evlin Minarista Limbong at State University of Medan

  • State University of Medan

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Edi Syahputra

  • Feria Andriana Putri

Adek Safitri

  • Maruli Simbolon

Mawarni Nehe

  • Nuraini Erlinda
  • Debby May Puspita
  • Nurul Munawarah
  • Milda Rizky Novriani
  • Fadhilah Syam Nasution

Cut Yuniza Eviyanti

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Captcha Page

We apologize for the inconvenience...

To ensure we keep this website safe, please can you confirm you are a human by ticking the box below.

If you are unable to complete the above request please contact us using the below link, providing a screenshot of your experience.

https://ioppublishing.org/contacts/

IMAGES

  1. Developing Problem-Solving Skills for Kids

    student learning outcomes for problem solving

  2. Student Learning Outcomes Venn Diagram Template

    student learning outcomes for problem solving

  3. Student Learning Outcomes: Examples & Assessments

    student learning outcomes for problem solving

  4. Problem Solving Strategies for Education

    student learning outcomes for problem solving

  5. 18 Problem-Based Learning Examples (2024)

    student learning outcomes for problem solving

  6. PPT

    student learning outcomes for problem solving

VIDEO

  1. Student LEarning Outcomes Final

  2. Creating Student Learning Outcomes

  3. NO MISTAKES IN MATH

  4. Research Methodology in English Education /B.Ed. 4th Year/Note-4

  5. Writing Student Learning Outcomes

  6. NURSING INTERVIEW QUESTION & TIPS! (Quick Interview Tips for PASSING Nursing Interviews!)

COMMENTS

  1. Core Outcomes: Critical Thinking and Problem Solving

    Core Outcomes. Sample Indicators. Level 1. Limited demonstration or application of knowledge and skills. Identifies the main problem, question at issue or the source's position. Identifies implicit aspects of the problem and addresses their relationship to each other. Level 2. Basic demonstration and application of knowledge and skills.

  2. Creating Learning Outcomes

    A learning outcome is a concise description of what students will learn and how that learning will be assessed. Having clearly articulated learning outcomes can make designing a course, assessing student learning progress, and facilitating learning activities easier and more effective. Learning outcomes can also help students regulate their learning and develop effective study strategies.

  3. Problem-Based Learning: An Overview of its Process and Impact on

    In this review, we provide an overview of the process of problem-based learning (PBL) and the studies examining the effectiveness of PBL. We also discuss a number of naturalistic and empirical studies that have examined the process of PBL and how its various components impact students' learning. We conclude that the studies comparing the ...

  4. The effectiveness of collaborative problem solving in promoting

    On the basis of these results, recommendations are made for further study and instruction to better support students' critical thinking in the context of collaborative problem-solving.

  5. Study shows that students learn more when taking part in classrooms

    Study shows students in 'active learning' classrooms learn more than they think For decades, there has been evidence that classroom techniques designed to get students to participate in the learning process produces better educational outcomes at virtually all levels.

  6. PDF What Are Student Learning Outcomes?

    What Are Student Learning Outcomes?Learning outcomes are statements of the knowledge, skills and abilities individual students should possess and can demonstrate upon completion of a learning experience. r sequence of learning experiences. Before preparing a list of learning outcomes consi. er the following recommendations: Learning outcomes.

  7. Full article: Fostering student engagement through a real-world

    Ample research has identified several features of a learning experience likely to enhance student learning, including collaboration, open-ended exploration, and problem-based learning in real-life ...

  8. Problem-Based Learning

    Problem-based learning (PBL) is a student-centered approach in which students learn about a subject by working in groups to solve an open-ended problem. This problem is what drives the motivation and the learning.

  9. How do students'roles in collaborative learning affect collaborative

    Highlights • Collaborative learning manifests the inherently social nature of learning through peer-directed interaction. • The role a student takes during group work influences that student's behavior and determines group interactions. • The student's roles that facilitate the development of collaborative problem-solving competency were fluid and subject to change during group work ...

  10. Collaborative Learning to Improve Problem-Solving Skills: A Relation

    Research on the effectiveness of collaborative learning approaches usually concentrates on individual performance as the primary indicator for a successful learning outcome. However, inconsistent success has been demonstrated for students&#8217; outcomes after...

  11. Student-Centered Learning: Practical Application of Theory in Practice

    EUGENE F. TRESTER has 30+ years of experience in employing student-centered learning in classrooms and in assisting the development of facilitators in the practice of student-centered learning nationally and internationally. Trester has received recognition and awards internationally, nationally, and locally, including multiple students ...

  12. Effective Learning Behavior in Problem-Based Learning: a Scoping Review

    Problem-based learning (PBL) emphasizes learning behavior that leads to critical thinking, problem-solving, communication, and collaborative skills in preparing students for a professional medical career. However, learning behavior that develops these ...

  13. Writing and Assessing Student Learning Outcomes

    View this resource to learn more about the characteristics of well-written student learning outcome statements, see examples of how to strengthen learning outcomes, and consider how to assess learning outcomes directly or indirectly.

  14. PDF Fostering Student Engagement: Creative Problem-Solving in Small Group

    ouched in small group facilitations to support peer learning.IntroductionCreativeProblem-Solving (CPS) is a powerful teaching method that can support a pedagogica. shift in the classroom and foster both student engagement and motivation to learn. Caswell (2006) describe. it as an approach to finding workable answers to problems that exist in ...

  15. (PDF) Improving student problem-solving skill and cognitive learning

    This study aimed to improve student problem-solving skill and cognitive learning outcome through the implementation of problem-based learning (PBL) model in learning biology at the senior high school.

  16. Effects of a problem posing instructional interventions on student

    Therefore, this study aims to explore the effects of a problem posing instructional intervention on student learning outcomes at the cognitive and non-cognitive levels from 2000 to 2023, using a three-level meta-analysis. 32 studies and 4,068 participants were included to compare the classrooms with and without problem posing instructional ...

  17. Assessing Student Learning Outcomes Across a Curriculum

    Disciplinary and professional competence in postsecondary education is made up of complex sets of constructs and role performances that differ markedly across the disciplines and professions. These often defy definition as learning outcomes because they are...

  18. PDF The Effect of Problem-Solving Instructional Strategies on Students

    The results revealed that student taught using problem-solving performed significantly better than those taught through lecture method. From the findings chemistry teachers are encouraged to attend seminars/workshops on problem -solving in order to facilitate the teaching and learning of chemistry in schools.

  19. (PDF) Improving Students' Problem Solving Ability and High Level

    The results showed that the problem-based learning approach to mathematics learning was able to improve students' high-level thinking and problem-solving skills in mathematics.

  20. Problem solving skills improvement and the impact on students' learning

    Experiment class received treatment used e-project based learning. Experiment classes are giving the e-project to improve problem solving skills and control classes are learned conventionally. Data was collected through pretest and posttest results from students' in both classes.

  21. Productive Problem-Solving Behaviors of Students with Learning Disabilities

    ABSTRACT. The purpose of this study was to explore the problem-solving behaviors of middle-school students with learning disabilities (SLD). Think-aloud interviews were performed with 20 seventh- and eighth-grade students who had learning disabilities to observe their behaviors while solving mathematical word problems (i.e., behaviors and patterns of behaviors).