Menu Trigger

Why Schools Need to Change Yes, We Can Define, Teach, and Assess Critical Thinking Skills

ways to assess critical thinking skills

Jeff Heyck-Williams (He, His, Him) Director of the Two Rivers Learning Institute in Washington, DC

critical thinking

Today’s learners face an uncertain present and a rapidly changing future that demand far different skills and knowledge than were needed in the 20th century. We also know so much more about enabling deep, powerful learning than we ever did before. Our collective future depends on how well young people prepare for the challenges and opportunities of 21st-century life.

Critical thinking is a thing. We can define it; we can teach it; and we can assess it.

While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the advent of the term “21st century skills” and discussions of deeper learning. There is increasing agreement among education reformers that critical thinking is an essential ingredient for long-term success for all of our students.

However, there are still those in the education establishment and in the media who argue that critical thinking isn’t really a thing, or that these skills aren’t well defined and, even if they could be defined, they can’t be taught or assessed.

To those naysayers, I have to disagree. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. In fact, as part of a multi-year Assessment for Learning Project , Two Rivers Public Charter School in Washington, D.C., has done just that.

Before I dive into what we have done, I want to acknowledge that some of the criticism has merit.

First, there are those that argue that critical thinking can only exist when students have a vast fund of knowledge. Meaning that a student cannot think critically if they don’t have something substantive about which to think. I agree. Students do need a robust foundation of core content knowledge to effectively think critically. Schools still have a responsibility for building students’ content knowledge.

However, I would argue that students don’t need to wait to think critically until after they have mastered some arbitrary amount of knowledge. They can start building critical thinking skills when they walk in the door. All students come to school with experience and knowledge which they can immediately think critically about. In fact, some of the thinking that they learn to do helps augment and solidify the discipline-specific academic knowledge that they are learning.

The second criticism is that critical thinking skills are always highly contextual. In this argument, the critics make the point that the types of thinking that students do in history is categorically different from the types of thinking students do in science or math. Thus, the idea of teaching broadly defined, content-neutral critical thinking skills is impossible. I agree that there are domain-specific thinking skills that students should learn in each discipline. However, I also believe that there are several generalizable skills that elementary school students can learn that have broad applicability to their academic and social lives. That is what we have done at Two Rivers.

Defining Critical Thinking Skills

We began this work by first defining what we mean by critical thinking. After a review of the literature and looking at the practice at other schools, we identified five constructs that encompass a set of broadly applicable skills: schema development and activation; effective reasoning; creativity and innovation; problem solving; and decision making.

critical thinking competency

We then created rubrics to provide a concrete vision of what each of these constructs look like in practice. Working with the Stanford Center for Assessment, Learning and Equity (SCALE) , we refined these rubrics to capture clear and discrete skills.

For example, we defined effective reasoning as the skill of creating an evidence-based claim: students need to construct a claim, identify relevant support, link their support to their claim, and identify possible questions or counter claims. Rubrics provide an explicit vision of the skill of effective reasoning for students and teachers. By breaking the rubrics down for different grade bands, we have been able not only to describe what reasoning is but also to delineate how the skills develop in students from preschool through 8th grade.

reasoning rubric

Before moving on, I want to freely acknowledge that in narrowly defining reasoning as the construction of evidence-based claims we have disregarded some elements of reasoning that students can and should learn. For example, the difference between constructing claims through deductive versus inductive means is not highlighted in our definition. However, by privileging a definition that has broad applicability across disciplines, we are able to gain traction in developing the roots of critical thinking. In this case, to formulate well-supported claims or arguments.

Teaching Critical Thinking Skills

The definitions of critical thinking constructs were only useful to us in as much as they translated into practical skills that teachers could teach and students could learn and use. Consequently, we have found that to teach a set of cognitive skills, we needed thinking routines that defined the regular application of these critical thinking and problem-solving skills across domains. Building on Harvard’s Project Zero Visible Thinking work, we have named routines aligned with each of our constructs.

For example, with the construct of effective reasoning, we aligned the Claim-Support-Question thinking routine to our rubric. Teachers then were able to teach students that whenever they were making an argument, the norm in the class was to use the routine in constructing their claim and support. The flexibility of the routine has allowed us to apply it from preschool through 8th grade and across disciplines from science to economics and from math to literacy.

argumentative writing

Kathryn Mancino, a 5th grade teacher at Two Rivers, has deliberately taught three of our thinking routines to students using the anchor charts above. Her charts name the components of each routine and has a place for students to record when they’ve used it and what they have figured out about the routine. By using this structure with a chart that can be added to throughout the year, students see the routines as broadly applicable across disciplines and are able to refine their application over time.

Assessing Critical Thinking Skills

By defining specific constructs of critical thinking and building thinking routines that support their implementation in classrooms, we have operated under the assumption that students are developing skills that they will be able to transfer to other settings. However, we recognized both the importance and the challenge of gathering reliable data to confirm this.

With this in mind, we have developed a series of short performance tasks around novel discipline-neutral contexts in which students can apply the constructs of thinking. Through these tasks, we have been able to provide an opportunity for students to demonstrate their ability to transfer the types of thinking beyond the original classroom setting. Once again, we have worked with SCALE to define tasks where students easily access the content but where the cognitive lift requires them to demonstrate their thinking abilities.

These assessments demonstrate that it is possible to capture meaningful data on students’ critical thinking abilities. They are not intended to be high stakes accountability measures. Instead, they are designed to give students, teachers, and school leaders discrete formative data on hard to measure skills.

While it is clearly difficult, and we have not solved all of the challenges to scaling assessments of critical thinking, we can define, teach, and assess these skills . In fact, knowing how important they are for the economy of the future and our democracy, it is essential that we do.

Jeff Heyck-Williams (He, His, Him)

Director of the two rivers learning institute.

Jeff Heyck-Williams is the director of the Two Rivers Learning Institute and a founder of Two Rivers Public Charter School. He has led work around creating school-wide cultures of mathematics, developing assessments of critical thinking and problem-solving, and supporting project-based learning.

Read More About Why Schools Need to Change

Teacher holding laptop in classroom

AI in Schools Has Prevailed for a Full Year. What Happens Next?

August 13, 2024

elementary students collaborate

Connections over Consequences: Effective Strategies for Collaborative Problem-Solving with Students

Sanchel Hall

August 6, 2024

Two high school students

Incorporating Leadership Skills into a Student-Centered Classroom

Elizabeth Lennon (she, her)

July 8, 2024

ways to assess critical thinking skills

ORIGINAL RESEARCH article

Performance assessment of critical thinking: conceptualization, design, and implementation.

\r\nHenry I. Braun*

  • 1 Lynch School of Education and Human Development, Boston College, Chestnut Hill, MA, United States
  • 2 Graduate School of Education, Stanford University, Stanford, CA, United States
  • 3 Department of Business and Economics Education, Johannes Gutenberg University, Mainz, Germany

Enhancing students’ critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information. We argue that performance assessment provides the most realistic—and most credible—approach to measuring CT. From this conceptualization and construct definition, we describe one possible framework for building performance assessments of CT with attention to extended performance tasks within the assessment system. The framework is a product of an ongoing, collaborative effort, the International Performance Assessment of Learning (iPAL). The framework comprises four main aspects: (1) The storyline describes a carefully curated version of a complex, real-world situation. (2) The challenge frames the task to be accomplished (3). A portfolio of documents in a range of formats is drawn from multiple sources chosen to have specific characteristics. (4) The scoring rubric comprises a set of scales each linked to a facet of the construct. We discuss a number of use cases, as well as the challenges that arise with the use and valid interpretation of performance assessments. The final section presents elements of the iPAL research program that involve various refinements and extensions of the assessment framework, a number of empirical studies, along with linkages to current work in online reading and information processing.

Introduction

In their mission statements, most colleges declare that a principal goal is to develop students’ higher-order cognitive skills such as critical thinking (CT) and reasoning (e.g., Shavelson, 2010 ; Hyytinen et al., 2019 ). The importance of CT is echoed by business leaders ( Association of American Colleges and Universities [AACU], 2018 ), as well as by college faculty (for curricular analyses in Germany, see e.g., Zlatkin-Troitschanskaia et al., 2018 ). Indeed, in the 2019 administration of the Faculty Survey of Student Engagement (FSSE), 93% of faculty reported that they “very much” or “quite a bit” structure their courses to support student development with respect to thinking critically and analytically. In a listing of 21st century skills, CT was the most highly ranked among FSSE respondents ( Indiana University, 2019 ). Nevertheless, there is considerable evidence that many college students do not develop these skills to a satisfactory standard ( Arum and Roksa, 2011 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019 ). This state of affairs represents a serious challenge to higher education – and to society at large.

In view of the importance of CT, as well as evidence of substantial variation in its development during college, its proper measurement is essential to tracking progress in skill development and to providing useful feedback to both teachers and learners. Feedback can help focus students’ attention on key skill areas in need of improvement, and provide insight to teachers on choices of pedagogical strategies and time allocation. Moreover, comparative studies at the program and institutional level can inform higher education leaders and policy makers.

The conceptualization and definition of CT presented here is closely related to models of information processing and online reasoning, the skills that are the focus of this special issue. These two skills are especially germane to the learning environments that college students experience today when much of their academic work is done online. Ideally, students should be capable of more than naïve Internet search, followed by copy-and-paste (e.g., McGrew et al., 2017 ); rather, for example, they should be able to critically evaluate both sources of evidence and the quality of the evidence itself in light of a given purpose ( Leu et al., 2020 ).

In this paper, we present a systematic approach to conceptualizing CT. From that conceptualization and construct definition, we present one possible framework for building performance assessments of CT with particular attention to extended performance tasks within the test environment. The penultimate section discusses some of the challenges that arise with the use and valid interpretation of performance assessment scores. We conclude the paper with a section on future perspectives in an emerging field of research – the iPAL program.

Conceptual Foundations, Definition and Measurement of Critical Thinking

In this section, we briefly review the concept of CT and its definition. In accordance with the principles of evidence-centered design (ECD; Mislevy et al., 2003 ), the conceptualization drives the measurement of the construct; that is, implementation of ECD directly links aspects of the assessment framework to specific facets of the construct. We then argue that performance assessments designed in accordance with such an assessment framework provide the most realistic—and most credible—approach to measuring CT. The section concludes with a sketch of an approach to CT measurement grounded in performance assessment .

Concept and Definition of Critical Thinking

Taxonomies of 21st century skills ( Pellegrino and Hilton, 2012 ) abound, and it is neither surprising that CT appears in most taxonomies of learning, nor that there are many different approaches to defining and operationalizing the construct of CT. There is, however, general agreement that CT is a multifaceted construct ( Liu et al., 2014 ). Liu et al. (2014) identified five key facets of CT: (i) evaluating evidence and the use of evidence; (ii) analyzing arguments; (iii) understanding implications and consequences; (iv) developing sound arguments; and (v) understanding causation and explanation.

There is empirical support for these facets from college faculty. A 2016–2017 survey conducted by the Higher Education Research Institute (HERI) at the University of California, Los Angeles found that a substantial majority of faculty respondents “frequently” encouraged students to: (i) evaluate the quality or reliability of the information they receive; (ii) recognize biases that affect their thinking; (iii) analyze multiple sources of information before coming to a conclusion; and (iv) support their opinions with a logical argument ( Stolzenberg et al., 2019 ).

There is general agreement that CT involves the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion (e.g., Erwin and Sebrell, 2003 ; Kosslyn and Nelson, 2017 ; Shavelson et al., 2018 ). We further suggest that CT includes dealing with dilemmas of ambiguity or conflict among principles and contradictory information ( Oser and Biedermann, 2020 ).

Importantly, Oser and Biedermann (2020) posit that CT can be manifested at three levels. The first level, Critical Analysis , is the most complex of the three levels. Critical Analysis requires both knowledge in a specific discipline (conceptual) and procedural analytical (deduction, inclusion, etc.) knowledge. The second level is Critical Reflection , which involves more generic skills “… necessary for every responsible member of a society” (p. 90). It is “a basic attitude that must be taken into consideration if (new) information is questioned to be true or false, reliable or not reliable, moral or immoral etc.” (p. 90). To engage in Critical Reflection, one needs not only apply analytic reasoning, but also adopt a reflective stance toward the political, social, and other consequences of choosing a course of action. It also involves analyzing the potential motives of various actors involved in the dilemma of interest. The third level, Critical Alertness , involves questioning one’s own or others’ thinking from a skeptical point of view.

Wheeler and Haertel (1993) categorized higher-order skills, such as CT, into two types: (i) when solving problems and making decisions in professional and everyday life, for instance, related to civic affairs and the environment; and (ii) in situations where various mental processes (e.g., comparing, evaluating, and justifying) are developed through formal instruction, usually in a discipline. Hence, in both settings, individuals must confront situations that typically involve a problematic event, contradictory information, and possibly conflicting principles. Indeed, there is an ongoing debate concerning whether CT should be evaluated using generic or discipline-based assessments ( Nagel et al., 2020 ). Whether CT skills are conceptualized as generic or discipline-specific has implications for how they are assessed and how they are incorporated into the classroom.

In the iPAL project, CT is characterized as a multifaceted construct that comprises conceptualizing, analyzing, drawing inferences or synthesizing information, evaluating claims, and applying the results of these reasoning processes to various purposes (e.g., solve a problem, decide on a course of action, find an answer to a given question or reach a conclusion) ( Shavelson et al., 2019 ). In the course of carrying out a CT task, an individual typically engages in activities such as specifying or clarifying a problem; deciding what information is relevant to the problem; evaluating the trustworthiness of information; avoiding judgmental errors based on “fast thinking”; avoiding biases and stereotypes; recognizing different perspectives and how they can reframe a situation; considering the consequences of alternative courses of actions; and communicating clearly and concisely decisions and actions. The order in which activities are carried out can vary among individuals and the processes can be non-linear and reciprocal.

In this article, we focus on generic CT skills. The importance of these skills derives not only from their utility in academic and professional settings, but also the many situations involving challenging moral and ethical issues – often framed in terms of conflicting principles and/or interests – to which individuals have to apply these skills ( Kegan, 1994 ; Tessier-Lavigne, 2020 ). Conflicts and dilemmas are ubiquitous in the contexts in which adults find themselves: work, family, civil society. Moreover, to remain viable in the global economic environment – one characterized by increased competition and advances in second generation artificial intelligence (AI) – today’s college students will need to continually develop and leverage their CT skills. Ideally, colleges offer a supportive environment in which students can develop and practice effective approaches to reasoning about and acting in learning, professional and everyday situations.

Measurement of Critical Thinking

Critical thinking is a multifaceted construct that poses many challenges to those who would develop relevant and valid assessments. For those interested in current approaches to the measurement of CT that are not the focus of this paper, consult Zlatkin-Troitschanskaia et al. (2018) .

In this paper, we have singled out performance assessment as it offers important advantages to measuring CT. Extant tests of CT typically employ response formats such as forced-choice or short-answer, and scenario-based tasks (for an overview, see Liu et al., 2014 ). They all suffer from moderate to severe construct underrepresentation; that is, they fail to capture important facets of the CT construct such as perspective taking and communication. High fidelity performance tasks are viewed as more authentic in that they provide a problem context and require responses that are more similar to what individuals confront in the real world than what is offered by traditional multiple-choice items ( Messick, 1994 ; Braun, 2019 ). This greater verisimilitude promises higher levels of construct representation and lower levels of construct-irrelevant variance. Such performance tasks have the capacity to measure facets of CT that are imperfectly assessed, if at all, using traditional assessments ( Lane and Stone, 2006 ; Braun, 2019 ; Shavelson et al., 2019 ). However, these assertions must be empirically validated, and the measures should be subjected to psychometric analyses. Evidence of the reliability, validity, and interpretative challenges of performance assessment (PA) are extensively detailed in Davey et al. (2015) .

We adopt the following definition of performance assessment:

A performance assessment (sometimes called a work sample when assessing job performance) … is an activity or set of activities that requires test takers, either individually or in groups, to generate products or performances in response to a complex, most often real-world task. These products and performances provide observable evidence bearing on test takers’ knowledge, skills, and abilities—their competencies—in completing the assessment ( Davey et al., 2015 , p. 10).

A performance assessment typically includes an extended performance task and short constructed-response and selected-response (i.e., multiple-choice) tasks (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). In this paper, we refer to both individual performance- and constructed-response tasks as performance tasks (PT) (For an example, see Table 1 in section “iPAL Assessment Framework”).

www.frontiersin.org

Table 1. The iPAL assessment framework.

An Approach to Performance Assessment of Critical Thinking: The iPAL Program

The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1 ). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and practice in measuring CT with performance tasks ( Shavelson et al., 2018 ). In this section, we present iPAL’s assessment framework as the basis of measuring CT, with examples along the way.

iPAL Background

The iPAL assessment framework builds on the Council of Aid to Education’s Collegiate Learning Assessment (CLA). The CLA was designed to measure cross-disciplinary, generic competencies, such as CT, analytic reasoning, problem solving, and written communication ( Klein et al., 2007 ; Shavelson, 2010 ). Ideally, each PA contained an extended PT (e.g., examining a range of evidential materials related to the crash of an aircraft) and two short PT’s: one in which students either critique an argument or provide a solution in response to a real-world societal issue.

Motivated by considerations of adequate reliability, in 2012, the CLA was later modified to create the CLA+. The CLA+ includes two subtests: a PT and a 25-item Selected Response Question (SRQ) section. The PT presents a document or problem statement and an assignment based on that document which elicits an open-ended response. The CLA+ added the SRQ section (which is not linked substantively to the PT scenario) to increase the number of student responses to obtain more reliable estimates of performance at the student-level than could be achieved with a single PT ( Zahner, 2013 ; Davey et al., 2015 ).

iPAL Assessment Framework

Methodological foundations.

The iPAL framework evolved from the Collegiate Learning Assessment developed by Klein et al. (2007) . It was also informed by the results from the AHELO pilot study ( Organisation for Economic Co-operation and Development [OECD], 2012 , 2013 ), as well as the KoKoHs research program in Germany (for an overview see, Zlatkin-Troitschanskaia et al., 2017 , 2020 ). The ongoing refinement of the iPAL framework has been guided in part by the principles of Evidence Centered Design (ECD) ( Mislevy et al., 2003 ; Mislevy and Haertel, 2006 ; Haertel and Fujii, 2017 ).

In educational measurement, an assessment framework plays a critical intermediary role between the theoretical formulation of the construct and the development of the assessment instrument containing tasks (or items) intended to elicit evidence with respect to that construct ( Mislevy et al., 2003 ). Builders of the assessment framework draw on the construct theory and operationalize it in a way that provides explicit guidance to PT’s developers. Thus, the framework should reflect the relevant facets of the construct, where relevance is determined by substantive theory or an appropriate alternative such as behavioral samples from real-world situations of interest (criterion-sampling; McClelland, 1973 ), as well as the intended use(s) (for an example, see Shavelson et al., 2019 ). By following the requirements and guidelines embodied in the framework, instrument developers strengthen the claim of construct validity for the instrument ( Messick, 1994 ).

An assessment framework can be specified at different levels of granularity: an assessment battery (“omnibus” assessment, for an example see below), a single performance task, or a specific component of an assessment ( Shavelson, 2010 ; Davey et al., 2015 ). In the iPAL program, a performance assessment comprises one or more extended performance tasks and additional selected-response and short constructed-response items. The focus of the framework specified below is on a single PT intended to elicit evidence with respect to some facets of CT, such as the evaluation of the trustworthiness of the documents provided and the capacity to address conflicts of principles.

From the ECD perspective, an assessment is an instrument for generating information to support an evidentiary argument and, therefore, the intended inferences (claims) must guide each stage of the design process. The construct of interest is operationalized through the Student Model , which represents the target knowledge, skills, and abilities, as well as the relationships among them. The student model should also make explicit the assumptions regarding student competencies in foundational skills or content knowledge. The Task Model specifies the features of the problems or items posed to the respondent, with the goal of eliciting the evidence desired. The assessment framework also describes the collection of task models comprising the instrument, with considerations of construct validity, various psychometric characteristics (e.g., reliability) and practical constraints (e.g., testing time and cost). The student model provides grounds for evidence of validity, especially cognitive validity; namely, that the students are thinking critically in responding to the task(s).

In the present context, the target construct (CT) is the competence of individuals to think critically, which entails solving complex, real-world problems, and clearly communicating their conclusions or recommendations for action based on trustworthy, relevant and unbiased information. The situations, drawn from actual events, are challenging and may arise in many possible settings. In contrast to more reductionist approaches to assessment development, the iPAL approach and framework rests on the assumption that properly addressing these situational demands requires the application of a constellation of CT skills appropriate to the particular task presented (e.g., Shavelson, 2010 , 2013 ). For a PT, the assessment framework must also specify the rubric by which the responses will be evaluated. The rubric must be properly linked to the target construct so that the resulting score profile constitutes evidence that is both relevant and interpretable in terms of the student model (for an example, see Zlatkin-Troitschanskaia et al., 2019 ).

iPAL Task Framework

The iPAL ‘omnibus’ framework comprises four main aspects: A storyline , a challenge , a document library , and a scoring rubric . Table 1 displays these aspects, brief descriptions of each, and the corresponding examples drawn from an iPAL performance assessment (Version adapted from original in Hyytinen and Toom, 2019 ). Storylines are drawn from various domains; for example, the worlds of business, public policy, civics, medicine, and family. They often involve moral and/or ethical considerations. Deriving an appropriate storyline from a real-world situation requires careful consideration of which features are to be kept in toto , which adapted for purposes of the assessment, and which to be discarded. Framing the challenge demands care in wording so that there is minimal ambiguity in what is required of the respondent. The difficulty of the challenge depends, in large part, on the nature and extent of the information provided in the document library , the amount of scaffolding included, as well as the scope of the required response. The amount of information and the scope of the challenge should be commensurate with the amount of time available. As is evident from the table, the characteristics of the documents in the library are intended to elicit responses related to facets of CT. For example, with regard to bias, the information provided is intended to play to judgmental errors due to fast thinking and/or motivational reasoning. Ideally, the situation should accommodate multiple solutions of varying degrees of merit.

The dimensions of the scoring rubric are derived from the Task Model and Student Model ( Mislevy et al., 2003 ) and signal which features are to be extracted from the response and indicate how they are to be evaluated. There should be a direct link between the evaluation of the evidence and the claims that are made with respect to the key features of the task model and student model . More specifically, the task model specifies the various manipulations embodied in the PA and so informs scoring, while the student model specifies the capacities students employ in more or less effectively responding to the tasks. The score scales for each of the five facets of CT (see section “Concept and Definition of Critical Thinking”) can be specified using appropriate behavioral anchors (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). Of particular importance is the evaluation of the response with respect to the last dimension of the scoring rubric; namely, the overall coherence and persuasiveness of the argument, building on the explicit or implicit characteristics related to the first five dimensions. The scoring process must be monitored carefully to ensure that (trained) raters are judging each response based on the same types of features and evaluation criteria ( Braun, 2019 ) as indicated by interrater agreement coefficients.

The scoring rubric of the iPAL omnibus framework can be modified for specific tasks ( Lane and Stone, 2006 ). This generic rubric helps ensure consistency across rubrics for different storylines. For example, Zlatkin-Troitschanskaia et al. (2019 , p. 473) used the following scoring scheme:

Based on our construct definition of CT and its four dimensions: (D1-Info) recognizing and evaluating information, (D2-Decision) recognizing and evaluating arguments and making decisions, (D3-Conseq) recognizing and evaluating the consequences of decisions, and (D4-Writing), we developed a corresponding analytic dimensional scoring … The students’ performance is evaluated along the four dimensions, which in turn are subdivided into a total of 23 indicators as (sub)categories of CT … For each dimension, we sought detailed evidence in students’ responses for the indicators and scored them on a six-point Likert-type scale. In order to reduce judgment distortions, an elaborate procedure of ‘behaviorally anchored rating scales’ (Smith and Kendall, 1963) was applied by assigning concrete behavioral expectations to certain scale points (Bernardin et al., 1976). To this end, we defined the scale levels by short descriptions of typical behavior and anchored them with concrete examples. … We trained four raters in 1 day using a specially developed training course to evaluate students’ performance along the 23 indicators clustered into four dimensions (for a description of the rater training, see Klotzer, 2018).

Shavelson et al. (2019) examined the interrater agreement of the scoring scheme developed by Zlatkin-Troitschanskaia et al. (2019) and “found that with 23 items and 2 raters the generalizability (“reliability”) coefficient for total scores to be 0.74 (with 4 raters, 0.84)” ( Shavelson et al., 2019 , p. 15). In the study by Zlatkin-Troitschanskaia et al. (2019 , p. 478) three score profiles were identified (low-, middle-, and high-performer) for students. Proper interpretation of such profiles requires care. For example, there may be multiple possible explanations for low scores such as poor CT skills, a lack of a disposition to engage with the challenge, or the two attributes jointly. These alternative explanations for student performance can potentially pose a threat to the evidentiary argument. In this case, auxiliary information may be available to aid in resolving the ambiguity. For example, student responses to selected- and short-constructed-response items in the PA can provide relevant information about the levels of the different skills possessed by the student. When sufficient data are available, the scores can be modeled statistically and/or qualitatively in such a way as to bring them to bear on the technical quality or interpretability of the claims of the assessment: reliability, validity, and utility evidence ( Davey et al., 2015 ; Zlatkin-Troitschanskaia et al., 2019 ). These kinds of concerns are less critical when PT’s are used in classroom settings. The instructor can draw on other sources of evidence, including direct discussion with the student.

Use of iPAL Performance Assessments in Educational Practice: Evidence From Preliminary Validation Studies

The assessment framework described here supports the development of a PT in a general setting. Many modifications are possible and, indeed, desirable. If the PT is to be more deeply embedded in a certain discipline (e.g., economics, law, or medicine), for example, then the framework must specify characteristics of the narrative and the complementary documents as to the breadth and depth of disciplinary knowledge that is represented.

At present, preliminary field trials employing the omnibus framework (i.e., a full set of documents) indicated that 60 min was generally an inadequate amount of time for students to engage with the full set of complementary documents and to craft a complete response to the challenge (for an example, see Shavelson et al., 2019 ). Accordingly, it would be helpful to develop modified frameworks for PT’s that require substantially less time. For an example, see a short performance assessment of civic online reasoning, requiring response times from 10 to 50 min ( Wineburg et al., 2016 ). Such assessment frameworks could be derived from the omnibus framework by focusing on a reduced number of facets of CT, and specifying the characteristics of the complementary documents to be included – or, perhaps, choices among sets of documents. In principle, one could build a ‘family’ of PT’s, each using the same (or nearly the same) storyline and a subset of the full collection of complementary documents.

Paul and Elder (2007) argue that the goal of CT assessments should be to provide faculty with important information about how well their instruction supports the development of students’ CT. In that spirit, the full family of PT’s could represent all facets of the construct while affording instructors and students more specific insights on strengths and weaknesses with respect to particular facets of CT. Moreover, the framework should be expanded to include the design of a set of short answer and/or multiple choice items to accompany the PT. Ideally, these additional items would be based on the same narrative as the PT to collect more nuanced information on students’ precursor skills such as reading comprehension, while enhancing the overall reliability of the assessment. Areas where students are under-prepared could be addressed before, or even in parallel with the development of the focal CT skills. The parallel approach follows the co-requisite model of developmental education. In other settings (e.g., for summative assessment), these complementary items would be administered after the PT to augment the evidence in relation to the various claims. The full PT taking 90 min or more could serve as a capstone assessment.

As we transition from simply delivering paper-based assessments by computer to taking full advantage of the affordances of a digital platform, we should learn from the hard-won lessons of the past so that we can make swifter progress with fewer missteps. In that regard, we must take validity as the touchstone – assessment design, development and deployment must all be tightly linked to the operational definition of the CT construct. Considerations of reliability and practicality come into play with various use cases that highlight different purposes for the assessment (for future perspectives, see next section).

The iPAL assessment framework represents a feasible compromise between commercial, standardized assessments of CT (e.g., Liu et al., 2014 ), on the one hand, and, on the other, freedom for individual faculty to develop assessment tasks according to idiosyncratic models. It imposes a degree of standardization on both task development and scoring, while still allowing some flexibility for faculty to tailor the assessment to meet their unique needs. In so doing, it addresses a key weakness of the AAC&U’s VALUE initiative 2 (retrieved 5/7/2020) that has achieved wide acceptance among United States colleges.

The VALUE initiative has produced generic scoring rubrics for 15 domains including CT, problem-solving and written communication. A rubric for a particular skill domain (e.g., critical thinking) has five to six dimensions with four ordered performance levels for each dimension (1 = lowest, 4 = highest). The performance levels are accompanied by language that is intended to clearly differentiate among levels. 3 Faculty are asked to submit student work products from a senior level course that is intended to yield evidence with respect to student learning outcomes in a particular domain and that, they believe, can elicit performances at the highest level. The collection of work products is then graded by faculty from other institutions who have been trained to apply the rubrics.

A principal difficulty is that there is neither a common framework to guide the design of the challenge, nor any control on task complexity and difficulty. Consequently, there is substantial heterogeneity in the quality and evidential value of the submitted responses. This also causes difficulties with task scoring and inter-rater reliability. Shavelson et al. (2009) discuss some of the problems arising with non-standardized collections of student work.

In this context, one advantage of the iPAL framework is that it can provide valuable guidance and an explicit structure for faculty in developing performance tasks for both instruction and formative assessment. When faculty design assessments, their focus is typically on content coverage rather than other potentially important characteristics, such as the degree of construct representation and the adequacy of their scoring procedures ( Braun, 2019 ).

Concluding Reflections

Challenges to interpretation and implementation.

Performance tasks such as those generated by iPAL are attractive instruments for assessing CT skills (e.g., Shavelson, 2010 ; Shavelson et al., 2019 ). The attraction mainly rests on the assumption that elaborated PT’s are more authentic (direct) and more completely capture facets of the target construct (i.e., possess greater construct representation) than the widely used selected-response tests. However, as Messick (1994) noted authenticity is a “promissory note” that must be redeemed with empirical research. In practice, there are trade-offs among authenticity, construct validity, and psychometric quality such as reliability ( Davey et al., 2015 ).

One reason for Messick (1994) caution is that authenticity does not guarantee construct validity. The latter must be established by drawing on multiple sources of evidence ( American Educational Research Association et al., 2014 ). Following the ECD principles in designing and developing the PT, as well as the associated scoring rubrics, constitutes an important type of evidence. Further, as Leighton (2019) argues, response process data (“cognitive validity”) is needed to validate claims regarding the cognitive complexity of PT’s. Relevant data can be obtained through cognitive laboratory studies involving methods such as think aloud protocols or eye-tracking. Although time-consuming and expensive, such studies can yield not only evidence of validity, but also valuable information to guide refinements of the PT.

Going forward, iPAL PT’s must be subjected to validation studies as recommended in the Standards for Psychological and Educational Testing by American Educational Research Association et al. (2014) . With a particular focus on the criterion “relationships to other variables,” a framework should include assumptions about the theoretically expected relationships among the indicators assessed by the PT, as well as the indicators’ relationships to external variables such as intelligence or prior (task-relevant) knowledge.

Complementing the necessity of evaluating construct validity, there is the need to consider potential sources of construct-irrelevant variance (CIV). One pertains to student motivation, which is typically greater when the stakes are higher. If students are not motivated, then their performance is likely to be impacted by factors unrelated to their (construct-relevant) ability ( Lane and Stone, 2006 ; Braun et al., 2011 ; Shavelson, 2013 ). Differential motivation across groups can also bias comparisons. Student motivation might be enhanced if the PT is administered in the context of a course with the promise of generating useful feedback on students’ skill profiles.

Construct-irrelevant variance can also occur when students are not equally prepared for the format of the PT or fully appreciate the response requirements. This source of CIV could be alleviated by providing students with practice PT’s. Finally, the use of novel forms of documentation, such as those from the Internet, can potentially introduce CIV due to differential familiarity with forms of representation or contents. Interestingly, this suggests that there may be a conflict between enhancing construct representation and reducing CIV.

Another potential source of CIV is related to response evaluation. Even with training, human raters can vary in accuracy and usage of the full score range. In addition, raters may attend to features of responses that are unrelated to the target construct, such as the length of the students’ responses or the frequency of grammatical errors ( Lane and Stone, 2006 ). Some of these sources of variance could be addressed in an online environment, where word processing software could alert students to potential grammatical and spelling errors before they submit their final work product.

Performance tasks generally take longer to administer and are more costly than traditional assessments, making it more difficult to reliably measure student performance ( Messick, 1994 ; Davey et al., 2015 ). Indeed, it is well known that more than one performance task is needed to obtain high reliability ( Shavelson, 2013 ). This is due to both student-task interactions and variability in scoring. Sources of student-task interactions are differential familiarity with the topic ( Hyytinen and Toom, 2019 ) and differential motivation to engage with the task. The level of reliability required, however, depends on the context of use. For use in formative assessment as part of an instructional program, reliability can be lower than use for summative purposes. In the former case, other types of evidence are generally available to support interpretation and guide pedagogical decisions. Further studies are needed to obtain estimates of reliability in typical instructional settings.

With sufficient data, more sophisticated psychometric analyses become possible. One challenge is that the assumption of unidimensionality required for many psychometric models might be untenable for performance tasks ( Davey et al., 2015 ). Davey et al. (2015) provide the example of a mathematics assessment that requires students to demonstrate not only their mathematics skills but also their written communication skills. Although the iPAL framework does not explicitly address students’ reading comprehension and organization skills, students will likely need to call on these abilities to accomplish the task. Moreover, as the operational definition of CT makes evident, the student must not only deploy several skills in responding to the challenge of the PT, but also carry out component tasks in sequence. The former requirement strongly indicates the need for a multi-dimensional IRT model, while the latter suggests that the usual assumption of local item independence may well be problematic ( Lane and Stone, 2006 ). At the same time, the analytic scoring rubric should facilitate the use of latent class analysis to partition data from large groups into meaningful categories ( Zlatkin-Troitschanskaia et al., 2019 ).

Future Perspectives

Although the iPAL consortium has made substantial progress in the assessment of CT, much remains to be done. Further refinement of existing PT’s and their adaptation to different languages and cultures must continue. To this point, there are a number of examples: The refugee crisis PT (cited in Table 1 ) was translated and adapted from Finnish to US English and then to Colombian Spanish. A PT concerning kidney transplants was translated and adapted from German to US English. Finally, two PT’s based on ‘legacy admissions’ to US colleges were translated and adapted to Colombian Spanish.

With respect to data collection, there is a need for sufficient data to support psychometric analysis of student responses, especially the relationships among the different components of the scoring rubric, as this would inform both task development and response evaluation ( Zlatkin-Troitschanskaia et al., 2019 ). In addition, more intensive study of response processes through cognitive laboratories and the like are needed to strengthen the evidential argument for construct validity ( Leighton, 2019 ). We are currently conducting empirical studies, collecting data on both iPAL PT’s and other measures of CT. These studies will provide evidence of convergent and discriminant validity.

At the same time, efforts should be directed at further development to support different ways CT PT’s might be used—i.e., use cases—especially those that call for formative use of PT’s. Incorporating formative assessment into courses can plausibly be expected to improve students’ competency acquisition ( Zlatkin-Troitschanskaia et al., 2017 ). With suitable choices of storylines, appropriate combinations of (modified) PT’s, supplemented by short-answer and multiple-choice items, could be interwoven into ordinary classroom activities. The supplementary items may be completely separate from the PT’s (as is the case with the CLA+), loosely coupled with the PT’s (as in drawing on the same storyline), or tightly linked to the PT’s (as in requiring elaboration of certain components of the response to the PT).

As an alternative to such integration, stand-alone modules could be embedded in courses to yield evidence of students’ generic CT skills. Core curriculum courses or general education courses offer ideal settings for embedding performance assessments. If these assessments were administered to a representative sample of students in each cohort over their years in college, the results would yield important information on the development of CT skills at a population level. For another example, these PA’s could be used to assess the competence profiles of students entering Bachelor’s or graduate-level programs as a basis for more targeted instructional support.

Thus, in considering different use cases for the assessment of CT, it is evident that several modifications of the iPAL omnibus assessment framework are needed. As noted earlier, assessments built according to this framework are demanding with respect to the extensive preliminary work required by a task and the time required to properly complete it. Thus, it would be helpful to have modified versions of the framework, focusing on one or two facets of the CT construct and calling for a smaller number of supplementary documents. The challenge to the student should be suitably reduced.

Some members of the iPAL collaborative have developed PT’s that are embedded in disciplines such as engineering, law and education ( Crump et al., 2019 ; for teacher education examples, see Jeschke et al., 2019 ). These are proving to be of great interest to various stakeholders and further development is likely. Consequently, it is essential that an appropriate assessment framework be established and implemented. It is both a conceptual and an empirical question as to whether a single framework can guide development in different domains.

Performance Assessment in Online Learning Environment

Over the last 15 years, increasing amounts of time in both college and work are spent using computers and other electronic devices. This has led to formulation of models for the new literacies that attempt to capture some key characteristics of these activities. A prominent example is a model proposed by Leu et al. (2020) . The model frames online reading as a process of problem-based inquiry that calls on five practices to occur during online research and comprehension:

1. Reading to identify important questions,

2. Reading to locate information,

3. Reading to critically evaluate information,

4. Reading to synthesize online information, and

5. Reading and writing to communicate online information.

The parallels with the iPAL definition of CT are evident and suggest there may be benefits to closer links between these two lines of research. For example, a report by Leu et al. (2014) describes empirical studies comparing assessments of online reading using either open-ended or multiple-choice response formats.

The iPAL consortium has begun to take advantage of the affordances of the online environment (for examples, see Schmidt et al. and Nagel et al. in this special issue). Most obviously, Supplementary Materials can now include archival photographs, audio recordings, or videos. Additional tasks might include the online search for relevant documents, though this would add considerably to the time demands. This online search could occur within a simulated Internet environment, as is the case for the IEA’s ePIRLS assessment ( Mullis et al., 2017 ).

The prospect of having access to a wealth of materials that can add to task authenticity is exciting. Yet it can also add ambiguity and information overload. Increased authenticity, then, should be weighed against validity concerns and the time required to absorb the content in these materials. Modifications of the design framework and extensive empirical testing will be required to decide on appropriate trade-offs. A related possibility is to employ some of these materials in short-answer (or even selected-response) items that supplement the main PT. Response formats could include highlighting text or using a drag-and-drop menu to construct a response. Students’ responses could be automatically scored, thereby containing costs. With automated scoring, feedback to students and faculty, including suggestions for next steps in strengthening CT skills, could also be provided without adding to faculty workload. Therefore, taking advantage of the online environment to incorporate new types of supplementary documents should be a high priority and, perhaps, to introduce new response formats as well. Finally, further investigation of the overlap between this formulation of CT and the characterization of online reading promulgated by Leu et al. (2020) is a promising direction to pursue.

Data Availability Statement

All datasets generated for this study are included in the article/supplementary material.

Author Contributions

HB wrote the article. RS, OZ-T, and KB were involved in the preparation and revision of the article and co-wrote the manuscript. All authors contributed to the article and approved the submitted version.

This study was funded in part by the Spencer Foundation (Grant No. #201700123).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank all the researchers who have participated in the iPAL program.

  • ^ https://www.ipal-rd.com/
  • ^ https://www.aacu.org/value
  • ^ When test results are reported by means of substantively defined categories, the scoring is termed “criterion-referenced”. This is, in contrast to results, reported as percentiles; such scoring is termed “norm-referenced”.

American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association.

Google Scholar

Arum, R., and Roksa, J. (2011). Academically Adrift: Limited Learning on College Campuses. Chicago, IL: University of Chicago Press.

Association of American Colleges and Universities (n.d.). VALUE: What is value?. Available online at:: https://www.aacu.org/value (accessed May 7, 2020).

Association of American Colleges and Universities [AACU] (2018). Fulfilling the American Dream: Liberal Education and the Future of Work. Available online at:: https://www.aacu.org/research/2018-future-of-work (accessed May 1, 2020).

Braun, H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89, 429–440. doi: 10.1111/bjep.12274

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, H. I., Kirsch, I., and Yamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th grade NAEP reading assessment. Teach. Coll. Rec. 113, 2309–2344.

Crump, N., Sepulveda, C., Fajardo, A., and Aguilera, A. (2019). Systematization of performance tests in critical thinking: an interdisciplinary construction experience. Rev. Estud. Educ. 2, 17–47.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., and Wise, L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service.

Erwin, T. D., and Sebrell, K. W. (2003). Assessment of critical thinking: ETS’s tasks in critical thinking. J. Gen. Educ. 52, 50–70. doi: 10.1353/jge.2003.0019

CrossRef Full Text | Google Scholar

Haertel, G. D., and Fujii, R. (2017). “Evidence-centered design and postsecondary assessment,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 313–339. doi: 10.4324/9781315709307-26

Hyytinen, H., and Toom, A. (2019). Developing a performance assessment task in the Finnish higher education context: conceptual and empirical insights. Br. J. Educ. Psychol. 89, 551–563. doi: 10.1111/bjep.12283

Hyytinen, H., Toom, A., and Shavelson, R. J. (2019). “Enhancing scientific thinking through the development of critical thinking in higher education,” in Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills , eds M. Murtonen and K. Balloo (London: Palgrave MacMillan).

Indiana University (2019). FSSE 2019 Frequencies: FSSE 2019 Aggregate. Available online at:: http://fsse.indiana.edu/pdf/FSSE_IR_2019/summary_tables/FSSE19_Frequencies_(FSSE_2019).pdf (accessed May 1, 2020).

Jeschke, C., Kuhn, C., Lindmeier, A., Zlatkin-Troitschanskaia, O., Saas, H., and Heinze, A. (2019). Performance assessment to investigate the domain specificity of instructional skills among pre-service and in-service teachers of mathematics and economics. Br. J. Educ. Psychol. 89, 538–550. doi: 10.1111/bjep.12277

Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Cambridge, MA: Harvard University Press.

Klein, S., Benjamin, R., Shavelson, R., and Bolus, R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31, 415–439. doi: 10.1177/0193841x07303318

Kosslyn, S. M., and Nelson, B. (2017). Building the Intentional University: Minerva and the Future of Higher Education. Cambridge, MAL: The MIT Press.

Lane, S., and Stone, C. A. (2006). “Performance assessment,” in Educational Measurement , 4th Edn, ed. R. L. Brennan (Lanham, MA: Rowman & Littlefield Publishers), 387–432.

Leighton, J. P. (2019). The risk–return trade-off: performance assessments and cognitive validation of inferences. Br. J. Educ. Psychol. 89, 441–455. doi: 10.1111/bjep.12271

Leu, D. J., Kiili, C., Forzani, E., Zawilinski, L., McVerry, J. G., and O’Byrne, W. I. (2020). “The new literacies of online research and comprehension,” in The Concise Encyclopedia of Applied Linguistics , ed. C. A. Chapelle (Oxford: Wiley-Blackwell), 844–852.

Leu, D. J., Kulikowich, J. M., Kennedy, C., and Maykel, C. (2014). “The ORCA Project: designing technology-based assessments for online research,” in Paper Presented at the American Educational Research Annual Meeting , Philadelphia, PA.

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. Ser. 1, 1–23. doi: 10.1002/ets2.12009

McClelland, D. C. (1973). Testing for competence rather than for “intelligence.”. Am. Psychol. 28, 1–14. doi: 10.1037/h0034092

McGrew, S., Ortega, T., Breakstone, J., and Wineburg, S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Educ. 4, 4-9, 39.

Mejía, A., Mariño, J. P., and Molina, A. (2019). Incorporating perspective analysis into critical thinking performance assessments. Br. J. Educ. Psychol. 89, 456–467. doi: 10.1111/bjep.12297

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educ. Res. 23, 13–23. doi: 10.3102/0013189x023002013

Mislevy, R. J., Almond, R. G., and Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Res. Rep. Ser. 2003, i–29. doi: 10.1002/j.2333-8504.2003.tb01908.x

Mislevy, R. J., and Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educ. Meas. Issues Pract. 25, 6–20. doi: 10.1111/j.1745-3992.2006.00075.x

Mullis, I. V. S., Martin, M. O., Foy, P., and Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Available online at:: http://timssandpirls.bc.edu/pirls2016/international-results/ (accessed May 1, 2020).

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., and Beck, K. (2020). “Performance assessment of generic and domain-specific skills in higher education economics,” in Student Learning in German Higher Education , eds O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper, and C. Lautenbach (Berlin: Springer), 281–299. doi: 10.1007/978-3-658-27886-1_14

Organisation for Economic Co-operation and Development [OECD] (2012). AHELO: Feasibility Study Report , Vol. 1. Paris: OECD. Design and implementation.

Organisation for Economic Co-operation and Development [OECD] (2013). AHELO: Feasibility Study Report , Vol. 2. Paris: OECD. Data analysis and national experiences.

Oser, F. K., and Biedermann, H. (2020). “A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. O. Zlatkin-Troitschanskaia (Cham: Springer), 89–106. doi: 10.1007/978-3-030-26578-6_7

Paul, R., and Elder, L. (2007). Consequential validity: using assessment to drive instruction. Found. Crit. Think. 29, 31–40.

Pellegrino, J. W., and Hilton, M. L. (eds) (2012). Education for life and work: Developing Transferable Knowledge and Skills in the 21st Century. Washington DC: National Academies Press.

Shavelson, R. (2010). Measuring College Learning Responsibly: Accountability in a New Era. Redwood City, CA: Stanford University Press.

Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educ. Psychol. 48, 73–86. doi: 10.1080/00461520.2013.779483

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., and Marino, J. P. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19, 337–362. doi: 10.1080/15305058.2018.1543309

Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Marino, J. P. (2018). “International performance assessment of learning in higher education (iPAL): research and development,” in Assessment of Learning Outcomes in Higher Education: Cross-National Comparisons and Perspectives , eds O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, and C. Kuhn (Berlin: Springer), 193–214. doi: 10.1007/978-3-319-74338-7_10

Shavelson, R. J., Klein, S., and Benjamin, R. (2009). The limitations of portfolios. Inside Higher Educ. Available online at: https://www.insidehighered.com/views/2009/10/16/limitations-portfolios

Stolzenberg, E. B., Eagan, M. K., Zimmerman, H. B., Berdan Lozano, J., Cesar-Davis, N. M., Aragon, M. C., et al. (2019). Undergraduate Teaching Faculty: The HERI Faculty Survey 2016–2017. Los Angeles, CA: UCLA.

Tessier-Lavigne, M. (2020). Putting Ethics at the Heart of Innovation. Stanford, CA: Stanford Magazine.

Wheeler, P., and Haertel, G. D. (1993). Resource Handbook on Performance Assessment and Measurement: A Tool for Students, Practitioners, and Policymakers. Palm Coast, FL: Owl Press.

Wineburg, S., McGrew, S., Breakstone, J., and Ortega, T. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Executive Summary. Stanford, CA: Stanford History Education Group.

Zahner, D. (2013). Reliability and Validity–CLA+. Council for Aid to Education. Available online at:: https://pdfs.semanticscholar.org/91ae/8edfac44bce3bed37d8c9091da01d6db3776.pdf .

Zlatkin-Troitschanskaia, O., and Shavelson, R. J. (2019). Performance assessment of student learning in higher education [Special issue]. Br. J. Educ. Psychol. 89, i–iv, 413–563.

Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., and Brückner, S. (2017). Modeling and Measuring Competencies in Higher Education: Approaches to Challenges in Higher Education Policy and Practice. Berlin: Springer VS.

Zlatkin-Troitschanskaia, O., Pant, H. A., Toepper, M., and Lautenbach, C. (eds) (2020). Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results. Wiesbaden: Springer.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., and Pant, H. A. (2018). “Assessment of learning outcomes in higher education: international comparisons and perspectives,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 686–697.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., and Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89, 468–484. doi: 10.1111/bjep.12286

Keywords : critical thinking, performance assessment, assessment framework, scoring rubric, evidence-centered design, 21st century skills, higher education

Citation: Braun HI, Shavelson RJ, Zlatkin-Troitschanskaia O and Borowiec K (2020) Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Received: 30 May 2020; Accepted: 04 August 2020; Published: 08 September 2020.

Reviewed by:

Copyright © 2020 Braun, Shavelson, Zlatkin-Troitschanskaia and Borowiec. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Henry I. Braun, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

How to Evaluate a Job Candidate’s Critical Thinking Skills in an Interview

  • Christopher Frank,
  • Paul Magnone,
  • Oded Netzer

ways to assess critical thinking skills

It’s not about how they answer your questions — it’s about the kind of questions they ask you.

The oldest and still the most powerful tactic for fostering critical thinking is the Socratic method, developed over 2,400 years ago by Socrates, one of the founders of Western philosophy. The Socratic method uses thought-provoking question-and-answer probing to promote learning. It focuses on generating more questions than answers, where the answers are not a stopping point but the beginning of further analysis. Hiring managers can apply this model to create a different dialogue with candidates in a modern-day organization.

Hiring is one of the most challenging competencies to master, yet it is one of the most strategic and impactful managerial functions. A McKinsey study quantified that superior talent is up to eight times more productive, showing that the relationship between talent quality and business performance is dramatic. Organizations seeking growth or simply survival during difficult times must successfully recruit A-list talent, thought leaders, and subject matter experts. This is often done under time constraints as you must quickly fill a key position. Essentially you are committing to a long-term relationship after a few very short dates.

ways to assess critical thinking skills

  • CF Christopher Frank is the coauthor of “ Decisions Over Decimals: Striking the Balance between Intuition and Information ” (Wiley) and “ Drinking from the Fire Hose: Making Smarter Decisions Without Drowning in Information ” (Portfolio). He is the Vice President of research and analytics at American Express.
  • PM Paul Magnone is the coauthor of “ Decisions Over Decimals: Striking the Balance between Intuition and Information ” (Wiley) and “ Drinking from the Fire Hose: Making Smarter Decisions Without Drowning in Information ” (Portfolio). He currently serves as the head of global strategic alliances for Google.
  • ON Oded Netzer is the coauthor of “ Decisions Over Decimals: Striking the Balance between Intuition and Information ” (Wiley). He is the Vice Dean for Research and the Arthur J. Samberg Professor of Business at Columbia Business School, an affiliate of the Columbia Data Science Institute, and an Amazon Scholar.

Partner Center

  • Open access
  • Published: 09 March 2020

Rubrics to assess critical thinking and information processing in undergraduate STEM courses

  • Gil Reynders 1 , 2 ,
  • Juliette Lantz 3 ,
  • Suzanne M. Ruder 2 ,
  • Courtney L. Stanford 4 &
  • Renée S. Cole   ORCID: orcid.org/0000-0002-2807-1500 1  

International Journal of STEM Education volume  7 , Article number:  9 ( 2020 ) Cite this article

71k Accesses

63 Citations

3 Altmetric

Metrics details

Process skills such as critical thinking and information processing are commonly stated outcomes for STEM undergraduate degree programs, but instructors often do not explicitly assess these skills in their courses. Students are more likely to develop these crucial skills if there is constructive alignment between an instructor’s intended learning outcomes, the tasks that the instructor and students perform, and the assessment tools that the instructor uses. Rubrics for each process skill can enhance this alignment by creating a shared understanding of process skills between instructors and students. Rubrics can also enable instructors to reflect on their teaching practices with regard to developing their students’ process skills and facilitating feedback to students to identify areas for improvement.

Here, we provide rubrics that can be used to assess critical thinking and information processing in STEM undergraduate classrooms and to provide students with formative feedback. As part of the Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project, rubrics were developed to assess these two skills in STEM undergraduate students’ written work. The rubrics were implemented in multiple STEM disciplines, class sizes, course levels, and institution types to ensure they were practical for everyday classroom use. Instructors reported via surveys that the rubrics supported assessment of students’ written work in multiple STEM learning environments. Graduate teaching assistants also indicated that they could effectively use the rubrics to assess student work and that the rubrics clarified the instructor’s expectations for how they should assess students. Students reported that they understood the content of the rubrics and could use the feedback provided by the rubric to change their future performance.

The ELIPSS rubrics allowed instructors to explicitly assess the critical thinking and information processing skills that they wanted their students to develop in their courses. The instructors were able to clarify their expectations for both their teaching assistants and students and provide consistent feedback to students about their performance. Supporting the adoption of active-learning pedagogies should also include changes to assessment strategies to measure the skills that are developed as students engage in more meaningful learning experiences. Tools such as the ELIPSS rubrics provide a resource for instructors to better align assessments with intended learning outcomes.

Introduction

Why assess process skills.

Process skills, also known as professional skills (ABET Engineering Accreditation Commission, 2012 ), transferable skills (Danczak et al., 2017 ), or cognitive competencies (National Research Council, 2012 ), are commonly cited as critical for students to develop during their undergraduate education (ABET Engineering Accreditation Commission, 2012 ; American Chemical Society Committee on Professional Training, 2015 ; National Research Council, 2012 ; Singer et al., 2012 ; The Royal Society, 2014 ). Process skills such as problem-solving, critical thinking, information processing, and communication are widely applicable to many academic disciplines and careers, and they are receiving increased attention in undergraduate curricula (ABET Engineering Accreditation Commission, 2012 ; American Chemical Society Committee on Professional Training, 2015 ) and workplace hiring decisions (Gray & Koncz, 2018 ; Pearl et al., 2019 ). Recent reports from multiple countries (Brewer & Smith, 2011 ; National Research Council, 2012 ; Singer et al., 2012 ; The Royal Society, 2014 ) indicate that these skills are emphasized in multiple undergraduate academic disciplines, and annual polls of about 200 hiring managers indicate that employers may place more importance on these skills than in applicants’ content knowledge when making hiring decisions (Deloitte Access Economics, 2014 ; Gray & Koncz, 2018 ). The assessment of process skills can provide a benchmark for achievement at the end of an undergraduate program and act as an indicator of student readiness to enter the workforce. Assessing these skills may also enable instructors and researchers to more fully understand the impact of active learning pedagogies on students.

A recent meta-analysis of 225 studies by Freeman et al. ( 2014 ) showed that students in active learning environments may achieve higher content learning gains than students in traditional lectures in multiple STEM fields when comparing scores on equivalent examinations. Active learning environments can have many different attributes, but they are commonly characterized by students “physically manipulating objects, producing new ideas, and discussing ideas with others” (Rau et al., 2017 ) in contrast to students sitting and listening to a lecture. Examples of active learning pedagogies include POGIL (Process Oriented Guided Inquiry Learning) (Moog & Spencer, 2008 ; Simonson, 2019 ) and PLTL (Peer-led Team Learning) (Gafney & Varma-Nelson, 2008 ; Gosser et al., 2001 ) in which students work in groups to complete activities with varying levels of guidance from an instructor. Despite the clear content learning gains that students can achieve from active learning environments (Freeman et al., 2014 ), the non-content-gains (including improvements in process skills) in these learning environments have not been explored to a significant degree. Active learning pedagogies such as POGIL and PLTL place an emphasis on students developing non-content skills in addition to content learning gains, but typically only the content learning is assessed on quizzes and exams, and process skills are not often explicitly assessed (National Research Council, 2012 ). In order to fully understand the effects of active learning pedagogies on all aspects of an undergraduate course, evidence-based tools must be used to assess students’ process skill development. The goal of this work was to develop resources that could enable instructors to explicitly assess process skills in STEM undergraduate classrooms in order to provide feedback to themselves and their students about the students’ process skills development.

Theoretical frameworks

The incorporation of these rubrics and other currently available tools for use in STEM undergraduate classrooms can be viewed through the lenses of constructive alignment (Biggs, 1996 ) and self-regulated learning (Zimmerman, 2002 ). The theory of constructivism posits that students learn by constructing their own understanding of knowledge rather than acquiring the meaning from their instructor (Bodner, 1986 ), and constructive alignment extends the constructivist model to consider how the alignment between a course’s intended learning outcomes, tasks, and assessments affects the knowledge and skills that students develop (Biggs, 2003 ). Students are more likely to develop the intended knowledge and skills if there is alignment between the instructor’s intended learning outcomes that are stated at the beginning of a course, the tasks that the instructor and students perform, and the assessment strategies that the instructor uses (Biggs, 1996 , 2003 , 2014 ). The nature of the tasks and assessments indicates what the instructor values and where students should focus their effort when studying. According to Biggs ( 2003 ) and Ramsden ( 1997 ), students see assessments as defining what they should learn, and a misalignment between the outcomes, tasks, and assessments may hinder students from achieving the intended learning outcomes. In the case of this work, the intended outcomes are improved process skills. In addition to aligning the components of a course, it is also critical that students receive feedback on their performance in order to improve their skills. Zimmerman’s theory of self-regulated learning (Zimmerman, 2002 ) provides a rationale for tailoring assessments to provide feedback to both students and instructors.

Zimmerman’s theory of self-regulated learning defines three phases of learning: forethought/planning, performance, and self-reflection. According to Zimmerman, individuals ideally should progress through these three phases in a cycle: they plan a task, perform the task, and reflect on their performance, then they restart the cycle on a new task. If a student is unable to adequately progress through the phases of self-regulated learning on their own, then feedback provided by an instructor may enable the students to do so (Butler & Winne, 1995 ). Thus, one of our criteria when creating rubrics to assess process skills was to make the rubrics suitable for faculty members to use to provide feedback to their students. Additionally, instructors can use the results from assessments to give themselves feedback regarding their students’ learning in order to regulate their teaching. This theory is called self-regulated learning because the goal is for learners to ultimately reflect on their actions to find ways to improve. We assert that, ideally, both students and instructors should be “learners” and use assessment data to reflect on their actions, although with different aims. Students need consistent feedback from an instructor and/or self-assessment throughout a course to provide a benchmark for their current performance and identify what they can do to improve their process skills (Black & Wiliam, 1998 ; Butler & Winne, 1995 ; Hattie & Gan, 2011 ; Nicol & Macfarlane-Dick, 2006 ). Instructors need feedback on the extent to which their efforts are achieving their intended goals in order to improve their instruction and better facilitate the development of process skills through course experiences.

In accordance with the aforementioned theoretical frameworks, tools used to assess undergraduate STEM student process skills should be tailored to fit the outcomes that are expected for undergraduate students and be able to provide formative assessment and feedback to both students and faculty about the students’ skills. These tools should also be designed for everyday classroom use to enable students to regularly self-assess and faculty to provide consistent feedback throughout a semester. Additionally, it is desirable for assessment tools to be broadly generalizable to measure process skills in multiple STEM disciplines and institutions in order to increase the rubrics’ impact on student learning. Current tools exist to assess these process skills, but they each lack at least one of the desired characteristics for providing regular feedback to STEM students.

Current tools to assess process skills

Current tests available to assess critical thinking include the Critical Thinking Assessment Test (CAT) (Stein & Haynes, 2011 ), California Critical Thinking Skills Test (Facione, 1990a , 1990b ), and Watson Glaser Critical Thinking Appraisal (Watson & Glaser, 1964 ). These commercially available, multiple-choice tests are not designed to provide regular, formative feedback throughout a course and have not been implemented for this purpose. Instead, they are designed to provide summative feedback with a focus on assessing this skill at a programmatic or university level rather than for use in the classroom to provide formative feedback to students. Rather than using tests to assess process skills, rubrics could be used instead. Rubrics are effective assessment tools because they can be quick and easy to use, they provide feedback to both students and instructors, and they can evaluate individual aspects of a skill to give more specific feedback (Brookhart & Chen, 2014 ; Smit & Birri, 2014 ). Rubrics for assessing critical thinking are available, but they have not been used to provide feedback to undergraduate STEM students nor were they designed to do so (Association of American Colleges and Universities, 2019 ; Saxton et al., 2012 ). The Critical Thinking Analytic Rubric is designed specifically to assess K-12 students to enhance college readiness and has not been broadly tested in collegiate STEM courses (Saxton et al., 2012 ). The critical thinking rubric developed by the Association of American Colleges and Universities (AAC&U) as part its Valid Assessment of Learning in Undergraduate Education (VALUE) Institute and Liberal Education and America’s Promise (LEAP) initiative (Association of American Colleges and Universities, 2019 ) is intended for programmatic assessment rather than specifically giving feedback to students throughout a course. As with tests for assessing critical thinking, current rubrics to assess critical thinking are not designed to act as formative assessments and give feedback to STEM faculty and undergraduates at the course or task level. Another issue with the assessment of critical thinking is the degree to which the construct is measurable. A National Research Council report (National Research Council, 2011 ) has suggested that there is little evidence of a consistent, measurable definition for critical thinking and that it may not be different from one’s general cognitive ability. Despite this issue, we have found that critical thinking is consistently listed as a programmatic outcome in STEM disciplines (American Chemical Society Committee on Professional Training, 2015 ; The Royal Society, 2014 ), so we argue that it is necessary to support instructors as they attempt to assess this skill.

Current methods for evaluating students’ information processing include discipline-specific tools such as a rubric to assess physics students’ use of graphs and equations to solve work-energy problems (Nguyen et al., 2010 ) and assessments of organic chemistry students’ ability to “[manipulate] and [translate] between various representational forms” including 2D and 3D representations of chemical structures (Kumi et al., 2013 ). Although these assessment tools can be effectively used for their intended context, they were not designed for use in a wide range of STEM disciplines or for a variety of tasks.

Despite the many tools that exist to measure process skills, none has been designed and tested to facilitate frequent, formative feedback to STEM undergraduate students and faculty throughout a semester. The rubrics described here have been designed by the Enhancing Learning by Improving Process Skills in STEM (ELIPSS) Project (Cole et al., 2016 ) to assess undergraduate STEM students’ process skills and to facilitate feedback at the classroom level with the potential to track growth throughout a semester or degree program. The rubrics described here are designed to assess critical thinking and information processing in student written work. Rubrics were chosen as the format for our process skill assessment tools because the highest level of each category in rubrics can serve as an explicit learning outcome that the student is expected to achieve (Panadero & Jonsson, 2013 ). Rubrics that are generalizable to multiple disciplines and institutions can enable the assessment of student learning outcomes and active learning pedagogies throughout a program of study and provide useful tools for a greater number of potential users.

Research questions

This work sought to answer the following research questions for each rubric:

Does the rubric adequately measure relevant aspects of the skill?

How well can the rubrics provide feedback to instructors and students?

Can multiple raters use the rubrics to give consistent scores?

This work received Institutional Review Board approval prior to any data collection involving human subjects. The sources of data used to construct the process skill rubrics and answer these research questions were (1) peer-reviewed literature on how each skill is defined, (2) feedback from content experts in multiple STEM disciplines via surveys and in-person, group discussions regarding the appropriateness of the rubrics for each discipline, (3) interviews with students whose work was scored with the rubrics and teaching assistants who scored the student work, and (4) results of applying the rubrics to samples of student work.

Defining the scope of the rubrics

The rubrics described here and the other rubrics in development by the ELIPSS Project are intended to measure process skills, which are desired learning outcomes identified by the STEM community in recent reports (National Research Council, 2012 ; Singer et al., 2012 ). In order to measure these skills in multiple STEM disciplines, operationalized definitions of each skill were needed. These definitions specify which aspects of student work (operations) would be considered evidence for the student using that skill and establish a shared understanding of each skill by members of each STEM discipline. The starting point for this work was the process skill definitions developed as part of the POGIL project (Cole et al., 2019a ). The POGIL community includes instructors from a variety of disciplines and institutions and represented the intended audience for the rubrics: faculty who value process skills and want to more explicitly assess them. The process skills discussed in this work were defined as follows:

Critical thinking is analyzing, evaluating, or synthesizing relevant information to form an argument or reach a conclusion supported with evidence.

Information processing is evaluating, interpreting, and manipulating or transforming information.

Examples of critical thinking include the tasks that students are asked to perform in a laboratory course. When students are asked to analyze the data they collected, combine data from different sources, and generate arguments or conclusions about their data, we see this as critical thinking. However, when students simply follow the so-called “cookbook” laboratory instructions that require them to confirm pre-determined conclusions, we do not think students are engaging in critical thinking. One example of information processing is when organic chemistry students are required to re-draw molecules in different formats. The students must evaluate and interpret various pieces of one representation, and then they recreate the molecule in another representation. However, if students are asked to simply memorize facts or algorithms to solve problems, we do not see this as information processing.

Iterative rubric development

The development process was the same for the information processing rubric and the critical thinking rubric. After defining the scope of the rubric, an initial version was drafted based upon the definition of the target process skill and how each aspect of the skill is defined in the literature. A more detailed discussion of the literature that informed each rubric category is included in the “Results and Discussion” section. This initial version then underwent iterative testing in which the rubric was reviewed by researchers, practitioners, and students. The rubric was first evaluated by the authors and a group of eight faculty from multiple STEM disciplines who made up the ELIPSS Project’s primary collaborative team (PCT). The PCT was a group of faculty members with experience in discipline-based education research who employ active-learning pedagogies in their classrooms. This initial round of evaluation was intended to ensure that the rubric measured relevant aspects of the skill and was appropriate for each PCT member’s discipline. This evaluation determined how well the rubrics were aligned with each instructor’s understanding of the process skill including both in-person and email discussions that continued until the group came to consensus that each rubric category could be applied to student work in courses within their disciplines. There has been an ongoing debate regarding the role of disciplinary knowledge in critical thinking and the extent to which critical thinking is subject-specific (Davies, 2013 ; Ennis, 1990 ). This work focuses on the creation of rubrics to measure process skills in different domains, but we have not performed cross-discipline comparisons. This initial round of review was also intended to ensure that the rubrics were ready for classroom testing by instructors in each discipline. Next, each rubric was tested over three semesters in multiple classroom environments, illustrated in Table 1 . The rubrics were applied to student work chosen by each PCT member. The PCT members chose the student work based on their views of how the assignments required students to engage in process skills and show evidence of those skills. The information processing and critical thinking rubrics shown in this work were each tested in at least three disciplines, course levels, and institutions.

After each semester, the feedback was collected from the faculty testing the rubric, and further changes to the rubric were made. Feedback was collected in the form of survey responses along with in-person group discussions at annual project meetings. After the first iteration of completing the survey, the PCT members met with the authors to discuss how they were interpreting each survey question. This meeting helped ensure that the surveys were gathering valid data regarding how well the rubrics were measuring the desired process skill. Questions in the survey such as “What aspects of the student work provided evidence for the indicated process skill?” and “Are there edits to the rubric/descriptors that would improve your ability to assess the process skill?” allowed the authors to determine how well the rubric scores were matching the student work and identify necessary changes to the rubric. Further questions asked about the nature and timing of the feedback given to students in order to address the question of how well the rubrics provide feedback to instructors and students. The survey questions are included in the Supporting Information . The survey responses were analyzed qualitatively to determine themes related to each research question.

In addition to the surveys given to faculty rubric testers, twelve students were interviewed in fall 2016 and fall 2017. In the United States of America, the fall semester typically runs from August to December and is the first semester of the academic year. Each student participated in one interview which lasted about 30 min. These interviews were intended to gather further data to answer questions about how well the rubrics were measuring the identified process skills that students were using when they completed their assignments and to ensure that the information provided by the rubrics made sense to students. The protocol for these interviews is included in the Supporting Information . In fall 2016, the students interviewed were enrolled in an organic chemistry laboratory course for non-majors at a large, research-intensive university in the United States. Thirty students agreed to have their work analyzed by the research team, and nine students were interviewed. However, the rubrics were not a component of the laboratory course grading. Instead, the first author assessed the students’ reports for critical thinking and information processing, and then the students were provided electronic copies of their laboratory reports and scored rubrics in advance of the interview. The first author had recently been a graduate teaching assistant for the course and was familiar with the instructor’s expectations for the laboratory reports. During the interview, the students were given time to review their reports and the completed rubrics, and then they were asked about how well they understood the content of the rubrics and how accurately each category score represented their work.

In fall 2017, students enrolled in a physical chemistry thermodynamics course for majors were interviewed. The physical chemistry course took place at the same university as the organic laboratory course, but there was no overlap between participants. Three students and two graduate teaching assistants (GTAs) were interviewed. The course included daily group work, and process skill assessment was an explicit part of the instructor’s curriculum. At the end of each class period, students assessed their groups using portions of ELIPSS rubrics, including the two process skill rubrics included in this paper. About every 2 weeks, the GTAs assessed the student groups with a complete ELIPSS rubric for a particular skill, then gave the groups their scored rubrics with written comments. The students’ individual homework problem sets were assessed once with rubrics for three skills: critical thinking, information processing, and problem-solving. The students received the scored rubric with written comments when the graded problem set was returned to them. In the last third of the semester, the students and GTAs were interviewed about how rubrics were implemented in the course, how well the rubric scores reflected the students’ written work, and how the use of rubrics affected the teaching assistants’ ability to assess the student skills. The protocols for these interviews are included in the Supporting Information .

Gathering evidence for utility, validity, and reliability

The utility, validity, and reliability of the rubrics were measured throughout the development process. The utility is the degree to which the rubrics are perceived as practical to experts and practitioners in the field. Through multiple meetings, the PCT faculty determined that early drafts of the rubric seemed appropriate for use in their classrooms, which represented multiple STEM disciplines. Rubric utility was reexamined multiple times throughout the development process to ensure that the rubrics would remain practical for classroom use. Validity can be defined in multiple ways. For example, the Standards for Educational and Psychological Testing (Joint Committee on Standards for Educational Psychological Testing, 2014 ) defines validity as “the degree to which all the accumulated evidence supports the intended interpretation of test scores for the proposed use.” For the purposes of this work, we drew on the ways in which two distinct types of validity were examined in the rubric literature: content validity and construct validity. Content validity is the degree to which the rubrics cover relevant aspects of each process skill (Moskal & Leydens, 2000 ). In this case, the process skill definition and a review of the literature determined which categories were included in each rubric. The literature review was finished once the data was saturated: when no more new aspects were found. Construct validity is the degree to which the levels of each rubric category accurately reflect the process that students performed (Moskal & Leydens, 2000 ). Evidence of construct validity was gathered via the faculty surveys, teaching assistant interviews, and student interviews. In the student interviews, students were given one of their completed assignments and asked to explain how they completed the task. Students were then asked to explain how well each category applied to their work and if any changes were needed to the rubric to more accurately reflect their process. Due to logistical challenges, we were not able to obtain evidence for convergent validity, and this is further discussed in the “Limitations” section.

Adjacent agreement, also known as “interrater agreement within one,” was chosen as the measure of interrater reliability due to its common use in rubric development projects (Jonsson & Svingby, 2007 ). The adjacent agreement is the percentage of cases in which two raters agree on a rating or are different by one level (i.e., they give adjacent ratings to the same work). Jonsson and Svingby ( 2007 ) found that most of the rubrics they reviewed had adjacent agreement scores of 90% or greater. However, they noted that the agreement threshold varied based on the number of possible levels of performance for each category in the rubric, with three and four being the most common numbers of levels. Since the rubrics discussed in this report have six levels (scores of zero through five) and are intended for low-stakes assessment and feedback, the goal of 80% adjacent agreement was selected. To calculate agreement for the critical thinking and information processing rubrics, two researchers discussed the scoring criteria for each rubric and then independently assessed the organic chemistry laboratory reports.

Results and discussion

The process skill rubrics to assess critical thinking and information processing in student written work were completed after multiple rounds of revision based on feedback from various sources. These sources include feedback from instructors who tested the rubrics in their classrooms, TAs who scored student work with the rubrics, and students who were assessed with the rubrics. The categories for each rubric will be discussed in terms of the evidence that the rubrics measure the relevant aspects of the skill and how they can be used to assess STEM undergraduate student work. Each category discussion will begin with a general explanation of the category followed by more specific examples from the organic chemistry laboratory course and physical chemistry lecture course to demonstrate how the rubrics can be used to assess student work.

Information processing rubric

The definition of information processing and the focus of the rubric presented here (Fig. 1 ) are distinct from cognitive information processing as defined by the educational psychology literature (Driscoll, 2005 ). The rubric shown here is more aligned with the STEM education construct of representational competency (Daniel et al., 2018 ).

figure 1

Rubric for assessing information processing

When solving a problem or completing a task, students must evaluate the provided information for relevance or importance to the task (Hanson, 2008 ; Swanson et al., 1990 ). All the information provided in a prompt (e.g., homework or exam questions) may not be relevant for addressing all parts of the prompt. Students should ideally show evidence of their evaluation process by identifying what information is present in the prompt/model, indicating what information is relevant or not relevant, and indicating why information is relevant. Responses with these characteristics would earn high rubric scores for this category. Although students may not explicitly state what information is necessary to address a task, the information they do use can act as indirect evidence of the degree to which they have evaluated all of the available information in the prompt. Evidence for students inaccurately evaluating information for relevance includes the inclusion of irrelevant information or the omission of relevant information in an analysis or in completing a task. When evaluating the organic chemistry laboratory reports, the focus for the evaluating category was the information students presented when identifying the chemical structure of their products. For students who received a high score, this information included their measured value for the product’s melting point, the literature (expected) value for the melting point, and the peaks in a nuclear magnetic resonance (NMR) spectrum. NMR spectroscopy is a commonly used technique in chemistry to obtain structural information about a compound. Lower scores were given if students omitted any of the necessary information or if they included unnecessary information. For example, if a student discussed their reaction yield when discussing the identity of their product, they would receive a low Evaluating score because the yield does not help them determine the identity of their product; the yield, in this case, would be unnecessary information. In the physical chemistry course, students often did not show evidence that they determined which information was relevant to answer the homework questions and thus earned low evaluating scores. These omissions will be further addressed in the “Interpreting” section.

Interpreting

In addition to evaluating, students must often interpret information using their prior knowledge to explain the meaning of something, make inferences, match data to predictions, and extract patterns from data (Hanson, 2008 ; Nakhleh, 1992 ; Schmidt et al., 1989 ; Swanson et al., 1990 ). Students earn high scores for this category if they assign correct meaning to labeled information (e.g., text, tables, graphs, diagrams), extract specific details from information, explain information in their own words, and determine patterns in information. For the organic chemistry laboratory reports, students received high scores if they accurately interpreted their measured values and NMR peaks. Almost every student obtained melting point values that were different than what was expected due to measurement error or impurities in their products, so they needed to describe what types of impurities could cause such discrepancies. Also, each NMR spectrum contained one peak that corresponded to the solvent used to dissolve the students’ product, so the students needed to use their prior knowledge of NMR spectroscopy to recognize that peak did not correspond to part of their product.

In physical chemistry, the graduate teaching assistant often gave students low scores for inaccurately explaining changes to chemical systems such as changes in pressure or entropy. The graduate teaching assistant who assessed the student work used the rubric to identify both the evaluating and interpreting categories as weaknesses in many of the students’ homework submissions. However, the students often earned high scores for the manipulating and transforming categories, so the GTA was able to give students specific feedback on their areas for improvement while also highlighting their strengths.

Manipulating and transforming (extent and accuracy)

In addition to evaluating and interpreting information, students may be asked to manipulate and transform information from one form to another. These transformations should be complete and accurate (Kumi et al., 2013 ; Nguyen et al., 2010 ). Students may be required to construct a figure based on written information, or conversely, they may transform information in a figure into words or mathematical expressions. Two categories for manipulating and transforming (i.e., extent and accuracy) were included to allow instructors to give more specific feedback. It was often found that students would either transform little information but do so accurately, or transform much information and do so inaccurately; the two categories allowed for differentiated feedback to be provided. As stated above, the organic chemistry students were expected to transform their NMR spectral data into a table and provide a labeled structure of their final product. Students were given high scores if they converted all of the relevant peaks from their spectrum into the table format and were able to correctly match the peaks to the hydrogen atoms in their products. Students received lower scores if they were only able to convert the information for a few peaks or if they incorrectly matched the peaks to the hydrogen atoms.

Critical thinking rubric

Critical thinking can be broadly defined in different contexts, but we found that the categories included in the rubric (Fig. 2 ) represented commonly accepted aspects of critical thinking (Danczak et al., 2017 ) and suited the needs of the faculty collaborators who tested the rubric in their classrooms.

figure 2

Rubric for assessing critical thinking

When completing a task, students must evaluate the relevance of information that they will ultimately use to support a claim or conclusions (Miri et al., 2007 ; Zohar et al., 1994 ). An evaluating category is included in both critical thinking and information processing rubrics because evaluation is a key aspect of both skills. From our previous work developing a problem-solving rubric (manuscript in preparation) and our review of the literature for this work (Danczak et al., 2017 ; Lewis & Smith, 1993 ), the overlap was seen between information processing, critical thinking, and problem-solving. Additionally, while the Evaluating category in the information processing rubric assesses a student’s ability to determine the importance of information to complete a task, the evaluating category in the critical thinking rubric places a heavier emphasis on using the information to support a conclusion or argument.

When scoring student work with the evaluating category, students receive high scores if they indicate what information is likely to be most relevant to the argument they need to make, determine the reliability of the source of their information, and determine the quality and accuracy of the information itself. The information used to assess this category can be indirect as with the Evaluating category in the information processing rubric. In the organic chemistry laboratory reports, students needed to make an argument about whether they successfully produced the desired product, so they needed to discuss which information was relevant to their claims about the product’s identity and purity. Students received high scores for the evaluating category when they accurately determined that the melting point and nearly all peaks except the solvent peak in the NMR spectrum indicated the identity of their product. Students received lower scores for evaluating when they left out relevant information because this was seen as evidence that the student inaccurately evaluated the information’s relevance in supporting their conclusion. They also received lower scores when they incorrectly stated that a high yield indicated a pure product. Students were given the opportunity to demonstrate their ability to evaluate the quality of information when discussing their melting point. Students sometimes struggled to obtain reliable melting point data due to their inexperience in the laboratory, so the rubric provided a way to assess the student’s ability to critique their own data.

In tandem with evaluating information, students also need to analyze that same information to extract meaningful evidence to support their conclusions (Bailin, 2002 ; Lai, 2011 ; Miri et al., 2007 ). The analyzing category provides an assessment of a student’s ability to discuss information and explore the possible meaning of that information, extract patterns from data/information that could be used as evidence for their claims, and summarize information that could be used as evidence. For example, in the organic chemistry laboratory reports, students needed to compare the information they obtained to the expected values for a product. Students received high scores for the analyzing category if they could extract meaningful structural information from the NMR spectrum and their two melting points (observed and expected) for each reaction step.

Synthesizing

Often, students are asked to synthesize or connect multiple pieces of information in order to draw a conclusion or make a claim (Huitt, 1998 ; Lai, 2011 ). Synthesizing involves identifying the relationships between different pieces of information or concepts, identifying ways that different pieces of information or concepts can be combined, and explaining how the newly synthesized information can be used to reach a conclusion and/or support an argument. While performing the organic chemistry laboratory experiments, students obtained multiple types of information such as the melting point and NMR spectrum in addition to other spectroscopic data such as an infrared (IR) spectrum. Students received high scores for this category when they accurately synthesized these multiple data types by showing how the NMR and IR spectra could each reveal different parts of a molecule in order to determine the molecule’s entire structure.

Forming arguments (structure and validity)

The final key aspect of critical thinking is forming a well-structured and valid argument (Facione, 1984 ; Glassner & Schwarz, 2007 ; Lai, 2011 ; Lewis & Smith, 1993 ). It was observed that students can earn high scores for evaluating, analyzing, and synthesizing, but still struggle to form arguments. This was particularly common in assessing problem sets in the physical chemistry course.

As with the manipulating and transforming categories in the information processing rubric, two forming arguments categories were included to allow instructors to give more specific feedback. Some students may be able to include all of the expected structural elements of their arguments but use faulty information or reasoning. Conversely, some students may be able to make scientifically valid claims but not necessarily support them with evidence. The two forming arguments categories are intended to accurately assess both of these scenarios. For the forming arguments (structure) category, students earn high scores if they explicitly state their claim or conclusion, list the evidence used to support the argument, and provide reasoning to link the evidence to their claim/conclusion. Students who do not make a claim or who provide little evidence or reasoning receive lower scores.

For the forming arguments (validity) category, students earn high scores if their claim is accurate and their reasoning is logical and clearly supports the claim with provided evidence. Organic chemistry students earned high scores for the forms and supports arguments categories if they made explicit claims about the identity and purity of their product and provided complete and accurate evidence for their claim(s) such as the melting point values and positions of NMR peaks that correspond to their product. Additionally, the students provided evidence for the purity of their products by pointing to the presence or absence of peaks in their NMR spectrum that would match other potential side products. They also needed to provide logical reasoning for why the peaks indicated the presence or absence of a compound. As previously mentioned, the physical chemistry students received lower scores for the forming arguments categories than for the other aspects of critical thinking. These students were asked to make claims about the relationships between entropy and heat and then provide relevant evidence to justify these claims. Often, the students would make clearly articulated claims but would provide little evidence to support them. As with the information processing rubric, the critical thinking rubric allowed the GTAs to assess aspects of these skills independently and identify specific areas for student improvement.

Validity and reliability

The goal of this work was to create rubrics that can accurately assess student work (validity) and be consistently implemented by instructors or researchers within multiple STEM fields (reliability). The evidence for validity includes the alignment of the rubrics with literature-based descriptions of each skill, review of the rubrics by content experts from multiple STEM disciplines, interviews with undergraduate students whose work was scored using the rubrics, and interviews of the GTAs who scored the student work.

The definitions for each skill, along with multiple iterations of the rubrics, underwent review by STEM content experts. As noted earlier, the instructors who were testing the rubrics were given a survey at the end of each semester and were invited to offer suggested changes to the rubric to better help them assess their students. After multiple rubric revisions, survey responses from the instructors indicated that the rubrics accurately represented the breadth of each process skill as seen in each expert’s content area and that each category could be used to measure multiple levels of student work. By the end of the rubrics’ development, instructors were writing responses such as “N/A” or “no suggestions” to indicate that the rubrics did not need further changes.

Feedback from the faculty also indicated that the rubrics were measuring the intended constructs by the ways they responded to the survey item “What aspects of the student work provided evidence for the indicated process skill?” For example, one instructor noted that for information processing, she saw evidence of the manipulating and transforming categories when “students had to transform their written/mathematical relationships into an energy diagram.” Another instructor elicited evidence of information processing during an in-class group quiz: “A question on the group quiz was written to illicit [sic] IP [information processing]. Students had to transform a structure into three new structures and then interpret/manipulate the structures to compare the pKa values [acidity] of the new structures.” For this instructor, the structures written by the students revealed evidence of their information processing by showing what information they omitted in the new structures or inaccurately transformed. For critical thinking, an instructor assessed short research reports with the critical thinking rubric and “looked for [the students’] ability to use evidence to support their conclusions, to evaluate the literature studies, and to develop their own judgements by synthesizing the information.” Another instructor used the critical thinking rubric to assess their students’ abilities to choose an instrument to perform a chemical analysis. According to the instructor, the students provided evidence of their critical thinking because “in their papers, they needed to justify their choice of instrument. This justification required them to evaluate information and synthesize a new understanding for this specific chemical analysis.”

Analysis of student work indicates multiple levels of achievement for each rubric category (illustrated in Fig. 3 ), although there may have been a ceiling effect for the evaluating and the manipulating and transforming (extent) categories in information processing for organic chemistry laboratory reports because many students earned the highest possible score (five) for those categories. However, other implementations of the ELIPSS rubrics (Reynders et al., 2019 ) have shown more variation in student scores for the two process skills.

figure 3

Student rubric scores from an organic chemistry laboratory course. The two rubrics were used to evaluate different laboratory reports. Thirty students were assessed for information processing and 28 were assessed for critical thinking

To provide further evidence that the rubrics were measuring the intended skills, students in the physical chemistry course were interviewed about their thought processes and how well the rubric scores reflected the work they performed. During these interviews, students described how they used various aspects of information processing and critical thinking skills. The students first described how they used information processing during a problem set where they had to answer questions about a diagram of systolic and diastolic blood pressure. Students described how they evaluated and interpreted the graph to make statements such as “diastolic [pressure] is our y-intercept” and “volume is the independent variable.” The students then demonstrated their ability to transform information from one form to another, from a graph to a mathematic equation, by recognizing “it’s a linear relationship so I used Y equals M X plus B ” and “integrated it cause it’s the change, the change in V [volume]. For critical thinking, students described their process on a different problem set. In this problem set, the students had to explain why the change of Helmholtz energy and the change in Gibbs free energy were equivalent under a certain given condition. Students first demonstrated how they evaluated the relevant information and analyzed what would and would not change in their system. One student said, “So to calculate the final pressure, I think I just immediately went to the ideal gas law because we know the final volume and the number of moles won’t change and neither will the temperature in this case. Well, I assume that it wouldn’t.” Another student showed evidence of their evaluation by writing out all the necessary information in one place and stating, “Whenever I do these types of problems, I always write what I start with which is why I always have this line of information I’m given.” After evaluating and analyzing, students had to form an argument by claiming that the two energy values were equal and then defending that claim. Students explained that they were not always as clear as they could be when justifying their claim. For instance, one student said, “Usually I just write out equations and then hope people understand what I’m doing mathematically” but they “probably could have explained it a little more.”

Student feedback throughout the organic chemistry course and near the end of the physical chemistry course indicated that the rubric scores were accurate representations of the students’ work with a few exceptions. For example, some students felt like they should have received either a lower or higher score for certain categories, but they did say that the categories themselves applied well to their work. Most notably, one student reported that the forms and supports arguments categories in the critical thinking rubric did not apply to her work because she “wasn’t making an argument” when she was demonstrating that the Helmholtz and Gibbs energy values were equal in her thermodynamics assignment. We see this as an instance where some students and instructors may define argument in different ways. The process skill definitions and the rubric categories are meant to articulate intended learning outcomes from faculty members to their students, so if a student defines the skills or categories differently than the faculty member, then the rubrics can serve to promote a shared understanding of the skill.

As previously mentioned, reliability was measured by two researchers assessing ten laboratory reports independently to ensure that multiple raters could use the rubrics consistently. The average adjacent agreement scores were 92% for critical thinking and 93% for information processing. The exact agreement scores were 86% for critical thinking and 88% for information processing. Additionally, two different raters assessed a statistics assignment that was given to sixteen first-year undergraduates. The average pairwise adjacent agreement scores were 89% for critical thinking and 92% for information processing for this assignment. However, the exact agreement scores were much lower: 34% for critical thinking and 36% for information processing. In this case, neither rater was an expert in the content area. While the exact agreement scores for the statistics assignment are much lower than desirable, the adjacent agreement scores do meet the threshold for reliability as seen in other rubrics (Jonsson & Svingby, 2007 ) despite the disparity in expertise. Based on these results, it may be difficult for multiple raters to give exactly the same scores to the same work if they have varying levels of content knowledge, but it is important to note that the rubrics are primarily intended for formative assessment that can facilitate discussions between instructors and students about the ways for students to improve. The high level of adjacent agreement scores indicates that multiple raters can identify the same areas to improve in examples of student work.

Instructor and teaching assistant reflections

The survey responses from faculty members determined the utility of the rubrics. Faculty members reported that when they used the rubrics to define their expectations and be more specific about their assessment criteria, the students seemed to be better able to articulate the areas in which they needed improvement. As one instructor put it, “having the rubrics helped open conversations and discussions” that were not happening before the rubrics were implemented. We see this as evidence of the clear intended learning outcomes that are an integral aspect of achieving constructive alignment within a course. The instructors’ specific feedback to the students, and the students’ increased awareness of their areas for improvement, may enable the students to better regulate their learning throughout a course. Additionally, the survey responses indicated that the faculty members were changing their teaching practices and becoming more cognizant of how assignments did or did not elicit the process skill evidence that they desired. After using the rubrics, one instructor said, “I realize I need to revise many of my activities to more thoughtfully induce process skill development.” We see this as evidence that the faculty members were using the rubrics to regulate their teaching by reflecting on the outcomes of their practices and then planning for future teaching. These activities represent the reflection and forethought/planning aspects of self-regulated learning on the part of the instructors. Graduate teaching assistants in the physical chemistry course indicated that the rubrics gave them a way to clarify the instructor’s expectations when they were interacting with the students. As one GTA said, “It’s giving [the students] feedback on direct work that they have instead of just right or wrong. It helps them to understand like ‘Okay how can I improve? What areas am I lacking in?’” A more detailed account of how the instructors and teaching assistants implemented the rubrics has been reported elsewhere (Cole et al., 2019a ).

Student reflections

Students in both the organic and physical chemistry courses reported that they could use the rubrics to engage in the three phases of self-regulated learning: forethought/planning, performing, and reflecting. In an organic chemistry interview, one student was discussing how they could improve their low score for the synthesizing category of critical thinking by saying “I could use the data together instead of trying to use them separately,” thus demonstrating forethought/planning for their later work. Another student described how they could use the rubric while performing a task: “I could go through [the rubric] as I’m writing a report…and self-grade.” Finally, one student demonstrated how they could use the rubrics to reflect on their areas for improvement by saying that “When you have the five column [earn a score of five], I can understand that I’m doing something right” but “I really need to work on revising my reports.” We see this as evidence that students can use the rubrics to regulate their own learning, although classroom facilitation can have an effect on the ways in which students use the rubric feedback (Cole et al., 2019b ).

Limitations

The process skill definitions presented here represent a consensus understanding among members of the POGIL community and the instructors who participated in this study, but these skills are often defined in multiple ways by various STEM instructors, employers, and students (Danczak et al., 2017 ). One issue with critical thinking, in particular, is the broadness of how the skill is defined in the literature. Through this work, we have evidence via expert review to indicate that our definitions represent common understandings among a set of STEM faculty. Nonetheless, we cannot claim that all STEM instructors or researchers will share the skill definitions presented here.

There is currently a debate in the STEM literature (National Research Council, 2011 ) about whether the critical thinking construct is domain-general or domain-specific, that is, whether or not one’s critical thinking ability in one discipline can be applied to another discipline. We cannot make claims about the generalness of the construct based on the data presented here because the same students were not tested across multiple disciplines or courses. Additionally, we did not gather evidence for convergent validity, which is “the degree to which an operationalized construct is similar to other operationalized constructs that it theoretically should be similar to” (National Research Council, 2011 ). In other words, evidence for convergent validity would be the comparison of multiple measures of information processing or critical thinking. However, none of the instructors who used the ELIPSS rubrics also used a secondary measure of the constructs. Although the rubrics were examined by a multidisciplinary group of collaborators, this group was primarily chemists and included eight faculties from other disciplines, so the content validity of the rubrics may be somewhat limited.

Finally, the generalizability of the rubrics is limited by the relatively small number of students who were interviewed about their work. During their interviews, the students in the organic and physical chemistry courses each said that they could use the rubric scores as feedback to improve their skills. Additionally, as discussed in the “Validity and Reliability” section, the processes described by the students aligned with the content of the rubric and provided evidence of the rubric scores’ validity. However, the data gathered from the student interviews only represents the views of a subset of students in the courses, and further study is needed to determine the most appropriate contexts in which the rubrics can be implemented.

Conclusions and implications

Two rubrics were developed to assess and provide feedback on undergraduate STEM students’ critical thinking and information processing. Faculty survey responses indicated that the rubrics measured the relevant aspects of each process skill in the disciplines that were examined. Faculty survey responses, TA interviews, and student interviews over multiple semesters indicated that the rubric scores accurately reflected the evidence of process skills that the instructors wanted to see and the processes that the students performed when they were completing their assignments. The rubrics showed high inter-rater agreement scores, indicating that multiple raters could identify the same areas for improvement in student work.

In terms of constructive alignment, courses should ideally have alignment between their intended learning outcomes, student and instructor activities, and assessments. By using the ELIPSS rubrics, instructors were able to explicitly articulate the intended learning outcomes of their courses to their students. The instructors were then able to assess and provide feedback to students on different aspects of their process skills. Future efforts will be focused on modifying student assignments to enable instructors to better elicit evidence of these skills. In terms of self-regulated learning, students indicated in the interviews that the rubric scores were accurate representations of their work (performances), could help them reflect on their previous work (self-reflection), and the feedback they received could be used to inform their future work (forethought). Not only did the students indicate that the rubrics could help them regulate their learning, but the faculty members indicated that the rubrics had helped them regulate their teaching. With the individual categories on each rubric, the faculty members were better able to observe their students’ strengths and areas for improvement and then tailor their instruction to meet those needs. Our results indicated that the rubrics helped instructors in multiple STEM disciplines and at multiple institutions reflect on their teaching and then make changes to better align their teaching with their desired outcomes.

Overall, the rubrics can be used in a number of different ways to modify courses or for programmatic assessment. As previously stated, instructors can use the rubrics to define expectations for their students and provide them with feedback on desired skills throughout a course. The rubric categories can be used to give feedback on individual aspects of student process skills to provide specific feedback to each student. If an instructor or department wants to change from didactic lecture-based courses to active learning ones, the rubrics can be used to measure non-content learning gains that stem from the adoption of such pedagogies. Although the examples provided here for each rubric were situated in chemistry contexts, the rubrics were tested in multiple disciplines and institution types. The rubrics have the potential for wide applicability to assess not only laboratory reports but also homework assignments, quizzes, and exams. Assessing these tasks provides a way for instructors to achieve constructive alignment between their intended outcomes and their assessments, and the rubrics are intended to enhance this alignment to improve student process skills that are valued in the classroom and beyond.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

American Association of Colleges and Universities

Critical Thinking Assessment Test

Comprehensive University

Enhancing Learning by Improving Process Skills in STEM

Liberal Education and America’s Promise

Nuclear Magnetic Resonance

Primary Collaborative Team

Peer-led Team Learning

Process Oriented Guided Inquiry Learning

Primarily Undergraduate Institution

Research University

Science, Technology, Engineering, and Mathematics

Valid Assessment of Learning in Undergraduate Education

ABET Engineering Accreditation Commission. (2012). Criteria for Accrediting Engineering Programs . Retrieved from http://www.abet.org/accreditation/accreditation-criteria/criteria-for-accrediting-engineering-programs-2016-2017/ .

American Chemical Society Committee on Professional Training. (2015). Unergraduate Professional Education in Chemistry: ACS Guidelines and Evaluation Procedures for Bachelor's Degree Programs . Retrieved from https://www.acs.org/content/dam/acsorg/about/governance/committees/training/2015-acs-guidelines-for-bachelors-degree-programs.pdf

Association of American Colleges and Universities. (2019). VALUE Rubric Development Project. Retrieved from https://www.aacu.org/value/rubrics .

Bailin, S. (2002). Critical Thinking and Science Education. Science and Education, 11 , 361–375.

Article   Google Scholar  

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32 (3), 347–364.

Biggs, J. (2003). Aligning teaching and assessing to course objectives. Teaching and learning in higher education: New trends and innovations, 2 , 13–17.

Google Scholar  

Biggs, J. (2014). Constructive alignment in university teaching. HERDSA Review of higher education, 1 (1), 5–22.

Black, P., & Wiliam, D. (1998). Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5 (1), 7–74.

Bodner, G. M. (1986). Constructivism: A theory of knowledge. Journal of Chemical Education, 63 (10), 873–878.

Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science . DC : Washington .

Brookhart, S. M., & Chen, F. (2014). The quality and effectiveness of descriptive rubrics. Educational Review , 1–26.

Butler, D. L., & Winne, P. H. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis. Review of Educational Research, 65 (3), 245–281.

Cole, R., Lantz, J., & Ruder, S. (2016). Enhancing Learning by Improving Process Skills in STEM. Retrieved from http://www.elipss.com .

Cole, R., Lantz, J., & Ruder, S. (2019a). PO: The Process. In S. R. Simonson (Ed.), POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners (pp. 42–68). Sterling, VA: Stylus Publishing.

Cole, R., Reynders, G., Ruder, S., Stanford, C., & Lantz, J. (2019b). Constructive Alignment Beyond Content: Assessing Professional Skills in Student Group Interactions and Written Work. In M. Schultz, S. Schmid, & G. A. Lawrie (Eds.), Research and Practice in Chemistry Education: Advances from the 25 th IUPAC International Conference on Chemistry Education 2018 (pp. 203–222). Singapore: Springer.

Chapter   Google Scholar  

Danczak, S., Thompson, C., & Overton, T. (2017). ‘What does the term Critical Thinking mean to you?’A qualitative analysis of chemistry undergraduate, teaching staff and employers' views of critical thinking. Chemistry Education Research and Practice, 18 , 420–434.

Daniel, K. L., Bucklin, C. J., Leone, E. A., & Idema, J. (2018). Towards a Definition of Representational Competence. In Towards a Framework for Representational Competence in Science Education (pp. 3–11). Switzerland: Springer.

Davies, M. (2013). Critical thinking and the disciplines reconsidered. Higher Education Research & Development, 32 (4), 529–544.

Deloitte Access Economics. (2014). Australia's STEM Workforce: a survey of employers. Retrieved from https://www2.deloitte.com/au/en/pages/economics/articles/australias-stem-workforce-survey.html .

Driscoll, M. P. (2005). Psychology of learning for instruction . Boston, MA: Pearson Education.

Ennis, R. H. (1990). The extent to which critical thinking is subject-specific: Further clarification. Educational researcher, 19 (4), 13–16.

Facione, P. A. (1984). Toward a theory of critical thinking. Liberal Education, 70 (3), 253–261.

Facione, P. A. (1990a). The California Critical Thinking Skills Test--College Level . In Technical Report #1 . Experimental Validation and Content : Validity .

Facione, P. A. (1990b). The California critical thinking skills test—college level . In Technical Report #2 . Factors Predictive of CT : Skills .

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111 (23), 8410–8415.

Gafney, L., & Varma-Nelson, P. (2008). Peer-led team learning: evaluation, dissemination, and institutionalization of a college level initiative (Vol. 16): Springer Science & Business Media, Netherlands.

Glassner, A., & Schwarz, B. B. (2007). What stands and develops between creative and critical thinking? Argumentation? Thinking Skills and Creativity, 2 (1), 10–18.

Gosser, D. K., Cracolice, M. S., Kampmeier, J. A., Roth, V., Strozak, V. S., & Varma-Nelson, P. (2001). Peer-led team learning: A guidebook: Prentice Hall Upper Saddle River, NJ .

Gray, K., & Koncz, A. (2018). The key attributes employers seek on students' resumes. Retrieved from http://www.naceweb.org/about-us/press/2017/the-key-attributes-employers-seek-on-students-resumes/ .

Hanson, D. M. (2008). A cognitive model for learning chemistry and solving problems: implications for curriculum design and classroom instruction. In R. S. Moog & J. N. Spencer (Eds.), Process-Oriented Guided Inquiry Learning (pp. 15–19). Washington, DC: American Chemical Society.

Hattie, J., & Gan, M. (2011). Instruction based on feedback. Handbook of research on learning and instruction , 249-271.

Huitt, W. (1998). Critical thinking: an overview. In Educational psychology interactive Retrieved from http://www.edpsycinteractive.org/topics/cogsys/critthnk.html .

Joint Committee on Standards for Educational Psychological Testing. (2014). Standards for Educational and Psychological Testing : American Educational Research Association.

Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2 (2), 130–144.

Kumi, B. C., Olimpo, J. T., Bartlett, F., & Dixon, B. L. (2013). Evaluating the effectiveness of organic chemistry textbooks in promoting representational fluency and understanding of 2D-3D diagrammatic relationships. Chemistry Education Research and Practice, 14 , 177–187.

Lai, E. R. (2011). Critical thinking: a literature review. Pearson's Research Reports, 6 , 40–41.

Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory into Practice, 32 , 131–137.

Miri, B., David, B., & Uri, Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: a case of critical thinking. Research in Science Education, 37 , 353–369.

Moog, R. S., & Spencer, J. N. (Eds.). (2008). Process oriented guided inquiry learning (POGIL) . Washington, DC: American Chemical Society.

Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: validity and reliability. Practical Assessment, Research and Evaluation, 7 , 1–11.

Nakhleh, M. B. (1992). Why some students don't learn chemistry: Chemical misconceptions. Journal of Chemical Education, 69 (3), 191.

National Research Council. (2011). Assessing 21st Century Skills: Summary of a Workshop . Washington, DC: The National Academies Press.

National Research Council. (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century . Washington, DC: The National Academies Press.

Nguyen, D. H., Gire, E., & Rebello, N. S. (2010). Facilitating Strategies for Solving Work-Energy Problems in Graphical and Equational Representations. 2010 Physics Education Research Conference, 1289 , 241–244.

Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31 (2), 199–218.

Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: a review. Educational Research Review, 9 , 129–144.

Pearl, A. O., Rayner, G., Larson, I., & Orlando, L. (2019). Thinking about critical thinking: An industry perspective. Industry & Higher Education, 33 (2), 116–126.

Ramsden, P. (1997). The context of learning in academic departments. The experience of learning, 2 , 198–216.

Rau, M. A., Kennedy, K., Oxtoby, L., Bollom, M., & Moore, J. W. (2017). Unpacking “Active Learning”: A Combination of Flipped Classroom and Collaboration Support Is More Effective but Collaboration Support Alone Is Not. Journal of Chemical Education, 94 (10), 1406–1414.

Reynders, G., Suh, E., Cole, R. S., & Sansom, R. L. (2019). Developing student process skills in a general chemistry laboratory. Journal of Chemical Education , 96 (10), 2109–2119.

Saxton, E., Belanger, S., & Becker, W. (2012). The Critical Thinking Analytic Rubric (CTAR): Investigating intra-rater and inter-rater reliability of a scoring mechanism for critical thinking performance assessments. Assessing Writing, 17 , 251–270.

Schmidt, H. G., De Volder, M. L., De Grave, W. S., Moust, J. H. C., & Patel, V. L. (1989). Explanatory Models in the Processing of Science Text: The Role of Prior Knowledge Activation Through Small-Group Discussion. J. Educ. Psychol., 81 , 610–619.

Simonson, S. R. (Ed.). (2019). POGIL: An Introduction to Process Oriented Guided Inquiry Learning for Those Who Wish to Empower Learners . Sterling, VA: Stylus Publishing, LLC.

Singer, S. R., Nielsen, N. R., & Schweingruber, H. A. (Eds.). (2012). Discipline-Based education research: understanding and improving learning in undergraduate science and engineering . Washington D.C.: The National Academies Press.

Smit, R., & Birri, T. (2014). Assuring the quality of standards-oriented classroom assessment with rubrics for complex competencies. Studies in Educational Evaluation, 43 , 5–13.

Stein, B., & Haynes, A. (2011). Engaging Faculty in the Assessment and Improvement of Students' Critical Thinking Using the Critical Thinking Assessment Test. Change: The Magazine of Higher Learning, 43 , 44–49.

Swanson, H. L., Oconnor, J. E., & Cooney, J. B. (1990). An Information-Processing Analysis of Expert and Novice Teachers Problem-Solving. American Educational Research Journal, 27 (3), 533–556.

The Royal Society. (2014). Vision for science and mathematics education: The Royal Society Science Policy Centre . London: England.

Watson, G., & Glaser, E. M. (1964). Watson-Glaser Critical Thinking Appraisal Manual . New York, NY: Harcourt, Brace, and World.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into Practice, 41 (2), 64–70.

Zohar, A., Weinberger, Y., & Tamir, P. (1994). The Effect of the Biology Critical Thinking Project on the Development of Critical Thinking. Journal of Research in Science Teaching, 31 , 183–196.

Download references

Acknowledgements

We thank members of our Primary Collaboration Team and Implementation Cohorts for collecting and sharing data. We also thank all the students who have allowed us to examine their work and provided feedback.

Supporting information

• Product rubric survey

• Initial implementation survey

• Continuing implementation survey

This work was supported in part by the National Science Foundation under collaborative grants #1524399, #1524936, and #1524965. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Author information

Authors and affiliations.

Department of Chemistry, University of Iowa, W331 Chemistry Building, Iowa City, Iowa, 52242, USA

Gil Reynders & Renée S. Cole

Department of Chemistry, Virginia Commonwealth University, Richmond, Virginia, 23284, USA

Gil Reynders & Suzanne M. Ruder

Department of Chemistry, Drew University, Madison, New Jersey, 07940, USA

Juliette Lantz

Department of Chemistry, Ball State University, Muncie, Indiana, 47306, USA

Courtney L. Stanford

You can also search for this author in PubMed   Google Scholar

Contributions

RC, JL, and SR performed an initial literature review that was expanded by GR. All authors designed the survey instruments. GR collected and analyzed the survey and interview data with guidance from RC. GR revised the rubrics with extensive input from all other authors. All authors contributed to reliability measurements. GR drafted all manuscript sections. RC provided extensive comments during manuscript revisions; JL, SR, and CS also offered comments. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Renée S. Cole .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Supporting Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Reynders, G., Lantz, J., Ruder, S.M. et al. Rubrics to assess critical thinking and information processing in undergraduate STEM courses. IJ STEM Ed 7 , 9 (2020). https://doi.org/10.1186/s40594-020-00208-5

Download citation

Received : 01 October 2019

Accepted : 20 February 2020

Published : 09 March 2020

DOI : https://doi.org/10.1186/s40594-020-00208-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Constructive alignment
  • Self-regulated learning
  • Process skills
  • Professional skills
  • Critical thinking
  • Information processing

ways to assess critical thinking skills

loading

Bookmark this page

  • A Model for the National Assessment of Higher Order Thinking
  • International Critical Thinking Essay Test
  • Online Critical Thinking Basic Concepts Test
  • Online Critical Thinking Basic Concepts Sample Test

Consequential Validity: Using Assessment to Drive Instruction

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

ways to assess critical thinking skills

Critical Thinking Testing and Assessment









The purpose of assessment in instruction is improvement. The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students’ abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to learn about critical thinking, the better we can devise instruction with that particular end in view.

ways to assess critical thinking skills

The Foundation for Critical Thinking offers assessment instruments which share in the same general goal: to enable educators to gather evidence relevant to determining the extent to which instruction is teaching students to think critically (in the process of learning content). To this end, the Fellows of the Foundation recommend:

that academic institutions and units establish an oversight committee for critical thinking, and

that this oversight committee utilizes a combination of assessment instruments (the more the better) to generate incentives for faculty, by providing them with as much evidence as feasible of the actual state of instruction for critical thinking.

The following instruments are available to generate evidence relevant to critical thinking teaching and learning:

Course Evaluation Form : Provides evidence of whether, and to what extent, students perceive faculty as fostering critical thinking in instruction (course by course). Machine-scoreable.

Online Critical Thinking Basic Concepts Test : Provides evidence of whether, and to what extent, students understand the fundamental concepts embedded in critical thinking (and hence tests student readiness to think critically). Machine-scoreable.

Critical Thinking Reading and Writing Test : Provides evidence of whether, and to what extent, students can read closely and write substantively (and hence tests students' abilities to read and write critically). Short-answer.

International Critical Thinking Essay Test : Provides evidence of whether, and to what extent, students are able to analyze and assess excerpts from textbooks or professional writing. Short-answer.

Commission Study Protocol for Interviewing Faculty Regarding Critical Thinking : Provides evidence of whether, and to what extent, critical thinking is being taught at a college or university. Can be adapted for high school. Based on the California Commission Study . Short-answer.

Protocol for Interviewing Faculty Regarding Critical Thinking : Provides evidence of whether, and to what extent, critical thinking is being taught at a college or university. Can be adapted for high school. Short-answer.

Protocol for Interviewing Students Regarding Critical Thinking : Provides evidence of whether, and to what extent, students are learning to think critically at a college or university. Can be adapted for high school). Short-answer. 

Criteria for Critical Thinking Assignments : Can be used by faculty in designing classroom assignments, or by administrators in assessing the extent to which faculty are fostering critical thinking.

Rubrics for Assessing Student Reasoning Abilities : A useful tool in assessing the extent to which students are reasoning well through course content.  

All of the above assessment instruments can be used as part of pre- and post-assessment strategies to gauge development over various time periods.

Consequential Validity

All of the above assessment instruments, when used appropriately and graded accurately, should lead to a high degree of consequential validity. In other words, the use of the instruments should cause teachers to teach in such a way as to foster critical thinking in their various subjects. In this light, for students to perform well on the various instruments, teachers will need to design instruction so that students can perform well on them. Students cannot become skilled in critical thinking without learning (first) the concepts and principles that underlie critical thinking and (second) applying them in a variety of forms of thinking: historical thinking, sociological thinking, biological thinking, etc. Students cannot become skilled in analyzing and assessing reasoning without practicing it. However, when they have routine practice in paraphrasing, summariz­ing, analyzing, and assessing, they will develop skills of mind requisite to the art of thinking well within any subject or discipline, not to mention thinking well within the various domains of human life.

For full copies of this and many other critical thinking articles, books, videos, and more, join us at the Center for Critical Thinking Community Online - the world's leading online community dedicated to critical thinking!   Also featuring interactive learning activities, study groups, and even a social media component, this learning platform will change your conception of intellectual development.

Critical Thinking Tests ({YEAR} Guide)

What Is Critical Thinking?

Who uses critical thinking tests and why, how to prepare for a critical thinking test in 2024, final thoughts, critical thinking tests (2024 guide).

Updated November 18, 2023

Nikki Dale

Critical thinking is the ability to scrutinize evidence using intellectual skills. Reflective skills are employed to reach clear, coherent and logical conclusions – rather than just accepting information as it is provided.

Critical thinking tests measure the candidate’s understanding of logical connections between ideas, the strength of an argument, alternate interpretations and the significance of a particular claim.

A major facet of critical thinking is the ability to separate facts from opinions and work against any subconscious bias.

In critical thinking tests, employers are looking for people who can think critically about information, showing they are open-minded, good problem-solvers and excellent decision-makers.

Critical thinking tests assess how well a candidate can analyze and reason when presented with specific information.

They are used as part of the application process in several industries, most commonly for professions where employees would need to use advanced judgment and analysis skills in decision-making.

For example:

Academic applications – In some instances, critical thinking tests are used to assess whether prospective students have the skills required to be successful in higher education.

Law – Critical thinking assessments are often used in the legal sector as part of the application process. In many law positions, facts are more important than opinion, subconscious bias or pre-existing ideas so an applicant needs to be skilled in critical thinking.

Finance – In financial institutions, decisions often need to be made based on facts rather than emotion or opinion. Judgments made in banking need to be skilled decisions based on logic and the strength of data and information – so to be successful, candidates need to demonstrate that they will not accept arguments and conclusions at face value.

Graduate roles – In some sectors, critical thinking tests are used in graduate recruitment because they are considered to be predictors of ability.

With several different tests available, suited to different industries, many top-level jobs are likely to include critical thinking assessments as part of the application process.

Critical Thinking Tests Explained

Critical thinking tests are usually presented in a similar format no matter who the publisher is. A paragraph of information and data is given, with a statement that is under scrutiny.

Multiple-choice answers are presented for each statement, and there may be more than one question about the same paragraph.

While each question is presented in the same way, different aspects of critical thinking are assessed throughout the test.

Assessing Assumptions

For this type of question, there may be something ‘taken for granted’ in the information provided – and it might not be explicitly stated.

The candidate needs to evaluate the scenario and conclude whether any assumptions are present. The statement below the scenario may or may not support the statement and the answer selection will be about whether the stated assumption is made or not made in the scenario.

Example Question for Assessing Assumptions

Practice Critical Thinking Test with JobTestPrep

The mainstream media presents information that is supported by the political party in power.

Assumption: The information that the mainstream media presents is always correct.

a) Assumption made b) Assumption not made

Determining Inferences

Following a paragraph of information containing evidence, you will be presented with an inference and need to assess whether the inference is absolutely true, possibly true, possibly false, absolutely false, or it is not possible to reach a decision.

An inference is a conclusion that can be reached based on logical reasoning from the information. Although all the evidence to support (or not support) the inference is included in the passage, it will not be obvious or explicitly stated, which makes the inference harder to conclude.

Example Question for Determining Inferences

It has been snowing all night and there is thick snow on the ground. Today’s weather is sunny and bright.

Inference: The snow will melt today.

a) Possibly true b) Absolutely true c) Possibly false d) Absolutely false e) Not possible to reach a decision

Making Deductions

For this type of question, the information presented will be a set of factual statements and the candidate will need to decide if the deduction applies or does not apply.

This logical thinking is a top-down exercise where all the information is provided and needs to be read in the order it is presented.

If statement A = B, does B = C? There should be no grey areas – it either does or does not follow.

Example Question for Making Deductions

All plants have leaves. All leaves are green.

Proposed deduction: All plants are green.

a) Deduction follows b) Deduction does not follow

If you need to prepare for a number of different employment tests and want to outsmart the competition, choose a Premium Membership from JobTestPrep . You will get access to three PrepPacks of your choice, from a database that covers all the major test providers and employers and tailored profession packs.

Get a Premium Package Now

Interpretation of Conclusions

Presented with information, the candidate needs to assess whether a given conclusion is correct based on the evidence provided.

For the purposes of the test, we need to believe that all the information provided in the paragraph is true, even if we have opinions about the correctness of the statement.

Example Question for Interpretation of Conclusions

When cooking a meal, one of the most important things to get right is the balance between major food groups. Satisfaction from a good meal comes from getting the most nutrition and can therefore be attributed to a wide variety of flavors, including vegetables, a good source of protein and carbohydrates. A balanced diet is about more than just everything in moderation and should be considered a scientific process with measuring of ingredients and efficient cooking methods.

Proposed conclusion: The best meals are those that are scientifically prepared.

a) Conclusion follows b) Conclusion does not follow

Evaluation of Arguments (Analysis of Arguments)

In this analysis section, the candidate is presented with a scenario and an argument that might be in favor of the scenario or against it.

The candidate needs to evaluate whether the argument itself is weak or strong. This needs to be based on the relevance to the scenario and whether it accurately addresses the question.

Example Question for Evaluation of Arguments

Should all drugs be made legal?

Proposed argument: No, all drugs are dangerous to everyone.

a) Argument is strong b) Argument is weak

Most Common Critical Thinking Tests in 2024

Watson glaser test.

Watson Glaser is the most commonly used test publisher for critical thinking assessments and is used by many industries.

When sitting a Watson Glaser test, your results will be compared against a sample group of over 1,500 test-takers who are considered representative of graduate-level candidates.

The test is usually 40 questions long, with 30 minutes to answer, but there is a longer version that asks 80 questions with a time limit of an hour.

Who Uses This Test?

The Watson Glaser Test is used in a wide variety of industries for different roles, especially in the legal and banking sectors. Some employers that use the Watson Glaser Test are:

  • Bank of England
  • Irwin Mitchell
  • Simmons & Simmons

What Is the RED model?

The Watson Glaser Test is based on something called the ‘RED model’. The questions in the test are based on:

  • Recognizing assumptions
  • Evaluating arguments
  • Drawing conclusions

The science behind the Watson Glaser Test shows that candidates who show strong critical thinking skills in these areas are more likely to perform well in roles where logical decisions and judgments have to be made.

Where to Take a Free Practice Test

Watson Glaser Tests have a specific layout and format. If you are going to be completing one of the assessments as part of your application, it’s best to practice questions that match the test format.

You can find Watson Glaser practice tests at JobTestPrep as well as a prep pack to give you all the tips, tricks and information you need to make the most of your practice time.

Take a Practice Watson Glaser Test

SHL Critical Reasoning Battery Test

The SHL Critical Reasoning Battery Test includes questions based on numerical, verbal and inductive reasoning. This test is usually used for managerial and supervisory roles, and can include mechanical comprehension if needed for the job role (usually in engineering or mechanical roles).

You can find out more on JobTestPrep’s SHL Critical Reasoning Battery pages .

Take a Practice SHL Test

The Graduate Management Admissions Test (GMAT) is an online adaptive test – using sophisticated algorithms to adjust the difficulty of the questions according to the answers already provided.

Questions include integrated, quantitative and verbal reasoning as well as an analytical writing assessment. The GMAT is widely used to predict performance in business or management programs in more than 1,700 universities and organizations.

Take a Practice GMAT

Preparation is key to success in any pre-employment assessment. While some people think critical reasoning is not a skill you can practice, there are some steps you can take to perform at your best.

Critical thinking tests are straightforward but not necessarily easy.

Step 1 . Consider Buying a Preparation Pack

If you can determine who the publisher is for the test you will take, it may be worthwhile investing in a prep pack from that particular publisher.

JobTestPrep offers prep packs for many major test publishers. These packs include realistic practice tests as well as study guides, tips and tricks to help you build your own question-solving strategies.

Step 2 . Use Practice Tests

Even if you decide not to purchase a prep pack, taking practice tests will help you focus on the areas where you need to improve to be successful.

It is important to find out the publisher of the test you will take because not all critical thinking tests are at the same level and they may not follow the same structure. Timings, answering methodologies and the number of questions will vary between publishers.

You can usually find out the test publisher before you take the assessment by asking the recruiter or searching online.

Step 3 . Practice Under Test Conditions

Critical thinking tests are timed. To give yourself the best chance of achieving a high score, you need to answer the questions quickly and efficiently.

Practicing under test conditions – including the time limit – will help you to understand how much time you need to spend on each question and will help you to develop efficient time management skills for the assessment.

Practicing under test conditions will also help you focus so you can make the most of the session.

Step 4 . Practice Abstract Reasoning

Abstract reasoning is a form of critical thinking that uses logic to form a conclusion. Some abstract reasoning tests are presented as word problems.

Practicing these is a good way to flex critical thinking muscles. You can find practice questions on the Psychometric Success website .

Step 5 . Practice Critical Thinking in Everyday Life

Reading widely, especially non-fiction, is a good way to practice your critical thinking skills in everyday life.

Newspaper articles, scientific or technical journals, and other sources of information present an opportunity to think about:

  • The strength of arguments
  • The perspective of the author
  • Whether there are enough facts presented to draw the conclusion given
  • Whether other conclusions could be drawn from the same information

Step 6 . Revise Logical Fallacies

Knowledge of logical fallacies will help you to judge the effectiveness of an argument. Fallacy describes ‘faulty reasoning’ in an argument and is often seen in hyperbole or opinion pieces in newspapers and magazines.

There are many types of fallacy that you might come across, such as:

  • Strawman – An argument that doesn’t address the statement.
  • False cause – An argument based on a connection that doesn’t exist.
  • Ambiguity – An argument using a phrase that is unclear or that may have different meanings.
  • Appeal to popularity – An argument that states it must be true because many people believe it.

There are many others, including red herrings, appeal to authority and false dichotomy. Learning these will help you to identify a weak argument.

Step 7 . Focus on Long-Term Practice

Cramming and panicking about a critical thinking assessment is rarely conducive to great performance.

If you are looking for a career in a sector where critical thinking skills are necessary, then long-term practice will have better results when you come to be assessed. Make critical thinking a part of life – so that every day can be a chance to practice recognizing assumptions.

Key Tips for Critical Thinking Test Success

Understand the format of the test and each question type.

Familiarity is important for any assessment, and in critical thinking tests, it is essential that you can recognize what the question is looking for. As mentioned above, this is usually one of the following:

  • Assessing assumptions
  • Determining inferences
  • Making deductions
  • Interpreting conclusions

Practice tests will help you become comfortable with the structure and format of the test, including ways to answer, and will also demonstrate what the question types look like.

Read Test Content Carefully

Taking time to read and understand the content provided in the question is important to ensure that you can answer correctly.

The information you need to determine the correct answer will be provided although it might not be explicitly stated. Careful reading is an important part of critical thinking.

Only Use the Information Provided

While some of the information provided in the critical thinking test might be related to the role you are applying for, or about something that you have existing knowledge of, you mustn't use this knowledge during the test.

A facet of critical thinking is avoiding subconscious bias and opinion, so only use the information that is provided to answer the question.

Look Out for Facts and Fallacies

Throughout the critical thinking test, look out for facts and fallacies in the information and arguments provided.

Identifying fallacies will help you decide if an argument is strong and will help you answer questions correctly.

Critical thinking tests are used as pre-employment assessments for jobs that require effective communication, good problem-solving and great decision-making, such as those in the legal sector and banking.

These tests assess the ability of candidates to question and scrutinize evidence, make logical connections between ideas, find alternative interpretations and decide on the strength of an argument.

All critical thinking tests are not the same, but they do have similar question types. Learning what these are and how to answer them will help you perform better. Practicing tests based on the specific publisher of your test will give you the best results.

You might also be interested in these other Psychometric Success articles:

The Watson Glaser Critical Thinking Appraisal

Or explore the Aptitude Tests / Test Types sections.

Critical Thinking Assessment: 4 Ways to Test Applicants

Post Author - Juste Semetaite

In the current age of information overload, critical thinking (CT) is a vital skill to sift fact from fiction. Fake news, scams, and disinformation can have a negative impact on individuals as well as businesses. Ultimately, those with finer CT skills can help to lead their team with logical thinking, evidence-based motivation, and smarter decisions.

Today, most roles require critical thinking skills. And understanding how to test and evaluate critical thinking skills can not only help to differentiate candidates but may even predict job performance .

This article will cover:

What is critical thinking?

  • Critical thinking vs problem-solving
  • 5 critical thinking sub-skills
  • The importance of assessing critical thinking skills
  • 4 ways to leverage critical thinking assessments

Critical thinking is the process of analyzing and evaluating information in a logical way. And though a valuable skill since as far back as the early philosophers’ era, it is just as vital today. For candidates to succeed in the digital economy , they need modern thinking skills that help them think critically.

Whether we realize it or not, we process tons of data and information on a daily basis. Everything from social media to online news, data from apps like Strava – and that’s on top of all the key metrics in relation to our professional role.

Without a shadow of a doubt, correctly interpreting information — and recognizing disinformation — is an essential skill in today’s workplace and everyday life. And that’s also why teaching critical thinking skills in education is so important to prepare the next generation for the challenges they will face in the modern workplace.

Critical thinking isn’t about being constantly negative or critical of everything. It’s about objectivity and having an open, inquisitive mind. To think critically is to analyze issues based on hard evidence (as opposed to personal opinions, biases, etc.) in order to build a thorough understanding of what’s really going on. And from this place of thorough understanding, you can make better decisions and solve problems more effectively. Bernard Marr | Source

Today, candidates with CT skills think and reason independently, question data, and use their findings to contribute actively to their team rather than passively taking in or accepting information as fact.

Why are critical thinking skills important?

In the workplace, those with strong CT skills no longer rely on their gut or instinct for their decisions. They’re able to problem-solve more effectively by analyzing situations systematically.

With these skills, they think objectively about information and other points of view and look for evidence to support their findings rather than simply accepting opinions or conclusions as facts.

When employees can turn critical thinking into a habit, it ultimately reduces personal bias and helps them be more open to their teammates’ suggestions — improving how teams collaborate and collectively solve problems.

Critical thinking vs. Problem solving – what’s the difference?

Let’s explore the difference between these two similar concepts in more detail.

Critical thinking is about processing and analyzing information to reach an objective, evidence-based conclusion. Let’s take a look at an example of critical thinking in action:

  • A member of the team suggests using a new app they’ve heard about to automate and speed up candidate screening . Some like the idea, but others in the team share reasons why they don’t support the idea. So you visit the software website and look at the key features and benefits yourself, then you might look for reviews about it and ask your HR counterparts what they think of it. The reviews look promising, and a few of your fellow practitioners say it’s worked well for them. Next, you look into the costs compared to the solution your team is already using and calculate that the return on investment (ROI) is good. You arrive at the conclusion that it’d be worth testing the platform with the free trial version and recommend this to your team.

On the other hand, problem solving can involve many of the same skills as critical thinking, such as observing and evaluating. Still, it focuses on identifying business obstacles and coming up with solutions. So, let’s return to the example of the candidate screening software and see how it might work differently in the context of problem-solving :

  • For weeks, the talent acquisition team has complained about how long it takes to screen candidates manually. One of the team members decides to look for a solution to their problem. They assess the team’s current processes and resources and how to best solve the issues. In their research, they discover the new candidate screening platform and test out its functionality for a few days. They feel it would benefit the team and suggest it at the next meeting. Great problem solving, HR leader!

Problem-Solving Skills: 5 Ways to Evaluate Them When Hiring

What are the 5 sub-skills that make up critical thinking?

the sub skills of critical thinking competency

Now that we’ve established what CT is, let’s break it down into the 5 core sub-skills that make up a critical thinking mindset .

  • Observation : Being observant of your environment is the first step to thinking critically. Observant employees can even identify a potential problem before it becomes a problem.
  • Analysis : Once you’ve observed the issue or problem, you can begin to analyze its parts. It’s about asking questions, researching, and evaluating the findings objectively. This is an essential skill, especially for someone in a management role.
  • Inference : Also known as construct validity, is about drawing a conclusion from limited information. To do this effectively may require in-depth knowledge of a field. Candidates with this skill can contribute a lot of value to a startup, for instance, where initially, there may be little data available for information processing.
  • Communication : This pertains to expressing ideas and your reasoning clearly and persuasively, as well as actively listening to colleagues’ suggestions or viewpoints. When all members of a team or department can communicate and consider different perspectives, it helps tasks (and, well, everything) move along swiftly and smoothly.
  • Problem solving : Once you begin implementing a chosen solution, you may still encounter teething problems. At that point, problem solving skills will help you decide on the best solution and how to overcome the obstacles to bring you closer to your goal.

What is a critical thinking assessment test?

Though there are a few different ways to assess critical thinking, such as the Collegiate Learning Assessment, one of the most well-known tests is the Watson Glaser™ Critical Thinking Appraisal .

Critical thinking tests, or critical reasoning tests, are psychometric tests used in recruitment at all levels, graduate, professional and managerial, but predominantly in the legal sector. However, it is not uncommon to find companies in other sectors using critical thinking tests as part of their selection process. This is an intense test, focusing primarily on your analytical, or critical thinking, skills. Source

These tests are usually timed and typically include multiple choice items, short answers or short scenario-based questions to assess students or prospective candidates. They test candidates’ ability to interpret data without bias, find logical links between information, and separate facts from false data .

Critical thinking example questions from the Watson-Glacer test rubric

But how do these tests measure critical thinking?

In addition to educational and psychological testing, many employers today use critical thinking tests to assess a person’s ability to question information — to ask What , Why , and How of the data. A standard critical thinking test breaks down this aptitude by examining the following 5 components:

  • assumption – analyzing a scenario to determine if there are any assumptions made
  • deduction – the ability to choose which deductions are logical
  • evaluating evidence – in support of and against something
  • inference – conclusions, drawn from observed facts
  • interpretation – interpreting the accuracy of a stated conclusion (based on a scenario)

Why is it important to assess critical thinking skills during the recruitment process?

Critical thinking skills may be considered a soft skill , but it’s become a prerequisite in certain industries, like software, and for many roles. Marketing managers, project managers, accountants, and healthcare professionals, for example, all require a degree of CT skills to perform their roles.

The kinds of businesses that require critical thinking include technology , engineering , healthcare , the legal sector , scientific research, and education . These industries are typically very technical and rely on data . People working in these fields research and use data to draw logical conclusions that help them work smarter and more efficiently.

In the hiring process, test takers with good critical thinking skills stand out . Why? Because they are able to demonstrate their ability to collaborate, problem-solve, and manage pressure in a rational, logical manner. As a result, they’re more likely to make the right business decisions that boost efficiency and, ultimately, a business’s bottom line.

Critical thinking assessment template for evaluating candidates

Examples of jobs that rely on critical thinking skills

Critical thinking is not rocket science, but it is an important skill when making decisions — especially when the correct answer is not obvious. Here are a few examples of job roles that rely on critical thinking dispositions:

  • computer programmers or developers : may use critical thinking and other advanced skills in a variety of ways, from debugging code to analyzing the problem, finding potential causes, and coming up with suitable solutions. They also use CT when there is no clear roadmap to rely on, such as when building a new app or feature.
  • criminologists : must have critical thinking abilities to observe criminal behavior objectively and to analyze the problem in such a way that they can be confident in the conclusions they present to the authorities.
  • medical professionals : need to diagnose their patients’ condition through observation, communication, analysis and solving complex problems to decide on the best treatment.
  • air traffic controllers: need a super clear, calm head to deal with their high-stress job. They observe traffic, communicate with pilots, and constantly problem-solve to avoid airplane collisions.
  • legal professionals : use logic and reasoning to analyze various cases – even before deciding whether they’ll take on a case – and then use their excellent communication skills to sway people over to their reasoning in a trial setting.
  • project managers : have to deal with a lot of moving parts at the same time. To successfully keep projects on time and budget, they continually observe and analyze the progress of project components, communicate continually with the team and external stakeholders and work to solve any problems that crop up.

What are the risks of not testing for critical thinking?

By not evaluating critical thinking beforehand, you may end up hiring candidates with poor CT skills. Especially when hiring business leaders and for key positions, this has the potential to wreak havoc on a business. Their inaccurate assumptions are more likely to lead to bad decisions , which could cost the company money .

Weak critical thinking can result in a number of issues for your organization and justifies the expense or added effort of asking your candidate to complete critical thinking tests in the hiring process. For example, poor CT skills may result in:

  • making mistakes
  • not being able to take action when needed
  • working off false assumptions
  • unnecessary strain on work relationships

4 ways to assess critical thinking skills in candidates

Now that we’ve seen how important it is for most candidates today to have strong critical thinking skills, let’s take a look at some of the assessment instruments the talent acquisition team can use.

#1 – A homework assignment

A homework assignment is a task that assesses whether test takers have the right skills for a role. If critical thinking is essential for a particular job, you could provide candidates with a homework assignment that specifically tests their ability to:

  • accurately interpret relevant evidence
  • reach logical conclusions
  • judge information skeptically
  • communicate their own viewpoint and others’ backed by facts

Top tips to enlarge those brains

Tip : use Toggl Hire’s skills screening tests to easily filter out the good candidates first and speed up your hiring process.

#2 – Behavioral and situational interview questions

Ask the candidate to provide examples of situations when they used CT for solving problems or making a decision. This can provide insight into the candidate’s ability to analyze information and make informed decisions. For example:

Critical thinking example questions:

  • Tell me about a time when you had to make a really difficult decision at work.
  • What would you do in a situation where your manager made a mistake in a presentation or report?
  • How would you respond if a colleague shared a new idea or solution with you?
  • How do you evaluate the potential outcomes of different actions or decisions?
  • Can you describe a situation where you had to think on your feet and come up with a creative solution to a problem?
  • How do you ensure that your decision-making is based on relevant and accurate information?

30 Behavioral Interview Questions to Ask Candidates (With Answers)

#3 – Discuss the candidate’s critical thinking skills with their references

Additionally, the hiring manager can ask the candidate’s references about how the candidate demonstrated CT skills in the past.

  • Can you recall a time when (the candidate) had to convince you to choose an alternative solution to a problem?
  • Tell me about a time when (the candidate) had to solve a team disagreement regarding a project.

#4 – Critical thinking tests

Ask the candidate to complete a critical thinking test and score against critical thinking rubrics. You can then share feedback on their test scores with them and explore their willingness to improve their score, if necessary. Or compare their score to other applicants, and prioritize those with higher scores if the role truly requires a critical thinker.

Create your next critical thinking assessment with Toggl Hire

Assessing critical thinking skills is becoming a key component in the hiring process, especially for roles that require a particularly advanced skillset. Critical thinking is a sign of future performance. Candidates that clearly demonstrate these skills have a lot to offer companies, from better decision-making to more productive relationships and cost savings.

If your team needs help automating the screening process, and creating custom skills tests based on specific roles, try Toggl Hire’s skills test questions engine or the Custom Test Builder to create the exact questions you want from scratch.

Juste Semetaite

Juste loves investigating through writing. A copywriter by trade, she spent the last ten years in startups, telling stories and building marketing teams. She works at Toggl Hire and writes about how businesses can recruit really great people.

Subscribe to On The Clock.

Insights into building businesses better, from hiring to profitability (and everything in between). New editions drop every two weeks.

You might also like...

Related to Talent Assessments

How to Hire an Accountant for Your Business

How to Hire an Accountant for Your Business

Toggl Blog, Read articles by Elizabeth Thorn

Conducting Talent Gap Analysis in 6 Steps

Toggl Blog, Read articles by Juste Semetaite

8 Problem-Solving Interview Questions You Should Ask

Take a peek at our most popular categories:

Tara Well Ph.D.

How to Improve Your Critical Thinking Skills

Traditional tools and new technologies..

Posted September 29, 2023 | Reviewed by Lybi Ma

Hannah Olinger / Unsplash

Technology provides access to vast information and makes daily life easier. Yet, too much reliance on technology potentially interferes with the acquisition and maintenance of critical thinking skills in several ways:

1. Information Overload : The constant influx of data can discourage deep critical thinking as we may come to rely on quick, surface-level information rather than delving deeply into a subject.

2. Shortened Attention Span: Frequent digital distractions can disrupt our ability for the sustained focus and concentration required for critical thinking.

3. Confirmatory Bias and Echo Chambers: Technology, including social media and personalized content algorithms, can reinforce confirmation bias . People are often exposed to information that aligns with their beliefs and opinions, making them less likely to encounter diverse perspectives and engage in critical thinking about opposing views.

4. Reduced Problem-Solving Opportunities: Technology often provides quick solutions to problems. While this benefits efficiency, it may discourage individuals from engaging in complex problem-solving, a fundamental aspect of critical thinking.

5. Loss of Research Skills: The ease of accessing information online can diminish traditional research skills, such as library research or in-depth reading. These skills are essential for critical thinking, as they involve evaluating sources, synthesizing information, and analyzing complex texts.

While technology can pose challenges to developing critical thinking skills, it's important to note that technology can also be a valuable tool for learning and skill development. It can provide access to educational resources, facilitate collaboration , and support critical thinking when used thoughtfully and intentionally. Balancing technology use with activities that encourage deep thinking and analysis is vital to lessening its potential adverse effects on critical thinking.

Writing is a traditional and powerful tool to exercise and improve your critical thinking skills. Consider these ways writing can help enhance critical thinking:

1. Clarity of Thought: Writing requires that you articulate your thoughts clearly and coherently. When you need to put your ideas on paper, you must organize them logically, which requires a deeper understanding of the subject matter.

2. Analysis and Evaluation: Critical thinking involves analyzing and evaluating information. When you write, you often need to assess the validity and relevance of different sources, arguments, or pieces of evidence, which hone your critical thinking skills.

3. Problem-Solving: Writing can be a problem-solving exercise in itself. Whether crafting an argument, developing a thesis, or finding the right words to express your ideas, writing requires thinking critically about approaching these challenges effectively.

4. Research Skills: Good writing often involves research, and research requires critical thinking. You need to assess the credibility of sources, synthesize information, and draw conclusions based on the evidence you gather.

5. Argumentation: Constructing a persuasive argument in writing is a complex process requiring critical thinking. You must anticipate counterarguments, provide evidence to support your claims, and address potential weaknesses in your reasoning.

6. Revision and Editing: To be an influential writer, you must learn to read your work critically. Editing and revising requires evaluating your writing objectively, identifying areas that need improvement, and refining your ideas and arguments.

7. Problem Identification: In some cases, writing can help you identify problems or gaps in your thinking. As you write, you might realize that your arguments are not as strong as you initially thought or that you need more information to support your claims. This recognition of limitations is a crucial aspect of critical thinking.

Writing is a dynamic process that engages multiple facets of critical thinking. It has been a valuable tool used in education , business, and personal development for centuries.

Yet, this traditional approach of self-generated written thoughts is rapidly being supplanted by AI -generated writing tools like Chat GPT (Generative Pre-trained Transformer. With over 100 million users of Chat GPT alone, we cannot ignore its potential impact. How might the increasing reliance on AI-generated writing tools influence our critical thinking skills? The impact can vary depending on how the tools are used and the context in which they are employed.

ways to assess critical thinking skills

Critical thinking involves evaluating information sources for credibility, relevance, and bias. If individuals consistently trust the information provided by chatbots without critically assessing its quality, it can hinder their development of critical thinking skills. This is especially true if they depend on the chatbot to provide answers without questioning or verifying the information. Relying solely on chatbots for answers may also reduce people's effort in problem-solving. Critical thinking often requires wrestling with complex problems, considering multiple perspectives, and generating creative solutions. If we default to chatbots for quick answers, we may miss opportunities to develop these skills.

However, it's essential to note that the impact of chatbots on critical thinking skills may not be entirely negative. These tools can also have positive effects:

1. Chatbots provide quick access to vast information, which can benefit research and problem-solving. When used as a supplement to critical thinking, they can enhance the efficiency of information retrieval.

2. Chatbots can sometimes assist in complex tasks by providing relevant data or suggestions. When individuals critically evaluate and integrate this information into their decision-making process, it can enhance their critical thinking.

3. Chatbots can be used as learning aids. They can provide explanations, examples, and guidance, which can support skill development and, when used effectively, encourage critical thinking.

In summary, the impact of chatbots on critical thinking skills depends on how we use them. The effect will be harmful if they become a crutch to avoid independent thought or analysis. However, they can be valuable resources when used as tools to facilitate and augment critical thinking and writing processes. Individuals must balance leveraging the convenience of chatbots and actively engaging in independent critical thinking and problem-solving to maintain and enhance their cognitive abilities. You can do that effectively through writing regularly.

Copyright 2023 Tara Well, PhD

Tara Well Ph.D.

Tara Well, Ph.D. , is a professor in the department of psychology at Barnard College of Columbia University.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

JavaScript seems to be disabled in your browser. For the best experience on our site, be sure to turn on Javascript in your browser.

  • Order Tracking
  • Create an Account

ways to assess critical thinking skills

200+ Award-Winning Educational Textbooks, Activity Books, & Printable eBooks!

  • Compare Products

Reading, Writing, Math, Science, Social Studies

  • Search by Book Series
  • Algebra I & II  Gr. 7-12+
  • Algebra Magic Tricks  Gr. 2-12+
  • Algebra Word Problems  Gr. 7-12+
  • Balance Benders  Gr. 2-12+
  • Balance Math & More!  Gr. 2-12+
  • Basics of Critical Thinking  Gr. 4-7
  • Brain Stretchers  Gr. 5-12+
  • Building Thinking Skills  Gr. Toddler-12+
  • Building Writing Skills  Gr. 3-7
  • Bundles - Critical Thinking  Gr. PreK-9
  • Bundles - Language Arts  Gr. K-8
  • Bundles - Mathematics  Gr. PreK-9
  • Bundles - Multi-Subject Curriculum  Gr. PreK-12+
  • Bundles - Test Prep  Gr. Toddler-12+
  • Can You Find Me?  Gr. PreK-1
  • Complete the Picture Math  Gr. 1-3
  • Cornell Critical Thinking Tests  Gr. 5-12+
  • Cranium Crackers  Gr. 3-12+
  • Creative Problem Solving  Gr. PreK-2
  • Critical Thinking Activities to Improve Writing  Gr. 4-12+
  • Critical Thinking Coloring  Gr. PreK-2
  • Critical Thinking Detective  Gr. 3-12+
  • Critical Thinking Tests  Gr. PreK-6
  • Critical Thinking for Reading Comprehension  Gr. 1-5
  • Critical Thinking in United States History  Gr. 6-12+
  • CrossNumber Math Puzzles  Gr. 4-10
  • Crypt-O-Words  Gr. 2-7
  • Crypto Mind Benders  Gr. 3-12+
  • Daily Mind Builders  Gr. 5-12+
  • Dare to Compare Math  Gr. 2-7
  • Developing Critical Thinking through Science  Gr. 1-8
  • Dr. DooRiddles  Gr. PreK-12+
  • Dr. Funster's  Gr. 2-12+
  • Editor in Chief  Gr. 2-12+
  • Fun-Time Phonics!  Gr. PreK-2
  • Half 'n Half Animals  Gr. K-4
  • Hands-On Thinking Skills  Gr. K-1
  • Inference Jones  Gr. 1-6
  • James Madison  Gr. 10-12+
  • Jumbles  Gr. 3-5
  • Language Mechanic  Gr. 4-7
  • Language Smarts  Gr. 1-4
  • Mastering Logic & Math Problem Solving  Gr. 6-9
  • Math Analogies  Gr. K-9
  • Math Detective  Gr. 3-8
  • Math Games  Gr. 3-8
  • Math Mind Benders  Gr. 5-12+
  • Math Ties  Gr. 4-8
  • Math Word Problems  Gr. 4-10
  • Mathematical Reasoning  Gr. Toddler-11
  • Middle School Science  Gr. 6-8
  • Mind Benders  Gr. PreK-12+
  • Mind Building Math  Gr. K-1
  • Mind Building Reading  Gr. K-1
  • Novel Thinking  Gr. 3-6
  • OLSAT® Test Prep  Gr. PreK-K
  • Organizing Thinking  Gr. 2-8
  • Pattern Explorer  Gr. 3-9
  • Practical Critical Thinking  Gr. 8-12+
  • Punctuation Puzzler  Gr. 3-8
  • Reading Detective  Gr. 3-12+
  • Red Herring Mysteries  Gr. 4-12+
  • Red Herrings Science Mysteries  Gr. 4-9
  • Science Detective  Gr. 3-6
  • Science Mind Benders  Gr. PreK-3
  • Science Vocabulary Crossword Puzzles  Gr. 4-6
  • Sciencewise  Gr. 4-12+
  • Scratch Your Brain  Gr. 2-12+
  • Sentence Diagramming  Gr. 3-12+
  • Smarty Pants Puzzles  Gr. 3-12+
  • Snailopolis  Gr. K-4
  • Something's Fishy at Lake Iwannafisha  Gr. 5-9
  • Teaching Technology  Gr. 3-12+
  • Tell Me a Story  Gr. PreK-1
  • Think Analogies  Gr. 3-12+
  • Think and Write  Gr. 3-8
  • Think-A-Grams  Gr. 4-12+
  • Thinking About Time  Gr. 3-6
  • Thinking Connections  Gr. 4-12+
  • Thinking Directionally  Gr. 2-6
  • Thinking Skills & Key Concepts  Gr. PreK-2
  • Thinking Skills for Tests  Gr. PreK-5
  • U.S. History Detective  Gr. 8-12+
  • Understanding Fractions  Gr. 2-6
  • Visual Perceptual Skill Building  Gr. PreK-3
  • Vocabulary Riddles  Gr. 4-8
  • Vocabulary Smarts  Gr. 2-5
  • Vocabulary Virtuoso  Gr. 2-12+
  • What Would You Do?  Gr. 2-12+
  • Who Is This Kid? Colleges Want to Know!  Gr. 9-12+
  • Word Explorer  Gr. 6-8
  • Word Roots  Gr. 3-12+
  • World History Detective  Gr. 6-12+
  • Writing Detective  Gr. 3-6
  • You Decide!  Gr. 6-12+

ways to assess critical thinking skills

  • Special of the Month
  • Sign Up for our Best Offers
  • Bundles = Greatest Savings!
  • Sign Up for Free Puzzles
  • Sign Up for Free Activities
  • Toddler (Ages 0-3)
  • PreK (Ages 3-5)
  • Kindergarten (Ages 5-6)
  • 1st Grade (Ages 6-7)
  • 2nd Grade (Ages 7-8)
  • 3rd Grade (Ages 8-9)
  • 4th Grade (Ages 9-10)
  • 5th Grade (Ages 10-11)
  • 6th Grade (Ages 11-12)
  • 7th Grade (Ages 12-13)
  • 8th Grade (Ages 13-14)
  • 9th Grade (Ages 14-15)
  • 10th Grade (Ages 15-16)
  • 11th Grade (Ages 16-17)
  • 12th Grade (Ages 17-18)
  • 12th+ Grade (Ages 18+)
  • Test Prep Directory
  • Test Prep Bundles
  • Test Prep Guides
  • Preschool Academics
  • Store Locator
  • Submit Feedback/Request
  • Sales Alerts Sign-Up
  • Technical Support
  • Mission & History
  • Articles & Advice
  • Testimonials
  • Our Guarantee
  • New Products
  • Free Activities
  • Libros en Español

How to Assess Critical Thinking

Assessing Critical Thinking

October 11, 2008, by The Critical Thinking Co. Staff

Developing appropriate testing and evaluation of students is an important part of building critical thinking practice into your teaching. If students know that you expect them to think critically on tests, and the necessary guidelines and preparation are given before hand, they are more likely to take a critical thinking approach to learning all course material. Design test items that require higher-order thinking skills such as analysis, synthesis, and evaluation, rather than simple recall of facts; ask students to explain and justify all claims made; instruct them to make inferences or draw conclusions that go beyond given data. Essays and problems are the most obvious form of item to use for testing these skills, but well-constructed multiple-choice items can also work well. Consider carefully how you will evaluate and grade tests that require critical thinking and develop clear criteria that can be shared with the students.

In order to make informed decisions about student critical thinking and learning, you need to assess student performance and behavior in class as well as on tests and assignments. Paying careful attention to signs of inattention or frustration, and asking students to explain them, can provide much valuable information about what may need to change in your teaching approach; similarly, signs of strong engagement or interest can tell you a great deal about what you are doing well to get students to think. Brief classroom assessment instruments, such as asking students to write down the clearest and most confusing points for them in a class session, can be very helpful for collecting a lot of information quickly about student thinking and understanding.

How it works

Transform your enterprise with the scalable mindsets, skills, & behavior change that drive performance.

Explore how BetterUp connects to your core business systems.

We pair AI with the latest in human-centered coaching to drive powerful, lasting learning and behavior change.

Build leaders that accelerate team performance and engagement.

Unlock performance potential at scale with AI-powered curated growth journeys.

Build resilience, well-being and agility to drive performance across your entire enterprise.

Transform your business, starting with your sales leaders.

Unlock business impact from the top with executive coaching.

Foster a culture of inclusion and belonging.

Accelerate the performance and potential of your agencies and employees.

See how innovative organizations use BetterUp to build a thriving workforce.

Discover how BetterUp measurably impacts key business outcomes for organizations like yours.

Daring Leadership Institute: a groundbreaking partnership that amplifies Brené Brown's empirically based, courage-building curriculum with BetterUp’s human transformation platform.

Learn more

  • What is coaching?

Learn how 1:1 coaching works, who its for, and if it's right for you.

Accelerate your personal and professional growth with the expert guidance of a BetterUp Coach.

Types of Coaching

Navigate career transitions, accelerate your professional growth, and achieve your career goals with expert coaching.

Enhance your communication skills for better personal and professional relationships, with tailored coaching that focuses on your needs.

Find balance, resilience, and well-being in all areas of your life with holistic coaching designed to empower you.

Discover your perfect match : Take our 5-minute assessment and let us pair you with one of our top Coaches tailored just for you.

Find your Coach

Research, expert insights, and resources to develop courageous leaders within your organization.

Best practices, research, and tools to fuel individual and business growth.

View on-demand BetterUp events and learn about upcoming live discussions.

The latest insights and ideas for building a high-performing workplace.

  • BetterUp Briefing

The online magazine that helps you understand tomorrow's workforce trends, today.

Innovative research featured in peer-reviewed journals, press, and more.

Founded in 2022 to deepen the understanding of the intersection of well-being, purpose, and performance

We're on a mission to help everyone live with clarity, purpose, and passion.

Join us and create impactful change.

Read the buzz about BetterUp.

Meet the leadership that's passionate about empowering your workforce.

Find your Coach

For Business

For Individuals

Request a demo

How to develop critical thinking skills

man-thinking-while-holding-pen-and-looking-at-computer-how-to-develop-critical-thinking-skills

Jump to section

What are critical thinking skills?

How to develop critical thinking skills: 12 tips, how to practice critical thinking skills at work, become your own best critic.

A client requests a tight deadline on an intense project. Your childcare provider calls in sick on a day full of meetings. Payment from a contract gig is a month behind. 

Your day-to-day will always have challenges, big and small. And no matter the size and urgency, they all ask you to use critical thinking to analyze the situation and arrive at the right solution. 

Critical thinking includes a wide set of soft skills that encourage continuous learning, resilience , and self-reflection. The more you add to your professional toolbelt, the more equipped you’ll be to tackle whatever challenge presents itself. Here’s how to develop critical thinking, with examples explaining how to use it.

Critical thinking skills are the skills you use to analyze information, imagine scenarios holistically, and create rational solutions. It’s a type of emotional intelligence that stimulates effective problem-solving and decision-making . 

When you fine-tune your critical thinking skills, you seek beyond face-value observations and knee-jerk reactions. Instead, you harvest deeper insights and string together ideas and concepts in logical, sometimes out-of-the-box , ways. 

Imagine a team working on a marketing strategy for a new set of services. That team might use critical thinking to balance goals and key performance indicators , like new customer acquisition costs, average monthly sales, and net profit margins. They understand the connections between overlapping factors to build a strategy that stays within budget and attracts new sales. 

Looking for ways to improve critical thinking skills? Start by brushing up on the following soft skills that fall under this umbrella: 

  • Analytical thinking: Approaching problems with an analytical eye includes breaking down complex issues into small chunks and examining their significance. An example could be organizing customer feedback to identify trends and improve your product offerings. 
  • Open-mindedness: Push past cognitive biases and be receptive to different points of view and constructive feedback . Managers and team members who keep an open mind position themselves to hear new ideas that foster innovation . 
  • Creative thinking: With creative thinking , you can develop several ideas to address a single problem, like brainstorming more efficient workflow best practices to boost productivity and employee morale . 
  • Self-reflection: Self-reflection lets you examine your thinking and assumptions to stimulate healthier collaboration and thought processes. Maybe a bad first impression created a negative anchoring bias with a new coworker. Reflecting on your own behavior stirs up empathy and improves the relationship. 
  • Evaluation: With evaluation skills, you tackle the pros and cons of a situation based on logic rather than emotion. When prioritizing tasks , you might be tempted to do the fun or easy ones first, but evaluating their urgency and importance can help you make better decisions. 

There’s no magic method to change your thinking processes. Improvement happens with small, intentional changes to your everyday habits until a more critical approach to thinking is automatic. 

Here are 12 tips for building stronger self-awareness and learning how to improve critical thinking: 

1. Be cautious

There’s nothing wrong with a little bit of skepticism. One of the core principles of critical thinking is asking questions and dissecting the available information. You might surprise yourself at what you find when you stop to think before taking action. 

Before making a decision, use evidence, logic, and deductive reasoning to support your own opinions or challenge ideas. It helps you and your team avoid falling prey to bad information or resistance to change .

2. Ask open-ended questions

“Yes” or “no” questions invite agreement rather than reflection. Instead, ask open-ended questions that force you to engage in analysis and rumination. Digging deeper can help you identify potential biases, uncover assumptions, and arrive at new hypotheses and possible solutions. 

3. Do your research

No matter your proficiency, you can always learn more. Turning to different points of view and information is a great way to develop a comprehensive understanding of a topic and make informed decisions. You’ll prioritize reliable information rather than fall into emotional or automatic decision-making. 

close-up-of-mans-hands-opening-a-dictionary-with-notebook-on-the-side-how-to-develop-critical-thinking-skills

4. Consider several opinions

You might spend so much time on your work that it’s easy to get stuck in your own perspective, especially if you work independently on a remote team . Make an effort to reach out to colleagues to hear different ideas and thought patterns. Their input might surprise you.

If or when you disagree, remember that you and your team share a common goal. Divergent opinions are constructive, so shift the focus to finding solutions rather than defending disagreements. 

5. Learn to be quiet

Active listening is the intentional practice of concentrating on a conversation partner instead of your own thoughts. It’s about paying attention to detail and letting people know you value their opinions, which can open your mind to new perspectives and thought processes.

If you’re brainstorming with your team or having a 1:1 with a coworker , listen, ask clarifying questions, and work to understand other peoples’ viewpoints. Listening to your team will help you find fallacies in arguments to improve possible solutions.

6. Schedule reflection

Whether waking up at 5 am or using a procrastination hack, scheduling time to think puts you in a growth mindset . Your mind has natural cognitive biases to help you simplify decision-making, but squashing them is key to thinking critically and finding new solutions besides the ones you might gravitate toward. Creating time and calm space in your day gives you the chance to step back and visualize the biases that impact your decision-making. 

7. Cultivate curiosity

With so many demands and job responsibilities, it’s easy to seek solace in routine. But getting out of your comfort zone helps spark critical thinking and find more solutions than you usually might.

If curiosity doesn’t come naturally to you, cultivate a thirst for knowledge by reskilling and upskilling . Not only will you add a new skill to your resume , but expanding the limits of your professional knowledge might motivate you to ask more questions. 

You don’t have to develop critical thinking skills exclusively in the office. Whether on your break or finding a hobby to do after work, playing strategic games or filling out crosswords can prime your brain for problem-solving. 

woman-solving-puzzle-at-home-how-to-develop-critical-thinking-skills

9. Write it down

Recording your thoughts with pen and paper can lead to stronger brain activity than typing them out on a keyboard. If you’re stuck and want to think more critically about a problem, writing your ideas can help you process information more deeply.

The act of recording ideas on paper can also improve your memory . Ideas are more likely to linger in the background of your mind, leading to deeper thinking that informs your decision-making process. 

10. Speak up

Take opportunities to share your opinion, even if it intimidates you. Whether at a networking event with new people or a meeting with close colleagues, try to engage with people who challenge or help you develop your ideas. Having conversations that force you to support your position encourages you to refine your argument and think critically. 

11. Stay humble

Ideas and concepts aren’t the same as real-life actions. There may be such a thing as negative outcomes, but there’s no such thing as a bad idea. At the brainstorming stage , don’t be afraid to make mistakes.

Sometimes the best solutions come from off-the-wall, unorthodox decisions. Sit in your creativity , let ideas flow, and don’t be afraid to share them with your colleagues. Putting yourself in a creative mindset helps you see situations from new perspectives and arrive at innovative conclusions. 

12. Embrace discomfort

Get comfortable feeling uncomfortable . It isn’t easy when others challenge your ideas, but sometimes, it’s the only way to see new perspectives and think critically.

By willingly stepping into unfamiliar territory, you foster the resilience and flexibility you need to become a better thinker. You’ll learn how to pick yourself up from failure and approach problems from fresh angles. 

man-looking-down-to-something-while-thinking-how-to-develop-critical-thinking-skills

Thinking critically is easier said than done. To help you understand its impact (and how to use it), here are two scenarios that require critical thinking skills and provide teachable moments. 

Scenario #1: Unexpected delays and budget

Imagine your team is working on producing an event. Unexpectedly, a vendor explains they’ll be a week behind on delivering materials. Then another vendor sends a quote that’s more than you can afford. Unless you develop a creative solution, the team will have to push back deadlines and go over budget, potentially costing the client’s trust. 

Here’s how you could approach the situation with creative thinking:

  • Analyze the situation holistically: Determine how the delayed materials and over-budget quote will impact the rest of your timeline and financial resources . That way, you can identify whether you need to build an entirely new plan with new vendors, or if it’s worth it to readjust time and resources. 
  • Identify your alternative options: With careful assessment, your team decides that another vendor can’t provide the same materials in a quicker time frame. You’ll need to rearrange assignment schedules to complete everything on time. 
  • Collaborate and adapt: Your team has an emergency meeting to rearrange your project schedule. You write down each deliverable and determine which ones you can and can’t complete by the deadline. To compensate for lost time, you rearrange your task schedule to complete everything that doesn’t need the delayed materials first, then advance as far as you can on the tasks that do. 
  • Check different resources: In the meantime, you scour through your contact sheet to find alternative vendors that fit your budget. Accounting helps by providing old invoices to determine which vendors have quoted less for previous jobs. After pulling all your sources, you find a vendor that fits your budget. 
  • Maintain open communication: You create a special Slack channel to keep everyone up to date on changes, challenges, and additional delays. Keeping an open line encourages transparency on the team’s progress and boosts everyone’s confidence. 

coworkers-at-meeting-looking-together-the-screen-how-to-develop-critical-thinking-skills

Scenario #2: Differing opinions 

A conflict arises between two team members on the best approach for a new strategy for a gaming app. One believes that small tweaks to the current content are necessary to maintain user engagement and stay within budget. The other believes a bold revamp is needed to encourage new followers and stronger sales revenue. 

Here’s how critical thinking could help this conflict:

  • Listen actively: Give both team members the opportunity to present their ideas free of interruption. Encourage the entire team to ask open-ended questions to more fully understand and develop each argument. 
  • Flex your analytical skills: After learning more about both ideas, everyone should objectively assess the benefits and drawbacks of each approach. Analyze each idea's risk, merits, and feasibility based on available data and the app’s goals and objectives. 
  • Identify common ground: The team discusses similarities between each approach and brainstorms ways to integrate both idea s, like making small but eye-catching modifications to existing content or using the same visual design in new media formats. 
  • Test new strategy: To test out the potential of a bolder strategy, the team decides to A/B test both approaches. You create a set of criteria to evenly distribute users by different demographics to analyze engagement, revenue, and customer turnover. 
  • Monitor and adapt: After implementing the A/B test, the team closely monitors the results of each strategy. You regroup and optimize the changes that provide stronger results after the testing. That way, all team members understand why you’re making the changes you decide to make.

You can’t think your problems away. But you can equip yourself with skills that help you move through your biggest challenges and find innovative solutions. Learning how to develop critical thinking is the start of honing an adaptable growth mindset. 

Now that you have resources to increase critical thinking skills in your professional development, you can identify whether you embrace change or routine, are open or resistant to feedback, or turn to research or emotion will build self-awareness. From there, tweak and incorporate techniques to be a critical thinker when life presents you with a problem.

Understand Yourself Better:

Big 5 Personality Test

Elizabeth Perry, ACC

Elizabeth Perry is a Coach Community Manager at BetterUp. She uses strategic engagement strategies to cultivate a learning community across a global network of Coaches through in-person and virtual experiences, technology-enabled platforms, and strategic coaching industry partnerships. With over 3 years of coaching experience and a certification in transformative leadership and life coaching from Sofia University, Elizabeth leverages transpersonal psychology expertise to help coaches and clients gain awareness of their behavioral and thought patterns, discover their purpose and passions, and elevate their potential. She is a lifelong student of psychology, personal growth, and human potential as well as an ICF-certified ACC transpersonal life and leadership Coach.

6 ways to leverage AI for hyper-personalized corporate learning

Can dreams help you solve problems 6 ways to try, what is lateral thinking 7 techniques to encourage creative ideas, 17 memorization techniques to sharpen your memory & recall, how divergent thinking can drive your creativity, what’s convergent thinking how to be a better problem-solver, 8 creative solutions to your most challenging problems, why asynchronous learning is the key to successful upskilling, how to be optimistic, betterup named a 2019 “cool vendor” in human capital management: enhancing employee experience by gartnerup your game: a new model for leadership, 7 critical teamwork skills and how to develop them, what is creative thinking and how can i improve, 6 big picture thinking strategies that you'll actually use, what are analytical skills examples and how to level up, critical thinking is the one skillset you can't afford not to master, stay connected with betterup, get our newsletter, event invites, plus product insights and research..

3100 E 5th Street, Suite 350 Austin, TX 78702

  • Platform Overview
  • Integrations
  • Powered by AI
  • BetterUp Lead™
  • BetterUp Manage™
  • BetterUp Care®
  • Sales Performance
  • Diversity & Inclusion
  • Case Studies
  • Why BetterUp?
  • About Coaching
  • Find your Coach
  • Career Coaching
  • Communication Coaching
  • Personal Coaching
  • News and Press
  • Leadership Team
  • Become a BetterUp Coach
  • BetterUp Labs
  • Center for Purpose & Performance
  • Leadership Training
  • Business Coaching
  • Contact Support
  • Contact Sales
  • Privacy Policy
  • Acceptable Use Policy
  • Trust & Security
  • Cookie Preferences
  • Product overview
  • All features
  • Latest feature release
  • App integrations

CAPABILITIES

  • project icon Project management
  • Project views
  • Custom fields
  • Status updates
  • goal icon Goals and reporting
  • Reporting dashboards
  • workflow icon Workflows and automation
  • portfolio icon Resource management
  • Capacity planning
  • Time tracking
  • my-task icon Admin and security
  • Admin console
  • asana-intelligence icon Asana AI
  • list icon Personal
  • premium icon Starter
  • briefcase icon Advanced
  • Goal management
  • Organizational planning
  • Campaign management
  • Creative production
  • Content calendars
  • Marketing strategic planning
  • Resource planning
  • Project intake
  • Product launches
  • Employee onboarding
  • View all uses arrow-right icon
  • Project plans
  • Team goals & objectives
  • Team continuity
  • Meeting agenda
  • View all templates arrow-right icon
  • Work management resources Discover best practices, watch webinars, get insights
  • Customer stories See how the world's best organizations drive work innovation with Asana
  • Help Center Get lots of tips, tricks, and advice to get the most from Asana
  • Asana Academy Sign up for interactive courses and webinars to learn Asana
  • Developers Learn more about building apps on the Asana platform
  • Community programs Connect with and learn from Asana customers around the world
  • Events Find out about upcoming events near you
  • Partners Learn more about our partner programs
  • Asana for nonprofits Get more information on our nonprofit discount program, and apply.

Featured Reads

ways to assess critical thinking skills

  • Collaboration |
  • How to build your critical thinking ski ...

How to build your critical thinking skills in 7 steps (with examples)

Julia Martins contributor headshot

Critical thinking is, well, critical. By building these skills, you improve your ability to analyze information and come to the best decision possible. In this article, we cover the basics of critical thinking, as well as the seven steps you can use to implement the full critical thinking process.

Critical thinking comes from asking the right questions to come to the best conclusion possible. Strong critical thinkers analyze information from a variety of viewpoints in order to identify the best course of action.

Don’t worry if you don’t think you have strong critical thinking abilities. In this article, we’ll help you build a foundation for critical thinking so you can absorb, analyze, and make informed decisions. 

What is critical thinking? 

Critical thinking is the ability to collect and analyze information to come to a conclusion. Being able to think critically is important in virtually every industry and applicable across a wide range of positions. That’s because critical thinking isn’t subject-specific—rather, it’s your ability to parse through information, data, statistics, and other details in order to identify a satisfactory solution. 

Definitions of critical thinking

Various scholars have provided definitions of critical thinking, each emphasizing different aspects of this complex cognitive process:

Michael Scriven , an American philosopher, defines critical thinking as "the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication as a guide to belief and action."

Robert Ennis , professor emeritus at the University of Illinois, describes critical thinking as "reasonable, reflective thinking focused on deciding what to believe or do."

Diane Halpern , a cognitive psychologist and former president of the American Psychological Association, defines it as "the use of cognitive skills or strategies that increase the probability of a desirable outcome."

Decision-making tools for agile businesses

In this ebook, learn how to equip employees to make better decisions—so your business can pivot, adapt, and tackle challenges more effectively than your competition.

Make good choices, fast: How decision-making processes can help businesses stay agile ebook banner image

Top 8 critical thinking skills

Critical thinking is essential for success in everyday life, higher education, and professional settings. The handbook "Foundation for Critical Thinking" defines it as a process of conceptualization, analysis, synthesis, and evaluation of information.

In no particular order, here are eight key critical thinking abilities that can help you excel in any situation:

1. Analytical thinking

Analytical thinking involves evaluating data from multiple sources in order to come to the best conclusions. Analytical thinking allows people to reject cognitive biases and strive to gather and analyze intricate subject matter while solving complex problems. Analytical thinkers who thrive at critical thinking can:

Identify patterns and trends in the data

Break down complex issues into manageable components

Recognize cause-and-effect relationships

Evaluate the strength of arguments and evidence

Example: A data analyst breaks down complex sales figures to identify trends and patterns that inform the company's marketing strategy.

2. Open-mindedness

Open-mindedness is the willingness to consider new ideas, arguments, and information without prejudice. This critical thinking skill helps you analyze and process information to come to an unbiased conclusion. Part of the critical thinking process is letting your personal biases go, taking information at face value and coming to a conclusion based on multiple points of view .

Open-minded critical thinkers demonstrate:

Willingness to consider alternative viewpoints

Ability to suspend judgment until sufficient evidence is gathered

Receptiveness to constructive criticism and feedback

Flexibility in updating beliefs based on new information

Example: During a product development meeting, a team leader actively considers unconventional ideas from junior members, leading to an innovative solution.

3. Problem-solving

Effective problem solving is a cornerstone of critical thinking. It requires the ability to identify issues, generate possible solutions, evaluate alternatives, and implement the best course of action. This critical thinking skill is particularly valuable in fields like project management and entrepreneurship.

Key aspects of problem-solving include:

Clearly defining the problem

Gathering relevant information

Brainstorming potential solutions

Evaluating the pros and cons of each option

Implementing and monitoring the chosen solution

Reflecting on the outcome and adjusting as necessary

Example: A high school principal uses problem-solving skills to address declining student engagement by surveying learners, consulting with higher education experts, and implementing a new curriculum that balances academic rigor with practical, real-world applications.

4. Reasoned judgment

Reasoned judgment is a key component of higher order thinking that involves making thoughtful decisions based on logical analysis of evidence and thorough consideration of alternatives. This critical thinking skill is important in both academic and professional settings. Key aspects reasoned judgment include:

Objectively gathering and analyzing information

Evaluating the credibility and relevance of evidence

Considering multiple perspectives before drawing conclusions

Making decisions based on logical inference and sound reasoning

Example: A high school science teacher uses reasoned judgment to design an experiment, carefully observing and analyzing results before drawing conclusions about the hypothesis.

5. Reflective thinking

Reflective thinking is the process of analyzing one's own thought processes, actions, and outcomes to gain deeper understanding and improve future performance. Good critical thinking requires analyzing and synthesizing information to form a coherent understanding of a problem. It's an essential critical thinking skill for continuous learning and improvement.

Key aspects of reflective thinking include:

Critically examining one's own assumptions and cognitive biases

Considering diverse viewpoints and perspectives

Synthesizing information from various experiences and sources

Applying insights to improve future decision-making and actions

Continuously evaluating and adjusting one's thinking processes

Example: A community organizer reflects on the outcomes of a recent public event, considering what worked well and what could be improved for future initiatives.

6. Communication

Strong communication skills help critical thinkers articulate ideas clearly and persuasively. Communication in the workplace is crucial for effective teamwork, leadership, and knowledge dissemination. Key aspects of communication in critical thinking include:

Clearly expressing complex ideas

Active listening and comprehension

Adapting communication styles to different audiences

Constructing and delivering persuasive arguments

Example: A manager effectively explains a new company policy to her team, addressing their concerns and ensuring everyone understands its implications.

7. Research

Critical thinkers with strong research skills gather, evaluate, and synthesize information from various sources of information. This is particularly important in academic settings and in professional fields that require continuous learning. Effective research involves:

Identifying reliable and relevant sources of information

Evaluating the credibility and bias of sources

Synthesizing information from multiple sources

Recognizing gaps in existing knowledge

Example: A journalist verifies information from multiple credible sources before publishing an article on a controversial topic.

8. Decision-making

Effective decision making is the culmination of various critical thinking skills that allow an individual to draw logical conclusions and generalizations. It involves weighing options, considering consequences, and choosing the best course of action. Key aspects of decision-making include:

Defining clear criteria for evaluation

Gathering and analyzing relevant information

Considering short-term and long-term consequences

Managing uncertainty and risk

Balancing logic and intuition

Example: A homeowner weighs the costs, benefits, and long-term implications before deciding to invest in solar panels for their house.

7 steps to improve critical thinking

Critical thinking is a skill that you can build by following these seven steps. The seven steps to critical thinking help you ensure you’re approaching a problem from the right angle, considering every alternative, and coming to an unbiased conclusion.

First things first: When to use the 7 step critical thinking process

There’s a lot that goes into the full critical thinking process, and not every decision needs to be this thought out. Sometimes, it’s enough to put aside bias and approach a process logically. In other, more complex cases, the best way to identify the ideal outcome is to go through the entire critical thinking process. 

The seven-step critical thinking process is useful for complex decisions in areas you are less familiar with. Alternatively, the seven critical thinking steps can help you look at a problem you’re familiar with from a different angle, without any bias. 

If you need to make a less complex decision, consider another problem solving strategy instead. Decision matrices are a great way to identify the best option between different choices. Check out our article on 7 steps to creating a decision matrix .

1. Identify the problem or question

Before you put those critical thinking skills to work, you first need to identify the problem you’re solving. This step includes taking a look at the problem from a few different perspectives and asking questions like: 

What’s happening? 

Why is this happening? 

What assumptions am I making? 

At first glance, how do I think we can solve this problem? 

A big part of developing your critical thinking skills is learning how to come to unbiased conclusions. In order to do that, you first need to acknowledge the biases that you currently have. Does someone on your team think they know the answer? Are you making assumptions that aren’t necessarily true? Identifying these details helps you later on in the process. 

2. Gather relevant information

At this point, you likely have a general idea of the problem—but in order to come up with the best solution, you need to dig deeper. 

During the research process, collect information relating to the problem, including data, statistics, historical project information, team input, and more. Make sure you gather information from a variety of sources, especially if those sources go against your personal ideas about what the problem is or how to solve it.

Gathering varied information is essential for your ability to apply the critical thinking process. If you don’t get enough information, your ability to make a final decision will be skewed. Remember that critical thinking is about helping you identify the objective best conclusion. You aren’t going with your gut—you’re doing research to find the best option

3. Analyze and evaluate data

Just as it’s important to gather a variety of information, it is also important to determine how relevant the different information sources are. After all, just because there is data doesn’t mean it’s relevant. 

Once you’ve gathered all of the information, sift through the noise and identify what information is relevant and what information isn’t. Synthesizing all of this information and establishing significance helps you weigh different data sources and come to the best conclusion later on in the critical thinking process. 

To determine data relevance, ask yourself:

How reliable is this information? 

How significant is this information? 

Is this information outdated? Is it specialized in a specific field? 

4. Consider alternative points of view

One of the most useful parts of the critical thinking process is coming to a decision without bias. In order to do so, you need to take a step back from the process and challenge the assumptions you’re making. 

We all have bias—and that isn’t necessarily a bad thing. Unconscious biases (also known as cognitive biases) often serve as mental shortcuts to simplify problem solving and aid decision making. But even when biases aren’t inherently bad, you must be aware of your biases in order to put them aside when necessary. 

Before coming to a solution, ask yourself:

Am I making any assumptions about this information? 

Are there additional variables I haven’t considered? 

Have I evaluated the information from every perspective? 

Are there any viewpoints I missed?

5. Draw logical conclusions

Finally, you’re ready to come to a conclusion. To identify the best solution, draw connections between causes and effects. Use the facts you’ve gathered to evaluate the most objective conclusion. 

Keep in mind that there may be more than one solution. Often, the problems you’re facing are complex and intricate. The critical thinking process doesn’t necessarily lead to a cut-and-dry solution—instead, the process helps you understand the different variables at play so you can make an informed decision. 

6. Develop and communication solutions

Communication is a key skill for critical thinkers. It isn’t enough to think for yourself—you also need to share your conclusion with other project stakeholders. If there are multiple solutions, present them all. There may be a case where you implement one solution, then test to see if it works before implementing another solution. 

This process of communicating and sharing ideas is key in promoting critical thinking within a team or organization. By encouraging open dialogue and collaborative problem-solving, you create an environment that fosters the development of critical thinking skills in others.

7. Reflect and learn from the process

The seven-step critical thinking process yields a result—and you then need to put that solution into place. After you’ve implemented your decision, evaluate whether or not it was effective. Did it solve the initial problem? What lessons—whether positive or negative—can you learn from this experience to improve your critical thinking for next time? 

By engaging in this metacognitive reflective thinking process, you're essentially teaching critical thinking to yourself, refining your methodology with each iteration. This reflective practice is fundamental in developing a more robust and adaptable approach to problem-solving.

Depending on how your team shares information, consider documenting lessons learned in a central source of truth. That way, team members that are making similar or related decisions in the future can understand why you made the decision you made and what the outcome was.

Example of critical thinking in the workplace

Imagine you work in user experience design (UX). Your team is focused on pricing and packaging and ensuring customers have a clear understanding of the different services your company offers. Here’s how to apply the critical thinking process in the workplace in seven steps: 

Step 1: Start by identifying the problem

Your current pricing page isn’t performing as well as you want. You’ve heard from customers that your services aren’t clear, and that the page doesn’t answer the questions they have. This page is really important for your company, since it’s where your customers sign up for your service. You and your team have a few theories about why your current page isn’t performing well, but you decide to apply the critical thinking process to ensure you come to the best decision for the page. 

Gather information about how the problem started

Part of identifying the problem includes understanding how the problem started. The pricing and packaging page is important—so when your team initially designed the page, they certainly put a lot of thought into it. Before you begin researching how to improve the page, ask yourself: 

Why did you design the pricing page the way you did? 

Which stakeholders need to be involved in the decision making process? 

Where are users getting stuck on the page?

Are any features currently working?

Step 2: Then gather information and research

In addition to understanding the history of the pricing and packaging page, it’s important to understand what works well. Part of this research means taking a look at what your competitor’s pricing pages look like. 

Ask yourself: 

How have our competitors set up their pricing pages?

Are there any pricing page best practices? 

How does color, positioning, and animation impact navigation? 

Are there any standard page layouts customers expect to see? 

Step 3: Organize and analyze information

You’ve gathered all of the information you need—now you need to organize and analyze it. What trends, if any, are you noticing? Is there any particularly relevant or important information that you have to consider? 

Step 4: Consider alternative viewpoints to reduce bias

In the case of critical thinking, it’s important to address and set bias aside as much as possible. Ask yourself: 

Is there anything I’m missing? 

Have I connected with the right stakeholders? 

Are there any other viewpoints I should consider? 

Step 5: Determine the most logical solution for your team

You now have all of the information you need to design the best pricing page. Depending on the complexity of the design, you may want to design a few options to present to a small group of customers or A/B test on the live website.

Step 6: Communicate your solution to stakeholders

Critical thinking can help you in every element of your life, but in the workplace, you must also involve key project stakeholders . Stakeholders help you determine next steps, like whether you’ll A/B test the page first. Depending on the complexity of the issue, consider hosting a meeting or sharing a status report to get everyone on the same page. 

Step 7: Reflect on the results

No process is complete without evaluating the results. Once the new page has been live for some time, evaluate whether it did better than the previous page. What worked? What didn’t? This also helps you make better critical decisions later on.

Tools and techniques to improve critical thinking skills

As the importance of critical thinking continues to grow in academic and professional settings, numerous tools and resources have been developed to help individuals enhance their critical thinking skills. Here are some notable contributions from experts and institutions in the field:

Mind mapping for better analysis

Mind mapping is a visual technique that helps organize and structure information. It's particularly useful for synthesizing complex ideas and identifying connections between different concepts. The benefits of mind mapping include:

Enhancing creativity by encouraging non-linear thinking

Improving memory and retention of information

Facilitating brainstorming and idea generation

Providing a clear overview of complex topics

To create a mind map:

Start with a central idea or concept.

Branch out with related sub topics or ideas.

Use colors, symbols, and images to enhance visual appeal and memorability.

Draw connections between related ideas across different branches.

Mind mapping can be particularly effective in project planning , content creation, and studying complex subjects.

The Socratic Method for deeper understanding

The Socratic Method, named after the ancient Greek philosopher Socrates, involves asking probing questions to stimulate critical thinking and illuminate ideas. This technique is widely used in higher education to teach critical thinking. Key aspects of the Socratic Method include:

Asking open-ended questions that encourage deeper reflection

Challenging assumptions and preconceived notions

Exploring the implications and consequences of ideas

Fostering intellectual curiosity and continuous inquiry

The Socratic Method can be applied in various settings:

In education, to encourage students to think deeply about subject matter

In business, it is important to challenge team members to consider multiple points of view.

In personal development, to examine one's own beliefs and decisions

Example: A high school teacher might use the Socratic Method to guide students through a complex ethical dilemma, asking questions like "What principles are at stake here?" and "How might this decision affect different stakeholders?"

SWOT analysis for comprehensive evaluation

SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis is a strategic planning tool that can be applied to critical thinking. It helps in evaluating situations from multiple angles, promoting a more thorough understanding of complex issues. The components of SWOT analysis are:

Strengths: internal positive attributes or assets

Weaknesses: internal negative attributes or limitations

Opportunities: External factors that could be beneficial

Threats: External factors that could be harmful

To conduct a SWOT analysis:

Clearly define the subject of analysis (e.g., a project, organization, or decision).

Brainstorm and list items for each category.

Analyze the interactions between different factors.

Use the analysis to inform strategy or decision-making.

Example: A startup might use SWOT analysis to evaluate its position before seeking investment, identifying its innovative technology as a strength, limited capital as a weakness, growing market demand as an opportunity, and established competitors as a threat.

Critical thinking resources

The Foundation for Critical Thinking : Based in California, this organization offers a wide range of resources, including books, articles, and workshops on critical thinking.

The National Council for Excellence in Critical Thinking : This council provides guidelines and standards for critical thinking instruction and assessment.

University of Louisville : Their Critical Thinking Initiative offers various resources and tools for developing critical thinking skills.

The New York Times Learning Network provides lesson plans and activities to help develop critical thinking skills through current events and news analysis.

Critical thinking frameworks and tools

Paul-Elder Critical Thinking Framework : Developed by Dr. Richard Paul and Dr. Linda Elder, this framework provides a comprehensive approach to developing critical thinking skills.

Bloom's Taxonomy : While not exclusively for critical thinking, this classification system is widely used in education to promote higher-order thinking skills.

The California Critical Thinking Disposition Inventory (CCTDI) : This assessment tool measures the disposition to engage in problems and make decisions using critical thinking.

The Ennis-Weir Critical Thinking Essay Test : Developed by Robert Ennis, this test assesses a person's ability to appraise an argument and to formulate a written argument.

By incorporating these tools and techniques into regular practice, individuals can significantly enhance their critical thinking capabilities, leading to more effective problem-solving, decision-making, and overall cognitive performance.

Critically successful 

Critical thinking takes time to build, but with effort and patience you can apply an unbiased, analytical mind to any situation. Critical thinking makes up one of many soft skills that makes you an effective team member, manager, and worker. If you’re looking to hone your skills further, read our article on the 25 project management skills you need to succeed .

Related resources

ways to assess critical thinking skills

10 tips to improve nonverbal communication

ways to assess critical thinking skills

Scaling clinical trial management software with PM solutions

ways to assess critical thinking skills

4 ways to establish roles and responsibilities for team success

ways to assess critical thinking skills

6 ways to develop adaptability in the workplace and embrace change

Yes, We Can Define, Teach, and Assess Critical Thinking Skills

  • Share article

By Jeff Heyck-Williams, Director of Curriculum and Instruction at Two Rivers Public Charter School

While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the advent of the term “21st century skills” and discussions of deeper learning. There is increasing agreement among education reformers that critical thinking is an essential ingredient for long-term success for all of our students.

However, there are still those in the education establishment and in the media who argue that critical thinking isn’t really a thing, or that these skills aren’t well defined and, even if they could be defined, they can’t be taught or assessed.

To those naysayers, I have to disagree. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. In fact, as part of a multi-year Assessment for Learning Project , Two Rivers Public Charter School in Washington, DC, has done just that.

Before I dive into what we have done, I want to acknowledge that some of the criticism has merit.

First, there are those that argue that critical thinking can only exist when students have a vast fund of knowledge. Meaning that a student cannot think critically if they don’t have something substantive about which to think. I agree. Students do need a robust foundation of core content knowledge to effectively think critically. Schools still have a responsibility for building students’ content knowledge.

However, I would argue that students don’t need to wait to think critically until after they have mastered some arbitrary amount of knowledge. They can start building critical thinking skills when they walk in the door. All students come to school with experience and knowledge which they can immediately think critically about. In fact, some of the thinking that they learn to do helps augment and solidify the discipline-specific academic knowledge that they are learning.

The second criticism is that critical thinking skills are always highly contextual. In this argument, the critics make the point that the types of thinking that students do in history is categorically different from the types of thinking students do in science or math. Thus, the idea of teaching broadly defined, content-neutral critical thinking skills is impossible. I agree that there are domain-specific thinking skills that students should learn in each discipline. However, I also believe that there are several generalizable skills that elementary school students can learn that have broad applicability to their academic and social lives. That is what we have done at Two Rivers.

Defining Critical Thinking Skills

We began this work by first defining what we mean by critical thinking. After a review of the literature and looking at the practice at other schools, we identified five constructs that encompass a set of broadly applicable skills: schema development and activation; effective reasoning; creativity and innovation; problem solving; and decision making.

ways to assess critical thinking skills

We then created rubrics to provide a concrete vision of what each of these constructs look like in practice. Working with the Stanford Center for Assessment, Learning and Equity (SCALE) , we refined these rubrics to capture clear and discrete skills.

For example, we defined effective reasoning as the skill of creating an evidence-based claim: students need to construct a claim, identify relevant support, link their support to their claim, and identify possible questions or counter claims. Rubrics provide an explicit vision of the skill of effective reasoning for students and teachers. By breaking the rubrics down for different grade bands, we have been able not only to describe what reasoning is but also to delineate how the skills develop in students from preschool through 8th grade.

ways to assess critical thinking skills

Before moving on, I want to freely acknowledge that in narrowly defining reasoning as the construction of evidence-based claims we have disregarded some elements of reasoning that students can and should learn. For example, the difference between constructing claims through deductive versus inductive means is not highlighted in our definition. However, by privileging a definition that has broad applicability across disciplines, we are able to gain traction in developing the roots of critical thinking. In this case, to formulate well-supported claims or arguments.

Teaching Critical Thinking Skills

The definitions of critical thinking constructs were only useful to us in as much as they translated into practical skills that teachers could teach and students could learn and use. Consequently, we have found that to teach a set of cognitive skills, we needed thinking routines that defined the regular application of these critical thinking and problem-solving skills across domains. Building on Harvard’s Project Zero Visible Thinking work, we have named routines aligned with each of our constructs.

For example, with the construct of effective reasoning, we aligned the Claim-Support-Question thinking routine to our rubric. Teachers then were able to teach students that whenever they were making an argument, the norm in the class was to use the routine in constructing their claim and support. The flexibility of the routine has allowed us to apply it from preschool through 8th grade and across disciplines from science to economics and from math to literacy.

ways to assess critical thinking skills

Kathryn Mancino, a 5th grade teacher at Two Rivers, has deliberately taught three of our thinking routines to students using the anchor charts above (click to view a larger size of each image). Her charts name the components of each routine and has a place for students to record when they’ve used it and what they have figured out about the routine. By using this structure with a chart that can be added to throughout the year, students see the routines as broadly applicable across disciplines and are able to refine their application over time.

Assessing Critical Thinking Skills

By defining specific constructs of critical thinking and building thinking routines that support their implementation in classrooms, we have operated under the assumption that students are developing skills that they will be able to transfer to other settings. However, we recognized both the importance and the challenge of gathering reliable data to confirm this.

With this in mind, we have developed a series of short performance tasks around novel discipline-neutral contexts in which students can apply the constructs of thinking. Through these tasks, we have been able to provide an opportunity for students to demonstrate their ability to transfer the types of thinking beyond the original classroom setting. Once again, we have worked with SCALE to define tasks where students easily access the content but where the cognitive lift requires them to demonstrate their thinking abilities.

These assessments demonstrate that it is possible to capture meaningful data on students’ critical thinking abilities. They are not intended to be high stakes accountability measures. Instead, they are designed to give students, teachers, and school leaders discrete formative data on hard to measure skills.

While it is clearly difficult, and we have not solved all of the challenges to scaling assessments of critical thinking, we can define, teach, and assess these skills. In fact, knowing how important they are for the economy of the future and our democracy, it is essential that we do.

The opinions expressed in Next Gen Learning in Action are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for The Savvy Principal

More From Forbes

13 Easy Steps To Improve Your Critical Thinking Skills

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

With the sheer volume of information that we’re bombarded with on a daily basis – and with the pervasiveness of fake news and social media bubbles – the ability to look at evidence, evaluate the trustworthiness of a source, and think critically is becoming more important than ever. This is why, for me, critical thinking is one of the most vital skills to cultivate for future success.

Critical thinking isn’t about being constantly negative or critical of everything. It’s about objectivity and having an open, inquisitive mind. To think critically is to analyze issues based on hard evidence (as opposed to personal opinions, biases, etc.) in order to build a thorough understanding of what’s really going on. And from this place of thorough understanding, you can make better decisions and solve problems more effectively.

To put it another way, critical thinking means arriving at your own carefully considered conclusions instead of taking information at face value. Here are 13 ways you can cultivate this precious skill:

1. Always vet new information with a cautious eye. Whether it’s an article someone has shared online or data that’s related to your job, always vet the information you're presented with. Good questions to ask here include, "Is this information complete and up to date?” “What evidence is being presented to support the argument?” and “Whose voice is missing here?”

2. Look at where the information has come from. Is the source trustworthy? What is their motivation for presenting this information? For example, are they trying to sell you something or get you to take a certain action (like vote for them)?

Best High-Yield Savings Accounts Of 2024

Best 5% interest savings accounts of 2024.

3. Consider more than one point of view. Everyone has their own opinions and motivations – even highly intelligent people making reasonable-sounding arguments have personal opinions and biases that shape their thinking. So, when someone presents you with information, consider whether there are other sides to the story.

4. Practice active listening. Listen carefully to what others are telling you, and try to build a clear picture of their perspective. Empathy is a really useful skill here since putting yourself in another person's shoes can help you understand where they're coming from and what they might want. Try to listen without judgment – remember, critical thinking is about keeping an open mind.

5. Gather additional information where needed. Whenever you identify gaps in the information or data, do your own research to fill those gaps. The next few steps will help you do this objectively…

6. Ask lots of open-ended questions. Curiosity is a key trait of critical thinkers, so channel your inner child and ask lots of "who," "what," and "why" questions.

7. Find your own reputable sources of information, such as established news sites, nonprofit organizations, and education institutes. Try to avoid anonymous sources or sources with an ax to grind or a product to sell. Also, be sure to check when the information was published. An older source may be unintentionally offering up wrong information just because events have moved on since it was published; corroborate the info with a more recent source.

8. Try not to get your news from social media. And if you do see something on social media that grabs your interest, check the accuracy of the story (via reputable sources of information, as above) before you share it.

9. Learn to spot fake news. It's not always easy to spot false or misleading content, but a good rule of thumb is to look at the language, emotion, and tone of the piece. Is it using emotionally charged language, for instance, and trying to get you to feel a certain way? Also, look at the sources of facts, figures, images, and quotes. A legit news story will clearly state its sources.

10. Learn to spot biased information. Like fake news, biased information may seek to appeal more to your emotions than logic and/or present a limited view of the topic. So ask yourself, “Is there more to this topic than what’s being presented here?” Do your own reading around the topic to establish the full picture.

11. Question your own biases, too. Everyone has biases, and there’s no point pretending otherwise. The trick is to think objectively about your likes and dislikes, preferences, and beliefs, and consider how these might affect your thinking.

12. Form your own opinions. Remember, critical thinking is about thinking independently. So once you’ve assessed all the information, form your own conclusions about it.

13. Continue to work on your critical thinking skills. I recommend looking at online learning platforms such as Udemy and Coursera for courses on general critical thinking skills, as well as courses on specific subjects like cognitive biases.

Read more about critical thinking and other essential skills in my new book, Future Skills: The 20 Skills & Competencies Everyone Needs To Succeed In A Digital World . Written for anyone who wants to surf the wave of digital transformation – rather than be drowned by it – the book explores why these vital future skills matter and how to develop them.

Bernard Marr

  • Editorial Standards
  • Reprints & Permissions

Join The Conversation

One Community. Many Voices. Create a free account to share your thoughts. 

Forbes Community Guidelines

Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.

In order to do so, please follow the posting rules in our site's  Terms of Service.   We've summarized some of those key rules below. Simply put, keep it civil.

Your post will be rejected if we notice that it seems to contain:

  • False or intentionally out-of-context or misleading information
  • Insults, profanity, incoherent, obscene or inflammatory language or threats of any kind
  • Attacks on the identity of other commenters or the article's author
  • Content that otherwise violates our site's  terms.

User accounts will be blocked if we notice or believe that users are engaged in:

  • Continuous attempts to re-post comments that have been previously moderated/rejected
  • Racist, sexist, homophobic or other discriminatory comments
  • Attempts or tactics that put the site security at risk
  • Actions that otherwise violate our site's  terms.

So, how can you be a power user?

  • Stay on topic and share your insights
  • Feel free to be clear and thoughtful to get your point across
  • ‘Like’ or ‘Dislike’ to show your point of view.
  • Protect your community.
  • Use the report tool to alert us when someone breaks the rules.

Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's  Terms of Service.

ways to assess critical thinking skills

Critical Thinking: What It Is and Why It Counts

2023 Update Peter A. Facione, Ph.D.

The late George Carlin worked “critical thinking” into one of his comedic monologue rants on the perils of trusting our lives and fortunes to the decision-making of people who were gullible, uninformed, and unreflective. Had he lived to experience the economic collapse of 2008 and 2009, he would have surely added more to his caustic but accurate assessments regarding how failing to anticipate the consequences of one’s decisions often leads to disastrous results not only for the decision maker, but for many other people as well.

After years of viewing higher education as more of a private good which benefits only the student, we are again beginning to appreciate higher education as being also a public good which benefits society. Is it not a wiser social policy to invest in the education of the future workforce, rather than to suffer the financial costs and endure the fiscal and social burdens associated with economic weakness, public health problems, crime, and avoidable poverty? Perhaps that realization, along with its obvious advantages for high level strategic decision making, is what led the Chairperson of the Joint Chiefs of Staff to comment on critical thinking in his commencement address to a graduating class of military officers.

ways to assess critical thinking skills

Teach people to make good decisions and you have equipped them to improve their own futures and become contributing members of society, rather than burdens on society. Becoming educated and practicing good judgment does not absolutely guarantee a life of happiness, virtue, or economic success, but it surely offers a better chance at those things. And it is clearly better than enduring the consequences of making bad decisions and better than burdening friends, family, and all the rest of us with the unwanted and avoidable consequences of those poor choices.

Defining “Critical Thinking”

Yes, surely, we have all heard business executives, policy makers, civic leaders, and educators talking about critical thinking. At times we found ourselves wondering exactly what critical thinking was and why it is considered so useful and important. This essay takes a deeper look at these questions.

But rather than beginning with an abstract definition – as if critical thinking were about memorization, which is not the case – give this thought experiment a try: Imagine you were invited to a movie by a friend. But it is not a movie you want to see. So, your friend asks you why. You give your honest reason. The movie offends your sense of decency. Your friend asks you to clarify your reason by explaining what bothers you about the film. You reply that it is not the language used or the sexuality portrayed, but you find the violence in the film offensive.

Sure, that should be a good enough answer. But suppose your friend, perhaps being a bit philosophically inclined or simply curious or argumentative, pursues the matter further by asking you to define what you mean by “offensive violence.”

Take a minute and give it a try. How would you define “offensive violence” as it applies to movies? Can you write a characterization which captures what this commonly used concept contains? Take care, though, we would not want to make the definition so broad that all movie violence would be automatically “offensive.” And check to be sure your way of defining “offensive violence” fits with how the rest of the people who know and use English would understand the term. Otherwise, they will not be able to understand what you mean when you use that expression.

Did you produce a definition that works? How do you know?

What you just did with the expression “offensive violence” is very much the same as what had to be done with the expression “critical thinking.” At one level we all know what “critical thinking” means — it means good thinking, almost the opposite of illogical, irrational, thinking. But when we test our understanding further, we run into questions. For example, is critical thinking the same as creative thinking, are they different, or is one part of the other? How do critical thinking and native intelligence or scholastic aptitude relate? Does critical thinking focus on the subject matter or content that you know or on the process you use when you reason about that content?

It might not hurt at all if you formed some tentative preliminary ideas about the questions we just raised. We humans learn better when we stop frequently to reflect, rather than just plowing from the top of the page to the bottom without coming up for air.

ways to assess critical thinking skills

Back to critical thinking – let us ask ourselves to generate possible examples of strong critical thinking? How about the adroit and clever questioning of Socrates or a good attorney or interviewer? Or, what about the clever investigative approaches used by police detectives and crime scene analysts? Would we not want to also include people working together to solve a problem as they consider and discuss their options? How about someone who is good at listening to all sides of a dispute, considering all the facts, and then deciding what is relevant and what is not, and then rendering a thoughtful judgment? And maybe too, someone who can summarize complex ideas clearly with fairness to all sides, or a person who can come up with the most coherent and justifiable explanation of what a passage of written material means? Or the person who can readily devise sensible alternatives to explore, but who does not become defensive about abandoning them if they do not work? And the person who can explain exactly how a particular conclusion was reached, or why certain criteria apply?

Or, considering the concept of critical thinking from the opposite direction, we might ask what the consequences of failing to use our critical thinking might be. Imagine for a moment what could happen when a person or a group of people decides important matters without pausing first to think things through.

C:\Users\Pete\Documents\writings PAF\What & Why\Images for W&W 2011\Slide1.JPG

Expert Opinion

An international group of experts was asked to try to form a consensus about the meaning of critical thinking. One of the first things they did was to ask themselves the question: Who are the best critical thinkers we know and what is it about them that leads us to consider them the best? So, who are the best critical thinkers you know? Why do you think they are strong critical thinkers? Can you draw from those examples a description that is more abstract? For example, consider effective trial lawyers, apart from how they conduct their personal lives or whether their client is guilty or innocent, just look at how the lawyers develop their cases in court. They use reasons to try to convince the judge and jury of their client’s claim of guilt or innocence. They offer evidence and evaluate the significance of the evidence presented by the opposition lawyers. They interpret testimony. They analyze and evaluate the arguments advanced by the other side.

Now, consider the example of a team of people trying to solve a problem. The team members, unlike the courtroom’s adversarial situation, try to collaborate. The members of an effective team do not compete against each other. They work together, like colleagues, for the common goal. Unless they solve the problem, none of them has won. When they find the way to solve the problem, they all have won. So, from analyzing just two examples we can generalize something especially important: critical thinking is thinking that has a purpose (proving a point, interpreting what something means, solving a problem), but critical thinking can be a collaborative, noncompetitive endeavor. And, by the way, even lawyers collaborate. They can work together on a common defense or a joint prosecution, and they can also cooperate with each other to get to the truth so that justice is done.

We will come to a more precise definition of critical thinking soon enough. But first, there is something else we can learn from paradigm examples. When were you thinking about “offensive violence” did you come up with any examples that were tough to classify? Borderline cases, as it were — an example that one person might consider offensive, but another might reasonably regard as non-offensive. Yes, well, so did we. This is going to happen with all abstract concepts. It happens with the concept of critical thinking as well. There are people of whom we would say, on certain occasions, this person is a good thinker, clear, logical, thoughtful, attentive to the facts, open to alternatives, but, wow, at other times, look out! When you get this person on such-and-such a topic, well it is all over then. You have pushed some kind of button, and the person does not want to hear what anybody else has to say. The person’s mind is made up ahead of time. New facts are pushed aside. No other point of view is tolerated.

Do you know any people that might fit that general description?

ways to assess critical thinking skills

Now, formulate a list of cases — people that are clearly strong critical thinkers and clearly weak critical thinkers and some who are on the borderline. Considering all those cases, what is it about them that led you to decide which were which? Suggestion: What can the strong critical thinkers do (what mental abilities do they have), that the weak critical thinkers have trouble doing? What skills or approaches do the strong critical thinkers habitually seem to exhibit which the weak critical thinkers seem not to possess?

Core Critical Thinking Skills

Above we suggested you look for a list of mental skills and habits of mind, the experts, when faced with the same problem you are working on, refer to their lists as including cognitive skills and dispositions .

ways to assess critical thinking skills

Quoting from the consensus statement of the national panel of experts: interpretation is “to comprehend and express the meaning or significance of a wide variety of experiences, situations, data, events, judgments, conventions, beliefs, rules, procedures, or criteria.” [1] Interpretation includes the sub-skills of categorization, decoding significance, and clarifying meaning. Can you think of examples of interpretation? How about recognizing a problem and describing it without bias? How about reading a person’s intentions in the expression on her face; distinguishing a main idea from subordinate ideas in a text; constructing a tentative categorization or way of organizing something you are studying; paraphrasing someone’s ideas in your own words; or, clarifying what a sign, chart or graph means? What about identifying an author’s purpose, theme, or point of view? How about what you did above when you clarified what “offensive violence” meant?

Again, from the experts: analysis is “to identify the intended and actual inferential relationships among statements, questions, concepts, descriptions, or other forms of representation intended to express belief, judgment, experiences, reasons, information, or opinions.” The experts include examining ideas, detecting arguments, and analyzing arguments as sub-skills of analysis. Again, can you come up with some examples of analysis? What about identifying the similarities and differences between two approaches to the solution of a given problem? What about picking out the main claim made in a newspaper editorial and tracing back the reasons the editor offers in support of that claim? Or, what about identifying unstated assumptions; constructing a way to represent a main conclusion and the reasons given to support or criticize it; sketching the relationship of sentences or paragraphs to each other and to the main purpose of the passage? What about graphically organizing this essay, in your own way, knowing that its purpose is to give a preliminary idea about what critical thinking means?

The experts define evaluation as meaning “to assess the credibility of statements or other representations which are accounts or descriptions of a person’s perception, experience, situation, judgment, belief, or opinion; and to assess the logical strength of the actual or intended inferential relationships among statements, descriptions, questions or other forms of representation.” Your examples? How about judging an author’s or speaker’s credibility, comparing the strengths and weaknesses of alternative interpretations, determining the credibility of a source of information, judging if two statements contradict each other, or judging if the evidence at hand supports the conclusion being drawn? Among the examples the experts propose are these: “recognizing the factors which make a person a credible witness regarding a given event or a credible authority with regard to a given topic,” “judging if an argument’s conclusion follows either with certainty or with a high level of confidence from its premises,” “judging the logical strength of arguments based on hypothetical situations,” “judging if a given argument is relevant or applicable or has implications for the situation at hand.”

Do the people you regard as strong critical thinkers have the three cognitive skills described so far? Are they good at interpretation, analysis, and evaluation? What about the next three? And your examples of weak critical thinkers, are they lacking in these cognitive skills? All, or just some?

To the experts, inference means “to identify and secure elements needed to draw reasonable conclusions; to form conjectures and hypotheses; to consider relevant information and to reason to the consequences flowing from data, statements, principles, evidence, judgments, beliefs, opinions, concepts, descriptions, questions, or other forms of representation.” As sub-skills of inference the experts list querying evidence, conjecturing alternatives, and drawing conclusions. Can you think of some examples of inference? You might suggest things like seeing the implications of the position someone is advocating. Or drawing out or constructing meaning from the elements in a reading. You may suggest predicting what will happen next based on what is known about the forces at work in a given situation. Or formulating a synthesis of related ideas into a coherent perspective. How about this: after judging that it would be useful to you to resolve a given uncertainty, developing a workable plan to gather that information? Or, when faced with a problem, developing a set of options for addressing it. What about conducting a controlled experiment scientifically and applying the proper statistical methods to attempt to confirm or disconfirm an empirical hypothesis?

Beyond being able to interpret, analyze, evaluate, and infer, strong critical thinkers can do two more things. They can explain what they think and how they arrived at that judgment. And they can apply their powers of critical thinking to themselves and improve on their previous opinions. These two skills are called “explanation” and “self-regulation.”

The experts define explanation as being able to present in a cogent and coherent way the results of one’s reasoning. This means to be able to give someone a full look at the big picture: both “to state and to justify that reasoning in terms of the evidential, conceptual, methodological, criteriological, and contextual considerations upon which one’s results were based; and to present one’s reasoning in the form of cogent arguments.” The sub-skills under explanation are describing methods and results, justifying procedures, proposing, and defending with good reasons one’s causal and conceptual explanations of events or points of view, and presenting full and well-reasoned, arguments in the context of seeking the best understandings possible. Your examples first, please… Here are some more: to construct a chart which organizes one’s findings, to write down for future reference your current thinking on some important and complex matter, to cite the standards and contextual factors used to judge the quality of an interpretation of a text, to state research results and describe the methods and criteria used to achieve those results, to appeal to established criteria as a way of showing the reasonableness of a given judgment, to design a graphic display which accurately represents the subordinate and super-ordinate relationship among concepts or ideas, to cite the evidence that led you to accept or reject an author’s position on an issue, to list the factors that were considered in assigning a final course grade.

Maybe the most remarkable cognitive skill of all, however, is this next one. This one is remarkable because it allows strong critical thinkers to improve their own thinking. In a sense this is critical thinking applied to itself. Because of that some people want to call this “meta-cognition,” meaning it raises thinking to another level. But “another level” really does not fully capture it, because at that next level up what self-regulation does is look back at all the dimensions of critical thinking and double check itself. Self-regulation is like a recursive function in mathematical terms, which means it can apply to everything, including itself. You can monitor and correct an interpretation you offered. You can examine and correct an inference you have drawn. You can review and reformulate one of your own explanations. You can even examine and correct your ability to examine and correct yourself! How? It is as simple as stepping back and saying to yourself, “How am I doing? Have I missed anything important? Let me double check before I go further.”

The experts define self-regulation to mean “self-consciously to monitor one’s cognitive activities, the elements used in those activities, and the results educed, particularly by applying skills in analysis, and evaluation to one’s own inferential judgments with a view toward questioning, confirming, validating, or correcting either one’s reasoning or one’s results.” The two sub-skills here are self-examination and self-correction. Examples? Easy — to examine your views on a controversial issue with sensitivity to the possible influences of your personal biases or self-interest, to check yourself when listening to a speaker in order to be sure you are understanding what the person is really saying without introducing your own ideas, to monitor how well you seem to be understanding or comprehending what you are reading or experiencing, to remind yourself to separate your personal opinions and assumptions from those of the author of a passage or text, to double check yourself by recalculating the figures, to vary your reading speed and method mindful of the type of material and your purpose for reading, to reconsider your interpretation or judgment in view of further analysis of the facts of the case, to revise your answers in view of the errors you discovered in your work, to change your conclusion in view of the realization that you had misjudged the importance of certain factors when coming to your earlier decision. [2]

• What does this mean?
• What is happening?
• How should we understand that (e.g., what he or she just said)?
• What is the best way to characterize/categorize/classify this?
• In this context, what was intended by saying/doing that?
• How can we make sense out of this (experience, feeling, or statement)?
• Please tell us again your reasons for making that claim.
• What is your conclusion/What is it that you are claiming?
• Why do you think that?
• What are the arguments pro and con?
• What assumptions must we make to accept that conclusion?
• What is your basis for saying that?
• Given what we know so far, what conclusions can we draw?
• Given what we know so far, what can we rule out?
• What does this evidence imply?
• If we abandoned/accepted that assumption, how would things change?
• What additional information do we need to resolve this question?
• If we believed these things, what would they imply for us going forward?
• What are the consequences of doing things that way?
• What are some alternatives we have not yet explored?
• Let us consider each option and see where it takes us.
• Are there any undesirable consequences that we can and should foresee?
• How credible is that claim?
• Why do we think we can trust what this person claims?
• How strong are those arguments?
• Do we have our facts right?
• How confident can we be in our conclusion, given what we now know?
• What were the specific findings/results of the investigation?
• Please tell us how you conducted that analysis.
• How did you come to that interpretation?
• Please take us through your reasoning one more time.
• Why do you think that (was the right answer/was the solution)?
• How would you explain why this decision was made?
• Our position on this issue is still too vague; can we be more precise?
• How good was our methodology, and how well did we follow it?
• Is there a way we can reconcile these two apparently conflicting conclusions?
• How good is our evidence?
• OK, before we commit, what are we missing?
• I am finding some of our definitions a little confusing; can we revisit what we mean by certain things before making any final decisions?

The Delphi Research Method

The panel of experts we keep referring to included forty-six men and women from throughout the United States and Canada. They represented many different scholarly disciplines in the humanities, sciences, social sciences, and education. They participated in a research project that lasted two years and was conducted on behalf of the American Philosophical Association. Their work was published under the title Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction . The executive summary is available from www.insightassessment.com

You might be wondering how such a large group of people could collaborate on this project over that long a period and at those distances and still come to a consensus. Good question. Remember we are talking the days before e-mail.

“To comprehend and express the meaning or significance of a wide variety of experiences, situations, data, events, judgments, conventions, beliefs, rules, procedures, or criteria” Categorize
Decode significance
Clarify meaning
“To identify the intended and actual inferential relationships among statements, questions, concepts, descriptions, or other forms of representation intended to express belief, judgment, experiences, reasons, information, or opinions” Examine ideas
Identify arguments
Identify reasons and claims
“To identify and secure elements needed to draw reasonable conclusions; to form conjectures and hypotheses; to consider relevant information and to reduce the consequences flowing from data, statements, principles, evidence, judgments, beliefs, opinions, concepts, descriptions, questions, or other forms of representation” Query evidence
Conjecture alternatives
Draw logically valid or justified conclusions
“To assess the credibility of statements or other representations that are accounts or descriptions of a person’s perception, experience, situation, judgment, belief, or opinion; and to assess the logical strength of the actual or intended inferential relationships among statements, descriptions, questions, or other forms of representation” Assess credibility of claims
Assess quality of arguments
that were made using inductive or deductive reasoning
“To state and to justify that reasoning in terms of the evidential, conceptual, methodological, criteriological, and contextual considerations upon which one’s results were based; and to present one’s reasoning in the form of cogent arguments” State results
Justify procedures
Present arguments
“Self-consciously to monitor one’s cognitive activities, the elements used in those activities, and the results educed, particularly by applying skills in analysis, and evaluation to one’s own inferential judgments with a view toward questioning, confirming, validating, or correcting either one’s reasoning or one’s results” Self-monitor Self-correct

Not only did the group have to rely on snail mail during their two-year collaboration; they used a method of interaction, known as the Delphi Method, which was developed precisely to enable experts to think effectively about something over large spans of distance and time. In the Delphi Method a central investigator organizes the group and feeds them an initial question. [In this case it had to do with how college level critical thinking should be defined so that people teaching at that level would know which skills and dispositions to cultivate in their students.] The central investigator receives all responses, summarizes them, and transmits them back to all the panelists for reactions, replies, and additional questions.

Wait a minute! These are all well-known experts, so what do you do if people disagree? And what about the possible influence of a big-name person? Good points. First, the central investigator takes precautions to remove names so that the panelists are not told who said what. They know who is on the panel, of course. But that is as far as it goes. After that each experts’ argument must stand on its own merits. Second, an expert is only as good as the arguments she or he gives. So, the central investigator summarizes the arguments and lets the panelists decide if they accept them or not. When consensus appears to be at hand, the central investigator proposes this and asks if people agree. If not, then points of disagreement among the experts are registered. We want to share with you one important example of each of these. First, we will describe the expert consensus view of the dispositions which are vital to strong critical thinking. Then we will note a point of separation among the experts.

The Disposition Toward Critical Thinking

What kind of a person would be apt to use their critical thinking skills? The experts poetically describe such a person as having “a critical spirit.” Having a critical spirit does not mean that the person is always negative and hypercritical of everyone and everything.

The experts use the metaphorical phrase critical spirit in a positive sense. By it they mean “a probing inquisitiveness, a keenness of mind, a zealous dedication to reason, and a hunger or eagerness for reliable information. ”

Almost sounds like Supreme Court Justice Sandra Day O’Connor or Sherlock Holmes The kind of person being described here is the kind that always wants to ask “Why?” or “How?” or “What happens if?”. The one key difference, however, is that in fiction Sherlock always solves the mystery, while in the real world there is no guarantee. Critical thinking is about how you approach problems, questions, issues. It is the best way we know of to get to the truth. But! There still are no guarantees — no answers in the back of the book of real life. Does this characterization, that strong critical thinkers possess a “critical spirit, a probing inquisitiveness, a keenness of mind…” fit with your examples of people you would call strong critical thinkers?

But you might say, I know people who have skills but do not use them. We cannot call someone a strong critical thinker just because she or he has these cognitive skills, however important they might be, because what if they just do not bother to apply them?

One response is to say that it is hard to imagine an accomplished dancer who never dances. After working to develop those skills it seems such a shame to let them grow weak with lack of practice. But dancers get tired. And they surrender to the stiffness of age or the fear of injury. In the case of critical thinking skills, we might argue that not using them once you have them is hard to imagine. It’s hard to imagine a person deciding not to think.

Considered as a form of thoughtful judgment or reflective decision-making, in a very real sense critical thinking is pervasive . There is hardly a time or a place where it would not seem to be of potential value. As long as people have purposes in mind and wish to judge how to accomplish them, as long as people wonder what is true and what is not, what to believe and what to reject, strong critical thinking is going to be necessary.

And yet weird things happen, so it is probably true that some people might let their thinking skills grow dull. It is easier to imagine times when people are just too tired, too lax, or too frightened. But imagine it you can, Young Skywalker, so there must be more to critical thinking than just the list of cognitive skills. Human beings are more than thinking machines. And this brings us back to those all-important attitudes which the experts called “dispositions.”

ways to assess critical thinking skills

The experts were persuaded that critical thinking is a pervasive and purposeful human phenomenon. The ideal critical thinker can be characterized not merely by her or his cognitive skills but also by how she or he approaches life and living in general. This is a bold claim. Critical thinking goes way beyond the classroom. In fact, many of the experts fear that some of the things people experience in school are harmful to the development and cultivation of strong critical thinking. Critical thinking came before schooling was ever invented; it lies at the very roots of civilization. It is a cornerstone in the journey humankind is taking from beastly savagery to global sensitivity. Consider what life would be like without the things on this list and we think you will understand.

The approaches to life and living which characterize critical thinking include:

* inquisitiveness regarding a wide range of issues,

* concern to become and remain well-informed,

* alertness to opportunities to use critical thinking,

* trust in the processes of reasoned inquiry,

* self-confidence in one’s own abilities to reason,

* open-mindedness regarding divergent world views,

* flexibility in considering alternatives and opinions

* understanding of the opinions of other people,

* fair-mindedness in appraising reasoning,

* honesty in facing one’s own biases, prejudices, stereotypes, or egocentric tendencies,

* prudence in suspending, making, or altering judgments,

* willingness to reconsider and revise views where honest reflection suggests that change is warranted.

What would someone be like who lacked those dispositions?

It might be someone who does not care about much of anything, is not interested in the facts, prefers not to think, mistrusts reasoning as a way of finding things out or solving problems, holds his or her own reasoning abilities in low esteem, is close-minded, inflexible, insensitive, cannot understand what others think, is unfair when it comes to judging the quality of arguments, denies his or her own biases, jumps to conclusions or delays too long in making judgments, and never is willing to reconsider an opinion. Not someone prudent people would want to ask to manage their investments!

The experts went beyond approaches to life and living in general to emphasize that strong critical thinkers can also be described in terms of how they approach specific issues, questions, or problems. The experts said you would find these sorts of characteristics:

* clarity in stating the question or concern,

* orderliness in working with complexity,

* diligence in seeking relevant information,

* reasonableness in selecting and applying criteria,

* care in focusing attention on the concern at hand,

* persistence though difficulties are encountered,

* precision to the degree permitted by the subject and the circumstances.

So, how would a weak critical thinker approach specific problems or issues? Obviously, by being muddle-headed about what he or she is doing, disorganized and overly simplistic, spotty about getting the facts, apt to apply unreasonable criteria, easily distracted, ready to give up at the least hint of difficulty, intent on a solution that is more detailed than is possible, or being satisfied with an overly generalized and uselessly vague response. Remind you of anyone you know?

Someone positively disposed toward using critical thinking would probably agree with statements like these:

“I hate talk shows where people shout their opinions but never give any reasons at all.”“Figuring out what people really mean by what they say is important to me.”

“I always do better in jobs where I’m expected to think things out for myself.”

“I hold off making decisions until I have thought through my options.”

“Rather than relying on someone else’s notes, I prefer to read the material myself.”

“I try to see the merit in another’s opinion, even if I reject it later.”

“Even if a problem is tougher than I expected, I will keep working on it.”

“Making intelligent decisions is more important than winning arguments.”

ways to assess critical thinking skills

A person disposed to be averse or hostile toward using critical thinking would probably disagree with the statements above but be likely to agree with these:

“I prefer jobs where the supervisor says exactly what to do and exactly how to do it.”“No matter how complex the problem, you can bet there will be a simple solution.”

“I don’t waste time looking things up.”

“I hate when teachers discuss problems instead of just giving the answers.”

“If my belief is truly sincere, evidence to the contrary is irrelevant.”

“Selling an idea is like selling cars, you say whatever works.”

We used the expression “strong critical thinker” to contrast with the expression “weak critical thinker.” But you will find people who drop the adjective “strong” (or “good”) and just say that someone is a “critical thinker” or not. It is like saying that a soccer (European “football”) player is a “defender” or “not a defender”, instead of saying the player’s skills at playing defense are strong or weak. People use the word “defender” in place of the phrase “is good at playing defense.” Similarly, people use “critical thinker” in place of “is a strong critical thinker” or “has strong critical thinking skills.” This is not only a helpful conversational shortcut, it suggests that to many people “critical thinker” has a laudatory sense. The word can be used to praise someone at the same time that it identifies the person, as in “Look at that play. That’s what I call a defender!”

“If we were compelled to make a choice between these personal attributes and knowledge about the principles of logical reasoning together with some degree of technical skill in manipulating special logical processes, we should decide for the former.”

John Dewey, How We Think , 1909. Republished as How We Think: A Restatement of the Relation of Reflective Thinking to the Educational Process . D. C. Heath Publishing. Lexington, MA. 1933.

We said the experts did not come to full agreement on something. That thing has to do with the concept of a “strong critical thinker.” This time the emphasis is on the word “good” because of the crucial ambiguity it contains. A person can be good at critical thinking, meaning that the person can have the appropriate dispositions and be adept at the cognitive processes, while still not being a good (in the moral sense) critical thinker. For example, a person can be adept at developing arguments and then, unethically, use this skill to mislead and exploit a gullible person, perpetrate a fraud, or deliberately confuse and confound, and frustrate a project.

The experts were faced with an interesting problem. Some, a minority, would prefer to think that critical thinking, by its very nature, is inconsistent with the kinds of unethical and deliberately counterproductive examples given. They find it hard to imagine a person who was good at critical thinking not also being good in the broader personal and social sense. In other words, if a person were “really” a “strong critical thinker” in the procedural sense and if the person had all the appropriate dispositions, then the person simply would not do those kinds of exploitive and aggravating things.

What We All Need Most Right Now

A screenshot of a cell phone Description automatically generated

The large majority, however, hold the opposite judgment.

The majority are firm in the view that strong critical thinking has nothing to do with any given set of political or religious tenets, ethical values, cultural mores, orthodoxies, or ideologies of any kind. Rather, the commitment one makes as a strong critical thinker is to always seek the truth with objectivity, integrity, and fair-mindedness. Most experts maintain that critical thinking conceived of as we have described it above, is, regrettably, consistent with abusing one’s knowledge, skills, or power. There have been people with superior thinking skills and strong habits of mind who, unfortunately, have used their talents for ruthless, horrific, and immoral purposes. Would that it was not so! Would that experience, knowledge, mental horsepower, and ethical virtues were all the same. But from the time of Socrates, if not thousands of years before that, humans have known that many of us have one or more of these without having the full set.

Any tool, any approach to situations, can go either way, ethically speaking, depending on the character, integrity, and principles of the persons who possess them. So, in the final analysis most experts maintained that we cannot say a person is not thinking critically simply because we disapprove ethically of what the person is doing. The majority concluded that, “what ‘critical thinking’ means, why it is of value, and the ethics of its use are best regarded as three distinct concerns.”

Perhaps this realization forms part of the basis for why people these days are demanding a broader range of learning outcomes from our schools and colleges. “Knowledge and skills,” the staples of the educational philosophy of the mid-twentieth century, are not sufficient. We must look to a broader set of outcomes including habits of mind and dispositions, such as civic engagement, concern for the common good, and social responsibility.

“Thinking” in Popular Culture

We have said so many good things about critical thinking that you might have the impression that “critical thinking” and “good thinking” mean the same thing. But that is not what the experts said. They see critical thinking as making up part of what we mean by good thinking, but not as being the only kind of good thinking. For example, they would have included creative thinking as part of good thinking.

Creative or innovative thinking is the kind of thinking that leads to new insights, novel approaches, fresh perspectives, whole new ways of understanding and conceiving of things. The products of creative thought include some obvious things like music, poetry, dance, dramatic literature, inventions, and technical innovations. But there are some not so obvious examples as well, such as ways of putting a question that expand the horizons of viable solutions, or ways of conceiving of relationships which challenge presuppositions and lead one to see the world in imaginative and different ways.

The experts working on the concept of critical thinking wisely left open the entire question of what the other forms good thinking might take. Creative thinking is only one example. There is a kind of purposive, kinetic thinking that instantly coordinates movement and intention as, for example, when an athlete dribbles a soccer ball down the field during a match. There is a kind of meditative thinking which may lead to a sense of inner peace or to profound insights about human existence. In contrast, there is a kind of hyper-alert, instinctive thinking needed by soldiers in battle. In the context of popular culture, one finds people proposing all kinds of thinking or this kind of intelligence or that kind of intelligence. Sometimes it is hard to sort out science from pseudo-science – the kernel of enduring truth from the latest cocktail party banter.

“Thinking” in Cognitive Science

Theories emerging from more scientific studies of human thinking and decision-making in recent years propose that thinking is more integrated and less dualistic than the notions in popular culture suggest. We should be cautious about proposals suggesting oversimplified ways of understanding how humans think. We should avoid harsh, rigid dichotomies such as “reason vs. emotion,” “intuitive vs. linear,” “creativity vs. criticality,” “right brained vs. left brained,” “as on Mars vs. as on Venus.”

There is often a kernel of wisdom in popular beliefs, and perhaps that gem this time is the realization that sometimes we decide things very quickly almost as spontaneous, intuitive, reactions to the situation at hand. Many accidents on the freeways of this nation are avoided precisely because drivers can see and react to dangerous situations so quickly. Many good decisions which feel intuitive are really the fruit of expertise. Decisions good drivers make in those moments of crisis, just like the decisions which practiced athletes make in the flow of a game or the decisions that a gifted teacher makes as she or he interacts with students, are borne of expertise, training, and practice.

A close-up of a statue Description automatically generated

Recent integrative models of human decision-making propose that the thinking processes of our species is not best described as a conflictive duality as in “intuitive vs. reflective” but rather an integrative functioning of two mutually supportive systems “intuitive and reflective.” These two systems of thinking are present in all of us and can act in parallel to process cognitively the matters over which we are deciding.

One system is more intuitive, reactive, quick and holistic. So as not to confuse things with the notions of thinking in popular culture, cognitive scientists often name this system, “System 1.” The other (yes, you can guess its name) is more deliberative, reflective, computational and rule governed. You are right, it is called “ System 2 .”

In System 1 thinking, one relies heavily on several heuristics (cognitive maneuvers), key situational characteristics, readily associated ideas, and vivid memories to arrive quickly and confidently at a judgment. System 1 thinking is particularly helpful in familiar situations when time is short and immediate action is required.

While System 1 is functioning, another powerful system is also at work, that is, unless we shut it down by abusing alcohol or drugs, or with fear or indifference. Called “ System 2 ,” this is our more reflective thinking system. It is useful for making judgments when you find yourself in unfamiliar situations and have more time to figure things out. It allows us to process abstract concepts, to deliberate, to plan, to consider options carefully, to review and revise our work in the light of relevant guidelines or standards or rules of procedure. While System 2 decisions are also influenced by the correct or incorrect application of heuristic maneuvers, this is the system which relies on well-articulated reasons and more fully developed evidence. It is reasoning based on what we have learned through careful analysis, evaluation, explanation, and self-correction. This is the system which values intellectual honesty, analytically anticipating what happens next, maturity of judgment, fair-mindedness, elimination of biases, and truth-seeking. This is the system which we rely on to carefully think trough complex, novel, high-stakes, and highly integrative problems. [3]

Educators urge us to improve our critical thinking skills and to reinforce our disposition to use those skills because that is perhaps the best way to develop and refine our System 2 reasoning.

Slide2.JPG

Cognitive heuristics are thinking maneuvers which, at times, appear to be almost hardwired into our species. They influence both systems of thinking, the intuitive thinking of System 1 and the reflective reasoning of System 2. Five heuristics often seem to be operating more frequently in our System 1 reasoning are known as availability, affect, association, simulation, and similarity .

Availability , the coming to mind of a story or vivid memory of something that happened to you or to someone close to you, inclines a person to make inaccurate estimates of the likelihood of that thing’s happening again. People tell stories of things that happened to themselves or their friends all the time as a way of explaining their own decisions. The stories may not be scientifically representative, the events may be mistaken, misunderstood, or misinterpreted. But all that aside, the power of the story is to guide, often in a good way, the decision toward one choice rather than another.

The Affect heuristic operates when you have an immediate positive or a negative reaction to some idea, proposal, person, object, whatever. Sometimes called a “gut reaction” this affective response sets up an initial orientation in us, positive or negative, toward the object. It takes a lot of System 2 reasoning to overcome a powerful affective response to an idea, but it can be done. And at times it should be, because there is no guarantee that your gut reaction is always right.

The Association heuristic is operating when one word or idea reminds us of something else. For example, some people associate the word “cancer” with “death.” Some associate “sunshine” with “happiness.” These kinds of associational reasoning responses can be helpful at times, as for example if associating cancer with death leads you not to smoke and to go in for regular checkups. At other times the same association may influence a person to make an unwise decision, as for example if associating “cancer” with “death” were to lead you to be so fearful and pessimistic that you do not seek diagnosis and treatment of a worrisome cancer symptom until it was really too late to do anything.

The Simulation heuristic works when you are imagining how various scenarios will unfold. People often imagine how a conversation will go, or how they will be treated by someone else when they meet the person, or what their friends or boss or lover will say and do when they must address some difficult issue. These simulations, like movies in our heads, help us prepare and do a better job when the difficult moment arrives. But they can also lead us to have mistaken expectations. People may not respond as we imagined, things may go much differently. Our preparations may fail us because the ease of our simulation misled us into thinking that things would have to go as we had imagined them. And they did not.

The Similarity heuristic operates when we notice some way in which we are like someone else and infer that what happened to that person is therefore more likely to happen to us. The similarity heuristic functions much like an analogical argument or metaphorical model. The similarity we focus on might be fundamental and relevant, which would make the inference more warranted. For example, the boss fired your coworker for missing sales targets, and you draw the reasonable conclusion that if you miss your sales target, you will be fired too. Or the similarity that comes to mind might be superficial or not connected with the outcome, which would make the inference unwarranted. For example, you see a TV commercial showing trim-figured young people enjoying fattening fast foods and infer that because you are young too you can indulge your cravings for fast foods without gaining a lot of excess unsightly poundage.

Heuristics and biases often appearing to be somewhat more associated with System 2 thinking include: satisficing, risk/loss aversion, anchoring with adjustment, and the illusion of control.

CRITICAL THINKING SKILLS MAP ON TO LEADERSHIP DECISION MAKING

Successful professionals with leadership responsibilities, like those in business or the military, apply all their critical thinking skills to solve problems and to make sound decisions. At the risk of oversimplifying all the ways that our critical thinking intersects with problem solving and leadership decision making, here are some of the more obvious connecting points:

  • Analyze the strategic environment, identify its elements and their relationships
  • Interpret events and other elements in the strategic environment for signs of risk, opportunity, weakness, advantage
  • Infer , given what is known with precision and accuracy within the strategic environment, the logical and most predictable consequences of various courses of action
  • Infer , given the range of uncertainty and risk in the strategic environment, the full range of the possible and probable consequences of each possible course of action
  • Evaluate anticipated results for positive and negative impacts
  • Evaluate risks, opportunities, options, consequences
  • Explain the rationale (evidence, methodology, criteria, theoretical assumptions, and context) for deciding on the integrated strategic objectives and for the planning and action parameters that compose the strategy
  • Double Check Everything: At every step review one’s own thinking and make necessary corrections.

© 2013 Measured Reasons LLC, Hermosa Beach, CA. From Jan 2013 briefing “Critical and Creative Thinking” for Joint Special Operations Forces Senior Enlisted Academy, MacDill AFB.

Satisficing occurs as we consider our alternatives. When we come to one which is good enough to fulfill our objectives, we often regard ourselves as having completed our deliberations. We satisficed. And why not? The choice is, after all, good enough. It may not be perfect, it may not be optimal, it may not even be the best among the options available. But it is good enough. Time to decide and move forward.

The running mate of satisficing is temporizing. Temporizing is deciding that the option which we have come to is “good enough for now.” We often move through life satisficing and temporizing. At times we look back on our situations and wonder why it is that we have settled for far less than we might have. If we had only studied harder, worked out a little more, taken better care of ourselves and our relationships, perhaps we would not be living as we are now. But, at the time each of the decisions along the way was “good enough for the time being.”

We are by nature a species that is averse to risk and loss . We often make decisions based on what we are too worried about losing, rather than based on what we might gain. This works out to be a rather serviceable approach in many circumstances. People do not want to lose control, they do not want to lose their freedom, they do not want to lose their lives, their families, their jobs, their possessions. High stakes gambling is best left to those who can afford to lose the money. Las Vegas did not build all those multi-million-dollar casino hotels because vacationers are winning all the time! And so, in real life, we take precautions. We avoid unnecessary risks. The odds may not be stacked against us, but the consequences of losing at times are so great that we would prefer to forego the possibilities of gain in order not to lose what we have. And yet, on occasion this can be a most unfortunate decision too. History has shown time and time again that businesses which avoid risks often are unable to compete successfully with those willing to move more boldly into new markets or into new product lines.

Any heuristic is only a maneuver, perhaps a shortcut or impulse to think or act in one way rather than another, but certainly not a failsafe rule. It may work out well much of the time to rely on the heuristic, but it will not work out for the best all the time.

For example, people with something to lose tend toward conservative choices politically as well as economically. Nothing wrong with that necessarily. Just an observation about the influence of Loss Aversion heuristic on actual decision making. We are more apt to endure the status quo, even as it slowly deteriorates, than we are to call for “radical” change. Regrettably, however, when the call for change comes, it often requires a far greater upheaval to make the necessary transformations, or, on occasion, the situation has deteriorated beyond the point of no return. In those situations, we find ourselves wondering why we waited so long before doing something.

The heuristic known as Anchoring with Adjustment is operative when we find ourselves making evaluative judgments. The natural thing for us to do is to locate or anchor our evaluation at some point along whatever scale we are using. For example, a professor says that the student’s paper is a C+. Then, as other information comes our way, we may adjust that judgment. The professor, for example, may decide that the paper is as good as some others that were given a B-, and so adjust the grade upward. The interesting thing about this heuristic is that we do not normally start over with a fresh evaluation. We have dropped anchor, and we may drag it upward or downward a bit, but we do not pull it off the bottom of the sea to relocate our evaluation. First impressions, as the saying goes, cannot be undone. The good thing about this heuristic is that it permits us to move on. We have done the evaluation; there are other papers to grade, other projects to do, other things in life that need attention. We could not endure long if we had to constantly reevaluate everything anew. The unfortunate thing about this heuristic is that we sometimes drop anchor in the wrong place; we have a tough time giving people a second chance at making a good first impression.

The heuristic known as Illusion of Control is evident in many situations. Many of us overestimate our abilities to control what will happen. We make plans for how we are going to do this or that, say this or that, manipulate the situation this way or that way, share or not share this information or that possibility, all the time thinking that somehow our petty plans will enable us to control what happens. We function as if others are dancing on the ends of the strings that we are pulling, when the influences our words or actions have on future events may be quite negligible. At times we do have some measure of control. For example, we may exercise, not smoke, and watch our diet to be more fit and healthy. We are careful not to drink if we are planning to drive so that we reduce the risks of being involved in a traffic accident. But at times we simply are mistaken about our ability to exercise full control over a situation. Sadly, we might become ill even if we do work hard to take care of ourselves. Or we may be involved in an accident even if we are sober. Our business may fail even if we work hard to make it a success. We may not do as well on an exam as we might hope even if we study hard.

Related to the Illusion of Control heuristic is the tendency to misconstrue our personal influence or responsibility for past events. This is called Hindsight Bias. We may overestimate the influence our actions have had on events when things go right, or we may underestimate our responsibility or culpability when things go wrong. We have all heard people bragging about how they did this and how they did that and, as a result, such and such wonderful things happened. We made these great plans and look at how well our business did financially. Which may be true when the economy is strong but not when the economy is failing. It is not clear how much of that success came from the planning and how much came from the general business environment. Or, we have all been in the room when it was time to own up for something that went wrong and thought to ourselves, hey, I may have had some part in this, but it was not entirely my fault. “It was not my fault the children were late for school! Hey, I was dressed and ready to go at the regular time.” As if seeing that the family was running late, I had no responsibility to take some initiative and help.

“Insanity is doing the same thing over and over again while expecting a different outcome.”

Albert Einstein

Research on our shared heuristic patterns of decision-making does not aim to evaluate these patterns as necessarily good or bad patterns of thinking. I fear that my wording of them may not have been as entirely neutral and descriptive as perhaps it should have been. In truth, reliance on heuristics can be an efficient way of deciding things, given how complicated our lives are. We cannot devote maximal cognitive resources to every single decision we make.

Those of us who study these heuristic thinking phenomena are simply trying to document how we humans do think. There are many useful purposes for doing this. For example, if we find that people repeatedly make a given kind of mistake when thinking about a commonly experienced problem, then we might find ways to intervene and to help ourselves not repeat that error repeatedly.

This research on the actual patterns of thinking used by individuals and by groups might prove particularly valuable to those who seek interventions which could improve how we make our own heath care decisions, how we make business decisions, how we lead teams of people to work more effectively in collaborative settings, and the like.

Popular culture offers one other myth about decision-making which is worth questioning. And that is the belief that when we make reflective decisions, we carefully weigh each of our options, giving due consideration to all of them in turn, before deciding which we will adopt. Although perhaps it should be, research on human decision-making shows that this simply is not what happens. [4] When seeking to explain how people decide on an option with such conviction that they stick to their decision over time and with such confidence that they act on that decision, the concept that what we do is build a Dominance Structure has been put forth.

In a nutshell this theory suggests that when we settle on a particular option which is good enough, we tend to elevate its merits and diminish its flaws relative to the other options. We raise it up in our minds until it becomes for us the dominant option. In this way, as our decision takes shape, we gain confidence in our choice and we feel justified in dismissing the other options, even though the objective distance between any of them and our dominant option may not be very great at all. But we become invested in our dominant option to the extent that we can put the other possibilities aside and act based on our choice. In fact, it comes to dominate the other options in our minds so much that we can sustain our decision to act over time, rather than going back to re-evaluate or reconsider constantly. Understanding the natural phenomenon of dominance structuring can help us appreciate why it can be so difficult for us to get others to change their minds, or why it seems that our reasons for our decisions are so much better than any of the objections which others might make to our decisions. This is not to say that we are right or wrong. Rather, this is only to observe that human beings are capable of unconsciously building up defenses around their choices which can result in the warranted or unwarranted confidence to act based on those choices.

Realizing the power of dominance structuring, one can only be more committed to the importance of education and critical thinking. We should do all that we can to inform ourselves fully and to reflect carefully on our choices before we make them, because we are, after all, human and we are as likely as the next person to believe that we are right and they are wrong once the dominance structure begins to be erected. Breaking through that to fix bad decisions, which is possible, can be much harder than getting things right in the first place.

There are more heuristics than only those mentioned above. There is more to learn about dominance structuring as it occurs in groups as well as in individuals, and how to mitigate the problems which may arise by prematurely settling on a “good enough” option, or about how to craft educational programs or interventions which help people be more effective in their System 1 and System 2 thinking. There is much to learn about human thinking and how to optimize it in individuals of different ages; how to optimize the thinking of groups of peers and groups where organizational hierarchies influence interpersonal dynamics. And, happily, there is a lot we know today about human thinking and decision-making that we did not know a few years ago.

Why critical thinking?

Let us start with you first. Why would critical thinking be of value to you to have the cognitive skills of interpretation, analysis, evaluation, inference, explanation, and self-regulation?

Apart from, or maybe in light of, what we said at the beginning of this essay about the utility of positive critical thinking and about the problems that failures of critical thinking contribute to, why would it be of value to you to learn to approach life and to approach specific concerns with the critical thinking dispositions listed above? Would you have greater success in your work? Would you get better grades?

The answer to the grades question, scientifically speaking, is very possibly Yes! A study of over 1100 college students shows that scores on a college level critical thinking skills test significantly correlated with college GPA. [5] It has also been shown that critical thinking skills can be learned, which suggests that as one learns them one’s GPA might well improve. In further support of this hypothesis is the significant correlation between critical thinking and reading comprehension. Improvements in one are paralleled by improvements in the other. Now if you can read better and think better, you probably will do better in your classes, learn more, and get higher grades. It is, to say the least, very plausible.

Learning, Critical Thinking, and Our Nation’s Future

“The future now belongs to societies that organize themselves for learning… nations that want high incomes and full employment must develop policies that emphasize the acquisition of knowledge and skills by everyone, not just a select few.”

Ray Marshall & Marc Tucker, Thinking For A Living: Education And The Wealth of Nations , Basic Books. New York. 1992.

But what a limited benefit — better grades. Who really cares in the long run? Two years after college, five years out, what does GPA really mean? These days a college level technical and professional program has a half-life of about four years, which means that the technical content is expanding so fast and changing so much that in about four years after graduation your professional training will be in serious need of renewal. So, if the only thing a college is good for is to get the entry level training and the credential needed for a particular job, then college would be a time-limited value.

ways to assess critical thinking skills

The APA Delphi Report, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction 1990 ERIC Doc. NO.: ED 315423

Is that the whole story? A job is a good thing, but is that what a college education is all about? Just getting started in a job? Maybe some cannot see its further value, but many do. A main purpose, if not the main purpose, of the collegiate experience, at either the two-year or the four-year level, is to achieve what people have called a “liberal education.” Not liberal in the sense of a smattering of this and that for no particular purpose except to fulfill the unit requirement. But liberal in the sense of “liberating.” And who is being liberated? You! Liberated from a kind of slavery. But from whom?

From professors. From dependence on professors so that they no longer stand as infallible authorities delivering opinions beyond our capacity to challenge, question, and dissent. In fact, this is exactly what the professors want. They want their students to excel on their own, to go beyond what is currently known, to make their own contributions to knowledge and to society. [Being a professor is a curious job — the more effective you are as a teacher, less your students require your aid in learning.]

Liberal education is about learning to learn, which means learning to think for yourself on your own and in collaboration with others.

Liberal education leads us away from naive acceptance of authority, above self-defeating relativism, and beyond ambiguous contextualism. It culminates in principled reflective judgment. Learning critical thinking, cultivating the critical spirit, is not just a means to this end, it is part of the goal itself. People who are weak critical thinkers, who lack the dispositions and skills described, cannot be said to be liberally educated, regardless of the academic degrees they may hold.

Yes, there is much more to a liberal education than critical thinking. There is an understanding of the methods, principles, theories, and ways of achieving knowledge which are proper to the different intellectual realms. There is an encounter with the cultural, artistic, and spiritual dimensions of life. There is the evolution of one’s decision making to the level of principled integrity and concern for the common good and social justice. There is the realization of the ways all our lives are shaped by global as well as local political, social, psychological, economic, environmental, and physical forces. There is the growth that comes from the interaction with cultures, languages, ethnic groups, religions, nationalities, and social classes other than one’s own. There is the refinement of one’s humane sensibilities through reflection on the recurring questions of human existence, meaning, love, life, and death. There is the sensitivity, appreciation, and critical appraisal of all that is good and all that is bad in the human condition. As the mind awakens and matures, and the proper nurturing and educational nourishment is provided, these others central parts of a liberal education develop as well. Critical thinking plays an essential role in achieving these purposes.

Anything else? What about going beyond the individual to the community?

The experts say critical thinking is fundamental to, if not essential for, “a rational and democratic society.” What might the experts mean by this?

Well, how wise would democracy be if people abandoned critical thinking? Imagine an electorate that did not care for the facts. An electorate that did not wish to consider the pros and cons of the issues. Or, worse, had neither the education nor the brain power to do so. Imagine your life and the lives of your friends and family placed in the hands of juries and judges who let their political allegiance, biases and stereotypes govern their decisions, who do not attend to the evidence, who are not interested in reasoned inquiry, who do not know how to draw an inference or evaluate one. Without critical thinking, people could easily be exploited not only politically but economically.

The impact of abandoning critical thinking would not be confined to the micro-economics of the household checking account. Suppose the people involved in international commerce were lacking in critical thinking skills, they would be unable to analyze and interpret the market trends, evaluate the implications of interest fluctuations, or explain the potential impact of those factors which influence large scale production and distribution of goods and materials. Suppose these people were unable to draw the proper inferences from the economic facts, or unable to evaluate the claims made by the unscrupulous and misinformed. In such a situation, serious economic mistakes would be made. Whole sectors of the economy would become unpredictable and large-scale economic disaster would become extremely likely. So, given a society that does not value and cultivate critical thinking, we might reasonably expect that in time the judicial system and the economic system would collapse. And, in such a society, one that does not liberate its citizens by teaching them to think critically for themselves, it would be madness to advocate democratic forms of government.

ways to assess critical thinking skills

Is it any wonder that business and civic leaders are maybe even more interested in critical thinking than educators? Critical thinking employed by an informed citizenry is a necessary condition for the success of democratic institutions and for competitive free-market economic enterprise. These values are so important that it is in the national interest that we should try to educate all citizens so that they can learn to think critically. Not just for their personal good, but for the good of the rest of us too.

ways to assess critical thinking skills

Look at what has happened around the world in places devastated by economic embargoes, one-sided warfare, or the HIV/AIDS epidemic. Or, consider the problem of global climate change, and how important it is for all of us to cooperate with efforts to curtail our use of fossil fuels to reduce emissions of harmful greenhouse gases.

Consider the “cultural revolutions” undertaken by totalitarian rulers. Notice how in virtually every case absolutist and dictatorial despots seek ever more severe limitations on free expression. They label “liberal” intellectuals “dangers to society” and expel “radical” professors from teaching posts because they might “corrupt the youth.” Some use the power of their governmental or religious authority to crush not only their opposition but the moderates as well — all in the name of maintaining the purity of their movement. They intimidate journalists and those media outlets which dare to comment “negatively” on their political and cultural goals or their heavy-handed methods.

The historical evidence is there for us to see what happens when schools are closed or converted from places of education to places for indoctrination. We know what happens when children are no longer being taught truth-seeking, the skills of good reasoning, or the lessons of human history and basic science: Cultures disintegrate; communities collapse; the machinery of civilization fails; massive numbers of people die; and sooner or later social and political chaos ensues.

Or, imagine a media, a religious or political hegemony which cultivated, instead of critical thinking, all the opposite dispositions? Or consider if that hegemony reinforced uncritical, impulsive decision making and the “ready-shoot-aim” approach to executive action. Imagine governmental structures, administrators, and community leaders who, instead of encouraging critical thinking, were content to make knowingly irrational, illogical, prejudicial, unreflective, short-sighted, and unreasonable decisions.

How long might it take for the people in this society which does not value critical thinking to be at serious risk of foolishly harming themselves and each other?

The news too often reports about hate groups, wanton shooting, terrorists, and violently extreme political, ideological, or religious zealots. Education which includes a good measure of critical thinking skills and dispositions like truth-seeking and open-mindedness, is a problem for terrorists and extremists of every stripe because terrorists and extremists want to control of what people think. They are ideologists of the worst kind. Their methods include indoctrination, intimidation, and the strictest authoritarian orthodoxy. In the “black-and-white” world of “us vs. them” a good education would mean that the people might begin to think for themselves. And that is something these extremists do not want.

History shows that assaults on learning, whether by book burning, exile of intellectuals, or regulations aimed at suppressing research and frustrating the fair-minded, evidence-based, and unfettered pursuit of knowledge, can happen wherever and whenever people are not vigilant defenders of open, objective, and independent inquiry.

Does this mean that society should place an extremely high value on critical thinking?

Absolutely!

Does this mean society has the right to force someone to learn to think critically?

Maybe. But, really, should we have to?

I D E A S

A 5-Step Critical Thinking General Problem Solving Process

I = IDENTIFY the Problem and Set Priorities (Step 1)
D = DETERMINE Relevant Information and Deepen Understanding (Step 2)
E = ENUMERATE Options and Anticipate Consequence (Step 3)
A = ASSESS the Situation and Make a Preliminary Decision (Step 4)
S = SCRUTINIZE the Process and Self-Correct as Needed (Step 5)

EXPERT CONSENSUS STATEMENT REGARDING CRITICAL THINKING AND THE IDEAL CRITICAL THINKER

NEW! For Students and Professionals

Online mini-courses strengthen critical thinking in as little as 1 hour.

  • Correctly analyze problems and develop effective problem-solving strategies
  • Interpret information correctly to make informed decisions
  • Draw sound and reasonable inferences
  • Evaluate arguments and credibility of sources for robust decision-making
  • Persuasively explain reasoning for your point of view,
  • Enhancing your analytical and interpretive communication skills
  • Make more thoughtful decisions in your personal and professional life
  • Protect yourself from being misled by rhetorical deceptions and fallacies
  • Strengthening your ability to reason well with quantitative information
  • Learn how to evaluate comparative, ideological, and scientific reasoning
  • Discover the benefits and the risks of both reactive and reflective thinking

www.insightbasecamp.com

Critical Thinking Skill Builders Mindset Boosters Deep Dives into How Humans Reason Self Quizzes

With Insight Basecamp’s expert-led online courses, students and professionals can strengthen their critical thinking skills and mindset. Choose from a wide range of courses to help you come to well-reasoned judgments, analyze problems correctly, draw sound inferences, and make thoughtful decisions.

Our lifelong learning opportunities are designed for students and professionals from various industries, providing relevant and applicable knowledge. Whether you are advancing your career or building your learning and decision-making skills, our courses are designed to meet your needs.

Tactics for Training, Triggering, and Teaching Critical Thinking

A screenshot of text Description automatically generated

. Images in this white paper are copyrighted from keynote presentations and professional development workshops.

Contact the author at Measured Reasons LLC for more information.

A tree in front of a building Description automatically generated

“Critical Thinking for Life: Valuing, Measuring, and Training Critical Thinking in All its Forms,” describes the work of Drs. Peter A. and Noreen C. Facione. The essay can be found in the Spring 2013 issue of Inquiry (Vol. XXVIII, No.1).

They and their co-investigators have been engaged in research and teaching about reasoning, decision-making, and effective individual and group thinking processes since 1967. Over the years they developed instruments to measure the core skills and habits of mind of effective thinking, these instruments are now in use in many different languages throughout the world. Since 1992 they have presented hundreds of workshops about effective teaching for thinking and about leadership, decision-making, leadership development, planning and budgeting, and learning outcomes assessment at national and international professional association meetings, business organizations, military bases, healthcare agencies, and on college and university throughout the nation.

READINGS and REFERENCES

American Philosophical Association, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. “The Delphi Report,” Committee on Pre-College Philosophy. (ERIC Doc. No. ED 315 423). 1990

Brookfield, Stephen D. : Developing Critical Thinkers: Challenging Adults to Explore Alternative Ways of Thinking and Acting . Josey-Bass Publishers. San-Francisco, CA. 1987.

Browne, M. Neil, and Keeley, Stuart M.: Asking the Right Questions . Prentice-Hall Publishers. Englewood Cliffs, NJ. 2003.

Costa, Arthur L., & Lowery, l Lawrence F.: Techniques for Teaching Thinking. Critical Thinking Press and Software. Pacific Grove, CA. 1989.

Facione, Noreen C, and Facione Peter A..: Critical Thinking and Clinical Judgment in the Health Sciences – An International Teaching Anthology . The California Academic Press, Millbrae CA. 2008.

Facione, Noreen C. and Facione, Peter A.: Critical Thinking Assessment and Nursing Education Programs: An Aggregate Data Analysis . The California Academic Press. Millbrae, CA 1997.

Facione, Noreen. C., and Facione, Peter A., Analyzing Explanations for Seemingly Irrational Choices, International Journal of Applied Philosophy , Vol. 15 No. 2 (2001) 267-86.

Facione, Peter A and Noreen C.: Thinking and Reasoning in Human Decision Making. The California Academic Press. Millbrae CA, 2007

Facione, Peter A and Giddens C. A.: Think Critically , Pearson Education: Englewood Cliffs, NJ, 2016.

Facione, P.A., Facione, N.C., Talking Critical Thinking, Change: The Magazine of Higher Education , March-April. 2007.

Facione, P.A., Facione N. C., and Giancarlo, C: The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skills, Journal of Informal Logic, Vol. 20 No. 1 (2000) 61-84.

Gilovich, Thomas; Griffin, Dale; and Kahneman, Daniel: Heuristics and Biases: The Psychology of Intuitive Judgment . Cambridge University Press. 2002.

Goldstein, William, and Hogarth, Robin M. (Eds.): Research on Judgment and Decision Making . Cambridge University Press. 1997.

Esterle, John, and Clurman, Dan: Conversations with Critical Thinkers . The Whitman Institute. San Francisco, CA. 1993.

Janis, I.L. and Mann, L: Decision-Making . The Free Press, New York. 1977.

Kahneman, Daniel; Slovic, Paul; and Tversky, Amos: Judgment Under Uncertainty: Heuristics and Biases . Cambridge University Press. 1982.

Kahneman Daniel: Knetsch, J.L.; and Thaler, R.H.: The endowment effect, loss aversion, and status quo bias. Journal of Economic Perspectives . 1991, 5;193-206.

King, Patricia M. & Kitchener, Karen Strohm: Developing Reflective Judgment. Josey-Bass Publishers. San Francisco, CA. 1994

Kurfiss, Joanne G., Critical Thinking: Theory, Research, Practice and Possibilities, ASHE-ERIC Higher Education Report # 2, Washington DC, 1988.

Marshall, Ray, and Tucker, Marc, Thinking for a Living: Education and the Wealth of Nations , Basic Books. New York, NY. 1992.

Resnick, L. W., Education and Learning to Think, National Academy Press, 1987.

Rubenfeld, M. Gaie, & Scheffer, Barbara K., Critical Thinking in Nursing: An Interactive Approach . J. B. Lippincott Company. Philadelphia PA, 1995.

Siegel, Harvey: Educating Reason: Rationality, CT and Education. Routledge Publishing. New York. 1989.

Sternberg, Robert J.: Critical Thinking: Its Nature, Measurement, and Improvement. National Institute of Education, Washington DC, 1986.

Toulmin, Stephen: The Uses of Argument . Cambridge University Press, 1969.

Wade, Carole, and Tavris, Carol: Critical & Creative Thinking: The Case of Love and War . Harper Collins College Publisher. New York. NY 1993.

GOVERNMENT REPORTS

U.S. Department of Education, Office of Educational Research and Improvement, National Center for Educational Statistics (NCES) Documents National Assessment of College Student Learning: Getting Started, A Summary of Beginning Activities. NCES 93-116.

National Assessment of College Student Learning: Identification of the Skills to Be Taught, Learned, and Assessed, A Report on the Proceedings of the Second Design Workshop, November 1992. NCES 94-286.

National Assessment of College Student Learning: Identifying College Graduates’ Essential Skills in Writing, Speech and Listening, and Critical Thinking. NCES 95-001.

  • The findings of expert consensus cited or reported in this essay are published in Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Peter A. Facione, principal investigator, The California Academic Press, Millbrae, CA, 1990. (ERIC ED 315 423). In 1993/94 the Center for the Study of Higher Education at The Pennsylvania State University studied 200 policymakers, employers, and faculty members from two-year and four-year colleges to determine what this group took to be the core critical thinking skills and habits of mind. The Pennsylvania State University Study, under the direction of Dr. Elizabeth Jones, was funded by the US Department of Education Office of Educational Research and Instruction. The Penn State study findings, published in 1994, confirmed the expert consensus described in this paper. ↑
  • The California Critical Thinking Skills Test , the Test of Everyday Reasoning , the Health Science Reasoning Test , the Military and Defense Reasoning Profile , The Business Critical Thinking Skills Test , and Educate Insight Series for K-12, and the INSIGHT Series for employers and business, health, legal, first responder, educator, science and engineering, and defense professionals and executives. along with other testing instruments authored by Dr. Facione and his research team for people in K-12, college, and graduate / professional work target the core critical thinking skills identified here. These instruments are published in English and several authorized translations exclusively by Insight Assessment. ↑
  • Chapters 10 and 11 of Think Critically , Pearson Education, locate critical thinking within this integrative model of thinking. The cognitive heuristics, which will be described next, and the human capacity to derive sustained confidence decisions (right or wrong), — known as “dominance structuring,” – are presented there too. There are lots of useful exercises and examples in that book. You may also wish to consult the references listed at the end of this essay. The material presented in this section is derived from these books and related publications by many of these same authors and others working to scientifically explain how humans make decisions. ↑
  • Henry Montgomery, “From cognition to action: The search for dominance in decision making.” In Process and Structure in Human Decision-Making , Montgomery H, Svenson O (Eds). John Wiley & Sons: Chichester, UK, 1989. For a more accessible description along with reflective exercises on how to avoid becoming “locked in” to a poor decision prematurely, see chapter 11 of Think Critically . ↑
  • (Findings regarding the effectiveness of critical thinking instruction, and correlations with GPA and reading ability are reported in “Technical Report #1, Experimental Validation and Content Validity” (ERIC ED 327 549), “Technical Report #2, Factors Predictive of CT Skills” (ERIC ED 327 550), and “Gender, Ethnicity, Major, CT Self-Esteem, and the California Critical Thinking Skills Test” (ERIC ED 326 584). These findings remain consistent in research using the tools in the California Critical Thinking Skills Test family of instruments published by Insight Assessment.) ↑

Logo

How to embed critical thinking from course design to assessment

Critical thinking is an essential, human skill. This practical advice aims to help university educators nurture and enhance students’ ability to analyse and evaluate information at all stages of teaching

M. C. Zhang's avatar

M. C. Zhang

  • More on this topic

Asian man critical thinking concept

Created in partnership with

Macau University of Science and Technology logo

You may also like

Asian student addressing class with microphone

Popular resources

.css-1txxx8u{overflow:hidden;max-height:81px;text-indent:0px;} A DIY guide to starting your own journal

Where to start with generative ai chatbot customisation, what leaders get wrong in pursuit of an equitable campus, emotions and learning: what role do emotions play in how and why students learn, contextual learning: linking learning to the real world.

Critical thinking transcends academic boundaries. In an era of global challenges and information overload, the ability to sift through data, question assumptions, generate innovative solutions and navigate the complexities of the 21st century with agility and discernment is paramount. 

Educators must embrace and model critical thinking , crafting learning experiences that challenge students to question, analyse and reflect. Students, for their part, should embrace the challenge of critical engagement, recognising its value not just for academic success but as a lifelong skill. The wider community – including employers, policymakers and families – must also recognise and support the cultivation of critical thinking as essential for the well-being of society.

  • Metacognition in education: how to get students thinking about their thinking
  • ‘Well…what do you think?’ Responding to challenging questions in the moment
  • View: Is critical thinking the answer to generative AI?

Today’s students are preparing for the workforce as well as for their roles as citizens in a complex, interconnected world. Critical thinking serves as the cornerstone of academic enquiry towards building a more just, creative and sustainable world.

Cultivating a culture of critical thinking

Creating a culture that values and nurtures critical thinking requires diligence, creativity and a willingness to adapt. This culture extends beyond the classroom, permeating every aspect of the educational experience, from the design of curricula to the ways in which institutions engage with their communities . It is about creating spaces where questions are encouraged, where failure is seen as a stepping stone to understanding , and where diversity of thought is celebrated. 

Each of the strategies mentioned can be expanded into several prompts, focusing on implementation, challenges, case studies and assessments.

Encourage inquiry-based learning

Inquiry-based learning (IBL) is an active, student-centred educational approach where students develop questions, investigate deeply and construct their own understanding. As an assistant professor at Macau University of Science and Technology (MUST), I use these methods and tools to ensure a hands-on, minds-on approach to learning that is crucial for developing critical thinking and problem-solving skills.

  • Use a theoretical framework: delve into the pedagogical theories and the work of education theorists like John Dewey, that underpin IBL, exploring key educational theorists and studies.
  • Implement IBL across disciplines: f or example, in a literature class, students might be asked to develop their own interpretations of a text, then research literary criticisms to compare and refine their perspectives. In a science setting, it can be applied through project-based lab work where students formulate hypotheses about ecological phenomena and design experiments to test their theories.
  • Tools and resources: t o successfully implement IBL, educators might use digital tools such as Socrative to collect and analyse student questions in real time. Platforms like Padlet provide collaborative spaces for students to brainstorm and share ideas.

Use problem-based learning

Problem-based learning (PBL) is also a student-centred approach that puts students in groups to solve an open-ended problem, and in turn motivates their learning, according to Cornell University’s Center for Teaching Innovation. 

  • Design PBL scenarios: t o boost students’ investment in their learning, educators can choose a problem that aligns with students’ interests and career aspirations.
  • Assess PBL outcomes: discuss methods for assessing the outcomes of PBL, including rubrics and self-assessment tools.  Setting incremental deadlines and checkpoints helps keep students focused and on track, and helps the teacher manage the scope of the project.

Incorporate case studies

Working on case studies in my management courses at MUST enables students to apply theoretical knowledge in their analysis of real-world business scenarios.

  • Source and create case studies: f or instance, dissecting a company that is facing a market downturn provokes critical questions such as how to pivot strategies and manage resources effectively, compelling students to think like decision-makers. Consider using contemporary cases from business journals to ensure students deal with up-to-date content.
  • Explore discussion techniques: the  think-pair-share technique prompts students to first consider the case individually, then discuss their thoughts in pairs, and finally share their insights with the class. This ensures that everyone is actively engaged and benefits from collective intelligence.
  • Use cross-disciplinary applications : Highlight how case studies can be used across different subjects.

Promote reflective practice

Gibbs’ reflective cycle and Schön’s reflection-in-action are just two models that teachers can use to encourage their students’ reflective practice. Others include digital journals and blogs where students document and share their reflective processes.  These platforms serve as a dynamic archive of their growth and an avenue to develop a professional online presence. In giving feedback on reflections, specific and actionable comments can guide improvement. For example, rather than simply praising a good entry, I would suggest ways to connect theoretical concepts with personal experiences more deeply.

Develop assessment methods that encourage critical thinking

Assessment forms such as peer assessments, portfolios and open-book exams have a role in supporting critical thinking.  Portfolios allow students to continuously collect and curate their work, which prompts ongoing self-reflection and improvement in understanding management concepts.

Be sure to  set clear criteria for evaluation when using peer assessment to avoid subjective judgements and encourage constructive feedback. Discuss the importance of feedback loops in the assessment process to enhance critical thinking.

Be thoughtful about how you integrate technology

  • Review specific educational technologies designed to enhance critical thinking. Select applications that stimulate analysis and strategic decision-making. For instance, using interactive simulations such as The Business Strategy Game enables students to apply management theories in a risk-free environment.
  • Discuss the role of digital literacy in critical thinking and consider how to foster it among students. T each students how to discern credible sources online, which is a vital skill for any manager in the age of information overload.
  • Balance technology and tradition.  Case studies in management, for example, can be enriched with data analytics tools for more profound insights, yet the mistake many make is allowing these tools to overshadow the fundamental skills of critical thinking and qualitative analysis.

Model critical thinking

  • Educators can think about how they can improve their own critical thinking skills, whether through peer observation, reading the latest research or as part of professional development . Teachers sharing their thought processes with students can also be seen as a teaching tool.
  • Highlight the ethical dimension of critical thinking in discussion, analysis and decision-making.
  • Ask questions that stimulate students’ deeper thinking and exploration. Ensure that the classroom is a safe space where students feel comfortable asking challenging questions and can follow their curiosity and wonder, which underpins the questioning mindset. 

In sum, critical thinking touches on the essence of what it means to be an informed, engaged and responsible member of society. By committing to the strategies outlined and embracing the collective responsibility for nurturing critical thinkers, educational institutions can play a pivotal role in helping students face challenges with resilience, creativity and commitment to ethical principles. 

M. C. Zhang is an assistant professor at the School of Liberal Arts at Macau University of Science and Technology.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week,  sign up for the Campus newsletter .

Using data skills to turn students’ passion for sports into rewarding careers

A diy guide to starting your own journal, finding a community and career through data skills, key questions to help universities measure societal impact, campus webinar: the evolution of interdisciplinarity, why we should be giving feedback via video.

Register for free

and unlock a host of features on the THE site

Pardon Our Interruption

As you were browsing something about your browser made us think you were a bot. There are a few reasons this might happen:

  • You've disabled JavaScript in your web browser.
  • You're a power user moving through this website with super-human speed.
  • You've disabled cookies in your web browser.
  • A third-party browser plugin, such as Ghostery or NoScript, is preventing JavaScript from running. Additional information is available in this support article .

To regain access, please make sure that cookies and JavaScript are enabled before reloading the page.

Division of Student Life

No such thing as a silly question: answers to questions you might be afraid to ask.

Whether you just arrived at Iowa or have found a familiar rhythm on campus, you probably have questions. And like many students, you might be too shy to ask or you might not know where to go for an answer.

We’re here to help.

Here is a list of questions — and answers — on topics you might want or need to know.

1. What should I do if I feel anxious or if I’m struggling mentally? How do I find help?

First, know you are not alone, and there’s nothing embarrassing or shameful about reaching out for help. We know it can be intimidating, but finding the right resources is the first step to feeling better.

The UI has various mental health services that you can access no matter what you are going through, including anxiety, depression, substance use, eating disorders, trauma, grief, identity development, and relationship concerns.

If formal counseling isn’t for you, several student support groups offer space where you and others with shared experiences can talk. You can drop in to any of these groups at any time.

The UI also offers several free workshops that focus on managing stress and anxiety. Check out the workshop options at Student Wellness and University Counseling Services, which cover topics such as mindfulness, sleep, motivations and procrastination, anxiety, and distress coping skills. These workshops help you build effective skills so you can better manage stress and anxiety.

If you think counseling services would be helpful:

  • Contact University Counseling Service at 319-335-7294 or email [email protected] . UCS has locations on the west side of campus (3223 Westlawn) and the east side (Suite 1950 in University Capitol Centre). Individual and group therapy are offered.  
  • If you are unsure what services may be best for you, UCS staff can guide you in the right direction through an initial consultation . Please know you will be asked to fill out paperwork if you visit UCS for the first time or if it has been more than three months since you were last seen. More information on what paperwork may be needed will be provided during appointment scheduling.  
  • In addition, you can receive 24-hour support through the UI Support and Crisis Line by calling or texting 844-461-5420 or chatting on this page online . You can use the line anonymously if you wish to do so.  
  • You can also schedule a same-day, one-time appointment with a counselor if you would like to talk about an immediate issue or develop a plan to work on your well-being without ongoing therapy.

2. What do I do if I feel sick?

We understand that you don’t want to miss anything or fall behind in classes, but we recommend that you do not try to go to class if you are sick. Contact your professor to let them know you are sick, see if there is any makeup work you might need to do, and ask a classmate to provide notes for you.

If you think you may have an illness more severe than a common cold or you just want peace of mind, a visit to Student Health could help get you back on the mend sooner. Student Health is located at 4189 Westlawn and is open from 8 a.m. to 5 p.m. Monday through Thursday and 9:30 a.m. to 5 p.m. Friday. You can call 319-335-8394 to make an appointment or schedule one online.

There is no cost to visit Student Health; a student health fee is included in the fees you pay each semester. You might be charged for other things related to your visit, such as lab work, medications, or medical supplies. Those charges will be submitted to your insurance, and, if you are a first-time patient or change health insurance, you can fill out this form so Student Health has that information. Charges not covered by insurance can be paid with cash or with your U-Bill. 

If you are unsure if a visit to Student Health is best, you can contact the Student Health Nurseline at 319-335-9704. The Nurseline can help you decide if you need to make an appointment, how to take self-care measures, answer medication questions, and more.

3. I started Iowa with one major, but I’m having doubts if this is the right one for me. What should I do if I’m considering switching majors or colleges?

Don’t worry! Many students switch their majors. The idea of what you thought you wanted to do might look much different now that you have started college, or you may not love your area of study as much as you thought you would.

First, your academic advisor is a great resource. Set up a meeting with them to talk about what you’re not loving about current classes in your major, what classes you do enjoy, and your interests. Your advisor can also help you figure out the length of time it would take to complete your degree if you decide to switch.

If you’re a first-year student, it’s likely your advisor is in the Academic Advising Center . But if your advisor is located within a college and you are thinking about a switch in majors that would also require a switch in colleges, your current advisor is still the best person to lead you in the right direction. You could also contact the Academic Advising Center to speak with an advisor about exploring other majors.

If you want to start thinking about a new area of study, looking at the general catalog can give you more information. You can also access sample plans on MyUI that will outline what an eight-semester plan for a new major may look like.

The Pomerantz Career Center also has resources for exploring majors and career options, including career assessment s. Iowa has more than 200 majors to choose from, so be assured you will find something that both excites you and helps you reach your career goals.

4. What should I do if I’m feeling overwhelmed with my courses or I’m failing a class?

First, don’t panic. Many students feel overwhelmed with their class load from time to time.

Speaking to your professor or teaching assistant is the first step. Your instructors will be able to give you a good picture of where you stand in a class and what you can do to get your desired grade. Professors and TAs hold office hours, and having one-on-one conversations with them can help you make a study plan or get a better grasp on the course material.

Your academic advisor is also a good resource, especially if you would like to change your schedule. They can go over the pros and cons of dropping a class.

If you are considering dropping a class, here is what that process will look like:

  • Keep in mind that you need 12 credits a semester to keep your full-time student status. Dropping below 12 credits could affect financial aid and scholarships. If you are concerned that dropping a class would affect your financial aid, contact the Office of Student Financial Aid.
  • You can drop a class on MyUI before the sixth day of the fall or spring semester, but it’s a good idea to speak with your academic advisor first.
  • After the drop deadline has passed for a semester, you can still request to drop a course, but you will need your academic advisor’s approval.

If you don’t want to drop a class but your grade is slipping, take advantage of tutoring resources . You can find academic help for specific courses, helpful tips in videos and worksheets, a private tutor or workshop, or a free supplemental instruction session.

5. Campus seems so big and I’m afraid of getting lost. How do I find my way around?

Navigating campus can be overwhelming when you first arrive and everything is new. There are plenty of campus maps to choose from, and it’s a good idea to walk to any buildings you’re unfamiliar with to find where your class will be held.

The UI campus is very walkable and bikeable, and those are main modes of transportation you will see students using. Students can also use Cambus for free around campus; here is a map of where bus routes will take you.

The main routes are the red and blue routes, which travel the entire campus. A helpful way to remember the direction red and blue routes go is “Blue to Burge, Red to Rienow.” The red route goes in a clockwise direction, and the blue route goes counterclockwise. Cambus also operates an Interdorm route, which goes to the residence halls and the Pentacrest.

The Transit app will show you real-time bus arrivals, departures, locations, and the closest bus stops. By subscribing to alerts on the app, you will be notified of service changes or severe weather impacts.

While we understand why you might like your vehicle on campus, you don’t need to bring one to get around and we encourage you to use other transportation. If you do bring a car, you will have to pay for a permit. More details on how to do so are here . 

6. I used to play sports in high school, but that’s changed since I started college. How can I stay active?

Without sports and high school gym classes, it can be an adjustment to incorporate staying active into your college routine. Luckily, Iowa has many opportunities for you to get exercise (not to mention you’ll get your steps in walking around campus to your classes).

  • Campus Recreation and Wellness Center: This is perhaps the most well-known recreation facility, located on the east side of campus. Not only does it include all the gym equipment and weights you might want, but it also has an indoor climbing wall, swimming pools, a jogging track, and basketball and volleyball courts. It also has the Wellness Suite, where staff provide fitness assessments, nutrition counseling, and more.
  • Field House: Located on the west side of campus, this space houses basketball, volleyball, and badminton courts; a cycling studio; an indoor track; and a weightlifting room. It also has a swimming pool.
  • Fitness East: Fitness East is in Halsey Hall, and it can be accessed through the walkway between Halsey Hall and the IMU Parking Ramp. While this space is smaller than other facilities, it has all the gym equipment you need for your workout.
  • Hawkeye Tennis and Recreation Complex: Located on Prairie Meadow Drive on the far west edge of campus, this space has indoor and outdoor tennis courts, pickleball courts, cardio equipment, and weights.

All enrolled UI students can access any recreational services facility, but you must present your student ID to get in. The cost to use the facilities is included in your student fees.

If you need something more structured, Iowa has many intramural sports teams you can join if you miss doing activities with a team or competing.

7. I’m away from all or most of the friends I grew up with. How can I make new friends and find a new community at Iowa?

Making new friends is hard, no matter what age you are. If you’re a recent high school graduate, you may have grown up with the same people and friends for most of your life. While trying to make new friends can be intimidating, the new people you meet in college can be some of the most meaningful relationships of your life. Just remember: You are not the only one trying to make new friends.

If you’re living in the residence halls, start by introducing yourself to people on your floor. You can also leave the door of your room open as a sign you’re welcome to visitors. 

Attending campus events that pique your interest can help you connect with other like-minded people. In addition, joining a student organization — even if you stick with it for only a semester — can help you meet new people. If you don’t know where to start with finding the right student org for you, schedule a meeting with a Leadership and Engagement advisor to talk about your interests and get connected.

You can also meet new people by getting  a job or volunteering on campus.  

Again, know that many other people are also looking for new friendships. Asking someone to grab a cup of coffee after class or to meet you for a weekend lunch session will likely make their day as much as it will yours.

8. Being away from home for the first time is harder than I thought it would be. What can make this easier?

No matter how far you may have traveled to become an Iowa student, it’s normal to feel bouts of homesickness, especially if it’s your first semester on campus. Here are some tips:

While it may seem counterintuitive, try to limit your trips back home because they could prolong your feelings of homesickness. Staying on campus for longer stretches of time can help it become more familiar to you and will help Iowa feel more like a new home.

Iowa also has so many ways to get involved. Be it a club, intramural sports, or a job, getting involved on campus can make you feel like you belong here (and you’ll make new friends).

Having new, yet familiar experiences can also help you feel more at ease. For example, if you enjoyed spending Sunday mornings at your hometown’s local coffee shop, find a new place to get your caffeine fix. If you liked spending your mornings at the gym, head to one of our great recreational facilities.

Time is the best way to work through this new transition, and know that you can talk to anyone on campus about how you’re feeling. Also, remember all your loved ones are just a call or text away.

9. This is my first time having to budget and be responsible for my own finances. What are some money tips relevant for me?

Budgeting can be hard, even for people who have been doing it for years. This may be the first time you’ve really had to think about all your monthly expenses. Making a plan to manage your money will be less stressful in the long run because it will help you create some savings and will make unexpected expenses less scary — while also building good habits for the future.

Many tools are available to help you create a budget, from just writing down a plan in a notebook to using an app. No matter what method you use, all budgets are made roughly the same way.

First, figure out what time span you want to create a budget for. Weekly or monthly budgets are common, but you could also create one for an entire semester. Next, determine your income for that time frame. Then, add up your fixed expenses (U-Bill, car payments, cellphone, etc.) and variable expenses (groceries, gas, entertainment, etc.). Once you subtract expenses from your income, you can determine if there are any areas where you’d like to save or how much money you have left over to save.

If you’re not sure what your expenses are for a certain time span, make a note of the money you spend during that time frame and see if your habits are on par with your goals.

Also, make sure you’re being responsible with any credit cards you may have. Even though you don’t have to worry about charges put on the card immediately, you don’t want any surprises when the bill arrives. Building credit is a good practice to start, but making note of charges to credit cards is equally important.

Lots of financial wellness resources can be found here . If you’re struggling with budgets or have other financial questions, meet with a financial aid advisor by scheduling an appointment on MyUI or email [email protected]

10. How do I balance academics, social life, and my other commitments?

Once you step onto campus, it’s probably tempting to jump headfirst into everything that piques your interest, but piling too much on your plate can lead you to feel stressed out or overwhelmed. Academics, jobs, student organizations, having fun with friends — it’s important to have all these things in your life, and finding the right balance for yourself is key.

Establishing a routine (that still leaves time for fun and spontaneous ice cream runs!) is a great way to feel balanced. After you get used to your class and homework schedule, figure out what you want to prioritize and determine if you’re using your time effectively. If you feel you’re lacking in one area, make it a bigger priority the next week.

If you’re struggling academically, you can connect with Academic Support and Retention for more resources to help you succeed. Also, Student Care and Assistance can help provide a personalized assessment of how you spend your time and ways you can adjust your schedule to match your priorities.

The Student Life Development Fund: provides support for the Vice President for Student Life to support all departments and units across the division. This fund is utilized to support many priorities and initiatives across the division.

ways to assess critical thinking skills

Home » Articles » Parenting » Kids » 5 Ways to Improve Critical Thinking Skills

critical thinking skills

5 Ways to Improve Critical Thinking Skills

BJ Foster

Several months ago, my family and I stayed in a yurt while on vacation. If you don’t know what that is, imagine a cross between a large hotel room and a tent. In other words, it was glorified camping. It was fun, or at least interesting. The yurt was on property owned by a college professor, who we had the pleasure to talk to each day. When we asked her about her students, she said over the last couple of decades, students have lost critical thinking skills.

But critical thinking skills are essential for kids to thrive and to make the world a better place. We have to instill them in our kids. Something a friend of mine uses for this is the Go Bible . It’s easy to read with lots of applications and poses questions to kids about everyday scenarios. Great exercises and tools like that can help kids formulate their thoughts and make better decisions. Here are 5 more ways to improve critical thinking skills.

1. Encourage curiosity.

Encourage your kids to explore and learn new things . It will teach them to have an open mind and gather facts before arriving at a conclusion. According to research by Harvard Business Review, curiosity “encourages [people] to put themselves in one another’s shoes and take an interest in one another’s ideas rather than focus only on their own perspective.”

2. Carve out time for free play.

In his book The Anxious Generation , Jonathan Haidt argues that giving kids more free play helps kids learn to resolve problems, think creatively, and even reduce bad behavior. Free play has disappeared over the last few decades, and our kids’ ability to self direct and solve relational problems has taken a hit. So, take them to a park, back away, and let them play.

3. Ask open ended questions.

Asking your kids open ended questions helps their problem-solving and encourages their vocabulary as they formulate words. Questions like these challenge kids and give them an open road to produce their own original thoughts. If you have trouble thinking of questions, find tools that can give you ideas, like the Go Bible that my friend uses.

4. Play strategy games.

My dad taught me to play chess when I was six. It trained me to think a couple moves ahead and about the consequences of my actions. Playing strategy games with your kids is a great way to sharpen your kids’ brains to problem-solve, evaluate strengths and weaknesses, and evaluate cause and effect.

5. Let them solve their own problems.

It’s hard to watch our kids experience pain. If you are like me, it gives you a terrible bout of stress. That’s probably why I swoop in and solve the problem for them—it gives me relief. But that robs them of an opportunity to grow stronger and learn critical thinking skills. I love the scene in Finding Nemo where the dad turtle lets his son struggle to figure out how to get back to him after getting momentarily separated. I need to be more like that. When your kids run into a problem, let them figure it out, unless it’s an emergency.

Sound off: What are some other things we can do to improve our kids’ critical thinking skills?

Huddle Up Question

Huddle up with your kids and ask, “What’s one thing you’d like to learn more about?”

More Resources

ways to assess critical thinking skills

5 Ways to Combat Kids’ Shortening Attention Spans

ways to assess critical thinking skills

The Worst Places Teens Go to Find Answers to Life

benefits-of-risk-taking

3 Skills Kids Gain When They Take Risks

ways to assess critical thinking skills

QUICK LINKS

  • Contributors
  • Fostering and Adoption

5509 W. Gray Street, Suite 100 Tampa, FL 33609 (813) 222-8300

EMAIL SIGN UP

ways to assess critical thinking skills

Copyright © 2024 Family First, INC. All Rights Reserved. Site Design by Design Extensions

Privacy Policy | Terms of Use

Bring Your Class to the Stephen A. Schwarzman Building

Our staff welcomes upper high school, undergraduate, and graduate students, and other groups to explore the varied collections and resources at our landmark 42nd Street building. We work with instructors to design class visits that use the Library's remarkable collections to foster creative inquiry, cultivate critical thinking, and develop information literacy skills, both textual and visual. 

Our Teaching Philosophy

We believe that learners of all levels of experience benefit from discovering, accessing, and connecting with The New York Public Library’s research collections.

To create positive and meaningful learning experiences, we:

  • Create thoughtful,  objective-driven learning experiences with the Library’s collections  
  • Design class visits that provide students with  ample time and tools to analyze resources   
  • Approach collaboration with external educators as partnerships  in which all parties contribute knowledge and preparatory work toward a positive student experience  
  • Transform students into regular library users

Planning Your Class Visit

Our teams work closely with educators to design and facilitate thoughtful class visits using the Library’s collections.

Instructors should fill out and submit the  instruction request form . Within 7 to 10 days, a librarian or curator will reach out to set up a planning meeting. 

Submit the form  at least six weeks before your preferred visit. We encourage you to plan your visit before the start of the semester. 

Meet with your NYPL liaison in-person or virtually. At the meeting, you will:

  • Review your learning objectives 
  • Discuss materials 
  • Establish date and time of visit
  • Lay out a preliminary agenda for the visit
  • Determine roles and responsibilities

3. Select Materials

We may ask you to research and select class visit materials in advance. Your NYPL liaison will help you identify pertinent collections and, if needed, ask that you come into the Library to locate and select documents.

Items should be finalized no less than two weeks before the class visit.   

Please arrive promptly. Your NYPL liaison will meet you and your students at a predesignated location in the building.  

We would love to hear about your experience, and work with you again! Shortly after your visit, your NYPL liaison will ask you to share brief feedback on your class visit.

Request a Class Visit

Instructors interested in planning a class visit should start by filling out this form  at least six weeks before your desired visit. A librarian, archivist, or curator will follow up with you shortly.

Discover Our Research Divisions

The New York Public Library's flagship location, the Stephen A. Schwarzman Building, is one of the Library's premier research centers, renowned for its extraordinary historical collections and its commitment to providing free and equal access to its resources. Find out more about each of our divisions. 

Visits for K–12th Grade Classes

If you are an instructor for a K-12th grade class and would like to bring your students to the Library for an exhibition tour, please contact The New York Public Library’s Center for Educators & Schools. 

Building Tours

If you are interested in a tour of the Stephen A. Schwarzman building, please contact Visitor Services for more information.

ways to assess critical thinking skills

EPRA International Journal of Multidisciplinary Research (IJMR)

  • Vol. 10 Issue. 8 (August-2024) EPRA International Journal of Multidisciplinary Research (IJMR)

VISUAL AND AUDITORY DIGITAL INSTRUCTIONS: A SUPPLEMENTARY MATERIAL IN FACILITATING STUDENTS' PROBLEM-SOLVING SKILLS

Francis canomon mabunga.

: 10
: 8
: August
: 2024

ABOUT EPRA JOURNALS

Quick links.

  • Submit Your Paper
  • Track Your Paper Status
  • Certificate Download

FOR AUTHORS

  • Impact Factor
  • Plagiarism Policy
  • Retraction Policy
  • Publication Policy
  • Terms & Conditions
  • Refund Policy
  • Privacy Policy
  • Cancellation Policy
  • Shipping Policy

IMAGES

  1. ULTIMATE CRITICAL THINKING CHEAT SHEET Published 01/19/2017 Infographic

    ways to assess critical thinking skills

  2. Critical Thinking Poster

    ways to assess critical thinking skills

  3. Critical Thinking Definition, Skills, and Examples

    ways to assess critical thinking skills

  4. Critical Thinking Assessment: 4 Ways to Test Applicants • Toggl Hire

    ways to assess critical thinking skills

  5. Critical Thinking Skills

    ways to assess critical thinking skills

  6. Diagram of Critical Thinking Skills with keywords. EPS 10 Stock Vector

    ways to assess critical thinking skills

COMMENTS

  1. Teaching, Measuring & Assessing Critical Thinking Skills

    Yes, We Can Define, Teach, and Assess Critical Thinking Skills. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the ...

  2. A Short Guide to Building Your Team's Critical Thinking Skills

    Summary. Most employers lack an effective way to objectively assess critical thinking skills and most managers don't know how to provide specific instruction to team members in need of becoming ...

  3. Assessing Critical Thinking in Higher Education: Current State and

    Critical thinking is one of the most frequently discussed higher order skills, believed to play a central role in logical thinking, decision making, and problem solving (Butler, 2012; Halpern, 2003).It is also a highly contentious skill in that researchers debate about its definition; its amenability to assessment; its degree of generality or specificity; and the evidence of its practical ...

  4. Eight Instructional Strategies for Promoting Critical Thinking

    Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care ...

  5. What Are Critical Thinking Skills and Why Are They Important?

    Critical thinking skills differ from individual to individual and are utilized in various ways. Examples of common critical thinking skills include: ... However, think of a schoolteacher assessing the classroom to determine how to energize the lesson. There's options such as playing a game, watching a video, or challenging the students with a ...

  6. Frontiers

    An Approach to Performance Assessment of Critical Thinking: The iPAL Program. The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and ...

  7. How to Evaluate a Job Candidate's Critical Thinking Skills in an Interview

    Organizations seeking growth or simply survival during difficult times must successfully recruit A-list talent, thought leaders, and subject matter experts. This is often done under time ...

  8. Fostering and assessing student critical thinking: From theory to

    The OECD rubrics were designed for use in real-life teaching practices in different ways: (1) designing and revising lesson plans to support students gain opportunity to develop critical thinking skills (and creativity); (2) assessing student work and progression in the acquisition of these skills; (3) generating newly aligned rubrics adapted ...

  9. Rubrics to assess critical thinking and information processing in

    Process skills such as critical thinking and information processing are commonly stated outcomes for STEM undergraduate degree programs, but instructors often do not explicitly assess these skills in their courses. Students are more likely to develop these crucial skills if there is constructive alignment between an instructor's intended learning outcomes, the tasks that the instructor and ...

  10. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  11. Critical Thinking Testing and Assessment

    The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students' abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to ...

  12. Critical Thinking Tests: A Complete Guide

    Critical thinking tests assess how well a candidate can analyze and reason when presented with specific information. ... Reading widely, especially non-fiction, is a good way to practice your critical thinking skills in everyday life. Newspaper articles, scientific or technical journals, and other sources of information present an opportunity ...

  13. Critical Thinking Assessment: 4 Ways to Test Applicants

    Assessing critical thinking skills is becoming a key component in the hiring process, especially for roles that require a particularly advanced skillset. Critical thinking is a sign of future performance. Candidates that clearly demonstrate these skills have a lot to offer companies, from better decision-making to more productive relationships ...

  14. 5 Top Critical Thinking Skills (And How To Improve Them)

    Here are some steps you can take when using critical thinking for problem-solving at work: Identify a problem or issue. Create inferences on why the problem exists and how it can be solved. Collect information or data on the issue through research. Organize and sort data and findings. Develop and execute solutions.

  15. How to Improve Your Critical Thinking Skills

    Consider these ways writing can help enhance critical thinking: 1. Clarity of Thought:Writing requires that you articulate your thoughts clearly and coherently. When you need to put your ideas on ...

  16. How to Assess Critical Thinking

    In order to make informed decisions about student critical thinking and learning, you need to assess student performance and behavior in class as well as on tests and assignments. Paying careful attention to signs of inattention or frustration, and asking students to explain them, can provide much valuable information about what may need to ...

  17. How to develop critical thinking skills

    Here are 12 tips for building stronger self-awareness and learning how to improve critical thinking: 1. Be cautious. There's nothing wrong with a little bit of skepticism. One of the core principles of critical thinking is asking questions and dissecting the available information.

  18. 7 steps to uplevel your critical thinking skills [2024]

    Example: A journalist verifies information from multiple credible sources before publishing an article on a controversial topic. 8. Decision-making. Effective decision making is the culmination of various critical thinking skills that allow an individual to draw logical conclusions and generalizations.

  19. Yes, We Can Define, Teach, and Assess Critical Thinking Skills

    Assessing Critical Thinking Skills. By defining specific constructs of critical thinking and building thinking routines that support their implementation in classrooms, we have operated under the ...

  20. 6 Main Types of Critical Thinking Skills (With Examples)

    There are six main skills you can develop to successfully analyze facts and situations and come up with logical conclusions: 1. Analytical thinking. Being able to properly analyze information is the most important aspect of critical thinking. This implies gathering information and interpreting it, but also skeptically evaluating data.

  21. Instruments to assess students' critical thinking—A qualitative

    Critical thinking (CT) skills are essential to academic and professional success. Instruments to assess CT often rely on multiple-choice formats with inherent problems. This research presents two instruments for assessing CT, an essay and open-ended group-discussion format, which were implemented in an undergraduate business course at a large ...

  22. 13 Easy Steps To Improve Your Critical Thinking Skills

    6. Ask lots of open-ended questions. Curiosity is a key trait of critical thinkers, so channel your inner child and ask lots of "who," "what," and "why" questions. 7. Find your own reputable ...

  23. Critical Thinking: What It Is and Why It Counts

    A study of over 1100 college students shows that scores on a college level critical thinking skills test significantly correlated with college GPA. It has also been shown that critical thinking skills can be learned, which suggests that as one learns them one's GPA might well improve. In further support of this hypothesis is the significant ...

  24. How to embed critical thinking from course design to assessment

    Develop assessment methods that encourage critical thinking. Assessment forms such as peer assessments, portfolios and open-book exams have a role in supporting critical thinking. Portfolios allow students to continuously collect and curate their work, which prompts ongoing self-reflection and improvement in understanding management concepts.

  25. Enhancing Critical Thinking Skills: A Comprehensive Guide

    Enhancing Critical Thinking Skills A Critical thinker understands that self-reflection, continuing to learn from past mistakes, and accepting feedback is imperative to further enhance their critical thinking skills. Always being curious and eager to grow one's mindset and intellect are great ways to enhance critical thinking skills. Taking the time out to seek knowledge on concepts one does ...

  26. No such thing as a silly question: Answers to questions you might be

    If you liked spending your mornings at the gym, head to one of our great recreational facilities.Time is the best way to work through this new transition, and know that you can talk to anyone on campus about how you're feeling. Also, remember all your loved ones are just a call or text away.9.

  27. 5 Ways to Improve Critical Thinking Skills

    But critical thinking skills are essential for kids to thrive and to make the world a better place. We have to instill them in our kids. Something a friend of mine uses for this is the Go Bible. It's easy to read with lots of applications and poses questions to kids about everyday scenarios. Great exercises and tools like that can help kids ...

  28. Bring Your Class to the Stephen A. Schwarzman Building

    Our staff welcomes upper high school, undergraduate, and graduate students, and other groups to explore the varied collections and resources at our landmark 42nd Street building. We work with instructors to design class visits that use the Library's remarkable collections to foster creative inquiry, cultivate critical thinking, and develop information literacy skills, both textual and visual.

  29. Using Thematic Approach in The Learners' Critical Thinking Abilities

    This study focuses on determining the significant effect on teacher's utilization of Thematic Approach on the learner's critical thinking abilities and problem-solving skills. It seeks to determine the level of Thematic approach, learners' critical thinking abilities and problem-solving skills. The significant difference in the learners' problem-solving skills before and after using ...

  30. Visual and Auditory Digital Instructions: a Supplementary Material in

    In addition, the students demonstrate satisfactory to outstanding problem-solving skills with a multi-structural level of logical reasoning, critical thinking, and analytical skills. And it indicates that the content and element of visual and auditory digital instructions shows a significant effect on students' problem-solving skills.