Skip to content

Measuring critical thinking and problem-solving skills

a man holding a yellow marker

Introduction

For most of Winter 2024–2025, while the snow was on the ground in New England, I read background articles for this post. In February 2025 I started a new job as a Portfolio Manager, so I put this away and concentrated on my new position. Now I am turning my attention back to this topic, and to this series on important questions in K-12 educational assessment.

For me, researching the measurement of critical thinking and problem solving has been overwhelming. I feel like there is an abundance of detailed information on this topic but a lack of actionable ideas. Even after all this time I do not feel like I can provide an in-depth treatment of how educators and assessment providers should measure critical thinking and problem solving. So, rather than doing a bad job of summarizing this topic, I thought I would share what I learned. As I walk you through my journey, I believe we will learn something about measurement, critical thinking, and problem solving.

Working definitions

When I began this investigation, I thought that most people used the terms critical thinking and problem solving interchangeably. For many researchers, there appears to be significant overlap between the two terms. However, an article by Alice Barana, Marina Marchisio, and Fabio Roman (2023) provided some needed clarity. For their research into how generative artificial intelligence (genAI) might be used to foster problem solving and critical thinking skills in mathematics, they adopted working definitions for each term. Then, they identified the relationship between the two concepts. The following three quotes from their article summarize their ideas.  

A comprehensive definition of problem solving involves the ability to understand the environment, identify complex problems, review related information to develop, evaluate strategies, and implement solutions to build the desired outcome. (Fissore et al., 2021, as stated in Barana, Marchisio, and Roman, 2023)

Critical thinking can be defined as the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. (Ennis, 2015, as stated in Barana, Marchisio, and Roman, 2023)

During the problem solving process, critical thinking has several roles: it permits to correctly look for data, it helps the choice of good strategies, and it allows for arguing about findings. (Barana, Marchisio, and Roman, 2023).

Barana et al. see conceptual thinking as essential for problem solving, aiding individuals in finding data, applying effective strategies, and presenting their findings compellingly. I may have found these definitions in an article not focused on K-12 educational assessment, but their ideas distinguish the two concepts compellingly.

There are assessments of critical thinking skills

During my investigation I found general assessments of critical thinking skills. Heather Butler (2024) conducted a review of the following eight assessments of critical thinking skills.

  • California Critical Thinking Dispositions Inventory
  • California Critical Thinking Skills Test
  • California Measure of Mental Motivation
  • Cornell Critical Thinking Test
  • Ennis-Weit Critical Thinking Essay Test
  • Halpern Critical Thinking Assessment
  • Test of Everyday Reasoning
  • Watson-Glasser Critical Thinking Appraisal II

Only three of the eight assessments are designed to be administered to K-12 students (those in bold). If you are an educator interested in using one of these assessments, I encourage you to read Butler’s article.

Problem solving and critical thinking in the content areas

After reading about these general assessments of critical thinking skills, I turned my attention to the assessment of problem solving and critical thinking in the content areas. At first, what I found was a general disagreement about whether one could assess problem solving and critical thinking outside of the content areas. There are researchers that think, yes, you can evaluate these important skills outside of the content areas. Others say no, you should really assess these skills in a content-area context. So, I went looking for assessments of problem solving and critical thinking in the content areas.

My exploration did not lead me toward a strong opinion about the importance of a content-area context in the assessment of problem solving and critical thinking. However, I found a couple of interesting investigations of critical thinking in science that I share in the following sections.

The effects of STEM education on critical thinking

I found an article that used one of the critical thinking assessments, the California Critical Thinking Disposition Inventory (CCTDI), to evaluate students’ changes in critical thinking following a STEM intervention. Yasemin Hacioglu and Filiz Gulhan (2021) worked with a group of 30 seventh grade students (ages 12 and 13) in the Istanbul province of Turkey. They administered the CCTDI before and after students were asked to redesign an estate of multiple buildings on a central road to be safer and energy efficient. To accomplish this goal, students needed to address three areas.

  1. The main house on the estate had been robbed so students were required to increase the guards’ visibility across the estate from the guard tower.
  2. The estate was built on a slope, which meant that drivers had limited visibility at the entrance. Students were required to reconfigure or relocate the entrance.
  3. Make the main house more energy efficient by increasing insulation and using renewable sources of energy.

The researchers found “a significant difference between the seventh grade students’ CCTDI Pre-test and Post-test scores in favour of post-test (t(30)=-2,571; p<.05).” These results were consistent with similar studies completed by other researchers. Hacioglu and Gulhan found that two sub‑scales increased the most: “truth-seeking and open-mindedness.” Other sub‑scales did not increase significantly: “analyticity, systematicity, self-confidence, and inquisitiveness.” They concluded that the activity, and STEM education more broadly, “made contributions to the development of the seventh grade students’ critical thinking dispositions.”

Reading science news critically

Marianne Bissonnette, Pierre Chastenay, and Chantal Francoeur of the Université du Québec à Montréal, Canada explored the use of critical thinking when reading about science in the news (2021). They worked with a small sample of six students (ages 16 and 17) that were strategically chosen from a larger group of 57 students because of their differing responses to surveys investigating their interest in school science, their self-concept, and the importance they placed on science and technology.

The researchers presented students with two science news stories. The first article was “pseudoscience-based and promoted a fearful and negative opinion towards wave-emitting technologies like radio and cell phones.” The second article “was science-based and presented a nuanced, fact-checked opinion about the different types of EM [electromagnetic] waves and their risks.”

During the interaction with each student, the researchers explored:

  • How well each student comprehended the main idea of the articles
  • How well each student understood the data presented in the articles
  • Whether each student could identify the science-based article and the pseudoscience article
  • The extent to which each student comprehended the content of the articles
  • The extent to which each student was open to the arguments made in each article

From these explorations, the researchers identified important critical thinking skills in the evaluation of news articles: text comprehension, numeracy, evaluation of evidence, and the ability to discern the quality of an argument. This information should help educators design learning experiences where students grapple with evaluating science news articles—and perhaps articles in other areas.

Problem solving and critical thinking in mathematics

The last part of my investigation explored how problem solving and critical thinking were treated in mathematics. To do that, I reviewed information from the Common Core State Standards and the Smarter Balanced assessments.

The Common Core State Standards, in the section called the Standards for Mathematical Practice, make general statements about what students should know and be able to do in the areas of problem solving and critical thinking (https://www.thecorestandards.org/Math/Practice/). The Smarter Balanced assessments are aligned to these mathematical practices. The consortium has developed test content that aligns explicitly to assessment targets related to problem solving (https://contentexplorer.smarterbalanced.org/target/m-g6-8-c2-ta). The following is the problem-solving claim for Grades 6-8.

Claim 2: Problem solving: Students can solve a range of complex well-posed problems in pure and applied mathematics, making productive use of knowledge and problem-solving strategies.

– Target A: Apply mathematics to solve well-posed problems in pure mathematics and arising in everyday life, society, and the workplace.

– Target B: Select and use appropriate tools strategically.

– Target C: Interpret results in the context of a situation.

– Target D: Identify important quantities in a practical situation and map their relationships.

I found one phrase in the assessment claim and targets interesting: well-posed problems. When I explored the claims and targets further, the achievement level descriptors differentiated solving problems that were familiar and unfamiliar to students. Generally, students capable of solving unfamiliar problems are at a higher level of achievement than students capable of solving only familiar problems. So, problems must be well-posed, which I am interpreting as having a solution, but they can be unfamiliar to students.

What did I learn?

The most important thing I learned during this investigation is that there are multiple ways to unpack the concepts of critical thinking and problem solving for the purposes of building an assessment. If you look at problem solving as a skill not highly connected with a content area, then truth-seeking, open-mindedness, self-confidence, and inquisitiveness become important assessment targets. If you look at problem solving in connection with a content area, then assessment targets like problem type and familiarity become important.

There is more than one way to deconstruct and assess problem solving and critical thinking. So, it becomes important to develop a sound working definition of these terms and define assessment targets aligned to the working definition. From there test items can be developed that align to the assessment claims and targets, items can be written to evaluate the claims and targets, and information can be gathered about student performance. This is, in short, how to use evidence centered design to build an assessment. For more information on how to use evidence centered design to build an assessment, please see this report written by Robert Mislevy, Russell Almond, and Janice Lukas: https://files.eric.ed.gov/fulltext/ED483399.pdf.

References

Barana, A., Marchisio, M., & Roman, F. (2023). Fostering problem solving and critical thinking in mathematics through generative artificial intelligence. In International Association for Development of the Information Society. International Association for the Development of the Information Society. https://eric.ed.gov/?id=ED636445

Bissonnette, M., Chastenay, P., & Francoeur, C. (2021). Exploring adolescents’ critical thinking aptitudes when reading about science in the news. Journal of Media Literacy Education, 13(1), 1–13. https://eric.ed.gov/?id=EJ1301306

Butler, H. A. (2024). Predicting Everyday Critical Thinking: A Review of Critical Thinking Assessments. Journal of Intelligence, 12(2), Article 2. https://doi.org/10.3390/jintelligence12020016

Hacioglu, Y., & Gulhan, F. (2021). The Effects of STEM Education on the Students’ Critical Thinking Skills and STEM Perceptions. Journal of Education in Science, Environment and Health, 7(2), 139–155. https://eric.ed.gov/?id=EJ1308420

Mislevy, R. J., Almond, R. G., & Lukas, J. F. (2004). A brief introduction to evidence-centered design. National Center for Research on Evaluation, Standards, and Student Testing. https://files.eric.ed.gov/fulltext/ED483399.pdf

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognizing you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookies are enabled so that we can save your preferences and deliver optimal site performance.

3rd Party Cookies

This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages. Keeping this cookie enabled helps us to improve our website.

This website uses Google Tag Manager cookies to track user interactions, such as clicks, form submissions, and page views. This helps the website owner understand visitor behavior and improve user experience.