The Open Minds Foundation

Applying Algebraic Strategies to Make Gains in How You Think

Using mathematical inversion to improve critical thinking..

Posted August 31, 2023 | Reviewed by Lybi Ma

  • Purposely playing devil's advocate can improve your critical thinking.
  • Adopting the rule of five approach to lateral thinking provides good critical thinking practise.
  • We need to overcome our own psychological predispositions to become good critical thinkers.

When it comes to critical thinking, we are at the mercy of our brain’s own psychological predispositions, which make us naturally poor at thinking impartially. Confirmation bias encourages us to favour patterns and discount new information; truth bias makes us assume everything is true; and the illusory truth effect makes us poor at indexing information, meaning we’ll start to believe something is true, even if we know that it is false.

What’s more, the more times we encounter information, the more we believe it, and even when we practice being sceptical, our brains assume the worst and make us overcompensate in our fact-checking, making it hard to identify what is and isn’t real. Add to these predispositions the influence of the opinions and values we already hold, and we’re a melting pot of ideas that totally prevent impartiality. We’re also susceptible to outside influence, and naturally seek out individuals or groups with ideas that complement our own; the beginnings of groupthink .

When it seems that even our brains are against us, our potential saviour comes in the form of critical thinking. Specifically focussed on improving our ability to think impartially, rationally, and analytically, and to slow down our thought processing to become more effective, critical thinking has a direct impact on our mental capability, particularly when it comes to external influence. It is, and should be, the first line of defence in identifying fake news and propaganda, helping prevent the development of extremist views, and ultimately fostering an improved understanding of the information we consume. It also helps reduce our susceptibility to coercion and manipulation.

A novel approach to critical thinking is to emulate the mathematical approach to equations, which utilise inversion to problem-solve. For mathematicians, it is second nature to invert an equation to make it easier to solve, aligning all information of one type, on one side of the equation. It’s what helps to calculate “x” and is a method of showing your working, which encourages slower more deliberate thinking.

While that makes sense for calculating x/y, the question is how can it apply to much more complex subjects? Subjects that are anything but linear, infinitely complex, and which can have any number of attributes and conflicting facts . Well, it’s not so much the pure mathematical approach that we need to apply, but the principle of what it is and what it represents.

Inversion thinking can be applied to any amount of information, and the purpose is to encourage us to deliberately approach information in a contrary way. By specifically imagining an alternative scenario, playing devil’s advocate, seeking out conflicting or alternative sources and viewpoints, and actively challenging our own, it makes us far better able to determine fact from fiction and recognise our own bias in how we process and rationalise information. Our goal with this form of critical thinking is not necessarily to reach an answer, but to be sure that we have considered something from multiple angles, and challenged our own perceptions about what is fact and what is fiction. The result should be better quality decision-making and more informed understanding, but this should not be the primary goal when practising critical thinking.

We encourage the rule of five as an approach to lateral reading, the perfect instigator of the inversion approach. Lateral reading encourages the breadth rather than depth of reading, promoting that you seek multiple resources, and filter common facts from opinions across multiple resources. In turn, the rule of five helps practise lateral thinking, by actively tasking you with finding:

  • Two sources of information that you are comfortable or familiar with, preferably across two different mediums, for example, article and video
  • Two sources of information that you are definitely not comfortable or familiar with, preferably ones that directly contradict or challenge your own viewpoints
  • A source of information that has a very specific, niche, or strong opinion on the matter

The purpose of seeking out multiple sources of information is that it allows you to identify commonalities (points that persist in all formats); challenge your own bias by directly contradicting them; and help you to identify flaws in your own arguments or reasoning. It encourages you to actively practise critical thinking, in turn making you better at it, and it makes you less susceptible to exploitation, manipulation, and misinformation, all of which are rife in a society where we consume so much content.

Mathematical Teaching Strategies: Pathways to Critical Thinking and Metacognition. International Journal of Research in Education and Science.

The Open Minds Foundation

The Open Minds Foundation is dedicated to undermining the effects of coercive control, through critical thinking education and training.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Additional menu

Khan Academy Blog

Unlocking the Power of Math Learning: Strategies and Tools for Success

posted on September 20, 2023

math improve critical thinking

Mathematics, the foundation of all sciences and technology, plays a fundamental role in our everyday lives. Yet many students find the subject challenging, causing them to shy away from it altogether. This reluctance is often due to a lack of confidence, a misunderstanding of unclear concepts, a move ahead to more advanced skills before they are ready, and ineffective learning methods. However, with the right approach, math learning can be both rewarding and empowering. This post will explore different approaches to learning math, strategies for success, and cutting-edge tools to help you achieve your goals.

Math Learning

Math learning can take many forms, including traditional classroom instruction, online courses, and self-directed learning. A multifaceted approach to math learning can improve understanding, engage students, and promote subject mastery. A 2014 study by the National Council of Teachers of Mathematics found that the use of multiple representations, such as visual aids, graphs, and real-world examples, supports the development of mathematical connections, reasoning, and problem-solving skills.

Moreover, the importance of math learning goes beyond solving equations and formulas. Advanced math skills are essential for success in many fields, including science, engineering, finance, health care, and technology. In fact, a report by Burning Glass Technologies found that 71% of high-salary, entry-level positions require advanced math skills.

Benefits of Math Learning

In today’s 21st-century world, having a broad knowledge base and strong reading and math skills is essential. Mathematical literacy plays a crucial role in this success. It empowers individuals to comprehend the world around them and make well-informed decisions based on data-driven understanding. More than just earning good grades in math, mathematical literacy is a vital life skill that can open doors to economic opportunities, improve financial management, and foster critical thinking. We’re not the only ones who say so:

  • Math learning enhances problem-solving skills, critical thinking, and logical reasoning abilities. (Source: National Council of Teachers of Mathematics )
  • It improves analytical skills that can be applied in various real-life situations, such as budgeting or analyzing data. (Source: Southern New Hampshire University )
  • Math learning promotes creativity and innovation by fostering a deep understanding of patterns and relationships. (Source: Purdue University )
  • It provides a strong foundation for careers in fields such as engineering, finance, computer science, and more. These careers generally correlate to high wages. (Source: U.S. Bureau of Labor Statistics )
  • Math skills are transferable and can be applied across different academic disciplines. (Source: Sydney School of Education and Social Work )

How to Know What Math You Need to Learn

Often students will find gaps in their math knowledge; this can occur at any age or skill level. As math learning is generally iterative, a solid foundation and understanding of the math skills that preceded current learning are key to success. The solution to these gaps is called mastery learning, the philosophy that underpins Khan Academy’s approach to education .

Mastery learning is an educational philosophy that emphasizes the importance of a student fully understanding a concept before moving on to the next one. Rather than rushing students through a curriculum, mastery learning asks educators to ensure that learners have “mastered” a topic or skill, showing a high level of proficiency and understanding, before progressing. This approach is rooted in the belief that all students can learn given the appropriate learning conditions and enough time, making it a markedly student-centered method. It promotes thoroughness over speed and encourages individualized learning paths, thus catering to the unique learning needs of each student.

Students will encounter mastery learning passively as they go through Khan Academy coursework, as our platform identifies gaps and systematically adjusts to support student learning outcomes. More details can be found in our Educators Hub . 

Try Our Free Confidence Boosters

How to learn math.

Learning at School

One of the most common methods of math instruction is classroom learning. In-class instruction provides students with real-time feedback, practical application, and a peer-learning environment. Teachers can personalize instruction by assessing students’ strengths and weaknesses, providing remediation when necessary, and offering advanced instruction to students who need it.

Learning at Home

Supplemental learning at home can complement traditional classroom instruction. For example, using online resources that provide additional practice opportunities, interactive games, and demonstrations, can help students consolidate learning outside of class. E-learning has become increasingly popular, with a wealth of online resources available to learners of all ages. The benefits of online learning include flexibility, customization, and the ability to work at one’s own pace. One excellent online learning platform is Khan Academy, which offers free video tutorials, interactive practice exercises, and a wealth of resources across a range of mathematical topics.

Moreover, parents can encourage and monitor progress, answer questions, and demonstrate practical applications of math in everyday life. For example, when at the grocery store, parents can ask their children to help calculate the price per ounce of two items to discover which one is the better deal. Cooking and baking with your children also provides a lot of opportunities to use math skills, like dividing a recipe in half or doubling the ingredients. 

Learning Math with the Help of Artificial Intelligence (AI) 

AI-powered tools are changing the way students learn math. Personalized feedback and adaptive practice help target individual needs. Virtual tutors offer real-time help with math concepts while AI algorithms identify areas for improvement. Custom math problems provide tailored practice, and natural language processing allows for instant question-and-answer sessions. 

Using Khan Academy’s AI Tutor, Khanmigo

Transform your child’s grasp of mathematics with Khanmigo , the 24/7 AI-powered tutor that specializes in tailored, one-on-one math instruction. Available at any time, Khanmigo provides personalized support that goes beyond mere answers to nurture genuine mathematical understanding and critical thinking. Khanmigo can track progress, identify strengths and weaknesses, and offer real-time feedback to help students stay on the right track. Within a secure and ethical AI framework, your child can tackle everything from basic arithmetic to complex calculus, all while you maintain oversight using robust parental controls.

Get Math Help with Khanmigo Right Now

You can learn anything .

Math learning is essential for success in the modern world, and with the right approach, it can also be enjoyable and rewarding. Learning math requires curiosity, diligence, and the ability to connect abstract concepts with real-world applications. Strategies for effective math learning include a multifaceted approach, including classroom instruction, online courses, homework, tutoring, and personalized AI support. 

So, don’t let math anxiety hold you back; take advantage of available resources and technology to enhance your knowledge base and enjoy the benefits of math learning.

National Council of Teachers of Mathematics, “Principles to Actions: Ensuring Mathematical Success for All” , April 2014

Project Lead The Way Research Report, “The Power of Transportable Skills: Assessing the Demand and Value of the Skills of the Future” , 2020

Page. M, “Why Develop Quantitative and Qualitative Data Analysis Skills?” , 2016

Mann. EL, Creativity: The Essence of Mathematics, Journal for the Education of the Gifted. Vol. 30, No. 2, 2006, pp. 236–260, http://www.prufrock.com ’

Nakakoji Y, Wilson R.” Interdisciplinary Learning in Mathematics and Science: Transfer of Learning for 21st Century Problem Solving at University ”. J Intell. 2020 Sep 1;8(3):32. doi: 10.3390/jintelligence8030032. PMID: 32882908; PMCID: PMC7555771.

Get Khanmigo

The best way to learn and teach with AI is here. Ace the school year with our AI-powered guide, Khanmigo. 

For learners     For teachers     For parents

Check Out the New Website Shop!

Teaching with a Mountain View

Novels & Picture Books

math improve critical thinking

Anchor Charts

Classroom

  • Critical Thinking

How To Encourage Critical Thinking in Math

By Mary Montero

Share This Post:

  • Facebook Share
  • Twitter Share
  • Pinterest Share
  • Email Share

Critical thinking in math helps students learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different strategies.

Critical thinking is more than just a buzzword… It’s an essential skill that helps students develop problem-solving abilities and make logical connections between different concepts. By encouraging critical thinking in math, students learn to approach problems more thoughtfully, they learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different strategies for finding the solution. Critical thinking also involves a great deal of persistence. Those are critical life skills!

When you think about it, students are typically asked to solve math problems and find the answer. Showing their work is frequently stressed too, which is important, but not the end. Instead, students need to be able to look at math in different ways in order to truly grasp a complete understanding of math concepts. Mathematics requires logical reasoning, problem-solving, and abstract thinking.

Critical thinking in math helps students learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different strategies.

What Does Critical Thinking in Math Look Like?

When I think about critical thinking in math, I focus on:

  • Solving problems through logical thinking . Students learn how to break down complex problems, analyze the different parts, and understand how they fit together logically.
  • Identifying patterns and making connections. Students learn how to identify patterns across different math concepts, make connections between seemingly unrelated topics, and develop a more in-depth understanding of how math works.
  • Evaluating and comparing solutions. Students learn to evaluate which solution is best for a given problem and identify any flaws in their reasoning or others’ reasoning when looking at different solutions

Mathematician Posters

These FREE Marvelous Mathematician posters have been a staple in my classroom for the last 8+ years! I first started using a version from MissMathDork and adapted them for my classroom over the years. 

free marvelous mathematician posters

I print, laminate, and add magnetic stickers on the back. At the beginning of the year, I only put one or two up at a time depending on our area of focus. Now, they are all hanging on my board, and I’ll pull out different ones depending on our area of focus. They are so empowering to my mathematicians and help them stay on track!

A Marvelous Mathematician:

  • knows that quicker doesn’t mean better
  • looks for patterns
  • knows mistakes happen and keeps going
  • makes sense of the most important details
  • embraces challenges and works through frustrations
  • uses proper math vocabulary to explain their thinking
  • shows their work and models their thinking
  • discusses solutions and evaluates reasonableness
  • gives context by labeling answers
  • applies mathematical knowledge to similar situations
  • checks for errors (computational and conceptual)

Critical Thinking Math Activities

Here are a few of my favorite critical thinking activities. 

Square Of Numbers

I love to incorporate challenge problems (use Nrich and Openmiddle to get started) because they teach my students so much more than how to solve a math problem. They learn important lessons in teamwork, persistence, resiliency, and growth mindset. We talk about strategies for tackling difficult problems and the importance of not giving up when things get hard.

This square of numbers challenge was a hit!

ALL kids need to feel and learn to embrace challenge. Oftentimes, kids I see have rarely faced an academic challenge. Things have just come easy to them, so when it doesn’t, they can lack strategies that will help them. In fact, they will often give up before they even get started.

I tell them it’s my job to make sure I’m helping them stretch and grow their brain by giving them challenges. They don’t love it at first, but they eventually do! 

This domino challenge was another one from Nrich . I’m always on the hunt for problems like this!!  How would you guide students toward an answer??

Nrich domino challenge math puzzler for critical thinking in math

Fifteen Cards

This is a well-loved math puzzle with my students, and it’s amazing for encouraging students to consider all options when solving a math problem.

fifteen cards Nrich math puzzler for critical thinking in math

We have number cards 1-15 (one of each number) and only seven are laid out. With the given clues, students need to figure out which seven cards should be put out and in what order. My students love these, and after they’ve done a few, they enjoy creating their own, too! Use products, differences, and quotients to increase the challenge.

This is also adapted from Nrich, which is an AMAZING resource for math enrichment!

This is one of my favorite fraction lessons that I’ve done for years! Huge shout out to Meg from The Teacher Studio for this one. I give each child a slip of paper with this figure and they have to silently write their answer and justification. Then I tally up the answers and have students take a side and DEBATE with their reasoning! It’s an AMAZING conversation, and I highly recommend trying it with your students. 

Sometimes we leave it hanging overnight and work on visual models to make some proofs. 

fourths math puzzler

Logic Puzzles

Logic puzzles are always a hit too! You can enrich and extend your math lessons with these ‘Math Mystery’ logic puzzles that are the perfect challenge for 4th, 5th, and 6th grades. The puzzles are skills-based, so they integrate well with almost ANY math lesson. You can use them to supplement instruction or challenge your fast-finishers and gifted students… all while encouraging critical thinking about important math skills!

 math logic puzzles for critical thinking in math

Three levels are included, so they’re perfect to use for differentiation.

  • Introductory logic puzzles are great for beginners (4th grade and up!)
  • Advanced logic puzzles are great for students needing an extra challenge
  • Extra Advanced logic puzzles are perfect for expert solvers… we dare you to figure these puzzles out! 

Do you have a group of students who are ready for more of a fraction challenge? My well-loved fraction puzzlers are absolutely perfect for fraction enrichment. They’ll motivate your students to excel at even the most challenging tasks! 

fraction math puzzlers for critical thinking

Math Projects

Math projects are another way to differentiation while building critical thinking skills. Math projects hold so much learning power with their real-world connections, differentiation options, collaborative learning opportunities, and numerous avenues for cross curricular learning too. 

If you’re new to math projects, I shared my best tips and tricks for using math projects in this blog post . They’re perfect for cumulative review, seasonal practice, centers, early finisher work, and more.

math projects upper elementary

I use both concept-based math projects to focus on specific standards and seasonal math projects that integrate several skills.

Place Value Detectives Lay 804151 2642763 1

Error Analysis

Finally, error analysis is always a challenging way to encourage critical thinking. When we use error analysis, we encourage students to analyze their own mistakes to prevent making the same mistakes in the future.

For my gifted students, I use error analysis tasks as an assessment when they have shown mastery of a unit during other tasks. For students in the regular classroom needing enrichment, I usually have them complete the tasks in a center or with a partner.

For students needing extra support, we complete error analysis in small groups.  We go step-by-step through the concept and they are always able to eventually identify what the error is. It is so empowering to students when they finally figure out the error AND it helps prevent them from making the same error in the future!

My FREE addition error analysis is a good place to start, no matter the grade level. I show them the process of walking through the problem and how best to complete an error analysis task.

When you’re ready for more, this bundle of error analysis tasks contains more than 240 tasks to engage and enrich your students in critical thinking practice.

Division Strategies Error AnalysisIMG 0763 3512378 6647195 jpg

If you want to dig even deeper, visit this conceptual vs computational error analysis post to learn more about using error analysis in the classroom. 

analyzing errors anchor chart for error analysis

Related Critical Thinking Posts

  • How to Increase Critical Thinking and Creativity in Your “Spare” Time
  • More Tips to Increase Critical Thinking

Critical thinking is essential for students to develop a deeper understanding of math concepts, problem-solving skills, and a stronger ability to reason logically. When you learn how to encourage critical thinking in math, you’re setting your students up for success not only in more advanced math subjects they’ll encounter, but also in life. 

How do you integrate critical thinking in your classroom? Come share your ideas with us in our FREE Inspired In Upper Elementary Facebook group .

facebook group promo 3

Mary Montero

I’m so glad you are here. I’m a current gifted and talented teacher in a small town in Colorado, and I’ve been in education since 2009. My passion (other than my family and cookies) is for making teachers’ lives easier and classrooms more engaging.

You might also like…

Setting2BHigh2BAcademic2BStandards2B252812529

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

One Comment

Mary Thankyou for your inspirational activities. I have just read and loved the morning talk activities. I do have meetings with my students but usually at end of day. What time do you

math improve critical thinking

©2023 Teaching With a Mountain View . All Rights Reserved | Designed by Ashley Hughes

Username or Email Address

Remember Me

Lost your password?

Review Cart

No products in the cart.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Clio cresswell.

1 School of Mathematics and Statistics, The University of Sydney, Sydney, Australia

Craig P. Speelman

2 School of Arts and Humanities, Edith Cowan University, Joondalup, Australia

Associated Data

All relevant data are within the paper and its Supporting Information files.

Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational and logical reasoning tasks aggregated from various psychological studies. These tasks, that included the Cognitive Reflection Test and the Wason Selection Task, are of particular interest as they have typically and reliably eluded participants in all studies, and results have been uncorrelated with general intelligence, education levels and other demographic information. The results in this study revealed that in general the greater the mathematics training of the participant, the more tasks were completed correctly, and that performance on some tasks was also associated with performance on others not traditionally associated. A ceiling effect also emerged. The work is deconstructed from the viewpoint of adding to the platform from which to approach the greater, and more scientifically elusive, question: are any skills associated with mathematics training innate or do they arise from skills transfer?

Introduction

Mathematics is often promoted as endowing those who study it with a number of broad thinking skills such as: an ability to think logically, analytically, critically and abstractly; having capacity to weigh evidence with impartiality. This is a view of mathematics as providing transferable skills which can be found across educational institutions, governments and corporations worldwide. A view material to the place of mathematics in curricula.

Consider the UK government’s commissioned inquiry into mathematics education “Making Mathematics Count” ascertaining the justification that “mathematical training disciplines the mind, develops logical and critical reasoning, and develops analytical and problem-solving skills to a high degree” [ 1 p11]. The Australian Mathematical Sciences Institute very broadly states in its policy document “Vision for a Maths Nation” that “Not only is mathematics the enabling discipline, it has a vital productive role planning and protecting our well-being” (emphasis in original) [ 2 ]. In Canada, British Columbia’s New 2016 curriculum K-9 expressly mentions as part of its “Goals and Rationale”: “The Mathematics program of study is designed to develop deep mathematical understanding and fluency, logical reasoning, analytical thought, and creative thinking.” [ 3 ]. Universities, too, often make such specific claims with respect to their teaching programs. “Mathematics and statistics will help you to think logically and clearly, and apply a range of problem-solving strategies” is claimed by The School of Mathematical Sciences at Monash University, Australia [ 4 ]. The School of Mathematics and Statistics at The University of Sydney, Australia, directly attributes as part of particular course objectives and outcomes skills that include “enhance your problem-solving skills” as part of studies in first year [ 5 ], “develop logical thinking” as part of studies in second year, which was a statement drafted by the lead author in fact [ 6 ], and “be fluent in analysing and constructing logical arguments” as part of studies in third year [ 7 ]. The University of Cambridge’s Faculty of Mathematics, UK, provides a dedicated document “Transferable Skills in the Mathematical Tripos” as part of its undergraduate mathematics course information, which again lists “analytic ability; creativity; initiative; logical and methodical reasoning; persistence” [ 8 ].

In contrast, psychological research, which has been empirically investigating the concept of transferability of skills since the early 1900s, points quite oppositely to reasoning skills as being highly domain specific [ 9 ]. Therefore, support for claims that studying mathematics engenders more than specific mathematics knowledge is highly pertinent. And yet it is largely absent. The 2014 Centre for Curriculum Redesign (CCR) four part paper “Mathematics for the 21st Century: What Should Students Learn?” concludes in its fourth paper titled “Does mathematics education enhance higher-order thinking skills?” with a call to action “… there is not sufficient evidence to conclude that mathematics enhances higher order cognitive functions. The CCR calls for a much stronger cognitive psychology and neuroscience research base to be developed on the effects of studying mathematics” [ 10 ].

Inglis and Simpson [ 11 ], bringing up this very issue, examined the ability of first-year undergraduate students from a high-ranking UK university mathematics department, on the “Four Cards Problem” thinking task, also known as the Wason Selection Task. It is stated as follows.

Each of the following cards have a letter on one side and a number on the other.

equation image

Here is a rule: “if a card has a D on one side, then it has a 3 on the other”. Your task is to select all those cards, but only those cards, which you would have to turn over in order to find out whether the rule is true or false. Which cards would you select?

This task involves understanding conditional inference, namely understanding the rule “If P then Q” and with this, deducing the answer as “P and not Q” or “D and 7”. Such logical deduction indeed presents as a good candidate to test for a potential ability of the mathematically trained. This task has also been substantially investigated in the domain of the psychology of reasoning [ 12 p8] revealing across a wide range of publications that only around 10% of the general population reach the correct result. The predominant mistake being to pick “D and 3”; where in the original study by Wason [ 13 ] it is suggested that this was picked by 65% of people. This poor success rate along with a standard mistake has fuelled interest in the task as well as attempts to understand why it occurs. A prevailing theory being the so named matching bias effect; the effect of disproportionately concentrating on items specifically mentioned in the situation, as opposed to reasoning according to logical rules.

Inglis and Simpson’s results isolated mathematically trained individuals with respect to this task. The participants were under time constraint and 13% of the first-year undergraduate mathematics students sampled reached the correct response, compared to 4% of the non-mathematics (arts) students that was included. Of note also was the 24% of mathematics students as opposed to 45% of the non-mathematics students who chose the standard mistake. The study indeed unveiled that mathematically trained individuals were significantly less affected by the matching bias effect with this problem than the individuals without mathematics training. However, the achievement of the mathematically trained group was still far from masterful and the preponderance for a non-standard mistake compared with non-mathematically trained people is suggestive. Mathematical training appears to engender a different thinking style, but it remains unclear what the difference is.

Inglis, Simpson and colleagues proceeded to follow up their results with a number of studies concentrated on conditional inference in general [ 14 , 15 ]. A justification for this single investigatory pathway being that if transfer of knowledge is present, something subtle to test for in the first place, a key consideration should be the generalisation of learning rather than the application of skills learned in one context to another (where experimenter bias in the choice of contexts is more likely to be an issue). For this they typically used sixteen “if P then Q” comprehension tasks, where their samples across a number of studies have included 16-year-old pre-university mathematics students (from England and Cyprus), mathematics honours students in their first year of undergraduate university study, third year university mathematics students, and associated control groups. The studies have encompassed controls for general intelligence and thinking disposition prior to training, as well as follows ups of up to two years to address the issue of causation. The conclusive thinking pattern that has emerged is a tendency of the mathematical groups towards a greater likelihood of rejecting the invalid denial of the antecedent and affirmation of the consequent inferences. But with this, and this was validated by a second separate study, the English mathematics group actually became less likely to endorse the valid modus tollens inference. So again, mathematical training appears to engender a different thinking style, but there are subtleties and it remains unclear what the exact difference is.

This project was designed to broaden the search on the notion that mathematics training leads to increased reasoning skills. We focused on a range of reasoning problems considered in psychological research to be particularly insightful into decision making, critical thinking and logical deduction, with their distinction in that the general population generally struggles with answering them correctly. An Australian sample adds diversity to the current enquiries that have been European focussed. Furthermore, in an effort to identify the impact of mathematics training through a possible gradation effect, different levels of mathematically trained individuals were tested for performance.

Well-studied thinking tasks from a variety of psychological studies were chosen. Their descriptions, associated success rates and other pertinent details follows. They were all chosen as the correct answer is typically eluded for a standard mistake.

The three-item Cognitive Reflection Test (CRT) was used as introduced by Frederick [ 16 ]. This test was devised in line with the theory that there are two general types of cognitive activity: one that operates quickly and without reflection, and another that requires not only conscious thought and effort, but also an ability to reflect on one’s own cognition by including a step of suppression of the first to reach it. The three items in the test involve an incorrect “gut” response and further cognitive skill is deemed required to reach the correct answer (although see [ 17 ] for evidence that correct responses can result from “intuition”, which could be related to intelligence [ 18 ]).

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Bat and ball

A bat and a ball cost $1.10 in total. The bat costs a dollar more than the ball. How much does the ball cost?

The solutions are: 47 days for the Lily Pads problem, 5 minutes for the Widgets problem and 5 cents for the Bat and Ball problem. The considered intuitive, but wrong, answers are 24 days, 100 minutes and 10 cents, respectively. These wrong answers are attributed to participants becoming over focused on the numbers so as to ignore the exponential growth pattern in the Lily Pads problem, merely complete a pattern in numbers in the Widgets problem, and neglect the relationship “more than” in the Bat and Ball problem [ 19 ]. The original study by Frederick [ 16 ] provides a composite measure of the performance on these three items, with only 17% of those studied (n = 3428) reaching the perfect score. The CRT has since been studied extensively [ 19 – 21 ]. Research using the CRT tends not to report performance on the individual items of the test, but rather a composite measure of performance. Attridge and Inglis [ 22 ] used the CRT as a test for thinking disposition of mathematics students as one way to attempt to disentangle the issue of filtering according to prior thinking styles rather than transference of knowledge in successful problem solving. They repeat tested 16-year old pre-university mathematics students and English literature students without mathematics subjects at a one-year interval and found no difference between groups.

Three problems were included that test the ability to reason about probability. All three problems were originally discussed by Kahneman and Tversky [ 23 ], with the typically poor performance on these problems explained by participants relying not on probability knowledge, but a short-cut method of thinking known as the representativeness heuristic. In the late 1980s, Richard Nisbett and colleagues showed that graduate level training in statistics, while not revealing any improvement in logical reasoning, did correlate with higher-quality statistical answers [ 24 ]. Their studies lead in particular to the conclusion that comprehension of, what is known as the law of large numbers, did show improvement with training. The first of our next three problems targeted this law directly.

A certain town is served by two hospitals. In the larger hospital, about 45 babies are born each day, and in the smaller hospital, about 15 babies are born each day. As you know, about 50 percent of all babies are boys. However, the exact percentage varies from day to day. Sometimes it may be higher than 50 percent, sometimes lower. For a period of one year, each hospital recorded the number of days on which more than 60 percent of the babies born were boys. Which hospital do you think recorded more such days? (Circle one letter.)

  • (a) the larger hospital
  • (b) the smaller hospital
  • (c) about the same (that is, within 5 percent of each other)

Kahneman and Tversky [ 23 ] reported that, of 50 participants, 12 chose (a), 10 chose (b), and 28 chose (c). The correct answer is (b), for the reason that small samples are more likely to exhibit extreme events than large samples from the same population. The larger the sample, the more likely it will exhibit characteristics of the parent population, such as the proportion of boys to girls. However, people tend to discount or be unaware of this feature of sampling statistics, which Kahneman and Tversky refer to as the law of large numbers. Instead, according to Kahneman and Tversky, people tend to adhere to a fallacious law of small numbers, where even small samples are expected to exhibit properties of the parent population, as illustrated by the proportion of participants choosing the answer (c) in their 1972 study. Such thinking reflects use of the representativeness heuristic, whereby someone will judge the likelihood of an uncertain event based on how similar it is to characteristics of the parent population of events.

Birth order

All families of six children in a city were surveyed. In 72 families the exact order of births of boys and girls was GBGBBG.

  • (a) What is your estimate of the number of families surveyed in which the exact order of births was BGBBBB?
  • (b) In the same survey set, which, if any, of the following two sequences would be more likely: BBBGGG or GBBGBG?

All of the events listed in the problem have an equal probability, so the correct answer to (a) is 72, and to (b) is “neither is more likely”. Kahneman and Tversky [ 23 ] reported that 75 of 92 participants judged the sequence in (a) as less likely than the given sequence. A similar number (unspecified by Kahneman and Tversky, but the statistical effect was reported to be of the same order as in (a)) reported that GBBGBG was the more likely sequence. Again, Kahneman and Tversky suggested that these results reflected use of the representativeness heuristic. In the context of this problem, the heuristic would have taken the following form: some birth orders appear less patterned than others, and less patterned is to be associated with the randomness of birth order, making them more likely.

Coin tosses

In a sequence of coin tosses (the coin is fair) which of the following outcomes would be most likely (circle one letter):

  • (a) H T H T H T H T
  • (b) H H H H T T T T
  • (c) T T H H T T H H
  • (d) H T T H T H H T
  • (e) all of the above are equally likely

The correct answer in this problem is (e). Kahneman and Tversky [ 23 ] reported that participants tend to choose less patterned looking sequences (e.g., H T T H T H H T) as more likely than more systematic looking sequences (e.g., H T H T H T H T). This reasoning again reflects the representativeness heuristic.

Three further questions from the literature were included to test problem solving skill.

Two drivers

Two drivers set out on a 100-mile race that is marked off into two 50-mile sections. Driver A travels at exactly 50 miles per hour during the entire race. Driver B travels at exactly 45 mph during the first half of the race (up to the 50-mile marker) and travels at exactly 55 mph during the last half of the race (up to the finish line). Which of the two drivers would win the race? (Circle one letter.)

  • (a) Driver A would win the race
  • (b) Driver B would win the race
  • (c) the two drivers would arrive at the same time (within a few seconds of one another)

This problem was developed by Pelham and Neter [ 25 ]. The correct answer is (a), which can be determined by calculations of driving times for each Driver, using time = distance/velocity. Pelham and Neter argue, however, that (c) is intuitively appealing, on the basis that both drivers appear to have the same overall average speed. Pelham and Neter reported that 67% of their sample gave this incorrect response to the problem, and a further 13% selected (b).

Petrol station

Imagine that you are driving along the road and you notice that your car is running low on petrol. You see two petrol stations next to each other, both advertising their petrol prices. Station A’s price is 65c/litre; Station B’s price is 60c/litre. Station A’s sign also announces: “5c/litre discount for cash!” Station B’s sign announces “5c/litre surcharge for credit cards.” All other factors being equal (for example, cleanliness of the stations, number of cars waiting at each etc), to which station would you choose to go, and why?

This problem was adapted from one described by Galotti [ 26 ], and is inspired by research reported by Thaler [ 27 ]. According to Thaler’s research, most people prefer Station A, even though both stations are offering the same deal: 60c/litre for cash, and 65c/litre for credit. Tversky and Kahneman [ 28 ] explain this preference by invoking the concept of framing effects. In the context of this problem, such an effect would involve viewing the outcomes as changes from some initial point. The initial point frames the problem, and provides a context for viewing the outcome. Thus, depending on the starting point, outcomes in this problem can be viewed as either a gain (in Station A, you gain a discount if you use cash) or a loss (in Station B, you are charged more (a loss) for using credit). Given that people are apparently more concerned about a loss than a gain [ 29 ], the loss associated with Station B makes it the less attractive option, and hence the preference for Station A. The correct answer, though, is that the stations are offering the same deal and so no station should be preferred.

And finally, a question described by Stanovich [ 30 , 31 ] as testing our predisposition for cognitive operations that require the least computational effort.

Jack looking at Anne

Jack is looking at Anne, but Anne is looking at George. Jack is married, but George is not. Is a married person looking at an unmarried person? (Circle one letter.)

  • (c) Cannot be determined

Stanovich reported that over 80% of people choose the “lazy” answer (c). The correct answer is (a).

The above questions survey, in a clear problem solving setting, an ability to engage advanced cognitive processing in order to critically evaluate and possibly override initial gut reasoning, an ability to reason about probability within the framework of the law of large numbers and the relationship between randomness and patterning, an ability to isolate salient features of a problem and, with the last question in particular, an ability to map logical relations. It might be hypothesised that according to degrees of mathematical training, in line with the knowledge base provided and the claims of associated broad and enhanced problem-solving abilities in general, that participants with greater degrees of such training would outperform others on these questions. This hypothesis was investigated in this study. In addition, given that no previous study on this issue has examined the variety of problems used in this study, we also undertook an exploratory analysis to investigate whether there exist any associations between the problems in terms of their likelihood of correct solution. Similarities between problems might indicate which problem solving domains could be susceptible to the effects of mathematics training.

A questionnaire was constructed containing the problems described in the previous sections plus the Four Cards Problem as tested by Inglis and Simpson [ 11 ] for comparison. The order of the problems was as follows: 1) Lily Pads; 2) Hospitals; 3) Widgets; 4) Four Cards; 5) Bat and Ball; 6) Birth Order; 7) Petrol Station; 8) Coin Tosses; 9) Two Drivers; 10) Jack looking at Anne. It was administered to five groups distinctive in mathematics training levels chosen from a high-ranking Australian university, where the teaching year is separated into two teaching semesters and where being a successful university applicant requires having been highly ranked against peers in terms of intellectual achievement:

  • Introductory—First year, second semester, university students with weak high school mathematical results, only enrolled in the current unit as a compulsory component for their chosen degree, a unit not enabling any future mathematical pathway, a typical student may be enrolled in a Biology or Geography major;
  • Standard—First year, second semester, university students with fair to good high school mathematical results, enrolled in the current mathematics unit as a compulsory component for their chosen degree with the possibility of including some further mathematical units in their degree pathway, a typical student may be enrolled in an IT or Computer Science major;
  • Advanced1—First year, second semester, university mathematics students with very strong interest as well as background in mathematics, all higher year mathematical units are included as possible future pathway, a typical student may be enrolled in a Mathematics or Physics major;
  • Advanced2—Second year, second semester, university mathematics students with strong interest as well as background in mathematics, typically a direct follow on from the previously mentioned Advanced1 cohort;
  • Academic—Research academics in the mathematical sciences.

Participants

123 first year university students volunteered during “help on demand” tutorial times containing up to 30 students. These are course allocated times that are supervised yet self-directed by students. This minimised disruption and discouraged coercion. 44 second year university students completed the questionnaire during a weekly one-hour time slot dedicated to putting the latest mathematical concepts to practice with the lecturer (whereby contrast to what occurs in tutorial times the lecturer does most of the work and all students enrolled are invited). All these university students completed the questionnaire in normal classroom conditions; they were not placed under strict examination conditions. The lead author walked around to prevent discussion and coercion and there was minimum disruption. 30 research academics responded to local advertising and answered the questionnaire in their workplace while supervised.

The questionnaires were voluntary, anonymous and confidential. Participants were free to withdraw from the study at any time and without any penalty. No participant took this option however. The questionnaires gathered demographic information which included age, level of education attained and current qualification pursued, name of last qualification and years since obtaining it, and an option to note current speciality for research academics. Each problem task was placed on a separate page. Participants were not placed under time constraint, but while supervised, were asked to write their start and finish times on the front page of the survey to note approximate completion times. Speed of completion was not incentivised. Participants were not allowed to use calculators. A final “Comments Page” gave the option for feedback including specifically if the participants had previously seen any of the questions. Questionnaires were administered in person and supervised to avoid collusion or consulting of external sources.

The responses were coded four ways: A) correct; B) standard error (the errors discussed above in The Study); C) other error; D) left blank.

The ethical aspects of the study were approved by the Human Research Ethics Committee of the University of Sydney, protocol number [2016/647].

The first analysis examined the total number of correct responses provided by the participants as a function of group. Scores ranged from 1 to 11 out of a total possible of 11 (Problem 6 had 2 parts) ( Fig 1 ). An ANOVA of this data indicated a significant effect of group (F(4, 192) = 20.426, p < .001, partial η 2 = .299). Pairwise comparisons using Tukey’s HSD test indicated that the Introductory group performed significantly worse than the Advanced1, Advanced2 and Academic groups. There were no significant differences between the Advanced1, Advanced2 and Academic groups.

An external file that holds a picture, illustration, etc.
Object name is pone.0236153.g001.jpg

Error bars are one standard error of the mean.

Overall solution time, while recorded manually and approximately, was positively correlated with group, such that the more training someone had received, the longer were these solution times (r(180) = 0.247, p = .001). However, as can be seen in Fig 2 , this relationship is not strong.

An external file that holds a picture, illustration, etc.
Object name is pone.0236153.g002.jpg

A series of chi-squared analyses, and their Bayesian equivalents, were performed on each problem, to determine whether the distribution of response types differed as a function of group. To minimise the number of cells in which expected values in some of these analyses were less than 5, the Standard Error, Other Error and Blank response categories were collapsed into one category (Incorrect Response). For three of the questions, the expected values of some cells did fall below 5, and this was due to most people getting the problem wrong (Four Cards), or most people correctly responding to the problem (Bat and Ball, Coin Tosses). In these cases, the pattern of results was so clear that a statistical analysis was barely required. Significant chi-squared results were examined further with pairwise posthoc comparisons (see Table 1 ).

Superscripts label the groups (e.g., Introductory = a). Within the table, these letters refer to which other group a particular group was significantly different to according to a series of pairwise post hoc chi squared analyses (Bonferroni corrected α = .005) (e.g., ‘d’ in the Introductory column indicates the Introductory and the Advanced2 (d) group were significantly different for a particular problem).

The three groups with the least amount of training in mathematics were far less likely than the other groups to give the correct solution (χ 2 (4) = 31.06, p < .001; BF 10 = 45,045) ( Table 1 ). People in the two most advanced groups (Advanced2 and Academic) were more likely to solve the card problem correctly, although it was still less than half of the people in these groups who did so. Further, these people were less likely to give the standard incorrect solution, so that most who were incorrect suggested some more cognitively elaborate answer, such as turning over all cards. The proportion of people in the Advanced2 and Academic groups (39 and 37%) who solved the problem correctly far exceeded the typical proportion observed with this problem (10%). Of note, also, is the relatively high proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [ 11 ].

The cognitive reflection test

In the Lily Pads problem, although most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, it was also the case that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 27.28, p < .001; BF 10 = 15,554), with the standard incorrect answer being the next most prevalent response for the two lower ability mathematics groups ( Table 1 ).

Performance on the Widgets problem was similar to performance on the Lily Pads problem in that most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, but that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 23.76, p< .001; BF 10 = 516) ( Table 1 ). As with the Lily Pads and Widget problems, people in the Standard, Advanced1, Advanced2 and Academic groups were highly likely to solve the Bat and Ball problem (χ 2 (4) = 35.37, p < .001; BF 10 = 208,667). Errors were more likely from the least mathematically trained people (Introductory, Standard) than the other groups ( Table 1 ).

To compare performance on the CRT with previously published results, performance on the three problems (Lily Pads, Widgets, Bat and Ball) were combined. The number of people in each condition that solved 0, 1, 2, or 3 problems correctly is presented in Table 2 . The Introductory group were evenly distributed amongst the four categories, with 26% solving all three problems correctly. Around 70% of the rest of the groups solved all 3 problems correctly, which is vastly superior to the 17% reported by Frederick [ 16 ].

Responses to the Hospitals problem were almost universally split between correct and standard errors in the Standard, Advanced1, Advanced2 and Academic groups. Although this pattern of responses was also evident in the Introductory group, this group also exhibited more non-standard errors and non-responses than the other groups. However, the differences between the groups were not significant (χ 2 (4) = 4.93, p = .295; BF 10 = .068) ( Table 1 ). Nonetheless, the performance of all groups exceeds the 20% correct response rate reported by Kahneman and Tversky [ 23 ].

The two versions of the Birth Order problem showed similar results, with correct responses being more likely in the groups with more training (i.e., Advanced1, Advanced2 and Academic), and responses being shared amongst the various categories in the Introductory and Standard groups (χ a 2 (4) = 24.54, p < .001; BF 10 = 1,303; χ b 2 (4) = 25.77, p < .001; BF 10 = 2,970) ( Table 1 ). Nonetheless, performance on both versions of the problem in this study was significantly better than the 82% error rate reported by Kahneman and Tversky [ 23 ].

The Coin Tosses problem was performed well by all groups, with very few people in any condition committing errors. There were no obvious differences between the groups (χ 2 (4) = 3.70, p = .448; BF 10 = .160) ( Table 1 ). Kahneman and Tversky [ 23 ] reported that people tend to make errors on this type of problem by choosing less patterned looking sequences, but they did not report relative proportions of people making errors versus giving correct responses. Clearly the sample in this study did not perform like those in Kahneman and Tversky’s study.

Responses on the Two Drivers problem were clearly distinguished by a high chance of error in the Introductory and Standard groups (over 80%), and a fairly good chance of being correct in the Advanced1, Advanced2 and Academic groups (χ 2 (4) = 46.16, p < .001; BF 10 = 1.32 x 10 8 ) ( Table 1 ). Academics were the standout performers on this problem, although over a quarter of this group produced an incorrect response. Thus, the first two groups performed similarly to the participants in the Pelham and Neter [ 25 ] study, 80% of whom gave an incorrect response.

Responses on the Petrol Station problem were marked by good performance by the Academic group (73% providing a correct response), and just over half of each of the other groups correctly solving the problem. This difference was not significant (χ 2 (4) = 4.68, p = .322: BF 10 = .059) ( Table 1 ). Errors were fairly evenly balanced between standard and other, except for the Academic group, who were more likely to provide a creative answer if they made an error. Thaler [ 27 ] reported that most people get this problem wrong. In this study, however, on average, most people got this problem correct, although this average was boosted by the Academic group.

Responses on the Jack looking at Anne problem generally were standard errors, except for the Advanced2 and Academic groups, which were evenly split between standard errors and correct responses (χ 2 (4) = 18.03, p = .001; BF 10 = 46) ( Table 1 ). Thus, apart from these two groups, the error rate in this study was similar to that reported by Stanovich [ 30 ], where 80% of participants were incorrect.

A series of logistic regression analyses were performed in order to examine whether the likelihood of solving a particular problem correctly could be predicted on the basis of whether other problems were solved correctly. Each analysis involved selecting performance (correct or error) on one problem as the outcome variable, and performance on the other problems as predictor variables. Training (amount of training) was also included as a predictor variable in each analysis. A further logistic regression was performed with training as the outcome variable, and performance on all of the problems as predictor variables. The results of these analyses are summarised in Table 3 . There were three multi-variable relationships observed in these analyses, which can be interpreted as the likelihood of solving one problem in each group being associated with solving the others in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. Training also featured in each of these sets, moderating the relationships as per the results presented above for each problem.

P = Problem (1 = Four Cards; 2 = Lily Pads; 3 = Widgets; 4 = Bat & Ball; 5 = Hospitals; 6a = Birth Order (a); 6b = Birth Order (b); 7 = Coin Tosses; 8 = Two Drivers; 9 = Petrol Station; 10 = Jack looking at Anne).

training = Amount of training condition.

p = significance level of logistic regression model.

% = percentage of cases correctly classified by the logistic regression model.

✓ = significant predictor, α < .05.

* = logistic regression for the training outcome variable is multinomial, whereas all other logistic regressions are binomial.

The final “Comments Page” revealed the participants as overwhelmingly enjoying the questions. Any analysis of previous exposure to the tasks proved impossible as there was little to no alignment on participant’s degree of recall, if any, and even perceptions of what exposure entailed. For example, some participants confused being exposed to the particular tasks with being habitually exposed to puzzles, or even mathematics problems, more broadly.

In general, the amount of mathematics training a group had received predicted their performance on the overall set of problems. The greater the training, the more problems were answered correctly, and the slower the recorded response times. There was not an obvious difference between the Advanced1, Advanced2 and Academic groups on either of these measures, however there were clear differences between this group and the Introductory and Standard groups, with the former exhibiting clearly superior accuracy. While time records were taken approximately, so as to avoid adding time pressure as a variable, that the Advanced1, Advanced2 and Academic groups recorded more time in their consideration of the problems, may suggest a “pause and consider” approach to such problems is a characteristic of the advanced groups. This is in line with what was suggested by an eye-movement tracking study of mathematically trained students attempting the Four Cards Problem; where participants that had not chosen the standard error had spent longer considering the card linked to the matching bias effect [ 14 ]. It is important to note, however, that longer response times may reflect other cognitive processes than deliberation [ 32 ].

Performance on some problems was associated with performance on other problems. That is, if someone correctly answered a problem in one of these sets, they were also highly likely to correctly answer the other problems in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. This is different with how these problems have been typically clustered a priori in the research literature: (I) Lily Pads, Widgets and Bat and Ball (CRT); (II) Hospitals and Two Drivers (explained below); (III) Hospitals, Birth Order and Coin Tosses (representativeness heuristic); (IV) Birth Order and Coin Tosses (probability theory). Consideration of these problem groupings follows.

Correctly answering all three problems in (I) entailed not being distracted by particular pieces of information in the problems so as to stay focused on uncovering the real underlying relationships. The Lily Pads and Widget problems can mislead if attention is over focused on the numbers, and conversely, the Petrol Station problem can mislead if there is too much focus on the idea of a discount. While the Lily Pads and Widget problems are traditionally paired with the Bat and Ball problem in the CRT, it may be that performance on the Bat and Ball problem did not appear as part of this set due to an added level of difficulty. With the problems in (I), avoiding being distracted by certain parts of the questions at the expense of others almost leads directly to the correct answer. However, with the Bat and Ball problem, further steps in mathematical reasoning still need to occur in answering which two numbers add together to give a result while also subtracting one from the other for another.

With the problems in (II) it is of interest that the Two Drivers problem was created specifically to be paired with the Hospitals problem to test for motivation in problem solving [ 23 ]. Within this framework further transparent versions of these problems were successfully devised to manipulate for difficulty. The Two Drivers problem was amended to have Driver B travelling at exactly 5 mph during the first half of the race and at exactly 95 mph during the last half of the race. The Hospitals problem was amended so the smaller hospital would have “only 2” babies born each day and where for a period of one year the hospitals recorded the number of days on which all of the babies born were boys. Could the association in (II) be pointing to how participants overcome initial fictitious mathematical rules? Maybe they reframe the question in simpler terms to see the pattern. The Four Cards Problem also elicited a high number of incorrect answers where, associated with mathematical training, the standard incorrect solution was avoided for more cognitively elaborate ones. Indeed, a gradation effect appeared across the groups where the standard error of the “D and 3” cards becomes “D only” ( Table 4 ). Adrian Simpson and Derrick Watson found a comparable result across their two groups [14 p61]. This could again be pointing to having avoided an initial fictitious rule of simply concentrating on items directly found in the question, participants then seek to reframe the question to unearth the logical rule to be deduced. An added level of difficulty with this question may be why participants become trapped in a false answer. The eye-movement tracking study mentioned above supports this theory.

The problems in (III) fit naturally together as part of basic probability theory, a topic participants would have assimilated, or not, as part of various education curricula. While the equal likelihood of all possible outcomes with respect to a coin toss may be culturally assimilated, the same may not be as straightforward for birth gender outcomes where such assumptions could be swayed by biological hypothesis or folk wisdom [ 33 ]. The gradation of the results in terms of mathematical training does not support this possibility.

The effect of training on performance accuracy was more obvious in some problems compared to others, and to some extent, this was related to the type of problem. For instance, most of the problems in which performance was related to training (Four Cards, CRT [Lily Pads, Widgets, Bat and Ball], Two Drivers, Jack looking at Anne) could be classed as relying on logical and/or critical thinking. The one exception was the Birth Order problems, which are probability related.

In contrast, two of the three problems in which training did not appear to have much impact on performance (Hospitals and Coin Tosses) require domain-specific knowledge. The Hospitals problem requires a degree of knowledge about sampling statistics. This is a topic of quite distinct flavour that not all mathematically trained individuals gain familiarity with. On the other hand, all groups having performed well on the Coin Tosses problem is in line with a level of familiarity with basic probability having been originally presented at high school. While the questioning of patterning as negatively correlated with randomness is similar to that appearing in the Birth Order question, in the Birth Order question this aspect is arguably more concealed. These results and problem grouping (III) could be pointing to an area for improvement in teaching where the small gap in knowledge required to go from answering the Coin Tosses problem correctly to achieving similarly with the Birth Order problem could be easily addressed. A more formal introduction to sampling statistics in mathematical training could potentially bridge this gap as well as further be extended towards improvement on the Hospitals problem.

The other problem where performance was unrelated to training, the Petrol Station problem, cannot be characterised similarly. It is more of a logical/critical thinking type problem, where there remains some suggestion that training may have impacted performance, as the Academic group seemed to perform better than the rest of the sample. An alternate interpretation of this result is therefore that this problem should not be isolated but grouped with the other problems where performance is affected by training.

Although several aspects of the data suggest mathematics training improves the chances that someone will solve problems of the sort examined here, differences in the performance of participants in the Advanced1, Advanced2 and Academic groups were not obvious. This is despite the fact that large differences exist in the amount of training in these three groups. The first two groups were undergraduate students and the Academic group all had PhDs and many were experienced academic staff. One interpretation of this result is current mathematics training can only take someone so far in terms of improving their abilities with these problems. There is a point of demarcation to consider in terms of mathematical knowledge between the Advanced1, Advanced2 and Academic groups as compared to the Introductory and Standard groups. In Australia students are able to drop mathematical study at ages 15–16 years, or choose between a number of increasingly involved levels of mathematics. For the university in this study, students are filtered upon entry into mathematics courses according to their current knowledge status. All our groups involved students who had opted for post-compulsory mathematics at high school. And since our testing occurred in second semester, some of the mathematical knowledge shortfalls that were there upon arrival were bridged in first semester. Students must pass a first semester course to be allowed entry into the second semester course. A breakdown of the mathematics background of each group is as follows:

  • The Introductory group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Introduction to Differentiation, Applications of the Derivative, Antiderivatives, Areas and the Definite Integral), Financial Mathematics, Statistical Analysis. The Introductory group then explored concepts in mathematical modelling with emphasis on the importance of calculus in their first semester of mathematical studies.
  • The Standard group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Rates of Change, Integration including the method of substitution, trigonometric identities and inverse trigonometric functions, Areas and Volumes of solids of revolution, some differential equations), Combinatorics, Proof (with particular focus on Proof by Mathematical Induction), Vectors (with application to projectile motion), Statistical Analysis. In first semester their mathematical studies then covered a number of topics the Advanced1 group studied prior to gaining entrance at university; further details on this are given below.
  • The Advanced1 group’s mathematics high school syllabus studied prior to first semester course entry covered: the same course content the Standard group covered at high school plus extra topics on Proof (develop rigorous mathematical arguments and proofs, specifically in the context of number and algebra and further develop Proof by Mathematical Induction), Vectors (3 dimensional vectors, vector equations of lines), Complex Numbers, Calculus (Further Integration techniques with partial fractions and integration by parts), Mechanics (Application of Calculus to Mechanics with simple harmonic motion, modelling motion without and with resistance, projectiles and resisted motion). The Standard group cover these topics in their first semester university studies in mathematics with the exclusion of further concepts of Proof or Mechanics. In first semester the Advanced1 group have built on their knowledge with an emphasis on both theoretical and foundational aspects, as well as developing the skill of applying mathematical theory to solve practical problems. Theoretical topics include a host of theorems relevant to the study of Calculus.

In summary, at the point of our study, the Advanced1 group had more knowledge and practice on rigorous mathematical arguments and proofs in the context of number and algebra, and more in-depth experience with Proofs by Induction, but the bulk of extra knowledge rests with a much deeper knowledge of Calculus. They have had longer experience with a variety of integration techniques, and have worked with a variety of applications of calculus to solve practical problems, including a large section on mechanics at high school. In first semester at university there has been a greater focus on theoretical topics including a host of theorems and associated proofs relevant to the topics studied. As compared to the Introductory and Standard groups, the Advanced1 group have only widened the mathematics knowledge gap since their choice of post-compulsory mathematics at high school. The Advanced2 group come directly from an Advanced1 cohort. And the Academics group would have reached the Advanced1 group’s proficiency as part of their employment. So, are specific reasoning skills resulting from this level of abstract reasoning? Our findings suggest this should certainly be an area of investigation and links in interestingly with other research work. In studying one of the thinking tasks in particular (the Four Cards Problem) and its context of conditional inference more specifically, Inglis and Simpson [ 15 ] found a clear difference between undergraduates in mathematics and undergraduates in other university disciplines, yet also showed a lack of development over first-year university studies on conditional inference measures. A follow up study by Attridge and Inglis [ 22 ] then zeroed in on post-compulsory high school mathematical training and found that students with such training did develop their conditional reasoning to a greater extent than their control group over the course of a year, despite them having received no explicit tuition in conditional logic. The development though, whilst demonstrated as not being the result of a domain-general change in cognitive capacity or thinking disposition, and most likely associated with the domain-specific study of mathematics, revealed a complex pattern of endorsing more of some inferences and less of others. The study here focused on a much broader problem set associated with logical and critical thinking and it too is suggestive of a more complex picture in how mathematics training may be contributing to problem solving styles. A more intricate pattern to do with the impact of mathematical training on problem solving techniques is appearing as required for consideration.

There is also a final interpretation to consider: that people in the Advanced 1, Advanced2 and Academic groups did not gain anything from their mathematics training in terms of their ability to solve these problems. Instead, with studies denying any correlation of many of these problems with what is currently measured as intelligence [ 30 ], they might still be people of a particular intelligence or thinking disposition to start with, who have been able to use that intelligence to not only solve these problems, but also survive the challenges of their mathematics training.

That the CRT has been traditionally used as a measure of baseline thinking disposition and that performance has been found to be immutable across groups tested is of particular interest since our results show a clear possible training effect on these questions. CRT is tied with a willingness to engage in effortful thinking which presents as a suitable ability for training. It is beyond the scope of this study, but a thorough review of CRT testing is suggestive of a broader appreciation and better framework to understand thinking disposition, ability and potential ability.

Mathematical training appears associated with certain thinking skills, but there are clearly some subtleties that need to be extricated. The thinking tasks here add to the foundational results where the aim is for a firmer platform on which to eventually base more targeted and illustrative inquiry. If thinking skills can be fostered, could first year university mathematics teaching be improved so that all samples from that group reach the Advanced1 group level of reasoning? Do university mathematics courses become purely about domain-specific knowledge from this point on? Intensive training has been shown to impact the brain and cognition across a number of domains from music [ 34 ], to video gaming [ 35 ], to Braille reading [ 36 ]. The hypothesis that mathematics, with its highly specific practice, fits within this list remains legitimate, but simply unchartered. With our current level of understanding it is worth appreciating the careful wording of the NYU Courant Institute on ‘Why Study Math?’ where there is no assumption of causation: “Mathematicians need to have good reasoning ability in order to identify, analyze, and apply basic logical principles to technical problems.” [ 37 ].

Limitations

One possible limitation of the current study is that the problems may have been too easy for the more advanced people, and so we observed a ceiling effect (i.e., some people obtained 100% correct on all problems). This was most obvious in the Advanced1, Advanced2 and Academic groups. It is possible that participants in these groups had developed logical and critical thinking skills throughout their mathematical training that were sufficient to cope with most of the problems used in this study, and so this would support the contention that training in mathematics leads to the development of logical and critical thinking skills useful in a range of domains. Another interpretation is that participants in these groups already possessed the necessary thinking skills for solving the problems in this study, which is why they are able to cope with the material in the advanced units they were enrolled in, or complete a PhD in mathematics and hold down an academic position in a mathematics department. This would then suggest that training in mathematics had no effect on abstract thinking skills—people in this study possessed them to varying extents prior to their studies. This issue might be settled in a future study that used a greater number of problems of varying difficulties to maximise the chances of finding a difference between the three groups with the most amount of training. Alternatively, a longitudinal study that followed people through their mathematics training could determine whether their logical and critical thinking abilities changed throughout their course.

A further limitation of the study may be that several of the reasoning biases examined in this study were measured by only one problem each (i.e., Four Cards Problem, Two Drivers, Petrol Station, Jack looking at Anne). A more reliable measure of these biases could be achieved by including more problems that tap into these biases. This would, however, increase the time required of participants during data collection, and in the context of this study, would mean a different mode of testing would likely be required.

Broad sweeping intuitive claims of the transferable skills endowed by a study of mathematics require evidence. Our study uniquely covers a wide range of participants, from limited mathematics training through to research academics in the mathematical sciences. It furthermore considered performance on 11 well-studied thinking tasks that typically elude participants in psychological studies and on which results have been uncorrelated with general intelligence, education levels and other demographic information [ 15 , 16 , 30 ]. We identified different performances on these tasks with respect to different groups, based on level of mathematical training. This included the CRT which has developed into a method of measuring baseline thinking disposition. We identified different distributions of types of errors for the mathematically trained. We furthermore identified a performance threshold that exists in first year university for those with high level mathematics training. This study then provides insight into possible changes and adjustments to mathematics courses in order for them to fulfil their advertised goal of reaching improved rational and logical reasoning for a higher number of students.

It is central to any education program to have a clear grasp of the nature of what it delivers and how, but arguably especially so for the core discipline that is mathematics. In 2014 the Office of The Chief Scientist of Australia released a report “Australia’s STEM workforce: a survey of employers” where transferable skills attributed to mathematics were also ones that employers deemed as part of the most valuable [ 38 ]. A better understanding of what mathematics delivers in this space is an opportunity to truly capitalise on this historical culture-crossing subject.

Supporting information

Acknowledgments.

The authors would like to thank Jacqui Ramagge for her proof reading and input, as well as support towards data collection.

Funding Statement

The authors received no specific funding for this work.

Data Availability

  • PLoS One. 2020; 15(7): e0236153.

Decision Letter 0

17 Mar 2020

PONE-D-20-01159

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Dear Professor Speelman,

Thank you for submitting your manuscript to PLOS ONE. I have sent it to two expert reviewers and have received their comments back. As you can see at the bottom of this email, both reviewers are positive about your manuscript but raise some issues that you would need to address before the manuscript can be considered for publication. Notably, reviewer #1 points out that the manuscript should include a discussion on the reasons why individuals with math training may have improved reasoning skills (e.g., logical intuitions versus deliberate thinking). The reviewer also rightly mentions that your sample sizes are limited, notably for the most advanced groups. This should be discussed and acknowledged. Reviewer #2 has a number of conceptual and methodological points that you will also have to address. The reviewer provides very thorough comments and I will not reiterate the points here. However, note that both reviewers suggest that you need to improve the figures and I agree with them.   

We would appreciate receiving your revised manuscript by May 01 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Jérôme Prado

Academic Editor

Journal Requirements:

When submitting your revision, we need you to address these additional requirements:

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.plosone.org/attachments/PLOSOne_formatting_sample_main_body.pdf and http://www.plosone.org/attachments/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please include additional information regarding the survey or questionnaire used in the study and ensure that you have provided sufficient details that others could replicate the analyses. For instance, if you developed a questionnaire as part of this study and it is not under a copyright more restrictive than CC-BY, please include a copy, in both the original language and English, as Supporting Information. Please also let us know if it would be possible to provide the anonymized data points necessary to replicate the statistical analyses, for instance, as shown in fig 1 and 2. If so, please deposit those to a suitable data repository or include them in the Supporting Information files.

3. Thank you for stating the following financial disclosure:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

  • Please provide an amended Funding Statement that declares *all* the funding or sources of support received during this specific study (whether external or internal to your organization) as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now .  
  • Please state what role the funders took in the study.  If any authors received a salary from any of your funders, please state which authors and which funder. If the funders had no role, please state: "The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer #1: I think this is a very good and interesting manuscript trying to answer an important research question. I propose some changes that I believe should be applied before publication.

1. Each reasoning bias is measured with only one problem. In reasoning research, it is rather common to measure each type of reasoning problem with a series of structurally equivalent reasoning problems, so the results will be independent of contexts effects and will be generalizable to that type of problem. Here, the authors only measured each reasoning bias with one single problem and this might be problematic (see, for example: Fiedler & Hertel, 1994). I think this can be addressed by simply discussing it in the limitation section.

2. This is rather a minor issue, but the discussion on the CRT problems is not up-to-date (page 7). Most recent experiments on dual process theory suggest that people who are able to correctly solve these reasoning problems (including the CRT) do so intuitively, and not because they engaged in careful deliberation (Bago & De Neys, 2019). Intelligence made people have better intuitive responses (Thompson, Pennycook, Trippas & Evans, 2018). Similarly, this problems persists in the discussion about reaction times (page 25). Longer reaction times does not necessarily mean that people engaged in deliberation (see: Evans, Kyle, Dillon & Rand, 2015). Response time might be driven by decision conflict or response rationalization. These issues could be clarified with some changes in the wording or some footnotes on page 7 and 25. Furthermore, it would be interesting to have a discussion on how mathematical education helps people overcome their biases. Is it because it creates better intuition, or helps people engage in deliberation? An interesting question this manuscript does not discuss. It’s on the authors whether or not they discuss this latter point now, but the changes on page 7 and 25 should be made.

3. A more serious problem is the rather small sample size (especially in the more advanced groups). This small sample size makes the appearance of both false negatives and false positives more likely. Perhaps, the authors could compute the Bayes Factors for the chi-square or logistic regression test, so we can actually see how strong the evidence is for or against the null. This is especially important as the authors run a great number of explorative analysis (Table 3), and some of those results might need to be interpreted with great caution (depending on the Bayes Factor).

The graphs are not looking good, they should comply with APA formatting. At the very least, the axis titles should be meaningful and measure units should be written there.

The presentation order of the problems is quite unusual; why isn’t it random? Why did the authors decide on this order?

Reviewer #2: The study reported in this paper compared five groups of participants with varying levels of mathematical expertise on a set of reasoning tasks. The study is interesting and informative. It extends the current literature on this topic (which is reviewed very nicely in the introduction). However, there are some issues with the current analysis and interpretation that should be resolved prior to publication. I have therefore recommended major revisions. My comments are organised in the order in which they came up in the paper and they explain my responses to the questions above.

1. Line 114 – “general population” a bit misleading – they were also students but from other disciplines.

2. Line 124 onwards reads:

“The ultimate question to consider here is: are any skills associated with mathematics training innate or do they arise from skills transfer? Though to investigate how mathematical training affects reasoning skills, randomised sampling and randomised intervention to reveal causal relationships are clearly not viable. With so many possible confounding variables and logistical issues, it is even questionable what conclusions such studies might provide. Furthermore, a firm baseline from which to propose more substantive investigations is still missing.”

I find this paragraph slightly problematic because the current study doesn’t inform us on this ultimate question, so it makes the outline of the current study in the following paragraph feel unsatisfactory. I think the current study is important but prefacing it with this paragraph underplays that importance. And I think a randomised controlled study, although not viable, would give the answers we need because the random allocation to groups would allow us to rule out any confounding variables. Finally, the last sentence in this paragraph is unclear to me.

3. In the descriptions of the five participants groups the authors refer to the group’s level of interest in mathematics, but this seems like an overgeneralisation to me. Surely the introductory group could contain a biology student who also happens to be good at mathematics and very much enjoy it? I would be more comfortable with the descriptions if the parts about interest level were removed.

4. How many of the 123 first year students were in each of the three first year groups?

5. Line 313 – the standard group is referred to as “university mathematics students”, but they are not taking mathematics degreed.

6. Line 331 - what is a practice class?

7. Were the data collection settings quiet? From the description it sounds like groups of participants were completing the study at the same time in the same room, but the authors should make this explicit for the sake of the method being reproducible. E.g. how many students were in the room at the time?

8. Line 355-356 – the authors should not use the term “marginally worse” because this is statistically inappropriate – in a frequentist approach results are either significant or non-significant.

9. Line 340 – “approximate completion times were noted.”

This doesn’t sound rigorous enough to justify analysing them. Their analysis is interesting, but the authors should remind readers clearly whenever the response times are analysed or discussed that their recording was only manual and approximate.

10. I suggest replacing Figure 1 with a bar chart showing standard error of the mean on the error bars. A table with mean score out of 11 and the standard deviation for each group may also be useful. Figure 2 should be a scatterplot rather than a box and whisker plot.

11. Was the 0-11 total correct score approximately normally distributed across the full sample?

12. Chi square analysis requires at least 5 cases in each cell, was this met? It seems not since Table 1 shows lots of cells in the “no response” row having 0% of cases.

13. The chi-square analyses should be followed up with post hoc tests to see exactly where the differences between groups are. The descriptions as they stand aren’t that informative (as readers can just look at Table 1) without being backed up by post hoc tests.

14. For each chi square analysis in the text, I would find it easier to read if the test statistics came at the top of the paragraph, before the description.

15. Line 381-383 – “Of note, also, is the relatively low proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [11]."

I think this is supposed to say that a low proportion did make the standard error or that a high proportion did not make the standard error.

16. Line 403 - p values this small should be reported as p < .001 rather than p = .000 since they aren’t actually 0.

17. Line 476 – “…if a particular outcome variable was predicted significantly by a particular predictor variable, the converse relationship was also observed”

Isn’t that necessarily the case with regression analyses, like with correlations?

18. I don’t think the logistic regression analyses add much to the paper and at the moment they come across as potential p-hacking since they don’t clearly relate to the research question. To me they make the paper feel less focused. Having said that, there is some interesting discussion of them in the Discussion section. I’d recommend adding some justification to the introduction for why it is interesting to look at the relationships among tasks (without pretending to have made any specific hypotheses about the relationships, of course).

19. Line 509 would be clearer if it read “between these groups and the introductory and standard groups”

20. Lines 597 – 620 - This is an interesting discussion, especially the suggestion that advanced calculus may be responsible for the development. No development in reasoning skills from the beginning of a mathematics degree onwards was also found by Inglis and Simpson (2009), who suggested that the initial difference between mathematics and non-mathematics undergraduates could have been due to pre-university study of mathematics. Attridge & Inglis (2013) found evidence that this was the case (they found no difference between mathematics and non-mathematics students at age 16 but a significant difference at the end of the academic year, where the mathematics students had improved and the non-mathematics students had not).

Could the authors add some discussion of whether something similar may have been the case with their Australian sample? E.g. do students in Australia choose whether, or to what extent, to study mathematics towards the end of high school? If not, the description of the groups suggests that there were at least differences in high school mathematics attainment between groups 1-3, even if they studied the same mathematics curriculum. Do the authors think that this difference in attainment could have led to the differences between groups in the current study?

21. Line 617 – “Intensive training has been shown to impact the brain and cognition across a number of domains from music, to video gaming, to Braille reading [31].”

Reference 31 appears to only relate to music. Please add references for video gaming and Braille reading.

22. I recommend editing the figures from SPSS’s default style or re-making them in Excel or DataGraph to look more attractive.

23. I cannot find the associated datafile anywhere in the submission. Apologies if this is my mistake.

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

20 Apr 2020

All responses are detailed against the specific reviewers' comments in the Response to Reviewers document

Submitted filename: Response to Reviewers.docx

Decision Letter 1

11 Jun 2020

PONE-D-20-01159R1

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors.

Dear Dr. Speelman,

Thank you for submitting your revised manuscript to PLOS ONE. I have sent it to reviewer #2 and have now received the reviewer's comment. As you can see, the reviewer thinks that the manuscript is improved but has some outstanding issues that you would need to address in another round of revision. I notably agree with the reviewer that you should provide the raw data, allowing readers to replicate your analyses. Therefore, I invite you submit a revised version of your manuscript.

Please submit your revised manuscript by Jul 26 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Reviewer #2: The manuscript has improved but there are still a few issues that should be resolved prior to publication.

1. On lines 96, 97, 100 and 102, the references to “general population” should be changed to reflect the fact that these participants were non-mathematics (arts) students.

2. Line 306 – change “mathematics students” to “university students”.

3. The method section doesn’t specify the gender split and mean age of the sample.

4. Table 3 - values the p values listed as .000 should be changed to <.001.

5. Table 3 - I suggest repeating the list of problem numbers and names in the legend. It may make for a long legend but would make it much easier for the reader to interpret the table.

6. I am not sure what the new post hoc tests are comparing. What I expected was to see group 1 compared to groups 2, 3, 4 and 5, and so on. This would tell us which groups are statistically different from each other. At the moment we only know from the overall chi square tests whether there are any differences among the groups or not, we don’t know specifically which groups are statistically different from each other and which ones are not. We only have the authors’ interpretations based on the observed counts.

7. Line 584 - change “performance was correlated with training” to “performance was related to training” to avoid any confusion since a correlation analysis was not performed.

8. Data file – I had expected the data file to give the raw data rather than summary data, i.e. with each participant in a separate row, and a column indicating their group membership, a column giving their age, a column for sex etc (including all the demographics mentioned in the method), and a column for each reasoning question. This would allow other researchers to replicate the regression analyses and look at other relationships within the dataset. Without being able to replicate all analyses in the paper, the data file does not meet the minimal data set definition for publication in PLOS journals: https://journals.plos.org/plosone/s/data-availability .

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 1

16 Jun 2020

Please see "Response to Reviewers" document

Decision Letter 2

PONE-D-20-01159R2

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

Dear Dr. Speelman:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Dr. Jérôme Prado

Wonder Math

How to Improve Problem-Solving Skills: Mathematics and Critical Thinking

how-to-improve-problem-solving-skills

In today’s rapidly changing world, problem-solving has become a quintessential skill. When we discuss the topic, it’s natural to ask, “What is problem-solving?” and “How can we enhance this skill, particularly in children?” The discipline of mathematics offers a rich platform to explore these questions. Through math, not only do we delve into numbers and equations, but we also explore how to improve problem-solving skills and how to develop critical thinking skills in math. Let’s embark on this enlightening journey together.

What is Problem-Solving?

At its core, problem-solving involves identifying a challenge and finding a solution. But it’s not always as straightforward as it sounds. So, what is problem-solving? True problem-solving requires a combination of creative thinking and logical reasoning. Mathematics, in many ways, embodies this blend. When a student approaches a math problem, they must discern the issue at hand, consider various methods to tackle it, and then systematically execute their chosen strategy.

But what is problem-solving in a broader context? It’s a life skill. Whether we’re deciding the best route to a destination, determining how to save for a big purchase, or even figuring out how to fix a broken appliance, we’re using problem-solving.

How to Develop Critical Thinking Skills in Math

Critical thinking goes hand in hand with problem-solving. But exactly how to develop critical thinking skills in math might not be immediately obvious. Here are a few strategies:

  • Contextual Learning: Teaching math within a story or real-life scenario makes it relevant. When students see math as a tool to navigate the world around them, they naturally begin to think critically about solutions.
  • Open-ended Questions: Instead of merely seeking the “right” answer, encourage students to explain their thought processes. This nudges them to think deeply about their approach.
  • Group Discussions: Collaborative learning can foster different perspectives, prompting students to consider multiple ways to solve a problem.
  • Challenging Problems: Occasionally introducing problems that are a bit beyond a student’s current skill level can stimulate critical thinking. They will have to stretch their understanding and think outside the box.

What are the Six Basic Steps of the Problem-Solving Process?

Understanding how to improve problem-solving skills often comes down to familiarizing oneself with the systematic approach to challenges. So, what are the six basic steps of the problem-solving process?

  • Identification: Recognize and define the problem.
  • Analysis: Understand the problem’s intricacies and nuances.
  • Generation of Alternatives: Think of different ways to approach the challenge.
  • Decision Making: Choose the most suitable method to address the problem.
  • Implementation: Put the chosen solution into action.
  • Evaluation: Reflect on the solution’s effectiveness and learn from the outcome.

By embedding these steps into mathematical education, we provide students with a structured framework. When they wonder about how to improve problem-solving skills or how to develop critical thinking skills in math, they can revert to this process, refining their approach with each new challenge.

Making Math Fun and Relevant

At Wonder Math, we believe that the key to developing robust problem-solving skills lies in making math enjoyable and pertinent. When students see math not just as numbers on a page but as a captivating story or a real-world problem to be solved, their engagement skyrockets. And with heightened engagement comes enhanced understanding.

As educators and parents, it’s crucial to continuously ask ourselves: how can we demonstrate to our children what problem-solving is? How can we best teach them how to develop critical thinking skills in math? And how can we instill in them an understanding of the six basic steps of the problem-solving process?

The answer, we believe, lies in active learning, contextual teaching, and a genuine passion for the beauty of mathematics.

The Underlying Beauty of Mathematics

Often, people perceive mathematics as a rigid discipline confined to numbers and formulas. However, this is a limited view. Math, in essence, is a language that describes patterns, relationships, and structures. It’s a medium through which we can communicate complex ideas, describe our universe, and solve intricate problems. Understanding this deeper beauty of math can further emphasize how to develop critical thinking skills in math.

Why Mathematics is the Ideal Playground for Problem-Solving

Math provides endless opportunities for problem-solving. From basic arithmetic puzzles to advanced calculus challenges, every math problem offers a chance to hone our problem-solving skills. But why is mathematics so effective in this regard?

  • Structured Challenges: Mathematics presents problems in a structured manner, allowing learners to systematically break them down. This format mimics real-world scenarios where understanding the structure of a challenge can be half the battle.
  • Multiple Approaches: Most math problems can be approached in various ways . This teaches learners flexibility in thinking and the ability to view a single issue from multiple angles.
  • Immediate Feedback: Unlike many real-world problems where solutions might take time to show results, in math, students often get immediate feedback. They can quickly gauge if their approach works or if they need to rethink their strategy.

Enhancing the Learning Environment

To genuinely harness the power of mathematics in developing problem-solving skills, the learning environment plays a crucial role. A student who is afraid of making mistakes will hesitate to try out different approaches, stunting their critical thinking growth.

However, in a nurturing, supportive environment where mistakes are seen as learning opportunities, students thrive. They become more willing to take risks, try unconventional solutions, and learn from missteps. This mindset, where failure is not feared but embraced as a part of the learning journey, is pivotal for developing robust problem-solving skills.

Incorporating Technology

In our digital age, technology offers innovative ways to explore math. Interactive apps and online platforms can provide dynamic problem-solving scenarios, making the process even more engaging. These tools can simulate real-world challenges, allowing students to apply their math skills in diverse contexts, further answering the question of how to improve problem-solving skills.

More than Numbers 

In summary, mathematics is more than just numbers and formulas—it’s a world filled with challenges, patterns, and beauty. By understanding its depth and leveraging its structured nature, we can provide learners with the perfect platform to develop critical thinking and problem-solving skills. The key lies in blending traditional techniques with modern tools, creating a holistic learning environment that fosters growth, curiosity, and a lifelong love for learning.

Join us on this transformative journey at Wonder Math. Let’s make math an adventure, teaching our children not just numbers and equations, but also how to improve problem-solving skills and navigate the world with confidence. Enroll your child today and witness the magic of mathematics unfold before your eyes!

FAQ: Mathematics and Critical Thinking

1. what is problem-solving in the context of mathematics.

Problem-solving in mathematics refers to the process of identifying a mathematical challenge and systematically working through methods and strategies to find a solution.

2. Why is math considered a good avenue for developing problem-solving skills?

Mathematics provides structured challenges and allows for multiple approaches to find solutions. This promotes flexibility in thinking and encourages learners to view problems from various angles.

3. How does contextual learning enhance problem-solving abilities?

By teaching math within a story or real-life scenario, it becomes more relevant for the learner. This helps them see math as a tool to navigate real-world challenges , thereby promoting critical thinking.

4. What are the six basic steps of the problem-solving process in math?

The six steps are: Identification, Analysis, Generation of Alternatives, Decision Making, Implementation, and Evaluation.

5. How can parents support their children in developing mathematical problem-solving skills?

Parents can provide real-life contexts for math problems , encourage open discussions about different methods, and ensure a supportive environment where mistakes are seen as learning opportunities.

6. Are there any tools or apps that can help in enhancing problem-solving skills in math?

Yes, there are various interactive apps and online platforms designed specifically for math learning. These tools provide dynamic problem-solving scenarios and simulate real-world challenges, making the learning process engaging.

7. How does group discussion foster critical thinking in math?

Group discussions allow students to hear different perspectives and approaches to a problem. This can challenge their own understanding and push them to think about alternative methods.

8. Is it necessary to always follow the six steps of the problem-solving process sequentially?

While the six steps provide a structured approach, real-life problem-solving can sometimes be more fluid. It’s beneficial to know the steps, but adaptability and responsiveness to the situation are also crucial.

9. How does Wonder Math incorporate active learning in teaching mathematics?

Wonder Math integrates mathematics within engaging stories and real-world scenarios, making it fun and relevant. This active learning approach ensures that students are not just passive recipients but active participants in the learning process.

10. What if my child finds a math problem too challenging and becomes demotivated?

It’s essential to create a supportive environment where challenges are seen as growth opportunities. Remind them that every problem is a chance to learn, and it’s okay to seek help or approach it differently.

Related posts

I

Math Programs 101: What Every Parent Should Know When Looking For A Math Program

  As a parent, you know that a solid foundation in mathematics is crucial for your child’s success, both in school and in life. But with so many math programs and math help services out there, how do you choose the right one? Whether you’re considering Outschool classes, searching for “math tutoring near me,” or exploring tutoring services online, understanding…

Pi Day Celebration Pie

Celebrating Pi Day: Fun Pi Day Activities and Challenges for Young Mathematicians

  Hello, young mathematicians and curious minds! Today, we’re diving into the exciting world of Pi Day, a special day celebrated on March 14th (3/14) that honors the mysterious and magical number, pi (π). As Wonder Math, the only math program dedicated to developing mathematical thinkers in second through fifth grade, we’re thrilled to share some engaging Pi Day activities…

  • Our Mission

5 Ways to Stop Thinking for Your Students

Too often math students lean on teachers to think for them, but there are some simple ways to guide them to think for themselves.

Photo of middle school student doing math on board

Who is doing the thinking in your classroom? If you asked me that question a few years ago, I would have replied, “My kids are doing the thinking, of course!” But I was wrong. As I reflect back to my teaching style before I read Building Thinking Classrooms by Peter Liljedahl (an era in my career I like to call “pre-thinking classroom”), I now see that I was encouraging my students to mimic rather than think .

My lessons followed a formula that I knew from my own school experience as a student and what I had learned in college as a pre-service teacher. It looked like this: Students faced me stationed at the board; I demonstrated a few problems while students copied what I wrote in their notes. I would throw out a few questions to the class to assess understanding. If a few kids answered correctly, I felt confident that the lesson had gone well. Some educators might call this “ I do, we do, you do .”

What’s wrong with this formula? When it was time for them to work independently, which usually meant a homework assignment because I used most of class time for direct instruction, the students would come back to class and say, “The homework was so hard. I don’t get it. Can you go over questions 1–20?” Exhausted and frustrated, I would wonder, “But I taught it—why didn’t they get it?”

Now in the “peri-thinking classroom” era of my career, my students are often working at the whiteboards in random groups as outlined in Liljedahl’s book. The pendulum has shifted from the teacher doing the thinking to the students doing the thinking. Do they still say, “I don’t get it!”? Yes, of course! But I use the following strategies to put the thinking back onto them.

5 Ways to Get Your Students to Think

1. Answer questions with a refocus on the students’ point of view. Liljedahl found in his research that students ask three types of questions: “(1) proximity questions—asked when the teacher is close; (2) stop thinking questions—most often of the form ‘is this right’ or ‘will this be on the test’; and (3) keep thinking questions—questions that students ask so they can get back to work.” He suggests that teachers acknowledge “proximity” and “stop thinking questions” but not answer them.

Try these responses to questions that students ask to keep working:

  • “What have you done so far?” 
  • “Where did you get that number?” 
  • “What information is given in the problem?” 
  • “Does that number seem reasonable in this situation?”  

2. Don’t carry a pencil or marker. This is a hard rule to follow; however, if you hold the writing utensil, you’ll be tempted to write for them . Use verbal nudges and hints, but avoid writing out an explanation. If you need to refer to a visual, find a group that has worked out the problem, and point out their steps. Hearing and viewing other students’ work is more powerful .

3. We instead of I . When I assign a handful of problems for groups to work on at the whiteboards, they are tempted to divvy up the task. “You do #30, and I’ll do #31.” This becomes an issue when they get stuck. I inevitably hear, “Can you help me with #30? I forgot how to start.”

I now require questions to use “we” instead of “I.” This works wonders. As soon as they start to ask a question with “I,” they pause and ask their group mates. Then they can legitimately say, “ We tried #30, and we are stumped.” But, in reality, once they loop in their group mates, the struggling student becomes unstuck, and everyone in the group has to engage with the problem.

4. Stall your answer. If I hear a basic computation question such as, “What is 3 divided by 5?” I act like I am busy helping another student: “Hold on, I need to help Marisela. I’ll be right back.” By the time I return to them, they are way past their question. They will ask a classmate, work it out, or look it up. If the teacher is not available to think for them, they learn to find alternative resources.

5. Set boundaries. As mentioned before, students ask “proximity” questions because I am close to them. I might reply with “Are you asking me a thinking question? I’m glad to give you a hint or nudge, but I cannot take away your opportunity to think.” This type of response acknowledges that you are there to help them but not to do their thinking for them.

When you set boundaries of what questions will be answered, the students begin to more carefully craft their questions. At this point of the year, I am starting to hear questions such as, “We have tried solving this system by substitution, but we are getting an unreasonable solution. Can you look at our steps?” Yes!

Shifting the focus to students doing the thinking not only enhances their learning but can also have the effect of less frustration and fatigue for the teacher. As the class becomes student-centered, the teacher role shifts to guide or facilitator and away from “sage on the stage.”

As another added benefit, when you serve as guide or facilitator, the students are getting differentiated instruction and assessment. Maybe only a few students need assistance with adding fractions, while a few students need assistance on an entirely different concept. At first, you might feel like your head is spinning trying to address so many different requests; however, as you carefully sift through the types of questions you hear, you will soon be comfortable only answering the “keep thinking” questions.

Engaging Maths

Dr catherine attard, promoting creative and critical thinking in mathematics and numeracy.

  • by cattard2017
  • Posted on June 25, 2017

What is critical and creative thinking, and why is it so important in mathematics and numeracy education?

Numeracy is often defined as the ability to apply mathematics in the context of day to day life. However, the term ‘critical numeracy’ implies much more. One of the most basic reasons for learning mathematics is to be able to apply mathematical skills and knowledge to solve both simple and complex problems, and, more than just allowing us to navigate our lives through a mathematical lens, being numerate allows us to make our world a better place.

The mathematics curriculum in Australia provides teachers with the perfect opportunity to teach mathematics through critical and creative thinking. In fact, it’s mandated. Consider the core processes of the curriculum. The Australian Curriculum (ACARA, 2017), requires teachers to address four proficiencies : Problem Solving, Reasoning, Fluency, and Understanding. Problem solving and reasoning require critical and creative thinking (). This requirement is emphasised more heavily in New South wales, through the graphical representation of the mathematics syllabus content , which strategically places Working Mathematically (the proficiencies in NSW) and problem solving, at its core. Alongside the mathematics curriculum, we also have the General Capabilities , one of which is Critical and Creative Thinking – there’s no excuse!

Critical and creative thinking need to be embedded in every mathematics lesson . Why? When we embed critical and creative thinking, we transform learning from disjointed, memorisation of facts, to sense-making mathematics. Learning becomes more meaningful and purposeful for students.

How and when do we embed critical and creative thinking?

There are many tools and many methods of promoting thinking. Using a range of problem solving activities is a good place to start, but you might want to also use some shorter activities and some extended activities. Open-ended tasks are easy to implement, allow all learners the opportunity to achieve success, and allow for critical thinking and creativity. Tools such as Bloom’s Taxonomy and Thinkers Keys  are also very worthwhile tasks. For good mathematical problems go to the nrich website . For more extended mathematical investigations and a wonderful array of rich tasks, my favourite resource is Maths300   (this is subscription based, but well worth the money). All of the above activities can be used in class and/or for homework, as lesson starters or within the body of a lesson.

Screen Shot 2017-06-25 at 5.40.37 pm

Will critical and creative thinking take time away from teaching basic concepts?

No, we need to teach mathematics in a way that has meaning and relevance, rather than through isolated topics. Therefore, teaching through problem-solving rather than for problem-solving. A classroom that promotes and critical and creative thinking provides opportunities for:

  • higher-level thinking within authentic and meaningful contexts;
  • complex problem solving;
  • open-ended responses; and
  • substantive dialogue and interaction.

Who should be engaging in critical and creative thinking?

Is it just for students? No! There are lots of reasons that teachers should be engaged with critical and creative thinking. First, it’s important that we model this type of thinking for our students. Often students see mathematics as black or white, right or wrong. They need to learn to question, to be critical, and to be creative. They need to feel they have permission to engage in exploration and investigation. They need to move from consumers to producers of mathematics.

Secondly, teachers need to think critically and creatively about their practice as teachers of mathematics. We need to be reflective practitioners who constantly evaluate our work, questioning curriculum and practice, including assessment, student grouping, the use of technology, and our beliefs of how children best learn mathematics.

Critical and creative thinking is something we cannot ignore if we want our students to be prepared for a workforce and world that is constantly changing. Not only does it equip then for the future, it promotes higher levels of student engagement, and makes mathematics more relevant and meaningful.

How will you and your students engage in critical and creative thinking?

Share this:

  • Pingback: Critical Thinking, Mathematics, and McDonald’s | Engaging Maths
  • Pingback: Beach Towels and Pencil Cases: Interesting, Inquiry-based Mathematical Investigations | Engaging Maths

Leave a comment Cancel reply

' src=

  • Already have a WordPress.com account? Log in now.
  • Subscribe Subscribed
  • Copy shortlink
  • Report this content
  • View post in Reader
  • Manage subscriptions
  • Collapse this bar

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Writing – original draft, Writing – review & editing

Affiliation School of Mathematics and Statistics, The University of Sydney, Sydney, Australia

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation School of Arts and Humanities, Edith Cowan University, Joondalup, Australia

ORCID logo

  • Clio Cresswell, 
  • Craig P. Speelman

PLOS

  • Published: July 29, 2020
  • https://doi.org/10.1371/journal.pone.0236153
  • Peer Review
  • Reader Comments

Fig 1

Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational and logical reasoning tasks aggregated from various psychological studies. These tasks, that included the Cognitive Reflection Test and the Wason Selection Task, are of particular interest as they have typically and reliably eluded participants in all studies, and results have been uncorrelated with general intelligence, education levels and other demographic information. The results in this study revealed that in general the greater the mathematics training of the participant, the more tasks were completed correctly, and that performance on some tasks was also associated with performance on others not traditionally associated. A ceiling effect also emerged. The work is deconstructed from the viewpoint of adding to the platform from which to approach the greater, and more scientifically elusive, question: are any skills associated with mathematics training innate or do they arise from skills transfer?

Citation: Cresswell C, Speelman CP (2020) Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors. PLoS ONE 15(7): e0236153. https://doi.org/10.1371/journal.pone.0236153

Editor: Jérôme Prado, French National Center for Scientific Research (CNRS) & University of Lyon, FRANCE

Received: January 13, 2020; Accepted: June 30, 2020; Published: July 29, 2020

Copyright: © 2020 Cresswell, Speelman. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All relevant data are within the paper and its Supporting Information files.

Funding: The authors received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Mathematics is often promoted as endowing those who study it with a number of broad thinking skills such as: an ability to think logically, analytically, critically and abstractly; having capacity to weigh evidence with impartiality. This is a view of mathematics as providing transferable skills which can be found across educational institutions, governments and corporations worldwide. A view material to the place of mathematics in curricula.

Consider the UK government’s commissioned inquiry into mathematics education “Making Mathematics Count” ascertaining the justification that “mathematical training disciplines the mind, develops logical and critical reasoning, and develops analytical and problem-solving skills to a high degree” [ 1 p11]. The Australian Mathematical Sciences Institute very broadly states in its policy document “Vision for a Maths Nation” that “Not only is mathematics the enabling discipline, it has a vital productive role planning and protecting our well-being” (emphasis in original) [ 2 ]. In Canada, British Columbia’s New 2016 curriculum K-9 expressly mentions as part of its “Goals and Rationale”: “The Mathematics program of study is designed to develop deep mathematical understanding and fluency, logical reasoning, analytical thought, and creative thinking.” [ 3 ]. Universities, too, often make such specific claims with respect to their teaching programs. “Mathematics and statistics will help you to think logically and clearly, and apply a range of problem-solving strategies” is claimed by The School of Mathematical Sciences at Monash University, Australia [ 4 ]. The School of Mathematics and Statistics at The University of Sydney, Australia, directly attributes as part of particular course objectives and outcomes skills that include “enhance your problem-solving skills” as part of studies in first year [ 5 ], “develop logical thinking” as part of studies in second year, which was a statement drafted by the lead author in fact [ 6 ], and “be fluent in analysing and constructing logical arguments” as part of studies in third year [ 7 ]. The University of Cambridge’s Faculty of Mathematics, UK, provides a dedicated document “Transferable Skills in the Mathematical Tripos” as part of its undergraduate mathematics course information, which again lists “analytic ability; creativity; initiative; logical and methodical reasoning; persistence” [ 8 ].

In contrast, psychological research, which has been empirically investigating the concept of transferability of skills since the early 1900s, points quite oppositely to reasoning skills as being highly domain specific [ 9 ]. Therefore, support for claims that studying mathematics engenders more than specific mathematics knowledge is highly pertinent. And yet it is largely absent. The 2014 Centre for Curriculum Redesign (CCR) four part paper “Mathematics for the 21st Century: What Should Students Learn?” concludes in its fourth paper titled “Does mathematics education enhance higher-order thinking skills?” with a call to action “… there is not sufficient evidence to conclude that mathematics enhances higher order cognitive functions. The CCR calls for a much stronger cognitive psychology and neuroscience research base to be developed on the effects of studying mathematics” [ 10 ].

Inglis and Simpson [ 11 ], bringing up this very issue, examined the ability of first-year undergraduate students from a high-ranking UK university mathematics department, on the “Four Cards Problem” thinking task, also known as the Wason Selection Task. It is stated as follows.

Each of the following cards have a letter on one side and a number on the other.

math improve critical thinking

Here is a rule: “if a card has a D on one side, then it has a 3 on the other”. Your task is to select all those cards, but only those cards, which you would have to turn over in order to find out whether the rule is true or false. Which cards would you select?

This task involves understanding conditional inference, namely understanding the rule “If P then Q” and with this, deducing the answer as “P and not Q” or “D and 7”. Such logical deduction indeed presents as a good candidate to test for a potential ability of the mathematically trained. This task has also been substantially investigated in the domain of the psychology of reasoning [ 12 p8] revealing across a wide range of publications that only around 10% of the general population reach the correct result. The predominant mistake being to pick “D and 3”; where in the original study by Wason [ 13 ] it is suggested that this was picked by 65% of people. This poor success rate along with a standard mistake has fuelled interest in the task as well as attempts to understand why it occurs. A prevailing theory being the so named matching bias effect; the effect of disproportionately concentrating on items specifically mentioned in the situation, as opposed to reasoning according to logical rules.

Inglis and Simpson’s results isolated mathematically trained individuals with respect to this task. The participants were under time constraint and 13% of the first-year undergraduate mathematics students sampled reached the correct response, compared to 4% of the non-mathematics (arts) students that was included. Of note also was the 24% of mathematics students as opposed to 45% of the non-mathematics students who chose the standard mistake. The study indeed unveiled that mathematically trained individuals were significantly less affected by the matching bias effect with this problem than the individuals without mathematics training. However, the achievement of the mathematically trained group was still far from masterful and the preponderance for a non-standard mistake compared with non-mathematically trained people is suggestive. Mathematical training appears to engender a different thinking style, but it remains unclear what the difference is.

Inglis, Simpson and colleagues proceeded to follow up their results with a number of studies concentrated on conditional inference in general [ 14 , 15 ]. A justification for this single investigatory pathway being that if transfer of knowledge is present, something subtle to test for in the first place, a key consideration should be the generalisation of learning rather than the application of skills learned in one context to another (where experimenter bias in the choice of contexts is more likely to be an issue). For this they typically used sixteen “if P then Q” comprehension tasks, where their samples across a number of studies have included 16-year-old pre-university mathematics students (from England and Cyprus), mathematics honours students in their first year of undergraduate university study, third year university mathematics students, and associated control groups. The studies have encompassed controls for general intelligence and thinking disposition prior to training, as well as follows ups of up to two years to address the issue of causation. The conclusive thinking pattern that has emerged is a tendency of the mathematical groups towards a greater likelihood of rejecting the invalid denial of the antecedent and affirmation of the consequent inferences. But with this, and this was validated by a second separate study, the English mathematics group actually became less likely to endorse the valid modus tollens inference. So again, mathematical training appears to engender a different thinking style, but there are subtleties and it remains unclear what the exact difference is.

This project was designed to broaden the search on the notion that mathematics training leads to increased reasoning skills. We focused on a range of reasoning problems considered in psychological research to be particularly insightful into decision making, critical thinking and logical deduction, with their distinction in that the general population generally struggles with answering them correctly. An Australian sample adds diversity to the current enquiries that have been European focussed. Furthermore, in an effort to identify the impact of mathematics training through a possible gradation effect, different levels of mathematically trained individuals were tested for performance.

Well-studied thinking tasks from a variety of psychological studies were chosen. Their descriptions, associated success rates and other pertinent details follows. They were all chosen as the correct answer is typically eluded for a standard mistake.

The three-item Cognitive Reflection Test (CRT) was used as introduced by Frederick [ 16 ]. This test was devised in line with the theory that there are two general types of cognitive activity: one that operates quickly and without reflection, and another that requires not only conscious thought and effort, but also an ability to reflect on one’s own cognition by including a step of suppression of the first to reach it. The three items in the test involve an incorrect “gut” response and further cognitive skill is deemed required to reach the correct answer (although see [ 17 ] for evidence that correct responses can result from “intuition”, which could be related to intelligence [ 18 ]).

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

If it takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?

Bat and ball

A bat and a ball cost $1.10 in total. The bat costs a dollar more than the ball. How much does the ball cost?

The solutions are: 47 days for the Lily Pads problem, 5 minutes for the Widgets problem and 5 cents for the Bat and Ball problem. The considered intuitive, but wrong, answers are 24 days, 100 minutes and 10 cents, respectively. These wrong answers are attributed to participants becoming over focused on the numbers so as to ignore the exponential growth pattern in the Lily Pads problem, merely complete a pattern in numbers in the Widgets problem, and neglect the relationship “more than” in the Bat and Ball problem [ 19 ]. The original study by Frederick [ 16 ] provides a composite measure of the performance on these three items, with only 17% of those studied (n = 3428) reaching the perfect score. The CRT has since been studied extensively [ 19 – 21 ]. Research using the CRT tends not to report performance on the individual items of the test, but rather a composite measure of performance. Attridge and Inglis [ 22 ] used the CRT as a test for thinking disposition of mathematics students as one way to attempt to disentangle the issue of filtering according to prior thinking styles rather than transference of knowledge in successful problem solving. They repeat tested 16-year old pre-university mathematics students and English literature students without mathematics subjects at a one-year interval and found no difference between groups.

Three problems were included that test the ability to reason about probability. All three problems were originally discussed by Kahneman and Tversky [ 23 ], with the typically poor performance on these problems explained by participants relying not on probability knowledge, but a short-cut method of thinking known as the representativeness heuristic. In the late 1980s, Richard Nisbett and colleagues showed that graduate level training in statistics, while not revealing any improvement in logical reasoning, did correlate with higher-quality statistical answers [ 24 ]. Their studies lead in particular to the conclusion that comprehension of, what is known as the law of large numbers, did show improvement with training. The first of our next three problems targeted this law directly.

  • (a). the larger hospital
  • (b). the smaller hospital
  • (c). about the same (that is, within 5 percent of each other)

Kahneman and Tversky [ 23 ] reported that, of 50 participants, 12 chose (a), 10 chose (b), and 28 chose (c). The correct answer is (b), for the reason that small samples are more likely to exhibit extreme events than large samples from the same population. The larger the sample, the more likely it will exhibit characteristics of the parent population, such as the proportion of boys to girls. However, people tend to discount or be unaware of this feature of sampling statistics, which Kahneman and Tversky refer to as the law of large numbers. Instead, according to Kahneman and Tversky, people tend to adhere to a fallacious law of small numbers, where even small samples are expected to exhibit properties of the parent population, as illustrated by the proportion of participants choosing the answer (c) in their 1972 study. Such thinking reflects use of the representativeness heuristic, whereby someone will judge the likelihood of an uncertain event based on how similar it is to characteristics of the parent population of events.

Birth order

  • (a). What is your estimate of the number of families surveyed in which the exact order of births was BGBBBB?
  • (b). In the same survey set, which, if any, of the following two sequences would be more likely: BBBGGG or GBBGBG?

All of the events listed in the problem have an equal probability, so the correct answer to (a) is 72, and to (b) is “neither is more likely”. Kahneman and Tversky [ 23 ] reported that 75 of 92 participants judged the sequence in (a) as less likely than the given sequence. A similar number (unspecified by Kahneman and Tversky, but the statistical effect was reported to be of the same order as in (a)) reported that GBBGBG was the more likely sequence. Again, Kahneman and Tversky suggested that these results reflected use of the representativeness heuristic. In the context of this problem, the heuristic would have taken the following form: some birth orders appear less patterned than others, and less patterned is to be associated with the randomness of birth order, making them more likely.

Coin tosses

  • (a). H T H T H T H T
  • (b). H H H H T T T T
  • (c). T T H H T T H H
  • (d). H T T H T H H T
  • (e). all of the above are equally likely

The correct answer in this problem is (e). Kahneman and Tversky [ 23 ] reported that participants tend to choose less patterned looking sequences (e.g., H T T H T H H T) as more likely than more systematic looking sequences (e.g., H T H T H T H T). This reasoning again reflects the representativeness heuristic.

Three further questions from the literature were included to test problem solving skill.

Two drivers

  • (a). Driver A would win the race
  • (b). Driver B would win the race
  • (c). the two drivers would arrive at the same time (within a few seconds of one another)

This problem was developed by Pelham and Neter [ 25 ]. The correct answer is (a), which can be determined by calculations of driving times for each Driver, using time = distance/velocity. Pelham and Neter argue, however, that (c) is intuitively appealing, on the basis that both drivers appear to have the same overall average speed. Pelham and Neter reported that 67% of their sample gave this incorrect response to the problem, and a further 13% selected (b).

Petrol station

Imagine that you are driving along the road and you notice that your car is running low on petrol. You see two petrol stations next to each other, both advertising their petrol prices. Station A’s price is 65c/litre; Station B’s price is 60c/litre. Station A’s sign also announces: “5c/litre discount for cash!” Station B’s sign announces “5c/litre surcharge for credit cards.” All other factors being equal (for example, cleanliness of the stations, number of cars waiting at each etc), to which station would you choose to go, and why?

This problem was adapted from one described by Galotti [ 26 ], and is inspired by research reported by Thaler [ 27 ]. According to Thaler’s research, most people prefer Station A, even though both stations are offering the same deal: 60c/litre for cash, and 65c/litre for credit. Tversky and Kahneman [ 28 ] explain this preference by invoking the concept of framing effects. In the context of this problem, such an effect would involve viewing the outcomes as changes from some initial point. The initial point frames the problem, and provides a context for viewing the outcome. Thus, depending on the starting point, outcomes in this problem can be viewed as either a gain (in Station A, you gain a discount if you use cash) or a loss (in Station B, you are charged more (a loss) for using credit). Given that people are apparently more concerned about a loss than a gain [ 29 ], the loss associated with Station B makes it the less attractive option, and hence the preference for Station A. The correct answer, though, is that the stations are offering the same deal and so no station should be preferred.

And finally, a question described by Stanovich [ 30 , 31 ] as testing our predisposition for cognitive operations that require the least computational effort.

Jack looking at Anne

  • (c). Cannot be determined

Stanovich reported that over 80% of people choose the “lazy” answer (c). The correct answer is (a).

The above questions survey, in a clear problem solving setting, an ability to engage advanced cognitive processing in order to critically evaluate and possibly override initial gut reasoning, an ability to reason about probability within the framework of the law of large numbers and the relationship between randomness and patterning, an ability to isolate salient features of a problem and, with the last question in particular, an ability to map logical relations. It might be hypothesised that according to degrees of mathematical training, in line with the knowledge base provided and the claims of associated broad and enhanced problem-solving abilities in general, that participants with greater degrees of such training would outperform others on these questions. This hypothesis was investigated in this study. In addition, given that no previous study on this issue has examined the variety of problems used in this study, we also undertook an exploratory analysis to investigate whether there exist any associations between the problems in terms of their likelihood of correct solution. Similarities between problems might indicate which problem solving domains could be susceptible to the effects of mathematics training.

  • Introductory—First year, second semester, university students with weak high school mathematical results, only enrolled in the current unit as a compulsory component for their chosen degree, a unit not enabling any future mathematical pathway, a typical student may be enrolled in a Biology or Geography major;
  • Standard—First year, second semester, university students with fair to good high school mathematical results, enrolled in the current mathematics unit as a compulsory component for their chosen degree with the possibility of including some further mathematical units in their degree pathway, a typical student may be enrolled in an IT or Computer Science major;
  • Advanced1—First year, second semester, university mathematics students with very strong interest as well as background in mathematics, all higher year mathematical units are included as possible future pathway, a typical student may be enrolled in a Mathematics or Physics major;
  • Advanced2—Second year, second semester, university mathematics students with strong interest as well as background in mathematics, typically a direct follow on from the previously mentioned Advanced1 cohort;
  • Academic—Research academics in the mathematical sciences.

Participants

123 first year university students volunteered during “help on demand” tutorial times containing up to 30 students. These are course allocated times that are supervised yet self-directed by students. This minimised disruption and discouraged coercion. 44 second year university students completed the questionnaire during a weekly one-hour time slot dedicated to putting the latest mathematical concepts to practice with the lecturer (whereby contrast to what occurs in tutorial times the lecturer does most of the work and all students enrolled are invited). All these university students completed the questionnaire in normal classroom conditions; they were not placed under strict examination conditions. The lead author walked around to prevent discussion and coercion and there was minimum disruption. 30 research academics responded to local advertising and answered the questionnaire in their workplace while supervised.

The questionnaires were voluntary, anonymous and confidential. Participants were free to withdraw from the study at any time and without any penalty. No participant took this option however. The questionnaires gathered demographic information which included age, level of education attained and current qualification pursued, name of last qualification and years since obtaining it, and an option to note current speciality for research academics. Each problem task was placed on a separate page. Participants were not placed under time constraint, but while supervised, were asked to write their start and finish times on the front page of the survey to note approximate completion times. Speed of completion was not incentivised. Participants were not allowed to use calculators. A final “Comments Page” gave the option for feedback including specifically if the participants had previously seen any of the questions. Questionnaires were administered in person and supervised to avoid collusion or consulting of external sources.

The responses were coded four ways: A) correct; B) standard error (the errors discussed above in The Study); C) other error; D) left blank.

The ethical aspects of the study were approved by the Human Research Ethics Committee of the University of Sydney, protocol number [2016/647].

The first analysis examined the total number of correct responses provided by the participants as a function of group. Scores ranged from 1 to 11 out of a total possible of 11 (Problem 6 had 2 parts) ( Fig 1 ). An ANOVA of this data indicated a significant effect of group (F(4, 192) = 20.426, p < .001, partial η 2 = .299). Pairwise comparisons using Tukey’s HSD test indicated that the Introductory group performed significantly worse than the Advanced1, Advanced2 and Academic groups. There were no significant differences between the Advanced1, Advanced2 and Academic groups.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

Error bars are one standard error of the mean.

https://doi.org/10.1371/journal.pone.0236153.g001

Overall solution time, while recorded manually and approximately, was positively correlated with group, such that the more training someone had received, the longer were these solution times (r(180) = 0.247, p = .001). However, as can be seen in Fig 2 , this relationship is not strong.

thumbnail

https://doi.org/10.1371/journal.pone.0236153.g002

A series of chi-squared analyses, and their Bayesian equivalents, were performed on each problem, to determine whether the distribution of response types differed as a function of group. To minimise the number of cells in which expected values in some of these analyses were less than 5, the Standard Error, Other Error and Blank response categories were collapsed into one category (Incorrect Response). For three of the questions, the expected values of some cells did fall below 5, and this was due to most people getting the problem wrong (Four Cards), or most people correctly responding to the problem (Bat and Ball, Coin Tosses). In these cases, the pattern of results was so clear that a statistical analysis was barely required. Significant chi-squared results were examined further with pairwise posthoc comparisons (see Table 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t001

The three groups with the least amount of training in mathematics were far less likely than the other groups to give the correct solution (χ 2 (4) = 31.06, p < .001; BF 10 = 45,045) ( Table 1 ). People in the two most advanced groups (Advanced2 and Academic) were more likely to solve the card problem correctly, although it was still less than half of the people in these groups who did so. Further, these people were less likely to give the standard incorrect solution, so that most who were incorrect suggested some more cognitively elaborate answer, such as turning over all cards. The proportion of people in the Advanced2 and Academic groups (39 and 37%) who solved the problem correctly far exceeded the typical proportion observed with this problem (10%). Of note, also, is the relatively high proportion of those in the higher training groups who, when they made an error, did not make the standard error, a similar result to the one reported by Inglis and Simpson [ 11 ].

The cognitive reflection test

In the Lily Pads problem, although most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, it was also the case that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 27.28, p < .001; BF 10 = 15,554), with the standard incorrect answer being the next most prevalent response for the two lower ability mathematics groups ( Table 1 ).

Performance on the Widgets problem was similar to performance on the Lily Pads problem in that most people in the Standard, Advanced1, Advanced2 and Academic groups were likely to select the correct solution, but that the less training someone had received in mathematics, the more likely they were to select an incorrect solution (χ 2 (4) = 23.76, p< .001; BF 10 = 516) ( Table 1 ). As with the Lily Pads and Widget problems, people in the Standard, Advanced1, Advanced2 and Academic groups were highly likely to solve the Bat and Ball problem (χ 2 (4) = 35.37, p < .001; BF 10 = 208,667). Errors were more likely from the least mathematically trained people (Introductory, Standard) than the other groups ( Table 1 ).

To compare performance on the CRT with previously published results, performance on the three problems (Lily Pads, Widgets, Bat and Ball) were combined. The number of people in each condition that solved 0, 1, 2, or 3 problems correctly is presented in Table 2 . The Introductory group were evenly distributed amongst the four categories, with 26% solving all three problems correctly. Around 70% of the rest of the groups solved all 3 problems correctly, which is vastly superior to the 17% reported by Frederick [ 16 ].

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t002

Responses to the Hospitals problem were almost universally split between correct and standard errors in the Standard, Advanced1, Advanced2 and Academic groups. Although this pattern of responses was also evident in the Introductory group, this group also exhibited more non-standard errors and non-responses than the other groups. However, the differences between the groups were not significant (χ 2 (4) = 4.93, p = .295; BF 10 = .068) ( Table 1 ). Nonetheless, the performance of all groups exceeds the 20% correct response rate reported by Kahneman and Tversky [ 23 ].

The two versions of the Birth Order problem showed similar results, with correct responses being more likely in the groups with more training (i.e., Advanced1, Advanced2 and Academic), and responses being shared amongst the various categories in the Introductory and Standard groups (χ a 2 (4) = 24.54, p < .001; BF 10 = 1,303; χ b 2 (4) = 25.77, p < .001; BF 10 = 2,970) ( Table 1 ). Nonetheless, performance on both versions of the problem in this study was significantly better than the 82% error rate reported by Kahneman and Tversky [ 23 ].

The Coin Tosses problem was performed well by all groups, with very few people in any condition committing errors. There were no obvious differences between the groups (χ 2 (4) = 3.70, p = .448; BF 10 = .160) ( Table 1 ). Kahneman and Tversky [ 23 ] reported that people tend to make errors on this type of problem by choosing less patterned looking sequences, but they did not report relative proportions of people making errors versus giving correct responses. Clearly the sample in this study did not perform like those in Kahneman and Tversky’s study.

Responses on the Two Drivers problem were clearly distinguished by a high chance of error in the Introductory and Standard groups (over 80%), and a fairly good chance of being correct in the Advanced1, Advanced2 and Academic groups (χ 2 (4) = 46.16, p < .001; BF 10 = 1.32 x 10 8 ) ( Table 1 ). Academics were the standout performers on this problem, although over a quarter of this group produced an incorrect response. Thus, the first two groups performed similarly to the participants in the Pelham and Neter [ 25 ] study, 80% of whom gave an incorrect response.

Responses on the Petrol Station problem were marked by good performance by the Academic group (73% providing a correct response), and just over half of each of the other groups correctly solving the problem. This difference was not significant (χ 2 (4) = 4.68, p = .322: BF 10 = .059) ( Table 1 ). Errors were fairly evenly balanced between standard and other, except for the Academic group, who were more likely to provide a creative answer if they made an error. Thaler [ 27 ] reported that most people get this problem wrong. In this study, however, on average, most people got this problem correct, although this average was boosted by the Academic group.

Responses on the Jack looking at Anne problem generally were standard errors, except for the Advanced2 and Academic groups, which were evenly split between standard errors and correct responses (χ 2 (4) = 18.03, p = .001; BF 10 = 46) ( Table 1 ). Thus, apart from these two groups, the error rate in this study was similar to that reported by Stanovich [ 30 ], where 80% of participants were incorrect.

A series of logistic regression analyses were performed in order to examine whether the likelihood of solving a particular problem correctly could be predicted on the basis of whether other problems were solved correctly. Each analysis involved selecting performance (correct or error) on one problem as the outcome variable, and performance on the other problems as predictor variables. Training (amount of training) was also included as a predictor variable in each analysis. A further logistic regression was performed with training as the outcome variable, and performance on all of the problems as predictor variables. The results of these analyses are summarised in Table 3 . There were three multi-variable relationships observed in these analyses, which can be interpreted as the likelihood of solving one problem in each group being associated with solving the others in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. Training also featured in each of these sets, moderating the relationships as per the results presented above for each problem.

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t003

The final “Comments Page” revealed the participants as overwhelmingly enjoying the questions. Any analysis of previous exposure to the tasks proved impossible as there was little to no alignment on participant’s degree of recall, if any, and even perceptions of what exposure entailed. For example, some participants confused being exposed to the particular tasks with being habitually exposed to puzzles, or even mathematics problems, more broadly.

In general, the amount of mathematics training a group had received predicted their performance on the overall set of problems. The greater the training, the more problems were answered correctly, and the slower the recorded response times. There was not an obvious difference between the Advanced1, Advanced2 and Academic groups on either of these measures, however there were clear differences between this group and the Introductory and Standard groups, with the former exhibiting clearly superior accuracy. While time records were taken approximately, so as to avoid adding time pressure as a variable, that the Advanced1, Advanced2 and Academic groups recorded more time in their consideration of the problems, may suggest a “pause and consider” approach to such problems is a characteristic of the advanced groups. This is in line with what was suggested by an eye-movement tracking study of mathematically trained students attempting the Four Cards Problem; where participants that had not chosen the standard error had spent longer considering the card linked to the matching bias effect [ 14 ]. It is important to note, however, that longer response times may reflect other cognitive processes than deliberation [ 32 ].

Performance on some problems was associated with performance on other problems. That is, if someone correctly answered a problem in one of these sets, they were also highly likely to correctly answer the other problems in the set. These sets were: (1) Lily Pads, Widgets and Petrol Station; (2) Hospitals, Four Cards and Two Drivers; (3) Birth Order and Coin Tosses. This is different with how these problems have been typically clustered a priori in the research literature: (I) Lily Pads, Widgets and Bat and Ball (CRT); (II) Hospitals and Two Drivers (explained below); (III) Hospitals, Birth Order and Coin Tosses (representativeness heuristic); (IV) Birth Order and Coin Tosses (probability theory). Consideration of these problem groupings follows.

Correctly answering all three problems in (I) entailed not being distracted by particular pieces of information in the problems so as to stay focused on uncovering the real underlying relationships. The Lily Pads and Widget problems can mislead if attention is over focused on the numbers, and conversely, the Petrol Station problem can mislead if there is too much focus on the idea of a discount. While the Lily Pads and Widget problems are traditionally paired with the Bat and Ball problem in the CRT, it may be that performance on the Bat and Ball problem did not appear as part of this set due to an added level of difficulty. With the problems in (I), avoiding being distracted by certain parts of the questions at the expense of others almost leads directly to the correct answer. However, with the Bat and Ball problem, further steps in mathematical reasoning still need to occur in answering which two numbers add together to give a result while also subtracting one from the other for another.

With the problems in (II) it is of interest that the Two Drivers problem was created specifically to be paired with the Hospitals problem to test for motivation in problem solving [ 23 ]. Within this framework further transparent versions of these problems were successfully devised to manipulate for difficulty. The Two Drivers problem was amended to have Driver B travelling at exactly 5 mph during the first half of the race and at exactly 95 mph during the last half of the race. The Hospitals problem was amended so the smaller hospital would have “only 2” babies born each day and where for a period of one year the hospitals recorded the number of days on which all of the babies born were boys. Could the association in (II) be pointing to how participants overcome initial fictitious mathematical rules? Maybe they reframe the question in simpler terms to see the pattern. The Four Cards Problem also elicited a high number of incorrect answers where, associated with mathematical training, the standard incorrect solution was avoided for more cognitively elaborate ones. Indeed, a gradation effect appeared across the groups where the standard error of the “D and 3” cards becomes “D only” ( Table 4 ). Adrian Simpson and Derrick Watson found a comparable result across their two groups [14 p61]. This could again be pointing to having avoided an initial fictitious rule of simply concentrating on items directly found in the question, participants then seek to reframe the question to unearth the logical rule to be deduced. An added level of difficulty with this question may be why participants become trapped in a false answer. The eye-movement tracking study mentioned above supports this theory.

thumbnail

https://doi.org/10.1371/journal.pone.0236153.t004

The problems in (III) fit naturally together as part of basic probability theory, a topic participants would have assimilated, or not, as part of various education curricula. While the equal likelihood of all possible outcomes with respect to a coin toss may be culturally assimilated, the same may not be as straightforward for birth gender outcomes where such assumptions could be swayed by biological hypothesis or folk wisdom [ 33 ]. The gradation of the results in terms of mathematical training does not support this possibility.

The effect of training on performance accuracy was more obvious in some problems compared to others, and to some extent, this was related to the type of problem. For instance, most of the problems in which performance was related to training (Four Cards, CRT [Lily Pads, Widgets, Bat and Ball], Two Drivers, Jack looking at Anne) could be classed as relying on logical and/or critical thinking. The one exception was the Birth Order problems, which are probability related.

In contrast, two of the three problems in which training did not appear to have much impact on performance (Hospitals and Coin Tosses) require domain-specific knowledge. The Hospitals problem requires a degree of knowledge about sampling statistics. This is a topic of quite distinct flavour that not all mathematically trained individuals gain familiarity with. On the other hand, all groups having performed well on the Coin Tosses problem is in line with a level of familiarity with basic probability having been originally presented at high school. While the questioning of patterning as negatively correlated with randomness is similar to that appearing in the Birth Order question, in the Birth Order question this aspect is arguably more concealed. These results and problem grouping (III) could be pointing to an area for improvement in teaching where the small gap in knowledge required to go from answering the Coin Tosses problem correctly to achieving similarly with the Birth Order problem could be easily addressed. A more formal introduction to sampling statistics in mathematical training could potentially bridge this gap as well as further be extended towards improvement on the Hospitals problem.

The other problem where performance was unrelated to training, the Petrol Station problem, cannot be characterised similarly. It is more of a logical/critical thinking type problem, where there remains some suggestion that training may have impacted performance, as the Academic group seemed to perform better than the rest of the sample. An alternate interpretation of this result is therefore that this problem should not be isolated but grouped with the other problems where performance is affected by training.

  • The Introductory group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Introduction to Differentiation, Applications of the Derivative, Antiderivatives, Areas and the Definite Integral), Financial Mathematics, Statistical Analysis. The Introductory group then explored concepts in mathematical modelling with emphasis on the importance of calculus in their first semester of mathematical studies.
  • The Standard group’s mathematics high school syllabus studied prior to first semester course entry covered: Functions, Trigonometric Functions, Calculus (Rates of Change, Integration including the method of substitution, trigonometric identities and inverse trigonometric functions, Areas and Volumes of solids of revolution, some differential equations), Combinatorics, Proof (with particular focus on Proof by Mathematical Induction), Vectors (with application to projectile motion), Statistical Analysis. In first semester their mathematical studies then covered a number of topics the Advanced1 group studied prior to gaining entrance at university; further details on this are given below.
  • The Advanced1 group’s mathematics high school syllabus studied prior to first semester course entry covered: the same course content the Standard group covered at high school plus extra topics on Proof (develop rigorous mathematical arguments and proofs, specifically in the context of number and algebra and further develop Proof by Mathematical Induction), Vectors (3 dimensional vectors, vector equations of lines), Complex Numbers, Calculus (Further Integration techniques with partial fractions and integration by parts), Mechanics (Application of Calculus to Mechanics with simple harmonic motion, modelling motion without and with resistance, projectiles and resisted motion). The Standard group cover these topics in their first semester university studies in mathematics with the exclusion of further concepts of Proof or Mechanics. In first semester the Advanced1 group have built on their knowledge with an emphasis on both theoretical and foundational aspects, as well as developing the skill of applying mathematical theory to solve practical problems. Theoretical topics include a host of theorems relevant to the study of Calculus.

In summary, at the point of our study, the Advanced1 group had more knowledge and practice on rigorous mathematical arguments and proofs in the context of number and algebra, and more in-depth experience with Proofs by Induction, but the bulk of extra knowledge rests with a much deeper knowledge of Calculus. They have had longer experience with a variety of integration techniques, and have worked with a variety of applications of calculus to solve practical problems, including a large section on mechanics at high school. In first semester at university there has been a greater focus on theoretical topics including a host of theorems and associated proofs relevant to the topics studied. As compared to the Introductory and Standard groups, the Advanced1 group have only widened the mathematics knowledge gap since their choice of post-compulsory mathematics at high school. The Advanced2 group come directly from an Advanced1 cohort. And the Academics group would have reached the Advanced1 group’s proficiency as part of their employment. So, are specific reasoning skills resulting from this level of abstract reasoning? Our findings suggest this should certainly be an area of investigation and links in interestingly with other research work. In studying one of the thinking tasks in particular (the Four Cards Problem) and its context of conditional inference more specifically, Inglis and Simpson [ 15 ] found a clear difference between undergraduates in mathematics and undergraduates in other university disciplines, yet also showed a lack of development over first-year university studies on conditional inference measures. A follow up study by Attridge and Inglis [ 22 ] then zeroed in on post-compulsory high school mathematical training and found that students with such training did develop their conditional reasoning to a greater extent than their control group over the course of a year, despite them having received no explicit tuition in conditional logic. The development though, whilst demonstrated as not being the result of a domain-general change in cognitive capacity or thinking disposition, and most likely associated with the domain-specific study of mathematics, revealed a complex pattern of endorsing more of some inferences and less of others. The study here focused on a much broader problem set associated with logical and critical thinking and it too is suggestive of a more complex picture in how mathematics training may be contributing to problem solving styles. A more intricate pattern to do with the impact of mathematical training on problem solving techniques is appearing as required for consideration.

There is also a final interpretation to consider: that people in the Advanced 1, Advanced2 and Academic groups did not gain anything from their mathematics training in terms of their ability to solve these problems. Instead, with studies denying any correlation of many of these problems with what is currently measured as intelligence [ 30 ], they might still be people of a particular intelligence or thinking disposition to start with, who have been able to use that intelligence to not only solve these problems, but also survive the challenges of their mathematics training.

That the CRT has been traditionally used as a measure of baseline thinking disposition and that performance has been found to be immutable across groups tested is of particular interest since our results show a clear possible training effect on these questions. CRT is tied with a willingness to engage in effortful thinking which presents as a suitable ability for training. It is beyond the scope of this study, but a thorough review of CRT testing is suggestive of a broader appreciation and better framework to understand thinking disposition, ability and potential ability.

Mathematical training appears associated with certain thinking skills, but there are clearly some subtleties that need to be extricated. The thinking tasks here add to the foundational results where the aim is for a firmer platform on which to eventually base more targeted and illustrative inquiry. If thinking skills can be fostered, could first year university mathematics teaching be improved so that all samples from that group reach the Advanced1 group level of reasoning? Do university mathematics courses become purely about domain-specific knowledge from this point on? Intensive training has been shown to impact the brain and cognition across a number of domains from music [ 34 ], to video gaming [ 35 ], to Braille reading [ 36 ]. The hypothesis that mathematics, with its highly specific practice, fits within this list remains legitimate, but simply unchartered. With our current level of understanding it is worth appreciating the careful wording of the NYU Courant Institute on ‘Why Study Math?’ where there is no assumption of causation: “Mathematicians need to have good reasoning ability in order to identify, analyze, and apply basic logical principles to technical problems.” [ 37 ].

Limitations

One possible limitation of the current study is that the problems may have been too easy for the more advanced people, and so we observed a ceiling effect (i.e., some people obtained 100% correct on all problems). This was most obvious in the Advanced1, Advanced2 and Academic groups. It is possible that participants in these groups had developed logical and critical thinking skills throughout their mathematical training that were sufficient to cope with most of the problems used in this study, and so this would support the contention that training in mathematics leads to the development of logical and critical thinking skills useful in a range of domains. Another interpretation is that participants in these groups already possessed the necessary thinking skills for solving the problems in this study, which is why they are able to cope with the material in the advanced units they were enrolled in, or complete a PhD in mathematics and hold down an academic position in a mathematics department. This would then suggest that training in mathematics had no effect on abstract thinking skills—people in this study possessed them to varying extents prior to their studies. This issue might be settled in a future study that used a greater number of problems of varying difficulties to maximise the chances of finding a difference between the three groups with the most amount of training. Alternatively, a longitudinal study that followed people through their mathematics training could determine whether their logical and critical thinking abilities changed throughout their course.

A further limitation of the study may be that several of the reasoning biases examined in this study were measured by only one problem each (i.e., Four Cards Problem, Two Drivers, Petrol Station, Jack looking at Anne). A more reliable measure of these biases could be achieved by including more problems that tap into these biases. This would, however, increase the time required of participants during data collection, and in the context of this study, would mean a different mode of testing would likely be required.

Broad sweeping intuitive claims of the transferable skills endowed by a study of mathematics require evidence. Our study uniquely covers a wide range of participants, from limited mathematics training through to research academics in the mathematical sciences. It furthermore considered performance on 11 well-studied thinking tasks that typically elude participants in psychological studies and on which results have been uncorrelated with general intelligence, education levels and other demographic information [ 15 , 16 , 30 ]. We identified different performances on these tasks with respect to different groups, based on level of mathematical training. This included the CRT which has developed into a method of measuring baseline thinking disposition. We identified different distributions of types of errors for the mathematically trained. We furthermore identified a performance threshold that exists in first year university for those with high level mathematics training. This study then provides insight into possible changes and adjustments to mathematics courses in order for them to fulfil their advertised goal of reaching improved rational and logical reasoning for a higher number of students.

It is central to any education program to have a clear grasp of the nature of what it delivers and how, but arguably especially so for the core discipline that is mathematics. In 2014 the Office of The Chief Scientist of Australia released a report “Australia’s STEM workforce: a survey of employers” where transferable skills attributed to mathematics were also ones that employers deemed as part of the most valuable [ 38 ]. A better understanding of what mathematics delivers in this space is an opportunity to truly capitalise on this historical culture-crossing subject.

Supporting information

https://doi.org/10.1371/journal.pone.0236153.s001

Acknowledgments

The authors would like to thank Jacqui Ramagge for her proof reading and input, as well as support towards data collection.

  • 1. Smith A. Making mathematics count: The report of Professor Adrian Smith’s inquiry into post-14 mathematics education. 2004. London: The Stationery Office.
  • 2. AMSI, Vision for a Maths Nation. 2015. http://amsi.org.au/publications/a-vision-for-a-maths-nation/
  • 3. British Columbia [Internet]. Mathematics; Goals and Rationale. 2016 [cited 2019 Dec 5]. https://curriculum.gov.bc.ca/curriculum/mathematics/core/goals-and-rationale
  • 4. Monash University [Internet]. Mathematical Sciences. 2019 [cited 2019 Jul 30]. https://www.monash.edu/science/schools/mathematical-sciences/current .
  • 5. The University of Sydney [Internet]. MATH1014. 2017 [cited 2019 Dec 5]. http://www.maths.usyd.edu.au/u/UG/TU/YR1ADMIN/r/MATH1014.pdf .
  • 6. The University of Sydney [Internet]. MATH2965. 2016 [cited 2016 Dec 12]. http://www.maths.usyd.edu.au/u/UG/IM/MATH2965/
  • 7. The University of Sydney [Internet]. MATH3066. 2017 [cited 2017 Dec 8]. http://www.maths.usyd.edu.au/u/UG/SM/MATH3066/r/2017info3066.pdf .
  • 8. Cambridge University [Internet]. Mathematical Tripos. 2019 [cited 2019 Jul 30]. https://www.maths.cam.ac.uk/undergrad/course/transferable_skills .
  • 9. Speelman CP, Kirsner K. Beyond the learning curve: The construction of mind. Oxford: Oxford University Press; 2005.
  • 10. Fadel C. Mathematics for the 21 st Century: What Should Students Learn? Boston, Massachusetts: Center for Curriculum Redesign; 2014.
  • 11. Inglis M, Simpson A. Heuristic biases in mathematical reasoning. In: Chick HL, Vincent JL, editors. Proceedings of the 29th Conference of the International Group for the Psychology of Mathematics Education. Melbourne: PME; 2005. p. 177–84.
  • 12. Manktelow KI. Reasoning and Thinking. UK: Psychology Press; 1999.
  • View Article
  • PubMed/NCBI
  • Google Scholar
  • 14. Inglis M, Attridge N. Does mathematical study develop logical thinking? Testing the theory of formal discipline. London: World Scientific Publishing Europe Ltd; 2016.
  • 24. Nisbett RE. Can reasoning be taught? In: Callan E, Grotzer T, Kagan J, Nisbett RE, Perkins DN, Shulman LS, editors. Education and a Civic Society: Teaching Evidence-based Decision Making. Cambridge, MA: American Academy of Arts & Sciences; 2009.
  • 26. Galotti KM. Cognitive psychology in and out of the laboratory. Belmont, CA: Brooks/Cole; 1994.
  • 37. NYU [Internet]. Why Study Math? 2019 [cited 2019 Jul 30]. https://math.nyu.edu/dynamic/undergrad/overview/why-study-math/
  • 38. Office of The Chief Scientist. Australia’s STEM workforce: a survey of employers. Barton ACT: Deloitte Access Economics; 2014.

math improve critical thinking

5. Teaching Mathematical Reasoning: Critical Math Thinking Through Problem-Solving and Modeling

  • Mathematical problem-solving : This approach makes students think conceptually about problems before applying tools they’ve learned.
  • Mathematical modeling : Modeling projects give students experience in weighing several factors against one another and using mathematical knowledge to make decisions.

What is mathematical reasoning? The short answer is that is that is reasoning with math, and in a sense, it’s the skill that underlies all other math skills.

I. Mathematical Problem-Solving

An emphasis on open-ended mathematical problem-solving can help develop mathematical reasoning skills and address a problem teachers have long been concerned about: too much “rote” learning in math. 

Too often students spend time in math class memorizing procedures and applying them mindlessly to problems. This leads to errors when students are confronted with unfamiliar problems. It also contributes to a widespread misperception of math as boring and lacking relevance to everyday life. 

On the other hand, attempting to remedy this problem by giving students open-ended problems has its own drawbacks. Without the conceptual and methodological tools to solve these problems students become frustrated and disengaged. It can end up being an inefficient way to spend class time.  

Although learning fundamental math skills like algorithms for adding, subtracting, multiplying, and dividing is absolutely critical for students in the early grades, the deeper mathematical problem-solving skills are the ones we really want students to graduate with. How can we ensure they do?

The deeper mathematical problem-solving skills are the ones we really want students to graduate with.

math improve critical thinking

Evidence suggests that skills in mathematical problem-solving lead to more general improvements in outcomes related to math. They help students acquire a deeper understanding of mathematical reasoning and concepts. 

For instance, the commutative property, which most students learn applies to addition and multiplication problems (changing the order of the operations doesn’t affect the outcome), also applies to other logical and practical situations. A familiarity with some of these situations fosters deeper conceptual understanding, and deeper conceptual understanding leads to better critical thinking.

And learning these skills helps students improve outcomes related to critical thinking more generally. For example, students who become skilled in mathematical problem-solving tend to also:

  • Create beneficial habits of mind — persistence, thoroughness, creativity in solution-finding, and improved self-monitoring.
  • Break down hard problems into easier parts or reframing problems so that they can think about them more clearly. 
  • Some problem solving tactics are applicable to situations well beyond math: making a visualization of a situation to understand it more clearly; creating a simplified version of the problem to more easily address the essence of the problem; creating branches of possibilities to solve the problem; creating “what if” example cases to test key assumptions, etc.
  • Elevate the value of discussion and argumentation over simple appeals to authority.

Small-group mathematical problem solving targets skills that traditional mathematics instruction doesn’t. Instead of just finding a match between an algorithm and a question, students must: adapt or create an algorithm; evaluate and debate the merits of different solution paths; and verify their solution through additional evidence.

Small-group mathematical problem solving targets skills that traditional mathematics instruction doesn’t.

This process continues until the class has thoroughly explored the problem space, revealing multiple solution paths and exploring variations on the problem or contrasting problem-types.

Of course, the usefulness of a question like this depends on what students already know. If students don’t already know that chickens have two legs and pigs have four, they’re just going to be confused by the problem (and the explanation of the solution). It also requires some other basic skills—for instance, that if one chicken has two legs, four chickens would have eight.

As a way of evaluating student growth, teachers could also include some of these open-ended problems in homework assignments or as extra credit assignments.

Lesson Plan Outline

An example that might be appropriate for fifth grade is something like the following: A farmer has some pigs and some chickens. He finds that together they have 70 heads and 200 legs. How many pigs and how many chickens does he have? Divide the class into student groups of three to four. Have students spend a few minutes reading over the problem individually. Then let student groups discuss possible solution paths. The teacher walks around the classroom, monitoring the groups. Then the teacher leads a whole-class discussion about the problem.

  • So how did you go about thinking about the problem?
  • Show us how you got your answer and why you think it’s right. This might mean that a student goes up to the board to illustrate something if a verbal explanation is inadequate.
  • And what was the answer you got?
  • Does anyone else have a different way of thinking about the problem? If there are other ways of solving the problem that students didn’t come up with, teachers can introduce these other ways themselves.

Developing Math Problem-Solving Skills

Teachers should keep in mind the following as they bring mathematical problem-solving activities into their classrooms:

  • Problem selection . Teachers have to select grade-appropriate problems. A question like “John is taller than Mary. Mary is taller than Peter. Who is the shortest of the three children?” may be considered an exercise to older students — that is, a question where the solutions steps are already known — but a genuine problem to younger students. It’s also helpful when problems can be extended in various ways. Adding variation and complexity to a problem lets students explore a class of related problems in greater depth.
  • Managing student expectations . Introducing open-ended math problems to students who haven’t experienced them before can also be confusing for the students. Students who are used to applying algorithms to problems can be confused about what teachers expect them to do with open-ended problems, because no algorithm is available.
  • Asking why . Asking students to explain the rationale behind their answer is critical to improving their thinking. Teachers need to make clear that these rationales or justifications are even more important than the answer itself. These justifications give us confidence that an answer is right. That is, if the student can’t justify her answer, it almost doesn’t matter if it’s correct, because there’s no way of verifying it.

math improve critical thinking

II. Mathematical Modeling

Another approach is mathematical modeling. Usually used for students in middle or high school, mathematical modeling brings math tools to bear on real-world problems, keeping students engaged and helping them to develop deeper mathematical reasoning and critical thinking skills.

Math modeling is an extremely common practice in the professional world. Investors model returns and the effects of various events on the market; business owners model revenue and expenses, buying behavior, and more; ecologists model population growth, rainfall, water levels, and soil composition, among many other things. 

But, despite these many applications and the contributions it can make to general mathematical reasoning and critical thinking skills, mathematical modeling is rarely a main component of the math curriculum. Although textbook examples occasionally refer to real-world phenomena, the modeling process is not commonly practiced in the classroom.

Modeling involves engaging students in a big, messy real-world problem. The goals are for students to:

  • refine their understanding of the situation by asking questions and making assumptions,
  • leverage mathematical tools to solve the problem,
  • make their own decisions about how to go about solving the problem,
  • explain whether and how their methods and solutions make sense,
  • and test or revise their solutions if necessary.

Mathematical modeling typically takes place over the course of several class sessions and involves working collaboratively with other students in small groups.

Modeling is not just about getting to a “right” answer — it’s about considering factors beyond mathematics as well.

Modeling also offers the opportunity to integrate other material across the curriculum and to “think mathematically” in several different contexts. Modeling is not just about getting to a “right” answer — it’s about considering factors beyond mathematics as well. For example, students deal with questions like:

  • What is a “fair” split? 
  • What level of risk should someone tolerate?
  • What tradeoffs should a society make?

In others words, students come to see mathematics as the socially indispensable tool that it is, rather than an abstract (and sometimes frustrating) school subject.

Mathematical Modeling and Critical Thinking

Research suggests that the ability to solve abstractly framed academic math problems is not necessarily related to mathematical reasoning more broadly: that is, the ability to use math well in everyday life or to integrate mathematical thinking into one’s decision-making. Students may be able to follow procedures when given certain cues, but unable to reason about underlying concepts. 

It’s also very common to hear complaints from students about math — that either they aren’t “ math people ,” that math is irrelevant, or that math is simply boring.

Mathematical modeling is one approach to resolving both these problems. It asks students to move between the concreteness of real — or at least relatively realistic — situations and the abstraction of mathematical models. Well-chosen problems can engage student interest. And the practice emphasizes revision, step-by-step improvement, and tradeoffs over single solution paths and single right-or-wrong answers.

math improve critical thinking

Mathematical modeling often begins with a general question, one that may initially seem only loosely related to mathematics:

  • how to design an efficient elevator system, given certain constraints;
  • what the best gas station is to visit in our local area;
  • how to distinguish between two kinds of flies, given some data about their physical attributes.

Then, over the course of the modeling process, students develop more specific questions or cases, adding constraints or assumptions to simplify the problem. Along the way, students identify the important variables — what’s changing, and what’s not changing? Which variables are playing the biggest role in the desired outcomes?

Students with little experience in modeling can leap too quickly into looking for a generalized solution, before they have developed a feel for the problem. They may also need assistance in developing those specific cases. During this part of the process, it can be easiest to use well-defined values for some variables. These values may then become variables later on.

After students explore some simplifying cases, then they work on extensions of these cases to reach ever more general solutions.

A key part of this activity is letting students be creative — students will often come up with unusual or especially innovative solutions.

Throughout the modeling process, the teacher may need to point out missing assumptions or constraints, or offer other ways of reframing the problem. For any given modeling problem, some solutions are usually more obvious than others, which leads to common stages students may reach as they solve the problem. But a key part of this activity is letting students be creative — students will often come up with unusual or especially innovative solutions.

A sample problem, from the Guidelines for Assessment and Instruction in Mathematical Modeling Education is below:

math improve critical thinking

This problem involves variables that aren’t necessarily immediately apparent to students. For instance, the size of the gas tank, and how much gas you fill up on per trip. As students manage this specific case, they can take other hypothetical scenarios to generalize their solution: if it’s 10 miles away, how cheap would the gas have to be to make it worth it? What about the time spent in the car — is there a value to put on that?

Many modeling problems can be arbitrarily extended in various directions. Instead of just considering the best gas station to go to for a single car, for instance, students can explore the behavior of a fleet of trucks on set routes or seasonal changes to gas prices.

It’s also possible to include shorter modeling activities, where students work together in pairs or small groups to extend a problem or interpret the meaning of a solution.

These kinds of modeling activities are not reserved solely for older students. One example of a modeling problem for students in elementary school might be something like: what should go in a lunchbox? Students can talk about what kinds of things are important to them for lunch, “mathematize” the problem by counting student preferences or coming up with an equation (e.g., lunch = sandwich + vegetable + dessert + drink); and even explore geometrically how to fit such items into a lunchbox of a certain size.

Teaching Mathematical Modeling: Further Key Factors

Mathematical modeling activities can be challenging for both teachers and students. 

Often, mathematical modeling activities stretch over several class periods. Fitting modeling activities in, especially if standardized tests are focused on mathematical content, can be challenging. One approach is to design modeling activities that support the overall content goals.

The teacher’s role during mathematical modeling is more like a facilitator than a lecturer. Mathematical modeling activities are considerably more open-ended than typical math activities, and require active organization, monitoring, and regrouping by the teacher. Deciding when to let students persevere on a problem for a bit longer and when to stop the class to provide additional guidance is a key skill that only comes with practice.

The teacher’s role during math modeling is more like a facilitator than a lecturer.

Students — especially students who have traditionally been successful in previous math classes — may also experience frustration when encountering modeling activities for the first time. Traditional math problems involve applying the right procedure to a well-defined problem. But expertise at this kind of mathematical reasoning differs markedly from tackling yet-to-be-defined problems with many possible solutions, each of which has tradeoffs and assumptions. Students might feel unprepared or even that they’re being treated unfairly.

Students also have to have some knowledge about the situation to reason mathematically about it. If the question is about elevators, for example, they need to know that elevators in tall buildings might go to different sets of floors; that elevators have a maximum capacity; that elevators occasionally break and need to be repaired. 

Finally, the mathematical question needs to be tailored to students’ experience and interests. Asking a group of students who don’t drive about how to efficiently purchase gas won’t garner student interest. Teachers should use their familiarity with their students to find and design compelling modeling projects. This is chance for both students and teachers to be creative. 

To download the PDF of the Teachers’ Guide

(please click here)

Sources and Resources

O’Connell, S. (2000). Introduction to Problem Solving: Strategies for The Elementary Classroom . Heinemann. A recent handbook for teachers with tips on how to implement small-group problem solving.

Youcubed.org , managed by Jo Boaler.  A community with lots of resources for small-group problem solving instruction.

Yackel, E., Cobb, P., & Wood, T. (1991). Small group interactions as a source of learning opportunities in second-grade mathematics . Journal for research in mathematics education , 390-408. Education research that illustrates how small-group problem solving leads to different kinds of learning opportunities than traditional instruction.

Guidelines for Assessment and Instruction in Mathematical Modeling Education , 2nd ed. (2019). Consortium for Mathematics and its Applications & Society for Industrial and Applied Mathematics.  An extensive guide for teaching mathematical modeling at all grade levels.

Hernández, M. L., Levy, R., Felton-Koestler, M. D., & Zbiek, R. M. (March/April 2017). Mathematical modeling in the high school curriculum . The variable , 2(2). A discussion of the advantages of mathematical modeling at the high school level.

Privacy Overview

Advertisement

Advertisement

Using a metacognitive learning approach to enhance students’ critical thinking skills through mathematics education

  • Original Paper
  • Open access
  • Published: 17 March 2022
  • Volume 2 , article number  31 , ( 2022 )

Cite this article

You have full access to this open access article

  • Syaiful   ORCID: orcid.org/0000-0002-0139-115X 1 ,
  • Nizlel Huda 1 ,
  • Amirul Mukminin 1 &

2598 Accesses

2 Citations

Explore all metrics

This study aims to describe how the metacognitive learning approach (MLA) helped a university's mathematics lecturer enhance students' Mathematical Critical Thinking Skills (MCTS) through mathematics learning. It is an experimental study using a pretest–posttest control group design. The subjects of this study were the students of Mathematics education at a university. The instrument used was the MCTS test. The data were analyzed using ANOVA at the level of significant 0.01. The results of data analysis showed that the MCTS of students who learned with an MLA is better than students who study conventionally, and learning using an MLA has the same effectiveness in increasing the MCTS of students from the low, medium, and high-level subgroups.

Similar content being viewed by others

math improve critical thinking

Mathematics learning strategies of high school students in Nepal

Bishnu Khanal, Ram Krishna Panthi, … Shashidhar Belbase

math improve critical thinking

The Enacted Curriculum—Students’ Perspectives of Good Mathematics Lessons in Singapore Secondary Schools

Facilitation of students’ metacognition: some insights gleaned from mathematics classrooms in singapore secondary schools.

Avoid common mistakes on your manuscript.

Introduction

Thinking is one of the characteristics of humans (Homo Sapiens). Since we may perceive, humans began thinking, which continues until the end of their lives. The superiority of humans compared to other creatures was indicated by the power of their mind consistently expressed in the act after going through the process of appreciation (Logan and Tandoc 2018 ; Sherwood et al. 2008 ). The ability of humans to adapt based on the power of their minds spawned technological and socio-cultural forms of life (Boyd et al. 2011 ; Gacel-Ávila 2005 ; Rustaman 1990 ).

The process of thinking activities includes three parts, namely problem solving, logical reasoning, and decision-making (Galotti and Mark 1994 ). The process describes that the thinking activities require an understanding of the problems associated with the material being contemplated, our ability to reason, intellectual ability, imagination, and flexibility of the mind that stretched into the results of thought (Cresswell and Speelman 2020 ; Saiz and Rivas 2011 ).

There is a relationship between the thinking process and mathematics (Ahdhianto et al. 2020 ). Someone good in mathematics will be good at thinking, and someone trained in learning mathematics will become a good thinker (Cresswell and Speelman 2020 ; Li and Schoenfeld 2019 ; Schoenfeld 2018 ). In terms of the process of arising mathematical ideas or concepts, Ruseffendi ( 1991 ) stated that mathematics arises because of thoughts, which are related to ideas, processes, and reasoning (Jonsson et al. 2014 ). Meanwhile, if viewed from the activities of mathematics done by students in learning mathematics, Runisah et al. ( 2017 ) provided an argument that mathematics activities can potentially enhance responsibility and freedom in thinking, mathematics is an arena for young students to be able to solve a problem and gain the confidence that the correct solution is not because of the words of the teacher, but because of their clear logic of reasoning (Kwang 2000 ). Therefore, we can see that there is a firm interconnection between math skills and someone thinking skills.

In mathematics, one of the thinking skills that belong to high-level thinking skills is the MCTS (Ahdhianto et al. 2020 ; Behar-Horenstein and Niu 2011 ; Ennis 1985 ; Kayaalp et al. 2020 ; Spector and Ma 2019 ). Critical thinking is a logical and thoughtful thought process that decides to believe or do. It also defines a thought process that considers every possible option to help make a logical decision and make it a challenging society recognized as a necessary condition for a responsible member (Uddin et al. 2020 ; Wijaya et al. 2020 ). There are four reasons put forward regarding the need to be accustomed to developing MCTS, namely: (1) the demands of the times which require citizens to seek, choose and use the information for social and state life, (2) every citizen always air-to-face with many problems and choices that are supposedly able to think critically and creatively, (3) the ability to look at things in a different way to solve problems, and (4) critical thinking is an aspect in solving problems creatively so that learners can compete fairly and able to cooperate with other nations (Boldureanu et al. 2018 ; Dewey 1916 ; Rapanta et al. 2020 ).

MCTS can be developed by studying mathematics in schools or colleges, which insist on the system, structures, concepts, principles, and strong interconnection between an element and another element (Bransford et al. 1999 ; Geary 1995 ; Marzano 1988 ; Oates 2011 ). In nature, Mathematics as a structured and systematic science, as a human activity through a process that is active, dynamic, and generative, as well as a science to developing an attitude of critical thinking, objective, and open, it becomes very important to be mastered by learners in the face of rate changes in science and technology are so fast (Hiebert 2013 ; Lave 1988 ; Stiff and Harvey 1988 ; Suter 2011 ).

Currently, most students assume that mathematics, a field of study which is difficult and disliked (Begle 1969 , 1979 ; Mahmud 2017 ; Meyers 1986 ; Ruseffendi 2006 ). Only a few can explore and understand mathematics as a science that can train MCTS (Munawarah et al. 2020 ).

Ironically, the MCTS of students, on the one hand, is vital to have and be developed, but on the other hand, it turns out that the MCTS of these students is still lacking (Ahdhianto et al. 2020 ; Feriyanto and Putri 2020 ; Syaiful et al. 2020 ). It is in line with the results of a preliminary study performed for a few semesters on undergraduate students of mathematics education who have a very diverse previous educational background (Syaiful 2013 ). These students come from Senior High School, Vocational High School, Islamic Senior High School. In a conducted preliminary study, it was given tests of critical thinking with critical thinking indicators (Shovkova 2019 ; Umam et al. 2020 ) as follows: (1) make generalizations and consider the result of the generalizations (G), (2) identify the relevance (IR), (3) formulate the problem into a mathematical model (MM), (4) make deductions using the principle (D), (5) provide examples of inference (C), and (6) reconstruct arguments (RA). The results obtained from these tests, both for students with natural and non-science backgrounds, were unsatisfactory. It showed their scores with an average of less than 50% of the maximum score for the two groups (Bhattacharyya and Pradhan 2015 ; Ruseffendi 1988 ; Syaiful 2013 ).

A more in-depth review of the preliminary study illustrates that most students still did not have metacognition. The dysfunctional metacognitive beliefs in the context of learning happened (Lenzo et al. 2020 ). It caused difficulty in understanding mathematical concepts as well as in procedural understanding. Another indication is that students also tend to be afraid to give ideas and comments and lack confidence in mathematical communication (Moodley et al. 2015 ; Murphy et al. 2016 ; Salguero et al. 2020 ; Seow and Gillan 2020 ; Syaiful 2019 ).

A learning strategy and approach are vital to developing students' thinking skills, so it is necessary to have mathematics learning which actively involves students in the learning process. Thus, an alternative form of learning designed in such a way as to reflect active student involvement that instills metacognition awareness is required(Abramovich et al. 2019 ; Kallio et al. 2020 ; Laurens et al. 2017 ; Li and Schoenfeld 2019 ).

Besides, learning mathematics using the metacognitive approach is a form of constructivist learning (Erdoğan and Şengül 2017 ; Tachie 2019 ; Verschaffel et al. 2019 ). It views that the learning process begins with cognitive conflict and is resolved by themselves through self-regulation, which ultimately in the learning process, students build their knowledge through experience from the results of interactions with their environment (Monteiro et al. 2020 ).

Based on another view, metacognitive includes developing a systematic method for solving the problem, visualizing, and evaluating the productivity of the thinking process (Chew et al. 2019 ; García-García and Dolores-Flores 2021 ). Another statement that supports this is as stated in the lecture of the teaching and learning process (MKPBM) Team (2001), which views metacognitive as a form of the ability to see oneself so that what it does can be controlled optimally (Corno 1986 ; Susantini et al. 2021 ; Winne 1996 ).

The author considers that the MLA has many advantages when used as alternative learning of mathematics to develop the MCTS of students. With MLA, a lecturer develops students' metacognitive awareness (Al-Gaseem et al. 2020 ; Akben 2020 ). The lecturer continuously trains students to design the best strategy in selecting, remembering, re-recognizing, organizing the information they face, and solving problems (Mathabathe 2019 ).

By developing awareness in metacognition, students are expected to be consistently used to monitor, control, and evaluate what they have done (Faivre et al. 2020 ; Miegel et al. 2020 ; Thorslund et al. 2020 ; Yusnaeni et al. 2020 ). Often students ask themselves the question, "What will be done? What is known? What will be sought? Which is the best strategy to solve the problem? Which operation should come first? Are the steps that have been taken correctly? In which part? Is there an error? How are the efforts to correct the error?" (Nakagawa and Saijo 2020 ). So with critical questions that can develop metacognitive awareness like that, later will develop the MCTS of these students in learning mathematics. (Saritepeci 2020 ; Wilson and Conyers 2016 ).

The background above encourages the author to research alternative mathematics learning with an MLA to improve the MCTS of undergraduate mathematics education students. Therefore, starting from the thoughts, the problems of this study were formulated as follows:

Is the Mathematical Critical Thinking Skill (MCTS) of mathematics education students of a university, Faculty of Teaching, who receive mathematics learning using an MLA better than students who receive conventional learning?

Is there a difference in the increase in MCTS between the low, middle, and high subgroups in the students who received mathematics learning using the MLA?

Literature review

Metacognition.

Metacognition refers to thought at a higher level (i.e., mental action or method of acquisition) via thinking, experience, senses, awareness, and understanding of targeting cognitions at the object level. Metacognition has gained much scholarly interest, partially because it is essential for successful learning for school children and adults. In metacognition, Nakagawa and Saijo ( 2020 ) argued that the most frequent distinction is between metacognitive information and metacognitive abilities. The former refers to declarative knowledge of a person about the interactions between the characteristics of a person, job, and strategy, whereas the latter refers to the procedural knowledge of a person for controlling problem-solving and learning activities. Procedural metacognition, the latter kind of metacognition, requires metacognitive processes, monitoring (i.e., subjective measures of existing cognitive activity), and metacognitive control (i.e., regulation of current cognitive activity).

Surveillance (Nakagawa and Saijo 2020 ) requires issues such as "How much effort is being made." Are they expected to bring this material into learning? "Have I understood this material enough to recall the specifics later on? and "How sure am I that this is the correct answer? Control requires behavior such as collecting examination material during the study, the differential allocation of study time to learning material, and the removal of responses or the termination of a memory quest (Nakagawa and Saijo 2020 ).

Metacognition is a highly disciplined thought that plays a vital role in learning and teaching practices in education organizations. In particular, Flow well was one of the first theorists to put forward a definition of metacognition. He categorized metacognition as metacognitive knowledge, metacognitive experience, and metacognitive monitoring and control (Al-Gaseem et al. 2020 ). In addition, metacognition involves different activities: (1) understanding one's moods, (2) understanding the moods of others, and (3) how individuals use these representations to deal with suffering and psychology and to resolve conflicts with each other (Maillard et al. 2020 ). Metacognition refers to both people's awareness and control of their cognitive processes and emotions, and motivations (Lysaker et al. 2020 ).

Five pedagogical strands in mathematics contribute to students' success: comprehending, computing, applying, reasoning, and engaging—the initial pedagogical action linked to comprehension aides. Students must be able to comprehend mathematical concepts, procedures, and relationships. The second pedagogical action assists students in performing routine computational operations functions that perform everyday operations. The third pedagogical action uses mathematical tools and procedures to resolve issues by applying conceptual and procedural techniques in mathematics. The fourth pedagogical action is related to students' mathematical reasoning, which enables them to explain and justify their actions, justifications for mathematical answers, proofs, and applications. The fifth pedagogical action in the mathematics classroom is about including students in making sense of mathematics by involving them in textbook problems and extending problems to more fields of application. These theoretical foundations for learning methods' psychological, empirical, and pedagogical components influenced our study's methodological approach and output during the analysis and interpretation processes. These theatrical controls may not be adequate. They were not explicitly stated throughout the study procedure, but they served as implicit components at each investigation stage (Khanal et al. 2021 ).

Metacognitive learning approach (MLA)

The metacognitive learning model is one learning model that teaches students to think creatively to solve a problem (Hargrove 2013 ; Hargrove and Nietfeld 2015 ; Listiani et al. 2014 ). In this case, students can plan, organize, and evaluate the activities they do. The learning focuses on the students or student-centered learning (Listiani et al. 2014 ). Then MLA is believed to make learning more meaningful, students' understanding becomes more profound than before, and its application is more comprehensive than others. Each learning model will have a syntax. The syntax of the MLA is as follows (Listiani et al. 2014 ). (1) Opening, students explore prior knowledge related to the material to be discussed at this stage. (2) Development of cognitive abilities: Students can solve cognitive-type problems at this stage. (3) Development of metacognitive abilities. Before developing metacognitive type abilities, students are first given a metacognitive type of math problem, then proceed with the following phases. (a) Planning, the teacher guides students in planning and re-implementing the completion procedure, the cognitive strategies used, and relevant prior knowledge to solve the given problem. (b) The teacher guides students in monitoring completion procedures, relevant prior knowledge, and the cognitive strategies used. (c) Reflection, the teacher guides students to understand the concepts that the students have carried out in solving metacognitive type math problems. The teacher reflects it by comparing the results that students have obtained with the statements given, so there will be a control and reflection on the previous cognitive activities. (4) Closing, students are guided in making conclusions from the previous learning at this stage. In this case, students are thinking, and the lecturer invites them to learn how to solve a problem, from planning, implementing to reflecting on the activities carried out. With this metacognitive knowledge and skills, students are aware of their strengths and limitations in learning. If the student feels he is wrong, he will immediately realize it and look for ways to correct it (Hargrove 2013 ; Listiani et al. 2014 ; Syaiful 2011 ; Tohir 2019 ). An example of the application of the MLA is listed in Table 1 .

This study was conducted at a faculty of teaching, a university in Sumatra, Indonesia. It was carried out in one semester. 83 Mathematics education students were involved in the subjects of this study. They consisted of 45 students of EG and 38 students of CG. These two groups were selected randomly from the available three classes. Each class was given a pretest consisting of 12 items of the MCTS test before the intervention was made. The pretest and posttest given were the same tests. The result of the pretest was tabulated to determine the initial ability of the students. The pretest scores are ordered from the highest to the lowest. 27% of the highest scores were grouped as students with high initial ability. 27% of the lowest score students were grouped as students with low initial ability. The remaining 46% were grouped as students with medium initial ability. Then, the students were classified as high, medium, and low levels. Finally, after the intervention was made, the posttest was given to the students to measure how the MLA enhanced the students’ MCTS.

The items were developed by referring to the six indicators: generalize and consider the results of generalization (G), identify the relevance (IR), formulate the problem to the model of mathematics (MM), make deduction using the principle of (D), provide examples of inference (C), and recommend argument (RA). There are items in the MCTS test, which are then broken down into numbered questions: 1, 2, 3a, 3b, 3c, 3d, 4a, 4b, 4c, 4d, 5, and 6. The MCTS test used was a test in the form of a description, with the aim that the thinking process, accuracy, and systematic preparation can be seen in this meta-steps of solving test questions. In addition, errors and difficulties experienced by students can be identified and studied to enable improvements implementation. The MCTS test was a development of questions to measure critical thinking skills. Mayadiana designed the test under the syllabus and lecture program unit (SAP) of the Mathematics Education, an undergraduate program in a university (Mayadiana 2005 ). Then, the authors adapted it following the probability material. Questions 1 and 5 measure the ability to generalize and consider generalization results, questions 2 and 6 measure the ability to reconstruct arguments, and questions 3a and 4a are used to measure the ability to identify relevance. Problem numbers 3b and 4b measure the ability to formulate problems into mathematical models. Problem numbers 3c and 4c are used to measure the ability to make deductions using principles. Problem numbers 3d and 4d measure the ability to give examples of inference (please find the MCTS test items in the Appendix 1).

In order to meet the criteria as a good test instrument, before being used in research, this test instrument was first tested to determine its validity, reliability, and level of difficulty (please find attached tables in the Appendixes 2 and 3). Then, a trial was carried out on mathematics education students who had obtained probability material. Before the trial was made, the test was validated by the subject matter expert (SME), a senior Statistics lecture in a mathematics education program. After the required data were collected, validity and reliability tests items analyses were conducted. It was found that the counted r of the test items were: number 1 = 0.555, 2 = 0.599, 3a = 0.623, 3b = 0.697, 3c = 0.651, 3d = 0.514, 4a = 0.706, 4b = 0.533, 4c = 0.608, 4d = 0.452, 5 = 0.560, and 6 = 0.649. Meanwhile, the Table r of the test items was 0.403. Since the value of counted r of the test items was bigger than table r, the test items were valid and could be used to measure the MCTS of the students. Moreover, for the reliability of the test items, it was required that Alpha = 0.7674. With the alpha 0.7674, it meant that the items were reliable and could be used to measure the MCTS of the students.

Conventional Learning

The lecturer was the center of this learning because it was teacher-centered learning. The lecturer first explains the material, demonstrates, gives some examples, while students listen to the lecturer's explanation carefully and write down some things they consider important. If there are students who feel unclear, then the student asks questions. Lecturers usually answer questions from these students, or occasionally the questions are thrown to other students before being answered by the lecturer. After the material has been delivered, students are given practice questions that are done individually. These questions can come from teaching materials in commonly used math diktats or from other sources. Throughout this activity, the lecturer went around the class to assist students who had difficulties. After that, some students were asked to work on the practice questions on the blackboard. Student activities in this class tend to be passive compared to classes where mathematics learning uses a metacognitive approach. It mainly happened because students did not carry out discussion activities. If possible, students will discuss with a friend next to their seat while working on practice questions.

Metacognitive Learning

The undergraduate students of mathematics education were the center of this learning because student-centered learning was the characteristic of this learning. At the beginning of the lesson, the lecturer conveyed the learning objectives to be achieved at the meeting, then motivated students by providing metaphors. The lecturer allows students to ask questions about the previous material or ask questions and provoke students to ask questions. Thus, at the beginning of learning, there is a process of developing metacognitive awareness and developing students' abilities to ask critically. After the teaching materials in Student Worksheets (SW) are distributed, the lecturer guides students to study the materials and answers metacognitive questions that encourage them to translate concepts in their own words. Then students pour their ideas into the available teaching materials and answer all the questions on the teaching materials in their language in order. At the end of the description of the material, if there are still things that students do not understand, the students write down questions on the teaching materials to then discuss them with fellow students or with the lecturer concerned, then write down the results of the discussion in the space provided. During the small discussion, the lecturer went around the class and gave individual feedback. The effect of metacognitive feedback leads students to focus on the mistakes made and provides instructions to correct their own mistakes. Often this tiny discussion activity continues into a class discussion under the guidance of the lecturer, with one or several students presenting the results of their group discussions while other groups respond or debate. After that, the recapitulation of what students do in class is concluded by themselves, while the lecturer summarizes the essence of the student's conclusions through metacognitive questions.

Mathematical critical thinking skills (MCTS)

Results of the pretest of students' MCTS between EG and CG were not significantly different. From the maximum score of 100, the EG obtained a mean of 24.36, while the CG with 38 students had a mean score of 21.66. The obtained score pretest that less than 30% of this shows that, in general, the MCTS of the EG and CG are very low, as shown in Table 2 .

The results of the pretest data of MCTS of students in Table 2 displayed that the student's prior knowledge of the mean scores was obtained; between the EG and CG seemed not much different. On a scale of 0–100, the EG with 45 students had a mean score of 24.35 and a standard deviation of 9.439. Meanwhile, the CG with 38 students had a mean score of 21.66 with a standard deviation of 8.609.

Based on the research data analysis results, the prior knowledge of students EG and CG was the same. Further to the two groups are enforced different treatments. The lecturer gave An EG a treatment in learning mathematics using an MLA, while the CG did not. The CG received learning mathematics using a conventional approach.

The same as the pretest, the aspects or indicators to critical thinking result from posttest measured are: generalize and consider the results of generalization (G), identify the relevance (IR), to formulate the problem to the model of mathematics (MM), make deduction using the principle of (D), provide examples of inference (C), and recommend argument (RA) as that appears in Table  3 .

The MCTS of students in terms of posttest scores acquisition were as follows. The lowest score of the EG was 57, while the lowest score of the CG was 34. The highest scores of the experimental and CG were respectively 92 and 85. Meanwhile, in terms of the mean scores, from a maximum of 100, the EG got a mean score of 70.02, and the CG got a mean score of 51.68. The mean score difference between the two is around 26% towards the mean score of the EG.

If referring to the benchmark categories which is usually used on campus where the research was conducted, the number of students in the EG who had MTCS in the category of sufficient was 22 people (49%), who had categories of good and very good as many as 23 people (51%). Meanwhile, the CG still had 15 people (39%) with a bad category, and 20 people (53%) were in a good category. In the CG, only three people (8%) had a good and very good category. In general, the EG belongs to have good MCTS (their mean ranged between 70 and 85%), while the CG belongs to have enough MCTS (their mean ranged between 50 and 70%).

The posttest and means-test results displayed that students who receive learning mathematics using the MLA have demonstrated an increase in the MCTS. It is much better than the students who got the conventional learning. The fact is possible because in the learning of Mathematics with the MLA, the paradigm of learning centered on teachers (lecturers) has shifted to learning that emphasizes the activity of the students to construct and reconstruct knowledge itself. In line with what is proposed by Radjibu et al. ( 2020 ), a more conducive learning atmosphere could be created by changing the outlook of the class as a collection of individuals to the direction of the class as a learning community, and teachers become a motivator, a facilitator, and a manager of learning.

Based on the calculation of the “Gain Temormalisation" results, the whole EG showed an increase in the MCTS amounted to 60.37%, while the CG only reached 38.33%. It means that an increase in MCTS that experienced by the EG was better than CG.

Before discussing more deeply about an increase in the MCTS for each subgroup of the EG, some things need to be observed regarding the learning of Mathematics with the conventional approach on the CG. The posttest results showed a significant increase in the MCTS, in which the students of CG reached the highest score, 85, with the average grade, which is quite good, and in the overall improvement to inability that is classified as medium. It indicates if conventional mathematics learning was conducted truly, it still would give positive results to improve the students' MCTS.

The increase in the MCTS in more detail to each experimental subgroup has obtained an average of improvement in the high-subgroup 65, 41%, the middle-subgroup was 59.82%, and the low-subgroup experienced an increase of 56.05%. In other words, for an EG, both overall and each subgroup, the increase in MCTS that occurred are classified into the category of the medium.

Based on testing with one-way Anova on the level of significance 0.01, obtained results showed no differences in the increase of the MCTS between low-subgroup, medium-subgroup, and high-subgroup in the group of students who received the learning of mathematics using the MLA. In other words, the use of the MLA in learning mathematics has an effect similar to increasing the MCTS of each subgroup so that every subgroup will provide the same results.

Ability to Generalize and Consider the Results of Generalizations

An ability to generalize and consider the result of generalization is measured with a test of MCTS number 1 and number 5, with a score maximum of the two questions, is 25, the prior knowledge of EG for the indicator to think critically is relatively low, which is only 9.84 (39.36%). In contrast, the final ability is classified as high, namely 20.55 (82.20%).

The description above also showed an increase in the MCTS of the EG in aspects of generalizing and considering the results of generalization of 70.65% relative to the capabilities originally (calculation-based formula gain Temormalisation). Thus, an increase of the ability to generalize and consider generalizing results to the EG, including high.

Based on Table  4 , known values of JK t (total) = 2.258; JK a (Between Groups) = 0.127; JK i (Within Groups) = 2.131; RJK a (Mean Square  betwen ) = 0.06351; and RJK i  (Mean Square within ) = 0.05073; thus resulting in a value of  F count  = 1.252. While at the significance level of 0.01 and degrees of freedom 2 and 42, the value of F table = 5.15 is obtained.

Based on testing with One-way Anova on the significance level of 0.01, the obtained result showed that F count  = 1.252 was smaller than F table  = 5.15. It meant no further increase in the ability to generalize and consider the generalization results between low, medium, and higher subgroups in EG. In other words, using the MLA in learning mathematics has an effect similar to increasing the ability to generalize and consider the results of generalization in each subgroup so that each subgroup will provide accurate and not different results.

The ability to identify relevance

The MCTS in identifying relevance was measured by the MTCS test number 3a and number 4a. With the maximum score of the two questions being 10, the initial ability of the EG for this critical thinking indicator is very low, which is only 2.85 (28.50%), while the final ability is classified as high, namely 8.29 (82.90%).

The description of those that increase the MCTS of EG in identifying the relevance was 76.08% relative to the capabilities initially (calculation-based formula gain normalized). Thus, increasing the ability to identify the relevance of the EG included high.

Table 5 displays values of JK t  = 2.105; JK a  = 0.219; JK i  = 1.886; RJK a  = 0.11; and RJK i  = 0.04491; thus resulting in a value of F count  = 2.438. While at the level of significance of 0.01 and degrees of freedom 2 and 42, the value of F table = 5.15 is obtained.

The testing with One-way Anova at the level of significance 0.01 showed that F count  = 2.438 was smaller than F table  = 5.15. It meant that there were no differences in the increase of the ability to identify relevance between low, medium, and higher subgroups in EG. In other words, the use of the MLA in learning mathematics has an effect similar to enhancing the ability to identify relevance in each subgroup so that every subgroup will provide accurate and not different results.

The ability to formulate the problem to the model of mathematics

The MCTS in aspects of formulating the problem to the mathematics model was measured with a test of the MCTS numbers 3b and 4b. With a score of maximum, two questions are 10, prior knowledge of the EG for the indicator to think critically is only 1.66 (16.60%) and classified as very low, while the ability of the end is 7.06 (70.60%) and classified high.

Based on the description above, an increase in the MCTS of EG in formulating the problem to the mathematics model is by 64.75% relative to the capabilities initially (calculation-based formula gain normalized). Thus, increase the ability to formulate the problem to the model of mathematics in an EG categorized as medium.

Table 6 shows the value of JK t  = 2.541; JK a  = 0.02486; JK i  = 2.516; RJK a  = 0.01243 and RJK i  = 0.05991; thus resulting in a value of F count  = 0.207. While at the significance level of 0.01 and degrees of freedom 2 and 42, the value of F table = 5.15 is obtained.

Based on testing with One-way Anova on the level of significance 0.01, we obtained results that F table  = 5.15 was bigger than F count  = 0.207. It meant that there were no differences in the increase in the ability to formulate the problem to the model of mathematics between low, medium, and high subgroups in EG. In other words, the use of the MLA in learning mathematics has the same effectiveness to improve the ability to formulate problems into the model of mathematics of every subgroup so that every subgroup would provide results that were indeed not different.

The ability to make deduction using principle

The MCTS items number 3c and numbers 4c measured the MCTS of students in making deductions using principle. With a score of maximum for the two questions is 10, the prior knowledge of EG for the indicator of critical thinking is relatively very low, which is only 0.77 (7.70%), while the final ability was 5.73 (57.30%). It was classified as sufficient.

The description also showed that an increase in the MCTS of the EG in making deductions using the principle amounted to 53.74% relative toward ability initially (calculation is based on a formula gain Ternormalisation). Thus, it could be concluded that an increase in the MCTS in making deductions using the principles of the EG was classified into medium criteria.

Table 7 displays values of JK t  = 5.022; JK a  = 0.09336; JK i  = 4.929; RJK a  = 0.04668; RJK i  = 0.117; thus resulting in a value of F count  = 0.398. While at the level of significance of 0.01 and degrees of freedom 2 and 42, the value of F table  = 5.15 is obtained.

Testing with One-way Anova on the level of significance 0.01 obtained that F count  = 0.398 was smaller than F table  = 5.15. It meant that there were no differences in the increase in the ability to make deductions premises using the principle between low, medium, and high subgroups in EG. In other words, the use of the MLA in learning mathematics has an effect similar to increasing the MCTS on the aspects of making deductions using the principles in each subgroup so that each subgroup would give the same result.

The ability to provide example of inference

The MCTS of students in giving an example of inference was measured with a test of MCTS numbers 3d and 4d. With a score of maximum for the two questions is 10, the prior knowledge of EG for the indicator to think critically is only 0.98 (9.80%), while the ability to reach 4.94 (49.40%) finally.

Based on the description of these, it appears that the increase in the MCTS of EG in giving an example of inference is by 43.90% relative to the capabilities initially (calculation based on formula gain Ternormalisation). It meant that an increase in the MCTS in giving an example of inference in an EG was relatively medium.

Table 8 shows that the values of JK t  = 9.255; JK a  = 0.174; JK i  = 9.081; RJK a  = 0.087; and RJK i  = 0.216; thus the value of F count  = 0.403. Meanwhile, at the level of significance of 0.01 and degrees of freedom 2 and 42, the value of F table  = 5.15 is obtained.

Based on testing with One-way Anova on the level of significance 0.01, obtained results F count  = 0.403 was smaller than F table  = 5.15. It meant that there were no differences in the increase of capabilities provided an example of inference between low, medium, and high subgroups in EG. In other words, the use of the MLA in learning mathematics has an effect similar to increasing the ability to give an example of inference to each subgroup so that each subgroup would provide the same results.

The ability to reconstruct arguments

The MCTS of students in reconstructing arguments was measured with a test of MCTS for items number 2 and number 6. The score maximum second question is 35, prior knowledge of the EG of indicators of critical thinking was only 8.25 (23, 57%), while the final ability was 23.45 (67.00%).

From the description, an increase in the MCTS of EG in reconstructing the argument is by 56.98% relative to the capabilities originally (calculations based on formulas gain Ternormalisation). It meant that the increase in the ability to reconstruct an argument on the EG was classified as a medium.

Table 9 displays the value of JK t  = 1.723; JK a  = 0.08718; JK i  = 1.636; RJK a  = 0.04359; and RJK i  = 0.03895. Thus, the value of F count  = 1.119. While at the level of significance of 0.01 and degrees of freedoom 2 and 42, the value of F table  = 5.15 is obtained.

Based on testing with One-way Anova on the level of significance 0.01, obtained results that F count  = 1.119 was smaller than F table  = 5.15. It meant that there were no differences in the increase of the ability to reconstruct the argument between low, medium, and higher subgroups in EG. In other words, the use of the MLA in learning mathematics has an effect similar to enhancing the ability to reconstruct arguments in each subgroup so that every subgroup will provide accurate and not different results.

Furthermore, the pretest and posttest of score acquisition percentage as a whole regarding the MCTS include aspects: generalize and consider basil generalization (G), identify the relevance (IR), to formulate the problem to the model of mathematics (MM), make the deduction using the principle of (D), giving examples of inference (C), and reconstruct arguments (RA), are presented in Fig.  1 . While Fig.  2 presents the percentage of gain acquisition in the MCTS, which includes all six aspects.

figure 1

Percentage of score acquisition for each mathematical critical thinking skill aspect

figure 2

Percentage of gain acquisition for each critical thinking aspect of mathematics

Based on Fig.  1 , from the six aspects of MCTS measured in the study, the highest increase in the MCTS of students was on generalizing and considering basil generalization and identifying the relevance, while the lowest increase was on the aspects of providing examples of inference. It was enabled because, based on the trial results, questions to measure aspects of generalizing and considering basil generalization and identifying the relevance were easier than the others, while questions to measure aspects that provide examples of inference were the most difficult ones compared with the others.

As displayed in Fig.  2 , from the six aspects of MCTS measured, the highest gain identified the relevance, while the lowest increase provided examples of inference. For the same reason as before, it was because questions to measure aspects of identifying the relevance were easier than the others, while the questions to measure the aspects of providing examples of inference were the most difficult ones compared with the others.

The MCTS of students who follow mathematics teaching using the MLA was much better than the MCTS of students who learned in conventional (Craig et al. 2020 ; Teng 2020 ). The MCTS of students who learned using an MLA is good, while the students who learned using the conventional can critically think that was classified into the medium. Nevertheless, the results obtained by students who learned in a conventional, showed a significant improvement in the MCTS, with the highest score are in the category of very good and the average grade is quite good. It indicates that although the learning of mathematics was done conventionally, it still would give positive results to improve the ability of the students if it was truly carried out.

In learning mathematics with an MLA, the teacher-centered learning paradigm has shifted to learning that emphasizes student activities to construct and reconstruct their knowledge (Hidayat et al. 2018 ; Sumarmo 2002 ; Tachie 2019 ). Also, it makes students more active (Ruseffendi 1991 ; Schermerhorn Jr and Bachrach 2020 ; van Rhijn et al. 2016 ; Wulf and Lewthwaite 2016 ). Besides, the MLA trains metacognition provides opportunities for students to learn on their own, where each student fills out the SW given according to the instructions (Atmatzidou et al. 2018 ; Erdoğan and Şengül 2017 ; Ross 1995 ; Xu and Ko 2019 ). Also, a more conducive learning atmosphere can be created by changing the view of the class as a collection of individuals towards the class as a learning community, and the lecturer as a trainer being a motivator, facilitator, and learning manager (Hidayat et al. 2018 ; Sumarmo 2002 ; Tachie 2019 ).

In addition, learning Mathematics with the MLA improved the MCTS significantly of each subgroup of students compared to the Conventional (Amin 2020 ; Erdogan 2019 ; Shanta 2020 ; Suryadi 2005 ). The MLA involved students’ metacognition (Amin et al. 2020 ; Mohseni et al. 2020 ). Metacognition is an essential component in learning Science like Mathematics (Kristensen et al. 2020 ). It made students control themselves and use their knowledge to solve problems (Teng 2020 ). It also challenged the students to think critically (Yildirim and Ersözlü 2013 ). By having good metacognition, according to Akturk and Sahin, Bonner, and Van Zile-Tamsen (as cited in Craig et al. 2020 , p. 156), students were demanded to own awareness of learning and comprehension, to understand their own and other thoughts, to monitor and increase the efficiency of the cognitive procedure.

Moreover, with metacognition, students have to assess the needs of a project and eventually pick out an appropriate method for project completion, to display their development in the direction of an aim and to regulate method usage, to mirror their selection making process, and to determine the intellectual states of others (Craig et al. 2020 ). Shortly, with metacognition, students achieved their academic target (Craig et al. 2020 ; Mathabathe 2019 ).

Based on the statistical calculation results, it was concluded that there was no difference increase in MCTS between the low-level subgroups, medium-level subgroups and high-level subgroups of students who received the learning of mathematics using the MLA. In other words, the MLA has the same effect in increasing the MCTS of any subgroups of students (Craig et al. 2020 ; Teng 2020 ).

Specifically, an increase in generalizing and considering the results of generalization is included in the high-level categories. Also, in identifying the relevance, the increase in the MCTS of students is included in the high-level categories. Meanwhile, the increase in the MCTS in aspects of formulating the problem to the mathematical model, making deductions with principle, providing an example of inference, reconstructing arguments, got an increase that belongs to the medium-level categories. The metacognition students influenced those who had (Lenzo et al. 2020 ; Salguero et al. 2020 ; Seow and Gillan 2020 ).

In general, the learning of Mathematics using an MLA makes students more active during the activities of the learning takes place, the students get more opportunities in exploring the material together with the lecturer and friends through the activities of discussion. Factors that significantly supports the implementation of learning mathematics using the MLA namely: (1) work together and assistance of lecturers who acted as observer and discussion partners in completing each obstacle that faced in the process of learning (Backer et al. 2020 ); (2) active students' engagement to participate in learning well (Azizi and Herman 2020 ; Hartini et al. 2020 ; Suryani et al. 2020 ; Umam et al. 2020 ; Uddin et al. 2020 ). In short, learning with an MLA presents teaching materials that train metacognition, lecturer intervention, and class interaction (English 2016 ; Su et al. 2016 ; Suryadi 2005 ; Verschaffel et al. 2020 ; Wang et al. 2021 ).

In addition, some of the obstacles are encountered in the learning of mathematics using the MLA. They were: (1) the time that is available relatively little to do developments in learning; (2) difficulty in making questions exercises on sheet working students which can improve the MCTS of students; (3) the difficulty in making a group discussion with varying levels of ability in mathematics of the group members, so it is expected in each group going on the activities of productive discussion groups (Munawarah et al. 2020 ).

Conclusion and recommendation for future research

Learning to use the MLA can be used as an approach in teaching Mathematics to enhance the MCTS of students, especially in the aspects of making generalizations and considering the results of generalization, identifying relevance, formulating the problem to the model of mathematics, making the deduction using the principle, provides an example of inference, and reconstruct the argument. Learning Mathematics with the MLA emphasizes the activity of students in the process of learning to optimize the involvement of students and turned out to give a result that is quite effective to create an atmosphere of learning as it requires the skills of a lecturer or teacher in terms of material in Mathematics and methodology of learning. Because of it, the lecturers or teachers are always expected to keep trying to improve the ability to teach and the ability of Maths through various sources. For example, the results of research or journal. Further research will observe and report the activity of the students in the class when the MLA is applied.

Data availability

The study does not provide data publicly due to their containing information that could compromise the privacy of the research participants. When necessary, please contact the corresponding author.

Code availability

Not applicable.

Abramovich S, Grinshpan AZ, Milligan DL (2019) Teaching mathematics through concept motivation and action learning. Educ Res Int. https://doi.org/10.1155/2019/3745406

Article   Google Scholar  

Ahdhianto E, Haryanto M, Nurfauzi Y (2020) Improving fifth-grade students’ mathematical problem-solving and critical thinking skills using problem-based learning. Univ J Educ Res 8(5):2012–2021. https://doi.org/10.13189/ujer.2020.080539

Akben N (2020) Effects of the problem-posing approach on students’ problem-solving skills and metacognitive awareness in science education. Res Sci Educ 50(3):1143–1165. https://doi.org/10.1007/s11165-018-9726-7

Al-Gaseem M, Bakkar B, Al-Zoubi S (2020) Metacognitive thinking skills among talented science education students. J Educ Gifted Young Sci 8(2):897–904. https://doi.org/10.17478/JEGYS.707205

Amin AM (2020) The correlation between metacognitive skills and critical thinking skills at the implementation of four different learning strategies in animal physiology lectures. Eur J Educ Res 9(1):143–163. https://doi.org/10.12973/eu-jer.9.1.143

Amin AM, Corebima AD, Zubaidah S, Mahanal S (2020) The correlation between metacognitive skills and critical thinking skills at the implementation of four different learning strategies in animal physiology lectures. Eur J Educ Res 9(1):143–163. https://doi.org/10.12973/eu-jer.9.1.143

Atmatzidou S, Demetriadis S, Nika P (2018) How does the degree of guidance support students’ metacognitive and problem-solving skills in educational robotics? J Sci Educ Technol 27(1):70–85

Azizi H, Herman T (2020) Critical thinking and communication skills of 10th grade students in trigonometry. J Phys 1469(1):10. https://doi.org/10.1088/1742-6596/1469/1/012161

Backer LD, Keer HV, Valcke M (2020) Variations in socially shared metacognitive regulation and their relation with university students’ performance. Metacogn Learn 15(2):233–259. https://doi.org/10.1007/s11409-020-09229-5

Begle EG (1969) The role of research in the improvement of mathematics education. Educ Stud Math 2:232–244

Begle EG (1979) Critical variables in mathematics education: findings from a survey of the empirical literature

Behar-Horenstein LS, Niu L (2011) Teaching critical thinking skills in higher education: a review of the literature. J Coll Teach Learn (TLC) 8(2)

Bhattacharyya P, Pradhan RK (2015) Perceived paternal parenting style and proactive coping strategies of Indian adolescents. Int J Psychol Stud 7(2):180

Boldureanu G, Alina M, Bercu A, Boldureanu D, Bedrule-grigorut MV (2018) Entrepreneurship education through successful entrepreneurial models in higher education institutions Gabriela. MDPI Sustainability, 1–33

Boyd R, Richerson PJ, Henrich J (2011) The cultural niche: why social learning is essential for human adaptation. Proc Natl Acad Sci 108(Supplement 2):10918–10925

Bransford J, Bransford JD, Brown AL, Cocking RR (1999) How people learn: brain, mind, experience, and school. National Academies Press

Google Scholar  

Chew MSF, Shahrill M, Li H-C (2019) The integration of a problem-solving framework for Brunei high school mathematics curriculum in increasing student’s affective competency. J Math Educ 10(2):215–228

Corno L (1986) The metacognitive control components of self-regulated learning. Contemp Educ Psychol 11(4):333–346

Craig K, Hale D, Grainger C, Stewart ME (2020) Evaluating metacognitive self-reports: systematic reviews of the value of self-report in metacognitive research. Metacogn Learn 15(2):155–213. https://doi.org/10.1007/s11409-020-09222-y

Cresswell C, Speelman CP (2020) Does mathematics training lead to better logical thinking and reasoning? A cross-sectional assessment from students to professors. PLoS ONE 15:1–21. https://doi.org/10.1371/journal.pone.0236153

Dewey J (1916) Democracy and education by John Dewey. Project Gutenberg

English LD (2016) STEM education K-12: perspectives on integration. Int J STEM Educ 3(1):1–8

Ennis RH (1985) A logical basis for measuring critical thinking skills. Educ Leadersh 43(2):44–48

Erdogan F (2019) Effect of cooperative learning supported by reflective thinking activities on students’ critical thinking skills. Eurasian J Educ Res 19(80):89–112

Erdoğan F, Şengül S (2017) The effect of cooperative learning method enhanced with metacognitive strategies on students' metacognitive skills in maths course. Educ Sci 42(192)

Faivre N, Vuillaume L, Bernasconi F, Salomon R, Blanke O, Cleeremans A (2020) Sensorimotor conflicts alter metacognitive and action monitoring. Cortex 124:224–234. https://doi.org/10.1016/j.cortex.2019.12.001

Feriyanto F, Putri ROE (2020) Developing mathematics module based on literacy and higher order thinking skills (HOTS) questions to train critical thinking ability of high school students in Mojokerto. J Phys. https://doi.org/10.1088/1742-6596/1594/1/012014

Gacel-Ávila J (2005) The internationalization of higher education: a paradigm for global citizenry. J Stud Int Educ 9(2):121–136

Galotti KM, Mark MC (1994) How do high school students structure an important life decision? A short-term longitudinal study of the college decision-making process. Res High Educ 35(5):589–607

García-García J, Dolores-Flores C (2021) Exploring pre-university students’ mathematical connections when solving Calculus application problems. Int J Math Educ Sci Technol 52(6):912–936

Geary DC (1995) Reflections of evolution and culture in children’s cognition: implications for mathematical development and instruction. Am Psychol 50(1):24

Hargrove RA (2013) Assessing the long-term impact of a metacognitive approach to creative skill development. Int J Technol Des Educ 23(3):489–517

Hargrove RA, Nietfeld JL (2015) The impact of metacognitive instruction on creative problem solving. J Exp Educ 83(3):291–318

Hartini S, Mariani I, Misbah, Sulaeman NF (2020) Developing of students worksheets through STEM approach to train critical thinking skills. J Phys 1567(4):10. https://doi.org/10.1088/1742-6596/1567/4/042029

Herman T, Dahlan JA (2017) The enhancement of students’ critical thinking skills in mathematics through the 5E learning cycle with metacognitive technique. 57(ICMSEd 2016) 10:101–106. https://doi.org/10.2991/icmsed-16.2017.23

Hidayat R, Zulnaidi H, Zamri SNAS (2018) Roles of metacognition and achievement goals in mathematical modeling competency: a structural equation modeling analysis. PLoS ONE 13(11):1–25. https://doi.org/10.1371/journal.pone.0206211

Hiebert J (2013) Conceptual and procedural knowledge: the case of mathematics. Routledge

Jonsson B, Norqvist M, Liljekvist Y, Lithner J (2014) Learning mathematics through algorithmic and creative reasoning. J Math Behav 36:20–32. https://doi.org/10.1016/j.jmathb.2014.08.003

Kallio H, Kalio M, Virta K, Iiskala T, Hotulainen R (2020) Teachers’ support for learners’ metacognitive awareness. Scand J Educ Res. https://doi.org/10.1080/00313831.2020.1755358

Kayaalp F, Meral E, Simsek U, Sahin IF (2020) A search for a method to improve critical thinking skills in social studies teaching: writing-to-learn. Rev Int Geogr Educ 10(3):400–430. https://doi.org/10.33403/rigeo.719222

Khanal B, Panthi RK, Kshetree MP, Acharya BR, Belbase S (2021) Mathematics learning strategies of high school students in Nepal. SN Social Sci 1(7):1–28. https://doi.org/10.1007/s43545-021-00165-y

Kristensen S, Sandberg K, Bibby BM (2020) Regression methods for metacognitive sensitivity. J Math Psychol. https://doi.org/10.1016/j.jmp.2019.102297

Kwang TS (2000) The effect of metacognitive training on the mathematical word problem solving on Singapore 11–12 years old in a computer environment. School of Education , PhD Thesis (June 2002), 46–55

Laurens T, Batlolona FA, Batlolona JR, Leasa M (2017) How does realistic mathematics education (RME) improve students’ mathematics cognitive achievement? EURASIA J Math Sci Technol Educ 14(2):569–578

Lave J (1988) Cognition in practice: mind, mathematics and culture in everyday life. Cambridge University Press

Book   Google Scholar  

Lenzo V, Sardella A, Martino G, Quattropani MC (2020) A systematic review of metacognitive beliefs in chronic medical conditions. In Frontiers in Psychology (Vol. 10). https://doi.org/10.3389/fpsyg.2019.02875

Li Y, Schoenfeld AH (2019) Problematizing teaching and learning mathematics as “given” in STEM education. Int J STEM Educ. https://doi.org/10.1186/s40594-019-0197-9

Listiani NW, Wiarta IW, Darsana IW (2014) Penerapan Model Pembelajaran Metakognitif Siswa Kelas V Sd Gugus 8 Blahbatuh. Miimbar PGSD Universitas Pendidikan Ganesha 2(1):1–10

Logan RK, Tandoc M (2018) Thinking in patterns and the pattern of human thought as contrasted with AI data processing. Information 9(4):83

Lysaker PH, Gagen E, Klion R, Zalzala A, Vohs J, Faith LA, Leonhardt B, Hamm J, Hasson-Ohayon I (2020) Metacognitive reflection and insight therapy: a recovery-oriented treatment approach for psychosis. Psychol Res Behav Manag 13:331–341. https://doi.org/10.2147/PRBM.S198628

Mahmud R (2017) The development of social learning model based on metacognitive strategies to foster mathematics self-efficacy of senior high school students 9 Makassar, Indonesia. Eurasia J Math Sci Technol Educ 13(8):4873–4883. https://doi.org/10.12973/eurasia.2017.00970a

Maillard P, Dimaggio G, Berthoud L, de Roten Y, Despland J-N, Kramer U (2020) Metacognitive improvement and symptom change in a 3-month treatment for borderline personality disorder. Psychol Psychother Theory Res Pract 93(2):309–325. https://doi.org/10.1111/papt.12219

Marzano RJ (1988) Dimensions of thinking: a framework for curriculum and instruction. ERIC

Mathabathe K (2019) Factors underlying metacognitive judgements in foundation chemistry. EURASIA J Math Sci Technol Educ. https://doi.org/10.29333/ejmste/105868

Mayadiana D (2005) Pembelajaran dengan Pendekatan Diskursif untuk Mengembangkan Kemampuan Berpikir Kritis Mahasiswa Calon Guru SD. UPI Bandung: Tidak Diterbitkan

Meyers C (1986) Teaching students to think critically. A guide for faculty in all disciplines. Jossey-Bass Higher Education Series. ERIC

Miegel F, Demiralay C, Moritz S, Wirtz J, Hottenrott B, Jelinek L (2020) Metacognitive training for obsessive-compulsive disorder: a study protocol for a randomized controlled trial. BMC Psychiatry. https://doi.org/10.1186/s12888-020-02648-3

Mohseni F, Seifoori Z, Ahangari S (2020) The impact of metacognitive strategy training and critical thinking awareness-raising on reading comprehension. Cogent Educ. https://doi.org/10.1080/2331186X.2020.1720946

Monteiro S, Sherbino J, Sibbald M, Norman G (2020) Critical thinking, biases and dual processing: the enduring myth of generalisable skills. Med Educ 54(1):66–73. https://doi.org/10.1111/medu.13872

Moodley T, Adendorff SA, Pather S (2015) At-risk student teachers’ attitudes and aspirations as learners and teachers of mathematics. S Afr J Childh Educ 5(3):1–10

Munawarah M, Haji AG, Maulana I (2020) Developing Problem-Based worksheet to improve students’ critical thinking skills and learning outcomes in the concept of chemical bonding. J Phys. https://doi.org/10.1088/1742-6596/1460/1/012099

Murphy J, Chang J-M, Suaray K (2016) Student performance and attitudes in a collaborative and flipped linear algebra course. Int J Math Educ Sci Technol 47(5):653–673

Nakagawa Y, Saijo T (2020) Future design as a metacognitive intervention for presentism. Sustainability (Switzerland). https://doi.org/10.3390/su12187552

Oates T (2011) Could do better: using international comparisons to refine the National Curriculum in England. Curric J 22(2):121–150

Radjibu PIVD, Kuswanto H, Sugiharto (2020) Analysis of critical thinking skills and scientific communication of students for SHM concepts assisted by Ispring quiz maker test instrument. J Phys. https://doi.org/10.1088/1742-6596/1440/1/012054

Rapanta C, Botturi L, Goodyear P, Guàrdia L, Koole M (2020) Online university teaching during and after the Covid-19 crisis: refocusing teacher presence and learning activity. Postdigital Sci Educ 2(3):923–945

Ross JA (1995) Students explaining solutions in student-directed groups: cooperative learning and reform in mathematics education. Sch Sci Math 95(8):411–416

Ruseffendi ET (1988) Pengajaran matematika modern dan masa kini untuk guru dan calon guru. Bandung: Tarsito.

Ruseffendi ET (1991) Penilaian Pendidikan dan Hasil Belajar Siswa Khususnya dalam Pengajaran Matematika untuk Guru dan Calon Guru. In Bandung: Diktat

Ruseffendi ET (2006) Pengantar kepada membantu guru mengembangkan kompetensinya dalam pengajaran matematika untuk meningkatkan CBSA. Bandung: Tarsito

Rustaman NY (1990) Pendidikan dan penelitian sains dalam mengembangkan keterampilan berpikir tingkat tinggi untuk pembangunan karakter. Seminar Nasional VIII Pendidikan Biologi 15:16–34

Saiz C, Rivas SF (2011) Evaluation of the ARDESOS program: an initiative to improve critical thinking skills. J Scholarsh Teach Learn 11(2):34–51

Salguero JM, Garcia-Sancho E, Ramos-Cejudo J, Kannis-Dymand L (2020) Individual differences in anger and displaced aggression: the role of metacognitive beliefs and anger rumination. Aggress Behav 46(2):162–169. https://doi.org/10.1002/ab.21878

Saritepeci M (2020) Predictors of cyberloafing among high school students: unauthorized access to school network, metacognitive awareness and smartphone addiction. Educ Inf Technol 25(3):2201–2219. https://doi.org/10.1007/s10639-019-10042-0

Schermerhorn JR Jr, Bachrach DG (2020) Exploring management. Wiley

Schoenfeld AH (2018) Learning to think mathematically: Problem solving, metacognition, and sense-making in mathematics. Learn Think Math 2017:1–10

Seow TXF, Gillan CM (2020) Transdiagnostic phenotyping reveals a host of metacognitive deficits implicated in compulsivity. Sci Rep. https://doi.org/10.1038/s41598-020-59646-4

Shanta S (2020) T/E design based learning: assessing student critical thinking and problem solving abilities. Int J Technol Des Educ. https://doi.org/10.1007/s10798-020-09608-8

Sherwood CC, Subiaul F, Zawidzki TW (2008) A natural history of the human mind: tracing evolutionary changes in brain and cognition. J Anat 212(4):426–454. https://doi.org/10.1111/j.1469-7580.2008.00868.x

Shovkova O (2019) Better learning through metacognitive monitoring: developing students’ critical thinking. Sci Notes Ostroh Acad Natl Univ 1(9):57–65. https://doi.org/10.25264/2415-7384-2019-9-57-65

Spector JM, Ma S (2019) Inquiry and critical thinking skills for the next generation: from artificial intelligence back to human intelligence. Smart Learn Environ 6(1):10. https://doi.org/10.1186/s40561-019-0088-z

Stiff LV, Harvey WB (1988) On the education of black children in mathematics. J Black Stud 19(2):190–203

Su HFH, Ricci FA, Mnatsakanian M (2016) Mathematical teaching strategies: pathways to critical thinking and metacognition. Int J Res Educ Sci 2(1):190–200

Sumarmo U (2002) Alternatif pembelajaran matematika dalam menerapkan kurikulum berbasis kompetensi. Makalah Disajikan Pada Seminar Nasional FPMIPA UPI: Tidak Diterbitkan

Suryadi D (2005) Penggunaan pendekatan pembelajaran tidak langsung serta pendekatan gabungan langsung dan tidak langsung dalam rangka meningkatkan kemampuan berpikir matematik tingkat tinggi siswa SLTP. Universitas Pendidikan Indonesia

Suryani I, Senam, Wilujeng I (2020) Analysis of Junior High School student’s critical thinking skills integrated with the local potential of eremerasa nature tourism. J Phys 1440(1):10. https://doi.org/10.1088/1742-6596/1440/1/012096

Susantini E, Puspitawati RP, Raharjo, Suaidah HL (2021) E-book of metacognitive learning strategies: design and implementation to activate student’s self-regulation. Res Pract Technol Enhanc Learn 16(1):10. https://doi.org/10.1186/s41039-021-00161-z

Suter WN (2011) Introduction to educational research: a critical thinking approach. SAGE Publications, Thousand Oaks

Syaiful (2019) Communication skills and mathematical problem solving ability among junior high schools students through problem-based learning. Int J Sci Technol Res 8(11):1048–1060

Syaiful S (2011) Metakognisi Siswa Dalam Pembelajaran Matematika Realistik Di Sekolah Menengah Pertama. Edumatica 1(2):1–13

Syaiful S (2013) The teaching model to enhance mathematical problem solving ability in junior high school teacher. Int J Educ Res 1(9):69–78

Syaiful, Kamid, Muslim, Huda N, Mukminin A, Habibi A (2020) Emotional quotient and creative thinking skills in Mathematics. Univ J Educ Res 8(2):499–507. https://doi.org/10.13189/ujer.2020.080221

Tachie SA (2019) Meta-cognitive skills and strategies application: how this helps learners in mathematics problem-solving. Eur J Math Sci Technol Educ. https://doi.org/10.29333/ejmste/105364

Teng F (2020) The benefits of metacognitive reading strategy awareness instruction for young learners of English as a second language. Literacy 54(1):29–39. https://doi.org/10.1111/lit.12181

Thorslund J, McEvoy PM, Anderson RA (2020) Group metacognitive therapy for adolescents with anxiety and depressive disorders: a pilot study. J Clin Psychol 76(4):625–645. https://doi.org/10.1002/jclp.22914

Tohir M (2019) Keterampilan Berpikir Kreatif Siswa dalam Menyelesaikan Soal Olimpiade Matematika Berdasarkan Level Metakognisi. Alifmatika 1(1):1–14

Uddin MR, Shimizu K, Widiyatmoko A (2020) Assessing secondary level students’ critical thinking skills: Inspiring environmental education for achieving sustainable development goals. J Phys. https://doi.org/10.1088/1742-6596/1567/2/022043

Umam A, Suparmi A, Sukarmin S (2020) Analysis of critical thinking skill profile on the concept of simple harmonic motion using two tier instrument test. J Phys. https://doi.org/10.1088/1742-6596/1567/3/032085

van Rhijn T, Lero DS, Burke T (2016) Why go back to school? Investigating the motivations of student parents to pursue post-secondary education. New Horizons Adult Educ Human Resource Dev 28(2):14–26

Verschaffel L, Depaepe F, Mevarech Z (2019) Learning mathematics in metacognitively oriented ICT-based learning environments: a systematic review of the literature. Educ Res Int. https://doi.org/10.1155/2019/3402035

Verschaffel L, Schukajlow S, Star J, Van Dooren W (2020) Word problems in mathematics education: a survey. ZDM Math Educ 52(1):1–16

Wang M, Binning KR, Del Toro J, Qin X, Zepeda CD (2021) Skill, thrill, and will: the role of metacognition, interest, and self-control in predicting student engagement in mathematics learning over time. Child Dev 92:1369

Wijaya AMY, Hobri Prastiti TD, Dafik, Suratno (2020) The analysis of learning materials implementation using inquiry based learning method to enhance student’s critical thinking skills in solving two dimensional problem. J Phys 1465(1):10. https://doi.org/10.1088/1742-6596/1465/1/012065

Wilson D, Conyers M (2016) Teaching students to drive their brains: metacognitive strategies, activities, and lesson ideas. In: Ascd

Winne PH (1996) A metacognitive view of individual differences in self-regulated learning. Learn Individ Differ 8(4):327–353

Wulf G, Lewthwaite R (2016) Optimizing performance through intrinsic motivation and attention for learning: the OPTIMAL theory of motor learning. Psychon Bull Rev 23(5):1382–1414

Xu H, Ko PY (2019) Enhancing teachers’ knowledge of how to promote self-regulated learning in primary school students: a case study in Hong Kong. Teach Teach Educ 80:106–114

Yildirim S, Ersözlü ZN (2013) The relationship between students’ metacognitive awareness and their solutions to similar types of mathematical problems. Eur J Math Sci Technol Educ 9(4):411–415. https://doi.org/10.12973/eurasia.2013.946a

Yusnaeni Y, Corebima AD, Susilo H, Zubaidah S (2020) The contribution of metacognitive skills and creative thinking skills in 21st century learning. Univ J Educ Res 8(4):31–36. https://doi.org/10.13189/ujer.2020.081805

Download references

Acknowledgements

The authors thank the university students for their time and cooperation in this study.

No funding was received.

Author information

Authors and affiliations.

Postgraduate Program, Mathematics Education Study Program, Universitas Jambi, Kampus UNJA Pasar, Jl. Raden Mattaher No. 16, Jambi, 36133, Indonesia

Syaiful, Nizlel Huda, Amirul Mukminin &  Kamid

You can also search for this author in PubMed   Google Scholar

Contributions

This study is useful both theoretically and practically. Theoretically, this research can be used as a reference by mathematics lecturers in future research related to enhancing the MCTS of students using MLA. It can also broaden mathematics lecturer knowledge, especially related to making generalizations and considering the results of generalization, identifying relevance, formulating the problem to the model of mathematics, making the deduction using the principle, providing an example of inference, and reconstructing the argument. Practically, this research can be empirical evidence that describes how the MLA enhances the MCTS of students. Moreover, this study is to be taken into consideration for the parties like principals, instructors, and mathematics lecturers particularly, in making decisions, especially matters related to teaching mathematics to students at universities. All authors participated in the design, data analysis, and interpretation, and SS was a major contributor in writing the manuscript. NH, AM, and KK revised the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Syaiful .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 21 kb)

Supplementary file2 (xlsx 21 kb), supplementary file3 (docx 16 kb), rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Syaiful, Huda, N., Mukminin, A. et al. Using a metacognitive learning approach to enhance students’ critical thinking skills through mathematics education. SN Soc Sci 2 , 31 (2022). https://doi.org/10.1007/s43545-022-00325-8

Download citation

Received : 25 March 2021

Accepted : 03 February 2022

Published : 17 March 2022

DOI : https://doi.org/10.1007/s43545-022-00325-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Learning approach
  • Metacognitive
  • Critical thinking skill
  • Mathematics
  • University students
  • Find a journal
  • Publish with us
  • Track your research

Accessibility Links

  • Skip to content
  • Skip to search IOPscience
  • Skip to Journals list
  • Accessibility help
  • Accessibility Help

Click here to close this panel.

Purpose-led Publishing is a coalition of three not-for-profit publishers in the field of physical sciences: AIP Publishing, the American Physical Society and IOP Publishing.

Together, as publishers that will always put purpose above profit, we have defined a set of industry standards that underpin high-quality, ethical scholarly communications.

We are proudly declaring that science is our only shareholder.

Android-based math learning to improve critical thinking

A Widiyatmoko 1 , S Utaminingsih 2 and Santoso 2

Published under licence by IOP Publishing Ltd Journal of Physics: Conference Series , Volume 1823 , Second UPY International Conference on Applied Science and Education (2nd UPINCASE) 2020 3-4 November 2020, Yogyakarta, Indonesia Citation A Widiyatmoko et al 2021 J. Phys.: Conf. Ser. 1823 012091 DOI 10.1088/1742-6596/1823/1/012091

Article metrics

467 Total downloads

Share this article

Author e-mails.

[email protected]

Author affiliations

1 Student at Post-Graduate of Primary Teacher Education Universitas Muria Kudus, Indonesia

2 Lecturer at Post-Graduate of Primary Teacher Education Universitas Muria Kudus, Indonesia

Buy this article in print

This research aims to describe and the test of effectiveness of android-based integer learning through Smart Apps Creator (SAC) to improve the critical thinking skills at elementary school students. The method research is research and development. The development procedure is adapted from Borg & Gall using ten stages starting from research and information collecting, planning, develop preliminary form of product, preliminary field testing, main product revision, main field testing, operasional product revision, operational field testing, final product revision, dissemination and implementation. Product validation is performed by material, media, and practitioner validators. Research data is obtained from questionnaires, test, observations, and documentation of five elementary schools in Kudus. Analysis of research data uses T-test to determine effectiveness. The results were matched between classes treated using android-based media and SAC with comparison classes that were not treated using android-based media with SAC. The result of T-test in control and experiment class=4,457 trought the degrees significant 0,000<α=0.05. The study results of the experiment group were higher than the control group. Therefore, android-based integer learning with smartapps creator is effective for improving the critical thinking skills of elementary school students.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence . Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

IMAGES

  1. Increasing Critical Thinking Skills in Math

    math improve critical thinking

  2. Critical Thinking: Test-taking Practice for Math Grade 4

    math improve critical thinking

  3. PSLE Mathematics Critical Thinking Practice In Challenging Problem Sums

    math improve critical thinking

  4. Critical Thinking in Mathematics: KS2

    math improve critical thinking

  5. FREE Award-Winning Critical Thinking Puzzles! Sign Up Today! Delivered

    math improve critical thinking

  6. Maths Worksheets For Grade 3: Critical Thinking Exercises For 8th Graders

    math improve critical thinking

VIDEO

  1. Critical Thinking

  2. Updating Knowledge In Sepsis Management: Improve Critical Thinking, Save More Lives

  3. BINARY THINKING

  4. Learning Outcomes Of Critical Thinking

  5. The Three A's of Critical Thinking

  6. CRITICAL THINKING#short#shorts #maths#math#mathshorts#mathhacks#mathtricks#mathtutorials

COMMENTS

  1. Applying Algebraic Strategies to Make Gains in How You Think

    Using mathematical inversion to improve critical thinking. What's more, the more times we encounter information, the more we believe it, and even when we practice being sceptical, our brains ...

  2. Mathematics Improves Your Critical Thinking and Problem-Solving

    Mathematics provides a systematic and logical framework for problem-solving and critical thinking. The study of math helps to develop analytical skills, logical reasoning, and problem-solving abilities that can be applied to many areas of life.By using critical thinking skills to solve math problems, we can develop a deeper understanding of concepts, enhance our problem-solving skills, and ...

  3. Unlocking the Power of Math Learning: Strategies and Tools for Success

    More than just earning good grades in math, mathematical literacy is a vital life skill that can open doors to economic opportunities, improve financial management, and foster critical thinking. We're not the only ones who say so: Math learning enhances problem-solving skills, critical thinking, and logical reasoning abilities.

  4. How To Encourage Critical Thinking in Math

    Critical thinking is more than just a buzzword… It's an essential skill that helps students develop problem-solving abilities and make logical connections between different concepts. By encouraging critical thinking in math, students learn to approach problems more thoughtfully, they learn to analyze and evaluate math concepts, identify patterns and relationships, and explore different ...

  5. Does mathematics training lead to better logical thinking and reasoning

    The other problem where performance was unrelated to training, the Petrol Station problem, cannot be characterised similarly. It is more of a logical/critical thinking type problem, where there remains some suggestion that training may have impacted performance, as the Academic group seemed to perform better than the rest of the sample.

  6. How to Improve Problem-Solving Skills: Mathematics and Critical Thinking

    This helps them see math as a tool to navigate real-world challenges, thereby promoting critical thinking. 4. What are the six basic steps of the problem-solving process in math? The six steps are: Identification, Analysis, Generation of Alternatives, Decision Making, Implementation, and Evaluation.

  7. PDF Mathematical Teaching Strategies: Pathways to Critical Thinking and

    When teaching mathematics, critical thinking skills can be used, practiced and enhanced by effective cognitive methods. Critical thinking can enhance creative problem solving options by encouraging students to seek new ... improve their critical thinking skill levels in this area (Pascarella & Terenzini, 1991). The work of Reichenbach

  8. Improving Students' Math Literacy in Middle and High School

    Making the math classroom become a laboratory of reading and math enables students to become owners of the learning process. Students can be math literate, which will allow teachers to facilitate learning processes with all types of word problems, and consequently improve math scores and prepare students for the world of infusing mathematics ...

  9. Promoting Independent Critical Thinking in Math

    5 Ways to Get Your Students to Think. 1. Answer questions with a refocus on the students' point of view. Liljedahl found in his research that students ask three types of questions: " (1) proximity questions—asked when the teacher is close; (2) stop thinking questions—most often of the form 'is this right' or 'will this be on the ...

  10. Promoting Creative and Critical thinking in Mathematics and Numeracy

    The mathematics curriculum in Australia provides teachers with the perfect opportunity to teach mathematics through critical and creative thinking. In fact, it's mandated. Consider the core processes of the curriculum. The Australian Curriculum (ACARA, 2017), requires teachers to address four proficiencies: Problem Solving, Reasoning, Fluency ...

  11. Enhancing Math Thinking Skills: Transforming Traditional Activities for

    Discover effective strategies for promoting critical thinking, problem-solving, and mathematical reasoning in the classroom. Learn how to transform traditional numeracy tasks into engaging thinking activities that foster deep understanding. Empower your students with open-ended exploration and patte

  12. Full article: Promoting critical thinking through mathematics and

    1 Introduction and background. Critical thinking has been considered a key twenty-first century competence by different frameworks (Voogt and Roblin Citation 2012) and by STEM educators (Jang Citation 2016).An education contributing to the development of twenty-first century competences requires, among other things, a reconsideration of instructional processes and a shift from teaching to know ...

  13. Critical Thinking in Mathematics Education

    Definition. Mainstream educational psychologists view critical thinking (CT) as the strategic use of a set of reasoning skills for developing a form of reflective thinking that ultimately optimizes itself, including a commitment to using its outcomes as a basis for decision-making and problem solving. In such descriptions, CT is established as ...

  14. Mathematical Literacy and Critical Thinking

    The development of mathematical literacy enables students to become skilled critical thinkers and problem-solvers who have a better understanding of the world they live in. However, very often students are unable to understand the mathematical principles and apply them to real-life situations. In many college mathematics classrooms, the lessons ...

  15. Does mathematics training lead to better logical thinking and ...

    Mathematics is often promoted as endowing those who study it with transferable skills such as an ability to think logically and critically or to have improved investigative skills, resourcefulness and creativity in problem solving. However, there is scant evidence to back up such claims. This project tested participants with increasing levels of mathematics training on 11 well-studied rational ...

  16. The Increase of Critical Thinking Skills through Mathematical

    On the other hand, critical thinking skills will lead a student in the process of analysis, evaluation and synthesis in solving a mathematical problem. This study attempts to perform an alternative solution with a focus on mathematics learning conditions that is held in the lecture room through mathematical investigation approach.

  17. PDF Learners' Critical Thinking About Learning Mathematics

    education, their potential of contributing to improve their mathematics learning process becomes visible. We recommend that young learners get training in and are encouraged to think critically . ... curriculum, it is stated that, "Critical thinking in mathematics involves critical evaluation of reasoning and argumentation, and

  18. Teaching Mathematical Reasoning

    Another approach is mathematical modeling. Usually used for students in middle or high school, mathematical modeling brings math tools to bear on real-world problems, keeping students engaged and helping them to develop deeper mathematical reasoning and critical thinking skills. Math modeling is an extremely common practice in the professional ...

  19. Introduction to Mathematical Thinking

    The eight-week-long Basic Course is designed for people who want to develop or improve mathematics-based, analytic thinking for professional or general life purposes. The ten-week-long Extended Course is aimed primarily at first-year students at college or university who are thinking of majoring in mathematics or a mathematically-dependent ...

  20. (PDF) Students' Critical Thinking Skills in Solving Mathematical

    To develop mathematical critical thinking skills, students are expected to have a fighting attitude in solving mathematical problems. This can support learning that is oriented towards students ...

  21. Using a metacognitive learning approach to enhance students' critical

    This study aims to describe how the metacognitive learning approach (MLA) helped a university's mathematics lecturer enhance students' Mathematical Critical Thinking Skills (MCTS) through mathematics learning. It is an experimental study using a pretest-posttest control group design. The subjects of this study were the students of Mathematics education at a university. The instrument used ...

  22. (PDF) Improving mathematical critical thinking skills through problem

    This research was conducted to improve students' mathematical critical thinking skills by using problem-based learning models. The type of this research is quantitative which is designed as an ...

  23. Android-based math learning to improve critical thinking

    Android-based math learning to improve critical thinking. A Widiyatmoko 1, S Utaminingsih 2 and Santoso 2. Published under licence by IOP Publishing Ltd ... (SAC) to improve the critical thinking skills at elementary school students. The method research is research and development. The development procedure is adapted from Borg & Gall using ten ...