The Research Doesn’t Add Up
Item 2 of 6
Summary in Brief:
In our view, the authors of the draft framework make misleading claims that are not supported by research. Big shifts in mathematical education are being proposed based upon claims about research that are problematic.
Disclaimer for The Research Doesn’t Add Up:
All statements written below reflect our analysis from a close reading of the California Math Framework (CMF) second field review (sfr) and associated cited research.
NEXT: It Removes Standards-Based Instruction
The Research Doesn’t Add Up
Introduction:
In our view, a serious issue with the California Math Framework (CMF) second field review (sfr) is that it makes claims about significant student benefit using research that does not support those claims, either in magnitude or sometimes even in direction.
The CMF sfr proposes to achieve more equitable math results through improved math performance. However, in furtherance of these goals, the CMF sfr cites and relies on research whose claims of significant student benefit are in question. The use of such ineffective strategies underlying the CMF sfr's proposal, as using poorly evidenced strategies has done in the past, will likely preserve or even widen the existing inequity and achievement gaps, rather than achieve equitable math results.
We find four major concerns with the CMF sfr’s research:
The research citations suggest a larger student benefit than is supported by the research.
Research is cited in a way inconsistent with the actual findings of the research.
The cited research is often not peer reviewed, not well cited, uses non-randomized assignment, and non-standard measures.
Citations suggest support for proposals that the research was not specifically designed to separately evaluate, and at times even is contradicted in the research.
Below are examples giving evidence for these concerns.
Relying on claims unsupported by the research being referenced undermines the credibility of the CMF sfr and its proposals.
Example 1: SFUSD Math Program
The SFUSD math program model is the same model which the CMF sfr suggests; it detracks math classes (K through 10th grade), it delays Algebra 1 to grade 9 for all students, and it decelerates, by delaying Alg. 1 to grade 9 for all students, and by discouraging calculus completion in high school through the obstacles placed in that path. SFUSD also employs a group-work based teaching approach called ‘Complex instruction’, which the CMF sfr suggests.
Claims about math gains made by SFUSD’s math program, the widely acknowledged model for the CMF sfr, appeared in the CMF first field review (ffr) (Ch. 1 Line 471-476), but those claims have disappeared from the CMF’s second field review (sfr), which is telling in and of itself. If the SFUSD math program had legitimate math achievement gains, presumably those claims would be discussed and included in the CMF sfr. But they are not.
How has SFUSD’s math program, the role model for the CMF sfr, performed?
Math gain claims have been made about SFUSD’s math program, but have been hotly disputed. Specific claims, made by SFUSD, and by some research authors, include:
a decline in the Algebra 1 repeat rate from 40% to 8%, and
an increase in student enrollment in advanced math classes beyond Algebra 2
(PRESS RELEASE Historic shifts in Math show promise Students significantly more likely to pass Algebra the first time)
(see Hechinger report re SFUSD math Boaler, Schoenfeld, Asturias, Callahan, and Foster 2018)
Both of these claims have been disputed, and we think, refuted, most notably by the Families for SF Inequity in Numbers report, (https://www.familiesforsanfrancisco.com/updates/inequity-in-numbers), but also by the ‘Evaluation of the Forced Uniform Math Assignment since 2014–15 Implementation’ article, by Ze’ev Wurman, (https://www.independent.org/publications/article.asp?id=13698).
The Inequity in Numbers (IIN) report exposed how a SFUSD placement policy change, rather than any improvement in math achievement, led to the decline in Algebra 1 repeat rates, i.e. that SFUSD stopped requiring students to pass a state proficiency test to advance from Algebra 1 to Geometry.
The IIN report also revealed how SFUSD counted a compressed Alg. 2/Precalculus course as ‘advanced,’ when the University of California (UC) declined to consider the compressed course as ‘advanced’, as the compressed course did not include enough Precalculus content. In fact, according to the IIN report, enrollment in ‘advanced’ classes beyond Algebra 2 went down, not up.
The IIN report:
“Once we exclude the enrollment data for the compression course, the enrollment number for advanced math shows a net decrease from 2017-2018 (the final cohort prior to the implementation of the new math course sequence).”
(https://www.familiesforsanfrancisco.com/updates/inequity-in-numbers)
(Note - by ‘implementation of the new math course sequence’, the IIN report means the implementation of the SFUSD (detracking) math program)
Further, according to Families for San Francisco, SFUSD has not been fully forthcoming with public records requests, which queried how SFUSD actually calculated these claims of math gains (https://www.familiesforsanfrancisco.com/updates/inequity-in-numbers).
Ze’ev Wurman’s findings highlight that rather than SFUSD’s math program achieving math achievement gains, SFUSD’s state assessment test results showed a 5% drop for 11th grade students testing at the proficient or above level, from 2016-17 (the final 11th grade cohort prior to SFUSD math program implementation) to 2018-19 (the 2nd 11th grade cohort after SFUSD math program implementation). (https://www.independent.org/publications/article.asp?id=13698)
Expanding on Ze’ev Wurman’s analysis, we reviewed state assessment test scores for SFUSD grade 11 students (https://caaspp-elpac.cde.ca.gov/caaspp/) from 2016-17 to 2018-19 to assess the SFUSD math program’s impact, which modified the math course sequence for grade 8 and above starting in the 2014-15 school year. Above grade 8, the state assessment test (CAASPP) is only administered in grade 11 in high school.
The time period for the SFUSD state assessment tests examined, from 2016-17 to 2018-19, encompasses the time period starting with the last cohort prior to implementation of the SFUSD math program reaching grade 11 (2016-17), to subsequent time periods that include cohorts impacted by the SFUSD math program implementation, and which reached grade 11. The first cohort impacted by SFUSD’s math course sequence change reached grade 11 in the school year 2017-18. Additional SFUSD state assessment test information from 2015-16 was also included in the review for informational purposes. SFUSD student grade 11 state assessment test math achievement by racial demographic and ELL status and economic disadvantage status was also reviewed and is displayed below. Source: https://caaspp-elpac.cde.ca.gov/caaspp/
As Ze’ev Wurman points out in (https://www.independent.org/publications/article.asp?id=13698), 11th grade state assessment test results (CAASPP results - https://caaspp-elpac.cde.ca.gov/caaspp/) do not support SFUSD’s math achievement gain claims.
From 2016-17 to 2018-19 California’s 11th graders were showing a slight upward trajectory in math achievement, but SFUSD’s 11th grade students showed a 5% decline from 2016-17 to 2018-19:
Table 1
All CA Students
All SFUSD Students CAASPP results --- grade 11 students Math - grade 11 -- % Proficient or above - all State of CA students
Math - grade 11 -- % Proficient or above - all SFUSD students https://caaspp-elpac.cde.ca.gov/caaspp/
2015-16 37%
53% 2016-17 ( last cohort prior to SFUSD math program 37.56%
53.82% 2017-18 (first cohort after SFUSD math program implementation) 38.65%
49.88% 2018-19 (second cohort after SFUSD math program implementation) 39.73%
48.61%
In addition, after SFUSD’s math course sequence change, grade 11 math achievement levels for SFUSD’s initially high achieving students declined; Asian and White student math achievement declined, 2% and 7% respectively, from 2016-17 to 2018-19. (See Table 2) Table 2 All SFUSD Students - grade 11 SFUSD SFUSD SFUSD SFUSD SFUSD SFUSD CAASPP results --- grade 11 --- % Proficient and above Math - grade 11 -- % Proficient or above - all SFUSD students Asian - grade 11 - Math - % Proficient or above - SFUSD White - grade 11 - Math - % Proficient or above - SFUSD Latino - grade 11 - Math - % Proficient or above - SFUSD Black - grade 11 - Math - % Proficient or above - SFUSD ELL - grade 11 - Math - % Proficient or above - SFUSD Economically disadvantaged - grade 11 - Math - % Proficient or above - SFUSD 2015-16 53% 75% 63% 16% 11% 24% 46% 2016-17 (last cohort of SFUSD 11th graders prior to SFUSD math program implementation) 53.82% 74.98% 61.79% 18.26% 9.78% 28.93% 50.00% 2017-18 (first cohort of 11th graders - after SFUSD math program implementation) 49.88% 71.77% 63.22% 14.51% 12.09% 23.59% 43.97% 2018-19 (second cohort of 11th graders after SFUSD math program implementation ---prior to pandemic disruption of CAASPP testing) 48.61% 72.79% 54.13% 15.09% 12.11% 17.00% 41.66%
Source: https://caaspp-elpac.cde.ca.gov/caaspp/
Lower achieving SFUSD math students, including Latinos, ELL students, and economically disadvantaged students, also showed declines in math achievement from 2016-17 to 2018-19, of 3%, 12%, and 8% respectively (See Table 2).
Only Black SFUSD students showed a math achievement gain, of 2.5%, from 2016-17 to 2018-19. (See Table 2).
According to the CAASPP state assessment test results, overall student math achievement performance declined after SFUSD implemented its math course sequence change program.
What about inequity, was that reduced by SFUSD’s math program?
Inequity in math achievement between students of different racial demographic and socio-economic groups, as measured by the difference in % of students proficient and above, was reduced in a few cases, but rose in other cases. This number, if reduced, was largely due to declines in the proficient and above level of initially higher achieving students, such as the 7% decline in White student math achievement from 2016-17 to 2018-19, rather than due to increases in the % of proficient and above level for lower achieving students. Only one student group had an increase in % proficient and above level from 2016-17 to 2018-19, the Black student group, at 2.5%.
SFUSD Asian student math achievement declined 2% from 2016-17 to 2018-19; as did that of Whites, Latinos, ELLs, and Economically disadvantaged students, by 7%, 3%, 12%, and 8% respectively. (Source: https://caaspp-elpac.cde.ca.gov/caaspp)
And, according to Families for San Francisco (FFSF), new inequities have been introduced by the implementation of SFUSD’s math program due to costly workarounds required for students who wish to accelerate (math) within SFUSD and complete calculus in high school. (https://www.familiesforsanfrancisco.com/updates/inequity-in-numbers)
In fact, a website has been created specifically to help SFUSD students and parents navigate the workaround issue, and to assist students who wish to complete calculus in high school within SFUSD: https://www.mathpathsf.com/. According to the CMF first field review, calculus completion in high school is an unstated requirement for many colleges:
“Considering that many competitive colleges and universities (those that accept less than 25 percent of applicants) hold calculus as an unstated requirement,...”
(Ch. 1 CMF ffr Line 116-117)
To sum up, SFUSD’s math program, the role model for the CMF sfr, did not provide student benefit overall, according to state assessment tests, in either math achievement or in reducing inequity in math achievement. Student math achievement at SFUSD declined 5% overall after implementation of the SFUSD math program, from 2016-17 to 2018-19 (https://caaspp-elpac.cde.ca.gov/caaspp/). According to FFSF, new inequities were introduced after implementation of the SFUSD math program, while according to state assessment test results, gaps between proficiency levels remained. (https://www.familiesforsanfrancisco.com/updates/inequity-in-numbers) (https://caaspp-elpac.cde.ca.gov/caaspp/)
Claims of math gains from SFUSD’s math program have been discredited, which likely explains why there is little to no mention of SFUSD’s math program in the CMF sfr.
Still though, these discredited SFUSD claims live on, in research cited in the CMF sfr (Ch. 2 Line 813, 883, Ch. 8 Line 762, Ch. 12 Line 265) to support its guidance, in a paper by LaMar, Leshin, Boaler 2020, which describes a school district called ‘Gateside Union District.’
According to the “The derailing impact of content standards...”, LaMar, Leshin, Boaler 2020,
‘Gateside’ “...the district that is the focus of this paper..,” began work in 2014 to, “...address three established sources of mathematics disengagement and inequality – tracking, procedural teaching, and fixed messaging.” and “take steps to counter them.” (pg. 2)
“Groups of parents organized meetings and Facebook groups to oppose the district changes, but the district stood firm and within a year they saw substantial benefits. Under their previously tracked system, 40% of students failed algebra, and inequities were very evident - after one year of the new system the algebra failure rate dropped to 8 percent.” (pg. 3)
“One of the arguments of those opposed to equitable initiatives is that equity reduces the chances of high achievers to excel but there was no evidence of this compromise and the proportion of students taking advanced classes at Gateside district increased by one third.” (pg. 3)
“These changes have brought about impressive outcomes, with algebra failure rates declining dramatically, and large increases in students taking advanced mathematics courses.” (pg. 10)
There is little question that ‘Gateside Union District’ is SFUSD.
Here is a table of the very similar math gain claims made for ‘Gateside’ and made for SFUSD:
Math gain claims - 'Gateside'
Pg. # "The derailing impact...2020"
Math Gain Claims - SFUSD math program
Link to SFUSD math gain claims
Algebra 1 failure rate went from 40% to 8% as a result of the program put into place in 2014. (Note: SFUSD’s math dept. used ‘failure rate’ and ‘repeat rate’ interchangeably in regard to Algebra 1 advancement as previously,students were required to pass the state proficiency test to advance from Algebra 1 to Geometry)
pg. 3
Algebra 1 repeat rate went from 40% to 8% as a result of the program put into place in 2014.
http://www.sfusdmath.org/uploads/2/4/0/9/24098802/historic_shifts_in_math_show_promise.pdf
Students enrolled in 'advanced' classes went up by 1/3.
pg. 3
claims that "456 more students, or 10.4% more students are taking courses beyond Algebra 2 in 2018-2019 than were in 2017-2018." 2018-11 CSBA Math Placement (Presentation): San Francisco Detracking: Early Indicators...
https://drive.google.com/file/d/1svs346aKIrw0vSZqV6xPhEvmrjaPT2md/view
Table 3
And here is a table of the very similar math program elements and District attributes for ‘Gateside’ and for SFUSD: Math program elements or District Attributes Gateside Union District math program changes - according to "The Derailing Impact...2020" Pg. # "The Derailing Impact..." SFUSD Link - Proof of SFUSD items Year Program started Gateside began work in 2014 to counter issues and take steps Pg. 2 - see 3. The setting Program adopted in Feb. 2014. Program began 2014-15 school year https://www.sfusdmath.org/uploads/2/4/0/9/24098802/ccss-math-faq.pdf Detracking Gateside eliminated tracking before 11th grade Pg. 3 Same
Algebra 1 delayed for all to 9th grade Gateside moved Algebra to 9th grade for all students Pg. 3 Same https://www.sfusdmath.org/uploads/2/4/0/9/24098802/ccss-math-faq.pdf Compressed Alg 2/Precalculus class - in 11th grade Gateside created pathways to calculus for all students, which included a compressed Alg 2/Precalculus course in grade 11 Pg. 3 Same https://www.sfusdmath.org/uploads/2/4/0/9/24098802/ccss-math-faq.pdf Detracking All students take the same mathematics course until the end of 10th grade. Pg. 3 Same https://www.sfusdmath.org/uploads/2/4/0/9/24098802/ccss-math-faq.pdf Teacher Professional Development Extensive professional development in Complex Instruction (CI) accompanied the changes. pg. 4 Same https://www.sfusdmath.org/complex-instruction.html#:~:text=The%20complex%20instruction%20model%20aims,%2C%20unpublished%20document%2C%202009). District type Large, urban district in California pg. 4 Same https://www.sfusd.edu/ Racial Demographics - Student Body Gateside - racially diverse student body - 27% Latino, 35% Asian, 15% White, less than 10% African American, Filipino, Pacific Islander and American Indian. Year of data collection not specified. pg. 4 27% Latino, 35% Asian, 15% White, less than 10% all others - 2017-18 school year non charter, enrollment multi-year summary by ethnicity. https://dq.cde.ca.gov/dataquest/dqcensus/EnrEthYears.aspx?cds=3868478&agglevel=district&year=2019-20&ro=y&ro=y Student demographics Gateside - 55% students socioeconomically disadvantaged, 29% Language Learners, 11% special needs. Year of data collection not specified. pg. 4 53% (Free and Reduced Price meals 2016-17), 28% (Language Learners 2017-18), 12% special education 2018-19 https://www.ed-data.org/district/San-Francisco/San-Francisco-Unified
Table 4
It’s quite clear to us that ‘Gateside’ and SFUSD are one and the same, and that the same discredited SFUSD math gain claims are being repeated in the “The derailing impact of content standards...”, LaMar, Leshin, Boaler 2020 article. It’s unfortunate that the CMF sfr chose to cite a research paper containing these discredited claims.
In summary, the role model for the CMF sfr, SFUSD’s math program, did not show evidence of student benefit, but instead showed evidence of a decline in student math achievement after implementation, according to state assessment test results, for initially high achieving students and for lower achieving students. Just one student group showed a math achievement gain. (https://caaspp-elpac.cde.ca.gov/caaspp/)
Research that includes discredited math gain claims should not be referenced in the CMF sfr; doing so calls into question the standard of research that the CMF sfr cites, accepts, and includes.
Changes recommended for CA public school education should not be based on discredited claims and research.
Example 2: The ‘Railside Study’
A prime example of CMF sfr cited research that has come under serious scrutiny is what is known as ‘The Railside study.’ (https://www.tcrecord.org/Content.asp?ContentId=14590). The ‘Railside Study’ is cited at least 7 times in the CMF sfr under the citation ‘Boaler and Staples, 2008.’
The full citation for the ‘Railside Study’, according to the CMF sfr’s Appendix B is:
Boaler, Jo, and Megan Staples. 2008. “Creating Mathematical Futures through an
Equitable Teaching Approach: The Case of Railside School.” Teachers’ College Record. 110(3): 608-645.
(Ch. 1 Lines 137-143, Ch. 2 Lines 63-67, Ch. 9 Lines 160-165, Ch. 9 Lines 181-184, Ch. 9 Lines 287-282, Ch. 9 LInes 331-335, Ch. 12 Line 259)
The ‘Railside Study’ article can be found here: https://www.tcrecord.org/Content.asp?ContentId=14590
Though the CMF sfr cites the ‘Railside Study’ at least 7 times, ‘Railside’ has been heartily and heavily critiqued, with serious questions and allegations raised about its methodology, the differing math achievement levels of students studied, the grade level of the non-standard test that was used, the use of non-standard tests, the Study’s evidentiary standards, its math achievement claims, and consequently its validity, in “A Close Examination of Jo Boaler’s Railside Report,” which can be found here: "A Close Examination...".
For example, when the authors of “A Close Examination…” examined the 4 non-standard tests used to assess student math achievement as part of the ‘Railside Study’ (tests which had been posted on one author’s website), the authors of “A Close Examination…” determined that the test questions were 3 grade levels below the grade for which they were being administered. They also claimed the math questions in the 4 ‘Railside Study’ tests had ‘serious mathematical difficulties: mathematically incorrect problems, a multiple choice problem with all answer choices incorrect’, and more issues (pg. 2 "A Close Examination...".).
Though the claims of ‘Railside Study’ are heavily disputed, the ‘Railside Study' is the most frequently cited research article in all of the CMF sfr.
Heavily citing and relying on such seriously questioned research calls into question the proposals underpinned by such research, in a math curriculum framework designed to influence possibly as many as 6 million California school children (https://www.cde.ca.gov/nr/ne/yr21/yr21rel32.asp).
Citing and relying on the ‘Railside Study’ as research support calls into question the research standard the CMF sfr is willing to accept, include, and cite.
Changes recommended for CA public school education should not be based on discredited claims and research.
Example 3: Youcubed Summer Camp
Another example of a CMF sfr math gain claim that deserves scrutiny is the claim of a 2.8 school years math gain from a 4 week youcubed summer math camp. The CMF sfr claim:
“After only 18 lessons the students improved their achievement by the
equivalent of 2.8 years of school. Students related their increased achievement to the
classroom environment that encouraged discussion, convincing, and skepticism
(Youcubed, 2017), as illustrated by this interview with two students, TJ and José:...” (Ch. 4 Line 906-909)
Checking the only supporting citation for this 2.8 years math gain claim, (Youcubed, 2017), led to a Youcubed video which provided no evidence for the claim. Trying to find a research paper that purported to provide evidence for this claim led a researcher down a number of rabbit holes, which will be explained below. It took multiple searches; that path and the paper found are described below.
The full citation for the supporting cite for the 2.8 math years gain claim, (Youcubed, 2017) (Ch. 4 Line 906-909), per Appendix B for Ch. 4 is:
Youcubed. 2017. “Solving the Math Problem – Subtitled.” Retrieved from
https://vimeo.com/245472639.
Appendix B (Line 717-718)
This vimeo link leads to a 3 minute 26 second youcubed video, titled, “Solving the Math Problem,” which doesn’t provide any evidence for the 2.8 school years math gain claim, but it does make a similar math gain claim for the Youcubed summer math camp at 3 min 04 seconds of the video:
“The students attended 18 math lessons at Youcubed and improved their performance on standardized tests by an average of 50% – the equivalent of 2.8 years of school.”
The video is mostly a series of interviews with middle school students who (ostensibly) participated in the youcubed summer math camp.
Right above the (Youcubed, 2017) reference in Appendix B for Ch. 4 (Line 717-718), however, is another Youcubed reference:
Youcubed. n.d. “Youcubed Summer Camps.” https://www.youcubed.org/evidence/our-teaching-approach/
Appendix B (Line 715-716)
As the 2.8 years claim was made about a Youcubed summer math camp, the researcher followed this Youcubed summer camp link, which leads to a web page describing Youcubed’s summer math camps, on which there is a button/link, which says ‘Read the Journal Article,’ which leads to the following article:
Boaler, Jo, Jack A. Dieckmann, Tanya LaMar, Miriam Leshin, Megan Selbach-Allen,
and Graciela Pérez-Núñez. 2021. “The Transformative Impact of a Mathematical
Mindset Experience Taught at Scale.” Frontiers in Education: December 10, 2021.
The article can be found here: https://www.frontiersin.org/articles/10.3389/feduc.2021.784393/full
The “The Transformative Impact…” article is cited in the CMF sfr (Ch. 4 Line 882). In the “The Transformative Impact…” article, on page 3, it mentions the same claim of a 2.8 school years math achievement gain from a ‘youcubed’ summer math camp, listing (Boaler, 2019b) as the cite/reference to support that math gain claim. In the references listed at the end of the “The Transformative Impact…” article, the full reference for (Boaler, 2019b) is:
“Prove it To Me!” Mathematics Teaching in the Middle School 24(7): 422–428. doi:10.5951/mathteacmiddscho.24.7.0422
Also, the researcher found this at the bottom of the ‘Youcubed’ summer math camp webpage, (https://www.youcubed.org/evidence/our-teaching-approach/), a link for:
‘A paper published by NCTM discussing evidence of the effectiveness of the youcubed summer camp’
which link leads to:
Boaler, Jo. 2019. “Prove it To Me!” Mathematics Teaching in the Middle School 24(7): 422–428.
The “Prove it To Me!” paper can be found here: https://www.youcubed.org/wp-content/uploads/2019/05/prove-it-to-me-JB.pdf
This, then, seems to be the research paper that supports the 2.8 years math gain claim from a 4 week Youcubed summer math camp. The same claim about a 2.8 school years math gain from a summer math camp is made in “Prove it To Me!”, at the bottom of page 424, that was made in the CMF sfr Ch. 4 (Line 906-909):
“For the next eighteen lessons, we taught students using tasks….At the end of the eighteen lessons, the improvement of the students on standardized test scores was equivalent to 2.8 years of school…” (page 424 “Prove it To Me!”)
On page 425 of “Prove it To Me!”, it references:
“You can watch a short film showing the approach at www.youcubed.org.”
We assume this is the same “Solving the Math Problem” vimeo hosted video that the CMF sfr cited in support of the 2.8 school years CMF sfr math gain claim (Ch. 4 Line 906-909).
The youcubed webpage, though, does point to the “Prove it To Me!” paper as the source of support for youcubed summer math camps, as it says this on its summer camp webpage:
‘A paper published by NCTM discussing evidence of the effectiveness of the youcubed summer camp’
(https://www.youcubed.org/evidence/our-teaching-approach/)
“Prove it To Me!” does describe a 4 week youcubed summer math camp that took place, but doesn’t seem to specify exactly when it took place. In our view, this fuzziness about details in “Prove it To Me!” extends to other details in the paper as well:
There is no detail about the ‘standardized test’ assessment that was administered.
No comparison or control group is mentioned, let alone a control group that completed a traditional 4 week summer math camp as a comparison.
There is little detail about the participants, specifically their prior math achievement level.
There is no way to independently verify the paper’s claims of results.
Camp participants were not chosen randomly, raising issues of selection bias.
Though “Prove it To Me!” describes quite an extraordinary 2.8 school year math achievement gain (0.91 Standard Deviation (SD)) from a 4 week youcubed summer math camp, it has only 2 citations, according to Google Scholar. The paper does not appear to have been peer reviewed, according to Google Scholar.
While “Prove it To Me!” has just the two citations, it is, however, now cited 4 times in the CMF sfr: Ch. 2 Line 45 and 574; Ch. 4 Line 384; and Ch. 13 Line 451. Perhaps this will boost its citation count in Google Scholar.
And while the CMF sfr cites the Youcubed video as the source of support for the 2.8 school year math gain (0.91 SD) claim from a 4 week youcubed summer math camp (Ch. 4 Line 906-909), the video provides no evidence for the math gain claim.
“Prove it To Me!” does not provide evidence for the 2.8 school year math gain claim either, in our opinion.
Changes to CA’s public school math education should not be based upon claims without evidence.
Circular self-citing’ and ‘recycling’ of papers in the CMF sfr
Two interesting patterns emerge when analyzing significant research that underpins the CMF sfr:
A pattern of ‘circular self-citing’ - where research paper authors cite their own work as evidence for their other work.
A pattern of ‘paper recycling’ - where the same research papers are being cited over and over again as evidence.
‘Circular Self-Citing’?:
When examining significant research underpinning the CMF sfr’s proposals, it’s evident that some CMF sfr research authors are engaging in ‘circular self-citing’ – claiming their own past work as evidence for their new work — which may boost a paper’s citations count.
Checking a research paper’s citation references list sheds light on this.
For “Prove it To Me!,” the paper examined just above and cited in the CMF sfr, 5 of only 8 total citations in its references list are of the author’s own work.
For the ‘Railside Study’, cited in the CMF sfr, 8 of the citations in the reference list are of one of the authors’ own work.
For the “The Transformative Impact…” article, cited in the CMF sfr (Ch. 4 Line 881-883), 7 of the citations in its references list are of one of the authors’ own work.
For the “The derailing impact of content standards…” article, cited in the CMF sfr, (Ch. 2 Line 813 and 883), 17 of the citations in its reference list are of one of its own authors’ work.
For the “Raising Expectations and Achievement…2021” paper, cited in the CMF sfr (Ch. 9 Line 183, 304, 311), 6 of the citation references in its reference list are to articles by one of its own authors.
For the “Changing Students Minds…” article, cited in the CMF sfr (Ch. 1 Line 413-416) and (Ch. 9 Line 76-78), 3 of the citation references in its reference list are to articles by one of its own authors.
Given the ‘circular self-citing’ practice of CMF sfr cited research authors, we wonder about the depth of support for the claims made in these articles and the evidence behind the claims.
‘Paper Recycling’:
In analyzing significant research cited by the CMF sfr, it becomes clear that there is a concentration of the same research papers by the same author(s) being cited in the CMF sfr a number of times.
For example, one research author’s work is cited 82 times in the CMF sfr, with 23 papers and 3 books making up the body of work being cited 82 times, according to this author’s analysis of the citations. Of the 23 papers by this one research author, 6 are analyzed here in this talking point, The Research Doesn’t Add Up.
As a gauge, the next most frequently cited CMF sfr research author has 13 citations, about ⅙ as many as the most frequently cited research author in the CMF.
This ‘paper recycling’ isn’t immediately apparent when reading the CMF sfr, perhaps due to the inconsistency in the use of abbreviated citation names, and perhaps due to some omissions in the CMF sfr citations list appendix. For example:
The same abbreviated citation language refers to different works in different chapters:
Example 1 (Boaler 2019b)
Ch. 1 LIne 243 cites (Boaler, 2019 a,b); according to Appendix B for Ch. 1, (Boaler, 2019b) refers to: Boaler, Jo. 2019b. Limitless Mind. Learn, Lead and Live without Barriers. New York: Harper Collins.
Ch. 2 Line 574 cites (Boaler, 2019b); according to Appendix B for Ch. 2, (Boaler, 2019b) refers to: Boaler, J. (2019)b. Prove It to Me!. Mathematics Teaching in the Middle School, 24(7), 422-428.
Example 2: (Boaler, 2019)
Ch. 4 Line 384 cites (Boaler, 2019); according to Appendix B for Ch. 4 (Boaler, 2019) refers to: Boaler, J. (2019). Prove It to Me!. Mathematics Teaching in the Middle School, 24(7), 422-428
Ch. 9 Line 64 cites (Boaler, 2019); according to Appendix B for Ch. 9 (Boaler, 2019) refers to: Boaler, Jo. 2019. Limitless Mind. Learn, Lead and Live without Barriers. New York: Harper Collins.
The same work has different associated abbreviated citation language:
Example 1:
(see also Anderson, Boaler, and Dieckmann, 2018) Ch 10 Line 433; according to Appendix B for Ch. 10, (see also Anderson, Boaler, and Dieckmann, 2018) refers to: “Achieving elusive teacher change through challenging myths about learning: A blended approach.” RK Anderson, J Boaler, JA Dieckmann - Education Sciences, 2018 - mdpi.com
(See Anderson et al, 2019) Ch. 10 Line 786; according to Appendix B for Ch. 10, (See Anderson et al, 2019) refers to: Achieving elusive teacher change through challenging myths about learning: A blended approach RK Anderson, J Boaler, JA Dieckmann - Education Sciences, 2018 - mdpi.com Note - there is no (Anderson et al, 2019) paper, only the 2018 Anderson paper.
(Anderson et al, 2018) Ch. 9 Line 304; according to Appendix B for Ch. 9, (Anderson et al, 2018) refers to nothing, as the full citation is missing.
Example 2:
Ch. 1 Line 243 cites (Boaler, 2019 a,b); according to Appendix B for Ch. 1, (Boaler, 2019b) refers to: Boaler, Jo. 2019b. Limitless Mind. Learn, Lead and Live without Barriers. New York: Harper Collins.
Ch. 13 Line 451 cites (Boaler, 2019 a,b); according to Appendix B for Ch. 13, (Boaler, 2019a) refers to: Boaler, Jo. 2019a. Limitless Mind. Learn, Lead and Live without Barriers. New York: Harper Collins.
Appendix B, the CMF sfr’s citation reference list, is missing some full citations:
Ch. 9 Line 304 (Anderson et al, 2018); Appendix B for Ch. 9 has no entry for a paper authored by Anderson.
Ch. 9 Line 299 (Burris, Heubert, and Levin, 2006); Appendix B for Ch. 9 has no entry for a paper by (Burris, Heubert, and Levin, 2006)
These are but a few examples. Inconsistency in the use of the abbreviated citation language and some omissions likely obscure the fact that some research papers are being cited over and over again.
Given the ‘recycling’ of the same research papers in the CMF sfr, it makes one wonder about the breadth and depth of the research support for its claims and proposals, and about the evidence behind them.
Changes to CA’s public school math education should not be based upon claims and research without evidence.
Example 4: 2019 Youcubed summer camp
Another math gain claim that the CMF sfr makes that deserves scrutiny is a claim of a 0.52 standard deviation (SD) math achievement gain from a 2019 4 week youcubed summer math camp employing ‘mindset’ techniques and using ‘open, rich math tasks.’ The claim:
“In the Youcubed summer camps for middle-school students (Youcubed, n.d.), which
significantly increase achievement in a short period of time (Boaler et al., 2021),
students are taught that reasoning is a crucially important part of mathematics.”
(Ch. 4 Line 881-883)
The full citation, according to Appendix B for Ch. 4, for (Youcubed, n.d.) is:
Youcubed Summer Camps, https://www.youcubed.org/evidence/our-teaching-approach/
However, this citation simply leads to a Youcubed webpage which only describes Youcubed’s Summer Math Camps, not to any evidence supporting the math gain claim.
The second citation used to support the above claim is (Boaler et al., 2021). According to Appendix B for Ch. 4, the full citation for (Boaler et al., 2021) is:
Boaler, Jo, Jack A. Dieckmann, Tanya LaMar, Miriam Leshin, Megan Selbach-Allen,
and Graciela Pérez-Núñez. 2021. The Transformative Impact of a Mathematical
Mindset Experience Taught at Scale. Frontiers in Education: December 10, 2021.
Appendix B sfr (Line 636-638)
The “The Transformative Impact…” article can be found here: https://www.frontiersin.org/articles/10.3389/feduc.2021.784393/full
The “The Transformative Impact…” article does discuss claimed math gains from a 2019 4 week Youcubed summer math camp, conducted by and in 10 different school districts, claiming that student participants in the 4 week summer math camp made a math gain of 0.52 SD from the Youcubed summer math camp curriculum, based on ‘big ideas’, using ‘mindset growth’ techniques, and characterized by ‘open math tasks’, while coupled with Youcubed camp teacher training.
However, the ‘Student Achievement’ section (p. 5 of the “The Transformative Impact…”) reveals that the claimed student math achievement gains were based on administering and scoring a pre and post camp assessment – which used identical math questions (4 specific MARS math tasks). Students were assessed on already seen, encountered, and known test material, according to the “The Transformative Impact…pg. 5”. Further, in our view, MARS Math Tasks are unsuitable for use as an assessment tool, as they are non standard (not standardized) and are not psychometrically sound (not psychometrically sound means the test is not robust, independently verified, or has demonstrated consistency or predictive capability - see details below on MARS Math Tasks). In our opinion, these methodological flaws cast doubt on any claim of math achievement gain described in “The Transformative Impact..” article.
Additional concerns about “The Transformative Impact…” article’s claims, include but are not limited to:
No objective criteria was provided for how the student participants were selected, introducing possible selection bias.
No valid comparison existed between the math camp student group and the comparison group
The student group comparison in the article was between a student group which took a 4 week youcubed summer math camp and another student group which did not take any summer math camp, rather than between two groups who both took 4 week summer math camps, one with ‘mindset’ techniques and other attributes, and one employing more traditional summer math camp techniques and attributes. The study’s comparison was not a valid comparison, in our view.
The comparison between a group that took a 4 week youcubed math camp and one that did not take a summer math camp provides no information on whether the special characteristics (‘mindset growth’ techniques, etc.) of the ‘Youcubed’ summer math camp provided student benefit beyond that solely from attending a 4 week summer math camp.
Beyond the unsuitability of the MARS math tasks as an assessment tool, the scoring of the MARS math tasks was done by an organization whose principal officer has co-authored papers with an author of “The Transformative Impact…” article.
Use of MARS Math Tasks as an assessment tool:
Typically, claims about results in math learning rely upon exams and assessments made up of multiple items, all of which have been tested for validity and reliability. This does not seem to be the case for the MARS tasks used as the assessment for the “The Transformative Impact…” article above, or in the other articles we are analyzing in The Research Does Not Add Up. We can find no information that MARS Math tasks have met these standards for assessments.
In fact, what we did find is that summative MARS math tasks, according to the website of its creator, The Mathematics Assessment Project (MAP), are described as ‘prototypes’ and as being in ‘draft form.’
Mathematics Assessment Project
BALANCED ASSESSMENT
Prototype Summative Assessment Tests
The Summative Assessment Tasks may be distributed, unmodified, under the Creative Commons Attribution, Non-commercial, No Derivatives License 3.0. All other rights reserved. Please send any enquiries about commercial use or derived works to map.info@mathshell.org. Note: please bear in mind that these prototype materials need some further trialing before inclusion in a high-stakes test. https://www.map.mathshell.org/tests.php
The purpose of these (MARS math tasks tests) is to provide examples of the type of tests students should be able to tackle, if the aspirations of the Common Core State Standards are to be realized. The methodology of the tests and their balance of different task types is discussed in more detail here. Note: please bear in mind that these materials are still in draft and unpolished form. https://www.map.mathshell.org/tests.php
To our knowledge, most education studies that “meet standards” (see US Dept of Education, Whatworks Clearinghouse, https://ies.ed.gov/ncee/wwc) use robust, independent measures. Examples include measures that have strong evidence of consistency (reliability), and have also been studied in thousands, if not millions of students each year, including those of different socioeconomic backgrounds. These measures are carefully and consistently administered and scored and the entire outcome measure process is independent of the original study. These measures are also studied as to whether they predict if students later succeed. None of this applies to MARS math tasks, in our view.
We could not find a publisher for MARS math tasks tests other than these websites (https://www.map.mathshell.org/index.php, http://web.archive.org/web/20030303050154/http://www.nottingham.ac.uk/education/MARS/eval/own.htm) where there were several references to MARS tests being “drafts” and the stimulus tests and answers being publicly available (no test security). We could not find measures of reliability or predictive validity (whether they predict later academic success) for MARS tasks as an outcome tool.
Furthermore, the MARS measure consists of several math problems that we understand teachers hand score, thus leading to potential inconsistency and subjectivity.
For all of these reasons, in our view, MARS math tasks are unsuitable as an outcome assessment measure and cast grave doubt on the claims of the “The Transformative Impact…” article about student benefit from the ‘Youcubed’ summer math camp.
Given the “The Transformative Impact…” article’s methodological flaws and the study having assessed students on the same, exact, known material in a pre and post camp assessment, while using an non-standard assessment, we believe this research article cannot support claims of student benefit.
Changes to CA public education which could impact as many as 6 million school children should not be based upon claims without evidence.
Example 5: 5th grade Central Valley The CMF sfr makes another math gain claim, stating that teacher ‘mathematical mindset’ and ‘multidimensional mathematics’ professional development (PD) and implementation in the classroom translated to student math gains of 5.2 months (0.085 to 0.1 SD) in a 5th grade classroom over the course of a 5th grade year. The CMF sfr claim:
“Another study describes a county-wide approach in which fifth grade teachers across several districts in California’s Central Valley were taught to teach multidimensional
mathematics. Within one year the students significantly increased their mathematics achievement on CAASPP tests—particularly girls, language learners and economically disadvantaged students (Anderson et al., 2018).” (Ch. 9 Line 300-304)
“By the end of the school year the students of the teachers in the network achieved at significantly higher levels on the mathematics portion of the CAASPP. The focus on mindset particularly raised the achievement of girls, language learners, and economically disadvantaged students (see Anderson et al., 2019).” (Ch. 10 Line 783-786)
“The blended approach and the details of teacher and student change is explained fully in Anderson et al., 2019.” (Ch. 10 Line 805-806)
The full citation for (Anderson et al., 2019), according to Appendix B for Ch. 10 (Appendix B for Ch. 9 is missing a full citation for (Anderson et al., 2018)) is:
Anderson, R. K., Boaler, J., & Dieckmann, J. A. (2018). Achieving elusive teacher change through challenging myths about learning: A blended approach. Education Sciences, 8(3), 98.
The article can be found here: https://www.mdpi.com/2227-7102/8/3/98
“Achieving elusive teacher change…” claims a 5.2 month student math achievement gain (0.085 SD to 0.1 SD) on the Smarter Balanced Assessment Consortium (SBAC) test over the course of 5th grade due to the 'Mathematics Mindset' professional development (PD) received by their teachers, and by the teachers’ implementation of ‘mindset’ techniques and ‘multidimensional mathematics’ in their 5th grade classrooms.
There are, however, in our view, multiple concerns about the study’s methodology:
Lack of School District and School identification, preventing independent verification of SBAC results.
Teachers were not chosen randomly, but were nominated by their districts, introducing selection bias.
Students were not chosen randomly, introducing selection bias.
The study assumes that the teachers, both treatment and control group, have the same level of teacher effectiveness, which cannot be assumed.
There is no data on the level of each teacher's effectiveness, prior to treatment.
The study compares a group of 5th grade teachers from 8 districts who received an average of 35 hours of 'Mathematics mindset' and ‘multidimensional mathematics’ professional development (PD) over one year, participated in math networking meetings over 3 years, and received on site math coaching from County Office of Education staff, versus a control group of 5th grade teachers who received none of those things.
This is an invalid comparison in our view.
We have no information on what type of impact teachers receiving a comparable amount of traditional Math professional development (PD) rather than ‘Mathematical Mindset’ and ‘multidimensional mathematics’ PD, coupled with on site coaching and math network meetings, for the same duration of time, might have made on the math achievement level of the students they would teach.
The study provides no information on any of the students' (those participating in the study or the control group) prior math achievement levels; equivalency in math achievement level cannot be assumed.
The study uses SBAC math achievement results from the end of 5th grade to make its claimed comparisons among students' math achievement levels. We have no SBAC math achievement results for the students prior to 5th grade.
In our view, beyond the many concerns about the study’s methodology, even if the study’s results were valid, and accurate, which in our view is in doubt, still, according to SimplyPsychology, a result of 0.1 SD isn’t meaningful.
“This means that if the difference between two groups’ means is less than 0.2 standard deviations, the difference is negligible, even if it is statistically significant.”
https://www.simplypsychology.org/effect-size.html
“Achieving Elusive Teacher Change…” claims a SD 0.085 to 0.1 SD difference between the treatment and control group, which, even if one ignores the study’s many methodological concerns listed above, does not support a claim of meaningful student benefit.
Example 6: “Raising Expectations…”
The CMF sfr makes another math gain claim, claiming that in two studies within one paper:
Detracking math classes coupled with teacher ‘mathematical mindset’ professional development (PD) led to math gains “Raising Expectations…” (Study 1)
Detracking math classes led to math gains “Raising Expectations…” (Study 2)
Both of the above studies are described within the paper, “Raising Expectations and Achievement…Boaler and Foster, 2021.”
The CMF sfr’s claims:
“Some studies have also shown that high-achieving students are advantaged when they are given opportunities to extend work and discuss mathematical connections in non-tracked groups (Boaler and Foster, 2021; Boaler and Staples, 2008; Sahlberg 2021).
(Ch. 9 Line 181-184)
“Boaler and Foster (2021) describe the change in achievement that resulted when teachers in eight districts in Northern California were given professional development that helped them de-track middle school classes and teach broader and deeper mathematics. Student achievement in these districts was compared with that in districts who continued to teach students in tracked groups with a more narrow mathematics focus. In the non-tracked districts, 15 percent more of the students achieved proficiency in the CAASPP assessments and 20 percent more students in the more conceptual MARS assessments (Boaler and Foster, 2021).” (Study 1) Ch. 9 (Line 304-312)
“Raising Expectations… 2021,” (Study 2), claims math class detracking and the use of a ‘conceptual curriculum’ led to math gains of 2.03 middle school years growth (0.68 SD) after one year. The CMF sfr’s claim:
“In a second study, comparisons were made between students working in tracked
groups and the same districts one year after significant de-tracking with the use of a
more conceptual curriculum. After a large number of districts detracked mathematics in middle school, student achievement increased significantly across the achievement
range, as shown in figure 9.1.” “Raising Expectations…2021 ”(Study 2) Ch. 9 (Line 312-316)
“These distributions show that student achievement increased across the range when
students were taught a more conceptual curriculum in de-tracked groups, producing
significantly more high achieving students. The score gain of 5.61 on the assessments
(0.68 standard deviations), is equivalent to 2.03 years of middle school growth.”
Raising Expectations…2021 ”(Study 2) Ch. 9 (Line 322-325)
The full citation for (Boaler and Foster, 2021), which contains Study 1 and Study 2, according to Appendix B for Ch. 9 is:
Boaler, Jo, and David Foster. 2021. “Raising Expectations and Achievement: The
Impact of Two Wide Scale De-Tracking Mathematics Reforms.” Youcubed and Silicon
Valley Mathematics Initiative. https://www.youcubed.org/wp-
content/uploads/2017/09/Raising-Expectations-2021.pdf.
The paper can be found here: https://www.youcubed.org/wp-
content/uploads/2017/09/Raising-Expectations-2021.pdf
Results of a Google Scholar search (https://scholar.google.com/) on ‘Raising Expectations…’ by Boaler, J. and Foster, D. 2021 show the paper is unpublished and a 2014 version has just 3 citations. The 2021 version does not generate any results in a Google Scholar search. There is no indication in Google Scholar that ‘Raising Expectations..’ 2021 version, or the 2014 version, was peer reviewed.
‘Raising Expectations…’, Study 1, which discusses a detracking and teacher professional development intervention, does not identify any of the 8 districts it studies, nor any of the 25 comparison districts. Nor does it provide any district level results, for either the California Standards Test (CST) or for the MARS math tasks, the outcome metrics mentioned in the paper, but it instead lumps all the data together. As such, in our view, it provides no specific, independently verifiable data on the results of its intervention. As discussed above, in our view, MARS Math Tasks are an unsuitable assessment tool, as they are not psychometrically sound. Study 1 provides sparse detail on its study design, and little detail on the criteria used in the recruitment of the districts eventually chosen to be studied. From our perspective, concerns about study methodology include:
Neither the treatment districts nor comparison districts were named, thus the results are independently unverifiable.
All of the districts’ data is lumped together, hiding district level details, preventing independent verification.
The districts were not randomly chosen.
Teachers were not randomly selected; neither were the students.
An invalid comparison was used - between teachers who received math ‘mathematical mindset’ professional development and teachers who did not receive any type or amount of traditional math professional development (PD).
No information was provided on the student participants’ prior math levels.
The study assumes that all teachers are equally effective, which can not be assumed; teachers are not always equally effective.
The use of the MARS Math tasks as an evaluation tool is problematic, and unsuitable, as explained above.
Evaluations predicated upon MARS math tasks lack credibility.
The study does not provide detailed student math achievement results.
Regarding ‘Raising Expectations..’, Study 2, concerns about its study methodology include:
There is little detail on the study design.
It does not provide independently verifiable results
It does not provide detailed student math achievement results.
In our view, neither Study 1 nor Study 2 provide credible evidence to support the CMF sfr’s claims of math gains from detracking, and in the case of Study 1, the additional teacher professional development.
Changes in CA math education that could impact as many as 6 million school children should not be based on claims without credible evidence.
Example 7: Massive Open Online Course (MOOC)
The CMF sfr makes another math gain claim that suggests that ‘mathematical mindset’ student training via an online class and teacher professional development (PD) led to a math achievement gain. The CMF sfr’s claim:
“Many studies show that these perceived differences in ability can be changed by
interventions (Kwon et al., 2021; Boaler et al., 2018; Frontiers et al., 2007; Moses and
Cobb, 2002).” Ch. 9 (Line 76-78)
“When messages such as these were shown in a free online class offered through a
randomized controlled trial, students significantly increased their mathematics
engagement in class and improved later achievement (Boaler, Dieckmann, Pérez-
Núñez, Sun, and Williams, 2018). “ (Ch. 1 Line 413-416)
The full citations for ‘Boaler et al., 2018’, and ‘Boaler, Dieckmann, Pérez-
Núñez, Sun, and Williams, 2018’, are the same full citation, according to Appendix B for Ch. 9 and for Ch. 1, which is:
Boaler, Jo, Jack A. Dieckmann, Graciela Pérez-Núñez, Kathy Liu Sun, and Cathy Williams. "Changing students' minds and achievement in mathematics: The impact of a free online student course." In Frontiers in Education, p. 26. Frontiers, 2018.
The article can be found here: https://www.frontiersin.org/articles/10.3389/feduc.2018.00026/full
“Changing students' minds….” claims a 0.33 SD math gain, which it claims stems from a 1 hour 30 min (15 minute modules X 6 sessions = 90 mins, though students completing just 4 sessions of the 15 minutes modules were accepted into the study) ‘mathematical mindset’ online class. The online class modules were mainly videos, according to “Changing students’ minds…”, which were administered in 4 school districts, and were coupled with 1 day of teacher training for teachers to learn to implement the lessons from the online class.
In our view, the study has numerous methodological concerns, most importantly, according to “Changing students’ minds..”, that the teachers in the study, who taught both treatment and control group students, knew which group of students was which. In our view, this introduces bias, and compromises both the study and the intervention. Additionally, according to “Changing students’ minds..”, neither the teachers nor the students were chosen randomly; middle school teachers were recruited, and their students were included in the study.
Study Methodological Concerns:
Teachers knew which students were in the treatment group, and which students were in the control group.
Participating districts and schools were not identified, preventing independent verification of SBAC assessment results by district and school.
Districts were not randomly chosen, introducing selection bias.
Teachers were not randomly chosen; neither were students, introducing selection bias.
No prior student math achievement level was provided; thus no information on the levels of student math achievement in either the treatment or control groups was provided.
It cannot be assumed that student math achievement levels in the treatment and control group were equivalent. No student math achievement level was provided for either group.
It cannot be assumed that the math achievement levels of different classrooms of students taught by the same teacher are equivalent, as this study does assume.
There was student attrition in both the treatment and control group, raising questions about the composition of the remaining students in each group, treatment and control, which were not randomly chosen to start with.
In our view, given the numerous study methodological concerns, described above, the CMF sfr’s claim of student benefit from an online ‘mathematics mindset’ course, coupled with teacher professional development, is unsupported.
Example 8: (Burris, Heubert, and Levin, 2006)
The following is an example of the CMF sfr citing research in a way inconsistent with the research’s actual findings.
The CMF sfr claims that a focus on ‘big ideas’ and ‘multidimensional mathematics’, with students from different math levels working together, leads to math achievement gain. The claim:
“There have been numerous research studies showing the effectiveness of approaches that focus on big ideas, and multidimensional mathematics, with students from different achievement levels working together. In a de-tracking initiative, a suburban New York school district stopped teaching “regular” or “advanced” classes in middle school, and instead provided all students with content previously labeled as “advanced.” Researchers followed students in six cohorts over six years. In the first three years the 12 cohorts worked in tracks, for the next three years the cohorts worked in heterogeneous classes using the “advanced” curriculum, which consisted of sixth, seventh, and eighth grade coursework taken in grades six and seven, followed by the first course in an integrated mathematics sequence incorporating algebra concepts (entitled Sequential Mathematics I) in eighth grade. The researchers found that the students who learned in the heterogeneous classes took more advanced math, enjoyed math more, and passed the state Regents test in New York a year earlier than students in traditional tracks. Further, researchers showed that the advantages occurred across the achievement spectrum for low and high achieving students (Burris, Heubert, and Levin, 2006).” (Ch. 9 Line 285-299)
A full citation for (Burris, Heubert, and Levin, 2006) is missing in the CMF sfr’s Appendix B for Ch. 9. But the full citation, we believe, is:
Burris, C. C., Heubert, J. P., & Levin, H. M. (2006). Accelerating mathematics achievement using heterogeneous grouping. American Educational Research Journal, 43(1), 137-154.
The article can be found and downloaded here: https://www.semanticscholar.org/paper/Accelerating-Mathematics-Achievement-Using-Grouping-Burris-Heubert/89bab06af3343fb9fdf24e74fab1cacda8b1b4ce
Though the CMF sfr cites (Burris, Heubert, and Levin, 2006) to support this claim, there is no mention of big ideas or multidimensional mathematics in the Burris et al 2006 article, though the CMF sfr cites it to claim the following:
“There have been numerous research studies showing the effectiveness of approaches that focus on big ideas, and multidimensional mathematics, with students from different achievement levels working together.” (Ch. 9 Line 285-287)
From our view, there was no multidimensional mathematics approach discussed in (Burris, Heubert, and Levin, 2006); there was just one mathematical approach discussed and implemented, that of enrolling all students in a rigorous, algebra based course in 8th grade, called Sequential Mathematics I. This enrollment was accompanied by additional support in the form of separate math workshops available to all students.
A word search for ‘big ideas’ in the text of (Burris, Heubert, and Levin, 2006) returns no results.
A word search for ‘multidimensional’ in the text of (Burris, Heubert, and Levin, 2006) returns no results.
Rather, the purpose of the study, according to (Burris, Heubert, and Levin, 2006), was:
“First, would more students take and pass such courses at the level of trigonometry and beyond if they took an accelerated algebra course in the eighth grade?
Second, would the performance of initial higher achievers decrease if all students were heterogeneously grouped and accelerated in mathematics?
We sought answers to these important questions.”
(Burris, Heubert, and Levin, 2006, pg. 109)
Further, to prepare students for taking Sequential Mathematics I in 8th grade, according to (Burris, Heubert, and Levin, 2006, pg. 110) this approach was taken:
“Thus, here "accelerated mathematics" refers to a program of mathematics study that (a) teaches the usual sixth-, seventh-, and eighth-grade curricula in 2 years rather than 3 and (b) teaches the usual ninth-grade curriculum, an algebra-based course labeled Sequential Mathematics I, in the eighth grade.” (pg. 110 Burris, et al 2006)
Thus, in our view, (Burris, Heubert, and Levin, 2006) does not support the CMF sfr’s claim that use of big ideas and multidimensional mathematics with students of all levels working together leads to mathematics gains. Those studied in (Burris, Heubert, and Levin, 2006) were enrolled in an accelerated algebra course in 8th grade with a defined, rigorous, curriculum.
Though (Burris, Heubert, and Levin, 2006) was cited in the previous field review (ffr) as though it supported the CMF’s proposals, it does not. Its intervention of math acceleration, coupled with math workshop support, and heterogeneous classrooms, is almost opposite to what the CMF ffr and sfr proposes; the intervention studied was about math acceleration starting in grade 8, not the math delay and deceleration that the CMF ffr and sfr propose, delaying Algebra 1 to grade 9 for all, while discouraging calculus completion in high school.
Summary:
Example 1: SFUSD Math Program
Math gain claims about SFUSD’s math program have been discredited, and the CMF sfr, in including research articles that reference these discredited SFUSD claims, casts doubt on its own proposals supported by such research. It also calls into question the standard of research the CMF sfr has chosen to accept, include, and cite.
Example 2: The ‘Railside Study’
The ‘Railside study’, frequently cited in the CMF sfr to buttress claims related to detracking and student ‘mindset’ about mathematics, has had a careful critique questioning it, questioning numerous aspects of its methodology, calling its validity into question. Thus, in our view, it cannot unequivocally support claims of student benefit. Its inclusion in the CMF sfr, to support CMF sfr proposals, casts doubt on the CMF sfr’s proposals, and on the standard of research the CMF sfr has chosen to accept, include, and cite.
Example 3: Youcubed Summer Camp
In our view, the CMF sfr claim of a 2.8 school years math gain from a 4 week youcubed summer math camp is unsupported by the reference cited in the CMF sfr (a 3 minute Youcubed video). A researcher had to hunt to find the research paper that purportedly supported this claim, which was the “Prove it To Me!” paper. “Prove it To Me!” does not provide support for the 2.8 school years math gain claim either, in our opinion. By including unsupported claims of student benefit, and citing a research paper of poor caliber, like “Prove it To Me!”, and a 3 minute video of student participant interviews, the CMF sfr undermines its own claims and proposals, and casts doubt on the standard of research the CMF sfr has chosen to accept, include and cite to underpin its proposals.
Example 4: 2019 Youcubed Summer Camp
In our view, the underlying support for a CMF sfr math gain claim of a 0.52 standard deviation (SD) math achievement gain from a 2019 4 week youcubed summer math camp does not withstand scrutiny. The underlying support for this claim, “The Transformative Impact…” article, has numerous methodological flaws, including administering an identical pre- and post-camp assessment to students, containing known material, while using a non-standard assessment measure. We don’t believe it can support a math achievement gain claim.
Example 5: 5th grade Central Valley
In our view, the research article underlying the CMF sfr claim of a math gain of 5.2 months (.085 to .1 SD) over the course of a 5th grade year due to ‘mathematics mindset’ teacher professional development, has too many methodological flaws to support a claim of student benefit. In addition, according to SimplyPsychology, a difference between two groups of 0.1 SD is not meaningful, and cannot support a claim of student benefit.
Example 6: “Raising Expectations..”
Study 1: the underlying research, (an unpublished, non peer reviewed paper, “Raising Expectations…2021”, with methodological flaws) for the CMF sfr’s claim that 15 percent more of the students achieved proficiency in the CAASPP assessments and 20 percent more students in the more conceptual MARS assessments does not support a claim of student benefit, in our view.
Study 2: the underlying support (an unpublished, non peer reviewed paper, “Raising Expectations…2021,” with methodological flaws) for the CMF sfr’s claim of a math gain of 2.03 middle school years growth (0.68 SD) after one year does not support a claim of student benefit in our view.
The CMF sfr should not be proposing big shifts in CA public math education based upon unpublished papers with scant information on study design and flawed study methodologies.
Example 7: Massive Open Online Course (MOOC)
In our view, the underlying article for the CMF sfr’s claim of a 0.33 SD math gain from an online ‘mathematics mindset’ course and teacher professional development, "Changing students' minds and achievement in mathematics…2018”, has too many methodological flaws to support a claim of student benefit, including non random selection of teachers and students, and the involved teachers knowing which student group was the treatment group and which student group was the control group.
Example 8: (Burris, Heubert, and Levin 2006)
The CMF sfr makes a claim about an approach using ‘big ideas’ and ‘multidimensional mathematics’ that are unsupported by the actual findings and methodology of the cited research, (Burris, Heubert, and Levin 2006). Burris et al 2006 never discusses ‘big’ ideas or ‘multidimensional mathematics,’ but does study an accelerated intervention enrolling all 8th grade students in a rigorous, Algebra based course, which the CMF sfr recommends against. Burris et al 2006 does not support the CMF sfr’s recommendations, or its claim about ‘big ideas’ and ‘multidimensional mathematics.’
Conclusion:
In our view, to make such a big shift in guidance for California’s mathematical education, the research underlying the CMF sfr’s proposals should be robust and rock solid rather than questionable, misrepresented, or lacking in evidence. In our view, The Research Does Not Add Up.
Citations
(Accelerating Mathematics Achievement Using Heterogeneous Grouping Author(s): Carol Corbett Burris, Jay P. Heubert and Henry M. Levin Source: American Educational Research Journal, Vol. 43, No. 1 (Spring, 2006), pp. 105-136)
https://www.semanticscholar.org/paper/Accelerating-Mathematics-Achievement-Using-Grouping-Burris-Heubert/89bab06af3343fb9fdf24e74fab1cacda8b1b4ce
(Opinion) How One City Got Math Right: Pathways that work. The Hechinger Report, October 9, 2018.
https://hechingerreport.org/opinion-how-one-city-got-math-right
http://www.sfusdmath.org/uploads/2/4/0/9/24098802/sfusd_middle_school_sli-math-2016.pdf
