PTA - Friend or Foe?

For years, the Parent Teacher Association (PTA) has placed themselves as the ultimate source for coordination between parents and teachers, ostensibly providing numerous ways in which to help engage parents in their child’s education.  Most parents don’t think twice about joining the PTA – after all, it’s all about the kids.  Or is it?

The Oklahoma Chamber of Commerce Makes "Outrageously Wrong" Claims About COMMON CORE

Recently, the Oklahoma State Chamber of Commerce produced a brochure called, “Preparing Students for Success”

Please note the blue square in the bottom of page 2 which says, “Case Study”.  This is a case study on Kentucky.  Here, the Chamber of Commerce gives us the exciting CLAIM that Kentucky’s educational scores have risen dramatically since the Common Core State Standards were fully implemented in Kentucky.  

After just one year, the percentage of Kentucky students graduating college and career ready increased from 34 percent to 47%, and for the first time, the state scored above the national average on national tests, including the ACT.

Having at least a working knowledge of statistics, I KNOW there is no way to extrapolate (let alone determine) any correlation among something as diffuse as standards (or curricula) and educational achievement on national tests in one year’s time.  There are simply too many moving parts – too many outside factors involved – to be able to attribute any increase (or decrease for that matter) of testing scores to a change in standards made across an entire K-12 system in one year.  Without a degree in statistics (knowing what happened to the OU statisticians who critiqued Oklahoma’s A-F system) I felt unable to address this claim outright.

After hearing this brochure was being handed out to legislators at the Capitol, however, I decided to do some research on the Chamber of Commerce claims, so I called Richard Innes, Education Analyst at the Bluegrass Institute in Kentucky.  Richard is also a member of Truth in American Education – an organization for which ROPE is a contributor.

Richard wrote me the following email to share widely.  He is NOT happy about the claims made by the Oklahoma Chamber of Commerce (and, coincidentally, Stand For Children).

Mr. Innes writes: 


For starters, the chamber’s assertion about testing results — that Kentucky scored above the national average on the ACT — is absolutely false! In fact, this claim is outrageously wrong.

The 2013 ACT results for all states are readily available here: http://act.org/newsroom/data/2013/states.html

Kentucky’s ACT Composite Score in 2013 was 19.6. The national average was 20.9.

You have access to the NAEP maps to discuss the claim that Kentucky outscored the nation relative to that assessment, and I have more on NAEP below, as well.

Next, the chamber’s pamphlet also contains significant errors about Kentucky’s College and/or Career Ready statistics (CCR). The CCR rate didn’t increase from 34% to 47% in one year. 

Here are the actual Kentucky CCR rates by year: 

Kentucky’s College and/or Career Ready Rates, by Year

Data Source: Excel Spreadsheet on CCR in the Kentucky School Report Cards web site (http://applications.education.ky.gov/src/DataSets.aspx). Go to the web address and click on “CCR” under the “Delivery Targets” section. Next, open the Excel spreadsheet and scroll all the way down to the state data at the bottom. The second to last line shows the actual statewide CCR “scores” which really are the percentages. 

The one-year increase from 2012 to 2013 was only 6.9 points!

I am still researching the real implications of the numbers in the table above. At this point I suspect that the growth in 2012 and 2013 is notably due to changes in the state accountability system that started in the 2011-12 school year, including CCR as an accountability element for the first time and creating a number of new ways students could be counted in the success column.

Here the different ways a student can be listed as college and/or career ready:

The College Ready indicator includes graduates who met the Kentucky Council on Postsecondary Education (CPE) system wide Benchmarks for Reading (20), English (18), and Mathematics (19) on any administration of the ACT. Students also are counted college ready if they passed a college placement test (Compass or The Kentucky Online Testing (KYOTE) placement testing system).

The Career Ready indicator includes graduates who met benchmarks for Career Ready Academic (ASVAB or ACT WorkKeys) and Career Ready Technical (Passed the KY Occupational Skills Standards Assessment [KOSSA] or received an Industry-Recognized Career Certificate). Students must pass an element in both areas, academic and technical, to be counted as career ready.

Source: Kentucky School Report Card, State, “Accountability – College and Career Readiness (CCR), from PDF version, Page 36.
The key here is that Kentucky schools probably are still adjusting to the fact that they can qualify students as either college or career ready in more than one way. Some, perhaps a great deal, of the growth in the CCR numbers might be due to more students having the opportunity to take the alternate readiness elements that were not formerly used for accountability and may not have been widely available to students before accountability began. At this point, I cannot say for sure.
What I do know is that the growth in CCR between 2010 and 2013 shown in the table above is not reflected in any notable increase in performance on the ACT college entrance test in Kentucky. 
Throughout the period from 2010 to 2013, all Kentucky public school students have taken the ACT. Unfortunately, the public school Benchmark results have not been consistently reported over this four-year period because Kentucky switched from using ACT’s own Benchmark scores to using different Benchmark Scores set by the Kentucky Council on Postsecondary Education (CPE).
However, here are the percentages of all Kentucky students, public and private combined, who have achieved the ACT Benchmark Scores by year. Because private school enrollment in Kentucky is fairly low, the numbers largely reflect the trend for the public schools only.
ACT Performance for Kentucky by Year, All Students Public & Private, Percent of Students Reaching ACT Benchmarks
As you can see, there have been no remarkable changes in Kentucky over this four-year period. Reading performance actually dropped in 2013 and science jumped up a bit, but that may be because ACT reset benchmark thresholds for those subjects in 2013. If the reading Benchmark score had not changed, I calculate the percentage reaching the old benchmark in 2013 would be 44, unchanged from the 2012 level. In science, the 2013 percentage would be 23 percent, little different from previous years, as well.
However, the main point is there has been no notable jump in Kentucky for college readiness based on the ACT.
So, there is evidence that the CCR improvement is mostly due to other factors that can lead to a student succeeding in meeting one of the various CCR criteria.
Another point: I led you to the Bluegrass institute’s blog (www.bipps.org/blog). We have a large number of posts about Kentucky’s true educational performance there, including discussions about why you must look at scores disaggregated by race to begin to properly rank Kentucky against other states. That is true whether we use the National Assessment of Educational Progress (NAEP) or the ACT (only possible for those states that also test around 100% of their students with this college entrance test). 

Here is one example of what happens when we disaggregate NAEP performance for eighth grade math

Whites in all the states in green got math scores statistically significantly higher than Kentucky’s whites did in 2013. The seven states in tan tied us and only one state, West Virginia, did statistically significantly worse. That’s all — just one state. Is this a target for Oklahoma?

There are more maps like this for Grade 8 reading and for 4th grade math and reading in the Bluegrass Policy Blog.

Let me discuss another example. This is for Southern States that test 100 percent of their students on the ACT.

(If you cannot enlarge this with your e-mail reader, you can find this table here:


Notice that Louisiana scored 0.1 point lower than Kentucky when you look at overall scores. However, when you break the data out by race, Louisiana outscored Kentucky for every single racial group and clearly has a somewhat stronger education system. However, this truth is buried in the overall average scores.

So, what happened to that overall score?

The answer is something that even has a name, Simpson’s Paradox. It is a hazard in working with averaged data when the underlying student demographics vary dramatically.

Because of the achievement gaps for all the minorities, a state like Kentucky with a very large white population gets an unfair advantage in any simplistic comparison of only overall average scores. This is very important to understand when others point to things like Kentucky’s supposedly high ranking in the latest Quality Counts report from Education Week. Quality Counts is largely a popularity contest for current education fad ideas, but it also includes some NAEP scores. However, with one small exception, Quality Counts only looks at overall average scores. They thus trip all over Simpson’s Paradox.

In closing, the jury is still very much out on Kentucky’s real education performance since CCSS came along, but there already is a disturbing amount of evidence that much of the hoopla about Kentucky’s supposed improvement is not on target.

For sure, almost every claim about Kentucky in the Chamber’s pamphlet is absolutely incorrect. I would love to know who fed them this nonsense.