A media friend was looking into a recent Vanderbilt study on six unidentified school districts across the state. The researchers found that “more students were chronically absent this fall than in previous years, and absenteeism increased the most among English Learners, students of color, and students who are economically disadvantaged.” I told her I was not concerned about the latest findings.
When pressed on the issue, I pointed out that the sample size was a little concerning. Only six districts were represented; my guess is they looked at the larger urban areas. I find the research misleading. There are 147 districts in the state, each with unique and distinct issues. If we rush in and try to apply a one size fits all solution to any issue, we would be making a mistake.
By failing to identify the six individual districts, the results from the research were problematic to me---as well as to other stakeholders and policymakers as well. The study has some interesting findings. It would be useful to those 6 unidentified districts. However, I am not certain there is a crossover for other districts.
It is worth focusing on the fact that there was a global pandemic ongoing. A pandemic that nobody saw coming, or that we had ever dealt with previously. Education was largely online, so that may point out why low-income, ELL students, and students of color did not have internet access or had difficulty accessing online school.
In addition, some students were still having an issue getting devices initially. The research period was over by the first semester and did not allow for adjustments that may have occurred. I am concerned about replicability in the study, and subsequent reliability. Even though it was a fascinating read with interesting thoughts, I am not sure I can go much beyond that at this time.
Professor Andrea Jones-Rooy brilliantly stated in Quartz: “Data doesn’t say anything. Humans say things. We’ve conflated data with truth. And this has dangerous implications for our ability to understand, explain, and improve the things we care about.” Data is important, but it can be manipulated and generated in the way questions are asked. The value is often suspect, and conclusions can be pre-determined.
Gavin Freeguard, using a similar theme, pointed out that, “badly presented data will be confusing to people inside government too. If we as the general public are having difficulty understanding messages across (and within) datasets, it suggests those inside government are having similar struggles. Indeed, making sense of the data is made nearly impossible by the sheer number of sources in use.”
In recent years, we have seen an overreliance on research from sole sources. In our state’s public education, much of the education research is done at Vanderbilt or Gates Foundation-funded research through various education non-profits. That is not meant to be disparaging, but rather an observation.
However, the question we must ask ourselves is, “What is the possibility of the research being replicated or reproduced by other research institutions or organizations? Will the findings be the same?” We must recognize a susceptibility to biases in research. Bias can occur in the planning, data collection, analysis, and publication phases of research. Many biases work subconsciously, are undetectable, and ultimately not correctable. We must be aware of inconsistencies between actual results and preconceived expected outcomes.
This brings us back full circle. Was Vanderbilt’s research accurate and unbiased? Perhaps. Then again, the data may have been distorted by how the questions were asked or by some unforeseen bias. Who was the audience? Who paid for the research? What was the objective of the research? We all want to improve public education, but a little context on the findings would have been helpful—especially identifying the districts used in the study and identifying who paid for the research.
Too often media is off and running with stories about research without critical details. My friend happened to drill down and ask the tough questions. For too long we have been shallow when it comes to facts, and excellent when it comes to entertainment. Don Henley sings “Long Way Home”, which contains a great lesson: "There are three sides to every story- Yours and mine and the cold, hard truth." Such is with some research and news. It is up to all of us to determine what is true, and what is not. Data doesn’t talk, people do!
Executive Director of Professional Educators of Tennessee
* * *
Data doesn't talk but people do and boy, you did talk. You guessed, you railed, threw in some innuendo and some misdirections, possibly a hyperbole or two....and for what?
I didn't see any meat in your diatribe, but I sensed you didn't like what was in Vanderbilt's research. If the data is as bad as you are preaching to us, give us concrete data showing the data's bad. Not some song lyrics or quotes from academics or pie in the sky dreamings...they mean nothing and don't prove your point. They only show your point of view.