DreamBox, Datamining & DeVos?
Share and Enjoy
President Elect Trump’s choice for US Secretary of Ed, Betsy DeVos, is pro- choice. So parents would like to know:
Would Ms. DeVos support ensuring parents the CHOICE of whether to have their children datamined?
Apparently, Ms. DeVos supports allowing big technology companies to datamine children, 50,000 data points per hour, without parents knowing about it or ever seeing how the data are shared, used and profiled.
DeVos is Chair of The Philanthropy Round Table, a group that focuses on Venture Capital Philanthropy (and making “serious money“) and also promotes computer-based datamining and as an added bonus, this group Chaired by Devos is also excited about Common Core. You can read excerpts from their Blended Learning Guidebook here:
The Common Core—and basing annual tests on the objectives spelled out within it—is an exciting idea. But annual tests are just too slow, infrequent, impersonal, and inexact to be adequate in the digital era. Even individual quizzes or papers—the way many teachers assess what each of their students know and don’t—are needlessly slow and laborious and often fail to produce useable results.
…This is where new-style digital learning may be a game-changer. If computerized curricula that include constant student testing become widespread in classrooms, with daily reports showing how every student in a class is doing on various fronts, then accountability becomes much easier to enforce. Teachers, principals, and parents will know right away if students are learning and understanding. Good blended-learning software puts all final results in the context of where the student started out, so separating good instruction from bad instruction isn’t just a crude matter of who aces te end-of-year test.
Smart games collect mind-boggling volumes of data; DreamBox, the popular math software program for elementary-school students, records 50,000 data points per student per hour. –Jessie Woolley-Wilson (CEO, DreamBox Learning), in discussion with author.
…Within a regular classroom, differentiating like this is extremely hard, even for the best of teachers. Adaptive software can do that sort of differentiating automatically. Much as Netflix or Amazon or Pandora are able to learn from each user’s actions to predict what that person will next need or desire, so adaptive educational software can pick up how a given student learns, and what he or she is missing.
A Lack of Research “Honestly, it’s so early on, no one knows what works and doesn’t work,” says Diane Tavenner, leader of Summit Public Schools. “Indeed,” notes Scott Benson, who directs blended-learning grants at the Bill & Melinda Gates Foundation, “part of me is really nervous—that the dialogue and enthusiasm is outpacing the results.” –Philanthropy Round Table [emphasis added]
There are so many concerning statements in this Philanthropy Round Table Blended Learning Guide to collecting more and more data -on the premise of adaptive, constant-testing, personalized learning. Where do we start?
- “Good blended-learning software” Oxymoron? Multiple studies continue to show that online learning have “dismal academic outcomes” when compared to traditional teacher-taught/brick and mortar schools. Blended learning fared even worse. If we cannot see the choices that ANY adaptive software program is making for the child, then by definition, the blended-learning software program is not good.
- Algorithms lack transparency people are unable to see the hidden code (read more about blackbox algortihms here) that the software program uses, unable to see decisions made about them based on the data collected and inferred. These personalized programs, and decisions they make, are often wrong (garbage in, garbage out) yet are making predictions about children. The data collected can also be re-purposed or shared with other entities, marketers, agencies, via data brokers.
- Take Netflix for example. Let’s say I am on their website and I pick out a movie on Civil War (for a class project). Now, Netflix will tag me as a Civil War enthusiast, and will send me movie suggestions with similar titles, when I really do not like war movies at all.
- Or let’s say, students in class are searching for what type of clothes are worn in Switzerland. They type in “women’s tops in Switzerland”. You can imagine what images might pop up and what preferences might be flagged for these students. (Yes, this example happened to elementary students.)
- Social Media text mining. Algorithms are used all the time, without our knowledge, to infer personality, emotion, race and gender, behavior, preferences. For example, this algorithm “attempts to learn about the author of the text through subtle variations in the writing styles that occur between gender, age and social groups. Such information has a variety of applications including advertising and law enforcement.”
- An example of adaptive education software algorithms being wrong, would again be an elementary student. This student was a slow reader, he could read the words but it took him longer than the screen would allow before scrolling to the next page and did not give him time to answer the question on each page. The student continually ended up with scores that said he could not read. Thankfully the classroom teacher knew the student could read above grade level from a textbook, and continued to give him appropriately leveled books to challenge him. However, he was stuck at a very low level (and flagged as poor reader) when he used the online curriculum. His data footprint still has him incorrectly labeled.
- A different student, different software program, had multiple choice answers. The student was a gifted student, often thinking outside of the box, often asking questions that were miles ahead of the class. The student would agonize over which was the correct answer on the software, and would often get the question wrong. When the teacher later gave the same assignment to the student, but gave it on paper and allowed the student to ask questions afterwards, the teacher discovered that no only did the student understand and get all questions correct on her own; the student also had convincing arguments that two of the questions on the online assignment were flawed.
- Constant student testing This is Competency Based Education (CBE), mentioned in ESSA, White House Testing Plan, and promoted by globalists like Tom Vander Ark. CBE relies on hidden embedded constant online data collection. Millions of data points per day of unseen data making decisions about children– and no laws to say how these data can be used, re-purposed, shared or even sold.
- DreamBox collects 50,000 data points PER HOUR, a constant stream of real-time data. Who sees that data? Are there any laws governing how that data is combined or shared or profile or sold? (FERPA doesn’t cover data.) DreamBox partners with Clever. Reed Hastings is DreamBox Co-founder, Chairman and is CEO of Netflix.
Some say “technology is the world we live in”. We would counter that sentiment with this: Technology does not have to collect and sell data. Adults use technology and social media at their own risk. However, schools should not require children to use screens, or online programs that collect and share their data. Children deserve protection and privacy. Data footprints are making decisions for and about children, without their knowledge. This needs to stop.
Stealth testing, personalized, adaptive computer programs used in schools collect massive amounts of data. These adaptive programs use algorithms– the data, the algorithms, can be wrong. There is no way to see the algorithms used in “personalized learning” and no way to check online data for accuracy or see how it is used and shared. There has been considerable research on algorithmic bias and inaccuracy here and here and here and here and here and here and here… and as the Philanthropy Round Table admits,
“Right now, educational software is often inferior to the best teachers. But teachers aren’t improving at nearly the rate that software is. …But never mind the glitches”
Will Trump and DeVos end –or promote– datamining and Common Core?
Ms. DeVos claims to be anti-Common Core. We have asked DeVos to prove it. Trump promised to end Common Core, shrink the role of federal government in education; we ask President Trump to keep these promises. TRUMP HAS ALSO PROMISED TO END STUDENT DATAMINING here–and we very much expect him to keep this promise.
Again, parents ask President Elect Trump and Ms. DeVos– Will you end Common Core and the datamining that is tied to it?
Will you allow parents to CHOOSE whether their children are common core datamined and plugged into computers that collect hidden data? Or will you only promote cyber charters and virtual Blended-Learning schools with cemented in Common Core, thanks to ESSA and Competency Based (hidden online constant data collection) Education?
In the end, will there be no choice? Will this be an extension of the DeVos blended-learning Skunk Works plan?