Everything you do online is measured; this data (or meta data) can be tracked, and can be sold and analyzed (exploited?) via computer programs called algorithms.  For instance, Google is now offering a program to tell if you are depressed just by what you have searched for.  LED lights can soon surveil you as you do your shopping. Facial recognition and voice programs can help employers predict and decide whether to hire applicants. Keystroke data (what keys you type, how fast) can be used to detect emotions.  How quickly you drag your mouse on the computer screen can also detect your emotional state

How does mouse-tracking, emotion-detecting technology work?

These authors say mouse tracking technology “offers game-changing options for businesses, insider-threat detection and national security …Opportunities for this technology extend across many industries, with the potential for far-reaching impacts. …Neuro-ID’s technology is used to capture a person’s response to various types of questions. Those experiencing frustration and associated indecision when answering a particular question can be immediately identified.”   

“Modern computing devices allow users to enter information using keyboards, mice, or touch screens. Mobile devices have additional sensors like gyrometers and accelerometers that orient the screen. All of these input devices collect data at millisecond precision. Recent research has focused on a person’s emotional state and how those emotions affect their fine motor movements. Your emotions, like happiness, sadness, anger or frustration, cause immediate, uncontrolled changes in how your hand moves a computer mouse, navigates a touch pad screen or holds a smartphone. ” –LSE Business Review

How does edtech data tracking and analyzing impact schools and students?

CASEL (the Collaborative for Academic, Social and Emotional Learning), the same special interest non-profit who is promoting national standards to rate student emotions,  recently held a Social-Emotional Assessment Design Challenge.  NWEA, a global non-profit assessment company, popular in many schools for its online interim NWEA MAPs assessments, took first place with their SEL entry. (You can see the other SEL edtech runners up here.)

According to the NWEA press release on winning the SEL competition with their SEL keystroke detector:  

“NWEA Introduces MAP Growth Capability that Detects and Addresses Lack of Student Engagement in Assessment [and can] Identify Students Who are Rapid Guessing During Assessment [and] Provides Teachers with Real-Time Alerts and Guidance”

“This research is informing a wide range of NWEA offerings. NWEA researchers Jim Soland and Nate Jensen recently won the Social-Emotional Assessment Design Challenge for assessments that measure social-emotional learning (SEL). Their study demonstrated how students’ rapid guessing on an academic assessment directly correlates to the SEL constructs of self-management and self-regulation. Other research focuses on what rapid-guessing behavior informs us about racial and gender achievement gaps.” 

As this Truth in American Education post points out, “It’s unclear if this [NWEA] program could differentiate between the student who’s experiencing inner turmoil over a particular question, and one with the sniffles who just spends a couple of seconds blowing his nose.”  To add to that question:  What if your child doesn’t have great fine-motor control with a mouse, or what if the student is working on a computer with slow internet or dirty track-ball and the mouse won’t move correctly or what if the student answered the question quickly-not because it was a guess-but because they KNEW and recognized the answer, or finally, what if the whole idea of determining a child’s emotions by how they use a computer mouse is just bunk?  We would also like to hear how parents feel about an algorithm that measures their child’s online behavior based on their race and gender.  How much are we willing to rely on machines to predict and profile our children? With the recent push for increased student data mining, student workforce data badges, and profiling children based on their social emotional  learning (SEL) skills, will parents or schools even be informed that children’s emotional states, their keystrokes are being measured and tracked?

Silicon Valley in the Classroom

Amazon and Facebook and Google and Microsoft and many other edtech companies are in your child’s classroom. Edtech companies like  DreamBox, Khan Academy, and  Knewton use adaptive  or “personalized” online programs that collect large amounts of data on each child.  Knewton claims 5- 10 million data points per child, per day.   DreamBox claims 50,000 data points per hour on each student.  Khan Academy also uses adaptive learning;  Khan Academy attracted scrutiny  because of their sharing of student data with third parties. Khan Academy has reportedly changed their privacy policy but continues to use adaptive algorithms.  Khan has since partnered with the College Board to offer students “free” SAT test prep.   The truth is, these edtech products are not free, we are paying for them with children’s data.   Edtech relies on your student’s data and tech companies make money from data collection.  These Silicon Valley giants have a vision on how to revolutionize education.  The edtech revolution promises a $76B global market for education and talent [student data] technology. 

Who is responsible?

When edtech companies market their products to schools, shouldn’t there be some sort of independent research to show the edtech product is effective?   The problem, as this article  Inside Google’s Academic Influence Campaign so beautifully documents, research is often bought and paid for.  Are there independent studies showing that these personalized, adaptive, SEL gathering algorithms that analyze and profile children’s data are accurate and not biased?

  • When schools use computer based curriculum or assessments, do the schools know exactly what student data is collected, how it is shared or used to profile or predict a child?  
  • If your school, your district, your legislator sign off on any of these edtech programs for your child, without your informed parental consent on how that data is being used, then these folks should have an ethical responsibility to know exactly what data is collected and who it is shared with.  
  • Can parents see the data collected on their own children? ASK. Ask to see the data points collected, who they are shared with and ask to see data sharing contracts. Need help? Use this  Parent Coalition for Student Privacy Toolkit or this EFF Student Privacy Tips for Parents.

There is a lot of finger pointing: schools can’t possibly know what Google or other edtechs do with student data, legislators, special interest non-profits and think tanks promise your child’s data is “safe” because of FERPA , which is riddled with loopholes.  Ultimately, parents and students are held hostage.  

  • It’s getting to the point where either you surrender your child to the edtech folks or you don’t attend a school that uses online curriculum.

 As bigdata, billion dollar education rebundler  Tom VanderArk puts it, “gone are the days of data poverty”… “better, faster, cheaper data is available from other sources.” Those other sources are adaptive online curriculum and assessments that deliver real time data, via “personalized learning”  or Competency Based Education.   But these online programs are not only measuring what knowledge your child has learned, whether s/he has the right or wrong answer, they are analyzing mouse clicks and personality.  

 

Do parents send their children to school to be surveilled and analyzed? Do schools and edtech companies have a legal and ethical right to psychologically profile children or is Silicon Valley money influencing America’s schools

Do parents have a say in education matters, or because they don’t have millions to spend on lobbying, are they largely ignored?

Read about Google’s Silicon Tower of Lobbying and Influence

As this New York Times article implies, it seems tech giants like Google can take the data because they have the money to buy policy:

Google’s willingness to spread cash around the think tanks and advocacy groups focused on internet and telecommunications policy has effectively muted, if not silenced, criticism of the company over the past several years, said Marc Rotenberg, the president of the Electronic Privacy Information Center. His group, which does not accept any corporate funding, has played a leading role in calling out Google and other tech companies for alleged privacy violations. But Mr. Rotenberg said it is become increasingly difficult to find partners in that effort as more groups have accepted Google funding.

What’s lacking: Transparency and Consent

Wouldn’t it be nice if parents could send their children to school to learn knowledge, where assessments didn’t collect and share predictive data and measure mouse clicks, covertly analyze keystrokes, and rank children based on their emotional data?   Are we putting too much blind faith in big data and the tech industry promoting it?   Is student privacy being sold to the highest bidder?  Should we be concerned about data privacy? This privacy expert says when it comes to internet privacy, be very afraid. Even when edtech companies claim they aren’t tracking students (and teachers), they sometimes are.  Who is regulating edtech’s personalized, adaptive data collection ?   NO ONE.

jeb model bill underline

Cheri Kiesecker