Parents' Privacy Concerns Kill 'Personalized Learning' Initiative 93
theodp writes: "You may recall that inBloom is a data initiative that sought to personalize learning. GeekWire's Tricia Duryee now reports that inBloom, which was backed by $100 million from The Bill and Melinda Gates Foundation and others, is closing up shop after parents worried that its database technology was violating their children's privacy. According to NY Times coverage (reg.), the inBloom database tracked 400 different data fields about students — including family relationships ('foster parent' or 'father's significant other') and reasons for enrollment changes ('withdrawn due to illness' or 'leaving school as a victim of a serious violent incident') — that parents objected to, prompting some schools to recoil from the venture. In a statement, inBloom CEO Iwan Streichenberger said that personalized learning was still an emerging concept, and complained that the venture had been 'the subject of mischaracterizations and a lightning rod for misdirected criticism.' He added, 'It is a shame that the progress of this important innovation has been stalled because of generalized public concerns about data misuse, even though inBloom has world-class security and privacy protections that have raised the bar for school districts and the industry as a whole.' [Although it was still apparently vulnerable to Heartbleed.] Gates still has a couple of irons left in the data-driven personalized learning fire via his ties to Code.org, which seeks 7 years of participating K-12 students' data, and Khan Academy, which recently attracted scrutiny over its data-privacy policies."
Good to hear there are reasonable parents left... (Score:5, Interesting)
I already feared that every parent of today is on the "total surveillance" trip, teaching their children to kneel before their corporate overlords from their infancy.
But then again, maybe those parents were only concerned about the collecting of data associated with themselves, not their children...
The Snowden effect (Score:3)
Somebody has already cooked up a term for that: https://en.wikipedia.org/wiki/... [wikipedia.org]
You're no longer paranoid. Being concerned about your privacy became just a wee bit more fashionable. Why surrender more data to Big Data that will only end up in the data dungeons of the three letter agencies?
For what benefit to the child? (Score:4, Insightful)
FTA:-
Wow. Sounds like a gross invasion of privacy. If I was the student, I wouldn't want my teacher to know that I was a "victim of a serious violent incident". Not to mention once this kind of data gets into a database, its pretty dang hard to get it permanently scrubbed. So, what do the students get out of giving away their personal details?
Do teachers really need all this information to teach effectively? Do teachers even have the time to prowl through these thick databases to "tailor" their teaching methods? And what's wrong with teachers getting this information they need the old fashioned way -by winning the trust and confidence of the student/parent and being told directly? And is the student's teacher the only one privy to this information?
Even more fundamentally, it is fair to pigeonhole the students, each of whom are unique individuals with their own feelings, drives, desires and motivations into anonymous datasets and discrete categories so that they can be dealt with by the numbers?
This initiative seems to have been very badly thought out. Humans are not machines.
Re: (Score:3)
Well, knowing that amount of information about the children extends well to the parents.
The organization response does appear to be tone-deaf. I wouldn't care if they had perfect security. I care about what they're going to do with the information.
Re:Good to hear there are reasonable parents left. (Score:5, Insightful)
The organization response does appear to be tone-deaf. I wouldn't care if they had perfect security. I care about what they're going to do with the information.
Exactly... And being US based, you can't trust what they say anyway, because they can be legally order to lie to you.
It really, doesn't matter what they say... At the end of the day, the US doesn't have a legal framework to support safe use of private data for good, without risks that it may end up at NSA (or big insurance companies).
Closing this was the only way, given the current political landscape in the US big data is never safe.
Re: (Score:2)
And beyond that, it doesn't matter what they say or how sincere they are today. Tomorrow they may unilaterally change the agreement without notice. Why the courts don't shred any contract claiming that right, I don't know.
Re: (Score:2)
They retained the right to sell information to third parties. So that data on your child that you couldn't opt out of giving inBloom could go to some marketing agency so they could sell something better to your child.
Re: (Score:3)
I'm sure the data will be of interest to any individual or organization targeting vulnerable children, and their fearful parents.
Some possibilities off the top of my head:-
Quacks selling miracle cures for sick children.
Organisations selling therapy/schemes/camps/training for out of control children.
Quasi-religious entities recruiting impressionable members.
Criminal organizations seeking malleable stooges.
Adults seeking children with less adult supervision for more nefarious activities.
In contrast, marketing
Re: (Score:2)
Given that the uploaded data would have included IEP information (including medical diagnoses), disciplinary information, and even teen pregnancy information, all those would have been possible.
Of course, InBloom has been shut down but some of the data had been uploaded. What happened to that data? Who has it now and will it be deleted or used for "other purposes"?
Re: (Score:3, Interesting)
I would also add that it is actually dirt simple for companies to assure "security" of this kind of personal data: all they have to do is not collect it in the first place.
Re: (Score:2)
So they got caught with their pants down, okay. Not the first group this happened to.
It would be better to hear their logic for collecting this data to begin with. If they wanted personalized learning, I'm pretty sure a student ID unique to each student make more sense than gathering data on parents, their partners, reasons they missed class, etc...?
If they really and truly only wanted to help personalize learning why not trim off the data people took issue with? They obviously wanted that data more than
Re: (Score:2)
It would be better to hear their logic for collecting this data to begin with. If they wanted personalized learning, I'm pretty sure a student ID unique to each student make more sense than gathering data on parents, their partners, reasons they missed class, etc...?
Yeah, cause you can tell so much about a person by an arbitrarily assigned ID. The ID tells you all you need to know about what kinds of learning materials might work best for someone, or what wouldn't be appropriate. Yeah, you know from the ID that a child is in a single parent home so you might want to tailor the material towards examples that he will be familiar with (because you also know that the student is a boy from his student ID.)
And when the next arbitrarily assigned ID shows up on the system, y
Re: (Score:2)
You are telling me it's impossible to gauge someone's knowledge or tailor learning to something like testing and progression, and you have to know who a kids parent is sleeping with? Seriously, hold that thought a minute.
Hahaha, haha, hahahaha, OMFG! Hahahaha.
Okay, sorry. Have a nice day sir.
Re: (Score:2)
Well, I think the problem with a lot of people not being concerned about privacy is because, we've all already had our data stolen. Most people didn't even know it was a "thing" until it was too late. Kind of like going to church or exercising. As an adult you think back "I wish I had gone to church or exercised instead of doing all that coke and killing that hooker... hey... I could make my kid do it the right way though!" and viola...
Re: (Score:3)
We were fighting it like crazy and it was our kids' data we were concerned about. One of the big problems was that it wasn't opt-in. It wasn't even opt-out. It was "the government has decreed that parents aren't allowed to opt out." So you couldn't make an informed decision about InBloom. Your child's data was going there whether you liked it or not. Add in the fact that InBloom stated that they would release the data to "third parties" and you can see why parents like my wife and I were fighting it a
Re: (Score:2)
Re: (Score:2, Troll)
Spoken like a clueless teen, right down to the diction.In your case , you are probably right.
The rest of the world recalls what works and what doesnt. The newer ways, tailored to fit the convenience of teachers, board , the few liberal parents and now a data collection scam dressed as a Gates Charity, do not work, did not work and will not work. Bringing the focus back to the student, adopting the ways of schools from the 1930s to the 1950s and only updating newer facts for texts is going to be THE SOLUTIO
Re: (Score:2)
Sit down son, Ill dismiss thoughts from fools based on my experience and observation. This includes dipshits like you spouting lefty regurgitation without a thought as to what you are saying. I notice you were too squeamish to do this with your account. Why should I take you seriously at all? Are you afraid you might get modded down for your silly tantrum? You may even feel silly to find that I am not right wing. I just live in a REAL world where children are undereducated by spoiled fuckheads more concerne
If Gates Wants Our Kids? (Score:2)
He can grow 'em in tanks, for his personal slave army.
Good news (Score:4)
Every so often, a little glimmer of good news comes my way. This would be one of them!
Re: (Score:2)
And how many of those people are OK with their kid's school automatically signing them up for Google or Microsoft account for their school domain then forcing all the work to be done in the online docs/whatever tools?
I don't know, but I would absolutely, and vocally object. And refuse to allow my child to comply.
So? Fix it. (Score:5, Insightful)
'It is a shame that the progress of this important innovation has been stalled because of generalized public concerns about data misuse,'
OK, so quit whining and fix it. Go talk to Bill and Melinda and ask them to fund some lobbying to get privacy laws with sharp teeth put in place. Simple laws that say something like, "Any company says they won't abuse your data gets shut down and all their assets siezed if they sell, transfer, share with a parter, or in any other way distribute your data, or if they sell the use of your data as a service, or use your data for any purpose or in any way other than what is explicitly stated on the front page of their web site, above the fold, in bold 14 point type."
All we want is to be able to trust you. Since it would be silly to trust an American company that didn't have its financial ass on the line, what we need is for your financial future to be directly coupled to you doing what you claim you were going to do anyway. Put your money where your mouth is; if you're not trying to pull something, it won't cost you a thing.
Re: (Score:2)
"Any company says they won't abuse your data gets shut down and all their assets siezed if they sell, transfer, share with a parter, or in any other way distribute your data, or if they sell the use of your data as a service, or use your data for any purpose or in any way other than what is explicitly stated on the front page of their web site, above the fold, in bold 14 point type."
The ultimate poison pill for any startup company. This would effectively prohibit any future funding or merger. "Gee, guys, you have a great idea and we'd love to buy you out to bring your idea to a larger audience, but our lawyers won't let us assume the liability of dealing with your data."
Re: (Score:1)
"our lawyers won't let us assume the liability"
Mission Accomplished.
Not just startups (Score:2)
The ultimate poison pill for any startup company. This would effectively prohibit any future funding or merger. "Gee, guys, you have a great idea and we'd love to buy you out to bring your idea to a larger audience, but our lawyers won't let us assume the liability of dealing with your data."
You're overlooking the fact that if there was such a law, it would apply to everyone, not just startups. Want to deal with existing established companies? Same problem. So now you have the interesting choice of either accepting the risk, or leaving the market entirely.
And there will be some companies who are willing to accept the risk, provided the rewards are commensurate.
Re: (Score:2)
And there will be some companies who are willing to accept the risk, provided the rewards are commensurate.
The final result of this will be both less competition in the market and higher prices. A win-win for the consumer.
I actually wasn't overlooking the application to existing companies, I was just making the point stronger by showing how it would stifle innovation and creativity.
Re: (Score:2)
They could always purge the data. If the new buyer has any desire to use the data in a way that wasn't part of the deal when the user provided it, it's the only proper thing to do anyway.
Of course, those investors have the money burning a hole in their pocket. If they don't invest it, it will inflate itself away. Everything they might want to invest in is operating under that same law, so they might as well choose the same way they are now.
Re: (Score:2)
They could always purge the data.
And the DATA, along with the USE to provide personalized learning, IS THE VALUE OF THE COMPANY. If you have to delete the data to sell the company, or merge it with another technology firm to enhance the products, then the company loses a lot of its value and this provision becomes, just like I said, a poison pill.
And before you rant on about use of this data, I'm saying it is going to be used for EXACTLY THE REASON IT WAS COLLECTED.
If the new buyer has any desire to use the data in a way that wasn't part of the deal when the user provided it,
If the new buyer has ANY desire to use the data for the same purpose it
Re: (Score:2)
And before you rant on about use of this data, I'm saying it is going to be used for EXACTLY THE REASON IT WAS COLLECTED.
And they are willing to back that with a contract where changes in terms are explicitly forbidden, right? Because all I see are vague statements that aren't even promises.
I really don't care if the company has any resale value or not. I'm more concerned that they not collect kids' data under color of government (since school is compulsory) and then change management and sell it to the highest bidder. If they can't make a go of it under that constraint, we're better off without them.
If they would like to not
Re: (Score:2)
It's only a poison pill for companies who's business model is to cyberstalk people. Everyone else can simply not collect and record personally idenfiable data. And the stalkers should be poisoned and hopefully killed i
Re: (Score:2)
It's only a poison pill for companies who's business model is to cyberstalk people.
You are so wrong that it's remarkable. It's a poison pill for any company that needs customer data to operate. The ultimate example is this one, where data is needed so the education can be personalized and similarities in student backgrounds can be leveraged into better education for all of them. This company wasn't cyberstalking anyone.
Everyone else can simply not collect and record personally idenfiable data.
So you have the same idea that spetry did, that a student can log in with his student ID and magically the system will know what learning material to provide to it. I say
Re: (Score:3)
Simple laws that say something like, "Any company says they won't abuse your data gets shut down and all their assets siezed
What does it matter?
The company could put that in the EULA...
But what would it change. Even if the company is truly nice, and truly wants to do honor it's agreement. It can be force to disclose data to the NSA and not talk about it.
Even, if there was a law, there would be a secret law circumventing it. In the current political landscape this isn't far fetched.
In fact it's naive to think things like this don't take place.
Re: (Score:3)
'It is a shame that the progress of this important innovation has been stalled because of generalized public concerns about data misuse,'
Actually, the biggest problem with InBloom is that collecting all this data didn't have any benefit that teachers or parents could recognize. If they like data so much, why didn't they get data to show that students actually benefit from big data before they rolled it out? Here's the best comment at the NYT:
Re: (Score:2)
Its easy to keep both (Score:4, Interesting)
Don't give them your data... have them give you the engine.
Then you feed the data into it locally, and it generates a customized learning profile which is anonymoized.
Then you anonymously download profile XJ2221LP4-123 whatever and then you get the best of both worlds.
Why are people so stupid... its so fucking easy.
Re: (Score:2)
In this situation it's not that easy. I have the feeling they're trying to aggregate data across a lot of different school systems to understand what is happening and create papers and reports people can use. Decentralized engines won't help achieve that. Even with anonymous IDs, you still need to track students, and really each student in a family (to judge family trends) and that ID has so much of their other information that it could be traversed back to the individual.
Re: (Score:2)
My idea has them getting NO data at all. You don't give them your data. Rather, THEY give you the engine. Your data stays local on your machine the whole time.
And from that you figure out your education profile and get the kind of education you need.
All they get are download statistics. I suppose they could compare IP addresses to figure out which student downloaded which selection of courses but that won't give lots of personal information. You can infer personal information but it will be fussy informatio
Re: (Score:2)
Not to mention it totally screws up the marketing plans. How are they going to help marketers target individual children when they have anonymized data?
Re: (Score:2)
Explain terms of usage. If I made an error, I'd like to be corrected in a way that I won't make it again.
Irony.. (Score:3)
Tangentially related: the other day, my neighbour called up her niece concerned - facebook update informed her that both she and her mother had went to a hospital, and had been there for a few hours. The niece's opening response: "who told you?". She was convinced someone blabbed, when all along was 'use geolocation services' or some such on their phones. They simply had no idea what information they were freely handing out. Have to wonder if some kids had tried to sneak into a bar before, only for their phones to rat them out.
Excuse me while guesstimate the hypocrisy inherent in them refusing something that actually might be of (good) use.
Bad logic (Score:3)
...how many of these 'concerned parents' are spewing that same data daily over facebook, without a care?
She was convinced someone blabbed, when all along was 'use geolocation services' or some such on their phones. They simply had no idea what information they were freely handing out.
You contradict yourself. First you claim that the parents spew data "without a care". But in your example, the niece clearly did care about the loss of data, she was simply technologically inept at securing her phone.
And, even that is understandable. Frankly speaking, can you honestly claim that you know and approve of every bit of data that leaves your phone? That you are fully conversant and familiar with the multitude of information that is being broadcasted from your phone, right this minute, by the OS
Re: (Score:2)
No, I don't contradict, because they NORMALLY give up their information fr
Re: (Score:2)
Fair enough, but even you must recognize that that your standards are rather extreme and far from the norm. And I suspect unacceptable to the majority of people. Its the same as preaching total abstinence from sex as the cure for AIDS - it definitely works
This could have been good... (Score:2)
But the simple fact that between US corporations and the US government, privacy abuses have been so bad (although admittedly still better than some other countries) that there is no chance people would willingly opt into any such system. Even if the current incarnation is honest, there is 0% chance that it will stay that way, for one reason or another.
Everyone older than a teenager should remember the whole Google 'do no evil' thing, and many of us honestly hoped that they would stay that way. Unfortunate
Re: (Score:2)
privacy abuses have been so bad (although admittedly still better than some other countries)
Out of curiosity: which countries do you think of? :)
:)
Even stasi, east german secret policy during the cold war, didn't conduct surveillance at the scale as US government.
Considering credit card penetration in the US, etc. I would suspect you have better privacy in China. Though, you right to disagree might be slightly reduced
Re: (Score:2)
I replied, but something happened to my connectivity just as I was about to hit submit.
The only reason places like East Germany didn't, is because they couldn't. They didn't have this level of technology back then. Not to mention, you didn't have an entire population of people stupid enough to vomit every intimate detail of their private lives onto the internet.
Now? Oh, they'd have a total field day. Given the way Russia has been going lately, I wouldn't be surprised if they started, assuming they haven
Schools are operated by cowards (Score:5, Insightful)
First of all, the summary is misleading. It wasn't parents that "shut this down" (and that would simply happen by parents not utilizing the service in the first place). It was the governments that own and operate the schools. The passed laws that will not allow the schools to share the data in the first place. Big difference. Especially since there was no breach. Nothing "bad" happened to warrant this ruling.
Whether this has always the case, or is simply more apparent in this day and age, I'm not sure. But at this point in time, public schools are operated by cowards. I'm talking about the school boards and superintendents who operate the school districts at the highest levels (where these kinds of decisions are made). I'm talking about everything from their policies regarding "threats" (like how you hear in the news about 10 year olds being suspended from school [cnn.com] because they made their fingers into the shape of a gun and made a sound), to locking down schools with video cameras at the entrances so parents have to show their ID and be buzzed in just to have lunch with their child. An event happens at one school in the entire nation, and suddenly that is somehow a realistic threat to that every other school in the nation too. It's because those operating the schools at the highest levels are cowards. They say they have "zero tolerance" for many things now (like the whole "gun" threat nonsense), which really means "We absolve ourselves from having to think or make decisions in any way, so that we, the school board, have zero liability at all in the event, no matter how remote, that something bad happens at our schools." Cowards .
Now this whole inBloom thing, whether a good idea or not, is dead because of those cowards. Parents no longer have this option, in the 21st century, to simply consolidate their children's educational data to a single 3rd party service. Why? Because school officials, in their fear and ignorance, assume that somehow it's all going to be breached - and here's the key part - and that they will be responsible and bear some degree of liability.
Re: (Score:3)
The problem is who owns the data. The deep secret about your 'permanent record' that principals talk about when you're in school is that once you graduate, they sit on paper in a disused basement until destroyed by floods, fire, or rats. Perhaps these days they sit on tapes that become unreadable even sooner.
The good news is that they don't get sold to credit agencies, insurance companies or other lowlifes. Even if they wanted to sell it, they can't. It's just too hard to retrieve.
The harm of collecting it
Re: (Score:2)
Let's see, FERPA [wikipedia.org]
It only applies to organizations receiving DOE funds. OH, and it seems it got loosened up a bit in 2012.
So yes, it's a pinkie swear.
Re: (Score:1)
"... simply consolidate their children's educational data to a single 3rd party service." - There's not a single good reason to do that, other than to fulfil the fantasies of the founders of those "3rd party services".
If you want "personalized education", pay teachers for spending time on your children.
If you want colorful "management reports" on your childrens education project status, automatically derived from some formalized database entries, then of course, such a "consolidating 3rd party service"
Re: (Score:2)
One of the big problems with InBloom was that there was no "option" of using it. The children's data would be uploaded whether the parents wanted it to be or not. For example, my wife and I were opposed to InBloom and didn't want our sons' information uploaded to their cloud servers. We couldn't opt-out, though. Like it or not, our sons' data would have been uploaded to InBloom's system and there would have been nothing we could have done to stop it. (Beyond complaining loudly to our politicians - whic
Re: (Score:1)
Because school officials, in their fear and ignorance, assume that somehow it's all going to be breached - and here's the key part - and that they will be responsible and bear some degree of liability.
Maybe those school officials, familiar with history and similar systems, have a bit more education on the subject than yourself...
What a shock (Score:2)
The government (and their private sector lobbyists) has made it quite clear that they don't give a shit about anyone's rights or privacy. Parents have a right to be concerned. These days there is a 'permanent record', and with ever growing numbers of data points being added, the probability of having your career torpedoed for out-of-context events that happened decades ago is growing radically.
Good to see it go. (Score:2)
We don't need more tracking for a goverment to abuse.
Even if their security were perfect... (Score:2)
Which, of course, it is not, that still leaves the entirely reasonable objection that they have the data for any reason at all. Why should they be trusted with it?
Can't Even Test Kids (Score:2)
Personalzed? (Score:1)