Forgot your password?
typodupeerror
Education Privacy The Almighty Buck

Parents' Privacy Concerns Kill 'Personalized Learning' Initiative 93

Posted by Soulskill
from the we-care-too-much-about-our-kids-to-care-about-our-kids dept.
theodp writes: "You may recall that inBloom is a data initiative that sought to personalize learning. GeekWire's Tricia Duryee now reports that inBloom, which was backed by $100 million from The Bill and Melinda Gates Foundation and others, is closing up shop after parents worried that its database technology was violating their children's privacy. According to NY Times coverage (reg.), the inBloom database tracked 400 different data fields about students — including family relationships ('foster parent' or 'father's significant other') and reasons for enrollment changes ('withdrawn due to illness' or 'leaving school as a victim of a serious violent incident') — that parents objected to, prompting some schools to recoil from the venture. In a statement, inBloom CEO Iwan Streichenberger said that personalized learning was still an emerging concept, and complained that the venture had been 'the subject of mischaracterizations and a lightning rod for misdirected criticism.' He added, 'It is a shame that the progress of this important innovation has been stalled because of generalized public concerns about data misuse, even though inBloom has world-class security and privacy protections that have raised the bar for school districts and the industry as a whole.' [Although it was still apparently vulnerable to Heartbleed.] Gates still has a couple of irons left in the data-driven personalized learning fire via his ties to Code.org, which seeks 7 years of participating K-12 students' data, and Khan Academy, which recently attracted scrutiny over its data-privacy policies."
This discussion has been archived. No new comments can be posted.

Parents' Privacy Concerns Kill 'Personalized Learning' Initiative

Comments Filter:
  • by ffkom (3519199) on Tuesday April 22, 2014 @06:12PM (#46818763)
    ... who refuse to feed the data krakens.

    I already feared that every parent of today is on the "total surveillance" trip, teaching their children to kneel before their corporate overlords from their infancy.

    But then again, maybe those parents were only concerned about the collecting of data associated with themselves, not their children...

    • Somebody has already cooked up a term for that: https://en.wikipedia.org/wiki/... [wikipedia.org]

      You're no longer paranoid. Being concerned about your privacy became just a wee bit more fashionable. Why surrender more data to Big Data that will only end up in the data dungeons of the three letter agencies?

      • by Camael (1048726) on Tuesday April 22, 2014 @11:20PM (#46820167)

        FTA:-

        It reports that the inBloom database could track 400 different data fields about students, including details such as family relationships, reasons for enrollment changes (such as sicknesses, or being a victim of a serious violent incident)

        Wow. Sounds like a gross invasion of privacy. If I was the student, I wouldn't want my teacher to know that I was a "victim of a serious violent incident". Not to mention once this kind of data gets into a database, its pretty dang hard to get it permanently scrubbed. So, what do the students get out of giving away their personal details?

        Over the last year, the incredibly talented team at inBloom has developed and introduced a technical solution that addresses the complex challenges that teachers, educators and parents face when trying to best utilize the student data available to them. That solution can provide a high impact and cost-effective service to every school district across the country, enabling teachers to more easily tailor education to students’ individual learning needs.

        Do teachers really need all this information to teach effectively? Do teachers even have the time to prowl through these thick databases to "tailor" their teaching methods? And what's wrong with teachers getting this information they need the old fashioned way -by winning the trust and confidence of the student/parent and being told directly? And is the student's teacher the only one privy to this information?

        Even more fundamentally, it is fair to pigeonhole the students, each of whom are unique individuals with their own feelings, drives, desires and motivations into anonymous datasets and discrete categories so that they can be dealt with by the numbers?

        This initiative seems to have been very badly thought out. Humans are not machines.

    • Well, knowing that amount of information about the children extends well to the parents.

      The organization response does appear to be tone-deaf. I wouldn't care if they had perfect security. I care about what they're going to do with the information.

      • by jopsen (885607) <jopsen@gmail.com> on Tuesday April 22, 2014 @07:12PM (#46819057) Homepage

        The organization response does appear to be tone-deaf. I wouldn't care if they had perfect security. I care about what they're going to do with the information.

        Exactly... And being US based, you can't trust what they say anyway, because they can be legally order to lie to you.

        It really, doesn't matter what they say... At the end of the day, the US doesn't have a legal framework to support safe use of private data for good, without risks that it may end up at NSA (or big insurance companies).

        Closing this was the only way, given the current political landscape in the US big data is never safe.

        • by sjames (1099)

          And beyond that, it doesn't matter what they say or how sincere they are today. Tomorrow they may unilaterally change the agreement without notice. Why the courts don't shred any contract claiming that right, I don't know.

      • They retained the right to sell information to third parties. So that data on your child that you couldn't opt out of giving inBloom could go to some marketing agency so they could sell something better to your child.

        • by Camael (1048726)

          I'm sure the data will be of interest to any individual or organization targeting vulnerable children, and their fearful parents.

          Some possibilities off the top of my head:-

          Quacks selling miracle cures for sick children.
          Organisations selling therapy/schemes/camps/training for out of control children.
          Quasi-religious entities recruiting impressionable members.
          Criminal organizations seeking malleable stooges.
          Adults seeking children with less adult supervision for more nefarious activities.

          In contrast, marketing

          • Given that the uploaded data would have included IEP information (including medical diagnoses), disciplinary information, and even teen pregnancy information, all those would have been possible.

            Of course, InBloom has been shut down but some of the data had been uploaded. What happened to that data? Who has it now and will it be deleted or used for "other purposes"?

    • Re: (Score:3, Interesting)

      I agree that it is good to hear.

      I would also add that it is actually dirt simple for companies to assure "security" of this kind of personal data: all they have to do is not collect it in the first place.
      • by s.petry (762400)

        So they got caught with their pants down, okay. Not the first group this happened to.

        It would be better to hear their logic for collecting this data to begin with. If they wanted personalized learning, I'm pretty sure a student ID unique to each student make more sense than gathering data on parents, their partners, reasons they missed class, etc...?

        If they really and truly only wanted to help personalize learning why not trim off the data people took issue with? They obviously wanted that data more than

        • by Obfuscant (592200)

          It would be better to hear their logic for collecting this data to begin with. If they wanted personalized learning, I'm pretty sure a student ID unique to each student make more sense than gathering data on parents, their partners, reasons they missed class, etc...?

          Yeah, cause you can tell so much about a person by an arbitrarily assigned ID. The ID tells you all you need to know about what kinds of learning materials might work best for someone, or what wouldn't be appropriate. Yeah, you know from the ID that a child is in a single parent home so you might want to tailor the material towards examples that he will be familiar with (because you also know that the student is a boy from his student ID.)

          And when the next arbitrarily assigned ID shows up on the system, y

          • by s.petry (762400)

            You are telling me it's impossible to gauge someone's knowledge or tailor learning to something like testing and progression, and you have to know who a kids parent is sleeping with? Seriously, hold that thought a minute.

            Hahaha, haha, hahahaha, OMFG! Hahahaha.

            Okay, sorry. Have a nice day sir.

    • Well, I think the problem with a lot of people not being concerned about privacy is because, we've all already had our data stolen. Most people didn't even know it was a "thing" until it was too late. Kind of like going to church or exercising. As an adult you think back "I wish I had gone to church or exercised instead of doing all that coke and killing that hooker... hey... I could make my kid do it the right way though!" and viola...

    • We were fighting it like crazy and it was our kids' data we were concerned about. One of the big problems was that it wasn't opt-in. It wasn't even opt-out. It was "the government has decreed that parents aren't allowed to opt out." So you couldn't make an informed decision about InBloom. Your child's data was going there whether you liked it or not. Add in the fact that InBloom stated that they would release the data to "third parties" and you can see why parents like my wife and I were fighting it a

    • These parents are idiots. I found their kids on Facebook, as well as the candy van stalkers that have been trying to meet up with them.
  • He can grow 'em in tanks, for his personal slave army.

  • by JohnFen (1641097) on Tuesday April 22, 2014 @06:23PM (#46818821)

    Every so often, a little glimmer of good news comes my way. This would be one of them!

  • So? Fix it. (Score:5, Insightful)

    by Bob9113 (14996) on Tuesday April 22, 2014 @06:43PM (#46818925) Homepage

    'It is a shame that the progress of this important innovation has been stalled because of generalized public concerns about data misuse,'

    OK, so quit whining and fix it. Go talk to Bill and Melinda and ask them to fund some lobbying to get privacy laws with sharp teeth put in place. Simple laws that say something like, "Any company says they won't abuse your data gets shut down and all their assets siezed if they sell, transfer, share with a parter, or in any other way distribute your data, or if they sell the use of your data as a service, or use your data for any purpose or in any way other than what is explicitly stated on the front page of their web site, above the fold, in bold 14 point type."

    All we want is to be able to trust you. Since it would be silly to trust an American company that didn't have its financial ass on the line, what we need is for your financial future to be directly coupled to you doing what you claim you were going to do anyway. Put your money where your mouth is; if you're not trying to pull something, it won't cost you a thing.

    • by Obfuscant (592200)

      "Any company says they won't abuse your data gets shut down and all their assets siezed if they sell, transfer, share with a parter, or in any other way distribute your data, or if they sell the use of your data as a service, or use your data for any purpose or in any way other than what is explicitly stated on the front page of their web site, above the fold, in bold 14 point type."

      The ultimate poison pill for any startup company. This would effectively prohibit any future funding or merger. "Gee, guys, you have a great idea and we'd love to buy you out to bring your idea to a larger audience, but our lawyers won't let us assume the liability of dealing with your data."

      • by Anonymous Coward

        "our lawyers won't let us assume the liability"

        Mission Accomplished.

      • The ultimate poison pill for any startup company. This would effectively prohibit any future funding or merger. "Gee, guys, you have a great idea and we'd love to buy you out to bring your idea to a larger audience, but our lawyers won't let us assume the liability of dealing with your data."

        You're overlooking the fact that if there was such a law, it would apply to everyone, not just startups. Want to deal with existing established companies? Same problem. So now you have the interesting choice of either accepting the risk, or leaving the market entirely.

        And there will be some companies who are willing to accept the risk, provided the rewards are commensurate.

        • by Obfuscant (592200)

          And there will be some companies who are willing to accept the risk, provided the rewards are commensurate.

          The final result of this will be both less competition in the market and higher prices. A win-win for the consumer.

          I actually wasn't overlooking the application to existing companies, I was just making the point stronger by showing how it would stifle innovation and creativity.

      • by sjames (1099)

        They could always purge the data. If the new buyer has any desire to use the data in a way that wasn't part of the deal when the user provided it, it's the only proper thing to do anyway.

        Of course, those investors have the money burning a hole in their pocket. If they don't invest it, it will inflate itself away. Everything they might want to invest in is operating under that same law, so they might as well choose the same way they are now.

        • by Obfuscant (592200)

          They could always purge the data.

          And the DATA, along with the USE to provide personalized learning, IS THE VALUE OF THE COMPANY. If you have to delete the data to sell the company, or merge it with another technology firm to enhance the products, then the company loses a lot of its value and this provision becomes, just like I said, a poison pill.

          And before you rant on about use of this data, I'm saying it is going to be used for EXACTLY THE REASON IT WAS COLLECTED.

          If the new buyer has any desire to use the data in a way that wasn't part of the deal when the user provided it,

          If the new buyer has ANY desire to use the data for the same purpose it

          • by sjames (1099)

            And before you rant on about use of this data, I'm saying it is going to be used for EXACTLY THE REASON IT WAS COLLECTED.

            And they are willing to back that with a contract where changes in terms are explicitly forbidden, right? Because all I see are vague statements that aren't even promises.

            I really don't care if the company has any resale value or not. I'm more concerned that they not collect kids' data under color of government (since school is compulsory) and then change management and sell it to the highest bidder. If they can't make a go of it under that constraint, we're better off without them.

            If they would like to not

      • by ultranova (717540)

        The ultimate poison pill for any startup company. This would effectively prohibit any future funding or merger. "Gee, guys, you have a great idea and we'd love to buy you out to bring your idea to a larger audience, but our lawyers won't let us assume the liability of dealing with your data."

        It's only a poison pill for companies who's business model is to cyberstalk people. Everyone else can simply not collect and record personally idenfiable data. And the stalkers should be poisoned and hopefully killed i

        • by Obfuscant (592200)

          It's only a poison pill for companies who's business model is to cyberstalk people.

          You are so wrong that it's remarkable. It's a poison pill for any company that needs customer data to operate. The ultimate example is this one, where data is needed so the education can be personalized and similarities in student backgrounds can be leveraged into better education for all of them. This company wasn't cyberstalking anyone.

          Everyone else can simply not collect and record personally idenfiable data.

          So you have the same idea that spetry did, that a student can log in with his student ID and magically the system will know what learning material to provide to it. I say

    • by jopsen (885607)

      Simple laws that say something like, "Any company says they won't abuse your data gets shut down and all their assets siezed

      What does it matter?
      The company could put that in the EULA...

      But what would it change. Even if the company is truly nice, and truly wants to do honor it's agreement. It can be force to disclose data to the NSA and not talk about it.
      Even, if there was a law, there would be a secret law circumventing it. In the current political landscape this isn't far fetched.
      In fact it's naive to think things like this don't take place.

    • by nbauman (624611)

      'It is a shame that the progress of this important innovation has been stalled because of generalized public concerns about data misuse,'

      Actually, the biggest problem with InBloom is that collecting all this data didn't have any benefit that teachers or parents could recognize. If they like data so much, why didn't they get data to show that students actually benefit from big data before they rolled it out? Here's the best comment at the NYT:

      Kate Delaware

      I'm sure there are other examples of how inBloom intended their service to be used, and maybe some of them were kind of cool, but the one pictured in the article is absurd. As a teacher, I d

    • by guises (2423402)
      I think the mistake here is in allowing companies to set their own privacy policies.
  • by Karmashock (2415832) on Tuesday April 22, 2014 @06:48PM (#46818941)

    Don't give them your data... have them give you the engine.

    Then you feed the data into it locally, and it generates a customized learning profile which is anonymoized.

    Then you anonymously download profile XJ2221LP4-123 whatever and then you get the best of both worlds.

    Why are people so stupid... its so fucking easy.

    • by SumDog (466607)

      In this situation it's not that easy. I have the feeling they're trying to aggregate data across a lot of different school systems to understand what is happening and create papers and reports people can use. Decentralized engines won't help achieve that. Even with anonymous IDs, you still need to track students, and really each student in a family (to judge family trends) and that ID has so much of their other information that it could be traversed back to the individual.

      • My idea has them getting NO data at all. You don't give them your data. Rather, THEY give you the engine. Your data stays local on your machine the whole time.

        And from that you figure out your education profile and get the kind of education you need.

        All they get are download statistics. I suppose they could compare IP addresses to figure out which student downloaded which selection of courses but that won't give lots of personal information. You can infer personal information but it will be fussy informatio

      • by sjames (1099)

        Not to mention it totally screws up the marketing plans. How are they going to help marketers target individual children when they have anonymized data?

  • by aevan (903814) on Tuesday April 22, 2014 @06:48PM (#46818945)
    ...how many of these 'concerned parents' are spewing that same data daily over facebook, without a care?

    Tangentially related: the other day, my neighbour called up her niece concerned - facebook update informed her that both she and her mother had went to a hospital, and had been there for a few hours. The niece's opening response: "who told you?". She was convinced someone blabbed, when all along was 'use geolocation services' or some such on their phones. They simply had no idea what information they were freely handing out. Have to wonder if some kids had tried to sneak into a bar before, only for their phones to rat them out.

    Excuse me while guesstimate the hypocrisy inherent in them refusing something that actually might be of (good) use.
    • ...how many of these 'concerned parents' are spewing that same data daily over facebook, without a care?
      She was convinced someone blabbed, when all along was 'use geolocation services' or some such on their phones. They simply had no idea what information they were freely handing out.

      You contradict yourself. First you claim that the parents spew data "without a care". But in your example, the niece clearly did care about the loss of data, she was simply technologically inept at securing her phone.

      And, even that is understandable. Frankly speaking, can you honestly claim that you know and approve of every bit of data that leaves your phone? That you are fully conversant and familiar with the multitude of information that is being broadcasted from your phone, right this minute, by the OS

      • by aevan (903814)
        Yes, I can, because I don't own a smart phone, and expressly for those reasons. I also don't post pics of relatives or give out information of their behaviours online, with OR without their permission. No twitting, no facebooking, no blogging of habits. There is no hypocrisy here. [Not exactly material, but I've also jailbroken and secured phones for friends: I am conversant with the tech, merely have no use for it personally].

        No, I don't contradict, because they NORMALLY give up their information fr
        • by Camael (1048726)

          Yes, I can, because I don't own a smart phone, and expressly for those reasons. I also don't post pics of relatives or give out information of their behaviours online, with OR without their permission. No twitting, no facebooking, no blogging of habits.

          Fair enough, but even you must recognize that that your standards are rather extreme and far from the norm. And I suspect unacceptable to the majority of people. Its the same as preaching total abstinence from sex as the cure for AIDS - it definitely works

  • But the simple fact that between US corporations and the US government, privacy abuses have been so bad (although admittedly still better than some other countries) that there is no chance people would willingly opt into any such system. Even if the current incarnation is honest, there is 0% chance that it will stay that way, for one reason or another.

    Everyone older than a teenager should remember the whole Google 'do no evil' thing, and many of us honestly hoped that they would stay that way. Unfortunate

    • by jopsen (885607)

      privacy abuses have been so bad (although admittedly still better than some other countries)

      Out of curiosity: which countries do you think of? :)

      Even stasi, east german secret policy during the cold war, didn't conduct surveillance at the scale as US government.


      Considering credit card penetration in the US, etc. I would suspect you have better privacy in China. Though, you right to disagree might be slightly reduced :)

      • I replied, but something happened to my connectivity just as I was about to hit submit.

        The only reason places like East Germany didn't, is because they couldn't. They didn't have this level of technology back then. Not to mention, you didn't have an entire population of people stupid enough to vomit every intimate detail of their private lives onto the internet.

        Now? Oh, they'd have a total field day. Given the way Russia has been going lately, I wouldn't be surprised if they started, assuming they haven

  • by Dan East (318230) on Tuesday April 22, 2014 @06:53PM (#46818971) Homepage Journal

    First of all, the summary is misleading. It wasn't parents that "shut this down" (and that would simply happen by parents not utilizing the service in the first place). It was the governments that own and operate the schools. The passed laws that will not allow the schools to share the data in the first place. Big difference. Especially since there was no breach. Nothing "bad" happened to warrant this ruling.

    Whether this has always the case, or is simply more apparent in this day and age, I'm not sure. But at this point in time, public schools are operated by cowards. I'm talking about the school boards and superintendents who operate the school districts at the highest levels (where these kinds of decisions are made). I'm talking about everything from their policies regarding "threats" (like how you hear in the news about 10 year olds being suspended from school [cnn.com] because they made their fingers into the shape of a gun and made a sound), to locking down schools with video cameras at the entrances so parents have to show their ID and be buzzed in just to have lunch with their child. An event happens at one school in the entire nation, and suddenly that is somehow a realistic threat to that every other school in the nation too. It's because those operating the schools at the highest levels are cowards. They say they have "zero tolerance" for many things now (like the whole "gun" threat nonsense), which really means "We absolve ourselves from having to think or make decisions in any way, so that we, the school board, have zero liability at all in the event, no matter how remote, that something bad happens at our schools." Cowards .

    Now this whole inBloom thing, whether a good idea or not, is dead because of those cowards. Parents no longer have this option, in the 21st century, to simply consolidate their children's educational data to a single 3rd party service. Why? Because school officials, in their fear and ignorance, assume that somehow it's all going to be breached - and here's the key part - and that they will be responsible and bear some degree of liability.

    • by ffkom (3519199)

      "... simply consolidate their children's educational data to a single 3rd party service." - There's not a single good reason to do that, other than to fulfil the fantasies of the founders of those "3rd party services".

      If you want "personalized education", pay teachers for spending time on your children.

      If you want colorful "management reports" on your childrens education project status, automatically derived from some formalized database entries, then of course, such a "consolidating 3rd party service"

    • One of the big problems with InBloom was that there was no "option" of using it. The children's data would be uploaded whether the parents wanted it to be or not. For example, my wife and I were opposed to InBloom and didn't want our sons' information uploaded to their cloud servers. We couldn't opt-out, though. Like it or not, our sons' data would have been uploaded to InBloom's system and there would have been nothing we could have done to stop it. (Beyond complaining loudly to our politicians - whic

    • by phorm (591458)

      Because school officials, in their fear and ignorance, assume that somehow it's all going to be breached - and here's the key part - and that they will be responsible and bear some degree of liability.

      Maybe those school officials, familiar with history and similar systems, have a bit more education on the subject than yourself...

  • The government (and their private sector lobbyists) has made it quite clear that they don't give a shit about anyone's rights or privacy. Parents have a right to be concerned. These days there is a 'permanent record', and with ever growing numbers of data points being added, the probability of having your career torpedoed for out-of-context events that happened decades ago is growing radically.

  • We don't need more tracking for a goverment to abuse.

  • Which, of course, it is not, that still leaves the entirely reasonable objection that they have the data for any reason at all. Why should they be trusted with it?

  • In order to ascertain whether a learning program works well the first item needed is solid testing so that you know where a child is at in his learning path. Sadly efforts to do real testing get sabotaged by the powers that be. For example we have the F-Cat testing which is sort of an anti learning device. The reason it is negative is that schools know what will be on the tests and when the tests will take place. The S.A.T. tests have suffered a similar fate. These days it is normal to study for
  • If they "personalize" learning the same way Facebook "personalizes" ads. Then I don't blame them for not wanting it.

All warranty and guarantee clauses become null and void upon payment of invoice.

Working...