The Idris book is one of my favourite programming books. It describes a simple, iterative technique for exploring a problem (the “type-driven development”) and then shows you how to apply that in a number of different settings, while teaching you a set of complex language concepts almost in passing. I’d recommend anyone who enjoyed previous *-driven-development approaches, and who is interested in functional programming and the Haskell family of languages, to take a look at it.
I got kind of bored at one point so maybe they cover this way down in this overly long article but I couldn’t really figure out what point they were making. Is Facebook reducing access to third-parties but I still get access to all of my own data? The quotes they give from Facebook sound like they are restricting what app writers can do, that doesn’t seem bad to me as long as I can still download all of my data. But then they bring up this “what is my data” question. They say:
But I don’t know…I don’t really have a choice in the end but someone taking pictures of me to a new service without my consent is not really something I want either. Maybe there needs to be some consent model, “user X would like to export this picture of you, is that OK?” similarly there should probably be a consent model of “user Y has uploaded a picture of you to FB, is that OK?”
But in the end, FB doesn’t say they are doing any of this so I don’t know what EFF is trying to do other than drum up hostility towards something that doesn’t exist yet?
read: warn and inform people of risks
Inform them of what risks? Facebook hasn’t actually done anything, this whole post is fantasy.
(author of the original piece here)
If you want some specifics about what is happening right now, it’s the trend to lock down APIs, which Facebook has and will I’m sure continue to do.
If you look at the links in the piece, you’ll see Zuckerberg saying that an early idealism for “data portability” led them to make mistakes, and they’re correcting those mistakes now. Mike Masnick has a somewhat pithier summary of these quotes here.
I’d also point you to the paragraph talking about the current ACLU case attempting to establish that scraping without authorisation is not a criminal act, per se; and the counter-arguments being made that without locking down data in this way, companies like Facebook cannot adequately protect your data.
I actually wanted to include exactly this suggestion as a hypothetical, but realised that the piece was already too long, so took it out, and replaced it with the “we should all have a longer conversation about this” ending, because there are so many cases like this to consider.
I understand if you think this is smoke without fire at this point, but our previous experiences in this space, we worry about this a lot at EFF, and do fear this kind of lock-down as being one of the unintended consequences of the current debate. Even if the article was imperfect, I’m happy that you’re thinking about these questions!
You can also read a bit more on this topic with this EFF article, written earlier in the Facebook/Cambridge Analytica story, by Cory Doctorow.
Thanks for your response
I think many of the points you bring up are important to discuss and consider, I mostly just really dislike how you do it. You are basically just say “OMG, FB could do this thing and that would be bad, everyone freak out”, and I don’t think that facilitates an informed discussion. I would much rather have read an article that took what’s happened and discussed options without trying to imply things. For example:
The way that reads to me is you’ve set it up such that Facebook does not agree on that definition of data portability. And I can’t tell if they do or don’t, but you’re certainly biasing me against Facebook.
Is it? I can’t read the minds of exec at Facebook and I know you can’t either. Instead I wish you said something like “Facebook changed things like X, that means you cannot do Y, if Y is important to you, then you, dear reader, should help do something about bringing Y back”. Trying to make a narrative with good guys and bad guys is just off-putting to me, people are more complicated than that.
The risk of the introduction of a bad regulation of people’s data: to protect users (even from themselves and from their friends) the USA Government might “force” Facebook to “keep their data safe” in their “secure” (USA located) servers, minimizing the amount of data that they will be able to get back or access.
Also the author notices how Facebook API changes just limit observable data leaks, but Facebook could keep selling data behind the scenes…
As far as I understood the issue is exactly that they didn’t do anything to prevent the manipulation of a few million people, through the use of precise profiles built from their data.
Or maybe Zuckerberg at the Congress is just a brillant late April prank! :-D