Thursday 5 April 2018

Your Facebook data has probably already been scraped, Mark Zuckerberg says

          Mark Zuckerberg
Facebook revealed on Thursday that as many 87 million users could have had their Facebook data improperly shared with the British political-data firm Cambridge Analytica.

But even if you're sure you're not one of those people because you never shared your data with a sketchy quiz app, there's probably someone out there who has scraped data from your Facebook profile, CEO Mark Zuckerberg said on Wednesday during a call with reporters.
Basically, the search function of Facebook's apps was so powerful and widely used that if you were a Facebook user and it was turned on — which it was by default — then you should assume someone out there has access to your information, Zuckerberg said.
"In terms of the scale, I think the thing people should assume, given this is a feature that's been available for a while and a lot of people use it in the right way, but we've also seen some scraping," Zuckerberg said on a conference call with reporters on Wednesday. "I would assume if you had that setting turned on that someone at some point has accessed your public information in this way."
Here's how Mike Schroepfer, Facebook's chief technology officer, explained it in a blog post on Wednesday:
"Until today, people could enter another person's phone number or email address into Facebook search to help find them. This has been especially useful for finding your friends in languages which take more effort to type out a full name, or where many people have the same name. In Bangladesh, for example, this feature makes up 7% of all searches.
"However, malicious actors have also abused these features to scrape public profile information by submitting phone numbers or email addresses they already have through search and account recovery. Given the scale and sophistication of the activity we've seen, we believe most people on Facebook could have had their public profile scraped in this way. So we have now disabled this feature. We're also making changes to account recovery to reduce the risk of scraping as well."
Unfortunately, there's no way to protect yourself from this kind of data scraping on Facebook besides locking down your profile or deleting it. Facebook describes the data taken by scrapers as "public."
It's one of several revealing details to come out of the hourlong conference call with reporters — others include that nobody has lost their job over the Cambridge Analytica scandal and that Zuckerberg considers himself a "power user of the internet."

Mark Zuckerberg: Hey everyone. Thanks for joining today. Before we get started today, I just want to take a moment to talk about what happened at YouTube yesterday.
Silicon Valley is a tight-knit community, and we all have a lot of friends over there at Google and YouTube.
We're thinking of everyone there and everyone who was affected by the shooting.
Now I know we face a lot of important questions. So I just want to take a few minutes to talk about that upfront, and then we'll take about 45 minutes of your questions.
Two of the most basic questions that I think people are asking about Facebook are: first, can we get our systems under control and can we keep people safe, and second, can we make sure that our systems aren't used to undermine democracy?
And I'll talk about both of those for a moment and the actions that we're taking to make sure the answers are yes. But I want to back up for a moment first.
We're an idealistic and optimistic company. For the first decade, we really focused on all the good that connecting people brings. And as we rolled Facebook out across the world, people everywhere got a powerful new tool for staying connected, for sharing their opinions, for building businesses. Families have been reconnected, people have gotten married because of these tools. Social movements and marches have been organized, including just in the last couple of weeks. And tens of millions of small business now have better tools to grow that previously only big companies would have had access to.
But it's clear now that we didn't do enough. We didn't focus enough on preventing abuse and thinking through how people could use these tools to do harm as well. That goes for fake news, foreign interference in elections, hate speech, in addition to developers and data privacy. We didn't take a broad enough view of what our responsibility is, and that was a huge mistake. It was my mistake.
So now we have to go through every part of our relationship with people and make sure that we're taking a broad enough view of our responsibility. It's not enough to just connect people, we have to make sure that those connections are positive and that they're bringing people closer together. It's not enough to just give people a voice, we have to make sure that people are not using that voice to hurt people or spread disinformation. And it's not enough to give people tools to sign into apps, we have to ensure that all of those developers protect people's information too. It's not enough to have rules requiring they protect information, it's not enough to believe them when they tell us they're protecting information — we actually have to ensure that everyone in our ecosystem protects people's information.
So across every part of our relationship with people, we're broadening our view of our responsibility, from just giving people tools to recognizing that it's on us to make sure those tools are used well.
Now let me get into more specifics for a moment.
With respect to getting our systems under control, a couple of weeks ago I announced that we were going to do a full investigation of every app that had a large amount of people's data before we locked down the platform, and that we'd make further changes to restrict the data access that developers could get.
Ime Archibong and Mike Schroepfer followed up with a number of changes we're making, including requiring apps you haven't used in a while to get your authorization again before querying for more of your data. And today we're following up further and restricting more APIs like Groups and Events. The basic idea here is that you should be able to sign into apps and share your public information easily, but anything that might also share other people's information — like other posts in groups you're in or other people going to events that you're going to — those should be more restricted. I'm going to be happy to take questions about everything we're doing there in a minute.
I also want to take a moment to talk about elections specifically.
Yesterday we took a big action by taking down Russian IRA pages targeting their home country.
Since we became aware of this activity, their activity after the 2016 US elections, we've been working to root out the IRA and protect the integrity of elections around the world. And since then there have been a number of important elections that we've focused on. A few months after the 2016 elections there was the French presidential election, and leading up to that we deployed some new AI tools that took down more than 30,000 fake accounts. After that there was the German election, where we developed a new playbook for working with the local election commission to share information on the threats we were each seeing. And in the US Senate Alabama special election last year, we successfully deployed some new AI tools that removed Macedonian trolls who were trying to spread misinformation during the election.
So all in, we now have about 15,000 people working on security and content review, and we'll have more than 20,000 by the end of this year.
This is going to be a big year of elections ahead, with the US midterms and presidential elections in India, Brazil, Mexico, Pakistan, Hungary and others — so this is going to be a major focus for us.
But while we've been doing this, we've also been tracing back and identifying this network of fake accounts the IRA has been using so we can work to remove them from Facebook entirely. This was the first action we've taken against the IRA in Russia itself, and it included identifying and taking down Russian news organization that we determined were controlled and operated by the IRA. So we have more work to do here, and we're going to continue working very hard to defend against them.
All right. So that's my update for now. We expect to make more changes over the coming months, and we'll keep you updated, and now let's take some questions.

David McCabe, Axios: Given that Colin testified just last year, and more has come out since then, and given that the numbers around the time of the IRA operation changed so drastically, why should lawmakers—why should users and Congress trust that you are giving them a full and accurate picture now?
Zuckerberg: Of the IRA, I think there is going to be more content that we are going to find over time. As long as there are people employed in Russia who have the job of trying to find ways to exploit these systems, this is going to be a never-ending battle. You never fully solve security; it's an arms race. In retrospect, we were behind, and we didn't invest enough in it up front. We had thousands of people working on security, but nowhere near the 20,000 that we're going to have by the end of this year. So I am confident we are making progress against these adversaries. But they were very sophisticated, and it would be a mistake to assume that you can ever fully solve a problem like this, or think that they are going to give up and stop doing what they are doing.Rory Cellan Jones, BBC: You, back in November 2016 when you could say this crisis began, dismissed as crazy the idea that fake news could influence the election, and more recently here in the UK you've turned down an invitation to speak to our Parliamentarians in the House of Commons, just as we learn tonight that 1 million UK users were affected by the Cambridge Analytica data leak. Are you taking this seriously enough, and can you convince British users that you care enough about the situation?

Zuckerberg: Yes. So we announced today that I'm going to be testifying in front of Congress. I imagine that is going to cover a lot of ground. I am going to be sending one of our top folks. I believe it's going to be [Schroepfer], the CTO, or Chris Cox, the product officer. These are the top folks who I run the company with — to answer additional questions from countries and other places.
Oh sorry, I should also probably address — you asked about my comments after the 2016 election. I've said this already, but I think at this point that I clearly made a mistake by just dismissing fake news as "crazy," as having an impact. People will analyze the actual impact of this for a long time to come, but what I think was clear at this point is that it was too flippant. I should have never referred to it as crazy. This is clearly a problem that requires careful work, and since then we've done a lot to fight the spread of disinformation on Facebook from working with fact checkers to making it so that we're trying to promote and work with broadly trusted news sources. But this is an important area of work for us.
Ian Sherr, CNET: So you just announced 87 million people affected by the Cambridge Analytica stuff today. How long did you know this number was affected? Because the 50 million number was out there for quite a while. I know you guys weren't specifically saying that, but it feels like the data keeps changing on us. And we're not getting a full forthright view of what's going on here.

No comments:

Post a Comment

Related news

Related Posts Plugin for WordPress, Blogger...
Related Posts Plugin for WordPress, Blogger...