How you might vote based on what you like

By Angie Waller, this table shows how Facebook thinks you’ll vote based on what you like. It’s a straightforward view that’s fun to look at. In particular, I like the excluded audiences for certain topics marked with an x.

I often see ads that are completely unrelated to my interests, and a small part of me feels like I’m winning in some way, even though I’m almost definitely losing.

Tags: , ,

More friendships between rich and poor might mean less poverty

Recently published in Nature, research by Chetty, R., Jackson, M.O., Kuchler, T. et al. suggests that economic connectedness, or friendships between rich and poor, could improve economic mobility. The researchers used Facebook connection data from 70.3 million users, along with demographic and income data. NYT’s The Upshot explains the relationships with a collection of maps and charts.

You can find an anonymized, aggregated version of the data through the Social Capital Atlas. Also, I am very much into this socially-focused use of social media data.

Tags: , , , ,

Facebook doesn’t seem to fully know how its data is used internally

Lorenzo Franceschi reporting for Motherboard on a leaked Facebook document:

“We do not have an adequate level of control and explainability over how our systems use data, and thus we can’t confidently make controlled policy changes or external commitments such as ‘we will not use X data for Y purpose.’ And yet, this is exactly what regulators expect us to do, increasing our risk of mistakes and misrepresentation,” the document read. (Motherboard retyped the document from scratch to protect a source.)

In other words, even Facebook’s own engineers admit that they are struggling to make sense and keep track of where user data goes once it’s inside Facebook’s systems, according to the document. This problem inside Facebook is known as “data lineage.”

Hm.

Tags: , ,

Analysis of Facebook groups before January 6

The Washington Post and ProPublica analyzed Facebook group posts that disputed election results:

To determine the extent of posts attacking Biden’s victory, The Post and ProPublica obtained a unique dataset of 100,000 groups and their posts, along with metadata and images, compiled by CounterAction, a firm that studies online disinformation. The Post and ProPublica used machine learning to narrow that list to 27,000 public groups that showed clear markers of focusing on U.S. politics. Out of the more than 18 million posts in those groups between Election Day and Jan. 6, the analysis searched for words and phrases to identify attacks on the election’s integrity.

The more than 650,000 posts attacking the election — and the 10,000-a-day average — is almost certainly an undercount. The ProPublica-Washington Post analysis examined posts in only a portion of all public groups, and did not include comments, posts in private groups or posts on individuals’ profiles. Only Facebook has access to all the data to calculate the true total — and it hasn’t done so publicly.

Read more about the methodology behind the analysis.

Tags: , , , ,

How Facebook disappeared from the internet

Cloudflare describes how things looked from their point of view the day that Facebook, along with its other properties, went down. From the Border Gateway Protocol, which defines routing information:

A BGP UPDATE message informs a router of any changes you’ve made to a prefix advertisement or entirely withdraws the prefix. We can clearly see this in the number of updates we received from Facebook when checking our time-series BGP database. Normally this chart is fairly quiet: Facebook doesn’t make a lot of changes to its network minute to minute.

But at around 15:40 UTC we saw a peak of routing changes from Facebook. That’s when the trouble began.

Tags: ,

Facebook feed comparison between groups

As part of their Citizen Browser project to inspect Facebook, The Markup shows a side-by-side comparison between Facebook feeds for different groups, based on the feeds of 1,000 paid participants.

There are pretty big differences for news sources and group suggestions, but the news stories don’t seem as big as you might think with a median 3 percentage points difference between groups. Although, the distribution shows a wider spread.

Tags: , , ,

Facebook feed comparison between groups

As part of their Citizen Browser project to inspect Facebook, The Markup shows a side-by-side comparison between Facebook feeds for different groups, based on the feeds of 1,000 paid participants.

There are pretty big differences for news sources and group suggestions, but the news stories don’t seem as big as you might think with a median 3 percentage points difference between groups. Although, the distribution shows a wider spread.

Tags: , , ,

Trump’s criminal justice ad spending on Facebook

The Marshall Project contrasted ad spending on Facebook by Trump’s campaign against Joe Biden’s:

Our analysis found that of the $82 million Trump’s reelection campaign has spent on Facebook ads this year, $6.6 million paid for ads about crime and policing—a top focus of his Facebook campaign. Almost all of it came since George Floyd was killed by police in Minneapolis in May. More than one-third of those ad buys were aimed at key battleground states and many sought to persuade specific undecided voters, and married women in particular. The Biden campaign? It didn’t spend a cent on criminal justice ads on Facebook until late August, choosing instead to focus on the COVID-19 pandemic and economic recovery. Yet Biden had, during the Democratic primaries, articulated a more progressive criminal justice platform than any of his party’s recent nominees.

Tags: , , ,

Calling on all academic institutions and organizations to stop using Facebook for any official activities #FacebookAcademicBoycott


This is long long overdue.  I have been wanting to write this post for a while now and just have not gotten to it.

It has become abundantly clear that Facebook has serious issues in regard to privacy (e.g., see this), fake news, support for extremism, and lack of any moral compass.  It is for these and other reasons that many individuals have quit using Facebook.  In recent months one particular incredibly bad aspect of Facebook that has come more into the light is its unwillingness to limit things like hate speech, fake and dangerous, conspiracy theories, and the encouragement of violence on its platform.  The egregious behavior of Facebook has led to a growing number of institutions to at least temporarily limit their connections and financial support (e.g., through Ads) of Facebook.  For example see: Patagonia joins growing list of companies boycotting Facebook ads which discusses how many companies are stopping advertising on Facebook due to " “Facebook’s long history of allowing racist, violent and verifiably false content to run rampant on its platform.”

Given it's direct and indirect support for hate speech and hate groups, and given that many people justifiably refuse to make use of Facebook pages, and groups, it is clear that any organization that carries out functions through Facebook is both supporting Facebook's policies and excluding people who justifiably do not want to make use of Facebook.  Therefore, I believe it is necessary for all academic institutions and organisms to cease and desist using Facebook for any official functions.  Thus any university or academic entity (e.g, journals, societies, etc) that uses Facebook in any way for official communications and activities should stop doing so and find alternative ways to engage their communities and communicate.  

I have therefore deleted the Facebook Groups I maintained for my lab, my blog, microBEnet, and the UC Davis Microbiome Special Research Program.  And I implore al

l academic institutions and organizations and individuals to stop using Facebook for any official activities.

(As an aside, this is not a call to stop using Facebook for personal activities, though it seems like that would be the right thing to do too.  I am planning to reduce to eliminate my personal use of Facebook too)






Readability of privacy policies for big tech companies

For The New York Times, Kevin Litman-Navarro plotted the length and readability of privacy policies for large companies:

To see exactly how inscrutable they have become, I analyzed the length and readability of privacy policies from nearly 150 popular websites and apps. Facebook’s privacy policy, for example, takes around 18 minutes to read in its entirety – slightly above average for the policies I tested.

The comparison is between websites with a focus on Facebook, but the main takeaway I think is that almost all privacy policies are complex, because they’re not there for the users.

Tags: , ,