"Dear friends, it's not us, it's you. The problem is you. You have stopped sharing all the intimate details of your life and we can no longer monetize you as we wish. This is why we have decided to change our relationship. We want to convince you to share more, in order to show you more advertising. »
This is obviously not an exact quote from Mark Zuckerberg, but a rough translation of what his last words really meant. In a declaration published on 12 January, the company's founder announced further changes to the platform's news feed, which will now prioritise messages from friends and family rather than content posted by news organisations and companies.
Media information and other similar content will appear less in news feeds unless it is shared by users and widely commented on. The company also said it would change its "ranking" (the system for classifying the information that appears in the feed) to give priority to "quality information", but it did not specify what this was about. Advertising will not be affected by these changes - you will still be bombarded with advertising, whether you like it or not - and whether it is relevant to you or not.
Zuckerberg said the changes were designed to improve the platform. "We want to bring people together - whether it's family and friends, or gathering them around important moments in the world - we want to help ensure that time spent on Facebook is well spent, he said in a message on the Facebook site.
But let's be clear, this latest algorithm change to encourage more personal interactions has nothing to do with you - it has to do with Facebook's revenue.
Facebook has been worried for months about "context collapse" or "contextual collapse". Clearly, users have become much more reluctant to publish personal data online and, as Internet news feeds fill up with content produced by media and other influencers, audiences have become more passive than sharing.
A study conducted in the United States shows that adults spend about 50 minutes a day on the social networking giant - although this time is declining. Facebook users are getting older, too - and have been for some time - and younger users are turning to of competing networks like Snapchat to share their actions.
Monetize the audience
Facebook's business model is based on selling our data to advertisers, data that are nothing more than highly sophisticated representations of our digital identity and emotions. Increasingly, however, Facebook users are posting data that are nothing more than highly sophisticated representations of our digital identity and emotions. links to third party websites on their walls (information, entertainment), sharing less about their personal life.
The company does, however, strive to encourage the sharing of personal information. The function "On that day"For example, it was created to encourage users to share information that concerns their privacy. Clumsily, however, because "That day" may have been a day when you lost a loved one or were fired.
That's how we've seen reminders of invitations (birthdays or other events), and small messages encouraging us to share our life on the network, blooming on our Facebook walls. By accessing the content on our phones, Facebook has also tried to convince us to share more on our wall. Photos taken with your phone are directly included in the suggested messages, for example. Similarly, the Facebook Live has been strongly promoted, again to encourage the sharing of personal information.
On the media side, it's panic. Many are worried that their plummeting traffic in the coming weeks, when Facebook will turn off the tap and start removing their content from the feeds.
Capturing attention
There are indeed a few specialized information sites that use a pay system (or paywall) and have therefore remained fairly independent from the influence of social media. Their content can be shared on social networks, some articles are freely available, but they are less dependent than others on advertising revenue generated by the traffic obtained via social media.
Unfortunately, most other news media have invested heavily in their presence on Facebook - also investing in the people and technology that enable them to support this social strategy. Some editors have been chasing the audience via social media, reassured that the millions of hits they get through social media will eventually be transformed into a viable business model. Which was not the case.
On the contrary, this system has strengthened Facebook's position as an "information keeper", while significantly improving the technology giant's bottom line. Facebook, together with Google, enjoys a quasi-monopolistic position in the digital sphere; both companies have pocketed about 84 % of total online advertising spending in 2017.
Facebook acts with impunity, jealously protecting the secrets of its ranking algorithm (ranking). The platform has become the largest information sharing site in the world, and controls at the individual level what two billion people see daily on their news feeds.
Democratic deficit
These changes raise a major democratic concern: Facebook does have the power to hide information it doesn't like. There is no indication that it does, but we should all be appalled that such a powerful company would achieve this level of control over information.
Facebook's latest turnarounds effectively mean that if publishers want their content to be seen by the public, they will have to pay Facebook through advertising or negotiate new deals that will further undermine their editorial independence and reinforce Facebook's dominance in the market.
This change in no way affects the spread of so-called "fake news" and even contributes to aggravating the problem. Although there is no magic bullet for fake news, the Facebook's efforts to fight them, so far. were inconclusive. And there is a real risk that the latest changes in the ranking will exacerbate this problem.
Indeed, content with a high virality potential, the kind of content that generates a lot of debate and comment - the kind of content Facebook wants to encourage - can be completely inaccurate and unverified.
This is dangerous territory for the media: the phenomenon of the "filter bubble"The "content only" approach, which consists of seeing, indefinitely, only content that corresponds to one's preferences, will be amplified.
Worse still, the controversies and important debates - the very flavour and appeal of media diversity - may well disappear forever in the new world that social media are creating.
Tom FelleSenior Lecturer in News and Digital Journalism, City, University of London
The original text of this article was published on The Conversationeditorial partner of UP'.

Anything to add? Say it as a comment.