Posts Tagged ‘Facebook’

No place to hide for the social media publishers

March 27, 2017

It is time that Facebook and Google and WhatsApp and Snapchat and Twitter accepted that they are just publishers and cannot hide behind the label of being “tech companies”. They cannot function as a hiding place for publications by criminals and terrorists and make ad revenue  on such publications and then claim they are merely couriers like a postal service. They cannot censor some content and then claim they are not responsible for the rest.

It is time to treat them as the publishers they are.

Facebook and Twitter and Google (YouTube) and and WhatsApp and LinkedIn cannot abdicate their responsibility as publishers because they choose not to exercise the quality control they could. They cannot remove (censor) some material and then claim they are nor responsible for the rest.

Facebook and Twitter are “publishers”, not merely “couriers”

Social media like to claim that they merely provide a “platform” or  are just “communication enablers” or only provide “communication media” and therefore that they are not responsible – and should not be held responsible – for the content they disseminate.

But they protest too much.

It is quite wrong to compare Facebook or Twitter or LinkedIn to a telecommunications enterprise or a postal service or a courier service or an e-mail service provider. In all of these a specific identifiable “sender” directs a communique to a specific, identified “receiver”. The carrying of the communique to the specific receiver is the service provided by the communications enterprise and is not in any sense “publishing”. The service provided by the social media is more than just the provision of a soap box in Hyde Park (a platform) or the provision of a Board or a Wall in a town square onto which a newspaper could be appended. Any website could be a platform for comments but the website owner must take ultimate responsibility for the content published on the web-site. ……

Their advertising revenues depend upon the dissemination being as wide and as “indiscriminate” as possible. They are not so different to a radio or a TV broadcast where the broadcaster tries to reach as large an audience as possible. The broadcaster is clearly responsible and accountable for the content of the broadcast. A free newspaper being distributed at all Metro stations but where revenues are dependent upon advertising also has a responsible publisher. Any advertising revenue accrues to the publisher.

The clincher for me is that the placement of advertisements based on circulation is decisive proof of the existence of a publisher. All published material does not contain advertising. Not all advertising is proof of the existence of a publisher. A billboard or sandwich-board owner for example, is not a publisher. But the mere existence of advertising based on circulation numbers or “reach” or any similar parameter is conclusive proof – I think – of the existence of a publisher. And it is the person or organisation responsible for the circulation who takes the advertising revenues and in consequence must be the responsible and accountable publisher.

Freedom of speech does not really enter the argument. The publisher may choose to publish whatever he pleases. He may refrain from “censoring” his users if he so wishes. Or he may – at some cost – ensure that the content he publishes meets criteria that he sets himself. But he remains responsible and accountable for what he publishes. Facebook and Twitter cannot abdicate their responsibility because they choose not to exercise the quality control they could.


 

Facebook does stupid again as it bans Swedish Cancer Society’s educational images

October 20, 2016

UPDATE: Facebook has apologised for removing a video on breast cancer awareness posted by a Swedish group, saying it was incorrectly taken out.


Facebook likes to call itself a technology company rather than what it is – a publisher. It exercises editorial authority and both removes material it does not like and it promotes material that it does. As a publisher they don’t – by any stretch of the imagination – do a very good job. After the fiasco of the banning of the iconic Vietnam “napalm girl” image, they have now proceeded to further demonstrate their stupidity by banning educational images about breast cancer from the Swedish Cancer Society.

Swedish Cancer Society

Swedish Cancer Society: Learn to know your breasts

PhysOrg:

Facebook has removed a video on breast cancer awareness posted in Sweden because it deemed the images “offensive”, the Swedish Cancer Society said Thursday. The video, displaying animated figures of women with circle-shaped breasts, aimed to explain to women how to check for suspicious lumps.

Sweden’s Cancerfonden said it has tried to contact Facebook without any response and has decided to appeal the decision to remove the video. Facebook was not immediately available for comment. “We find it incomprehensible and strange how one can perceive medical information as offensive,” Cancerfonden communications director Lena Biornstad told AFP. “This is information that saves lives, which is important for us,” she said. “This prevents us from doing so.”

Facebook faced outrage in September for repeatedly deleting a historic Vietnam War photo included in a post by Norway’s Prime Minister Erna Solberg. It said the iconic photo of a naked Vietnamese girl fleeing a napalm bombing violated its rules but later backtracked on the decision.


 

Facebook forced to back down over Vietnam photo

September 9, 2016

My previous post was about the inane censorship applied by Facebook about Nick Ut’s iconic photograph of a naked girl fleeing after a Napalm strike.

It has taken almost a day for Facebook to see some sense – though it has only come after a massive wave of negative publicity to get them to do so. But their pronouncements suggest they still don’t understand that they are, in fact, a publisher whenever they censor or even prioritise certain content over others. They are a publisher first, a purveyor of advertisements second  and only a technology company as a distant third. Merely repeating their mantra of being a technology company does not change reality.

My previous post fed onto my Facebook page about 16 hours ago. However it does not seem that Facebook tampered with that feed in any way.

vietnam-photo-on-facebook

BBC: 

Facebook says it will allow an iconic photograph of a girl fleeing a Napalm attack taken during the Vietnam war in 1972 to be used on its platform. It had previously removed the image, posted by a Norwegian writer, on the grounds that it contained nudity.

The move sparked a debate about Facebook’s role as an editor. The editor of Norway’s largest newspaper had written an open letter to Facebook’s chief Mark Zuckerberg calling the move “an abuse of power”. The tech giant said it had “listened to the community” following a considerable amount of criticism about its decision to block the photo. …..


 

Facebook editors display their ignorance and “promote stupidity”

September 9, 2016

Probably the Facebook editors involved are just ignorant. Blaming the algorithm for their own shortcomings is rather pathetic.

This story in the Guardian about Facebook censoring this iconic Vietnam photograph:

vietnam-napalm-girl-photo-nick-ut-ap

photo Nick Ut /AP

The Guardian:

Norway’s largest newspaper has published a front-page open letter to Facebook CEO Mark Zuckerberg, lambasting the company’s decision to censor a historic photograph of the Vietnam war and calling on Zuckerberg to recognize and live up to his role as “the world’s most powerful editor”.

Espen Egil Hansen, the editor-in-chief and CEO of Aftenposten, accused Zuckerberg of thoughtlessly “abusing your power” over the social media site that has become a lynchpin of the distribution of news and information around the world, writing, “I am upset, disappointed – well, in fact even afraid – of what you are about to do to a mainstay of our democratic society.”

…… The controversy stems from Facebook’s decision to delete a post by Norwegian writer Tom Egeland that featured The Terror of War, a Pulitzer prize-winning photograph by Nick Ut that showed children – including the naked 9-year-old Kim Phúc – running away from a napalm attack during the Vietnam war. Egeland’s post discussed “seven photographs that changed the history of warfare” – a group to which the “napalm girl” image certainly belongs.

Egeland was subsequently suspended from Facebook. When Aftenposten reported on the suspension – using the same photograph in its article, which was then shared on the publication’s Facebook page – the newspaper received a message from Facebook asking it to “either remove or pixelize” the photograph. ……. 

Before Aftenposten could respond, Hansen writes, Facebook deleted the article and image from the newspaper’s Facebook page.

In his open letter, Hansen points out that Facebook’s decision to delete the photograph reveals a troubling inability to “distinguish between child pornography and famous war photographs”, as well as an unwillingness to “allow[ing] space for good judgement”.

“Even though I am editor-in-chief of Norway’s largest newspaper, I have to realize that you are restricting my room for exercising my editorial responsibility,” he wrote. “I think you are abusing your power, and I find it hard to believe that you have thought it through thoroughly.”

Hansen goes on to argue that rather than fulfill its mission statement to “make the world more open and connected”, such editorial decisions “will simply promote stupidity and fail to bring human beings closer to each other”.

Facebook is a publisher whether it wants to admit it or not. Just the act of censorship makes it a publisher. The world may well be dumbing down since the time of hunter-gatherers. And Facebook probably contributes to accelerating the glorification of stupidity.

aftenposten-facebook


 

I’m rooting for Adblocker against Facebook

August 17, 2016

Ultimately it is the consumer who pays for ads. I strongly dislike being forced to “consume” unsolicited advertisements. I resent TV channels and their commercial breaks and the inane, predatory advertising I am compelled to watch. Though, I note that these days I only partly watch TV programs – upto the first commercial break – after which I surf away. Sometimes – but not always – I return to complete watching some program. I have no alternative to suggest but the business models based on advertising are fundamentally flawed. They all rely on “forcing” a large number of uninterested viewers or readers to “consume” ads they don’t want to be exposed to by dangling “free content” as the bait.

It is a myth to think that a person forced to “consume” unsolicited ads is not also paying a price. My contention is that the “free” content is never actually free. It is paid for by the “psychological stress and suffering” the ad causes to the non-consumer. Effectively I pay for the “free” content on a site by having to suffer the slings and arrows of their rubbishy ads for things and services I will never buy. I pay in time and stress. I use “Adblocker”. Some sites get upset and don’t wish to grant me access. That’s OK. It’s a mutually acceptable parting of ways. There are a very few sites whose content is so good that I am willing to turn off my adblocker to put up with their intrusions into my personal space and consciousness. The really good sites are those where I am willing to pay a subscription – and there are only a very few of those.

So, in the battle between Facebook and Adblocker, I am firmly in the Adblocker camp. And I am perfectly aware that Adblocker’s business model, which is to “blackmail” advertisers into paying to be whitelisted, is morally equivalent and just as low as the advertisers bombarding non-consumers with unsolicited advertising.

The MIT Technology Review writes:

Facebook Can’t Win Against Ad Blockers, and Here’s the Proof

Facebook can’t win the war it started on ad blockers last week.

So say Princeton assistant professor Arvind Narayanan and undergraduate Grant Storey, who have created an experimental ad “highlighter” for the Chrome browser to prove it. When you have Facebook Ad Highlighter installed, ads in the News Feed are grayed out and written over with the words “THIS IS AN AD.”

Facebook announced that it was taking measures to prevent ad blockers from working on Tuesday last week. On Thursday the largest ad blocker out there, Adblock Plus, informed users of a simple tweak to their settings that would defeat Facebook’s blocker blockade.

Princeton researchers say Facebook can’t prevent their experimental add-on for the Chrome browser graying out ads in your News Feed.

We’re still waiting for Facebook to fire back, as the executive leading its ad technology has promised it will. But Narayanan argues in a blog post introducing his ad highlighter that Facebook simply can’t win.

The ad blockers in use today work by looking at the HTML code that tells your Web browser how to render a page and where to get the images and other files embedded into it. Facebook’s initial move against ad blockers removed clues in its HTML that gave away which parts of a page were ad content.

The Princeton duo’s ad highlighter works differently. It looks at the parts of the Web page that are visible to humans. Facebook Ad Highlighter simply looks for and blocks any posts with a giveaway “Sponsored” tag. It appears to be quite effective. Facebook must clearly label ads to stay within Federal Trade Commission rules on transparency and its own commitments to its users.

Narayanan concludes in his post that Facebook’s anti-ad-blocking campaign is doomed, at least if it continues in the current vein of acting as if the social network can somehow neutralize ad blockers completely.

Narayanan’s blogpost is here:

Can Facebook really make ads unblockable?

Facebook announced two days ago that it would make its ads indistinguishable from regular posts, and hence impossible to block. But within hours, the developers of Adblock Plus released an update which enabled the tool to continue blocking Facebook ads. The ball is now back in Facebook’s court. So far, all it’s done is issue a rather petulant statement. The burning question is this: can Facebook really make ads indistinguishable from content? Who ultimately has the upper hand in the ad blocking wars?

There are two reasons — one technical, one legal — why we don’t think Facebook will succeed in making its ads unblockable, if a user really wants to block them.

The technical reason is that the web is an open platform. When you visit facebook.com, Facebook’s server sends your browser the page content along with instructions on how to render them on the screen, but it is entirely up to your browser to follow those instructions. The browser ultimately acts on behalf of the user, and gives you — through extensions — an extraordinary degree of control over its behavior, and in particular, over what gets displayed on the screen. This is what enables the ecosystem of ad-blocking and tracker-blocking extensions to exist, along with extensions for customizing web pages in various other interesting ways.

I wish there was a business model which would save me from these pernicious ads.


 

How come Facebook’s tracking never gets my preferences correctly?

June 1, 2016

I don’t much care that Facebook is tracking me – and now “on and off Facebook through cookies”. But their analysis of whatever tracking they do is suspect. At least in my case, the tracking analysis does not seem to be very effective (or even intelligent).

Facebook cookie monster

Facebook’s choice of “top stories” – which seems to be their enforced default condition – never matches what I would consider top stories on my news feed. I keep switching back to “most recent” and what I get is something close to – but not exactly – the most recent posts (or comments). Some posts are suppressed and some are elevated. In this age where they are supposedly tracking my every move, why cannot they manage something as simple as just following a time stamp? It is pretty clear that their over-complicated, over-sophisticated algorithms cannot leave well alone. Why must they always try to “add value” (and fail) by revising time?

For the last 5 days Facebook has been showing this irritating message

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy.

As the WSJ points out, Facebook is trying to show an increased “value” to its advertisers (presumably to fool them into paying higher rates). Personally I thin the advertisers would be throwing their money away. The pages that Facebook suggests for me are very, very rarely of any relevance – or even of interest – for me. I cannot remember ever having clicked on an advertisement on Facebook. I don’t suppose I am in the main target group for Facebook advertisers, but surely the much-touted sophistication of their algorithms can do better. I am not especially impressed by the quality of the selections made for me.

I find Google ads are much more closely aligned to my interests. In any search for news stories, I always ignore the first few paid-for references. They are invariably low quality stories. But I have been known to click – not very often but a few times – on their ads. Ads on WordPress sites are generally very relevant to the main story (interspersed with regular ads for porn sites but these are easy to ignore).

I suspect that Facebook are claiming far more for their algorithms and their capability of selection of target audiences than they can actually achieve. (That they do suppress news they don’t like is now pretty well proven).

WSJ:

Facebook has set out to power all advertising across the Internet.

To that end, the social network and online advertising company said Thursday it will now help marketers show ads to all users who visit websites and applications in its Audience Network ad network. Previously Facebook only showed ads to members of its social network when they visited those third-party properties.

The change is a subtle one, but it could mean Facebook will soon help to sell and place a much larger portion of the video and display ads that appear across the Internet. The change will also intensify competition with Alphabet Inc. subsidiary Google, which dominates the global digital-advertising market, and a wide range of other online ad specialists.

“Publishers and app developers have some users who aren’t Facebook users. We think we can do a better job powering those ads,” said Andrew Bosworth, vice president of Facebook’s ads and business platform.

But my advice to Facebook advertisers would be to double check any claims Facebook makes about how well they are able to select their target audiences. From the little I have seen, they are not particularly good.

All I really want is that my news feed follow the fundamental time-stamp and that “most recent” gives me the most recent posts – without suppression of some and elevation of others. Google seems to know my mind better than Facebook does.


 

Facebook is just another disinformation source

May 13, 2016

That Facebook is biased and reflects the views of its owners/managers is neither a surprise or anything wrong. What I find reprehensible is the lie promoted by Facebook that ist is objective and unbiased. After the Gizmodo story this week, Facebook denied that it was spinning the news. But the latest revelations show that the allegations were fundamentally true. The simple truth is that Facebook promotes certain news stories and suppresses others. They don’t manufacture news. But what they do is to spread a skewed version of what is news. And that is disinformation. Again, nothing wrong with that. It is what every newspaper or TV channel does. But the prejudices and biases of, say, the Washington Post are not hidden under a false cloak of objectivity.

facebook disinformation

facebook disinformation

The shattering of the cloak of objectivity around Facebook and its subjective choice of news stories to promote or to suppress can no longer be ignored by Zuckerberg and he has initiated an “investigation”. A biased platform with a hidden, skewed agenda is fundamentally incompatible with selling advertising where the advertisers need to know, objectively, how well their messages are targeted.

BBC:

Facebook chief executive Mark Zuckerberg has said the company is investigating claims it censored news reports with conservative viewpoints. It follows a week of allegations in the media and discussion in the US Senate.

The tech news website Gizmodo had said Facebook staff suppressed articles on conservative topics from the site’s “trending” news section and “injected” others, even if they were not trending. ….

….. Mr Zuckerberg said he was inviting leading conservatives to meet him to discuss their views.

…… Gizmodo’s original report alleged that staff tampered with trending topic stories and were told to include stories published by the BBC, CNN and other mainstream news organisations ahead of smaller news sites.

It said the trending topics section was run like a newsroom, with curators able to “blacklist” or “inject” topics.

The report was followed by a release of documents to The Guardian, which appeared to show editorial decision-making by Facebook staff, alongside the company’s algorithm, to determine what is trending.

The Guardian:

Leaked documents show how Facebook, now the biggest news distributor on the planet, relies on old-fashioned news values on top of its algorithms to determine what the hottest stories will be for the 1 billion people who visit the social network every day.

The documents, given to the Guardian, come amid growing concerns over how Facebook decides what is news for its users. This week the company was accused of an editorial bias against conservative news organizations, prompting calls for a congressional inquiry from the US Senate commerce committee chair, John Thune. ….

….. But the documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its “trending module” headlines – the list of news topics that shows up on the side of the browser window on Facebook’s desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.

The guidelines show human intervention – and therefore editorial decisions – at almost every stage of Facebook’s trending news operation, a team that at one time was as few as 12 people …… 

Facebook is in the business of skewed information dissemination and that skewing is effectively disinformation. Every entity involved in providing information must, by its selection of information to be distributed, also be involved in disinformation. But the additional problem for Facebook is that this disinformation and skewing of stories is not in the interests of the advertisers. Facebook is not just misleading its users, it is misleading its advertisers.


 

Who’s surprised? Facebook’s algorithms are dishonest and self-serving

May 9, 2016

So who’s surprised that Facebook’s “trending” algorithms are far from objective. In fact they are blatantly dishonest and exhibit the biases of its owners and managers. They also suppress any unfavourable statistics about Facebook itself. They suppress favourable statistics on political viewpoints that Zuckerberg does not share and inject false statistics about political viewpoints that he supports.

Self-serving and dishonest, without a doubt. But no different to any lobby group or news outlet which has a particular point of view. The problem is that Facebook claims that its trending module is objective when it clearly is not. And that is fraud.

GizmodoFacebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential “trending” news section, according to a former journalist who worked on the project. This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.

Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all. The former curators, all of whom worked as contractors, also said they were directed not to include news about Facebook itself in the trending module.

In other words, Facebook’s news section operates like a traditional newsroom, reflecting the biases of its workers and the institutional imperatives of the corporation. Imposing human editorial values onto the lists of topics an algorithm spits out is by no means a bad thing—but it is in stark contrast to the company’s claims that the trending module simply lists “topics that have recently become popular on Facebook.”


 

Shy people get depressed (courtesy Facebook)

September 9, 2014

Two articles today about research on Facebook usage.

Shy People Use Facebook More [Research]

Shy and introvert people spend more time on Facebook but disclose little information with friends and acquaintances, said Pavica Sheldon, assistant professor at The University of Alabama in Huntsville’s communications arts department.

Facebook addicts at a risk of developing depression

Facebook users spending a lot of time on the social networking site, might be feeling down, lonely and even depressed, claims a new report.

A recent study has revealed a link between a Facebook and the dampened mood of active users who feel they have “wasted time on doing” what they call “meaningless activity.”

Which in turn suggests that shy people use Facebook longer, are more likely to be addicted and therefore more likely to be depressed.

But I would have thought that shy people are more likely to be lonely and more likely to be depressed – anyway.

 

Why so much fuss that Facebook “manipulated” emotions?

July 8, 2014

There has been a lot of fuss lately about an internal Facebook study which managed to be published in a scientific journal as I noted in passing about 3 weeks ago.

Emotional contagion by Facebook could be a new disease. A case of the medium creating the new disease! Heightened emotions can apparently be transmitted by Facebook. The researchers find that“emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness”. And emotional contagion is what turns a crowd into a mob. And as this work from MIT shows, “Good people can do bad things. Belonging to a group makes people more likely to harm others outside the group.”

The research consisted of manipulating Facebook feeds and seeing what happened. The paper, the journal, Facebook and Cornell University have been heavily criticised for their “lack of ethics” and many are back-tracking in CYA exercises. Retraction Watch writes:

The Proceedings of the National Academy of Sciences (PNAS) is subjecting a much-criticized study involving Facebook that it published just two weeks ago to an Expression of Concern. …. Critics — and there were many online — said the study violated ethical norms because it did not alert participants that they were taking part.

…… Here’s the Expression of Concern, signed by editor-in-chief Inder Verma:

……. When the authors prepared their paper for publication in PNAS, they stated that: “Because this experiment was conducted by Facebook, Inc. for internal purposes, the Cornell University IRB [Institutional Review Board] determined that the project did not fall under Cornell’s Human Research Protection Program.” This statement has since been confirmed by Cornell University. ……

But I find all the fuss a little hypocritical. Manipulation of the behaviour of others is the norm and the bed-rock for all human social intercourse.

Politicians manipulate – or try to – their voters. Demagogues manipulate individuals to create a mob. Artists and authors try to arouse emotions. Scientists try to influence their grant panels. We manipulate our friends and our family members. A leader manipulates his followers. Followers try to influence their leaders. All human cooperation is built on manipulation of behaviour. We try to manipulate our enemies. When we call it “manipulation” we disapprove but when we call it “motivation” it is to be admired. Obama tries to motivate Netanyahu but Bibi usually manages to manipulate Barack. Manipulation of behaviour by persuasion is fine but manipulation by coercion is frowned upon. Any advertisement – by definition – plays with the emotions of its target audience and tries to manipulate their behaviour.

So what is wrong then when a Facebook or a Google or a Twitter  – whose business model depends on placing advertisements accurately and effectively – tries to employ “emotional contagion” to maximise their revenues? I closed my Facebook and Twitter accounts some time ago partly because I did not like their intrusive nature. But that was because I felt that my personal space was being encroached on – and beyond the level I felt comfortable with. But I certainly did not feel they were doing anything unethical. In this case I find the criticism confused and a little inane. Was it unethical for Facebook to have conducted an “internal” study. I don’t think so. Was it unethical for PNAS to have published the paper? Not really.

If it is unethical for internet sites or social media to target advertisements then it is unethical for any advertisement to be targeted towards anyone.

The onus I think lies with the individual.

 

 


%d bloggers like this: