广州新闻翻译公司:假新闻为特朗普助选?_塞班岛娱乐国际 广州新闻翻译公司:假新闻为特朗普助选?_塞班岛娱乐国际


欢迎访问塞班岛娱乐_塞班岛娱乐国际_塞班岛娱乐国际登录有限公司!  联系邮箱:fanyi@zcald.com
当前位置:塞班岛娱乐 >塞班岛娱乐国际 >行业新闻

塞班岛娱乐国际 / NEWS



作者: 来源: 日期:2016-11-25 9:00:42

Harsh truths about fake news for Facebook, Google and Twitter





To judge by the headlines being passed around on social media during her campaign, Hillary Clinton was often in serious trouble.

从大选期间社交媒体上传播的头条文章来判断,希拉里•克林顿(Hillary Clinton)时常处于严重的麻烦之中。广州新闻翻译公司。


Experts” believed the Democratic presidential candidate had suffered brain damage. Or maybe she was trying to hide alcohol and drug addiction. She was also facing imminent indictment, now that the Federal Bureau of Investigation had finally found evidence of criminality in her use of a private email server — though the New York Police Department, after uncovering shocking evidence of her links to money laundering and sex crimes involving children, might pounce first.



In the heat of the most bitter US presidential election in memory, the internet was having a field day. A barrage of fake news — much of it aimed at undermining Mrs Clinton or boosting her opponent — was just part of the diet. It included conspiracy theories, misdirection, prejudice, harassment and hate speech, specially created to circulate on the digital networks that are now central to mass communication and media consumption.



According to critics, the digital platforms contributed to a virulent tribalism, as long-simmering partisan divisions boiled over. And they served to further undermine faith in established media outlets, as many members of a polarised electorate found ready support for the prejudices and unfounded suspicions they already harboured.



The aftermath has brought criticism of some of the biggest internet companies — in particular Facebook and Twitter — and an admission that change is needed. “We have a problem in the tech industry we have to deal with,” says John Borthwick, a New York tech investor. “These platforms are central to our democracy. Something has started to go wildly wrong.”

大选后的余震使得一些大型互联网公司饱受批评——特别是FacebookTwitter——并促使它们承认需要改变现状。“在科技行业,我们要处理一个问题,”纽约的科技投资者约翰•博思威克(John Borthwick)称,“这些平台处于我们民主的核心。有些事情开始错得离谱了。”广州新闻翻译公司。


Finger pointing



The backlash in the wake of the election of Donald Trump, the Republican candidate, has centred on fake news: false reports that are dressed up to look like genuine news articles, sometimes from news sources invented solely for the purpose. Passed around widely on Facebook, retweeted on Twitter or promoted by Google’s search algorithms, some of these stories managed to infiltrate the popular political discourse.

共和党候选人唐纳德•特朗普(Donald Trump)当选总统后爆发的强烈反弹集中针对假新闻:伪装得像真正的新闻文章一样的虚假报道,其来源有时是专门为造假而捏造的。其中部分所谓的新闻在Facebook上疯狂传播、在Twitter上被广泛转发或者被谷歌(Google)的搜索算法推广,最终成功渗入了大众的政治讨论中。


They included the “report” from the Denver Guardian, a non-existent publication, that an FBI agent suspected of leaking emails from Mrs Clinton’s private server had been found dead in a murder or suicide. With the velocity that characterises the flow of news on Facebook, that report was shared up to 100 times a minute on the network.

其中包括一篇来自子虚乌有的刊物《丹佛卫报》(Denver Guardian)的“报道”:一名被疑泄露了希拉里私人服务器上邮件的FBI特工被发现死于谋杀或自杀。凭借Facebook标志性的新闻传播速度,这则新闻在该社交媒体上以每分钟100次的分享速度迅速扩散。广州新闻翻译公司。


Not all the misinformation favoured the Republican candidate, though most of it did. Of the 20 fakes that generated the most audience engagement on Facebook in the final three months of the campaign, all but three were either pro-Trump or anti-Clinton, according to an analysis by news website BuzzFeed. The made-up stories also touched a nerve: Facebook’s users engaged more deeply with the fakes than the top 20 stories produced by a selection of established media companies.



The viral success of fake news on Facebook and the role the sharing of these articles might have had in tilting the election for Mr Trump have caused considerable angst inside the company.



There is anxiety around the election and there are questions around the role that Facebook and other organisations may have played,” says one person with knowledge of the internal discussions at the social networking site.



President Barack Obama said last week that, when it is no longer possible to tell “what’s true and what’s not, and particularly in an age of social media . . . then we have problems”.

总统巴拉克•奥巴马(Barack Obama)上周表示,当不再可能分辨“是与非,特别是在社交媒体时代……那么我们就有麻烦了”。


In Silicon Valley, where a dominant liberal culture is in shock over the election result, there has been finger pointing over the part the world’s most powerful tech companies may have played.



People need to step up and call out the platforms for being effectively propaganda machines for either side,” says Justin Kan, a successful entrepreneur and now an investor at Y Combinator, which funds internet start-ups. “The leadership in Silicon Valley needs to call out Facebook to do the right thing.”

“人们需要站出来指出这些平台实际成为了双方的宣传机器,”成功的创业者、如今身为Y Combinator投资人的简彦豪(Justin Kan)说,“硅谷领袖应该呼吁Facebook做正确的事。”Y Combinator为互联网初创企业提供资金。广州新闻翻译公司。


Mark Zuckerberg, Facebook’s chief executive, has rejected much of the criticism, while conceding that there is “more to be done” to stop the spread of fake news. In the days after the election, he claimed it was a “pretty crazy idea” that false stories had somehow had a bearing on the result. But growing pressure led him at the end of last week to outline a number of measures the site was taking to try to tackle the problem.

Facebook首席执行官马克•扎克伯格(Mark Zuckerberg)反驳了很多批评,同时承认确实“需要付出更多努力”来阻止虚假新闻传播。在大选后几天,他声称,认为虚假报道以某种方式影响了大选结果,是“相当疯狂的想法”。但越来越大的压力使得他在上周末制定了一些该网站将采取的措施,以解决该问题。


The steps taken by the big internet companies in the wake of the election highlight the pressure they have been under to act. Google and Facebook moved last week to prevent their advertising from appearing on sites that carry fake news, a belated attempt to make lying less profitable. Twitter suspended a batch of alt-right accounts — linked to US rightwing extremist groups — for hate speech.



Evidence that has surfaced since the election shows that the digital platforms will have to do more to weed out misinformation and harassment from their systems, taking in not just fake news but a wider range of abusive behaviour.



Bots — automated systems designed to imitate people — did much to circulate fake news headlines on Twitter, according to Philip Howard, a professor at Oxford university’s internet institute. About a fifth of all tweets about the election were generated by accounts that produce high volumes of posts, he says, a clear indication that they were bots rather than real users.

据牛津大学(Oxford University)互联网学院的教授菲利普•霍华德(Philip Howard)表示,模仿人类发帖的自动系统bot,在很大程度上对Twitter上虚假新闻的传播起到了推波助澜的作用。他称,在所有有关大选的Twitter帖子中,约有五分之一来自发出大量帖子的账户,明显表明这些账户是bot而非真正用户。


Digital denial’



Google’s algorithms have also been shown to be vulnerable. For instance, it is a week since a false report that George Soros — often a target of rightwing attacks — had died.

谷歌的算法也表现出漏洞。例如,有关乔治•索罗斯(George Soros)——常常是右翼攻击的目标——去世的虚假报道已经出现一周了。广州新闻翻译公司。


But a search for “George Soros” on November 20 still returned this headline at the top of the In the News section of Google’s first page of results: “Breaking Intel: George Soros DEAD”. The story was from a site called the Event Chronicle.

1120日在谷歌上搜索“乔治•索罗斯”,这条报道仍然出现在谷歌搜索结果塞班岛娱乐新闻栏的首位上:“爆炸新闻:乔治•索罗斯去世”。该报道来自一个名为“事件编年史”(Event Chronicle)的网站。


Facebook’s Mr Zuckerberg says only 1 per cent of news circulating on the site was fake. But with the company now used as a news source by nearly half of Americans, that represents a vast amount of traffic. Also, Mr Howard says the location of bot groups operating on Facebook suggests much of the fake information was targeted at Facebook users in swing states like Ohio and Florida, potentially increasing its impact.



The failure to block the tide of misinformation has reawakened complaints from the traditional media world that the digital companies have deliberately turned a blind eye, much as they were accused of doing in the case of copyrighted content.



Google and Facebook have resisted describing themselves as media companies or publishers that are responsible for the content they distribute.



These companies are in digital denial,” says Robert Thomson, chief executive of News Corp. “Of course they are publishers and being a publisher comes with the responsibility to protect and project the provenance of news. The great papers have grappled with that sacred burden over decades and centuries, and you can’t absolve yourself from that burden or the costs of compliance by saying, ‘We are a technology company’.”

“这些公司在进行‘数字化开脱’,”新闻集团(News Corp)首席执行官罗伯特•汤姆森(Robert Thomson)表示,“他们当然是出版机构,作为出版机构有责任保护并突出消息来源。几十年、乃至几个世纪以来,一些伟大的报纸一直坚守这一神圣的职责,你不能仅靠声称‘我们是一家科技公司’就让自己免于承担这一职责或合规成本。”广州新闻翻译公司。


Along with other recent failures, such as errors in measurement that led Facebook to inflate the number of times its video ads were seen, the furore over fake news has intensified calls for internet companies to think of themselves as media concerns.



 The measurement, fake news and extremist content issues highlight that new media or social media companies are not technology companies; they’re media companies,” says Sir Martin Sorrell, chief executive of WPP, the world’s largest advertising group. “They are responsible for the content in their digital pipes.”

“计算方式、假新闻、极端内容等问题突显出,新媒体或社交媒体公司并非科技公司,而是媒体公司,”世界最大广告集团WPP首席执行官苏铭天爵士(Sir Martin Sorrell)表示,“它们应对自家数字渠道的内容负责。”


Yet the business imperatives the internet platforms follow may have given them too little incentive to exercise this type of responsibility. Weeding out false information “hasn’t been a priority”, says Mr Borthwick. “Content, more often than not, has been a means to an end — and the end is more sharing, more connectivity.”



One former Facebook staffer also says the way the company is run may have exacerbated the distribution of fake news. Its engineers focus on improving engagement — clicks, likes, comments, shares — as the primary measures of success of any new feature. New projects are typically released after six-month “sprints”, during which pressure to increase those metrics is intense.



 Engagement is a dangerous drug,” says a former Facebook manager. “Nobody is incentivised to think critically about unintended, often long-term consequences.”



That may also contribute to a “filter bubble” problem that leaves users inside an echo chamber of similar views.



Worse, the pursuit of engagement for its own sake may exacerbate the problem and add to the flow of angry, hateful and inaccurate information. “There’s a lot of evidence that what people share is not necessarily what they researched but what gives them an emotional response,” says Mr Kan.



The posts that bring the strongest reactions, adds Mr Borthwick, are “catnip to the news feed”. So the Facebook engineers have an incentive to feature these items most prominently, aiding the spread of information that deepens political divisions.



Room for improvement



It is unclear how far the internet companies will move to address these issues. Early attention has turned to the algorithms used to weed out fake news, an area in which many experts believe there is scope for improvement.



Mr Zuckerberg has made no mention of another issue raised by critics: whether the sites should hire human editors. Employing people to carry out detailed filtering of content is impractical given the scale of the networks, say critics such as Mr Borthwick.



But he, and others, argue that internet companies should still hire “public editors” who can help to establish guidelines and shape their thinking in product design and other issues that will affect the way their services are used.



Such calls may continue to fall on deaf ears. “The culture they’ve built and the people they’ve hired” mean that internet companies like Facebook simply do not recognise any need for editorial sensibility, says Ben Edelman, an assistant professor at Harvard Business School.

此类呼吁可能仍旧得不到理睬。哈佛商学院(Harvard Business School)助理教授本•埃德尔曼(Ben Edelman)说:“他们打造的文化与他们雇用的人”意味着Facebook等互联网公司根本不会认可有必要建立编辑敏感性。广州新闻翻译公司。


The cultural chasm runs even deeper. On Twitter, a commitment to free speech has long contributed to the site’s hesitation to stamp out harassment, a flaw it belatedly tried to fix last week with new controls to combat bullying, racism and misogyny.



Mr Zuckerberg, casting Facebook more as a communication platform than a media site, takes a similar stance. “We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible,” he wrote last week.



But old authorities become muted in a world where users’ voices are pre-eminent. In an interview last week with The New Yorker, Mr Obama complained that, on a Facebook page, an explanation of global warming by a Nobel Prize winner looks no different than one by a paid climate change denier.

但在一个互联网用户的声音被突出的世界,传统权威变得无声。上周在接受《纽约客》(The New Yorker)采访时,奥巴马抱怨称,在Facebook页面上,一位诺贝尔奖得主对全球变暖的阐释看上去并不比一个收了钱的气候变化否认者的言论重要。广州新闻翻译公司。


He added: “The capacity to disseminate misinformation, wild conspiracy theories, to paint the opposition in wildly negative light without any rebuttal — that has accelerated in ways that much more sharply polarise the electorate and make it very difficult to have a common conversation.”



In the wake of a bitterly divisive US election, Facebook users are retreating deeper into their “filter bubbles”. The bitterness of the loss, says Mr Howard, means that many on the losing side have been systematically “unfriending people who voted for the other candidate”.



The result is likely to be even deeper tribal divisions. That can only add to an environment in which many people are all too ready to believe the most biased or inaccurate information about the opposing camp — and to shout it out to anyone who will listen.