Sunteți pe pagina 1din 85

ISBN: 978-9934-564-18-5

DIGITAL HYDRA:
SECURITY IMPLICATIONS OF
FALSE INFORMATION ONLINE
ISBN 978-9934-564-18-5
Digital Hydra: Security Implications of False Information Online

Project director: Giorgio Bertolin

Researchers:
Nitin Agarwal, Professor of Information Science, University of Arkansas at Little Rock
Kumar Bandeli, Doctoral Cand., Information Science, University of Arkansas at Little Rock
Giorgio Bertolin, Social Scientist, NATO Strategic Communications Centre of Excellence
Nora Biteniece, Software Engineer, NATO Strategic Communications Centre of Excellence
Katerina Sedova, Project Assistant, NATO Strategic Communications Centre of Excellence

Text Editor: Anna Reynolds

The NATO StratCom Centre of Excellence, based in Latvia, is a multinational, cross-sector


organization which provides comprehensive analyses, advice and practical support to the
alliance and allied nations. This report is a product of the NATO Strategic Communications
Centre of Excellence (NATO StratCom COE). It is produced for NATO, NATO member countries,
NATO partners, related private and public institutions and related individuals. It does not
represent the opinions or policies of NATO.

© All rights reserved by the NATO StratCom COE. Reports may not be copied, reproduced,
distributed or publicly displayed without reference to the NATO StratCom COE.

The views expressed here are solely those of the authors in their private capacity and do not
in any way represent the views of NATO.

Riga, November 2017

NATO Strategic Communications Centre of Excellence


Riga, Kalnciema iela 11b, Latvia LV1048
www.stratcomcoe.org
Ph.: 0037167335463
info@stratcomcoe.org 1
ISBN 978-9934-564-18-5
Digital Hydra: Security Implications of False Information Online

2
OUR LATEST
PUBLICATIONS

3
TABLE OF CONTENTS

Introduction................................................................................................................. 5

Platforms: Geography, Culture, Language...................................................................... 11


Introduction..................................................................................................... 12
Geography....................................................................................................... 12
Facebook......................................................................................................... 13
Twitter............................................................................................................. 15
Other platforms............................................................................................... 17
The Arab world and the MENA region.............................................................. 19
Social media on RuNet.................................................................................... 23
Concluding remarks....................................................................................... 29

Blogs, Fake News, and Information Activities .............................................................31


Introduction..................................................................................................... 32
Methodology.................................................................................................... 32
Typical characteristics of disinformation riddled blogs................................... 35
Tracking the origins of misleading blog content............................................. 38
Mixed-media vs. Cross-media approaches........................................................ 40
Tracking how an antagonistic narrative travels................................................ 42
Conclusions.................................................................................................... 45

Third-party Services and Fake News Media Sites ...........................................................47


Background..................................................................................................... 48
News sources and third-party services............................................................. 53
Conclusions and implications.......................................................................... 59

Conclusions and Recommendations........................................................................... 61

Glossary.................................................................................................................... 66

Endnotes................................................................................................................... 73

4
INTRODUCTION

The study investigates misinformation Disinformation: The dissemination of


and disinformation on social media in false information with the deliberate
the context of the rise of ‘fake news’ intent to deceive or mislead.
and the birth of the ‘post-truth’ era.
Misinformation: The dissemination of
Are these concerns substantiated by
false information, either knowingly or
facts? What are the consequences of
unknowingly.
these phenomena for the information
environment? Most importantly, do Throughout this study, we will focus on
these phenomena pose a threat for our the malicious use of information, when
societal security? This study will provide information is used to mislead and
actionable knowledge by answering to deceive. In practice, misinformation is
these questions. often understood as only the unintended
dissemination of false information.
This introduction is an attempt to position
In order to avoid confusion, this is
the emergence of ‘fake news’ in a wider
how the term is used throughout this
societal context. Particular emphasis
study. Disinformation is comprised
is placed on the cognitive biases that
of two elements: the falsehood of the
enable information manipulation. In turn,
information, and the clear intention
this will lead to a discussion about the
to mislead.2 The term is modelled
tactics employed by adversarial actors
after ‘dezinformatsiya’, a Russian term
to carry out information activities.
first coined by the KGB 3 to refer to the
DEFINITIONS use of false or otherwise misleading
information that is purposely provided
A glossary with definitions is provided to selected audiences to influence their
as in the appendix (page 66). However, behaviour.4 This was part of the broader
some terms deserve to be defined from set of tactics called ‘active measures’,
the very beginning. ‘Disinformation’ which frequently involved ‘attempts to
and ‘misinformation’ are not officially deceive the target (foreign governmental
defined in the NATO terminology. For and non-governmental elites or mass
these key terms we adopt the following
5
audiences), and to distort the target’s
definitions:1 perception of reality’.5
However, contemporary attempts at This has led to social media platforms,
disinformation are not just the revival as well as private companies and,
of an old Soviet strategy. The means occasionally, governments, taking
offered by contemporary communication action. Google is particularly involved
practices magnify the effects of in this effort. It does so through direct
disinformation. Old-style disinformation initiatives (e.g. Google NewsLab and
devoted considerable care to the the introduction of a fact-checking
crafting of false stories, while today the snippet)10 and by funding fact-checking
focus is on quantity rather than quality. networks (e.g. the Poynter International
Contemporary disinformation is like Fact-Checking Network and the First
a Lernaean Hydra: one story may be Draft News network).11 However, some
discredited, but many more will appear. studies point out how debunking and
fact-checking may be ineffective and
CONTEMPORARY sometimes even counterproductive.12
CHALLENGES: These initiatives are quick and easy, but
MISINFORMATION, they do not get at the root causes of the
DISINFORMATION, AND issue.
INFORMATION ACTIVITIES It is for this reason that social media
False information on social media has
6 companies are exploring other ways to
gained enormous popularity over the counter misinformation, disinformation,
last year, but it has rarely been framed and other information activities. For
in terms of information activities by the example, Facebook recognized that
mainstream media. This issue affects ‘social media platforms can serve as a
audiences from all facets of the political new tool of collection and dissemination
spectrum.7 The expression most widely for [information activities]’. In fact, ‘[t]
used to refer to the phenomenon in this hrough the adept use of social media,
context is ‘fake news’. The expression information operators may attempt
quickly gained popularity, so much so to distort public discourse, recruit
that even those who were accused of supporters and financiers, or affect
spreading fake news in the first place political or military outcomes.13 Because
started using the term to describe of this, Facebook states that countering
the accusations themselves.8 Media information activities is a priority for
outlets have been accused of spreading the platform. It claims to be doing
disinformation from their inception. that by ‘collaborating with others to
It is therefore unsurprising how many find industry solutions [...], disrupting
observers have resisted contemporary economic incentives [..., and] building
concerns about the emergence of fake new products to curb the spread of false
6 news.9 However, the threat today is news’.14
qualitatively different.
THE SOCIAL CONTEXT changes in technology that make it easier
for them to gather information,
In NATO doctrine, the information communicate and organize’, while at the
environment is composed of three same time feeling increasingly ‘excluded
domains: the physical, the virtual, and the from meaningful participation in
cognitive/psychological. Our perception traditional decision-making processes’.18
of the world is constructed in these Disinformation aims at destabilizing
domains; as noted by R. Waltzmann, ‘the society by exploiting these emerging
Internet and social media provide new dynamics, many of which take place on
ways of constructing realities for actors, social media.
audiences, and media’.15
According to a recent study published
The social context that underlies the rise by Al Jazeera, ‘The explosion of social
of false and misleading information on media can be both a blessing and a curse
social media is labelled by some to be for journalists. It has made anyone […] a
a ‘post-truth’ environment. As outlined potential witness or source; it has allowed
by The Economist, ‘[t]here is a strong people to tell stories from places where
case that, in America and elsewhere, journalists are not present, or where they
there is a shift towards a politics in cannot easily go. Yet it comes with its own
which feelings trump facts more freely problems, problems that can be boiled
and with less resistance than used to be down to a single question: How can you
the case’.16 Italian semiotician Umberto trust what you see online?’19 Western
Eco famously claimed that, while social audiences rely heavily on social media
media can support the democratization to get their news. This is evidenced by a
of authoritarian regimes, they also give recent survey highlighting how the majority
voice to ‘legions of imbeciles’.17 The of American adults use social media as
spread of disinformation over social their primary source of information.20 While
media would not be possible without the democratization of the informational
a suitable habitat. Eco’s statement space brings indisputable positive effects,
is a provocation that highlights how it also leaves society more vulnerable
information on social media is left to manipulation. Tailored social media
without qualified gatekeepers, people content generates ‘information bubbles21
who can take responsibility for what is where voters see only stories and opinions
published. suiting their preferences and biases—
ripe conditions for […] disinformation
The threat posed by misinformation
campaigns’.22 Manipulation is carried
and disinformation may affect social
out by influencing the way information
stability. In its 2016 report on global risks,
is processed by the human brain. In the
the World Economic Forum described the
phenomenon of the ‘(dis)empowered
absence of qualified gatekeepers, these 7
techniques can be exploited to their
citizen’: ‘individuals feel empowered by
full extent. The dissemination of false
information is inherently linked to wider TACTICS OF
dynamics, such as ‘increasing mistrust of DISINFORMATION ON
people with respect to institutions, to the
SOCIAL MEDIA
increasing level of functional illiteracy23
[…], as well as the combined effect of Broadly speaking, tactics used to spread
confirmation bias’.24 disinformation on social media share the
same desired outcome, i.e. manipulating
Confirmation bias, i.e. the tendency to
public discourse. Facebook lists three
interpret new information as evidence
major tactics employed by malicious
that confirms one’s existing beliefs, is
actors to conduct operations on their
the underlying mechanism that allows
platform:
misinformation and disinformation to
flourish. Social psychology indicates 1. Targeted data collection
a number of other cognitive biases 2. Content creation (false or real)
that adversarial actors can capitalize
on. Among other things, information is 3. False amplification (coordinated
perceived to be valid when: activity by inauthentic
accounts)29
• The subject is repeatedly
This study focuses on the first and
exposed to a specific
second point.
statement 25
• The information has been For journalists, the distinction between
encountered previously by the true and false statements is important.
subject 26 However, stories are often valued more
for their psychological impact and than
Moreover, a number of factors make for their intrinsic value. When stories are
a subject less likely to analyze a piece designed to be part of a broader effort,
of information carefully before making the primary objective can be influencing
a decision regarding its validity. selected audiences, which is achieved
Among them: via the following activities: 30
• The subject’s perceived familiarity • Increasing the target’s
with the subject at hand27 suggestibility
• The level of interest in the • Gaining control over the target’s
topic: the less a subject is environment
interested in the topic, the less • Creating doubt
likely he/she is to accurately • Creating a sense of
analyze information28 powerlessness
These cognitive biases inform the tactics • Creating strong emotional
8 that enable acts of disinformation in responses in the target
cyberspace. • Heavy intimidation
Tactics Platforms Desired outcome
Retrieving public information in order to
Broad data Blogs, Friendship-
conduct audience analysis and deliver
collection based networks
targeted content
Targeted data Friendship-based Retrieving non-public information on
collection networks selected individuals in order to expose it31
False content Friendship-based Inject selected narratives in public
creation and and Follower-based discourse, confuse, (possibly) reflexive
spreading networks control32
Emotional content33 Friendship-based Inject selected narratives in public
creation and and Follower-based discourse,34 confuse, (possibly) reflexive
spreading networks control
Saturating the
information Mainly Follower- Silence targeted discussions, confuse,
environment, based networks divert attention35
informational fog
Mainly Follower- Increase reach and perceived credibility of
False amplification
based networks selected content36
Psychological manipulation of selected
Impersonation Mainly friendship-
targets into performing actions or
(people) based networks
divulging confidential information37
Inject selected narratives in public
Impersonation Mainly Follow-
discourse, confuse, (possibly) reflexive
(organizations) er-based networks
control

Manipulation on social media can be disinformation on social media is a


channelled into any of these activities. subset of the general counter-propaganda
It is important to highlight that these effort. This highlights how we should look
activities are not compartmentalized; primarily at the social implications rather
on the contrary, several activities can than the technical details. A considerable
be pursued at the same time and in number of counter-strategies are
synergy. This is done through social currently focused on the latter, sometimes
media-specific tactics, summarized in neglecting social dynamics. For example,
the matrix above.31323334353637 targeted counter-efforts38 don’t take
into consideration the fact that, when
The outlined tactics can be countered by
compared to the appeal of emotional
NATO and its Allied countries by adapting
content, logical argumentation has little
established procedures to the evolution of
power when it comes to countering
disinformation. The manipulation of public
the spread of disinformation online.39
discourse through social media stands
out among the challenges emerging in
Analogously, automatic fact-checking 9
applications 40 invariably stumble on
the information environment. Countering
the same obstacle, i.e. the fact that the
people who will download and install
these applications are generally not those
who are most vulnerable to propaganda in
the first place.

STUDY OUTLINE
This study provides a look into what
can be done to counter the problem
of disinformation on social media by
analysing more closely the various
facets that compose it. The study is
organized as follows. Chapter 1 frames
the issue of false information on social
media in the context of the existing
military doctrine on disinformation.
Chapter 2 outlines a conceptual map
describing how disinformation differs
across various social media platforms.
The following chapters take a look
at what may be the Achilles’ heel of
any strategy involving the use of so-
called fake news, i.e. the link between
social media and external content.
Chapter 3 looks at blogs specifically,
and how they are used in concert with
social media to spread misinformation
and disinformation online. Chapter 4
explores the third-parties tracking user
behaviour on internet outlets associated
with the spread of false and misleading
information. The conclusion brings
together the findings of the study,
highlighting recommendations and
delineating possible counter-strategies.
The study is complemented by a glossary
that incorporates both NATO-approved
definitions and, for those terms not
10 currently present in NATO doctrine,
definitions developed by subject-matter
experts and other sources.
1
PLATFORMS:
GEOGRAPHY,
CULTURE, LANGUAGE
Giorgio Bertolin, Katerina Sedova

Different platforms dominate different cultural-geographical areas.


Different networks lend themselves to different exploitation tactics. Social
media companies are aware of the impact that disinformation planted online
has on public discourse, and have come up with some countermeasures.
However, it remains to be seen the extent to which these countermeasures
are effective. Russian-speaking internet users prefer Russian-made social
media platforms. These platforms are qualitatively different from their
Western counterparts, and can be used more effectively in disinformation
campaigns by malicious actors.

11
INTRODUCTION The platforms analysed in this chapter
are those that are most relevant today,
In this section we will describe the but they will not necessarily retain
specificities of misinformation and their positions in the future. One just
disinformation across different social needs to consider the fall of social
media platforms. We will provide an media giants, such as MySpace 42
overview of the challenges encountered to be reminded that the popularity of
by major social media platforms, and social media platforms is not set in
of what the platforms themselves are stone.
currently doing to curb the spread of mis-
and disinformation. GEOGRAPHY
A recent study found that ‘[t]he rapid A mere quarter century since the World
growth of social networks caused Wide Web entered the public domain,
them to become ideal platforms for 3,77 billion—more than half of the
spreading disinformation campaigns world’s population—is online. As of 2017,
(…) [t]o spread fake news, it is 2,8 billion people are using social media,
necessary to promote it to social and the pace of growth continues to
media users’.41 This chapter will adopt a accelerate.43 The following maps show
‘microscopic’ perspective, looking at the the regional nature of the world’s leading
characteristics of current major social platforms.44 They depict, respectively,
networks. the first and second most popular
platforms in the countries surveyed.
We must remember that the social
Facebook is dominant in the Western
media landscape has a transient nature.

WORLD MAP OF SOCIAL NETWORKS


January 2017

Facebook
QZone
Twitter
V Kontakte
LinkedIn
Odnoslassniki

12
Instagram
WORLD MAP OF SOCIAL NETWORKS
Ranked 2nd - January 2017

Facebook
Reddit
Twitter
V Kontakte
LinkedIn
Odnoslassniki
Instagram

world, South America, the Middle East phenomenon, where the Facebook
and North Africa (MENA) region, and user is offered information of a nature
all English-speaking countries. Reddit is similar to what he himself produces.45
particularly popular in English-speaking To the use of its platform to spread
countries such as Canada, Australia, mis- and disinformation, Facebook
and New Zealand. And while Twitter has responded by developing reporting and
considerable traction in the West, its role flagging procedures.46 These efforts
is less important in other world regions. have received mixed appraisals. Most
of the measures simply don’t work,
FACEBOOK mainly because they do not take into
account cognitive biases such as the
Facebook is currently regarded as the ‘continued influence effect’.47 One
leading friendship-based network, at critic pointed out that exposing false
least in the Western world. In this capacity, or misleading information in stories
this platform is the most valuable target and/or accounts is useful only as a
for disinformation campaigns. Recent whitewashing maneuver: ‘It’s ultimately
concerns about the weaponization a kind of PR move. It’s cheap to do. It’s
of false information were focused on easy. It doesn’t actually require them to
Facebook. Many observers pointed to do anything.’48 Yet, it is important to note
Facebook’s role in exacerbating the that ‘disinformation campaigns happen
negative aspects of the kinds of social largely outside of Facebook’s control’49
dynamics that facilitate the spread —what happens on Facebook is a
of mis-/disinformation. In particular, symptom of broader dynamics in society
the algorithms behind the selection of at large. While the company’s efforts 13
stories on Facebook’s homepage were might curb the spread of disinformation,
accused of worsening the echo chamber
“ Researchers can monitor only the tip of the
iceberg, i.e. public pages and user groups.

it cannot fight it directly. Facebook


cannot be tasked with countering
disinformation: this task does not
are casually encountered by users, either
because their Facebook contacts are sharing
these stories or because they are being
fall within the responsibility or the promoted by Facebook’s algorithms, or a
competence of a social media company. combination of both. These stories that ‘go
As Bounegru writes, ‘Facebook’s viral’, either genuine or misleading, can have
architecture poses challenges to the a considerable effect in promoting selected
study of circulation of content agendas, especially when the content is
due to the nature of its access tailored to specific target audiences. This
and permissions system’.50 practice is blossoming, especially in regard
Therefore researchers can monitor only to the concerns the commercial sector,
the tip of the iceberg, i.e. public pages
although some companies claim to have
and user groups. Users who subscribe to
applied the same methodology to fit the
groups and public pages receive updates
specific needs of political campaigning.51
whenever new posts are published. In
Regardless of the veracity of this
this context two scenarios are possible:
claim—which is difficult to assess, since
1. Users subscribe to pages said companies refuse to share their
spreading mis-/disinformation methodologies and measures of effect—
because the content resonates the possibility of applying commercial
with them. principles to political purposes must be
considered. Adversarial actors can take
2. Users subscribe to pages that
advantage of this while hiding behind
share ‘neutral’ content that
business-driven and relatively anonymous
resonates with them. These
entities.
pages can potentially spread
disinformation at a later stage. As with all friends-based networks,
14 However, the biggest impact is that of users perceive Facebook as a familiar
stories that quickly gain popularity and environment. This presents a clear risk:
“ On Twitter, manipulation of the information
environment has occurred through the
trending topics feature and hijacking or
clotting hashtags.

if a Facebook user’s contact shares a


story, the user will likely assume that
the contact is vouching for that piece of
TWITTER
Twitter has become particularly
important for political manipulation,
information, which may not be entirely
true. Moreover, some users share stories since it is the platform of choice for many
without double-checking their veracity, traditional media outlets, politicians, and
particularly if they are not tech- or fake other opinion leaders. A considerable
news-savvy. number of quantitative studies on social
media use Twitter as their testing ground
Grassroots attempts to counter since, compared to other platforms,
misinformation and disinformation on Twitter presents more publicly available
Facebook have focused on debunking data.
false stories. Websites like Snopes
feature efforts to investigate the veracity As early as 2010, Chamberlain observed
of posts shared on social media, primarily how ‘the proliferation of disinformation
Facebook. However, research shows that capabilities represented by Twitter
these efforts not only do not reach their will almost guarantee that users of
stated goals, but might actually make social networks will be exposed to
the situation worse. First, very few of the disinformation’. Users are at risk of
users exposed to unsubstantiated claims being ‘manipulated by any organisation
‘actively interact with the corrections’.52 that cares to develop an information
Second, these users seem to be more operations capability’.54 The same
active within their own echo chamber author attempted an explanation for why
after they have come in contact with a Twitter is a favourable environment for
correction, suggesting a hardening of disinformation:
their initial beliefs.53
15
“ The world of social media is in constant
flux, and it is therefore necessary to monitor
developments in order to stay current with
emerging platforms and cross-platform
trends.

Twitter messages can seem


credible without containing
any references to support their
accounts’ are present on other platforms,
it is here that this potentially malicious
technique demonstrates its reach.58 Up
claims. The short length of tweets to 15% of all Twitter users are in fact
encourages short declarative automated scripts that mimic human
statements absent of supporting behaviour with growing sophistication.59
arguments and thus users This number can grow considerably in
do not become suspicious of certain specific contexts. As recently
unreferenced assertions. The fact highlighted by our Centre of Excellence,
that in some instances Twitter nearly 70% of all Russian-language
has been the primary source of Twitter accounts posting about NATO in
news about a currently unfolding Poland and the Baltic states are in fact
event also gives it some inherent automated scripts.60 These networks
credibility.55 of automated accounts (or ‘social bot
networks’) can considerably boost the
reach of disinformation.
On Twitter, manipulation of the
information environment has Any efforts by Twitter to curb the
occurred through promoting a certain diffusion of false and misleading stories
idea via the trending topics feature, are likely to be channelled towards
and through suppressing certain ideas reporting abusive content and ‘fake
or discussions through hijacking56 news’, as Facebook already does.61
or clotting hashtags57. Both forms of However, no such measures are in place
manipulation oftentimes make use of as of this writing. The criticisms directed
networks of fake Twitter accounts. These toward Facebook’s strategy can be
fake accounts are often automated or applied to Twitter as well. Moreover, the
16 partly automated—robotic activity is main problem affecting this platform is
what plagues Twitter most. While ‘bot the proliferation of automated activity.
“ Micro-platforms exacerbate the echo cham-
ber phenomenon.

Any measure aimed at curbing it is likely


to have a sanitizing effect on the overall
Twitter environment.
of social media is in constant flux, and
it is therefore necessary to monitor
developments in order to stay current
with emerging platforms and cross-
Just like Facebook, Twitter is a
platform trends. Moreover, existing
battleground for companies selling
platforms are not separate worlds.
various degrees of targeted messaging
They exist in interconnection with
fuelled by audience analysis. However,
each other—a story that originates on
Twitter offers less personal information
YouTube can be shared on Facebook,
than Facebook. The main reason is that
then picked up by a newspaper
profile descriptions on Twitter are not
website; the article can then be shared
as codified as they are on Facebook.
on Twitter and might generate a thread
A special category of metadata is
on Reddit, and so on. In this context,
dedicated only to geo-localization and
YouTube occupies a distinctive space.
external URL links. Therefore, collecting
information on these profiles is a process Youtube it is a fundamental element in
that revolves around various types of the virtual space where social media
secondary analysis,62 meaning that the exists—‘Web 2.0’. A considerable number
final result is less refined what can be of stories shared on social media
obtained from Facebook. originate here, and misleading stories are
not an exception. Conspiracy theories
OTHER PLATFORMS have been thriving on YouTube since
Beyond Facebook and Twitter, the reach the early days of the platform.63 This is
of any other platform in the Western because YouTube was and is regarded
information environment is limited. as a medium that allows the distribution
However, the impact of the major of ‘alternative’ information, while at
platforms is not necessarily directly the same time being a mainstream 17
proportional to their reach. The world information source for millions of
follow in the wake of conspiracy theories. While Instagram seems to be relatively
Politically slanted channels that distribute immune to the worst aspects of
false content prosper on YouTube and are disinformation-spreading on social
shared on social media, where they reach media, it is by no means a safe space.
and attract larger audiences. It must be Among the main issues
noted that, as is the case for the creation affecting this platform are
of content on other mediums, most impersonation and spamming.68
creators are motivated by financial gain. Instagram’s guidelines prohibit
In order to fight the spread of misleading ‘artificially collecting likes, followers, or
content YouTube fosters media literacy shares, posting repetitive comments or
campaigns.65 YouTube has also modified content, or repeatedly contacting people
its terms of use by implementing a new for commercial purposes without their
review process for its Partner Program. consent’.69 The fact that guidelines are
Since April 2017, YouTube channels tailored for spam marketing suggests
that manipulation of the information
cannot generate revenue until their
environment for political purposes is not
videos reach 10,000 views.66 This higher
yet an issue for Instagram.
threshold is supposed to give YouTube
‘enough information on the validity of a Recently, self-styled ‘alternative’
channel’.67 However, this cannot do much platforms have burgeoned, aiming
to deter state actors, or proxies of state to circumvent the policies of
actors who get their funding from other mainstream platforms, in particular
sources or who are not motivated by the perceived censorship carried out
financial gain. by Facebook and Twitter towards
inflammatory and controversial
speech. These micro-platforms70

JANUARY 2017

DIGITAL IN THE MIDDLE EAST


KEY STATISTICAL INDICATORS FOR THE REGION’S INTERNET, MOBILE, AND SOCIAL MEDIA USERS

18
Data and design from: Digital in 2017: Global Overview (We Are Social, 2017)
exacerbate the echo chamber language users are particularly active on
phenomenon. Their reach is limited, Facebook, Twitter, and Instagram.
but not so limited as to be negligible:
Although revolutions do not take place
as mentioned above, stories easily
on social media, social media played
jump from one platform to another. In
a prominent role in the Arab Spring74
a disinformation campaign, targeting
in 2010. Tunisian protesters famously
groups that are particularly receptive71
used Twitter to make their voices
on an alternative platform can serve as
heard in the Arab world and beyond.75
an intermediate stage through which
As of January 2017, there are 93 million
selected narratives can be passed on to
active social media users in the Middle
mainstream networks. Micro-platforms
East alone (see Figure on page 18).76
can act as accumulators, where hostile
narratives are free to flourish because of Facebook is the most popular social
the absence of any significant obstacle.72 network in the Arab world. 87% of social
media users have a Facebook account.
THE ARAB WORLD AND Of these, 89% access the platform
THE MENA REGION on a daily basis.77 As evidenced in a
Arabic is the fourth most common recent report published by Al-Jazeera,
language online;73 the MENA region Facebook is ‘the first platform to
produces a wide range of Arabic- consider for newsgathering, but also for
language material, but, as previously storytelling and audience engagement’.78
noted, this material is not shared
on native platforms. Rather, Arabic-

19
SCREENSHOT OF DA BEGAD’S HOMEPAGE
(14 JUNE 2017)

20
“ Throughout the MENA region, social media
are still perceived as an alternative, relatively
independent source of communication.

Content creation and content diffusion


are tactics that bring best results when
they are used in concert. Analogous
maintained by a team of volunteers
that
from
claim
any
to be
political
independent
affiliation;
to what happens in other parts of the this team relies on crowdsourcing for
world, in the MENA region there is a reporting.82
‘range of services available to anyone
Da Begad’s graphic concept is very basic.
looking to distribute fake news and
First, the disputed story is introduced in
launch public opinion manipulation
a section called ‘The Post’ (since all of
campaigns’.79 Companies like ‘Dr
these stories are found in social media).
Followers’ and ‘CoolSouk’ offer a wide
Then a brief analysis of the contested
range of services aiming at boosting
claims is given in a section called ‘The
the popularity of social media posts.80
Facts’ together with the necessary
These services target popular platforms
references.
including Twitter, Facebook, Instagram,
and YouTube. The activity of these There are features to suggest that
companies can considerably increase the spread of false and misleading
the visibility of counterfactual stories, information over social media is aided
which, as elsewhere, can be spread for by automated activity. During the
political purposes. 2017 diplomatic crisis involving Qatar,
researchers collected evidence pointing
Fact-checking efforts are present in
to automated activity in support of
the MENA region as well. The Egypt-
information attacks against Qatar: the
based website Da Begad (see the
Twitter hashtags that advocated cutting
screenshot of Da Begad’s homepage
off relations with Qatar ‘originated in
from 14 June 2017 on page 20) debunks
Kuwait and spread fast, suggesting heavy
false stories found on social media.81
Analogous to similar initiatives in
bot usage’, while the response hashtag, 21
in defence of Qatar, ‘increased gradually,
other parts of the world, Da Begad is
without the kind of significant peak its
rival hashtag experienced. Anti-Qatar among Arabic speakers in the West. It is for
hashtags seem to be more organized and this reason that some Arab States decided
suggest advance preparation’.83 to strengthen the government’s ability to
monitor and curb the use of social media
Non-state actors are particularly apt at
by violent groups. Bahrain, Egypt, Lebanon,
combining audience characterization with
and Kuwait have enforced legislation
aggressive information activities. Groups
to address this issue.86 In some cases,
like Daesh are ‘particularly successful
this entails directly targeting the most
in targeting tech savvy, impatient and
vulnerable demographic group, i.e. young
respect-seeking millennials (…). They know
men. The Kuwaiti government collaborates
how and what they think and feel, how they
actively with the Kuwait University’s
want to be perceived and how they wish
Media Department in a research project
to receive information’.84 This allows the
aimed at detecting early signs of youth
terrorist organization to spread emotional
radicalization on social media.87 These
content and biased stories, mostly focused
efforts are complemented by those
on praising the Caliphate utopia.85 These
of supra-national entities. The Global
activities are carried out prominently on
Coalition’s Information Cell developed an
social media, both in the MENA region and
audience characterization system that

22
exploits interactive videogames to detect popular social media platforms are the
potential supporters of Daesh.88 same. The following sections will analyse
a region where the social networks that
The issue is not limited to Arab countries
rank highest in popularity are relatively
alone. As part of its fight against pro-
unknown to the rest of the world.
Palestinian violent political extremism,
Israel is working towards compelling
social media companies (namely Google
SOCIAL MEDIA ON RUNET
and Facebook) to curb support for
groups and pages that spread extremist RuNet (or the ‘Russian Internet’)
messages. Israel’s new counterterrorism continues to be dominated by home-
law ‘established a new criminal offense for grown social networks, as populations
demonstrating solidarity with a terrorist in Russia and many nations of the
organization or with an act of terrorism, former Soviet Union are mainly active
and incitement to terrorism, including on VK, Odnoklassniki, and MoiMir. To
via the internet and social media’.89 Since understand RuNet’s social media space,
social networks, through extremist one must understand the domestic
propaganda, are widely regarded as origin of Russian disinformation, its
catalysers for radicalization, governments weaponization for geopolitical goals,
throughout the world are pressuring social and the consequences for Russian-
media companies to take action, but owned social networks.
whether this will result in concrete actions VK.com (VKontakte.com)92
is questionable.90
As of January 2017, VK reports 90 million
Throughout the MENA region, social media monthly active users with almost 70% of
are still perceived as an alternative source them living in Russia.93 Founded in 2006,
of communication, relatively independent VKontakte intended to connect university
from the constraints imposed on traditional students. The network continues to attract
media by state authorities (on page 22, a a younger audience in comparison to
cartoon published by Al-Jazeera’s website other social networks, with its largest user
light-heartedly illustrates this point - demographic group under the age of 35.94
TV screen is captioned with the label
‘Authority’s Media). VK is known for its uncluttered user
interface, focus on communities, and
However, governments in the region are entertainment—the source of its high
quickly weaponizing new media. For audience engagement. While a typical
example, Iran is believed to be creating friends-based network, several features
bogus online personas to carry out differentiate it from Facebook. The rich
targeted attacks.91 As evidenced above, built-in image modification features
the online environment in the MENA region allow VK users to easily overlay images 23
suffers from the same issues encountered with text for meme creation. Music and
in the West, not least because the most
video sharing—sometimes in violation of OK.ru (Odnoklassniki.ru)98
existing regulations—are central to VK’s
success and continued active audience Odnoklassniki (OK) is the second most
engagement. popular social media network in Russia
and the nations of the former Soviet
The detailed information codified in its Union, with 40 million registered users
profile questionnaire enables a powerful in Russia and 65 million overall.99 OK.ru
search function, which makes it easy users are more likely to be women and over
to find specific people, and locale- and 30 than users of any other Russian social
interest-based groups. A phenomenon network.100 OK has the typical features of
specific to VK, that does not exist in most other friends-based networks—personal
Western social media, is relying on local profiles, chats, discussion boards, and
groups to spread information. Towns and the functions that make status updates,
other geographical areas have local group sharing pictures, and searching for
pages with thousands of participants, who friends possible.
share pictures, videos, and eyewitness
accounts.95 Such fine-grained social OK’s focus is on engagement through
connectivity presents fertile ground for entertainment. Personal feeds are
disinformation.96 flanked by social games, ads, and
banners showing the most popular
VK’s weak privacy and security settings songs and videos trending on OK.101
make its users vulnerable to disinformation. To compete with VK, whose foray into
Its API and user data protection allow multimedia content contributed to its
micro-targeting. In 2017, VK added several early lead in popularity, Odnoklassniki
features that enhance its advertising launched its own video, TV, and cinema
platform and make it more vulnerable to service. Unlike VK, the service allows
misinformation and disinformation. One of users to watch TV shows online from
these features allows advertisers to display STS and TNT—two popular Russian
shortened web addresses to streamline entertainment TV broadcasters. This
the appearance of their ads by obfuscating is a key method driving continued user
the destination address. This hinders a engagement within the network.102 The
user’s ability to identify the source of the mastery of state-controlled Russian TV
posting and to critically evaluate the link in blending the Kremlin’s narratives with
before clicking it, thus increasing users’ entertainment is well documented.103
vulnerability by encouraging them to
unknowingly navigate to sites that may be Image consumption and manipulation
malicious. Lax security measures further are central to the user experience.
enable registration of mass accounts, Numerous options facilitate custom
making VK the cheapest platform for the framing ‘postcards’ and rating pictures.
24 creation of bots, which are used to amplify Before users log in, the default feed
disinformation.97 displaying trending content showcases
the essence of OK—moralistic memes,
TYPES OF MEDIA TRENDING ON OK.RU

jokes, cute pictures, and the ubiquitous Moi Mir (my.mail.ru)


cat videos, which are occasionally
interlaced with propagandistic imagery The third most popular Russian social
of the Russian leader.104 This feature network is Moi Mir,105 a property of the
relies on a vulnerability that can be Mail.Ru web service, akin to the Google+
exploited: if the feed can be manipulated and Gmail ecosystem. As of 2016, Moi
to push content, public opinion may be Mir claimed 25 million users.106 Moi Mir
influenced through the juxtaposition of users set up their accounts through
propagandistic imagery with emotional a  mail.ru e-mail address, which serves
content, just as purchasing habits may as a single sign-on into the mail.ru
be influenced through advertisement. universe and Moi Mir. The lack of an SMS
verification requirement to complete
Odnoklassniki supports an extensive set registration on Moi Mir exposes the site
of search parameters, which enhance to anyone who wants to easily create
discoverability. The network is home to non-genuine accounts.107 The key to
many large user-generated communities, Moi Mir’s popularity is the rich social
counting millions of followers from the gaming and video-sharing experience
RuNet area. it provides. The platform was recently 25
augmented with digital TV content from
STS and TNT. This move is consistent
“ If the feed can be manipulated to push
content, public opinion may be influenced
through the juxtaposition of propagandistic
imagery with emotional content.

with the goals of Mail.Ru Group, which


owns both Odnoklassniki and Moi Mir 108
Once the integration has been put in
place, Moi Mir will be vulnerable to the
The RuNet evolution
Nationwide, Russian Internet penetration
grew from 5% in 2002 to 73% in 2017.112
tactic of blending entertainment with Why are Western social media platforms
disinformation, masterfully honed by lagging so far behind in Russian-
Russian State TV.109 speaking communities? In a comparative
study of Facebook and VK, researchers
The search functionality of Moi Mir explained this phenomenon through
has many features in common with VK the concepts of platform and culture.
and Odnoklassniki. However, unlike the The users interviewed expressed their
other two platforms, many of Moi Mir’s appreciation for the user-friendly
search parameters are based on physical minimalistic interface unencumbered
appearance, or the user’s ‘chronotype‘, by advertisements, access to engaging
signifying waking and sleeping patterns. content through audio and video sharing,
Like most dating sites (and VK), Moi Mir trustworthiness, and fun. Although VK
offers a ‘last active/last visited‘ indicator exists in 70 languages, it dominates
to reveal the levels of engagement.110 among Russian speakers, who can
While this facilitates making connections, connect with each other in Russian,
such information exposes personal on the basis of entertainment, cultural
details that a malicious actor might use. humour, and pride for being on the ‘made
The ‘dating-like‘ atmosphere can create in Russia‘ platform.113
an illusion of intimacy which may make
users vulnerable to a specific type of Since 2011, Facebook has slowly gained
hybrid troll, those featuring attractive share in certain demographic segments.
individuals.111 First, Facebook users in Russia are
26 on average older, more educated, and
earn a higher income than VK users.114
Second, Facebook’s foothold in Russia
“ The ‘dating-like‘ atmosphere can create an
illusion of intimacy which may make users
vulnerable to a specific type of hybrid trolls,
those featuring attractive individuals.

is increasing as it is used for business


communication and maintaining
professional contacts outside of the
sector encouraged by the Kremlin,120
RuNet was allowed to flourish,
empowering the development of local
RuNet ecosystem.115 However, the most social networks. VK, Odnoklassniki, and
significant factors contributing to the other popular online media were founded
increase of genuine Facebook users in by mid-2000s. President Putin’s early
the RuNet ecosystem are the domestic disinterest in RuNet ensured its freedom
political climate in Russia and the through 2010,121 when the first large
geopolitical adventurism of the Kremlin, scale disinformation campaign in Russia
as dissidents and the politically engaged was launched in support of Medvedev’s
are fleeing increased government re-election campaign. It deployed pro-
control. government botmasters and trolls,
recruited from pro-Kremlin youth groups
RuNet, and VK in particular, was once a
such as ‘Nashi’, to comment on opposition
pivotal medium for political engagement
blogs and to repost pro-government
and a dynamic platform for political
messages.122 The trolls responded to
discourse, organizing protests, and other
the highly skeptical blogging culture
activism. The ‘colonization of RuNet’
by faking detailed, believable proofs in
by the state political technologists
support of their false narratives.123
since the 2000s occurred gradually in
several stages.116 As traditional media These early attempts at disinformation,
became the target of government augmented by a rapidly maturing spam
control,117 serious political reporting industry and search optimization,
moved to social media that were free emphasized tactics that went beyond
from interference, particularly blogs,118 reposting and retweeting to manipulate
where the culture of skepticism required popular rankings with engaging, viral
elaborate proof of one’s assertions.119 content.124 In their disinformation 27
Coupled with the boom in the technology efforts, the leaders of ‘Nashi’ became
“ RuNet, and VK in particular, was once a
pivotal medium for political engagement and
a dynamic platform for political discourse,
organizing protests, and other activism.

fixated on creating professionally


produced, visually engaging content
that could go viral and beat the trending
pressure likely occurred quietly. The
storm over Pavel Durov’s VK illustrates the
Kremlin’s interest in social networks. In a
topic algorithms.125 In 2011, the political move reportedly orchestrated by Putin’s
environment changed; in reaction to close ally Igor Sechin, the founder and
Putin’s announcement that he intended president of VKontakte sold his shares
to run for president, mass street protests and the platform was taken over by allies
occurred, and continued for several of the Kremlin.126
years, but by that time the infrastructure
for a disinformation machine was in The case of Ukraine
place. Since the Euromaidan protests of 2014,
RuNet entered a pivotal stage in 2011, Ukraine has been on the front line of
one of increasing government censorship multiple disinformation campaigns.127
fueled by the Kremlin’s push to silence The most successful planted stories
domestic opposition. During this stage, used emotionally compelling images and
the local bloggers bore the brunt of the video content of unrelated events and
state’s displeasure as it expanded the legal geographic locations as ‘evidence’ of
definitions of ‘extremist’ content. New Ukrainian misdeeds.128
censorship empowered arbitrary banning In 2014, a quarter of the Ukrainian public
of local and foreign websites without had an account on VK, with a large
explanation, and required bloggers percentage using social media as its main
with audiences of over 3,000 readers to news source.129 The Ukrainian government
register with the government as mass- prohibited these networks,130 declaring
media outlets. As Odnoklassniki and Moi them tools of warfare and banning access
Mir were already properties of Mail.ru to them in Ukraine.131 Weeks after the
28 and owned by Alisher Usmanov, a close announcement of the ban and before the
Kremlin ally, compliance to government block was fully implemented, 2.2 million
Ukrainians moved from VK and OK to and by private organizations. Research
Facebook.132 In Russia, VK’s change of shows that these approaches are, at
ownership and the Kremlin’s control over best, well intentioned but ineffective.
the social media through its close allies This is a dead end also for countering
sent the educated, politically engaged, structured disinformation campaigns. It
and surveillance-weary social media is, therefore, vital that countermeasures
users to Facebook, Twitter, and encrypted be grounded in radically different
messaging platforms such as Telegram.133 procedures.
The Kremlin continues to threaten Facebook is currently the undisputed
shutdowns134 and bans on VPNs (Virtual leading social media platform in the
Private Networks).135 As the Russian state Western world, Latin America, the Middle
exerts control over social media companies East and North Africa, India, as well as
and coopts them as tools of statecraft, a number of other regions. Facebook
these trends in increased censorship are was recently accused of unintentionally
likely to accelerate. facilitating the spread of disinformation.
The company has shown interest in
CONCLUDING REMARKS tackling the issue, but it remains to be
All social media platforms share the seen how fruitful these efforts will be.
same vulnerability: their users trust the The most popular followers-based
online environment. They are surrounded network, Twitter, is just as likely to be
by ‘friends’, and they feel as though a vehicle for disinformation. On Twitter,
they have control over the information the use of automated bot accounts
they are given access to. However, the demonstrates the full potential of the
threshold for critical evaluation of the medium for spreading false information.
information received is considerably Some of the structural characteristics
lower than for traditional media.136 This of Twitter—concise messages and
means that adversarial actors encounter metadata tags—can be exploited by
fewer obstacles in the execution of malicious actors to magnify the reach of
disinformation campaigns than was the selected narratives.
case in the past. Many of the tactics
The world of social media is constantly
that can be applied to disinformation changing and evolving. New platforms
campaigns are based on standard emerge, and the old ones keep re-
business models. If social media are inventing themselves, adding new
used in hybrid warfare, escalation from interface features. Younger audiences
guerrilla marketing to guerrilla warfare are active on platforms that are virtually
becomes a tangible possibility. unknown to their parents. Moreover,
different social networks are popular in
A large part of the efforts to counter
different regions of the world. Chinese
misinformation and disinformation
and Russian audiences are active on
29
online consists of debunking initiatives
‘home-grown’ social media, over which
supported by the platforms themselves
their respective governments have a
considerable degree of control. In the
Middle East and North Africa, the most
popular social networks are those used
in the West. While the political dynamics
are profoundly different, we see evidence
of the same vulnerabilities.

Compared to Facebook, RuNet’s social


media networks have fewer security and
privacy protections, and they offer more
robust capabilities for discovering people
and groups. The features on RuNet drive
engagement through entertainment
with rich visual content, and video
and audio sharing. The integration
of TV programming enables passive
consumption, bringing platforms closer to
the well-honed disinformation techniques
of Russia‘s state-controlled media. Their
advertising frameworks, which shape
consumer choices, can also be used to
influence public opinion with nuanced
audience targeting. When weaponized,
these legitimate platform features are
powerful vehicles for disinformation.

Yet in today’s Russia, the ownership of


social media platforms and the ability of
their policy teams to withstand pressure
from the state, are the decisive factors
determining the platforms‘ vulnerability
to exploitation. With Russian social media
consolidated in the hands of the Kremlin‘s
allies, the lines between the state and
social media technology companies
have become blurred. The ownership of
the networks, their susceptibility to state
pressure, enhanced by ‘engagement
through entertainment’ platform model,
30 leave RuNet social media networks—
and their users—uniquely vulnerable to
mis-/disinformation.
2
BLOGS, FAKE NEWS,
AND INFORMATION
ACTIVITIES
Nitin Agarwal, Kiran Kumar Bandeli

Blogs provide fertile ground for framing narratives. This chapter


demonstrates that aside from the blog post itself, reader comments can
make the narrative more persuasive. However, the absence of a social
network structure for blogs inhibits the dissemination of these narratives.
Social media platforms such as Twitter and Facebook are used as vehicles
to disseminate the content using cross-media and mixed-media tactics. The
link between blogs and social media platforms is vital for understanding
contemporary disinformation campaigns.

31
INTRODUCTION embed YouTube videos, SoundCloud files,
and Internet-based memes in blogs has led
Blogs have ushered in an era of citizen to unprecedented convenience in framing
journalism that has irreversibly changed narratives, disseminating them widely,
the way we consume information, partly and driving online traffic to generate a
supplanting traditional journalism. Blogs rich conversation around a chosen topic.
have endowed citizens with the power In addition to content promotion, prolific
and freedom to express their opinions or media integration helps boost search
frame narratives for a greater audience; rankings artificially—a technique known
readers’ comments on blogs afford as link farming, which is a well-known
greater inclusiveness and dialog. Blogs strategy for search engine optimization.
cater to the needs of the public to receive Gaming search engines by using prolific
information in manageable chunks, linking to blogs has become part of modern
tailored to their individual preferences. information activity. By further examining
They can provide intimate details and the information flows within the media
live accounts featuring compelling, on- networks, we attempt to understand
the-ground-style coverage of an event. the sources of mis-/disinformation and
Together, these two capabilities—news their reach; if we can detect how far
chunking and first-person reporting—can and how quickly the mis-/disinformation
create the capability to orchestrate highly can travel, we can also understand the
biased, partial, and distorted information, extent to which information is being
i.e. an information campaign. manipulated. This chapter will present
an in-depth examination of the social
Blogs alone are not effective in media networks using a social-network-
conducting information campaigns. analysis-based methodology to identify
Blogs provide fertile ground for framing the prominent information brokers and
narratives, but the absence of a social leading coordinators of disinformation
network structure inhibits dissemination. campaigns. A methodological section
Various social media platforms, such as will describe how the data is fetched from
Twitter, Facebook, and VK, are then used different sources, and the approach we
as vehicles to disseminate the content. propose for studying information flows.
Nine out of ten bloggers have Facebook The analysis and findings below provide a
accounts. 78% of bloggers use Twitter to deep dive into the research questions we
promote their content. This percentage is set out to answer in this study.
higher, i.e. almost 90% for professional
and full-time bloggers.137 In addition to METHODOLOGY
bloggers promoting their content, studies
have widely reported the exploitation of For the purposes of our analysis, we
examined several blogs and identified
32 computer programs,138 also known as
common attributes among them, such
social bots, to massively amplify content
dissemination via Twitter. The ability to as title, date and time of posting, author/
THE DATA COLLECTION PROCESS FOR BLOGS

blogger, blog post content, comments, ● Fake news dataset from kaggle.
and permalinks. We collected and indexed com. This dataset has 244
all blog content from four different blog blogs, 2236 bloggers, 12,999
datasets into our Blogtrackers database. posts, and 20 attributes. Some
The database can be accessed at http:// of the key attributes in this
blogtrackers.host.ualr.edu/. The dataset dataset are: domain name,
consists of 372 blog sites, 7576 bloggers, site_url, author, post title,
and 196940 blog posts riddled with false text, published date, language,
and misleading information. To crawl comments, replies_count,
these blogs from different sources, we shares, and likes. The dataset
setup crawler(s) for each blog to extract is available at https://www.
all the required attributes. There are three kaggle.com/mrisdal/fake-news.
main steps in crawling data from a blog:
● Dr. Melissa Zimdars’ compiled
(1) exploring the blog site, (2) crawling
list of fake news blogs. Dr.
the blog site, and (3) cleaning and storing
Melissa Zimdars, a professor
the data in a database for analysis and
from Merrimack College (http://
retrieval. Figure above represents the
bit.ly/2wTMlUb), compiled blogs
data crawling process for the blogs.139
featuring fake news. These blog
For this study, data was collected from sites are available at http://bit.
four diverse sources. The descriptions ly/2ezvFbV. This dataset has
37 blogs, 971 bloggers, 96,056
associated with the attributes in these 33
four types of datasets are as follows: posts, and 79 attributes. The
key attributes are: blog name,
blogger, blog post title, blog The characteristics of these four datasets
post, posting date, location, and are presented in Table below. Next
language. we present the research methodology
used to analyse these blogs in order to
● Blogs containing disinformation
examine the spread of disinformation.
regarding the Baltic States.
This dataset has 21 blogs, 728 In this study, we plan to answer the
bloggers, 16,667 posts, and 79 following research questions:
attributes. The key attributes
are: blog name, blogger, blog ● What are the typical
post title, blog post, posting characteristics of mis-/
date, location, and language. disinformation-riddled blogs?

● Blogs containing disinformation ● Can we track the origins of


regarding NATO exercises/ the content, such as memes,
activities. This dataset images, etc., appearing in these
contains blogs collected by the blogs?
Blogtrackers tool that posted
● What strategies are common
mis-/disinformation during
various exercises conducted in disseminating the content
by NATO, such as the Trident (e.g. mixed-media and cross-
Juncture 2015, Brilliant Jump media)? And, can we identify
2016, and Anakonda 2016. the other media sites that
This dataset has 70 blogs, are predominantly used to
3641 bloggers, 71,218 posts, disseminate the original blog
and 79 attributes. The key posts?
attributes are: blog name,
● How do antagonistic narratives
blogger, blog post title, blog
post, posting date, location, and travel?
language.

Dataset Number of Bloggers Number of Attributes


Blogs Posts
Fake news from Kaggle.com 244 2236 12,999 20
Prof. Melissa Zimdars’ compiled 37 971 96,056 79
fake news blogs
Blogs containing disinformation 21 728 16,667 79
regarding the Baltic States

34 Blogs containing disinformation 70 3641 71218 79


regarding NATO exercises/
activities
TYPICAL CHARACTERISTICS 4. Always crosscheck the story
OF DISINFORMATION- with fact-checking websites,
such as snopes.com, factcheck.
RIDDLED BLOGS
org, mediabiasfactcheck.
What are the typical characteristics of com, or politifact.com for the
mis-/disinformation-riddled blogs? Based credibility of the story. For
on our observations and the work of other example, a blog post titled
experts, we provide a set of heuristics ‘The Amish In America Commit
to identify blogs that are potentially Their Vote to Donald Trump;
riddled with mis-/disinformation.140 Mathematically Guaranteeing
These heuristics are: Him a Presidential Victory – ABC
News’ is a fake story reported
1. Pay attention to the ‘contact by the well-known fact checking
us’ section of the page to website snopes.com.
validate and verify site authors.
5. Search for the post in well-
The contact information
known search engines, such
sections of these blogs do not
as Google, Bing, Yahoo, etc., to
provide real contact information
see if the same post or content
for the author. For instance, one
is repeated on other sites using
such real-looking contact URL
mix/cross media approaches
is http://abcnews.com.co/.
to disseminate the narrative.
2. Do not read just the headline; For instance, the blog post
instead, skim the body content ‘Obama Signs Executive Order
to familiarize yourself with Declaring Investigation into
the details of the story. For Election Results; Revote Planned
example, the headline ‘Obama for Dec. 19th – ABC News’ has
Signs Executive Order Declaring been shared on many websites,
Investigation into Election indicating the use of a mixed-
Results; Revote Planned for Dec. media approach.
19th – ABC News’ is a false
6. Check if the article has been
story with a catchy headline.
previously published and if
But, reading through the content
it is being reused to affect
will enable the reader better to
perceptions about an event.
evaluate the story.
For example, a blog post title
3. Pay close attention to the ‘Muslims BUSTED: They Stole
URLs, sources, images, and Millions in Govt Benefits’
editorial standards of the published in 2016, contained an
writing. For instance, the image that was reused from the
URL bloomberg.ma is used to year 2013.
imitate the well-known site
35
7. Check if the post is disturbing
bloomberg.com.
or controversial.
Fake stories usually appear Super Human Abilities Gained’
under sensational headlines. has a lot of comments, many of
For instance, the blog post which were debunking the story.
titled ‘EU NATO Commit
To evaluate the efficacy of these eight
Adultery, Prince Charles
criteria, we conducted a survey. We
Saudi Trade & More’ presents
randomly selected 96 blogs featuring
disturbing information.
mis-/disinformation and asked survey
Disinformation narratives are
often embedded in such stories. participants to rate (low, medium,
high) how effective each of the eight
8. Check if the post has any criteria was in assessing whether the
‘likes’, ‘replies’, or ‘comments’. blog site contained misleading or false
This will indicate how interested information.
readers are in a given story, and
whether they agree or disagree. After collecting the survey data, we
The sentiment can be used to constructed a stacked bar for each of
infer this. For example, a blog the criteria where the X-axis represents
post titled ‘NASA Confirms— values (0%–100%) indicating participant

EFFECTIVENESS OF EACH OF THE 8 CRITERIA IN


IDENTIFYING BLOGS CONTAINING MISINFORMATION OR
DISINFORMATION.

36 * The criteria are sorted in decreasing order of effectiveness. The smaller the gray bar
the more effective the criterion is. Numbers on the colored bars indicate the number
of blog sites identified as containing misleading or false information with a confidence
of High, Medium, and Low.
confidence in the 96 blogs rated low, draw their attention, and direct them to
medium, or high, and the Y-axis denotes non-factual stories with the intention of
the eight criteria. Looking at Figure on page influencing readers. For example, 96%
36, it is clear that the best criterion is the (12,468 of 12,999) of the posts had zero
use of mix/cross media strategies by the ‘likes’ and 94% (12,304 of 12,999) of the
blog site in disseminating the content. posts had zero ‘replies’. These posts were
This can be used as the superlative feature primarily intended to be disseminated to
for assessing the mis-/disinformation reach more people and mislead. We also
contained in any blog post. The next best observed that the majority of the stories
feature is fact-checking websites. originated from a set of domains that
are usually reported as containing false
Next, we present some empirical
information by snopes.com.
observations vis-à-vis mis-/disinformation
heuristics on the fake news dataset We further examined the website structure
collected from kaggle.com. Incidentally, disseminating these false stories. We
most of the posts had very few comments found, in many cases, that the ‘contact us’
or none, which might imply that the stories page does not provide any real contact
were mainly disseminated but not discussed information or redirects readers to
much on these sites. We also found that another website, usually a social
during the US elections many posts were media site, e.g. Facebook or Twitter.
primarily intended to reach large numbers, The example below illustrates how a

ILLUSTRATION: THE‘CONTACT US’ PAGE REDIRECTS


TO ANOTHER WEBSITE
ABC NEWS
LATEST BREAKING NEWS

NEWS FASHION TECH VIDEO WORLD

RECENT POSTS

6 Amazing Places To Get Free Stock Photos


And Videos From
8 Interesting Facts About Data Science
Resolve Debt Issues In A Wise Manner
Solving your �nancial hiccups with the best
credit card consolidation loans
Keeping Bleached Hair Healthy and
Maintaining a Beauty Routine

RECENT COMMENTS
BUSINESS SEPTEMBER 1, 2017

6 Amazing Places To Get Free Stock Photos And Trolololol on 5 reasons to change your
iPhone for Blackberry Classic

Videos From LEO PERELLA on Fireman Suspended &


Jailed By Atheist Mayor For Praying At Scene
A picture speaks louder than a thousand words, and boy is this true when it comes to content Of Fire
marketing. You can have the accompanying text worthy of Shakespeare himself, yet it will Giovanni Alvarez on 5 reasons to change
never even come… your iPhone for Blackberry Classic
 107 F Phoenix, United States Friday, September 1, 2017 Blog Forums Contact

NEWS WORLD  POLITICS  TECHNOLOGY  LIFESTYLE  SPORTS  

Home  Contact

Contact
Thank you for contacting Carla’s Nice Nunnery (CNN) owned LATEST POSTS
and operated by The Reverend Paul Horner!

The Reality about


Testosterone Levels in
Women
Jimmy Rustling, CNN - August 31, 2017 0
37
Testosterone is an important
hormone that is often overlooked in
women, because most people only
see it as the "man's" bodybuilding
steroid, and cringe...

Trump To Build World’s


Largest Stadium Around
Section Of Wall For...
August 27, 2017

How to Choose a Thesis


Editing Services
site redirects to another website when tactic is highly effective in disseminating
readers look for contact information for disinformation originating on blogs via
the author. The example provided here other social media channels.
refers to the site name – ABC NEWS
with the URL http://abcnews.com.co/. TRACKING THE ORIGINS
The contact information link is present OF MISLEADING BLOG
at the bottom of the page for this site. CONTENT
If the reader clicks on ‘contact us’, he is
Can the origins of misleading content,
redirected to another site named CNN
such as memes, images, etc., which
with the URL http://cnn.com.de/contact/.
appear on these blogs, be tracked? We
The http://cnn.com.de/ website closely
began our analysis with a ‘reverse image
mimics the CNN News website (http://
search’ (i.e. searching for the URL of
www.cnn.com/), even using the CNN
a given image on Google Images to
logo, website structure, etc. However,
identify other sources that have used
cnn.com.de is riddled with false stories
the image) and found that the images
and conspiracy theories. When posted
were not unique for each article and not
on Facebook, an article from cnn.com.
relevant to the context they are used
de would bear the CNN logo and appear
for. The same image was reused with
as if the article were actually published
different narratives, as shown below.
by the genuine CNN.com. This deception
Images lend credibility to a narrative

REVERSE IMAGE SEARCH SHOWS THE USE OF


ONE IMAGE WITH DIFFERENT NARRATIVES

38
and are more effective than text alone a post shared on Twitter was actually
for fabricating perceptions. The use of linked to a blog post using hashtags and
images and videos in framing narratives links. This pattern is common across
is effective because multiple modalities various social media channels, i.e.
are exploited to influence thinking.141 the origin of the content is generated
We also observed a pattern in which on a blog and later disseminated

BLOG POST USES HASHTAGS AND LINKS


TO REFER TO TWITTER
About Contact Membership Store Donate USA Canada Latin America Africa Middle East Europe Russia Asia Oceania

Notre site en Français: mondialisation.ca


Italiano Deutsch Português srpski ‫اﻟﻌرﺑﯾﺔ‬
Search Authors... Globalizacion | Asia-Pacific Research

Search... GR Newsletter, Enter Email

US Nato War Economy Civil Rights Environment Poverty Media Justice 9/11 War Crimes Militarization History Science

Latest News & Top Stories


Syria. Analyzing Madaya’s Starvation Falsification. Western
MI6 and Princess Diana: Unpublished Media Propaganda in Support of US-NATO War Crimes
Document pertaining to the “Car Accident
Plot”. Sworn Testimony of Former MI6 Agent By Paul Antonopoulos Region: Middle East & North Africa
Tomlinson Theme: Crimes against Humanity, Media Disinformation
Global Research, January 11, 2016 In-depth Report: SYRIA
Al-Masdar News 9 January 2016
IAEA Certifies Iran’s Compliance with
Nuclear Deal
165 Like 160 52 0 646

Arkema CEO: “No Way to Prevent Imminent


Explosion” at Texas Chemical Plant Public outcry and condemnation against the Syrian government spread
like wildfire across mainstream news and social media when the horrific
Millions Worldwide Hit by Unprecedented photos of starved children and civilians from the besieged town of
Flooding as Climate Change Becomes a
Deadly Reality Madaya emerged. No one could understand why Assad would allow
this to happen to his own people, especially since videos emerged (that
Death Toll Expected to Rise as Chemical can be seen on my last Madaya article) under a month ago that
Explosions Add to Devastation Caused by displayed rallies against the occupying terrorist forces and in support of
Hurricane Harvey
the Syrian government.
Cricket Test Match: Bangladesh’s Victory
Hours after my last article, truth seekers quickly unveiled the lies shrouding around the starvation of the
Most Popular All Articles
people.

NEWS It was also interesting to see that an official statement by the neutral Red Cross could not verify or
deny the systematic starvation of Madaya’s people.
THEMES
‫اﻟﮭﯾﺋﺔاﻟﺳورﯾﺔﻟﻺﻋﻼم‬ Follow
I-BOOKS SERIES @SMO_SYRIA

IN-DEPTH REPORTS Official statement by the #IRCC on #Madaya #Zabadani #Fouaa


GLOBAL RESEARCH VIDEOS
#Kafaraya #Nabul #Zahraa #Break_Hunger_Siege
17:03 - 9 Jan 2016

TWEET USES HASHTAGS/LINKS


3 3

 Home  Moments #Break_Hunger_Siege  Have an account? Log in



#Break_Hunger_Siege
 
Follow 
@
Top Latest People Photos Videos s Broadcasts 

.#break_hunger_siege #FreeSyria
Search filters · Show #SaveSyriasChildren and save humanity on
Melanie Penner  @meljenp · 1 Jun 2016
.#break_hunger_siege #FreeSyria #SaveSyriasChildren and save humanity on

#InternationalChildrensDay
#InternationalChildrensDay

New to Twitter? Joanna Penner @jopenn


Sign up now to get your ow  @
#children under #siege
personalized timeline! #children under #siege
eating grass
eating grass
an attempt to survive
an attempt to survive
Sign up a war…
a war…

  2
2:54 PM - 1 Jun 2016

39
Worldwide trends Nemesis Arcady    @zuzamikulova · 17 Jul 2016 
‫داﻳﻢ_اﻟﺤﻠﻮﻳﻦ‬# 2 Retweets starving civilians need food not bombs
8,845 Tweets #Break_hunger_siege
#NosVamosAlMundial #Syria
  2
7,558 Tweets

‫ﻟﻴﺖ_اﻟﻤﺴﺎﻓﺎت‬#
22.5K Tweets

#PrimerPlano
6,191 Tweets

#LaNocturna
3,671 Tweets

Rusia 2018
23.6K Tweets

Cabildo
through social media channels. a narrative. More precisely, the
Figures below depict this pattern. information campaign can be observed
Initially the content is generated on blog on multiple social media sites through
posts where the use of hashtags and the use of text, images, and audio and
links serve as the vehicles connecting to video content. Although the content may
other social media channels, in this case not be strictly identical on the various
to Twitter. social media channels where it appears,
it clearly pertains to a single information
MIXED-MEDIA VS. CROSS- campaign.
MEDIA APPROACHES
A cross-media information dissemination
A mixed-media information campaign is characterized by a central
dissemination campaign uses multiple channel around which the campaign is
social media channels to perpetuate built. More precisely, the information is

MIXED-MEDIA STRATEGY FOR DISSEMINATING


MISINFORMATION OR DISINFORMATION
ON DIFFERENT WEBSITES.
About Contact Membership Store Donate USA Canada Latin America Africa Middle East Europe Russia Asia Oceania
 Live Feed Forums Members Advertising Football Betting & Results   5 out of 20    Log in
Notre site en Français: mondialisation.ca
Italiano Deutsch Português srpski ‫اﻟﻌرﺑﯾﺔ‬ Feb 1, 2014 #1745
Search Authors... Globalizacion | Asia-Pacific Research
Towards a Renewed Imperialist Intervention in Libya? Anti-NATO Forces Retake Areas in Southern Libya
Search... GR Newsletter, Enter Email
For nearly seven months in 2011, NATO planes — particularly from the U.S., France, Britain and Canada — carried out a massive bombing
campaign in Libya intended to overthrow the government of Muammar Gadda�.
US Nato War Economy Civil Rights Environment Poverty Media Justice 9/11 War Crimes Militarization History Science
Active Member
1 comment After getting the U.N. Security Council to pass a resolution imposing an arms embargo on Libya and then another authorizing a so-called “no-
�y zone” in which only their planes could �y, the imperialists succeeded in having Gadda� captured and brutally killed, opening the way for the

Background Check Yourself


Joined: Jul 17, 2013
Latest News & Top Stories New intervention in Libya? 0/0
utug [user], Wednesday, 5 February 2014, 12:06 Messages: 6,108
establishment of a new regime that would further their interests in that oil-rich North African country.
Anti-NATO Force Retreat Areas in Southern Libya Likes Received: 1,948
Enter a Name & Search for Free! View Background Check Instantly. Now, just two and a half years later, this puppet government is losing ground in southern and western Libya to pro-Gadda� forces, who have
IAEA Certifies Iran’s Compliance with For nearly seven months in 2011, NATO planes - particularly from the US, France, Britain and Canada - conducted a massive bombing campaign in Libya intended to overthrow the 43%
Nuclear Deal checkpeople.com government of Muammar Gaddafi.
taken back several towns and an air base.
Now, just two and a half years later, this puppet government is losing ground in southern and western Libya for pro-Gaddafi forces, who have taken back several cities and an air base.

Towards a Renewed Imperialist Intervention in Libya? Anti-


These developments have prompted French Admiral Edouard Guillard to appeal for a renewed imperialist intervention in Libya, claiming that developments on the southern border could lead These developments have prompted French Admiral Edouard Guillard to appeal for a renewed imperialist intervention in Libya, claiming that
Arkema CEO: “No Way to Prevent Imminent
to a "terrorist threat." developments on the southern border could lead to a “terrorist threat.” (Washington Post, Jan. 27)
Explosion” at Texas Chemical Plant

Millions Worldwide Hit by Unprecedented


NATO Forces Retake Areas in Southern Libya Over 100,000 people have died in Syria over the last three years since the US and Saudi Arabia have promoted a counterrevolutionary assault on the population. The current Geneva II talks
in Switzerland are ostensibly designed to reach a political solution in Syria, but the US and its allies are continuing to finance and coordinate those seeking the overthrow of the government of Guillard claimed that any intervention would require the “consent” of the neocolonial regime that these same imperialists set up in Tripoli. It is
Flooding as Climate Change Becomes a President Bashar al-Assad. headed by Prime Minister Ali Zeidan and the General National Congress.
Deadly Reality
By Abayomi Azikiwe Region: Middle East & North Africa, sub-Saharan Africa Anti-war and anti-imperialist groups in the Western states should oppose this military and political interference in the internal affairs of the African, Middle Eastern and Asian states - such as
In-depth Report: NATO'S WAR ON LIBYA Afghanistan, where after 12 years the Pentagon-NATO forces are no closer to victory Than in 2001. The US and NATO must be forced to withdraw their occupying forces and shut down their Since mid-January forces that remain allied with the former Jamahiriya political and economic system set up by Gadda� have taken control of
Death Toll Expected to Rise as Chemical Global Research, February 01, 2014 military bases.
workers.org several cities and towns in the south. Clashes have also been reported around the capital of Tripoli, where nationalist forces have fought
Explosions Add to Devastation Caused by
Hurricane Harvey http://www.informationclearinghouse.info/article37521.htm pitched battles with militias and military forces backed by the GNC regime. (Libya Herald, Jan. 20)
4 2 0 7 ARE YOU A WEB DEVELOPER ?
Like 231
Cricket Test Match: Bangladesh’s Victory
The withdrawal of the Tebu, who are dark-skinned Africans, from an air base at Tamenhint created the conditions for the seizure of this
comments
Over Australia important location by pro-Gadda� forces on Jan. 21.
For nearly seven months in 2011, NATO planes — particularly from the U.S.,
France, Britain and Canada — carried out a massive bombing campaign in Libya According to a Jan. 22 Saudi Gazette report, “The Tamenhint air base 30 km northeast of Sebha is reported to be back in pro-Gadda� hands
Battlefield America Is the New Normal
intended to overthrow the government of Muammar Gaddafi. after Tebu forces from Murzuk who were guarding it withdrew. They unilaterally pulled out Monday evening [Jan. 20] claiming that the
government was deliberately exploiting clashes in Sebha between Tebus and Awlad Sulaiman in order to divert attention from moves to
Most Popular All Articles After getting the U.N. Security Council to pass a resolution imposing an arms replace it with a new administration.”
embargo on Libya and then another authorizing a so-called “no-fly zone” in which
NEWS only their planes could fly, the imperialists succeeded in having Gaddafi captured These events have sent shockwaves throughout the GNC and Zeidan, its weak and vacillating prime minister, who is allied with the United
and brutally killed, opening the way for the establishment of a new regime that States and other imperialist states responsible for installing the current regime. The situation in Libya has clearly shown that the current
THEMES would further their interests in that oil-rich North African country. regime has failed to stabilize its rule. Militias set up to bring down the Gadda� regime are reportedly in open de�ance of Zeidan and other
“authorities” in Tripoli.
I-BOOKS SERIES Now, just two and a half years later, this puppet government is losing ground in southern and western
Libya to pro-Gaddafi forces, who have taken back several towns and an air base.
IN-DEPTH REPORTS
These developments have prompted French Admiral Edouard Guillard to appeal for a renewed
GLOBAL RESEARCH VIDEOS
imperialist intervention in Libya, claiming that developments on the southern border could lead to a

Towards a Renewed Imperialist Intervention in Libya?


Anti-NATO Forces Retake Areas in Southern Libya.

STOP THE WAR IN LIBYA. WE DEMAND IT Like Page


via Libyan broken heart
February 1, 2014 ·
kilokeal @ · 22 Jun 2015 
US Will Provide Weapons for NATO Commandos to Attack Ukrainian
For nearly seven months in 2011, NATO planes — particularly from the
Separatists: Effort to bol... bit.ly/1Lwoo2k #infowars #
U.S.,
Date...France, Britain and Canada — carried out a massive bombing
/:h
tp  

  1 1 HOME NEWS MUNDO OBRERO WWP BOOKS CONTACT ARCHIVES DONATE JOIN WWP campaign in Libya intended to overthrow the government of Muammar
Info Terrorism @I · 22 Jun 2015 Gaddafi.
 Home » Global » Anti-NATO forces retake areas in southern Libya
Custom Search
US Will Provide Weapons for NATO Commandos to Attack Ukrainian After getting the U.N. Security Council to pass a resolution imposing an
Separatists dlvr.it/BHpvL9
/:tp
h  
Anti-NATO forces retake areas arms embargo on Libya and then another authorizing a so-called “no-fly
  in southern Libya zone” in which only their planes could fly, the imperialists succeeded in
New Right Media @ ia · 22 Jun 2015 By posted on January 29, 2014 having Gaddafi captured and brutally killed, opening the way for the

US Will Provide Weapons for NATO Commandos to Attack Ukrainian establishment of a new regime that would further their interests in that oil-
For nearly seven months in 2011,
Separatists dlvr.it/BHq049 Infowars #News rich North African country.
NATO planes — particularly from
/:tp
h  

the U.S., France, Britain and Canada


  — carried out a massive bombing
Now, just two and a half years later, this puppet government is losing
Your support has kept Workers.org free on
campaign in Libya intended to the Web since 1994. Be a supporter at ground in southern and western Libya to pro-Gaddafi forces,
The World Crisis @ · 22 Jun 2015  https://www.patreon.com/wwp
overthrow the government of
US Will Provide Weapons for NATO Commandos to Attack Ukrainian
Muammar Gaddafi.
Separatists goo.gl/fb/BEqA83
/:h
tp  

Towards a Renewed Imperialist

40
After getting the U.N. Security
  Council to pass a resolution
Donate to WORKERS.ORG
Intervention in Libya? Anti-NATO
imposing an arms embargo on Libya
Forces Retake Areas in Southern
The image above shows the mixed-media dissemination campaign for ‘Towards a Renewed
and then another authorizing a so-
 Libya
Back to top ↑
For nearly seven months in 2011, NATO planes —…

Imperialist Intervention in Libya? Anti-NATO Forces Retake Areas in Southern Libya’ GLOBALRESEARCH.CA
hosted on a website (e.g. text on a blog or in Southern Libya’ was disseminated on
video on a YouTube channel) and is widely multiple sites, i.e. facebook.com, oroom.
distributed through other social media org, twitter.com, globalresearch.ca,
channels that provide established social hotnews.ro, and workers.org.142
network structures, such as Twitter,
Next, we examined the cross-media
Facebook, etc.
information dissemination approach.
First we investigated the use of the This tactic was observed to a good effect
mixed-media approach in disseminating in our dataset. There were many sites
stories. In this study, we encountered that shared links to specific social media
cases where an article was shared on channels such as Twitter, Facebook, and
different sites as shown on page 40. Reddit sites. For instance, a blog site
For instance, a story titled ‘Towards named ‘globalresearch.ca’ had a post
a Renewed Imperialist Intervention in entitled ‘US Will Provide Weapons For
Libya? Anti-NATO Forces Retake Areas NATO Commandos to Attack Ukrainian

CROSS MEDIA INFORMATION DISSEMINATION


STRATEGY FOR DISSEMINATING MISINFORMATION OR
DISINFORMATION ON SOCIAL MEDIA
Public Posts See All Save

June 22, 2015 at 11:43am ·

US push to defeat separatists in #Ukraine.


http://www.infowars.com/us-will-provide-wea…See more

US Will Provide Weapons for NATO Commandos to


Attack Ukrainian Separatists »
www.infowars.com

438 127 Comments 327 Shares

Global Research (Centre for Research on Globalization) Saved from


Visit
- Infowars
June 23, 2015 at 2:30pm ·

saved to WAKE UP AMERICA!!!


The Pentagon has yet to reveal the number of troops » US Will Provide Weapons for NATO Commandos to Attack Ukrainian
that will participate in the battle against…See more Separatists Alex Jones' Infowars: There's a war on for your mind!

US Will Provide Weapons For NATO Commandos to Comments


About Contact Membership Store Donate USA Canada Latin America Africa Middle East Europe Russia Asia Oceania
Attack Ukrainian Separatists
Notre site en Français: mondialisation.ca
www.globalresearch.ca Search Authors...
Italiano Deutsch Português srpski ‫اﻟﻌرﺑﯾﺔ‬
Globalizacion | Asia-Pacific Research

221 48 Comments 286 Shares Search... GR Newsletter, Enter Email

US Nato War Economy Civil Rights Environment Poverty Media Justice 9/11 War Crimes Militarization History Science

Infowars Nightly News Latest News & Top Stories


June 23, 2015 at 5:30pm ·
MI6 and Princess Diana: Unpublished
Document pertaining to the “Car Accident

NATO Heavy Weapons Fueling World War III - NATO is Plot”. Sworn Testimony of Former MI6 Agent
Tomlinson
US Will Provide Weapons For NATO Commandos to Attack
pushing civilization dangerously…See more IAEA Certifies Iran’s Compliance with
Nuclear Deal Ukrainian Separatists
NATO Heavy Weapons Fueling World War III Arkema CEO: “No Way to Prevent Imminent
Explosion” at Texas Chemical Plant By Kurt Nimmo Region: Russia and FSU
Theme: Militarization and WMD, US NATO War Agenda
Global Research, June 23, 2015
www.youtube.com Millions Worldwide Hit by Unprecedented Infowars 22 June 2015
In-depth Report: UKRAINE REPORT

Flooding as Climate Change Becomes a


Deadly Reality 377 11 0 538

Death Toll Expected to Rise as Chemical Pentagon boss Ashton Carter has announced the United States “will
24 1 Comment 20 Shares Explosions Add to Devastation Caused by
Hurricane Harvey contribute weapons, aircraft and forces, including commandos, for
NATO’s rapid reaction force” to defend against “Russia from the east
Cricket Test Match: Bangladesh’s Victory and violent extremists from the south,” according to the Associated
Most Popular All Articles Press.

NEWS Carter did not specify who the “extremists from the south” are, but a
recent NATO military exercise in Poland left little doubt.
THEMES
During the largest maneuver by NATO since the end of the Cold War, a rapid reaction force in Poland
I-BOOKS SERIES staged a mock raid in the fictional country of Botnia.

IN-DEPTH REPORTS

GLOBAL RESEARCH VIDEOS

kilokeal @ · 22 Jun 2015 


US Will Provide Weapons for NATO Commandos to Attack Ukrainian
Separatists: Effort to bol... bit.ly/1Lwoo2k #infowars # /:tp
h  

  1 1

Info Terrorism @I · 22 Jun 2015 


US Will Provide Weapons for NATO Commandos to Attack Ukrainian
Separatists dlvr.it/BHpvL9
/tp
:h  

 
New Right Media @ ia · 22 Jun 2015 
US Will Provide Weapons for NATO Commandos to Attack Ukrainian
Separatists dlvr.it/BHq049 Infowars #News
/:tp
h  

 

41
The World Crisis @ · 22 Jun 2015 
US Will Provide Weapons for NATO Commandos to Attack Ukrainian
Separatists goo.gl/fb/BEqA83
/:tp
h  

 


Back to top ↑
Separatists’ with the link — http://bit. and 2 shares. Also, many groups
ly/2ewVTg7. This post was shared on posted this article to disseminate to an
Twitter (http://bit.ly/2xEQxnU), Pinterest intended audience. The same blog, i.e.
(http://bit.ly/2x02sQ0), and Facebook 21stcenturywire.com, published another
(http://bit.ly/2wrIhZD) as depicted blog post on September 27, 2016 entitled
below. This clearly indicates a cross- ‘EU NATO Commit Adultery, Prince Charles
media pattern. Saudi Trade & More’ that again presented
factually incorrect information. As
TRACKING HOW AN we did with the previous example, we
ANTAGONISTIC NARRATIVE tracked how this post was disseminated
TRAVELS through different social media channels.
This blog post, however, received no
To analyze how a narrative travels, we
comments. The article was shared on
examined the ‘likes’ and ‘comments’
Twitter, but it got only 1 retweet, 1 like,
features available on blogs. A higher
and no replies. The same post was also
number of retweets, shares and
shared on Facebook, where it received
comments at blog level show that
27 reactions, 1 comment, and 11 shares.
posts have been circulated widely,
But all the shares were coming from
demonstrating that media integration
the same group, 21stcenturywire.com.
strategies do help in disseminating the
No other Facebook group posted this
narratives. Readers can like the content
article. Since not many individuals or
and comment on the post. Note that the
groups showed interest in spreading this
‘like’ feature on the blogs embeds various
information, it is clear that this article
social plugins from Twitter, Facebook,
did not get any traction on blogs and not
Reddit, etc. These social plugins allow
much on other social media platforms.
readers to like the page simultaneously
on the different social media platforms, Next, we analyzed the effects network
thereby disseminating the content on a of blogs have on content dissemination.
variety of platforms simultaneously. For Unlike social media platforms, blogs do
instance, a blog site, 21stcenturywire. not have a social network structure, i.e.
com, published a blog post on September there is no follow-follower relation among
18, 2016 entitled ‘Syria: No “Dusty Boy” blogs. However, it is still possible to
Outrage for 7 yr old Haider, Sniped by observe the information flow network in
NATO Terrorists in Idlib Village of Foua’. blogs based on who links to whom. More
This blog post received 65 comments specifically, we examined the hyperlinks in
in which the audience presented their the blogs to extract the blog network. We
views. Moreover, the article was shared used this approach to extract the network
on other social media channels such of the blogs containing disinformation
as Twitter, where it got 19 retweets, 5
42 likes, and 2 replies. The same post on
regarding Baltic States. We used specific
software to visualize the network143,
Facebook got 6 reactions, 3 comments, as depicted in on page 43. The network
NETWORK* OF BLOGS AND SHARED HYPERLINKS

* The network contains 21 blogs (red nodes) and 2321 hyperlinks (blue nodes). Size of a
node is proportional to the number of shared hyperlinks (i.e. out-degree centrality). Edge
thickness is proportional to the number of times a blogs shared a hyperlink.

contains 21 blogs (red nodes) and 2321 The exclusivity of resource sharing by a
hyperlinks (blue nodes). Further analysis few blogs hints at information campaign
of the blog network helps in identifying coordination. To dig deeper, we construct
5 blogs out of 21 that were the most a blog network based on the commonly
resourceful (having the most hyperlinks), shared hyperlinks. The blog network
as well as the most exclusive in resources thus identified is depicted on the next
(i.e. they shared hyperlinks that no other page. The network is fully connected,
blogs shared). 10 out of 2321 hyperlinks i.e. every blog connects to every other
were the most shared and most exclusively blog. This suggests that every blog in
shared, i.e., these hyperlinks were shared this set shared the same hyperlinks. This
by only a few blogs. Most of these top ten confirms our conjecture that there is
shared hyperlinks have a domain suffix intensive campaign coordination among 43
from the Baltic nations, i.e. ‘.ee’ for Estonia, these blogs. Further investigation is
‘.lv’ for Latvia, and ‘.lt’ for Lithuania. required to know if these blogs belong to
A NETWORK* OF BLOGS BASED ON
COMMONLY SHARED HYPERLINKS

* The network is fully connected, i.e. a clique, where every blog is connected with every
other blog. This depicts massively coordinated information campaign.144

or are controlled by the same individual commentary lends a persuasive


or a group. 144 dimension to the blog post.

Next, we analyzed the role of blogs On page 45 it is possible to see how


in providing a persuasive dimension exemplified accounts in users’ comments
to the narrative. We examined  how for a post may influence the audience
‘exemplified accounts’145 in the perceptions.147 After reading through the
44 user comments to a story may comments, we can actually observe that
influence audience perceptions.146 some of the commenters’ accounts help
We provide an example where in developing a persuasive discourse.
Filter Results
Global Research (Centre for Research on Like Page We observe that user comments actually
Globalization)
TAGGEDJuly
LOCATION
2, 2014 · augment the narrative presented in the
Anywhere
“I have met allies who can report that Russia, as part of their sophisticated
Little Rock,
information andArkansas
disinformation operations, engaged actively with so-called
blog post. We can see a lot of users
non-governmental
Hyderabad
against
Choose shale
organisations — environmental organizations working
gas — to maintain European dependence on imported
a Location...
commenting about the post to further
Russian gas,” said Rasmussen, the former Prime Minister of Denmark.
DATE POSTED
strengthen the narrative. At the same
Anytime
NATO Accuses Moscow of Covertly
Funding Western Anti-Fracking
time, we can see patterns such as linking
this content to other websites or pages
2017
Activists
2016
At a June 19 speaking event at London's Chatham
2015 House, North Atlantic Treaty Organization (NATO)…
(such as Facebook fan pages), sharing
Choose a Date...
GLOBALRESEARCH.CA
to other channels (50 shares) to further
89 Likes 26 Comments 50 Shares
raise discussions.
Like Comment Share

89 Top Comments
CONCLUSIONS
50 shares 26 Comments

Blogs are becoming virtual town halls


must be stopped!
I wish it was true, , because Fracking is CRIMINAL and
:)

that are shaping the public perceptions


and narratives of regional events.
There are enough GREEN Energy options left and wide open for further
exploration.
It is also "TYPICAL" NATO, to accuse ANYONE not agreeing with their
policies for "Wild West" (read East) expansions.
Narratives are first framed on the blogs,
It is the NATO that have caused a lot of tension in and around Eastern
Countries.
One should never drive a Cat in a Corner, as reaction is UNPREDICTABLE.
then they are disseminated through other
So, this gambling game as the NATO plays it, goes over the head and at Cost
of European and non European citizens.
social media channels. The key findings
https://www.facebook.com/SolarEnergyglobalInfo?ref=hl include the identification of massively
Like · Reply · 21 · July 2, 2014 at 2:29pm · Edited

1 Reply coordinated information campaigns


In addition to being an imbecile, Rasmussen is also a liar.
Besides, European environmental organizations don't need Russia's alleged
among blogs by applying social network
analysis concepts and demonstrating
"sophisticated information & disinformation operations" to know that shale
fracking is a huge environmental hazard.
Like · Reply · 20 · July 2, 2014 at 3:01pm · Edited

1 Reply that commentary on blogs lends a


Like · Reply ·
Rasmussen, Obama's pet parrot...
6 · July 2, 2014 at 2:12pm
persuasive dimension to the discourse.
wonder what they could think of next to blame on Russia lol..
gee they try everything to try and make something stick lol
Like · Reply · 5 · July 2, 2014 at 2:37pm
In our research, we highlighted
If only it was true. Is this because NATO has a stake in
Slavyansk? After all Shell Oil signed a $10 BILLION dollar drilling contract for
the role that blogs can have in
weaponizing narratives and conducting
the region that is now the most bombed in Ukraine since the Nazi's in
WW2.........
Like · Reply · 2 · July 2, 2014 at 2:45pm

Why refer to NATO as if it were anything more than a puppet and disinformation campaigns, suggesting
mouthpiece of the United States?
Like · Reply · 2 · July 2, 2014 at 2:31pm that action be taken towards developing
countermeasures. The major
1 Reply

WOW. they have gone THAT low. for oil money, now they are
accusing people who had to drink oil in their drinking water because of
fracking of being traitors. contributions of this chapter include:
despicable filth.
Like · Reply · 4 · July 2, 2014 at 2:50pm
assessment of guidelines for detecting
blogs containing misinformation or
disinformation; tracking the origins of
An example illustrating exemplified the content on blogs such as memes,
accounts in comments may shape the images, videos, etc.; evaluating mixed-
discourse to form a persuasive dimension media and cross- media narrative
dissemination strategies; tracking how 45
the narratives originating in blogs travel
in the social media ecosystem; and
analyzing campaign coordination from
blog networks. We studied four different
blog datasets consisting of 372 blog
sites, 7576 bloggers, and 196,940 blog
posts riddled with misleading or false
information. Social network analysis
of the blog network revealed most
resourceful blogs and blogs that were
most exclusive in sharing resources.
Furthermore, a massive misinformation
coordination campaign was discovered.

Acknowledgements
This research is funded in part by
the U.S. National Science Foundation
(IIS-1636933, ACI-1429160, and IIS-
1110868), U.S. Office of Naval Research
(N00014-10-1-0091, N00014-14-1-0489,
N00014-15-P-1187, N00014-16-1-2016,
N00014-16-1-2412, N00014-17-1-2605,
N00014-17-1-2675), U.S. Air Force
Research Lab, U.S. Army Research
Office (W911NF-16-1-0189), U.S.
Defense Advanced Research Projects
Agency (W31P4Q-17-C-0059) and the
Jerry L. Maulden/Entergy Fund at the
University of Arkansas at Little Rock.
Any opinions, findings, conclusions,
or recommendations expressed in this
material are those of the authors and do
not necessarily reflect the views of the
funding organizations. The researchers
gratefully acknowledge the support.

46
3
THIRD-PARTY SERVICES
AND FAKE NEWS
MEDIA SITES
Nora Biteniece

This chapter discusses how user data collected by third-party services


coupled with online advertising technologies can be exploited for targeted
information activities. This chapter also presents findings from our study
of online news sources mentioned in a discussion regarding the NATO
presence in the Baltic States and Poland on Twitter. The discussion is
referred to as the Enhanced Forward Presence (eFP) dicussion, using
NATO’s name for its defence and deterrence posture in Eastern Europe.

47
We identified 933 unique news sources publishers have suffered a malware
linked from Tweets mentioning eFP; 43% attack through advertisements.149
of these sources can be classed as fake This is just one of the ways in which
news media sites. We also observed online advertising technologies are
the systematic use of particular third- exploited. In September 2017 an
party services across the fake news article was published on the Facebook
sites in our data set. Two of these Newsroom website reporting on
services raised concerns as they loaded geographically targeted advertisements
a variety of opaque third-party services, purchased by inauthentic accounts
set unreasonable cookie expiration and pages that originated in Russia.150
dates and had been associated with These ‘ads and accounts appeared to focus
malicious behaviour in the past such as on amplifying divisive social and political
creating spam links. We also identified messages’ including LGBT matters, race
three social media third-party services issues, immigration, and gun rights.151
present on several fake news media sites This suggests that user data coupled with
that load additional advertising and site online advertising technologies can be
analytics services potentially allowing used for targeted information activities.
these parties to tie visitors to specific To gain insight into how online advertising
online personas. enables actors to target individuals or
groups, it is necessary to understand two
User behaviour online such as visiting
processes—how user data is collected,
websites, reading articles, watching
and how online advertisements are
videos, searching for keywords, and
delivered using these data. We will begin
‘sharing’ and ‘liking’ content on social
by describing the mechanisms used to
media can reveal a lot about them. This
collect user data online followed by a
insight has been effectively used by online
brief overview of behavioural advertising
advertisers to target specific consumer
technologies and the vulnerabilities of
groups with relevant advertisements.
the entire ecosystem. The second section
The ability to target specific groups is the
will present findings from our own study
main goal of online advertising systems,
of online news sources mentioned in the
and ad-providers are willing to pay for
eFP discussion on Twitter. In conclusion
these services. Hence, the importance
we will discuss the implications of our
of online advertising services continues
findings.
to grow, as do their revenues. In 2013, for
instance, companies paid $42.8 billion
to US online advertising services.148 As BACKGROUND
online advertising has grown, there has
Behavioural Tracking and
been a corresponding rise in exploitation
Profiling
of the advertising ecosystem by
cybercriminals seeking to locate victims. The purpose of collecting online
48 According to the online security firm behavioural data is to track users over
Symantec, more than a half of website time and build profiles containing
“ User data coupled with online advertising
technologies can be used for targeted
information activities.

information about their characteristics


(such as gender, age, and ethnicity),
interests, and shopping activities.152
Cookies are small text files that web
servers can set and read from a user’s
browser. When a user navigates to a
This is known as behavioural tracking particular website for the first time, the
and profiling, and it has been effectively website may call a script to set a cookie,
used in online advertising. Companies containing a unique ID, on the user’s
use behavioural data to display machine. The browser will attach the
advertisements that closely reflect cookie to all subsequent communication
users’ interests.153 User behavioural between the client and the web server
tracking and profiling occur across until the cookie expires, is reset by the
three of the most popular Internet server, or deleted by the user. The most
services, i.e. websites, location-based basic function of a cookie is to identify a
services, and social media sites.154 device, and by extension unique visitors
Each service has different tracking to a website. Cookies help websites to
mechanisms. For example, social media provide services such as visitor counters
platforms are designed to track content for website owners, customized web
accessed by users, what they ‘like’ and pages, and anti-fraud provisions. Note
‘share’, and what they engage with. This that cookies are sent only to the websites
is achieved through requiring all users that set them or to servers in the same
to create a personal profile, providing domain. However, a website might host
platform features such as creating a content, e.g. images, links, or IFrames159,
post, sharing an article, liking content, stored on servers in other domains.160
etc.155 Cookies that are set during the retrieval
of this content are third-party cookies,
Web tracking,156 however, is mainly
whereas first-party cookies are set by
performed by monitoring IP addresses,
the website that the user is actually
and by using cookies, Javascripts,157
accessing. To illustrate this, let us say
49
and supercookies158.
that an internet user navigates to a
“ The collected data are stored with records
of all the websites the user has visited in the
previous minutes, months, and even years.

website that also loads advertisements


from a third-party server. Because the
third-party server has established a
detected through its cookie network, for
instance a visit to an e-commerce website
for gluten-free products. Knowing this,
connection with the user’s computer, it the company can target that user for
is also able to set a cookie containing an gluten-free product advertisements.
ID unique to that user’s machine. Adjusting advertisements for each user
based on previous online activity is
Companies use cookie technology
known as behavioural targeting, and is
to track user activities online. The
enabled by the current online advertising
collected data are stored with records
technologies.
of all the websites the user has visited
in the previous minutes, months, and Behavioural Targeting
even years. This information can be
augmented with contextual data provided Online advertising systems are
by websites (the content of the website) typically composed of three main
and/or by data from large data brokers.161 entities: the ad-provider, the ad-
Often, companies deepen their publisher, and the ad-network.162
connection with users by planting The ad-provider is the entity wishing
cookies on several sites to gather to advertise its product or service; the
additional information regarding online ad-publisher is the website that hosts/
behaviour. The more data they collect displays advertisements; and ad-
from different websites about a particular networks are companies that aggregate
user, the better the inferences they can available ad space across a large
draw. Let us say that a user visits a collection of publishers, code their
cooking website; the company can read inventory, and sell it on to ad-providers.
the content of the website and infer that In the process of coding the available
50 the user is interested in cooking. It can ad spaces, ad-networks segment
cross-reference this information with their audience based on the online
any other website visited by that user behavioural and contextual data they
have collected, and any inferences that Note that ad-networks have a built-in
can be drawn, thus allowing ad-providers opportunity to plant cookies every time
to carry out both contextual advertising they deliver an ad, thus, their cookie
and behavioural targeting for various network is as large as the pool of sites
audience segments.163 for which they service ads. This is due
to the fact that the host website’s server
To illustrate how an online advertising
must contact the ad-network every time
system might work, let us say that a user
it needs an ad. In the next section, we will
visits a website/ad-publisher that uses ad
describe the vulnerabilities of the online
networks, the ad-publisher instructs the
advertising ecosystem and the problems
user’s browser to contact the network.
with behavioural targeting.
The ad-network, in turn, retrieves
whatever user-cookie-identifiers it Vulnerabilities
can. Using those identifiers, the ad-
network can access its own database The online advertising ecosystem
to see what other information about the assumes that each entity (ad provider,
user’s history it has in order to identify publisher, ad network) when given the
that user’s interests and demographic connection to a user’s machine, will not
information. The ad-network can compromise that machine and, when
then decide which advertisements gathering data, will gather only what
to display for that particular user.164 it needs, store it safely, and use it to
Although the ad-network decides which enhance the user’s web experience. In
advertisements should be displayed, reality, the ecosystem is very complex
it often does not deliver the actual and each layer is vulnerable to malicious
advertisements. Instead, the ad-network exploitation.
instructs the user’s browser to contact First, a publisher itself may be a phishing
the actual ad-providers’ server (See site—a website that looks similar to 51
Figure above). genuine companies or financial services,
“ The data collection that makes online
advertising possible allows advertisers and
other entities to target and possibly influence
specific user or audience segments.

but is set up to mislead users into entering


important details such as their usernames
and passwords. This may be done with the
is an opportunity for the introduction of
malware.

Moreover, behavioural targeting enabled


purpose of stealing user data and/or fraud.
by user data collection and the ad delivery
Second, the advertisements delivered
systems can be used to take advantage
through ad networks are not under the
of vulnerable users. For example,
control of the publisher; this means that it
information about a user’s health,
is not the users who decide which entity is
financial condition, or age can be inferred
allowed to connect with their machine and
from online tracking and used to target
which is not. Third, online advertisements
that person for payday loans, sub-prime
can deliver files and entire programs
mortgages, or bogus health cures.167
to a user even if the advertisement
Users’ behavioural profiles can be used
itself appears to be just an image. This
to offer certain customers products at
means that ad providers are able to
a higher cost or deny them access to
transmit advertisements with embedded
goods altogether (‘online redlining’).168
executable scripts—a key vulnerability,165
In the absence of clear privacy laws and
such scripts would be able to download
security standards, these behavioural
malware on the user’s computer without
profiles leave users vulnerable to identity
any clicks or other actions being taken by
theft and information activities. Our study
the user. Ad networks usually perform some
did not focus on detecting malicious ads
kind of quality control on the advertisements
or identifying phishing sites. Instead we
they service; however, the actual file at a
looked at the risks of online advertising
given URL can be changed after the initial
technology being used for targeted
quality control check has taken place.166
information activities. The data collection
In addition, an advertisement passes
that makes online advertising possible
through several networks before it actually
52 reaches the user’s browser. Each time it
allows advertisers and other entities to
target and possibly influence specific
passes through another network, there
user or audience segments. A striking NEWS SOURCES AND
example is the case of Facebook carrying THIRD-PARTY SERVICES
ads from linked and inauthentic accounts
and pages originating in Russia and We examined 933 online news sites
essentially targeting socially divisive mentioned in the eFP discussion on
messages at the US public.169 Twitter. We found 588 unique third parties
that receive data about visitors to these
We set out to understand whether sites. 43.1% of the websites in our dataset
fake news media sites mentioned in were classified as fake news media sites,
the eFP discussion on Twitter use the and 71 of the identified third parties
described technology to collect data were found mostly on these sites.170
that would enable them to target and The observed third-party services
possibly influence individual users and included ad networks, web analytics,
user groups. Although we do not know and social media services. When
if the collected data is used for targeted examining the use of third-party services
information activities, we demonstrate by legitimate and fake media sites, we
that the information available could be observed:
used for that purpose. It is not within the
scope of this chapter, or of our study, ● Both classes of news media
to detect targeted information activity sites share popular third-
or attribute such activity to anyone. The party services such as Google
next section will present findings from our Analytics, Double Click, Google
study. Adsense, and Facebook
Connect;

53
● The legitimate news media, connecting two nodes, a particular third-
however, seems to use a greater party service is present on both of these
variety and number of third- websites.
party services (See Figure on
In Figure below you can see two distinct
page 53).
clusters of sites; the larger one consists
The Y axis in Figure on page 53 of sites that all use the Google Adsense
shows the total number of times a Asychronous service, the smaller one
third-party service appeared on any consists of sites that use the Artificial
website in our dataset; the X axis Computation Intelligence service. The 4
shows the calculated Fakeness Index171 much smaller clusters are sites that use
for each third-party service. We then MarketGid, i.ua, Whos.amung.us, and
Clicky services respectively. We discuss
developed a network graph (see Figure
some of these services in more detail in
below) to examine which services with a
the next section.
fakeness index 0.9 or above were shared
across fake news media sites. The When examining the patterns in the data
individual nodes are websites classed collected by both fake and legitimate news
as fake; the edges between them are media outlets, we observed three distinct
third-party services. If there’s an edge categories of data being collected:

54
● Anonymous (analytics, user hosting, embedded service scripts,
agent details, cookie data, date/ accessed the page, and examined the
time, ad views, etc.) network traffic. Two more suspicious
third-party services are discussed below.
● Pseudonymous (device ID,
search history, IP address, and Artificial Computation
other location based data) Intelligence Service
● Person-identifiable information As already mentioned, we found that the
or PII (address, email address, third-party service Artificial Computation
name, login, phone number) Intelligence (called through loading a
JavaScript file—acint.js) was used across
We found that there is no significant
several fake news media sites. When
difference in the pattern of data collected
investigating the company behind the
by services present on mostly legitimate
service, we found that it claims to be a
and mostly fake news media sites.
web analytics service for the largest RuNet
However, a number of services across
websites, and is supposedly collecting
the fake and legitimate news media sites
user IP addresses, operating system
collect IP addresses (Pseudonymous),
information, browser details, and the
addresses, names, and email addresses.
number of visits. However, according to
Collecting this information allows third
WOT it produces spam and malware links
parties to later target specific users via
to .ru domains, and when placing acint.js
means that go beyond online advertising,
on our website, we observed that it loaded
such as e-mails and IP addresses.172
13 other third-party services to our site.174
Thus, both legitimate and fake news
In addition, it attempted to load a resource
media sites in our dataset collect data
from stat.sputnik.ru (See next page).
that enable them to target specific users
and/or audience segments. Another third-party service Artificial
Computation Intelligence loads comes
Investigating the services shared across
from the sape.ru domain. Sape.ru
mostly fake news media sites further, we
itself is a web analytics and backlink
found that the services were legitimate:
service,175 however, in the past it has
Google Adsense Asynchronous is a
produced unwanted links and injected
well-known ad network, while Clicky,
acint.js script176 on websites its service
i.ua, and whos.amung.us are legitimate
was installed. This suggests two things:
web analytics services. The criteria
first, Artificial Computation Intelligence
we used included transparency of
and Sape services are related. Second,
ownership, privacy policy, cookie
Artificial Computation Intelligence uses
expiration date, Web Of Trust (WOT)173
sape.ru as a proxy to enroll websites
rating, and the additional services
required to load a website. To test the
into their ad framework without web 55
developer/admin consent.
latter, we bought a domain name and
IN MAY:

IN AUGUST:

56
“ Collecting this information allows third parties
to later target specific users via means that
go beyond online advertising, such as e-mails
and IP addresses.

MarketGid
MarketGid was also used across several
fake news media sites in the form of i.js,
were used by both legitimate and fake
news media sites. By enabling a website
visitor to like or interact with the content
on social media, websites and services
a Javascript file. Investigating MarketGid are able to tie the visitor to a specific
further we found that it is an ad network; online persona. This online persona will
however, according to WOT it produces have a lot more information associated
spam and malware links. When placing with it than just browser details, IP
MarketGid on our website, we observed addresses, or referral data.
that it loaded two other cookies,177 from
a targeted advertising company and Most of the social media third-party
a news agency respectively. All three services identified in our data set were
service cookies are set to expire in 2038, provided by the social media companies
which is, according to EU privacy policy, themselves; however, several integrated
an unreasonable cookie expiration date. widgets178 from different platforms
This also means they could collect data supported interactions across a variety
about the user for that entire period of of social media services. Companies
time unless the individual cookies are that provide such widgets free of
deleted. charge most likely monetize their use
by collecting user data. AddThis, for
Social Media Third-Party example, has profiles for 1.9 billion
Services people. This suggests that the core
business for AddThis does not lie in
As mentioned before, we found several
providing free social media widgets, but
social media third-party services in our
rather in selling user profile data to third
dataset. These services in most cases
parties.
facilitated interactions from external
57
sources to a social media platform, e.g. To examine whether the third-party
liking, sharing, commenting, etc., and services identified in our data set expose
any social-media-related user data, we of this request, i.e. retrieving the user
bought a domain name and hosting, ID from cookies present in the browser
and embedded the third-party services or redirecting to a pop up window with
used on fake news media sites only. The a login screen. Although some of the
user data would have to be in the HTTP information is visible from the developer
header when interacting with the third- tools in Google Chrome (Facebook user
party service for website owners like ID in a cookie), it is not accessible to
us to access it. See table below for our the website owner. As explained in the
observations.179 previous sections, cookies can be read
by the domain from which they originate.
In short, when calling a social media
Thus, for a website to read the Facebook
service from a website, no user data
cookie containing a user ID, it would have
is passed or exposed during this
to be from the same domain as Facebook.
communication. Instead, the social
In addition, we observed that four of the
media service handles the aftersteps
identified third-party services loaded

Supported
Widget Comments
Functionality
Authorization through Retrieves a browser cookie with users’
Facebook Connect
Facebook Facebook IDs
Facebook Social Querying Facebooks’ Social
Graph Graph179
Facebook Social Like/Share/Comment on Loads Facebook Connect and
Plugins Facebook Facebook Impressions
LinkedIn Widgets Post on LinkedIn
Share content across social
Lockerz Share
media platforms of choice
Pinterest
Loads several advertising services
Share content across social
Pluso (adapt.tv, advertising.com, DoubleClick,
media platforms of choice
Eyeota, FACETz, rutarget, Vi)
Reddit Post on Reddit
Share across social media
Share42
paltform of choice
Loads Cedexis radar, Google Analytics,
Tumblr Buttons Post on Tumblr
andd ScoreCard Research Beacon
Twitter Button Tweet on Twitter Loads Twitter Syndication
Like on social media Loads Yandex.Metrics and Mail.ru
UpToLike
58 platform of choice Group
VKontakte Like/Share/Comment When logged in on VK, it also loads
Widgets on VK Mail.ru group and Top Mail
other third-party services (advertising Krishnamurthys’ and Wills’ methodology,
and website analytics services). This we also examined whether navigating
again increases the number of third to an article linked from Facebook, VK,
parties that record user browsing habits, Twitter, LinkedIn, Tumblr, and Reddit
allowing them to cross-tabulate this leaks any social media user information.
information with their other records and We observed no user information in the
infer more about the user. However, as requested URL for any of the social media
already mentioned, by allowing users sites we looked at. The referrer URL for
to share content from a website on Facebook and Tumblr was the respective
their social media profiles, it enables social network domain (facebook.
website owners to then ‘backtrack’ the com and tumblr.com); for Twitter it is a
shortened URL to the article; for Reddit it
users who shared their content as well
is the full URL to the article, for LinkedIn
as the platform they shared it on. For
there is no such field;182 and for VK it is a
example, by navigating to   facebook.
URL that links to the reader’s profile (or
com/search.php, pasting a link, and
login page if they do not have a profile).
then clicking on the option ‘Posts by
Since 2009 there has been a tremendous
everyone’ a list of Facebook users who
improvement in user data privacy when
have shared that particular link will be interacting with external content on
displayed. There are even services that social media sites. However, in some
aggregate this information across the cases (Facebook, Tumblr, VK, and
different social media platforms.180 One Reddit) it is still possible for websites to
can also backtrack people who liked track which social networks a particular
or commented on a social media page visitor uses.
through those social media APIs that
allow page owners to query the list of CONCLUSIONS AND
users who liked or commented on their IMPLICATIONS
page.
On social media, with everything
In 2009 Krishnamurthy and Wills packaged as URLs linking to external
identified several ways in which social sites, new and unpoliced parts of the
media sites leak person-identifiable internet are visited. Consequently, the
information to third-party services.181 way people get their news has also
They observed that information that changed. A recent study has shown that
could lead to a user profile (user name, 62% of US citizens get their news through
user ID, or email address) was leaked social media sites.183 This, however, has
through the ‘referrer’ and ‘request’ lowered the barrier of access to non-
URL fields in the HTTP header when traditional, possibly untrustworthy,
accessing external content from news media. We also saw this in our
various social media sites (MySpace,
Facebook, Twitter, LiveJournal, LinkedIn,
study on the eFP discussion on Twitter 59
where 43.1% of the linked news sites
Hi5, Imeem, Orkut, and Xanga). Using were fake news media sites. When
examining the third-party services on use of Artificial Computation Intelligence
news sites in our dataset, we observed and MarketGid. These services load
that both legitimate and fake news content from several other opaque third-
media sites use social media services party services, enabling them to place
to provide additional functionality cookies on users’ machines and obtain
such as ‘liking’ or ‘sharing’ on a data such as IP addresses, user agents,
platform. This has several implications: and the sites they visited. As explained in
the first part of this chapter, these data
and any information inferred from them
● The external sources can track
can be employed to target user groups
which social media platforms
based on interests, demographics, and/
their visitors use through the
or geographical location. Moreover,
referer field in the HTTP header.
because of the widespread cross-
● The external sources can interaction between websites and social
backtrack their own content media sites, third parties present on
shared on social media these fake news media sites (or their
platforms, together with web admins for that matter) can tie a
information about any user who visitor to a specific online persona, and
shares it. This allows third- thus target them individually and with
parties, or anyone who utilizes a lot more insight. In addition, in the
this data from third-parties, to past both of these services have been
target specific individuals on associated with malicious behaviour
social media sites. such as creating spam links and injecting
Javascripts. This suggests that Artificial
● Companies that provide social Computation Intelligence and MarketGid
media widgets free of charge act as proxies to spread spam and
most likely monetize their use malware and to plant cookies from other
by collecting user data and third parties enabling these parties to
selling it to third-parties. collect user data without consent.
In addition, several of the identified
social media third-party services loaded
other web analytics or advertising
services. This raises some concerns,
since it allows additional third-parties to
collect information on visitors solely on
the grounds that they shared an article
on their Facebook profile, for example.

60 When examining other third-party


services present mostly on fake news
media sites, we observed the systematic
CONCLUSIONS AND
RECOMMENDATIONS

61
This publication highlights how false Social media platforms offer
information online brings about a unprecedented levels of sophistication to
number of security implications. We malicious actors who aim at influencing
likened false information to the Lernaean a political conversation through the
Hydra, the mythical creature that could use of false or misleading information.
generate two new heads for each head Social media users often trust the online
it lost to the axe. According to the myth, information environment more than
Heracles slayed it by thinking outside the traditional media. This is due to the
box, burning the stumps of the severed structure of the platforms: information
heads, and smashing the only true comes from friends, acquaintances,
mortal head the monster had with a rock. and sources that resonate with the
Analogously, anyone who is battling user’s beliefs and values. Given these
disinformation online must think beyond circumstances, information is rarely
simply debunking single stories. evaluated critically. The cognitive biases
we all fall into from time to time are what
Social media platforms are popular
enables malicious actors to manipulate
because they cater to the basic human
online audiences, but technological
need for building and maintaining social
innovations make it easier for them to
interactions. It is for this reason that new
exploit these mechanisms.
media are extremely valuable for Strategic
Communications, and can be dangerous There is wide scope for capitalizing
vehicles for disinformation. Today’s on the social media environment to
disinformation shows continuity with the fight disinformation. Social media can
past at the strategic level, and discontinuity generate informational bubbles, but can
at the tactical level. The contemporary also pierce them.
media landscape is characterized by
Chapter 2 highlights how different social
informality and reciprocity. As the
media providers cater to different world
relevance of the traditional gatekeepers
regions. The Russian-language internet
of information is fading, print media
is, in many respects, a galaxy of its own.
for example, the online environment is
Russian-made social media platforms are
becoming less regulated than its offline
qualitatively different from their Western
counterpart.
counterparts, and can be used more
Contemporary disinformation is more effectively in disinformation campaigns.
quantitative than qualitative. The majority Western analysts should familiarize
of false stories shared on social media themselves with these platforms.
are rudimentary, and in some cases so This will enable them to understand
improbable that authorities are reluctant the narratives that are being pushed
to even address them. Yet, these stories through these channels and, potentially,
62 can have strategic-level effects on public interact with them. The platforms that
discourse. are popular in Arabic-speaking regions
are mostly those that are common in RECOMMENDATIONS
the West, but the stories being shared
reflect the different social and political The chapters highlighted a number of
issues affecting the region. common themes that cut across the
topics. These common themes are:
The discussion of blogs in Chapter 3 data awareness, channel identification,
proves that social media is a channel dialogue with the social media industry,
for dissemination of narratives, rather and regulation. The recommendations
than the place where they originate. below address these themes.
False stories often originate in blogs
and are shared on social media only at Data awareness
a later stage. Disinformation campaigns
As the means to collect user data grow in
coordinate the activity of several
sophistication, users are more and more
channels, in order to reach the largest
vulnerable to this kind of activity. Users
audience possible. Blogs are among
should be aware of these risks. This is
the most important environments where
particularly true for those social media
narratives are crafted and propagated.
users whose work is of a delicate nature,
Aside from the blog post itself, the
i.e. military/security personnel and civil
comments below the post reinforce the
servants.
persuasiveness of the narrative.
Moreover, we must keep in mind that
Buying and selling user profile data has
algorithms can discover attributes
become big business. The discussion
not explicitly expressed by the user.184
of user data collection in Chapter 4
Despite our efforts, malicious actors
demonstrats that this new reality brings
can still find ways to use the data we
about significant security implications.
leave behind to target us with tailored
External actors can monitor content they
messaging that is more likely to influence
post to social media platforms together
our behvious. Understanding this is an
with information about the users who
important part of data awareness.
share it, paving the way for tailored
messaging—specific groups, even The general public needs to be educated
specific individuals, can be targeted on how their online behaviour is being
on social media with political content tracked and how this information can
designed specifically for them. Several be used. There have been a number
firms are engaged in the analysis of of efforts in this direction, mostly
social media audiences. These services by citizen-journalists and browser-
are used by for-profit companies,
extension developers.
political adversaries, and, potentially,
malicious actors aiming at influencing
selected audiences. 63
Channel identification who are most knowledgeable about the
platforms’ vulnerabilities are the social
As the Lernaean Hydra had a single mortal media companies themselves. For this
head, so contemporary disinformation reason governments (and, in particular,
campaigns waged on multiple channels the security sector) should dialogue with
have a single ‘backbone’. Detecting social media companies.
this backbone helps us understand
Social media companies need this
the context in which a specific group
dialogue as much as governments do.
of false stories has originated, and is,
They have been facing considerable
therefore, a fundamental step towards
criticism over the use of their products
assessing whether a specific case
in spreading misinformation, and have
should be considered misinformation or
responded with in-house solutions, as
disinformation. Western analysts must
outlined in Chapter 2. However, in order
leave their comfort zones and explore
for countermeasures to be effective
channels they are unfamiliar with. This
relevant authorities should be involved,
means those platforms that are distant
so actions can be based on exchange of
from their socio-cultural context, be
relevant information. Some steps in this
it because of geography or language
direction are already coming from the
(as is the case with Russian- or Arabic-
industry, as demonstrated by Facebook’s
language social media) or because they
self-accusation regarding Russian
cater to different demographics (as is the
interference in the 2016 US elections.185
case with emerging platforms targeting
It is in the companies’ self-interest to
younger audiences).
collaborate with authorities on these
False information does not exist in matters, as users are likely to respond
a vacuum, it needs a context and a positively to actions aimed at sanitizing
medium. Different audiences have the social media environment whilst
different interests and are active in protecting their privacy.
different virtual spaces. Malicious actors
Browser providers should assume a more
know this, and adapt their messaging
active role in educating their users about
campaigns to the audiences they want to
behavioural tracking online. Information
target.
about what kinds of user data is being
Dialogue with the industry collected and by whom should be a
standard part of browser functionality.
The use of false information for Apple Inc., for example, has restricted
malicious purposes can be likened to several tracking mechanisms in their
traffic violations. While responsibility newest Safari browser.186 However, this
for misbehaviour rests solely on the does not solve the problem of users
64 drivers, highway authorities can help being unaware that their data is being
the police in making roads safer. The collected and what it will likely be used
same is true for social media: those
for. Social media companies should be must come from the institutions. In May
encouraged to tighten their data sharing 2018, the EU will enforce a new regulation
policies. Targeting people on social regarding user data protection,187 which
media is so easy and effective because aims to give ownership of personal data
social media companies have gathered a back to the users through several key
considerable amount of insight on social requirements. The companies collecting
media users, their interests, and their data on EU citizens, regardless of where
attitudes. They provide the mechanisms the company is registered or where it
for targeting. After significant ad sales stores its data, will have to abide by the
to a network of inauthentic accounts and new regulations. Every user will have
pages that disseminated socially divisive the right to be forgotten or for their data
messages, Facebook has made their ad to be moved to another data controller.
review process more rigorous. However, It should be clear to the user who is
much more can be done to curb access to collecting their data and for what reason,
technologies that enables third-parties as well as how to opt-out of the data
to tap into the information Facebook has collection process.
on users.
These new regulations will be a significant
Regulation improvement in the protection of user
data and user privacy. However, enforcing
Regulation is intended to prevent the the regulations must be combined with
suppression of uncomfortable voices by efforts to educate the general public on
authoritarian regimes. It is in the users’ user data collection and their rights to
interest that the virtual spaces where own their own data. Moreover, because
they voice their opinions are kept safe so the entire online tracking process is
that they can be truly free. This entails opaque, the new regulations will still
deterring abusive behaviour online, only affect the companies that interact
protecting users’ privacy, and limiting the directly with the user. The largest data
intentional spread of false information. brokers still collect user data in the
Individually, false or misleading stories background and, in most cases, without
are easy to falsify, and even easier to the knowledge of the user.
create. More work and creative solutions
are needed in order to tackle the root
causes that make it so cheap to spread
misinformation and disinformation.

An area that deserves particular


attention is the protection of personal
data. Some companies are already
self-regulating to support user privacy. 65
However, self-regulation can achieve
only limited results, systemic regulation
GLOSSARY

The entries presented here are intended to help the reader understand the key terms
that are discussed throughout the research product. This unofficial terminology,
updated as of 1st November 2017, is aimed at serving further research. The list is
inclusive, i.e. it includes terms that are not used in the study, but are central to the
discussion. Moreover, some of the entries were not intended to be descriptions in
the original context: when this is the case, the “comments” section points it out.
While the list is inclusive, only one definition is given for each term, in order to
keep this glossary simple and easy to use.

66
TERM DEFINITION SOURCE COMMENTS
An individual or group
AJP 3.10 Allied
that witnesses an event or
Joint Doctrine
Audience information conveyed through
for Information
social audiovisual or printed
Operations
media.

Websites where information is


posted on a regular basis. Con-
tent varies widely, from personal
diary-type minutiae to sustained
PAO Handbook
Blog discussion of politics, hobbies
2014
or other interests. Some blogs
are a “grab bag” of topics, while
others focus on a particular
subject.

Software to manage (post, edit)


blogs from operating system
with no need to launch a web
browser. A typical blog client PAO Handbook
Blog client
has an editor, a spell-checker 2014
and a few more options that
simplify content creation and
editing.

Person who runs a blog. Also


PAO Handbook
Blogger blogger.com, a popular free
2014
website for blog hosting.
A multidiscipline effort led and
coordinated by Info ops function
MC 402/2 NATO
to analyse an adversary’s
Counter- Military Policy
information activities, its
propaganda on Psychological
source content, intended
Operations
audience, media selection, and
effectiveness.
Dissemination of false Oxford Dictionary
Disinformation information with the deliberate of Media and
intent to deceive or mislead. Communication

An ideological environment
in which ideas and opinions Akin to the
Oxford Dictionary
are amplified and reinforced concept of filter
Echo Chamber of Media and
by their repetition, creating a bubble (see
Communication
mainstreaming effect of like- definition).
mindedness. 67
H. Allcott & M.
Gentzkow (2017)
The source does
“Social Media
not aim at giving a
News articles that are and Fake News
definition. This is a
Fake News intentionally and verifiably false, in the 2016
working definition
and could mislead readers. Election”, Journal
in the context of a
of Economic
journal article.
Perspectives 31
(2)

Coordinated activity by
J. Weedon,
inauthentic accounts with The source does
W. Nuland, A.
the intent of manipulating not aim at giving a
Stamos (2017)
False political discussion (e.g., by definition. This is a
“Information
Amplifiers discouraging specific parties working definition
Operations and
from participating in discussion, in the context of a
Facebook”,
or amplifying sensationalistic report.
Facebook
voices over others).

A phenomenon whereby the


ideological perspectives of
internet users are reinforced
Akin to the
as a result of the selective
Oxford Dictionary concept of echo
Filter Bubble algorithmic tailoring of search
of Social Media chamber (see
engine results to individual users
definition).
(as reflected in recorded data
such as search history, click
data, and location).

A widespread tendency of human


beings to be drawn to others with
whom they see themselves as
having much in common. This
is reflected in the folk wisdom
that ‘birds of a feather flock
together’ or ‘like attracts like’
Oxford Dictionary
(in contrast to heterophily ). We
Homophily of Media and
seek out that which supports our
Communication
social identity in terms of major
social characteristics, such as
age, sex , socioeconomic status
, and ethnicity . This even applies
to parasocial relations with
68 characters represented in texts
(in any medium).
The capacity to have an effect
on the character, development, Oxford Online
Influence
or behaviour of someone or Dictionary
something, or the effect itself.

Unprocessed data of every AAP-06 NATO


Information description which may be used Glossary of Terms
in the production of intelligence. and Definitions

Actions designed to affect


MC 0422/5 NATO
information and/or information
Information Military Policy
systems. They can be performed
Activities on Information
by any actor and include
Operations
protection measures.

A desired condition created in


the information environment
as a result of information MC 0422/5 NATO
Information activities. Information effects Military Policy
Effects should be measurable to enable on Information
analysis, planning, execution and Operations
assessment of related activities
and the effects them self.

the information itself, the


individuals, organizations and
systems that receive process MC 0422/5 NATO
Information and convey the information, and Military Policy
Environment the cognitive processes that on Information
people employ, including the Operations
virtual and physical space in
which this occurs.

A desired condition to be
created in the information
environment. It should MilStratCom
Information be measurable to enable Practitioners
Objective analysis, planning, execution/ Handbook 2016-
management and assessment/ 08-22
evaluation of related actions and
effects.

69
A staff function to analyse,
plan, assess, and integrate
information activities to create
AJP-3.10 Allied
desired effects on the will,
Information Joint Doctrine
understanding, and capability
Operations for Information
of adversaries, potential
Operations
adversaries, and NAC-approved
audiences, in support of Alliance
mission objectives.

Information systems are


socio-technical systems for
the collection, processing and
dissemination of information.
Information They comprise personnel, AJP 3.10 Allied
Systems technical components, Joint Doctrine
organisational structures and for Information
processes that create, collect, Operations
perceive, analyse, assess,
structure, manipulate, store,
retrieve, display, share, transmit
and disseminate information.

D. Stupples
The source does
Warfare that integrates (2015) “The next
not aim at giving a
electronic warfare, cyberwarfare, war will be an
Information definition. This is a
and psychological operations information war,
Warfare working definition
(PSYOPS) into a single fighting and we’re not
in the context of
organisation. ready for it”, The
a magazine article.
Conversation
All activities pertaining to
managing the interaction with
the news media; can refer to the
function responsible for such
Media PAO Handbook
activities, such as the ‘media
Operations 2014
operations section’. For use in
this handbook, the terms media
operations is synonymous with
media relations.

The dissemination of false


Oxford Dictionary
Misinforma- information, either knowing it to
of Media and
70 tion be false ( see disinformation ),
or unknowingly.
Communication
In common
Information, especially of a AJP-3.10.1 Allied
speech, the
biased or misleading nature, Joint Doctrine
Propaganda term refers
used to promote a political for Psychological
exclusively to false
cause or point of view. Operations
information.

A statement of a measurable
response that reflects the AJP-3.10.1 Allied
Psychological
desired attitude or behaviour Joint Doctrine
Effect (in
change of a selected target for Psychological
PSYOPS)
audience as a result of Operations
psychological operations.

The vulnerability of a target AAP-06 NATO


Receptivity (in
audience to particular Glossary of Terms
PSYOPS)
psychological operations media. and Definitions

http://dic.
RuNet Russian-speaking Internet
academic.ru/

Web-based technologies used


NATO ACO
for social interaction and to
Directive on
Social Media transform and broadcast media
Social Media, 16
monologues into interactives,
September 2014
social dialogues’

Sending unsolicited and


unwanted e-mails in bulk for
advertising purposes. The
Oxford Dictionary
proliferation of such material,
Spamming of Business and
which now accounts for some
Management
85% of all e-mails sent, has
become a serious nuisance to
business users.

The anticipated acceptance or


AAP-06 NATO
rejection of a target audience
Susceptibility Glossary of Terms
to a particular psychological
and Definitions
operations approach.
From S. Tatham, Additional
Examining selected groups of
Target Audience information can be
Target people across a host of psycho-
Analysis, The found on the NATO
Audience social research parameters, to
Three Swords StratCom COE’s
Analysis (TAA) determine how best to change
Magazine 28 Target Audience
those groups’ behaviour
(2015) Analysis course. 71
Somebody who disrupts an on-
line or social media community
by posting abusive or irrelevant Oxford Dictionary
Troll
material, normally while hiding of Journalism
their identity behind one or more
user-names.
M. Brandel (2007) The source does
“Blog trolls and not aim at giving a
Posting of incendiary comments
cyberstalkers: definition. This is a
Trolling with the intent of provoking
How to working definition
others into conflict.
beat them”, in the context of
Computerworld a magazine article.
The term ‘Web
2.0’ was coined
in 2003 by Tim
O’Reilly and Dale
Dougherty of
The web seen as a platform
O’Reilly Media
for participation in which the
as a marketing
consumer is also a producer. Oxford Dictionary
response to the
Web 2.0 This was enabled by multiple of Media and
‘dot-com’ crash
software applications that Communication
of 2000–02 . It
supported user-generated
is intended to be
content.
seen in contrast to
a selective framing
of ‘Web 1.0’, which
characterized the
web of the 1990s.

72
ENDNOTES
1. Both entries are taken from the Around Fake News’, Huffington Post
Oxford Dictionary of Media and UK, 17 March 2017 http://www.
Communication. huffingtonpost.co.uk/neil-durkin/fake-
2. B. Nimmo, Identifying Disinformation: news_b_15387590.html; see also C.
an ABC, Institute for European Studies Archetti, ‘The Future of Social Media:
(2016). Strategic Communication, Politics &
3. Perhaps ironically, the term itself Context’, unpublished seminar paper
is deceptive, as it was made to be (Trends in Social Media and their
vaguely French-sounding, and was Further Development seminar, Riga, 20
even given a false French etymology March 2017)
in the Soviet Encyclopedia. 10. Google NewsLab https://newslab.
4. R. Godson, Written Testimony to withgoogle.com; ‘Fact Check now
the Senate Select Committee on available in Google Search and News
Intelligence, Open Hearing, March around the world’, Google Blog https://
30, 2017: Disinformation: A Primer blog.google/products/search/fact-
in Russian Active Measures and check-now-available-google-search-
Influence Campaigns (2017), p. 1. and-news-around-world
5. Ibid., p. 11. 11. https://www.poynter.org/tag/
6. We define social media as “websites international-fact-checking-network;
and applications that enable users https://firstdraftnews.com
to create and share content or to 12. Zollo F, Bessi A, Del Vicario M, Scala
participate in social networking” A, Caldarelli G, et al. (2017) Debunking
(Oxford Online Dictionary). in a world of tribes. PLOS ONE 12(7);
7. K. Starbird, interviewed by L. Garcia Peter, C., & Koch, T. (2016) When
Navarro, ‘How Misinformation Debunking Scientific Myths Fails
Spreads On The Internet’, NPR (and When It Does Not) The Backfire
(2017), available at http://www. Effect in the Context of Journalistic
npr.org/2017/04/09/523170115/ Coverage and Immediate Judgments
how-misinformation-spreads-on-the- as Prevention Strategy. Science
internet-and-how-to-stop-it, accessed Communication, 38(1); Nyhan, B., &
on 04/07/2017. Reifler, J. (2010) When corrections
8. See for example A. Jamieson, ‘You are fail: The persistence of political
fake news: Trump attacks CNN and misperceptions. Political Behavior,
BuzzFeed at press conference’, The 32(2).
Guardian, 11 January 2017 https:// 13. J. Weedon, W. Nuland, A. Stamos,
www.theguardian.com/us-news/2017/ Information Operations and Facebook,
jan/11/trump-attacks-cnn-buzzfeed- Facebook (2017), p. 4 https://
at-press-conference fbnewsroomus.files.wordpress. 73
9. Neil Durkin, ‘Don’t Believe The Hype com/2017/04/facebook-and-
information-operations-v1.pdf understand information correctly.
14. Ibid., p. 5 24. Zollo F, Bessi A, Del Vicario M, Scala
15. R. Waltzman, The Weaponization of A, Caldarelli G, et al. (2017) Debunking
Information: The Need for Cognitive in a world of tribes. PLOS ONE 12(7),
Security, p. 2. Emphasis added p. 9.
16. ‘Yes, I’d lie to you’, The Economist, 25. H. M. Claypool et al. ‘The effects of
10 September 2016 http:// personal relevance and repetition
www.economist.com/news/ on persuasive processing’, Social
briefing/21706498-dishonesty- Cognition 22/3 (2004), pp. 310-
politics-nothing-new-manner-which- 335; see also J. W. Alba and H.
some-politicians-now-lie-and Marmorstein ‘The effects of frequency
17. ‘Umberto Eco e i social: Danno diritto knowledge on consumer decision
di parola a legioni di imbecilli’, La making’, Journal of Consumer
Repubblica, 11 June 2015 http:// Research 14/1 (1987), pp. 14-25
video.repubblica.it/tecno-e-scienze/ 26. L. A. Henkel and M. E. Mattson
umberto-eco-e-i-social--danno- ‘Reading is believing: The truth effect
diritto-di-parola-a-legioni-di- and source credibility’ Consciousness
imbecilli/203952/203032 and cognition 20/4 (2011), pp. 1705-
18. ‘The Global Risks Report 2016’, World 1721
Economic Forum (2016), p.40. 27. T. Garcia-Marques and D. M. Mackie
19. R. Younes and E. Mackintosh, ‘The feeling of familiarity as a
‘Trusting What You See Online - It’s regulator of persuasive processing’
Not Just About the Tools’, in Finding Social Cognition 19/1 (2001), pp. 9-34
the Truth amongst the Fakes, Al 28. H. M. Claypool et al. ‘The effects of
Jazeera Media Institute (2017), p. 44 personal relevance and repetition on
20. J. Gottfried and E. Shearer, ‘News Use persuasive processing’, pp. 310-335
Across Social Media Platforms 2016’, 29. J. Weedon, W. Nuland, A. Stamos,
Pew Research Center http://www. Information Operations and Facebook,
journalism.org/2016/05/26/news- p. 6
use-across-social-media-platforms- 30. Bullet points taken from C. Hadnagy,
2016/#fn-55250-1 Social Engineering, Wiley (2011), pp.
21. In this context, the term is 233-234
synonymous with the more common 31. This in turn is done to defame,
“filter bubble”. ridicule, and threaten the targets. See
22. C. Watts, Statement Prepared for the S. Svetoka, ‘Social Media as a Tool of
U.S. Senate Select Committee on Hybrid Warfare’, NATO StratCom COE
Intelligence hearing: Disinformation: (2016), p. 20
A Primer In Russian Active Measures 32. Reflexive control is defined as “a
And Influence Campaigns (US means of conveying to a partner
Senate, 30 March 2017), available or an opponent specially prepared
at https://www.intelligence.senate. information to incline him to
gov/sites/default/files/documents/ voluntarily make the predetermined
74 os-cwatts-033017.pdf, accessed on decision desired by the initiator of the
28/06/2017. P. 7. action”. T. Thomas, ‘Russia’s Reflexive
23. Functional illiteracy is the inability to Control Theory and the Military’,
Journal of Slavic Military Studies 17/2 Propagandists Abuse the Internet and
(2004), p. 237 Manipulate the Public (TrendMicro,
33. Emotional content is not necessarily 2017), available at: https://www.
false, although it can be used in trendmicro.com/vinfo/us/security/
association with false content. news/cybercrime-and-digital-threats/
34. These narratives are likely to feature fake-news-cyber-propaganda-the-
predominantly negative sentiment: abuse-of-social-media, accessed 14
“[p]ropaganda seeks out and exploits June 2017, p. 6.
the most powerful emotions (...), it 42. F. Gillette, The Rise and Inglorious
is primarily in the negative emotions Fall of Myspace (Bloomberg, 2011),
that propaganda activities reside. In available at: https://www.bloomberg.
psychological terms, we understand com/news/articles/2011-06-22/the-
what we hate better than what we rise-and-inglorious-fall-of-myspace,
like”. N. O’Shaughnessy, ‘Putin, Xi, and accessed 13 June 17.
Hitler – Propaganda and the paternity 43. S. Kemp, ‘Digital in 2017: Global
of pseudo democracy’, Defence Overview’, available at: https://
Strategic Communications 2 (2017), wearesocial.com/special-reports/
p. 123 digital-in-2017-global-overview,
35. S. Svetoka, ‘Social Media as a Tool of accessed 12 July 2017
Hybrid Warfare’, p. 20 44. R. Hutt, The World’s Most Popular
36. This is done through the use of Social Networks Mapped, (World
automated and semi-automated Economic Forum, 2017), available
accounts. at: https://www.weforum.org/
37. S. Svetoka, ‘Social Media as a Tool of agenda/2017/03/most-popular-social-
Hybrid Warfare’, p. 20 networks-mapped/, accessed on 08
38. Such as Google’s targeted counter- June 2017.
radicalization initiatives. See B. 45. The debate was popular on
Quinn, ‘Google to point extremist mainstream media. See, for example,
searches towards anti-radicalisation K. Hosanagar, Blame the Echo
websites’, The Guardian, https://www. Chamber on Facebook. But Blame
theguardian.com/uk-news/2016/ Yourself, Too (2016), available at:
feb/02/google-pilot-extremist- https://www.wired.com/2016/11/
anti-radicalisation-information. facebook-echo-chamber/, accessed
Other initiatives, like Quilliam’s on 25 July 2017.
#NotAnotherBrother campaign, are 46. A. Mosseri, News Feed FYI:
different, because they make use of Addressing Hoaxes and Fake News
emotional content. See https://www. (Facebook, 2017), available at: https://
youtube.com/watch?v=IjIQ0ctzyZE newsroom.fb.com/news/2016/12/
39. B. Heap, ‘Strategic Communications: news-feed-fyi-addressing-hoaxes-
Insights from the Commercial Sector’, and-fake-news/, accessed on 13 June
NATO StratCom COE (2017), p. 17. 2017.
40. See, for example, the charity Full Fact: 47. ‘Continued influence’ refers to
https://fullfact.org/automated the widespread tendency among 75
41. L. Gu, V. Kropotov, and F. Yarochkin, the general populace to believe in
The Fake News Machine: How misinformation after corrections of
false statements have been issued: in at: https://securityintelligence.com/
other words, misinformation is proved information-security-in-the-age-of-
to be resistant to correction. See S. disinformation/, accessed on 30 June
Lewandowsky et al., Misinformation 2017.
and Its Correction Continued Influence 58. Regarding automated activity, the
and Successful Debiasing (2012), StratCom COE is in the process of
Psychological Science in the Public launching a regular product focused
Interest 13 (3). See also F. Zollo, A. on robotic trolling.
Bessi, M. Del Vicario, A. Scala, G. 59. O. Varol, E. Ferrara, C.A. Davis, F.
Caldarelli et al. (2017) Debunking in a Menczer, & A. Flammini, (2017).
world of tribes. PLOS ONE 12(7), p. 9. Online human-bot interactions:
48. Melissa Zimdars, assistant professor Detection, estimation, and
of communication and media at characterization. arXiv preprint
Merrimack College, in S. Levin, arXiv:1703.03107.
Facebook Promised To Tackle Fake 60. R. Fredheim (2017) Robotrolling 1.
News But The Evidence Shows It’s Not Available at http://stratcomcoe.org/
Working (The Guardian, 16 May 2017), robotrolling-20171, accessed on
available online at: https://www. 03/10/2017.
theguardian.com/technology/2017/ 61. E. Dwoskin, ‘Twitter is looking
may/16/facebook-fake-news-tools- for ways to let users flag fake
not-working, accessed on 16 June news,offensive content’ (The
2017. Washington Post, 29 June
49. Ibid. 2017), available at: https://www.
50. L. Bounegru et al., A Field Guide to washingtonpost.com/news/the-
Fake News (Public Data Lab, 2017), switch/wp/2017/06/29/twitter-is-
p. 16. looking-for-ways-to-let-users-flag-fake-
51. This topic became particularly popular news/?utm_term=.79db7791edca,
over the last year, when a number of accessed on 30 June 2017.
companies claimed to have applied 62. Such as network analysis, temporal
target audience analysis (variously analysis, sentiment analysis, etc.
paraphrased) to steer the results of 63. See for example Jennifer Keelan et al.
major political events worldwide. ‘YouTube as a source of information
This study does not name these on immunization: a content analysis’
commercial entities. Jama 298.21 (2007): 2482–2484.
52. F. Zollo et al. (2017) Debunking in a Despite the fact that YouTube had
world of tribes. PLOS ONE 12(7), p. 8. been launched merely two years
53. Ibid. before, ‘anti-vaxxers’ had already
54. P. Chamberlain, ‘Twitter as a Vector discovered its potential for sharing
for Disinformation, Journal of their content in a more permissive
Information Warfare 9/1 (2010), 6. environment than that of traditional
55. Ibid, 4. media. Nowadays, virtually all
56. See glossary. conspiracy theories are represented
76 57. G. Moraetes, ‘Information Security in the YouTube galaxy, but it is beyond
in the Age of Disinformation’, IBM the scope of this study to list them.
Security Intelligence (2017), available 64. Just in the US, 10% of adults
reported in 2016 that they got their Media Institute (2017), p. 138.
news from YouTube: J. Gottfried 74. See ‘Social media and its influence
and E. Shearer, ‘News Use Across on the Arab Spring’ (Al Jazeera
Social Media Platforms 2016’, America, 2015), available at: http://
Pew Research Center http://www. america.aljazeera.com/watch/shows/
journalism.org/2016/05/26/news- live-news/2015/12/social-media-
use-across-social-media-platforms- and-its-influence-on-the-arab-spring-
2016/#fn-55250-1 movement.html, accessed on 15 June
65. ‘YouTube to offer fake news 2017.
workshops to teenagers’, BBC 75. M. Esseghaier, ‘Tweeting Out a
Newsbeat (21 April 2017), available Tyrant: Social Media and the Tunisian
at: http://www.bbc.co.uk/newsbeat/ Revolution’, Wi Journal of Mobile
article/39653429/youtube-to-offer- Media 11 (1), available at: http://
fake-news-workshops-to-teenagers, wi.mobilities.ca/tweeting-out-a-
accessed on 27 June 2017. tyrant-social-media-and-the-tunisian-
66. Introducing Expanded YouTube revolution/, accessed on 15 June
Partner Program Safeguards to 2017.
Protect Creators, YouTube Creator 76. Data from: Digital in 2017: Global
Blog (06 April 2017), available at: Overview (We Are Social, 2017),
https://youtube-creators.googleblog. available at: https://wearesocial.com/
com/2017/04/introducing-expanded- special-reports/digital-in-2017-global-
youtube-partner.html, accessed on 28 overview, accessed on 14 June 2017.
April 2017. 77. A. Elsheikh, ‘Finding Your Story’, p.
67. Ibid. 139.
68. Instagram Help Centre, available 78. Ibid., emphasis added.
at: https://help.instagram. 79. L. Gu et al, The Fake News Machine,
com/370054663112398, accessed on p. 9.
28 June 2017. 80. Ibid., p. 34.
69. Instagram Community Guidelines, 81. The name can be translated as ‘Is
available at: https://help.instagram. this serious?’/ ‘Is this real?’. Website
com/477434105621119, accessed on available at: https://dabegad.com,
28 June 2017. accessed on 24 August 2017.
70. One of the most successful of said 82. Da Begad’s Website, ‘Who
platforms, Gab, counts little more than are we?’ [in Arabic]: https://
180,000 users in mid-2017: https:// dabegad.com/%D8%B9%D9%86-
gab.ai/a/posts/8106308, accessed on %D8%AF%D9%87-
30 June 2017. %D8%A8%D8%AC%D8%AF, accessed
71. See glossary. on 14 June 2017.
72. It must, however, be noted that 83. H. A. Unver, ‘Can Fake News Lead To
obstacles such as debunking of War? What The Gulf Crisis Tells Us’,
false stories et similia might not be War on the Rocks, available at: https://
obstacle at all, as previously noted. warontherocks.com/2017/06/can-
73. A. Elsheikh, ‘Finding Your Story: Which fake-news-lead-to-war-what-the-gulf- 77
Platform and Where?’, in Finding the crisis-tells-us/, accessed on 16 June
Truth amongst the Fakes, Al Jazeera 2017.
84. B. Heap, ‘Strategic Communications: Overview’ Report, We Are Social,
Insights from the Commercial Sector’, January 2017. Accessed 9 August
NATO StratCom COE (2017), p. 17. 2017: https://wearesocial.com/
85. Ibid. special-reports/digital-in-2017-global-
86. US Department of State, Country overview
Reports on Terrorism 2016 94. Fanteev, Frank, ‘300+ Million Users:
(Country Reports: Middle East and Understanding Russia’s VK Social
North Africa) (2017), available at: Network,’ Digital Marketing Magazine,
https://www.state.gov/j/ct/rls/ 2015. Accessed 9 August 2017: http://
crt/2016/272232.htm, accessed on 21 digitalmarketingmagazine.co.uk/
July 2017. social-media-marketing/300-million-
87. Ibid. users-understanding-russia-s-vk-
88. As explained by a representative of social-network/2564; Pavelek, Ondrej,
the Global Coalition during the Foreign ‘Vkontakte Demographics’, Havas
Terrorist Fighters Working Group Worldwide, 2013, Accessed: https://
Meeting (15 March 2017). www.slideshare.net/andrewik1/v-
89. US Department of State, Country kontakte-demographics
Reports on Terrorism 2016 95. Note: Aric Toler of Bellingcat
(Country Reports: Middle East and points to granular search, detailed
North Africa) (2017), available at: information about military service
https://www.state.gov/j/ct/rls/ and communities for military units
crt/2016/272232.htm, accessed on 21 as critical in Bellingcat’s open source
July 2017, emphasis added. research. Toler, Aric, ‘The Open Source
90. See ‘Fighting the Cyber-Jihadists’, The Guidebook to RuNet’. Accessed
Economist (10 June 2017). 9 August 2017: https://medium.
91. T. Fox-Brewster, ‘With Fake News And com/1st-draft/how-to-get-started-
Femmes Fatales, Iran’s Spies Learn investigating-the-russian-language-
To Love Facebook’, Forbes (2017), internet-3a934b9d55e2
available at: https://www.forbes.com/ 96. Note: Local and military service
sites/thomasbrewster/2017/07/27/ groups figured prominently in the
iran-hackers-oilrig-use-fake- spread of disinformation about MH17
personas-on-facebook-linkedin-for- downing and, ironically, in the digital
cyberespionage/#56002c2e49af, forensic analysis tracking down its
accessed on 28 July 2017. perpetrators. Bellingcat, ‘MH17: The
92. Note: VK was formerly Vkontakte, Open Source Investigation Three
ВКонтакте, ‘In Contact’ in Russian; Years Later’. Accessed 9 August
both names are still encountered. 2017: https://www.bellingcat.com/
93. Note: The platform is also popular in wp-content/uploads/2017/07/mh17-
Belarus, Kazakhstan, and Ukraine, with 3rd-anniversary-report.pdf; Bellingcat
11.9 million users in the last-named, Investigation Team, ‘Pre-MH17
although this number is expected to Photograph of Buk 332 Discovered’,
diminish following recent action by June 5, 2017. Accessed 9 August
78 the government of Ukraine to ban 2017: https://www.bellingcat.com/
the Russian-owned social networks. news/uk-and-europe/2017/06/05/
Kemp, Simon, ‘Digital in 2017: Global pre-mh17-photograph-buk-332-
discovered/ Pomerantsev__The_Menace_of_
97. Zhdanova, Mariia & Orlova, Dariya, Unreality.pdf
‘Computational Propaganda in 104. Note: Translated from Russian.
Ukraine: Caught between external Clockwise from top left: “Little bear
threats and internal challenges,’ kisses mom”, “Grandmothers at
in Samuel Woolley and Philip N the door when I get home”, “It is
Howard, eds, Working Paper 2017.9, necessary to remember the past, but
Oxford, UK: Project on Computational not to live it. The past is past”, “It is
Propaganda. Accessed 9 August possible to question Putin’s merits
2017: http://comprop.oii.ox.ac. towards the Fatherland. But the return
uk/2017/06/19/computational- of the Crimea is priceless”
propaganda-in-ukraine-caught- 105. Moi Mir (Мой Мир) or ‘My World’
between-external-threats-and-internal- in Russian.
challenges/ 106. Note: There are also around 25
98. Odnoklassniki (Одноклассники) or million Russian-speaking Facebook
‘Classmates’ in Russian. users.
99. Ghedin, Guido, ‘Odnoklassniki: 107. Toler, Aric, ‘What you Need to
Users, Features and the Power of Know About Russian Social Networks
Communities’, Digital in the Round, 13 to Conduct Open-Source Research,’
December 2013. Accessed 9 August GlobalVoices, 21 October 2015.
2017: http://www.digitalintheround. Accessed 9 August 2017: https://
com/odnoklassniki-users-features- globalvoices.org/2015/10/21/what-
communities/ you-need-to-know-about-russian-
100. Russian Search Tips, ‘Top social social-networks-to-conduct-open-
networks in Russia: latest numbers source-research/
and trends’, 20 January 2015. 108. Sivertseva, ‘Odnoklassniki and
Accessed 9 August 2017: http://www. MoiMir’.
russiansearchtips.com/2015/01/ 109. Pomerantsev & Weiss, ‘The
top-social-networks-russia-latest- Menace of Unreality’.
numbers-trends/ 110. Toler, ‘What you Need to Know
101. Ghedin, ‘Odnoklassniki’. About Russian Social Networks’,
102. Sivertseva, Ekaterina, 111. Fedor, Julia., & Fredheim, Rolf.
‘Odnoklassniki and MoiMir Bring TV (2017). ‘We need more clips about
Shows to Russian Internet Users’, Putin, and lots of them:’ Russia’s
Digital in the Round, 22 May 2014. state-commissioned online visual
Accessed 9 August 2017: http://www. culture. Nationalities Papers 45(2),
digitalintheround.com/odnoklassniki- 161–181
moimir-tv-russia/ 112. Figure 8. 2017 Digital Yearbook
103. Pomerantsev, P and Weiss, M, by We Are Social Singapore,
‘The Menace of Unreality: How the available at: https://www.slideshare.
Kremlin Weaponizes Information, net/wearesocialsg/2017-digital-
Culture, and Money’, Institute of yearbook?ref=http://www.
Modern Russia. Accessed: https:// digitalstrategyconsulting.com/ 79
imrussia.org/media/pdf/Research/ intelligence/russia-digital-marketing/
Michael_Weiss_and_Peter_ 113. Baran, Katsiaryna, & Stock,
Wolfgang, ‘Facebook has Been Working Paper 2017.9, Oxford, UK:
Smacked Down. The Russian Project on Computational Propaganda
Special way of SNSs: Vkontakte as Acessed 9 August 2017: http://
a Case Study’ in ECSM 2015 - The comprop.oii.ox.ac.uk/wp-content/
Proceedings of the 2nd European uploads/sites/89/2017/06/Comprop-
Conference on Social Media. Russia.pdf
Accessed 9 August 2017: http:// 119. Ibid.
www.isi.hhu.de/fileadmin/redaktion/ 120. Note: It was in the interest of
Fakultaeten/Philosophische_ the Kremlin and encouraged by the
Fakultaet/Sprache_und_Information/ Kremlin that the viable competitive
Informationswissenschaft/Dateien/ technology sector should take
Wolfgang_G._Stock/Baran_2015_ advantage of the engineering talent
ECSM_2015_Proceedings-276.pdf pool, as is evidenced by Putin’s
114. Note: Facebook users in Russia early promise, which surprised the
tend to be university graduates and technology executives: ‘Whenever
young professionals fall within the we’ll have to choose between
age group 24–40. Sikorska, Olena, excessive regulation and protection of
‘VKontakte vs. Facebook: How online freedom, we’ll definitely opt for
Russians consume social networks? freedom’; this is how as the late Anton
(Infographic)’, Digital EastFactor, 12 Nossik, CEO of the prominent online
May 2014. Accessed 9 August 2017. news outlets, recalled the meeting. In
http://www.digitaleastfactor.com/ Nossik. Anton, ‘I helped build Russia’s
vkontakte-vs-facebook-russians- Internet. Now Putin wants to destroy
consume-social-networks-infographic/ it’, The New Republic, 15 May 2014.
115. Anna Lubov, ‘Top social networks Accessed 9 August 2017: http://www.
in Russia: latest trends, winter newrepublic.com/article/117771/
2015–2016’, Russian Search Tips. putinsinternet-crackdown-russias-first-
Accessed 9 August 2017: http://www. blogger-react
russiansearchtips.com/2016/03/ 121. Etling, B., Alexanyan, K., Kelly,
top-social-networks-in-russia-latest- J., Faris, R., Palfrey, J. G., & Gasser,
trends-in-winter-2015-2016/ U. (2010). ‘Public discourse in the
116. Fedor & Fredheim, ‘We need more Russian blogosphere: Mapping
clips about Putin’, 161–181. RuNet politics and mobilization’,
117. Lipman, M. ‘Media manipulation Berkman Center, Research Publication
and political control in Russia.’ No. 2010–11. Accessed 9 August
Chatham House, 2009. Accessed 2017: http://papers.ssrn.com/
9 August 2017: https://www. abstract=1698344
chathamhouse.org/sites/ 122. Note: In search of his own power
files/chathamhouse/public/ base leading up to the presidential re-
Research/Russia%20and%20 election campaign, Dmitry Medvedev,
Eurasia/300109lipman.pdf in contrast to Putin, turned to RuNet
118. Sanovich, Sergey. ‘Computational and to the educated middle-class
80 Propaganda in Russia: The Origins professionals who consumed and
of Digital Misinformation’ in Samuel engaged in it. Shortly after assuming
Woolley and Philip N Howard, Eds his post as President, he established
a social media presence, famously The most popular social networks
earning himself a reputation as among Ukrainians were VK (11.9
‘Blogger-in-Chief.’ See Sanovich, million users), Facebook (over 8
‘Computational Propaganda in million), Odnoklassniki (5.7 million)
Russia’. and Twitter (2.5 million). Detector
123. Ibid. Media. (2017). Як російська
124. Ibid. пропаганда впливає на суспільну
125. Fedor & Fredheim, ‘We need more думку в Україні (дослідження) (How
clips about Putin’, 161–181. Russian propaganda influences public
126. Toor, A. (2014) ‘How Putin’s opinion in Ukraine [research findings]).
cronies seized control of Russia’s Retrieved from: http://osvita.
Facebook’: http://www.the-village. mediasapiens.ua/mediaprosvita/
ru/village/business/story/150063- research/yak_rosiyska_propaganda_
kak-otbirali-vkontakte, accessed on vplivae_na_suspilnu_dumku_v_ukraini_
14/09/2017. doslidzhennya/
127. The full spectrum of 130. Note: Moi Mir is the property
disinformation in Ukraine is of the government-approved Mail.Ru
documented by numerous StratCom conglomerate.
Center of Excellence studies. See: 131. Указ Президента України
http://www.stratcomcoe.org/analysis- №133/2017Ж Про рішення Ради
russias-information-campaign- національної безпеки і оборони
against-ukraine-1; http://www. України від 28 квітня 2017 року
stratcomcoe.org/framing-ukraine- ‘Про застосування персональних
russia-conflict-online-and-social- спеціальних економічних та інших
media обмежувальних заходів (санкцій)’
128. The following are examples of (Decree of the President of Ukraine
fake news unmasked by the Stop Fake № 133/201ж ‘On the decision of
organisation. ‘Video with Russian the Council of National Security
“GRAD” volleys aimed at South and Defense of Ukraine dated April
Ossetia was presented as events 28, 2017 “On the Application of
in Sloviansk’: http://www.stopfake. Personal Special Economic and Other
org/en/video-with-russian-grad- Restrictive Measures [Sanctions]”
volleys-aimed-at-south-ossetia-was- ’). http://www.president.gov.ua/
presented-as-events-in-sloviansk/; documents/1332017-21850
‘Photo from China Dated 1989 132. ‘In Ukraine Facebook Grows at
Presented as the Actual Events in the Expense of Russian Competitors’,
Donbass’: http://www.stopfake.org/ Gemius Global, 14 July 2017, https://
en/photo-from-china-dated-1989- www.gemius.com/all-reader-news/
presented-as-the-actual-events-in- in-ukraine-facebook-grows-at-the-
donbass/ expense-of-russian-competitors.html;
129. Note: The current population Oleg Dytrenko, ‘Facebook обійшов
of Ukraine is 45 million. 63% of the ВКонтакте вже в перший тиждень
adult population are active internet після введення санкцій проти 81
users, and 21% of these use social російських соцмереж’ (‘Facebook
media as the main source of news. has surpassed VK on the first week
after the introduction of sanctions Analytics Using Blogtrackers”
against Russian social networks’), (International Conference on Social
Watcher, http://watcher.com. Computing, Behavioral-Cultural
ua/2017/06/07/facebook-obiyshov- Modeling & Prediction and Behavior
vkontakte-vzhe-v-pershyy-tyzhden- Representation in Modeling and
pislya-vvedennya-sanktsiy-proty- Simulation, July 2017).
rosiyskyh-sotsmerezh/ 140. Tim O’Reilly, “How I Detect
133. Note: Telegram is an encrypted Fake News,” December 2016, http://
messaging platform, launched in www.kdnuggets.com/2016/12/
2013 by Pavel Durov. It has been oreilly-detect-fake-news.html;
under pressure to cooperate with Melissa Zimdars, “False, Misleading,
the Russian government or face Clickbait-y, and Satirical ‘News’
shutdown. Sources,” Google Docs, 2016, https://
134. https://www.cnet.com/news/ docs.google.com/document/d/10eA5-
telegram-registers-in-russia-wont- mCZLSS4MQY5QGb5ewC3VAL6pLk
share-user-data/ T53V_81ZyitM/preview?usp=embed_
135. Note: VPNs (Virtual Private facebook; Krishna Bharat, “How to
Networks) enable users to bypass Detect Fake News in Real-Time,”
blocks and navigate to censored sites. NewCo Shift, April 27, 2017, https://
http://www.reuters.com/article/us- shift.newco.co/how-to-detect-fake-
russia-internet-idUSKBN1AF0QI news-in-real-time-9fdae0197bfd.
136. This is exemplified by the case of 141. Scot Macdonald, Propaganda and
a Norwegian Facebook group opposed Information Warfare in the Twenty-
to immigration that mistook a picture First Century: Altered Images and
of empty bus seats for burqa-clad Deception Operations (Routledge,
women. The picture was commented 2006).
on, liked, and shared by a considerable 142. This story was reported as
number of angered users before the conspiracy theory by bsdetector.tech
mistake was detected: https://www. 143. We used ORA-Lite (Organization
thelocal.no/20170731/norwegian-anti- Risk Analyzer), available at `http://
immigrant-facebook-groups-confuses- www.casos.cs.cmu.edu/projects/ora/
empty-bus-seats-with-terrorists, software.php
accessed on 02 August 2017. 144. The names of the blog sites have
137. Technorati, “State of the been smudged to keep the identity of
Blogosphere 2011,” 2011, http:// the bloggers anonymous.
technorati.com/state-of-the- 145. This concept is widely studied in
blogosphere-2011/. communication literature under the
138. Nitin Agarwal et al., heading Exemplification Theory.
“Examining The Use Of Botnets 146. Patric R. Spence et al., “That Is
And Their Evolution In Propaganda So Gross and I Have to Post About
Dissemination,” Defence Strategic It: Exemplification Effects and User
Communications 2 (2017): 87–112. Comments on a News Story,” Southern
82 139. Muhammad Nihal Hussain, Communication Journal 82, no. 1
Saaduddin Ghouri Mohammad, (2017): 27–37.
and Nitin Agarwal, “Blog Data 147. Zillmann, D. (2002).
Exemplification theory of media ument/9789400729025-c1.pdf?S-
influence. In J. Bryant & D. Zillmann GWID=0-0-45-1302338-p174266596
(Eds.), Media effects: Advances in 161. Company that collects personal
theory and research (2nd ed., pp. information about consumers from
19–41). Mahwah, NJ: Lawrence public and non-public sources and
Erlbaum Associates sells that information to other organi-
148. https://www.iab.com/2013- zations. Data brokers create profiles
internet-ad-revenues-soar-to-42- on users for marketing purposes and
8-billion-hitting-landmark-high- sell them to businesses who want to
surpassing-broadcast-television-for- target their advertisements.
the-first-timemarks-a-17-rise-over-r- 162. http://www.springer.com/cda/
ecord-setting-revenues-in-2012/ content/document/cda_downloaddoc-
149. http://www.symantec. ument/9789400729025-c1.pdf?S-
com/content/en/us/enterprise/ GWID=0-0-45-1302338-p174266596
other_resources/b-istr_main_report_ 163. Advertising on a website that is
v18_2012_21291018.en-us.pdf . targeted to be relevant to the page’s
150. https://newsroom.fb.com/ content.
news/2017/09/information- 164. https://otalliance.org/system/
operations-update/ files/files/resource/documents/re-
151. Ibid. port_-_online_advertising_hidden_haz-
152. http://www.springer.com/cda/ ards_to_consumer_security_date_pri-
content/document/cda_ Ibid. vacy_may_15_20141.pdf
153. Ibid. 165. https://otalliance.org/system/
154. Ibid.
files/files/resource/documents/re-
155. http://www.springer.com/cda/
port_-_online_advertising_hidden_haz-
content/document/cda_downloaddoc-
ards_to_consumer_security_date_pri-
ument/9789400729025-c1.pdf?S-
vacy_may_15_20141.pdf
GWID=0-0-45-1302338-p174266596
166. Ibid.
156. Tracking users across different
167. https://www.eff.org/files/on-
visits and/or across different sites.
lineprivacylegprimersept09.pdf
157. Snippet of JavaScript code or
168. Ibid.
executable file. In the context of web-
169. https://newsroom.fb.com/
sites, used to implement behaviour,
news/2017/09/information-opera-
change page content etc.
tions-update/
158. https://www.nccgroup.trust/glo-
170. Sites were the Fakeness Index
balassets/our-research/us/whitepa-
was 0.9 or higher.
pers/isec_cleaning_up_after_cookies.
171. Fakeness Index – a value between
pdf
0 and 1 expressing the ratio of a cook-
159. An HTML document embedded
ie occurrences on fake news media
inside another HTML document on a
sites against its occurrences on legit-
website. IFrames are often used to
imate news media sites. The higher
insert content from another source,
the index, the “faker” the cookie. This
such as an advertisement.
160. http://www.springer.com/cda/
index was developed to narrow down 83
the cookies of interest.
content/document/cda_downloaddoc-
172. See Turlas’ watering hole cam-
paign that delivered fingerprinting cial-media-platforms-2016/
scripts based on the IP address range 184. This example, about Jewish
requests were coming from: https:// users, is particularly significant: S.
www.welivesecurity.com/2017/06/06/ A. O’Brien and D. O’Sullivan, How
turlas-watering-hole-campaign-updat- Facebook knows you’re Jewish, CNN
ed-firefox-extension-abusing-insta- (2017), available at http://money.cnn.
gram/ com/2017/09/21/technology/busi-
173. WOT is a website reputation and ness/facebook-rosh-hashanah-ad-tar-
review service that helps people make geting/index.html, accessed on
informed decisions about whether to 22/09/2017.
trust a website or not (see https:// 185. C. Leonnig, T. Hamburger and R.
www.mywot.com/). Helderman (2017) ‘Russian firm tied to
174. Admerge, x01.aidata.io, ads. pro-Kremlin propaganda advertised on
betweendigital.com, digitaladsystems. Facebook during election’, available
com, dmg.digitaltarget.ru, doubleclick. at https://www.washingtonpost.com/
net, mail.ru, marketgid.com, otm-r. politics/facebook-says-it-sold-polit-
com, sync.dmp.otm-r.com, rambler. ical-ads-to-russian-company-during-
ru, republer.com, rtb.com.ru, rutarget. 2016-election/2017/09/06/32f01fd2-
ru, ssp-rtb.sape.ru, targeterra.info, 931e-11e7-89fa-bb822a46da5b_story.
targetix.net, upravel.com html?utm_term=.dd26689bcd74,
175. Service for creating backlinks accessed on 25/09/2017.
to sites to increase web traffic; the 186. A. Hern (2017), ‘Apple block-
company is owned by Butko ing ads that follow users around
176. https://stackoverflow.com/ques- web is ‘sabotage’, says industry’,
tions/23411188/hidden-malicious- available at https://www.theguard-
script-insertinga-code-into-html-web- ian.com/technology/2017/
page-how-to-remove-clean sep/18/apple-stopping-ads-fol-
177. tovarro.com and lentainform.com low-you-around-internet-sabotage-ad-
178. An interface component that vertising-industry-ios-11-and-ma-
enables a user to perform a function cos-high-sierra-safari-internet,
or access a service. accessed on 21/09/2017.
179. Facebook data structure. Ex- 187. See the dedicated website: http://
plained here: http://www.businessin- www.eugdpr.org/.
sider.com/explainer-what-exactly-is-
the-social-graph-2012-3
180. https://muckrack.com/
whoshared/
181. http://conferences.sigcomm.org/
sigcomm/2009/workshops/wosn/
papers/p7.pdf
182. Some web servers have security
software installed which strips the
84 referrer from all requests.
183. http://www.journalism.
org/2016/05/26/news-use-across-so-

S-ar putea să vă placă și