mirror of https://github.com/MISP/misp-galaxy
Merge pull request #463 from VVX7/master
new: [galaxy] AMITT (Adversarial Misinformation and Influence Tactics…pull/464/head
commit
945ed77b9d
|
@ -0,0 +1,868 @@
|
||||||
|
{
|
||||||
|
"authors": [
|
||||||
|
"misinfosecproject"
|
||||||
|
],
|
||||||
|
"category": "misinformation-pattern",
|
||||||
|
"description": "AM!TT Technique",
|
||||||
|
"name": "Misinformation Pattern",
|
||||||
|
"source": "https://github.com/misinfosecproject/amitt_framework",
|
||||||
|
"type": "amitt-misinformation-pattern",
|
||||||
|
"uuid": "b3f65346-49e4-48c3-88f8-354902a5fe47",
|
||||||
|
"values": [
|
||||||
|
{
|
||||||
|
"description": "Nimmo's \"4Ds of propaganda\": dismiss, distort, distract, dismay (MisinfosecWG added divide in 2019). Misinformation promotes an agenda by advancing narratives supportive of that agenda. This is most effective when the advanced narrative pre-dates the revelation of the specific misinformation content. But this is often not possible.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0001",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:strategic-planning"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0001.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "16556f68-fe4f-43c8-a8a4-6fc205d80251",
|
||||||
|
"value": "5Ds (dismiss, distort, distract, dismay, divide)"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Organize citizens around pro-state messaging. Paid or volunteer groups coordinated to push state propaganda (examples include 2016 Diba Facebook Expedition, coordinated to overcome China’s Great Firewall to flood the Facebook pages of Taiwanese politicians and news agencies with a pro-PRC message).",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0002",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:strategic-planning"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0002.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "35f79572-d306-4df1-92e7-84e4d2242baf",
|
||||||
|
"value": "Facilitate State Propaganda"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. Examples include midwesterners are generous, Russia is under attack from outside.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0003",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:strategic-planning"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0003.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "05f58511-8d22-45d5-b889-47a07b9be00d",
|
||||||
|
"value": "Leverage Existing Narratives"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. MH17 (example) \"Russian Foreign Ministry again claimed that “absolutely groundless accusations are put forward against the Russian side, which are aimed at discrediting Russia in the eyes of the international community\" (deny); \"The Dutch MH17 investigation is biased, anti-Russian and factually inaccurate\" (dismiss). \n\nSuppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on.\n\nThese competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the \"firehose of misinformation\" approach.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0004",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:strategic-planning"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0004.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "8960c6c3-ab73-41b3-b661-901f4e4ed5e6",
|
||||||
|
"value": "Competing Narratives"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Recon/research to identify \"the source of power that provides moral or physical strength, freedom of action, or will to act.\" Thus, the center of gravity is usually seen as the \"source of strength\". Includes demographic and network analysis of communities",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0005",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:objective-planning"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0005.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "a6de0798-4de8-4aa8-90c4-fd6d88f850f3",
|
||||||
|
"value": "Center of Gravity Analysis"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "The promotion of beneficial master narratives is perhaps the most effective method for achieving long-term strategic narrative dominance. From a \"whole of society\" perpective the promotion of the society's core master narratives should occupy a central strategic role. From a misinformation campaign / cognitive security perpectve the tactics around master narratives center more precisely on the day-to-day promotion and reinforcement of this messaging. In other words, beneficial, high-coverage master narratives are a central strategic goal and their promotion consitutes an ongoing tactical struggle carried out at a whole-of-society level. \n\nBy way of example, major powers are promoting master narratives such as:\n* \"Huawei is detetmined to build trustworthy networks\"\n* \"Russia is the victim of bullying by NATO powers\"\n* \"USA is guided by its founding principles of liberty and egalitarianism\"\n\nTactically, their promotion covers a broad spectrum of activities both on- and offline.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0006",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:objective-planning"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0006.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "73c4fe48-8d25-47ce-8295-33db463b0e85",
|
||||||
|
"value": "Create Master Narratives"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. \n\nComputational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are. \n\nExamples: Ukraine elections (2019) circumvent Facebook’s new safeguards by paying Ukrainian citizens to give a Russian agent access to their personal pages. EU Elections (2019) Avaaz reported more than 500 suspicious pages and groups to Facebook related to the three-month investigation of Facebook disinformation networks in Europe. Mueller report (2016) The IRA was able to reach up to 126 million Americans on Facebook via a mixture of fraudulent accounts, groups, and advertisements, the report says. Twitter accounts it created were portrayed as real American voices by major news outlets. It was even able to hold real-life rallies, mobilizing hundreds of people at a time in major cities like Philadelphia and Miami. ",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0007",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-people"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0007.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "14394d02-9f8f-4999-8e3d-c51b6f25076b",
|
||||||
|
"value": "Create fake Social Media Profiles / Pages / Groups"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details. A prominent case from the 2016 era was the _Denver Guardian_, which purported to be a local newspaper in Colorado and specialized in negative stories about Hillary Clinton.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0008",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-people"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0008.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "dd3f7b62-a99c-40d6-baeb-cd36601cc524",
|
||||||
|
"value": "Create fake or imposter news sites"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself. For example, in the Jade Helm conspiracy theory promoted by SVR in 2015, a pair of experts--one of them naming himself a “Military Intelligence Analyst / Russian Regional CME” and the other a “Geopolitical Strategist, Journalist & Author”--pushed the story heavily on LinkedIn.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0009",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-people"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0009.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "0253d5f6-cc08-4f46-b00a-628926020d2c",
|
||||||
|
"value": "Create fake experts"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as \"useful idiots\" or \"unwitting agents\".",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0010",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-networks"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0010.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "784cfb1f-c6f5-44a3-8b60-272c64aac4ea",
|
||||||
|
"value": "Cultivate useful idiots"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Hack or take over legimate accounts to distribute misinformation or damaging content. Examples include Syrian Electronic Army (2013) series of false tweets from a hijacked Associated Press Twitter account claiming that President Barack Obama had been injured in a series of explosions near the White House. The false report caused a temporary plunge of 143 points on the Dow Jones Industrial Average.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0011",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-networks"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0011.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "79e9410b-c325-44fd-9b1b-8c9c53c8ecdd",
|
||||||
|
"value": "Hijack legitimate account"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use anonymous social media profiles. Examples include page or group administrators, masked \"whois\" website directory data, no bylines connected to news article, no masthead connect to news websites. \n\nExample is 2016 @TEN_GOP profile where the actual Tennessee Republican Party tried unsuccessfully for months to get Twitter to shut it down, and 2019 Endless Mayfly is an Iran-aligned network of inauthentic personas and social media accounts that spreads falsehoods and amplifies narratives critical of Saudi Arabia, the United States, and Israel.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0012",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-networks"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0012.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "40c0ba05-ecb4-42c1-af78-4c7cf586f547",
|
||||||
|
"value": "Use concealment"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0013",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-networks"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0013.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "81d35c37-da96-423b-9ec1-e2831a6f413d",
|
||||||
|
"value": "Create fake websites"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Generate revenue through online funding campaigns. e.g. Gather data, advance credible persona via Gofundme; Patreon; or via fake website connecting via PayPal or Stripe. (Example 2016) #VaccinateUS Gofundme campaigns to pay for Targetted facebook ads (Larry Cook, targetting Washington State mothers, $1,776 to boost posts over 9 months).",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0014",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-networks"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0014.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "06ff0cd0-08a4-486b-ab81-57c50bc2253e",
|
||||||
|
"value": "Create funding campaigns"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Many incident-based campaigns will create a hashtag to promote their fabricated event (e.g. #ColumbianChemicals to promote a fake story about a chemical spill in Louisiana). \n\nCreating a hashtag for an incident can have two important effects:\n1. Create a perception of reality around an event. Certainly only \"real\" events would be discussed in a hashtag. After all, the event has a name!\n2. Publicize the story more widely through trending lists and search behavior \n\nAsset needed to direct/control/manage \"conversation\" connected to launching new incident/campaign with new hashtag for applicable social media sites ie: Twitter, LinkedIn)",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0015",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-networks"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0015.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "80c68f29-1c22-4277-93c0-e19f97bd56ee",
|
||||||
|
"value": "Create hashtag"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. (example 2016) “Pope Francis shocks world, endorses Donald Trump for president.” (example 2016) \"FBI director received millions from Clinton Foundation, his brother’s law firm does Clinton’s taxes”. This is a key asset",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0016",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:microtargeting"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0016.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "7193e229-e122-4f50-818b-e2b047b18a9a",
|
||||||
|
"value": "Clickbait"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Drive traffic/engagement to funding campaign sites; helps provide measurable metrics to assess conversion rates",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0017",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:microtargeting"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0017.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "3a540119-0ede-4ac5-968c-de11ac477cb3",
|
||||||
|
"value": "Promote online funding"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create or fund advertisements targeted at specific populations",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0018",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:microtargeting"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0018.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "97ce4b61-b888-4a76-98f6-a32dc1df1a1a",
|
||||||
|
"value": "Paid targeted ads"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. \"Nothing is true, but everything is possible.\" Akin to astroturfing campaign.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0019",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0019.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "7bdc0b07-63db-406b-8602-1b8a1faa387f",
|
||||||
|
"value": "Generate information pollution"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0020",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0020.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "5bd83398-8273-49b8-8bc2-9435bda603ed",
|
||||||
|
"value": "Trial content"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0021",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0021.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "fa6e62ca-16c3-4fdd-93ff-b1e1da4cfad8",
|
||||||
|
"value": "Memes"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "\"Conspiracy narratives appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the \"\"firehose of falsehoods\"\" model. \n\nExample: QAnon: conspiracy theory is an explanation of an event or situation that invokes a conspiracy by sinister and powerful actors, often political in motivation, when other explanations are more probable \"",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0022",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0022.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "5a832f09-0b39-4734-b7a1-9a4592bdb57e",
|
||||||
|
"value": "Conspiracy narratives"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0023",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0023.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "01c4d71e-47ef-4cad-abda-ad1abd42cae7",
|
||||||
|
"value": "Distort facts"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create fake videos and/or images by manipulating existing content or generating new content (e.g. deepfakes). Examples include Pelosi video (making her appear drunk) and photoshoped shark on flooded streets of Houston TX.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0024",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0024.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "79a57ba1-9d29-4cd6-8669-ce9728bc33d7",
|
||||||
|
"value": "Create fake videos and images"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Obtain documents (eg by theft or leak), then alter and release, possibly among factual documents/sources. \n\nExample (2019) DFRLab report \"Secondary Infektion” highlights incident with key asset being a forged “letter” created by the operation to provide ammunition for far-right forces in Europe ahead of the election.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0025",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0025.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "01f8720b-d254-4744-a4eb-a28efc8c3528",
|
||||||
|
"value": "Leak altered documents"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create fake academic research. Example: fake social science research is often aimed at hot-button social issues such as gender, race and sexuality. Fake science research can target Climate Science debate or pseudoscience like anti-vaxx",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0026",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0026.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "032ea639-87e3-413b-925d-e556b472216b",
|
||||||
|
"value": "Create fake research"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Adapting existing narratives to current operational goals is the tactical sweet-spot for an effective misinformation campaign. Leveraging existing narratives is not only more effective, it requires substantially less resourcing, as the promotion of new master narratives operates on a much larger scale, both time and scope. Fluid, dynamic & often interchangable key master narratives can be (\"The morally corrupt West\") adapted to divisive (LGBT proganda) or to distort (individuals working as CIA operatives). For Western audiences, different but equally powerful framings are available, such as \"USA has a fraught history in race relations, espically in crimincal justice areas.\"",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0027",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0027.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "c30bfa00-2da6-4443-aa05-5342ad9ea2cc",
|
||||||
|
"value": "Adapt existing narratives"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "\"Misinformation promotes an agenda by advancing narratives supportive of that agenda. This is most effective when the advanced narrative pre-dates the revelation of the specific misinformation content. But this is often not possible. \n\nSuppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. \n\nThese competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the \"\"firehose of misinformation\"\" approach.\"",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0028",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:develop-content"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0028.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "c84a5389-92a0-41f1-bed1-b85a4720ffa5",
|
||||||
|
"value": "Create competing narratives"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create fake online polls, or manipulate existing online polls. Examples: flooding FCC with comments; creating fake engagement metrics of Twitter/Facebook polls to manipulate perception of given issue. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0029",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0029.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "d7175e98-579d-4675-aff1-3fc24a18e003",
|
||||||
|
"value": "Manipulate online polls"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create other assets/dossier/cover/fake relationships and/or connections or documents, sites, bylines, attributions, to establish/augment/inflate crediblity/believability",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0030",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0030.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "88fad613-42bb-46b0-8ef7-dafde53d2b72",
|
||||||
|
"value": "Backstop personas"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use YouTube as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0031",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0031.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "18a024a0-b0c8-4091-bd22-9d167c0ada16",
|
||||||
|
"value": "YouTube"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use Reddit as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0032",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0032.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "0cf0ecdb-fc07-41b0-9fa1-8c7eb40a8116",
|
||||||
|
"value": "Reddit"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use Instagram as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0033",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0033.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "3ad77fc0-970b-4a6a-bfd9-db122e375812",
|
||||||
|
"value": "Instagram"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use LinkedIn as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0034",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0034.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "9a440d3e-eba9-4d8f-ba93-d691a9121a68",
|
||||||
|
"value": "LinkedIn"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use Pinterest as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0035",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0035.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "ba998ea4-b39d-4d66-b3ba-d90e2e0abc8c",
|
||||||
|
"value": "Pinterest"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use WhatsApp as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0036",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0036.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "231e17e7-3268-4316-ae25-ba4e978a043a",
|
||||||
|
"value": "WhatsApp"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use Facebook as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0037",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0037.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "70086088-dfd6-4fd7-9f28-bf61c7f77dbb",
|
||||||
|
"value": "Facebook"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use Twitter as a narrative dissemination channel",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0038",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:channel-selection"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0038.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "c2463ebc-2156-4597-b8e8-cad15954cab4",
|
||||||
|
"value": "Twitter"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0039",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0039.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "f1145ebe-da32-471b-9ce5-4ba5c1393bb3",
|
||||||
|
"value": "Bait legitimate influencers"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0040",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0040.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "6134c516-1521-40ee-9cdd-48d5f034289a",
|
||||||
|
"value": "Demand unsurmountable proof"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0041",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0041.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "90e5c8f1-55b4-48f3-99df-07a1b15621b7",
|
||||||
|
"value": "Deny involvement"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0042",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0042.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "c4820314-22b3-4143-b197-0ef49faa6132",
|
||||||
|
"value": "Kernel of Truth"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0043",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0043.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "f89d4b1d-34a3-41fc-9fcb-5c17faf4d928",
|
||||||
|
"value": "Use SMS/ WhatsApp/ Chat apps"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0044",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0044.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "04946fbc-9bfc-4078-8dec-d3554233494b",
|
||||||
|
"value": "Seed distortions"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use the fake experts that were set up in T0009. Pseudo-experts are disposable assets that often appear once and then disappear. Give \"credility\" to misinformation. Take advantage of credential bias",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0045",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0045.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "6284e088-837a-4dbe-8f81-249559069625",
|
||||||
|
"value": "Use fake experts"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka \"Black-hat SEO\" ",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0046",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:pump-priming"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0046.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "1a51094b-5965-4ddb-9833-11e14ac1fd98",
|
||||||
|
"value": "Search Engine Optimization"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports. (Example 20190 Singapore Protection from Online Falsehoods and Manipulation Bill would make it illegal to spread \"false statements of fact\" in Singapore, where that information is \"prejudicial\" to Singapore's security or \"public tranquility.\" Or India/New Delhi has cut off services to Facebook and Twitter in Kashmir 28 times in the past five years, and in 2016, access was blocked for five months -- on the grounds that these platforms were being used for anti-social and \"anti-national\" purposes.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0047",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0047.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "6e13aaa2-8452-4f4f-b5ca-56291dcbb351",
|
||||||
|
"value": "Muzzle social media as a political force"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Intimidate, coerce, threaten critics/dissidents/journalists via trolling, doxing. Phillipines (example) Maria Ressa and Rappler journalists targeted Duterte regime, lawsuits, trollings, banned from the presidential palace where press briefings take place. 2017 Bot attack on five ProPublica Journalists.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0048",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0048.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "cf50c811-8d01-4c0b-bb0c-c7d84ac620b4",
|
||||||
|
"value": "Cow online opinion leaders"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Flooding and/or mobbing social media channels feeds and/or hashtag with excessive volume of content to control/shape online conversations and/or drown out opposing points of view. Bots and/or patriotic trolls are effective tools to acheive this effect. \n\nExample (2018): bots flood social media promoting messages which support Saudi Arabia with intent to cast doubt on allegations that the kingdom was involved in Khashoggi’s death.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0049",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0049.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "01b27791-6daf-4819-a218-256377282135",
|
||||||
|
"value": "Flooding"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Deploy state-coordinated social media commenters and astroturfers. Both internal/domestic and external social media influence operations, popularized by China (50cent Army manage message inside the \"Great Firewall\") but also technique used by Chinese English-language social media influence operations are seeded by state-run media, which overwhelmingly present a positive, benign, and cooperative image of China. ",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0050",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0050.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "b1744176-7e69-4d2a-bd26-3994dd1ade79",
|
||||||
|
"value": "Cheerleading domestic social media ops"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums. (2017 example) the FCC was inundated with nearly 22 million public comments on net neutrality (many from fake accounts)",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0051",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0051.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "a9d7894e-abc8-407f-8f90-62d3b2cff277",
|
||||||
|
"value": "Fabricate social media comment"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Create content/news/opinion web-sites to cross-post stories. Tertiary sites circulate and amplify narratives. Often these sites have no masthead, bylines or attribution. \n\nExamples of tertiary sites inculde Russia Insider, The Duran, geopolitica.ru, Mint Press News, Oriental Review, globalresearch.ca. \n\nExample (2019, Domestic news): Snopes reveals Star News Digital Media, Inc. may look like a media company that produces local news, but operates via undisclosed connections to political activism. \n\nExample (2018) FireEye reports on Iranian campaign that created between April 2018 and March 2019 sites used to spread inauthentic content from websites such as Liberty Front Press (LFP), US Journal, and Real Progressive Front during the US mid-terms.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0052",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0052.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "bb0c643e-c83b-474e-9eb6-21ba51d20efe",
|
||||||
|
"value": "Tertiary sites amplify news"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized) e.g. BlackLivesMatter or MeToo",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0053",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0053.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "9feff36b-887c-4cb8-9224-a0694b003d57",
|
||||||
|
"value": "Twitter trolls amplify and manipulate"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more \"popular\" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.(example 2019) #TrudeauMustGo ",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0054",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0054.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "10f072e1-02cd-4b6e-8a4e-c1c35cf9e166",
|
||||||
|
"value": "Twitter bots amplify"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Use the dedicated hashtag for the incident (e.g. #PhosphorusDisaster)",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0055",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0055.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "0f490149-34b2-4316-b19b-7b43423522b3",
|
||||||
|
"value": "Use hashtag"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Output information pollution (e.g. articles on an unreported false story/event) through channels controlled by or related to the incident creator. Examples include RT/Sputnik or antivax websites seeding stories.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0056",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:exposure"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0056.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "4a3a83d1-fb95-47ac-91fe-cd2682eb4637",
|
||||||
|
"value": "Dedicated channels disseminate information pollution"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives. Example: Facebook groups/pages coordinate/more divisive/polarizing groups and actvities into the public space. (Example) Mueller's report, highlights, the IRA organized political rallies in the U.S. using social media starting in 2015 and continued to coordinate rallies after the 2016 election",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0057",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:go-physical"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0057.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "37a150a4-abb9-475d-820b-132336b25491",
|
||||||
|
"value": "Organise remote rallies and events"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Make incident content visible for a long time, e.g. by exploiting platform terms of service, or placing it where it's hard to remove or unlikely to be removed.",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0058",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:persistence"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0058.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "c7366126-f01d-435d-91d5-e77d26082c1a",
|
||||||
|
"value": "Legacy web content"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0059",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:persistence"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0059.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "12a75c2e-495d-43da-bf13-d89f448cefc0",
|
||||||
|
"value": "Play the long game"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0060",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:persistence"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0060.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "cface37a-cbb9-4554-96f0-d3088f7131ed",
|
||||||
|
"value": "Continue to amplify"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"description": "Sell hats, t-shirts, flags and other branded content that's designed to be seen in the real world",
|
||||||
|
"meta": {
|
||||||
|
"external_id": "T0061",
|
||||||
|
"kill_chain": [
|
||||||
|
"misinfosec:misinformation-tactics:go-physical"
|
||||||
|
],
|
||||||
|
"refs": [
|
||||||
|
"https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/T0061.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"uuid": "3b312e50-6420-48b7-9a94-c4d84f29ad1c",
|
||||||
|
"value": "Sell merchandising"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"version": 3
|
||||||
|
}
|
|
@ -0,0 +1,25 @@
|
||||||
|
{
|
||||||
|
"description": "AM!TT Tactic",
|
||||||
|
"icon": "map",
|
||||||
|
"kill_chain_order": {
|
||||||
|
"misinformation-tactics": [
|
||||||
|
"Strategic Planning",
|
||||||
|
"Objective Planning",
|
||||||
|
"Develop People",
|
||||||
|
"Develop Networks",
|
||||||
|
"Microtargeting",
|
||||||
|
"Develop Content",
|
||||||
|
"Channel Selection",
|
||||||
|
"Pump Priming",
|
||||||
|
"Exposure",
|
||||||
|
"Go Physical",
|
||||||
|
"Persistence",
|
||||||
|
"Measure Effectiveness"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"name": "Misinformation Pattern",
|
||||||
|
"namespace": "misinfosec",
|
||||||
|
"type": "amitt-misinformation-pattern",
|
||||||
|
"uuid": "4d381145-9a5e-4778-918c-fbf23d78544e",
|
||||||
|
"version": 3
|
||||||
|
}
|
|
@ -0,0 +1,171 @@
|
||||||
|
import pandas as pd
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
import xlrd
|
||||||
|
|
||||||
|
|
||||||
|
class Amitt:
|
||||||
|
"""
|
||||||
|
Create MISP galaxy and cluster JSON files.
|
||||||
|
|
||||||
|
This script relies on the AMITT metadata xlsx available here:
|
||||||
|
https://github.com/misinfosecproject/amitt_framework/blob/master/generating_code/amitt_metadata_v3.xlsx
|
||||||
|
|
||||||
|
This script has been adapted from:
|
||||||
|
https://github.com/misinfosecproject/amitt_framework/blob/master/generating_code/amitt.py
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, infile='amitt_metadata_v3.xlsx'):
|
||||||
|
metadata = {}
|
||||||
|
xlsx = pd.ExcelFile(infile)
|
||||||
|
for sheetname in xlsx.sheet_names:
|
||||||
|
metadata[sheetname] = xlsx.parse(sheetname)
|
||||||
|
|
||||||
|
# Create individual tables and dictionaries
|
||||||
|
self.phases = metadata['phases']
|
||||||
|
self.techniques = metadata['techniques']
|
||||||
|
self.tasks = metadata['tasks']
|
||||||
|
self.incidents = metadata['incidents']
|
||||||
|
|
||||||
|
tactechs = self.techniques.groupby('tactic')['id'].apply(list).reset_index().rename({'id': 'techniques'},
|
||||||
|
axis=1)
|
||||||
|
self.tactics = metadata['tactics'].merge(tactechs, left_on='id', right_on='tactic', how='left').fillna('').drop(
|
||||||
|
'tactic', axis=1)
|
||||||
|
|
||||||
|
self.tacdict = self.make_object_dict(self.tactics)
|
||||||
|
|
||||||
|
def make_object_dict(self, df):
|
||||||
|
return pd.Series(df.name.values, index=df.id).to_dict()
|
||||||
|
|
||||||
|
def make_amitt_galaxy(self):
|
||||||
|
galaxy = {}
|
||||||
|
galaxy['name'] = 'Misinformation Pattern'
|
||||||
|
galaxy['type'] = 'amitt-misinformation-pattern'
|
||||||
|
galaxy['description'] = 'AM!TT Tactic'
|
||||||
|
galaxy['uuid'] = str(uuid.uuid4())
|
||||||
|
galaxy['version'] = 3
|
||||||
|
galaxy['icon'] = 'map'
|
||||||
|
galaxy['namespace'] = 'misinfosec'
|
||||||
|
|
||||||
|
galaxy['kill_chain_order'] = {
|
||||||
|
'misinformation-tactics': []
|
||||||
|
}
|
||||||
|
|
||||||
|
for k, v in self.tacdict.items():
|
||||||
|
galaxy['kill_chain_order']['misinformation-tactics'].append(v)
|
||||||
|
|
||||||
|
return galaxy
|
||||||
|
|
||||||
|
def write_amitt_file(self, fname, file_data):
|
||||||
|
with open(fname, 'w') as f:
|
||||||
|
json.dump(file_data, f, indent=2, sort_keys=True, ensure_ascii=False)
|
||||||
|
f.write('\n')
|
||||||
|
|
||||||
|
def make_amitt_cluster(self):
|
||||||
|
cluster = {}
|
||||||
|
cluster['authors'] = ['misinfosecproject']
|
||||||
|
cluster['category'] = 'misinformation-pattern'
|
||||||
|
cluster['description'] = 'AM!TT Technique'
|
||||||
|
cluster['name'] = 'Misinformation Pattern'
|
||||||
|
cluster['source'] = 'https://github.com/misinfosecproject/amitt_framework'
|
||||||
|
cluster['type'] = 'amitt-misinformation-pattern'
|
||||||
|
cluster['uuid'] = str(uuid.uuid4())
|
||||||
|
cluster['values'] = []
|
||||||
|
cluster['version'] = 3
|
||||||
|
|
||||||
|
techniques = self.techniques.values.tolist()
|
||||||
|
|
||||||
|
for technique in techniques:
|
||||||
|
t = {}
|
||||||
|
|
||||||
|
if technique[1] != technique[1]:
|
||||||
|
technique[1] = ''
|
||||||
|
|
||||||
|
if technique[2] != technique[2]:
|
||||||
|
technique[2] = ''
|
||||||
|
|
||||||
|
if technique[3] != technique[3]:
|
||||||
|
technique[3] = ''
|
||||||
|
|
||||||
|
if technique[1] == technique[2] == technique[3] == '':
|
||||||
|
continue
|
||||||
|
|
||||||
|
t['uuid'] = str(uuid.uuid4())
|
||||||
|
t['value'] = technique[1]
|
||||||
|
t['description'] = technique[3]
|
||||||
|
t['meta'] = {
|
||||||
|
'external_id': technique[0],
|
||||||
|
'kill_chain': [
|
||||||
|
'misinfosec:misinformation-tactics:' + self.tacdict[technique[2]].replace(' ', '-').lower()
|
||||||
|
],
|
||||||
|
'refs': [
|
||||||
|
'https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/' + technique[
|
||||||
|
0] + '.md'
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
cluster['values'].append(t)
|
||||||
|
|
||||||
|
return cluster
|
||||||
|
|
||||||
|
def make_amitt_task_cluster(self):
|
||||||
|
cluster = {}
|
||||||
|
cluster['authors'] = ['misinfosecproject']
|
||||||
|
cluster['category'] = 'misinformation-pattern'
|
||||||
|
cluster['description'] = 'AM!TT Task'
|
||||||
|
cluster['name'] = 'Misinformation Task'
|
||||||
|
cluster['source'] = 'https://github.com/misinfosecproject/amitt_framework'
|
||||||
|
cluster['type'] = 'amitt-misinformation-pattern'
|
||||||
|
cluster['uuid'] = str(uuid.uuid4())
|
||||||
|
cluster['values'] = []
|
||||||
|
cluster['version'] = '3'
|
||||||
|
|
||||||
|
techniques = self.techniques.values.tolist()
|
||||||
|
|
||||||
|
for technique in techniques:
|
||||||
|
t = {}
|
||||||
|
|
||||||
|
if technique[1] != technique[1]:
|
||||||
|
technique[1] = ''
|
||||||
|
|
||||||
|
if technique[2] != technique[2]:
|
||||||
|
technique[2] = ''
|
||||||
|
|
||||||
|
if technique[3] != technique[3]:
|
||||||
|
technique[3] = ''
|
||||||
|
|
||||||
|
if technique[1] == technique[2] == technique[3] == '':
|
||||||
|
continue
|
||||||
|
|
||||||
|
t['uuid'] = str(uuid.uuid4())
|
||||||
|
t['value'] = technique[1]
|
||||||
|
t['description'] = technique[3]
|
||||||
|
t['meta'] = {
|
||||||
|
'external_id': technique[0],
|
||||||
|
'kill_chain': [
|
||||||
|
'misinfosec:misinformation-tactics:' + self.tacdict[technique[2]].replace(' ', '-').lower()
|
||||||
|
],
|
||||||
|
'refs': [
|
||||||
|
'https://github.com/misinfosecproject/amitt_framework/blob/master/techniques/' + technique[
|
||||||
|
0] + '.md'
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
cluster['values'].append(t)
|
||||||
|
|
||||||
|
return cluster
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
amitt = Amitt()
|
||||||
|
|
||||||
|
galaxy = amitt.make_amitt_galaxy()
|
||||||
|
amitt.write_amitt_file('../galaxies/misinfosec-amitt-misinformation-pattern.json', galaxy)
|
||||||
|
|
||||||
|
cluster = amitt.make_amitt_cluster()
|
||||||
|
amitt.write_amitt_file('../clusters/misinfosec-amitt-misinformation-technique.json', cluster)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
Loading…
Reference in New Issue