mirror of https://github.com/MISP/misp-modules
Merge branch 'master' of github.com:MISP/misp-modules into documentation
commit
c91795dbcc
|
@ -6,11 +6,11 @@ services:
|
|||
cache: pip
|
||||
|
||||
python:
|
||||
- "3.4"
|
||||
- "3.5"
|
||||
- "3.5-dev"
|
||||
- "3.6"
|
||||
- "3.6-dev"
|
||||
- "3.7-dev"
|
||||
|
||||
install:
|
||||
- pip install -U nose codecov pytest
|
||||
|
|
22
README.md
22
README.md
|
@ -23,21 +23,33 @@ For more information: [Extending MISP with Python modules](https://www.circl.lu/
|
|||
* [countrycode](misp_modules/modules/expansion/countrycode.py) - a hover module to tell you what country a URL belongs to.
|
||||
* [CrowdStrike Falcon](misp_modules/modules/expansion/crowdstrike_falcon.py) - an expansion module to expand using CrowdStrike Falcon Intel Indicator API.
|
||||
* [CVE](misp_modules/modules/expansion/cve.py) - a hover module to give more information about a vulnerability (CVE).
|
||||
* [DBL Spamhaus](misp_modules/modules/expansion/dbl_spamhaus.py) - a hover module to check Spamhaus DBL for a domain name.
|
||||
* [DNS](misp_modules/modules/expansion/dns.py) - a simple module to resolve MISP attributes like hostname and domain to expand IP addresses attributes.
|
||||
* [DomainTools](misp_modules/modules/expansion/domaintools.py) - a hover and expansion module to get information from [DomainTools](http://www.domaintools.com/) whois.
|
||||
* [EUPI](misp_modules/modules/expansion/eupi.py) - a hover and expansion module to get information about an URL from the [Phishing Initiative project](https://phishing-initiative.eu/?lang=en).
|
||||
* [Farsight DNSDB Passive DNS](misp_modules/modules/expansion/farsight_passivedns.py) - a hover and expansion module to expand hostname and IP addresses with passive DNS information.
|
||||
* [GeoIP](misp_modules/modules/expansion/geoip_country.py) - a hover and expansion module to get GeoIP information from geolite/maxmind.
|
||||
* [hashdd](misp_modules/modules/expansion/hashdd.py) - a hover module to check file hashes against [hashdd.com](http://www.hashdd.com) including NSLR dataset.
|
||||
* [IPASN](misp_modules/modules/expansion/ipasn.py) - a hover and expansion to get the BGP ASN of an IP address.
|
||||
* [iprep](misp-modules/modules/expansion/iprep.py) - an expansion module to get IP reputation from packetmail.net.
|
||||
* [iprep](misp_modules/modules/expansion/iprep.py) - an expansion module to get IP reputation from packetmail.net.
|
||||
* [onyphe](misp_modules/modules/expansion/onyphe.py) - a modules to process queries on Onyphe.
|
||||
* [onyphe_full](misp_modules/modules/expansion/onyphe_full.py) - a modules to process full queries on Onyphe.
|
||||
* [OTX](misp_modules/modules/expansion/otx.py) - an expansion module for [OTX](https://otx.alienvault.com/).
|
||||
* [passivetotal](misp_modules/modules/expansion/passivetotal.py) - a [passivetotal](https://www.passivetotal.org/) module that queries a number of different PassiveTotal datasets.
|
||||
* [rbl](misp_modules/modules/expansion/rbl.py) - a module to get RBL (Real-Time Blackhost List) values from an attribute.
|
||||
* [reversedns](misp_modules/modules/expansion/reversedns.py) - Simple Reverse DNS expansion service to resolve reverse DNS from MISP attributes.
|
||||
* [securitytrails](misp_modules/modules/expansion/securitytrails.py) - an expansion module for [securitytrails](https://securitytrails.com/).
|
||||
* [shodan](misp_modules/modules/expansion/shodan.py) - a minimal [shodan](https://www.shodan.io/) expansion module.
|
||||
* [Sigma queries](misp_modules/modules/expansion/sigma_queries.py) - Experimental expansion module querying a sigma rule to convert it into all the available SIEM signatures.
|
||||
* [Sigma syntax validator](misp_modules/modules/expansion/sigma_syntax_validator.py) - Sigma syntax validator.
|
||||
* [sourcecache](misp_modules/modules/expansion/sourcecache.py) - a module to cache a specific link from a MISP instance.
|
||||
* [STIX2 pattern syntax validator](misp_modules/modules/expansion/stix2_pattern_syntax_validator.py) - a module to check a STIX2 pattern syntax.
|
||||
* [ThreatCrowd](misp_modules/modules/expansion/threatcrowd.py) - an expansion module for [ThreatCrowd](https://www.threatcrowd.org/).
|
||||
* [threatminer](misp_modules/modules/expansion/threatminer.py) - an expansion module to expand from [ThreatMiner](https://www.threatminer.org/).
|
||||
* [virustotal](misp_modules/modules/expansion/virustotal.py) - an expansion module to pull known resolutions and malware samples related with an IP/Domain from virusTotal (this modules require a VirusTotal private API key)
|
||||
* [VMray](misp_modules/modules/expansion/vmray_submit.py) - a module to submit a sample to VMray.
|
||||
* [VulnDB](misp_modules/modules/expansion/vulndb.py) - a module to query [VulnDB](https://www.riskbasedsecurity.com/).
|
||||
* [whois](misp_modules/modules/expansion) - a module to query a local instance of [uwhois](https://github.com/rafiot/uwhoisd).
|
||||
* [wikidata](misp_modules/modules/expansion/wiki.py) - a [wikidata](https://www.wikidata.org) expansion module.
|
||||
* [xforce](misp_modules/modules/expansion/xforceexchange.py) - an IBM X-Force Exchange expansion module.
|
||||
* [YARA syntax validator](misp_modules/modules/expansion/yara_syntax_validator.py) - YARA syntax validator.
|
||||
|
@ -45,7 +57,7 @@ For more information: [Extending MISP with Python modules](https://www.circl.lu/
|
|||
### Export modules
|
||||
|
||||
* [CEF](misp_modules/modules/export_mod/cef_export.py) module to export Common Event Format (CEF).
|
||||
* [GoAML export](misp_modules/modules/export_mod/goamlexport.py) module to export in GoAML format.
|
||||
* [GoAML export](misp_modules/modules/export_mod/goamlexport.py) module to export in [GoAML format](http://goaml.unodc.org/goaml/en/index.html).
|
||||
* [Lite Export](misp_modules/modules/export_mod/liteexport.py) module to export a lite event.
|
||||
* [Simple PDF export](misp_modules/modules/export_mod/pdfexport.py) module to export in PDF (required: asciidoctor-pdf).
|
||||
* [ThreatConnect](misp_modules/modules/export_mod/threat_connect_export.py) module to export in ThreatConnect CSV format.
|
||||
|
@ -56,9 +68,9 @@ For more information: [Extending MISP with Python modules](https://www.circl.lu/
|
|||
* [CSV import](misp_modules/modules/import_mod/csvimport.py) Customizable CSV import module.
|
||||
* [Cuckoo JSON](misp_modules/modules/import_mod/cuckooimport.py) Cuckoo JSON import.
|
||||
* [Email Import](misp_modules/modules/import_mod/email_import.py) Email import module for MISP to import basic metadata.
|
||||
* [GoAML import](misp_modules/modules/import_mod/) Module to import [GoAML](http://goaml.unodc.org/goaml/en/index.html) XML format.
|
||||
* [OCR](misp_modules/modules/import_mod/ocr.py) Optical Character Recognition (OCR) module for MISP to import attributes from images, scan or faxes.
|
||||
* [OpenIOC](misp_modules/modules/import_mod/openiocimport.py) OpenIOC import based on PyMISP library.
|
||||
* [stiximport](misp_modules/modules/import_mod/stiximport.py) - An import module to process STIX xml/json.
|
||||
* [ThreatAnalyzer](misp_modules/modules/import_mod/threatanalyzer_import.py) - An import module to process ThreatAnalyzer archive.zip/analysis.json sandbox exports.
|
||||
* [VMRay](misp_modules/modules/import_mod/vmray_import.py) - An import module to process VMRay export.
|
||||
|
||||
|
@ -372,7 +384,11 @@ Recommended Plugin.Import_ocr_enabled true Enable or disable the ocr
|
|||
In this same menu set any other plugin settings that are required for testing.
|
||||
|
||||
## Install misp-module on an offline instance.
|
||||
<<<<<<< HEAD
|
||||
First, you need to grab all necessery packages for example like this :
|
||||
=======
|
||||
First, you need to grab all necessary packages for example like this :
|
||||
>>>>>>> 79633242c842a6ca7c90a4c4e6bc002e05403aef
|
||||
|
||||
Use pip wheel to create an archive
|
||||
~~~
|
||||
|
|
|
@ -1,5 +1,3 @@
|
|||
stix
|
||||
cybox
|
||||
tornado
|
||||
dnspython
|
||||
requests
|
||||
|
@ -12,13 +10,17 @@ pyeupi
|
|||
ipasn-redis
|
||||
asnhistory
|
||||
git+https://github.com/Rafiot/uwhoisd.git@testing#egg=uwhois&subdirectory=client
|
||||
git+https://github.com/MISP/MISP-STIX-Converter.git#egg=misp_stix_converter
|
||||
git+https://github.com/MISP/PyMISP.git#egg=pymisp
|
||||
git+https://github.com/sebdraven/pyonyphe#egg=pyonyphe
|
||||
git+https://github.com/sebdraven/pydnstrails#egg=pydnstrails
|
||||
pillow
|
||||
pytesseract
|
||||
wand
|
||||
SPARQLWrapper
|
||||
domaintools_api
|
||||
pygeoip
|
||||
bs4
|
||||
oauth2
|
||||
yara
|
||||
sigmatools
|
||||
stix2-patterns
|
||||
|
|
|
@ -193,7 +193,7 @@ class QueryModule(tornado.web.RequestHandler):
|
|||
if dict_payload.get('timeout'):
|
||||
timeout = datetime.timedelta(seconds=int(dict_payload.get('timeout')))
|
||||
else:
|
||||
timeout = datetime.timedelta(seconds=30)
|
||||
timeout = datetime.timedelta(seconds=300)
|
||||
response = yield tornado.gen.with_timeout(timeout, self.run_request(jsonpayload))
|
||||
self.write(response)
|
||||
except tornado.gen.TimeoutError:
|
||||
|
|
|
@ -1,6 +1,3 @@
|
|||
from . import _vmray
|
||||
|
||||
__all__ = ['vmray_submit', 'asn_history', 'circl_passivedns', 'circl_passivessl',
|
||||
'countrycode', 'cve', 'dns', 'domaintools', 'eupi', 'farsight_passivedns', 'ipasn', 'passivetotal', 'sourcecache',
|
||||
'virustotal', 'whois', 'shodan', 'reversedns', 'geoip_country', 'wiki', 'iprep', 'threatminer', 'otx',
|
||||
'threatcrowd', 'vulndb', 'crowdstrike_falcon','yara_syntax_validator']
|
||||
__all__ = ['vmray_submit', 'asn_history', 'circl_passivedns', 'circl_passivessl', 'countrycode', 'cve', 'dns', 'domaintools', 'eupi', 'farsight_passivedns', 'ipasn', 'passivetotal', 'sourcecache', 'virustotal', 'whois', 'shodan', 'reversedns', 'geoip_country', 'wiki', 'iprep', 'threatminer', 'otx', 'threatcrowd', 'vulndb', 'crowdstrike_falcon', 'yara_syntax_validator', 'hashdd', 'onyphe', 'onyphe_full', 'rbl', 'xforceexchange', 'sigma_syntax_validator', 'stix2_pattern_syntax_validator', 'sigma_queries', 'dbl_spamhaus']
|
||||
|
|
|
@ -20,10 +20,12 @@ common_tlds = {"com":"Commercial (Worldwide)",
|
|||
"gov":"Government (USA)"
|
||||
}
|
||||
|
||||
codes = requests.get("http://www.geognos.com/api/en/countries/info/all.json").json()
|
||||
codes = False
|
||||
|
||||
def handler(q=False):
|
||||
global codes
|
||||
if not codes:
|
||||
codes = requests.get("http://www.geognos.com/api/en/countries/info/all.json").json()
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
|
|
|
@ -0,0 +1,60 @@
|
|||
import json
|
||||
import datetime
|
||||
from collections import defaultdict
|
||||
|
||||
try:
|
||||
import dns.resolver
|
||||
resolver = dns.resolver.Resolver()
|
||||
resolver.timeout = 0.2
|
||||
resolver.lifetime = 0.2
|
||||
except ImportError:
|
||||
print("dnspython3 is missing, use 'pip install dnspython3' to install it.")
|
||||
sys.exit(0)
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'input': ['domain', 'domain|ip', 'hostname', 'hostname|port'], 'output': ['text']}
|
||||
moduleinfo = {'version': '0.1', 'author': 'Christian Studer',
|
||||
'description': 'Checks Spamhaus DBL for a domain name.',
|
||||
'module-type': ['expansion', 'hover']}
|
||||
moduleconfig = []
|
||||
|
||||
dbl = 'dbl.spamhaus.org'
|
||||
dbl_mapping = {'127.0.1.2': 'spam domain',
|
||||
'127.0.1.4': 'phish domain',
|
||||
'127.0.1.5': 'malware domain',
|
||||
'127.0.1.6': 'botnet C&C domain',
|
||||
'127.0.1.102': 'abused legit spam',
|
||||
'127.0.1.103': 'abused spammed redirector domain',
|
||||
'127.0.1.104': 'abused legit phish',
|
||||
'127.0.1.105': 'abused legit malware',
|
||||
'127.0.1.106': 'abused legit botnet C&C',
|
||||
'127.0.1.255': 'IP queries prohibited!'}
|
||||
|
||||
def fetch_requested_value(request):
|
||||
for attribute_type in mispattributes['input']:
|
||||
if request.get(attribute_type):
|
||||
return request[attribute_type].split('|')[0]
|
||||
return None
|
||||
|
||||
def handler(q=False):
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
requested_value = fetch_requested_value(request)
|
||||
if requested_value is None:
|
||||
misperrors['error'] = "Unsupported attributes type"
|
||||
return misperrors
|
||||
query = "{}.{}".format(requested_value, dbl)
|
||||
try:
|
||||
query_result = resolver.query(query, 'A')[0]
|
||||
result = "{} - {}".format(requested_value, dbl_mapping[str(query_result)])
|
||||
except Exception as e:
|
||||
result = e
|
||||
return {'results': [{'types': mispattributes.get('output'), 'values': result}]}
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -0,0 +1,41 @@
|
|||
import json
|
||||
import requests
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'input': ['md5'], 'output': ['text']}
|
||||
moduleinfo = {'version': '0.1', 'author': 'Alexandre Dulaunoy', 'description': 'An expansion module to check hashes against hashdd.com including NSLR dataset.', 'module-type': ['hover']}
|
||||
moduleconfig = []
|
||||
hashddapi_url = 'https://api.hashdd.com/'
|
||||
|
||||
|
||||
def handler(q=False):
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
if not request.get('md5'):
|
||||
misperrors['error'] = 'MD5 hash value is missing missing'
|
||||
return misperrors
|
||||
v = request.get('md5').upper()
|
||||
r = requests.post(hashddapi_url, data={'hash':v})
|
||||
if r.status_code == 200:
|
||||
state = json.loads(r.text)
|
||||
if state:
|
||||
if state.get(v):
|
||||
summary = state[v]['known_level']
|
||||
else:
|
||||
summary = 'Unknown hash'
|
||||
else:
|
||||
misperrors['error'] = '{} API not accessible'.format(hashddapi_url)
|
||||
return misperrors['error']
|
||||
|
||||
r = {'results': [{'types': mispattributes['output'], 'values': summary}]}
|
||||
return r
|
||||
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -0,0 +1,108 @@
|
|||
import json
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import json
|
||||
try:
|
||||
from onyphe import Onyphe
|
||||
except ImportError:
|
||||
print("pyonyphe module not installed.")
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
|
||||
mispattributes = {'input': ['ip-src', 'ip-dst', 'hostname', 'domains'], 'output': ['hostname', 'domain', 'ip-src', 'ip-dst','url']}
|
||||
# possible module-types: 'expansion', 'hover' or both
|
||||
moduleinfo = {'version': '1', 'author': 'Sebastien Larinier @sebdraven',
|
||||
'description': 'Query on Onyphe',
|
||||
'module-type': ['expansion', 'hover']}
|
||||
|
||||
# config fields that your code expects from the site admin
|
||||
moduleconfig = ['apikey']
|
||||
|
||||
|
||||
def handler(q=False):
|
||||
if q:
|
||||
|
||||
request = json.loads(q)
|
||||
|
||||
if not request.get('config') and not (request['config'].get('apikey')):
|
||||
misperrors['error'] = 'Onyphe authentication is missing'
|
||||
return misperrors
|
||||
|
||||
api = Onyphe(request['config'].get('apikey'))
|
||||
|
||||
if not api:
|
||||
misperrors['error'] = 'Onyphe Error instance api'
|
||||
|
||||
ip = ''
|
||||
if request.get('ip-src'):
|
||||
ip = request['ip-src']
|
||||
elif request.get('ip-dst'):
|
||||
ip = request['ip-dst']
|
||||
else:
|
||||
misperrors['error'] = "Unsupported attributes type"
|
||||
return misperrors
|
||||
|
||||
return handle_expansion(api, ip, misperrors)
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def handle_expansion(api, ip, misperrors):
|
||||
result = api.ip(ip)
|
||||
|
||||
if result['status'] == 'nok':
|
||||
misperrors['error'] = result['message']
|
||||
return misperrors
|
||||
|
||||
categories = list(set([item['@category'] for item in result['results']]))
|
||||
|
||||
result_filtered = {"results": []}
|
||||
urls_pasties = []
|
||||
asn_list = []
|
||||
os_list = []
|
||||
domains_resolver = []
|
||||
domains_forward = []
|
||||
|
||||
for r in result['results']:
|
||||
if r['@category'] == 'pastries':
|
||||
if r['@type'] == 'pastebin':
|
||||
urls_pasties.append('https://pastebin.com/raw/%s' % r['key'])
|
||||
elif r['@category'] == 'synscan':
|
||||
asn_list.append(r['asn'])
|
||||
os_target = r['os']
|
||||
if os_target != 'Unknown':
|
||||
os_list.append(r['os'])
|
||||
elif r['@category'] == 'resolver' and r['@type'] =='reverse':
|
||||
domains_resolver.append(r['reverse'])
|
||||
elif r['@category'] == 'resolver' and r['@type'] =='forward':
|
||||
domains_forward.append(r['forward'])
|
||||
|
||||
result_filtered['results'].append({'types': ['url'], 'values': urls_pasties,
|
||||
'categories': ['External analysis']})
|
||||
|
||||
result_filtered['results'].append({'types': ['AS'], 'values': list(set(asn_list)),
|
||||
'categories': ['Network activity']})
|
||||
|
||||
result_filtered['results'].append({'types': ['target-machine'],
|
||||
'values': list(set(os_list)),
|
||||
'categories': ['Targeting data']})
|
||||
|
||||
result_filtered['results'].append({'types': ['domain'],
|
||||
'values': list(set(domains_resolver)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'resolver to %s' % ip })
|
||||
|
||||
result_filtered['results'].append({'types': ['domain'],
|
||||
'values': list(set(domains_forward)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'forward to %s' % ip})
|
||||
return result_filtered
|
||||
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -0,0 +1,377 @@
|
|||
import json
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import json
|
||||
try:
|
||||
from onyphe import Onyphe
|
||||
except ImportError:
|
||||
print("pyonyphe module not installed.")
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
|
||||
mispattributes = {'input': ['ip-src', 'ip-dst', 'hostname', 'domain'],
|
||||
'output': ['hostname', 'domain', 'ip-src', 'ip-dst','url']}
|
||||
|
||||
# possible module-types: 'expansion', 'hover' or both
|
||||
moduleinfo = {'version': '1', 'author': 'Sebastien Larinier @sebdraven',
|
||||
'description': 'Query on Onyphe',
|
||||
'module-type': ['expansion', 'hover']}
|
||||
|
||||
# config fields that your code expects from the site admin
|
||||
moduleconfig = ['apikey']
|
||||
|
||||
|
||||
def handler(q=False):
|
||||
if q:
|
||||
|
||||
request = json.loads(q)
|
||||
|
||||
if not request.get('config') and not (request['config'].get('apikey')):
|
||||
misperrors['error'] = 'Onyphe authentication is missing'
|
||||
return misperrors
|
||||
|
||||
api = Onyphe(request['config'].get('apikey'))
|
||||
|
||||
if not api:
|
||||
misperrors['error'] = 'Onyphe Error instance api'
|
||||
|
||||
ip = ''
|
||||
if request.get('ip-src'):
|
||||
ip = request['ip-src']
|
||||
return handle_ip(api ,ip, misperrors)
|
||||
elif request.get('ip-dst'):
|
||||
ip = request['ip-dst']
|
||||
return handle_ip(api,ip,misperrors)
|
||||
elif request.get('domain'):
|
||||
domain = request['domain']
|
||||
return handle_domain(api, domain, misperrors)
|
||||
elif request.get('hostname'):
|
||||
hostname = request['hostname']
|
||||
return handle_domain(api, hostname, misperrors)
|
||||
else:
|
||||
misperrors['error'] = "Unsupported attributes type"
|
||||
return misperrors
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def handle_domain(api, domain, misperrors):
|
||||
result_filtered = {"results": []}
|
||||
|
||||
r, status_ok = expand_pastries(api, misperrors, domain=domain)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error pastries result'
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_datascan(api, misperrors, domain=domain)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error datascan result '
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_threatlist(api, misperrors, domain=domain)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error threat list'
|
||||
return misperrors
|
||||
|
||||
return result_filtered
|
||||
|
||||
|
||||
def handle_ip(api, ip, misperrors):
|
||||
result_filtered = {"results": []}
|
||||
|
||||
r, status_ok = expand_syscan(api, ip, misperrors)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = "Error syscan result"
|
||||
|
||||
r, status_ok = expand_pastries(api,misperrors,ip=ip)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error pastries result'
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_datascan(api, misperrors, ip=ip)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error datascan result '
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_forward(api, ip, misperrors)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error forward result'
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_reverse(api, ip, misperrors)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error reverse result'
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_threatlist(api, misperrors, ip=ip)
|
||||
|
||||
if status_ok:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = 'Error threat list'
|
||||
return misperrors
|
||||
|
||||
return result_filtered
|
||||
|
||||
|
||||
def expand_syscan(api, ip, misperror):
|
||||
status_ok = False
|
||||
r = []
|
||||
asn_list = []
|
||||
os_list = []
|
||||
geoloc = []
|
||||
orgs = []
|
||||
results = api.synscan(ip)
|
||||
|
||||
if results['status'] == 'ok':
|
||||
status_ok = True
|
||||
for elem in results['results']:
|
||||
asn_list.append(elem['asn'])
|
||||
os_target = elem['os']
|
||||
geoloc.append(elem['location'])
|
||||
orgs.append(elem['organization'])
|
||||
if os_target != 'Unknown' and os_target != 'Undefined':
|
||||
os_list.append(elem['os'])
|
||||
|
||||
r.append({'types': ['target-machine'],
|
||||
'values': list(set(os_list)),
|
||||
'categories': ['Targeting data'],
|
||||
'comment': 'OS found on %s with synscan of Onyphe' % ip})
|
||||
|
||||
r.append({'types': ['target-location'],
|
||||
'values': list(set(geoloc)),
|
||||
'categories': ['Targeting data'],
|
||||
'comment': 'Geolocalisation of %s found with synscan of Onyphe'
|
||||
% ip
|
||||
})
|
||||
|
||||
r.append({'types': ['target-org'],
|
||||
'values': list(set(orgs)),
|
||||
'categories': ['Targeting data'],
|
||||
'comment': 'Organisations of %s found with synscan of Onyphe'
|
||||
% ip
|
||||
})
|
||||
|
||||
r.append({'types': ['AS'],
|
||||
'values': list(set(asn_list)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'As number of %s found with synscan of Onyphe' % ip
|
||||
})
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_datascan(api, misperror,**kwargs):
|
||||
status_ok = False
|
||||
r = []
|
||||
ip = ''
|
||||
query =''
|
||||
asn_list = []
|
||||
geoloc = []
|
||||
orgs = []
|
||||
ports = []
|
||||
|
||||
if 'ip' in kwargs:
|
||||
query = kwargs.get('ip')
|
||||
results = api.datascan(query)
|
||||
else:
|
||||
query = kwargs.get('domain')
|
||||
results = api.search_datascan('domain:%s' % query)
|
||||
|
||||
if results['status'] == 'ok':
|
||||
status_ok = True
|
||||
for elem in results['results']:
|
||||
asn_list.append(elem['asn'])
|
||||
geoloc.append(elem['location'])
|
||||
orgs.append(elem['organization'])
|
||||
ports.append(elem['port'])
|
||||
|
||||
r.append({'types': ['port'],
|
||||
'values': list(set(ports)),
|
||||
'categories': ['Other'],
|
||||
'comment': 'Ports of %s found with datascan of Onyphe'
|
||||
% query
|
||||
})
|
||||
|
||||
r.append({'types': ['target-location'],
|
||||
'values': list(set(geoloc)),
|
||||
'categories': ['Targeting data'],
|
||||
'comment': 'Geolocalisation of %s found with synscan of Onyphe'
|
||||
% query
|
||||
})
|
||||
|
||||
r.append({'types': ['target-org'],
|
||||
'values': list(set(orgs)),
|
||||
'categories': ['Targeting data'],
|
||||
'comment': 'Organisations of %s found with synscan of Onyphe'
|
||||
% query
|
||||
})
|
||||
|
||||
r.append({'types': ['AS'],
|
||||
'values': list(set(asn_list)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'As number of %s found with synscan of Onyphe' % query
|
||||
})
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_reverse(api, ip, misperror):
|
||||
status_ok = False
|
||||
r = None
|
||||
status_ok = False
|
||||
r = []
|
||||
results = api.reverse(ip)
|
||||
|
||||
domains_reverse = []
|
||||
|
||||
domains = []
|
||||
if results['status'] == 'ok':
|
||||
status_ok = True
|
||||
|
||||
for elem in results['results']:
|
||||
domains_reverse.append(elem['reverse'])
|
||||
domains.append(elem['domain'])
|
||||
|
||||
r.append({'types': ['domain'],
|
||||
'values': list(set(domains)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'Domains of %s from forward service of Onyphe' % ip})
|
||||
|
||||
r.append({'types': ['domain'],
|
||||
'values': list(set(domains_reverse)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'Reverse Domains of %s from forward service of Onyphe' % ip})
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_forward(api, ip, misperror):
|
||||
status_ok = False
|
||||
r = []
|
||||
results = api.forward(ip)
|
||||
|
||||
domains_forward = []
|
||||
|
||||
domains = []
|
||||
if results['status'] == 'ok':
|
||||
status_ok = True
|
||||
|
||||
for elem in results['results']:
|
||||
domains_forward.append(elem['forward'])
|
||||
domains.append(elem['domain'])
|
||||
|
||||
r.append({'types': ['domain'],
|
||||
'values': list(set(domains)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'Domains of %s from forward service of Onyphe' % ip})
|
||||
|
||||
r.append({'types': ['domain'],
|
||||
'values': list(set(domains_forward)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'Forward Domains of %s from forward service of Onyphe' % ip})
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_pastries(api, misperror, **kwargs):
|
||||
status_ok = False
|
||||
r = []
|
||||
|
||||
query = None
|
||||
result = None
|
||||
urls_pasties = []
|
||||
domains = []
|
||||
ips = []
|
||||
if 'ip' in kwargs:
|
||||
query = kwargs.get('ip')
|
||||
result = api.pastries(query)
|
||||
if 'domain' in kwargs:
|
||||
query = kwargs.get('domain')
|
||||
result = api.search_pastries('domain:%s' % query)
|
||||
|
||||
if result['status'] =='ok':
|
||||
status_ok = True
|
||||
for item in result['results']:
|
||||
if item['@category'] == 'pastries':
|
||||
if item['@type'] == 'pastebin':
|
||||
urls_pasties.append('https://pastebin.com/raw/%s' % item['key'])
|
||||
|
||||
if 'domain' in item:
|
||||
domains.extend(item['domain'])
|
||||
if 'ip' in item:
|
||||
ips.extend(item['ip'])
|
||||
if 'hostname' in item:
|
||||
domains.extend(item['hostname'])
|
||||
|
||||
r.append({'types': ['url'],
|
||||
'values': urls_pasties,
|
||||
'categories': ['External analysis'],
|
||||
'comment':'URLs of pasties where %s has found' % query})
|
||||
r.append({'types': ['domain'], 'values': list(set(domains)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'Domains found in pasties of Onyphe'})
|
||||
|
||||
r.append({'types': ['ip-dst'], 'values': list(set(ips)),
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'IPs found in pasties of Onyphe'})
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_threatlist(api, misperror,**kwargs):
|
||||
status_ok = False
|
||||
r = []
|
||||
|
||||
query = None
|
||||
|
||||
threat_list = []
|
||||
|
||||
if 'ip' in kwargs:
|
||||
query = kwargs.get('ip')
|
||||
results = api.threatlist(query)
|
||||
else:
|
||||
query = kwargs.get('domain')
|
||||
results = api.search_threatlist('domain:%s' % query)
|
||||
|
||||
if results['status'] == 'ok':
|
||||
status_ok = True
|
||||
threat_list = ['seen %s on %s ' % (item['seen_date'], item['threatlist'])
|
||||
for item in results['results']]
|
||||
|
||||
r.append({'types': ['comment'],
|
||||
'categories': ['Other'],
|
||||
'values': threat_list,
|
||||
'comment': '%s is present in threatlist' % query
|
||||
})
|
||||
|
||||
return r,status_ok
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -32,16 +32,15 @@ def valid_ip(ip):
|
|||
def findAll(data, keys):
|
||||
a = []
|
||||
if isinstance(data, dict):
|
||||
for key in data.keys():
|
||||
for key, value in data.items():
|
||||
if key == keys:
|
||||
a.append(data[key])
|
||||
a.append(value)
|
||||
else:
|
||||
if isinstance(data[key], (dict, list)):
|
||||
a += findAll(data[key], keys)
|
||||
if isinstance(value, (dict, list)):
|
||||
a.extend(findAll(value, keys))
|
||||
if isinstance(data, list):
|
||||
for i in data:
|
||||
a += findAll(i, keys)
|
||||
|
||||
a.extend(findAll(i, keys))
|
||||
return a
|
||||
|
||||
def valid_email(email):
|
||||
|
@ -82,10 +81,10 @@ def handler(q=False):
|
|||
return r
|
||||
|
||||
|
||||
def getHash(hash, key):
|
||||
def getHash(_hash, key):
|
||||
|
||||
ret = []
|
||||
req = json.loads(requests.get("https://otx.alienvault.com/otxapi/indicator/file/analysis/" + hash).text)
|
||||
req = json.loads(requests.get("https://otx.alienvault.com/otxapi/indicator/file/analysis/" + _hash).text)
|
||||
|
||||
for ip in findAll(req, "dst"):
|
||||
if not isBlacklisted(ip) and valid_ip(ip):
|
||||
|
@ -102,8 +101,8 @@ def getIP(ip, key):
|
|||
ret = []
|
||||
req = json.loads( requests.get("https://otx.alienvault.com/otxapi/indicator/ip/malware/" + ip + "?limit=1000").text )
|
||||
|
||||
for hash in findAll(req, "hash"):
|
||||
ret.append({"types": ["sha256"], "values": [hash]})
|
||||
for _hash in findAll(req, "hash"):
|
||||
ret.append({"types": ["sha256"], "values": [_hash]})
|
||||
|
||||
|
||||
req = json.loads( requests.get("https://otx.alienvault.com/otxapi/indicator/ip/passive_dns/" + ip).text )
|
||||
|
@ -122,21 +121,21 @@ def getDomain(domain, key):
|
|||
|
||||
req = json.loads( requests.get("https://otx.alienvault.com/otxapi/indicator/domain/malware/" + domain + "?limit=1000").text )
|
||||
|
||||
for hash in findAll(req, "hash"):
|
||||
ret.append({"types": ["sha256"], "values": [hash]})
|
||||
for _hash in findAll(req, "hash"):
|
||||
ret.append({"types": ["sha256"], "values": [_hash]})
|
||||
|
||||
req = json.loads(requests.get("https://otx.alienvault.com/otxapi/indicator/domain/whois/" + domain).text)
|
||||
|
||||
for domain in findAll(req, "domain"):
|
||||
ret.append({"types": ["hostname"], "values": [domain]})
|
||||
for _domain in findAll(req, "domain"):
|
||||
ret.append({"types": ["hostname"], "values": [_domain]})
|
||||
|
||||
for email in findAll(req, "value"):
|
||||
if valid_email(email):
|
||||
ret.append({"types": ["email"], "values": [domain]})
|
||||
ret.append({"types": ["email"], "values": [email]})
|
||||
|
||||
for domain in findAll(req, "hostname"):
|
||||
if "." in domain and not isBlacklisted(domain):
|
||||
ret.append({"types": ["hostname"], "values": [domain]})
|
||||
for _domain in findAll(req, "hostname"):
|
||||
if "." in _domain and not isBlacklisted(_domain):
|
||||
ret.append({"types": ["hostname"], "values": [_domain]})
|
||||
|
||||
req = json.loads(requests.get("https://otx.alienvault.com/otxapi/indicator/hostname/passive_dns/" + domain).text)
|
||||
for ip in findAll(req, "address"):
|
||||
|
|
|
@ -6,7 +6,7 @@ try:
|
|||
resolver = dns.resolver.Resolver()
|
||||
resolver.timeout = 0.2
|
||||
resolver.lifetime = 0.2
|
||||
except:
|
||||
except ImportError:
|
||||
print("dnspython3 is missing, use 'pip install dnspython3' to install it.")
|
||||
sys.exit(0)
|
||||
|
||||
|
@ -96,13 +96,12 @@ def handler(q=False):
|
|||
txt = resolver.query(query,'TXT')
|
||||
listed.append(query)
|
||||
info.append(str(txt[0]))
|
||||
except:
|
||||
except Exception:
|
||||
continue
|
||||
result = {}
|
||||
for l, i in zip(listed, info):
|
||||
result[l] = i
|
||||
r = {'results': [{'types': mispattributes.get('output'), 'values': json.dumps(result)}]}
|
||||
return r
|
||||
return {'results': [{'types': mispattributes.get('output'), 'values': json.dumps(result)}]}
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
|
|
@ -0,0 +1,566 @@
|
|||
import json
|
||||
import logging
|
||||
import sys
|
||||
import time
|
||||
|
||||
from dnstrails import APIError
|
||||
from dnstrails import DnsTrails
|
||||
|
||||
log = logging.getLogger('dnstrails')
|
||||
log.setLevel(logging.DEBUG)
|
||||
ch = logging.StreamHandler(sys.stdout)
|
||||
ch.setLevel(logging.DEBUG)
|
||||
formatter = logging.Formatter(
|
||||
'%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||
ch.setFormatter(formatter)
|
||||
log.addHandler(ch)
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {
|
||||
'input': ['hostname', 'domain', 'ip-src', 'ip-dst'],
|
||||
'output': ['hostname', 'domain', 'ip-src', 'ip-dst', 'dns-soa-email',
|
||||
'whois-registrant-email', 'whois-registrant-phone',
|
||||
'whois-registrant-name',
|
||||
'whois-registrar', 'whois-creation-date', 'domain']
|
||||
}
|
||||
|
||||
moduleinfo = {'version': '1', 'author': 'Sebastien Larinier @sebdraven',
|
||||
'description': 'Query on securitytrails.com',
|
||||
'module-type': ['expansion', 'hover']}
|
||||
|
||||
# config fields that your code expects from the site admin
|
||||
moduleconfig = ['apikey']
|
||||
|
||||
|
||||
def handler(q=False):
|
||||
if q:
|
||||
|
||||
request = json.loads(q)
|
||||
|
||||
if not request.get('config') and not (request['config'].get('apikey')):
|
||||
misperrors['error'] = 'DNS authentication is missing'
|
||||
return misperrors
|
||||
|
||||
api = DnsTrails(request['config'].get('apikey'))
|
||||
|
||||
if not api:
|
||||
misperrors['error'] = 'Onyphe Error instance api'
|
||||
|
||||
ip = ""
|
||||
dns_name = ""
|
||||
|
||||
ip = ''
|
||||
if request.get('ip-src'):
|
||||
ip = request['ip-src']
|
||||
return handle_ip(api, ip, misperrors)
|
||||
elif request.get('ip-dst'):
|
||||
ip = request['ip-dst']
|
||||
return handle_ip(api, ip, misperrors)
|
||||
elif request.get('domain'):
|
||||
domain = request['domain']
|
||||
return handle_domain(api, domain, misperrors)
|
||||
elif request.get('hostname'):
|
||||
hostname = request['hostname']
|
||||
return handle_domain(api, hostname, misperrors)
|
||||
else:
|
||||
misperrors['error'] = "Unsupported attributes types"
|
||||
return misperrors
|
||||
else:
|
||||
return False
|
||||
|
||||
|
||||
def handle_domain(api, domain, misperrors):
|
||||
result_filtered = {"results": []}
|
||||
|
||||
r, status_ok = expand_domain_info(api, misperrors, domain)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = misperrors['error'] + ' Error DNS result'
|
||||
return misperrors
|
||||
|
||||
time.sleep(1)
|
||||
r, status_ok = expand_subdomains(api, domain)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = misperrors['error'] + ' Error subdomains result'
|
||||
return misperrors
|
||||
|
||||
time.sleep(1)
|
||||
r, status_ok = expand_whois(api, domain)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = misperrors['error'] + ' Error whois result'
|
||||
return misperrors
|
||||
|
||||
time.sleep(1)
|
||||
r, status_ok = expand_history_ipv4_ipv6(api, domain)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = misperrors['error'] + ' Error history ipv4'
|
||||
return misperrors
|
||||
|
||||
time.sleep(1)
|
||||
|
||||
r, status_ok = expand_history_dns(api, domain)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = misperrors[
|
||||
'error'] + ' Error in expand History DNS'
|
||||
return misperrors
|
||||
|
||||
r, status_ok = expand_history_whois(api, domain)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] = misperrors['error'] + \
|
||||
' Error in expand History Whois'
|
||||
return misperrors
|
||||
|
||||
return result_filtered
|
||||
|
||||
|
||||
def handle_ip(api, ip, misperrors):
|
||||
result_filtered = {"results": []}
|
||||
|
||||
r, status_ok = expand_searching_domain(api, ip)
|
||||
|
||||
if status_ok:
|
||||
if r:
|
||||
result_filtered['results'].extend(r)
|
||||
else:
|
||||
misperrors['error'] += ' Error in expand searching domain'
|
||||
return misperrors
|
||||
|
||||
return result_filtered
|
||||
|
||||
|
||||
def expand_domain_info(api, misperror, domain):
|
||||
r = []
|
||||
status_ok = False
|
||||
ns_servers = []
|
||||
list_ipv4 = []
|
||||
list_ipv6 = []
|
||||
servers_mx = []
|
||||
soa_hostnames = []
|
||||
|
||||
results = api.domain(domain)
|
||||
|
||||
if results:
|
||||
status_ok = True
|
||||
if 'current_dns' in results:
|
||||
if 'values' in results['current_dns']['ns']:
|
||||
ns_servers = [ns_entry['nameserver'] for ns_entry in
|
||||
results['current_dns']['ns']['values']
|
||||
if 'nameserver' in ns_entry]
|
||||
if 'values' in results['current_dns']['a']:
|
||||
list_ipv4 = [a_entry['ip'] for a_entry in
|
||||
results['current_dns']['a']['values'] if
|
||||
'ip' in a_entry]
|
||||
|
||||
if 'values' in results['current_dns']['aaaa']:
|
||||
list_ipv6 = [ipv6_entry['ipv6'] for ipv6_entry in
|
||||
results['current_dns']['aaaa']['values'] if
|
||||
'ipv6' in ipv6_entry]
|
||||
|
||||
if 'values' in results['current_dns']['mx']:
|
||||
servers_mx = [mx_entry['hostname'] for mx_entry in
|
||||
results['current_dns']['mx']['values'] if
|
||||
'hostname' in mx_entry]
|
||||
if 'values' in results['current_dns']['soa']:
|
||||
soa_hostnames = [soa_entry['email'] for soa_entry in
|
||||
results['current_dns']['soa']['values'] if
|
||||
'email' in soa_entry]
|
||||
|
||||
if ns_servers:
|
||||
r.append({'types': ['domain'],
|
||||
'values': ns_servers,
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'List of name servers of %s first seen %s ' %
|
||||
(domain,
|
||||
results['current_dns']['ns']['first_seen'])
|
||||
})
|
||||
|
||||
if list_ipv4:
|
||||
r.append({'types': ['domain|ip'],
|
||||
'values': ['%s|%s' % (domain, ipv4) for ipv4 in
|
||||
list_ipv4],
|
||||
'categories': ['Network activity'],
|
||||
|
||||
'comment': ' List ipv4 of %s first seen %s' %
|
||||
(domain,
|
||||
results['current_dns']['a']['first_seen'])
|
||||
|
||||
})
|
||||
if list_ipv6:
|
||||
r.append({'types': ['domain|ip'],
|
||||
'values': ['%s|%s' % (domain, ipv6) for ipv6 in
|
||||
list_ipv6],
|
||||
'categories': ['Network activity'],
|
||||
'comment': ' List ipv6 of %s first seen %s' %
|
||||
(domain,
|
||||
results['current_dns']['aaaa']['first_seen'])
|
||||
|
||||
})
|
||||
|
||||
if servers_mx:
|
||||
r.append({'types': ['domain'],
|
||||
'values': servers_mx,
|
||||
'categories': ['Network activity'],
|
||||
'comment': ' List mx of %s first seen %s' %
|
||||
(domain,
|
||||
results['current_dns']['mx']['first_seen'])
|
||||
|
||||
})
|
||||
if soa_hostnames:
|
||||
r.append({'types': ['domain'],
|
||||
'values': soa_hostnames,
|
||||
'categories': ['Network activity'],
|
||||
'comment': ' List soa of %s first seen %s' %
|
||||
(domain,
|
||||
results['current_dns']['soa']['first_seen'])
|
||||
})
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_subdomains(api, domain):
|
||||
r = []
|
||||
status_ok = False
|
||||
|
||||
try:
|
||||
results = api.subdomains(domain)
|
||||
|
||||
if results:
|
||||
status_ok = True
|
||||
if 'subdomains' in results:
|
||||
r.append({
|
||||
'types': ['domain'],
|
||||
'values': ['%s.%s' % (sub, domain)
|
||||
for sub in results['subdomains']],
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'subdomains of %s' % domain
|
||||
}
|
||||
|
||||
)
|
||||
except APIError as e:
|
||||
misperrors['error'] = e.value
|
||||
return [], False
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_whois(api, domain):
|
||||
r = []
|
||||
status_ok = False
|
||||
|
||||
try:
|
||||
results = api.whois(domain)
|
||||
|
||||
if results:
|
||||
status_ok = True
|
||||
item_registrant = __select_registrant_item(results)
|
||||
if item_registrant:
|
||||
|
||||
if 'email' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-registrant-email'],
|
||||
'values': [item_registrant[0]['email']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois information of %s by securitytrails'
|
||||
% domain
|
||||
}
|
||||
)
|
||||
|
||||
if 'telephone' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-registrant-phone'],
|
||||
'values': [item_registrant[0]['telephone']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois information of %s by securitytrails'
|
||||
% domain
|
||||
}
|
||||
)
|
||||
|
||||
if 'name' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-registrant-name'],
|
||||
'values': [item_registrant[0]['name']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois information of %s by securitytrails'
|
||||
% domain
|
||||
}
|
||||
)
|
||||
|
||||
if 'registrarName' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-registrar'],
|
||||
'values': [item_registrant[0]['registrarName']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois information of %s by securitytrails'
|
||||
% domain
|
||||
}
|
||||
)
|
||||
|
||||
if 'createdDate' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-creation-date'],
|
||||
'values': [item_registrant[0]['createdDate']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois information of %s by securitytrails'
|
||||
% domain
|
||||
}
|
||||
)
|
||||
|
||||
except APIError as e:
|
||||
misperrors['error'] = e.value
|
||||
return [], False
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_history_ipv4_ipv6(api, domain):
|
||||
r = []
|
||||
status_ok = False
|
||||
|
||||
try:
|
||||
results = api.history_dns_ipv4(domain)
|
||||
|
||||
if results:
|
||||
status_ok = True
|
||||
r.extend(__history_ip(results, domain))
|
||||
|
||||
time.sleep(1)
|
||||
results = api.history_dns_aaaa(domain)
|
||||
|
||||
if results:
|
||||
status_ok = True
|
||||
r.extend(__history_ip(results, domain, type_ip='ipv6'))
|
||||
|
||||
except APIError as e:
|
||||
misperrors['error'] = e.value
|
||||
return [], False
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_history_dns(api, domain):
|
||||
r = []
|
||||
status_ok = False
|
||||
|
||||
try:
|
||||
|
||||
results = api.history_dns_ns(domain)
|
||||
if results:
|
||||
r.extend(__history_dns(results, domain, 'nameserver', 'ns'))
|
||||
|
||||
time.sleep(1)
|
||||
|
||||
results = api.history_dns_soa(domain)
|
||||
|
||||
if results:
|
||||
r.extend(__history_dns(results, domain, 'email', 'soa'))
|
||||
|
||||
time.sleep(1)
|
||||
|
||||
results = api.history_dns_mx(domain)
|
||||
|
||||
if results:
|
||||
status_ok = True
|
||||
r.extend(__history_dns(results, domain, 'host', 'mx'))
|
||||
|
||||
except APIError as e:
|
||||
misperrors['error'] = e.value
|
||||
return [], False
|
||||
|
||||
status_ok = True
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def expand_history_whois(api, domain):
|
||||
r = []
|
||||
status_ok = False
|
||||
try:
|
||||
results = api.history_whois(domain)
|
||||
|
||||
if results:
|
||||
|
||||
if 'items' in results['result']:
|
||||
for item in results['result']['items']:
|
||||
item_registrant = __select_registrant_item(item)
|
||||
|
||||
r.append(
|
||||
{
|
||||
'types': ['domain'],
|
||||
'values': item['nameServers'],
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'Whois history Name Servers of %s '
|
||||
'Status: %s ' % (
|
||||
domain, ' '.join(item['status']))
|
||||
|
||||
}
|
||||
)
|
||||
if item_registrant:
|
||||
|
||||
if 'email' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-registrant-email'],
|
||||
'values': [item_registrant[0]['email']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois history registrant email of %s'
|
||||
'Status: %s' % (
|
||||
domain,
|
||||
' '.join(item['status']))
|
||||
}
|
||||
)
|
||||
|
||||
if 'telephone' in item_registrant[0]:
|
||||
r.append(
|
||||
{
|
||||
'types': ['whois-registrant-phone'],
|
||||
'values': [item_registrant[0]['telephone']],
|
||||
'categories': ['Attribution'],
|
||||
'comment': 'Whois history registrant phone of %s'
|
||||
'Status: %s' % (
|
||||
domain,
|
||||
' '.join(item['status']))
|
||||
}
|
||||
)
|
||||
|
||||
except APIError as e:
|
||||
misperrors['error'] = e.value
|
||||
return [], False
|
||||
status_ok = True
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def __history_ip(results, domain, type_ip='ip'):
|
||||
r = []
|
||||
if 'records' in results:
|
||||
for record in results['records']:
|
||||
if 'values' in record:
|
||||
for item in record['values']:
|
||||
r.append(
|
||||
{'types': ['domain|ip'],
|
||||
'values': ['%s|%s' % (domain, item[type_ip])],
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'History IP on securitytrails %s '
|
||||
'last seen: %s first seen: %s' %
|
||||
(domain, record['last_seen'],
|
||||
record['first_seen'])
|
||||
}
|
||||
)
|
||||
|
||||
return r
|
||||
|
||||
|
||||
def __history_dns(results, domain, type_serv, service):
|
||||
r = []
|
||||
|
||||
if 'records' in results:
|
||||
for record in results['records']:
|
||||
if 'values' in record:
|
||||
values = record['values']
|
||||
if type(values) is list:
|
||||
|
||||
for item in record['values']:
|
||||
r.append(
|
||||
{'types': ['domain|ip'],
|
||||
'values': [item[type_serv]],
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'history %s of %s last seen: %s first seen: %s' %
|
||||
(service, domain, record['last_seen'],
|
||||
record['first_seen'])
|
||||
}
|
||||
)
|
||||
else:
|
||||
r.append(
|
||||
{'types': ['domain|ip'],
|
||||
'values': [values[type_serv]],
|
||||
'categories': ['Network activity'],
|
||||
'comment': 'history %s of %s last seen: %s first seen: %s' %
|
||||
(service, domain, record['last_seen'],
|
||||
record['first_seen'])
|
||||
}
|
||||
)
|
||||
return r
|
||||
|
||||
|
||||
def expand_searching_domain(api, ip):
|
||||
r = []
|
||||
status_ok = False
|
||||
|
||||
try:
|
||||
results = api.searching_domains(ipv4=ip)
|
||||
|
||||
if results:
|
||||
if 'records' in results:
|
||||
res = [(r['host_provider'], r['hostname'], r['whois'])
|
||||
for r in results['records']]
|
||||
|
||||
for host_provider, hostname, whois in res:
|
||||
comment = 'domain for %s by %s' % (ip, host_provider[0])
|
||||
if whois['registrar']:
|
||||
comment = comment + ' registrar %s' % whois['registrar']
|
||||
|
||||
r.append(
|
||||
{
|
||||
'types': ['domain'],
|
||||
'category': ['Network activity'],
|
||||
'values': [hostname],
|
||||
'comment': comment
|
||||
|
||||
}
|
||||
)
|
||||
status_ok = True
|
||||
except APIError as e:
|
||||
misperrors['error'] = e.value
|
||||
return [], False
|
||||
|
||||
return r, status_ok
|
||||
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
||||
|
||||
|
||||
def __select_registrant_item(entry):
|
||||
res = None
|
||||
if 'contacts' in entry:
|
||||
res = list(filter(lambda x: x['type'] == 'registrant',
|
||||
entry['contacts']))
|
||||
|
||||
if 'contact' in entry:
|
||||
res = list(filter(lambda x: x['type'] == 'registrant',
|
||||
entry['contact']))
|
||||
|
||||
return res
|
|
@ -0,0 +1,50 @@
|
|||
import sys, os, io, json
|
||||
try:
|
||||
from sigma.parser import SigmaCollectionParser
|
||||
from sigma.config import SigmaConfiguration
|
||||
from sigma.backends import getBackend, BackendOptions
|
||||
except ImportError:
|
||||
print("sigma or yaml is missing, use 'pip3 install sigmatools' to install it.")
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'input': ['sigma'], 'output': ['text']}
|
||||
moduleinfo = {'version': '0.1', 'author': 'Christian Studer', 'module-type': ['expansion', 'hover'],
|
||||
'description': 'An expansion hover module to display the result of sigma queries.'}
|
||||
moduleconfig = []
|
||||
sigma_targets = ('es-dsl', 'es-qs', 'graylog', 'kibana', 'xpack-watcher', 'logpoint', 'splunk', 'grep', 'wdatp', 'splunkxml', 'arcsight', 'qualys')
|
||||
|
||||
def handler(q=False):
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
if not request.get('sigma'):
|
||||
misperrors['error'] = 'Sigma rule missing'
|
||||
return misperrors
|
||||
config = SigmaConfiguration()
|
||||
backend_options = BackendOptions(None)
|
||||
f = io.TextIOWrapper(io.BytesIO(request.get('sigma').encode()), encoding='utf-8')
|
||||
parser = SigmaCollectionParser(f, config, None)
|
||||
targets = []
|
||||
old_stdout = sys.stdout
|
||||
result = io.StringIO()
|
||||
sys.stdout = result
|
||||
for t in sigma_targets:
|
||||
backend = getBackend(t)(config, backend_options, None)
|
||||
try:
|
||||
parser.generate(backend)
|
||||
backend.finalize()
|
||||
print("#NEXT")
|
||||
targets.append(t)
|
||||
except:
|
||||
continue
|
||||
sys.stdout = old_stdout
|
||||
results = result.getvalue()[:-5].split('#NEXT')
|
||||
d_result = {t: r.strip() for t,r in zip(targets, results)}
|
||||
return {'results': [{'types': mispattributes['output'], 'values': d_result}]}
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -0,0 +1,35 @@
|
|||
import json
|
||||
try:
|
||||
import yaml
|
||||
from sigma.parser import SigmaParser
|
||||
from sigma.config import SigmaConfiguration
|
||||
except ImportError:
|
||||
print("sigma or yaml is missing, use 'pip3 install sigmatools' to install it.")
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'input': ['sigma'], 'output': ['text']}
|
||||
moduleinfo = {'version': '0.1', 'author': 'Christian Studer', 'module-type': ['expansion', 'hover'],
|
||||
'description': 'An expansion hover module to perform a syntax check on sigma rules'}
|
||||
moduleconfig = []
|
||||
|
||||
def handler(q=False):
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
if not request.get('sigma'):
|
||||
misperrors['error'] = 'Sigma rule missing'
|
||||
return misperrors
|
||||
config = SigmaConfiguration()
|
||||
try:
|
||||
parser = SigmaParser(yaml.safe_load(request.get('sigma')), config)
|
||||
result = ("Syntax valid: {}".format(parser.values))
|
||||
except Exception as e:
|
||||
result = ("Syntax error: {}".format(str(e)))
|
||||
return {'results': [{'types': mispattributes['output'], 'values': result}]}
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -0,0 +1,42 @@
|
|||
import json
|
||||
try:
|
||||
from stix2patterns.validator import run_validator
|
||||
except ImportError:
|
||||
print("stix2 patterns python library is missing, use 'pip3 install stix2-patterns' to install it.")
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'input': ['stix2-pattern'], 'output': ['text']}
|
||||
moduleinfo = {'version': '0.1', 'author': 'Christian Studer', 'module-type': ['expansion', 'hover'],
|
||||
'description': 'An expansion hover module to perform a syntax check on stix2 patterns.'}
|
||||
moduleconfig = []
|
||||
|
||||
def handler(q=False):
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
if not request.get('stix2-pattern'):
|
||||
misperrors['error'] = 'STIX2 pattern missing'
|
||||
return misperrors
|
||||
pattern = request.get('stix2-pattern')
|
||||
syntax_errors = []
|
||||
for p in pattern[2:-2].split(' AND '):
|
||||
syntax_validator = run_validator("[{}]".format(p))
|
||||
if syntax_validator:
|
||||
for error in syntax_validator:
|
||||
syntax_errors.append(error)
|
||||
if syntax_errors:
|
||||
s = 's' if len(syntax_errors) > 1 else ''
|
||||
s_errors = ""
|
||||
for error in syntax_errors:
|
||||
s_errors += "{}\n".format(error[6:])
|
||||
result = "Syntax error{}: \n{}".format(s, s_errors[:-1])
|
||||
else:
|
||||
result = "Syntax valid"
|
||||
return {'results': [{'types': mispattributes['output'], 'values': result}]}
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
|
@ -0,0 +1,265 @@
|
|||
import json
|
||||
import requests
|
||||
import logging
|
||||
import sys
|
||||
import time
|
||||
|
||||
log = logging.getLogger('urlscan')
|
||||
log.setLevel(logging.DEBUG)
|
||||
ch = logging.StreamHandler(sys.stdout)
|
||||
ch.setLevel(logging.DEBUG)
|
||||
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||
ch.setFormatter(formatter)
|
||||
log.addHandler(ch)
|
||||
|
||||
moduleinfo = {
|
||||
'version': '0.1',
|
||||
'author': 'Dave Johnson',
|
||||
'description': 'Module to query urlscan.io',
|
||||
'module-type': ['expansion']
|
||||
}
|
||||
|
||||
moduleconfig = ['apikey']
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {
|
||||
'input': ['hostname', 'domain', 'url', 'hash'],
|
||||
'output': ['hostname', 'domain', 'ip-src', 'ip-dst', 'url', 'text', 'link', 'hash']
|
||||
}
|
||||
|
||||
|
||||
def handler(q=False):
|
||||
if q is False:
|
||||
return False
|
||||
request = json.loads(q)
|
||||
if (request.get('config')):
|
||||
if (request['config'].get('apikey') is None):
|
||||
misperrors['error'] = 'urlscan apikey is missing'
|
||||
return misperrors
|
||||
client = urlscanAPI(request['config']['apikey'])
|
||||
|
||||
r = {'results': []}
|
||||
|
||||
if 'ip-src' in request:
|
||||
r['results'] += lookup_indicator(client, request['ip-src'])
|
||||
if 'ip-dst' in request:
|
||||
r['results'] += lookup_indicator(client, request['ip-dst'])
|
||||
if 'domain' in request:
|
||||
r['results'] += lookup_indicator(client, request['domain'])
|
||||
if 'hostname' in request:
|
||||
r['results'] += lookup_indicator(client, request['hostname'])
|
||||
if 'url' in request:
|
||||
r['results'] += lookup_indicator(client, request['url'])
|
||||
if 'hash' in request:
|
||||
r['results'] += lookup_indicator(client, request['hash'])
|
||||
|
||||
# Return any errors generated from lookup to the UI and remove duplicates
|
||||
|
||||
uniq = []
|
||||
log.debug(r['results'])
|
||||
for item in r['results']:
|
||||
log.debug(item)
|
||||
if 'error' in item:
|
||||
misperrors['error'] = item['error']
|
||||
return misperrors
|
||||
if item not in uniq:
|
||||
uniq.append(item)
|
||||
r['results'] = uniq
|
||||
return r
|
||||
|
||||
|
||||
def lookup_indicator(client, query):
|
||||
result = client.search_url(query)
|
||||
log.debug('RESULTS: ' + json.dumps(result))
|
||||
r = []
|
||||
misp_comment = "{}: Enriched via the urlscan module".format(query)
|
||||
|
||||
# Determine if the page is reachable
|
||||
for request in result['data']['requests']:
|
||||
if request['response'].get('failed'):
|
||||
if request['response']['failed']['errorText']:
|
||||
log.debug('The page could not load')
|
||||
r.append(
|
||||
{'error': 'Domain could not be resolved: {}'.format(request['response']['failed']['errorText'])})
|
||||
|
||||
if result.get('page'):
|
||||
if result['page'].get('domain'):
|
||||
misp_val = result['page']['domain']
|
||||
r.append({'types': 'domain',
|
||||
'categories': ['Network activity'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result['page'].get('ip'):
|
||||
misp_val = result['page']['ip']
|
||||
r.append({'types': 'ip-dst',
|
||||
'categories': ['Network activity'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result['page'].get('country'):
|
||||
misp_val = 'country: ' + result['page']['country']
|
||||
if result['page'].get('city'):
|
||||
misp_val += ', city: ' + result['page']['city']
|
||||
r.append({'types': 'text',
|
||||
'categories': ['External analysis'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result['page'].get('asn'):
|
||||
misp_val = result['page']['asn']
|
||||
r.append({'types': 'AS', 'categories': ['External analysis'], 'values': misp_val, 'comment': misp_comment})
|
||||
|
||||
if result['page'].get('asnname'):
|
||||
misp_val = result['page']['asnname']
|
||||
r.append({'types': 'text',
|
||||
'categories': ['External analysis'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result.get('stats'):
|
||||
if result['stats'].get('malicious'):
|
||||
log.debug('There is something in results > stats > malicious')
|
||||
threat_list = set()
|
||||
|
||||
if 'matches' in result['meta']['processors']['gsb']['data']:
|
||||
for item in result['meta']['processors']['gsb']['data']['matches']:
|
||||
if item['threatType']:
|
||||
threat_list.add(item['threatType'])
|
||||
|
||||
threat_list = ', '.join(threat_list)
|
||||
log.debug('threat_list values are: \'' + threat_list + '\'')
|
||||
|
||||
if threat_list:
|
||||
misp_val = '{} threat(s) detected'.format(threat_list)
|
||||
r.append({'types': 'text',
|
||||
'categories': ['External analysis'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result.get('lists'):
|
||||
if result['lists'].get('urls'):
|
||||
for url in result['lists']['urls']:
|
||||
url = url.lower()
|
||||
if 'office' in url:
|
||||
misp_val = "Possible Office-themed phishing"
|
||||
elif 'o365' in url or '0365' in url:
|
||||
misp_val = "Possible O365-themed phishing"
|
||||
elif 'microsoft' in url:
|
||||
misp_val = "Possible Microsoft-themed phishing"
|
||||
elif 'paypal' in url:
|
||||
misp_val = "Possible PayPal-themed phishing"
|
||||
elif 'onedrive' in url:
|
||||
misp_val = "Possible OneDrive-themed phishing"
|
||||
elif 'docusign' in url:
|
||||
misp_val = "Possible DocuSign-themed phishing"
|
||||
r.append({'types': 'text',
|
||||
'categories': ['External analysis'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result.get('task'):
|
||||
if result['task'].get('reportURL'):
|
||||
misp_val = result['task']['reportURL']
|
||||
r.append({'types': 'link',
|
||||
'categories': ['External analysis'],
|
||||
'values': misp_val,
|
||||
'comment': misp_comment})
|
||||
|
||||
if result['task'].get('screenshotURL'):
|
||||
image_url = result['task']['screenshotURL']
|
||||
r.append({'types': 'link',
|
||||
'categories': ['External analysis'],
|
||||
'values': image_url,
|
||||
'comment': misp_comment})
|
||||
### TO DO ###
|
||||
### Add ability to add an in-line screenshot of the target website into an attribute
|
||||
# screenshot = requests.get(image_url).content
|
||||
# r.append({'types': ['attachment'],
|
||||
# 'categories': ['External analysis'],
|
||||
# 'values': image_url,
|
||||
# 'image': str(base64.b64encode(screenshot), 'utf-8'),
|
||||
# 'comment': 'Screenshot of website'})
|
||||
|
||||
return r
|
||||
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
||||
|
||||
|
||||
class urlscanAPI():
|
||||
def __init__(self, apikey=None, uuid=None):
|
||||
self.key = apikey
|
||||
self.uuid = uuid
|
||||
|
||||
def request(self, query):
|
||||
log.debug('From request function with the parameter: ' + query)
|
||||
payload = {'url': query}
|
||||
headers = {'API-Key': self.key,
|
||||
'Content-Type': "application/json",
|
||||
'Cache-Control': "no-cache"}
|
||||
|
||||
# Troubleshooting problems with initial search request
|
||||
log.debug('PAYLOAD: ' + json.dumps(payload))
|
||||
log.debug('HEADERS: ' + json.dumps(headers))
|
||||
|
||||
search_url_string = "https://urlscan.io/api/v1/scan/"
|
||||
response = requests.request("POST",
|
||||
search_url_string,
|
||||
data=json.dumps(payload),
|
||||
headers=headers)
|
||||
|
||||
# HTTP 400 - Bad Request
|
||||
if response.status_code == 400:
|
||||
raise Exception('HTTP Error 400 - Bad Request')
|
||||
|
||||
# HTTP 404 - Not found
|
||||
if response.status_code == 404:
|
||||
raise Exception('HTTP Error 404 - These are not the droids you\'re looking for')
|
||||
|
||||
# Any other status code
|
||||
if response.status_code != 200:
|
||||
raise Exception('HTTP Error ' + str(response.status_code))
|
||||
|
||||
if response.text:
|
||||
response = json.loads(response.content.decode("utf-8"))
|
||||
time.sleep(3)
|
||||
self.uuid = response['uuid']
|
||||
|
||||
# Strings for to check for errors on the results page
|
||||
# Null response string for any unavailable resources
|
||||
null_response_string = '"status": 404'
|
||||
# Redirect string accounting for 301/302/303/307/308 status codes
|
||||
redirect_string = '"status": 30'
|
||||
# Normal response string with 200 status code
|
||||
normal_response_string = '"status": 200'
|
||||
|
||||
results_url_string = "https://urlscan.io/api/v1/result/" + self.uuid
|
||||
log.debug('Results URL: ' + results_url_string)
|
||||
|
||||
# Need to wait for results to process and check if they are valid
|
||||
tries = 10
|
||||
while tries >= 0:
|
||||
results = requests.request("GET", results_url_string)
|
||||
log.debug('Made a GET request')
|
||||
results = results.content.decode("utf-8")
|
||||
# checking if there is a 404 status code and no available resources
|
||||
if null_response_string in results and \
|
||||
redirect_string not in results and \
|
||||
normal_response_string not in results:
|
||||
log.debug('Results not processed. Please check again later.')
|
||||
time.sleep(3)
|
||||
tries -= 1
|
||||
else:
|
||||
return json.loads(results)
|
||||
|
||||
raise Exception('Results contained a 404 status error and could not be processed.')
|
||||
|
||||
def search_url(self, query):
|
||||
log.debug('From search_url with parameter: ' + query)
|
||||
return self.request(query)
|
|
@ -2,201 +2,165 @@ import json
|
|||
import requests
|
||||
from requests import HTTPError
|
||||
import base64
|
||||
from collections import defaultdict
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'input': ['hostname', 'domain', "ip-src", "ip-dst", "md5", "sha1", "sha256", "sha512"],
|
||||
'output': ['domain', "ip-src", "ip-dst", "text", "md5", "sha1", "sha256", "sha512", "ssdeep",
|
||||
"authentihash", "filename"]
|
||||
}
|
||||
"authentihash", "filename"]}
|
||||
|
||||
# possible module-types: 'expansion', 'hover' or both
|
||||
moduleinfo = {'version': '2', 'author': 'Hannah Ward',
|
||||
moduleinfo = {'version': '3', 'author': 'Hannah Ward',
|
||||
'description': 'Get information from virustotal',
|
||||
'module-type': ['expansion']}
|
||||
|
||||
# config fields that your code expects from the site admin
|
||||
moduleconfig = ["apikey", "event_limit"]
|
||||
limit = 5 # Default
|
||||
comment = '%s: Enriched via VT'
|
||||
comment = '{}: Enriched via VirusTotal'
|
||||
hash_types = ["md5", "sha1", "sha256", "sha512"]
|
||||
|
||||
class VirusTotalRequest(object):
|
||||
def __init__(self, config):
|
||||
self.apikey = config['apikey']
|
||||
self.limit = int(config.get('event_limit', 5))
|
||||
self.base_url = "https://www.virustotal.com/vtapi/v2/{}/report"
|
||||
self.results = defaultdict(set)
|
||||
self.to_return = []
|
||||
self.input_types_mapping = {'ip-src': self.get_ip, 'ip-dst': self.get_ip,
|
||||
'domain': self.get_domain, 'hostname': self.get_domain,
|
||||
'md5': self.get_hash, 'sha1': self.get_hash,
|
||||
'sha256': self.get_hash, 'sha512': self.get_hash}
|
||||
self.output_types_mapping = {'submission_names': 'filename', 'ssdeep': 'ssdeep',
|
||||
'authentihash': 'authentihash', 'ITW_urls': 'url'}
|
||||
|
||||
def parse_request(self, q):
|
||||
req_values = set()
|
||||
for attribute_type, attribute_value in q.items():
|
||||
req_values.add(attribute_value)
|
||||
try:
|
||||
error = self.input_types_mapping[attribute_type](attribute_value)
|
||||
except KeyError:
|
||||
continue
|
||||
if error is not None:
|
||||
return error
|
||||
for key, values in self.results.items():
|
||||
values = values.difference(req_values)
|
||||
if values:
|
||||
if isinstance(key, tuple):
|
||||
types, comment = key
|
||||
self.to_return.append({'types': list(types), 'values': list(values), 'comment': comment})
|
||||
else:
|
||||
self.to_return.append({'types': key, 'values': list(values)})
|
||||
return self.to_return
|
||||
|
||||
def get_domain(self, domain, do_not_recurse=False):
|
||||
req = requests.get(self.base_url.format('domain'), params={'domain': domain, 'apikey': self.apikey})
|
||||
try:
|
||||
req.raise_for_status()
|
||||
req = req.json()
|
||||
except HTTPError as e:
|
||||
return str(e)
|
||||
if req["response_code"] == 0:
|
||||
# Nothing found
|
||||
return []
|
||||
if "resolutions" in req:
|
||||
for res in req["resolutions"][:self.limit]:
|
||||
ip_address = res["ip_address"]
|
||||
self.results[(("ip-dst", "ip-src"), comment.format(domain))].add(ip_address)
|
||||
# Pivot from here to find all domain info
|
||||
if not do_not_recurse:
|
||||
error = self.get_ip(ip_address, True)
|
||||
if error is not None:
|
||||
return error
|
||||
self.get_more_info(req)
|
||||
|
||||
def get_hash(self, _hash):
|
||||
req = requests.get(self.base_url.format('file'), params={'resource': _hash, 'apikey': self.apikey, 'allinfo': 1})
|
||||
try:
|
||||
req.raise_for_status()
|
||||
req = req.json()
|
||||
except HTTPError as e:
|
||||
return str(e)
|
||||
if req["response_code"] == 0:
|
||||
# Nothing found
|
||||
return []
|
||||
self.get_more_info(req)
|
||||
|
||||
def get_ip(self, ip, do_not_recurse=False):
|
||||
req = requests.get(self.base_url.format('ip-address'), params={'ip': ip, 'apikey': self.apikey})
|
||||
try:
|
||||
req.raise_for_status()
|
||||
req = req.json()
|
||||
except HTTPError as e:
|
||||
return str(e)
|
||||
if req["response_code"] == 0:
|
||||
# Nothing found
|
||||
return []
|
||||
if "resolutions" in req:
|
||||
for res in req["resolutions"][:self.limit]:
|
||||
hostname = res["hostname"]
|
||||
self.results[(("domain",), comment.format(ip))].add(hostname)
|
||||
# Pivot from here to find all domain info
|
||||
if not do_not_recurse:
|
||||
error = self.get_domain(hostname, True)
|
||||
if error is not None:
|
||||
return error
|
||||
self.get_more_info(req)
|
||||
|
||||
def find_all(self, data):
|
||||
hashes = []
|
||||
if isinstance(data, dict):
|
||||
for key, value in data.items():
|
||||
if key in hash_types:
|
||||
self.results[key].add(value)
|
||||
hashes.append(value)
|
||||
else:
|
||||
if isinstance(value, (dict, list)):
|
||||
hashes.extend(self.find_all(value))
|
||||
elif isinstance(data, list):
|
||||
for d in data:
|
||||
hashes.extend(self.find_all(d))
|
||||
return hashes
|
||||
|
||||
def get_more_info(self, req):
|
||||
# Get all hashes first
|
||||
hashes = self.find_all(req)
|
||||
for h in hashes[:self.limit]:
|
||||
# Search VT for some juicy info
|
||||
try:
|
||||
data = requests.get(self.base_url.format('file'), params={'resource': h, 'apikey': self.apikey, 'allinfo': 1}).json()
|
||||
except Exception:
|
||||
continue
|
||||
# Go through euch key and check if it exists
|
||||
for VT_type, MISP_type in self.output_types_mapping.items():
|
||||
if VT_type in data:
|
||||
self.results[((MISP_type,), comment.format(h))].add(data[VT_type])
|
||||
# Get the malware sample
|
||||
sample = requests.get(self.base_url[:-6].format('file/download'), params={'hash': h, 'apikey': self.apikey})
|
||||
malsample = sample.content
|
||||
# It is possible for VT to not give us any submission names
|
||||
if "submission_names" in data:
|
||||
self.to_return.append({"types": ["malware-sample"], "categories": ["Payload delivery"],
|
||||
"values": data["submimssion_names"], "data": str(base64.b64encore(malsample), 'utf-8')})
|
||||
|
||||
def handler(q=False):
|
||||
global limit
|
||||
if q is False:
|
||||
return False
|
||||
|
||||
q = json.loads(q)
|
||||
|
||||
key = q["config"]["apikey"]
|
||||
limit = int(q["config"].get("event_limit", 5))
|
||||
|
||||
r = {"results": []}
|
||||
|
||||
if "ip-src" in q:
|
||||
r["results"] += getIP(q["ip-src"], key)
|
||||
if "ip-dst" in q:
|
||||
r["results"] += getIP(q["ip-dst"], key)
|
||||
if "domain" in q:
|
||||
r["results"] += getDomain(q["domain"], key)
|
||||
if 'hostname' in q:
|
||||
r["results"] += getDomain(q['hostname'], key)
|
||||
if 'md5' in q:
|
||||
r["results"] += getHash(q['md5'], key)
|
||||
if 'sha1' in q:
|
||||
r["results"] += getHash(q['sha1'], key)
|
||||
if 'sha256' in q:
|
||||
r["results"] += getHash(q['sha256'], key)
|
||||
if 'sha512' in q:
|
||||
r["results"] += getHash(q['sha512'], key)
|
||||
|
||||
uniq = []
|
||||
for res in r["results"]:
|
||||
if res not in uniq:
|
||||
uniq.append(res)
|
||||
r["results"] = uniq
|
||||
return r
|
||||
|
||||
|
||||
def getHash(hash, key, do_not_recurse=False):
|
||||
req = requests.get("https://www.virustotal.com/vtapi/v2/file/report",
|
||||
params={"allinfo": 1, "apikey": key, 'resource': hash})
|
||||
try:
|
||||
req.raise_for_status()
|
||||
req = req.json()
|
||||
except HTTPError as e:
|
||||
misperrors['error'] = str(e)
|
||||
if not q.get('config') or not q['config'].get('apikey'):
|
||||
misperrors['error'] = "A VirusTotal api key is required for this module."
|
||||
return misperrors
|
||||
|
||||
if req["response_code"] == 0:
|
||||
# Nothing found
|
||||
return []
|
||||
|
||||
return getMoreInfo(req, key)
|
||||
|
||||
|
||||
def getIP(ip, key, do_not_recurse=False):
|
||||
global limit
|
||||
toReturn = []
|
||||
req = requests.get("https://www.virustotal.com/vtapi/v2/ip-address/report",
|
||||
params={"ip": ip, "apikey": key})
|
||||
try:
|
||||
req.raise_for_status()
|
||||
req = req.json()
|
||||
except HTTPError as e:
|
||||
misperrors['error'] = str(e)
|
||||
del q['module']
|
||||
query = VirusTotalRequest(q.pop('config'))
|
||||
r = query.parse_request(q)
|
||||
if isinstance(r, str):
|
||||
misperrors['error'] = r
|
||||
return misperrors
|
||||
|
||||
if req["response_code"] == 0:
|
||||
# Nothing found
|
||||
return []
|
||||
|
||||
if "resolutions" in req:
|
||||
for res in req["resolutions"][:limit]:
|
||||
toReturn.append({"types": ["domain"], "values": [res["hostname"]], "comment": comment % ip})
|
||||
# Pivot from here to find all domain info
|
||||
if not do_not_recurse:
|
||||
toReturn += getDomain(res["hostname"], key, True)
|
||||
|
||||
toReturn += getMoreInfo(req, key)
|
||||
return toReturn
|
||||
|
||||
|
||||
def getDomain(domain, key, do_not_recurse=False):
|
||||
global limit
|
||||
toReturn = []
|
||||
req = requests.get("https://www.virustotal.com/vtapi/v2/domain/report",
|
||||
params={"domain": domain, "apikey": key})
|
||||
try:
|
||||
req.raise_for_status()
|
||||
req = req.json()
|
||||
except HTTPError as e:
|
||||
misperrors['error'] = str(e)
|
||||
return misperrors
|
||||
|
||||
if req["response_code"] == 0:
|
||||
# Nothing found
|
||||
return []
|
||||
|
||||
if "resolutions" in req:
|
||||
for res in req["resolutions"][:limit]:
|
||||
toReturn.append({"types": ["ip-dst", "ip-src"], "values": [res["ip_address"]], "comment": comment % domain})
|
||||
# Pivot from here to find all info on IPs
|
||||
if not do_not_recurse:
|
||||
toReturn += getIP(res["ip_address"], key, True)
|
||||
if "subdomains" in req:
|
||||
for subd in req["subdomains"]:
|
||||
toReturn.append({"types": ["domain"], "values": [subd], "comment": comment % domain})
|
||||
toReturn += getMoreInfo(req, key)
|
||||
return toReturn
|
||||
|
||||
|
||||
def findAll(data, keys):
|
||||
a = []
|
||||
if isinstance(data, dict):
|
||||
for key in data.keys():
|
||||
if key in keys:
|
||||
a.append(data[key])
|
||||
else:
|
||||
if isinstance(data[key], (dict, list)):
|
||||
a += findAll(data[key], keys)
|
||||
if isinstance(data, list):
|
||||
for i in data:
|
||||
a += findAll(i, keys)
|
||||
|
||||
return a
|
||||
|
||||
|
||||
def getMoreInfo(req, key):
|
||||
global limit
|
||||
r = []
|
||||
# Get all hashes first
|
||||
hashes = []
|
||||
hashes = findAll(req, ["md5", "sha1", "sha256", "sha512"])
|
||||
r.append({"types": ["freetext"], "values": hashes})
|
||||
for hsh in hashes[:limit]:
|
||||
# Search VT for some juicy info
|
||||
try:
|
||||
data = requests.get("http://www.virustotal.com/vtapi/v2/file/report",
|
||||
params={"allinfo": 1, "apikey": key, "resource": hsh}
|
||||
).json()
|
||||
except:
|
||||
continue
|
||||
|
||||
# Go through each key and check if it exists
|
||||
if "submission_names" in data:
|
||||
r.append({'types': ["filename"], "values": data["submission_names"], "comment": comment % hsh})
|
||||
|
||||
if "ssdeep" in data:
|
||||
r.append({'types': ["ssdeep"], "values": [data["ssdeep"]], "comment": comment % hsh})
|
||||
|
||||
if "authentihash" in data:
|
||||
r.append({"types": ["authentihash"], "values": [data["authentihash"]], "comment": comment % hsh})
|
||||
|
||||
if "ITW_urls" in data:
|
||||
r.append({"types": ["url"], "values": data["ITW_urls"], "comment": comment % hsh})
|
||||
|
||||
# Get the malware sample
|
||||
sample = requests.get("https://www.virustotal.com/vtapi/v2/file/download",
|
||||
params={"hash": hsh, "apikey": key})
|
||||
|
||||
malsample = sample.content
|
||||
|
||||
# It is possible for VT to not give us any submission names
|
||||
if "submission_names" in data:
|
||||
r.append({"types": ["malware-sample"],
|
||||
"categories": ["Payload delivery"],
|
||||
"values": data["submission_names"],
|
||||
"data": str(base64.b64encode(malsample), 'utf-8')
|
||||
}
|
||||
)
|
||||
|
||||
return r
|
||||
|
||||
return {'results': r}
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
return moduleinfo
|
||||
|
|
|
@ -2,7 +2,7 @@ import json
|
|||
import requests
|
||||
try:
|
||||
import yara
|
||||
except:
|
||||
except (OSError, ImportError):
|
||||
print("yara is missing, use 'pip3 install yara' to install it.")
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
|
|
|
@ -1 +1 @@
|
|||
__all__ = ['testexport','cef_export','liteexport','goamlexport','threat_connect_export','pdfexport','threatStream_misp_export']
|
||||
__all__ = ['cef_export','liteexport','goamlexport','threat_connect_export','pdfexport','threatStream_misp_export']
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import json, datetime, base64
|
||||
import json
|
||||
import base64
|
||||
from pymisp import MISPEvent
|
||||
from collections import defaultdict, Counter
|
||||
|
||||
|
@ -67,7 +68,7 @@ class GoAmlGeneration(object):
|
|||
try:
|
||||
report_code.append(obj.get_attributes_by_relation('report-code')[0].value.split(' ')[0])
|
||||
currency_code.append(obj.get_attributes_by_relation('currency-code')[0].value)
|
||||
except:
|
||||
except IndexError:
|
||||
print('report_code or currency_code error')
|
||||
self.uuids, self.report_codes, self.currency_codes = uuids, report_code, currency_code
|
||||
|
||||
|
@ -87,9 +88,12 @@ class GoAmlGeneration(object):
|
|||
person_to_parse = [person_uuid for person_uuid in self.uuids.get('person') if person_uuid not in self.parsed_uuids.get('person')]
|
||||
if len(person_to_parse) == 1:
|
||||
self.itterate('person', 'reporting_person', person_to_parse[0], 'header')
|
||||
location_to_parse = [location_uuid for location_uuid in self.uuids.get('geolocation') if location_uuid not in self.parsed_uuids.get('geolocation')]
|
||||
if len(location_to_parse) == 1:
|
||||
self.itterate('geolocation', 'location', location_to_parse[0], 'header')
|
||||
try:
|
||||
location_to_parse = [location_uuid for location_uuid in self.uuids.get('geolocation') if location_uuid not in self.parsed_uuids.get('geolocation')]
|
||||
if len(location_to_parse) == 1:
|
||||
self.itterate('geolocation', 'location', location_to_parse[0], 'header')
|
||||
except TypeError:
|
||||
pass
|
||||
self.xml['data'] += "</report>"
|
||||
|
||||
def itterate(self, object_type, aml_type, uuid, xml_part):
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
from . import _vmray
|
||||
|
||||
__all__ = ['vmray_import', 'testimport', 'ocr', 'stiximport', 'cuckooimport', 'goamlimport',
|
||||
'email_import', 'mispjson', 'openiocimport', 'threatanalyzer_import', 'csvimport']
|
||||
__all__ = ['vmray_import', 'ocr', 'cuckooimport', 'goamlimport', 'email_import', 'mispjson', 'openiocimport', 'threatanalyzer_import', 'csvimport']
|
||||
|
|
|
@ -3,45 +3,58 @@ import json, os, base64
|
|||
import pymisp
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
mispattributes = {'inputSource': ['file'], 'output': ['MISP attributes']}
|
||||
moduleinfo = {'version': '0.1', 'author': 'Christian Studer',
|
||||
'description': 'Import Attributes from a csv file.',
|
||||
'module-type': ['import']}
|
||||
moduleconfig = ['header']
|
||||
moduleconfig = []
|
||||
inputSource = ['file']
|
||||
userConfig = {'header': {
|
||||
'type': 'String',
|
||||
'message': 'Define the header of the csv file, with types (included in MISP attribute types or attribute fields) separated by commas.\nFor fields that do not match these types, please use space or simply nothing between commas.\nFor instance: ip-src,domain, ,timestamp'},
|
||||
'has_header':{
|
||||
'type': 'Boolean',
|
||||
'message': 'Tick this box ONLY if there is a header line, NOT COMMENTED, in the file (which will be skipped atm).'
|
||||
}}
|
||||
|
||||
duplicatedFields = {'mispType': {'mispComment': 'comment'},
|
||||
'attrField': {'attrComment': 'comment'}}
|
||||
attributesFields = ['type', 'value', 'category', 'to_ids', 'comment', 'distribution']
|
||||
delimiters = [',', ';', '|', '/', '\t', ' ']
|
||||
|
||||
class CsvParser():
|
||||
def __init__(self, header):
|
||||
def __init__(self, header, has_header):
|
||||
self.header = header
|
||||
self.fields_number = len(header)
|
||||
self.has_header = has_header
|
||||
self.attributes = []
|
||||
|
||||
def parse_data(self, data):
|
||||
return_data = []
|
||||
for line in data:
|
||||
l = line.split('#')[0].strip() if '#' in line else line.strip()
|
||||
if l:
|
||||
return_data.append(l)
|
||||
self.data = return_data
|
||||
# find which delimiter is used
|
||||
self.delimiter, self.length = self.findDelimiter()
|
||||
|
||||
def findDelimiter(self):
|
||||
n = len(self.header)
|
||||
if n > 1:
|
||||
tmpData = []
|
||||
for da in self.data:
|
||||
tmp = []
|
||||
for d in (';', '|', '/', ',', '\t', ' ',):
|
||||
if da.count(d) == (n-1):
|
||||
tmp.append(d)
|
||||
if len(tmp) == 1 and tmp == tmpData:
|
||||
return tmpData[0], n
|
||||
else:
|
||||
tmpData = tmp
|
||||
if self.fields_number == 1:
|
||||
for line in data:
|
||||
l = line.split('#')[0].strip()
|
||||
if l:
|
||||
return_data.append(l)
|
||||
self.delimiter = None
|
||||
else:
|
||||
return None, 1
|
||||
self.delimiter_count = dict([(d, 0) for d in delimiters])
|
||||
for line in data:
|
||||
l = line.split('#')[0].strip()
|
||||
if l:
|
||||
self.parse_delimiter(l)
|
||||
return_data.append(l)
|
||||
# find which delimiter is used
|
||||
self.delimiter = self.find_delimiter()
|
||||
self.data = return_data[1:] if self.has_header else return_data
|
||||
|
||||
def parse_delimiter(self, line):
|
||||
for d in delimiters:
|
||||
if line.count(d) >= (self.fields_number - 1):
|
||||
self.delimiter_count[d] += 1
|
||||
|
||||
def find_delimiter(self):
|
||||
_, delimiter = max((n, v) for v, n in self.delimiter_count.items())
|
||||
return delimiter
|
||||
|
||||
def buildAttributes(self):
|
||||
# if there is only 1 field of data
|
||||
|
@ -59,7 +72,7 @@ class CsvParser():
|
|||
datamisp = []
|
||||
datasplit = data.split(self.delimiter)
|
||||
# in case there is an empty line or an error
|
||||
if len(datasplit) != self.length:
|
||||
if len(datasplit) != self.fields_number:
|
||||
continue
|
||||
# pop from the line data that matches with a misp type, using the list of indexes
|
||||
for l in list2pop:
|
||||
|
@ -93,9 +106,12 @@ class CsvParser():
|
|||
elif h in duplicatedFields['attrField']:
|
||||
# fields that should be considered as attribute fields
|
||||
head.append(duplicatedFields['attrField'].get(h))
|
||||
# otherwise, it is an attribute field
|
||||
else:
|
||||
# or, it could be an attribute field
|
||||
elif h in attributesFields:
|
||||
head.append(h)
|
||||
# otherwise, it is not defined
|
||||
else:
|
||||
head.append('')
|
||||
# return list of indexes of the misp types, list of the misp types, remaining fields that will be attribute fields
|
||||
return list2pop, misp, list(reversed(head))
|
||||
|
||||
|
@ -111,9 +127,11 @@ def handler(q=False):
|
|||
if not request.get('config') and not request['config'].get('header'):
|
||||
misperrors['error'] = "Configuration error"
|
||||
return misperrors
|
||||
config = request['config'].get('header').split(',')
|
||||
config = [c.strip() for c in config]
|
||||
csv_parser = CsvParser(config)
|
||||
header = request['config'].get('header').split(',')
|
||||
header = [c.strip() for c in header]
|
||||
has_header = request['config'].get('has_header')
|
||||
has_header = True if has_header == '1' else False
|
||||
csv_parser = CsvParser(header, has_header)
|
||||
csv_parser.parse_data(data.split('\n'))
|
||||
# build the attributes
|
||||
csv_parser.buildAttributes()
|
||||
|
@ -121,7 +139,18 @@ def handler(q=False):
|
|||
return r
|
||||
|
||||
def introspection():
|
||||
return mispattributes
|
||||
modulesetup = {}
|
||||
try:
|
||||
userConfig
|
||||
modulesetup['userConfig'] = userConfig
|
||||
except NameError:
|
||||
pass
|
||||
try:
|
||||
inputSource
|
||||
modulesetup['inputSource'] = inputSource
|
||||
except NameError:
|
||||
pass
|
||||
return modulesetup
|
||||
|
||||
def version():
|
||||
moduleinfo['config'] = moduleconfig
|
||||
|
|
|
@ -1,16 +1,24 @@
|
|||
import sys
|
||||
import json
|
||||
import base64
|
||||
|
||||
from PIL import Image
|
||||
|
||||
from pytesseract import image_to_string
|
||||
from io import BytesIO
|
||||
|
||||
import logging
|
||||
|
||||
log = logging.getLogger('ocr')
|
||||
log.setLevel(logging.DEBUG)
|
||||
ch = logging.StreamHandler(sys.stdout)
|
||||
ch.setLevel(logging.DEBUG)
|
||||
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
|
||||
ch.setFormatter(formatter)
|
||||
log.addHandler(ch)
|
||||
|
||||
misperrors = {'error': 'Error'}
|
||||
userConfig = { };
|
||||
userConfig = {};
|
||||
|
||||
inputSource = ['file']
|
||||
|
||||
moduleinfo = {'version': '0.1', 'author': 'Alexandre Dulaunoy',
|
||||
moduleinfo = {'version': '0.2', 'author': 'Alexandre Dulaunoy',
|
||||
'description': 'Optical Character Recognition (OCR) module for MISP',
|
||||
'module-type': ['import']}
|
||||
|
||||
|
@ -18,14 +26,61 @@ moduleconfig = []
|
|||
|
||||
|
||||
def handler(q=False):
|
||||
# try to import modules and return errors if module not found
|
||||
try:
|
||||
from PIL import Image
|
||||
except ImportError:
|
||||
misperrors['error'] = "Please pip(3) install pillow"
|
||||
return misperrors
|
||||
|
||||
try:
|
||||
# Official ImageMagick module
|
||||
from wand.image import Image as WImage
|
||||
except ImportError:
|
||||
misperrors['error'] = "Please pip(3) install wand"
|
||||
return misperrors
|
||||
|
||||
try:
|
||||
from pytesseract import image_to_string
|
||||
except ImportError:
|
||||
misperrors['error'] = "Please pip(3) install pytesseract"
|
||||
return misperrors
|
||||
|
||||
if q is False:
|
||||
return False
|
||||
r = {'results': []}
|
||||
request = json.loads(q)
|
||||
image = base64.b64decode(request["data"])
|
||||
document = base64.b64decode(request["data"])
|
||||
document = WImage(blob=document)
|
||||
if document.format == 'PDF':
|
||||
with document as pdf:
|
||||
# Get number of pages
|
||||
pages=len(pdf.sequence)
|
||||
log.debug("PDF with {} page(s) detected".format(pages))
|
||||
# Create new image object where the height will be the number of pages. With huge PDFs this will overflow, break, consume silly memory etc…
|
||||
img = WImage(width=pdf.width, height=pdf.height * pages)
|
||||
# Cycle through pages and stitch it together to one big file
|
||||
for p in range(pages):
|
||||
log.debug("Stitching page {}".format(p+1))
|
||||
image = img.composite(pdf.sequence[p], top=pdf.height * p, left=0)
|
||||
# Create a png blob
|
||||
image = img.make_blob('png')
|
||||
log.debug("Final image size is {}x{}".format(pdf.width, pdf.height*(p+1)))
|
||||
else:
|
||||
image = document
|
||||
|
||||
image_file = BytesIO(image)
|
||||
image_file.seek(0)
|
||||
ocrized = image_to_string(Image.open(image_file))
|
||||
|
||||
try:
|
||||
im = Image.open(image_file)
|
||||
except IOError:
|
||||
misperrors['error'] = "Corrupt or not an image file."
|
||||
return misperrors
|
||||
|
||||
|
||||
ocrized = image_to_string(im)
|
||||
|
||||
freetext = {}
|
||||
freetext['values'] = ocrized
|
||||
freetext['types'] = ['freetext']
|
||||
|
|
|
@ -15,7 +15,7 @@ misperrors = {'error': 'Error'}
|
|||
userConfig = {}
|
||||
inputSource = ['file']
|
||||
|
||||
moduleinfo = {'version': '0.7', 'author': 'Christophe Vandeplas',
|
||||
moduleinfo = {'version': '0.9', 'author': 'Christophe Vandeplas',
|
||||
'description': 'Import for ThreatAnalyzer archive.zip/analysis.json files',
|
||||
'module-type': ['import']}
|
||||
|
||||
|
@ -45,16 +45,21 @@ def handler(q=False):
|
|||
if re.match(r"Analysis/proc_\d+/modified_files/mapping\.log", zip_file_name):
|
||||
with zf.open(zip_file_name, mode='r', pwd=None) as fp:
|
||||
file_data = fp.read()
|
||||
for line in file_data.decode().split('\n'):
|
||||
if line:
|
||||
for line in file_data.decode("utf-8", 'ignore').split('\n'):
|
||||
if not line:
|
||||
continue
|
||||
if line.count('|') == 3:
|
||||
l_fname, l_size, l_md5, l_created = line.split('|')
|
||||
l_fname = cleanup_filepath(l_fname)
|
||||
if l_fname:
|
||||
if l_size == 0:
|
||||
pass # FIXME create an attribute for the filename/path
|
||||
else:
|
||||
# file is a non empty sample, upload the sample later
|
||||
modified_files_mapping[l_md5] = l_fname
|
||||
if line.count('|') == 4:
|
||||
l_fname, l_size, l_md5, l_sha256, l_created = line.split('|')
|
||||
l_fname = cleanup_filepath(l_fname)
|
||||
if l_fname:
|
||||
if l_size == 0:
|
||||
results.append({'values': l_fname, 'type': 'filename', 'to_ids': True,
|
||||
'categories': ['Artifacts dropped', 'Payload delivery'], 'comment': ''})
|
||||
else:
|
||||
# file is a non empty sample, upload the sample later
|
||||
modified_files_mapping[l_md5] = l_fname
|
||||
|
||||
# now really process the data
|
||||
for zip_file_name in zf.namelist(): # Get all files in the zip file
|
||||
|
@ -84,7 +89,7 @@ def handler(q=False):
|
|||
results.append({
|
||||
'values': sample_filename,
|
||||
'data': base64.b64encode(file_data).decode(),
|
||||
'type': 'malware-sample', 'categories': ['Artifacts dropped', 'Payload delivery'], 'to_ids': True, 'comment': ''})
|
||||
'type': 'malware-sample', 'categories': ['Payload delivery', 'Artifacts dropped'], 'to_ids': True, 'comment': ''})
|
||||
except Exception as e:
|
||||
# no 'sample' in archive, might be an url analysis, just ignore
|
||||
pass
|
||||
|
@ -109,7 +114,15 @@ def process_analysis_json(analysis_json):
|
|||
for process in analysis_json['analysis']['processes']['process']:
|
||||
# print_json(process)
|
||||
if 'connection_section' in process and 'connection' in process['connection_section']:
|
||||
# compensate for absurd behavior of the data format: if one entry = immediately the dict, if multiple entries = list containing dicts
|
||||
# this will always create a list, even with only one item
|
||||
if isinstance(process['connection_section']['connection'], dict):
|
||||
process['connection_section']['connection'] = [process['connection_section']['connection']]
|
||||
# iterate over each entry
|
||||
for connection_section_connection in process['connection_section']['connection']:
|
||||
if 'name_to_ip' in connection_section_connection: # TA 6.1 data format
|
||||
connection_section_connection['@remote_ip'] = connection_section_connection['name_to_ip']['@result_addresses']
|
||||
connection_section_connection['@remote_hostname'] = connection_section_connection['name_to_ip']['@request_name']
|
||||
|
||||
connection_section_connection['@remote_ip'] = cleanup_ip(connection_section_connection['@remote_ip'])
|
||||
connection_section_connection['@remote_hostname'] = cleanup_hostname(connection_section_connection['@remote_hostname'])
|
||||
|
@ -120,7 +133,7 @@ def process_analysis_json(analysis_json):
|
|||
# connection_section_connection['@remote_hostname'],
|
||||
# connection_section_connection['@remote_ip'])
|
||||
# )
|
||||
yield({'values': val, 'type': 'domain|ip', 'categories': 'Network activity', 'to_ids': True, 'comment': ''})
|
||||
yield({'values': val, 'type': 'domain|ip', 'categories': ['Network activity'], 'to_ids': True, 'comment': ''})
|
||||
elif connection_section_connection['@remote_ip']:
|
||||
# print("connection_section_connection ip-dst: {} IDS:yes".format(
|
||||
# connection_section_connection['@remote_ip'])
|
||||
|
@ -134,18 +147,18 @@ def process_analysis_json(analysis_json):
|
|||
if 'http_command' in connection_section_connection:
|
||||
for http_command in connection_section_connection['http_command']:
|
||||
# print('connection_section_connection HTTP COMMAND: {}\t{}'.format(
|
||||
# http_command['@method'], # comment
|
||||
# http_command['@url']) # url
|
||||
# connection_section_connection['http_command']['@method'], # comment
|
||||
# connection_section_connection['http_command']['@url']) # url
|
||||
# )
|
||||
val = cleanup_url(http_command['@url'])
|
||||
if val:
|
||||
yield({'values': val, 'type': 'url', 'categories': 'Network activity', 'to_ids': True, 'comment': http_command['@method']})
|
||||
yield({'values': val, 'type': 'url', 'categories': ['Network activity'], 'to_ids': True, 'comment': http_command['@method']})
|
||||
|
||||
if 'http_header' in connection_section_connection:
|
||||
for http_header in connection_section_connection['http_header']:
|
||||
if 'User-Agent:' in http_header['@header']:
|
||||
val = http_header['@header'][len('User-Agent: '):]
|
||||
yield({'values': val, 'type': 'user-agent', 'categories': 'Network activity', 'to_ids': False, 'comment': ''})
|
||||
yield({'values': val, 'type': 'user-agent', 'categories': ['Network activity'], 'to_ids': False, 'comment': ''})
|
||||
elif 'Host:' in http_header['@header']:
|
||||
val = http_header['@header'][len('Host: '):]
|
||||
if ':' in val:
|
||||
|
@ -158,7 +171,7 @@ def process_analysis_json(analysis_json):
|
|||
if val_hostname and val_port:
|
||||
val_combined = '{}|{}'.format(val_hostname, val_port)
|
||||
# print({'values': val_combined, 'type': 'hostname|port', 'to_ids': True, 'comment': ''})
|
||||
yield({'values': val_combined, 'type': 'hostname|port', 'to_ids': True, 'comment': ''})
|
||||
yield({'values': val_combined, 'type': 'hostname|port', 'categories': ['Network activity'], 'to_ids': True, 'comment': ''})
|
||||
elif val_ip and val_port:
|
||||
val_combined = '{}|{}'.format(val_ip, val_port)
|
||||
# print({'values': val_combined, 'type': 'ip-dst|port', 'to_ids': True, 'comment': ''})
|
||||
|
@ -203,7 +216,7 @@ def process_analysis_json(analysis_json):
|
|||
# networkoperation_section_dns_request_by_name['@request_name'],
|
||||
# networkoperation_section_dns_request_by_name['@result_addresses'])
|
||||
# )
|
||||
yield({'values': val, 'type': 'domain|ip', 'categories': 'Network activity', 'to_ids': True, 'comment': ''})
|
||||
yield({'values': val, 'type': 'domain|ip', 'categories': ['Network activity'], 'to_ids': True, 'comment': ''})
|
||||
elif networkoperation_section_dns_request_by_name['@request_name']:
|
||||
# print("networkoperation_section_dns_request_by_name hostname: {} IDS:yes".format(
|
||||
# networkoperation_section_dns_request_by_name['@request_name'])
|
||||
|
@ -227,14 +240,14 @@ def process_analysis_json(analysis_json):
|
|||
# networkpacket_section_connect_to_computer['@remote_port'])
|
||||
# )
|
||||
val_combined = "{}|{}".format(networkpacket_section_connect_to_computer['@remote_hostname'], networkpacket_section_connect_to_computer['@remote_ip'])
|
||||
yield({'values': val_combined, 'type': 'hostname|ip', 'to_ids': True, 'comment': ''})
|
||||
yield({'values': val_combined, 'type': 'domain|ip', 'to_ids': True, 'comment': ''})
|
||||
elif networkpacket_section_connect_to_computer['@remote_hostname']:
|
||||
# print("networkpacket_section_connect_to_computer hostname: {} IDS:yes COMMENT:port {}".format(
|
||||
# networkpacket_section_connect_to_computer['@remote_hostname'],
|
||||
# networkpacket_section_connect_to_computer['@remote_port'])
|
||||
# )
|
||||
val_combined = "{}|{}".format(networkpacket_section_connect_to_computer['@remote_hostname'], networkpacket_section_connect_to_computer['@remote_port'])
|
||||
yield({'values': val_combined, 'type': 'hostname|port', 'to_ids': True, 'comment': ''})
|
||||
yield({'values': val_combined, 'type': 'hostname|port', 'categories': ['Network activity'], 'to_ids': True, 'comment': ''})
|
||||
elif networkpacket_section_connect_to_computer['@remote_ip']:
|
||||
# print("networkpacket_section_connect_to_computer ip-dst: {} IDS:yes COMMENT:port {}".format(
|
||||
# networkpacket_section_connect_to_computer['@remote_ip'],
|
||||
|
@ -442,9 +455,9 @@ def cleanup_filepath(item):
|
|||
'\\AppData\\Roaming\\Adobe\\Acrobat\\9.0\\UserCache.bin',
|
||||
|
||||
'\\AppData\\Roaming\\Macromedia\\Flash Player\\macromedia.com\\support\\flashplayer\\sys\\settings.sol',
|
||||
'\\AppData\\Roaming\Adobe\\Flash Player\\NativeCache\\',
|
||||
'\\AppData\\Roaming\\Adobe\\Flash Player\\NativeCache\\',
|
||||
'C:\\Windows\\AppCompat\\Programs\\',
|
||||
'C:\~' # caused by temp file created by MS Office when opening malicious doc/xls/...
|
||||
'C:\\~' # caused by temp file created by MS Office when opening malicious doc/xls/...
|
||||
}
|
||||
if list_in_string(noise_substrings, item):
|
||||
return None
|
||||
|
|
2
setup.py
2
setup.py
|
@ -33,8 +33,6 @@ setup(
|
|||
'pyeupi',
|
||||
'ipasn-redis',
|
||||
'asnhistory',
|
||||
'stix',
|
||||
'cybox',
|
||||
'pillow',
|
||||
'pytesseract',
|
||||
'shodan',
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
{"module": "hashdd", "md5": "838DE99E82C5B9753BAC96D82C1A8DCB"}
|
|
@ -0,0 +1 @@
|
|||
curl -s http://127.0.0.1:6666/query -H "Content-Type: application/json" --data @bodyhashdd.json -X POST
|
331
tests/stix.xml
331
tests/stix.xml
|
@ -1,331 +0,0 @@
|
|||
<stix:STIX_Package
|
||||
xmlns:AddressObj="http://cybox.mitre.org/objects#AddressObject-2"
|
||||
xmlns:campaign="http://stix.mitre.org/Campaign-1"
|
||||
xmlns:ciqIdentity="http://stix.mitre.org/extensions/Identity#CIQIdentity3.0-1"
|
||||
xmlns:cybox="http://cybox.mitre.org/cybox-2"
|
||||
xmlns:cyboxCommon="http://cybox.mitre.org/common-2"
|
||||
xmlns:cyboxVocabs="http://cybox.mitre.org/default_vocabularies-2"
|
||||
xmlns:et="http://stix.mitre.org/ExploitTarget-1"
|
||||
xmlns:bae="http://bae.com"
|
||||
xmlns:indicator="http://stix.mitre.org/Indicator-2"
|
||||
xmlns:stix="http://stix.mitre.org/stix-1"
|
||||
xmlns:stixCommon="http://stix.mitre.org/common-1"
|
||||
xmlns:stixVocabs="http://stix.mitre.org/default_vocabularies-1"
|
||||
xmlns:ta="http://stix.mitre.org/ThreatActor-1"
|
||||
xmlns:ttp="http://stix.mitre.org/TTP-1"
|
||||
xmlns:xal="urn:oasis:names:tc:ciq:xal:3"
|
||||
xmlns:xnl="urn:oasis:names:tc:ciq:xnl:3"
|
||||
xmlns:xpil="urn:oasis:names:tc:ciq:xpil:3"
|
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" id="bae:Package-f0b927dd-9356-49ba-8f56-6d9f3d42fe25" version="1.1.1">
|
||||
<stix:Observables cybox_major_version="2" cybox_minor_version="1" cybox_update_version="0">
|
||||
<cybox:Observable id="bae:Observable-3a106400-cc90-4b5f-ad3d-6837dded226d">
|
||||
<cybox:Title>CNC Server 1</cybox:Title>
|
||||
<cybox:Object id="bae:Address-6e6990d8-20b6-4482-97a2-55732c993982">
|
||||
<cybox:Properties xsi:type="AddressObj:AddressObjectType" category="ipv4-addr">
|
||||
<AddressObj:Address_Value>82.146.166.56</AddressObj:Address_Value>
|
||||
</cybox:Properties>
|
||||
</cybox:Object>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable id="bae:Observable-9b6bd172-96e6-451f-93c6-3632fe05daf7">
|
||||
<cybox:Title>CNC Server 2</cybox:Title>
|
||||
<cybox:Object id="bae:Address-00dc7361-ca55-48ef-8529-72a3ac9c58d7">
|
||||
<cybox:Properties xsi:type="AddressObj:AddressObjectType" category="ipv4-addr">
|
||||
<AddressObj:Address_Value>209.239.79.47</AddressObj:Address_Value>
|
||||
</cybox:Properties>
|
||||
</cybox:Object>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable id="bae:Observable-7605b9f2-af49-429d-95ff-7029681dc20b">
|
||||
<cybox:Title>CNC Server 3</cybox:Title>
|
||||
<cybox:Object id="bae:Address-bbbbdcad-e8f5-449d-99d3-264de0ed512e">
|
||||
<cybox:Properties xsi:type="AddressObj:AddressObjectType" category="ipv4-addr">
|
||||
<AddressObj:Address_Value>41.213.121.180</AddressObj:Address_Value>
|
||||
</cybox:Properties>
|
||||
</cybox:Object>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable id="bae:Observable-f7c32ba1-91bb-4bb1-aee6-80afae13bbad">
|
||||
<cybox:Title>Watering Hole Wordpress</cybox:Title>
|
||||
<cybox:Object id="bae:Address-c0e85056-de10-44f7-a600-73144514e7a1">
|
||||
<cybox:Properties xsi:type="AddressObj:AddressObjectType">
|
||||
<AddressObj:Address_Value>eu-society.com</AddressObj:Address_Value>
|
||||
</cybox:Properties>
|
||||
</cybox:Object>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable id="bae:Observable-d4e819f1-0af9-49d4-826b-02ada3386cf5">
|
||||
<cybox:Title>Watering Hole Wordpress</cybox:Title>
|
||||
<cybox:Object id="bae:Address-ae972731-3b33-4fb1-bab6-053b0056deb5">
|
||||
<cybox:Properties xsi:type="AddressObj:AddressObjectType">
|
||||
<AddressObj:Address_Value>aromatravel.org</AddressObj:Address_Value>
|
||||
</cybox:Properties>
|
||||
</cybox:Object>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable id="bae:Observable-5a951ea5-5beb-40b8-8043-97f05e020993">
|
||||
<cybox:Title>Watering Hole Wordpress</cybox:Title>
|
||||
<cybox:Object id="bae:Address-f0d9ef90-af2d-462d-aa69-bfd55da9e65c">
|
||||
<cybox:Properties xsi:type="AddressObj:AddressObjectType">
|
||||
<AddressObj:Address_Value>bss.servebbs.com</AddressObj:Address_Value>
|
||||
</cybox:Properties>
|
||||
</cybox:Object>
|
||||
</cybox:Observable>
|
||||
</stix:Observables>
|
||||
<stix:Indicators>
|
||||
<stix:Indicator id="bae:indicator-1c3fc03e-86dc-4440-9e95-8b9964f03b92" timestamp="2016-07-04T15:17:14.479542+00:00" xsi:type='indicator:IndicatorType'>
|
||||
<indicator:Title>Watering Hole Detected</indicator:Title>
|
||||
<indicator:Type xsi:type="stixVocabs:IndicatorTypeVocab-1.1">URL Watchlist</indicator:Type>
|
||||
<indicator:Observable id="bae:Observable-4e173154-e739-4917-86e8-8b30e4cf39e5">
|
||||
<cybox:Observable_Composition operator="OR">
|
||||
<cybox:Observable idref="bae:Observable-f7c32ba1-91bb-4bb1-aee6-80afae13bbad">
|
||||
<cybox:Title>C2 List</cybox:Title>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-d4e819f1-0af9-49d4-826b-02ada3386cf5">
|
||||
<cybox:Title>C2 List</cybox:Title>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-5a951ea5-5beb-40b8-8043-97f05e020993">
|
||||
<cybox:Title>C2 List</cybox:Title>
|
||||
</cybox:Observable>
|
||||
</cybox:Observable_Composition>
|
||||
</indicator:Observable>
|
||||
</stix:Indicator>
|
||||
<stix:Indicator id="bae:indicator-e5ff413f-074e-41fe-ab20-6d0e0bf10f9c" timestamp="2016-07-04T15:17:14.480747+00:00" xsi:type='indicator:IndicatorType'>
|
||||
<indicator:Title>CnC Beaconing Detected</indicator:Title>
|
||||
<indicator:Type xsi:type="stixVocabs:IndicatorTypeVocab-1.1">C2</indicator:Type>
|
||||
<indicator:Observable id="bae:Observable-f4695e3e-5a27-4889-ba48-456b14146911">
|
||||
<cybox:Observable_Composition operator="OR">
|
||||
<cybox:Observable idref="bae:Observable-3a106400-cc90-4b5f-ad3d-6837dded226d">
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-9b6bd172-96e6-451f-93c6-3632fe05daf7">
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-7605b9f2-af49-429d-95ff-7029681dc20b">
|
||||
</cybox:Observable>
|
||||
</cybox:Observable_Composition>
|
||||
</indicator:Observable>
|
||||
</stix:Indicator>
|
||||
</stix:Indicators>
|
||||
<stix:TTPs>
|
||||
<stix:TTP id="bae:ttp-801d59e9-28c2-4016-976e-011dcab22cfd" timestamp="2016-07-04T15:17:14.479244+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Malware CnC Channels</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.479332+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Resources>
|
||||
<ttp:Infrastructure>
|
||||
<ttp:Type xsi:type="stixVocabs:AttackerInfrastructureTypeVocab-1.0">Hosting</ttp:Type>
|
||||
<ttp:Observable_Characterization cybox_major_version="2" cybox_minor_version="1" cybox_update_version="0">
|
||||
<cybox:Observable idref="bae:Observable-3a106400-cc90-4b5f-ad3d-6837dded226d">
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-9b6bd172-96e6-451f-93c6-3632fe05daf7">
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-7605b9f2-af49-429d-95ff-7029681dc20b">
|
||||
</cybox:Observable>
|
||||
</ttp:Observable_Characterization>
|
||||
</ttp:Infrastructure>
|
||||
</ttp:Resources>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-62c59e28-9480-4080-ad20-6b4092cb118d" timestamp="2016-07-04T15:17:14.476428+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Fingerprinting and whitelisting during watering-hole operations</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477552+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Theft - Credential Theft</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Resources>
|
||||
<ttp:Infrastructure>
|
||||
<ttp:Type xsi:type="stixVocabs:AttackerInfrastructureTypeVocab-1.0">Domain Registration</ttp:Type>
|
||||
<ttp:Observable_Characterization cybox_major_version="2" cybox_minor_version="1" cybox_update_version="0">
|
||||
<cybox:Observable idref="bae:Observable-f7c32ba1-91bb-4bb1-aee6-80afae13bbad">
|
||||
<cybox:Title>C2 List</cybox:Title>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-d4e819f1-0af9-49d4-826b-02ada3386cf5">
|
||||
<cybox:Title>C2 List</cybox:Title>
|
||||
</cybox:Observable>
|
||||
<cybox:Observable idref="bae:Observable-5a951ea5-5beb-40b8-8043-97f05e020993">
|
||||
<cybox:Title>C2 List</cybox:Title>
|
||||
</cybox:Observable>
|
||||
</ttp:Observable_Characterization>
|
||||
</ttp:Infrastructure>
|
||||
</ttp:Resources>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-af1016d6-a744-4ed7-ac91-00fe2272185a" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-7ffb9e3d-688a-4ec4-8919-f8ccb91d4b59" timestamp="2016-07-04T15:17:14.476589+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Spear-phishing in tandem with 0-day exploits</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477592+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Unauthorized Access</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-445b4827-3cca-42bd-8421-f2e947133c16" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-f13297d6-0963-4a74-89dc-12745db39845" timestamp="2016-07-04T15:17:14.476707+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Infiltration of organisations via third party supplier/partner</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477629+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Unauthorized Access</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-79a0e041-9d5f-49bb-ada4-8322622b162d" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-78aef1ee-8c7d-4a6e-87cc-a4471b29ac45" timestamp="2016-07-04T15:17:14.476822+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Custom recon tool to compromise and identify credentials of the network</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477665+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Theft - Credential Theft</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-af1016d6-a744-4ed7-ac91-00fe2272185a" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-3a5f9a19-fa56-4b7d-ad7f-df64b7efdf98" timestamp="2016-07-04T15:17:14.476932+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Multiple means of C2 communications given the diversity of the attacker toolset</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477708+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-d6dc32b9-2538-4951-8733-3cb9ef1daae2" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-e08d9d94-0294-4fa7-9e1d-8f1e087e3dc0" timestamp="2016-07-04T15:17:14.477046+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>rootkit communicates during the same time as network activity, encoded with an XOR key</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477744+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-d6dc32b9-2538-4951-8733-3cb9ef1daae2" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-898d58cd-369c-4b76-892b-2f5ced35cf0e" timestamp="2016-07-04T15:17:14.477159+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Kernel-centric rootkit waits for network trigger before launching</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477779+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-d6dc32b9-2538-4951-8733-3cb9ef1daae2" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-fb1ac371-8e28-48d9-8905-03f54bd546ee" timestamp="2016-07-04T15:17:14.477269+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Kernel centric exfiltration over TCP/UDP/DNS/ICMP/HTTP</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477814+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Theft</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-786ca8f9-2d9a-4213-b38e-399af4a2e5d6" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-27872de0-c3ce-425d-9348-8f1439645398" timestamp="2016-07-04T15:17:14.477382+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Exfiltration over HTTP/HTTPS</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477849+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Theft</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
<ttp:Kill_Chain_Phases>
|
||||
<stixCommon:Kill_Chain_Phase phase_id="stix:TTP-786ca8f9-2d9a-4213-b38e-399af4a2e5d6" kill_chain_id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff"/>
|
||||
</ttp:Kill_Chain_Phases>
|
||||
</stix:TTP>
|
||||
<stix:TTP id="bae:ttp-c3a4fec7-00b7-4efb-adf9-a38feee005fe" timestamp="2016-07-04T15:17:14.477499+00:00" xsi:type='ttp:TTPType'>
|
||||
<ttp:Title>Use of previously undocumented functions in their Kernel centric attacks</ttp:Title>
|
||||
<ttp:Intended_Effect timestamp="2016-07-04T15:17:14.477888+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage</stixCommon:Value>
|
||||
</ttp:Intended_Effect>
|
||||
</stix:TTP>
|
||||
<stix:Kill_Chains>
|
||||
<stixCommon:Kill_Chain id="stix:TTP-af3e707f-2fb9-49e5-8c37-14026ca0a5ff" definer="LMCO" name="LM Cyber Kill Chain">
|
||||
<stixCommon:Kill_Chain_Phase ordinality="1" name="Reconnaissance" phase_id="stix:TTP-af1016d6-a744-4ed7-ac91-00fe2272185a"/>
|
||||
<stixCommon:Kill_Chain_Phase ordinality="2" name="Weaponization" phase_id="stix:TTP-445b4827-3cca-42bd-8421-f2e947133c16"/>
|
||||
<stixCommon:Kill_Chain_Phase ordinality="3" name="Delivery" phase_id="stix:TTP-79a0e041-9d5f-49bb-ada4-8322622b162d"/>
|
||||
<stixCommon:Kill_Chain_Phase ordinality="4" name="Exploitation" phase_id="stix:TTP-f706e4e7-53d8-44ef-967f-81535c9db7d0"/>
|
||||
<stixCommon:Kill_Chain_Phase ordinality="5" name="Installation" phase_id="stix:TTP-e1e4e3f7-be3b-4b39-b80a-a593cfd99a4f"/>
|
||||
<stixCommon:Kill_Chain_Phase ordinality="6" name="Command and Control" phase_id="stix:TTP-d6dc32b9-2538-4951-8733-3cb9ef1daae2"/>
|
||||
<stixCommon:Kill_Chain_Phase ordinality="7" name="Actions on Objectives" phase_id="stix:TTP-786ca8f9-2d9a-4213-b38e-399af4a2e5d6"/>
|
||||
</stixCommon:Kill_Chain>
|
||||
</stix:Kill_Chains>
|
||||
</stix:TTPs>
|
||||
<stix:Exploit_Targets>
|
||||
<stixCommon:Exploit_Target id="bae:et-ca916690-1221-46b3-8e69-dabd46018892" timestamp="2016-07-04T15:17:14.480615+00:00" xsi:type='et:ExploitTargetType'>
|
||||
<et:Title>Privilage Escalation Vulnerability</et:Title>
|
||||
<et:Vulnerability>
|
||||
<et:CVE_ID>CVE-2013-5065</et:CVE_ID>
|
||||
</et:Vulnerability>
|
||||
</stixCommon:Exploit_Target>
|
||||
</stix:Exploit_Targets>
|
||||
<stix:Campaigns>
|
||||
<stix:Campaign id="bae:campaign-2b81268f-c0f0-4bd4-aa63-a880dc05ed82" timestamp="2016-07-04T15:17:14.478248+00:00" xsi:type='campaign:CampaignType'>
|
||||
<campaign:Title>The Epic Turla Campaign</campaign:Title>
|
||||
<campaign:Description>The Epic Turla Campaign</campaign:Description>
|
||||
<campaign:Intended_Effect timestamp="2016-07-04T15:17:14.478460+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage - Political</stixCommon:Value>
|
||||
</campaign:Intended_Effect>
|
||||
<campaign:Attribution>
|
||||
<campaign:Attributed_Threat_Actor>
|
||||
<stixCommon:Threat_Actor idref="bae:threatactor-857fd80b-1b4e-4add-bfa1-d1e90911421b" xsi:type='ta:ThreatActorType'>
|
||||
</stixCommon:Threat_Actor>
|
||||
</campaign:Attributed_Threat_Actor>
|
||||
</campaign:Attribution>
|
||||
</stix:Campaign>
|
||||
<stix:Campaign id="bae:campaign-53a0ca81-b065-431a-b043-0457b0bcfffa" timestamp="2016-07-04T15:17:14.478394+00:00" xsi:type='campaign:CampaignType'>
|
||||
<campaign:Title>SNAKE Campaign</campaign:Title>
|
||||
<campaign:Description>The SNAKE Campaign</campaign:Description>
|
||||
<campaign:Intended_Effect timestamp="2016-07-04T15:17:14.478500+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage - Political</stixCommon:Value>
|
||||
</campaign:Intended_Effect>
|
||||
<campaign:Attribution>
|
||||
<campaign:Attributed_Threat_Actor>
|
||||
<stixCommon:Threat_Actor idref="bae:threatactor-857fd80b-1b4e-4add-bfa1-d1e90911421b" xsi:type='ta:ThreatActorType'>
|
||||
</stixCommon:Threat_Actor>
|
||||
</campaign:Attributed_Threat_Actor>
|
||||
</campaign:Attribution>
|
||||
</stix:Campaign>
|
||||
</stix:Campaigns>
|
||||
<stix:Threat_Actors>
|
||||
<stix:Threat_Actor id="bae:threatactor-857fd80b-1b4e-4add-bfa1-d1e90911421b" timestamp="2016-07-04T15:17:14.454074+00:00" xsi:type='ta:ThreatActorType'>
|
||||
<ta:Title>SNAKE</ta:Title>
|
||||
<ta:Description>
|
||||
The group behind the SNAKE campaign are a top tier nation-state threat. Their capabilities extend from subtle watering-hole attacks to sophisticated server rootkits – virtually undetectable by conventional security products.
|
||||
This threat actor group has been operating continuously for over a decade, infiltrating governments and strategic private sector networks in that time. The most notorious of their early campaigns led to a breach of classified US military systems, an extensive clean-up called ‘Operation Buckshot Yankee’, and led to the creation of the US Cyber Command.
|
||||
Whilst the sophisticated rootkit is used for persistent access to networks, the group also leverage more straight-forward capabilities for gaining an initial toe-hold on targets. This includes the use of watering-hole attacks and basic remote access tools.
|
||||
</ta:Description>
|
||||
<ta:Short_Description>
|
||||
The group behind the SNAKE campaign are a top tier nation-state threat. Their capabilities extend from subtle watering-hole attacks to sophisticated server rootkits – virtually undetectable by conventional security products.
|
||||
</ta:Short_Description>
|
||||
<ta:Identity xsi:type="ciqIdentity:CIQIdentity3.0InstanceType">
|
||||
<ExtSch:Specification xmlns:ExtSch="http://stix.mitre.org/extensions/Identity#CIQIdentity3.0-1">
|
||||
<xpil:PartyName xmlns:xpil="urn:oasis:names:tc:ciq:xpil:3">
|
||||
<xnl:OrganisationName xmlns:xnl="urn:oasis:names:tc:ciq:xnl:3" xnl:Type="OfficialName">
|
||||
<xnl:NameElement>SNAKE</xnl:NameElement>
|
||||
</xnl:OrganisationName>
|
||||
<xnl:OrganisationName xmlns:xnl="urn:oasis:names:tc:ciq:xnl:3" xnl:Type="OfficialName">
|
||||
<xnl:NameElement>Turla</xnl:NameElement>
|
||||
</xnl:OrganisationName>
|
||||
<xnl:OrganisationName xmlns:xnl="urn:oasis:names:tc:ciq:xnl:3" xnl:Type="UnofficialName">
|
||||
<xnl:NameElement>WRAITH</xnl:NameElement>
|
||||
</xnl:OrganisationName>
|
||||
</xpil:PartyName>
|
||||
<xpil:Addresses xmlns:xpil="urn:oasis:names:tc:ciq:xpil:3">
|
||||
<xpil:Address>
|
||||
<xal:Country xmlns:xal="urn:oasis:names:tc:ciq:xal:3">
|
||||
<xal:NameElement>Russia</xal:NameElement>
|
||||
</xal:Country>
|
||||
<xal:AdministrativeArea xmlns:xal="urn:oasis:names:tc:ciq:xal:3">
|
||||
<xal:NameElement>Moscow</xal:NameElement>
|
||||
</xal:AdministrativeArea>
|
||||
</xpil:Address>
|
||||
</xpil:Addresses>
|
||||
<xpil:ElectronicAddressIdentifiers xmlns:xpil="urn:oasis:names:tc:ciq:xpil:3">
|
||||
<xpil:ElectronicAddressIdentifier>snake@gmail.com</xpil:ElectronicAddressIdentifier>
|
||||
<xpil:ElectronicAddressIdentifier>twitter.com/snake</xpil:ElectronicAddressIdentifier>
|
||||
</xpil:ElectronicAddressIdentifiers>
|
||||
<xpil:Languages xmlns:xpil="urn:oasis:names:tc:ciq:xpil:3">
|
||||
<xpil:Language>Russian</xpil:Language>
|
||||
</xpil:Languages>
|
||||
</ExtSch:Specification>
|
||||
</ta:Identity>
|
||||
<ta:Motivation timestamp="2016-07-04T15:17:14.475968+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:MotivationVocab-1.1">Political</stixCommon:Value>
|
||||
</ta:Motivation>
|
||||
<ta:Sophistication timestamp="2016-07-04T15:17:14.454350+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:ThreatActorSophisticationVocab-1.0">Expert</stixCommon:Value>
|
||||
</ta:Sophistication>
|
||||
<ta:Intended_Effect timestamp="2016-07-04T15:17:14.476056+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Advantage - Political</stixCommon:Value>
|
||||
</ta:Intended_Effect>
|
||||
<ta:Intended_Effect timestamp="2016-07-04T15:17:14.476095+00:00">
|
||||
<stixCommon:Value xsi:type="stixVocabs:IntendedEffectVocab-1.0">Theft - Intellectual Property</stixCommon:Value>
|
||||
</ta:Intended_Effect>
|
||||
</stix:Threat_Actor>
|
||||
</stix:Threat_Actors>
|
||||
</stix:STIX_Package>
|
|
@ -57,21 +57,6 @@ class TestModules(unittest.TestCase):
|
|||
assert("mrxcls.sys" in values)
|
||||
assert("mdmcpq3.PNF" in values)
|
||||
|
||||
def test_stix(self):
|
||||
with open("tests/stix.xml", "rb") as f:
|
||||
content = base64.b64encode(f.read())
|
||||
data = json.dumps({"module": "stiximport",
|
||||
"data": content.decode('utf-8'),
|
||||
})
|
||||
response = requests.post(self.url + "query", data=data).json()
|
||||
|
||||
print("STIX :: {}".format(response))
|
||||
values = [x["values"][0] for x in response["results"]]
|
||||
|
||||
assert("209.239.79.47" in values)
|
||||
assert("41.213.121.180" in values)
|
||||
assert("eu-society.com" in values)
|
||||
|
||||
def test_email_headers(self):
|
||||
query = {"module": "email_import"}
|
||||
query["config"] = {"unzip_attachments": None,
|
||||
|
|
Loading…
Reference in New Issue