Merge branch 'main' of github.com:MISP/misp-modules into chrisr3d_patch

composite_attributes_proposal
chrisr3d 2020-11-03 19:28:44 +01:00
commit 2ada938a08
150 changed files with 8486 additions and 1534 deletions

View File

@ -9,14 +9,32 @@ python:
- "3.6"
- "3.6-dev"
- "3.7-dev"
- "3.8-dev"
before_install:
- docker build -t misp-modules --build-arg BUILD_DATE=$(date -u +"%Y-%m-%d") docker/
install:
- sudo apt-get install libzbar0 libzbar-dev libpoppler-cpp-dev
- sudo apt-get install libzbar0 libzbar-dev libpoppler-cpp-dev tesseract-ocr libfuzzy-dev libcaca-dev liblua5.3-dev
- pip install pipenv
- pipenv install --dev
# install gtcaca
- git clone git://github.com/stricaud/gtcaca.git
- mkdir -p gtcaca/build
- pushd gtcaca/build
- cmake .. && make
- sudo make install
- popd
# install pyfaup
- git clone https://github.com/stricaud/faup.git
- pushd faup/build
- cmake .. && make
- sudo make install
- popd
- sudo ldconfig
- pushd faup/src/lib/bindings/python
- pip install .
- popd
script:
- pipenv run coverage run -m --parallel-mode --source=misp_modules misp_modules.__init__ -l 127.0.0.1 &
@ -31,7 +49,7 @@ script:
- sleep 5
- pipenv run nosetests --with-coverage --cover-package=misp_modules
- kill -s KILL $pid
- pipenv run flake8 --ignore=E501,W503 misp_modules
- pipenv run flake8 --ignore=E501,W503,E226 misp_modules
after_success:
- pipenv run coverage combine .coverage*

10
Pipfile
View File

@ -11,14 +11,14 @@ flake8 = "*"
[packages]
dnspython = "*"
requests = "*"
requests = {extras = ["security"],version = "*"}
urlarchiver = "*"
passivetotal = "*"
pypdns = "*"
pypssl = "*"
pyeupi = "*"
uwhois = {editable = true,git = "https://github.com/Rafiot/uwhoisd.git",ref = "testing",subdirectory = "client"}
pymisp = {editable = true,git = "https://github.com/MISP/PyMISP.git"}
pymisp = {editable = true,extras = ["fileobjects,openioc,pdfexport"],git = "https://github.com/MISP/PyMISP.git"}
pyonyphe = {editable = true,git = "https://github.com/sebdraven/pyonyphe"}
pydnstrails = {editable = true,git = "https://github.com/sebdraven/pydnstrails"}
pytesseract = "*"
@ -56,6 +56,12 @@ lxml = "*"
xlrd = "*"
idna-ssl = {markers = "python_version < '3.7'"}
jbxapi = "*"
geoip2 = "*"
apiosintDS = "*"
assemblyline_client = "*"
vt-graph-api = "*"
trustar = "*"
markdownify = "==0.5.3"
[requires]
python_version = "3"

1423
Pipfile.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -17,54 +17,72 @@ For more information: [Extending MISP with Python modules](https://www.misp-proj
### Expansion modules
* [apiosintDS](misp_modules/modules/expansion/apiosintds.py) - a hover and expansion module to query the OSINT.digitalside.it API.
* [API Void](misp_modules/modules/expansion/apivoid.py) - an expansion and hover module to query API Void with a domain attribute.
* [AssemblyLine submit](misp_modules/modules/expansion/assemblyline_submit.py) - an expansion module to submit samples and urls to AssemblyLine.
* [AssemblyLine query](misp_modules/modules/expansion/assemblyline_query.py) - an expansion module to query AssemblyLine and parse the full submission report.
* [Backscatter.io](misp_modules/modules/expansion/backscatter_io.py) - a hover and expansion module to expand an IP address with mass-scanning observations.
* [BGP Ranking](misp_modules/modules/expansion/bgpranking.py) - a hover and expansion module to expand an AS number with the ASN description, its history, and position in BGP Ranking.
* [BGP Ranking](misp_modules/modules/expansion/bgpranking.py) - a hover and expansion module to expand an AS number with the ASN description and its ranking and position in BGP Ranking.
* [RansomcoinDB check](misp_modules/modules/expansion/ransomcoindb.py) - An expansion hover module to query the [ransomcoinDB](https://ransomcoindb.concinnity-risks.com): it contains mapping between BTC addresses and malware hashes. Enrich MISP by querying for BTC -> hash or hash -> BTC addresses.
* [BTC scam check](misp_modules/modules/expansion/btc_scam_check.py) - An expansion hover module to instantly check if a BTC address has been abused.
* [BTC transactions](misp_modules/modules/expansion/btc_steroids.py) - An expansion hover module to get a blockchain balance and the transactions from a BTC address in MISP.
* [Censys-enrich](misp_modules/modules/expansion/censys_enrich.py) - An expansion and module to retrieve information from censys.io about a particular IP or certificate.
* [CIRCL Passive DNS](misp_modules/modules/expansion/circl_passivedns.py) - a hover and expansion module to expand hostname and IP addresses with passive DNS information.
* [CIRCL Passive SSL](misp_modules/modules/expansion/circl_passivessl.py) - a hover and expansion module to expand IP addresses with the X.509 certificate seen.
* [CIRCL Passive SSL](misp_modules/modules/expansion/circl_passivessl.py) - a hover and expansion module to expand IP addresses with the X.509 certificate(s) seen.
* [countrycode](misp_modules/modules/expansion/countrycode.py) - a hover module to tell you what country a URL belongs to.
* [CrowdStrike Falcon](misp_modules/modules/expansion/crowdstrike_falcon.py) - an expansion module to expand using CrowdStrike Falcon Intel Indicator API.
* [CPE](misp_modules/modules/expansion/cpe.py) - An expansion module to query the CVE Search API with a cpe code, to get its related vulnerabilities.
* [CVE](misp_modules/modules/expansion/cve.py) - a hover module to give more information about a vulnerability (CVE).
* [CVE advanced](misp_modules/modules/expansion/cve_advanced.py) - An expansion module to query the CIRCL CVE search API for more information about a vulnerability (CVE).
* [Cuckoo submit](misp_modules/modules/expansion/cuckoo_submit.py) - A hover module to submit malware sample, url, attachment, domain to Cuckoo Sandbox.
* [Cytomic Orion](misp_modules/modules/expansion/cytomic_orion.py) - An expansion module to enrich attributes in MISP and share indicators of compromise with Cytomic Orion.
* [DBL Spamhaus](misp_modules/modules/expansion/dbl_spamhaus.py) - a hover module to check Spamhaus DBL for a domain name.
* [DNS](misp_modules/modules/expansion/dns.py) - a simple module to resolve MISP attributes like hostname and domain to expand IP addresses attributes.
* [docx-enrich](misp_modules/modules/expansion/docx-enrich.py) - an enrichment module to get text out of Word document into MISP (using free-text parser).
* [docx-enrich](misp_modules/modules/expansion/docx_enrich.py) - an enrichment module to get text out of Word document into MISP (using free-text parser).
* [DomainTools](misp_modules/modules/expansion/domaintools.py) - a hover and expansion module to get information from [DomainTools](http://www.domaintools.com/) whois.
* [EQL](misp_modules/modules/expansion/eql.py) - an expansion module to generate event query language (EQL) from an attribute. [Event Query Language](https://eql.readthedocs.io/en/latest/)
* [EUPI](misp_modules/modules/expansion/eupi.py) - a hover and expansion module to get information about an URL from the [Phishing Initiative project](https://phishing-initiative.eu/?lang=en).
* [Farsight DNSDB Passive DNS](misp_modules/modules/expansion/farsight_passivedns.py) - a hover and expansion module to expand hostname and IP addresses with passive DNS information.
* [GeoIP](misp_modules/modules/expansion/geoip_country.py) - a hover and expansion module to get GeoIP information from geolite/maxmind.
* [GeoIP_City](misp_modules/modules/expansion/geoip_city.py) - a hover and expansion module to get GeoIP City information from geolite/maxmind.
* [GeoIP_ASN](misp_modules/modules/expansion/geoip_asn.py) - a hover and expansion module to get GeoIP ASN information from geolite/maxmind.
* [Greynoise](misp_modules/modules/expansion/greynoise.py) - a hover to get information from greynoise.
* [hashdd](misp_modules/modules/expansion/hashdd.py) - a hover module to check file hashes against [hashdd.com](http://www.hashdd.com) including NSLR dataset.
* [hibp](misp_modules/modules/expansion/hibp.py) - a hover module to lookup against Have I Been Pwned?
* [html_to_markdown](misp_modules/modules/expansion/html_to_markdown.py) - Simple HTML to markdown converter
* [intel471](misp_modules/modules/expansion/intel471.py) - an expansion module to get info from [Intel471](https://intel471.com).
* [IPASN](misp_modules/modules/expansion/ipasn.py) - a hover and expansion to get the BGP ASN of an IP address.
* [iprep](misp_modules/modules/expansion/iprep.py) - an expansion module to get IP reputation from packetmail.net.
* [Joe Sandbox submit](misp_modules/modules/expansion/joesandbox_submit.py) - Submit files and URLs to Joe Sandbox.
* [Joe Sandbox query](misp_modules/modules/expansion/joesandbox_query.py) - Query Joe Sandbox with the link of an analysis and get the parsed data.
* [Lastline submit](misp_modules/modules/expansion/lastline_submit.py) - Submit files and URLs to Lastline.
* [Lastline query](misp_modules/modules/expansion/lastline_query.py) - Query Lastline with the link to an analysis and parse the report.
* [macaddress.io](misp_modules/modules/expansion/macaddress_io.py) - a hover module to retrieve vendor details and other information regarding a given MAC address or an OUI from [MAC address Vendor Lookup](https://macaddress.io). See [integration tutorial here](https://macaddress.io/integrations/MISP-module).
* [macvendors](misp_modules/modules/expansion/macvendors.py) - a hover module to retrieve mac vendor information.
* [ocr-enrich](misp_modules/modules/expansion/ocr-enrich.py) - an enrichment module to get OCRized data from images into MISP.
* [ods-enrich](misp_modules/modules/expansion/ods-enrich.py) - an enrichment module to get text out of OpenOffice spreadsheet document into MISP (using free-text parser).
* [odt-enrich](misp_modules/modules/expansion/odt-enrich.py) - an enrichment module to get text out of OpenOffice document into MISP (using free-text parser).
* [MALWAREbazaar](misp_modules/modules/expansion/malwarebazaar.py) - an expansion module to query MALWAREbazaar with some payload.
* [ocr-enrich](misp_modules/modules/expansion/ocr_enrich.py) - an enrichment module to get OCRized data from images into MISP.
* [ods-enrich](misp_modules/modules/expansion/ods_enrich.py) - an enrichment module to get text out of OpenOffice spreadsheet document into MISP (using free-text parser).
* [odt-enrich](misp_modules/modules/expansion/odt_enrich.py) - an enrichment module to get text out of OpenOffice document into MISP (using free-text parser).
* [onyphe](misp_modules/modules/expansion/onyphe.py) - a modules to process queries on Onyphe.
* [onyphe_full](misp_modules/modules/expansion/onyphe_full.py) - a modules to process full queries on Onyphe.
* [OTX](misp_modules/modules/expansion/otx.py) - an expansion module for [OTX](https://otx.alienvault.com/).
* [passivetotal](misp_modules/modules/expansion/passivetotal.py) - a [passivetotal](https://www.passivetotal.org/) module that queries a number of different PassiveTotal datasets.
* [pdf-enrich](misp_modules/modules/expansion/pdf-enrich.py) - an enrichment module to extract text from PDF into MISP (using free-text parser).
* [pptx-enrich](misp_modules/modules/expansion/pptx-enrich.py) - an enrichment module to get text out of PowerPoint document into MISP (using free-text parser).
* [pdf-enrich](misp_modules/modules/expansion/pdf_enrich.py) - an enrichment module to extract text from PDF into MISP (using free-text parser).
* [pptx-enrich](misp_modules/modules/expansion/pptx_enrich.py) - an enrichment module to get text out of PowerPoint document into MISP (using free-text parser).
* [qrcode](misp_modules/modules/expansion/qrcode.py) - a module decode QR code, barcode and similar codes from an image and enrich with the decoded values.
* [rbl](misp_modules/modules/expansion/rbl.py) - a module to get RBL (Real-Time Blackhost List) values from an attribute.
* [recordedfuture](misp_modules/modules/expansion/recordedfuture.py) - a hover and expansion module for enriching MISP attributes with threat intelligence from Recorded Future.
* [reversedns](misp_modules/modules/expansion/reversedns.py) - Simple Reverse DNS expansion service to resolve reverse DNS from MISP attributes.
* [securitytrails](misp_modules/modules/expansion/securitytrails.py) - an expansion module for [securitytrails](https://securitytrails.com/).
* [shodan](misp_modules/modules/expansion/shodan.py) - a minimal [shodan](https://www.shodan.io/) expansion module.
* [Sigma queries](misp_modules/modules/expansion/sigma_queries.py) - Experimental expansion module querying a sigma rule to convert it into all the available SIEM signatures.
* [Sigma syntax validator](misp_modules/modules/expansion/sigma_syntax_validator.py) - Sigma syntax validator.
* [SophosLabs Intelix](misp_modules/modules/expansion/sophoslabs_intelix.py) - SophosLabs Intelix is an API for Threat Intelligence and Analysis (free tier availible). [SophosLabs](https://aws.amazon.com/marketplace/pp/B07SLZPMCS)
* [sourcecache](misp_modules/modules/expansion/sourcecache.py) - a module to cache a specific link from a MISP instance.
* [STIX2 pattern syntax validator](misp_modules/modules/expansion/stix2_pattern_syntax_validator.py) - a module to check a STIX2 pattern syntax.
* [ThreatCrowd](misp_modules/modules/expansion/threatcrowd.py) - an expansion module for [ThreatCrowd](https://www.threatcrowd.org/).
* [threatminer](misp_modules/modules/expansion/threatminer.py) - an expansion module to expand from [ThreatMiner](https://www.threatminer.org/).
* [TruSTAR Enrich](misp_modules/modules/expansion/trustar_enrich.py) - an expansion module to enrich MISP data with [TruSTAR](https://www.trustar.co/).
* [urlhaus](misp_modules/modules/expansion/urlhaus.py) - Query urlhaus to get additional data about a domain, hash, hostname, ip or url.
* [urlscan](misp_modules/modules/expansion/urlscan.py) - an expansion module to query [urlscan.io](https://urlscan.io).
* [virustotal](misp_modules/modules/expansion/virustotal.py) - an expansion module to query the [VirusTotal](https://www.virustotal.com/gui/home) API with a high request rate limit required. (More details about the API: [here](https://developers.virustotal.com/reference))
@ -75,40 +93,44 @@ For more information: [Extending MISP with Python modules](https://www.misp-proj
* [whois](misp_modules/modules/expansion/whois.py) - a module to query a local instance of [uwhois](https://github.com/rafiot/uwhoisd).
* [wikidata](misp_modules/modules/expansion/wiki.py) - a [wikidata](https://www.wikidata.org) expansion module.
* [xforce](misp_modules/modules/expansion/xforceexchange.py) - an IBM X-Force Exchange expansion module.
* [xlsx-enrich](misp_modules/modules/expansion/xlsx-enrich.py) - an enrichment module to get text out of an Excel document into MISP (using free-text parser).
* [xlsx-enrich](misp_modules/modules/expansion/xlsx_enrich.py) - an enrichment module to get text out of an Excel document into MISP (using free-text parser).
* [YARA query](misp_modules/modules/expansion/yara_query.py) - a module to create YARA rules from single hash attributes.
* [YARA syntax validator](misp_modules/modules/expansion/yara_syntax_validator.py) - YARA syntax validator.
### Export modules
* [CEF](misp_modules/modules/export_mod/cef_export.py) module to export Common Event Format (CEF).
* [Cisco FireSight Manager ACL rule](misp_modules/modules/export_mod/cisco_firesight_manager_ACL_rule_export.py) module to export as rule for the Cisco FireSight manager ACL.
* [GoAML export](misp_modules/modules/export_mod/goamlexport.py) module to export in [GoAML format](http://goaml.unodc.org/goaml/en/index.html).
* [Lite Export](misp_modules/modules/export_mod/liteexport.py) module to export a lite event.
* [PDF export](misp_modules/modules/export_mod/pdfexport.py) module to export an event in PDF.
* [Nexthink query format](misp_modules/modules/export_mod/nexthinkexport.py) module to export in Nexthink query format.
* [osquery](misp_modules/modules/export_mod/osqueryexport.py) module to export in [osquery](https://osquery.io/) query format.
* [ThreatConnect](misp_modules/modules/export_mod/threat_connect_export.py) module to export in ThreatConnect CSV format.
* [ThreatStream](misp_modules/modules/export_mod/threatStream_misp_export.py) module to export in ThreatStream format.
* [CEF](misp_modules/modules/export_mod/cef_export.py) - module to export Common Event Format (CEF).
* [Cisco FireSight Manager ACL rule](misp_modules/modules/export_mod/cisco_firesight_manager_ACL_rule_export.py) - module to export as rule for the Cisco FireSight manager ACL.
* [GoAML export](misp_modules/modules/export_mod/goamlexport.py) - module to export in [GoAML format](http://goaml.unodc.org/goaml/en/index.html).
* [Lite Export](misp_modules/modules/export_mod/liteexport.py) - module to export a lite event.
* [PDF export](misp_modules/modules/export_mod/pdfexport.py) - module to export an event in PDF.
* [Mass EQL Export](misp_modules/modules/export_mod/mass_eql_export.py) - module to export applicable attributes from an event to a mass EQL query.
* [Nexthink query format](misp_modules/modules/export_mod/nexthinkexport.py) - module to export in Nexthink query format.
* [osquery](misp_modules/modules/export_mod/osqueryexport.py) - module to export in [osquery](https://osquery.io/) query format.
* [ThreatConnect](misp_modules/modules/export_mod/threat_connect_export.py) - module to export in ThreatConnect CSV format.
* [ThreatStream](misp_modules/modules/export_mod/threatStream_misp_export.py) - module to export in ThreatStream format.
* [VirusTotal Graph](misp_modules/modules/export_mod/vt_graph.py) - Module to create a VirusTotal graph out of an event.
### Import modules
* [CSV import](misp_modules/modules/import_mod/csvimport.py) Customizable CSV import module.
* [Cuckoo JSON](misp_modules/modules/import_mod/cuckooimport.py) Cuckoo JSON import.
* [Email Import](misp_modules/modules/import_mod/email_import.py) Email import module for MISP to import basic metadata.
* [GoAML import](misp_modules/modules/import_mod/goamlimport.py) Module to import [GoAML](http://goaml.unodc.org/goaml/en/index.html) XML format.
* [Joe Sandbox import](misp_modules/modules/import_mod/joe_import.py) Parse data from a Joe Sandbox json report.
* [OCR](misp_modules/modules/import_mod/ocr.py) Optical Character Recognition (OCR) module for MISP to import attributes from images, scan or faxes.
* [OpenIOC](misp_modules/modules/import_mod/openiocimport.py) OpenIOC import based on PyMISP library.
* [CSV import](misp_modules/modules/import_mod/csvimport.py) - Customizable CSV import module.
* [Cuckoo JSON](misp_modules/modules/import_mod/cuckooimport.py) - Cuckoo JSON import.
* [Email Import](misp_modules/modules/import_mod/email_import.py) - Email import module for MISP to import basic metadata.
* [GoAML import](misp_modules/modules/import_mod/goamlimport.py) - Module to import [GoAML](http://goaml.unodc.org/goaml/en/index.html) XML format.
* [Joe Sandbox import](misp_modules/modules/import_mod/joe_import.py) - Parse data from a Joe Sandbox json report.
* [Lastline import](misp_modules/modules/import_mod/lastline_import.py) - Module to import Lastline analysis reports.
* [OCR](misp_modules/modules/import_mod/ocr.py) - Optical Character Recognition (OCR) module for MISP to import attributes from images, scan or faxes.
* [OpenIOC](misp_modules/modules/import_mod/openiocimport.py) - OpenIOC import based on PyMISP library.
* [ThreatAnalyzer](misp_modules/modules/import_mod/threatanalyzer_import.py) - An import module to process ThreatAnalyzer archive.zip/analysis.json sandbox exports.
* [VMRay](misp_modules/modules/import_mod/vmray_import.py) - An import module to process VMRay export.
## How to install and start MISP modules in a Python virtualenv? (recommended)
~~~~bash
sudo apt-get install python3-dev python3-pip libpq5 libjpeg-dev tesseract-ocr libpoppler-cpp-dev imagemagick virtualenv libopencv-dev zbar-tools libzbar0 libzbar-dev libfuzzy-dev -y
sudo apt-get install python3-dev python3-pip libpq5 libjpeg-dev tesseract-ocr libpoppler-cpp-dev imagemagick virtualenv libopencv-dev zbar-tools libzbar0 libzbar-dev libfuzzy-dev build-essential -y
sudo -u www-data virtualenv -p python3 /var/www/MISP/venv
cd /usr/local/src/
chown -R www-data .
sudo git clone https://github.com/MISP/misp-modules.git
cd misp-modules
sudo -u www-data /var/www/MISP/venv/bin/pip install -I -r REQUIREMENTS

View File

@ -1,82 +1,117 @@
-i https://pypi.org/simple
-e .
-e git+https://github.com/D4-project/BGP-Ranking.git/@429cea9c0787876820984a2df4e982449a84c10e#egg=pybgpranking&subdirectory=client
-e git+https://github.com/D4-project/IPASN-History.git/@47cd0f2658ab172fce42126ff3a1dbcddfb0b5fb#egg=pyipasnhistory&subdirectory=client
-e git+https://github.com/D4-project/BGP-Ranking.git/@fd9c0e03af9b61d4bf0b67ac73c7208a55178a54#egg=pybgpranking&subdirectory=client
-e git+https://github.com/D4-project/IPASN-History.git/@fc5e48608afc113e101ca6421bf693b7b9753f9e#egg=pyipasnhistory&subdirectory=client
-e git+https://github.com/MISP/PyIntel471.git@0df8d51f1c1425de66714b3a5a45edb69b8cc2fc#egg=pyintel471
-e git+https://github.com/MISP/PyMISP.git@3ad351380055f0a655ed529b9c79b242a9227b84#egg=pymisp
-e git+https://github.com/Rafiot/uwhoisd.git@411572840eba4c72dc321c549b36a54ed5cea9de#egg=uwhois&subdirectory=client
-e git+https://github.com/MISP/PyMISP.git@bacd4c78cd83d3bf45dcf55cd9ad3514747ac985#egg=pymisp[fileobjects,openioc,pdfexport]
-e git+https://github.com/Rafiot/uwhoisd.git@783bba09b5a6964f25566089826a1be4b13f2a22#egg=uwhois&subdirectory=client
-e git+https://github.com/cartertemm/ODTReader.git/@49d6938693f6faa3ff09998f86dba551ae3a996b#egg=odtreader
-e git+https://github.com/sebdraven/pydnstrails@48c1f740025c51289f43a24863d1845ff12fd21a#egg=pydnstrails
-e git+https://github.com/sebdraven/pyonyphe@cbb0168d5cb28a9f71f7ab3773164a7039ccdb12#egg=pyonyphe
aiohttp==3.4.4
antlr4-python3-runtime==4.7.2 ; python_version >= '3'
async-timeout==3.0.1
attrs==19.1.0
-e git+https://github.com/sebdraven/pyonyphe@1ce15581beebb13e841193a08a2eb6f967855fcb#egg=pyonyphe
aiohttp==3.6.2; python_full_version >= '3.5.3'
antlr4-python3-runtime==4.8; python_version >= '3'
apiosintds==1.8.3
argparse==1.4.0
assemblyline-client==4.0.1
async-timeout==3.0.1; python_full_version >= '3.5.3'
attrs==20.2.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
backscatter==0.2.4
beautifulsoup4==4.7.1
beautifulsoup4==4.9.3
blockchain==1.4.4
certifi==2019.3.9
certifi==2020.6.20
cffi==1.14.3
chardet==3.0.4
click-plugins==1.1.1
click==7.0
colorama==0.4.1
dnspython==1.16.0
domaintools-api==0.3.3
enum-compat==0.0.2
click==7.1.2; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
colorama==0.4.3; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
configparser==5.0.1; python_version >= '3.6'
cryptography==3.1.1
clamd==1.0.2
decorator==4.4.2
deprecated==1.2.10; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
dnspython==2.0.0
domaintools-api==0.5.2
enum-compat==0.0.3
ez-setup==0.9
ezodf==0.3.2
future==0.17.1
httplib2==0.12.3
idna-ssl==1.1.0 ; python_version < '3.7'
idna==2.8
future==0.18.2; python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'
futures==3.1.1
geoip2==4.1.0
httplib2==0.18.1
idna-ssl==1.1.0; python_version < '3.7'
idna==2.10; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
isodate==0.6.0
jbxapi==3.1.3
jsonschema==3.0.1
lxml==4.3.3
jbxapi==3.11.0
json-log-formatter==0.3.0
jsonschema==3.2.0
lief==0.10.1
lxml==4.5.2
maclookup==1.0.3
multidict==4.5.2
markdownify==0.5.3
maxminddb==2.0.2; python_version >= '3.6'
multidict==4.7.6; python_version >= '3.5'
np==1.0.2
numpy==1.16.3
numpy==1.19.2; python_version >= '3.6'
oauth2==1.9.0.post1
opencv-python==4.1.0.25
pandas-ods-reader==0.0.6
pandas==0.24.2
passivetotal==1.0.30
pdftotext==2.1.1
pillow==6.0.0
psutil==5.6.2
pyeupi==1.0
opencv-python==4.4.0.44
pandas-ods-reader==0.0.7
pandas==1.1.3
passivetotal==1.0.31
pdftotext==2.1.5
pillow==7.2.0
progressbar2==3.53.1
psutil==5.7.2; python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'
pycparser==2.20; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
pycryptodome==3.9.8; python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'
pycryptodomex==3.9.8; python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'
pydeep==0.4
pyeupi==1.1
pygeoip==0.3.2
pyparsing==2.4.0
pypdns==1.4.1
pyopenssl==19.1.0
pyparsing==2.4.7; python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'
pypdns==1.5.1
pypssl==2.1
pyrsistent==0.15.2
pytesseract==0.2.6
python-dateutil==2.8.0
pyrsistent==0.17.3; python_version >= '3.5'
pytesseract==0.3.6
python-baseconv==1.2.2
python-dateutil==2.8.1; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
python-docx==0.8.10
python-engineio==3.13.2
python-magic==0.4.18
python-pptx==0.6.18
pytz==2019.1
pyyaml==5.1
python-socketio[client]==4.6.0
python-utils==2.4.0
pytz==2019.3
pyyaml==5.3.1
pyzbar==0.1.8
rdflib==4.2.2
redis==3.2.1
reportlab==3.5.21
requests-cache==0.5.0
requests==2.22.0
shodan==1.13.0
sigmatools==0.10
six==1.12.0
soupsieve==1.9.1
sparqlwrapper==1.8.4
stix2-patterns==1.1.0
tabulate==0.8.3
tornado==6.0.2
url-normalize==1.4.1
pyzipper==0.3.3; python_version >= '3.5'
rdflib==5.0.0
redis==3.5.3; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'
reportlab==3.5.53
requests-cache==0.5.2
requests[security]==2.24.0
shodan==1.23.1
sigmatools==0.18.1
six==1.15.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
socketio-client==0.5.7.4
soupsieve==2.0.1; python_version >= '3.0'
sparqlwrapper==1.8.5
stix2-patterns==1.3.1
tabulate==0.8.7
tornado==6.0.4; python_version >= '3.5'
trustar==0.3.33
tzlocal==2.1
unicodecsv==0.14.1
url-normalize==1.4.2; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5'
urlarchiver==0.2
urllib3==1.25.3
vulners==1.5.0
wand==0.5.3
urllib3==1.25.10; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'
validators==0.14.0
vt-graph-api==1.0.1
vulners==1.5.8
wand==0.6.3
websocket-client==0.57.0
wrapt==1.12.1
xlrd==1.2.0
xlsxwriter==1.1.8
xlsxwriter==1.3.6
yara-python==3.8.1
yarl==1.3.0
yarl==1.6.0; python_version >= '3.5'

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,8 @@
{
"description": "On demand query API for OSINT.digitalside.it project.",
"requirements": ["The apiosintDS python library to query the OSINT.digitalside.it API."],
"input": "A domain, ip, url or hash attribute.",
"output": "Hashes and urls resulting from the query to OSINT.digitalside.it",
"references": ["https://osint.digitalside.it/#About"],
"features": "The module simply queries the API of OSINT.digitalside.it with a domain, ip, url or hash attribute.\n\nThe result of the query is then parsed to extract additional hashes or urls. A module parameters also allows to parse the hashes related to the urls.\n\nFurthermore, it is possible to cache the urls and hashes collected over the last 7 days by OSINT.digitalside.it"
}

View File

@ -0,0 +1,9 @@
{
"description": "Module to query APIVoid with some domain attributes.",
"logo": "logos/apivoid.png",
"requirements": ["A valid APIVoid API key with enough credits to proceed 2 queries"],
"input": "A domain attribute.",
"output": "DNS records and SSL certificates related to the domain.",
"features": "This module takes a domain name and queries API Void to get the related DNS records and the SSL certificates. It returns then those pieces of data as MISP objects that can be added to the event.\n\nTo make it work, a valid API key and enough credits to proceed 2 queries (0.06 + 0.07 credits) are required.",
"references": ["https://www.apivoid.com/"]
}

View File

@ -0,0 +1,9 @@
{
"description": "A module tu query the AssemblyLine API with a submission ID to get the submission report and parse it.",
"logo": "logos/assemblyline.png",
"requirements": ["assemblyline_client: Python library to query the AssemblyLine rest API."],
"input": "Link of an AssemblyLine submission report.",
"output": "MISP attributes & objects parsed from the AssemblyLine submission.",
"references": ["https://www.cyber.cg.ca/en/assemblyline"],
"features": "The module requires the address of the AssemblyLine server you want to query as well as your credentials used for this instance. Credentials include the used-ID and an API key or the password associated to the user-ID.\n\nThe submission ID extracted from the submission link is then used to query AssemblyLine and get the full submission report. This report is parsed to extract file objects and the associated IPs, domains or URLs the files are connecting to.\n\nSome more data may be parsed in the future."
}

View File

@ -0,0 +1,9 @@
{
"description": "A module to submit samples and URLs to AssemblyLine for advanced analysis, and return the link of the submission.",
"logo": "logos/assemblyline.png",
"requirements": ["assemblyline_client: Python library to query the AssemblyLine rest API."],
"input": "Sample, or url to submit to AssemblyLine.",
"output": "Link of the report generated in AssemblyLine.",
"references": ["https://www.cyber.gc.ca/en/assemblyline"],
"features": "The module requires the address of the AssemblyLine server you want to query as well as your credentials used for this instance. Credentials include the user-ID and an API key or the password associated to the user-ID.\n\nIf the sample or url is correctly submitted, you get then the link of the submission."
}

View File

@ -1,8 +1,8 @@
{
"description": "Query BGP Ranking (https://bgpranking-ng.circl.lu/).",
"requirements": ["pybgpranking python library"],
"features": "The module takes an AS number attribute as input and displays its description and history, and position in BGP Ranking.\n\n",
"features": "The module takes an AS number attribute as input and displays its description as well as its ranking position in BGP Ranking for a given day.\n\n",
"references": ["https://github.com/D4-project/BGP-Ranking/"],
"input": "Autonomous system number.",
"output": "Text containing a description of the ASN, its history, and the position in BGP Ranking."
"output": "An asn object with its related bgp-ranking object."
}

View File

@ -0,0 +1,8 @@
{
"description": "An expansion module to enrich attributes in MISP by quering the censys.io API",
"requirements": ["API credentials to censys.io"],
"input": "IP, domain or certificate fingerprint (md5, sha1 or sha256)",
"output": "MISP objects retrieved from censys, including open ports, ASN, Location of the IP, x509 details",
"references": ["https://www.censys.io"],
"features": "This module takes an IP, hostname or a certificate fingerprint and attempts to enrich it by querying the Censys API."
}

View File

@ -3,7 +3,7 @@
"logo": "logos/passivedns.png",
"requirements": ["pypdns: Passive DNS python library", "A CIRCL passive DNS account with username & password"],
"input": "Hostname, domain, or ip-address attribute.",
"ouput": "Text describing passive DNS information related to the input attribute.",
"features": "This module takes a hostname, domain or ip-address (ip-src or ip-dst) attribute as input, and queries the CIRCL Passive DNS REST API to get and display information about this input.\n\nTo make it work a username and a password are thus required to authenticate to the CIRCL Passive DNS API.",
"ouput": "Passive DNS objects related to the input attribute.",
"features": "This module takes a hostname, domain or ip-address (ip-src or ip-dst) attribute as input, and queries the CIRCL Passive DNS REST API to get the asssociated passive dns entries and return them as MISP objects.\n\nTo make it work a username and a password are thus required to authenticate to the CIRCL Passive DNS API.",
"references": ["https://www.circl.lu/services/passive-dns/", "https://datatracker.ietf.org/doc/draft-dulaunoy-dnsop-passive-dns-cof/"]
}

View File

@ -2,8 +2,8 @@
"description": "Modules to access CIRCL Passive SSL.",
"logo": "logos/passivessl.png",
"requirements": ["pypssl: Passive SSL python library", "A CIRCL passive SSL account with username & password"],
"input": "Ip-address attribute.",
"output": "Text describing passive SSL information related to the input attribute.",
"features": "This module takes an ip-address (ip-src or ip-dst) attribute as input, and queries the CIRCL Passive SSL REST API to get and display information about this input.\n\nTo make it work a username and a password are thus required to authenticate to the CIRCL Passive SSL API.",
"input": "IP address attribute.",
"output": "x509 certificate objects seen by the IP address(es).",
"features": "This module takes an ip-address (ip-src or ip-dst) attribute as input, and queries the CIRCL Passive SSL REST API to gather the related certificates and return the corresponding MISP objects.\n\nTo make it work a username and a password are required to authenticate to the CIRCL Passive SSL API.",
"references": ["https://www.circl.lu/services/passive-ssl/"]
}

View File

@ -0,0 +1,8 @@
{
"description": "An expansion module to query the CIRCL CVE search API for more information about a vulnerability (CVE).",
"logo": "logos/cve.png",
"input": "Vulnerability attribute.",
"output": "Additional information about the vulnerability, such as its cvss score, some references, or the related weaknesses and attack patterns.",
"references": ["https://cve.circl.lu", "https://cve/mitre.org/"],
"features": "The module takes a vulnerability attribute as input and queries the CIRCL CVE search API to gather additional information.\n\nThe result of the query is then parsed to return additional information about the vulnerability, like its cvss score or some references, as well as the potential related weaknesses and attack patterns.\n\nThe vulnerability additional data is returned in a vulnerability MISP object, and the related additional information are put into weakness and attack-pattern MISP objects."
}

View File

@ -0,0 +1,9 @@
{
"description": "An expansion module to enrich attributes in MISP by quering the Cytomic Orion API",
"logo": "logos/cytomic_orion.png",
"requirements": ["Access (license) to Cytomic Orion"],
"input": "MD5, hash of the sample / malware to search for.",
"output": "MISP objects with sightings of the hash in Cytomic Orion. Includes files and machines.",
"references": ["https://www.vanimpe.eu/2020/03/10/integrating-misp-and-cytomic-orion/", "https://www.cytomicmodel.com/solutions/"],
"features": "This module takes an MD5 hash and searches for occurrences of this hash in the Cytomic Orion database. Returns observed files and machines."
}

9
doc/expansion/eql.json Normal file
View File

@ -0,0 +1,9 @@
{
"description": "EQL query generation for a MISP attribute.",
"logo": "logos/eql.png",
"requirements": [],
"input": "A filename or ip attribute.",
"output": "Attribute containing EQL for a network or file attribute.",
"references": ["https://eql.readthedocs.io/en/latest/"],
"features": "This module adds a new attribute to a MISP event containing an EQL query for a network or file attribute."
}

View File

@ -0,0 +1,9 @@
{
"descrption": "An expansion module to query a local copy of Maxmind's Geolite database with an IP address, in order to get information about its related AS number.",
"logo": "logos/maxmind.png",
"requirements": ["A local copy of Maxmind's Geolite database"],
"input": "An IP address MISP attribute.",
"output": "Text containing information about the AS number of the IP address.",
"references": ["https://www.maxmind.com/en/home"],
"features": "The module takes an IP address attribute as input and queries a local copy of the Maxmind's Geolite database to get information about the related AS number."
}

View File

@ -0,0 +1,9 @@
{
"description": "An expansion module to query a local copy of Maxmind's Geolite database with an IP address, in order to get information about the city where it is located.",
"logo": "logos/maxmind.png",
"requirements": ["A local copy of Maxmind's Geolite database"],
"input": "An IP address MISP attribute.",
"output": "Text containing information about the city where the IP address is located.",
"references": ["https://www.maxmind.com/en/home"],
"features": "The module takes an IP address attribute as input and queries a local copy of the Maxmind's Geolite database to get information about the city where this IP address is located."
}

View File

@ -0,0 +1,9 @@
{
"descrption": "A hover module to get information about an url using a Google search.",
"logo": "logos/google.png",
"requirements": ["The python Google Search API library"],
"input": "An url attribute.",
"output": "Text containing the result of a Google search on the input url.",
"references": ["https://github.com/abenassi/Google-Search-API"],
"features": "The module takes an url as input to query the Google search API. The result of the query is then return as raw text."
}

View File

@ -1,9 +1,9 @@
{
"description": "Module to access GreyNoise.io API",
"logo": "logos/greynoise.png",
"requirements": [],
"requirements": ["A Greynoise API key."],
"input": "An IP address.",
"output": "Additional information about the IP fetched from Greynoise API.",
"references": ["https://greynoise.io/", "https://github.com/GreyNoise-Intelligence/api.greynoise.io"],
"features": "The module takes an IP address as input and queries Greynoise for some additional information about it. The result is returned as text."
"features": "The module takes an IP address as input and queries Greynoise for some additional information about it: basically it checks whether a given IP address is “Internet background noise”, or has been observed scanning or attacking devices across the Internet. The result is returned as text."
}

View File

@ -0,0 +1,7 @@
{
"description": "Expansion module to fetch the html content from an url and convert it into markdown.",
"input": "URL attribute.",
"output": "Markdown content converted from the HTML fetched from the url.",
"requirements": ["The markdownify python library"],
"features": "The module take an URL as input and the HTML content is fetched from it. This content is then converted into markdown that is returned as text."
}

View File

@ -0,0 +1,9 @@
{
"descrption": "An expansion module to query Intel471 in order to get additional information about a domain, ip address, email address, url or hash.",
"logo": "logos/intel471.png",
"requirements": ["The intel471 python library"],
"input": "A MISP attribute whose type is included in the following list:\n- hostname\n- domain\n- url\n- ip-src\n- ip-dst\n- email-src\n- email-dst\n- target-email\n- whois-registrant-email\n- whois-registrant-name\n- md5\n- sha1\n- sha256",
"output": "Freetext",
"references": ["https://public.intel471.com/"],
"features": "The module uses the Intel471 python library to query the Intel471 API with the value of the input attribute. The result of the query is then returned as freetext so the Freetext import parses it."
}

View File

@ -2,7 +2,7 @@
"description": "Module to query an IP ASN history service (https://github.com/D4-project/IPASN-History).",
"requirements": ["pyipasnhistory: Python library to access IPASN-history instance"],
"input": "An IP address MISP attribute.",
"output": "Text describing additional information about the input after a query on the IPASN-history database.",
"output": "Asn object(s) objects related to the IP address used as input.",
"references": ["https://github.com/D4-project/IPASN-History"],
"features": "This module takes an IP address attribute as input and queries the CIRCL IPASN service to get additional information about the input."
"features": "This module takes an IP address attribute as input and queries the CIRCL IPASN service. The result of the query is the latest asn related to the IP address, that is returned as a MISP object."
}

View File

@ -3,7 +3,7 @@
"logo": "logos/joesandbox.png",
"requirements": ["jbxapi: Joe Sandbox API python3 library"],
"input": "Sample, url (or domain) to submit to Joe Sandbox for an advanced analysis.",
"output": "Link of the data in input submitted to Joe Sandbox.",
"output": "Link of the report generated in Joe Sandbox.",
"references": ["https://www.joesecurity.org", "https://www.joesandbox.com/"],
"features": "The module requires a Joe Sandbox API key to submit files or URL, and returns the link of the submitted analysis.\n\nIt is then possible, when the analysis is completed, to query the Joe Sandbox API to get the data related to the analysis, using the [joesandbox_query module](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/joesandbox_query.py) directly on this submission link."
}

View File

@ -0,0 +1,9 @@
{
"description": "Query Lastline with an analysis link and parse the report into MISP attributes and objects.\nThe analysis link can also be retrieved from the output of the [lastline_submit](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/lastline_submit.py) expansion module.",
"logo": "logos/lastline.png",
"requirements": [],
"input": "Link to a Lastline analysis.",
"output": "MISP attributes and objects parsed from the analysis report.",
"references": ["https://www.lastline.com"],
"features": "The module requires a Lastline Portal `username` and `password`.\nThe module uses the new format and it is able to return MISP attributes and objects.\nThe module returns the same results as the [lastline_import](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/import_mod/lastline_import.py) import module."
}

View File

@ -0,0 +1,9 @@
{
"description": "Module to submit a file or URL to Lastline.",
"logo": "logos/lastline.png",
"requirements": [],
"input": "File or URL to submit to Lastline.",
"output": "Link to the report generated by Lastline.",
"references": ["https://www.lastline.com"],
"features": "The module requires a Lastline Analysis `api_token` and `key`.\nWhen the analysis is completed, it is possible to import the generated report by feeding the analysis link to the [lastline_query](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/lastline_query.py) module."
}

View File

@ -0,0 +1,8 @@
{
"description": "Query the MALWAREbazaar API to get additional information about the input hash attribute.",
"requirements": [],
"input": "A hash attribute (md5, sha1 or sha256).",
"output": "File object(s) related to the input attribute found on MALWAREbazaar databases.",
"references": ["https://bazaar.abuse.ch/"],
"features": "The module takes a hash attribute as input and queries MALWAREbazaar's API to fetch additional data about it. The result, if the payload is known on the databases, is at least one file object describing the file the input hash is related to.\n\nThe module is using the new format of modules able to return object since the result is one or multiple MISP object(s)."
}

View File

@ -0,0 +1,8 @@
{
"descrption": "Module to access the ransomcoinDB with a hash or btc address attribute and get the associated btc address of hashes.",
"requirements": ["A ransomcoinDB API key."],
"input": "A hash (md5, sha1 or sha256) or btc attribute.",
"output": "Hashes associated to a btc address or btc addresses associated to a hash.",
"references": ["https://ransomcoindb.concinnity-risks.com"],
"features": "The module takes either a hash attribute or a btc attribute as input to query the ransomcoinDB API for some additional data.\n\nIf the input is a btc address, we will get the associated hashes returned in a file MISP object. If we query ransomcoinDB with a hash, the response contains the associated btc addresses returned as single MISP btc attributes."
}

View File

@ -0,0 +1,9 @@
{
"description": "Module to enrich attributes with threat intelligence from Recorded Future.",
"logo": "logos/recordedfuture.png",
"requirements": ["A Recorded Future API token."],
"input": "A MISP attribute of one of the following types: ip, ip-src, ip-dst, domain, hostname, md5, sha1, sha256, uri, url, vulnerability, weakness.",
"output": "A MISP object containing a copy of the enriched attribute with added tags from Recorded Future and a list of new attributes related to the enriched attribute.",
"references": ["https://www.recordedfuture.com/"],
"features": "Enrich an attribute to add a custom enrichment object to the event. The object contains a copy of the enriched attribute with added tags presenting risk score and triggered risk rules from Recorded Future. Malware and Threat Actors related to the enriched indicator in Recorded Future is matched against MISP's galaxy clusters and applied as galaxy tags. The custom enrichment object also includes a list of related indicators from Recorded Future (IP's, domains, hashes, URL's and vulnerabilities) added as additional attributes."
}

View File

@ -0,0 +1,9 @@
{
"description": "An expansion module to query the Sophoslabs intelix API to get additional information about an ip address, url, domain or sha256 attribute.",
"logo": "logos/sophoslabs_intelix.svg",
"requirements": ["A client_id and client_secret pair to authenticate to the SophosLabs Intelix API"],
"input": "An ip address, url, domain or sha256 attribute.",
"output": "SophosLabs Intelix report and lookup objects",
"references": ["https://aws.amazon.com/marketplace/pp/B07SLZPMCS"],
"features": "The module takes an ip address, url, domain or sha256 attribute and queries the SophosLabs Intelix API with the attribute value. The result of this query is a SophosLabs Intelix hash report, or an ip or url lookup, that is then parsed and returned in a MISP object."
}

View File

@ -0,0 +1,8 @@
{
"description": "Module to get enrich indicators with TruSTAR.",
"logo": "logos/trustar.png",
"input": "Any of the following MISP attributes:\n- btc\n- domain\n- email-src\n- filename\n- hostname\n- ip-src\n- ip-dst\n- md5\n- sha1\n- sha256\n- url",
"output": "MISP attributes enriched with indicator summary data from the TruSTAR API. Data includes a severity level score and additional source and scoring info.",
"references": ["https://docs.trustar.co/api/v13/indicators/get_indicator_summaries.html"],
"features": "This module enriches MISP attributes with scoring and metadata from TruSTAR.\n\nThe TruSTAR indicator summary is appended to the attributes along with links to any associated reports."
}

View File

@ -0,0 +1,9 @@
{
"description": "Mass EQL query export for a MISP event.",
"logo": "logos/eql.png",
"requirements": [],
"features": "This module produces EQL queries for all relevant attributes in a MISP event.",
"references": ["https://eql.readthedocs.io/en/latest/"],
"input": "MISP Event attributes",
"output": "Text file containing one or more EQL queries"
}

View File

@ -0,0 +1,9 @@
{
"description": "This module is used to create a VirusTotal Graph from a MISP event.",
"logo": "logos/virustotal.png",
"requirements": ["vt_graph_api, the python library to query the VirusTotal graph API"],
"features": "The module takes the MISP event as input and queries the VirusTotal Graph API to create a new graph out of the event.\n\nOnce the graph is ready, we get the url of it, which is returned so we can view it on VirusTotal.",
"references": ["https://www.virustotal.com/gui/graph-overview"],
"input": "A MISP event.",
"output": "Link of the VirusTotal Graph created for the event."
}

View File

@ -5,7 +5,7 @@ import json
module_types = ['expansion', 'export_mod', 'import_mod']
titles = ['Expansion Modules', 'Export Modules', 'Import Modules']
markdown = ["# MISP modules documentation\n"]
githublink = 'https://github.com/MISP/misp-modules/tree/master/misp_modules/modules'
githublink = 'https://github.com/MISP/misp-modules/tree/main/misp_modules/modules'
def generate_doc(root_path):

View File

@ -1,7 +1,7 @@
{
"description": "Module to import MISP attributes from a csv file.",
"requirements": ["PyMISP"],
"features": "In order to parse data from a csv file, a header is required to let the module know which column is matching with known attribute fields / MISP types.\nThis header is part of the configuration of the module and should be filled out in MISP plugin settings, each field separated by COMMAS. Fields that do not match with any type known in MISP can be ignored in import, using a space or simply nothing between two separators (example: 'ip-src, , comment, ').\nThere is also one type that is confused and can be either a MISP attribute type or an attribute field: 'comment'. In this case, using 'attrComment' specifies that the attribute field 'comment' should be considered, otherwise it will be considered as the MISP attribute type.\n\nFor each MISP attribute type, an attribute is created.\nAttribute fields that are imported are the following: value, type, category, to-ids, distribution, comment, tag.",
"features": "In order to parse data from a csv file, a header is required to let the module know which column is matching with known attribute fields / MISP types.\n\nThis header either comes from the csv file itself or is part of the configuration of the module and should be filled out in MISP plugin settings, each field separated by COMMAS. Fields that do not match with any type known in MISP or are not MISP attribute fields should be ignored in import, using a space or simply nothing between two separators (example: 'ip-src, , comment, ').\n\nIf the csv file already contains a header that does not start by a '#', you should tick the checkbox 'has_header' to avoid importing it and have potential issues. You can also redefine the header even if it is already contained in the file, by following the rules for headers explained earlier. One reason why you would redefine a header is for instance when you want to skip some fields, or some fields are not valid types.",
"references": ["https://tools.ietf.org/html/rfc4180", "https://tools.ietf.org/html/rfc7111"],
"input": "CSV format file.",
"output": "MISP Event attributes"

View File

@ -0,0 +1,9 @@
{
"description": "Module to import and parse reports from Lastline analysis links.",
"logo": "logos/lastline.png",
"requirements": [],
"input": "Link to a Lastline analysis.",
"output": "MISP attributes and objects parsed from the analysis report.",
"references": ["https://www.lastline.com"],
"features": "The module requires a Lastline Portal `username` and `password`.\nThe module uses the new format and it is able to return MISP attributes and objects.\nThe module returns the same results as the [lastline_query](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/lastline_query.py) expansion module."
}

BIN
doc/logos/apivoid.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.8 KiB

BIN
doc/logos/assemblyline.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 171 KiB

BIN
doc/logos/cytomic_orion.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 898 B

BIN
doc/logos/eql.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 61 KiB

BIN
doc/logos/google.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

BIN
doc/logos/intel471.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.6 KiB

BIN
doc/logos/lastline.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 38 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 26 KiB

BIN
doc/logos/trustar.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

View File

@ -35,6 +35,7 @@ For more information: [Extending MISP with Python modules](https://www.circl.lu/
* [docx-enrich](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/docx-enrich.py) - an enrichment module to get text out of Word document into MISP (using free-text parser).
* [DomainTools](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/domaintools.py) - a hover and expansion module to get information from [DomainTools](http://www.domaintools.com/) whois.
* [EUPI](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/eupi.py) - a hover and expansion module to get information about an URL from the [Phishing Initiative project](https://phishing-initiative.eu/?lang=en).
* [EQL](misp_modules/modules/expansion/eql.py) - an expansion module to generate event query language (EQL) from an attribute. [Event Query Language](https://eql.readthedocs.io/en/latest/)
* [Farsight DNSDB Passive DNS](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/farsight_passivedns.py) - a hover and expansion module to expand hostname and IP addresses with passive DNS information.
* [GeoIP](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/geoip_country.py) - a hover and expansion module to get GeoIP information from geolite/maxmind.
* [Greynoise](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/expansion/greynoise.py) - a hover to get information from greynoise.
@ -87,6 +88,7 @@ For more information: [Extending MISP with Python modules](https://www.circl.lu/
* [Cisco FireSight Manager ACL rule](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/export_mod/cisco_firesight_manager_ACL_rule_export.py) module to export as rule for the Cisco FireSight manager ACL.
* [GoAML export](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/export_mod/goamlexport.py) module to export in [GoAML format](http://goaml.unodc.org/goaml/en/index.html).
* [Lite Export](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/export_mod/liteexport.py) module to export a lite event.
* [Mass EQL Export](misp_modules/modules/export_mod/mass_eql_export.py) module to export applicable attributes from an event to a mass EQL query.
* [PDF export](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/export_mod/pdfexport.py) module to export an event in PDF.
* [Nexthink query format](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/export_mod/nexthinkexport.py) module to export in Nexthink query format.
* [osquery](https://github.com/MISP/misp-modules/tree/master/misp_modules/modules/export_mod/osqueryexport.py) module to export in [osquery](https://osquery.io/) query format.

View File

@ -21,8 +21,28 @@ $SUDO_WWW virtualenv -p python3 /var/www/MISP/venv
# END with virtualenv
cd /usr/local/src/
sudo git clone https://github.com/MISP/misp-modules.git
cd misp-modules
# Ideally you add your user to the staff group and make /usr/local/src group writeable, below follows an example with user misp
sudo adduser misp staff
sudo chmod 2775 /usr/local/src
sudo chown root:staff /usr/local/src
git clone https://github.com/MISP/misp-modules.git
git clone git://github.com/stricaud/faup.git faup
git clone git://github.com/stricaud/gtcaca.git gtcaca
# Install gtcaca/faup
cd gtcaca
mkdir -p build
cd build
cmake .. && make
sudo make install
cd ../../faup
mkdir -p build
cd build
cmake .. && make
sudo make install
sudo ldconfig
cd ../../misp-modules
# BEGIN with virtualenv:
$SUDO_WWW /var/www/MISP/venv/bin/pip install -I -r REQUIREMENTS
@ -168,4 +188,4 @@ tar xvf misp-module-bundeled.tar.bz2 -C misp-modules-bundle
cd misp-modules-bundle
ls -1|while read line; do sudo pip3 install --force-reinstall --ignore-installed --upgrade --no-index --no-deps ${line};done
~~~
Next you can follow standard install procedure.
Next you can follow standard install procedure.

View File

@ -1 +1,3 @@
all = ['joe_parser']
from .vt_graph_parser import * # noqa
all = ['joe_parser', 'lastline_api']

View File

@ -51,12 +51,15 @@ signerinfo_object_mapping = {'sigissuer': ('text', 'issuer'),
class JoeParser():
def __init__(self):
def __init__(self, config):
self.misp_event = MISPEvent()
self.references = defaultdict(list)
self.attributes = defaultdict(lambda: defaultdict(set))
self.process_references = {}
self.import_pe = config["import_pe"]
self.create_mitre_attack = config["mitre_attack"]
def parse_data(self, data):
self.data = data
if self.analysis_type() == "file":
@ -72,7 +75,9 @@ class JoeParser():
if self.attributes:
self.handle_attributes()
self.parse_mitre_attack()
if self.create_mitre_attack:
self.parse_mitre_attack()
def build_references(self):
for misp_object in self.misp_event.objects:
@ -97,12 +102,12 @@ class JoeParser():
file_object = MISPObject('file')
for key, mapping in dropped_file_mapping.items():
attribute_type, object_relation = mapping
file_object.add_attribute(object_relation, **{'type': attribute_type, 'value': droppedfile[key]})
file_object.add_attribute(object_relation, **{'type': attribute_type, 'value': droppedfile[key], 'to_ids': False})
if droppedfile['@malicious'] == 'true':
file_object.add_attribute('state', **{'type': 'text', 'value': 'Malicious'})
file_object.add_attribute('state', **{'type': 'text', 'value': 'Malicious', 'to_ids': False})
for h in droppedfile['value']:
hash_type = dropped_hash_mapping[h['@algo']]
file_object.add_attribute(hash_type, **{'type': hash_type, 'value': h['$']})
file_object.add_attribute(hash_type, **{'type': hash_type, 'value': h['$'], 'to_ids': False})
self.misp_event.add_object(**file_object)
self.references[self.process_references[(int(droppedfile['@targetid']), droppedfile['@process'])]].append({
'referenced_uuid': file_object.uuid,
@ -132,9 +137,12 @@ class JoeParser():
for object_relation, attribute in attributes.items():
network_connection_object.add_attribute(object_relation, **attribute)
network_connection_object.add_attribute('first-packet-seen',
**{'type': 'datetime', 'value': min(tuple(min(timestamp) for timestamp in data.values()))})
**{'type': 'datetime',
'value': min(tuple(min(timestamp) for timestamp in data.values())),
'to_ids': False})
for protocol in data.keys():
network_connection_object.add_attribute('layer{}-protocol'.format(protocols[protocol]), **{'type': 'text', 'value': protocol})
network_connection_object.add_attribute('layer{}-protocol'.format(protocols[protocol]),
**{'type': 'text', 'value': protocol, 'to_ids': False})
self.misp_event.add_object(**network_connection_object)
self.references[self.analysisinfo_uuid].append(dict(referenced_uuid=network_connection_object.uuid,
relationship_type='initiates'))
@ -143,8 +151,8 @@ class JoeParser():
network_connection_object = MISPObject('network-connection')
for object_relation, attribute in attributes.items():
network_connection_object.add_attribute(object_relation, **attribute)
network_connection_object.add_attribute('first-packet-seen', **{'type': 'datetime', 'value': min(timestamps)})
network_connection_object.add_attribute('layer{}-protocol'.format(protocols[protocol]), **{'type': 'text', 'value': protocol})
network_connection_object.add_attribute('first-packet-seen', **{'type': 'datetime', 'value': min(timestamps), 'to_ids': False})
network_connection_object.add_attribute('layer{}-protocol'.format(protocols[protocol]), **{'type': 'text', 'value': protocol, 'to_ids': False})
self.misp_event.add_object(**network_connection_object)
self.references[self.analysisinfo_uuid].append(dict(referenced_uuid=network_connection_object.uuid,
relationship_type='initiates'))
@ -154,7 +162,8 @@ class JoeParser():
if screenshotdata:
screenshotdata = screenshotdata['interesting']['$']
attribute = {'type': 'attachment', 'value': 'screenshot.jpg',
'data': screenshotdata, 'disable_correlation': True}
'data': screenshotdata, 'disable_correlation': True,
'to_ids': False}
self.misp_event.add_attribute(**attribute)
def parse_system_behavior(self):
@ -166,9 +175,9 @@ class JoeParser():
general = process['general']
process_object = MISPObject('process')
for feature, relation in process_object_fields.items():
process_object.add_attribute(relation, **{'type': 'text', 'value': general[feature]})
process_object.add_attribute(relation, **{'type': 'text', 'value': general[feature], 'to_ids': False})
start_time = datetime.strptime('{} {}'.format(general['date'], general['time']), '%d/%m/%Y %H:%M:%S')
process_object.add_attribute('start-time', **{'type': 'datetime', 'value': start_time})
process_object.add_attribute('start-time', **{'type': 'datetime', 'value': start_time, 'to_ids': False})
self.misp_event.add_object(**process_object)
for field, to_call in process_activities.items():
if process.get(field):
@ -203,7 +212,7 @@ class JoeParser():
url_object = MISPObject("url")
self.analysisinfo_uuid = url_object.uuid
url_object.add_attribute("url", generalinfo["target"]["url"])
url_object.add_attribute("url", generalinfo["target"]["url"], to_ids=False)
self.misp_event.add_object(**url_object)
def parse_fileinfo(self):
@ -213,10 +222,10 @@ class JoeParser():
self.analysisinfo_uuid = file_object.uuid
for field in file_object_fields:
file_object.add_attribute(field, **{'type': field, 'value': fileinfo[field]})
file_object.add_attribute(field, **{'type': field, 'value': fileinfo[field], 'to_ids': False})
for field, mapping in file_object_mapping.items():
attribute_type, object_relation = mapping
file_object.add_attribute(object_relation, **{'type': attribute_type, 'value': fileinfo[field]})
file_object.add_attribute(object_relation, **{'type': attribute_type, 'value': fileinfo[field], 'to_ids': False})
arch = self.data['generalinfo']['arch']
if arch in arch_type_mapping:
to_call = arch_type_mapping[arch]
@ -234,9 +243,9 @@ class JoeParser():
attribute_type = 'text'
for comment, permissions in permission_lists.items():
permission_object = MISPObject('android-permission')
permission_object.add_attribute('comment', **dict(type=attribute_type, value=comment))
permission_object.add_attribute('comment', **dict(type=attribute_type, value=comment, to_ids=False))
for permission in permissions:
permission_object.add_attribute('permission', **dict(type=attribute_type, value=permission))
permission_object.add_attribute('permission', **dict(type=attribute_type, value=permission, to_ids=False))
self.misp_event.add_object(**permission_object)
self.references[file_object.uuid].append(dict(referenced_uuid=permission_object.uuid,
relationship_type='grants'))
@ -255,24 +264,24 @@ class JoeParser():
if elf.get('type'):
# Haven't seen anything but EXEC yet in the files I tested
attribute_value = "EXECUTABLE" if elf['type'] == "EXEC (Executable file)" else elf['type']
elf_object.add_attribute('type', **dict(type=attribute_type, value=attribute_value))
elf_object.add_attribute('type', **dict(type=attribute_type, value=attribute_value, to_ids=False))
for feature, relation in elf_object_mapping.items():
if elf.get(feature):
elf_object.add_attribute(relation, **dict(type=attribute_type, value=elf[feature]))
elf_object.add_attribute(relation, **dict(type=attribute_type, value=elf[feature], to_ids=False))
sections_number = len(fileinfo['sections']['section'])
elf_object.add_attribute('number-sections', **{'type': 'counter', 'value': sections_number})
elf_object.add_attribute('number-sections', **{'type': 'counter', 'value': sections_number, 'to_ids': False})
self.misp_event.add_object(**elf_object)
for section in fileinfo['sections']['section']:
section_object = MISPObject('elf-section')
for feature in ('name', 'type'):
if section.get(feature):
section_object.add_attribute(feature, **dict(type=attribute_type, value=section[feature]))
section_object.add_attribute(feature, **dict(type=attribute_type, value=section[feature], to_ids=False))
if section.get('size'):
section_object.add_attribute(size, **dict(type=size, value=int(section['size'], 16)))
section_object.add_attribute(size, **dict(type=size, value=int(section['size'], 16), to_ids=False))
for flag in section['flagsdesc']:
try:
attribute_value = elf_section_flags_mapping[flag]
section_object.add_attribute('flag', **dict(type=attribute_type, value=attribute_value))
section_object.add_attribute('flag', **dict(type=attribute_type, value=attribute_value, to_ids=False))
except KeyError:
print(f'Unknown elf section flag: {flag}')
continue
@ -281,6 +290,8 @@ class JoeParser():
relationship_type=relationship))
def parse_pe(self, fileinfo, file_object):
if not self.import_pe:
return
try:
peinfo = fileinfo['pe']
except KeyError:
@ -292,8 +303,8 @@ class JoeParser():
self.misp_event.add_object(**file_object)
for field, mapping in pe_object_fields.items():
attribute_type, object_relation = mapping
pe_object.add_attribute(object_relation, **{'type': attribute_type, 'value': peinfo[field]})
pe_object.add_attribute('compilation-timestamp', **{'type': 'datetime', 'value': int(peinfo['timestamp'].split()[0], 16)})
pe_object.add_attribute(object_relation, **{'type': attribute_type, 'value': peinfo[field], 'to_ids': False})
pe_object.add_attribute('compilation-timestamp', **{'type': 'datetime', 'value': int(peinfo['timestamp'].split()[0], 16), 'to_ids': False})
program_name = fileinfo['filename']
if peinfo['versions']:
for feature in peinfo['versions']['version']:
@ -301,18 +312,18 @@ class JoeParser():
if name == 'InternalName':
program_name = feature['value']
if name in pe_object_mapping:
pe_object.add_attribute(pe_object_mapping[name], **{'type': 'text', 'value': feature['value']})
pe_object.add_attribute(pe_object_mapping[name], **{'type': 'text', 'value': feature['value'], 'to_ids': False})
sections_number = len(peinfo['sections']['section'])
pe_object.add_attribute('number-sections', **{'type': 'counter', 'value': sections_number})
pe_object.add_attribute('number-sections', **{'type': 'counter', 'value': sections_number, 'to_ids': False})
signatureinfo = peinfo['signature']
if signatureinfo['signed']:
signerinfo_object = MISPObject('authenticode-signerinfo')
pe_object.add_reference(signerinfo_object.uuid, 'signed-by')
self.misp_event.add_object(**pe_object)
signerinfo_object.add_attribute('program-name', **{'type': 'text', 'value': program_name})
signerinfo_object.add_attribute('program-name', **{'type': 'text', 'value': program_name, 'to_ids': False})
for feature, mapping in signerinfo_object_mapping.items():
attribute_type, object_relation = mapping
signerinfo_object.add_attribute(object_relation, **{'type': attribute_type, 'value': signatureinfo[feature]})
signerinfo_object.add_attribute(object_relation, **{'type': attribute_type, 'value': signatureinfo[feature], 'to_ids': False})
self.misp_event.add_object(**signerinfo_object)
else:
self.misp_event.add_object(**pe_object)
@ -327,7 +338,7 @@ class JoeParser():
for feature, mapping in pe_section_object_mapping.items():
if section.get(feature):
attribute_type, object_relation = mapping
section_object.add_attribute(object_relation, **{'type': attribute_type, 'value': section[feature]})
section_object.add_attribute(object_relation, **{'type': attribute_type, 'value': section[feature], 'to_ids': False})
return section_object
def parse_network_interactions(self):
@ -339,13 +350,13 @@ class JoeParser():
for key, mapping in domain_object_mapping.items():
attribute_type, object_relation = mapping
domain_object.add_attribute(object_relation,
**{'type': attribute_type, 'value': domain[key]})
**{'type': attribute_type, 'value': domain[key], 'to_ids': False})
self.misp_event.add_object(**domain_object)
reference = dict(referenced_uuid=domain_object.uuid, relationship_type='contacts')
self.add_process_reference(domain['@targetid'], domain['@currentpath'], reference)
else:
attribute = MISPAttribute()
attribute.from_dict(**{'type': 'domain', 'value': domain['@name']})
attribute.from_dict(**{'type': 'domain', 'value': domain['@name'], 'to_ids': False})
self.misp_event.add_attribute(**attribute)
reference = dict(referenced_uuid=attribute.uuid, relationship_type='contacts')
self.add_process_reference(domain['@targetid'], domain['@currentpath'], reference)
@ -353,7 +364,7 @@ class JoeParser():
if ipinfo:
for ip in ipinfo['ip']:
attribute = MISPAttribute()
attribute.from_dict(**{'type': 'ip-dst', 'value': ip['@ip']})
attribute.from_dict(**{'type': 'ip-dst', 'value': ip['@ip'], 'to_ids': False})
self.misp_event.add_attribute(**attribute)
reference = dict(referenced_uuid=attribute.uuid, relationship_type='contacts')
self.add_process_reference(ip['@targetid'], ip['@currentpath'], reference)
@ -363,7 +374,7 @@ class JoeParser():
target_id = int(url['@targetid'])
current_path = url['@currentpath']
attribute = MISPAttribute()
attribute_dict = {'type': 'url', 'value': url['@name']}
attribute_dict = {'type': 'url', 'value': url['@name'], 'to_ids': False}
if target_id != -1 and current_path != 'unknown':
self.references[self.process_references[(target_id, current_path)]].append({
'referenced_uuid': attribute.uuid,
@ -384,8 +395,8 @@ class JoeParser():
registry_key = MISPObject('registry-key')
for field, mapping in regkey_object_mapping.items():
attribute_type, object_relation = mapping
registry_key.add_attribute(object_relation, **{'type': attribute_type, 'value': call[field]})
registry_key.add_attribute('data-type', **{'type': 'text', 'value': 'REG_{}'.format(call['type'].upper())})
registry_key.add_attribute(object_relation, **{'type': attribute_type, 'value': call[field], 'to_ids': False})
registry_key.add_attribute('data-type', **{'type': 'text', 'value': 'REG_{}'.format(call['type'].upper()), 'to_ids': False})
self.misp_event.add_object(**registry_key)
self.references[process_uuid].append(dict(referenced_uuid=registry_key.uuid,
relationship_type=relationship))
@ -398,7 +409,7 @@ class JoeParser():
def create_attribute(self, attribute_type, attribute_value):
attribute = MISPAttribute()
attribute.from_dict(**{'type': attribute_type, 'value': attribute_value})
attribute.from_dict(**{'type': attribute_type, 'value': attribute_value, 'to_ids': False})
self.misp_event.add_attribute(**attribute)
return attribute.uuid
@ -419,5 +430,5 @@ class JoeParser():
attributes = {}
for field, value in zip(network_behavior_fields, connection):
attribute_type, object_relation = network_connection_object_mapping[field]
attributes[object_relation] = {'type': attribute_type, 'value': value}
attributes[object_relation] = {'type': attribute_type, 'value': value, 'to_ids': False}
return attributes

View File

@ -0,0 +1,841 @@
"""
Lastline Community API Client and Utilities.
:Copyright:
Copyright 2019 Lastline, Inc. All Rights Reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Copyright (c) 2010-2012 by Internet Systems Consortium, Inc. ("ISC")
Permission to use, copy, modify, and/or distribute this software for any
purpose with or without fee is hereby granted, provided that the above
copyright notice and this permission notice appear in all copies.
THE SOFTWARE IS PROVIDED "AS IS" AND ISC DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL ISC BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
"""
import abc
import logging
import io
import ipaddress
import pymisp
import re
import requests
from urllib import parse
DEFAULT_LL_PORTAL_API_URL = "https://user.lastline.com/papi"
DEFAULT_LL_ANALYSIS_API_URL = "https://analysis.lastline.com"
LL_HOSTED_DOMAINS = frozenset([
"user.lastline.com",
"user.emea.lastline.com",
])
def purge_none(d):
"""Purge None entries from a dictionary."""
return {k: v for k, v in d.items() if v is not None}
def get_task_link(uuid, analysis_url=None, portal_url=None):
"""
Get the task link given the task uuid and at least one API url.
:param str uuid: the task uuid
:param str|None analysis_url: the URL to the analysis API endpoint
:param str|None portal_url: the URL to the portal API endpoint
:rtype: str
:return: the task link
:raises ValueError: if not enough parameters have been provided
"""
if not analysis_url and not portal_url:
raise ValueError("Neither analysis URL or portal URL have been specified")
if analysis_url:
portal_url = "{}/papi".format(analysis_url.replace("analysis.", "user."))
portal_url_path = "../portal#/analyst/task/{}/overview".format(uuid)
return parse.urljoin(portal_url, portal_url_path)
def get_portal_url_from_task_link(task_link):
"""
Return the portal API url related to the provided task link.
:param str task_link: a link
:rtype: str
:return: the portal API url
"""
parsed_uri = parse.urlparse(task_link)
return "{uri.scheme}://{uri.netloc}/papi".format(uri=parsed_uri)
def get_uuid_from_task_link(task_link):
"""
Return the uuid from a task link.
:param str task_link: a link
:rtype: str
:return: the uuid
:raises ValueError: if the link contains not task uuid
"""
try:
return re.findall("[a-fA-F0-9]{32}", task_link)[0]
except IndexError:
raise ValueError("Link does not contain a valid task uuid")
def is_task_hosted(task_link):
"""
Return whether the portal link is pointing to a hosted submission.
:param str task_link: a link
:rtype: boolean
:return: whether the link points to a hosted analysis
"""
for domain in LL_HOSTED_DOMAINS:
if domain in task_link:
return True
return False
class InvalidArgument(Exception):
"""Error raised invalid."""
class CommunicationError(Exception):
"""Exception raised in case of timeouts or other network problem."""
class Error(Exception):
"""Generic server error."""
class ApiError(Error):
"""Server error with a message and an error code."""
def __init__(self, error_msg, error_code=None):
super(ApiError, self).__init__(error_msg, error_code)
self.error_msg = error_msg
self.error_code = error_code
def __str__(self):
if self.error_code is None:
error_code = ""
else:
error_code = " ({})".format(self.error_code)
return "{}{}".format(self.error_msg, error_code)
class LastlineAbstractClient(abc.ABC):
""""A very basic HTTP client providing basic functionality."""
__metaclass__ = abc.ABCMeta
SUB_APIS = ('analysis', 'authentication', 'knowledgebase', 'login')
FORMATS = ["json", "xml"]
@classmethod
def sanitize_login_params(cls, api_key, api_token, username, password):
"""
Return a dictionary with either API or USER credentials.
:param str|None api_key: the API key
:param str|None api_token: the API token
:param str|None username: the username
:param str|None password: the password
:rtype: dict[str, str]
:return: the dictionary
:raises InvalidArgument: if too many values are invalid
"""
if api_key and api_token:
return {
"key": api_key,
"api_token": api_token,
}
elif username and password:
return {
"username": username,
"password": password,
}
else:
raise InvalidArgument("Arguments provided do not contain valid data")
@classmethod
def get_login_params_from_dict(cls, d):
"""
Get the module configuration from a ConfigParser object.
:param dict[str, str] d: the dictionary
:rtype: dict[str, str]
:return: the parsed configuration
"""
api_key = d.get("key")
api_token = d.get("api_token")
username = d.get("username")
password = d.get("password")
return cls.sanitize_login_params(api_key, api_token, username, password)
@classmethod
def get_login_params_from_conf(cls, conf, section_name):
"""
Get the module configuration from a ConfigParser object.
:param ConfigParser conf: the conf object
:param str section_name: the section name
:rtype: dict[str, str]
:return: the parsed configuration
"""
api_key = conf.get(section_name, "key", fallback=None)
api_token = conf.get(section_name, "api_token", fallback=None)
username = conf.get(section_name, "username", fallback=None)
password = conf.get(section_name, "password", fallback=None)
return cls.sanitize_login_params(api_key, api_token, username, password)
@classmethod
def load_from_conf(cls, conf, section_name):
"""
Load client from a ConfigParser object.
:param ConfigParser conf: the conf object
:param str section_name: the section name
:rtype: T <- LastlineAbstractClient
:return: the loaded client
"""
url = conf.get(section_name, "url")
return cls(url, cls.get_login_params_from_conf(conf, section_name))
def __init__(self, api_url, login_params, timeout=60, verify_ssl=True):
"""
Instantiate a Lastline mini client.
:param str api_url: the URL of the API
:param dict[str, str]: the login parameters
:param int timeout: the timeout
:param boolean verify_ssl: whether to verify the SSL certificate
"""
self._url = api_url
self._login_params = login_params
self._timeout = timeout
self._verify_ssl = verify_ssl
self._session = None
self._logger = logging.getLogger(__name__)
@abc.abstractmethod
def _login(self):
"""Login using account-based or key-based methods."""
def _is_logged_in(self):
"""Return whether we have an active session."""
return self._session is not None
@staticmethod
def _parse_response(response):
"""
Parse the response.
:param requests.Response response: the response
:rtype: tuple(str|None, Error|ApiError)
:return: a tuple with mutually exclusive fields (either the response or the error)
"""
try:
ret = response.json()
if "success" not in ret:
return None, Error("no success field in response")
if not ret["success"]:
error_msg = ret.get("error", "")
error_code = ret.get("error_code", None)
return None, ApiError(error_msg, error_code)
if "data" not in ret:
return None, Error("no data field in response")
return ret["data"], None
except ValueError as e:
return None, Error("Response not json {}".format(e))
def _handle_response(self, response, raw=False):
"""
Check a response for issues and parse the return.
:param requests.Response response: the response
:param boolean raw: whether the raw body should be returned
:rtype: str
:return: if raw, return the response content; if not raw, the data field
:raises: CommunicationError, ApiError, Error
"""
# Check for HTTP errors, and re-raise in case
try:
response.raise_for_status()
except requests.RequestException as e:
_, err = self._parse_response(response)
if isinstance(err, ApiError):
err_msg = "{}: {}".format(e, err.error_msg)
else:
err_msg = "{}".format(e)
raise CommunicationError(err_msg)
# Otherwise return the data (either parsed or not) but reraise if we have an API error
if raw:
return response.content
data, err = self._parse_response(response)
if err:
raise err
return data
def _build_url(self, sub_api, parts, requested_format="json"):
if sub_api not in self.SUB_APIS:
raise InvalidArgument(sub_api)
if requested_format not in self.FORMATS:
raise InvalidArgument(requested_format)
num_parts = 2 + len(parts)
pattern = "/".join(["%s"] * num_parts) + ".%s"
params = [self._url, sub_api] + parts + [requested_format]
return pattern % tuple(params)
def post(self, module, function, params=None, data=None, files=None, fmt="json"):
if isinstance(function, list):
functions = function
else:
functions = [function] if function else []
url = self._build_url(module, functions, requested_format=fmt)
return self.do_request(
url=url,
method="POST",
params=params,
data=data,
files=files,
fmt=fmt,
)
def get(self, module, function, params=None, fmt="json"):
if isinstance(function, list):
functions = function
else:
functions = [function] if function else []
url = self._build_url(module, functions, requested_format=fmt)
return self.do_request(
url=url,
method="GET",
params=params,
fmt=fmt,
)
def do_request(
self,
method,
url,
params=None,
data=None,
files=None,
fmt="json",
raw=False,
raw_response=False,
headers=None,
stream_response=False
):
if raw_response:
raw = True
if fmt:
fmt = fmt.lower().strip()
if fmt not in self.FORMATS:
raise InvalidArgument("Only json, xml, html and pdf supported")
elif not raw:
raise InvalidArgument("Unformatted response requires raw=True")
if fmt != "json" and not raw:
raise InvalidArgument("Non-json format requires raw=True")
if method not in ["POST", "GET"]:
raise InvalidArgument("Only POST and GET supported")
if not self._is_logged_in():
self._login()
try:
try:
response = self._session.request(
method=method,
url=url,
data=data,
params=params,
files=files,
verify=self._verify_ssl,
timeout=self._timeout,
stream=stream_response,
headers=headers,
)
except requests.RequestException as e:
raise CommunicationError(e)
if raw_response:
return response
return self._handle_response(response, raw)
except Error as e:
raise e
except CommunicationError as e:
raise e
class AnalysisClient(LastlineAbstractClient):
def _login(self):
"""
Creates auth session for malscape-service.
Credentials are 'key' and 'api_token'.
"""
if self._session is None:
self._session = requests.session()
url = self._build_url("authentication", ["login"])
self.do_request("POST", url, params=purge_none(self._login_params))
def get_progress(self, uuid):
"""
Get the completion progress of a given task.
:param str uuid: the unique identifier of the submitted task
:rtype: dict[str, int]
:return: a dictionary like the the following:
{
"completed": 1,
"progress": 100
}
"""
url = self._build_url('analysis', ['get_progress'])
params = {'uuid': uuid}
return self.do_request("POST", url, params=params)
def get_result(self, uuid):
"""
Get report results for a given task.
:param str uuid: the unique identifier of the submitted task
:rtype: dict[str, any]
:return: a dictionary like the the following:
{
"completed": 1,
"progress": 100
}
"""
# better: use 'get_results()' but that would break
# backwards-compatibility
url = self._build_url('analysis', ['get'])
params = {'uuid': uuid}
return self.do_request("GET", url, params=params)
def submit_file(
self,
file_data,
file_name=None,
password=None,
analysis_env=None,
allow_network_traffic=True,
analysis_timeout=None,
bypass_cache=False,
):
"""
Upload a file to be analyzed.
:param bytes file_data: the data as a byte sequence
:param str|None file_name: if set, represents the name of the file to submit
:param str|None password: if set, use it to extract the sample
:param str|None analysis_env: if set, e.g windowsxp
:param boolean allow_network_traffic: if set to False, deny network connections
:param int|None analysis_timeout: if set limit the duration of the analysis
:param boolean bypass_cache: whether to re-process a file (requires special permissions)
:rtype: dict[str, any]
:return: a dictionary in the following form if the analysis is already available:
{
"submission": "2019-11-17 09:33:23",
"child_tasks": [...],
"reports": [...],
"submission_timestamp": "2019-11-18 16:11:04",
"task_uuid": "86097fb8e4cd00100464cb001b97ecbe",
"score": 0,
"analysis_subject": {
"url": "https://www.google.com"
},
"last_submission_timestamp": "2019-11-18 16:11:04"
}
OR the following if the analysis is still pending:
{
"submission_timestamp": "2019-11-18 13:59:25",
"task_uuid": "f3c0ae115d51001017ff8da768fa6049",
}
"""
file_stream = io.BytesIO(file_data)
api_url = self._build_url("analysis", ["submit", "file"])
params = purge_none({
"bypass_cache": bypass_cache and 1 or None,
"analysis_timeout": analysis_timeout,
"analysis_env": analysis_env,
"allow_network_traffic": allow_network_traffic and 1 or None,
"filename": file_name,
"password": password,
"full_report_score": -1,
})
files = purge_none({
# If an explicit filename was provided, we can pass it down to
# python-requests to use it in the multipart/form-data. This avoids
# having python-requests trying to guess the filename based on stream
# attributes.
#
# The problem with this is that, if the filename is not ASCII, then
# this triggers a bug in flask/werkzeug which means the file is
# thrown away. Thus, we just force an ASCII name
"file": ('dummy-ascii-name-for-file-param', file_stream),
})
return self.do_request("POST", api_url, params=params, files=files)
def submit_url(
self,
url,
referer=None,
user_agent=None,
bypass_cache=False,
):
"""
Upload an URL to be analyzed.
:param str url: the url to analyze
:param str|None referer: the referer
:param str|None user_agent: the user agent
:param boolean bypass_cache: bypass_cache
:rtype: dict[str, any]
:return: a dictionary like the following if the analysis is already available:
{
"submission": "2019-11-17 09:33:23",
"child_tasks": [...],
"reports": [...],
"submission_timestamp": "2019-11-18 16:11:04",
"task_uuid": "86097fb8e4cd00100464cb001b97ecbe",
"score": 0,
"analysis_subject": {
"url": "https://www.google.com"
},
"last_submission_timestamp": "2019-11-18 16:11:04"
}
OR the following if the analysis is still pending:
{
"submission_timestamp": "2019-11-18 13:59:25",
"task_uuid": "f3c0ae115d51001017ff8da768fa6049",
}
"""
api_url = self._build_url("analysis", ["submit", "url"])
params = purge_none({
"url": url,
"referer": referer,
"bypass_cache": bypass_cache and 1 or None,
"user_agent": user_agent or None,
})
return self.do_request("POST", api_url, params=params)
class PortalClient(LastlineAbstractClient):
def _login(self):
"""
Login using account-based or key-based methods.
Credentials are 'username' and 'password'
"""
if self._session is None:
self._session = requests.session()
self.post("login", function=None, data=self._login_params)
def get_progress(self, uuid, analysis_instance=None):
"""
Get the completion progress of a given task.
:param str uuid: the unique identifier of the submitted task
:param str analysis_instance: if set, defines the analysis instance to query
:rtype: dict[str, int]
:return: a dictionary like the the following:
{
"completed": 1,
"progress": 100
}
"""
params = purge_none({"uuid": uuid, "analysis_instance": analysis_instance})
return self.get("analysis", "get_progress", params=params)
def get_result(self, uuid, analysis_instance=None):
"""
Get report results for a given task.
:param str uuid: the unique identifier of the submitted task
:param str analysis_instance: if set, defines the analysis instance to query
:rtype: dict[str, any]
:return: a dictionary like the the following:
{
"completed": 1,
"progress": 100
}
"""
params = purge_none(
{
"uuid": uuid,
"analysis_instance": analysis_instance,
"report_format": "json",
}
)
return self.get("analysis", "get_result", params=params)
def submit_url(
self,
url,
referer=None,
user_agent=None,
bypass_cache=False,
):
"""
Upload an URL to be analyzed.
:param str url: the url to analyze
:param str|None referer: the referer
:param str|None user_agent: the user agent
:param boolean bypass_cache: bypass_cache
:rtype: dict[str, any]
:return: a dictionary like the following if the analysis is already available:
{
"submission": "2019-11-17 09:33:23",
"child_tasks": [...],
"reports": [...],
"submission_timestamp": "2019-11-18 16:11:04",
"task_uuid": "86097fb8e4cd00100464cb001b97ecbe",
"score": 0,
"analysis_subject": {
"url": "https://www.google.com"
},
"last_submission_timestamp": "2019-11-18 16:11:04"
}
OR the following if the analysis is still pending:
{
"submission_timestamp": "2019-11-18 13:59:25",
"task_uuid": "f3c0ae115d51001017ff8da768fa6049",
}
"""
params = purge_none(
{
"url": url,
"bypass_cache": bypass_cache,
"referer": referer,
"user_agent": user_agent
}
)
return self.post("analysis", "submit_url", params=params)
def submit_file(
self,
file_data,
file_name=None,
password=None,
analysis_env=None,
allow_network_traffic=True,
analysis_timeout=None,
bypass_cache=False,
):
"""
Upload a file to be analyzed.
:param bytes file_data: the data as a byte sequence
:param str|None file_name: if set, represents the name of the file to submit
:param str|None password: if set, use it to extract the sample
:param str|None analysis_env: if set, e.g windowsxp
:param boolean allow_network_traffic: if set to False, deny network connections
:param int|None analysis_timeout: if set limit the duration of the analysis
:param boolean bypass_cache: whether to re-process a file (requires special permissions)
:rtype: dict[str, any]
:return: a dictionary in the following form if the analysis is already available:
{
"submission": "2019-11-17 09:33:23",
"child_tasks": [...],
"reports": [...],
"submission_timestamp": "2019-11-18 16:11:04",
"task_uuid": "86097fb8e4cd00100464cb001b97ecbe",
"score": 0,
"analysis_subject": {
"url": "https://www.google.com"
},
"last_submission_timestamp": "2019-11-18 16:11:04"
}
OR the following if the analysis is still pending:
{
"submission_timestamp": "2019-11-18 13:59:25",
"task_uuid": "f3c0ae115d51001017ff8da768fa6049",
}
"""
params = purge_none(
{
"filename": file_name,
"password": password,
"analysis_env": analysis_env,
"allow_network_traffic": allow_network_traffic,
"analysis_timeout": analysis_timeout,
"bypass_cache": bypass_cache,
}
)
files = {"file": (file_name, file_data, "application/octet-stream")}
return self.post("analysis", "submit_file", params=params, files=files)
class LastlineResultBaseParser(object):
"""
This is a parser to extract *basic* information from a Lastline result dictionary.
Note: this is a *version 0*: the information we extract is merely related to the behaviors and
the HTTP connections. Further iterations will include host activity such as files, mutexes,
registry keys, strings, etc.
"""
def __init__(self):
"""Constructor."""
self.misp_event = None
@staticmethod
def _get_mitre_techniques(result):
return [
"misp-galaxy:mitre-attack-pattern=\"{} - {}\"".format(w[0], w[1])
for w in sorted(set([
(y["id"], y["name"])
for x in result.get("malicious_activity", [])
for y in result.get("activity_to_mitre_techniques", {}).get(x, [])
]))
]
def parse(self, analysis_link, result):
"""
Parse the analysis result into a MISP event.
:param str analysis_link: the analysis link
:param dict[str, any] result: the JSON returned by the analysis client.
:rtype: MISPEvent
:return: some results that can be consumed by MIPS.
"""
self.misp_event = pymisp.MISPEvent()
# Add analysis subject info
if "url" in result["analysis_subject"]:
o = pymisp.MISPObject("url")
o.add_attribute("url", result["analysis_subject"]["url"])
else:
o = pymisp.MISPObject("file")
o.add_attribute("md5", type="md5", value=result["analysis_subject"]["md5"])
o.add_attribute("sha1", type="sha1", value=result["analysis_subject"]["sha1"])
o.add_attribute("sha256", type="sha256", value=result["analysis_subject"]["sha256"])
o.add_attribute(
"mimetype",
type="mime-type",
value=result["analysis_subject"]["mime_type"]
)
self.misp_event.add_object(o)
# Add HTTP requests from url analyses
network_dict = result.get("report", {}).get("analysis", {}).get("network", {})
for request in network_dict.get("requests", []):
parsed_uri = parse.urlparse(request["url"])
o = pymisp.MISPObject(name='http-request')
o.add_attribute('host', parsed_uri.netloc)
o.add_attribute('method', "GET")
o.add_attribute('uri', request["url"])
o.add_attribute("ip", request["ip"])
self.misp_event.add_object(o)
# Add network behaviors from files
for subject in result.get("report", {}).get("analysis_subjects", []):
# Add DNS requests
for dns_query in subject.get("dns_queries", []):
hostname = dns_query.get("hostname")
# Skip if it is an IP address
try:
if hostname == "wpad":
continue
_ = ipaddress.ip_address(hostname)
continue
except ValueError:
pass
o = pymisp.MISPObject(name='dns-record')
o.add_attribute('queried-domain', hostname)
self.misp_event.add_object(o)
# Add HTTP conversations (as network connection and as http request)
for http_conversation in subject.get("http_conversations", []):
o = pymisp.MISPObject(name="network-connection")
o.add_attribute("ip-src", http_conversation["src_ip"])
o.add_attribute("ip-dst", http_conversation["dst_ip"])
o.add_attribute("src-port", http_conversation["src_port"])
o.add_attribute("dst-port", http_conversation["dst_port"])
o.add_attribute("hostname-dst", http_conversation["dst_host"])
o.add_attribute("layer3-protocol", "IP")
o.add_attribute("layer4-protocol", "TCP")
o.add_attribute("layer7-protocol", "HTTP")
self.misp_event.add_object(o)
method, path, http_version = http_conversation["url"].split(" ")
if http_conversation["dst_port"] == 80:
uri = "http://{}{}".format(http_conversation["dst_host"], path)
else:
uri = "http://{}:{}{}".format(
http_conversation["dst_host"],
http_conversation["dst_port"],
path
)
o = pymisp.MISPObject(name='http-request')
o.add_attribute('host', http_conversation["dst_host"])
o.add_attribute('method', method)
o.add_attribute('uri', uri)
o.add_attribute('ip', http_conversation["dst_ip"])
self.misp_event.add_object(o)
# Add sandbox info like score and sandbox type
o = pymisp.MISPObject(name="sandbox-report")
sandbox_type = "saas" if is_task_hosted(analysis_link) else "on-premise"
o.add_attribute("score", result["score"])
o.add_attribute("sandbox-type", sandbox_type)
o.add_attribute("{}-sandbox".format(sandbox_type), "lastline")
o.add_attribute("permalink", analysis_link)
self.misp_event.add_object(o)
# Add behaviors
o = pymisp.MISPObject(name="sb-signature")
o.add_attribute("software", "Lastline")
for activity in result.get("malicious_activity", []):
a = pymisp.MISPAttribute()
a.from_dict(type="text", value=activity)
o.add_attribute("signature", **a)
self.misp_event.add_object(o)
# Add mitre techniques
for technique in self._get_mitre_techniques(result):
self.misp_event.add_tag(technique)

View File

@ -0,0 +1,8 @@
"""vt_graph_parser.
This module provides methods to import graph from misp.
"""
from .helpers import * # noqa
from .importers import * # noqa

View File

@ -0,0 +1,20 @@
"""vt_graph_parser.errors.
This module provides custom errors for data importers.
"""
class GraphImportError(Exception):
pass
class InvalidFileFormatError(Exception):
pass
class MispEventNotFoundError(Exception):
pass
class ServerError(Exception):
pass

View File

@ -0,0 +1,7 @@
"""vt_graph_parser.helpers.
This modules provides functions and attributes to help MISP importers.
"""
__all__ = ["parsers", "rules", "wrappers"]

View File

@ -0,0 +1,88 @@
"""vt_graph_parser.helpers.parsers.
This module provides parsers for MISP inputs.
"""
from vt_graph_parser.helpers.wrappers import MispAttribute
MISP_INPUT_ATTR = [
"hostname",
"domain",
"ip-src",
"ip-dst",
"md5",
"sha1",
"sha256",
"url",
"filename|md5",
"filename",
"target-user",
"target-email"
]
VIRUSTOTAL_GRAPH_LINK_PREFIX = "https://www.virustotal.com/graph/"
def _parse_data(attributes, objects):
"""Parse MISP event attributes and objects data.
Args:
attributes (dict): dictionary which contains the MISP event attributes data.
objects (dict): dictionary which contains the MISP event objects data.
Returns:
([MispAttribute], str): MISP attributes and VTGraph link if exists.
Link defaults to "".
"""
attributes_data = []
vt_graph_link = ""
# Get simple MISP event attributes.
attributes_data += (
[attr for attr in attributes
if attr.get("type") in MISP_INPUT_ATTR])
# Get attributes from MISP objects too.
if objects:
for object_ in objects:
object_attrs = object_.get("Attribute", [])
attributes_data += (
[attr for attr in object_attrs
if attr.get("type") in MISP_INPUT_ATTR])
# Check if there is any VirusTotal Graph computed in MISP event.
vt_graph_links = (
attr for attr in attributes if attr.get("type") == "link"
and attr.get("value", "").startswith(VIRUSTOTAL_GRAPH_LINK_PREFIX))
# MISP could have more than one VirusTotal Graph, so we will take
# the last one.
current_id = 0 # MISP attribute id is the number of the attribute.
vt_graph_link = ""
for link in vt_graph_links:
if int(link.get("id")) > current_id:
current_id = int(link.get("id"))
vt_graph_link = link.get("value")
attributes = [
MispAttribute(data["type"], data["category"], data["value"])
for data in attributes_data]
return (attributes,
vt_graph_link.replace(VIRUSTOTAL_GRAPH_LINK_PREFIX, ""))
def parse_pymisp_response(payload):
"""Get event attributes and VirusTotal Graph id from pymisp response.
Args:
payload (dict): dictionary which contains pymisp response.
Returns:
([MispAttribute], str): MISP attributes and VTGraph link if exists.
Link defaults to "".
"""
event_attrs = payload.get("Attribute", [])
objects = payload.get("Object")
return _parse_data(event_attrs, objects)

View File

@ -0,0 +1,304 @@
"""vt_graph_parser.helpers.rules.
This module provides rules that helps MISP importers to connect MISP attributes
between them using VirusTotal relationship. Check all available relationship
here:
- File: https://developers.virustotal.com/v3/reference/#files-relationships
- URL: https://developers.virustotal.com/v3/reference/#urls-relationships
- Domain: https://developers.virustotal.com/v3/reference/#domains-relationships
- IP: https://developers.virustotal.com/v3/reference/#ip-relationships
"""
import abc
class MispEventRule(object):
"""Rules for MISP event nodes connection object wrapper."""
def __init__(self, last_rule=None, node=None):
"""Create a MispEventRule instance.
MispEventRule is a collection of rules that can infer the relationships
between nodes from MISP events.
Args:
last_rule (MispEventRule): previous rule.
node (Node): actual node.
"""
self.last_rule = last_rule
self.node = node
self.relation_event = {
"ip_address": self.__ip_transition,
"url": self.__url_transition,
"domain": self.__domain_transition,
"file": self.__file_transition
}
def get_last_different_rule(self):
"""Search the last rule whose event was different from actual.
Returns:
MispEventRule: the last different rule.
"""
if not isinstance(self, self.last_rule.__class__):
return self.last_rule
else:
return self.last_rule.get_last_different_rule()
def resolve_relation(self, graph, node, misp_category):
"""Try to infer a relationship between two nodes.
This method is based on a non-deterministic finite automaton for
this reason the future rule only depends on the actual rule and the input
node.
For example if the actual rule is a MISPEventDomainRule and the given node
is an ip_address node, the connection type between them will be
`resolutions` and the this rule will transit to MISPEventIPRule.
Args:
graph (VTGraph): graph to be computed.
node (Node): the node to be linked.
misp_category: (str): MISP category of the given node.
Returns:
MispEventRule: the transited rule.
"""
if node.node_type in self.relation_event:
return self.relation_event[node.node_type](graph, node, misp_category)
else:
return self.manual_link(graph, node)
def manual_link(self, graph, node):
"""Creates a manual link between self.node and the given node.
We accept MISP types that VirusTotal does not know how to link, so we create
a end to end relationship instead of create an unknown relationship node.
Args:
graph (VTGraph): graph to be computed.
node (Node): the node to be linked.
Returns:
MispEventRule: the transited rule.
"""
graph.add_link(self.node.node_id, node.node_id, "manual")
return self
@abc.abstractmethod
def __file_transition(self, graph, node, misp_category):
"""Make a new transition due to file attribute event.
Args:
graph (VTGraph): graph to be computed.
node (Node): the node to be linked.
misp_category: (str): MISP category of the given node.
Returns:
MispEventRule: the transited rule.
"""
pass
@abc.abstractmethod
def __ip_transition(self, graph, node, misp_category):
"""Make a new transition due to ip attribute event.
Args:
graph (VTGraph): graph to be computed.
node (Node): the node to be linked.
misp_category: (str): MISP category of the given node.
Returns:
MispEventRule: the transited rule.
"""
pass
@abc.abstractmethod
def __url_transition(self, graph, node, misp_category):
"""Make a new transition due to url attribute event.
Args:
graph (VTGraph): graph to be computed.
node (Node): the node to be linked.
misp_category: (str): MISP category of the given node.
Returns:
MispEventRule: the transited rule.
"""
pass
@abc.abstractmethod
def __domain_transition(self, graph, node, misp_category):
"""Make a new transition due to domain attribute event.
Args:
graph (VTGraph): graph to be computed.
node (Node): the node to be linked.
misp_category: (str): MISP category of the given node.
Returns:
MispEventRule: the transited rule.
"""
pass
class MispEventURLRule(MispEventRule):
"""Rule for URL event."""
def __init__(self, last_rule=None, node=None):
super(MispEventURLRule, self).__init__(last_rule, node)
self.relation_event = {
"ip_address": self.__ip_transition,
"url": self.__url_transition,
"domain": self.__domain_transition,
"file": self.__file_transition
}
def __file_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "downloaded_files")
return MispEventFileRule(self, node)
def __ip_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "contacted_ips")
return MispEventIPRule(self, node)
def __url_transition(self, graph, node, misp_category):
suitable_rule = self.get_last_different_rule()
if not isinstance(suitable_rule, MispEventInitialRule):
return suitable_rule.resolve_relation(graph, node, misp_category)
else:
return MispEventURLRule(self, node)
def __domain_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "contacted_domains")
return MispEventDomainRule(self, node)
class MispEventIPRule(MispEventRule):
"""Rule for IP event."""
def __init__(self, last_rule=None, node=None):
super(MispEventIPRule, self).__init__(last_rule, node)
self.relation_event = {
"ip_address": self.__ip_transition,
"url": self.__url_transition,
"domain": self.__domain_transition,
"file": self.__file_transition
}
def __file_transition(self, graph, node, misp_category):
connection_type = "communicating_files"
if misp_category == "Artifacts dropped":
connection_type = "downloaded_files"
graph.add_link(self.node.node_id, node.node_id, connection_type)
return MispEventFileRule(self, node)
def __ip_transition(self, graph, node, misp_category):
suitable_rule = self.get_last_different_rule()
if not isinstance(suitable_rule, MispEventInitialRule):
return suitable_rule.resolve_relation(graph, node, misp_category)
else:
return MispEventIPRule(self, node)
def __url_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "urls")
return MispEventURLRule(self, node)
def __domain_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "resolutions")
return MispEventDomainRule(self, node)
class MispEventDomainRule(MispEventRule):
"""Rule for domain event."""
def __init__(self, last_rule=None, node=None):
super(MispEventDomainRule, self).__init__(last_rule, node)
self.relation_event = {
"ip_address": self.__ip_transition,
"url": self.__url_transition,
"domain": self.__domain_transition,
"file": self.__file_transition
}
def __file_transition(self, graph, node, misp_category):
connection_type = "communicating_files"
if misp_category == "Artifacts dropped":
connection_type = "downloaded_files"
graph.add_link(self.node.node_id, node.node_id, connection_type)
return MispEventFileRule(self, node)
def __ip_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "resolutions")
return MispEventIPRule(self, node)
def __url_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "urls")
return MispEventURLRule(self, node)
def __domain_transition(self, graph, node, misp_category):
suitable_rule = self.get_last_different_rule()
if not isinstance(suitable_rule, MispEventInitialRule):
return suitable_rule.resolve_relation(graph, node, misp_category)
else:
graph.add_link(self.node.node_id, node.node_id, "siblings")
return MispEventDomainRule(self, node)
class MispEventFileRule(MispEventRule):
"""Rule for File event."""
def __init__(self, last_rule=None, node=None):
super(MispEventFileRule, self).__init__(last_rule, node)
self.relation_event = {
"ip_address": self.__ip_transition,
"url": self.__url_transition,
"domain": self.__domain_transition,
"file": self.__file_transition
}
def __file_transition(self, graph, node, misp_category):
suitable_rule = self.get_last_different_rule()
if not isinstance(suitable_rule, MispEventInitialRule):
return suitable_rule.resolve_relation(graph, node, misp_category)
else:
return MispEventFileRule(self, node)
def __ip_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "contacted_ips")
return MispEventIPRule(self, node)
def __url_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "contacted_urls")
return MispEventURLRule(self, node)
def __domain_transition(self, graph, node, misp_category):
graph.add_link(self.node.node_id, node.node_id, "contacted_domains")
return MispEventDomainRule(self, node)
class MispEventInitialRule(MispEventRule):
"""Initial rule."""
def __init__(self, last_rule=None, node=None):
super(MispEventInitialRule, self).__init__(last_rule, node)
self.relation_event = {
"ip_address": self.__ip_transition,
"url": self.__url_transition,
"domain": self.__domain_transition,
"file": self.__file_transition
}
def __file_transition(self, graph, node, misp_category):
return MispEventFileRule(self, node)
def __ip_transition(self, graph, node, misp_category):
return MispEventIPRule(self, node)
def __url_transition(self, graph, node, misp_category):
return MispEventURLRule(self, node)
def __domain_transition(self, graph, node, misp_category):
return MispEventDomainRule(self, node)

View File

@ -0,0 +1,58 @@
"""vt_graph_parser.helpers.wrappers.
This module provides a Python object wrapper for MISP objects.
"""
class MispAttribute(object):
"""Python object wrapper for MISP attribute.
Attributes:
type (str): VirusTotal node type.
category (str): MISP attribute category.
value (str): node id.
label (str): node name.
misp_type (str): MISP node type.
"""
MISP_TYPES_REFERENCE = {
"hostname": "domain",
"domain": "domain",
"ip-src": "ip_address",
"ip-dst": "ip_address",
"url": "url",
"filename|X": "file",
"filename": "file",
"md5": "file",
"sha1": "file",
"sha256": "file",
"target-user": "victim",
"target-email": "email"
}
def __init__(self, misp_type, category, value, label=""):
"""Constructor for a MispAttribute.
Args:
misp_type (str): MISP type attribute.
category (str): MISP category attribute.
value (str): attribute value.
label (str): attribute label.
"""
if misp_type.startswith("filename|"):
label, value = value.split("|")
misp_type = "filename|X"
if misp_type == "filename":
label = value
self.type = self.MISP_TYPES_REFERENCE.get(misp_type)
self.category = category
self.value = value
self.label = label
self.misp_type = misp_type
def __eq__(self, other):
return (isinstance(other, self.__class__) and self.value == other.value and self.type == other.type)
def __repr__(self):
return 'MispAttribute("{type}", "{category}", "{value}")'.format(type=self.type, category=self.category, value=self.value)

View File

@ -0,0 +1,7 @@
"""vt_graph_parser.importers.
This module provides methods to import graphs from MISP.
"""
__all__ = ["base", "pymisp_response"]

View File

@ -0,0 +1,98 @@
"""vt_graph_parser.importers.base.
This module provides a common method to import graph from misp attributes.
"""
import vt_graph_api
from vt_graph_parser.helpers.rules import MispEventInitialRule
def import_misp_graph(
misp_attributes, graph_id, vt_api_key, fetch_information, name,
private, fetch_vt_enterprise, user_editors, user_viewers, group_editors,
group_viewers, use_vt_to_connect_the_graph, max_api_quotas,
max_search_depth):
"""Import VirusTotal Graph from MISP.
Args:
misp_attributes ([MispAttribute]): list with the MISP attributes which
will be added to the returned graph.
graph_id: if supplied, the graph will be loaded instead of compute it again.
vt_api_key (str): VT API Key.
fetch_information (bool): whether the script will fetch
information for added nodes in VT. Defaults to True.
name (str): graph title. Defaults to "".
private (bool): True for private graphs. You need to have
Private Graph premium features enabled in your subscription. Defaults
to False.
fetch_vt_enterprise (bool, optional): if True, the graph will search any
available information using VirusTotal Intelligence for the node if there
is no normal information for it. Defaults to False.
user_editors ([str]): usernames that can edit the graph.
Defaults to None.
user_viewers ([str]): usernames that can view the graph.
Defaults to None.
group_editors ([str]): groups that can edit the graph.
Defaults to None.
group_viewers ([str]): groups that can view the graph.
Defaults to None.
use_vt_to_connect_the_graph (bool): if True, graph nodes will
be linked using VirusTotal API. Otherwise, the links will be generated
using production rules based on MISP attributes order. Defaults to
False.
max_api_quotas (int): maximum number of api quotas that could
be consumed to resolve graph using VirusTotal API. Defaults to 20000.
max_search_depth (int, optional): max search depth to explore
relationship between nodes when use_vt_to_connect_the_graph is True.
Defaults to 3.
If use_vt_to_connect_the_graph is True, it will take some time to compute
graph.
Returns:
vt_graph_api.graph.VTGraph: the imported graph.
"""
rule = MispEventInitialRule()
# Check if the event has been already computed in VirusTotal Graph. Otherwise
# a new graph will be created.
if not graph_id:
graph = vt_graph_api.VTGraph(
api_key=vt_api_key, name=name, private=private,
user_editors=user_editors, user_viewers=user_viewers,
group_editors=group_editors, group_viewers=group_viewers)
else:
graph = vt_graph_api.VTGraph.load_graph(graph_id, vt_api_key)
attributes_to_add = [attr for attr in misp_attributes
if not graph.has_node(attr.value)]
total_expandable_attrs = max(sum(
1 for attr in attributes_to_add
if attr.type in vt_graph_api.Node.SUPPORTED_NODE_TYPES),
1)
max_quotas_per_search = max(
int(max_api_quotas / total_expandable_attrs), 1)
previous_node_id = ""
for attr in attributes_to_add:
# Add the current attr as node to the graph.
added_node = graph.add_node(
attr.value, attr.type, fetch_information, fetch_vt_enterprise,
attr.label)
# If use_vt_to_connect_the_grap is True the nodes will be connected using
# VT API.
if use_vt_to_connect_the_graph:
if (attr.type not in vt_graph_api.Node.SUPPORTED_NODE_TYPES and previous_node_id):
graph.add_link(previous_node_id, attr.value, "manual")
else:
graph.connect_with_graph(
attr.value, max_quotas_per_search, max_search_depth,
fetch_info_collected_nodes=fetch_information)
else:
rule = rule.resolve_relation(graph, added_node, attr.category)
return graph

View File

@ -0,0 +1,73 @@
"""vt_graph_parser.importers.pymisp_response.
This modules provides a graph importer method for MISP event by using the
response payload giving by MISP API directly.
"""
from vt_graph_parser.helpers.parsers import parse_pymisp_response
from vt_graph_parser.importers.base import import_misp_graph
def from_pymisp_response(
payload, vt_api_key, fetch_information=True,
private=False, fetch_vt_enterprise=False, user_editors=None,
user_viewers=None, group_editors=None, group_viewers=None,
use_vt_to_connect_the_graph=False, max_api_quotas=1000,
max_search_depth=3, expand_node_one_level=False):
"""Import VirusTotal Graph from MISP JSON file.
Args:
payload (dict): dictionary which contains the request payload.
vt_api_key (str): VT API Key.
fetch_information (bool, optional): whether the script will fetch
information for added nodes in VT. Defaults to True.
name (str, optional): graph title. Defaults to "".
private (bool, optional): True for private graphs. You need to have
Private Graph premium features enabled in your subscription. Defaults
to False.
fetch_vt_enterprise (bool, optional): if True, the graph will search any
available information using VirusTotal Intelligence for the node if there
is no normal information for it. Defaults to False.
user_editors ([str], optional): usernames that can edit the graph.
Defaults to None.
user_viewers ([str], optional): usernames that can view the graph.
Defaults to None.
group_editors ([str], optional): groups that can edit the graph.
Defaults to None.
group_viewers ([str], optional): groups that can view the graph.
Defaults to None.
use_vt_to_connect_the_graph (bool, optional): if True, graph nodes will
be linked using VirusTotal API. Otherwise, the links will be generated
using production rules based on MISP attributes order. Defaults to
False.
max_api_quotas (int, optional): maximum number of api quotas that could
be consumed to resolve graph using VirusTotal API. Defaults to 20000.
max_search_depth (int, optional): max search depth to explore
relationship between nodes when use_vt_to_connect_the_graph is True.
Defaults to 3.
expand_one_level (bool, optional): expand entire graph one level.
Defaults to False.
If use_vt_to_connect_the_graph is True, it will take some time to compute
graph.
Raises:
LoaderError: if JSON file is invalid.
Returns:
[vt_graph_api.graph.VTGraph: the imported graph].
"""
graphs = []
for event_payload in payload['data']:
misp_attrs, graph_id = parse_pymisp_response(event_payload)
name = "Graph created from MISP event"
graph = import_misp_graph(
misp_attrs, graph_id, vt_api_key, fetch_information, name,
private, fetch_vt_enterprise, user_editors, user_viewers, group_editors,
group_viewers, use_vt_to_connect_the_graph, max_api_quotas,
max_search_depth)
if expand_node_one_level:
graph.expand_n_level(1)
graphs.append(graph)
return graphs

View File

@ -1,17 +1,31 @@
from . import _vmray # noqa
import os
import sys
sys.path.append('{}/lib'.format('/'.join((os.path.realpath(__file__)).split('/')[:-3])))
__all__ = ['cuckoo_submit', 'vmray_submit', 'bgpranking', 'circl_passivedns', 'circl_passivessl',
'countrycode', 'cve', 'cve_advanced', 'dns', 'btc_steroids', 'domaintools', 'eupi',
'farsight_passivedns', 'ipasn', 'passivetotal', 'sourcecache', 'virustotal',
'whois', 'shodan', 'reversedns', 'geoip_country', 'wiki', 'iprep',
'countrycode', 'cve', 'cve_advanced', 'cpe', 'dns', 'btc_steroids', 'domaintools', 'eupi',
'eql', 'farsight_passivedns', 'ipasn', 'passivetotal', 'sourcecache', 'virustotal',
'whois', 'shodan', 'reversedns', 'geoip_asn', 'geoip_city', 'geoip_country', 'wiki', 'iprep',
'threatminer', 'otx', 'threatcrowd', 'vulndb', 'crowdstrike_falcon',
'yara_syntax_validator', 'hashdd', 'onyphe', 'onyphe_full', 'rbl',
'xforceexchange', 'sigma_syntax_validator', 'stix2_pattern_syntax_validator',
'sigma_queries', 'dbl_spamhaus', 'vulners', 'yara_query', 'macaddress_io',
'intel471', 'backscatter_io', 'btc_scam_check', 'hibp', 'greynoise', 'macvendors',
'qrcode', 'ocr-enrich', 'pdf-enrich', 'docx-enrich', 'xlsx-enrich', 'pptx-enrich',
'ods-enrich', 'odt-enrich', 'joesandbox_submit', 'joesandbox_query', 'urlhaus',
'virustotal_public']
'qrcode', 'ocr_enrich', 'pdf_enrich', 'docx_enrich', 'xlsx_enrich', 'pptx_enrich',
'ods_enrich', 'odt_enrich', 'joesandbox_submit', 'joesandbox_query', 'urlhaus',
'virustotal_public', 'apiosintds', 'urlscan', 'securitytrails', 'apivoid',
'assemblyline_submit', 'assemblyline_query', 'ransomcoindb', 'malwarebazaar',
'lastline_query', 'lastline_submit', 'sophoslabs_intelix', 'cytomic_orion', 'censys_enrich',
'trustar_enrich', 'recordedfuture', 'html_to_markdown']
minimum_required_fields = ('type', 'uuid', 'value')
checking_error = 'containing at least a "type" field and a "value" field'
standard_error_message = 'This module requires an "attribute" field as input'
def check_input_attribute(attribute, requirements=minimum_required_fields):
return all(feature in attribute for feature in requirements)

View File

@ -0,0 +1,96 @@
#!/usr/bin/env python
import requests
import logging
import os
# import pprint
copyright = """
Copyright 2019 (C) by Aaron Kaplan <aaron@lo-res.org>, all rights reserved.
This file is part of the ransomwarecoindDB project and licensed under the AGPL 3.0 license
"""
__version__ = 0.1
baseurl = "https://ransomcoindb.concinnity-risks.com/api/v1/"
user_agent = "ransomcoindb client via python-requests/%s" % requests.__version__
urls = {'BTC': {'btc': baseurl + 'bin2btc/',
'md5': baseurl + 'bin2btc/md5/',
'sha1': baseurl + 'bin2btc/sha1/',
'sha256': baseurl + 'bin2btc/sha256/',
},
'XMR': {'xmr': baseurl + 'bin2crypto/XMR/',
'md5': baseurl + 'bin2crypto/XMR/md5/',
'sha1': baseurl + 'bin2crypto/XMR/sha1/',
'sha256': baseurl + 'bin2crypto/XMR/sha256/',
}
}
def get_data_by(coin: str, key: str, value: str, api_key: str):
"""
Abstract function to fetch data from the bin2btc/{key} endpoint.
This function must be made concrete by generating a relevant function.
See below for examples.
"""
# pprint.pprint("api-key: %s" % api_key)
headers = {'x-api-key': api_key, 'content-type': 'application/json'}
headers.update({'User-Agent': user_agent})
# check first if valid:
valid_coins = ['BTC', 'XMR']
valid_keys = ['btc', 'md5', 'sha1', 'sha256']
if coin not in valid_coins or key not in valid_keys:
logging.error("get_data_by_X(): not a valid key parameter. Must be a valid coin (i.e. from %r) and one of: %r" % (valid_coins, valid_keys))
return None
try:
url = urls[coin.upper()][key]
logging.debug("url = %s" % url)
if not url:
logging.error("Could not find a valid coin/key combination. Must be a valid coin (i.e. from %r) and one of: %r" % (valid_coins, valid_keys))
return None
r = requests.get(url + "%s" % (value), headers=headers)
except Exception as ex:
logging.error("could not fetch from the service. Error: %s" % str(ex))
if r.status_code != 200:
logging.error("could not fetch from the service. Status code: %s" %
r.status_code)
return r.json()
def get_bin2btc_by_btc(btc_addr: str, api_key: str):
""" Function to fetch the data from the bin2btc/{btc} endpoint """
return get_data_by('BTC', 'btc', btc_addr, api_key)
def get_bin2btc_by_md5(md5: str, api_key: str):
""" Function to fetch the data from the bin2btc/{md5} endpoint """
return get_data_by('BTC', 'md5', md5, api_key)
def get_bin2btc_by_sha1(sha1: str, api_key: str):
""" Function to fetch the data from the bin2btc/{sha1} endpoint """
return get_data_by('BTC', 'sha1', sha1, api_key)
def get_bin2btc_by_sha256(sha256: str, api_key: str):
""" Function to fetch the data from the bin2btc/{sha256} endpoint """
return get_data_by('BTC', 'sha256', sha256, api_key)
if __name__ == "__main__":
""" Just for testing on the cmd line. """
to_btc = "1KnuC7FdhGuHpvFNxtBpz299Q5QteUdNCq"
api_key = os.getenv('api_key')
r = get_bin2btc_by_btc(to_btc, api_key)
print(r)
r = get_bin2btc_by_md5("abc", api_key)
print(r)
r = get_data_by('XMR', 'md5', "452878CD7", api_key)
print(r)

View File

@ -0,0 +1,147 @@
import json
import logging
import sys
import os
from apiosintDS import apiosintDS
log = logging.getLogger('apiosintDS')
log.setLevel(logging.DEBUG)
apiodbg = logging.StreamHandler(sys.stdout)
apiodbg.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
apiodbg.setFormatter(formatter)
log.addHandler(apiodbg)
misperrors = {'error': 'Error'}
mispattributes = {'input': ["domain", "domain|ip", "hostname", "ip-dst", "ip-src", "ip-dst|port", "ip-src|port", "url",
"md5", "sha1", "sha256", "filename|md5", "filename|sha1", "filename|sha256"],
'output': ["domain", "ip-dst", "url", "comment", "md5", "sha1", "sha256"]
}
moduleinfo = {'version': '0.1', 'author': 'Davide Baglieri aka davidonzo',
'description': 'On demand query API for OSINT.digitalside.it project.',
'module-type': ['expansion', 'hover']}
moduleconfig = ['import_related_hashes', 'cache', 'cache_directory']
def handler(q=False):
if q is False:
return False
request = json.loads(q)
tosubmit = []
if request.get('domain'):
tosubmit.append(request['domain'])
elif request.get('domain|ip'):
tosubmit.append(request['domain|ip'].split('|')[0])
tosubmit.append(request['domain|ip'].split('|')[1])
elif request.get('hostname'):
tosubmit.append(request['hostname'])
elif request.get('ip-dst'):
tosubmit.append(request['ip-dst'])
elif request.get('ip-src'):
tosubmit.append(request['ip-src'])
elif request.get('ip-dst|port'):
tosubmit.append(request['ip-dst|port'].split('|')[0])
elif request.get('ip-src|port'):
tosubmit.append(request['ip-src|port'].split('|')[0])
elif request.get('url'):
tosubmit.append(request['url'])
elif request.get('md5'):
tosubmit.append(request['md5'])
elif request.get('sha1'):
tosubmit.append(request['sha1'])
elif request.get('sha256'):
tosubmit.append(request['sha256'])
elif request.get('filename|md5'):
tosubmit.append(request['filename|md5'].split('|')[1])
elif request.get('filename|sha1'):
tosubmit.append(request['filename|sha1'].split('|')[1])
elif request.get('filename|sha256'):
tosubmit.append(request['filename|sha256'].split('|')[1])
else:
return False
submitcache = False
submitcache_directory = False
import_related_hashes = False
r = {"results": []}
if request.get('config'):
if request['config'].get('cache') and request['config']['cache'].lower() == "yes":
submitcache = True
if request['config'].get('import_related_hashes') and request['config']['import_related_hashes'].lower() == "yes":
import_related_hashes = True
if submitcache:
cache_directory = request['config'].get('cache_directory')
if cache_directory and len(cache_directory) > 0:
if os.access(cache_directory, os.W_OK):
submitcache_directory = cache_directory
else:
ErrorMSG = "Cache directory is not writable. Please fix it before."
log.debug(str(ErrorMSG))
misperrors['error'] = ErrorMSG
return misperrors
else:
ErrorMSG = "Value for Plugin.Enrichment_apiosintds_cache_directory is empty but cache option is enabled as recommended. Please set a writable cache directory in plugin settings."
log.debug(str(ErrorMSG))
misperrors['error'] = ErrorMSG
return misperrors
else:
log.debug("Cache option is set to " + str(submitcache) + ". You are not using the internal cache system and this is NOT recommended!")
log.debug("Please, consider to turn on the cache setting it to 'Yes' and specifing a writable directory for the cache directory option.")
try:
response = apiosintDS.request(entities=tosubmit, cache=submitcache, cachedirectory=submitcache_directory, verbose=True)
r["results"] += reversed(apiosintParser(response, import_related_hashes))
except Exception as e:
log.debug(str(e))
misperrors['error'] = str(e)
return r
def apiosintParser(response, import_related_hashes):
ret = []
if isinstance(response, dict):
for key in response:
for item in response[key]["items"]:
if item["response"]:
comment = item["item"] + " IS listed by OSINT.digitalside.it. Date list: " + response[key]["list"]["date"]
if key == "url":
if "hashes" in item.keys():
if "sha256" in item["hashes"].keys():
ret.append({"types": ["sha256"], "values": [item["hashes"]["sha256"]]})
if "sha1" in item["hashes"].keys():
ret.append({"types": ["sha1"], "values": [item["hashes"]["sha1"]]})
if "md5" in item["hashes"].keys():
ret.append({"types": ["md5"], "values": [item["hashes"]["md5"]]})
if len(item["related_urls"]) > 0:
for urls in item["related_urls"]:
if isinstance(urls, dict):
itemToInclude = urls["url"]
if import_related_hashes:
if "hashes" in urls.keys():
if "sha256" in urls["hashes"].keys():
ret.append({"types": ["sha256"], "values": [urls["hashes"]["sha256"]], "comment": "Related to: " + itemToInclude})
if "sha1" in urls["hashes"].keys():
ret.append({"types": ["sha1"], "values": [urls["hashes"]["sha1"]], "comment": "Related to: " + itemToInclude})
if "md5" in urls["hashes"].keys():
ret.append({"types": ["md5"], "values": [urls["hashes"]["md5"]], "comment": "Related to: " + itemToInclude})
ret.append({"types": ["url"], "values": [itemToInclude], "comment": "Related to: " + item["item"]})
else:
ret.append({"types": ["url"], "values": [urls], "comment": "Related URL to: " + item["item"]})
else:
comment = item["item"] + " IS NOT listed by OSINT.digitalside.it. Date list: " + response[key]["list"]["date"]
ret.append({"types": ["text"], "values": [comment]})
return ret
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -0,0 +1,95 @@
import json
import requests
from . import check_input_attribute, standard_error_message
from pymisp import MISPAttribute, MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['domain', 'hostname'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.1', 'author': 'Christian Studer',
'description': 'On demand query API for APIVoid.',
'module-type': ['expansion', 'hover']}
moduleconfig = ['apikey']
class APIVoidParser():
def __init__(self, attribute):
self.misp_event = MISPEvent()
self.attribute = MISPAttribute()
self.attribute.from_dict(**attribute)
self.misp_event.add_attribute(**self.attribute)
self.url = 'https://endpoint.apivoid.com/{}/v1/pay-as-you-go/?key={}&'
def get_results(self):
if hasattr(self, 'result'):
return self.result
event = json.loads(self.misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object')}
return {'results': results}
def parse_domain(self, apikey):
feature = 'dnslookup'
if requests.get(f'{self.url.format(feature, apikey)}stats').json()['credits_remained'] < 0.13:
self.result = {'error': 'You do not have enough APIVoid credits to proceed your request.'}
return
mapping = {'A': 'resolution-of', 'MX': 'mail-server-of', 'NS': 'server-name-of'}
dnslookup = requests.get(f'{self.url.format(feature, apikey)}action=dns-any&host={self.attribute.value}').json()
for item in dnslookup['data']['records']['items']:
record_type = item['type']
try:
relationship = mapping[record_type]
except KeyError:
continue
self._handle_dns_record(item, record_type, relationship)
ssl = requests.get(f'{self.url.format("sslinfo", apikey)}host={self.attribute.value}').json()
self._parse_ssl_certificate(ssl['data']['certificate'])
def _handle_dns_record(self, item, record_type, relationship):
dns_record = MISPObject('dns-record')
dns_record.add_attribute('queried-domain', type='domain', value=item['host'])
attribute_type, feature = ('ip-dst', 'ip') if record_type == 'A' else ('domain', 'target')
dns_record.add_attribute(f'{record_type.lower()}-record', type=attribute_type, value=item[feature])
dns_record.add_reference(self.attribute.uuid, relationship)
self.misp_event.add_object(**dns_record)
def _parse_ssl_certificate(self, certificate):
x509 = MISPObject('x509')
fingerprint = 'x509-fingerprint-sha1'
x509.add_attribute(fingerprint, type=fingerprint, value=certificate['fingerprint'])
x509_mapping = {'subject': {'name': ('text', 'subject')},
'issuer': {'common_name': ('text', 'issuer')},
'signature': {'serial': ('text', 'serial-number')},
'validity': {'valid_from': ('datetime', 'validity-not-before'),
'valid_to': ('datetime', 'validity-not-after')}}
certificate = certificate['details']
for feature, subfeatures in x509_mapping.items():
for subfeature, mapping in subfeatures.items():
attribute_type, relation = mapping
x509.add_attribute(relation, type=attribute_type, value=certificate[feature][subfeature])
x509.add_reference(self.attribute.uuid, 'seen-by')
self.misp_event.add_object(**x509)
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('config', {}).get('apikey'):
return {'error': 'An API key for APIVoid is required.'}
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if attribute['type'] not in mispattributes['input']:
return {'error': 'Unsupported attribute type.'}
apikey = request['config']['apikey']
apivoid_parser = APIVoidParser(attribute)
apivoid_parser.parse_domain(apikey)
return apivoid_parser.get_results()
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -0,0 +1,169 @@
# -*- coding: utf-8 -*-
import json
from . import check_input_attribute, standard_error_message
from assemblyline_client import Client, ClientError
from collections import defaultdict
from pymisp import MISPAttribute, MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['link'], 'format': 'misp_standard'}
moduleinfo = {'version': '1', 'author': 'Christian Studer',
'description': 'Query AssemblyLine with a report URL to get the parsed data.',
'module-type': ['expansion']}
moduleconfig = ["apiurl", "user_id", "apikey", "password"]
class AssemblyLineParser():
def __init__(self):
self.misp_event = MISPEvent()
self.results = {}
self.attribute = {'to_ids': True}
self._results_mapping = {'NET_DOMAIN_NAME': 'domain', 'NET_FULL_URI': 'url',
'NET_IP': 'ip-dst'}
self._file_mapping = {'entropy': {'type': 'float', 'object_relation': 'entropy'},
'md5': {'type': 'md5', 'object_relation': 'md5'},
'mime': {'type': 'mime-type', 'object_relation': 'mimetype'},
'sha1': {'type': 'sha1', 'object_relation': 'sha1'},
'sha256': {'type': 'sha256', 'object_relation': 'sha256'},
'size': {'type': 'size-in-bytes', 'object_relation': 'size-in-bytes'},
'ssdeep': {'type': 'ssdeep', 'object_relation': 'ssdeep'}}
def get_submission(self, attribute, client):
sid = attribute['value'].split('=')[-1]
try:
if not client.submission.is_completed(sid):
self.results['error'] = 'Submission not completed, please try again later.'
return
except Exception as e:
self.results['error'] = f'Something went wrong while trying to check if the submission in AssemblyLine is completed: {e.__str__()}'
return
try:
submission = client.submission.full(sid)
except Exception as e:
self.results['error'] = f"Something went wrong while getting the submission from AssemblyLine: {e.__str__()}"
return
self._parse_report(submission)
def finalize_results(self):
if 'error' in self.results:
return self.results
event = json.loads(self.misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object', 'Tag') if (key in event and event[key])}
return {'results': results}
def _create_attribute(self, result, attribute_type):
attribute = MISPAttribute()
attribute.from_dict(type=attribute_type, value=result['value'], **self.attribute)
if result['classification'] != 'UNCLASSIFIED':
attribute.add_tag(result['classification'].lower())
self.misp_event.add_attribute(**attribute)
return {'referenced_uuid': attribute.uuid, 'relationship_type': '-'.join(result['context'].lower().split(' '))}
def _create_file_object(self, file_info):
file_object = MISPObject('file')
filename_attribute = {'type': 'filename'}
filename_attribute.update(self.attribute)
if file_info['classification'] != "UNCLASSIFIED":
tag = {'Tag': [{'name': file_info['classification'].lower()}]}
filename_attribute.update(tag)
for feature, attribute in self._file_mapping.items():
attribute.update(tag)
file_object.add_attribute(value=file_info[feature], **attribute)
return filename_attribute, file_object
for feature, attribute in self._file_mapping.items():
file_object.add_attribute(value=file_info[feature], **attribute)
return filename_attribute, file_object
@staticmethod
def _get_results(submission_results):
results = defaultdict(list)
for k, values in submission_results.items():
h = k.split('.')[0]
for t in values['result']['tags']:
if t['context'] is not None:
results[h].append(t)
return results
def _get_scores(self, file_tree):
scores = {}
for h, f in file_tree.items():
score = f['score']
if score > 0:
scores[h] = {'name': f['name'], 'score': score}
if f['children']:
scores.update(self._get_scores(f['children']))
return scores
def _parse_report(self, submission):
if submission['classification'] != 'UNCLASSIFIED':
self.misp_event.add_tag(submission['classification'].lower())
filtered_results = self._get_results(submission['results'])
scores = self._get_scores(submission['file_tree'])
for h, results in filtered_results.items():
if h in scores:
attribute, file_object = self._create_file_object(submission['file_infos'][h])
print(file_object)
for filename in scores[h]['name']:
file_object.add_attribute('filename', value=filename, **attribute)
for reference in self._parse_results(results):
file_object.add_reference(**reference)
self.misp_event.add_object(**file_object)
def _parse_results(self, results):
references = []
for result in results:
try:
attribute_type = self._results_mapping[result['type']]
except KeyError:
continue
references.append(self._create_attribute(result, attribute_type))
return references
def parse_config(apiurl, user_id, config):
error = {"error": "Please provide your AssemblyLine API key or Password."}
if config.get('apikey'):
try:
return Client(apiurl, apikey=(user_id, config['apikey']))
except ClientError as e:
error['error'] = f'Error while initiating a connection with AssemblyLine: {e.__str__()}'
if config.get('password'):
try:
return Client(apiurl, auth=(user_id, config['password']))
except ClientError as e:
error['error'] = f'Error while initiating a connection with AssemblyLine: {e.__str__()}'
return error
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
if request['attribute']['type'] not in mispattributes['input']:
return {'error': 'Unsupported attribute type.'}
if not request.get('config'):
return {"error": "Missing configuration."}
if not request['config'].get('apiurl'):
return {"error": "No AssemblyLine server address provided."}
apiurl = request['config']['apiurl']
if not request['config'].get('user_id'):
return {"error": "Please provide your AssemblyLine User ID."}
user_id = request['config']['user_id']
client = parse_config(apiurl, user_id, request['config'])
if isinstance(client, dict):
return client
assemblyline_parser = AssemblyLineParser()
assemblyline_parser.get_submission(request['attribute'], client)
return assemblyline_parser.finalize_results()
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -0,0 +1,89 @@
# -*- coding: utf-8 -*-
import json
from assemblyline_client import Client, ClientError
from urllib.parse import urljoin
moduleinfo = {"version": 1, "author": "Christian Studer", "module-type": ["expansion"],
"description": "Submit files or URLs to AssemblyLine"}
moduleconfig = ["apiurl", "user_id", "apikey", "password"]
mispattributes = {"input": ["attachment", "malware-sample", "url"],
"output": ["link"]}
def parse_config(apiurl, user_id, config):
error = {"error": "Please provide your AssemblyLine API key or Password."}
if config.get('apikey'):
try:
return Client(apiurl, apikey=(user_id, config['apikey']))
except ClientError as e:
error['error'] = f'Error while initiating a connection with AssemblyLine: {e.__str__()}'
if config.get('password'):
try:
return Client(apiurl, auth=(user_id, config['password']))
except ClientError as e:
error['error'] = f'Error while initiating a connection with AssemblyLine: {e.__str__()}'
return error
def submit_content(client, filename, data):
try:
return client.submit(fname=filename, contents=data.encode())
except Exception as e:
return {'error': f'Error while submitting content to AssemblyLine: {e.__str__()}'}
def submit_request(client, request):
if 'attachment' in request:
return submit_content(client, request['attachment'], request['data'])
if 'malware-sample' in request:
return submit_content(client, request['malware-sample'].split('|')[0], request['data'])
for feature in ('url', 'domain'):
if feature in request:
return submit_url(client, request[feature])
return {"error": "No valid attribute type for this module has been provided."}
def submit_url(client, url):
try:
return client.submit(url=url)
except Exception as e:
return {'error': f'Error while submitting url to AssemblyLine: {e.__str__()}'}
def handler(q=False):
if q is False:
return q
request = json.loads(q)
if not request.get('config'):
return {"error": "Missing configuration."}
if not request['config'].get('apiurl'):
return {"error": "No AssemblyLine server address provided."}
apiurl = request['config']['apiurl']
if not request['config'].get('user_id'):
return {"error": "Please provide your AssemblyLine User ID."}
user_id = request['config']['user_id']
client = parse_config(apiurl, user_id, request['config'])
if isinstance(client, dict):
return client
submission = submit_request(client, request)
if 'error' in submission:
return submission
sid = submission['submission']['sid']
return {
"results": [{
"types": "link",
"categories": "External analysis",
"values": urljoin(apiurl, f'submission_detail.html?sid={sid}')
}]
}
def introspection():
return mispattributes
def version():
moduleinfo["config"] = moduleconfig
return moduleinfo

View File

@ -1,13 +1,15 @@
# -*- coding: utf-8 -*-
import json
from datetime import date, timedelta
from . import check_input_attribute, standard_error_message
from datetime import date, datetime, timedelta
from pybgpranking import BGPRanking
from pymisp import MISPAttribute, MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['AS'], 'output': ['freetext']}
mispattributes = {'input': ['AS'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.1', 'author': 'Raphaël Vinot',
'description': 'Query an ASN Description history service (https://github.com/CIRCL/ASN-Description-History.git)',
'description': 'Query BGP Ranking to get the ranking of an Autonomous System number.',
'module-type': ['expansion', 'hover']}
@ -15,19 +17,65 @@ def handler(q=False):
if q is False:
return False
request = json.loads(q)
if request.get('AS'):
toquery = request['AS']
else:
misperrors['error'] = "Unsupported attributes type"
return misperrors
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
toquery = request['attribute']
if toquery['type'] not in mispattributes['input']:
return {'error': 'Unsupported attribute type.'}
bgpranking = BGPRanking()
values = bgpranking.query(toquery, date=(date.today() - timedelta(1)).isoformat())
value_toquery = int(toquery['value'][2:]) if toquery['value'].startswith('AS') else int(toquery['value'])
values = bgpranking.query(value_toquery, date=(date.today() - timedelta(1)).isoformat())
if not values:
misperrors['error'] = 'Unable to find the ASN in BGP Ranking'
if not values['response'] or not values['response']['asn_description']:
misperrors['error'] = 'There is no result about this ASN in BGP Ranking'
return misperrors
return {'results': [{'types': mispattributes['output'], 'values': values}]}
event = MISPEvent()
attribute = MISPAttribute()
attribute.from_dict(**toquery)
event.add_attribute(**attribute)
asn_object = MISPObject('asn')
asn_object.add_attribute(**{
'type': 'AS',
'object_relation': 'asn',
'value': values['meta']['asn']
})
description, country = values['response']['asn_description'].split(', ')
for relation, value in zip(('description', 'country'), (description, country)):
asn_object.add_attribute(**{
'type': 'text',
'object_relation': relation,
'value': value
})
mapping = {
'address_family': {'type': 'text', 'object_relation': 'address-family'},
'date': {'type': 'datetime', 'object_relation': 'date'},
'position': {'type': 'float', 'object_relation': 'position'},
'rank': {'type': 'float', 'object_relation': 'ranking'}
}
bgp_object = MISPObject('bgp-ranking')
for feature in ('rank', 'position'):
bgp_attribute = {'value': values['response']['ranking'][feature]}
bgp_attribute.update(mapping[feature])
bgp_object.add_attribute(**bgp_attribute)
date_attribute = {'value': datetime.strptime(values['meta']['date'], '%Y-%m-%d')}
date_attribute.update(mapping['date'])
bgp_object.add_attribute(**date_attribute)
address_attribute = {'value': values['meta']['address_family']}
address_attribute.update(mapping['address_family'])
bgp_object.add_attribute(**address_attribute)
asn_object.add_reference(attribute.uuid, 'describes')
asn_object.add_reference(bgp_object.uuid, 'ranked-with')
event.add_object(asn_object)
event.add_object(bgp_object)
event = json.loads(event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object')}
return {'results': results}
def introspection():

View File

@ -0,0 +1,256 @@
# encoding: utf-8
import json
import base64
import codecs
from dateutil.parser import isoparse
from . import check_input_attribute, standard_error_message
from pymisp import MISPAttribute, MISPEvent, MISPObject
try:
import censys.base
import censys.ipv4
import censys.websites
import censys.certificates
except ImportError:
print("Censys module not installed. Try 'pip install censys'")
misperrors = {'error': 'Error'}
moduleconfig = ['api_id', 'api_secret']
mispattributes = {'input': ['ip-src', 'ip-dst', 'domain', 'hostname', 'hostname|port', 'domain|ip', 'ip-dst|port', 'ip-src|port',
'x509-fingerprint-md5', 'x509-fingerprint-sha1', 'x509-fingerprint-sha256'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.1', 'author': 'Loïc Fortemps',
'description': 'Censys.io expansion module', 'module-type': ['expansion', 'hover']}
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if request.get('config'):
if (request['config'].get('api_id') is None) or (request['config'].get('api_secret') is None):
misperrors['error'] = "Censys API credentials are missing"
return misperrors
else:
misperrors['error'] = "Please provide config options"
return misperrors
api_id = request['config']['api_id']
api_secret = request['config']['api_secret']
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if not any(input_type == attribute['type'] for input_type in mispattributes['input']):
return {'error': 'Unsupported attribute type.'}
attribute = MISPAttribute()
attribute.from_dict(**request['attribute'])
# Lists to accomodate multi-types attribute
conn = list()
types = list()
values = list()
results = list()
if "|" in attribute.type:
t_1, t_2 = attribute.type.split('|')
v_1, v_2 = attribute.value.split('|')
# We cannot use the port information
if t_2 == "port":
types.append(t_1)
values.append(v_1)
else:
types = [t_1, t_2]
values = [v_1, v_2]
else:
types.append(attribute.type)
values.append(attribute.value)
for t in types:
# ip, ip-src or ip-dst
if t[:2] == "ip":
conn.append(censys.ipv4.CensysIPv4(api_id=api_id, api_secret=api_secret))
elif t == 'domain' or t == "hostname":
conn.append(censys.websites.CensysWebsites(api_id=api_id, api_secret=api_secret))
elif 'x509-fingerprint' in t:
conn.append(censys.certificates.CensysCertificates(api_id=api_id, api_secret=api_secret))
found = True
for c in conn:
val = values.pop(0)
try:
r = c.view(val)
results.append(parse_response(r, attribute))
found = True
except censys.base.CensysNotFoundException:
found = False
except Exception:
misperrors['error'] = "Connection issue"
return misperrors
if not found:
misperrors['error'] = "Nothing could be found on Censys"
return misperrors
return {'results': remove_duplicates(results)}
def parse_response(censys_output, attribute):
misp_event = MISPEvent()
misp_event.add_attribute(**attribute)
# Generic fields (for IP/Websites)
if "autonomous_system" in censys_output:
cen_as = censys_output['autonomous_system']
asn_object = MISPObject('asn')
asn_object.add_attribute('asn', value=cen_as["asn"])
asn_object.add_attribute('description', value=cen_as['name'])
asn_object.add_attribute('subnet-announced', value=cen_as['routed_prefix'])
asn_object.add_attribute('country', value=cen_as['country_code'])
asn_object.add_reference(attribute.uuid, 'associated-to')
misp_event.add_object(**asn_object)
if "ip" in censys_output and "ports" in censys_output:
ip_object = MISPObject('ip-port')
ip_object.add_attribute('ip', value=censys_output['ip'])
for p in censys_output['ports']:
ip_object.add_attribute('dst-port', value=p)
ip_object.add_reference(attribute.uuid, 'associated-to')
misp_event.add_object(**ip_object)
# We explore all ports to find https or ssh services
for k in censys_output.keys():
if not isinstance(censys_output[k], dict):
continue
if 'https' in censys_output[k]:
try:
cert = censys_output[k]['https']['tls']['certificate']
cert_obj = get_certificate_object(cert, attribute)
misp_event.add_object(**cert_obj)
except KeyError:
print("Error !")
if 'ssh' in censys_output[k]:
try:
cert = censys_output[k]['ssh']['v2']['server_host_key']
# TODO enable once the type is merged
# misp_event.add_attribute(type='hasshserver-sha256', value=cert['fingerprint_sha256'])
except KeyError:
pass
# Info from certificate query
if "parsed" in censys_output:
cert_obj = get_certificate_object(censys_output, attribute)
misp_event.add_object(**cert_obj)
# Location can be present for IP/Websites results
if "location" in censys_output:
loc_obj = MISPObject('geolocation')
loc = censys_output['location']
loc_obj.add_attribute('latitude', value=loc['latitude'])
loc_obj.add_attribute('longitude', value=loc['longitude'])
if 'city' in loc:
loc_obj.add_attribute('city', value=loc['city'])
loc_obj.add_attribute('country', value=loc['country'])
if 'postal_code' in loc:
loc_obj.add_attribute('zipcode', value=loc['postal_code'])
if 'province' in loc:
loc_obj.add_attribute('region', value=loc['province'])
loc_obj.add_reference(attribute.uuid, 'associated-to')
misp_event.add_object(**loc_obj)
event = json.loads(misp_event.to_json())
return {'Object': event['Object'], 'Attribute': event['Attribute']}
# In case of multiple enrichment (ip and domain), we need to filter out similar objects
# TODO: make it more granular
def remove_duplicates(results):
# Only one enrichment was performed so no duplicate
if len(results) == 1:
return results[0]
elif len(results) == 2:
final_result = results[0]
obj_l2 = results[1]['Object']
for o2 in obj_l2:
if o2['name'] == "asn":
key = "asn"
elif o2['name'] == "ip-port":
key = "ip"
elif o2['name'] == "x509":
key = "x509-fingerprint-sha256"
elif o2['name'] == "geolocation":
key = "latitude"
if not check_if_present(o2, key, final_result['Object']):
final_result['Object'].append(o2)
return final_result
else:
return []
def check_if_present(object, attribute_name, list_objects):
"""
Assert if a given object is present in the list.
This function check if object (json format) is present in list_objects
using attribute_name for the matching
"""
for o in list_objects:
# We first look for a match on the name
if o['name'] == object['name']:
for attr in object['Attribute']:
# Within the attributes, we look for the one to compare
if attr['type'] == attribute_name:
# Then we check the attributes of the other object and look for a match
for attr2 in o['Attribute']:
if attr2['type'] == attribute_name and attr2['value'] == attr['value']:
return True
return False
def get_certificate_object(cert, attribute):
parsed = cert['parsed']
cert_object = MISPObject('x509')
cert_object.add_attribute('x509-fingerprint-sha256', value=parsed['fingerprint_sha256'])
cert_object.add_attribute('x509-fingerprint-sha1', value=parsed['fingerprint_sha1'])
cert_object.add_attribute('x509-fingerprint-md5', value=parsed['fingerprint_md5'])
cert_object.add_attribute('serial-number', value=parsed['serial_number'])
cert_object.add_attribute('version', value=parsed['version'])
cert_object.add_attribute('subject', value=parsed['subject_dn'])
cert_object.add_attribute('issuer', value=parsed['issuer_dn'])
cert_object.add_attribute('validity-not-before', value=isoparse(parsed['validity']['start']))
cert_object.add_attribute('validity-not-after', value=isoparse(parsed['validity']['end']))
cert_object.add_attribute('self_signed', value=parsed['signature']['self_signed'])
cert_object.add_attribute('signature_algorithm', value=parsed['signature']['signature_algorithm']['name'])
cert_object.add_attribute('pubkey-info-algorithm', value=parsed['subject_key_info']['key_algorithm']['name'])
if 'rsa_public_key' in parsed['subject_key_info']:
pub_key = parsed['subject_key_info']['rsa_public_key']
cert_object.add_attribute('pubkey-info-size', value=pub_key['length'])
cert_object.add_attribute('pubkey-info-exponent', value=pub_key['exponent'])
hex_mod = codecs.encode(base64.b64decode(pub_key['modulus']), 'hex').decode()
cert_object.add_attribute('pubkey-info-modulus', value=hex_mod)
if "extensions" in parsed and "subject_alt_name" in parsed["extensions"]:
san = parsed["extensions"]["subject_alt_name"]
if "dns_names" in san:
for dns in san['dns_names']:
cert_object.add_attribute('dns_names', value=dns)
if "ip_addresses" in san:
for ip in san['ip_addresses']:
cert_object.add_attribute('ip', value=ip)
if "raw" in cert:
cert_object.add_attribute('raw-base64', value=cert['raw'])
cert_object.add_reference(attribute.uuid, 'associated-to')
return cert_object
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -1,41 +1,72 @@
import json
import pypdns
from . import check_input_attribute, standard_error_message
from pymisp import MISPAttribute, MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['hostname', 'domain', 'ip-src', 'ip-dst'], 'output': ['freetext']}
moduleinfo = {'version': '0.1', 'author': 'Alexandre Dulaunoy', 'description': 'Module to access CIRCL Passive DNS', 'module-type': ['expansion', 'hover']}
mispattributes = {'input': ['hostname', 'domain', 'ip-src', 'ip-dst', 'ip-src|port', 'ip-dst|port'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.2', 'author': 'Alexandre Dulaunoy',
'description': 'Module to access CIRCL Passive DNS',
'module-type': ['expansion', 'hover']}
moduleconfig = ['username', 'password']
class PassiveDNSParser():
def __init__(self, attribute, authentication):
self.misp_event = MISPEvent()
self.attribute = MISPAttribute()
self.attribute.from_dict(**attribute)
self.misp_event.add_attribute(**self.attribute)
self.pdns = pypdns.PyPDNS(basic_auth=authentication)
def get_results(self):
if hasattr(self, 'result'):
return self.result
event = json.loads(self.misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object')}
return {'results': results}
def parse(self):
value = self.attribute.value.split('|')[0] if '|' in self.attribute.type else self.attribute.value
try:
results = self.pdns.query(value)
except Exception:
self.result = {'error': 'There is an authentication error, please make sure you supply correct credentials.'}
return
if not results:
self.result = {'error': 'Not found'}
return
mapping = {'count': 'counter', 'origin': 'text',
'time_first': 'datetime', 'rrtype': 'text',
'rrname': 'text', 'rdata': 'text',
'time_last': 'datetime'}
for result in results:
pdns_object = MISPObject('passive-dns')
for relation, attribute_type in mapping.items():
pdns_object.add_attribute(relation, type=attribute_type, value=result[relation])
pdns_object.add_reference(self.attribute.uuid, 'associated-to')
self.misp_event.add_object(**pdns_object)
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if request.get('hostname'):
toquery = request['hostname']
elif request.get('domain'):
toquery = request['domain']
elif request.get('ip-src'):
toquery = request['ip-src']
elif request.get('ip-dst'):
toquery = request['ip-dst']
else:
misperrors['error'] = "Unsupported attributes type"
return misperrors
if (request.get('config')):
if (request['config'].get('username') is None) or (request['config'].get('password') is None):
misperrors['error'] = 'CIRCL Passive DNS authentication is missing'
return misperrors
x = pypdns.PyPDNS(basic_auth=(request['config']['username'], request['config']['password']))
res = x.query(toquery)
out = ''
for v in res:
out = out + "{} ".format(v['rdata'])
r = {'results': [{'types': mispattributes['output'], 'values': out}]}
return r
if not request.get('config'):
return {'error': 'CIRCL Passive DNS authentication is missing.'}
if not request['config'].get('username') or not request['config'].get('password'):
return {'error': 'CIRCL Passive DNS authentication is incomplete, please provide your username and password.'}
authentication = (request['config']['username'], request['config']['password'])
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if not any(input_type == attribute['type'] for input_type in mispattributes['input']):
return {'error': 'Unsupported attribute type.'}
pdns_parser = PassiveDNSParser(attribute, authentication)
pdns_parser.parse()
return pdns_parser.get_results()
def introspection():

View File

@ -1,35 +1,97 @@
import json
import pypssl
from . import check_input_attribute, standard_error_message
from pymisp import MISPAttribute, MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['ip-src', 'ip-dst'], 'output': ['freetext']}
moduleinfo = {'version': '0.1', 'author': 'Raphaël Vinot', 'description': 'Module to access CIRCL Passive SSL', 'module-type': ['expansion', 'hover']}
mispattributes = {'input': ['ip-src', 'ip-dst', 'ip-src|port', 'ip-dst|port'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.2', 'author': 'Raphaël Vinot',
'description': 'Module to access CIRCL Passive SSL',
'module-type': ['expansion', 'hover']}
moduleconfig = ['username', 'password']
class PassiveSSLParser():
def __init__(self, attribute, authentication):
self.misp_event = MISPEvent()
self.attribute = MISPAttribute()
self.attribute.from_dict(**attribute)
self.misp_event.add_attribute(**self.attribute)
self.pssl = pypssl.PyPSSL(basic_auth=authentication)
self.cert_hash = 'x509-fingerprint-sha1'
self.cert_type = 'pem'
self.mapping = {'issuer': ('text', 'issuer'),
'keylength': ('text', 'pubkey-info-size'),
'not_after': ('datetime', 'validity-not-after'),
'not_before': ('datetime', 'validity-not-before'),
'subject': ('text', 'subject')}
def get_results(self):
if hasattr(self, 'result'):
return self.result
event = json.loads(self.misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object')}
return {'results': results}
def parse(self):
value = self.attribute.value.split('|')[0] if '|' in self.attribute.type else self.attribute.value
try:
results = self.pssl.query(value)
except Exception:
self.result = {'error': 'There is an authentication error, please make sure you supply correct credentials.'}
return
if not results:
self.result = {'error': 'Not found'}
return
if 'error' in results:
self.result = {'error': results['error']}
return
for ip_address, certificates in results.items():
ip_uuid = self._handle_ip_attribute(ip_address)
for certificate in certificates['certificates']:
self._handle_certificate(certificate, ip_uuid)
def _handle_certificate(self, certificate, ip_uuid):
x509 = MISPObject('x509')
x509.add_attribute(self.cert_hash, type=self.cert_hash, value=certificate)
cert_details = self.pssl.fetch_cert(certificate)
info = cert_details['info']
for feature, mapping in self.mapping.items():
attribute_type, object_relation = mapping
x509.add_attribute(object_relation, type=attribute_type, value=info[feature])
x509.add_attribute(self.cert_type, type='text', value=self.cert_type)
x509.add_reference(ip_uuid, 'seen-by')
self.misp_event.add_object(**x509)
def _handle_ip_attribute(self, ip_address):
if ip_address == self.attribute.value:
return self.attribute.uuid
ip_attribute = MISPAttribute()
ip_attribute.from_dict(**{'type': self.attribute.type, 'value': ip_address})
self.misp_event.add_attribute(**ip_attribute)
return ip_attribute.uuid
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if request.get('ip-src'):
toquery = request['ip-src']
elif request.get('ip-dst'):
toquery = request['ip-dst']
else:
misperrors['error'] = "Unsupported attributes type"
return misperrors
if request.get('config'):
if (request['config'].get('username') is None) or (request['config'].get('password') is None):
misperrors['error'] = 'CIRCL Passive SSL authentication is missing'
return misperrors
x = pypssl.PyPSSL(basic_auth=(request['config']['username'], request['config']['password']))
res = x.query(toquery)
out = res.get(toquery)
r = {'results': [{'types': mispattributes['output'], 'values': out}]}
return r
if not request.get('config'):
return {'error': 'CIRCL Passive SSL authentication is missing.'}
if not request['config'].get('username') or not request['config'].get('password'):
return {'error': 'CIRCL Passive SSL authentication is incomplete, please provide your username and password.'}
authentication = (request['config']['username'], request['config']['password'])
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if not any(input_type == attribute['type'] for input_type in mispattributes['input']):
return {'error': 'Unsupported attribute type.'}
pssl_parser = PassiveSSLParser(attribute, authentication)
pssl_parser.parse()
return pssl_parser.get_results()
def introspection():

View File

@ -0,0 +1,128 @@
import base64
import io
import json
import logging
import sys
import zipfile
import clamd
from . import check_input_attribute, standard_error_message
from typing import Optional
from pymisp import MISPEvent, MISPObject
log = logging.getLogger("clamav")
log.setLevel(logging.DEBUG)
sh = logging.StreamHandler(sys.stdout)
sh.setLevel(logging.DEBUG)
fmt = logging.Formatter(
"%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
sh.setFormatter(fmt)
log.addHandler(sh)
moduleinfo = {
"version": "0.1",
"author": "Jakub Onderka",
"description": "Submit file to ClamAV",
"module-type": ["expansion"]
}
moduleconfig = ["connection"]
mispattributes = {
"input": ["attachment", "malware-sample"],
"format": "misp_standard"
}
def create_response(original_attribute: dict, software: str, signature: Optional[str] = None) -> dict:
misp_event = MISPEvent()
if signature:
misp_event.add_attribute(**original_attribute)
av_signature_object = MISPObject("av-signature")
av_signature_object.add_attribute("signature", signature)
av_signature_object.add_attribute("software", software)
av_signature_object.add_reference(original_attribute["uuid"], "belongs-to")
misp_event.add_object(av_signature_object)
event = json.loads(misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object') if (key in event and event[key])}
return {"results": results}
def connect_to_clamav(connection_string: str) -> clamd.ClamdNetworkSocket:
if connection_string.startswith("unix://"):
return clamd.ClamdUnixSocket(connection_string.replace("unix://", ""))
elif ":" in connection_string:
host, port = connection_string.split(":")
return clamd.ClamdNetworkSocket(host, int(port))
else:
raise Exception("ClamAV connection string is invalid. It must be unix socket path with 'unix://' prefix or IP:PORT.")
def handler(q=False):
if q is False:
return False
request = json.loads(q)
connection_string: str = request["config"].get("connection")
if not connection_string:
return {"error": "No ClamAV connection string provided"}
attribute = request.get("attribute")
if not attribute:
return {"error": "No attribute provided"}
if not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
if attribute["type"] not in mispattributes["input"]:
return {"error": "Invalid attribute type provided, expected 'malware-sample' or 'attachment'"}
attribute_data = attribute.get("data")
if not attribute_data:
return {"error": "No attribute data provided"}
try:
clamav = connect_to_clamav(connection_string)
software_version = clamav.version()
except Exception:
logging.exception("Could not connect to ClamAV")
return {"error": "Could not connect to ClamAV"}
try:
data = base64.b64decode(attribute_data, validate=True)
except Exception:
logging.exception("Provided data is not valid base64 encoded string")
return {"error": "Provided data is not valid base64 encoded string"}
if attribute["type"] == "malware-sample":
try:
with zipfile.ZipFile(io.BytesIO(data)) as zipf:
data = zipf.read(zipf.namelist()[0], pwd=b"infected")
except Exception:
logging.exception("Could not extract malware sample from ZIP file")
return {"error": "Could not extract malware sample from ZIP file"}
try:
status, reason = clamav.instream(io.BytesIO(data))["stream"]
except Exception:
logging.exception("Could not send attribute data to ClamAV. Maybe file is too big?")
return {"error": "Could not send attribute data to ClamAV. Maybe file is too big?"}
if status == "ERROR":
return {"error": "ClamAV returned error message: {}".format(reason)}
elif status == "OK":
return {"results": {}}
elif status == "FOUND":
return create_response(attribute, software_version, reason)
else:
return {"error": "ClamAV returned invalid status {}: {}".format(status, reason)}
def introspection():
return mispattributes
def version():
moduleinfo["config"] = moduleconfig
return moduleinfo

View File

@ -20,13 +20,22 @@ common_tlds = {"com": "Commercial (Worldwide)",
"gov": "Government (USA)"
}
codes = False
def parse_country_code(extension):
# Retrieve a json full of country info
try:
codes = requests.get("http://www.geognos.com/api/en/countries/info/all.json").json()
except Exception:
return "http://www.geognos.com/api/en/countries/info/all.json not reachable"
if not codes.get('StatusMsg') or not codes["StatusMsg"] == "OK":
return 'Not able to get the countrycode references from http://www.geognos.com/api/en/countries/info/all.json'
for country in codes['Results'].values():
if country['CountryCodes']['tld'] == extension:
return country['Name']
return "Unknown"
def handler(q=False):
global codes
if not codes:
codes = requests.get("http://www.geognos.com/api/en/countries/info/all.json").json()
if q is False:
return False
request = json.loads(q)
@ -36,18 +45,7 @@ def handler(q=False):
ext = domain.split(".")[-1]
# Check if it's a common, non country one
if ext in common_tlds.keys():
val = common_tlds[ext]
else:
# Retrieve a json full of country info
if not codes["StatusMsg"] == "OK":
val = "Unknown"
else:
# Find our code based on TLD
codes = codes["Results"]
for code in codes.keys():
if codes[code]["CountryCodes"]["tld"] == ext:
val = codes[code]["Name"]
val = common_tlds[ext] if ext in common_tlds.keys() else parse_country_code(ext)
r = {'results': [{'types': ['text'], 'values':[val]}]}
return r

View File

@ -0,0 +1,125 @@
import json
import requests
from . import check_input_attribute, standard_error_message
from pymisp import MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['cpe'], 'format': 'misp_standard'}
moduleinfo = {
'version': '1',
'author': 'Christian Studer',
'description': 'An expansion module to enrich a CPE attribute with its related vulnerabilities.',
'module-type': ['expansion', 'hover']
}
moduleconfig = ["custom_API_URL", "limit"]
cveapi_url = 'https://cvepremium.circl.lu/api/cvefor/'
class VulnerabilitiesParser():
def __init__(self, attribute, api_url):
self.attribute = attribute
self.api_url = api_url
self.misp_event = MISPEvent()
self.misp_event.add_attribute(**attribute)
self.vulnerability_mapping = {
'id': {
'type': 'vulnerability',
'object_relation': 'id'
},
'summary': {
'type': 'text',
'object_relation': 'summary'
},
'vulnerable_configuration': {
'type': 'cpe',
'object_relation': 'vulnerable_configuration'
},
'vulnerable_configuration_cpe_2_2': {
'type': 'cpe',
'object_relation': 'vulnerable_configuration'
},
'Modified': {
'type': 'datetime',
'object_relation': 'modified'
},
'Published': {
'type': 'datetime',
'object_relation': 'published'
},
'references': {
'type': 'link',
'object_relation': 'references'
},
'cvss': {
'type': 'float',
'object_relation': 'cvss-score'
}
}
def parse_vulnerabilities(self, vulnerabilities):
for vulnerability in vulnerabilities:
vulnerability_object = MISPObject('vulnerability')
for feature in ('id', 'summary', 'Modified', 'Published', 'cvss'):
if vulnerability.get(feature):
attribute = {'value': vulnerability[feature]}
attribute.update(self.vulnerability_mapping[feature])
vulnerability_object.add_attribute(**attribute)
if vulnerability.get('Published'):
vulnerability_object.add_attribute(**{
'type': 'text',
'object_relation': 'state',
'value': 'Published'
})
for feature in ('references', 'vulnerable_configuration', 'vulnerable_configuration_cpe_2_2'):
if vulnerability.get(feature):
for value in vulnerability[feature]:
if isinstance(value, dict):
value = value['title']
attribute = {'value': value}
attribute.update(self.vulnerability_mapping[feature])
vulnerability_object.add_attribute(**attribute)
vulnerability_object.add_reference(self.attribute['uuid'], 'related-to')
self.misp_event.add_object(vulnerability_object)
def get_result(self):
event = json.loads(self.misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object')}
return {'results': results}
def check_url(url):
return url if url.endswith('/') else f"{url}/"
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if attribute.get('type') != 'cpe':
return {'error': 'Wrong input attribute type.'}
api_url = check_url(request['config']['custom_API_URL']) if request['config'].get('custom_API_URL') else cveapi_url
url = f"{api_url}{attribute['value']}"
if request['config'].get('limit'):
url = f"{url}/{request['config']['limit']}"
response = requests.get(url)
if response.status_code == 200:
vulnerabilities = response.json()
if not vulnerabilities:
return {'error': 'No related vulnerability for this CPE.'}
else:
return {'error': 'API not accessible.'}
parser = VulnerabilitiesParser(attribute, api_url)
parser.parse_vulnerabilities(vulnerabilities)
return parser.get_result()
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -20,7 +20,7 @@ def handler(q=False):
misperrors['error'] = 'Vulnerability id missing'
return misperrors
api_url = check_url(request['config']['custom_API']) if request['config'].get('custom_API') else cveapi_url
api_url = check_url(request['config']['custom_API']) if request.get('config') and request['config'].get('custom_API') else cveapi_url
r = requests.get("{}{}".format(api_url, request.get('vulnerability')))
if r.status_code == 200:
vulnerability = json.loads(r.text)

View File

@ -1,7 +1,8 @@
from collections import defaultdict
from pymisp import MISPEvent, MISPObject
import json
import requests
from . import check_input_attribute, standard_error_message
from collections import defaultdict
from pymisp import MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['vulnerability'], 'format': 'misp_standard'}
@ -22,8 +23,9 @@ class VulnerabilityParser():
self.references = defaultdict(list)
self.capec_features = ('id', 'name', 'summary', 'prerequisites', 'solutions')
self.vulnerability_mapping = {
'id': ('text', 'id'), 'summary': ('text', 'summary'),
'vulnerable_configuration_cpe_2_2': ('text', 'vulnerable_configuration'),
'id': ('vulnerability', 'id'), 'summary': ('text', 'summary'),
'vulnerable_configuration': ('cpe', 'vulnerable_configuration'),
'vulnerable_configuration_cpe_2_2': ('cpe', 'vulnerable_configuration'),
'Modified': ('datetime', 'modified'), 'Published': ('datetime', 'published'),
'references': ('link', 'references'), 'cvss': ('float', 'cvss-score')}
self.weakness_mapping = {'name': 'name', 'description_summary': 'description',
@ -46,14 +48,16 @@ class VulnerabilityParser():
if 'Published' in self.vulnerability:
vulnerability_object.add_attribute('published', **{'type': 'datetime', 'value': self.vulnerability['Published']})
vulnerability_object.add_attribute('state', **{'type': 'text', 'value': 'Published'})
for feature in ('references', 'vulnerable_configuration_cpe_2_2'):
for feature in ('references', 'vulnerable_configuration', 'vulnerable_configuration_cpe_2_2'):
if feature in self.vulnerability:
attribute_type, relation = self.vulnerability_mapping[feature]
for value in self.vulnerability[feature]:
if isinstance(value, dict):
value = value['title']
vulnerability_object.add_attribute(relation, **{'type': attribute_type, 'value': value})
vulnerability_object.add_reference(self.attribute['uuid'], 'related-to')
self.misp_event.add_object(**vulnerability_object)
if 'cwe' in self.vulnerability and self.vulnerability['cwe'] != 'Unknown':
self.misp_event.add_object(vulnerability_object)
if 'cwe' in self.vulnerability and self.vulnerability['cwe'] not in ('Unknown', 'NVD-CWE-noinfo'):
self.__parse_weakness(vulnerability_object.uuid)
if 'capec' in self.vulnerability:
self.__parse_capec(vulnerability_object.uuid)
@ -67,33 +71,39 @@ class VulnerabilityParser():
break
def __parse_capec(self, vulnerability_uuid):
attribute_type = 'text'
for capec in self.vulnerability['capec']:
capec_object = MISPObject('attack-pattern')
for feature in self.capec_features:
capec_object.add_attribute(feature, **dict(type=attribute_type, value=capec[feature]))
capec_object.add_attribute(feature, **{'type': 'text', 'value': capec[feature]})
for related_weakness in capec['related_weakness']:
attribute = dict(type='weakness', value="CWE-{}".format(related_weakness))
attribute = {'type': 'weakness', 'value': f"CWE-{related_weakness}"}
capec_object.add_attribute('related-weakness', **attribute)
self.misp_event.add_object(**capec_object)
self.references[vulnerability_uuid].append(dict(referenced_uuid=capec_object.uuid,
relationship_type='targeted-by'))
self.misp_event.add_object(capec_object)
self.references[vulnerability_uuid].append(
{
'referenced_uuid': capec_object.uuid,
'relationship_type': 'targeted-by'
}
)
def __parse_weakness(self, vulnerability_uuid):
attribute_type = 'text'
cwe_string, cwe_id = self.vulnerability['cwe'].split('-')
cwes = requests.get(self.api_url.replace('/cve/', '/cwe'))
if cwes.status_code == 200:
for cwe in cwes.json():
if cwe['id'] == cwe_id:
weakness_object = MISPObject('weakness')
weakness_object.add_attribute('id', **dict(type=attribute_type, value='-'.join([cwe_string, cwe_id])))
weakness_object.add_attribute('id', {'type': 'weakness', 'value': f'{cwe_string}-{cwe_id}'})
for feature, relation in self.weakness_mapping.items():
if cwe.get(feature):
weakness_object.add_attribute(relation, **dict(type=attribute_type, value=cwe[feature]))
self.misp_event.add_object(**weakness_object)
self.references[vulnerability_uuid].append(dict(referenced_uuid=weakness_object.uuid,
relationship_type='weakened-by'))
weakness_object.add_attribute(relation, **{'type': 'text', 'value': cwe[feature]})
self.misp_event.add_object(weakness_object)
self.references[vulnerability_uuid].append(
{
'referenced_uuid': weakness_object.uuid,
'relationship_type': 'weakened-by'
}
)
break
@ -105,7 +115,9 @@ def handler(q=False):
if q is False:
return False
request = json.loads(q)
attribute = request.get('attribute')
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if attribute.get('type') != 'vulnerability':
misperrors['error'] = 'Vulnerability id missing.'
return misperrors

View File

@ -0,0 +1,186 @@
#!/usr/bin/env python3
'''
Cytomic Orion MISP Module
An expansion module to enrich attributes in MISP and share indicators of compromise with Cytomic Orion
'''
from . import check_input_attribute, standard_error_message
from pymisp import MISPAttribute, MISPEvent, MISPObject
import json
import requests
import sys
misperrors = {'error': 'Error'}
mispattributes = {'input': ['md5'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.3', 'author': 'Koen Van Impe',
'description': 'an expansion module to enrich attributes in MISP and share indicators of compromise with Cytomic Orion',
'module-type': ['expansion']}
moduleconfig = ['api_url', 'token_url', 'clientid', 'clientsecret', 'clientsecret', 'username', 'password', 'upload_timeframe', 'upload_tag', 'delete_tag', 'upload_ttlDays', 'upload_threat_level_id', 'limit_upload_events', 'limit_upload_attributes']
# There are more config settings in this module than used by the enrichment
# There is also a PyMISP module which reuses the module config, and requires additional configuration, for example used for pushing indicators to the API
class CytomicParser():
def __init__(self, attribute, config_object):
self.misp_event = MISPEvent()
self.attribute = MISPAttribute()
self.attribute.from_dict(**attribute)
self.misp_event.add_attribute(**self.attribute)
self.config_object = config_object
if self.config_object:
self.token = self.get_token()
else:
sys.exit('Missing configuration')
def get_token(self):
try:
scope = self.config_object['scope']
grant_type = self.config_object['grant_type']
username = self.config_object['username']
password = self.config_object['password']
token_url = self.config_object['token_url']
clientid = self.config_object['clientid']
clientsecret = self.config_object['clientsecret']
if scope and grant_type and username and password:
data = {'scope': scope, 'grant_type': grant_type, 'username': username, 'password': password}
if token_url and clientid and clientsecret:
access_token_response = requests.post(token_url, data=data, verify=False, allow_redirects=False, auth=(clientid, clientsecret))
tokens = json.loads(access_token_response.text)
if 'access_token' in tokens:
return tokens['access_token']
else:
self.result = {'error': 'No token received.'}
return
else:
self.result = {'error': 'No token_url, clientid or clientsecret supplied.'}
return
else:
self.result = {'error': 'No scope, grant_type, username or password supplied.'}
return
except Exception:
self.result = {'error': 'Unable to connect to token_url.'}
return
def get_results(self):
if hasattr(self, 'result'):
return self.result
event = json.loads(self.misp_event.to_json())
results = {key: event[key] for key in ('Attribute', 'Object')}
return {'results': results}
def parse(self, searchkey):
if self.token:
endpoint_fileinformation = self.config_object['endpoint_fileinformation']
endpoint_machines = self.config_object['endpoint_machines']
endpoint_machines_client = self.config_object['endpoint_machines_client']
query_machines = self.config_object['query_machines']
query_machine_info = self.config_object['query_machine_info']
# Update endpoint URLs
query_endpoint_fileinformation = endpoint_fileinformation.format(md5=searchkey)
query_endpoint_machines = endpoint_machines.format(md5=searchkey)
# API calls
api_call_headers = {'Authorization': 'Bearer ' + self.token}
result_query_endpoint_fileinformation = requests.get(query_endpoint_fileinformation, headers=api_call_headers, verify=False)
json_result_query_endpoint_fileinformation = json.loads(result_query_endpoint_fileinformation.text)
if json_result_query_endpoint_fileinformation:
cytomic_object = MISPObject('cytomic-orion-file')
cytomic_object.add_attribute('fileName', type='text', value=json_result_query_endpoint_fileinformation['fileName'])
cytomic_object.add_attribute('fileSize', type='text', value=json_result_query_endpoint_fileinformation['fileSize'])
cytomic_object.add_attribute('last-seen', type='datetime', value=json_result_query_endpoint_fileinformation['lastSeen'])
cytomic_object.add_attribute('first-seen', type='datetime', value=json_result_query_endpoint_fileinformation['firstSeen'])
cytomic_object.add_attribute('classification', type='text', value=json_result_query_endpoint_fileinformation['classification'])
cytomic_object.add_attribute('classificationName', type='text', value=json_result_query_endpoint_fileinformation['classificationName'])
self.misp_event.add_object(**cytomic_object)
result_query_endpoint_machines = requests.get(query_endpoint_machines, headers=api_call_headers, verify=False)
json_result_query_endpoint_machines = json.loads(result_query_endpoint_machines.text)
if query_machines and json_result_query_endpoint_machines and len(json_result_query_endpoint_machines) > 0:
for machine in json_result_query_endpoint_machines:
if query_machine_info and machine['muid']:
query_endpoint_machines_client = endpoint_machines_client.format(muid=machine['muid'])
result_endpoint_machines_client = requests.get(query_endpoint_machines_client, headers=api_call_headers, verify=False)
json_result_endpoint_machines_client = json.loads(result_endpoint_machines_client.text)
if json_result_endpoint_machines_client:
cytomic_machine_object = MISPObject('cytomic-orion-machine')
clienttag = [{'name': json_result_endpoint_machines_client['clientName']}]
cytomic_machine_object.add_attribute('machineName', type='target-machine', value=json_result_endpoint_machines_client['machineName'], Tag=clienttag)
cytomic_machine_object.add_attribute('machineMuid', type='text', value=machine['muid'])
cytomic_machine_object.add_attribute('clientName', type='target-org', value=json_result_endpoint_machines_client['clientName'], Tag=clienttag)
cytomic_machine_object.add_attribute('clientId', type='text', value=machine['clientId'])
cytomic_machine_object.add_attribute('machinePath', type='text', value=machine['lastPath'])
cytomic_machine_object.add_attribute('first-seen', type='datetime', value=machine['firstSeen'])
cytomic_machine_object.add_attribute('last-seen', type='datetime', value=machine['lastSeen'])
cytomic_machine_object.add_attribute('creationDate', type='datetime', value=json_result_endpoint_machines_client['creationDate'])
cytomic_machine_object.add_attribute('clientCreationDateUTC', type='datetime', value=json_result_endpoint_machines_client['clientCreationDateUTC'])
cytomic_machine_object.add_attribute('lastSeenUtc', type='datetime', value=json_result_endpoint_machines_client['lastSeenUtc'])
self.misp_event.add_object(**cytomic_machine_object)
else:
self.result = {'error': 'No (valid) token.'}
return
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('attribute'):
return {'error': 'Unsupported input.'}
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
attribute = request['attribute']
if not any(input_type == attribute['type'] for input_type in mispattributes['input']):
return {'error': 'Unsupported attribute type.'}
if not request.get('config'):
return {'error': 'Missing configuration'}
config_object = {
'clientid': request["config"].get("clientid"),
'clientsecret': request["config"].get("clientsecret"),
'scope': 'orion.api',
'password': request["config"].get("password"),
'username': request["config"].get("username"),
'grant_type': 'password',
'token_url': request["config"].get("token_url"),
'endpoint_fileinformation': '{api_url}{endpoint}'.format(api_url=request["config"].get("api_url"), endpoint='/forensics/md5/{md5}/info'),
'endpoint_machines': '{api_url}{endpoint}'.format(api_url=request["config"].get("api_url"), endpoint='/forensics/md5/{md5}/muids'),
'endpoint_machines_client': '{api_url}{endpoint}'.format(api_url=request["config"].get("api_url"), endpoint='/forensics/muid/{muid}/info'),
'query_machines': True,
'query_machine_info': True
}
cytomic_parser = CytomicParser(attribute, config_object)
cytomic_parser.parse(attribute['value'])
return cytomic_parser.get_results()
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -49,8 +49,10 @@ def handler(q=False):
try:
query_result = resolver.query(query, 'A')[0]
result = "{} - {}".format(requested_value, dbl_mapping[str(query_result)])
except Exception as e:
result = str(e)
except dns.resolver.NXDOMAIN as e:
result = e.msg
except Exception:
return {'error': 'Not able to reach dbl.spamhaus.org or something went wrong'}
return {'results': [{'types': mispattributes.get('output'), 'values': result}]}

View File

@ -0,0 +1,84 @@
"""
Export module for converting MISP events into Endgame EQL queries
"""
import json
import logging
misperrors = {"error": "Error"}
moduleinfo = {
"version": "0.1",
"author": "92 COS DOM",
"description": "Generates EQL queries from events",
"module-type": ["expansion"]
}
# Map of MISP fields => Endgame fields
fieldmap = {
"ip-src": "source_address",
"ip-dst": "destination_address",
"filename": "file_name"
}
# Describe what events have what fields
event_types = {
"source_address": "network",
"destination_address": "network",
"file_name": "file"
}
# combine all the MISP fields from fieldmap into one big list
mispattributes = {
"input": list(fieldmap.keys())
}
def handler(q=False):
"""
Convert a MISP query into a CSV file matching the ThreatConnect Structured Import file format.
Input
q: Query dictionary
"""
if q is False or not q:
return False
# Check if we were given a configuration
request = json.loads(q)
config = request.get("config", {"Default_Source": ""})
logging.info("Setting config to: %s", config)
for supportedType in fieldmap.keys():
if request.get(supportedType):
attrType = supportedType
if attrType:
eqlType = fieldmap[attrType]
event_type = event_types[eqlType]
fullEql = "{} where {} == \"{}\"".format(event_type, eqlType, request[attrType])
else:
misperrors['error'] = "Unsupported attributes type"
return misperrors
response = []
response.append({'types': ['comment'], 'categories': ['External analysis'], 'values': fullEql, 'comment': "Event EQL queries"})
return {'results': response}
def introspection():
"""
Relay the supported attributes to MISP.
No Input
Output
Dictionary of supported MISP attributes
"""
return mispattributes
def version():
"""
Relay module version and associated metadata to MISP.
No Input
Output
moduleinfo: metadata output containing all potential configuration values
"""
return moduleinfo

View File

@ -16,10 +16,9 @@ def handler(q=False):
if q is False:
return False
request = json.loads(q)
if (request.get('config')):
if (request['config'].get('apikey') is None):
misperrors['error'] = 'Farsight DNSDB apikey is missing'
return misperrors
if not request.get('config') or not request['config'].get('apikey'):
misperrors['error'] = 'Farsight DNSDB apikey is missing'
return misperrors
client = DnsdbClient(server, request['config']['apikey'])
if request.get('hostname'):
res = lookup_name(client, request['hostname'])

View File

@ -0,0 +1,64 @@
import json
import geoip2.database
import sys
import logging
log = logging.getLogger('geoip_asn')
log.setLevel(logging.DEBUG)
ch = logging.StreamHandler(sys.stdout)
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
log.addHandler(ch)
misperrors = {'error': 'Error'}
mispattributes = {'input': ['ip-src', 'ip-dst', 'domain|ip'], 'output': ['freetext']}
moduleconfig = ['local_geolite_db']
# possible module-types: 'expansion', 'hover' or both
moduleinfo = {'version': '0.1', 'author': 'GlennHD',
'description': 'Query a local copy of the Maxmind Geolite ASN database (MMDB format)',
'module-type': ['expansion', 'hover']}
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('config') or not request['config'].get('local_geolite_db'):
return {'error': 'Please specify the path of your local copy of the Maxmind Geolite ASN database'}
path_to_geolite = request['config']['local_geolite_db']
if request.get('ip-dst'):
toquery = request['ip-dst']
elif request.get('ip-src'):
toquery = request['ip-src']
elif request.get('domain|ip'):
toquery = request['domain|ip'].split('|')[1]
else:
return False
try:
reader = geoip2.database.Reader(path_to_geolite)
except FileNotFoundError:
return {'error': f'Unable to locate the GeoLite database you specified ({path_to_geolite}).'}
log.debug(toquery)
try:
answer = reader.asn(toquery)
stringmap = 'ASN=' + str(answer.autonomous_system_number) + ', AS Org=' + str(answer.autonomous_system_organization)
except Exception as e:
misperrors['error'] = f"GeoIP resolving error: {e}"
return misperrors
r = {'results': [{'types': mispattributes['output'], 'values': stringmap}]}
return r
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -0,0 +1,65 @@
import json
import geoip2.database
import sys
import logging
log = logging.getLogger('geoip_city')
log.setLevel(logging.DEBUG)
ch = logging.StreamHandler(sys.stdout)
ch.setLevel(logging.DEBUG)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
ch.setFormatter(formatter)
log.addHandler(ch)
misperrors = {'error': 'Error'}
mispattributes = {'input': ['ip-src', 'ip-dst', 'domain|ip'], 'output': ['freetext']}
moduleconfig = ['local_geolite_db']
# possible module-types: 'expansion', 'hover' or both
moduleinfo = {'version': '0.1', 'author': 'GlennHD',
'description': 'Query a local copy of the Maxmind Geolite City database (MMDB format)',
'module-type': ['expansion', 'hover']}
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('config') or not request['config'].get('local_geolite_db'):
return {'error': 'Please specify the path of your local copy of Maxminds Geolite database'}
path_to_geolite = request['config']['local_geolite_db']
if request.get('ip-dst'):
toquery = request['ip-dst']
elif request.get('ip-src'):
toquery = request['ip-src']
elif request.get('domain|ip'):
toquery = request['domain|ip'].split('|')[1]
else:
return False
try:
reader = geoip2.database.Reader(path_to_geolite)
except FileNotFoundError:
return {'error': f'Unable to locate the GeoLite database you specified ({path_to_geolite}).'}
log.debug(toquery)
try:
answer = reader.city(toquery)
stringmap = 'Continent=' + str(answer.continent.name) + ', Country=' + str(answer.country.name) + ', Subdivision=' + str(answer.subdivisions.most_specific.name) + ', City=' + str(answer.city.name)
except Exception as e:
misperrors['error'] = f"GeoIP resolving error: {e}"
return misperrors
r = {'results': [{'types': mispattributes['output'], 'values': stringmap}]}
return r
def introspection():
return mispattributes
def version():
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -1,3 +0,0 @@
[GEOIP]
database = /opt/misp-modules/var/GeoIP.dat

View File

@ -1,9 +1,7 @@
import json
import pygeoip
import geoip2.database
import sys
import os
import logging
import configparser
log = logging.getLogger('geoip_country')
log.setLevel(logging.DEBUG)
@ -15,27 +13,22 @@ log.addHandler(ch)
misperrors = {'error': 'Error'}
mispattributes = {'input': ['ip-src', 'ip-dst', 'domain|ip'], 'output': ['freetext']}
moduleconfig = ['local_geolite_db']
# possible module-types: 'expansion', 'hover' or both
moduleinfo = {'version': '0.1', 'author': 'Andreas Muehlemann',
'description': 'Query a local copy of Maxminds Geolite database',
moduleinfo = {'version': '0.2', 'author': 'Andreas Muehlemann',
'description': 'Query a local copy of Maxminds Geolite database, updated for MMDB format',
'module-type': ['expansion', 'hover']}
try:
# get current db from http://geolite.maxmind.com/download/geoip/database/GeoLiteCountry/GeoIP.dat.gz
config = configparser.ConfigParser()
config.read(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'geoip_country.cfg'))
gi = pygeoip.GeoIP(config.get('GEOIP', 'database'))
enabled = True
except Exception:
enabled = False
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('config') or not request['config'].get('local_geolite_db'):
return {'error': 'Please specify the path of your local copy of Maxminds Geolite database'}
path_to_geolite = request['config']['local_geolite_db']
if request.get('ip-dst'):
toquery = request['ip-dst']
elif request.get('ip-src'):
@ -45,15 +38,18 @@ def handler(q=False):
else:
return False
log.debug(toquery)
try:
answer = gi.country_code_by_addr(toquery)
except Exception:
misperrors['error'] = "GeoIP resolving error"
reader = geoip2.database.Reader(path_to_geolite)
except FileNotFoundError:
return {'error': f'Unable to locate the GeoLite database you specified ({path_to_geolite}).'}
log.debug(toquery)
try:
answer = reader.country(toquery)
except Exception as e:
misperrors['error'] = f"GeoIP resolving error: {e}"
return misperrors
r = {'results': [{'types': mispattributes['output'], 'values': [str(answer)]}]}
r = {'results': [{'types': mispattributes['output'], 'values': [answer.country.iso_code]}]}
return r
@ -63,5 +59,5 @@ def introspection():
def version():
# moduleinfo['config'] = moduleconfig
moduleinfo['config'] = moduleconfig
return moduleinfo

View File

@ -0,0 +1,34 @@
import json
try:
from google import google
except ImportError:
print("GoogleAPI not installed. Command : pip install git+https://github.com/abenassi/Google-Search-API")
misperrors = {'error': 'Error'}
mispattributes = {'input': ['url'], 'output': ['text']}
moduleinfo = {'author': 'Oun & Gindt', 'module-type': ['hover'],
'description': 'An expansion hover module to expand google search information about an URL'}
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('url'):
return {'error': "Unsupported attributes type"}
num_page = 1
res = ""
search_results = google.search(request['url'], num_page)
for i in range(3):
res += "("+str(i+1)+")" + '\t'
res += json.dumps(search_results[i].description, ensure_ascii=False)
res += '\n\n'
return {'results': [{'types': mispattributes['output'], 'values':res}]}
def introspection():
return mispattributes
def version():
return moduleinfo

View File

@ -3,35 +3,59 @@ import json
misperrors = {'error': 'Error'}
mispattributes = {'input': ['ip-dst', 'ip-src'], 'output': ['text']}
moduleinfo = {'version': '0.1', 'author': 'Aurélien Schwab <aurelien.schwab+dev@gmail.com>', 'description': 'Module to access GreyNoise.io API.', 'module-type': ['hover']}
moduleconfig = ['user-agent'] # TODO take this into account in the code
moduleinfo = {
'version': '0.2',
'author': 'Aurélien Schwab <aurelien.schwab+dev@gmail.com>',
'description': 'Module to access GreyNoise.io API.',
'module-type': ['hover']
}
moduleconfig = ['api_key']
greynoise_api_url = 'http://api.greynoise.io:8888/v1/query/ip'
default_user_agent = 'MISP-Module'
greynoise_api_url = 'https://api.greynoise.io/v2/noise/quick/'
codes_mapping = {
'0x00': 'The IP has never been observed scanning the Internet',
'0x01': 'The IP has been observed by the GreyNoise sensor network',
'0x02': 'The IP has been observed scanning the GreyNoise sensor network, but has not completed a full connection, meaning this can be spoofed',
'0x03': 'The IP is adjacent to another host that has been directly observed by the GreyNoise sensor network',
'0x04': 'Reserved',
'0x05': 'This IP is commonly spoofed in Internet-scan activity',
'0x06': 'This IP has been observed as noise, but this host belongs to a cloud provider where IPs can be cycled frequently',
'0x07': 'This IP is invalid',
'0x08': 'This IP was classified as noise, but has not been observed engaging in Internet-wide scans or attacks in over 60 days'
}
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if not request.get('config') or not request['config'].get('api_key'):
return {'error': 'Missing Greynoise API key.'}
headers = {
'Accept': 'application/json',
'key': request['config']['api_key']
}
for input_type in mispattributes['input']:
if input_type in request:
ip = request[input_type]
break
else:
misperrors['error'] = "Unsupported attributes type"
misperrors['error'] = "Unsupported attributes type."
return misperrors
data = {'ip': ip}
r = requests.post(greynoise_api_url, data=data, headers={'user-agent': default_user_agent}) # Real request
if r.status_code == 200: # OK (record found)
response = json.loads(r.text)
if response:
return {'results': [{'types': mispattributes['output'], 'values': response}]}
elif r.status_code == 404: # Not found (not an error)
return {'results': [{'types': mispattributes['output'], 'values': 'No data'}]}
else: # Real error
misperrors['error'] = 'GreyNoise API not accessible (HTTP ' + str(r.status_code) + ')'
return misperrors['error']
response = requests.get(f'{greynoise_api_url}{ip}', headers=headers) # Real request
if response.status_code == 200: # OK (record found)
return {'results': [{'types': mispattributes['output'], 'values': codes_mapping[response.json()['code']]}]}
# There is an error
errors = {
400: "Bad request.",
401: "Unauthorized. Please check your API key.",
429: "Too many requests. You've hit the rate-limit."
}
try:
misperrors['error'] = errors[response.status_code]
except KeyError:
misperrors['error'] = f'GreyNoise API not accessible (HTTP {response.status_code})'
return misperrors['error']
def introspection():

View File

@ -23,11 +23,7 @@ def handler(q=False):
r = requests.post(hashddapi_url, data={'hash': v})
if r.status_code == 200:
state = json.loads(r.text)
if state:
if state.get(v):
summary = state[v]['known_level']
else:
summary = 'Unknown hash'
summary = state[v]['known_level'] if state and state.get(v) else 'Unknown hash'
else:
misperrors['error'] = '{} API not accessible'.format(hashddapi_url)
return misperrors['error']

View File

@ -0,0 +1,53 @@
import json
import requests
from markdownify import markdownify
from bs4 import BeautifulSoup
misperrors = {'error': 'Error'}
mispattributes = {'input': ['url'], 'output': ['text']}
moduleinfo = {'version': '0.1', 'author': 'Sami Mokaddem',
'description': 'Simple HTML fetcher',
'module-type': ['expansion']}
def fetchHTML(url):
r = requests.get(url)
return r.text
def stripUselessTags(html):
soup = BeautifulSoup(html, 'html.parser')
toRemove = ['script', 'head', 'header', 'footer', 'meta', 'link']
for tag in soup.find_all(toRemove):
tag.decompose()
return str(soup)
def convertHTML(html):
toStrip = ['a', 'img']
return markdownify(html, heading_style='ATX', strip=toStrip)
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if request.get('url'):
url = request['url']
else:
return False
html = fetchHTML(url)
html = stripUselessTags(html)
markdown = convertHTML(html)
r = {'results': [{'types': mispattributes['output'],
'values':[str(markdown)]}]}
return r
def introspection():
return mispattributes
def version():
return moduleinfo

View File

@ -1,26 +1,45 @@
# -*- coding: utf-8 -*-
import json
from . import check_input_attribute, standard_error_message
from pyipasnhistory import IPASNHistory
from pymisp import MISPAttribute, MISPEvent, MISPObject
misperrors = {'error': 'Error'}
mispattributes = {'input': ['ip-src', 'ip-dst'], 'output': ['freetext']}
moduleinfo = {'version': '0.1', 'author': 'Raphaël Vinot',
mispattributes = {'input': ['ip-src', 'ip-dst'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.2', 'author': 'Raphaël Vinot',
'description': 'Query an IP ASN history service (https://github.com/CIRCL/IP-ASN-history.git)',
'module-type': ['expansion', 'hover']}
def parse_result(attribute, values):
event = MISPEvent()
initial_attribute = MISPAttribute()
initial_attribute.from_dict(**attribute)
event.add_attribute(**initial_attribute)
mapping = {'asn': ('AS', 'asn'), 'prefix': ('ip-src', 'subnet-announced')}
print(values)
for last_seen, response in values['response'].items():
asn = MISPObject('asn')
asn.add_attribute('last-seen', **{'type': 'datetime', 'value': last_seen})
for feature, attribute_fields in mapping.items():
attribute_type, object_relation = attribute_fields
asn.add_attribute(object_relation, **{'type': attribute_type, 'value': response[feature]})
asn.add_reference(initial_attribute.uuid, 'related-to')
event.add_object(**asn)
event = json.loads(event.to_json())
return {key: event[key] for key in ('Attribute', 'Object')}
def handler(q=False):
if q is False:
return False
request = json.loads(q)
if request.get('ip-src'):
toquery = request['ip-src']
elif request.get('ip-dst'):
toquery = request['ip-dst']
else:
misperrors['error'] = "Unsupported attributes type"
return misperrors
if not request.get('attribute') or not check_input_attribute(request['attribute']):
return {'error': f'{standard_error_message}, which should contain at least a type, a value and an uuid.'}
if request['attribute']['type'] not in mispattributes['input']:
return {'error': 'Unsupported attribute type.'}
toquery = request['attribute']['value']
ipasn = IPASNHistory()
values = ipasn.query(toquery)
@ -28,7 +47,7 @@ def handler(q=False):
if not values:
misperrors['error'] = 'Unable to find the history of this IP'
return misperrors
return {'results': [{'types': mispattributes['output'], 'values': values}]}
return {'results': parse_result(request['attribute'], values)}
def introspection():

View File

@ -1,15 +1,17 @@
# -*- coding: utf-8 -*-
import jbxapi
import json
from . import check_input_attribute, checking_error, standard_error_message
from joe_parser import JoeParser
misperrors = {'error': 'Error'}
mispattributes = {'input': ['link'], 'format': 'misp_standard'}
moduleinfo = {'version': '0.1', 'author': 'Christian Studer',
inputSource = ['link']
moduleinfo = {'version': '0.2', 'author': 'Christian Studer',
'description': 'Query Joe Sandbox API with a report URL to get the parsed data.',
'module-type': ['expansion']}
moduleconfig = ['apiurl', 'apikey']
moduleconfig = ['apiurl', 'apikey', 'import_pe', 'import_mitre_attack']
def handler(q=False):
@ -18,9 +20,18 @@ def handler(q=False):
request = json.loads(q)
apiurl = request['config'].get('apiurl') or 'https://jbxcloud.joesecurity.org/api'
apikey = request['config'].get('apikey')
parser_config = {
"import_pe": request["config"].get('import_pe', "false") == "true",
"mitre_attack": request["config"].get('import_mitre_attack', "false") == "true",
}
if not apikey:
return {'error': 'No API key provided'}
if not request.get('attribute') or not check_input_attribute(request['attribute'], requirements=('type', 'value')):
return {'error': f'{standard_error_message}, {checking_error} that is the link to the Joe Sandbox report.'}
if request['attribute']['type'] != 'link':
return {'error': 'Unsupported attribute type.'}
url = request['attribute']['value']
if "/submissions/" not in url:
return {'error': "The URL does not point to a Joe Sandbox analysis."}
@ -41,7 +52,7 @@ def handler(q=False):
analysis_webid = joe_info['most_relevant_analysis']['webid']
joe_parser = JoeParser()
joe_parser = JoeParser(parser_config)
joe_data = json.loads(joe.analysis_download(analysis_webid, 'jsonfixed')[1])
joe_parser.parse_data(joe_data['analysis'])
joe_parser.finalize_results()
@ -50,7 +61,19 @@ def handler(q=False):
def introspection():
return mispattributes
modulesetup = {}
try:
userConfig
modulesetup['userConfig'] = userConfig
except NameError:
pass
try:
inputSource
modulesetup['input'] = inputSource
except NameError:
pass
modulesetup['format'] = 'misp_standard'
return modulesetup
def version():

View File

@ -0,0 +1,139 @@
#!/usr/bin/env python3
"""
Module (type "expansion") to query a Lastline report from an analysis link.
"""
import json
import lastline_api
from . import check_input_attribute, checking_error, standard_error_message
misperrors = {
"error": "Error",
}
mispattributes = {
"input": [
"link",
],
"output": ["text"],
"format": "misp_standard",
}
moduleinfo = {
"version": "0.1",
"author": "Stefano Ortolani",
"description": "Get a Lastline report from an analysis link.",
"module-type": ["expansion"],
}
moduleconfig = [
"username",
"password",
"verify_ssl",
]
def introspection():
return mispattributes
def version():
moduleinfo["config"] = moduleconfig
return moduleinfo
def handler(q=False):
if q is False:
return False
request = json.loads(q)
# Parse the init parameters
try:
config = request["config"]
auth_data = lastline_api.LastlineAbstractClient.get_login_params_from_dict(config)
if not request.get('attribute') or not check_input_attribute(request['attribute'], requirements=('type', 'value')):
return {'error': f'{standard_error_message}, {checking_error} that is the link to a Lastline analysis.'}
analysis_link = request['attribute']['value']
# The API url changes based on the analysis link host name
api_url = lastline_api.get_portal_url_from_task_link(analysis_link)
except Exception as e:
misperrors["error"] = "Error parsing configuration: {}".format(e)
return misperrors
# Parse the call parameters
try:
task_uuid = lastline_api.get_uuid_from_task_link(analysis_link)
except (KeyError, ValueError) as e:
misperrors["error"] = "Error processing input parameters: {}".format(e)
return misperrors
# Make the API calls
try:
api_client = lastline_api.PortalClient(api_url, auth_data, verify_ssl=config.get('verify_ssl', True).lower() in ("true"))
response = api_client.get_progress(task_uuid)
if response.get("completed") != 1:
raise ValueError("Analysis is not finished yet.")
response = api_client.get_result(task_uuid)
if not response:
raise ValueError("Analysis report is empty.")
except Exception as e:
misperrors["error"] = "Error issuing the API call: {}".format(e)
return misperrors
# Parse and return
result_parser = lastline_api.LastlineResultBaseParser()
result_parser.parse(analysis_link, response)
event = result_parser.misp_event
event_dictionary = json.loads(event.to_json())
return {
"results": {
key: event_dictionary[key]
for key in ('Attribute', 'Object', 'Tag')
if (key in event and event[key])
}
}
if __name__ == "__main__":
"""Test querying information from a Lastline analysis link."""
import argparse
import configparser
parser = argparse.ArgumentParser()
parser.add_argument("-c", "--config-file", dest="config_file")
parser.add_argument("-s", "--section-name", dest="section_name")
args = parser.parse_args()
c = configparser.ConfigParser()
c.read(args.config_file)
a = lastline_api.LastlineAbstractClient.get_login_params_from_conf(c, args.section_name)
j = json.dumps(
{
"config": a,
"attribute": {
"value": (
"https://user.lastline.com/portal#/analyst/task/"
"1fcbcb8f7fb400100772d6a7b62f501b/overview"
)
}
}
)
print(json.dumps(handler(j), indent=4, sort_keys=True))
j = json.dumps(
{
"config": a,
"attribute": {
"value": (
"https://user.lastline.com/portal#/analyst/task/"
"f3c0ae115d51001017ff8da768fa6049/overview"
)
}
}
)
print(json.dumps(handler(j), indent=4, sort_keys=True))

Some files were not shown because too many files have changed in this diff Show More