mirror of https://github.com/MISP/PyMISP
Merge branch 'master' of https://github.com/MISP/PyMISP
commit
2963303404
|
@ -140,7 +140,7 @@ subject_process = (strip |
|
|||
##
|
||||
## Tags that will be used for the changelog must match this regexp.
|
||||
##
|
||||
tag_filter_regexp = r'^v[0-9]+\.[0-9]+\.[0-9]+$'
|
||||
tag_filter_regexp = r'^v[0-9]+\.[0-9]+\.[0-9]+(\.[0-9])*$'
|
||||
|
||||
|
||||
## ``unreleased_version_label`` is a string
|
||||
|
|
|
@ -0,0 +1,3 @@
|
|||
[submodule "pymisp/data/misp-objects"]
|
||||
path = pymisp/data/misp-objects
|
||||
url = https://github.com/MISP/misp-objects
|
24
.travis.yml
24
.travis.yml
|
@ -2,25 +2,31 @@ language: python
|
|||
|
||||
cache: pip
|
||||
|
||||
addons:
|
||||
apt:
|
||||
sources: [ 'ubuntu-toolchain-r-test' ]
|
||||
packages:
|
||||
- libstdc++6
|
||||
- libfuzzy-dev
|
||||
|
||||
python:
|
||||
- "2.7"
|
||||
- "3.4"
|
||||
- "3.5"
|
||||
- "3.5-dev"
|
||||
- "3.6"
|
||||
- "3.6-dev"
|
||||
- "3.7-dev"
|
||||
- "nightly"
|
||||
|
||||
install:
|
||||
- pip install -U nose
|
||||
- pip install coveralls
|
||||
- pip install codecov
|
||||
- pip install requests-mock
|
||||
- pip install .
|
||||
- pip install -U nose pip setuptools
|
||||
- pip install coveralls codecov requests-mock
|
||||
- pip install git+https://github.com/kbandla/pydeep.git
|
||||
- pip install .[fileobjects,neo,openioc,virustotal]
|
||||
- pushd tests
|
||||
- git clone https://github.com/viper-framework/viper-test-files.git
|
||||
- popd
|
||||
|
||||
script:
|
||||
- nosetests --with-coverage --cover-package=pymisp tests/test_offline.py
|
||||
- nosetests --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
|
||||
|
||||
after_success:
|
||||
- codecov
|
||||
|
|
1570
CHANGELOG.txt
1570
CHANGELOG.txt
File diff suppressed because it is too large
Load Diff
11
MANIFEST.in
11
MANIFEST.in
|
@ -1 +1,10 @@
|
|||
include pymisp/data/*
|
||||
graft docs
|
||||
graft examples
|
||||
graft tests
|
||||
include CHANGELOG.txt
|
||||
include LICENSE
|
||||
include pymisp/data/*.json
|
||||
include pymisp/data/misp-objects/*.json
|
||||
include pymisp/data/misp-objects/objects/*/definition.json
|
||||
include pymisp/data/misp-objects/relationships/definition.json
|
||||
include README.md
|
||||
|
|
63
README.md
63
README.md
|
@ -1,7 +1,7 @@
|
|||
README
|
||||
======
|
||||
|
||||
[![Documentation Status](https://readthedocs.org/projects/pymisp/badge/?version=master)](http://pymisp.readthedocs.io/en/master/?badge=master)
|
||||
[![Documentation Status](https://readthedocs.org/projects/pymisp/badge/?version=latest)](http://pymisp.readthedocs.io/?badge=latest)
|
||||
[![Build Status](https://travis-ci.org/MISP/PyMISP.svg?branch=master)](https://travis-ci.org/MISP/PyMISP)
|
||||
[![Coverage Status](https://coveralls.io/repos/github/MISP/PyMISP/badge.svg?branch=master)](https://coveralls.io/github/MISP/PyMISP?branch=master)
|
||||
|
||||
|
@ -21,11 +21,12 @@ PyMISP allows you to fetch events, add or update events/attributes, add or updat
|
|||
pip3 install pymisp
|
||||
```
|
||||
|
||||
## Install the lastest version from repo
|
||||
## Install the latest version from repo
|
||||
|
||||
```
|
||||
git clone https://github.com/CIRCL/PyMISP.git && cd PyMISP
|
||||
python3 setup.py install
|
||||
git clone https://github.com/MISP/PyMISP.git && cd PyMISP
|
||||
git submodule update --init
|
||||
pip3 install -I .
|
||||
```
|
||||
|
||||
## Samples and how to use PyMISP
|
||||
|
@ -50,12 +51,62 @@ cd examples
|
|||
python3 last.py -l 10
|
||||
```
|
||||
|
||||
## Debugging
|
||||
|
||||
You have two options there:
|
||||
|
||||
1. Pass `debug=True` to `PyMISP` and it will enable logging.DEBUG to stderr on the whole module
|
||||
|
||||
2. Use the python logging module directly:
|
||||
|
||||
```python
|
||||
|
||||
import logging
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
# Configure it as you whish, for example, enable DEBUG mode:
|
||||
logger.setLevel(logging.DEBUG)
|
||||
```
|
||||
|
||||
Or if you want to write the debug output to a file instead of stderr:
|
||||
|
||||
```python
|
||||
import pymisp
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
logging.basicConfig(level=logging.DEBUG, filename="debug.log", filemode='w', format=pymisp.FORMAT)
|
||||
```
|
||||
|
||||
## Documentation
|
||||
|
||||
[PyMISP API documentation is available](https://media.readthedocs.org/pdf/pymisp/master/pymisp.pdf).
|
||||
[PyMISP API documentation is available](https://media.readthedocs.org/pdf/pymisp/latest/pymisp.pdf).
|
||||
|
||||
Documentation can be generated with epydoc:
|
||||
|
||||
```
|
||||
epydoc --url https://github.com/CIRCL/PyMISP --graph all --name PyMISP --pdf pymisp -o doc
|
||||
epydoc --url https://github.com/MISP/PyMISP --graph all --name PyMISP --pdf pymisp -o doc
|
||||
```
|
||||
|
||||
## Everything is a Mutable Mapping
|
||||
|
||||
... or at least everything that can be imported/exported from/to a json blob
|
||||
|
||||
`AbstractMISP` is the master class, and inherit `collections.MutableMapping` which means
|
||||
the class can be represented as a python dictionary.
|
||||
|
||||
The abstraction assumes every property that should not be seen in the dictionary is prepended with a `_`,
|
||||
or its name is added to the private list `__not_jsonable` (accessible through `update_not_jsonable` and `set_not_jsonable`.
|
||||
|
||||
This master class has helpers that will make it easy to load, and export, to, and from, a json string.
|
||||
|
||||
`MISPEvent`, `MISPAttribute`, `MISPObjectReference`, `MISPObjectAttribute`, and `MISPObject`
|
||||
are subclasses of AbstractMISP, which mean that they can be handled as python dictionaries.
|
||||
|
||||
## MISP Objects
|
||||
|
||||
Creating a new MISP object generator should be done using a pre-defined template and inherit `AbstractMISPObjectGenerator`.
|
||||
|
||||
Your new MISPObject generator need to generate attributes, and add them as class properties using `add_attribute`.
|
||||
|
||||
When the object is sent to MISP, all the class properties will be exported to the JSON export.
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
../../README.md
|
|
@ -17,10 +17,6 @@
|
|||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
import sys
|
||||
sys.path.insert(0, os.path.abspath('.'))
|
||||
|
||||
from recommonmark.parser import CommonMarkParser
|
||||
|
||||
# -- General configuration ------------------------------------------------
|
||||
|
@ -70,7 +66,7 @@ master_doc = 'index'
|
|||
|
||||
# General information about the project.
|
||||
project = 'PyMISP'
|
||||
copyright = '2016, Raphaël Vinot'
|
||||
copyright = '2017, Raphaël Vinot'
|
||||
author = 'Raphaël Vinot'
|
||||
|
||||
# The version info for the project you're documenting, acts as replacement for
|
||||
|
@ -78,9 +74,9 @@ author = 'Raphaël Vinot'
|
|||
# built documents.
|
||||
#
|
||||
# The short X.Y version.
|
||||
version = '2.4.50'
|
||||
version = 'master'
|
||||
# The full version, including alpha/beta/rc tags.
|
||||
release = '2.4.50'
|
||||
release = 'master'
|
||||
|
||||
# The language for content autogenerated by Sphinx. Refer to documentation
|
||||
# for a list of supported languages.
|
||||
|
|
|
@ -9,12 +9,11 @@ Welcome to PyMISP's documentation!
|
|||
Contents:
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 2
|
||||
:maxdepth: 4
|
||||
|
||||
readme
|
||||
README
|
||||
modules
|
||||
|
||||
|
||||
tools
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
@ -22,4 +21,3 @@ Indices and tables
|
|||
* :ref:`genindex`
|
||||
* :ref:`modindex`
|
||||
* :ref:`search`
|
||||
|
||||
|
|
|
@ -4,4 +4,82 @@ pymisp
|
|||
.. toctree::
|
||||
:maxdepth: 4
|
||||
|
||||
pymisp
|
||||
.. automodule:: pymisp
|
||||
:members:
|
||||
|
||||
|
||||
PyMISP
|
||||
------
|
||||
|
||||
.. autoclass:: PyMISP
|
||||
:members:
|
||||
|
||||
MISPAbstract
|
||||
------------
|
||||
|
||||
.. autoclass:: AbstractMISP
|
||||
:members:
|
||||
|
||||
MISPEncode
|
||||
----------
|
||||
|
||||
.. autoclass:: MISPEncode
|
||||
:members:
|
||||
|
||||
MISPEvent
|
||||
---------
|
||||
|
||||
.. autoclass:: MISPEvent
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
MISPAttribute
|
||||
-------------
|
||||
|
||||
.. autoclass:: MISPAttribute
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
MISPObject
|
||||
----------
|
||||
|
||||
.. autoclass:: MISPObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
MISPObjectAttribute
|
||||
-------------------
|
||||
|
||||
.. autoclass:: MISPObjectAttribute
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
MISPObjectReference
|
||||
-------------------
|
||||
|
||||
.. autoclass:: MISPObjectReference
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
MISPTag
|
||||
-------
|
||||
|
||||
.. autoclass:: MISPTag
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
MISPUser
|
||||
--------
|
||||
|
||||
.. autoclass:: MISPUser
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
|
||||
MISPOrganisation
|
||||
----------------
|
||||
|
||||
.. autoclass:: MISPOrganisation
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
|
|
|
@ -1,22 +0,0 @@
|
|||
pymisp package
|
||||
==============
|
||||
|
||||
Submodules
|
||||
----------
|
||||
|
||||
pymisp.api module
|
||||
-----------------
|
||||
|
||||
.. automodule:: pymisp.api
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
||||
|
||||
|
||||
Module contents
|
||||
---------------
|
||||
|
||||
.. automodule:: pymisp
|
||||
:members:
|
||||
:undoc-members:
|
||||
:show-inheritance:
|
|
@ -1 +0,0 @@
|
|||
.. include:: ../../README.md
|
|
@ -0,0 +1,69 @@
|
|||
pymisp - Tools
|
||||
==============
|
||||
|
||||
.. toctree::
|
||||
:maxdepth: 4
|
||||
|
||||
.. automodule:: pymisp.tools
|
||||
:members:
|
||||
|
||||
File Object
|
||||
-----------
|
||||
|
||||
.. autoclass:: FileObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
ELF Object
|
||||
----------
|
||||
|
||||
.. autoclass:: ELFObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
.. autoclass:: ELFSectionObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
PE Object
|
||||
---------
|
||||
|
||||
.. autoclass:: PEObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
.. autoclass:: PESectionObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
Mach-O Object
|
||||
-------------
|
||||
|
||||
.. autoclass:: MachOObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
.. autoclass:: MachOSectionObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
VT Report Object
|
||||
----------------
|
||||
|
||||
.. autoclass:: VTReportObject
|
||||
:members:
|
||||
:inherited-members:
|
||||
|
||||
STIX
|
||||
----
|
||||
|
||||
.. automodule:: pymisp.tools.stix
|
||||
:members:
|
||||
|
||||
OpenIOC
|
||||
--------
|
||||
|
||||
.. automethod:: pymisp.tools.load_openioc
|
||||
|
||||
.. automethod:: pymisp.tools.load_openioc_file
|
||||
|
|
@ -0,0 +1,31 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP
|
||||
from pymisp.tools import EMailObject
|
||||
import traceback
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
import glob
|
||||
import argparse
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Extract indicators out of binaries and add MISP objects to a MISP instance.')
|
||||
parser.add_argument("-e", "--event", required=True, help="Event ID to update.")
|
||||
parser.add_argument("-p", "--path", required=True, help="Path to process (expanded using glob).")
|
||||
args = parser.parse_args()
|
||||
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert, debug=True)
|
||||
|
||||
for f in glob.glob(args.path):
|
||||
try:
|
||||
eo = EMailObject(f)
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
continue
|
||||
|
||||
if eo:
|
||||
template_id = pymisp.get_object_template_id(eo.template_uuid)
|
||||
response = pymisp.add_object(args.event, template_id, eo)
|
||||
for ref in eo.ObjectReference:
|
||||
r = pymisp.add_object_reference(ref)
|
|
@ -0,0 +1,80 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP, MISPEvent
|
||||
from pymisp.tools import Fail2BanObject
|
||||
import argparse
|
||||
from base64 import b64decode
|
||||
from datetime import date, datetime
|
||||
from dateutil.parser import parse
|
||||
|
||||
|
||||
try:
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
except Exception:
|
||||
misp_url = 'URL'
|
||||
misp_key = 'AUTH_KEY'
|
||||
misp_key = True
|
||||
|
||||
|
||||
def create_new_event():
|
||||
me = MISPEvent()
|
||||
me.info = "Fail2Ban blocking"
|
||||
me.add_tag(args.tag)
|
||||
start = datetime.now()
|
||||
me.add_attribute('datetime', start.isoformat(), comment='Start Time')
|
||||
return me
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Add Fail2ban object.')
|
||||
parser.add_argument("-b", "--banned_ip", required=True, help="Banned IP address.")
|
||||
parser.add_argument("-a", "--attack_type", required=True, help="Type of attack.")
|
||||
parser.add_argument("-t", "--tag", required=True, help="Tag to search on MISP.")
|
||||
parser.add_argument("-p", "--processing_timestamp", help="Processing timestamp.")
|
||||
parser.add_argument("-f", "--failures", help="Amount of failures that lead to the ban.")
|
||||
parser.add_argument("-s", "--sensor", help="Sensor identifier.")
|
||||
parser.add_argument("-v", "--victim", help="Victim identifier.")
|
||||
parser.add_argument("-l", "--logline", help="Logline (base64 encoded).")
|
||||
parser.add_argument("-n", "--force_new", action='store_true', default=False, help="Force new MISP event.")
|
||||
parser.add_argument("-d", "--disable_new", action='store_true', default=False, help="Do not create a new Event.")
|
||||
args = parser.parse_args()
|
||||
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert, debug=True)
|
||||
event_id = -1
|
||||
me = None
|
||||
if args.force_new:
|
||||
me = create_new_event()
|
||||
else:
|
||||
response = pymisp.search_index(tag=args.tag, timestamp='1h')
|
||||
if response['response']:
|
||||
if args.disable_new:
|
||||
event_id = response['response'][0]['id']
|
||||
else:
|
||||
last_event_date = parse(response['response'][0]['date']).date()
|
||||
nb_attr = response['response'][0]['attribute_count']
|
||||
if last_event_date < date.today() or int(nb_attr) > 1000:
|
||||
me = create_new_event()
|
||||
else:
|
||||
event_id = response['response'][0]['id']
|
||||
else:
|
||||
me = create_new_event()
|
||||
|
||||
parameters = {'banned-ip': args.banned_ip, 'attack-type': args.attack_type}
|
||||
if args.processing_timestamp:
|
||||
parameters['processing-timestamp'] = args.processing_timestamp
|
||||
if args.failures:
|
||||
parameters['failures'] = args.failures
|
||||
if args.sensor:
|
||||
parameters['sensor'] = args.sensor
|
||||
if args.victim:
|
||||
parameters['victim'] = args.victim
|
||||
if args.logline:
|
||||
parameters['logline'] = b64decode(args.logline).decode()
|
||||
f2b = Fail2BanObject(parameters=parameters, standalone=False)
|
||||
if me:
|
||||
me.add_object(f2b)
|
||||
pymisp.add_event(me)
|
||||
elif event_id:
|
||||
template_id = pymisp.get_object_template_id(f2b.template_uuid)
|
||||
a = pymisp.add_object(event_id, template_id, f2b)
|
|
@ -0,0 +1,41 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP
|
||||
from pymisp.tools import make_binary_objects
|
||||
import traceback
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
import glob
|
||||
import argparse
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Extract indicators out of binaries and add MISP objects to a MISP instance.')
|
||||
parser.add_argument("-e", "--event", required=True, help="Event ID to update.")
|
||||
parser.add_argument("-p", "--path", required=True, help="Path to process (expanded using glob).")
|
||||
args = parser.parse_args()
|
||||
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert)
|
||||
|
||||
for f in glob.glob(args.path):
|
||||
try:
|
||||
fo, peo, seos = make_binary_objects(f)
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
continue
|
||||
|
||||
if seos:
|
||||
for s in seos:
|
||||
template_id = pymisp.get_object_template_id(s.template_uuid)
|
||||
r = pymisp.add_object(args.event, template_id, s)
|
||||
|
||||
if peo:
|
||||
template_id = pymisp.get_object_template_id(peo.template_uuid)
|
||||
r = pymisp.add_object(args.event, template_id, peo)
|
||||
for ref in peo.ObjectReference:
|
||||
r = pymisp.add_object_reference(ref)
|
||||
|
||||
if fo:
|
||||
template_id = pymisp.get_object_template_id(fo.template_uuid)
|
||||
response = pymisp.add_object(args.event, template_id, fo)
|
||||
for ref in fo.ObjectReference:
|
||||
r = pymisp.add_object_reference(ref)
|
|
@ -0,0 +1,32 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import json
|
||||
from pymisp import PyMISP
|
||||
from pymisp.tools import GenericObjectGenerator
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
import argparse
|
||||
|
||||
"""
|
||||
Sample usage:
|
||||
./add_generic_object.py -e 5065 -t email -l '[{"to": "undisclosed@ppp.com"}, {"to": "second.to@mail.com"}]'
|
||||
"""
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Create a MISP Object selectable by type starting from a dictionary')
|
||||
parser.add_argument("-e", "--event", required=True, help="Event ID to update")
|
||||
parser.add_argument("-t", "--type", required=True, help="Type of the generic object")
|
||||
parser.add_argument("-l", "--attr_list", required=True, help="List of attributes")
|
||||
args = parser.parse_args()
|
||||
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert)
|
||||
try:
|
||||
template_id = [x['ObjectTemplate']['id'] for x in pymisp.get_object_templates_list() if x['ObjectTemplate']['name'] == args.type][0]
|
||||
except IndexError:
|
||||
valid_types = ", ".join([x['ObjectTemplate']['name'] for x in pymisp.get_object_templates_list()])
|
||||
print ("Template for type %s not found! Valid types are: %s" % (args.type, valid_types))
|
||||
exit()
|
||||
|
||||
misp_object = GenericObjectGenerator(args.type.replace("|", "-"))
|
||||
misp_object.generate_attributes(json.loads(args.attr_list))
|
||||
r = pymisp.add_object(args.event, template_id, misp_object)
|
|
@ -0,0 +1,16 @@
|
|||
import json
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
from pymisp.tools import SBSignatureObject
|
||||
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert)
|
||||
a = json.loads('{"signatures":[{"new_data":[],"confidence":100,"families":[],"severity":1,"weight":0,"description":"AttemptstoconnecttoadeadIP:Port(2uniquetimes)","alert":false,"references":[],"data":[{"IP":"95.101.39.58:80(Europe)"},{"IP":"192.35.177.64:80(UnitedStates)"}],"name":"dead_connect"},{"new_data":[],"confidence":30,"families":[],"severity":2,"weight":1,"description":"PerformssomeHTTPrequests","alert":false,"references":[],"data":[{"url":"http://cert.int-x3.letsencrypt.org/"},{"url":"http://apps.identrust.com/roots/dstrootcax3.p7c"}],"name":"network_http"},{"new_data":[],"confidence":100,"families":[],"severity":2,"weight":1,"description":"Theofficefilehasaunconventionalcodepage:ANSICyrillic;Cyrillic(Windows)","alert":false,"references":[],"data":[],"name":"office_code_page"}]}')
|
||||
a = [(x['name'], x['description']) for x in a["signatures"]]
|
||||
|
||||
|
||||
b = SBSignatureObject(a)
|
||||
|
||||
|
||||
template_id = [x['ObjectTemplate']['id'] for x in pymisp.get_object_templates_list() if x['ObjectTemplate']['name'] == 'sb-signature'][0]
|
||||
|
||||
pymisp.add_object(234111, template_id, b)
|
|
@ -0,0 +1,114 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import argparse
|
||||
from datetime import date
|
||||
import importlib
|
||||
|
||||
from pymisp import MISPEvent
|
||||
from defang import defang
|
||||
from pytaxonomies import Taxonomies
|
||||
|
||||
|
||||
class ReportGenerator():
|
||||
def __init__(self, profile="daily_report"):
|
||||
self.taxonomies = Taxonomies()
|
||||
self.report = ''
|
||||
profile_name = "profiles.{}".format(profile)
|
||||
self.template = importlib.import_module(name=profile_name)
|
||||
|
||||
def from_remote(self, event_id):
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
misp = PyMISP(misp_url, misp_key, misp_verifycert)
|
||||
result = misp.get(event_id)
|
||||
self.misp_event = MISPEvent()
|
||||
self.misp_event.load(result)
|
||||
|
||||
def from_file(self, path):
|
||||
self.misp_event = MISPEvent()
|
||||
self.misp_event.load_file(path)
|
||||
|
||||
def attributes(self):
|
||||
if not self.misp_event.attributes:
|
||||
return ''
|
||||
list_attributes = []
|
||||
for attribute in self.misp_event.attributes:
|
||||
if attribute.type in self.template.types_to_attach:
|
||||
list_attributes.append("* {}".format(defang(attribute.value)))
|
||||
for obj in self.misp_event.Object:
|
||||
if obj.name in self.template.objects_to_attach:
|
||||
for attribute in obj.Attribute:
|
||||
if attribute.type in self.template.types_to_attach:
|
||||
list_attributes.append("* {}".format(defang(attribute.value)))
|
||||
return self.template.attributes.format(list_attributes="\n".join(list_attributes))
|
||||
|
||||
def _get_tag_info(self, machinetag):
|
||||
return self.taxonomies.revert_machinetag(machinetag)
|
||||
|
||||
def report_headers(self):
|
||||
content = {'org_name': 'name',
|
||||
'date': date.today().isoformat()}
|
||||
self.report += self.template.headers.format(**content)
|
||||
|
||||
def event_level_tags(self):
|
||||
if not self.misp_event.Tag:
|
||||
return ''
|
||||
for tag in self.misp_event.Tag:
|
||||
# Only look for TLP for now
|
||||
if tag['name'].startswith('tlp'):
|
||||
tax, predicate = self._get_tag_info(tag['name'])
|
||||
return self.template.event_level_tags.format(value=predicate.predicate.upper(), expanded=predicate.expanded)
|
||||
|
||||
def title(self):
|
||||
internal_id = ''
|
||||
summary = ''
|
||||
# Get internal refs for report
|
||||
for obj in self.misp_event.Object:
|
||||
if obj.name != 'report':
|
||||
continue
|
||||
for a in obj.Attribute:
|
||||
if a.object_relation == 'case-number':
|
||||
internal_id = a.value
|
||||
if a.object_relation == 'summary':
|
||||
summary = a.value
|
||||
|
||||
return self.template.title.format(internal_id=internal_id, title=self.misp_event.info,
|
||||
summary=summary)
|
||||
|
||||
def asciidoc(self, lang='en'):
|
||||
self.report += self.title()
|
||||
self.report += self.event_level_tags()
|
||||
self.report += self.attributes()
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
try:
|
||||
parser = argparse.ArgumentParser(description='Create a human-readable report out of a MISP event')
|
||||
parser.add_argument("--profile", default="daily_report", help="Profile template to use")
|
||||
parser.add_argument("-o", "--output", help="Output file to write to (generally ends in .adoc)")
|
||||
group = parser.add_mutually_exclusive_group(required=True)
|
||||
group.add_argument("-e", "--event", default=[], nargs='+', help="Event ID to get.")
|
||||
group.add_argument("-p", "--path", default=[], nargs='+', help="Path to the JSON dump.")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
report = ReportGenerator(args.profile)
|
||||
report.report_headers()
|
||||
|
||||
if args.event:
|
||||
for eid in args.event:
|
||||
report.from_remote(eid)
|
||||
report.asciidoc()
|
||||
else:
|
||||
for f in args.path:
|
||||
report.from_file(f)
|
||||
report.asciidoc()
|
||||
|
||||
if args.output:
|
||||
with open(args.output, "w") as ofile:
|
||||
ofile.write(report.report)
|
||||
else:
|
||||
print(report.report)
|
||||
except ModuleNotFoundError as err:
|
||||
print(err)
|
|
@ -0,0 +1,14 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
from pymisp import PyMISP
|
||||
|
||||
|
||||
def init(url, key):
|
||||
return PyMISP(url, key, misp_verifycert, 'json')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
misp = init(misp_url, misp_key)
|
||||
misp.cache_all_feeds()
|
|
@ -19,8 +19,8 @@ if __name__ == '__main__':
|
|||
parser = argparse.ArgumentParser(description='Create an event on MISP.')
|
||||
parser.add_argument("-d", "--distrib", type=int, help="The distribution setting used for the attributes and for the newly created event, if relevant. [0-3].")
|
||||
parser.add_argument("-i", "--info", help="Used to populate the event info field if no event ID supplied.")
|
||||
parser.add_argument("-a", "--analysis", type=int, help="The analysis level of the newly created event, if applicatble. [0-2]")
|
||||
parser.add_argument("-t", "--threat", type=int, help="The threat level ID of the newly created event, if applicatble. [1-4]")
|
||||
parser.add_argument("-a", "--analysis", type=int, help="The analysis level of the newly created event, if applicable. [0-2]")
|
||||
parser.add_argument("-t", "--threat", type=int, help="The threat level ID of the newly created event, if applicable. [1-4]")
|
||||
args = parser.parse_args()
|
||||
|
||||
misp = init(misp_url, misp_key)
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key
|
||||
from keys import misp_url, misp_key,misp_verifycert
|
||||
import argparse
|
||||
|
||||
|
||||
|
@ -10,7 +10,7 @@ import argparse
|
|||
|
||||
|
||||
def init(url, key):
|
||||
return PyMISP(url, key, True, 'json', debug=True)
|
||||
return PyMISP(url, key, misp_verifycert, 'json', debug=True)
|
||||
|
||||
|
||||
def del_event(m, eventid):
|
||||
|
|
|
@ -0,0 +1,89 @@
|
|||
import redis
|
||||
import json
|
||||
|
||||
|
||||
class MISPItemToRedis:
|
||||
"""This class provides a simple normalization to add MISP item to
|
||||
redis, so that they can easily be processed and added to MISP later on."""
|
||||
SUFFIX_SIGH = '_sighting'
|
||||
SUFFIX_ATTR = '_attribute'
|
||||
SUFFIX_OBJ = '_object'
|
||||
SUFFIX_LIST = [SUFFIX_SIGH, SUFFIX_ATTR, SUFFIX_OBJ]
|
||||
|
||||
def __init__(self, keyname, host='localhost', port=6379, db=0):
|
||||
self.host = host
|
||||
self.port = port
|
||||
self.db = db
|
||||
self.keyname = keyname
|
||||
self.serv = redis.StrictRedis(self.host, self.port, self.db)
|
||||
|
||||
def push_json(self, jdata, keyname, action):
|
||||
all_action = [s.lstrip('_') for s in self.SUFFIX_LIST]
|
||||
if action not in all_action:
|
||||
raise('Error: Invalid action. (Allowed: {})'.format(all_action))
|
||||
key = keyname + '_' + action
|
||||
self.serv.lpush(key, jdata)
|
||||
|
||||
def push_attribute(self, type_value, value, category=None, to_ids=False,
|
||||
comment=None, distribution=None, proposal=False, **kwargs):
|
||||
to_push = {}
|
||||
to_push['type'] = type_value
|
||||
to_push['value'] = value
|
||||
if category is not None:
|
||||
to_push['category'] = category
|
||||
if to_ids is not None:
|
||||
to_push['to_ids'] = to_ids
|
||||
if comment is not None:
|
||||
to_push['comment'] = comment
|
||||
if distribution is not None:
|
||||
to_push['distribution'] = distribution
|
||||
if proposal is not None:
|
||||
to_push['proposal'] = proposal
|
||||
for k, v in kwargs.items():
|
||||
to_push[k] = v
|
||||
key = self.keyname + self.SUFFIX_ATTR
|
||||
self.serv.lpush(key, json.dumps(to_push))
|
||||
|
||||
def push_attribute_obj(self, MISP_Attribute, keyname):
|
||||
key = keyname + self.SUFFIX_ATTR
|
||||
jdata = MISP_Attribute.to_json()
|
||||
self.serv.lpush(key, jdata)
|
||||
|
||||
def push_object(self, dict_values):
|
||||
# check that 'name' field is present
|
||||
if 'name' not in dict_values:
|
||||
print("Error: JSON must contain the field 'name'")
|
||||
key = self.keyname + self.SUFFIX_OBJ
|
||||
self.serv.lpush(key, json.dumps(dict_values))
|
||||
|
||||
def push_object_obj(self, MISP_Object, keyname):
|
||||
key = keyname + self.SUFFIX_OBJ
|
||||
jdata = MISP_Object.to_json()
|
||||
self.serv.lpush(key, jdata)
|
||||
|
||||
def push_sighting(self, value=None, uuid=None, id=None, source=None,
|
||||
type=0, timestamp=None, **kargs):
|
||||
to_push = {}
|
||||
if value is not None:
|
||||
to_push['value'] = value
|
||||
if uuid is not None:
|
||||
to_push['uuid'] = uuid
|
||||
if id is not None:
|
||||
to_push['id'] = id
|
||||
if source is not None:
|
||||
to_push['source'] = source
|
||||
if type is not None:
|
||||
to_push['type'] = type
|
||||
if timestamp is not None:
|
||||
to_push['timestamp'] = timestamp
|
||||
|
||||
for k, v in kargs.items():
|
||||
if v is not None:
|
||||
to_push[k] = v
|
||||
key = self.keyname + self.SUFFIX_SIGH
|
||||
self.serv.lpush(key, json.dumps(to_push))
|
||||
|
||||
def push_sighting_obj(self, MISP_Sighting, keyname):
|
||||
key = keyname + self.SUFFIX_SIGH
|
||||
jdata = MISP_Sighting.to_json()
|
||||
self.serv.lpush(key, jdata)
|
|
@ -0,0 +1,32 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
import time
|
||||
|
||||
from pymisp.tools.abstractgenerator import AbstractMISPObjectGenerator
|
||||
|
||||
|
||||
class CowrieMISPObject(AbstractMISPObjectGenerator):
|
||||
def __init__(self, dico_val, **kargs):
|
||||
self._dico_val = dico_val
|
||||
self.name = "cowrie"
|
||||
|
||||
# Enforce attribute date with timestamp
|
||||
super(CowrieMISPObject, self).__init__('cowrie',
|
||||
default_attributes_parameters={'timestamp': int(time.time())},
|
||||
**kargs)
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
skip_list = ['time', 'duration', 'isError', 'ttylog']
|
||||
for object_relation, value in self._dico_val.items():
|
||||
if object_relation in skip_list or 'log_' in object_relation:
|
||||
continue
|
||||
|
||||
if object_relation == 'timestamp':
|
||||
# Date already in ISO format, removing trailing Z
|
||||
value = value.rstrip('Z')
|
||||
|
||||
if isinstance(value, dict):
|
||||
self.add_attribute(object_relation, **value)
|
||||
else:
|
||||
self.add_attribute(object_relation, value=value)
|
|
@ -0,0 +1,84 @@
|
|||
# Generic MISP feed generator
|
||||
## Description
|
||||
|
||||
- ``generator.py`` exposes a class allowing to generate a MISP feed in real time, where each items can be added on daily generated events.
|
||||
- ``fromredis.py`` uses ``generator.py`` to generate a MISP feed based on data stored in redis.
|
||||
- ``server.py`` is a simple script using *Flask_autoindex* to serve data to MISP.
|
||||
- ``MISPItemToRedis.py`` permits to push (in redis) items to be added in MISP by the ``fromredis.py`` script.
|
||||
|
||||
|
||||
## Installation
|
||||
|
||||
````
|
||||
# Feed generator
|
||||
git clone https://github.com/CIRCL/PyMISP
|
||||
cd examples/feed-generator-from-redis
|
||||
cp settings.default.py settings.py
|
||||
vi settings.py # adjust your settings
|
||||
|
||||
python3 fromredis.py
|
||||
|
||||
# Serving file to MISP
|
||||
bash install.sh
|
||||
. ./serv-env/bin/activate
|
||||
python3 server.py
|
||||
````
|
||||
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
# Activate virtualenv
|
||||
. ./serv-env/bin/activate
|
||||
```
|
||||
|
||||
### Adding items to MISP
|
||||
|
||||
```
|
||||
# create helper object
|
||||
>>> helper = MISPItemToRedis("redis_list_keyname")
|
||||
|
||||
# push an attribute to redis
|
||||
>>> helper.push_attribute("ip-src", "8.8.8.8", category="Network activity")
|
||||
|
||||
# push an object to redis
|
||||
>>> helper.push_object({ "name": "cowrie", "session": "session_id", "username": "admin", "password": "admin", "protocol": "telnet" })
|
||||
|
||||
# push a sighting to redis
|
||||
>>> helper.push_sighting(uuid="5a9e9e26-fe40-4726-8563-5585950d210f")
|
||||
```
|
||||
|
||||
### Generate the feed
|
||||
|
||||
```
|
||||
# Create the FeedGenerator object using the configuration provided in the file settings.py
|
||||
# It will create daily event in which attributes and object will be added
|
||||
>>> generator = FeedGenerator()
|
||||
|
||||
# Add an attribute to the daily event
|
||||
>>> attr_type = "ip-src"
|
||||
>>> attr_value = "8.8.8.8"
|
||||
>>> additional_data = {}
|
||||
>>> generator.add_attribute_to_event(attr_type, attr_value, **additional_data)
|
||||
|
||||
# Add a cowrie object to the daily event
|
||||
>>> obj_name = "cowrie"
|
||||
>>> obj_data = { "session": "session_id", "username": "admin", "password": "admin", "protocol": "telnet" }
|
||||
>>> generator.add_object_to_event(obj_name, **obj_data)
|
||||
|
||||
# Immediatly write the event to the disk (Bypassing the default flushing behavior)
|
||||
>>> generator.flush_event()
|
||||
```
|
||||
|
||||
### Consume stored data in redis
|
||||
|
||||
```
|
||||
# Configuration provided in the file settings.py
|
||||
>>> python3 fromredis.py
|
||||
```
|
||||
|
||||
### Serve data to MISP
|
||||
|
||||
```
|
||||
>>> python3 server.py
|
||||
```
|
|
@ -0,0 +1,131 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import sys
|
||||
import json
|
||||
import argparse
|
||||
import datetime
|
||||
import time
|
||||
import redis
|
||||
|
||||
import settings
|
||||
|
||||
from generator import FeedGenerator
|
||||
|
||||
|
||||
def beautyful_sleep(sleep, additional):
|
||||
length = 20
|
||||
sleeptime = float(sleep) / float(length)
|
||||
for i in range(length):
|
||||
temp_string = '|'*i + ' '*(length-i-1)
|
||||
print('sleeping [{}]\t{}'.format(temp_string, additional), end='\r', sep='')
|
||||
sys.stdout.flush()
|
||||
time.sleep(sleeptime)
|
||||
|
||||
|
||||
class RedisToMISPFeed:
|
||||
SUFFIX_SIGH = '_sighting'
|
||||
SUFFIX_ATTR = '_attribute'
|
||||
SUFFIX_OBJ = '_object'
|
||||
SUFFIX_LIST = [SUFFIX_SIGH, SUFFIX_ATTR, SUFFIX_OBJ]
|
||||
|
||||
def __init__(self):
|
||||
self.host = settings.host
|
||||
self.port = settings.port
|
||||
self.db = settings.db
|
||||
self.serv = redis.StrictRedis(self.host, self.port, self.db, decode_responses=True)
|
||||
|
||||
self.generator = FeedGenerator()
|
||||
|
||||
self.keynames = []
|
||||
for k in settings.keyname_pop:
|
||||
for s in self.SUFFIX_LIST:
|
||||
self.keynames.append(k+s)
|
||||
|
||||
self.keynameError = settings.keyname_error
|
||||
|
||||
self.update_last_action("Init system")
|
||||
|
||||
def consume(self):
|
||||
self.update_last_action("Started consuming redis")
|
||||
while True:
|
||||
for key in self.keynames:
|
||||
while True:
|
||||
data = self.pop(key)
|
||||
if data is None:
|
||||
break
|
||||
try:
|
||||
self.perform_action(key, data)
|
||||
except Exception as error:
|
||||
self.save_error_to_redis(error, data)
|
||||
|
||||
beautyful_sleep(5, self.format_last_action())
|
||||
|
||||
def pop(self, key):
|
||||
popped = self.serv.rpop(key)
|
||||
if popped is None:
|
||||
return None
|
||||
try:
|
||||
popped = json.loads(popped)
|
||||
except ValueError as error:
|
||||
self.save_error_to_redis(error, popped)
|
||||
except ValueError as error:
|
||||
self.save_error_to_redis(error, popped)
|
||||
return popped
|
||||
|
||||
def perform_action(self, key, data):
|
||||
# sighting
|
||||
if key.endswith(self.SUFFIX_SIGH):
|
||||
if self.generator.add_sighting_on_attribute():
|
||||
self.update_last_action("Added sighting")
|
||||
else:
|
||||
self.update_last_action("Error while adding sighting")
|
||||
|
||||
# attribute
|
||||
elif key.endswith(self.SUFFIX_ATTR):
|
||||
attr_type = data.pop('type')
|
||||
attr_value = data.pop('value')
|
||||
if self.generator.add_attribute_to_event(attr_type, attr_value, **data):
|
||||
self.update_last_action("Added attribute")
|
||||
else:
|
||||
self.update_last_action("Error while adding attribute")
|
||||
|
||||
# object
|
||||
elif key.endswith(self.SUFFIX_OBJ):
|
||||
# create the MISP object
|
||||
obj_name = data.pop('name')
|
||||
if self.generator.add_object_to_event(obj_name, **data):
|
||||
self.update_last_action("Added object")
|
||||
else:
|
||||
self.update_last_action("Error while adding object")
|
||||
|
||||
else:
|
||||
# Suffix not valid
|
||||
self.update_last_action("Redis key suffix not supported")
|
||||
|
||||
# OTHERS
|
||||
def update_last_action(self, action):
|
||||
self.last_action = action
|
||||
self.last_action_time = datetime.datetime.now()
|
||||
|
||||
def format_last_action(self):
|
||||
return "Last action: [{}] @ {}".format(
|
||||
self.last_action,
|
||||
self.last_action_time.isoformat().replace('T', ' '),
|
||||
)
|
||||
|
||||
|
||||
def save_error_to_redis(self, error, item):
|
||||
to_push = {'error': str(error), 'item': str(item)}
|
||||
print('Error:', str(error), '\nOn adding:', item)
|
||||
self.serv.lpush(self.keynameError, to_push)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description="Pop item fom redis and add "
|
||||
+ "it to the MISP feed. By default, each action are pushed into a "
|
||||
+ "daily named event. Configuration taken from the file settings.py.")
|
||||
args = parser.parse_args()
|
||||
|
||||
redisToMISP = RedisToMISPFeed()
|
||||
redisToMISP.consume()
|
|
@ -0,0 +1,277 @@
|
|||
#!/usr/bin/env python
|
||||
|
||||
import sys
|
||||
import json
|
||||
import os
|
||||
import hashlib
|
||||
import datetime
|
||||
import time
|
||||
import uuid
|
||||
|
||||
from pymisp import MISPEvent
|
||||
|
||||
import settings
|
||||
|
||||
|
||||
def get_system_templates():
|
||||
"""Fetch all MISP-Object template present on the local system.
|
||||
|
||||
Returns:
|
||||
dict: A dictionary listing all MISP-Object templates
|
||||
|
||||
"""
|
||||
misp_objects_path = os.path.join(
|
||||
os.path.abspath(os.path.dirname(sys.modules['pymisp'].__file__)),
|
||||
'data', 'misp-objects', 'objects')
|
||||
|
||||
templates = {}
|
||||
for root, dirs, files in os.walk(misp_objects_path, topdown=False):
|
||||
for def_file in files:
|
||||
obj_name = root.split('/')[-1]
|
||||
template_path = os.path.join(root, def_file)
|
||||
with open(template_path, 'r') as f:
|
||||
definition = json.load(f)
|
||||
templates[obj_name] = definition
|
||||
return templates
|
||||
|
||||
|
||||
def gen_uuid():
|
||||
"""Generate a random UUID and returns its string representation"""
|
||||
return str(uuid.uuid4())
|
||||
|
||||
|
||||
class FeedGenerator:
|
||||
"""Helper object to create MISP feed.
|
||||
|
||||
Configuration taken from the file settings.py"""
|
||||
|
||||
def __init__(self):
|
||||
"""This object can be use to easily create a daily MISP-feed.
|
||||
|
||||
It handles the event creation, manifest file and cache file
|
||||
(hashes.csv).
|
||||
|
||||
"""
|
||||
self.sys_templates = get_system_templates()
|
||||
self.constructor_dict = settings.constructor_dict
|
||||
|
||||
self.flushing_interval = settings.flushing_interval
|
||||
self.flushing_next = time.time() + self.flushing_interval
|
||||
|
||||
self.manifest = {}
|
||||
self.attributeHashes = []
|
||||
|
||||
self.daily_event_name = settings.daily_event_name + ' {}'
|
||||
event_date_str, self.current_event_uuid, self.event_name = self.get_last_event_from_manifest()
|
||||
temp = [int(x) for x in event_date_str.split('-')]
|
||||
self.current_event_date = datetime.date(temp[0], temp[1], temp[2])
|
||||
self.current_event = self._get_event_from_id(self.current_event_uuid)
|
||||
|
||||
def add_sighting_on_attribute(self, sight_type, attr_uuid, **data):
|
||||
"""Add a sighting on an attribute.
|
||||
|
||||
Not supported for the moment."""
|
||||
self.update_daily_event_id()
|
||||
self._after_addition()
|
||||
return False
|
||||
|
||||
def add_attribute_to_event(self, attr_type, attr_value, **attr_data):
|
||||
"""Add an attribute to the daily event"""
|
||||
self.update_daily_event_id()
|
||||
self.current_event.add_attribute(attr_type, attr_value, **attr_data)
|
||||
self._add_hash(attr_type, attr_value)
|
||||
self._after_addition()
|
||||
return True
|
||||
|
||||
def add_object_to_event(self, obj_name, **data):
|
||||
"""Add an object to the daily event"""
|
||||
self.update_daily_event_id()
|
||||
if obj_name not in self.sys_templates:
|
||||
print('Unkown object template')
|
||||
return False
|
||||
|
||||
# Get MISP object constructor
|
||||
obj_constr = self.constructor_dict.get(obj_name, None)
|
||||
# Constructor not known, using the generic one
|
||||
if obj_constr is None:
|
||||
obj_constr = self.constructor_dict.get('generic')
|
||||
misp_object = obj_constr(obj_name)
|
||||
# Fill generic object
|
||||
for k, v in data.items():
|
||||
# attribute is not in the object template definition
|
||||
if k not in self.sys_templates[obj_name]['attributes']:
|
||||
# add it with type text
|
||||
misp_object.add_attribute(k, **{'value': v, 'type': 'text'})
|
||||
else:
|
||||
misp_object.add_attribute(k, **{'value': v})
|
||||
|
||||
else:
|
||||
misp_object = obj_constr(data)
|
||||
|
||||
self.current_event.add_object(misp_object)
|
||||
for attr_type, attr_value in data.items():
|
||||
self._add_hash(attr_type, attr_value)
|
||||
|
||||
self._after_addition()
|
||||
return True
|
||||
|
||||
def _after_addition(self):
|
||||
"""Write event on disk"""
|
||||
now = time.time()
|
||||
if self.flushing_next <= now:
|
||||
self.flush_event()
|
||||
self.flushing_next = now + self.flushing_interval
|
||||
|
||||
# Cache
|
||||
def _add_hash(self, attr_type, attr_value):
|
||||
if ('|' in attr_type or attr_type == 'malware-sample'):
|
||||
split = attr_value.split('|')
|
||||
self.attributeHashes.append([
|
||||
hashlib.md5(str(split[0]).encode("utf-8")).hexdigest(),
|
||||
self.current_event_uuid
|
||||
])
|
||||
self.attributeHashes.append([
|
||||
hashlib.md5(str(split[1]).encode("utf-8")).hexdigest(),
|
||||
self.current_event_uuid
|
||||
])
|
||||
else:
|
||||
self.attributeHashes.append([
|
||||
hashlib.md5(str(attr_value).encode("utf-8")).hexdigest(),
|
||||
self.current_event_uuid
|
||||
])
|
||||
|
||||
# Manifest
|
||||
def _init_manifest(self):
|
||||
# create an empty manifest
|
||||
with open(os.path.join(settings.outputdir, 'manifest.json'), 'w'):
|
||||
pass
|
||||
# create new event and save manifest
|
||||
self.create_daily_event()
|
||||
|
||||
def flush_event(self, new_event=None):
|
||||
print('Writting event on disk'+' '*50)
|
||||
if new_event is not None:
|
||||
event_uuid = new_event['uuid']
|
||||
event = new_event
|
||||
else:
|
||||
event_uuid = self.current_event_uuid
|
||||
event = self.current_event
|
||||
|
||||
eventFile = open(os.path.join(settings.outputdir, event_uuid+'.json'), 'w')
|
||||
eventFile.write(event.to_json())
|
||||
eventFile.close()
|
||||
|
||||
self.save_hashes()
|
||||
|
||||
def save_manifest(self):
|
||||
try:
|
||||
manifestFile = open(os.path.join(settings.outputdir, 'manifest.json'), 'w')
|
||||
manifestFile.write(json.dumps(self.manifest))
|
||||
manifestFile.close()
|
||||
print('Manifest saved')
|
||||
except Exception as e:
|
||||
print(e)
|
||||
sys.exit('Could not create the manifest file.')
|
||||
|
||||
def save_hashes(self):
|
||||
if len(self.attributeHashes) == 0:
|
||||
return False
|
||||
try:
|
||||
hashFile = open(os.path.join(settings.outputdir, 'hashes.csv'), 'a')
|
||||
for element in self.attributeHashes:
|
||||
hashFile.write('{},{}\n'.format(element[0], element[1]))
|
||||
hashFile.close()
|
||||
self.attributeHashes = []
|
||||
print('Hash saved' + ' '*30)
|
||||
except Exception as e:
|
||||
print(e)
|
||||
sys.exit('Could not create the quick hash lookup file.')
|
||||
|
||||
def _addEventToManifest(self, event):
|
||||
event_dict = event.to_dict()['Event']
|
||||
tags = []
|
||||
for eventTag in event_dict.get('EventTag', []):
|
||||
tags.append({'name': eventTag['Tag']['name'],
|
||||
'colour': eventTag['Tag']['colour']})
|
||||
return {
|
||||
'Orgc': event_dict.get('Orgc', []),
|
||||
'Tag': tags,
|
||||
'info': event_dict['info'],
|
||||
'date': event_dict['date'],
|
||||
'analysis': event_dict['analysis'],
|
||||
'threat_level_id': event_dict['threat_level_id'],
|
||||
'timestamp': event_dict.get('timestamp', int(time.time()))
|
||||
}
|
||||
|
||||
def get_last_event_from_manifest(self):
|
||||
"""Retreive last event from the manifest.
|
||||
|
||||
If the manifest doesn't exists or if it is empty, initialize it.
|
||||
|
||||
"""
|
||||
try:
|
||||
manifest_path = os.path.join(settings.outputdir, 'manifest.json')
|
||||
with open(manifest_path, 'r') as f:
|
||||
man = json.load(f)
|
||||
dated_events = []
|
||||
for event_uuid, event_json in man.items():
|
||||
# add events to manifest
|
||||
self.manifest[event_uuid] = event_json
|
||||
dated_events.append([
|
||||
event_json['date'],
|
||||
event_uuid,
|
||||
event_json['info']
|
||||
])
|
||||
# Sort by date then by event name
|
||||
dated_events.sort(key=lambda k: (k[0], k[2]), reverse=True)
|
||||
return dated_events[0]
|
||||
except FileNotFoundError as e:
|
||||
print('Manifest not found, generating a fresh one')
|
||||
self._init_manifest()
|
||||
return self.get_last_event_from_manifest()
|
||||
|
||||
# DAILY
|
||||
def update_daily_event_id(self):
|
||||
if self.current_event_date != datetime.date.today(): # create new event
|
||||
# save current event on disk
|
||||
self.flush_event()
|
||||
self.current_event = self.create_daily_event()
|
||||
self.current_event_date = datetime.date.today()
|
||||
self.current_event_uuid = self.current_event.get('uuid')
|
||||
self.event_name = self.current_event.info
|
||||
|
||||
def _get_event_from_id(self, event_uuid):
|
||||
with open(os.path.join(settings.outputdir, '%s.json' % event_uuid), 'r') as f:
|
||||
event_dict = json.load(f)['Event']
|
||||
event = MISPEvent()
|
||||
event.from_dict(**event_dict)
|
||||
return event
|
||||
|
||||
def create_daily_event(self):
|
||||
new_uuid = gen_uuid()
|
||||
today = str(datetime.date.today())
|
||||
event_dict = {
|
||||
'uuid': new_uuid,
|
||||
'id': len(self.manifest)+1,
|
||||
'Tag': settings.Tag,
|
||||
'info': self.daily_event_name.format(today),
|
||||
'analysis': settings.analysis, # [0-2]
|
||||
'threat_level_id': settings.threat_level_id, # [1-4]
|
||||
'published': settings.published,
|
||||
'date': today
|
||||
}
|
||||
event = MISPEvent()
|
||||
event.from_dict(**event_dict)
|
||||
|
||||
# reference org
|
||||
org_dict = {}
|
||||
org_dict['name'] = settings.org_name
|
||||
org_dict['uui'] = settings.org_uuid
|
||||
event['Orgc'] = org_dict
|
||||
|
||||
# save event on disk
|
||||
self.flush_event(new_event=event)
|
||||
# add event to manifest
|
||||
self.manifest[event['uuid']] = self._addEventToManifest(event)
|
||||
self.save_manifest()
|
||||
return event
|
|
@ -0,0 +1,4 @@
|
|||
#!/bin/bash
|
||||
virtualenv -p python3 serv-env
|
||||
. ./serv-env/bin/activate
|
||||
pip3 install -U flask Flask-AutoIndex redis
|
|
@ -0,0 +1,12 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
import os.path
|
||||
from flask import Flask
|
||||
from flask_autoindex import AutoIndex
|
||||
from settings import outputdir
|
||||
|
||||
app = Flask(__name__)
|
||||
AutoIndex(app, browse_root=os.path.join(os.path.curdir, outputdir))
|
||||
|
||||
if __name__ == '__main__':
|
||||
app.run(host='0.0.0.0')
|
|
@ -0,0 +1,58 @@
|
|||
""" REDIS RELATED """
|
||||
# Your redis server
|
||||
host='127.0.0.1'
|
||||
port=6379
|
||||
db=0
|
||||
## The keynames to POP element from
|
||||
#keyname_pop='misp_feed_generator_key'
|
||||
keyname_pop=['cowrie']
|
||||
|
||||
# OTHERS
|
||||
## How frequent the event should be written on disk
|
||||
flushing_interval=5*60
|
||||
## The redis list keyname in which to put items that generated an error
|
||||
keyname_error='feed-generation-error'
|
||||
|
||||
""" FEED GENERATOR CONFIGURATION """
|
||||
|
||||
# The output dir for the feed. This will drop a lot of files, so make
|
||||
# sure that you use a directory dedicated to the feed
|
||||
outputdir = 'output'
|
||||
|
||||
# Event meta data
|
||||
## Required
|
||||
### The organisation id that generated this feed
|
||||
org_name='myOrg'
|
||||
### Your organisation UUID
|
||||
org_uuid=''
|
||||
### The daily event name to be used in MISP.
|
||||
### (e.g. honeypot_1, will produce each day an event of the form honeypot_1 dd-mm-yyyy)
|
||||
daily_event_name='PyMISP default event name'
|
||||
|
||||
## Optional
|
||||
analysis=0
|
||||
threat_level_id=3
|
||||
published=False
|
||||
Tag=[
|
||||
{
|
||||
"colour": "#ffffff",
|
||||
"name": "tlp:white"
|
||||
},
|
||||
{
|
||||
"colour": "#ff00ff",
|
||||
"name": "my:custom:feed"
|
||||
}
|
||||
]
|
||||
|
||||
# MISP Object constructor
|
||||
from ObjectConstructor.CowrieMISPObject import CowrieMISPObject
|
||||
from pymisp.tools import GenericObjectGenerator
|
||||
|
||||
constructor_dict = {
|
||||
'cowrie': CowrieMISPObject,
|
||||
'generic': GenericObjectGenerator
|
||||
}
|
||||
|
||||
# Others
|
||||
## Redis pooling time
|
||||
sleep=60
|
|
@ -0,0 +1,13 @@
|
|||
# What
|
||||
|
||||
This python script can be used to generate a MISP feed based on an existing MISP instance.
|
||||
|
||||
# Installation
|
||||
|
||||
````
|
||||
git clone https://github.com/CIRCL/PyMISP
|
||||
cd examples/feed-generator
|
||||
cp settings-default.py settings.py
|
||||
vi settings.py #adjust your settings
|
||||
python3 generate.py
|
||||
````
|
|
@ -4,28 +4,79 @@
|
|||
import sys
|
||||
import json
|
||||
import os
|
||||
import hashlib
|
||||
from pymisp import PyMISP
|
||||
from settings import url, key, ssl, outputdir, filters, valid_attribute_distribution_levels
|
||||
|
||||
objectsFields = {
|
||||
'Attribute': {
|
||||
'uuid',
|
||||
'value',
|
||||
'category',
|
||||
'type',
|
||||
'comment',
|
||||
'data',
|
||||
'timestamp',
|
||||
'to_ids'
|
||||
},
|
||||
'Event': {
|
||||
'uuid',
|
||||
'info',
|
||||
'threat_level_id',
|
||||
'analysis',
|
||||
'timestamp',
|
||||
'publish_timestamp',
|
||||
'published',
|
||||
'date'
|
||||
},
|
||||
'Object': {
|
||||
'name',
|
||||
'meta-category',
|
||||
'description',
|
||||
'template_uuid',
|
||||
'template_version',
|
||||
'uuid',
|
||||
'timestamp',
|
||||
'distribution',
|
||||
'sharing_group_id',
|
||||
'comment'
|
||||
},
|
||||
'ObjectReference': {
|
||||
'uuid',
|
||||
'timestamp',
|
||||
'relationship_type',
|
||||
'comment',
|
||||
'object_uuid',
|
||||
'referenced_uuid'
|
||||
},
|
||||
'Orgc': {
|
||||
'name',
|
||||
'uuid'
|
||||
},
|
||||
'Tag': {
|
||||
'name',
|
||||
'colour',
|
||||
'exportable'
|
||||
}
|
||||
}
|
||||
|
||||
objectsToSave = {'Orgc': {'fields': ['name', 'uuid'],
|
||||
'multiple': False,
|
||||
},
|
||||
'Tag': {'fields': ['name', 'colour', 'exportable'],
|
||||
'multiple': True,
|
||||
},
|
||||
'Attribute': {'fields': ['uuid', 'value', 'category', 'type',
|
||||
'comment', 'data', 'timestamp', 'to_ids'],
|
||||
'multiple': True,
|
||||
},
|
||||
}
|
||||
|
||||
fieldsToSave = ['uuid', 'info', 'threat_level_id', 'analysis',
|
||||
'timestamp', 'publish_timestamp', 'published',
|
||||
'date']
|
||||
objectsToSave = {
|
||||
'Orgc': {},
|
||||
'Tag': {},
|
||||
'Attribute': {
|
||||
'Tag': {}
|
||||
},
|
||||
'Object': {
|
||||
'Attribute': {
|
||||
'Tag': {}
|
||||
},
|
||||
'ObjectReference': {}
|
||||
}
|
||||
}
|
||||
|
||||
valid_attribute_distributions = []
|
||||
|
||||
attributeHashes = []
|
||||
|
||||
def init():
|
||||
# If we have an old settings.py file then this variable won't exist
|
||||
|
@ -36,61 +87,65 @@ def init():
|
|||
valid_attribute_distributions = ['0', '1', '2', '3', '4', '5']
|
||||
return PyMISP(url, key, ssl)
|
||||
|
||||
def recursiveExtract(container, containerType, leaf, eventUuid):
|
||||
temp = {}
|
||||
if containerType in ['Attribute', 'Object']:
|
||||
if (__blockByDistribution(container)):
|
||||
return False
|
||||
for field in objectsFields[containerType]:
|
||||
if field in container:
|
||||
temp[field] = container[field]
|
||||
if (containerType == 'Attribute'):
|
||||
global attributeHashes
|
||||
if ('|' in container['type'] or container['type'] == 'malware-sample'):
|
||||
split = container['value'].split('|')
|
||||
attributeHashes.append([hashlib.md5(split[0].encode("utf-8")).hexdigest(), eventUuid])
|
||||
attributeHashes.append([hashlib.md5(split[1].encode("utf-8")).hexdigest(), eventUuid])
|
||||
else:
|
||||
attributeHashes.append([hashlib.md5(container['value'].encode("utf-8")).hexdigest(), eventUuid])
|
||||
children = leaf.keys()
|
||||
for childType in children:
|
||||
childContainer = container.get(childType)
|
||||
if (childContainer):
|
||||
if (type(childContainer) is dict):
|
||||
temp[childType] = recursiveExtract(childContainer, childType, leaf[childType], eventUuid)
|
||||
else:
|
||||
temp[childType] = []
|
||||
for element in childContainer:
|
||||
processed = recursiveExtract(element, childType, leaf[childType], eventUuid)
|
||||
if (processed):
|
||||
temp[childType].append(processed)
|
||||
return temp
|
||||
|
||||
def saveEvent(misp, uuid):
|
||||
result = {}
|
||||
event = misp.get_event(uuid)
|
||||
if not event.get('Event'):
|
||||
print('Error while fetching event: {}'.format(event['message']))
|
||||
sys.exit('Could not create file for event ' + uuid + '.')
|
||||
event = __cleanUpEvent(event)
|
||||
event['Event'] = recursiveExtract(event['Event'], 'Event', objectsToSave, event['Event']['uuid'])
|
||||
event = json.dumps(event)
|
||||
eventFile = open(os.path.join(outputdir, uuid + '.json'), 'w')
|
||||
eventFile.write(event)
|
||||
eventFile.close()
|
||||
|
||||
|
||||
def __cleanUpEvent(event):
|
||||
temp = event
|
||||
event = {'Event': {}}
|
||||
__cleanupEventFields(event, temp)
|
||||
__cleanupEventObjects(event, temp)
|
||||
return event
|
||||
|
||||
|
||||
def __cleanupEventFields(event, temp):
|
||||
for field in fieldsToSave:
|
||||
if field in temp['Event'].keys():
|
||||
event['Event'][field] = temp['Event'][field]
|
||||
return event
|
||||
|
||||
|
||||
def __blockAttributeByDistribution(attribute):
|
||||
if attribute['distribution'] not in valid_attribute_distributions:
|
||||
def __blockByDistribution(element):
|
||||
if element['distribution'] not in valid_attribute_distributions:
|
||||
return True
|
||||
return False
|
||||
|
||||
def saveHashes():
|
||||
if not attributeHashes:
|
||||
return False
|
||||
try:
|
||||
hashFile = open(os.path.join(outputdir, 'hashes.csv'), 'w')
|
||||
for element in attributeHashes:
|
||||
hashFile.write('{},{}\n'.format(element[0], element[1]))
|
||||
hashFile.close()
|
||||
except Exception as e:
|
||||
print(e)
|
||||
sys.exit('Could not create the quick hash lookup file.')
|
||||
|
||||
def __cleanupEventObjects(event, temp):
|
||||
for objectType in objectsToSave.keys():
|
||||
if objectsToSave[objectType]['multiple'] is True:
|
||||
if objectType in temp['Event']:
|
||||
for objectInstance in temp['Event'][objectType]:
|
||||
if objectType is 'Attribute':
|
||||
if __blockAttributeByDistribution(objectInstance):
|
||||
continue
|
||||
tempObject = {}
|
||||
for field in objectsToSave[objectType]['fields']:
|
||||
if field in objectInstance.keys():
|
||||
tempObject[field] = objectInstance[field]
|
||||
if objectType not in event['Event']:
|
||||
event['Event'][objectType] = []
|
||||
event['Event'][objectType].append(tempObject)
|
||||
else:
|
||||
tempObject = {}
|
||||
for field in objectsToSave[objectType]['fields']:
|
||||
tempObject[field] = temp['Event'][objectType][field]
|
||||
event['Event'][objectType] = tempObject
|
||||
return event
|
||||
|
||||
|
||||
def saveManifest(manifest):
|
||||
|
@ -138,4 +193,6 @@ if __name__ == '__main__':
|
|||
print("Event " + str(counter) + "/" + str(total) + " exported.")
|
||||
counter += 1
|
||||
saveManifest(manifest)
|
||||
print('Manifest saved. Feed creation completed.')
|
||||
print('Manifest saved.')
|
||||
saveHashes()
|
||||
print('Hashes saved. Feed creation completed.')
|
||||
|
|
|
@ -16,10 +16,10 @@ outputdir = 'output'
|
|||
# you can use on the event index, such as organisation, tags, etc.
|
||||
# It uses the same joining and condition rules as the API parameters
|
||||
# For example:
|
||||
# filters = {'tag':'tlp:white|feed-export|!privint','org':'CIRCL'}
|
||||
# the above would generate a feed for all events created by CIRCL, tagged
|
||||
# tlp:white and/or feed-export but exclude anything tagged privint
|
||||
filters = {}
|
||||
# filters = {'tag':'tlp:white|feed-export|!privint','org':'CIRCL', 'published':1}
|
||||
# the above would generate a feed for all published events created by CIRCL,
|
||||
# tagged tlp:white and/or feed-export but exclude anything tagged privint
|
||||
filters = {'published':'true'}
|
||||
|
||||
|
||||
# By default all attributes will be included in the feed generation
|
||||
|
|
|
@ -0,0 +1,22 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key
|
||||
import argparse
|
||||
|
||||
from io import open
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description="Update a MISP event.")
|
||||
parser.add_argument("-e", "--event", required=True, help="Event ID to update.")
|
||||
parser.add_argument("-i", "--input", required=True, help="Input file")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
pymisp = PyMISP(misp_url, misp_key)
|
||||
|
||||
with open(args.input, 'r') as f:
|
||||
result = pymisp.freetext(args.event, f.read())
|
||||
print(result)
|
|
@ -0,0 +1,5 @@
|
|||
8.8.8.8
|
||||
|
||||
google.fr
|
||||
|
||||
https://gmail.com
|
|
@ -0,0 +1,67 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import argparse
|
||||
import json
|
||||
|
||||
try:
|
||||
from pymisp import MISPEncode
|
||||
from pymisp.tools import make_binary_objects
|
||||
except ImportError:
|
||||
pass
|
||||
|
||||
|
||||
def check():
|
||||
missing_dependencies = {'pydeep': False, 'lief': False, 'magic': False, 'pymisp': False}
|
||||
try:
|
||||
import pymisp # noqa
|
||||
except ImportError:
|
||||
missing_dependencies['pymisp'] = 'Please install pydeep: pip install pymisp'
|
||||
try:
|
||||
import pydeep # noqa
|
||||
except ImportError:
|
||||
missing_dependencies['pydeep'] = 'Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git'
|
||||
try:
|
||||
import lief # noqa
|
||||
except ImportError:
|
||||
missing_dependencies['lief'] = 'Please install lief, documentation here: https://github.com/lief-project/LIEF'
|
||||
try:
|
||||
import magic # noqa
|
||||
except ImportError:
|
||||
missing_dependencies['magic'] = 'Please install python-magic: pip install python-magic.'
|
||||
return json.dumps(missing_dependencies)
|
||||
|
||||
|
||||
def make_objects(path):
|
||||
to_return = {'objects': [], 'references': []}
|
||||
fo, peo, seos = make_binary_objects(path)
|
||||
|
||||
if seos:
|
||||
for s in seos:
|
||||
to_return['objects'].append(s)
|
||||
if s.ObjectReference:
|
||||
to_return['references'] += s.ObjectReference
|
||||
|
||||
if peo:
|
||||
to_return['objects'].append(peo)
|
||||
if peo.ObjectReference:
|
||||
to_return['references'] += peo.ObjectReference
|
||||
|
||||
if fo:
|
||||
to_return['objects'].append(fo)
|
||||
if fo.ObjectReference:
|
||||
to_return['references'] += fo.ObjectReference
|
||||
return json.dumps(to_return, cls=MISPEncode)
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Extract indicators out of binaries and returns MISP objects.')
|
||||
group = parser.add_mutually_exclusive_group()
|
||||
group.add_argument("-p", "--path", help="Path to process.")
|
||||
group.add_argument("-c", "--check", action='store_true', help="Check the dependencies.")
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.check:
|
||||
print(check())
|
||||
if args.path:
|
||||
obj = make_objects(args.path)
|
||||
print(obj)
|
|
@ -39,7 +39,7 @@ if __name__ == '__main__':
|
|||
args = parser.parse_args()
|
||||
|
||||
if args.output is not None and os.path.exists(args.output):
|
||||
print('Output file already exists, abord.')
|
||||
print('Output file already exists, abort.')
|
||||
exit(0)
|
||||
|
||||
misp = init(misp_url, misp_key)
|
||||
|
|
|
@ -0,0 +1,26 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
import argparse
|
||||
|
||||
|
||||
def init(url, key):
|
||||
return PyMISP(url, key, misp_verifycert, 'json')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Get an attachment.')
|
||||
parser.add_argument("-a", "--attribute", type=int, help="Attribute ID to download.")
|
||||
args = parser.parse_args()
|
||||
|
||||
misp = init(misp_url, misp_key)
|
||||
|
||||
with open('foo', 'wb') as f:
|
||||
out = misp.get_attachment(args.attribute)
|
||||
if isinstance(out, dict):
|
||||
# Fails
|
||||
print(out)
|
||||
else:
|
||||
f.write(out)
|
|
@ -0,0 +1,28 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import argparse
|
||||
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Get MISP stuff as CSV.')
|
||||
parser.add_argument("-e", "--event_id", help="Event ID to fetch. Without it, it will fetch the whole database.")
|
||||
parser.add_argument("-a", "--attribute", nargs='+', help="Attribute column names")
|
||||
parser.add_argument("-o", "--object_attribute", nargs='+', help="Object attribute column names")
|
||||
parser.add_argument("-t", "--misp_types", nargs='+', help="MISP types to fetch (ip-src, hostname, ...)")
|
||||
parser.add_argument("-c", "--context", action='store_true', help="Add event level context (tags...)")
|
||||
parser.add_argument("-i", "--ignore", action='store_true', help="Returns the attributes even if the event isn't published, or the attribute doesn't have the to_ids flag")
|
||||
parser.add_argument("-f", "--outfile", help="Output file to write the CSV.")
|
||||
|
||||
args = parser.parse_args()
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert, debug=True)
|
||||
response = pymisp.get_csv(args.event_id, args.attribute, args.object_attribute, args.misp_types, args.context, args.ignore)
|
||||
|
||||
if args.outfile:
|
||||
with open(args.outfile, 'w') as f:
|
||||
f.write(response)
|
||||
else:
|
||||
print(response)
|
|
@ -0,0 +1,28 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp.tools import ext_lookups
|
||||
import argparse
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
parser = argparse.ArgumentParser(description='Search is galaxies or taxonomies.')
|
||||
parser.add_argument("-q", "--query", help="Query.")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
tag_gal = ext_lookups.revert_tag_from_galaxies(args.query)
|
||||
tag_tax = ext_lookups.revert_tag_from_taxonomies(args.query)
|
||||
|
||||
found_tax = ext_lookups.search_taxonomies(args.query)
|
||||
found_gal = ext_lookups.search_galaxies(args.query)
|
||||
|
||||
if tag_gal:
|
||||
print(tag_gal)
|
||||
if tag_tax:
|
||||
print(tag_tax)
|
||||
if found_tax:
|
||||
print(found_tax)
|
||||
if found_gal:
|
||||
print(found_gal)
|
|
@ -0,0 +1,71 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
# Export IOC's from MISP in CEF format
|
||||
# Based on cef_export.py MISP module by Hannah Ward
|
||||
|
||||
import sys
|
||||
import datetime
|
||||
from pymisp import PyMISP, MISPAttribute
|
||||
from keys import misp_url, misp_key
|
||||
|
||||
cefconfig = {"Default_Severity":1, "Device_Vendor":"MISP", "Device_Product":"MISP", "Device_Version":1}
|
||||
|
||||
cefmapping = {"ip-src":"src", "ip-dst":"dst", "hostname":"dhost", "domain":"destinationDnsDomain",
|
||||
"md5":"fileHash", "sha1":"fileHash", "sha256":"fileHash",
|
||||
"filename|md5":"fileHash", "filename|sha1":"fileHash", "filename|sha256":"fileHash",
|
||||
"url":"request"}
|
||||
|
||||
mispattributes = {'input':list(cefmapping.keys())}
|
||||
|
||||
|
||||
def make_cef(event):
|
||||
for attr in event["Attribute"]:
|
||||
if attr["to_ids"] and attr["type"] in cefmapping:
|
||||
if '|' in attr["type"] and '|' in attr["value"]:
|
||||
value = attr["value"].split('|')[1]
|
||||
else:
|
||||
value = attr["value"]
|
||||
response = "{} host CEF:0|{}|{}|{}|{}|{}|{}|msg={} customerURI={} externalId={} {}={}".format(
|
||||
datetime.datetime.fromtimestamp(int(attr["timestamp"])).strftime("%b %d %H:%M:%S"),
|
||||
cefconfig["Device_Vendor"],
|
||||
cefconfig["Device_Product"],
|
||||
cefconfig["Device_Version"],
|
||||
attr["category"],
|
||||
attr["category"],
|
||||
cefconfig["Default_Severity"],
|
||||
event["info"].replace("\\","\\\\").replace("=","\\=").replace('\n','\\n') + "(MISP Event #" + event["id"] + ")",
|
||||
misp_url + 'events/view/' + event["id"],
|
||||
attr["uuid"],
|
||||
cefmapping[attr["type"]],
|
||||
value,
|
||||
)
|
||||
print(str(bytes(response, 'utf-8'), 'utf-8'))
|
||||
|
||||
|
||||
def init_misp():
|
||||
global mymisp
|
||||
mymisp = PyMISP(misp_url, misp_key)
|
||||
|
||||
|
||||
def echeck(r):
|
||||
if r.get('errors'):
|
||||
if r.get('message') == 'No matches.':
|
||||
return
|
||||
else:
|
||||
print(r['errors'])
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def find_events():
|
||||
r = mymisp.search(controller='events', published=True, to_ids=True)
|
||||
echeck(r)
|
||||
if not r.get('response'):
|
||||
return
|
||||
for ev in r['response']:
|
||||
make_cef(ev['Event'])
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
init_misp()
|
||||
find_events()
|
|
@ -30,7 +30,7 @@ def find_hashes(htype):
|
|||
return
|
||||
for a in r['response']['Attribute']:
|
||||
attribute = MISPAttribute(mymisp.describe_types)
|
||||
attribute.set_all_values(**a)
|
||||
attribute.from_dict(**a)
|
||||
if '|' in attribute.type and '|' in attribute.value:
|
||||
c, value = attribute.value.split('|')
|
||||
comment = '{} - {}'.format(attribute.comment, c)
|
||||
|
|
|
@ -0,0 +1,27 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import argparse
|
||||
|
||||
from pymisp import PyMISP
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
from pymisp.tools import load_openioc_file
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Convert an OpenIOC file to a MISPEvent. Optionnaly send it to MISP.')
|
||||
parser.add_argument("-i", "--input", required=True, help="Input file")
|
||||
group = parser.add_mutually_exclusive_group(required=True)
|
||||
group.add_argument("-o", "--output", help="Output file")
|
||||
group.add_argument("-m", "--misp", action='store_true', help="Create new event on MISP")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
misp_event = load_openioc_file(args.input)
|
||||
|
||||
if args.misp:
|
||||
pymisp = PyMISP(misp_url, misp_key, misp_verifycert, debug=True)
|
||||
pymisp.add_event(misp_event)
|
||||
else:
|
||||
with open(args.output, 'w') as f:
|
||||
f.write(misp_event.to_json())
|
|
@ -0,0 +1,37 @@
|
|||
types_to_attach = ['ip-dst', 'url', 'domain']
|
||||
objects_to_attach = ['domain-ip']
|
||||
|
||||
headers = """
|
||||
:toc: right
|
||||
:toclevels: 1
|
||||
:toc-title: Daily Report
|
||||
:icons: font
|
||||
:sectanchors:
|
||||
:sectlinks:
|
||||
= Daily report by {org_name}
|
||||
{date}
|
||||
|
||||
:icons: font
|
||||
|
||||
"""
|
||||
|
||||
event_level_tags = """
|
||||
IMPORTANT: This event is classified TLP:{value}.
|
||||
|
||||
{expanded}
|
||||
|
||||
"""
|
||||
|
||||
attributes = """
|
||||
=== Indicator(s) of compromise
|
||||
|
||||
{list_attributes}
|
||||
|
||||
"""
|
||||
|
||||
title = """
|
||||
== ({internal_id}) {title}
|
||||
|
||||
{summary}
|
||||
|
||||
"""
|
|
@ -0,0 +1,33 @@
|
|||
types_to_attach = ['ip-dst', 'url', 'domain', 'md5']
|
||||
objects_to_attach = ['domain-ip', 'file']
|
||||
|
||||
headers = """
|
||||
:toc: right
|
||||
:toclevels: 1
|
||||
:toc-title: Weekly Report
|
||||
:icons: font
|
||||
:sectanchors:
|
||||
:sectlinks:
|
||||
= Weekly report by {org_name}
|
||||
{date}
|
||||
|
||||
:icons: font
|
||||
|
||||
"""
|
||||
|
||||
event_level_tags = """
|
||||
"""
|
||||
|
||||
attributes = """
|
||||
=== Indicator(s) of compromise
|
||||
|
||||
{list_attributes}
|
||||
|
||||
"""
|
||||
|
||||
title = """
|
||||
== ({internal_id}) {title}
|
||||
|
||||
{summary}
|
||||
|
||||
"""
|
|
@ -0,0 +1,24 @@
|
|||
# Description
|
||||
Get all attributes, from a MISP (https://github.com/MISP) instance, that can be converted into Suricata rules, given a *parameter* and a *term* to search
|
||||
|
||||
**requires**
|
||||
* PyMISP (https://github.com/CIRCL/PyMISP/)
|
||||
* python 2.7 or python3 (suggested)
|
||||
|
||||
|
||||
# Usage
|
||||
* **suricata_search.py -p tags -s 'APT' -o misp_ids.rules -t 5**
|
||||
- search for 'APT' tag
|
||||
- use 5 threads while generating IDS rules
|
||||
- dump results to misp_ids.rules
|
||||
|
||||
* **suricata_search.py -p tags -s 'APT' -o misp_ids.rules -ne 411 357 343**
|
||||
- same as above, but skip events ID 411,357 and 343
|
||||
|
||||
* **suricata_search.py -p tags -s 'circl:incident-classification="malware", tlp:green' -o misp_ids.rules**
|
||||
- search for multiple tags 'circl:incident-classification="malware", tlp:green'
|
||||
|
||||
* **suricata_search.py -p categories -s 'Artifacts dropped' -t 20 -o artifacts_dropped.rules**
|
||||
- search for category 'Artifacts dropped'
|
||||
- use 20 threads while generating IDS rules
|
||||
- dump results to artifacts_dropped.rules
|
|
@ -0,0 +1,216 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
"""
|
||||
https://github.com/raw-data/pymisp-suricata_search
|
||||
|
||||
2017.06.28 start
|
||||
2017.07.03 fixed args.quiet and status msgs
|
||||
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import os
|
||||
import queue
|
||||
import sys
|
||||
from threading import Thread, enumerate
|
||||
from keys import misp_url, misp_key, misp_verifycert
|
||||
|
||||
try:
|
||||
from pymisp import PyMISP
|
||||
except ImportError as err:
|
||||
sys.stderr.write("ERROR: {}\n".format(err))
|
||||
sys.stderr.write("\t[try] with pip install pymisp\n")
|
||||
sys.stderr.write("\t[try] with pip3 install pymisp\n")
|
||||
sys.exit(1)
|
||||
|
||||
HEADER = """
|
||||
#This part might still contain bugs, use and your own risk and report any issues.
|
||||
#
|
||||
# MISP export of IDS rules - optimized for suricata
|
||||
#
|
||||
# These NIDS rules contain some variables that need to exist in your configuration.
|
||||
# Make sure you have set:
|
||||
#
|
||||
# $HOME_NET - Your internal network range
|
||||
# $EXTERNAL_NET - The network considered as outside
|
||||
# $SMTP_SERVERS - All your internal SMTP servers
|
||||
# $HTTP_PORTS - The ports used to contain HTTP traffic (not required with suricata export)
|
||||
#
|
||||
"""
|
||||
|
||||
# queue for events matching searched term/s
|
||||
IDS_EVENTS = queue.Queue()
|
||||
|
||||
# queue for downloaded Suricata rules
|
||||
DOWNLOADED_RULES = queue.Queue()
|
||||
|
||||
# Default number of threads to use
|
||||
THREAD = 4
|
||||
|
||||
try:
|
||||
input = raw_input
|
||||
except NameError:
|
||||
pass
|
||||
|
||||
|
||||
def init():
|
||||
""" init connection to MISP """
|
||||
return PyMISP(misp_url, misp_key, misp_verifycert, 'json')
|
||||
|
||||
|
||||
def search(misp, quiet, noevent, **kwargs):
|
||||
""" Start search in MISP """
|
||||
|
||||
result = misp.search(**kwargs)
|
||||
|
||||
# fetch all events matching **kwargs
|
||||
track_events = 0
|
||||
skip_events = list()
|
||||
for event in result['response']:
|
||||
event_id = event["Event"].get("id")
|
||||
track_events += 1
|
||||
|
||||
to_ids = False
|
||||
for attribute in event["Event"]["Attribute"]:
|
||||
to_ids_event = attribute["to_ids"]
|
||||
if to_ids_event:
|
||||
to_ids = True
|
||||
break
|
||||
|
||||
# if there is at least one eligible event to_ids, add event_id
|
||||
if to_ids:
|
||||
# check if the event_id is not blacklisted by the user
|
||||
if isinstance(noevent, list):
|
||||
if event_id not in noevent[0]:
|
||||
to_ids_event = (event_id, misp)
|
||||
IDS_EVENTS.put(to_ids_event)
|
||||
else:
|
||||
skip_events.append(event_id)
|
||||
else:
|
||||
to_ids_event = (event_id, misp)
|
||||
IDS_EVENTS.put(to_ids_event)
|
||||
|
||||
if not quiet:
|
||||
print ("\t[i] matching events: {}".format(track_events))
|
||||
if len(skip_events) > 0:
|
||||
print ("\t[i] skipped {0} events -> {1}".format(len(skip_events),skip_events))
|
||||
print ("\t[i] events selected for IDS export: {}".format(IDS_EVENTS.qsize()))
|
||||
|
||||
|
||||
def collect_rules(thread):
|
||||
""" Dispatch tasks to Suricata_processor worker """
|
||||
|
||||
for x in range(int(thread)):
|
||||
th = Thread(target=suricata_processor, args=(IDS_EVENTS, ))
|
||||
th.start()
|
||||
|
||||
for x in enumerate():
|
||||
if x.name == "MainThread":
|
||||
continue
|
||||
x.join()
|
||||
|
||||
|
||||
def suricata_processor(ids_events):
|
||||
""" Trigger misp.download_suricata_rule_event """
|
||||
|
||||
while not ids_events.empty():
|
||||
event_id, misp = ids_events.get()
|
||||
ids_rules = misp.download_suricata_rule_event(event_id).text
|
||||
|
||||
for r in ids_rules.split("\n"):
|
||||
# skip header
|
||||
if not r.startswith("#"):
|
||||
if len(r) > 0: DOWNLOADED_RULES.put(r)
|
||||
|
||||
|
||||
def return_rules(output, quiet):
|
||||
""" Return downloaded rules to user """
|
||||
|
||||
rules = set()
|
||||
while not DOWNLOADED_RULES.empty():
|
||||
rules.add(DOWNLOADED_RULES.get())
|
||||
|
||||
if output is None:
|
||||
|
||||
if not quiet:
|
||||
print ("[+] Displaying rules")
|
||||
|
||||
print (HEADER)
|
||||
for r in rules: print (r)
|
||||
print ("#")
|
||||
|
||||
else:
|
||||
|
||||
if not quiet:
|
||||
print ("[+] Writing rules to {}".format(output))
|
||||
print ("[+] Generated {} rules".format(len(rules)))
|
||||
|
||||
with open(output, 'w') as f:
|
||||
f.write(HEADER)
|
||||
f.write("\n".join(r for r in rules))
|
||||
f.write("\n"+"#")
|
||||
|
||||
|
||||
def format_request(param, term, misp, quiet, output, thread, noevent):
|
||||
""" Format request and start search """
|
||||
|
||||
kwargs = {param: term}
|
||||
|
||||
if not quiet:
|
||||
print ("[+] Searching for: {}".format(kwargs))
|
||||
|
||||
search(misp, quiet, noevent, **kwargs)
|
||||
|
||||
# collect Suricata rules
|
||||
collect_rules(thread)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
formatter_class=argparse.RawTextHelpFormatter,
|
||||
description='Get all attributes that can be converted into Suricata rules, given a parameter and a term to '
|
||||
'search.',
|
||||
epilog='''
|
||||
EXAMPLES:
|
||||
suricata_search.py -p tags -s 'APT' -o misp_ids.rules -t 5
|
||||
suricata_search.py -p tags -s 'APT' -o misp_ids.rules -ne 411 357 343
|
||||
suricata_search.py -p tags -s 'tlp:green, OSINT' -o misp_ids.rules
|
||||
suricata_search.py -p tags -s 'circl:incident-classification="malware", tlp:green' -o misp_ids.rules
|
||||
suricata_search.py -p categories -s 'Artifacts dropped' -t 20 -o artifacts_dropped.rules
|
||||
''')
|
||||
parser.add_argument("-p", "--param", required=True, help="Parameter to search (e.g. categories, tags, org, etc.).")
|
||||
parser.add_argument("-s", "--search", required=True, help="Term/s to search.")
|
||||
parser.add_argument("-q", "--quiet", action='store_true', help="No status messages")
|
||||
parser.add_argument("-t", "--thread", required=False, help="Number of threads to use", default=THREAD)
|
||||
parser.add_argument("-ne", "--noevent", nargs='*', required=False, dest='noevent', action='append',
|
||||
help="Event/s ID to exclude during the search")
|
||||
parser.add_argument("-o", "--output", help="Output file",required=False)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.output is not None and os.path.exists(args.output) and not args.quiet:
|
||||
try:
|
||||
check = input("[!] Output file {} exists, do you want to continue [Y/n]? ".format(args.output))
|
||||
if check not in ["Y","y"]:
|
||||
exit(0)
|
||||
except KeyboardInterrupt:
|
||||
sys.exit(0)
|
||||
|
||||
if not args.quiet:
|
||||
print ("[i] Connecting to MISP instance: {}".format(misp_url))
|
||||
print ("[i] Note: duplicated IDS rules will be removed")
|
||||
|
||||
# Based on # of terms, format request
|
||||
if "," in args.search:
|
||||
for term in args.search.split(","):
|
||||
term = term.strip()
|
||||
misp = init()
|
||||
format_request(args.param, term, misp, args.quiet, args.output, args.thread, args.noevent)
|
||||
else:
|
||||
misp = init()
|
||||
format_request(args.param, args.search, misp, args.quiet, args.output, args.thread, args.noevent)
|
||||
|
||||
# return collected rules
|
||||
return_rules(args.output, args.quiet)
|
|
@ -0,0 +1,182 @@
|
|||
''' Convert a VirusTotal report into MISP objects '''
|
||||
import argparse
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from urllib.parse import urlsplit
|
||||
|
||||
import pymisp
|
||||
from pymisp.tools import VTReportObject
|
||||
|
||||
logging.basicConfig(level=logging.INFO, format="%(asctime)s | %(levelname)s | %(module)s.%(funcName)s.%(lineno)d | %(message)s")
|
||||
|
||||
|
||||
def build_cli():
|
||||
'''
|
||||
Build the command-line arguments
|
||||
'''
|
||||
desc = "Take an indicator or list of indicators to search VT for and import the results into MISP"
|
||||
post_desc = """
|
||||
config.json: Should be a JSON file containing MISP and VirusTotal credentials with the following format:
|
||||
{"misp": {"url": "<url_to_misp>", "key": "<misp_api_key>"}, "virustotal": {"key": "<vt_api_key>"}}
|
||||
Please note: Only public API features work in the VTReportObject for now. I don't have a quarter million to spare ;)
|
||||
|
||||
Example:
|
||||
python vt_to_misp.py -i 719c97a8cd8db282586c1416894dcaf8 -c ./config.json
|
||||
"""
|
||||
parser = argparse.ArgumentParser(description=desc, epilog=post_desc, formatter_class=argparse.RawTextHelpFormatter)
|
||||
parser.add_argument("-e", "--event", help="MISP event id to add to")
|
||||
parser.add_argument("-c", "--config", default="config.json", help="Path to JSON configuration file to read")
|
||||
indicators = parser.add_mutually_exclusive_group(required=True)
|
||||
indicators.add_argument("-i", "--indicator", help="Single indicator to look up")
|
||||
indicators.add_argument("-f", "--file", help="File of indicators to look up - one on each line")
|
||||
indicators.add_argument("-l", "--link", help="Link to a VirusTotal report")
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def build_config(path=None):
|
||||
'''
|
||||
Read a configuration file path. File is expected to be
|
||||
|
||||
:path: Path to a configuration file
|
||||
'''
|
||||
try:
|
||||
with open(path, "r") as ifile:
|
||||
return json.load(ifile)
|
||||
except OSError:
|
||||
raise OSError("Couldn't find path to configuration file: %s", path)
|
||||
except json.JSONDecodeError:
|
||||
raise IOError("Couldn't parse configuration file. Please make sure it is a proper JSON document")
|
||||
|
||||
|
||||
def generate_report(indicator, apikey):
|
||||
'''
|
||||
Build our VirusTotal report object, File object, and AV signature objects
|
||||
and link them appropriately
|
||||
|
||||
:indicator: Indicator hash to search in VT for
|
||||
'''
|
||||
report_objects = []
|
||||
vt_report = VTReportObject(apikey, indicator)
|
||||
report_objects.append(vt_report)
|
||||
raw_report = vt_report._report
|
||||
if vt_report._resource_type == "file":
|
||||
file_object = pymisp.MISPObject(name="file")
|
||||
file_object.add_attribute("md5", value=raw_report["md5"])
|
||||
file_object.add_attribute("sha1", value=raw_report["sha1"])
|
||||
file_object.add_attribute("sha256", value=raw_report["sha256"])
|
||||
vt_report.add_reference(referenced_uuid=file_object.uuid, relationship_type="report of")
|
||||
report_objects.append(file_object)
|
||||
elif vt_report._resource_type == "url":
|
||||
parsed = urlsplit(indicator)
|
||||
url_object = pymisp.MISPObject(name="url")
|
||||
url_object.add_attribute("url", value=parsed.geturl())
|
||||
url_object.add_attribute("host", value=parsed.hostname)
|
||||
url_object.add_attribute("scheme", value=parsed.scheme)
|
||||
url_object.add_attribute("port", value=parsed.port)
|
||||
vt_report.add_reference(referenced_uuid=url_object.uuid, relationship_type="report of")
|
||||
report_objects.append(url_object)
|
||||
for antivirus in raw_report["scans"]:
|
||||
if raw_report["scans"][antivirus]["detected"]:
|
||||
av_object = pymisp.MISPObject(name="av-signature")
|
||||
av_object.add_attribute("software", value=antivirus)
|
||||
signature_name = raw_report["scans"][antivirus]["result"]
|
||||
av_object.add_attribute("signature", value=signature_name, disable_correlation=True)
|
||||
vt_report.add_reference(referenced_uuid=av_object.uuid, relationship_type="included-in")
|
||||
report_objects.append(av_object)
|
||||
return report_objects
|
||||
|
||||
|
||||
def get_misp_event(event_id=None, info=None):
|
||||
'''
|
||||
Smaller helper function for generating a new MISP event or using a preexisting one
|
||||
|
||||
:event_id: The event id of the MISP event to upload objects to
|
||||
|
||||
:info: The event's title/info
|
||||
'''
|
||||
if event_id:
|
||||
event = misp.get_event(event_id)
|
||||
elif info:
|
||||
event = misp.new_event(info=info)
|
||||
else:
|
||||
event = misp.new_event(info="VirusTotal Report")
|
||||
misp_event = pymisp.MISPEvent()
|
||||
misp_event.load(event)
|
||||
return misp_event
|
||||
|
||||
|
||||
def main(misp, config, args):
|
||||
'''
|
||||
Main program logic
|
||||
|
||||
:misp: PyMISP API object for interfacing with MISP
|
||||
|
||||
:config: Configuration dictionary
|
||||
|
||||
:args: Argparse CLI object
|
||||
'''
|
||||
if args.indicator:
|
||||
misp_objects = generate_report(args.indicator, config["virustotal"]["key"])
|
||||
if misp_objects:
|
||||
misp_event = get_misp_event(args.event, "VirusTotal Report for {}".format(args.indicator))
|
||||
submit_to_misp(misp, misp_event, misp_objects)
|
||||
elif args.file:
|
||||
try:
|
||||
reports = []
|
||||
with open(args.file, "r") as ifile:
|
||||
for indicator in ifile:
|
||||
try:
|
||||
misp_objects = generate_report(indicator, config["virustotal"]["key"])
|
||||
if misp_objects:
|
||||
reports.append(misp_objects)
|
||||
except pymisp.exceptions.InvalidMISPObject as err:
|
||||
logging.error(err)
|
||||
if reports:
|
||||
current_time = datetime.now().strftime("%x %X")
|
||||
misp_event = get_misp_event(args.event, "VirusTotal Reports: {}".format(current_time))
|
||||
for report in reports:
|
||||
submit_to_misp(misp, misp_event, report)
|
||||
except OSError:
|
||||
logging.error("Couldn't open indicators file at '%s'. Check path", args.file)
|
||||
elif args.link:
|
||||
# https://www.virustotal.com/#/file/<ioc>/detection
|
||||
indicator = args.link.split("/")[5]
|
||||
misp_objects = generate_report(indicator, config["virustotal"]["key"])
|
||||
if misp_objects:
|
||||
misp_event = get_misp_event(args.event, "VirusTotal Report for {}".format(indicator))
|
||||
submit_to_misp(misp, misp_event, misp_objects)
|
||||
|
||||
|
||||
def submit_to_misp(misp, misp_event, misp_objects):
|
||||
'''
|
||||
Submit a list of MISP objects to a MISP event
|
||||
|
||||
:misp: PyMISP API object for interfacing with MISP
|
||||
|
||||
:misp_event: MISPEvent object
|
||||
|
||||
:misp_objects: List of MISPObject objects. Must be a list
|
||||
'''
|
||||
# go through round one and only add MISP objects
|
||||
for misp_object in misp_objects:
|
||||
template_id = misp.get_object_template_id(misp_object.template_uuid)
|
||||
misp.add_object(misp_event.id, template_id, misp_object)
|
||||
# go through round two and add all the object references for each object
|
||||
for misp_object in misp_objects:
|
||||
for reference in misp_object.ObjectReference:
|
||||
misp.add_object_reference(reference)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
args = build_cli()
|
||||
config = build_config(args.config)
|
||||
# change the 'ssl' value if you want to verify your MISP's SSL instance
|
||||
misp = pymisp.PyMISP(url=config["misp"]["url"], key=config["misp"]["key"], ssl=False)
|
||||
# finally, let's start checking VT and converting the reports
|
||||
main(misp, config, args)
|
||||
except KeyboardInterrupt:
|
||||
print("Bye Felicia")
|
||||
except pymisp.exceptions.InvalidMISPObject as err:
|
||||
logging.error(err)
|
|
@ -0,0 +1,22 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from pymisp import PyMISP
|
||||
from pymisp.tools import load_warninglists
|
||||
import argparse
|
||||
from keys import misp_url, misp_key
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
parser = argparse.ArgumentParser(description='Load the warninglists.')
|
||||
parser.add_argument("-p", "--package", action='store_true', help="from the PyMISPWarninglists package.")
|
||||
parser.add_argument("-r", "--remote", action='store_true', help="from the MISP instance.")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.package:
|
||||
print(load_warninglists.from_package())
|
||||
elif args.remote:
|
||||
pm = PyMISP(misp_url, misp_key)
|
||||
print(load_warninglists.from_instance(pm))
|
|
@ -1,7 +1,46 @@
|
|||
__version__ = '2.4.71'
|
||||
__version__ = '2.4.89'
|
||||
import logging
|
||||
import functools
|
||||
import warnings
|
||||
|
||||
from .exceptions import PyMISPError, NewEventError, NewAttributeError, MissingDependency, NoURL, NoKey
|
||||
from .api import PyMISP
|
||||
from .mispevent import MISPEvent, MISPAttribute, EncodeUpdate, EncodeFull
|
||||
from .tools.neo4j import Neo4j
|
||||
from .tools import stix
|
||||
FORMAT = "%(levelname)s [%(filename)s:%(lineno)s - %(funcName)s() ] %(message)s"
|
||||
formatter = logging.Formatter(FORMAT)
|
||||
default_handler = logging.StreamHandler()
|
||||
default_handler.setFormatter(formatter)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
logger.addHandler(default_handler)
|
||||
logger.setLevel(logging.WARNING)
|
||||
|
||||
|
||||
def deprecated(func):
|
||||
'''This is a decorator which can be used to mark functions
|
||||
as deprecated. It will result in a warning being emitted
|
||||
when the function is used.'''
|
||||
|
||||
@functools.wraps(func)
|
||||
def new_func(*args, **kwargs):
|
||||
warnings.showwarning(
|
||||
"Call to deprecated function {}.".format(func.__name__),
|
||||
category=DeprecationWarning,
|
||||
filename=func.__code__.co_filename,
|
||||
lineno=func.__code__.co_firstlineno + 1
|
||||
)
|
||||
return func(*args, **kwargs)
|
||||
return new_func
|
||||
|
||||
|
||||
try:
|
||||
from .exceptions import PyMISPError, NewEventError, NewAttributeError, MissingDependency, NoURL, NoKey, InvalidMISPObject, UnknownMISPObjectTemplate, PyMISPInvalidFormat # noqa
|
||||
from .api import PyMISP # noqa
|
||||
from .abstract import AbstractMISP, MISPEncode, MISPTag # noqa
|
||||
from .mispevent import MISPEvent, MISPAttribute, MISPObjectReference, MISPObjectAttribute, MISPObject, MISPUser, MISPOrganisation, MISPSighting # noqa
|
||||
from .tools import AbstractMISPObjectGenerator # noqa
|
||||
from .tools import Neo4j # noqa
|
||||
from .tools import stix # noqa
|
||||
from .tools import openioc # noqa
|
||||
from .tools import load_warninglists # noqa
|
||||
from .tools import ext_lookups # noqa
|
||||
logger.debug('pymisp loaded properly')
|
||||
except ImportError as e:
|
||||
logger.warning('Unable to load pymisp properly: {}'.format(e))
|
||||
|
|
|
@ -0,0 +1,231 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import abc
|
||||
import sys
|
||||
import datetime
|
||||
import json
|
||||
from json import JSONEncoder
|
||||
import collections
|
||||
import six # Remove that import when discarding python2 support.
|
||||
import logging
|
||||
|
||||
from .exceptions import PyMISPInvalidFormat
|
||||
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
if six.PY2:
|
||||
logger.warning("You're using python 2, it is strongly recommended to use python >=3.5")
|
||||
|
||||
# This is required because Python 2 is a pain.
|
||||
from datetime import tzinfo, timedelta
|
||||
|
||||
class UTC(tzinfo):
|
||||
"""UTC"""
|
||||
|
||||
def utcoffset(self, dt):
|
||||
return timedelta(0)
|
||||
|
||||
def tzname(self, dt):
|
||||
return "UTC"
|
||||
|
||||
def dst(self, dt):
|
||||
return timedelta(0)
|
||||
|
||||
|
||||
class MISPEncode(JSONEncoder):
|
||||
|
||||
def default(self, obj):
|
||||
if isinstance(obj, AbstractMISP):
|
||||
return obj.jsonable()
|
||||
return JSONEncoder.default(self, obj)
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta) # Remove that line when discarding python2 support.
|
||||
class AbstractMISP(collections.MutableMapping):
|
||||
|
||||
__not_jsonable = []
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
"""Abstract class for all the MISP objects"""
|
||||
super(AbstractMISP, self).__init__()
|
||||
self.__edited = True # As we create a new object, we assume it is edited
|
||||
|
||||
# List of classes having tags
|
||||
from .mispevent import MISPAttribute, MISPEvent
|
||||
self.__has_tags = (MISPAttribute, MISPEvent)
|
||||
if isinstance(self, self.__has_tags):
|
||||
self.Tag = []
|
||||
setattr(AbstractMISP, 'add_tag', AbstractMISP.__add_tag)
|
||||
setattr(AbstractMISP, 'tags', property(AbstractMISP.__get_tags, AbstractMISP.__set_tags))
|
||||
|
||||
@property
|
||||
def properties(self):
|
||||
"""All the class public properties that will be dumped in the dictionary, and the JSON export.
|
||||
Note: all the properties starting with a `_` (private), or listed in __not_jsonable will be skipped.
|
||||
"""
|
||||
to_return = []
|
||||
for prop, value in vars(self).items():
|
||||
if prop.startswith('_') or prop in self.__not_jsonable:
|
||||
continue
|
||||
to_return.append(prop)
|
||||
return to_return
|
||||
|
||||
def from_dict(self, **kwargs):
|
||||
"""Loading all the parameters as class properties, if they aren't `None`.
|
||||
This method aims to be called when all the properties requiring a special
|
||||
treatment are processed.
|
||||
Note: This method is used when you initialize an object with existing data so by default,
|
||||
the class is flaged as not edited."""
|
||||
for prop, value in kwargs.items():
|
||||
if value is None:
|
||||
continue
|
||||
setattr(self, prop, value)
|
||||
# We load an existing dictionary, marking it an not-edited
|
||||
self.__edited = False
|
||||
|
||||
def update_not_jsonable(self, *args):
|
||||
"""Add entries to the __not_jsonable list"""
|
||||
self.__not_jsonable += args
|
||||
|
||||
def set_not_jsonable(self, *args):
|
||||
"""Set __not_jsonable to a new list"""
|
||||
self.__not_jsonable = args
|
||||
|
||||
def from_json(self, json_string):
|
||||
"""Load a JSON string"""
|
||||
self.from_dict(json.loads(json_string))
|
||||
|
||||
def to_dict(self):
|
||||
"""Dump the lass to a dictionary.
|
||||
This method automatically removes the timestamp recursively in every object
|
||||
that has been edited is order to let MISP update the event accordingly."""
|
||||
to_return = {}
|
||||
for attribute in self.properties:
|
||||
val = getattr(self, attribute, None)
|
||||
if val is None:
|
||||
continue
|
||||
elif isinstance(val, list) and len(val) == 0:
|
||||
continue
|
||||
if attribute == 'timestamp':
|
||||
if self.edited:
|
||||
# In order to be accepted by MISP, the timestamp of an object
|
||||
# needs to be either newer, or None.
|
||||
# If the current object is marked as edited, the easiest is to
|
||||
# skip the timestamp and let MISP deal with it
|
||||
continue
|
||||
else:
|
||||
val = self._datetime_to_timestamp(val)
|
||||
to_return[attribute] = val
|
||||
return to_return
|
||||
|
||||
def jsonable(self):
|
||||
"""This method is used by the JSON encoder"""
|
||||
return self.to_dict()
|
||||
|
||||
def to_json(self):
|
||||
"""Dump recursively any class of type MISPAbstract to a json string"""
|
||||
return json.dumps(self, cls=MISPEncode, sort_keys=True, indent=2)
|
||||
|
||||
def __getitem__(self, key):
|
||||
try:
|
||||
return getattr(self, key)
|
||||
except AttributeError:
|
||||
# Expected by pop and other dict-related methods
|
||||
raise KeyError
|
||||
|
||||
def __setitem__(self, key, value):
|
||||
setattr(self, key, value)
|
||||
|
||||
def __delitem__(self, key):
|
||||
delattr(self, key)
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self.to_dict())
|
||||
|
||||
def __len__(self):
|
||||
return len(self.to_dict())
|
||||
|
||||
@property
|
||||
def edited(self):
|
||||
"""Recursively check if an object has been edited and update the flag accordingly
|
||||
to the parent objects"""
|
||||
if self.__edited:
|
||||
return self.__edited
|
||||
for p in self.properties:
|
||||
if self.__edited:
|
||||
break
|
||||
val = getattr(self, p)
|
||||
if isinstance(val, AbstractMISP) and val.edited:
|
||||
self.__edited = True
|
||||
elif isinstance(val, list) and all(isinstance(a, AbstractMISP) for a in val):
|
||||
if any(a.edited for a in val):
|
||||
self.__edited = True
|
||||
return self.__edited
|
||||
|
||||
@edited.setter
|
||||
def edited(self, val):
|
||||
"""Set the edit flag"""
|
||||
if isinstance(val, bool):
|
||||
self.__edited = val
|
||||
else:
|
||||
raise Exception('edited can only be True or False')
|
||||
|
||||
def __setattr__(self, name, value):
|
||||
if name in self.properties:
|
||||
self.__edited = True
|
||||
super(AbstractMISP, self).__setattr__(name, value)
|
||||
|
||||
def _datetime_to_timestamp(self, d):
|
||||
"""Convert a datetime.datetime object to a timestamp (int)"""
|
||||
if isinstance(d, (int, str)) or (sys.version_info < (3, 0) and isinstance(d, unicode)):
|
||||
# Assume we already have a timestamp
|
||||
return d
|
||||
if sys.version_info >= (3, 3):
|
||||
return int(d.timestamp())
|
||||
else:
|
||||
return int((d - datetime.datetime.fromtimestamp(0, UTC())).total_seconds())
|
||||
|
||||
def __add_tag(self, tag=None, **kwargs):
|
||||
"""Add a tag to the attribute (by name or a MISPTag object)"""
|
||||
if isinstance(tag, str):
|
||||
misp_tag = MISPTag()
|
||||
misp_tag.from_dict(name=tag)
|
||||
elif isinstance(tag, MISPTag):
|
||||
misp_tag = tag
|
||||
elif isinstance(tag, dict):
|
||||
misp_tag = MISPTag()
|
||||
misp_tag.from_dict(**tag)
|
||||
elif kwargs:
|
||||
misp_tag = MISPTag()
|
||||
misp_tag.from_dict(**kwargs)
|
||||
else:
|
||||
raise PyMISPInvalidFormat("The tag is in an invalid format (can be either string, MISPTag, or an expanded dict): {}".format(tag))
|
||||
self.Tag.append(misp_tag)
|
||||
self.edited = True
|
||||
|
||||
def __get_tags(self):
|
||||
"""Returns a lost of tags associated to this Attribute"""
|
||||
return self.Tag
|
||||
|
||||
def __set_tags(self, tags):
|
||||
"""Set a list of prepared MISPTag."""
|
||||
if all(isinstance(x, MISPTag) for x in tags):
|
||||
self.Tag = tags
|
||||
else:
|
||||
raise PyMISPInvalidFormat('All the attributes have to be of type MISPTag.')
|
||||
|
||||
|
||||
class MISPTag(AbstractMISP):
|
||||
def __init__(self):
|
||||
super(MISPTag, self).__init__()
|
||||
|
||||
def from_dict(self, name, **kwargs):
|
||||
self.name = name
|
||||
super(MISPTag, self).from_dict(**kwargs)
|
||||
|
||||
def __repr__(self):
|
||||
if hasattr(self, 'name'):
|
||||
return '<{self.__class__.__name__}(name={self.name})'.format(self=self)
|
||||
return '<{self.__class__.__name__}(NotInitialized)'.format(self=self)
|
1257
pymisp/api.py
1257
pymisp/api.py
File diff suppressed because it is too large
Load Diff
|
@ -69,6 +69,10 @@
|
|||
"default_category": "Payload delivery",
|
||||
"to_ids": 1
|
||||
},
|
||||
"email-body": {
|
||||
"default_category": "Payload delivery",
|
||||
"to_ids": 0
|
||||
},
|
||||
"float": {
|
||||
"default_category": "Other",
|
||||
"to_ids": 0
|
||||
|
@ -117,10 +121,30 @@
|
|||
"default_category": "Payload installation",
|
||||
"to_ids": 1
|
||||
},
|
||||
"stix2-pattern": {
|
||||
"default_category": "Payload installation",
|
||||
"to_ids": 1
|
||||
},
|
||||
"sigma": {
|
||||
"default_category": "Payload installation",
|
||||
"to_ids": 1
|
||||
},
|
||||
"gene": {
|
||||
"default_category": "Artifacts dropped",
|
||||
"to_ids": 0
|
||||
},
|
||||
"mime-type": {
|
||||
"default_category": "Artifacts dropped",
|
||||
"to_ids": 0
|
||||
},
|
||||
"identity-card-number": {
|
||||
"default_category": "Person",
|
||||
"to_ids": 0
|
||||
},
|
||||
"cookie": {
|
||||
"default_category": "Network activity",
|
||||
"to_ids": 0
|
||||
},
|
||||
"vulnerability": {
|
||||
"default_category": "External analysis",
|
||||
"to_ids": 0
|
||||
|
@ -217,6 +241,10 @@
|
|||
"default_category": "Financial fraud",
|
||||
"to_ids": 1
|
||||
},
|
||||
"phone-number": {
|
||||
"default_category": "Person",
|
||||
"to_ids": 0
|
||||
},
|
||||
"threat-actor": {
|
||||
"default_category": "Attribution",
|
||||
"to_ids": 0
|
||||
|
@ -349,6 +377,10 @@
|
|||
"default_category": "Attribution",
|
||||
"to_ids": 0
|
||||
},
|
||||
"whois-registrant-org": {
|
||||
"default_category": "Attribution",
|
||||
"to_ids": 0
|
||||
},
|
||||
"whois-registrar": {
|
||||
"default_category": "Attribution",
|
||||
"to_ids": 0
|
||||
|
@ -361,6 +393,14 @@
|
|||
"default_category": "Network activity",
|
||||
"to_ids": 1
|
||||
},
|
||||
"x509-fingerprint-md5": {
|
||||
"default_category": "Network activity",
|
||||
"to_ids": 1
|
||||
},
|
||||
"x509-fingerprint-sha256": {
|
||||
"default_category": "Network activity",
|
||||
"to_ids": 1
|
||||
},
|
||||
"dns-soa-email": {
|
||||
"default_category": "Attribution",
|
||||
"to_ids": 0
|
||||
|
@ -397,6 +437,14 @@
|
|||
"default_category": "Network activity",
|
||||
"to_ids": 1
|
||||
},
|
||||
"mac-address": {
|
||||
"default_category": "Network activity",
|
||||
"to_ids": 0
|
||||
},
|
||||
"mac-eui-64": {
|
||||
"default_category": "Network activity",
|
||||
"to_ids": 0
|
||||
},
|
||||
"email-dst-display-name": {
|
||||
"default_category": "Payload delivery",
|
||||
"to_ids": 0
|
||||
|
@ -426,7 +474,7 @@
|
|||
"to_ids": 0
|
||||
},
|
||||
"email-message-id": {
|
||||
"default_category": "",
|
||||
"default_category": "Payload delivery",
|
||||
"to_ids": 0
|
||||
},
|
||||
"github-username": {
|
||||
|
@ -470,7 +518,7 @@
|
|||
"to_ids": 0
|
||||
},
|
||||
"gender": {
|
||||
"default_category": "",
|
||||
"default_category": "Person",
|
||||
"to_ids": 0
|
||||
},
|
||||
"passport-number": {
|
||||
|
@ -544,6 +592,14 @@
|
|||
"mobile-application-id": {
|
||||
"default_category": "Payload delivery",
|
||||
"to_ids": 1
|
||||
},
|
||||
"cortex": {
|
||||
"default_category": "External analysis",
|
||||
"to_ids": 0
|
||||
},
|
||||
"boolean": {
|
||||
"default_category": "Other",
|
||||
"to_ids": 0
|
||||
}
|
||||
},
|
||||
"types": [
|
||||
|
@ -564,6 +620,7 @@
|
|||
"email-dst",
|
||||
"email-subject",
|
||||
"email-attachment",
|
||||
"email-body",
|
||||
"float",
|
||||
"url",
|
||||
"http-method",
|
||||
|
@ -576,7 +633,12 @@
|
|||
"pattern-in-traffic",
|
||||
"pattern-in-memory",
|
||||
"yara",
|
||||
"stix2-pattern",
|
||||
"sigma",
|
||||
"gene",
|
||||
"mime-type",
|
||||
"identity-card-number",
|
||||
"cookie",
|
||||
"vulnerability",
|
||||
"attachment",
|
||||
"malware-sample",
|
||||
|
@ -601,6 +663,7 @@
|
|||
"bin",
|
||||
"cc-number",
|
||||
"prtn",
|
||||
"phone-number",
|
||||
"threat-actor",
|
||||
"campaign-name",
|
||||
"campaign-id",
|
||||
|
@ -634,9 +697,12 @@
|
|||
"whois-registrant-email",
|
||||
"whois-registrant-phone",
|
||||
"whois-registrant-name",
|
||||
"whois-registrant-org",
|
||||
"whois-registrar",
|
||||
"whois-creation-date",
|
||||
"x509-fingerprint-sha1",
|
||||
"x509-fingerprint-md5",
|
||||
"x509-fingerprint-sha256",
|
||||
"dns-soa-email",
|
||||
"size-in-bytes",
|
||||
"counter",
|
||||
|
@ -646,6 +712,8 @@
|
|||
"ip-dst|port",
|
||||
"ip-src|port",
|
||||
"hostname|port",
|
||||
"mac-address",
|
||||
"mac-eui-64",
|
||||
"email-dst-display-name",
|
||||
"email-src-display-name",
|
||||
"email-header",
|
||||
|
@ -682,7 +750,9 @@
|
|||
"place-port-of-clearance",
|
||||
"place-port-of-onward-foreign-destination",
|
||||
"passenger-name-record-locator-number",
|
||||
"mobile-application-id"
|
||||
"mobile-application-id",
|
||||
"cortex",
|
||||
"boolean"
|
||||
],
|
||||
"categories": [
|
||||
"Internal reference",
|
||||
|
@ -757,6 +827,8 @@
|
|||
"filename|imphash",
|
||||
"filename|impfuzzy",
|
||||
"filename|pehash",
|
||||
"mac-address",
|
||||
"mac-eui-64",
|
||||
"ip-src",
|
||||
"ip-dst",
|
||||
"ip-dst|port",
|
||||
|
@ -767,13 +839,16 @@
|
|||
"email-dst",
|
||||
"email-subject",
|
||||
"email-attachment",
|
||||
"email-body",
|
||||
"url",
|
||||
"user-agent",
|
||||
"AS",
|
||||
"pattern-in-file",
|
||||
"pattern-in-traffic",
|
||||
"stix2-pattern",
|
||||
"yara",
|
||||
"sigma",
|
||||
"mime-type",
|
||||
"attachment",
|
||||
"malware-sample",
|
||||
"link",
|
||||
|
@ -783,9 +858,9 @@
|
|||
"hex",
|
||||
"vulnerability",
|
||||
"x509-fingerprint-sha1",
|
||||
"x509-fingerprint-md5",
|
||||
"x509-fingerprint-sha256",
|
||||
"other",
|
||||
"ip-dst|port",
|
||||
"ip-src|port",
|
||||
"hostname|port",
|
||||
"email-dst-display-name",
|
||||
"email-src-display-name",
|
||||
|
@ -795,7 +870,8 @@
|
|||
"email-mime-boundary",
|
||||
"email-thread-index",
|
||||
"email-message-id",
|
||||
"mobile-application-id"
|
||||
"mobile-application-id",
|
||||
"whois-registrant-email"
|
||||
],
|
||||
"Artifacts dropped": [
|
||||
"md5",
|
||||
|
@ -830,6 +906,7 @@
|
|||
"pattern-in-file",
|
||||
"pattern-in-memory",
|
||||
"pdb",
|
||||
"stix2-pattern",
|
||||
"yara",
|
||||
"sigma",
|
||||
"attachment",
|
||||
|
@ -843,7 +920,12 @@
|
|||
"text",
|
||||
"hex",
|
||||
"x509-fingerprint-sha1",
|
||||
"other"
|
||||
"x509-fingerprint-md5",
|
||||
"x509-fingerprint-sha256",
|
||||
"other",
|
||||
"cookie",
|
||||
"gene",
|
||||
"mime-type"
|
||||
],
|
||||
"Payload installation": [
|
||||
"md5",
|
||||
|
@ -878,6 +960,7 @@
|
|||
"pattern-in-file",
|
||||
"pattern-in-traffic",
|
||||
"pattern-in-memory",
|
||||
"stix2-pattern",
|
||||
"yara",
|
||||
"sigma",
|
||||
"vulnerability",
|
||||
|
@ -888,8 +971,11 @@
|
|||
"text",
|
||||
"hex",
|
||||
"x509-fingerprint-sha1",
|
||||
"x509-fingerprint-md5",
|
||||
"x509-fingerprint-sha256",
|
||||
"mobile-application-id",
|
||||
"other"
|
||||
"other",
|
||||
"mime-type"
|
||||
],
|
||||
"Persistence mechanism": [
|
||||
"filename",
|
||||
|
@ -905,9 +991,12 @@
|
|||
"ip-dst",
|
||||
"ip-dst|port",
|
||||
"ip-src|port",
|
||||
"port",
|
||||
"hostname",
|
||||
"domain",
|
||||
"domain|ip",
|
||||
"mac-address",
|
||||
"mac-eui-64",
|
||||
"email-dst",
|
||||
"url",
|
||||
"uri",
|
||||
|
@ -916,13 +1005,15 @@
|
|||
"AS",
|
||||
"snort",
|
||||
"pattern-in-file",
|
||||
"stix2-pattern",
|
||||
"pattern-in-traffic",
|
||||
"attachment",
|
||||
"comment",
|
||||
"text",
|
||||
"x509-fingerprint-sha1",
|
||||
"other",
|
||||
"hex"
|
||||
"hex",
|
||||
"cookie"
|
||||
],
|
||||
"Payload type": [
|
||||
"comment",
|
||||
|
@ -936,12 +1027,16 @@
|
|||
"whois-registrant-phone",
|
||||
"whois-registrant-email",
|
||||
"whois-registrant-name",
|
||||
"whois-registrant-org",
|
||||
"whois-registrar",
|
||||
"whois-creation-date",
|
||||
"comment",
|
||||
"text",
|
||||
"x509-fingerprint-sha1",
|
||||
"other"
|
||||
"x509-fingerprint-md5",
|
||||
"x509-fingerprint-sha256",
|
||||
"other",
|
||||
"dns-soa-email"
|
||||
],
|
||||
"External analysis": [
|
||||
"md5",
|
||||
|
@ -955,6 +1050,8 @@
|
|||
"ip-dst",
|
||||
"ip-dst|port",
|
||||
"ip-src|port",
|
||||
"mac-address",
|
||||
"mac-eui-64",
|
||||
"hostname",
|
||||
"domain",
|
||||
"domain|ip",
|
||||
|
@ -974,8 +1071,11 @@
|
|||
"comment",
|
||||
"text",
|
||||
"x509-fingerprint-sha1",
|
||||
"x509-fingerprint-md5",
|
||||
"x509-fingerprint-sha256",
|
||||
"github-repository",
|
||||
"other"
|
||||
"other",
|
||||
"cortex"
|
||||
],
|
||||
"Financial fraud": [
|
||||
"btc",
|
||||
|
@ -986,6 +1086,7 @@
|
|||
"bin",
|
||||
"cc-number",
|
||||
"prtn",
|
||||
"phone-number",
|
||||
"comment",
|
||||
"text",
|
||||
"other",
|
||||
|
@ -996,7 +1097,6 @@
|
|||
"text",
|
||||
"attachment",
|
||||
"comment",
|
||||
"text",
|
||||
"other",
|
||||
"hex"
|
||||
],
|
||||
|
@ -1010,7 +1110,8 @@
|
|||
"email-dst",
|
||||
"comment",
|
||||
"text",
|
||||
"other"
|
||||
"other",
|
||||
"whois-registrant-email"
|
||||
],
|
||||
"Person": [
|
||||
"first-name",
|
||||
|
@ -1038,7 +1139,9 @@
|
|||
"passenger-name-record-locator-number",
|
||||
"comment",
|
||||
"text",
|
||||
"other"
|
||||
"other",
|
||||
"phone-number",
|
||||
"identity-card-number"
|
||||
],
|
||||
"Other": [
|
||||
"comment",
|
||||
|
@ -1050,7 +1153,9 @@
|
|||
"cpe",
|
||||
"port",
|
||||
"float",
|
||||
"hex"
|
||||
"hex",
|
||||
"phone-number",
|
||||
"boolean"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
Subproject commit 7c2e07a50b944d265f92cfba712d872091c1c199
|
|
@ -1,6 +1,6 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
|
||||
class PyMISPError(Exception):
|
||||
def __init__(self, message):
|
||||
super(PyMISPError, self).__init__(message)
|
||||
|
@ -29,3 +29,21 @@ class NoURL(PyMISPError):
|
|||
|
||||
class NoKey(PyMISPError):
|
||||
pass
|
||||
|
||||
|
||||
class MISPObjectException(PyMISPError):
|
||||
pass
|
||||
|
||||
|
||||
class InvalidMISPObject(MISPObjectException):
|
||||
"""Exception raised when an object doesn't respect the contrains in the definition"""
|
||||
pass
|
||||
|
||||
|
||||
class UnknownMISPObjectTemplate(MISPObjectException):
|
||||
"""Exception raised when the template is unknown"""
|
||||
pass
|
||||
|
||||
|
||||
class PyMISPInvalidFormat(PyMISPError):
|
||||
pass
|
||||
|
|
1332
pymisp/mispevent.py
1332
pymisp/mispevent.py
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,17 @@
|
|||
import sys
|
||||
|
||||
from .vtreportobject import VTReportObject # noqa
|
||||
from .neo4j import Neo4j # noqa
|
||||
from .fileobject import FileObject # noqa
|
||||
from .peobject import PEObject, PESectionObject # noqa
|
||||
from .elfobject import ELFObject, ELFSectionObject # noqa
|
||||
from .machoobject import MachOObject, MachOSectionObject # noqa
|
||||
from .create_misp_object import make_binary_objects # noqa
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator # noqa
|
||||
from .genericgenerator import GenericObjectGenerator # noqa
|
||||
from .openioc import load_openioc, load_openioc_file # noqa
|
||||
from .sbsignatureobject import SBSignatureObject # noqa
|
||||
from .fail2banobject import Fail2BanObject # noqa
|
||||
|
||||
if sys.version_info >= (3, 6):
|
||||
from .emailobject import EMailObject # noqa
|
|
@ -0,0 +1,16 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import abc
|
||||
import six
|
||||
from .. import MISPObject
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta) # Remove that line when discarding python2 support.
|
||||
# Python3 way: class MISPObjectGenerator(metaclass=abc.ABCMeta):
|
||||
class AbstractMISPObjectGenerator(MISPObject):
|
||||
|
||||
@abc.abstractmethod
|
||||
def generate_attributes(self):
|
||||
"""Contains the logic where all the values of the object are gathered"""
|
||||
pass
|
|
@ -0,0 +1,93 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import six
|
||||
|
||||
from . import FileObject, PEObject, ELFObject, MachOObject
|
||||
from ..exceptions import MISPObjectException
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
try:
|
||||
import lief
|
||||
from lief import Logger
|
||||
Logger.disable()
|
||||
HAS_LIEF = True
|
||||
except ImportError:
|
||||
HAS_LIEF = False
|
||||
|
||||
|
||||
class FileTypeNotImplemented(MISPObjectException):
|
||||
pass
|
||||
|
||||
|
||||
def make_pe_objects(lief_parsed, misp_file, standalone=True, default_attributes_parameters={}):
|
||||
pe_object = PEObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
|
||||
misp_file.add_reference(pe_object.uuid, 'included-in', 'PE indicators')
|
||||
pe_sections = []
|
||||
for s in pe_object.sections:
|
||||
pe_sections.append(s)
|
||||
return misp_file, pe_object, pe_sections
|
||||
|
||||
|
||||
def make_elf_objects(lief_parsed, misp_file, standalone=True, default_attributes_parameters={}):
|
||||
elf_object = ELFObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
|
||||
misp_file.add_reference(elf_object.uuid, 'included-in', 'ELF indicators')
|
||||
elf_sections = []
|
||||
for s in elf_object.sections:
|
||||
elf_sections.append(s)
|
||||
return misp_file, elf_object, elf_sections
|
||||
|
||||
|
||||
def make_macho_objects(lief_parsed, misp_file, standalone=True, default_attributes_parameters={}):
|
||||
macho_object = MachOObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
|
||||
misp_file.add_reference(macho_object.uuid, 'included-in', 'MachO indicators')
|
||||
macho_sections = []
|
||||
for s in macho_object.sections:
|
||||
macho_sections.append(s)
|
||||
return misp_file, macho_object, macho_sections
|
||||
|
||||
|
||||
def make_binary_objects(filepath=None, pseudofile=None, filename=None, standalone=True, default_attributes_parameters={}):
|
||||
misp_file = FileObject(filepath=filepath, pseudofile=pseudofile, filename=filename,
|
||||
standalone=standalone, default_attributes_parameters=default_attributes_parameters)
|
||||
if HAS_LIEF and filepath or (pseudofile and filename):
|
||||
try:
|
||||
if filepath:
|
||||
lief_parsed = lief.parse(filepath=filepath)
|
||||
else:
|
||||
if six.PY2:
|
||||
logger.critical('Pseudofile is not supported in python2. Just update.')
|
||||
lief_parsed = None
|
||||
else:
|
||||
lief_parsed = lief.parse(raw=pseudofile.getvalue(), name=filename)
|
||||
if isinstance(lief_parsed, lief.PE.Binary):
|
||||
return make_pe_objects(lief_parsed, misp_file, standalone, default_attributes_parameters)
|
||||
elif isinstance(lief_parsed, lief.ELF.Binary):
|
||||
return make_elf_objects(lief_parsed, misp_file, standalone, default_attributes_parameters)
|
||||
elif isinstance(lief_parsed, lief.MachO.Binary):
|
||||
return make_macho_objects(lief_parsed, misp_file, standalone, default_attributes_parameters)
|
||||
except lief.bad_format as e:
|
||||
logger.warning('Bad format: {}'.format(e))
|
||||
except lief.bad_file as e:
|
||||
logger.warning('Bad file: {}'.format(e))
|
||||
except lief.conversion_error as e:
|
||||
logger.warning('Conversion file: {}'.format(e))
|
||||
except lief.builder_error as e:
|
||||
logger.warning('Builder file: {}'.format(e))
|
||||
except lief.parser_error as e:
|
||||
logger.warning('Parser error: {}'.format(e))
|
||||
except lief.integrity_error as e:
|
||||
logger.warning('Integrity error: {}'.format(e))
|
||||
except lief.pe_error as e:
|
||||
logger.warning('PE error: {}'.format(e))
|
||||
except lief.type_error as e:
|
||||
logger.warning('Type error: {}'.format(e))
|
||||
except lief.exception as e:
|
||||
logger.warning('Lief exception: {}'.format(e))
|
||||
except FileTypeNotImplemented as e:
|
||||
logger.warning(e)
|
||||
if not HAS_LIEF:
|
||||
logger.warning('Please install lief, documentation here: https://github.com/lief-project/LIEF')
|
||||
return misp_file, None, None
|
|
@ -0,0 +1,91 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
from ..exceptions import InvalidMISPObject
|
||||
from io import BytesIO
|
||||
from hashlib import md5, sha1, sha256, sha512
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
try:
|
||||
import lief
|
||||
HAS_LIEF = True
|
||||
except ImportError:
|
||||
HAS_LIEF = False
|
||||
|
||||
try:
|
||||
import pydeep
|
||||
HAS_PYDEEP = True
|
||||
except ImportError:
|
||||
HAS_PYDEEP = False
|
||||
|
||||
|
||||
class ELFObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, parsed=None, filepath=None, pseudofile=None, standalone=True, **kwargs):
|
||||
if not HAS_PYDEEP:
|
||||
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
|
||||
if not HAS_LIEF:
|
||||
raise ImportError('Please install lief, documentation here: https://github.com/lief-project/LIEF')
|
||||
if pseudofile:
|
||||
if isinstance(pseudofile, BytesIO):
|
||||
self.__elf = lief.ELF.parse(raw=pseudofile.getvalue())
|
||||
elif isinstance(pseudofile, bytes):
|
||||
self.__elf = lief.ELF.parse(raw=pseudofile)
|
||||
else:
|
||||
raise InvalidMISPObject('Pseudo file can be BytesIO or bytes got {}'.format(type(pseudofile)))
|
||||
elif filepath:
|
||||
self.__elf = lief.ELF.parse(filepath)
|
||||
elif parsed:
|
||||
# Got an already parsed blob
|
||||
if isinstance(parsed, lief.ELF.Binary):
|
||||
self.__elf = parsed
|
||||
else:
|
||||
raise InvalidMISPObject('Not a lief.ELF.Binary: {}'.format(type(parsed)))
|
||||
super(ELFObject, self).__init__('elf', standalone=standalone, **kwargs)
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
# General information
|
||||
self.add_attribute('type', value=str(self.__elf.header.file_type).split('.')[1])
|
||||
self.add_attribute('entrypoint-address', value=self.__elf.entrypoint)
|
||||
self.add_attribute('arch', value=str(self.__elf.header.machine_type).split('.')[1])
|
||||
self.add_attribute('os_abi', value=str(self.__elf.header.identity_os_abi).split('.')[1])
|
||||
# Sections
|
||||
self.sections = []
|
||||
if self.__elf.sections:
|
||||
pos = 0
|
||||
for section in self.__elf.sections:
|
||||
s = ELFSectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
|
||||
self.add_reference(s.uuid, 'included-in', 'Section {} of ELF'.format(pos))
|
||||
pos += 1
|
||||
self.sections.append(s)
|
||||
self.add_attribute('number-sections', value=len(self.sections))
|
||||
|
||||
|
||||
class ELFSectionObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, section, standalone=True, **kwargs):
|
||||
# Python3 way
|
||||
# super().__init__('pe-section')
|
||||
super(ELFSectionObject, self).__init__('elf-section', standalone=standalone, **kwargs)
|
||||
self.__section = section
|
||||
self.__data = bytes(self.__section.content)
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('name', value=self.__section.name)
|
||||
self.add_attribute('type', value=str(self.__section.type).split('.')[1])
|
||||
for flag in self.__section.flags_list:
|
||||
self.add_attribute('flag', value=str(flag).split('.')[1])
|
||||
size = self.add_attribute('size-in-bytes', value=self.__section.size)
|
||||
if int(size.value) > 0:
|
||||
self.add_attribute('entropy', value=self.__section.entropy)
|
||||
self.add_attribute('md5', value=md5(self.__data).hexdigest())
|
||||
self.add_attribute('sha1', value=sha1(self.__data).hexdigest())
|
||||
self.add_attribute('sha256', value=sha256(self.__data).hexdigest())
|
||||
self.add_attribute('sha512', value=sha512(self.__data).hexdigest())
|
||||
if HAS_PYDEEP:
|
||||
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
|
|
@ -0,0 +1,47 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from ..exceptions import InvalidMISPObject
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
from io import BytesIO
|
||||
import logging
|
||||
from email import message_from_bytes
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
|
||||
class EMailObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, filepath=None, pseudofile=None, standalone=True, **kwargs):
|
||||
if filepath:
|
||||
with open(filepath, 'rb') as f:
|
||||
pseudofile = BytesIO(f.read())
|
||||
elif pseudofile and isinstance(pseudofile, BytesIO):
|
||||
pseudofile = pseudofile
|
||||
else:
|
||||
raise InvalidMISPObject('File buffer (BytesIO) or a path is required.')
|
||||
# PY3 way:
|
||||
# super().__init__('file')
|
||||
super(EMailObject, self).__init__('email', standalone=standalone, **kwargs)
|
||||
self.__email = message_from_bytes(pseudofile.getvalue())
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
if 'Reply-To' in self.__email:
|
||||
self.add_attribute('reply-to', value=self.__email['Reply-To'])
|
||||
if 'Message-ID' in self.__email:
|
||||
self.add_attribute('message-id', value=self.__email['Message-ID'])
|
||||
if 'To' in self.__email:
|
||||
for to in self.__email['To'].split(','):
|
||||
self.add_attribute('to', value=to.strip())
|
||||
if 'Cc' in self.__email:
|
||||
for cc in self.__email['Cc'].split(','):
|
||||
self.add_attribute('cc', value=cc.strip())
|
||||
if 'Subject' in self.__email:
|
||||
self.add_attribute('subject', value=self.__email['Subject'])
|
||||
if 'From' in self.__email:
|
||||
for e_from in self.__email['From'].split(','):
|
||||
self.add_attribute('from', value=e_from.strip())
|
||||
if 'Return-Path' in self.__email:
|
||||
self.add_attribute('return-path', value=self.__email['Return-Path'])
|
||||
# TODO: self.add_attribute('attachment', value=)
|
|
@ -0,0 +1,43 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
try:
|
||||
from pymispgalaxies import Clusters
|
||||
has_pymispgalaxies = True
|
||||
except ImportError:
|
||||
has_pymispgalaxies = False
|
||||
|
||||
try:
|
||||
from pytaxonomies import Taxonomies
|
||||
has_pymispgalaxies = True
|
||||
except ImportError:
|
||||
has_pymispgalaxies = False
|
||||
|
||||
|
||||
def revert_tag_from_galaxies(tag):
|
||||
clusters = Clusters()
|
||||
try:
|
||||
return clusters.revert_machinetag(tag)
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
|
||||
def revert_tag_from_taxonomies(tag):
|
||||
taxonomies = Taxonomies()
|
||||
try:
|
||||
return taxonomies.revert_machinetag(tag)
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
|
||||
def search_taxonomies(query):
|
||||
taxonomies = Taxonomies()
|
||||
found = taxonomies.search(query)
|
||||
if not found:
|
||||
found = taxonomies.search(query, expanded=True)
|
||||
return found
|
||||
|
||||
|
||||
def search_galaxies(query):
|
||||
clusters = Clusters()
|
||||
return clusters.search(query)
|
|
@ -0,0 +1,34 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from datetime import datetime
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
import logging
|
||||
from dateutil.parser import parse
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
|
||||
class Fail2BanObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, parameters, standalone=True, **kwargs):
|
||||
super(Fail2BanObject, self).__init__('fail2ban', standalone=standalone, **kwargs)
|
||||
self.__parameters = parameters
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('banned-ip', value=self.__parameters['banned-ip'])
|
||||
self.add_attribute('attack-type', value=self.__parameters['attack-type'])
|
||||
try:
|
||||
timestamp = parse(self.__parameters['processing-timestamp'])
|
||||
except Exception:
|
||||
timestamp = datetime.now()
|
||||
|
||||
self.add_attribute('processing-timestamp', value=timestamp.isoformat())
|
||||
|
||||
if 'failures' in self.__parameters:
|
||||
self.add_attribute('failures', value=self.__parameters['failures'])
|
||||
if 'sensor' in self.__parameters:
|
||||
self.add_attribute('', value=self.__parameters['sensor'])
|
||||
if 'victim' in self.__parameters:
|
||||
self.add_attribute('victim', value=self.__parameters['victim'])
|
|
@ -0,0 +1,87 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from ..exceptions import InvalidMISPObject
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
import os
|
||||
from io import BytesIO
|
||||
from hashlib import md5, sha1, sha256, sha512
|
||||
import math
|
||||
from collections import Counter
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
|
||||
try:
|
||||
import pydeep
|
||||
HAS_PYDEEP = True
|
||||
except ImportError:
|
||||
HAS_PYDEEP = False
|
||||
|
||||
try:
|
||||
import magic
|
||||
HAS_MAGIC = True
|
||||
except ImportError:
|
||||
HAS_MAGIC = False
|
||||
|
||||
|
||||
class FileObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, filepath=None, pseudofile=None, filename=None, standalone=True, **kwargs):
|
||||
if not HAS_PYDEEP:
|
||||
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
|
||||
if not HAS_MAGIC:
|
||||
logger.warning("Please install python-magic: pip install python-magic.")
|
||||
if filename:
|
||||
# Useful in case the file is copied with a pre-defined name by a script but we want to keep the original name
|
||||
self.__filename = filename
|
||||
elif filepath:
|
||||
self.__filename = os.path.basename(filepath)
|
||||
else:
|
||||
raise InvalidMISPObject('A file name is required (either in the path, or as a parameter).')
|
||||
|
||||
if filepath:
|
||||
with open(filepath, 'rb') as f:
|
||||
self.__pseudofile = BytesIO(f.read())
|
||||
elif pseudofile and isinstance(pseudofile, BytesIO):
|
||||
# WARNING: lief.parse requires a path
|
||||
self.__pseudofile = pseudofile
|
||||
else:
|
||||
raise InvalidMISPObject('File buffer (BytesIO) or a path is required.')
|
||||
# PY3 way:
|
||||
# super().__init__('file')
|
||||
super(FileObject, self).__init__('file', standalone=standalone, **kwargs)
|
||||
self.__data = self.__pseudofile.getvalue()
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('filename', value=self.__filename)
|
||||
size = self.add_attribute('size-in-bytes', value=len(self.__data))
|
||||
if int(size.value) > 0:
|
||||
self.add_attribute('entropy', value=self.__entropy_H(self.__data))
|
||||
self.add_attribute('md5', value=md5(self.__data).hexdigest())
|
||||
self.add_attribute('sha1', value=sha1(self.__data).hexdigest())
|
||||
self.add_attribute('sha256', value=sha256(self.__data).hexdigest())
|
||||
self.add_attribute('sha512', value=sha512(self.__data).hexdigest())
|
||||
self.add_attribute('malware-sample', value=self.__filename, data=self.__pseudofile)
|
||||
if HAS_MAGIC:
|
||||
self.add_attribute('mimetype', value=magic.from_buffer(self.__data))
|
||||
if HAS_PYDEEP:
|
||||
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
|
||||
|
||||
def __entropy_H(self, data):
|
||||
"""Calculate the entropy of a chunk of data."""
|
||||
# NOTE: copy of the entropy function from pefile
|
||||
|
||||
if len(data) == 0:
|
||||
return 0.0
|
||||
|
||||
occurences = Counter(bytearray(data))
|
||||
|
||||
entropy = 0
|
||||
for x in occurences.values():
|
||||
p_x = float(x) / len(data)
|
||||
entropy -= p_x * math.log(p_x, 2)
|
||||
|
||||
return entropy
|
|
@ -0,0 +1,16 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
|
||||
|
||||
class GenericObjectGenerator(AbstractMISPObjectGenerator):
|
||||
|
||||
def generate_attributes(self, attributes):
|
||||
for attribute in attributes:
|
||||
for object_relation, value in attribute.items():
|
||||
if isinstance(value, dict):
|
||||
self.add_attribute(object_relation, **value)
|
||||
else:
|
||||
# In this case, we need a valid template, as all the other parameters will be pre-set.
|
||||
self.add_attribute(object_relation, value=value)
|
|
@ -0,0 +1,26 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
try:
|
||||
from pymispwarninglists import WarningLists
|
||||
has_pymispwarninglists = True
|
||||
except ImportError:
|
||||
has_pymispwarninglists = False
|
||||
|
||||
|
||||
def from_instance(pymisp_instance, slow_search=False):
|
||||
"""Load the warnindlist from an existing MISP instance
|
||||
:pymisp_instance: Already instantialized PyMISP instance."""
|
||||
|
||||
warninglists_index = pymisp_instance.get_warninglists()['Warninglists']
|
||||
all_warningslists = []
|
||||
for warninglist in warninglists_index:
|
||||
wl = pymisp_instance.get_warninglist(warninglist['Warninglist']['id'])['Warninglist']
|
||||
wl['list'] = wl.pop('WarninglistEntry')
|
||||
all_warningslists.append(wl)
|
||||
|
||||
return WarningLists(slow_search, all_warningslists)
|
||||
|
||||
|
||||
def from_package(slow_search=False):
|
||||
return WarningLists(slow_search)
|
|
@ -0,0 +1,91 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from ..exceptions import InvalidMISPObject
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
from io import BytesIO
|
||||
from hashlib import md5, sha1, sha256, sha512
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
|
||||
try:
|
||||
import lief
|
||||
HAS_LIEF = True
|
||||
except ImportError:
|
||||
HAS_LIEF = False
|
||||
|
||||
try:
|
||||
import pydeep
|
||||
HAS_PYDEEP = True
|
||||
except ImportError:
|
||||
HAS_PYDEEP = False
|
||||
|
||||
|
||||
class MachOObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, parsed=None, filepath=None, pseudofile=None, standalone=True, **kwargs):
|
||||
if not HAS_PYDEEP:
|
||||
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
|
||||
if not HAS_LIEF:
|
||||
raise ImportError('Please install lief, documentation here: https://github.com/lief-project/LIEF')
|
||||
if pseudofile:
|
||||
if isinstance(pseudofile, BytesIO):
|
||||
self.__macho = lief.MachO.parse(raw=pseudofile.getvalue())
|
||||
elif isinstance(pseudofile, bytes):
|
||||
self.__macho = lief.MachO.parse(raw=pseudofile)
|
||||
else:
|
||||
raise InvalidMISPObject('Pseudo file can be BytesIO or bytes got {}'.format(type(pseudofile)))
|
||||
elif filepath:
|
||||
self.__macho = lief.MachO.parse(filepath)
|
||||
elif parsed:
|
||||
# Got an already parsed blob
|
||||
if isinstance(parsed, lief.MachO.Binary):
|
||||
self.__macho = parsed
|
||||
else:
|
||||
raise InvalidMISPObject('Not a lief.MachO.Binary: {}'.format(type(parsed)))
|
||||
# Python3 way
|
||||
# super().__init__('elf')
|
||||
super(MachOObject, self).__init__('macho', standalone=standalone, **kwargs)
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('type', value=str(self.__macho.header.file_type).split('.')[1])
|
||||
self.add_attribute('name', value=self.__macho.name)
|
||||
# General information
|
||||
if self.__macho.has_entrypoint:
|
||||
self.add_attribute('entrypoint-address', value=self.__macho.entrypoint)
|
||||
# Sections
|
||||
self.sections = []
|
||||
if self.__macho.sections:
|
||||
pos = 0
|
||||
for section in self.__macho.sections:
|
||||
s = MachOSectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
|
||||
self.add_reference(s.uuid, 'included-in', 'Section {} of MachO'.format(pos))
|
||||
pos += 1
|
||||
self.sections.append(s)
|
||||
self.add_attribute('number-sections', value=len(self.sections))
|
||||
|
||||
|
||||
class MachOSectionObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, section, standalone=True, **kwargs):
|
||||
# Python3 way
|
||||
# super().__init__('pe-section')
|
||||
super(MachOSectionObject, self).__init__('macho-section', standalone=standalone, **kwargs)
|
||||
self.__section = section
|
||||
self.__data = bytes(self.__section.content)
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('name', value=self.__section.name)
|
||||
size = self.add_attribute('size-in-bytes', value=self.__section.size)
|
||||
if int(size.value) > 0:
|
||||
self.add_attribute('entropy', value=self.__section.entropy)
|
||||
self.add_attribute('md5', value=md5(self.__data).hexdigest())
|
||||
self.add_attribute('sha1', value=sha1(self.__data).hexdigest())
|
||||
self.add_attribute('sha256', value=sha256(self.__data).hexdigest())
|
||||
self.add_attribute('sha512', value=sha512(self.__data).hexdigest())
|
||||
if HAS_PYDEEP:
|
||||
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
|
|
@ -1,9 +1,8 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import glob
|
||||
import os
|
||||
from pymisp import MISPEvent
|
||||
from .. import MISPEvent
|
||||
|
||||
try:
|
||||
from py2neo import authenticate, Graph, Node, Relationship
|
||||
|
@ -54,5 +53,5 @@ class Neo4j():
|
|||
av = Relationship(attr_node, "is", val)
|
||||
s = val | ev | av
|
||||
tx.merge(s)
|
||||
#tx.graph.push(s)
|
||||
# tx.graph.push(s)
|
||||
tx.commit()
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import os
|
||||
|
||||
from pymisp import MISPEvent
|
||||
from .. import MISPEvent
|
||||
try:
|
||||
from bs4 import BeautifulSoup
|
||||
has_bs4 = True
|
||||
|
@ -11,7 +11,6 @@ except ImportError:
|
|||
has_bs4 = False
|
||||
|
||||
iocMispMapping = {
|
||||
# ~ @Link https://wiki.ops.fr/doku.php/manuels:misp:event-guidelines
|
||||
'CookieHistoryItem/HostName': {'type': 'hostname', 'comment': 'CookieHistory.'},
|
||||
|
||||
'DriverItem/DriverName': {'category': 'Artifacts dropped', 'type': 'other', 'comment': 'DriverName.'},
|
||||
|
@ -151,7 +150,7 @@ def extract_field(report, field_name):
|
|||
data = report.find(field_name.lower())
|
||||
if data and hasattr(data, 'text'):
|
||||
return data.text
|
||||
return None
|
||||
return ''
|
||||
|
||||
|
||||
def load_openioc_file(openioc_path):
|
||||
|
@ -279,17 +278,3 @@ def set_all_attributes(openioc, misp_event):
|
|||
misp_event.add_attribute(**attribute_values)
|
||||
|
||||
return misp_event
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
import requests
|
||||
# test file for composite
|
||||
url = 'https://raw.githubusercontent.com/fireeye/iocs/master/BlogPosts/9cee306d-5441-4cd3-932d-f3119752634c.ioc'
|
||||
# ~ url = 'https://raw.githubusercontent.com/MISP/misp-modules/master/tests/openioc.xml'
|
||||
x = requests.get(url)
|
||||
mispEvent = load_openioc(x.text)
|
||||
print(mispEvent)
|
||||
# ~ from pymisp import PyMISP
|
||||
# ~ misp = PyMISP('http://misp.local', 'xxxxx')
|
||||
# ~ r = misp.add_event(mispEvent)
|
||||
# ~ print(r)
|
||||
|
|
|
@ -0,0 +1,138 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from ..exceptions import InvalidMISPObject
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
from io import BytesIO
|
||||
from hashlib import md5, sha1, sha256, sha512
|
||||
from datetime import datetime
|
||||
import logging
|
||||
|
||||
logger = logging.getLogger('pymisp')
|
||||
|
||||
try:
|
||||
import lief
|
||||
HAS_LIEF = True
|
||||
except ImportError:
|
||||
HAS_LIEF = False
|
||||
|
||||
try:
|
||||
import pydeep
|
||||
HAS_PYDEEP = True
|
||||
except ImportError:
|
||||
HAS_PYDEEP = False
|
||||
|
||||
|
||||
class PEObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, parsed=None, filepath=None, pseudofile=None, standalone=True, **kwargs):
|
||||
if not HAS_PYDEEP:
|
||||
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
|
||||
if not HAS_LIEF:
|
||||
raise ImportError('Please install lief, documentation here: https://github.com/lief-project/LIEF')
|
||||
if pseudofile:
|
||||
if isinstance(pseudofile, BytesIO):
|
||||
self.__pe = lief.PE.parse(raw=pseudofile.getvalue())
|
||||
elif isinstance(pseudofile, bytes):
|
||||
self.__pe = lief.PE.parse(raw=pseudofile)
|
||||
else:
|
||||
raise InvalidMISPObject('Pseudo file can be BytesIO or bytes got {}'.format(type(pseudofile)))
|
||||
elif filepath:
|
||||
self.__pe = lief.PE.parse(filepath)
|
||||
elif parsed:
|
||||
# Got an already parsed blob
|
||||
if isinstance(parsed, lief.PE.Binary):
|
||||
self.__pe = parsed
|
||||
else:
|
||||
raise InvalidMISPObject('Not a lief.PE.Binary: {}'.format(type(parsed)))
|
||||
# Python3 way
|
||||
# super().__init__('pe')
|
||||
super(PEObject, self).__init__('pe', standalone=standalone, **kwargs)
|
||||
self.generate_attributes()
|
||||
|
||||
def _is_exe(self):
|
||||
if not self._is_dll() and not self._is_driver():
|
||||
return self.__pe.header.has_characteristic(lief.PE.HEADER_CHARACTERISTICS.EXECUTABLE_IMAGE)
|
||||
return False
|
||||
|
||||
def _is_dll(self):
|
||||
return self.__pe.header.has_characteristic(lief.PE.HEADER_CHARACTERISTICS.DLL)
|
||||
|
||||
def _is_driver(self):
|
||||
# List from pefile
|
||||
system_DLLs = set(('ntoskrnl.exe', 'hal.dll', 'ndis.sys', 'bootvid.dll', 'kdcom.dll'))
|
||||
if system_DLLs.intersection([imp.lower() for imp in self.__pe.libraries]):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _get_pe_type(self):
|
||||
if self._is_dll():
|
||||
return 'dll'
|
||||
elif self._is_driver():
|
||||
return 'driver'
|
||||
elif self._is_exe():
|
||||
return 'exe'
|
||||
else:
|
||||
return 'unknown'
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('type', value=self._get_pe_type())
|
||||
# General information
|
||||
self.add_attribute('entrypoint-address', value=self.__pe.entrypoint)
|
||||
self.add_attribute('compilation-timestamp', value=datetime.utcfromtimestamp(self.__pe.header.time_date_stamps).isoformat())
|
||||
# self.imphash = self.__pe.get_imphash()
|
||||
try:
|
||||
if (self.__pe.has_resources and
|
||||
self.__pe.resources_manager.has_version and
|
||||
self.__pe.resources_manager.version.has_string_file_info and
|
||||
self.__pe.resources_manager.version.string_file_info.langcode_items):
|
||||
fileinfo = dict(self.__pe.resources_manager.version.string_file_info.langcode_items[0].items.items())
|
||||
self.add_attribute('original-filename', value=fileinfo.get('OriginalFilename'))
|
||||
self.add_attribute('internal-filename', value=fileinfo.get('InternalName'))
|
||||
self.add_attribute('file-description', value=fileinfo.get('FileDescription'))
|
||||
self.add_attribute('file-version', value=fileinfo.get('FileVersion'))
|
||||
self.add_attribute('lang-id', value=self.__pe.resources_manager.version.string_file_info.langcode_items[0].key)
|
||||
self.add_attribute('product-name', value=fileinfo.get('ProductName'))
|
||||
self.add_attribute('product-version', value=fileinfo.get('ProductVersion'))
|
||||
self.add_attribute('company-name', value=fileinfo.get('CompanyName'))
|
||||
self.add_attribute('legal-copyright', value=fileinfo.get('LegalCopyright'))
|
||||
except lief.read_out_of_bound:
|
||||
# The file is corrupted
|
||||
pass
|
||||
# Sections
|
||||
self.sections = []
|
||||
if self.__pe.sections:
|
||||
pos = 0
|
||||
for section in self.__pe.sections:
|
||||
s = PESectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
|
||||
self.add_reference(s.uuid, 'included-in', 'Section {} of PE'.format(pos))
|
||||
if ((self.__pe.entrypoint >= section.virtual_address) and
|
||||
(self.__pe.entrypoint < (section.virtual_address + section.virtual_size))):
|
||||
self.add_attribute('entrypoint-section-at-position', value='{}|{}'.format(section.name, pos))
|
||||
pos += 1
|
||||
self.sections.append(s)
|
||||
self.add_attribute('number-sections', value=len(self.sections))
|
||||
# TODO: TLSSection / DIRECTORY_ENTRY_TLS
|
||||
|
||||
|
||||
class PESectionObject(AbstractMISPObjectGenerator):
|
||||
|
||||
def __init__(self, section, standalone=True, **kwargs):
|
||||
# Python3 way
|
||||
# super().__init__('pe-section')
|
||||
super(PESectionObject, self).__init__('pe-section', standalone=standalone, **kwargs)
|
||||
self.__section = section
|
||||
self.__data = bytes(self.__section.content)
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
self.add_attribute('name', value=self.__section.name)
|
||||
size = self.add_attribute('size-in-bytes', value=self.__section.size)
|
||||
if int(size.value) > 0:
|
||||
self.add_attribute('entropy', value=self.__section.entropy)
|
||||
self.add_attribute('md5', value=md5(self.__data).hexdigest())
|
||||
self.add_attribute('sha1', value=sha1(self.__data).hexdigest())
|
||||
self.add_attribute('sha256', value=sha256(self.__data).hexdigest())
|
||||
self.add_attribute('sha512', value=sha512(self.__data).hexdigest())
|
||||
if HAS_PYDEEP:
|
||||
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
|
|
@ -0,0 +1,21 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
|
||||
|
||||
class SBSignatureObject(AbstractMISPObjectGenerator):
|
||||
'''
|
||||
Sandbox Analyzer
|
||||
'''
|
||||
def __init__(self, software, report, standalone=True, **kwargs):
|
||||
super(SBSignatureObject, self).__init__("sb-signature", **kwargs)
|
||||
self._software = software
|
||||
self._report = report
|
||||
self.generate_attributes()
|
||||
|
||||
def generate_attributes(self):
|
||||
''' Parse the report for relevant attributes '''
|
||||
self.add_attribute("software", value=self._software)
|
||||
for (signature_name, description) in self._report:
|
||||
self.add_attribute("signature", value=signature_name, comment=description)
|
|
@ -1,4 +1,3 @@
|
|||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
try:
|
||||
|
|
|
@ -0,0 +1,86 @@
|
|||
#!/usr/bin/env python3
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
import re
|
||||
|
||||
import requests
|
||||
try:
|
||||
import validators
|
||||
has_validators = True
|
||||
except ImportError:
|
||||
has_validators = False
|
||||
|
||||
|
||||
from .abstractgenerator import AbstractMISPObjectGenerator
|
||||
from .. import InvalidMISPObject
|
||||
|
||||
|
||||
class VTReportObject(AbstractMISPObjectGenerator):
|
||||
'''
|
||||
VirusTotal Report
|
||||
|
||||
:apikey: VirusTotal API key (private works, but only public features are supported right now)
|
||||
|
||||
:indicator: IOC to search VirusTotal for
|
||||
'''
|
||||
def __init__(self, apikey, indicator, vt_proxies=None, standalone=True, **kwargs):
|
||||
# PY3 way:
|
||||
# super().__init__("virustotal-report")
|
||||
super(VTReportObject, self).__init__("virustotal-report", standalone=standalone, **kwargs)
|
||||
indicator = indicator.strip()
|
||||
self._resource_type = self.__validate_resource(indicator)
|
||||
if self._resource_type:
|
||||
self._proxies = vt_proxies
|
||||
self._report = self.__query_virustotal(apikey, indicator)
|
||||
self.generate_attributes()
|
||||
else:
|
||||
error_msg = "A valid indicator is required. (One of type url, md5, sha1, sha256). Received '{}' instead".format(indicator)
|
||||
raise InvalidMISPObject(error_msg)
|
||||
|
||||
def get_report(self):
|
||||
return self._report
|
||||
|
||||
def generate_attributes(self):
|
||||
''' Parse the VirusTotal report for relevant attributes '''
|
||||
self.add_attribute("last-submission", value=self._report["scan_date"])
|
||||
self.add_attribute("permalink", value=self._report["permalink"])
|
||||
ratio = "{}/{}".format(self._report["positives"], self._report["total"])
|
||||
self.add_attribute("detection-ratio", value=ratio)
|
||||
|
||||
def __validate_resource(self, ioc):
|
||||
'''
|
||||
Validate the data type of an indicator.
|
||||
Domains and IP addresses aren't supported because
|
||||
they don't return the same type of data as the URLs/files do
|
||||
|
||||
:ioc: Indicator to search VirusTotal for
|
||||
'''
|
||||
if not has_validators:
|
||||
raise Exception('You need to install validators: pip install validators')
|
||||
if validators.url(ioc):
|
||||
return "url"
|
||||
elif re.match(r"\b([a-fA-F0-9]{32}|[a-fA-F0-9]{40}|[a-fA-F0-9]{64})\b", ioc):
|
||||
return "file"
|
||||
return False
|
||||
|
||||
def __query_virustotal(self, apikey, resource):
|
||||
'''
|
||||
Query VirusTotal for information about an indicator
|
||||
|
||||
:apikey: VirusTotal API key
|
||||
|
||||
:resource: Indicator to search in VirusTotal
|
||||
'''
|
||||
url = "https://www.virustotal.com/vtapi/v2/{}/report".format(self._resource_type)
|
||||
params = {"apikey": apikey, "resource": resource}
|
||||
# for now assume we're using a public API key - we'll figure out private keys later
|
||||
if self._proxies:
|
||||
report = requests.get(url, params=params, proxies=self._proxies)
|
||||
else:
|
||||
report = requests.get(url, params=params)
|
||||
report = report.json()
|
||||
if report["response_code"] == 1:
|
||||
return report
|
||||
else:
|
||||
error_msg = "{}: {}".format(resource, report["verbose_msg"])
|
||||
raise InvalidMISPObject(error_msg)
|
24
setup.py
24
setup.py
|
@ -1,4 +1,4 @@
|
|||
#!/usr/bin/python
|
||||
#!/usr/bin/env python
|
||||
# -*- coding: utf-8 -*-
|
||||
from setuptools import setup
|
||||
import pymisp
|
||||
|
@ -26,8 +26,24 @@ setup(
|
|||
'Topic :: Security',
|
||||
'Topic :: Internet',
|
||||
],
|
||||
test_suite="tests",
|
||||
install_requires=['requests', 'python-dateutil', 'jsonschema'],
|
||||
test_suite="tests.test_offline",
|
||||
install_requires=['six', 'requests', 'python-dateutil', 'jsonschema', 'setuptools>=36.4'],
|
||||
extras_require={'fileobjects': ['lief>=0.8', 'python-magic'],
|
||||
'neo': ['py2neo'],
|
||||
'openioc': ['beautifulsoup4'],
|
||||
'virustotal': ['validators'],
|
||||
'warninglists': ['pymispwarninglists']},
|
||||
tests_require=[
|
||||
'jsonschema',
|
||||
'python-dateutil',
|
||||
'python-magic',
|
||||
'requests-mock',
|
||||
'six'
|
||||
],
|
||||
include_package_data=True,
|
||||
package_data={'data': ['schema.json', 'schema-lax.json', 'describeTypes.json']},
|
||||
package_data={'pymisp': ['data/*.json',
|
||||
'data/misp-objects/schema_objects.json',
|
||||
'data/misp-objects/schema_relationships.json',
|
||||
'data/misp-objects/objects/*/definition.json',
|
||||
'data/misp-objects/relationships/definition.json']},
|
||||
)
|
||||
|
|
|
@ -0,0 +1,23 @@
|
|||
{
|
||||
"Event": {
|
||||
"Attribute": [
|
||||
{
|
||||
"Tag": [
|
||||
{
|
||||
"name": "osint"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": false,
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar.exe"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,25 @@
|
|||
{
|
||||
"Event": {
|
||||
"Attribute": [
|
||||
{
|
||||
"Tag": [
|
||||
{
|
||||
"name": "osint"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"deleted": true,
|
||||
"disable_correlation": false,
|
||||
"id": "42",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar.exe"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,55 @@
|
|||
{
|
||||
"Event": {
|
||||
"Object": [
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"category": "Attribution",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "registrar",
|
||||
"to_ids": false,
|
||||
"type": "whois-registrar",
|
||||
"value": "registar.example.com"
|
||||
},
|
||||
{
|
||||
"category": "Network activity",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "domain",
|
||||
"to_ids": true,
|
||||
"type": "domain",
|
||||
"value": "domain.example.com"
|
||||
},
|
||||
{
|
||||
"category": "Network activity",
|
||||
"disable_correlation": true,
|
||||
"object_relation": "nameserver",
|
||||
"to_ids": false,
|
||||
"type": "hostname",
|
||||
"value": "ns1.example.com"
|
||||
},
|
||||
{
|
||||
"category": "External analysis",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "nameserver",
|
||||
"to_ids": true,
|
||||
"type": "hostname",
|
||||
"value": "ns2.example.com"
|
||||
}
|
||||
],
|
||||
"description": "Whois records information for a domain name or an IP address.",
|
||||
"distribution": 5,
|
||||
"meta-category": "network",
|
||||
"name": "whois",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "429faea1-34ff-47af-8a00-7c62d3be5a6a",
|
||||
"template_version": 9,
|
||||
"uuid": "a"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,10 @@
|
|||
{
|
||||
"Event": {
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"published": true,
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,59 @@
|
|||
{
|
||||
"Event": {
|
||||
"Object": [
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"Tag": [
|
||||
{
|
||||
"name": "blah"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": true,
|
||||
"object_relation": "filename",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar"
|
||||
}
|
||||
],
|
||||
"ObjectReference": [
|
||||
{
|
||||
"comment": "foo",
|
||||
"object_uuid": "a",
|
||||
"referenced_uuid": "b",
|
||||
"relationship_type": "baz"
|
||||
}
|
||||
],
|
||||
"description": "File object describing a file with meta-information",
|
||||
"distribution": 5,
|
||||
"meta-category": "file",
|
||||
"name": "file",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
|
||||
"template_version": 10,
|
||||
"uuid": "a"
|
||||
},
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"category": "External analysis",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "url",
|
||||
"to_ids": true,
|
||||
"type": "url",
|
||||
"value": "https://www.circl.lu"
|
||||
}
|
||||
],
|
||||
"description": "url object describes an url along with its normalized field (like extracted using faup parsing library) and its metadata.",
|
||||
"distribution": 5,
|
||||
"meta-category": "network",
|
||||
"name": "url",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "60efb77b-40b5-4c46-871b-ed1ed999fce5",
|
||||
"template_version": 6,
|
||||
"uuid": "b"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
|
@ -0,0 +1,56 @@
|
|||
{
|
||||
"Event": {
|
||||
"Object": [
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"Tag": [
|
||||
{
|
||||
"name": "blah"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": true,
|
||||
"object_relation": "filename",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar"
|
||||
}
|
||||
],
|
||||
"description": "File object describing a file with meta-information",
|
||||
"distribution": 5,
|
||||
"meta-category": "file",
|
||||
"name": "file",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
|
||||
"template_version": 10,
|
||||
"uuid": "a"
|
||||
},
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"Tag": [
|
||||
{
|
||||
"name": "blah"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": true,
|
||||
"object_relation": "filename",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "baz"
|
||||
}
|
||||
],
|
||||
"description": "File object describing a file with meta-information",
|
||||
"distribution": 5,
|
||||
"meta-category": "file",
|
||||
"name": "file",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
|
||||
"template_version": 10,
|
||||
"uuid": "b"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
|
@ -0,0 +1,31 @@
|
|||
{
|
||||
"Event": {
|
||||
"Object": [
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "filename",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar"
|
||||
}
|
||||
],
|
||||
"Tag": [
|
||||
{
|
||||
"name": "osint"
|
||||
}
|
||||
],
|
||||
"description": "File object describing a file with meta-information",
|
||||
"distribution": 5,
|
||||
"meta-category": "file",
|
||||
"name": "file",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
|
||||
"template_version": 9,
|
||||
"uuid": "a"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
|
@ -0,0 +1,20 @@
|
|||
{
|
||||
"Event": {
|
||||
"Tag": [
|
||||
{
|
||||
"name": "bar"
|
||||
},
|
||||
{
|
||||
"name": "baz"
|
||||
},
|
||||
{
|
||||
"name": "foo"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,21 @@
|
|||
{
|
||||
"Event": {
|
||||
"Attribute": [
|
||||
{
|
||||
"category": "Payload delivery",
|
||||
"data": "ewogICJFdmVudCI6IHsKICB9Cn0K",
|
||||
"disable_correlation": false,
|
||||
"encrypt": true,
|
||||
"malware_filename": "bar.exe",
|
||||
"to_ids": true,
|
||||
"type": "malware-sample",
|
||||
"value": "bar.exe|7637beddacbeac59d44469b2b120b9e6"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,165 @@
|
|||
{"response":[{
|
||||
"Event": {
|
||||
"id": "6719",
|
||||
"orgc_id": "1",
|
||||
"org_id": "1",
|
||||
"date": "2018-01-04",
|
||||
"threat_level_id": "1",
|
||||
"info": "Test existing malware PyMISP",
|
||||
"published": false,
|
||||
"uuid": "5a4e4fdd-1eb4-4ff3-9e87-43fa950d210f",
|
||||
"attribute_count": "6",
|
||||
"analysis": "0",
|
||||
"timestamp": "1515081727",
|
||||
"distribution": "0",
|
||||
"proposal_email_lock": false,
|
||||
"locked": false,
|
||||
"publish_timestamp": "0",
|
||||
"sharing_group_id": "0",
|
||||
"disable_correlation": false,
|
||||
"event_creator_email": "raphael.vinot@circl.lu",
|
||||
"Org": {
|
||||
"id": "1",
|
||||
"name": "CIRCL",
|
||||
"uuid": "55f6ea5e-2c60-40e5-964f-47a8950d210f"
|
||||
},
|
||||
"Orgc": {
|
||||
"id": "1",
|
||||
"name": "CIRCL",
|
||||
"uuid": "55f6ea5e-2c60-40e5-964f-47a8950d210f"
|
||||
},
|
||||
"Attribute": [],
|
||||
"ShadowAttribute": [],
|
||||
"RelatedEvent": [],
|
||||
"Galaxy": [],
|
||||
"Object": [
|
||||
{
|
||||
"id": "2279",
|
||||
"name": "file",
|
||||
"meta-category": "file",
|
||||
"description": "File object describing a file with meta-information",
|
||||
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
|
||||
"template_version": "7",
|
||||
"event_id": "6719",
|
||||
"uuid": "5a4e4ffe-4cb8-48b1-bd5c-48fb950d210f",
|
||||
"timestamp": "1515081726",
|
||||
"distribution": "5",
|
||||
"sharing_group_id": "0",
|
||||
"comment": "",
|
||||
"deleted": false,
|
||||
"ObjectReference": [],
|
||||
"Attribute": [
|
||||
{
|
||||
"id": "814967",
|
||||
"type": "malware-sample",
|
||||
"category": "Payload delivery",
|
||||
"to_ids": true,
|
||||
"uuid": "5a4e4fff-407c-40ff-9de5-43dc950d210f",
|
||||
"event_id": "6719",
|
||||
"distribution": "5",
|
||||
"timestamp": "1515081727",
|
||||
"comment": "",
|
||||
"sharing_group_id": "0",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"object_id": "2279",
|
||||
"object_relation": "malware-sample",
|
||||
"value": "simple.json|7637beddacbeac59d44469b2b120b9e6",
|
||||
"data": "UEsDBAoACQAAAEOAJEyjHboUIQAAABUAAAAgABwANzYzN2JlZGRhY2JlYWM1OWQ0NDQ2OWIyYjEyMGI5ZTZVVAkAA\/5PTlr+T05adXgLAAEEIQAAAAQhAAAATvzonhGOj12MyB1QeGLJ5iZhOjD+zymV4FU2+kjD4oTYUEsHCKMduhQhAAAAFQAAAFBLAwQKAAkAAABDgCRMg45UABcAAAALAAAALQAcADc2MzdiZWRkYWNiZWFjNTlkNDQ0NjliMmIxMjBiOWU2LmZpbGVuYW1lLnR4dFVUCQAD\/k9OWv5PTlp1eAsAAQQhAAAABCEAAADDgZOh6307Bduy829xtRjpivO\/xFI3KVBLBwiDjlQAFwAAAAsAAABQSwECHgMKAAkAAABDgCRMox26FCEAAAAVAAAAIAAYAAAAAAABAAAApIEAAAAANzYzN2JlZGRhY2JlYWM1OWQ0NDQ2OWIyYjEyMGI5ZTZVVAUAA\/5PTlp1eAsAAQQhAAAABCEAAABQSwECHgMKAAkAAABDgCRMg45UABcAAAALAAAALQAYAAAAAAABAAAApIGLAAAANzYzN2JlZGRhY2JlYWM1OWQ0NDQ2OWIyYjEyMGI5ZTYuZmlsZW5hbWUudHh0VVQFAAP+T05adXgLAAEEIQAAAAQhAAAAUEsFBgAAAAACAAIA2QAAABkBAAAAAA==",
|
||||
"ShadowAttribute": []
|
||||
},
|
||||
{
|
||||
"id": "814968",
|
||||
"type": "filename",
|
||||
"category": "Payload delivery",
|
||||
"to_ids": false,
|
||||
"uuid": "5a4e4fff-9ec0-4822-a405-4e29950d210f",
|
||||
"event_id": "6719",
|
||||
"distribution": "5",
|
||||
"timestamp": "1515081727",
|
||||
"comment": "",
|
||||
"sharing_group_id": "0",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"object_id": "2279",
|
||||
"object_relation": "filename",
|
||||
"value": "simple.json",
|
||||
"ShadowAttribute": []
|
||||
},
|
||||
{
|
||||
"id": "814969",
|
||||
"type": "md5",
|
||||
"category": "Payload delivery",
|
||||
"to_ids": true,
|
||||
"uuid": "5a4e4fff-8000-49f9-8c3e-4598950d210f",
|
||||
"event_id": "6719",
|
||||
"distribution": "5",
|
||||
"timestamp": "1515081727",
|
||||
"comment": "",
|
||||
"sharing_group_id": "0",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"object_id": "2279",
|
||||
"object_relation": "md5",
|
||||
"value": "7637beddacbeac59d44469b2b120b9e6",
|
||||
"ShadowAttribute": []
|
||||
},
|
||||
{
|
||||
"id": "814970",
|
||||
"type": "sha1",
|
||||
"category": "Payload delivery",
|
||||
"to_ids": true,
|
||||
"uuid": "5a4e4fff-dae0-4aa4-81ea-4899950d210f",
|
||||
"event_id": "6719",
|
||||
"distribution": "5",
|
||||
"timestamp": "1515081727",
|
||||
"comment": "",
|
||||
"sharing_group_id": "0",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"object_id": "2279",
|
||||
"object_relation": "sha1",
|
||||
"value": "023853a4331db8d67e44553004cf338ec1b7440e",
|
||||
"ShadowAttribute": []
|
||||
},
|
||||
{
|
||||
"id": "814971",
|
||||
"type": "sha256",
|
||||
"category": "Payload delivery",
|
||||
"to_ids": true,
|
||||
"uuid": "5a4e4fff-03ec-4e88-b5f4-472b950d210f",
|
||||
"event_id": "6719",
|
||||
"distribution": "5",
|
||||
"timestamp": "1515081727",
|
||||
"comment": "",
|
||||
"sharing_group_id": "0",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"object_id": "2279",
|
||||
"object_relation": "sha256",
|
||||
"value": "6ae8b0f1c7d6f3238d1fc14038018c3b4704c8cc23dac1c2bfd2c81b5a278eef",
|
||||
"ShadowAttribute": []
|
||||
},
|
||||
{
|
||||
"id": "814972",
|
||||
"type": "size-in-bytes",
|
||||
"category": "Other",
|
||||
"to_ids": false,
|
||||
"uuid": "5a4e4fff-b6f4-41ba-a6eb-446c950d210f",
|
||||
"event_id": "6719",
|
||||
"distribution": "5",
|
||||
"timestamp": "1515081727",
|
||||
"comment": "",
|
||||
"sharing_group_id": "0",
|
||||
"deleted": false,
|
||||
"disable_correlation": true,
|
||||
"object_id": "2279",
|
||||
"object_relation": "size-in-bytes",
|
||||
"value": "21",
|
||||
"ShadowAttribute": []
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}]}
|
|
@ -0,0 +1,40 @@
|
|||
{
|
||||
"Event": {
|
||||
"Object": [
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"category": "Other",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "member3",
|
||||
"to_ids": false,
|
||||
"type": "text",
|
||||
"value": "foo"
|
||||
},
|
||||
{
|
||||
"category": "Other",
|
||||
"disable_correlation": false,
|
||||
"object_relation": "member1",
|
||||
"to_ids": false,
|
||||
"type": "text",
|
||||
"value": "bar"
|
||||
}
|
||||
],
|
||||
"description": "TestTemplate.",
|
||||
"distribution": 5,
|
||||
"meta-category": "file",
|
||||
"misp_objects_path_custom": "tests/mispevent_testfiles",
|
||||
"name": "test_object_template",
|
||||
"sharing_group_id": 0,
|
||||
"template_uuid": "4ec55cc6-9e49-4c64-b794-03c25c1a6589",
|
||||
"template_version": 1,
|
||||
"uuid": "a"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,36 @@
|
|||
{
|
||||
"Event": {
|
||||
"Attribute": [
|
||||
{
|
||||
"ShadowAttribute": [
|
||||
{
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": false,
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar.pdf"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": false,
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "bar.exe"
|
||||
}
|
||||
],
|
||||
"ShadowAttribute": [
|
||||
{
|
||||
"category": "Payload delivery",
|
||||
"disable_correlation": false,
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"value": "baz.jpg"
|
||||
}
|
||||
],
|
||||
"analysis": "1",
|
||||
"date": "2017-12-31",
|
||||
"distribution": "1",
|
||||
"info": "This is a test",
|
||||
"threat_level_id": "1"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,149 @@
|
|||
{
|
||||
"Event": {
|
||||
"Attribute": [
|
||||
{
|
||||
"ShadowAttribute": [
|
||||
{
|
||||
"Org": {
|
||||
"id": "1",
|
||||
"name": "CIRCL",
|
||||
"uuid": "55f6ea5e-2c60-40e5-964f-47a8950d210f"
|
||||
},
|
||||
"category": "Artifacts dropped",
|
||||
"comment": "",
|
||||
"disable_correlation": false,
|
||||
"event_id": "6676",
|
||||
"event_uuid": "5a4cb19a-f550-437f-bd29-48ed950d210f",
|
||||
"id": "3770",
|
||||
"old_id": "811578",
|
||||
"org_id": "1",
|
||||
"proposal_to_delete": false,
|
||||
"timestamp": "1514975846",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"uuid": "5a4cb1c7-fa84-45fa-8d27-4822950d210f",
|
||||
"value": "blah.exe.jpg"
|
||||
}
|
||||
],
|
||||
"category": "Artifacts dropped",
|
||||
"comment": "",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"distribution": "5",
|
||||
"event_id": "6676",
|
||||
"id": "811578",
|
||||
"object_id": "0",
|
||||
"sharing_group_id": "0",
|
||||
"timestamp": "1514975687",
|
||||
"to_ids": false,
|
||||
"type": "filename",
|
||||
"uuid": "5a4cb1c7-fa84-45fa-8d27-4822950d210f",
|
||||
"value": "blah.exe"
|
||||
}
|
||||
],
|
||||
"Object": [
|
||||
{
|
||||
"Attribute": [
|
||||
{
|
||||
"ShadowAttribute": [
|
||||
{
|
||||
"Org": {
|
||||
"id": "1",
|
||||
"name": "CIRCL",
|
||||
"uuid": "55f6ea5e-2c60-40e5-964f-47a8950d210f"
|
||||
},
|
||||
"category": "Payload delivery",
|
||||
"comment": "",
|
||||
"disable_correlation": false,
|
||||
"event_id": "6676",
|
||||
"event_uuid": "5a4cb19a-f550-437f-bd29-48ed950d210f",
|
||||
"id": "3771",
|
||||
"old_id": "811579",
|
||||
"org_id": "1",
|
||||
"proposal_to_delete": false,
|
||||
"timestamp": "1514976196",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"uuid": "5a4cb2b8-4748-4c72-96e6-4588950d210f",
|
||||
"value": "baz.png.exe"
|
||||
}
|
||||
],
|
||||
"category": "Payload delivery",
|
||||
"comment": "",
|
||||
"deleted": false,
|
||||
"disable_correlation": false,
|
||||
"distribution": "5",
|
||||
"event_id": "6676",
|
||||
"id": "811579",
|
||||
"object_id": "2278",
|
||||
"object_relation": "filename",
|
||||
"sharing_group_id": "0",
|
||||
"timestamp": "1514975928",
|
||||
"to_ids": true,
|
||||
"type": "filename",
|
||||
"uuid": "5a4cb2b8-4748-4c72-96e6-4588950d210f",
|
||||
"value": "baz.png"
|
||||
},
|
||||
{
|
||||
"category": "Other",
|
||||
"comment": "",
|
||||
"deleted": false,
|
||||
"disable_correlation": true,
|
||||
"distribution": "5",
|
||||
"event_id": "6676",
|
||||
"id": "811580",
|
||||
"object_id": "2278",
|
||||
"object_relation": "state",
|
||||
"sharing_group_id": "0",
|
||||
"timestamp": "1514975928",
|
||||
"to_ids": false,
|
||||
"type": "text",
|
||||
"uuid": "5a4cb2b9-92b4-4d3a-82df-4e86950d210f",
|
||||
"value": "Malicious"
|
||||
}
|
||||
],
|
||||
"comment": "",
|
||||
"deleted": false,
|
||||
"description": "File object describing a file with meta-information",
|
||||
"distribution": "5",
|
||||
"event_id": "6676",
|
||||
"id": "2278",
|
||||
"meta-category": "file",
|
||||
"name": "file",
|
||||
"sharing_group_id": "0",
|
||||
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
|
||||
"template_version": "7",
|
||||
"timestamp": "1514975928",
|
||||
"uuid": "5a4cb2b8-7958-4323-852c-4d2a950d210f"
|
||||
}
|
||||
],
|
||||
"Org": {
|
||||
"id": "1",
|
||||
"name": "CIRCL",
|
||||
"uuid": "55f6ea5e-2c60-40e5-964f-47a8950d210f"
|
||||
},
|
||||
"Orgc": {
|
||||
"id": "1",
|
||||
"name": "CIRCL",
|
||||
"uuid": "55f6ea5e-2c60-40e5-964f-47a8950d210f"
|
||||
},
|
||||
"analysis": "2",
|
||||
"attribute_count": "3",
|
||||
"date": "2018-01-03",
|
||||
"disable_correlation": false,
|
||||
"distribution": "0",
|
||||
"event_creator_email": "raphael.vinot@circl.lu",
|
||||
"id": "6676",
|
||||
"info": "Test proposals / ShadowAttributes",
|
||||
"locked": false,
|
||||
"org_id": "1",
|
||||
"orgc_id": "1",
|
||||
"proposal_email_lock": true,
|
||||
"publish_timestamp": "0",
|
||||
"published": false,
|
||||
"sharing_group_id": "0",
|
||||
"threat_level_id": "1",
|
||||
"timestamp": "1514975929",
|
||||
"uuid": "5a4cb19a-f550-437f-bd29-48ed950d210f"
|
||||
}
|
||||
}
|
|
@ -0,0 +1,5 @@
|
|||
{
|
||||
"timestamp": 11111111,
|
||||
"type": "bar",
|
||||
"value": "1"
|
||||
}
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue