Merge remote-tracking branch 'upstream/main' into feature/tagdelete_searchsg

pull/637/head
Tom King 2020-10-14 17:14:52 +01:00
commit e5d413ca4f
74 changed files with 8412 additions and 2556 deletions

1
.gitignore vendored
View File

@ -1,6 +1,7 @@
*.swp
*.pem
*.pyc
docs/build/
examples/keys.py
examples/cudeso.py
examples/feed-generator/output/*\.json

View File

@ -1,12 +1,12 @@
dist: bionic
language: python
cache: pip
addons:
apt:
sources: [ 'ubuntu-toolchain-r-test' ]
packages:
- libstdc++6
- libfuzzy-dev
python:
@ -24,5 +24,5 @@ script:
- bash travis/test_travis.sh
after_success:
- pipenv run codecov
- pipenv run coveralls
- poetry run codecov
- poetry run coveralls

View File

@ -7,7 +7,382 @@ Changelog
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
v2.4.131 (2020-09-08)
---------------------
New
~~~
- [test] Validate tag removal. [Raphaël Vinot]
- [describeTypes] sha3 added. [Alexandre Dulaunoy]
Changes
~~~~~~~
- Bump version. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- [describeTypes] updated. [Alexandre Dulaunoy]
- [describeTypes] updated. [Alexandre Dulaunoy]
- Bump objects. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Bump file template version. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- Rename blacklist -> blocklist. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
v2.4.130 (2020-08-20)
---------------------
New
~~~
- Blacklist methods. [Raphaël Vinot]
- Add list of missing calls. [Raphaël Vinot]
- Add test_obj_references_export. [louis]
- Add MISPObject.standalone property. [louis]
Setting MISPObject.standalone updates MISPObject._standalone and
add/removes "ObjectReference" from AbstractMISP.__not_jsonable using
update_not_jsonable/_remove_from_not_jsonable.
- Add AbstractMISP._remove_from_not_jsonable. [louis]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- Bump types. [Raphaël Vinot]
- [testlive_comprehensive] Updated generic tagging method to match
changes in MISP. [mokaddem]
- Cleanup blocklist methods. [Raphaël Vinot]
- Remove outdated example. [Raphaël Vinot]
Fix #611
- New test_get_non_exists_event. [Jakub Onderka]
- Bump dependencies. [Raphaël Vinot]
- Enable more tests. [Raphaël Vinot]
- Make get_object return a not standalone object. [louis]
- Remove standalone default value from MISPObject children c'tor.
[louis]
MISPObject.__init__ sets standalone=True by default, so there is no
need to do it in its child classes.
- Make MISPObject standalone by default. [louis]
standalone defaults to True in MISPObject.__init__, and is set to False
when the object is added to an event.
- Add MISPObject._standalone type. [louis]
Fix
~~~
- Bump file template version. [Raphaël Vinot]
- Test_get_non_exists_event. [Jakub Onderka]
- IP removed from the public DNS list. [Raphaël Vinot]
- Example using deprecated calls. [Raphaël Vinot]
fix #602
- Add STIX XML output for the search. [Raphaël Vinot]
Use stix-xml as return_format.
Fix #600 https://github.com/MISP/MISP/issues/5618
- Dummy event example. [Raphaël Vinot]
Fix #598
Other
~~~~~
- Exclude section correlation .rsrc and zero-filled. [deku]
- Linting/Add missing whitespace. [Paal Braathen]
- Remove explicit loglevel checking. [Paal Braathen]
- Remove explicit traceback printing. [Paal Braathen]
- Master branch has been renamed to main. [Arcuri Davide]
- Update README.md. [Raphaël Vinot]
fix: #599
v2.4.128 (2020-06-22)
---------------------
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Add a few test cases. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
v2.4.127.1 (2020-06-19)
-----------------------
New
~~~
- Optionally include deleted attributes/objects in feed. [Raphaël Vinot]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Add test case for get event deleted. [Raphaël Vinot]
- Add test case for search deleted. [Raphaël Vinot]
- Update comments for search. [Raphaël Vinot]
Fix
~~~
- Keep deleted key in MISPObject and MISPObjectAttribute. [Raphaël
Vinot]
v2.4.127 (2020-06-16)
---------------------
New
~~~
- Add helper and test case for GitVulnFinderObject. [Raphaël Vinot]
- Add git-commit-id type. [Raphaël Vinot]
- Add deleted in field export. [Raphaël Vinot]
Fix #586
- Timeout for connection/request, fixes #584. [Christophe Vandeplas]
Changes
~~~~~~~
- Bump Changelog. [Raphaël Vinot]
- Rename master -> main. [Raphaël Vinot]
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Rename branches master -> main. [Raphaël Vinot]
- Remove extra parameter in change_user_password. [Raphaël Vinot]
Fix
~~~
- Do not fail if the attribute value is not a string. [Raphaël Vinot]
- Properly strip value in MISPObject.add_attribute, take 2. [Raphaël
Vinot]
Fix #546
- Properly strip value in MISPObject.add_attribute. [Raphaël Vinot]
Fix #546
- Deleted is not always required in the feed export. [Raphaël Vinot]
- Make mypy happy. [Raphaël Vinot]
- Fixes bug in timeout change. [Christophe Vandeplas]
- Fixes bug in timeout change. [Christophe Vandeplas]
- Fixes bug in timeout change. [Christophe Vandeplas]
- Fixes bug in timeout change. [Christophe Vandeplas]
- Fixes bug in timeout change. [Christophe Vandeplas]
hail to Rafiot
- Fixes bug in timeout change. [Christophe Vandeplas]
- Fixes bug in timeout change. [Christophe Vandeplas]
Other
~~~~~
- Previously file object was reporting the libmagic description of a
file instead of the mimetype. According to [MISP
DataModels](https://www.misp-project.org/datamodels/#types) ``` mime-
type: A media type (also MIME type and content type) is a two-part
identifier for file formats and format contents transmitted on the
Internet ``` more precisely defined in
[RFC2045](https://tools.ietf.org/html/rfc2045) and others. [Troy Ross]
The description returned by libmagic is more useful than the generic mime-type,
but I did not find a place to put the description in the current data model.
- Fix end of line encoding of examples/cytomic_orion.py. [Sebastian
Wagner]
v2.4.126 (2020-05-18)
---------------------
New
~~~
- Test search with timestamp. [Raphaël Vinot]
- Add testcase for updating partial event. [Raphaël Vinot]
- Add pyfaup as optional dependency. [Raphaël Vinot]
- [dev] add microblog object tool. [VVX7]
- Very simple test case for rest search on objects. [Raphaël Vinot]
- Self registration, object level search (initial) [Raphaël Vinot]
- [dev] add flag to get extended misp event. [VVX7]
- [dev] add flag to get extended misp event. [VVX7]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump misp-object. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Add test for feed partial update. [Raphaël Vinot]
- Strip empty parameters in build_complex_query. [Raphaël Vinot]
Fix #577
- Simplify delete_attribute. [Raphaël Vinot]
- Bump travis install. [Raphaël Vinot]
- Add comment in microblog object. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- [dev] clean up how keys are accessed in self._parameters. [VVX7]
- [dev] use isinstance() type check. [VVX7]
- [dev] fix abstract generator import. add logger. [VVX7]
- [dev] change type() == list. [VVX7]
- Bump misp-objects. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- [dev] remove duplicate line. [VVX7]
- [dev] add extend_event() test. chg typo in get_event() [VVX7]
- Re-Bump CHANGELOG. [Raphaël Vinot]
Fix
~~~
- Settings is not required in MISPFeed. [Raphaël Vinot]
- Properly skip timestamp in __iter__ when needed. [Raphaël Vinot]
- Catch exception when liblua-5.3 is not present. [Raphaël Vinot]
- Make flake8 happy. [Raphaël Vinot]
- Properly load feeds, fix undefined variable. [Raphaël Vinot]
- Make flake8 happy. [Raphaël Vinot]
- Remove extra print. [Raphaël Vinot]
- Typo, add test for extended event. [Raphaël Vinot]
Other
~~~~~
- Update docstring in api.py. [Bernhard E. Reiter]
* remove typo in ssl parameter docstring.
* Add hint that other certs (which are not in the default CAs, but also are not self signed in a strict sense) can also use the CA_BUNDLE function of the ssl parameter.
v2.4.125 (2020-04-30)
---------------------
New
~~~
- Extended option on get event. [Raphaël Vinot]
Related to #567
Changes
~~~~~~~
- Bump version in pyproject. [Raphaël Vinot]
- Bump CHANGELOG. [Raphaël Vinot]
- Bump objects, deps. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Remove old suricata script, keep reference to old code. [Raphaël
Vinot]
Fix
~~~
- Enable autoalert on admin user. [Raphaël Vinot]
- [abstract] Forces file to be read with utf8 encoding. [mokaddem]
- Properly handle timezone in tests. [Raphaël Vinot]
Other
~~~~~
- Update up.py. [Raphaël Vinot]
Fix #563
- Fixed __query_virustotal return type. [DocArmoryTech]
__query_virustotal returned a Response object and not the json expected; modified so that report_json is returned instead of report.
v2.4.124 (2020-03-30)
---------------------
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Add option to aggregare by country. [Raphaël Vinot]
- [CSSE COVID] Publish the event immediately. [Raphaël Vinot]
- Add changelog and readme in the package. [Raphaël Vinot]
- Bump version in pyproject. [Raphaël Vinot]
Fix
~~~
- Strip every string in AbstractMISP. [Raphaël Vinot]
fix #546
- Incorrect expectation of attribute value to be a str - take 2.
[Raphaël Vinot]
Related #553
- Incorrect expectation of attribute value to be a str. [Raphaël Vinot]
Fix #553
Other
~~~~~
- Dos2unix examples/stats_report.py. [Sebastian Wagner]
- Cytomic Orion API access. [Koen Van Impe]
- Add organisations from CSV. [Koen Van Impe]
- Minor updates to vmray_automation for travis. [Koen Van Impe]
- VMRay Automation with ExpandedPyMISP. [Koen Van Impe]
v2.4.123 (2020-03-10)
---------------------
New
~~~
- Add import script for dxy data. [Raphaël Vinot]
- Csse covid19 daily report importer. [Raphaël Vinot]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump changelog. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- JSON files are UTF8. [Raphaël Vinot]
Bump dev deps, update comment
- Add tag, set distribution, add file and source (CSSE importer)
[Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
v2.4.122 (2020-02-26)
---------------------
New
~~~
- Add uuid by default in MISPEvent, add F/L seen in feed output.
[Raphaël Vinot]
- Admin script to setup a sync server. [Raphaël Vinot]
- Add feed generation example in notebook. [Raphaël Vinot]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Comments were still referencing pipenv. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Bump dep. [Raphaël Vinot]
- Fix typo in readme. [Raphaël Vinot]
- Use bionic on travis. [Raphaël Vinot]
- Add poetry support. [Raphaël Vinot]
Fix
~~~
- Test cases & template version. [Raphaël Vinot]
- Mypy, more typing. [Raphaël Vinot]
- Do not skip data in add_attribute methods. [Raphaël Vinot]
- Remove references to the old API. [Raphaël Vinot]
Other
~~~~~
- Use poetry everywhere, fix readme. [Raphaël Vinot]
v2.4.121.1 (2020-02-07)
@ -16,6 +391,8 @@ v2.4.121.1 (2020-02-07)
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
Fix
@ -918,7 +1295,7 @@ Other
values, sanitization) [Falconieri]
- Add: exportpdf tool working. [Falconieri]
- General improvement : deisgn, exhaustiviness of mispEvent values
displayed, good pratice concerning paragraphe/table made. [Falconieri]
displayed, good practice concerning paragraphe/table made. [Falconieri]
- Update with table basics. [Falconieri]
- Structure of the improvements OK : test file, test folder, report
generator. [Falconieri]
@ -1842,7 +2219,7 @@ Changes
- Bump CHANGELOG. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Update readme for new logging system. [Raphaël Vinot]
- Small improvments in the logging system. [Raphaël Vinot]
- Small improvements in the logging system. [Raphaël Vinot]
- Properly use python logging module. [Raphaël Vinot]
- Update asciidoctor generator. [Raphaël Vinot]
- Remove warning if PyMISP is too new. [Raphaël Vinot]
@ -2170,7 +2547,7 @@ Other
- Cleanup warning function. [Raphaël Vinot]
- Fix typos. [Raphaël Vinot]
- Remove unused variable. [Tristan METAYER]
- Remove category It will be automaticly detected
- Remove category It will be automatically detected
https://github.com/MISP/PyMISP/blob/master/pymisp/tools/openioc.py.
[Tristan METAYER]
- Revert tab to escape. [Tristan METAYER]
@ -2379,7 +2756,7 @@ Other
- Bump version. [Raphaël Vinot]
- Add orgs managment. [Raphaël Vinot]
- Run on more python versions. [Raphaël Vinot]
- Exemple addtag (dirty) [Déborah Servili]
- Example addtag (dirty) [Déborah Servili]
- Fix last commit. [Raphaël Vinot]
- Wrong use of API for dateuntil. [Koen Van Impe]

25
Pipfile
View File

@ -1,25 +0,0 @@
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
nose = "*"
coveralls = "*"
codecov = "*"
requests-mock = "*"
pymisp = {editable = true,extras = ["fileobjects", "neo", "openioc", "virustotal", "pdfexport", "docs"],path = "."}
docutils = "==0.15"
memory-profiler = "*"
mypy = "*"
flake8 = "*"
[packages]
pymisp = {editable = true,extras = ["fileobjects", "openioc", "virustotal", "pdfexport"],path = "."}
pymispwarninglists = {editable = true,git = "https://github.com/MISP/PyMISPWarningLists.git"}
[requires]
python_version = "3"
[pipenv]
allow_prereleases = true

872
Pipfile.lock generated
View File

@ -1,872 +0,0 @@
{
"_meta": {
"hash": {
"sha256": "980c848909285e25224dc957df15e733666b06107dfbd97e6edfcd51c8da9206"
},
"pipfile-spec": 6,
"requires": {
"python_version": "3"
},
"sources": [
{
"name": "pypi",
"url": "https://pypi.org/simple",
"verify_ssl": true
}
]
},
"default": {
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"beautifulsoup4": {
"hashes": [
"sha256:05fd825eb01c290877657a56df4c6e4c311b3965bda790c613a3d6fb01a5462a",
"sha256:9fbb4d6e48ecd30bcacc5b63b94088192dcda178513b2ae3c394229f8911b887",
"sha256:e1505eeed31b0f4ce2dbb3bc8eb256c04cc2b3b72af7d551a4ab6efd5cbe5dae"
],
"version": "==4.8.2"
},
"certifi": {
"hashes": [
"sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3",
"sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f"
],
"version": "==2019.11.28"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"decorator": {
"hashes": [
"sha256:54c38050039232e1db4ad7375cfce6748d7b41c29e95a081c8a6d2c30364a2ce",
"sha256:5d19b92a3c8f7f101c8dd86afd86b0f061a8ce4540ab8cd401fa2542756bce6d"
],
"version": "==4.4.1"
},
"deprecated": {
"hashes": [
"sha256:408038ab5fdeca67554e8f6742d1521cd3cd0ee0ff9d47f29318a4f4da31c308",
"sha256:8b6a5aa50e482d8244a62e5582b96c372e87e3a28e8b49c316e46b95c76a611d"
],
"version": "==1.2.7"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c"
],
"version": "==2.8"
},
"jsonschema": {
"hashes": [
"sha256:4e5b3cf8216f577bee9ce139cbe72eca3ea4f292ec60928ff24758ce626cd163",
"sha256:c8a85b28d377cc7737e46e2d9f2b4f44ee3c0e1deac6bf46ddefc7187d30797a"
],
"version": "==3.2.0"
},
"lief": {
"hashes": [
"sha256:276cc63ec12a21bdf01b8d30962692c17499788234f0765247ca7a35872097ec",
"sha256:3e6baaeb52bdc339b5f19688b58fd8d5778b92e50221f920cedfa2bec1f4d5c2",
"sha256:45e5c592b57168c447698381d927eb2386ffdd52afe0c48245f848d4cc7ee05a",
"sha256:6547752b5db105cd41c9fa65d0d7452a4d7541b77ffee716b46246c6d81e172f",
"sha256:83b51e01627b5982662f9550ac1230758aa56945ed86829e4291932d98417da3",
"sha256:895599194ea7495bf304e39317b04df20cccf799fc2751867cc1aa4997cfcdae",
"sha256:8a91cee2568306fe1d2bf84341b459c85368317d01d7105fa49e4f4ede837076",
"sha256:913b36a67707dc2afa72f117bab9856ea3f434f332b04a002a0f9723c8779320",
"sha256:9f604a361a3b1b3ed5fdafed0321c5956cb3b265b5efe2250d1bf8911a80c65b",
"sha256:a487fe7234c04bccd58223dbb79214421176e2629814c7a4a887764cceb5be7c",
"sha256:bc8488fb0661cb436fe4bb4fe947d0f9aa020e9acaed233ccf01ab04d888c68a",
"sha256:bddbf333af62310a10cb738a1df1dc2b140dd9c663b55ba3500c10c249d416d2",
"sha256:cce48d7c97cef85e01e6cfeff55f2068956b5c0257eb9c2d2c6d15e33dd1e4fc",
"sha256:f8b3f66956c56b582b3adc573bf2a938c25fb21c8894b373a113e24c494fc982"
],
"version": "==0.10.1"
},
"pillow": {
"hashes": [
"sha256:0a628977ac2e01ca96aaae247ec2bd38e729631ddf2221b4b715446fd45505be",
"sha256:4d9ed9a64095e031435af120d3c910148067087541131e82b3e8db302f4c8946",
"sha256:54ebae163e8412aff0b9df1e88adab65788f5f5b58e625dc5c7f51eaf14a6837",
"sha256:5bfef0b1cdde9f33881c913af14e43db69815c7e8df429ceda4c70a5e529210f",
"sha256:5f3546ceb08089cedb9e8ff7e3f6a7042bb5b37c2a95d392fb027c3e53a2da00",
"sha256:5f7ae9126d16194f114435ebb79cc536b5682002a4fa57fa7bb2cbcde65f2f4d",
"sha256:62a889aeb0a79e50ecf5af272e9e3c164148f4bd9636cc6bcfa182a52c8b0533",
"sha256:7406f5a9b2fd966e79e6abdaf700585a4522e98d6559ce37fc52e5c955fade0a",
"sha256:8453f914f4e5a3d828281a6628cf517832abfa13ff50679a4848926dac7c0358",
"sha256:87269cc6ce1e3dee11f23fa515e4249ae678dbbe2704598a51cee76c52e19cda",
"sha256:875358310ed7abd5320f21dd97351d62de4929b0426cdb1eaa904b64ac36b435",
"sha256:8ac6ce7ff3892e5deaab7abaec763538ffd011f74dc1801d93d3c5fc541feee2",
"sha256:91b710e3353aea6fc758cdb7136d9bbdcb26b53cefe43e2cba953ac3ee1d3313",
"sha256:9d2ba4ed13af381233e2d810ff3bab84ef9f18430a9b336ab69eaf3cd24299ff",
"sha256:a62ec5e13e227399be73303ff301f2865bf68657d15ea50b038d25fc41097317",
"sha256:ab76e5580b0ed647a8d8d2d2daee170e8e9f8aad225ede314f684e297e3643c2",
"sha256:bf4003aa538af3f4205c5fac56eacaa67a6dd81e454ffd9e9f055fff9f1bc614",
"sha256:bf598d2e37cf8edb1a2f26ed3fb255191f5232badea4003c16301cb94ac5bdd0",
"sha256:c18f70dc27cc5d236f10e7834236aff60aadc71346a5bc1f4f83a4b3abee6386",
"sha256:c5ed816632204a2fc9486d784d8e0d0ae754347aba99c811458d69fcdfd2a2f9",
"sha256:dc058b7833184970d1248135b8b0ab702e6daa833be14035179f2acb78ff5636",
"sha256:ff3797f2f16bf9d17d53257612da84dd0758db33935777149b3334c01ff68865"
],
"version": "==7.0.0"
},
"pydeep": {
"hashes": [
"sha256:22866eb422d1d5907f8076ee792da65caecb172425d27576274e2a8eacf6afc1"
],
"version": "==0.4"
},
"pymisp": {
"editable": true,
"extras": [
"fileobjects",
"openioc",
"virustotal",
"pdfexport"
],
"path": "."
},
"pymispwarninglists": {
"editable": true,
"git": "https://github.com/MISP/PyMISPWarningLists.git",
"ref": "1257a2e378ffb9f3dfcc4a0e83bde4ae1b040c83"
},
"pyrsistent": {
"hashes": [
"sha256:cdc7b5e3ed77bed61270a47d35434a30617b9becdf2478af76ad2c6ade307280"
],
"version": "==0.15.7"
},
"python-dateutil": {
"hashes": [
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.1"
},
"python-magic": {
"hashes": [
"sha256:f2674dcfad52ae6c49d4803fa027809540b130db1dec928cfbb9240316831375",
"sha256:f3765c0f582d2dfc72c15f3b5a82aecfae9498bd29ca840d72f37d7bd38bfcd5"
],
"version": "==0.4.15"
},
"reportlab": {
"hashes": [
"sha256:2a1c4ea2155fd5b6e3f89e36b8aa21b5a14c9bbaf9b44de2787641668bc95edc",
"sha256:2b7469a98df1315d4f52319c4438eaee3fdd17330830edadae775e9312402638",
"sha256:3b556160aac294fa661545245e4bc273328f9226e5110139647f4d4bc0cfc453",
"sha256:3eb25d2c2bde078815d8f7ea400abbcae16a0c498a4b27ead3c4a620b1f1f980",
"sha256:3f229c0b2ca27eb5b08777981d3bd0d34e59bfa306627b88d80c3734cd3e26d5",
"sha256:4695755cc70b7a9308508aa41eafc3f335348be0eadd86e8f92cb87815d6177b",
"sha256:4f97b4474e419ae5c441ecdf0db8eceb5f5af0461bdf73e3e5ec05353844045c",
"sha256:550d2d8516e468192e12be8aeaf80f3bd805dc46dd0a5a4ddf2a3e1cd8149a16",
"sha256:59aa9c4ca80d397f6cabec092b5a6e2304fb1b7ca53e5b650872aae13ebfeb68",
"sha256:6e4479b75778b9c1e4640dc90efb72cb990471d56089947d6be4ccd9e7a56a3c",
"sha256:6e9434bd0afa6d6fcf9abbc565750cc456b6e60dc49abd7cd2bc7cf414ee079b",
"sha256:73e4e30b72da1f9f8caba775ad9cc027957c2340c38ba2d6622a9f2351b12c3a",
"sha256:7c05c2ba8ab32f02b23a56a75a4d136c2bfb7221a04a8306835a938fa6711644",
"sha256:849e4cabce1ed1183e83dc89570810b3bf9bf9cf0d0a605bde854a0baf212124",
"sha256:863c6fcf5fc0c8184b6315885429f5468373a3def2eb0c0073d09b79b2161113",
"sha256:8e688df260682038ecd32f106d796024fbcf70e7bf54340b14f991bd5465f97a",
"sha256:9675a26d01ec141cb717091bb139b6227bfb3794f521943101da50327bff4825",
"sha256:969b0d9663c0c641347d2408d41e6723e84d9f7863babc94438c91295c74f36d",
"sha256:978560732758bf5fca4ec1ed124afe2702d08824f6b0364cca31519bd5e7dadd",
"sha256:99ea85b47248c6cdbece147bdbd67aed16209bdd95770aa1f151ec3bb8794496",
"sha256:9cdc318c37fa959909db5beb05ca0b684d3e2cba8f40af1ce6f332c3f69bd2b8",
"sha256:b55c26510ff7f135af8eae1216372028cde7dab22003d918649fce219020eb58",
"sha256:cb301340b4fc1f2b7b25ea4584c5cbde139ced2d4ff01ad5e8fcf7d7822982b0",
"sha256:e7578a573454a5490553fb091374996d32269dff44021a401763080bda1357cf",
"sha256:e84387d35a666aafafda332afca8a75fb04f097cc0a2dc2d04e8c90a83cf7c1b",
"sha256:eb66eff64ea75f028af3ac63a7a2bf1e8733297141a85cbdffd5deaef404fa52",
"sha256:f5e3afd2cc35a73f34c3084c69fe4653591611da5189e50b58db550bb46e340a",
"sha256:f6c10628386bfe0c1f6640c28fb262d0960bb26c249cefabb755fb273323220d"
],
"version": "==3.5.34"
},
"requests": {
"hashes": [
"sha256:11e007a8a2aa0323f5a921e9e6a2d7e4e67d9877e85773fba9ba6419025cbeb4",
"sha256:9cf5292fcd0f598c671cfc1e0d7d1a7f13bb8085e9a590f48c010551dc6c4b31"
],
"version": "==2.22.0"
},
"six": {
"hashes": [
"sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
],
"version": "==1.14.0"
},
"soupsieve": {
"hashes": [
"sha256:bdb0d917b03a1369ce964056fc195cfdff8819c40de04695a80bc813c3cfa1f5",
"sha256:e2c1c5dee4a1c36bcb790e0fabd5492d874b8ebd4617622c4f6a731701060dda"
],
"version": "==1.9.5"
},
"urllib3": {
"hashes": [
"sha256:2f3db8b19923a873b3e5256dc9c2dedfa883e33d87c690d9c7913e1f40673cdc",
"sha256:87716c2d2a7121198ebcb7ce7cccf6ce5e9ba539041cfbaeecfb641dc0bf6acc"
],
"version": "==1.25.8"
},
"validators": {
"hashes": [
"sha256:b192e6bde7d617811d59f50584ed240b580375648cd032d106edeb3164099508"
],
"version": "==0.14.2"
},
"wrapt": {
"hashes": [
"sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1"
],
"version": "==1.11.2"
}
},
"develop": {
"alabaster": {
"hashes": [
"sha256:446438bdcca0e05bd45ea2de1668c1d9b032e1a9154c2c259092d77031ddd359",
"sha256:a661d72d58e6ea8a57f7a86e37d86716863ee5e92788398526d58b26a4e4dc02"
],
"version": "==0.7.12"
},
"attrs": {
"hashes": [
"sha256:08a96c641c3a74e44eb59afb61a24f2cb9f4d7188748e76ba4bb5edfa3cb7d1c",
"sha256:f7b7ce16570fe9965acd6d30101a28f62fb4a7f9e926b3bbc9b61f8b04247e72"
],
"version": "==19.3.0"
},
"babel": {
"hashes": [
"sha256:1aac2ae2d0d8ea368fa90906567f5c08463d98ade155c0c4bfedd6a0f7160e38",
"sha256:d670ea0b10f8b723672d3a6abeb87b565b244da220d76b4dba1b66269ec152d4"
],
"version": "==2.8.0"
},
"beautifulsoup4": {
"hashes": [
"sha256:05fd825eb01c290877657a56df4c6e4c311b3965bda790c613a3d6fb01a5462a",
"sha256:9fbb4d6e48ecd30bcacc5b63b94088192dcda178513b2ae3c394229f8911b887",
"sha256:e1505eeed31b0f4ce2dbb3bc8eb256c04cc2b3b72af7d551a4ab6efd5cbe5dae"
],
"version": "==4.8.2"
},
"certifi": {
"hashes": [
"sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3",
"sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f"
],
"version": "==2019.11.28"
},
"chardet": {
"hashes": [
"sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae",
"sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691"
],
"version": "==3.0.4"
},
"click": {
"hashes": [
"sha256:2335065e6395b9e67ca716de5f7526736bfa6ceead690adf616d925bdc622b13",
"sha256:5b94b49521f6456670fdb30cd82a4eca9412788a93fa6dd6df72c94d5a8ff2d7"
],
"version": "==7.0"
},
"codecov": {
"hashes": [
"sha256:8ed8b7c6791010d359baed66f84f061bba5bd41174bf324c31311e8737602788",
"sha256:ae00d68e18d8a20e9c3288ba3875ae03db3a8e892115bf9b83ef20507732bed4"
],
"index": "pypi",
"version": "==2.0.15"
},
"colorama": {
"hashes": [
"sha256:7d73d2a99753107a36ac6b455ee49046802e59d9d076ef8e47b61499fa29afff",
"sha256:e96da0d330793e2cb9485e9ddfd918d456036c7149416295932478192f4436a1"
],
"version": "==0.4.3"
},
"commonmark": {
"hashes": [
"sha256:452f9dc859be7f06631ddcb328b6919c67984aca654e5fefb3914d54691aed60",
"sha256:da2f38c92590f83de410ba1a3cbceafbc74fee9def35f9251ba9a971d6d66fd9"
],
"version": "==0.9.1"
},
"coverage": {
"hashes": [
"sha256:15cf13a6896048d6d947bf7d222f36e4809ab926894beb748fc9caa14605d9c3",
"sha256:1daa3eceed220f9fdb80d5ff950dd95112cd27f70d004c7918ca6dfc6c47054c",
"sha256:1e44a022500d944d42f94df76727ba3fc0a5c0b672c358b61067abb88caee7a0",
"sha256:25dbf1110d70bab68a74b4b9d74f30e99b177cde3388e07cc7272f2168bd1477",
"sha256:3230d1003eec018ad4a472d254991e34241e0bbd513e97a29727c7c2f637bd2a",
"sha256:3dbb72eaeea5763676a1a1efd9b427a048c97c39ed92e13336e726117d0b72bf",
"sha256:5012d3b8d5a500834783689a5d2292fe06ec75dc86ee1ccdad04b6f5bf231691",
"sha256:51bc7710b13a2ae0c726f69756cf7ffd4362f4ac36546e243136187cfcc8aa73",
"sha256:527b4f316e6bf7755082a783726da20671a0cc388b786a64417780b90565b987",
"sha256:722e4557c8039aad9592c6a4213db75da08c2cd9945320220634f637251c3894",
"sha256:76e2057e8ffba5472fd28a3a010431fd9e928885ff480cb278877c6e9943cc2e",
"sha256:77afca04240c40450c331fa796b3eab6f1e15c5ecf8bf2b8bee9706cd5452fef",
"sha256:7afad9835e7a651d3551eab18cbc0fdb888f0a6136169fbef0662d9cdc9987cf",
"sha256:9bea19ac2f08672636350f203db89382121c9c2ade85d945953ef3c8cf9d2a68",
"sha256:a8b8ac7876bc3598e43e2603f772d2353d9931709345ad6c1149009fd1bc81b8",
"sha256:b0840b45187699affd4c6588286d429cd79a99d509fe3de0f209594669bb0954",
"sha256:b26aaf69713e5674efbde4d728fb7124e429c9466aeaf5f4a7e9e699b12c9fe2",
"sha256:b63dd43f455ba878e5e9f80ba4f748c0a2156dde6e0e6e690310e24d6e8caf40",
"sha256:be18f4ae5a9e46edae3f329de2191747966a34a3d93046dbdf897319923923bc",
"sha256:c312e57847db2526bc92b9bfa78266bfbaabac3fdcd751df4d062cd4c23e46dc",
"sha256:c60097190fe9dc2b329a0eb03393e2e0829156a589bd732e70794c0dd804258e",
"sha256:c62a2143e1313944bf4a5ab34fd3b4be15367a02e9478b0ce800cb510e3bbb9d",
"sha256:cc1109f54a14d940b8512ee9f1c3975c181bbb200306c6d8b87d93376538782f",
"sha256:cd60f507c125ac0ad83f05803063bed27e50fa903b9c2cfee3f8a6867ca600fc",
"sha256:d513cc3db248e566e07a0da99c230aca3556d9b09ed02f420664e2da97eac301",
"sha256:d649dc0bcace6fcdb446ae02b98798a856593b19b637c1b9af8edadf2b150bea",
"sha256:d7008a6796095a79544f4da1ee49418901961c97ca9e9d44904205ff7d6aa8cb",
"sha256:da93027835164b8223e8e5af2cf902a4c80ed93cb0909417234f4a9df3bcd9af",
"sha256:e69215621707119c6baf99bda014a45b999d37602cb7043d943c76a59b05bf52",
"sha256:ea9525e0fef2de9208250d6c5aeeee0138921057cd67fcef90fbed49c4d62d37",
"sha256:fca1669d464f0c9831fd10be2eef6b86f5ebd76c724d1e0706ebdff86bb4adf0"
],
"version": "==5.0.3"
},
"coveralls": {
"hashes": [
"sha256:2da39aeaef986757653f0a442ba2bef22a8ec602c8bacbc69d39f468dfae12ec",
"sha256:906e07a12b2ac04b8ad782d06173975fe5ff815fe9df3bfedd2c099bc5791aec"
],
"index": "pypi",
"version": "==1.10.0"
},
"decorator": {
"hashes": [
"sha256:54c38050039232e1db4ad7375cfce6748d7b41c29e95a081c8a6d2c30364a2ce",
"sha256:5d19b92a3c8f7f101c8dd86afd86b0f061a8ce4540ab8cd401fa2542756bce6d"
],
"version": "==4.4.1"
},
"deprecated": {
"hashes": [
"sha256:408038ab5fdeca67554e8f6742d1521cd3cd0ee0ff9d47f29318a4f4da31c308",
"sha256:8b6a5aa50e482d8244a62e5582b96c372e87e3a28e8b49c316e46b95c76a611d"
],
"version": "==1.2.7"
},
"docopt": {
"hashes": [
"sha256:49b3a825280bd66b3aa83585ef59c4a8c82f2c8a522dbe754a8bc8d08c85c491"
],
"version": "==0.6.2"
},
"docutils": {
"hashes": [
"sha256:54a349c622ff31c91cbec43b0b512f113b5b24daf00e2ea530bb1bd9aac14849",
"sha256:d2ddba74835cb090a1b627d3de4e7835c628d07ee461f7b4480f51af2fe4d448"
],
"index": "pypi",
"version": "==0.15"
},
"entrypoints": {
"hashes": [
"sha256:589f874b313739ad35be6e0cd7efde2a4e9b6fea91edcc34e58ecbb8dbe56d19",
"sha256:c70dd71abe5a8c85e55e12c19bd91ccfeec11a6e99044204511f9ed547d48451"
],
"version": "==0.3"
},
"flake8": {
"hashes": [
"sha256:45681a117ecc81e870cbf1262835ae4af5e7a8b08e40b944a8a6e6b895914cfb",
"sha256:49356e766643ad15072a789a20915d3c91dc89fd313ccd71802303fd67e4deca"
],
"index": "pypi",
"version": "==3.7.9"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
"sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c"
],
"version": "==2.8"
},
"imagesize": {
"hashes": [
"sha256:6965f19a6a2039c7d48bca7dba2473069ff854c36ae6f19d2cde309d998228a1",
"sha256:b1f6b5a4eab1f73479a50fb79fcf729514a900c341d8503d62a62dbc4127a2b1"
],
"version": "==1.2.0"
},
"jinja2": {
"hashes": [
"sha256:6e7a3c2934694d59ad334c93dd1b6c96699cf24c53fdb8ec848ac6b23e685734",
"sha256:d6609ae5ec3d56212ca7d802eda654eaf2310000816ce815361041465b108be4"
],
"version": "==2.11.0"
},
"jsonschema": {
"hashes": [
"sha256:4e5b3cf8216f577bee9ce139cbe72eca3ea4f292ec60928ff24758ce626cd163",
"sha256:c8a85b28d377cc7737e46e2d9f2b4f44ee3c0e1deac6bf46ddefc7187d30797a"
],
"version": "==3.2.0"
},
"lief": {
"hashes": [
"sha256:276cc63ec12a21bdf01b8d30962692c17499788234f0765247ca7a35872097ec",
"sha256:3e6baaeb52bdc339b5f19688b58fd8d5778b92e50221f920cedfa2bec1f4d5c2",
"sha256:45e5c592b57168c447698381d927eb2386ffdd52afe0c48245f848d4cc7ee05a",
"sha256:6547752b5db105cd41c9fa65d0d7452a4d7541b77ffee716b46246c6d81e172f",
"sha256:83b51e01627b5982662f9550ac1230758aa56945ed86829e4291932d98417da3",
"sha256:895599194ea7495bf304e39317b04df20cccf799fc2751867cc1aa4997cfcdae",
"sha256:8a91cee2568306fe1d2bf84341b459c85368317d01d7105fa49e4f4ede837076",
"sha256:913b36a67707dc2afa72f117bab9856ea3f434f332b04a002a0f9723c8779320",
"sha256:9f604a361a3b1b3ed5fdafed0321c5956cb3b265b5efe2250d1bf8911a80c65b",
"sha256:a487fe7234c04bccd58223dbb79214421176e2629814c7a4a887764cceb5be7c",
"sha256:bc8488fb0661cb436fe4bb4fe947d0f9aa020e9acaed233ccf01ab04d888c68a",
"sha256:bddbf333af62310a10cb738a1df1dc2b140dd9c663b55ba3500c10c249d416d2",
"sha256:cce48d7c97cef85e01e6cfeff55f2068956b5c0257eb9c2d2c6d15e33dd1e4fc",
"sha256:f8b3f66956c56b582b3adc573bf2a938c25fb21c8894b373a113e24c494fc982"
],
"version": "==0.10.1"
},
"markupsafe": {
"hashes": [
"sha256:00bc623926325b26bb9605ae9eae8a215691f33cae5df11ca5424f06f2d1f473",
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
"sha256:7c1699dfe0cf8ff607dbdcc1e9b9af1755371f92a68f706051cc8c37d447c905",
"sha256:88e5fcfb52ee7b911e8bb6d6aa2fd21fbecc674eadd44118a9cc3863f938e735",
"sha256:8defac2f2ccd6805ebf65f5eeb132adcf2ab57aa11fdf4c0dd5169a004710e7d",
"sha256:98c7086708b163d425c67c7a91bad6e466bb99d797aa64f965e9d25c12111a5e",
"sha256:9add70b36c5666a2ed02b43b335fe19002ee5235efd4b8a89bfcf9005bebac0d",
"sha256:9bf40443012702a1d2070043cb6291650a0841ece432556f784f004937f0f32c",
"sha256:ade5e387d2ad0d7ebf59146cc00c8044acbd863725f887353a10df825fc8ae21",
"sha256:b00c1de48212e4cc9603895652c5c410df699856a2853135b3967591e4beebc2",
"sha256:b1282f8c00509d99fef04d8ba936b156d419be841854fe901d8ae224c59f0be5",
"sha256:b2051432115498d3562c084a49bba65d97cf251f5a331c64a12ee7e04dacc51b",
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7",
"sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be"
],
"version": "==1.1.1"
},
"mccabe": {
"hashes": [
"sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
"sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
],
"version": "==0.6.1"
},
"memory-profiler": {
"hashes": [
"sha256:23b196f91ea9ac9996e30bfab1e82fecc30a4a1d24870e81d1e81625f786a2c3"
],
"index": "pypi",
"version": "==0.57.0"
},
"mypy": {
"hashes": [
"sha256:0a9a45157e532da06fe56adcfef8a74629566b607fa2c1ac0122d1ff995c748a",
"sha256:2c35cae79ceb20d47facfad51f952df16c2ae9f45db6cb38405a3da1cf8fc0a7",
"sha256:4b9365ade157794cef9685791032521233729cb00ce76b0ddc78749abea463d2",
"sha256:53ea810ae3f83f9c9b452582261ea859828a9ed666f2e1ca840300b69322c474",
"sha256:634aef60b4ff0f650d3e59d4374626ca6153fcaff96ec075b215b568e6ee3cb0",
"sha256:7e396ce53cacd5596ff6d191b47ab0ea18f8e0ec04e15d69728d530e86d4c217",
"sha256:7eadc91af8270455e0d73565b8964da1642fe226665dd5c9560067cd64d56749",
"sha256:7f672d02fffcbace4db2b05369142e0506cdcde20cea0e07c7c2171c4fd11dd6",
"sha256:85baab8d74ec601e86134afe2bcccd87820f79d2f8d5798c889507d1088287bf",
"sha256:87c556fb85d709dacd4b4cb6167eecc5bbb4f0a9864b69136a0d4640fdc76a36",
"sha256:a6bd44efee4dc8c3324c13785a9dc3519b3ee3a92cada42d2b57762b7053b49b",
"sha256:c6d27bd20c3ba60d5b02f20bd28e20091d6286a699174dfad515636cb09b5a72",
"sha256:e2bb577d10d09a2d8822a042a23b8d62bc3b269667c9eb8e60a6edfa000211b1",
"sha256:f97a605d7c8bc2c6d1172c2f0d5a65b24142e11a58de689046e62c2d632ca8c1"
],
"index": "pypi",
"version": "==0.761"
},
"mypy-extensions": {
"hashes": [
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"
],
"version": "==0.4.3"
},
"neobolt": {
"hashes": [
"sha256:ca4e87679fe3ed39aec23638658e02dbdc6bbc3289a04e826f332e05ab32275d"
],
"version": "==1.7.16"
},
"neotime": {
"hashes": [
"sha256:4e0477ba0f24e004de2fa79a3236de2bd941f20de0b5db8d976c52a86d7363eb"
],
"version": "==1.7.4"
},
"nose": {
"hashes": [
"sha256:9ff7c6cc443f8c51994b34a667bbcf45afd6d945be7477b52e97516fd17c53ac",
"sha256:dadcddc0aefbf99eea214e0f1232b94f2fa9bd98fa8353711dacb112bfcbbb2a",
"sha256:f1bffef9cbc82628f6e7d7b40d7e255aefaa1adb6a1b1d26c69a8b79e6208a98"
],
"index": "pypi",
"version": "==1.3.7"
},
"packaging": {
"hashes": [
"sha256:170748228214b70b672c581a3dd610ee51f733018650740e98c7df862a583f73",
"sha256:e665345f9eef0c621aa0bf2f8d78cf6d21904eef16a93f020240b704a57f1334"
],
"version": "==20.1"
},
"pillow": {
"hashes": [
"sha256:0a628977ac2e01ca96aaae247ec2bd38e729631ddf2221b4b715446fd45505be",
"sha256:4d9ed9a64095e031435af120d3c910148067087541131e82b3e8db302f4c8946",
"sha256:54ebae163e8412aff0b9df1e88adab65788f5f5b58e625dc5c7f51eaf14a6837",
"sha256:5bfef0b1cdde9f33881c913af14e43db69815c7e8df429ceda4c70a5e529210f",
"sha256:5f3546ceb08089cedb9e8ff7e3f6a7042bb5b37c2a95d392fb027c3e53a2da00",
"sha256:5f7ae9126d16194f114435ebb79cc536b5682002a4fa57fa7bb2cbcde65f2f4d",
"sha256:62a889aeb0a79e50ecf5af272e9e3c164148f4bd9636cc6bcfa182a52c8b0533",
"sha256:7406f5a9b2fd966e79e6abdaf700585a4522e98d6559ce37fc52e5c955fade0a",
"sha256:8453f914f4e5a3d828281a6628cf517832abfa13ff50679a4848926dac7c0358",
"sha256:87269cc6ce1e3dee11f23fa515e4249ae678dbbe2704598a51cee76c52e19cda",
"sha256:875358310ed7abd5320f21dd97351d62de4929b0426cdb1eaa904b64ac36b435",
"sha256:8ac6ce7ff3892e5deaab7abaec763538ffd011f74dc1801d93d3c5fc541feee2",
"sha256:91b710e3353aea6fc758cdb7136d9bbdcb26b53cefe43e2cba953ac3ee1d3313",
"sha256:9d2ba4ed13af381233e2d810ff3bab84ef9f18430a9b336ab69eaf3cd24299ff",
"sha256:a62ec5e13e227399be73303ff301f2865bf68657d15ea50b038d25fc41097317",
"sha256:ab76e5580b0ed647a8d8d2d2daee170e8e9f8aad225ede314f684e297e3643c2",
"sha256:bf4003aa538af3f4205c5fac56eacaa67a6dd81e454ffd9e9f055fff9f1bc614",
"sha256:bf598d2e37cf8edb1a2f26ed3fb255191f5232badea4003c16301cb94ac5bdd0",
"sha256:c18f70dc27cc5d236f10e7834236aff60aadc71346a5bc1f4f83a4b3abee6386",
"sha256:c5ed816632204a2fc9486d784d8e0d0ae754347aba99c811458d69fcdfd2a2f9",
"sha256:dc058b7833184970d1248135b8b0ab702e6daa833be14035179f2acb78ff5636",
"sha256:ff3797f2f16bf9d17d53257612da84dd0758db33935777149b3334c01ff68865"
],
"version": "==7.0.0"
},
"prompt-toolkit": {
"hashes": [
"sha256:46642344ce457641f28fc9d1c9ca939b63dadf8df128b86f1b9860e59c73a5e4",
"sha256:e7f8af9e3d70f514373bf41aa51bc33af12a6db3f71461ea47fea985defb2c31",
"sha256:f15af68f66e664eaa559d4ac8a928111eebd5feda0c11738b5998045224829db"
],
"version": "==2.0.10"
},
"psutil": {
"hashes": [
"sha256:094f899ac3ef72422b7e00411b4ed174e3c5a2e04c267db6643937ddba67a05b",
"sha256:10b7f75cc8bd676cfc6fa40cd7d5c25b3f45a0e06d43becd7c2d2871cbb5e806",
"sha256:1b1575240ca9a90b437e5a40db662acd87bbf181f6aa02f0204978737b913c6b",
"sha256:21231ef1c1a89728e29b98a885b8e0a8e00d09018f6da5cdc1f43f988471a995",
"sha256:28f771129bfee9fc6b63d83a15d857663bbdcae3828e1cb926e91320a9b5b5cd",
"sha256:70387772f84fa5c3bb6a106915a2445e20ac8f9821c5914d7cbde148f4d7ff73",
"sha256:b560f5cd86cf8df7bcd258a851ca1ad98f0d5b8b98748e877a0aec4e9032b465",
"sha256:b74b43fecce384a57094a83d2778cdfc2e2d9a6afaadd1ebecb2e75e0d34e10d",
"sha256:e85f727ffb21539849e6012f47b12f6dd4c44965e56591d8dec6e8bc9ab96f4a",
"sha256:fd2e09bb593ad9bdd7429e779699d2d47c1268cbde4dda95fcd1bd17544a0217",
"sha256:ffad8eb2ac614518bbe3c0b8eb9dffdb3a8d2e3a7d5da51c5b974fb723a5c5aa"
],
"version": "==5.6.7"
},
"py2neo": {
"hashes": [
"sha256:a218ccb4b636e3850faa6b74ebad80f00600217172a57f745cf223d38a219222"
],
"version": "==4.3.0"
},
"pycodestyle": {
"hashes": [
"sha256:95a2219d12372f05704562a14ec30bc76b05a5b297b21a5dfe3f6fac3491ae56",
"sha256:e40a936c9a450ad81df37f549d676d127b1b66000a6c500caa2b085bc0ca976c"
],
"version": "==2.5.0"
},
"pydeep": {
"hashes": [
"sha256:22866eb422d1d5907f8076ee792da65caecb172425d27576274e2a8eacf6afc1"
],
"version": "==0.4"
},
"pyflakes": {
"hashes": [
"sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0",
"sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2"
],
"version": "==2.1.1"
},
"pygments": {
"hashes": [
"sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a",
"sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d"
],
"version": "==2.3.1"
},
"pymisp": {
"editable": true,
"extras": [
"fileobjects",
"openioc",
"virustotal",
"pdfexport"
],
"path": "."
},
"pyparsing": {
"hashes": [
"sha256:4c830582a84fb022400b85429791bc551f1f4871c33f23e44f353119e92f969f",
"sha256:c342dccb5250c08d45fd6f8b4a559613ca603b57498511740e65cd11a2e7dcec"
],
"version": "==2.4.6"
},
"pyrsistent": {
"hashes": [
"sha256:cdc7b5e3ed77bed61270a47d35434a30617b9becdf2478af76ad2c6ade307280"
],
"version": "==0.15.7"
},
"python-dateutil": {
"hashes": [
"sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c",
"sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a"
],
"version": "==2.8.1"
},
"python-magic": {
"hashes": [
"sha256:f2674dcfad52ae6c49d4803fa027809540b130db1dec928cfbb9240316831375",
"sha256:f3765c0f582d2dfc72c15f3b5a82aecfae9498bd29ca840d72f37d7bd38bfcd5"
],
"version": "==0.4.15"
},
"pytz": {
"hashes": [
"sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d",
"sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be"
],
"version": "==2019.3"
},
"recommonmark": {
"hashes": [
"sha256:29cd4faeb6c5268c633634f2d69aef9431e0f4d347f90659fd0aab20e541efeb",
"sha256:2ec4207a574289355d5b6ae4ae4abb29043346ca12cdd5f07d374dc5987d2852"
],
"version": "==0.6.0"
},
"reportlab": {
"hashes": [
"sha256:2a1c4ea2155fd5b6e3f89e36b8aa21b5a14c9bbaf9b44de2787641668bc95edc",
"sha256:2b7469a98df1315d4f52319c4438eaee3fdd17330830edadae775e9312402638",
"sha256:3b556160aac294fa661545245e4bc273328f9226e5110139647f4d4bc0cfc453",
"sha256:3eb25d2c2bde078815d8f7ea400abbcae16a0c498a4b27ead3c4a620b1f1f980",
"sha256:3f229c0b2ca27eb5b08777981d3bd0d34e59bfa306627b88d80c3734cd3e26d5",
"sha256:4695755cc70b7a9308508aa41eafc3f335348be0eadd86e8f92cb87815d6177b",
"sha256:4f97b4474e419ae5c441ecdf0db8eceb5f5af0461bdf73e3e5ec05353844045c",
"sha256:550d2d8516e468192e12be8aeaf80f3bd805dc46dd0a5a4ddf2a3e1cd8149a16",
"sha256:59aa9c4ca80d397f6cabec092b5a6e2304fb1b7ca53e5b650872aae13ebfeb68",
"sha256:6e4479b75778b9c1e4640dc90efb72cb990471d56089947d6be4ccd9e7a56a3c",
"sha256:6e9434bd0afa6d6fcf9abbc565750cc456b6e60dc49abd7cd2bc7cf414ee079b",
"sha256:73e4e30b72da1f9f8caba775ad9cc027957c2340c38ba2d6622a9f2351b12c3a",
"sha256:7c05c2ba8ab32f02b23a56a75a4d136c2bfb7221a04a8306835a938fa6711644",
"sha256:849e4cabce1ed1183e83dc89570810b3bf9bf9cf0d0a605bde854a0baf212124",
"sha256:863c6fcf5fc0c8184b6315885429f5468373a3def2eb0c0073d09b79b2161113",
"sha256:8e688df260682038ecd32f106d796024fbcf70e7bf54340b14f991bd5465f97a",
"sha256:9675a26d01ec141cb717091bb139b6227bfb3794f521943101da50327bff4825",
"sha256:969b0d9663c0c641347d2408d41e6723e84d9f7863babc94438c91295c74f36d",
"sha256:978560732758bf5fca4ec1ed124afe2702d08824f6b0364cca31519bd5e7dadd",
"sha256:99ea85b47248c6cdbece147bdbd67aed16209bdd95770aa1f151ec3bb8794496",
"sha256:9cdc318c37fa959909db5beb05ca0b684d3e2cba8f40af1ce6f332c3f69bd2b8",
"sha256:b55c26510ff7f135af8eae1216372028cde7dab22003d918649fce219020eb58",
"sha256:cb301340b4fc1f2b7b25ea4584c5cbde139ced2d4ff01ad5e8fcf7d7822982b0",
"sha256:e7578a573454a5490553fb091374996d32269dff44021a401763080bda1357cf",
"sha256:e84387d35a666aafafda332afca8a75fb04f097cc0a2dc2d04e8c90a83cf7c1b",
"sha256:eb66eff64ea75f028af3ac63a7a2bf1e8733297141a85cbdffd5deaef404fa52",
"sha256:f5e3afd2cc35a73f34c3084c69fe4653591611da5189e50b58db550bb46e340a",
"sha256:f6c10628386bfe0c1f6640c28fb262d0960bb26c249cefabb755fb273323220d"
],
"version": "==3.5.34"
},
"requests": {
"hashes": [
"sha256:11e007a8a2aa0323f5a921e9e6a2d7e4e67d9877e85773fba9ba6419025cbeb4",
"sha256:9cf5292fcd0f598c671cfc1e0d7d1a7f13bb8085e9a590f48c010551dc6c4b31"
],
"version": "==2.22.0"
},
"requests-mock": {
"hashes": [
"sha256:510df890afe08d36eca5bb16b4aa6308a6f85e3159ad3013bac8b9de7bd5a010",
"sha256:88d3402dd8b3c69a9e4f9d3a73ad11b15920c6efd36bc27bf1f701cf4a8e4646"
],
"index": "pypi",
"version": "==1.7.0"
},
"six": {
"hashes": [
"sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
],
"version": "==1.14.0"
},
"snowballstemmer": {
"hashes": [
"sha256:209f257d7533fdb3cb73bdbd24f436239ca3b2fa67d56f6ff88e86be08cc5ef0",
"sha256:df3bac3df4c2c01363f3dd2cfa78cce2840a79b9f1c2d2de9ce8d31683992f52"
],
"version": "==2.0.0"
},
"soupsieve": {
"hashes": [
"sha256:bdb0d917b03a1369ce964056fc195cfdff8819c40de04695a80bc813c3cfa1f5",
"sha256:e2c1c5dee4a1c36bcb790e0fabd5492d874b8ebd4617622c4f6a731701060dda"
],
"version": "==1.9.5"
},
"sphinx": {
"hashes": [
"sha256:298537cb3234578b2d954ff18c5608468229e116a9757af3b831c2b2b4819159",
"sha256:e6e766b74f85f37a5f3e0773a1e1be8db3fcb799deb58ca6d18b70b0b44542a5"
],
"version": "==2.3.1"
},
"sphinx-autodoc-typehints": {
"hashes": [
"sha256:27c9e6ef4f4451766ab8d08b2d8520933b97beb21c913f3df9ab2e59b56e6c6c",
"sha256:a6b3180167479aca2c4d1ed3b5cb044a70a76cccd6b38662d39288ebd9f0dff0"
],
"version": "==1.10.3"
},
"sphinxcontrib-applehelp": {
"hashes": [
"sha256:edaa0ab2b2bc74403149cb0209d6775c96de797dfd5b5e2a71981309efab3897",
"sha256:fb8dee85af95e5c30c91f10e7eb3c8967308518e0f7488a2828ef7bc191d0d5d"
],
"version": "==1.0.1"
},
"sphinxcontrib-devhelp": {
"hashes": [
"sha256:6c64b077937330a9128a4da74586e8c2130262f014689b4b89e2d08ee7294a34",
"sha256:9512ecb00a2b0821a146736b39f7aeb90759834b07e81e8cc23a9c70bacb9981"
],
"version": "==1.0.1"
},
"sphinxcontrib-htmlhelp": {
"hashes": [
"sha256:4670f99f8951bd78cd4ad2ab962f798f5618b17675c35c5ac3b2132a14ea8422",
"sha256:d4fd39a65a625c9df86d7fa8a2d9f3cd8299a3a4b15db63b50aac9e161d8eff7"
],
"version": "==1.0.2"
},
"sphinxcontrib-jsmath": {
"hashes": [
"sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178",
"sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"
],
"version": "==1.0.1"
},
"sphinxcontrib-qthelp": {
"hashes": [
"sha256:513049b93031beb1f57d4daea74068a4feb77aa5630f856fcff2e50de14e9a20",
"sha256:79465ce11ae5694ff165becda529a600c754f4bc459778778c7017374d4d406f"
],
"version": "==1.0.2"
},
"sphinxcontrib-serializinghtml": {
"hashes": [
"sha256:c0efb33f8052c04fd7a26c0a07f1678e8512e0faec19f4aa8f2473a8b81d5227",
"sha256:db6615af393650bf1151a6cd39120c29abaf93cc60db8c48eb2dddbfdc3a9768"
],
"version": "==1.1.3"
},
"typed-ast": {
"hashes": [
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"version": "==1.4.1"
},
"typing-extensions": {
"hashes": [
"sha256:091ecc894d5e908ac75209f10d5b4f118fbdb2eb1ede6a63544054bb1edb41f2",
"sha256:910f4656f54de5993ad9304959ce9bb903f90aadc7c67a0bef07e678014e892d",
"sha256:cf8b63fedea4d89bab840ecbb93e75578af28f76f66c35889bd7065f5af88575"
],
"version": "==3.7.4.1"
},
"urllib3": {
"hashes": [
"sha256:2f3db8b19923a873b3e5256dc9c2dedfa883e33d87c690d9c7913e1f40673cdc",
"sha256:87716c2d2a7121198ebcb7ce7cccf6ce5e9ba539041cfbaeecfb641dc0bf6acc"
],
"version": "==1.25.8"
},
"validators": {
"hashes": [
"sha256:b192e6bde7d617811d59f50584ed240b580375648cd032d106edeb3164099508"
],
"version": "==0.14.2"
},
"wcwidth": {
"hashes": [
"sha256:8fd29383f539be45b20bd4df0dc29c20ba48654a41e661925e612311e9f3c603",
"sha256:f28b3e8a6483e5d49e7f8949ac1a78314e740333ae305b4ba5defd3e74fb37a8"
],
"version": "==0.1.8"
},
"wrapt": {
"hashes": [
"sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1"
],
"version": "==1.11.2"
}
}
}

View File

@ -1,59 +1,48 @@
**IMPORTANT NOTE**: This library will require **at least** python 3.6 starting the 1st of January 2020. If you have to legacy versions of python, please use PyMISP v2.4.119.1, and consider updating your system(s). Anything released within the last 2 years will do, starting with Ubuntu 18.04.
**IMPORTANT NOTE**: This library will require **at least** python 3.6 starting the 1st of January 2020. If you have legacy versions of python, please use PyMISP v2.4.119.1, and consider updating your system(s). Anything released within the last 2 years will do, starting with Ubuntu 18.04.
README
======
# PyMISP - Python Library to access MISP
[![Documentation Status](https://readthedocs.org/projects/pymisp/badge/?version=latest)](http://pymisp.readthedocs.io/?badge=latest)
[![Build Status](https://travis-ci.org/MISP/PyMISP.svg?branch=master)](https://travis-ci.org/MISP/PyMISP)
[![Coverage Status](https://coveralls.io/repos/github/MISP/PyMISP/badge.svg?branch=master)](https://coveralls.io/github/MISP/PyMISP?branch=master)
[![Build Status](https://travis-ci.org/MISP/PyMISP.svg?branch=main)](https://travis-ci.org/MISP/PyMISP)
[![Coverage Status](https://coveralls.io/repos/github/MISP/PyMISP/badge.svg?branch=main)](https://coveralls.io/github/MISP/PyMISP?branch=main)
[![Python 3.6](https://img.shields.io/badge/python-3.6+-blue.svg)](https://www.python.org/downloads/release/python-360/)
[![PyPi version](https://img.shields.io/pypi/v/pymisp.svg)](https://pypi.python.org/pypi/pymisp/)
[![Number of PyPI downloads](https://img.shields.io/pypi/dm/pymisp.svg)](https://pypi.python.org/pypi/pymisp/)
# PyMISP - Python Library to access MISP
PyMISP is a Python library to access [MISP](https://github.com/MISP/MISP) platforms via their REST API.
PyMISP allows you to fetch events, add or update events/attributes, add or update samples or search for attributes.
## Requirements
* [requests](http://docs.python-requests.org)
## Install from pip
**It is strongly recommended to use a virtual environment**
If you want to know more about virtual environments, [python has you covered](https://docs.python.org/3/tutorial/venv.html)
Only basic dependencies:
```
pip3 install pymisp
```
## Install the latest version from repo
With optional dependencies:
```
pip3 install pymisp[fileobjects,openioc,virustotal]
```
## Install the latest version from repo from development purposes
**Note**: poetry is required; e.g., "pip3 install poetry"
```
git clone https://github.com/MISP/PyMISP.git && cd PyMISP
git submodule update --init
pip3 install -I .[fileobjects,openioc,virustotal]
poetry install -E fileobjects -E openioc -E virustotal -E docs -E pdfexport
```
## Installing it with virtualenv
It is recommended to use virtualenv to not polute your OS python envirenment.
```
pip3 install virtualenv
git clone https://github.com/MISP/PyMISP.git && cd PyMISP
python3 -m venv ./venv
source venv/bin/activate
git submodule update --init
pip3 install -I .[fileobjects,openioc,virustotal]
```
## Running the tests
### Running the tests
```bash
pip3 install -U nose pip setuptools coveralls codecov requests-mock
pip3 install git+https://github.com/kbandla/pydeep.git
git clone https://github.com/viper-framework/viper-test-files.git tests/viper-test-files
nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
poetry run nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
```
If you have a MISP instance to test against, you can also run the live ones:
@ -61,7 +50,7 @@ If you have a MISP instance to test against, you can also run the live ones:
**Note**: You need to update the key in `tests/testlive_comprehensive.py` to the automation key of your admin account.
```bash
nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/testlive_comprehensive.py
poetry run nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/testlive_comprehensive.py
```
## Samples and how to use PyMISP
@ -91,7 +80,7 @@ python3 last.py -l 45m # 45 minutes
## Debugging
You have two options there:
You have two options here:
1. Pass `debug=True` to `PyMISP` and it will enable logging.DEBUG to stderr on the whole module
@ -102,7 +91,7 @@ You have two options there:
import logging
logger = logging.getLogger('pymisp')
# Configure it as you whish, for example, enable DEBUG mode:
# Configure it as you wish, for example, enable DEBUG mode:
logger.setLevel(logging.DEBUG)
```
@ -119,11 +108,11 @@ logging.basicConfig(level=logging.DEBUG, filename="debug.log", filemode='w', for
## Test cases
1. The content of `mispevent.py` is tested on every commit
2. The tests cases that require a running MISP instance can be run the following way:
2. The test cases that require a running MISP instance can be run the following way:
```bash
# From a pipenv
# From poetry
nosetests-3.4 -s --with-coverage --cover-package=pymisp,tests --cover-tests tests/testlive_comprehensive.py:TestComprehensive.[test_name]
@ -131,29 +120,23 @@ nosetests-3.4 -s --with-coverage --cover-package=pymisp,tests --cover-tests test
## Documentation
[PyMISP API documentation is available](https://media.readthedocs.org/pdf/pymisp/latest/pymisp.pdf).
Documentation can be generated with epydoc:
```
epydoc --url https://github.com/MISP/PyMISP --graph all --name PyMISP --pdf pymisp -o doc
```
The documentation is available [here](https://pymisp.readthedocs.io/en/latest/).
### Jupyter notebook
A series of [Jupyter notebooks for PyMISP tutorial](https://github.com/MISP/PyMISP/tree/master/docs/tutorial) are available in the repository.
A series of [Jupyter notebooks for PyMISP tutorial](https://github.com/MISP/PyMISP/tree/main/docs/tutorial) are available in the repository.
## Everything is a Mutable Mapping
... or at least everything that can be imported/exported from/to a json blob
`AbstractMISP` is the master class, and inherit `collections.MutableMapping` which means
`AbstractMISP` is the master class, and inherits from `collections.MutableMapping` which means
the class can be represented as a python dictionary.
The abstraction assumes every property that should not be seen in the dictionary is prepended with a `_`,
or its name is added to the private list `__not_jsonable` (accessible through `update_not_jsonable` and `set_not_jsonable`.
This master class has helpers that will make it easy to load, and export, to, and from, a json string.
This master class has helpers that make it easy to load, and export to, and from, a json string.
`MISPEvent`, `MISPAttribute`, `MISPObjectReference`, `MISPObjectAttribute`, and `MISPObject`
are subclasses of AbstractMISP, which mean that they can be handled as python dictionaries.
@ -162,6 +145,11 @@ are subclasses of AbstractMISP, which mean that they can be handled as python di
Creating a new MISP object generator should be done using a pre-defined template and inherit `AbstractMISPObjectGenerator`.
Your new MISPObject generator need to generate attributes, and add them as class properties using `add_attribute`.
Your new MISPObject generator must generate attributes and add them as class properties using `add_attribute`.
When the object is sent to MISP, all the class properties will be exported to the JSON export.
# License
PyMISP is distributed under an [open source license](./LICENSE). A simplified 2-BSD license.

View File

@ -40,6 +40,7 @@ extensions = [
'sphinx.ext.viewcode',
'sphinx.ext.napoleon',
'sphinx.ext.imgconverter',
'recommonmark',
]
napoleon_google_docstring = False
@ -76,9 +77,9 @@ author = 'Raphaël Vinot'
# built documents.
#
# The short X.Y version.
version = 'master'
version = 'main'
# The full version, including alpha/beta/rc tags.
release = 'master'
release = 'main'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
@ -132,6 +133,9 @@ pygments_style = 'sphinx'
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True
# lief is a bit difficult to install
autodoc_mock_imports = ["lief"]
# -- Options for HTML output ----------------------------------------------

View File

@ -9,7 +9,7 @@ Welcome to PyMISP's documentation!
Contents:
.. toctree::
:maxdepth: 4
:maxdepth: 2
README
modules

View File

@ -1,5 +1,5 @@
pymisp
======
pymisp - Modules
================
.. toctree::
:maxdepth: 4
@ -14,12 +14,6 @@ PyMISP
.. autoclass:: PyMISP
:members:
PyMISPExpanded (Python 3.6+ only)
---------------------------------
.. autoclass:: ExpandedPyMISP
:members:
MISPAbstract
------------

View File

@ -419,6 +419,40 @@
"print(event.to_json())\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## New first/last seen"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from pymisp import MISPObject\n",
"\n",
"misp_object = event.add_object(name='domain-ip', comment='My Fancy new object, in one line')\n",
"\n",
"obj_attr = misp_object.add_attribute('domain', value='circl.lu')\n",
"obj_attr.add_tag('tlp:green')\n",
"misp_object.add_attribute('ip', value='149.13.33.14')\n",
"\n",
"misp_object.first_seen = '2018-04-11'\n",
"misp_object.last_seen = '2018-06-11T23:27:40.23356+07:00'\n",
"\n",
"print(misp_object.last_seen)\n",
"\n",
"misp_object.add_attributes('ip', {'value': '10.8.8.8', 'to_ids': False}, '10.9.8.8')\n",
"\n",
"\n",
"misp_object.add_reference(obj_attr.uuid, 'related-to', 'Expanded with passive DNS entry')\n",
"\n",
"print(event.to_json(indent=2))"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -714,6 +748,78 @@
"print(event.to_json())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Generate a feed"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from pymisp import MISPEvent, MISPOrganisation\n",
"from pymisp.tools import feed_meta_generator\n",
"from pathlib import Path\n",
"import json\n",
"\n",
"out_dir = Path('feed_test')\n",
"out_dir.mkdir(exist_ok=True)\n",
"\n",
"org = MISPOrganisation()\n",
"org.name = \"Test Org\"\n",
"org.uuid = \"972360d2-2c96-4004-937c-ba010d03f925\"\n",
"\n",
"event = MISPEvent()\n",
"\n",
"event.info = 'This is my new MISP event for a feed'\n",
"event.distribution = 1\n",
"event.Orgc = org\n",
"event.add_attribute('ip-dst', \"8.8.8.8\")\n",
"\n",
"feed_event = event.to_feed()\n",
"\n",
"with (out_dir / f'{event.uuid}.json').open('w') as f:\n",
" json.dump(feed_event, f)\n",
"\n",
"\n",
"feed_meta_generator(out_dir)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!ls feed_test"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!cat feed_test/manifest.json\n",
"\n",
"!echo ''\n",
"\n",
"!cat feed_test/hashes.csv"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!rm feed_test/*"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -853,10 +959,9 @@
"metadata": {},
"outputs": [],
"source": [
"from pymisp import ExpandedPyMISP, PyMISP\n",
"from pymisp import PyMISP\n",
"\n",
"misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert)\n",
"misp_old = PyMISP(misp_url, misp_key, misp_verifycert)"
"misp = PyMISP(misp_url, misp_key, misp_verifycert)"
]
},
{
@ -907,56 +1012,7 @@
"existing_event.add_object(mispObject)\n",
"print(existing_event.to_json())\n",
"\n",
"res = misp.update(existing_event)\n",
"existing_event = MISPEvent()\n",
"existing_event.load(res)\n",
"print(existing_event.to_json())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Old API\n",
"\n",
"Returns plain JSON"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from pymisp import MISPEvent, MISPObject\n",
"\n",
"event = MISPEvent()\n",
"event.info = 'This is my new MISP event' # Required\n",
"event.distribution = 0 # Optional, defaults to MISP.default_event_distribution in MISP config\n",
"event.threat_level_id = 2 # Optional, defaults to MISP.default_event_threat_level in MISP config\n",
"event.analysis = 1 # Optional, defaults to 0 (initial analysis)\n",
"\n",
"mispObject = MISPObject('file')\n",
"mispObject.add_attribute('filename', type='filename',\n",
" value='filename.exe',\n",
" Tag=[{'name': 'tlp:amber'}])\n",
"\n",
"event.add_object(mispObject)\n",
"\n",
"print(misp)\n",
"res = misp.add_event(event)\n",
"print(res)\n",
"existing_event = MISPEvent()\n",
"existing_event.load(res)\n",
"mispObject = MISPObject('file')\n",
"mispObject.add_attribute('filename', type='filename',\n",
" value='filename2.exe',\n",
" Tag=[{'name': 'tlp:white'}])\n",
"\n",
"existing_event.add_object(mispObject)\n",
"print(existing_event.to_json())\n",
"\n",
"res = misp.update(existing_event)\n",
"res = misp.update_event(existing_event)\n",
"existing_event = MISPEvent()\n",
"existing_event.load(res)\n",
"print(existing_event.to_json())"
@ -995,19 +1051,6 @@
"print(\"Event id: %s\" % event.id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"event = misp_old.new_event(distribution=1,\n",
" threat_level_id=1,\n",
" analysis=1,\n",
" info=\"Event from notebook\")\n",
"print(\"Event id: %s\" % event['Event']['id'])"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -1028,7 +1071,7 @@
"event_obj.threat_level_id = 1\n",
"event_obj.analysis = 1\n",
"event_obj.info = \"Event from notebook 2\"\n",
"event = misp.add_event(event_obj)\n",
"event = misp.add_event(event_obj, pythonify=True)\n",
"event_id = event.id\n",
"print(\"Event id: %s\" % event_id)"
]
@ -1097,31 +1140,6 @@
"to_ids = False"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"##### Oldish API"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"scrolled": true
},
"outputs": [],
"source": [
"proposal = False\n",
"updated_event = misp.add_named_attribute(event_id,\n",
" attr_type,\n",
" value,\n",
" category=category,\n",
" to_ids=to_ids,\n",
" proposal=proposal)\n",
"print(updated_event)"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -1153,7 +1171,7 @@
"attribute.category = category\n",
"attribute.to_ids = to_ids\n",
"\n",
"attribute_to_change = misp.add_attribute(event_id, attribute)\n",
"attribute_to_change = misp.add_attribute(event_id, attribute, pythonify=True)\n",
"print(attribute_to_change.id, attribute_to_change)"
]
},
@ -1368,7 +1386,7 @@
"metadata": {},
"outputs": [],
"source": [
"misp.get_sharing_groups()"
"misp.sharing_groups()"
]
},
{
@ -1384,7 +1402,7 @@
"metadata": {},
"outputs": [],
"source": [
"misp.get_users_list()"
"misp.users()"
]
},
{
@ -1409,7 +1427,7 @@
"metadata": {},
"outputs": [],
"source": [
"misp.get_organisations_list()"
"misp.organisations()"
]
},
{
@ -1425,7 +1443,7 @@
"metadata": {},
"outputs": [],
"source": [
"misp.get_roles_list()"
"misp.roles()"
]
},
{
@ -1441,7 +1459,7 @@
"metadata": {},
"outputs": [],
"source": [
"misp.get_feeds_list()"
"misp.feeds(pythonify=True)"
]
},
{
@ -1477,7 +1495,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
"version": "3.8.2"
}
},
"nbformat": 4,

View File

@ -52,9 +52,9 @@
"metadata": {},
"outputs": [],
"source": [
"from pymisp import ExpandedPyMISP\n",
"from pymisp import PyMISP\n",
"\n",
"misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert, debug=False)"
"misp = PyMISP(misp_url, misp_key, misp_verifycert, debug=False)"
]
},
{
@ -364,7 +364,16 @@
"metadata": {},
"outputs": [],
"source": [
"print(r)"
"r = misp.search(tags=['%tlp:amber%'], pythonify=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(r[0].tags)"
]
},
{
@ -595,7 +604,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
"version": "3.7.5"
}
},
"nbformat": 4,

View File

@ -70,7 +70,7 @@
"source": [
"## Search unpublished events\n",
"\n",
"**WARNING**: By default, the search query will only return all the events listed on teh index page"
"**WARNING**: By default, the search query will only return all the events listed on the index page"
]
},
{

View File

@ -49,7 +49,7 @@ if __name__ == '__main__':
if args.force_new:
me = create_new_event()
else:
response = pymisp.search_index(tag=args.tag, timestamp='1h', pythonify=True)
response = pymisp.search_index(tags=args.tag, timestamp='1h', pythonify=True)
if response:
if args.disable_new:
event_id = response[0].id

65
examples/add_github_user.py Executable file
View File

@ -0,0 +1,65 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from pymisp import MISPObject
from pymisp.tools import update_objects
from keys import misp_url, misp_key, misp_verifycert
import argparse
import requests
import sys
"""
usage: add_github_user.py [-h] -e EVENT [-f] -u USERNAME
Fetch GitHub user details and add it in object in MISP
optional arguments:
-h, --help show this help message and exit
-e EVENT, --event EVENT
Event ID to update
-f, --force-template-update
-u USERNAME, --username USERNAME
GitHub username to add
"""
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Fetch GitHub user details and add it in object in MISP')
parser.add_argument("-e", "--event", required=True, help="Event ID to update")
parser.add_argument("-f", "--force-template-update", required=False, action="store_true")
parser.add_argument("-u", "--username", required=True, help="GitHub username to add")
args = parser.parse_args()
r = requests.get("https://api.github.com/users/{}".format(args.username))
if r.status_code != 200:
sys.exit("HTTP return is {} and not 200 as expected".format(r.status_code))
if args.force_template_update:
print("Updating MISP Object templates...")
update_objects()
pymisp = PyMISP(misp_url, misp_key, misp_verifycert)
misp_object = MISPObject(name="github-user")
github_user = r.json()
rfollowers = requests.get(github_user['followers_url'])
followers = rfollowers.json()
rfollowing = requests.get("https://api.github.com/users/{}/following".format(args.username))
followings = rfollowing.json()
rkeys = requests.get("https://api.github.com/users/{}/keys".format(args.username))
keys = rkeys.json()
misp_object.add_attributes("follower", *[follower['login'] for follower in followers])
misp_object.add_attributes("following", *[following['login'] for following in followings])
misp_object.add_attributes("ssh-public-key", *[sshkey['key'] for sshkey in keys])
misp_object.add_attribute('bio', github_user['bio'])
misp_object.add_attribute('link', github_user['html_url'])
misp_object.add_attribute('user-fullname', github_user['name'])
misp_object.add_attribute('username', github_user['login'])
misp_object.add_attribute('twitter_username', github_user['twitter_username'])
misp_object.add_attribute('location', github_user['location'])
misp_object.add_attribute('company', github_user['company'])
misp_object.add_attribute('public_gists', github_user['public_gists'])
misp_object.add_attribute('public_repos', github_user['public_repos'])
misp_object.add_attribute('blog', github_user['blog'])
misp_object.add_attribute('node_id', github_user['node_id'])
retcode = pymisp.add_object(args.event, misp_object)

56
examples/add_gitlab_user.py Executable file
View File

@ -0,0 +1,56 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from pymisp import MISPObject
from pymisp.tools import update_objects
from keys import misp_url, misp_key, misp_verifycert
import argparse
import requests
import sys
"""
usage: add_gitlab_user.py [-h] -e EVENT [-f] -u USERNAME [-l LINK]
Fetch GitLab user details and add it in object in MISP
optional arguments:
-h, --help show this help message and exit
-e EVENT, --event EVENT
Event ID to update
-f, --force-template-update
-u USERNAME, --username USERNAME
GitLab username to add
-l LINK, --link LINK Url to access the GitLab instance, Default is
www.gitlab.com.
"""
default_url = "http://www.gitlab.com/"
parser = argparse.ArgumentParser(description='Fetch GitLab user details and add it in object in MISP')
parser.add_argument("-e", "--event", required=True, help="Event ID to update")
parser.add_argument("-f", "--force-template-update", required=False, action="store_true")
parser.add_argument("-u", "--username", required=True, help="GitLab username to add")
parser.add_argument("-l", "--link", required=False, help="Url to access the GitLab instance, Default is www.gitlab.com.", default=default_url)
args = parser.parse_args()
r = requests.get("{}api/v4/users?username={}".format(args.link, args.username))
if r.status_code != 200:
sys.exit("HTTP return is {} and not 200 as expected".format(r.status_code))
if args.force_template_update:
print("Updating MISP Object templates...")
update_objects()
gitlab_user = r.json()[0]
pymisp = PyMISP(misp_url, misp_key, misp_verifycert)
print(gitlab_user)
misp_object = MISPObject(name="gitlab-user")
misp_object.add_attribute('username', gitlab_user['username'])
misp_object.add_attribute('id', gitlab_user['id'])
misp_object.add_attribute('name', gitlab_user['name'])
misp_object.add_attribute('state', gitlab_user['state'])
misp_object.add_attribute('avatar_url', gitlab_user['avatar_url'])
misp_object.add_attribute('web_url', gitlab_user['web_url'])
retcode = pymisp.add_object(args.event, misp_object)

View File

@ -0,0 +1,57 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import ExpandedPyMISP, MISPOrganisation, MISPSharingGroup
from keys import misp_url, misp_key, misp_verifycert
import argparse
import csv
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Add organizations from a CSV file')
parser.add_argument("-c", "--csv-import", required=True, help="The CSV file containing the organizations. Format 'orgname,nationality,sector,type,contacts,uuid,local,sharingroup_uuid'")
args = parser.parse_args()
misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert)
# CSV format
# orgname,nationality,sector,type,contacts,uuid,local,sharingroup
with open(args.csv_import) as csv_file:
count_orgs = 0
csv_reader = csv.reader(csv_file, delimiter=',')
for row in csv_reader:
org = MISPOrganisation()
org.name = row[0]
print("Process {}".format(org.name))
org.nationality = row[1]
org.sector = row[2]
org.type = row[3]
org.contacts = row[4]
org.uuid = row[5]
org.local = row[6]
add_org = misp.add_organisation(org, pythonify=True)
if 'errors' in add_org:
print(add_org['errors'])
else:
count_orgs = count_orgs + 1
org_uuid = add_org.uuid
if org_uuid:
sharinggroup = MISPSharingGroup()
sharinggroup_uuid = row[7]
if sharinggroup_uuid:
sharinggroup.uuid = sharinggroup_uuid
add_sharing = misp.add_org_to_sharing_group(sharinggroup, org)
else:
print("Organisation {} not added to sharing group, missing sharing group uuid".format(org.name))
print("Import finished, {} organisations added".format(count_orgs))

View File

@ -1,37 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
import json
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
result = m.get_event(event)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Get an event from a MISP instance.')
parser.add_argument("-e", "--event", required=True, help="Event ID to get.")
parser.add_argument("-a", "--attribute", help="Attribute ID to modify. A little dirty for now, argument need to be included in event")
parser.add_argument("-t", "--tag", required=True, type=int, help="Tag ID.")
parser.add_argument("-m", "--modify_attribute", action='store_true', help="If set, the tag will be add to the attribute, otherwise to the event.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
event = misp.get_event(args.event)
if args.modify_attribute:
for temp in event['Event']['Attribute']:
if temp['id'] == args.attribute:
attribute = temp
break
misp.add_tag(attribute, args.tag, attribute=True)
else:
misp.add_tag(event['Event'], args.tag)

View File

@ -3,15 +3,11 @@
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
import json
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
result = m.get_event(event)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Tag something.')
@ -29,8 +25,7 @@ if __name__ == '__main__':
if args.event and not args.attribute:
result = misp.search(eventid=args.event)
data = result['response']
for event in data:
for event in result:
uuid = event['Event']['uuid']
if args.attribute:
@ -38,8 +33,7 @@ if __name__ == '__main__':
print("Please provide event ID also")
exit()
result = misp.search(eventid=args.event)
data = result['response']
for event in data:
for event in result:
for attribute in event['Event']['Attribute']:
if attribute["id"] == args.attribute:
uuid = attribute["uuid"]

View File

@ -0,0 +1,40 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from pymisp import PyMISP
import sys
import json
# NOTE: the user of the API key *need to be a sync user*
remote_url = 'https://misp.remote'
remote_api_key = 'REMOTE KEY FOR SYNC USER'
remote_verify = True
# NOTE: the user of the API key *need to be an admin*
own_url = 'https://misp.own'
own_api_key = 'OWN KEY FOR ADMIN USER'
own_verify = True
remote_misp = PyMISP(url=remote_url, key=remote_api_key, ssl=remote_verify)
sync_config = remote_misp.get_sync_config()
if 'errors' in sync_config:
print('Sumething went wrong:')
print(json.dumps(sync_config, indent=2))
sys.exit(1)
else:
print('Sucessfully got a sync config:')
print(json.dumps(sync_config, indent=2))
own_misp = PyMISP(url=own_url, key=own_api_key, ssl=own_verify)
response = own_misp.import_server(sync_config)
if 'errors' in response:
print('Sumething went wrong:')
print(json.dumps(response, indent=2))
sys.exit(1)
else:
print('Sucessfully added the sync config:')
print(json.dumps(response, indent=2))

View File

@ -0,0 +1,152 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from pathlib import Path
from csv import DictReader
from pymisp import MISPEvent, MISPOrganisation, PyMISP, MISPObject
from datetime import datetime
from dateutil.parser import parse
import json
from pymisp.tools import feed_meta_generator
from io import BytesIO
from collections import defaultdict
make_feed = False
aggregate_by_country = True
path = Path('/home/raphael/gits/COVID-19/csse_covid_19_data/csse_covid_19_daily_reports/')
def get_country_region(row):
if 'Country/Region' in row:
return row['Country/Region']
elif 'Country_Region' in row:
return row['Country_Region']
else:
print(p, row.keys())
raise Exception()
def get_last_update(row):
if 'Last_Update' in row:
return parse(row['Last_Update'])
elif 'Last Update' in row:
return parse(row['Last Update'])
else:
print(p, row.keys())
raise Exception()
def add_detailed_object(obj, row):
if 'Province/State' in row:
if row['Province/State']:
obj.add_attribute('province-state', row['Province/State'])
elif '\ufeffProvince/State' in row:
if row['\ufeffProvince/State']:
obj.add_attribute('province-state', row['\ufeffProvince/State'])
elif 'Province_State' in row:
if row['Province_State']:
obj.add_attribute('province-state', row['Province_State'])
else:
print(p, row.keys())
raise Exception()
obj.add_attribute('country-region', get_country_region(row))
obj.add_attribute('update', get_last_update(row))
if 'Lat' in row:
obj.add_attribute('latitude', row['Lat'])
if 'Long_' in row:
obj.add_attribute('longitude', row['Long_'])
elif 'Long' in row:
obj.add_attribute('longitude', row['Long'])
if row['Confirmed']:
obj.add_attribute('confirmed', int(row['Confirmed']))
if row['Deaths']:
obj.add_attribute('death', int(row['Deaths']))
if row['Recovered']:
obj.add_attribute('recovered', int(row['Recovered']))
if 'Active' in row and row['Active']:
obj.add_attribute('active', int(row['Active']))
def country_aggregate(aggregate, row):
c = get_country_region(row)
if c not in aggregate:
aggregate[c] = defaultdict(active=0, death=0, recovered=0, confirmed=0, update=datetime.fromtimestamp(0))
if row['Confirmed']:
aggregate[c]['confirmed'] += int(row['Confirmed'])
if row['Deaths']:
aggregate[c]['death'] += int(row['Deaths'])
if row['Recovered']:
aggregate[c]['recovered'] += int(row['Recovered'])
if 'Active' in row and row['Active']:
aggregate[c]['active'] += int(row['Active'])
update = get_last_update(row)
if update > aggregate[c]['update']:
aggregate[c]['update'] = update
if make_feed:
org = MISPOrganisation()
org.name = 'CIRCL'
org.uuid = "55f6ea5e-2c60-40e5-964f-47a8950d210f"
else:
from covid_key import url, key
misp = PyMISP(url, key)
for p in path.glob('**/*.csv'):
d = datetime.strptime(p.name[:-4], '%m-%d-%Y').date()
event = MISPEvent()
if aggregate_by_country:
event.info = f"[{d.isoformat()}] CSSE COVID-19 daily report"
else:
event.info = f"[{d.isoformat()}] CSSE COVID-19 detailed daily report"
event.date = d
event.distribution = 3
event.add_tag('tlp:white')
if make_feed:
event.orgc = org
else:
e = misp.search(eventinfo=event.info, metadata=True, pythonify=True)
if e:
# Already added.
continue
event.add_attribute('attachment', p.name, data=BytesIO(p.open('rb').read()))
event.add_attribute('link', f'https://github.com/CSSEGISandData/COVID-19/tree/master/csse_covid_19_data/csse_covid_19_daily_reports/{p.name}', comment='Source')
if aggregate_by_country:
aggregate = defaultdict()
with p.open() as f:
reader = DictReader(f)
for row in reader:
if aggregate_by_country:
country_aggregate(aggregate, row)
else:
obj = MISPObject(name='covid19-csse-daily-report')
add_detailed_object(obj, row)
event.add_object(obj)
if aggregate_by_country:
for country, values in aggregate.items():
obj = event.add_object(name='covid19-csse-daily-report', standalone=False)
obj.add_attribute('country-region', country)
obj.add_attribute('update', values['update'])
obj.add_attribute('confirmed', values['confirmed'])
obj.add_attribute('death', values['death'])
obj.add_attribute('recovered', values['recovered'])
obj.add_attribute('active', values['active'])
if make_feed:
with (Path('output') / f'{event.uuid}.json').open('w') as _w:
json.dump(event.to_feed(), _w)
else:
event = misp.add_event(event)
misp.publish(event)
if make_feed:
feed_meta_generator(Path('output'))

View File

@ -0,0 +1,77 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from pathlib import Path
from pymisp import MISPEvent, MISPOrganisation, PyMISP
from dateutil.parser import parse
import json
from pymisp.tools import feed_meta_generator
from io import BytesIO
make_feed = False
path = Path('/home/raphael/gits/covid-19-china/data')
if make_feed:
org = MISPOrganisation()
org.name = 'CIRCL'
org.uuid = "55f6ea5e-2c60-40e5-964f-47a8950d210f"
else:
from covid_key import url, key
misp = PyMISP(url, key)
for p in path.glob('*_json/current_china.json'):
d = parse(p.parent.name[:-5])
event = MISPEvent()
event.info = f"[{d.isoformat()}] DXY COVID-19 live report"
event.date = d
event.distribution = 3
event.add_tag('tlp:white')
if make_feed:
event.orgc = org
else:
e = misp.search(eventinfo=event.info, metadata=True, pythonify=True)
if e:
# Already added.
continue
event.add_attribute('attachment', p.name, data=BytesIO(p.open('rb').read()))
with p.open() as f:
data = json.load(f)
for province in data:
obj_province = event.add_object(name='covid19-dxy-live-province', standalone=False)
obj_province.add_attribute('province', province['provinceName'])
obj_province.add_attribute('update', d)
if province['currentConfirmedCount']:
obj_province.add_attribute('current-confirmed', province['currentConfirmedCount'])
if province['confirmedCount']:
obj_province.add_attribute('total-confirmed', province['confirmedCount'])
if province['curedCount']:
obj_province.add_attribute('total-cured', province['curedCount'])
if province['deadCount']:
obj_province.add_attribute('total-death', province['deadCount'])
if province['comment']:
obj_province.add_attribute('comment', province['comment'])
for city in province['cities']:
obj_city = event.add_object(name='covid19-dxy-live-city', standalone=False)
obj_city.add_attribute('city', city['cityName'])
obj_city.add_attribute('update', d)
if city['currentConfirmedCount']:
obj_city.add_attribute('current-confirmed', city['currentConfirmedCount'])
if city['confirmedCount']:
obj_city.add_attribute('total-confirmed', city['confirmedCount'])
if city['curedCount']:
obj_city.add_attribute('total-cured', city['curedCount'])
if city['deadCount']:
obj_city.add_attribute('total-death', city['deadCount'])
obj_city.add_reference(obj_province, 'part-of')
if make_feed:
with (Path('output') / f'{event.uuid}.json').open('w') as _w:
json.dump(event.to_feed(), _w)
else:
misp.add_event(event)
if make_feed:
feed_meta_generator(Path('output'))

549
examples/cytomic_orion.py Executable file
View File

@ -0,0 +1,549 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
Cytomic Automation
Put this script in crontab to run every /15 or /60
*/15 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/cytomic_orion.py
Fetches the configuration set in the Cytomic Orion enrichment module
- events : upload events tagged with the 'upload' tag, all the attributes supported by Cytomic Orion
- upload : upload attributes flagged with the 'upload' tag (only attributes supported by Cytomic Orion)
- delete : delete attributes flagged with the 'upload' tag (only attributes supported by Cytomic Orion)
'''
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
import re
import sys
import requests
import json
import urllib3
def get_token(token_url, clientid, clientsecret, scope, grant_type, username, password):
'''
Get oAuth2 token
Configuration settings are fetched first from the MISP module configu
'''
try:
if scope and grant_type and username and password:
data = {'scope': scope, 'grant_type': grant_type, 'username': username, 'password': password}
if token_url and clientid and clientsecret:
access_token_response = requests.post(token_url, data=data, verify=False, allow_redirects=False, auth=(clientid, clientsecret))
tokens = json.loads(access_token_response.text)
if 'access_token' in tokens:
access_token = tokens['access_token']
return access_token
else:
sys.exit('No token received')
else:
sys.exit('No token_url, clientid or clientsecret supplied')
else:
sys.exit('No scope, grant_type, username or password supplied')
except Exception:
sys.exit('Unable to connect to token_url')
def get_config(url, key, misp_verifycert):
'''
Get the module config and the settings needed to access the API
Also contains the settings to do the query
'''
try:
misp_headers = {'Content-Type': 'application/json', 'Accept': 'application/json', 'Authorization': key}
req = requests.get(url + 'servers/serverSettings.json', verify=misp_verifycert, headers=misp_headers)
if req.status_code == 200:
req_json = req.json()
if 'finalSettings' in req_json:
finalSettings = req_json['finalSettings']
clientid = clientsecret = scope = username = password = grant_type = api_url = token_url = ''
module_enabled = False
scope = 'orion.api'
grant_type = 'password'
limit_upload_events = 50
limit_upload_attributes = 50
ttlDays = "1"
last_attributes = '5d'
post_threat_level_id = 2
for el in finalSettings:
# Is the module enabled?
if el['setting'] == 'Plugin.Enrichment_cytomic_orion_enabled':
module_enabled = el['value']
if module_enabled is False:
break
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_clientid':
clientid = el['value']
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_clientsecret':
clientsecret = el['value']
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_username':
username = el['value']
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_password':
password = el['value']
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_api_url':
api_url = el['value'].replace('\\/', '/')
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_token_url':
token_url = el['value'].replace('\\/', '/')
elif el['setting'] == 'MISP.baseurl':
misp_baseurl = el['value']
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_upload_threat_level_id':
if el['value']:
try:
post_threat_level_id = int(el['value'])
except:
continue
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_upload_ttlDays':
if el['value']:
try:
ttlDays = "{last_days}".format(last_days=int(el['value']))
except:
continue
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_upload_timeframe':
if el['value']:
try:
last_attributes = "{last_days}d".format(last_days=int(el['value']))
except:
continue
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_upload_tag':
upload_tag = el['value']
elif el['setting'] == 'Plugin.Enrichment_cytomic_orion_delete_tag':
delete_tag = el['value']
elif el['setting'] == 'Plugin.Enrichment_limit_upload_events':
if el['value']:
try:
limit_upload_events = "{limit_upload_events}".format(limit_upload_events=int(el['value']))
except:
continue
elif el['setting'] == 'Plugin.Enrichment_limit_upload_attributes':
if el['value']:
try:
limit_upload_attributes = "{limit_upload_attributes}".format(limit_upload_attributes=int(el['value']))
except:
continue
else:
sys.exit('Did not receive a 200 code from MISP')
if module_enabled and api_url and token_url and clientid and clientsecret and username and password and grant_type:
return {'cytomic_policy': 'Detect',
'upload_timeframe': last_attributes,
'upload_tag': upload_tag,
'delete_tag': delete_tag,
'upload_ttlDays': ttlDays,
'post_threat_level_id': post_threat_level_id,
'clientid': clientid,
'clientsecret': clientsecret,
'scope': scope,
'username': username,
'password': password,
'grant_type': grant_type,
'api_url': api_url,
'token_url': token_url,
'misp_baseurl': misp_baseurl,
'limit_upload_events': limit_upload_events,
'limit_upload_attributes': limit_upload_attributes}
else:
sys.exit('Did not receive all the necessary configuration information from MISP')
except Exception as e:
sys.exit('Unable to get module config from MISP')
class cytomicobject:
misp = None
lst_evtid = None
lst_attuuid = None
lst_attuuid_error = None
endpoint_ioc = None
api_call_headers = None
post_data = None
args = None
tag = None
limit_events = None
limit_attributes = None
atttype_misp = None
atttype_cytomic = None
attlabel_cytomic = None
att_types = {
"ip-dst": {"ip": "ipioc"},
"ip-src": {"ip": "ipioc"},
"url": {"url": "urlioc"},
"md5": {"hash": "filehashioc"},
"domain": {"domain": "domainioc"},
"hostname": {"domain": "domainioc"},
"domain|ip": {"domain": "domainioc"},
"hostname|port": {"domain": "domainioc"}
}
debug = True
error = False
res = False
res_msg = None
def collect_events_ids(cytomicobj, moduleconfig):
# Get events that contain Cytomic tag.
try:
evt_result = cytomicobj.misp.search(controller='events', limit=cytomicobj.limit_events, tags=cytomicobj.tag, last=moduleconfig['upload_timeframe'], published=True, deleted=False, pythonify=True)
cytomicobj.lst_evtid = ['x', 'y']
for evt in evt_result:
evt = cytomicobj.misp.get_event(event=evt['id'], pythonify=True)
if len(evt.tags) > 0:
for tg in evt.tags:
if tg.name == cytomicobj.tag:
if not cytomicobj.lst_evtid:
cytomicobj.lst_evtid = str(evt['id'])
else:
if not evt['id'] in cytomicobj.lst_evtid:
cytomicobj.lst_evtid.append(str(evt['id']))
break
cytomicobj.lst_evtid.remove('x')
cytomicobj.lst_evtid.remove('y')
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to collect events ids')
def find_eventid(cytomicobj, evtid):
# Get events that contain Cytomic tag.
try:
cytomicobj.res = False
for id in cytomicobj.lst_evtid:
if id == evtid:
cytomicobj.res = True
break
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to collect events ids')
def print_result_events(cytomicobj):
try:
if cytomicobj.res_msg is not None:
for key, msg in cytomicobj.res_msg.items():
if msg is not None:
print(key, msg)
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to print result')
def set_postdata(cytomicobj, moduleconfig, attribute):
# Set JSON to send to the API.
try:
if cytomicobj.args.upload or cytomicobj.args.events:
event = attribute['Event']
event_title = event['info']
event_id = event['id']
threat_level_id = int(event['threat_level_id'])
if moduleconfig['post_threat_level_id'] <= threat_level_id:
if cytomicobj.atttype_misp == 'domain|ip' or cytomicobj.atttype_misp == 'hostname|port':
post_value = attribute['value'].split('|')[0]
else:
post_value = attribute['value']
if cytomicobj.atttype_misp == 'url' and 'http' not in post_value:
pass
else:
if cytomicobj.post_data is None:
cytomicobj.post_data = [{cytomicobj.attlabel_cytomic: post_value, 'AdditionalData': '{} {}'.format(cytomicobj.atttype_misp, attribute['comment']).strip(), 'Source': 'Uploaded from MISP', 'Policy': moduleconfig['cytomic_policy'], 'Description': '{} - {}'.format(event_id, event_title).strip()}]
else:
if post_value not in str(cytomicobj.post_data):
cytomicobj.post_data.append({cytomicobj.attlabel_cytomic: post_value, 'AdditionalData': '{} {}'.format(cytomicobj.atttype_misp, attribute['comment']).strip(), 'Source': 'Uploaded from MISP', 'Policy': moduleconfig['cytomic_policy'], 'Description': '{} - {}'.format(event_id, event_title).strip()})
else:
if cytomicobject.debug:
print('Event %s skipped because of lower threat level' % event_id)
else:
event = attribute['Event']
threat_level_id = int(event['threat_level_id'])
if moduleconfig['post_threat_level_id'] <= threat_level_id:
if cytomicobj.atttype_misp == 'domain|ip' or cytomicobj.atttype_misp == 'hostname|port':
post_value = attribute['value'].split('|')[0]
else:
post_value = attribute['value']
if cytomicobj.atttype_misp == 'url' and 'http' not in post_value:
pass
else:
if cytomicobj.post_data is None:
cytomicobj.post_data = [{cytomicobj.attlabel_cytomic: post_value}]
else:
cytomicobj.post_data.append({cytomicobj.attlabel_cytomic: post_value})
else:
if cytomicobject.debug:
print('Event %s skipped because of lower threat level' % event_id)
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to process post-data')
def send_postdata(cytomicobj, evtid=None):
# Batch post to upload event attributes.
try:
if cytomicobj.post_data is not None:
if cytomicobj.debug:
print('POST: {} {}'.format(cytomicobj.endpoint_ioc, cytomicobj.post_data))
result_post_endpoint_ioc = requests.post(cytomicobj.endpoint_ioc, headers=cytomicobj.api_call_headers, json=cytomicobj.post_data, verify=False)
json_result_post_endpoint_ioc = json.loads(result_post_endpoint_ioc.text)
print(result_post_endpoint_ioc)
if 'true' not in (result_post_endpoint_ioc.text):
cytomicobj.error = True
if evtid is not None:
if cytomicobj.res_msg['Event: ' + str(evtid)] is None:
cytomicobj.res_msg['Event: ' + str(evtid)] = '(Send POST data: errors uploading attributes, event NOT untagged). If the problem persists, please review the format of the value of the attributes is correct.'
else:
cytomicobj.res_msg['Event: ' + str(evtid)] = cytomicobj.res_msg['Event: ' + str(evtid)] + ' (Send POST data -else: errors uploading attributes, event NOT untagged). If the problem persists, please review the format of the value of the attributes is correct.'
if cytomicobj.debug:
print('RESULT: {}'.format(json_result_post_endpoint_ioc))
else:
if evtid is None:
cytomicobj.error = True
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to post attributes')
def process_attributes(cytomicobj, moduleconfig, evtid=None):
# Get attributes to process.
try:
for misptype, cytomictypes in cytomicobject.att_types.items():
cytomicobj.atttype_misp = misptype
for cytomiclabel, cytomictype in cytomictypes.items():
cytomicobj.attlabel_cytomic = cytomiclabel
cytomicobj.atttype_cytomic = cytomictype
cytomicobj.post_data = None
icont = 0
if cytomicobj.args.upload or cytomicobj.args.events:
cytomicobj.endpoint_ioc = moduleconfig['api_url'] + '/iocs/' + cytomicobj.atttype_cytomic + '?ttlDays=' + str(moduleconfig['upload_ttlDays'])
else:
cytomicobj.endpoint_ioc = moduleconfig['api_url'] + '/iocs/eraser/' + cytomicobj.atttype_cytomic
# Get attributes to upload/delete and prepare JSON
# If evtid is set; we're called from --events
if cytomicobject.debug:
print("\nSearching for attributes of type %s" % cytomicobj.atttype_misp)
if evtid is None:
cytomicobj.error = False
attr_result = cytomicobj.misp.search(controller='attributes', last=moduleconfig['upload_timeframe'], limit=cytomicobj.limit_attributes, type_attribute=cytomicobj.atttype_misp, tag=cytomicobj.tag, published=True, deleted=False, includeProposals=False, include_context=True, to_ids=True)
else:
if cytomicobj.error:
break
# We don't search with tags; we have an event for which we want to upload all events
attr_result = cytomicobj.misp.search(controller='attributes', eventid=evtid, last=moduleconfig['upload_timeframe'], limit=cytomicobj.limit_attributes, type_attribute=cytomicobj.atttype_misp, published=True, deleted=False, includeProposals=False, include_context=True, to_ids=True)
cytomicobj.lst_attuuid = ['x', 'y']
if len(attr_result['Attribute']) > 0:
for attribute in attr_result['Attribute']:
if evtid is not None:
if cytomicobj.error:
cytomicobj.res_msg['Event: ' + str(evtid)] = cytomicobj.res_msg['Event: ' + str(evtid)] + ' (errors uploading attributes, event NOT untagged). If the problem persists, please review the format of the value of the attributes is correct.'
break
if icont >= cytomicobj.limit_attributes:
if not cytomicobj.error and cytomicobj.post_data is not None:
# Send data to Cytomic
send_postdata(cytomicobj, evtid)
if not cytomicobj.error:
if 'Event: ' + str(evtid) in cytomicobj.res_msg:
if cytomicobj.res_msg['Event: ' + str(evtid)] is None:
cytomicobj.res_msg['Event: ' + str(evtid)] = cytomicobj.attlabel_cytomic + 's: ' + str(icont)
else:
cytomicobj.res_msg['Event: ' + str(evtid)] += ' | ' + cytomicobj.attlabel_cytomic + 's: ' + str(icont)
else:
if cytomicobject.debug:
print('Data sent (' + cytomicobj.attlabel_cytomic + '): ' + str(icont))
cytomicobj.post_data = None
if cytomicobj.error:
if evtid is not None:
cytomicobj.res_msg['Event: ' + str(evtid)] = cytomicobj.res_msg['Event: ' + str(evtid)] + ' (errors uploading attributes, event NOT untagged). If the problem persists, please review the format of the value of the attributes is correct.'
break
icont = 0
if evtid is None:
event = attribute['Event']
event_id = event['id']
find_eventid(cytomicobj, str(event_id))
if not cytomicobj.res:
if not cytomicobj.lst_attuuid:
cytomicobj.lst_attuuid = attribute['uuid']
else:
if not attribute['uuid'] in cytomicobj.lst_attuuid:
cytomicobj.lst_attuuid.append(attribute['uuid'])
icont += 1
# Prepare data to send
set_postdata(cytomicobj, moduleconfig, attribute)
else:
icont += 1
# Prepare data to send
set_postdata(cytomicobj, moduleconfig, attribute)
if not cytomicobj.error:
# Send data to Cytomic
send_postdata(cytomicobj, evtid)
if not cytomicobj.error and cytomicobj.post_data is not None and icont > 0:
# Data sent; process response
if cytomicobj.res_msg is not None and 'Event: ' + str(evtid) in cytomicobj.res_msg:
if cytomicobj.res_msg['Event: ' + str(evtid)] is None:
cytomicobj.res_msg['Event: ' + str(evtid)] = cytomicobj.attlabel_cytomic + 's: ' + str(icont)
else:
cytomicobj.res_msg['Event: ' + str(evtid)] += ' | ' + cytomicobj.attlabel_cytomic + 's: ' + str(icont)
else:
if cytomicobject.debug:
print('Data sent (' + cytomicobj.attlabel_cytomic + '): ' + str(icont))
if not cytomicobj.error:
cytomicobj.lst_attuuid.remove('x')
cytomicobj.lst_attuuid.remove('y')
# Untag attributes
untag_attributes(cytomicobj)
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to get attributes')
def untag_event(evtid):
# Remove tag of the event being processed.
try:
cytomicobj.records = 0
evt = cytomicobj.misp.get_event(event=evtid, pythonify=True)
if len(evt.tags) > 0:
for tg in evt.tags:
if tg.name == cytomicobj.tag:
cytomicobj.misp.untag(evt['uuid'], cytomicobj.tag)
cytomicobj.records += 1
cytomicobj.res_msg['Event: ' + str(evtid)] = cytomicobj.res_msg['Event: ' + str(evtid)] + ' (event untagged)'
break
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to untag events')
def process_events(cytomicobj, moduleconfig):
# Get events that contain Cytomic tag.
try:
collect_events_ids(cytomicobj, moduleconfig)
total_attributes_sent = 0
for evtid in cytomicobj.lst_evtid:
cytomicobj.error = False
if cytomicobj.res_msg is None:
cytomicobj.res_msg = {'Event: ' + str(evtid): None}
else:
cytomicobj.res_msg['Event: ' + str(evtid)] = None
if cytomicobject.debug:
print('Event id: ' + str(evtid))
# get attributes of each known type of the event / prepare data to send / send data to Cytomic
process_attributes(cytomicobj, moduleconfig, evtid)
if not cytomicobj.error:
untag_event(evtid)
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to process events ids')
def untag_attributes(cytomicobj):
# Remove tag of attributes sent.
try:
icont = 0
if len(cytomicobj.lst_attuuid) > 0:
for uuid in cytomicobj.lst_attuuid:
attr = cytomicobj.misp.get_attribute(attribute=uuid, pythonify=True)
if len(attr.tags) > 0:
for tg in attr.tags:
if tg.name == cytomicobj.tag:
cytomicobj.misp.untag(uuid, cytomicobj.tag)
icont += 1
break
print('Attributes untagged (' + str(icont) + ')')
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to untag attributes')
def process_attributes_upload(cytomicobj, moduleconfig):
# get attributes of each known type / prepare data to send / send data to Cytomic
try:
collect_events_ids(cytomicobj, moduleconfig)
process_attributes(cytomicobj, moduleconfig)
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to upload attributes to Cytomic')
def process_attributes_delete(cytomicobj, moduleconfig):
# get attributes of each known type / prepare data to send / send data to Cytomic
try:
collect_events_ids(cytomicobj, moduleconfig)
process_attributes(cytomicobj, moduleconfig)
except Exception:
cytomicobj.error = True
if cytomicobj.debug:
sys.exit('Unable to delete attributes in Cytomic')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Upload or delete indicators to Cytomic API')
group = parser.add_mutually_exclusive_group()
group.add_argument('--events', action='store_true', help='Upload events indicators')
group.add_argument('--upload', action='store_true', help='Upload indicators')
group.add_argument('--delete', action='store_true', help='Delete indicators')
args = parser.parse_args()
if not args.upload and not args.delete and not args.events:
sys.exit("No valid action for the API")
if misp_verifycert is False:
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
module_config = get_config(misp_url, misp_key, misp_verifycert)
cytomicobj = cytomicobject
misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert, debug=cytomicobject.debug)
cytomicobj.misp = misp
cytomicobj.args = args
access_token = get_token(module_config['token_url'], module_config['clientid'], module_config['clientsecret'], module_config['scope'], module_config['grant_type'], module_config['username'], module_config['password'])
cytomicobj.api_call_headers = {'Authorization': 'Bearer ' + access_token}
if cytomicobj.debug:
print('Received access token')
if cytomicobj.args.events:
cytomicobj.tag = module_config['upload_tag']
cytomicobj.limit_events = module_config['limit_upload_events']
cytomicobj.limit_attributes = module_config['limit_upload_attributes']
process_events(cytomicobj, module_config)
print_result_events(cytomicobj)
elif cytomicobj.args.upload:
cytomicobj.tag = module_config['upload_tag']
cytomicobj.limit_events = 0
cytomicobj.limit_attributes = module_config['limit_upload_attributes']
process_attributes_upload(cytomicobj, module_config)
else:
cytomicobj.tag = module_config['delete_tag']
cytomicobj.limit_events = 0
cytomicobj.limit_attributes = module_config['limit_upload_attributes']
process_attributes_delete(cytomicobj, module_config)

View File

@ -7,7 +7,7 @@ import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Delete the user with the given id. Keep in mind that disabling users (by setting the disabled flag via an edit) is always prefered to keep user associations to events intact.')
parser = argparse.ArgumentParser(description='Delete the user with the given id. Keep in mind that disabling users (by setting the disabled flag via an edit) is always preferred to keep user associations to events intact.')
parser.add_argument("-i", "--user_id", help="The id of the user you want to delete.")
args = parser.parse_args()

View File

@ -55,7 +55,10 @@ def floodemail(misp, event, maxlength=25):
def create_dummy_event(misp):
return misp.new_event(0, 4, 0, 'dummy event')
event = MISPEvent()
event.info = 'Dummy event'
event = misp.add_event(event, pythonify=True)
return event
def create_massive_dummy_events(misp, nbattribute):

View File

@ -11,7 +11,7 @@
````
# Feed generator
git clone https://github.com/CIRCL/PyMISP
git clone https://github.com/MISP/PyMISP
cd examples/feed-generator-from-redis
cp settings.default.py settings.py
vi settings.py # adjust your settings
@ -66,7 +66,7 @@ python3 server.py
>>> obj_data = { "session": "session_id", "username": "admin", "password": "admin", "protocol": "telnet" }
>>> generator.add_object_to_event(obj_name, **obj_data)
# Immediatly write the event to the disk (Bypassing the default flushing behavior)
# Immediately write the event to the disk (Bypassing the default flushing behavior)
>>> generator.flush_event()
```

View File

@ -107,7 +107,7 @@ class RedisToMISPFeed:
# Suffix not provided, try to add anyway
if settings.fallback_MISP_type == 'attribute':
new_key = key + self.SUFFIX_ATTR
# Add atribute type from the config
# Add attribute type from the config
if 'type' not in data and settings.fallback_attribute_type:
data['type'] = settings.fallback_attribute_type
else:

View File

@ -7,6 +7,11 @@ import os
from pymisp import ExpandedPyMISP
from settings import entries, url, key, ssl, outputdir, filters, valid_attribute_distribution_levels
try:
from settings import include_deleted
except ImportError:
include_deleted = False
valid_attribute_distributions = []
@ -64,7 +69,7 @@ if __name__ == '__main__':
total = len(events)
for event in events:
try:
e = misp.get_event(event.uuid, pythonify=True)
e = misp.get_event(event.uuid, deleted=include_deleted, pythonify=True)
e_feed = e.to_feed(valid_distributions=valid_attribute_distributions, with_meta=True)
except Exception as e:
print(e, event.uuid)

View File

@ -24,6 +24,8 @@ entries = 200
# tagged tlp:white and/or feed-export but exclude anything tagged privint
filters = {'published':'true'}
# Include deleted attributes and objects in the events
include_deleted = False
# By default all attributes will be included in the feed generation
# Remove the levels that you do not wish to include in the feed

View File

@ -4,8 +4,8 @@
* It will also generate a html document with a table (attribute\_table.html) containing count for each type of attribute.
* test\_attribute\_treemap.html is a quick page made to visualize both treemap and table at the same time.
* tags\_count.py is a script that count the number of occurences of every tags in a fetched sample of Events in a given period of time.
* tag\_search.py is a script that count the number of occurences of a given tag in a fetched sample of Events in a given period of time.
* tags\_count.py is a script that count the number of occurrences of every tags in a fetched sample of Events in a given period of time.
* tag\_search.py is a script that count the number of occurrences of a given tag in a fetched sample of Events in a given period of time.
* Events will be fetched from _days_ days ago to today.
* _begindate_ is the beginning of the studied period. If it is later than today, an error will be raised.
* _enddate_ is the end of the studied period. If it is earlier than _begindate_, an error will be raised.

View File

@ -1,405 +1,405 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
Maxime Thiebaut
Generate a report of your MISP statistics
Put this script in crontab to run every /15 or /60
*/5 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/stats_report.py -t 30d -m -v
Do inline config in "main"
'''
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
from datetime import datetime
from datetime import date
import time
import sys
import smtplib
import mimetypes
from email.mime.multipart import MIMEMultipart
from email import encoders
from email.mime.base import MIMEBase
from email.mime.text import MIMEText
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def init(url, key, verifycert):
'''
Template to get MISP module started
'''
return ExpandedPyMISP(url, key, verifycert, 'json')
def get_data(misp, timeframe, date_from=None, date_to=None):
'''
Get the event date to build our report
'''
number_of_misp_events = 0
number_of_attributes = 0
number_of_attributes_to_ids = 0
attr_type = {}
attr_category = {}
tags_type = {}
tags_tlp = {'tlp:white': 0, 'tlp:green': 0, 'tlp:amber': 0, 'tlp:red': 0}
tags_misp_galaxy_mitre = {}
tags_misp_galaxy = {}
tags_misp_galaxy_threat_actor = {}
galaxies = {}
galaxies_cluster = {}
threat_levels_counts = [0, 0, 0, 0]
analysis_completion_counts = [0, 0, 0]
report = {}
try:
if date_from and date_to:
stats_event_response = misp.search(date_from=date_from, date_to=date_to)
else:
stats_event_response = misp.search(last=timeframe)
# Number of new or updated events since timestamp
report['number_of_misp_events'] = len(stats_event_response)
report['misp_events'] = []
for event in stats_event_response:
event_data = event['Event']
timestamp = datetime.utcfromtimestamp(int(event_data['timestamp'])).strftime(ts_format)
publish_timestamp = datetime.utcfromtimestamp(int(event_data['publish_timestamp'])).strftime(ts_format)
threat_level_id = int(event_data['threat_level_id']) - 1
threat_levels_counts[threat_level_id] = threat_levels_counts[threat_level_id] + 1
threat_level_id = threat_levels[threat_level_id]
analysis_id = int(event_data['analysis'])
analysis_completion_counts[analysis_id] = analysis_completion_counts[analysis_id] + 1
analysis = analysis_completion[analysis_id]
report['misp_events'].append({'id': event_data['id'], 'title': event_data['info'].replace('\n', '').encode('utf-8'), 'date': event_data['date'], 'timestamp': timestamp, 'publish_timestamp': publish_timestamp, 'threat_level': threat_level_id, 'analysis_completion': analysis})
# Walk through the attributes
if 'Attribute' in event_data:
event_attr = event_data['Attribute']
for attr in event_attr:
number_of_attributes = number_of_attributes + 1
type = attr['type']
category = attr['category']
to_ids = attr['to_ids']
if to_ids:
number_of_attributes_to_ids = number_of_attributes_to_ids + 1
if type in attr_type:
attr_type[type] = attr_type[type] + 1
else:
attr_type[type] = 1
if category in attr_category:
attr_category[category] = attr_category[category] + 1
else:
attr_category[category] = 1
# Process tags
if 'Tag' in event_data:
tags_attr = event_data['Tag']
for tag in tags_attr:
tag_title = tag['name']
if tag_title.lower().replace(' ', '') in tags_tlp:
tags_tlp[tag_title.lower().replace(' ', '')] = tags_tlp[tag_title.lower().replace(' ', '')] + 1
if 'misp-galaxy:mitre-' in tag_title:
if tag_title in tags_misp_galaxy_mitre:
tags_misp_galaxy_mitre[tag_title] = tags_misp_galaxy_mitre[tag_title] + 1
else:
tags_misp_galaxy_mitre[tag_title] = 1
if 'misp-galaxy:threat-actor=' in tag_title:
if tag_title in tags_misp_galaxy_threat_actor:
tags_misp_galaxy_threat_actor[tag_title] = tags_misp_galaxy_threat_actor[tag_title] + 1
else:
tags_misp_galaxy_threat_actor[tag_title] = 1
elif 'misp-galaxy:' in tag_title:
if tag_title in tags_misp_galaxy:
tags_misp_galaxy[tag_title] = tags_misp_galaxy[tag_title] + 1
else:
tags_misp_galaxy[tag_title] = 1
if tag_title in tags_type:
tags_type[tag_title] = tags_type[tag_title] + 1
else:
tags_type[tag_title] = 1
# Process the galaxies
if 'Galaxy' in event_data:
galaxy_attr = event_data['Galaxy']
for galaxy in galaxy_attr:
galaxy_title = galaxy['type']
if galaxy_title in galaxies:
galaxies[galaxy_title] = galaxies[galaxy_title] + 1
else:
galaxies[galaxy_title] = 1
for cluster in galaxy['GalaxyCluster']:
cluster_value = cluster['type']
if cluster_value in galaxies_cluster:
galaxies_cluster[cluster_value] = galaxies_cluster[cluster_value] + 1
else:
galaxies_cluster[cluster_value] = 1
report['number_of_attributes'] = number_of_attributes
report['number_of_attributes_to_ids'] = number_of_attributes_to_ids
report['attr_type'] = attr_type
report['attr_category'] = attr_category
report['tags_type'] = tags_type
report['tags_tlp'] = tags_tlp
report['tags_misp_galaxy_mitre'] = tags_misp_galaxy_mitre
report['tags_misp_galaxy'] = tags_misp_galaxy
report['tags_misp_galaxy_threat_actor'] = tags_misp_galaxy_threat_actor
report['galaxies'] = galaxies
report['galaxies_cluster'] = galaxies_cluster
# General MISP statistics
user_statistics = misp.users_statistics()
if user_statistics and 'errors' not in user_statistics:
report['user_statistics'] = user_statistics
# Return the report data
return report
except Exception as e:
sys.exit('Unable to get statistics from MISP')
def build_report(report, timeframe, misp_url, sanitize_report=True):
'''
Build the body of the report and optional attachments
'''
attachments = {}
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
report_body = "MISP Report %s for last %s on %s\n-------------------------------------------------------------------------------" % (current_date, timeframe, misp_url)
else:
report_body = "MISP Report %s from %s to %s on %s\n-------------------------------------------------------------------------------" % (current_date, date_from, date_to, misp_url)
report_body = report_body + '\nNew or updated events: %s' % report['number_of_misp_events']
report_body = report_body + '\nNew or updated attributes: %s' % report['number_of_attributes']
report_body = report_body + '\nNew or updated attributes with IDS flag: %s' % report['number_of_attributes_to_ids']
report_body = report_body + '\n'
if 'user_statistics' in report:
report_body = report_body + '\nTotal events: %s' % report['user_statistics']['stats']['event_count']
report_body = report_body + '\nTotal attributes: %s' % report['user_statistics']['stats']['attribute_count']
report_body = report_body + '\nTotal users: %s' % report['user_statistics']['stats']['user_count']
report_body = report_body + '\nTotal orgs: %s' % report['user_statistics']['stats']['org_count']
report_body = report_body + '\nTotal correlation: %s' % report['user_statistics']['stats']['correlation_count']
report_body = report_body + '\nTotal proposals: %s' % report['user_statistics']['stats']['proposal_count']
report_body = report_body + '\n\n'
if args.mispevent:
report_body = report_body + '\nNew or updated events\n-------------------------------------------------------------------------------'
attachments['misp_events'] = 'ID;Title;Date;Updated;Published;ThreatLevel;AnalysisStatus'
for el in report['misp_events']:
report_body = report_body + '\n #%s %s (%s) \t%s \n\t\t\t\t(Date: %s, Updated: %s, Published: %s)' % (el['id'], el['threat_level'], el['analysis_completion'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'])
attachments['misp_events'] = attachments['misp_events'] + '\n%s;%s;%s;%s;%s;%s;%s' % (el['id'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'], el['threat_level'], el['analysis_completion'])
report_body, attachments['attr_category'] = add_report_body(report_body, 'New or updated attributes - Category', report['attr_category'], 'AttributeCategory;Qt')
report_body, attachments['attr_type'] = add_report_body(report_body, 'New or updated attributes - Type', report['attr_type'], 'AttributeType;Qt')
report_body, attachments['tags_tlp'] = add_report_body(report_body, 'TLP Codes', report['tags_tlp'], 'TLP;Qt')
report_body, attachments['tags_misp_galaxy'] = add_report_body(report_body, 'Tag MISP Galaxy', report['tags_misp_galaxy'], 'MISPGalaxy;Qt')
report_body, attachments['tags_misp_galaxy_mitre'] = add_report_body(report_body, 'Tag MISP Galaxy Mitre', report['tags_misp_galaxy_mitre'], 'MISPGalaxyMitre;Qt')
report_body, attachments['tags_misp_galaxy_threat_actor'] = add_report_body(report_body, 'Tag MISP Galaxy Threat Actor', report['tags_misp_galaxy_threat_actor'], 'MISPGalaxyThreatActor;Qt')
report_body, attachments['tags_type'] = add_report_body(report_body, 'Tags', report['tags_type'], 'Tag;Qt')
report_body, attachments['galaxies'] = add_report_body(report_body, 'Galaxies', report['galaxies'], 'Galaxies;Qt')
report_body, attachments['galaxies_cluster'] = add_report_body(report_body, 'Galaxies Cluster', report['galaxies_cluster'], 'Galaxies;Qt')
if sanitize_report:
mitre_tactic = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Tactic')
mitre_group = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Group')
mitre_software = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Software')
threat_actor = get_sanitized_report(report['tags_misp_galaxy_threat_actor'], 'MISP Threat Actor')
misp_tag = get_sanitized_report(report['tags_type'], 'MISP Tags', False, True)
report_body, attachments['mitre_tactics'] = add_report_body(report_body, 'MITRE ATT&CK Tactics (sanitized)', mitre_tactic, 'MITRETactics;Qt')
report_body, attachments['mitre_group'] = add_report_body(report_body, 'MITRE ATT&CK Group (sanitized)', mitre_group, 'MITREGroup;Qt')
report_body, attachments['mitre_software'] = add_report_body(report_body, 'MITRE ATT&CK Software (sanitized)', mitre_software, 'MITRESoftware;Qt')
report_body, attachments['threat_actor'] = add_report_body(report_body, 'MISP Threat Actor (sanitized)', threat_actor, 'MISPThreatActor;Qt')
report_body, attachments['misp_tag'] = add_report_body(report_body, 'Tags (sanitized)', misp_tag, 'MISPTags;Qt')
report_body = report_body + "\n\nMISP Reporter Finished\n"
return report_body, attachments
def add_report_body(report_body, subtitle, data_object, csv_title):
'''
Add a section to the report body text
'''
if report_body:
report_body = report_body + '\n\n'
report_body = report_body + '\n%s\n-------------------------------------------------------------------------------' % subtitle
data_object_s = sorted(data_object.items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
csv_attachment = csv_title
for el in data_object_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
csv_attachment = csv_attachment + '\n%s;%s' % (el[0], el[1])
return report_body, csv_attachment
def msg_attach(content, filename):
'''
Return an message attachment object
'''
part = MIMEBase('application', "octet-stream")
part.set_payload(content)
part.add_header('Content-Disposition', 'attachment; filename="%s"' % filename)
return part
def print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url):
'''
Print (or send) the report
'''
if args.mail:
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
subject = "MISP Report %s for last %s on %s" % (current_date, timeframe, misp_url)
else:
subject = "MISP Report %s from %s to %s on %s" % (current_date, date_from, date_to, misp_url)
msg = MIMEMultipart()
msg['From'] = smtp_from
msg['To'] = smtp_to
msg['Subject'] = subject
msg.attach(MIMEText(report_body, 'text'))
if args.mispevent:
part = MIMEBase('application', "octet-stream")
part.set_payload(attachments['misp_events'])
part.add_header('Content-Disposition', 'attachment; filename="misp_events.csv"')
msg.attach(part)
msg.attach(msg_attach(attachments['attr_type'], 'attr_type.csv'))
msg.attach(msg_attach(attachments['attr_category'], 'attr_category.csv'))
msg.attach(msg_attach(attachments['tags_tlp'], 'tags_tlp.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_mitre'], 'tags_misp_galaxy_mitre.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy'], 'tags_misp_galaxy.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_threat_actor'], 'tags_misp_galaxy_threat_actor.csv'))
msg.attach(msg_attach(attachments['tags_type'], 'tags_type.csv'))
msg.attach(msg_attach(attachments['galaxies'], 'galaxies.csv'))
msg.attach(msg_attach(attachments['galaxies_cluster'], 'galaxies_cluster.csv'))
msg.attach(msg_attach(attachments['misp_tag'], 'misp_tag.csv'))
msg.attach(msg_attach(attachments['threat_actor'], 'threat_actor.csv'))
msg.attach(msg_attach(attachments['mitre_software'], 'mitre_software.csv'))
msg.attach(msg_attach(attachments['mitre_group'], 'mitre_group.csv'))
msg.attach(msg_attach(attachments['mitre_tactics'], 'mitre_tactics.csv'))
server = smtplib.SMTP(smtp_server)
server.sendmail(smtp_from, smtp_to, msg.as_string())
else:
print(report_body)
def get_sanitized_report(dataset, sanitize_selector='ATT&CK Tactic', lower=False, add_not_sanitized=False):
'''
Remove or bundle some of the tags
'quick'n'dirty ; could also do this by using the galaxy/tags definition
'''
# If you add the element completely then it gets removed by an empty string; this allows to filter out non-relevant items
sanitize_set = {
'ATT&CK Tactic': ['misp-galaxy:mitre-enterprise-attack-pattern="', 'misp-galaxy:mitre-pre-attack-pattern="', 'misp-galaxy:mitre-mobile-attack-pattern="', 'misp-galaxy:mitre-attack-pattern="', 'misp-galaxy:mitre-enterprise-attack-attack-pattern="', 'misp-galaxy:mitre-pre-attack-attack-pattern="', 'misp-galaxy:mitre-enterprise-attack-attack-pattern="', 'misp-galaxy:mitre-mobile-attack-attack-pattern="'],
'ATT&CK Group': ['misp-galaxy:mitre-enterprise-intrusion-set="', 'misp-galaxy:mitre-pre-intrusion-set="', 'misp-galaxy:mitre-mobile-intrusion-set="', 'misp-galaxy:mitre-intrusion-set="', 'misp-galaxy:mitre-enterprise-attack-intrusion-set="', 'misp-galaxy:mitre-pre-attack-intrusion-set="', 'misp-galaxy:mitre-mobile-attack-intrusion-set="'],
'ATT&CK Software': ['misp-galaxy:mitre-enterprise-malware="', 'misp-galaxy:mitre-pre-malware="', 'misp-galaxy:mitre-mobile-malware="', 'misp-galaxy:mitre-malware="', 'misp-galaxy:mitre-enterprise-attack-tool="', 'misp-galaxy:mitre-enterprise-tool="', 'misp-galaxy:mitre-pre-tool="', 'misp-galaxy:mitre-mobile-tool="', 'misp-galaxy:mitre-tool="', 'misp-galaxy:mitre-enterprise-attack-malware="'],
'MISP Threat Actor': ['misp-galaxy:threat-actor="'],
'MISP Tags': ['circl:incident-classification="', 'osint:source-type="blog-post"', 'misp-galaxy:tool="', 'CERT-XLM:malicious-code="', 'circl:topic="', 'ddos:type="', 'ecsirt:fraud="', 'dnc:malware-type="', 'enisa:nefarious-activity-abuse="', 'europol-incident:information-gathering="', 'misp-galaxy:ransomware="', 'misp-galaxy:rat="', 'misp-galaxy:social-dark-patterns="', 'misp-galaxy:tool="', 'misp:threat-level="', 'ms-caro-malware:malware-platform=', 'ms-caro-malware:malware-type=', 'veris:security_incident="', 'veris:attribute:integrity:variety="', 'veris:actor:motive="', 'misp-galaxy:banker="', 'misp-galaxy:malpedia="', 'misp-galaxy:botnet="', 'malware_classification:malware-category="', 'TLP: white', 'TLP: Green',
'inthreat:event-src="feed-osint"', 'tlp:white', 'tlp:amber', 'tlp:green', 'tlp:red', 'osint:source-type="blog-post"', 'Partner Feed', 'IBM XForce', 'type:OSINT', 'malware:', 'osint:lifetime="perpetual"', 'Actor:', 'osint:certainty="50"', 'Banker:', 'Group:', 'Threat:',
'ncsc-nl-ndn:feed="selected"', 'misp-galaxy:microsoft-activity-group="', 'admiralty-scale:source-reliability="b"', 'admiralty-scale:source-reliability="a"', 'admiralty-scale:information-credibility="2"', 'admiralty-scale:information-credibility="3"',
'feed:source="CESICAT"', 'osint:source-type="automatic-analysis"', 'workflow:state="complete"', 'osint:source-type="technical-report"',
'csirt_case_classification:incident-category="', 'dnc:driveby-type="', 'veris:action:social:variety="', 'osint:source-type="',
'osint:source-type="microblog-post"', 'ecsirt:malicious-code="', 'misp-galaxy:sector="', 'veris:action:variety=', 'label=', 'csirt_case_classification:incident-category="', 'admiralty-scale:source-reliability="c"', 'workflow:todo="review"', 'LDO-CERT:detection="toSIEM"', 'Threat tlp:White', 'Threat Type:', 'adversary:infrastructure-state="active"', 'cirl:incident-classification:', 'misp-galaxy:android="', 'dnc:infrastructure-type="', 'ecsirt:information-gathering="', 'ecsirt:intrusions="', 'dhs-ciip-sectors:DHS-critical-sectors="', 'malware_classification:obfuscation-technique="no-obfuscation"',
'riskiq:threat-type="', 'veris:action:hacking:variety="', 'veris:action:social:target="', 'workflow:state="incomplete"', 'workflow:todo="add-tagging"', 'workflow:todo="add-context"', 'europol-incident:availability="', 'label=', 'misp-galaxy:stealer="', 'misp-galaxy:exploit-kit="', 'rsit:availability="', 'rsit:fraud="', 'ransomware:type="', 'veris:action:variety=', 'malware:',
'ecsirt:abusive-content="']}
if sanitize_selector == 'MISP Tags':
sanitize_set['MISP Tags'] = sanitize_set['MISP Tags'] + sanitize_set['ATT&CK Tactic'] + sanitize_set['ATT&CK Group'] + sanitize_set['ATT&CK Software'] + sanitize_set['MISP Threat Actor']
result_sanitize_set = {}
if dataset:
for element in dataset:
sanited = False
for sanitize_el in sanitize_set[sanitize_selector]:
if sanitize_el in element:
sanited = True
new_el = element.replace(sanitize_el, '').replace('"', '').strip()
if lower:
new_el = new_el.lower()
result_sanitize_set[new_el] = dataset[element]
if add_not_sanitized and not sanited:
new_el = element.strip()
if lower:
new_el = new_el.lower()
result_sanitize_set[new_el] = dataset[element]
return result_sanitize_set
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Generate a report of your MISP statistics.')
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument('-t', '--timeframe', action='store', help='Timeframe to include in the report')
group.add_argument('-f', '--date_from', action='store', help='Start date of query (YYYY-MM-DD)')
parser.add_argument('-u', '---date-to', action='store', help='End date of query (YYYY-MM-DD)')
parser.add_argument('-e', '--mispevent', action='store_true', help='Include MISP event titles')
parser.add_argument('-m', '--mail', action='store_true', help='Mail the report')
parser.add_argument('-o', '--mailoptions', action='store', help='mailoptions: \'smtp_from=INSERT_FROM;smtp_to=INSERT_TO;smtp_server=localhost\'')
args = parser.parse_args()
misp = init(misp_url, misp_key, misp_verifycert)
timeframe = args.timeframe
if not timeframe:
date_from = args.date_from
if not args.date_to:
today = date.today()
date_to = today.strftime("%Y-%m-%d")
else:
date_to = args.date_to
else:
date_from = None
date_to = None
ts_format = '%Y-%m-%d %H:%M:%S'
threat_levels = ['High', 'Medium', 'Low', 'Undef']
analysis_completion = ['Initial', 'Ongoing', 'Complete']
smtp_from = 'INSERT_FROM'
smtp_to = 'INSERT_TO'
smtp_server = 'localhost'
if args.mailoptions:
mailoptions = args.mailoptions.split(';')
for s in mailoptions:
if s.split('=')[0] == 'smtp_from':
smtp_from = s.split('=')[1]
if s.split('=')[0] == 'smtp_to':
smtp_to = s.split('=')[1]
if s.split('=')[0] == 'smtp_server':
smtp_server = s.split('=')[1]
report = get_data(misp, timeframe, date_from, date_to)
if(report):
report_body, attachments = build_report(report, timeframe, misp_url)
print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url)
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
Maxime Thiebaut
Generate a report of your MISP statistics
Put this script in crontab to run every /15 or /60
*/5 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/stats_report.py -t 30d -m -v
Do inline config in "main"
'''
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
from datetime import datetime
from datetime import date
import time
import sys
import smtplib
import mimetypes
from email.mime.multipart import MIMEMultipart
from email import encoders
from email.mime.base import MIMEBase
from email.mime.text import MIMEText
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def init(url, key, verifycert):
'''
Template to get MISP module started
'''
return ExpandedPyMISP(url, key, verifycert, 'json')
def get_data(misp, timeframe, date_from=None, date_to=None):
'''
Get the event date to build our report
'''
number_of_misp_events = 0
number_of_attributes = 0
number_of_attributes_to_ids = 0
attr_type = {}
attr_category = {}
tags_type = {}
tags_tlp = {'tlp:white': 0, 'tlp:green': 0, 'tlp:amber': 0, 'tlp:red': 0}
tags_misp_galaxy_mitre = {}
tags_misp_galaxy = {}
tags_misp_galaxy_threat_actor = {}
galaxies = {}
galaxies_cluster = {}
threat_levels_counts = [0, 0, 0, 0]
analysis_completion_counts = [0, 0, 0]
report = {}
try:
if date_from and date_to:
stats_event_response = misp.search(date_from=date_from, date_to=date_to)
else:
stats_event_response = misp.search(last=timeframe)
# Number of new or updated events since timestamp
report['number_of_misp_events'] = len(stats_event_response)
report['misp_events'] = []
for event in stats_event_response:
event_data = event['Event']
timestamp = datetime.utcfromtimestamp(int(event_data['timestamp'])).strftime(ts_format)
publish_timestamp = datetime.utcfromtimestamp(int(event_data['publish_timestamp'])).strftime(ts_format)
threat_level_id = int(event_data['threat_level_id']) - 1
threat_levels_counts[threat_level_id] = threat_levels_counts[threat_level_id] + 1
threat_level_id = threat_levels[threat_level_id]
analysis_id = int(event_data['analysis'])
analysis_completion_counts[analysis_id] = analysis_completion_counts[analysis_id] + 1
analysis = analysis_completion[analysis_id]
report['misp_events'].append({'id': event_data['id'], 'title': event_data['info'].replace('\n', '').encode('utf-8'), 'date': event_data['date'], 'timestamp': timestamp, 'publish_timestamp': publish_timestamp, 'threat_level': threat_level_id, 'analysis_completion': analysis})
# Walk through the attributes
if 'Attribute' in event_data:
event_attr = event_data['Attribute']
for attr in event_attr:
number_of_attributes = number_of_attributes + 1
type = attr['type']
category = attr['category']
to_ids = attr['to_ids']
if to_ids:
number_of_attributes_to_ids = number_of_attributes_to_ids + 1
if type in attr_type:
attr_type[type] = attr_type[type] + 1
else:
attr_type[type] = 1
if category in attr_category:
attr_category[category] = attr_category[category] + 1
else:
attr_category[category] = 1
# Process tags
if 'Tag' in event_data:
tags_attr = event_data['Tag']
for tag in tags_attr:
tag_title = tag['name']
if tag_title.lower().replace(' ', '') in tags_tlp:
tags_tlp[tag_title.lower().replace(' ', '')] = tags_tlp[tag_title.lower().replace(' ', '')] + 1
if 'misp-galaxy:mitre-' in tag_title:
if tag_title in tags_misp_galaxy_mitre:
tags_misp_galaxy_mitre[tag_title] = tags_misp_galaxy_mitre[tag_title] + 1
else:
tags_misp_galaxy_mitre[tag_title] = 1
if 'misp-galaxy:threat-actor=' in tag_title:
if tag_title in tags_misp_galaxy_threat_actor:
tags_misp_galaxy_threat_actor[tag_title] = tags_misp_galaxy_threat_actor[tag_title] + 1
else:
tags_misp_galaxy_threat_actor[tag_title] = 1
elif 'misp-galaxy:' in tag_title:
if tag_title in tags_misp_galaxy:
tags_misp_galaxy[tag_title] = tags_misp_galaxy[tag_title] + 1
else:
tags_misp_galaxy[tag_title] = 1
if tag_title in tags_type:
tags_type[tag_title] = tags_type[tag_title] + 1
else:
tags_type[tag_title] = 1
# Process the galaxies
if 'Galaxy' in event_data:
galaxy_attr = event_data['Galaxy']
for galaxy in galaxy_attr:
galaxy_title = galaxy['type']
if galaxy_title in galaxies:
galaxies[galaxy_title] = galaxies[galaxy_title] + 1
else:
galaxies[galaxy_title] = 1
for cluster in galaxy['GalaxyCluster']:
cluster_value = cluster['type']
if cluster_value in galaxies_cluster:
galaxies_cluster[cluster_value] = galaxies_cluster[cluster_value] + 1
else:
galaxies_cluster[cluster_value] = 1
report['number_of_attributes'] = number_of_attributes
report['number_of_attributes_to_ids'] = number_of_attributes_to_ids
report['attr_type'] = attr_type
report['attr_category'] = attr_category
report['tags_type'] = tags_type
report['tags_tlp'] = tags_tlp
report['tags_misp_galaxy_mitre'] = tags_misp_galaxy_mitre
report['tags_misp_galaxy'] = tags_misp_galaxy
report['tags_misp_galaxy_threat_actor'] = tags_misp_galaxy_threat_actor
report['galaxies'] = galaxies
report['galaxies_cluster'] = galaxies_cluster
# General MISP statistics
user_statistics = misp.users_statistics()
if user_statistics and 'errors' not in user_statistics:
report['user_statistics'] = user_statistics
# Return the report data
return report
except Exception as e:
sys.exit('Unable to get statistics from MISP')
def build_report(report, timeframe, misp_url, sanitize_report=True):
'''
Build the body of the report and optional attachments
'''
attachments = {}
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
report_body = "MISP Report %s for last %s on %s\n-------------------------------------------------------------------------------" % (current_date, timeframe, misp_url)
else:
report_body = "MISP Report %s from %s to %s on %s\n-------------------------------------------------------------------------------" % (current_date, date_from, date_to, misp_url)
report_body = report_body + '\nNew or updated events: %s' % report['number_of_misp_events']
report_body = report_body + '\nNew or updated attributes: %s' % report['number_of_attributes']
report_body = report_body + '\nNew or updated attributes with IDS flag: %s' % report['number_of_attributes_to_ids']
report_body = report_body + '\n'
if 'user_statistics' in report:
report_body = report_body + '\nTotal events: %s' % report['user_statistics']['stats']['event_count']
report_body = report_body + '\nTotal attributes: %s' % report['user_statistics']['stats']['attribute_count']
report_body = report_body + '\nTotal users: %s' % report['user_statistics']['stats']['user_count']
report_body = report_body + '\nTotal orgs: %s' % report['user_statistics']['stats']['org_count']
report_body = report_body + '\nTotal correlation: %s' % report['user_statistics']['stats']['correlation_count']
report_body = report_body + '\nTotal proposals: %s' % report['user_statistics']['stats']['proposal_count']
report_body = report_body + '\n\n'
if args.mispevent:
report_body = report_body + '\nNew or updated events\n-------------------------------------------------------------------------------'
attachments['misp_events'] = 'ID;Title;Date;Updated;Published;ThreatLevel;AnalysisStatus'
for el in report['misp_events']:
report_body = report_body + '\n #%s %s (%s) \t%s \n\t\t\t\t(Date: %s, Updated: %s, Published: %s)' % (el['id'], el['threat_level'], el['analysis_completion'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'])
attachments['misp_events'] = attachments['misp_events'] + '\n%s;%s;%s;%s;%s;%s;%s' % (el['id'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'], el['threat_level'], el['analysis_completion'])
report_body, attachments['attr_category'] = add_report_body(report_body, 'New or updated attributes - Category', report['attr_category'], 'AttributeCategory;Qt')
report_body, attachments['attr_type'] = add_report_body(report_body, 'New or updated attributes - Type', report['attr_type'], 'AttributeType;Qt')
report_body, attachments['tags_tlp'] = add_report_body(report_body, 'TLP Codes', report['tags_tlp'], 'TLP;Qt')
report_body, attachments['tags_misp_galaxy'] = add_report_body(report_body, 'Tag MISP Galaxy', report['tags_misp_galaxy'], 'MISPGalaxy;Qt')
report_body, attachments['tags_misp_galaxy_mitre'] = add_report_body(report_body, 'Tag MISP Galaxy Mitre', report['tags_misp_galaxy_mitre'], 'MISPGalaxyMitre;Qt')
report_body, attachments['tags_misp_galaxy_threat_actor'] = add_report_body(report_body, 'Tag MISP Galaxy Threat Actor', report['tags_misp_galaxy_threat_actor'], 'MISPGalaxyThreatActor;Qt')
report_body, attachments['tags_type'] = add_report_body(report_body, 'Tags', report['tags_type'], 'Tag;Qt')
report_body, attachments['galaxies'] = add_report_body(report_body, 'Galaxies', report['galaxies'], 'Galaxies;Qt')
report_body, attachments['galaxies_cluster'] = add_report_body(report_body, 'Galaxies Cluster', report['galaxies_cluster'], 'Galaxies;Qt')
if sanitize_report:
mitre_tactic = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Tactic')
mitre_group = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Group')
mitre_software = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Software')
threat_actor = get_sanitized_report(report['tags_misp_galaxy_threat_actor'], 'MISP Threat Actor')
misp_tag = get_sanitized_report(report['tags_type'], 'MISP Tags', False, True)
report_body, attachments['mitre_tactics'] = add_report_body(report_body, 'MITRE ATT&CK Tactics (sanitized)', mitre_tactic, 'MITRETactics;Qt')
report_body, attachments['mitre_group'] = add_report_body(report_body, 'MITRE ATT&CK Group (sanitized)', mitre_group, 'MITREGroup;Qt')
report_body, attachments['mitre_software'] = add_report_body(report_body, 'MITRE ATT&CK Software (sanitized)', mitre_software, 'MITRESoftware;Qt')
report_body, attachments['threat_actor'] = add_report_body(report_body, 'MISP Threat Actor (sanitized)', threat_actor, 'MISPThreatActor;Qt')
report_body, attachments['misp_tag'] = add_report_body(report_body, 'Tags (sanitized)', misp_tag, 'MISPTags;Qt')
report_body = report_body + "\n\nMISP Reporter Finished\n"
return report_body, attachments
def add_report_body(report_body, subtitle, data_object, csv_title):
'''
Add a section to the report body text
'''
if report_body:
report_body = report_body + '\n\n'
report_body = report_body + '\n%s\n-------------------------------------------------------------------------------' % subtitle
data_object_s = sorted(data_object.items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
csv_attachment = csv_title
for el in data_object_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
csv_attachment = csv_attachment + '\n%s;%s' % (el[0], el[1])
return report_body, csv_attachment
def msg_attach(content, filename):
'''
Return an message attachment object
'''
part = MIMEBase('application', "octet-stream")
part.set_payload(content)
part.add_header('Content-Disposition', 'attachment; filename="%s"' % filename)
return part
def print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url):
'''
Print (or send) the report
'''
if args.mail:
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
subject = "MISP Report %s for last %s on %s" % (current_date, timeframe, misp_url)
else:
subject = "MISP Report %s from %s to %s on %s" % (current_date, date_from, date_to, misp_url)
msg = MIMEMultipart()
msg['From'] = smtp_from
msg['To'] = smtp_to
msg['Subject'] = subject
msg.attach(MIMEText(report_body, 'text'))
if args.mispevent:
part = MIMEBase('application', "octet-stream")
part.set_payload(attachments['misp_events'])
part.add_header('Content-Disposition', 'attachment; filename="misp_events.csv"')
msg.attach(part)
msg.attach(msg_attach(attachments['attr_type'], 'attr_type.csv'))
msg.attach(msg_attach(attachments['attr_category'], 'attr_category.csv'))
msg.attach(msg_attach(attachments['tags_tlp'], 'tags_tlp.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_mitre'], 'tags_misp_galaxy_mitre.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy'], 'tags_misp_galaxy.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_threat_actor'], 'tags_misp_galaxy_threat_actor.csv'))
msg.attach(msg_attach(attachments['tags_type'], 'tags_type.csv'))
msg.attach(msg_attach(attachments['galaxies'], 'galaxies.csv'))
msg.attach(msg_attach(attachments['galaxies_cluster'], 'galaxies_cluster.csv'))
msg.attach(msg_attach(attachments['misp_tag'], 'misp_tag.csv'))
msg.attach(msg_attach(attachments['threat_actor'], 'threat_actor.csv'))
msg.attach(msg_attach(attachments['mitre_software'], 'mitre_software.csv'))
msg.attach(msg_attach(attachments['mitre_group'], 'mitre_group.csv'))
msg.attach(msg_attach(attachments['mitre_tactics'], 'mitre_tactics.csv'))
server = smtplib.SMTP(smtp_server)
server.sendmail(smtp_from, smtp_to, msg.as_string())
else:
print(report_body)
def get_sanitized_report(dataset, sanitize_selector='ATT&CK Tactic', lower=False, add_not_sanitized=False):
'''
Remove or bundle some of the tags
'quick'n'dirty ; could also do this by using the galaxy/tags definition
'''
# If you add the element completely then it gets removed by an empty string; this allows to filter out non-relevant items
sanitize_set = {
'ATT&CK Tactic': ['misp-galaxy:mitre-enterprise-attack-pattern="', 'misp-galaxy:mitre-pre-attack-pattern="', 'misp-galaxy:mitre-mobile-attack-pattern="', 'misp-galaxy:mitre-attack-pattern="', 'misp-galaxy:mitre-enterprise-attack-attack-pattern="', 'misp-galaxy:mitre-pre-attack-attack-pattern="', 'misp-galaxy:mitre-enterprise-attack-attack-pattern="', 'misp-galaxy:mitre-mobile-attack-attack-pattern="'],
'ATT&CK Group': ['misp-galaxy:mitre-enterprise-intrusion-set="', 'misp-galaxy:mitre-pre-intrusion-set="', 'misp-galaxy:mitre-mobile-intrusion-set="', 'misp-galaxy:mitre-intrusion-set="', 'misp-galaxy:mitre-enterprise-attack-intrusion-set="', 'misp-galaxy:mitre-pre-attack-intrusion-set="', 'misp-galaxy:mitre-mobile-attack-intrusion-set="'],
'ATT&CK Software': ['misp-galaxy:mitre-enterprise-malware="', 'misp-galaxy:mitre-pre-malware="', 'misp-galaxy:mitre-mobile-malware="', 'misp-galaxy:mitre-malware="', 'misp-galaxy:mitre-enterprise-attack-tool="', 'misp-galaxy:mitre-enterprise-tool="', 'misp-galaxy:mitre-pre-tool="', 'misp-galaxy:mitre-mobile-tool="', 'misp-galaxy:mitre-tool="', 'misp-galaxy:mitre-enterprise-attack-malware="'],
'MISP Threat Actor': ['misp-galaxy:threat-actor="'],
'MISP Tags': ['circl:incident-classification="', 'osint:source-type="blog-post"', 'misp-galaxy:tool="', 'CERT-XLM:malicious-code="', 'circl:topic="', 'ddos:type="', 'ecsirt:fraud="', 'dnc:malware-type="', 'enisa:nefarious-activity-abuse="', 'europol-incident:information-gathering="', 'misp-galaxy:ransomware="', 'misp-galaxy:rat="', 'misp-galaxy:social-dark-patterns="', 'misp-galaxy:tool="', 'misp:threat-level="', 'ms-caro-malware:malware-platform=', 'ms-caro-malware:malware-type=', 'veris:security_incident="', 'veris:attribute:integrity:variety="', 'veris:actor:motive="', 'misp-galaxy:banker="', 'misp-galaxy:malpedia="', 'misp-galaxy:botnet="', 'malware_classification:malware-category="', 'TLP: white', 'TLP: Green',
'inthreat:event-src="feed-osint"', 'tlp:white', 'tlp:amber', 'tlp:green', 'tlp:red', 'osint:source-type="blog-post"', 'Partner Feed', 'IBM XForce', 'type:OSINT', 'malware:', 'osint:lifetime="perpetual"', 'Actor:', 'osint:certainty="50"', 'Banker:', 'Group:', 'Threat:',
'ncsc-nl-ndn:feed="selected"', 'misp-galaxy:microsoft-activity-group="', 'admiralty-scale:source-reliability="b"', 'admiralty-scale:source-reliability="a"', 'admiralty-scale:information-credibility="2"', 'admiralty-scale:information-credibility="3"',
'feed:source="CESICAT"', 'osint:source-type="automatic-analysis"', 'workflow:state="complete"', 'osint:source-type="technical-report"',
'csirt_case_classification:incident-category="', 'dnc:driveby-type="', 'veris:action:social:variety="', 'osint:source-type="',
'osint:source-type="microblog-post"', 'ecsirt:malicious-code="', 'misp-galaxy:sector="', 'veris:action:variety=', 'label=', 'csirt_case_classification:incident-category="', 'admiralty-scale:source-reliability="c"', 'workflow:todo="review"', 'LDO-CERT:detection="toSIEM"', 'Threat tlp:White', 'Threat Type:', 'adversary:infrastructure-state="active"', 'cirl:incident-classification:', 'misp-galaxy:android="', 'dnc:infrastructure-type="', 'ecsirt:information-gathering="', 'ecsirt:intrusions="', 'dhs-ciip-sectors:DHS-critical-sectors="', 'malware_classification:obfuscation-technique="no-obfuscation"',
'riskiq:threat-type="', 'veris:action:hacking:variety="', 'veris:action:social:target="', 'workflow:state="incomplete"', 'workflow:todo="add-tagging"', 'workflow:todo="add-context"', 'europol-incident:availability="', 'label=', 'misp-galaxy:stealer="', 'misp-galaxy:exploit-kit="', 'rsit:availability="', 'rsit:fraud="', 'ransomware:type="', 'veris:action:variety=', 'malware:',
'ecsirt:abusive-content="']}
if sanitize_selector == 'MISP Tags':
sanitize_set['MISP Tags'] = sanitize_set['MISP Tags'] + sanitize_set['ATT&CK Tactic'] + sanitize_set['ATT&CK Group'] + sanitize_set['ATT&CK Software'] + sanitize_set['MISP Threat Actor']
result_sanitize_set = {}
if dataset:
for element in dataset:
sanited = False
for sanitize_el in sanitize_set[sanitize_selector]:
if sanitize_el in element:
sanited = True
new_el = element.replace(sanitize_el, '').replace('"', '').strip()
if lower:
new_el = new_el.lower()
result_sanitize_set[new_el] = dataset[element]
if add_not_sanitized and not sanited:
new_el = element.strip()
if lower:
new_el = new_el.lower()
result_sanitize_set[new_el] = dataset[element]
return result_sanitize_set
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Generate a report of your MISP statistics.')
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument('-t', '--timeframe', action='store', help='Timeframe to include in the report')
group.add_argument('-f', '--date_from', action='store', help='Start date of query (YYYY-MM-DD)')
parser.add_argument('-u', '---date-to', action='store', help='End date of query (YYYY-MM-DD)')
parser.add_argument('-e', '--mispevent', action='store_true', help='Include MISP event titles')
parser.add_argument('-m', '--mail', action='store_true', help='Mail the report')
parser.add_argument('-o', '--mailoptions', action='store', help='mailoptions: \'smtp_from=INSERT_FROM;smtp_to=INSERT_TO;smtp_server=localhost\'')
args = parser.parse_args()
misp = init(misp_url, misp_key, misp_verifycert)
timeframe = args.timeframe
if not timeframe:
date_from = args.date_from
if not args.date_to:
today = date.today()
date_to = today.strftime("%Y-%m-%d")
else:
date_to = args.date_to
else:
date_from = None
date_to = None
ts_format = '%Y-%m-%d %H:%M:%S'
threat_levels = ['High', 'Medium', 'Low', 'Undef']
analysis_completion = ['Initial', 'Ongoing', 'Complete']
smtp_from = 'INSERT_FROM'
smtp_to = 'INSERT_TO'
smtp_server = 'localhost'
if args.mailoptions:
mailoptions = args.mailoptions.split(';')
for s in mailoptions:
if s.split('=')[0] == 'smtp_from':
smtp_from = s.split('=')[1]
if s.split('=')[0] == 'smtp_to':
smtp_to = s.split('=')[1]
if s.split('=')[0] == 'smtp_server':
smtp_server = s.split('=')[1]
report = get_data(misp, timeframe, date_from, date_to)
if(report):
report_body, attachments = build_report(report, timeframe, misp_url)
print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url)

View File

@ -1,24 +1,3 @@
# Description
Get all attributes, from a MISP (https://github.com/MISP) instance, that can be converted into Suricata rules, given a *parameter* and a *term* to search
This script was outdated and didn't work on the current version of PyMISP.
**requires**
* PyMISP (https://github.com/CIRCL/PyMISP/)
* python 2.7 or python3 (suggested)
# Usage
* **suricata_search.py -p tags -s 'APT' -o misp_ids.rules -t 5**
- search for 'APT' tag
- use 5 threads while generating IDS rules
- dump results to misp_ids.rules
* **suricata_search.py -p tags -s 'APT' -o misp_ids.rules -ne 411 357 343**
- same as above, but skip events ID 411,357 and 343
* **suricata_search.py -p tags -s 'circl:incident-classification="malware", tlp:green' -o misp_ids.rules**
- search for multiple tags 'circl:incident-classification="malware", tlp:green'
* **suricata_search.py -p categories -s 'Artifacts dropped' -t 20 -o artifacts_dropped.rules**
- search for category 'Artifacts dropped'
- use 20 threads while generating IDS rules
- dump results to artifacts_dropped.rules
For reference, you can look at this repository: https://github.com/raw-data/pymisp-suricata_search

View File

@ -1,216 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
https://github.com/raw-data/pymisp-suricata_search
2017.06.28 start
2017.07.03 fixed args.quiet and status msgs
"""
import argparse
import os
import queue
import sys
from threading import Thread, enumerate
from keys import misp_url, misp_key, misp_verifycert
try:
from pymisp import PyMISP
except ImportError as err:
sys.stderr.write("ERROR: {}\n".format(err))
sys.stderr.write("\t[try] with pip install pymisp\n")
sys.stderr.write("\t[try] with pip3 install pymisp\n")
sys.exit(1)
HEADER = """
#This part might still contain bugs, use and your own risk and report any issues.
#
# MISP export of IDS rules - optimized for suricata
#
# These NIDS rules contain some variables that need to exist in your configuration.
# Make sure you have set:
#
# $HOME_NET - Your internal network range
# $EXTERNAL_NET - The network considered as outside
# $SMTP_SERVERS - All your internal SMTP servers
# $HTTP_PORTS - The ports used to contain HTTP traffic (not required with suricata export)
#
"""
# queue for events matching searched term/s
IDS_EVENTS = queue.Queue()
# queue for downloaded Suricata rules
DOWNLOADED_RULES = queue.Queue()
# Default number of threads to use
THREAD = 4
try:
input = raw_input
except NameError:
pass
def init():
""" init connection to MISP """
return PyMISP(misp_url, misp_key, misp_verifycert, 'json')
def search(misp, quiet, noevent, **kwargs):
""" Start search in MISP """
result = misp.search(**kwargs)
# fetch all events matching **kwargs
track_events = 0
skip_events = list()
for event in result['response']:
event_id = event["Event"].get("id")
track_events += 1
to_ids = False
for attribute in event["Event"]["Attribute"]:
to_ids_event = attribute["to_ids"]
if to_ids_event:
to_ids = True
break
# if there is at least one eligible event to_ids, add event_id
if to_ids:
# check if the event_id is not blacklisted by the user
if isinstance(noevent, list):
if event_id not in noevent[0]:
to_ids_event = (event_id, misp)
IDS_EVENTS.put(to_ids_event)
else:
skip_events.append(event_id)
else:
to_ids_event = (event_id, misp)
IDS_EVENTS.put(to_ids_event)
if not quiet:
print ("\t[i] matching events: {}".format(track_events))
if len(skip_events) > 0:
print ("\t[i] skipped {0} events -> {1}".format(len(skip_events),skip_events))
print ("\t[i] events selected for IDS export: {}".format(IDS_EVENTS.qsize()))
def collect_rules(thread):
""" Dispatch tasks to Suricata_processor worker """
for x in range(int(thread)):
th = Thread(target=suricata_processor, args=(IDS_EVENTS, ))
th.start()
for x in enumerate():
if x.name == "MainThread":
continue
x.join()
def suricata_processor(ids_events):
""" Trigger misp.download_suricata_rule_event """
while not ids_events.empty():
event_id, misp = ids_events.get()
ids_rules = misp.download_suricata_rule_event(event_id).text
for r in ids_rules.split("\n"):
# skip header
if not r.startswith("#"):
if len(r) > 0: DOWNLOADED_RULES.put(r)
def return_rules(output, quiet):
""" Return downloaded rules to user """
rules = set()
while not DOWNLOADED_RULES.empty():
rules.add(DOWNLOADED_RULES.get())
if output is None:
if not quiet:
print ("[+] Displaying rules")
print (HEADER)
for r in rules: print (r)
print ("#")
else:
if not quiet:
print ("[+] Writing rules to {}".format(output))
print ("[+] Generated {} rules".format(len(rules)))
with open(output, 'w') as f:
f.write(HEADER)
f.write("\n".join(r for r in rules))
f.write("\n"+"#")
def format_request(param, term, misp, quiet, output, thread, noevent):
""" Format request and start search """
kwargs = {param: term}
if not quiet:
print ("[+] Searching for: {}".format(kwargs))
search(misp, quiet, noevent, **kwargs)
# collect Suricata rules
collect_rules(thread)
if __name__ == "__main__":
parser = argparse.ArgumentParser(
formatter_class=argparse.RawTextHelpFormatter,
description='Get all attributes that can be converted into Suricata rules, given a parameter and a term to '
'search.',
epilog='''
EXAMPLES:
suricata_search.py -p tags -s 'APT' -o misp_ids.rules -t 5
suricata_search.py -p tags -s 'APT' -o misp_ids.rules -ne 411 357 343
suricata_search.py -p tags -s 'tlp:green, OSINT' -o misp_ids.rules
suricata_search.py -p tags -s 'circl:incident-classification="malware", tlp:green' -o misp_ids.rules
suricata_search.py -p categories -s 'Artifacts dropped' -t 20 -o artifacts_dropped.rules
''')
parser.add_argument("-p", "--param", required=True, help="Parameter to search (e.g. categories, tags, org, etc.).")
parser.add_argument("-s", "--search", required=True, help="Term/s to search.")
parser.add_argument("-q", "--quiet", action='store_true', help="No status messages")
parser.add_argument("-t", "--thread", required=False, help="Number of threads to use", default=THREAD)
parser.add_argument("-ne", "--noevent", nargs='*', required=False, dest='noevent', action='append',
help="Event/s ID to exclude during the search")
parser.add_argument("-o", "--output", help="Output file",required=False)
args = parser.parse_args()
if args.output is not None and os.path.exists(args.output) and not args.quiet:
try:
check = input("[!] Output file {} exists, do you want to continue [Y/n]? ".format(args.output))
if check not in ["Y","y"]:
exit(0)
except KeyboardInterrupt:
sys.exit(0)
if not args.quiet:
print ("[i] Connecting to MISP instance: {}".format(misp_url))
print ("[i] Note: duplicated IDS rules will be removed")
# Based on # of terms, format request
if "," in args.search:
for term in args.search.split(","):
term = term.strip()
misp = init()
format_request(args.param, term, misp, args.quiet, args.output, args.thread, args.noevent)
else:
misp = init()
format_request(args.param, args.search, misp, args.quiet, args.output, args.thread, args.noevent)
# return collected rules
return_rules(args.output, args.quiet)

View File

@ -18,4 +18,4 @@ if __name__ == '__main__':
me = MISPEvent()
me.load_file(args.input)
result = misp.update_event(args.event, me)
result = misp.update_event(me, args.event)

View File

@ -0,0 +1,202 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
VMRay automatic import
Put this script in crontab to run every /15 or /60
*/5 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/vmray_automation.py
Calls "vmray_import" for all events that have an 'incomplete' VMray analysis
Do inline config in "main"
'''
from pymisp import ExpandedPyMISP, MISPAttribute
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
import json
import datetime
import time
import requests
import sys
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def get_vmray_config(url, key, misp_verifycert, default_wait_period):
try:
misp_headers = {'Content-Type': 'application/json', 'Accept': 'application/json', 'Authorization': key}
req = requests.get(url + 'servers/serverSettings.json', verify=misp_verifycert, headers=misp_headers)
if req.status_code == 200:
req_json = req.json()
if 'finalSettings' in req_json:
finalSettings = req_json['finalSettings']
vmray_api = ''
vmray_url = ''
vmray_wait_period = 0
for el in finalSettings:
# Is the vmray import module enabled?
if el['setting'] == 'Plugin.Import_vmray_import_enabled':
vmray_import_enabled = el['value']
if vmray_import_enabled is False:
break
# Get the VMRay API key from the MISP settings
elif el['setting'] == 'Plugin.Import_vmray_import_apikey':
vmray_api = el['value']
# The VMRay URL to query
elif el['setting'] == 'Plugin.Import_vmray_import_url':
vmray_url = el['value'].replace('/', '\\/')
# MISP modules - Port?
elif el['setting'] == 'Plugin.Import_services_port':
module_import_port = el['value']
if module_import_port:
module_import_port = str(module_import_port)
else:
module_import_port = "6666"
# MISP modules - URL
elif el['setting'] == 'Plugin.Import_services_url':
module_import_url = el['value'].replace('\/\/', '//')
# Wait period
elif el['setting'] == 'Plugin.Import_vmray_import_wait_period':
vmray_wait_period = abs(int(el['value']))
if vmray_wait_period < 1:
vmray_wait_period = default_wait_period
else:
sys.exit('Did not receive a 200 code from MISP')
if vmray_import_enabled and vmray_api and vmray_url and module_import_port and module_import_url:
return {'vmray_wait_period': vmray_wait_period, 'vmray_api': vmray_api, 'vmray_url': vmray_url, 'module_import_port': module_import_port, 'module_import_url': module_import_url}
sys.exit('Did not receive all the necessary configuration information from MISP')
except Exception as e:
sys.exit('Unable to get VMRay config from MISP')
def search_vmray_incomplete(m, url, wait_period, module_import_url, module_import_port, vmray_url, vmray_api, vmray_attribute_category, vmray_include_analysisid, vmray_include_imphash_ssdeep, vmray_include_extracted_files, vmray_include_analysisdetails, vmray_include_vtidetails, custom_tags_incomplete, custom_tags_complete):
controller = 'attributes'
vmray_value = 'VMRay Sample ID:' # How sample IDs are stored in MISP
req = None
# Search for the events
try:
result = m.search(controller, tags=custom_tags_incomplete)
attribute = result['Attribute']
if len(attribute) == 0:
sys.exit("No VMRay attributes found that match %s" % custom_tags_incomplete)
timestamp = int(attribute[0]["timestamp"])
# Not enough time has gone by to lookup the analysis jobs
if int((time.time() - timestamp) / 60) < int(wait_period):
if module_DEBUG:
r_timestamp = datetime.datetime.fromtimestamp(timestamp).strftime('%Y-%m-%d %H:%M:%S')
print("Attribute to recent for wait_period (%s minutes) - timestamp attribute: %s (%s minutes old)" % (wait_period, r_timestamp, round((int(time.time() - timestamp) / 60), 2)))
return False
if module_DEBUG:
print("All attributes older than %s" % int(wait_period))
for att in attribute:
value = att['value']
if vmray_value in value: # We found a sample ID
att_id = att['id']
att_uuid = att['uuid']
# VMRay Sample IDs are stored as VMRay Sample ID: 2796577
vmray_sample_id = value.split(vmray_value)[1].strip()
if vmray_sample_id.isdigit():
event_id = att['event_id']
if module_DEBUG:
print("Found event %s with matching tags %s for sample id %s " % (event_id, custom_tags_incomplete, vmray_sample_id))
# Prepare request to send to vmray_import via misp modules
misp_modules_url = module_import_url + ':' + module_import_port + '/query'
misp_modules_headers = {'Content-Type': 'application/json'}
misp_modules_body = '{ "sample_id":"' + vmray_sample_id + '","module":"vmray_import","event_id":"' + event_id + '","config":{"apikey":"' + vmray_api + '","url":"' + vmray_url + '","include_analysisid":"' + vmray_include_analysisid + '","include_analysisdetails":"' + vmray_include_analysisdetails + '","include_extracted_files":"' + vmray_include_extracted_files + '","include_imphash_ssdeep":"' + vmray_include_imphash_ssdeep + '","include_vtidetails":"' + vmray_include_vtidetails + '","sample_id":"' + vmray_sample_id + '"},"data":""}'
req = requests.post(misp_modules_url, data=misp_modules_body, headers=misp_modules_headers)
if module_DEBUG and req is not None:
print("Response code from submitting to MISP modules %s" % (req.status_code))
# Successful response from the misp modules?
if req.status_code == 200:
req_json = req.json()
if "error" in req_json:
print("Error code in reply %s " % req_json["error"])
continue
else:
results = req_json["results"]
# Walk through all results in the misp-module reply
for el in results:
to_ids = True
values = el['values']
types = el['types']
if "to_ids" in el:
to_ids = el['to_ids']
if "text" in types:
to_ids = False
comment = el['comment']
if len(comment) < 1:
comment = "Enriched via the vmray_import module"
# Attribute can belong in different types
for attr_type in types:
try:
new_attribute = MISPAttribute()
new_attribute.type = attr_type
new_attribute.category = vmray_attribute_category
new_attribute.value = values
new_attribute.to_ids = to_ids
new_attribute.comment = comment
r = m.add_attribute(event_id, new_attribute)
if module_DEBUG:
print("Add event %s: %s as %s (%s) (toids: %s)" % (event_id, values, attr_type, comment, to_ids))
except Exception as e:
if module_DEBUG:
print("Unable to add attribute %s as type %s for event %s" % (values, attr_type, event_id))
continue
# Remove 'incomplete' state tags
m.untag(att_uuid, custom_tags_incomplete)
# Update tags to 'complete' state
m.tag(att_uuid, custom_tags_complete)
if module_DEBUG:
print("Updated event %s" % event_id)
else:
sys.exit('MISP modules did not return HTTP 200 code (event %s ; sampleid %s)' % (event_id, vmray_sample_id))
except Exception as e:
sys.exit("Invalid response received from MISP : %s", e)
if __name__ == '__main__':
module_DEBUG = True
# Set some defaults to be used in this module
vmray_attribute_category = 'External analysis'
vmray_include_analysisid = '0'
vmray_include_imphash_ssdeep = '0'
vmray_include_extracted_files = '0'
vmray_include_analysisdetails = '0'
vmray_include_vtidetails = '0'
custom_tags_incomplete = 'workflow:state="incomplete"'
custom_tags_complete = 'workflow:state="complete"'
default_wait_period = 30
misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert, debug=module_DEBUG)
vmray_config = get_vmray_config(misp_url, misp_key, misp_verifycert, default_wait_period)
search_vmray_incomplete(misp, misp_url, vmray_config['vmray_wait_period'], vmray_config['module_import_url'], vmray_config['module_import_port'], vmray_config['vmray_url'], vmray_config['vmray_api'], vmray_attribute_category, vmray_include_analysisid, vmray_include_imphash_ssdeep, vmray_include_extracted_files, vmray_include_analysisdetails, vmray_include_vtidetails, custom_tags_incomplete, custom_tags_complete)

1885
poetry.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +1,4 @@
__version__ = '2.4.121.1'
__version__ = '2.4.131'
import logging
FORMAT = "%(levelname)s [%(filename)s:%(lineno)s - %(funcName)s() ] %(message)s"
@ -24,14 +24,15 @@ Response (if any):
try:
from .exceptions import PyMISPError, NewEventError, NewAttributeError, MissingDependency, NoURL, NoKey, InvalidMISPObject, UnknownMISPObjectTemplate, PyMISPInvalidFormat, MISPServerError, PyMISPNotImplementedYet, PyMISPUnexpectedResponse, PyMISPEmptyResponse # noqa
from .abstract import AbstractMISP, MISPEncode, pymisp_json_default, MISPTag, Distribution, ThreatLevel, Analysis # noqa
from .mispevent import MISPEvent, MISPAttribute, MISPObjectReference, MISPObjectAttribute, MISPObject, MISPUser, MISPOrganisation, MISPSighting, MISPLog, MISPShadowAttribute, MISPWarninglist, MISPTaxonomy, MISPNoticelist, MISPObjectTemplate, MISPSharingGroup, MISPRole, MISPServer, MISPFeed, MISPEventDelegation, MISPUserSetting # noqa
from .mispevent import MISPEvent, MISPAttribute, MISPObjectReference, MISPObjectAttribute, MISPObject, MISPUser, MISPOrganisation, MISPSighting, MISPLog, MISPShadowAttribute, MISPWarninglist, MISPTaxonomy, MISPNoticelist, MISPObjectTemplate, MISPSharingGroup, MISPRole, MISPServer, MISPFeed, MISPEventDelegation, MISPUserSetting, MISPInbox, MISPEventBlocklist, MISPOrganisationBlocklist # noqa
from .tools import AbstractMISPObjectGenerator # noqa
from .tools import Neo4j # noqa
from .tools import stix # noqa
from .tools import openioc # noqa
from .tools import ext_lookups # noqa
from .tools import update_objects # noqa
from .api import PyMISP # noqa
from .api import PyMISP, register_user # noqa
from .api import PyMISP as ExpandedPyMISP # noqa
from .tools import load_warninglists # noqa
# Let's not bother with old python

View File

@ -21,7 +21,7 @@ except ImportError:
import logging
from enum import Enum
from typing import Union, Optional
from typing import Union, Optional, Any, Dict, List, Set, Mapping
from .exceptions import PyMISPInvalidFormat, PyMISPError
@ -46,8 +46,11 @@ class MISPFileCache(object):
def _load_json(path: Path) -> Union[dict, None]:
if not path.exists():
return None
with path.open('r') as f:
data = load(f)
with path.open('r', encoding='utf-8') as f:
if HAS_RAPIDJSON:
data = load(f)
else:
data = load(f, encoding='utf-8')
return data
@ -73,7 +76,7 @@ class Analysis(Enum):
completed = 2
def _int_to_str(d: dict) -> dict:
def _int_to_str(d: Dict[str, Any]) -> Dict[str, Any]:
# transform all integer back to string
for k, v in d.items():
if isinstance(v, dict):
@ -111,9 +114,9 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
"""
super().__init__()
self.__edited: bool = True # As we create a new object, we assume it is edited
self.__not_jsonable: list = []
self._fields_for_feed: set
self.__self_defined_describe_types: Union[dict, None] = None
self.__not_jsonable: List[str] = []
self._fields_for_feed: Set
self.__self_defined_describe_types: Optional[Dict] = None
self.uuid: str
if kwargs.get('force_timestamps') is not None:
@ -123,13 +126,13 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
self.__force_timestamps: bool = False
@property
def describe_types(self) -> dict:
def describe_types(self) -> Dict:
if self.__self_defined_describe_types:
return self.__self_defined_describe_types
return self.__describe_types
@describe_types.setter
def describe_types(self, describe_types: dict):
def describe_types(self, describe_types: Dict):
self.__self_defined_describe_types = describe_types
@property
@ -163,15 +166,23 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
"""Add entries to the __not_jsonable list"""
self.__not_jsonable += args
def set_not_jsonable(self, args: list) -> None:
def set_not_jsonable(self, args: List[str]) -> None:
"""Set __not_jsonable to a new list"""
self.__not_jsonable = args
def _remove_from_not_jsonable(self, *args) -> None:
"""Remove the entries that are in the __not_jsonable list"""
for entry in args:
try:
self.__not_jsonable.remove(entry)
except ValueError:
pass
def from_json(self, json_string: str) -> None:
"""Load a JSON string"""
self.from_dict(**loads(json_string))
def to_dict(self) -> dict:
def to_dict(self) -> Dict:
"""Dump the class to a dictionary.
This method automatically removes the timestamp recursively in every object
that has been edited is order to let MISP update the event accordingly."""
@ -182,6 +193,8 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
continue
elif isinstance(val, list) and len(val) == 0:
continue
elif isinstance(val, str):
val = val.strip()
if attribute == 'timestamp':
if not self.__force_timestamps and is_edited:
# In order to be accepted by MISP, the timestamp of an object
@ -201,11 +214,11 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
to_return = _int_to_str(to_return)
return to_return
def jsonable(self) -> dict:
def jsonable(self) -> Dict:
"""This method is used by the JSON encoder"""
return self.to_dict()
def _to_feed(self) -> dict:
def _to_feed(self) -> Dict:
if not hasattr(self, '_fields_for_feed') or not self._fields_for_feed:
raise PyMISPError('Unable to export in the feed format, _fields_for_feed is missing.')
if hasattr(self, '_set_default') and callable(self._set_default): # type: ignore
@ -220,14 +233,14 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
else:
to_return[field] = getattr(self, field)
else:
if field == 'data':
# data in attribute is special
if field in ['data', 'first_seen', 'last_seen', 'deleted']:
# special fields
continue
raise PyMISPError('The field {} is required in {} when generating a feed.'.format(field, self.__class__.__name__))
to_return = _int_to_str(to_return)
return to_return
def to_json(self, sort_keys: bool=False, indent: Optional[int]=None):
def to_json(self, sort_keys: bool = False, indent: Optional[int] = None):
"""Dump recursively any class of type MISPAbstract to a json string"""
return dumps(self, default=pymisp_json_default, sort_keys=sort_keys, indent=indent)
@ -247,9 +260,17 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
delattr(self, key)
def __iter__(self):
return iter({k: v for k, v in self.__dict__.items() if not (k[0] == '_' or k in self.__not_jsonable)})
'''When we call **self, skip keys:
* starting with _
* in __not_jsonable
* timestamp if the object is edited *unless* it is forced
'''
return iter({k: v for k, v in self.__dict__.items()
if not (k[0] == '_'
or k in self.__not_jsonable
or (not self.__force_timestamps and (k == 'timestamp' and self.__edited)))})
def __len__(self):
def __len__(self) -> int:
return len([k for k in self.__dict__.keys() if not (k[0] == '_' or k in self.__not_jsonable)])
@property
@ -269,15 +290,15 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
return self.__edited
@edited.setter
def edited(self, val):
def edited(self, val: bool):
"""Set the edit flag"""
if isinstance(val, bool):
self.__edited = val
else:
raise PyMISPError('edited can only be True or False')
def __setattr__(self, name, value):
if name[0] != '_' and not self.__edited and name in self.keys():
def __setattr__(self, name: str, value: Any):
if name[0] != '_' and not self.__edited and name in self:
# The private members don't matter
# If we already have a key with that name, we're modifying it.
self.__edited = True
@ -290,7 +311,7 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
return int(d)
return int(d.timestamp())
def _add_tag(self, tag=None, **kwargs):
def _add_tag(self, tag: Optional[Union[str, 'MISPTag', Mapping]] = None, **kwargs):
"""Add a tag to the attribute (by name or a MISPTag object)"""
if isinstance(tag, str):
misp_tag = MISPTag()
@ -305,19 +326,19 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
misp_tag.from_dict(**kwargs)
else:
raise PyMISPInvalidFormat(f"The tag is in an invalid format (can be either string, MISPTag, or an expanded dict): {tag}")
if misp_tag not in self.tags:
if misp_tag not in self.tags: # type: ignore
self.Tag.append(misp_tag)
self.edited = True
return misp_tag
def _set_tags(self, tags):
def _set_tags(self, tags: List['MISPTag']):
"""Set a list of prepared MISPTag."""
if all(isinstance(x, MISPTag) for x in tags):
self.Tag = tags
else:
raise PyMISPInvalidFormat('All the attributes have to be of type MISPTag.')
def __eq__(self, other):
def __eq__(self, other) -> bool:
if isinstance(other, AbstractMISP):
return self.to_dict() == other.to_dict()
elif isinstance(other, dict):
@ -325,10 +346,8 @@ class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
else:
return False
def __repr__(self):
if hasattr(self, 'name'):
return '<{self.__class__.__name__}(name={self.name})'.format(self=self)
return '<{self.__class__.__name__}(NotInitialized)'.format(self=self)
def __repr__(self) -> str:
return '<{self.__class__.__name__} - please define me>'.format(self=self)
class MISPTag(AbstractMISP):
@ -338,6 +357,7 @@ class MISPTag(AbstractMISP):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.name: str
self.exportable: bool
def from_dict(self, **kwargs):
if kwargs.get('Tag'):
@ -348,18 +368,23 @@ class MISPTag(AbstractMISP):
if not hasattr(self, 'colour'):
self.colour = '#ffffff'
def _to_feed(self):
def _to_feed(self) -> Dict:
if hasattr(self, 'exportable') and not self.exportable:
return False
return {}
return super()._to_feed()
def delete(self):
self.deleted = True
self.edited = True
def __repr__(self) -> str:
if hasattr(self, 'name'):
return '<{self.__class__.__name__}(name={self.name})>'.format(self=self)
return '<{self.__class__.__name__}(NotInitialized)>'.format(self=self)
if HAS_RAPIDJSON:
def pymisp_json_default(obj: Union[AbstractMISP, datetime, date, Enum, UUID]) -> Union[dict, str]:
def pymisp_json_default(obj: Union[AbstractMISP, datetime, date, Enum, UUID]) -> Union[Dict, str]:
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime, date)):
@ -369,7 +394,7 @@ if HAS_RAPIDJSON:
elif isinstance(obj, UUID):
return str(obj)
else:
def pymisp_json_default(obj: Union[AbstractMISP, datetime, date, Enum, UUID]) -> Union[dict, str]:
def pymisp_json_default(obj: Union[AbstractMISP, datetime, date, Enum, UUID]) -> Union[Dict, str]:
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime, date)):

File diff suppressed because it is too large Load Diff

View File

@ -36,6 +36,7 @@
"comment",
"cookie",
"filename",
"filename-pattern",
"filename|authentihash",
"filename|impfuzzy",
"filename|imphash",
@ -44,12 +45,17 @@
"filename|sha1",
"filename|sha224",
"filename|sha256",
"filename|sha3-224",
"filename|sha3-256",
"filename|sha3-384",
"filename|sha3-512",
"filename|sha384",
"filename|sha512",
"filename|sha512/224",
"filename|sha512/256",
"filename|ssdeep",
"filename|tlsh",
"filename|vhash",
"gene",
"hex",
"impfuzzy",
@ -64,11 +70,17 @@
"pattern-in-file",
"pattern-in-memory",
"pdb",
"pgp-private-key",
"pgp-public-key",
"regkey",
"regkey|value",
"sha1",
"sha224",
"sha256",
"sha3-224",
"sha3-256",
"sha3-384",
"sha3-512",
"sha384",
"sha512",
"sha512/224",
@ -76,7 +88,9 @@
"sigma",
"ssdeep",
"stix2-pattern",
"telfhash",
"text",
"vhash",
"windows-scheduled-task",
"windows-service-displayname",
"windows-service-name",
@ -91,6 +105,7 @@
"campaign-name",
"comment",
"dns-soa-email",
"email",
"other",
"text",
"threat-actor",
@ -115,9 +130,14 @@
"domain",
"domain|ip",
"filename",
"filename-pattern",
"filename|md5",
"filename|sha1",
"filename|sha256",
"filename|sha3-224",
"filename|sha3-256",
"filename|sha3-384",
"filename|sha3-512",
"github-repository",
"hassh-md5",
"hasshserver-md5",
@ -140,6 +160,10 @@
"regkey|value",
"sha1",
"sha256",
"sha3-224",
"sha3-256",
"sha3-384",
"sha3-512",
"snort",
"text",
"url",
@ -172,6 +196,7 @@
"Internal reference": [
"anonymised",
"comment",
"git-commit-id",
"hex",
"link",
"other",
@ -187,10 +212,12 @@
"cookie",
"domain",
"domain|ip",
"email",
"email-dst",
"email-src",
"email-subject",
"eppn",
"filename-pattern",
"hassh-md5",
"hasshserver-md5",
"hex",
@ -229,6 +256,8 @@
"float",
"hex",
"other",
"pgp-private-key",
"pgp-public-key",
"phone-number",
"port",
"size-in-bytes",
@ -243,6 +272,7 @@
"chrome-extension-id",
"comment",
"domain",
"email",
"email-attachment",
"email-body",
"email-dst",
@ -257,6 +287,7 @@
"email-thread-index",
"email-x-mailer",
"filename",
"filename-pattern",
"filename|authentihash",
"filename|impfuzzy",
"filename|imphash",
@ -265,12 +296,17 @@
"filename|sha1",
"filename|sha224",
"filename|sha256",
"filename|sha3-224",
"filename|sha3-256",
"filename|sha3-384",
"filename|sha3-512",
"filename|sha384",
"filename|sha512",
"filename|sha512/224",
"filename|sha512/256",
"filename|ssdeep",
"filename|tlsh",
"filename|vhash",
"hassh-md5",
"hasshserver-md5",
"hex",
@ -298,6 +334,10 @@
"sha1",
"sha224",
"sha256",
"sha3-224",
"sha3-256",
"sha3-384",
"sha3-512",
"sha384",
"sha512",
"sha512/224",
@ -305,10 +345,12 @@
"sigma",
"ssdeep",
"stix2-pattern",
"telfhash",
"text",
"tlsh",
"url",
"user-agent",
"vhash",
"vulnerability",
"weakness",
"whois-registrant-email",
@ -325,6 +367,7 @@
"chrome-extension-id",
"comment",
"filename",
"filename-pattern",
"filename|authentihash",
"filename|impfuzzy",
"filename|imphash",
@ -333,12 +376,17 @@
"filename|sha1",
"filename|sha224",
"filename|sha256",
"filename|sha3-224",
"filename|sha3-256",
"filename|sha3-384",
"filename|sha3-512",
"filename|sha384",
"filename|sha512",
"filename|sha512/224",
"filename|sha512/256",
"filename|ssdeep",
"filename|tlsh",
"filename|vhash",
"hex",
"impfuzzy",
"imphash",
@ -355,6 +403,10 @@
"sha1",
"sha224",
"sha256",
"sha3-224",
"sha3-256",
"sha3-384",
"sha3-512",
"sha384",
"sha512",
"sha512/224",
@ -362,8 +414,10 @@
"sigma",
"ssdeep",
"stix2-pattern",
"telfhash",
"text",
"tlsh",
"vhash",
"vulnerability",
"weakness",
"x509-fingerprint-md5",
@ -392,6 +446,7 @@
"comment",
"country-of-residence",
"date-of-birth",
"email",
"first-name",
"frequent-flyer-number",
"gender",
@ -406,6 +461,8 @@
"passport-expiration",
"passport-number",
"payment-details",
"pgp-private-key",
"pgp-public-key",
"phone-number",
"place-of-birth",
"place-port-of-clearance",
@ -421,6 +478,7 @@
"Social network": [
"anonymised",
"comment",
"email",
"email-dst",
"email-src",
"eppn",
@ -429,6 +487,8 @@
"github-username",
"jabber-id",
"other",
"pgp-private-key",
"pgp-public-key",
"text",
"twitter-id",
"whois-registrant-email"
@ -570,6 +630,10 @@
"default_category": "Network activity",
"to_ids": 1
},
"email": {
"default_category": "Social network",
"to_ids": 1
},
"email-attachment": {
"default_category": "Payload delivery",
"to_ids": 1
@ -662,6 +726,22 @@
"default_category": "Payload delivery",
"to_ids": 1
},
"filename|sha3-224": {
"default_category": "Payload delivery",
"to_ids": 1
},
"filename|sha3-256": {
"default_category": "Payload delivery",
"to_ids": 1
},
"filename|sha3-384": {
"default_category": "Payload delivery",
"to_ids": 1
},
"filename|sha3-512": {
"default_category": "Payload delivery",
"to_ids": 1
},
"filename|sha384": {
"default_category": "Payload delivery",
"to_ids": 1
@ -686,6 +766,10 @@
"default_category": "Payload delivery",
"to_ids": 1
},
"filename|vhash": {
"default_category": "Payload delivery",
"to_ids": 1
},
"first-name": {
"default_category": "Person",
"to_ids": 0
@ -706,6 +790,10 @@
"default_category": "Artifacts dropped",
"to_ids": 0
},
"git-commit-id": {
"default_category": "Internal reference",
"to_ids": 0
},
"github-organisation": {
"default_category": "Social network",
"to_ids": 0
@ -862,6 +950,10 @@
"default_category": "Person",
"to_ids": 0
},
"pattern-filename": {
"default_category": "Payload installation",
"to_ids": 1
},
"pattern-in-file": {
"default_category": "Payload installation",
"to_ids": 1
@ -886,6 +978,14 @@
"default_category": "Payload delivery",
"to_ids": 1
},
"pgp-private-key": {
"default_category": "Person",
"to_ids": 0
},
"pgp-public-key": {
"default_category": "Person",
"to_ids": 0
},
"phone-number": {
"default_category": "Person",
"to_ids": 0
@ -942,6 +1042,22 @@
"default_category": "Payload delivery",
"to_ids": 1
},
"sha3-224": {
"default_category": "Payload delivery",
"to_ids": 1
},
"sha3-256": {
"default_category": "Payload delivery",
"to_ids": 1
},
"sha3-384": {
"default_category": "Payload delivery",
"to_ids": 1
},
"sha3-512": {
"default_category": "Payload delivery",
"to_ids": 1
},
"sha384": {
"default_category": "Payload delivery",
"to_ids": 1
@ -1006,6 +1122,10 @@
"default_category": "Targeting data",
"to_ids": 0
},
"telfhash": {
"default_category": "Payload delivery",
"to_ids": 1
},
"text": {
"default_category": "Other",
"to_ids": 0
@ -1038,6 +1158,10 @@
"default_category": "Network activity",
"to_ids": 0
},
"vhash": {
"default_category": "Payload delivery",
"to_ids": 1
},
"visa-number": {
"default_category": "Person",
"to_ids": 0
@ -1141,6 +1265,7 @@
"dns-soa-email",
"domain",
"domain|ip",
"email",
"email-attachment",
"email-body",
"email-dst",
@ -1164,17 +1289,23 @@
"filename|sha1",
"filename|sha224",
"filename|sha256",
"filename|sha3-224",
"filename|sha3-256",
"filename|sha3-384",
"filename|sha3-512",
"filename|sha384",
"filename|sha512",
"filename|sha512/224",
"filename|sha512/256",
"filename|ssdeep",
"filename|tlsh",
"filename|vhash",
"first-name",
"float",
"frequent-flyer-number",
"gender",
"gene",
"git-commit-id",
"github-organisation",
"github-repository",
"github-username",
@ -1214,12 +1345,15 @@
"passport-country",
"passport-expiration",
"passport-number",
"pattern-filename",
"pattern-in-file",
"pattern-in-memory",
"pattern-in-traffic",
"payment-details",
"pdb",
"pehash",
"pgp-private-key",
"pgp-public-key",
"phone-number",
"place-of-birth",
"place-port-of-clearance",
@ -1234,6 +1368,10 @@
"sha1",
"sha224",
"sha256",
"sha3-224",
"sha3-256",
"sha3-384",
"sha3-512",
"sha384",
"sha512",
"sha512/224",
@ -1250,6 +1388,7 @@
"target-machine",
"target-org",
"target-user",
"telfhash",
"text",
"threat-actor",
"tlsh",
@ -1258,6 +1397,7 @@
"uri",
"url",
"user-agent",
"vhash",
"visa-number",
"vulnerability",
"weakness",

@ -1 +1 @@
Subproject commit 3ba77c9d2cfea5c27bc8935812d83be54c4f0fd4
Subproject commit 5c935172ea9d1eeaeb7a42ad291eb10f57bc268f

View File

@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
from datetime import timezone, datetime, date
import copy
import json
import os
import base64
@ -12,7 +13,7 @@ from collections import defaultdict
import logging
import hashlib
from pathlib import Path
from typing import List, Optional, Union, IO
from typing import List, Optional, Union, IO, Dict, Any
from .abstract import AbstractMISP, MISPTag
from .exceptions import UnknownMISPObjectTemplate, InvalidMISPObject, PyMISPError, NewEventError, NewAttributeError
@ -82,7 +83,7 @@ def _make_datetime(value) -> datetime:
return value
def make_bool(value: Union[bool, int, str, dict, list, None]) -> bool:
def make_bool(value: Optional[Union[bool, int, str, dict, list]]) -> bool:
if isinstance(value, bool):
return value
if isinstance(value, int):
@ -102,6 +103,10 @@ class MISPOrganisation(AbstractMISP):
_fields_for_feed: set = {'name', 'uuid'}
def __init__(self):
super().__init__()
self.id: int
def from_dict(self, **kwargs):
if 'Organisation' in kwargs:
kwargs = kwargs['Organisation']
@ -167,21 +172,22 @@ class MISPSighting(AbstractMISP):
class MISPAttribute(AbstractMISP):
_fields_for_feed: set = {'uuid', 'value', 'category', 'type', 'comment', 'data',
'timestamp', 'to_ids', 'disable_correlation'}
'deleted', 'timestamp', 'to_ids', 'disable_correlation',
'first_seen', 'last_seen'}
def __init__(self, describe_types: Optional[dict]=None, strict: bool=False):
def __init__(self, describe_types: Optional[Dict] = None, strict: bool = False):
"""Represents an Attribute
:describe_type: Use it is you want to overwrite the defualt describeTypes.json file (you don't)
:strict: If false, fallback to sane defaults for the attribute type if the ones passed by the user are incorrect
"""
super().__init__()
if describe_types:
self.describe_types: dict = describe_types
self.describe_types: Dict[str, Any] = describe_types
self.__categories: List[str] = self.describe_types['categories']
self.__category_type_mapping: dict = self.describe_types['category_type_mappings']
self.__sane_default: dict = self.describe_types['sane_defaults']
self.__category_type_mapping: Dict[str, List[str]] = self.describe_types['category_type_mappings']
self.__sane_default: Dict[str, Dict[str, Union[str, int]]] = self.describe_types['sane_defaults']
self.__strict: bool = strict
self._data: Optional[BytesIO] = None
self.data: Optional[BytesIO] = None
self.first_seen: datetime
self.last_seen: datetime
self.uuid: str = str(uuid.uuid4())
@ -194,7 +200,7 @@ class MISPAttribute(AbstractMISP):
self.Event: MISPEvent
self.RelatedAttribute: List[MISPAttribute]
def add_tag(self, tag: Optional[Union[str, MISPTag, dict]]=None, **kwargs) -> MISPTag:
def add_tag(self, tag: Optional[Union[str, MISPTag, Dict]] = None, **kwargs) -> MISPTag:
return super()._add_tag(tag, **kwargs)
@property
@ -207,17 +213,53 @@ class MISPAttribute(AbstractMISP):
"""Set a list of prepared MISPTag."""
super()._set_tags(tags)
def __setattr__(self, name, value):
def _prepare_data(self, data: Optional[Union[Path, str, bytes, BytesIO]]):
if not data:
super().__setattr__('data', None)
return
if isinstance(data, BytesIO):
super().__setattr__('data', data)
elif isinstance(data, Path):
with data.open('rb') as f_temp:
super().__setattr__('data', BytesIO(f_temp.read()))
elif isinstance(data, (str, bytes)):
super().__setattr__('data', BytesIO(base64.b64decode(data)))
else:
raise PyMISPError(f'Invalid type ({type(data)}) for the data key: {data}')
if self.type == 'malware-sample':
try:
# Ignore type, if data is None -> exception
with ZipFile(self.data) as f: # type: ignore
if not self.__is_misp_encrypted_file(f):
raise PyMISPError('Not an existing malware sample')
for name in f.namelist():
if name.endswith('.filename.txt'):
with f.open(name, pwd=b'infected') as unpacked:
self.malware_filename = unpacked.read().decode().strip()
else:
with f.open(name, pwd=b'infected') as unpacked:
self._malware_binary = BytesIO(unpacked.read())
except Exception:
# not a encrypted zip file, assuming it is a new malware sample
self._prepare_new_malware_sample()
def __setattr__(self, name: str, value: Any):
if name in ['first_seen', 'last_seen']:
value = _make_datetime(value)
_datetime = _make_datetime(value)
if name == 'last_seen' and hasattr(self, 'first_seen') and self.first_seen > value:
if name == 'last_seen' and hasattr(self, 'first_seen') and self.first_seen > _datetime:
raise PyMISPError('last_seen ({value}) has to be after first_seen ({self.first_seen})')
if name == 'first_seen' and hasattr(self, 'last_seen') and self.last_seen < value:
if name == 'first_seen' and hasattr(self, 'last_seen') and self.last_seen < _datetime:
raise PyMISPError('first_seen ({value}) has to be before last_seen ({self.last_seen})')
super().__setattr__(name, value)
super().__setattr__(name, _datetime)
elif name == 'data':
self._prepare_data(value)
else:
super().__setattr__(name, value)
def hash_values(self, algorithm: str='sha512') -> List[str]:
def hash_values(self, algorithm: str = 'sha512') -> List[str]:
"""Compute the hash of every values for fast lookups"""
if algorithm not in hashlib.algorithms_available:
raise PyMISPError('The algorithm {} is not available for hashing.'.format(algorithm))
@ -242,7 +284,7 @@ class MISPAttribute(AbstractMISP):
if not hasattr(self, 'timestamp'):
self.timestamp = datetime.timestamp(datetime.now())
def _to_feed(self) -> dict:
def _to_feed(self) -> Dict:
to_return = super()._to_feed()
if self.data:
to_return['data'] = base64.b64encode(self.data.getvalue()).decode()
@ -256,7 +298,7 @@ class MISPAttribute(AbstractMISP):
return self.describe_types['types']
@property
def malware_binary(self) -> Union[BytesIO, None]:
def malware_binary(self) -> Optional[BytesIO]:
"""Returns a BytesIO of the malware (if the attribute has one, obvs)."""
if hasattr(self, '_malware_binary'):
return self._malware_binary
@ -294,7 +336,7 @@ class MISPAttribute(AbstractMISP):
"""Alias for add_shadow_attribute"""
return self.add_shadow_attribute(shadow_attribute, **kwargs)
def add_shadow_attribute(self, shadow_attribute: Union[MISPShadowAttribute, dict, None]=None, **kwargs) -> MISPShadowAttribute:
def add_shadow_attribute(self, shadow_attribute: Optional[Union[MISPShadowAttribute, Dict]] = None, **kwargs) -> MISPShadowAttribute:
"""Add a shadow attribute to the attribute (by name or a MISPShadowAttribute object)"""
if isinstance(shadow_attribute, MISPShadowAttribute):
misp_shadow_attribute = shadow_attribute
@ -310,7 +352,7 @@ class MISPAttribute(AbstractMISP):
self.edited = True
return misp_shadow_attribute
def add_sighting(self, sighting: Union[MISPSighting, dict, None]=None, **kwargs) -> MISPSighting:
def add_sighting(self, sighting: Optional[Union[MISPSighting, dict]] = None, **kwargs) -> MISPSighting:
"""Add a sighting to the attribute (by name or a MISPSighting object)"""
if isinstance(sighting, MISPSighting):
misp_sighting = sighting
@ -452,7 +494,7 @@ class MISPAttribute(AbstractMISP):
super().from_dict(**kwargs)
def to_dict(self) -> dict:
def to_dict(self) -> Dict:
to_return = super().to_dict()
if self.data:
to_return['data'] = base64.b64encode(self.data.getvalue()).decode()
@ -488,35 +530,6 @@ class MISPAttribute(AbstractMISP):
return False
return True
@property
def data(self):
return self._data if self._data else None
@data.setter
def data(self, data: Union[Path, str, bytes, BytesIO]):
if isinstance(data, Path):
with data.open('rb') as f_temp:
self._data = BytesIO(f_temp.read())
if isinstance(data, (str, bytes)):
self._data = BytesIO(base64.b64decode(data))
elif isinstance(data, BytesIO):
self._data = data
if self.type == 'malware-sample':
try:
with ZipFile(self.data) as f:
if not self.__is_misp_encrypted_file(f):
raise PyMISPError('Not an existing malware sample')
for name in f.namelist():
if name.endswith('.filename.txt'):
with f.open(name, pwd=b'infected') as unpacked:
self.malware_filename = unpacked.read().decode().strip()
else:
with f.open(name, pwd=b'infected') as unpacked:
self._malware_binary = BytesIO(unpacked.read())
except Exception:
# not a encrypted zip file, assuming it is a new malware sample
self._prepare_new_malware_sample()
def __repr__(self):
if hasattr(self, 'value'):
return '<{self.__class__.__name__}(type={self.type}, value={self.value})'.format(self=self)
@ -588,9 +601,10 @@ class MISPObject(AbstractMISP):
_fields_for_feed: set = {'name', 'meta-category', 'description', 'template_uuid',
'template_version', 'uuid', 'timestamp', 'distribution',
'sharing_group_id', 'comment'}
'sharing_group_id', 'comment', 'first_seen', 'last_seen',
'deleted'}
def __init__(self, name: str, strict: bool=False, standalone: bool=False, default_attributes_parameters: dict={}, **kwargs):
def __init__(self, name: str, strict: bool = False, standalone: bool = True, default_attributes_parameters: Dict = {}, **kwargs):
''' Master class representing a generic MISP object
:name: Name of the object
@ -616,14 +630,15 @@ class MISPObject(AbstractMISP):
self.last_seen: datetime
self.__fast_attribute_access: dict = defaultdict(list) # Hashtable object_relation: [attributes]
self.ObjectReference: List[MISPObjectReference] = []
self.Attribute: List[MISPAttribute] = []
self._standalone: bool = False
self.Attribute: List[MISPObjectAttribute] = []
self.SharingGroup: MISPSharingGroup
self._default_attributes_parameters: dict
if isinstance(default_attributes_parameters, MISPAttribute):
# Just make sure we're not modifying an existing MISPAttribute
self._default_attributes_parameters = default_attributes_parameters.to_dict()
else:
self._default_attributes_parameters = default_attributes_parameters
self._default_attributes_parameters = copy.copy(default_attributes_parameters)
if self._default_attributes_parameters:
# Let's clean that up
self._default_attributes_parameters.pop('value', None) # duh
@ -638,18 +653,15 @@ class MISPObject(AbstractMISP):
self._default_attributes_parameters.pop('data', None) # in case the original in a sample or an attachment
# Those values are set for the current object, if they exist, but not pop'd because they are still useful for the attributes
self.distribution = self._default_attributes_parameters.get('distribution', 5)
self.sharing_group_id = self._default_attributes_parameters.get('sharing_group_id', 0)
self.distribution: int = self._default_attributes_parameters.get('distribution', 5)
self.sharing_group_id: int = self._default_attributes_parameters.get('sharing_group_id', 0)
else:
self.distribution = 5 # Default to inherit
self.sharing_group_id = 0
self._standalone = standalone
if self._standalone:
# Mark as non_jsonable because we need to add the references manually after the object(s) have been created
self.update_not_jsonable('ObjectReference')
self.standalone = standalone
def _load_template_path(self, template_path: Union[Path, str]) -> bool:
self._definition: Union[dict, None] = self._load_json(template_path)
self._definition: Optional[Dict] = self._load_json(template_path)
if not self._definition:
return False
setattr(self, 'meta-category', self._definition['meta-category'])
@ -664,7 +676,7 @@ class MISPObject(AbstractMISP):
if not hasattr(self, 'timestamp'):
self.timestamp = datetime.timestamp(datetime.now())
def _to_feed(self) -> dict:
def _to_feed(self) -> Dict:
to_return = super(MISPObject, self)._to_feed()
if self.references:
to_return['ObjectReference'] = [reference._to_feed() for reference in self.references]
@ -680,12 +692,12 @@ class MISPObject(AbstractMISP):
raise PyMISPError('first_seen ({value}) has to be before last_seen ({self.last_seen})')
super().__setattr__(name, value)
def force_misp_objects_path_custom(self, misp_objects_path_custom: Union[Path, str], object_name: Optional[str]=None):
def force_misp_objects_path_custom(self, misp_objects_path_custom: Union[Path, str], object_name: Optional[str] = None):
if object_name:
self.name = object_name
self._set_template(misp_objects_path_custom)
def _set_template(self, misp_objects_path_custom: Optional[Union[Path, str]]=None):
def _set_template(self, misp_objects_path_custom: Optional[Union[Path, str]] = None):
if misp_objects_path_custom:
# If misp_objects_path_custom is given, and an object with the given name exists, use that.
if isinstance(misp_objects_path_custom, str):
@ -707,11 +719,11 @@ class MISPObject(AbstractMISP):
self._strict = False
@property
def attributes(self) -> List[MISPAttribute]:
def attributes(self) -> List['MISPObjectAttribute']:
return self.Attribute
@attributes.setter
def attributes(self, attributes: List[MISPAttribute]):
def attributes(self, attributes: List['MISPObjectAttribute']):
if all(isinstance(x, MISPObjectAttribute) for x in attributes):
self.Attribute = attributes
self.__fast_attribute_access = defaultdict(list)
@ -729,6 +741,21 @@ class MISPObject(AbstractMISP):
else:
raise PyMISPError('All the attributes have to be of type MISPObjectReference.')
@property
def standalone(self):
return self._standalone
@standalone.setter
def standalone(self, new_standalone: bool):
if self._standalone != new_standalone:
if new_standalone:
self.update_not_jsonable("ObjectReference")
else:
self._remove_from_not_jsonable("ObjectReference")
self._standalone = new_standalone
else:
pass
def from_dict(self, **kwargs):
if 'Object' in kwargs:
kwargs = kwargs['Object']
@ -796,7 +823,7 @@ class MISPObject(AbstractMISP):
super().from_dict(**kwargs)
def add_reference(self, referenced_uuid: Union[AbstractMISP, str], relationship_type: str, comment: Optional[str]=None, **kwargs) -> MISPObjectReference:
def add_reference(self, referenced_uuid: Union[AbstractMISP, str], relationship_type: str, comment: Optional[str] = None, **kwargs) -> MISPObjectReference:
"""Add a link (uuid) to an other object"""
if isinstance(referenced_uuid, AbstractMISP):
# Allow to pass an object or an attribute instead of its UUID
@ -819,24 +846,31 @@ class MISPObject(AbstractMISP):
return self._fast_attribute_access.get(object_relation, [])
@property
def _fast_attribute_access(self):
def _fast_attribute_access(self) -> Dict:
if not self.__fast_attribute_access:
for a in self.attributes:
self.__fast_attribute_access[a.object_relation].append(a)
return self.__fast_attribute_access
def has_attributes_by_relation(self, list_of_relations: List[str]):
def has_attributes_by_relation(self, list_of_relations: List[str]) -> bool:
'''True if all the relations in the list are defined in the object'''
return all(relation in self._fast_attribute_access for relation in list_of_relations)
def add_attribute(self, object_relation: str, simple_value: Union[str, int, float]=None, **value) -> Union[MISPAttribute, None]:
def add_attribute(self, object_relation: str, simple_value: Optional[Union[str, int, float]] = None, **value) -> Optional[MISPAttribute]:
"""Add an attribute. object_relation is required and the value key is a
dictionary with all the keys supported by MISPAttribute"""
if simple_value is not None: # /!\ The value *can* be 0
value = {'value': simple_value}
if value.get('value') in [None, '']:
logger.warning("The value of the attribute you're trying to add is None or empty string, skipping it. Object relation: {}".format(object_relation))
if value.get('value') is None:
logger.warning("The value of the attribute you're trying to add is None, skipping it. Object relation: {}".format(object_relation))
return None
else:
# Make sure we're not adding an empty value.
if isinstance(value['value'], str):
value['value'] = value['value'].strip()
if value['value'] == '':
logger.warning("The value of the attribute you're trying to add is an empty string, skipping it. Object relation: {}".format(object_relation))
return None
if self._known_template and self._definition:
if object_relation in self._definition['attributes']:
attribute = MISPObjectAttribute(self._definition['attributes'][object_relation])
@ -869,20 +903,22 @@ class MISPObject(AbstractMISP):
to_return.append(a)
return to_return
def to_dict(self, strict: bool=False) -> dict:
def to_dict(self, strict: bool = False) -> Dict:
if strict or self._strict and self._known_template:
self._validate()
return super(MISPObject, self).to_dict()
def to_json(self, sort_keys: bool=False, indent: Optional[int]=None, strict: bool=False):
def to_json(self, sort_keys: bool = False, indent: Optional[int] = None, strict: bool = False):
if strict or self._strict and self._known_template:
self._validate()
return super(MISPObject, self).to_json(sort_keys=sort_keys, indent=indent)
def _validate(self):
def _validate(self) -> bool:
if not self._definition:
raise PyMISPError('No object definition available, unable to validate.')
"""Make sure the object we're creating has the required fields"""
if self._definition.get('required'):
required_missing = set(self._definition.get('required')) - set(self._fast_attribute_access.keys())
required_missing = set(self._definition['required']) - set(self._fast_attribute_access.keys())
if required_missing:
raise InvalidMISPObject('{} are required.'.format(required_missing))
if self._definition.get('requiredOneOf'):
@ -909,7 +945,7 @@ class MISPEvent(AbstractMISP):
_fields_for_feed: set = {'uuid', 'info', 'threat_level_id', 'analysis', 'timestamp',
'publish_timestamp', 'published', 'date', 'extends_uuid'}
def __init__(self, describe_types: dict=None, strict_validation: bool=False, **kwargs):
def __init__(self, describe_types: Optional[Dict] = None, strict_validation: bool = False, **kwargs):
super().__init__(**kwargs)
if strict_validation:
schema_file = 'schema.json'
@ -920,6 +956,7 @@ class MISPEvent(AbstractMISP):
# This variable is used in add_attribute in order to avoid duplicating the structure
self.describe_types = describe_types
self.uuid: str = str(uuid.uuid4())
self.date: date
self.Attribute: List[MISPAttribute] = []
self.Object: List[MISPObject] = []
@ -928,7 +965,7 @@ class MISPEvent(AbstractMISP):
self.SharingGroup: MISPSharingGroup
self.Tag: List[MISPTag] = []
def add_tag(self, tag: Optional[Union[str, MISPTag, dict]]=None, **kwargs) -> MISPTag:
def add_tag(self, tag: Optional[Union[str, MISPTag, dict]] = None, **kwargs) -> MISPTag:
return super()._add_tag(tag, **kwargs)
@property
@ -963,7 +1000,7 @@ class MISPEvent(AbstractMISP):
self.threat_level_id = 4
@property
def manifest(self) -> dict:
def manifest(self) -> Dict:
required = ['info', 'Orgc']
for r in required:
if not hasattr(self, r):
@ -983,7 +1020,7 @@ class MISPEvent(AbstractMISP):
}
}
def attributes_hashes(self, algorithm: str='sha512') -> List[str]:
def attributes_hashes(self, algorithm: str = 'sha512') -> List[str]:
to_return: List[str] = []
for attribute in self.attributes:
to_return += attribute.hash_values(algorithm)
@ -992,10 +1029,9 @@ class MISPEvent(AbstractMISP):
to_return += attribute.hash_values(algorithm)
return to_return
def to_feed(self, valid_distributions: List[int]=[0, 1, 2, 3, 4, 5], with_meta: bool=False) -> dict:
def to_feed(self, valid_distributions: List[int] = [0, 1, 2, 3, 4, 5], with_meta: bool = False) -> Dict:
""" Generate a json output for MISP Feed.
Notes:
* valid_distributions only makes sense if the distribution key is set (i.e. the event is exported from a MISP instance)
Note: valid_distributions only makes sense if the distribution key is set; i.e., the event is exported from a MISP instance.
"""
required = ['info', 'Orgc']
for r in required:
@ -1082,7 +1118,7 @@ class MISPEvent(AbstractMISP):
raise PyMISPError('All the attributes have to be of type MISPShadowAttribute.')
@property
def related_events(self): # -> List[MISPEvent]:
def related_events(self) -> List['MISPEvent']:
return self.RelatedEvent
@property
@ -1096,14 +1132,14 @@ class MISPEvent(AbstractMISP):
else:
raise PyMISPError('All the attributes have to be of type MISPObject.')
def load_file(self, event_path: Union[Path, str], validate: bool=False, metadata_only: bool=False):
def load_file(self, event_path: Union[Path, str], validate: bool = False, metadata_only: bool = False):
"""Load a JSON dump from a file on the disk"""
if not os.path.exists(event_path):
raise PyMISPError('Invalid path, unable to load the event.')
with open(event_path, 'rb') as f:
self.load(f, validate, metadata_only)
def load(self, json_event: Union[IO, str, bytes, dict], validate: bool=False, metadata_only: bool=False):
def load(self, json_event: Union[IO, str, bytes, dict], validate: bool = False, metadata_only: bool = False):
"""Load a JSON dump from a pseudo file or a JSON string"""
if isinstance(json_event, IOBase):
# python2 and python3 compatible to find if we have a file
@ -1145,7 +1181,7 @@ class MISPEvent(AbstractMISP):
raise NewEventError(f'Invalid format for the date: {type(value)} - {value}')
super().__setattr__(name, value)
def set_date(self, d: Optional[Union[str, int, float, datetime, date]]=None, ignore_invalid: bool=False):
def set_date(self, d: Optional[Union[str, int, float, datetime, date]] = None, ignore_invalid: bool = False):
"""Set a date for the event (string, datetime, or date object)"""
if isinstance(d, (str, int, float, datetime, date)):
self.date = d # type: ignore
@ -1225,7 +1261,7 @@ class MISPEvent(AbstractMISP):
self.SharingGroup.from_dict(**kwargs.pop('SharingGroup'))
super(MISPEvent, self).from_dict(**kwargs)
def to_dict(self) -> dict:
def to_dict(self) -> Dict:
to_return = super().to_dict()
if to_return.get('date'):
@ -1268,7 +1304,7 @@ class MISPEvent(AbstractMISP):
if ((hasattr(a, 'id') and a.id == attribute_identifier)
or (hasattr(a, 'uuid') and a.uuid == attribute_identifier)
or (hasattr(a, 'value') and attribute_identifier == a.value
or attribute_identifier in a.value.split('|'))):
or (isinstance(a.value, str) and attribute_identifier in a.value.split('|')))):
tags += a.tags
return tags
@ -1282,7 +1318,7 @@ class MISPEvent(AbstractMISP):
if ((hasattr(a, 'id') and a.id == attribute_identifier)
or (hasattr(a, 'uuid') and a.uuid == attribute_identifier)
or (hasattr(a, 'value') and attribute_identifier == a.value
or attribute_identifier in a.value.split('|'))):
or (isinstance(a.value, str) and attribute_identifier in a.value.split('|')))):
a.add_tag(tag)
attributes.append(a)
@ -1301,14 +1337,12 @@ class MISPEvent(AbstractMISP):
def delete_attribute(self, attribute_id: str):
"""Delete an attribute, you can search by ID or UUID"""
found = False
for a in self.attributes:
if ((hasattr(a, 'id') and a.id == attribute_id)
or (hasattr(a, 'uuid') and a.uuid == attribute_id)):
a.delete()
found = True
break
if not found:
else:
raise PyMISPError('No attribute with UUID/ID {} found.'.format(attribute_id))
def add_attribute(self, type: str, value: Union[str, int, float], **kwargs) -> Union[MISPAttribute, List[MISPAttribute]]:
@ -1348,7 +1382,7 @@ class MISPEvent(AbstractMISP):
objects.append(obj)
return objects
def add_object(self, obj: Union[MISPObject, dict, None]=None, **kwargs) -> MISPObject:
def add_object(self, obj: Union[MISPObject, dict, None] = None, **kwargs) -> MISPObject:
"""Add an object to the Event, either by passing a MISPObject, or a dictionary"""
if isinstance(obj, MISPObject):
misp_obj = obj
@ -1364,6 +1398,7 @@ class MISPEvent(AbstractMISP):
misp_obj.from_dict(**kwargs)
else:
raise InvalidMISPObject("An object to add to an existing Event needs to be either a MISPObject, or a plain python dictionary")
misp_obj.standalone = False
self.Object.append(misp_obj)
self.edited = True
return misp_obj
@ -1493,6 +1528,12 @@ class MISPFeed(AbstractMISP):
if 'Feed' in kwargs:
kwargs = kwargs['Feed']
super().from_dict(**kwargs)
if hasattr(self, 'settings'):
try:
self.settings = json.loads(self.settings)
except json.decoder.JSONDecodeError as e:
logger.error("Failed to parse feed settings: {}".format(self.settings))
raise e
class MISPWarninglist(AbstractMISP):
@ -1584,8 +1625,9 @@ class MISPEventDelegation(AbstractMISP):
class MISPObjectAttribute(MISPAttribute):
_fields_for_feed: set = {'uuid', 'object_relation', 'value', 'category', 'type',
'comment', 'data', 'timestamp', 'to_ids', 'disable_correlation'}
_fields_for_feed: set = {'uuid', 'value', 'category', 'type', 'comment', 'data',
'deleted', 'timestamp', 'to_ids', 'disable_correlation',
'first_seen', 'last_seen', 'object_relation'}
def __init__(self, definition):
super().__init__()
@ -1632,7 +1674,7 @@ class MISPCommunity(AbstractMISP):
super().from_dict(**kwargs)
def __repr__(self):
return '<{self.__class__.__name__}(name={self.name}, uuid={self.uuid})'.format(self=self)
return f'<{self.__class__.__name__}(name={self.name}, uuid={self.uuid})'
class MISPUserSetting(AbstractMISP):
@ -1643,4 +1685,49 @@ class MISPUserSetting(AbstractMISP):
super().from_dict(**kwargs)
def __repr__(self):
return '<{self.__class__.__name__}(name={self.setting}'.format(self=self)
return f'<{self.__class__.__name__}(name={self.setting}'
class MISPInbox(AbstractMISP):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.data: Dict
def from_dict(self, **kwargs):
if 'Inbox' in kwargs:
kwargs = kwargs['Inbox']
super().from_dict(**kwargs)
def __repr__(self):
return f'<{self.__class__.__name__}(name={self.type})>'
class MISPEventBlocklist(AbstractMISP):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.event_uuid: str
def from_dict(self, **kwargs):
if 'EventBlocklist' in kwargs:
kwargs = kwargs['EventBlocklist']
super().from_dict(**kwargs)
def __repr__(self):
return f'<{self.__class__.__name__}(event_uuid={self.event_uuid}'
class MISPOrganisationBlocklist(AbstractMISP):
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.org_uuid: str
def from_dict(self, **kwargs):
if 'OrgBlocklist' in kwargs:
kwargs = kwargs['OrgBlocklist']
super().from_dict(**kwargs)
def __repr__(self):
return f'<{self.__class__.__name__}(org_uuid={self.org_uuid}'

View File

@ -10,17 +10,22 @@ from .fail2banobject import Fail2BanObject # noqa
from .domainipobject import DomainIPObject # noqa
from .asnobject import ASNObject # noqa
from .geolocationobject import GeolocationObject # noqa
from .git_vuln_finder_object import GitVulnFinderObject # noqa
from .emailobject import EMailObject # noqa
from .vehicleobject import VehicleObject # noqa
from .csvloader import CSVLoader # noqa
from .sshauthkeyobject import SSHAuthorizedKeysObject # noqa
from .feed import feed_meta_generator # noqa
from .update_objects import update_objects # noqa
try:
from .urlobject import URLObject # noqa
except ImportError:
# Requires faup, which is a bit difficult to install
pass
except OSError:
# faup required liblua-5.3
pass
try:
from .peobject import PEObject, PESectionObject # noqa

View File

@ -21,7 +21,7 @@ class AbstractMISPObjectGenerator(MISPObject):
except ValueError:
return False
def _sanitize_timestamp(self, timestamp: Optional[Union[datetime, date, dict, str, int, float]]=None) -> datetime:
def _sanitize_timestamp(self, timestamp: Optional[Union[datetime, date, dict, str, int, float]] = None) -> datetime:
if not timestamp:
return datetime.now()
@ -35,6 +35,7 @@ class AbstractMISPObjectGenerator(MISPObject):
return timestamp['value']
else: # Supported: float/int/string
if isinstance(timestamp, (str, int, float)) and self._detect_epoch(timestamp):
# It converts to the *local* datetime, which is consistent with the rest of the code.
return datetime.fromtimestamp(float(timestamp))
elif isinstance(timestamp, str):
return parse(timestamp)

View File

@ -9,8 +9,8 @@ logger = logging.getLogger('pymisp')
class ASNObject(AbstractMISPObjectGenerator):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(ASNObject, self).__init__('asn', strict=strict, standalone=standalone, **kwargs)
def __init__(self, parameters: dict, strict: bool = True, **kwargs):
super(ASNObject, self).__init__('asn', strict=strict, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -29,7 +29,7 @@ class FileTypeNotImplemented(MISPObjectException):
pass
def make_binary_objects(filepath: Optional[str]=None, pseudofile: Optional[BytesIO]=None, filename: Optional[str]=None, standalone: bool=True, default_attributes_parameters: dict={}):
def make_binary_objects(filepath: Optional[str] = None, pseudofile: Optional[BytesIO] = None, filename: Optional[str] = None, standalone: bool = True, default_attributes_parameters: dict = {}):
misp_file = FileObject(filepath=filepath, pseudofile=pseudofile, filename=filename,
standalone=standalone, default_attributes_parameters=default_attributes_parameters)
if HAS_LIEF and (filepath or (pseudofile and filename)):

View File

@ -8,8 +8,8 @@ from pymisp import MISPObject
class CSVLoader():
def __init__(self, template_name: str, csv_path: Path, fieldnames: list=[], has_fieldnames=False,
delimiter: str=',', quotechar: str='"'):
def __init__(self, template_name: str, csv_path: Path, fieldnames: list = [], has_fieldnames=False,
delimiter: str = ',', quotechar: str = '"'):
self.template_name = template_name
self.delimiter = delimiter
self.quotechar = quotechar
@ -34,7 +34,7 @@ class CSVLoader():
self.fieldnames = fieldnames
if not self.fieldnames:
raise Exception(f'No fieldnames, impossible to create objects.')
raise Exception('No fieldnames, impossible to create objects.')
else:
# Check if the CSV file has a header, and if it matches with the object template
tmp_object = MISPObject(self.template_name)

View File

@ -9,8 +9,8 @@ logger = logging.getLogger('pymisp')
class DomainIPObject(AbstractMISPObjectGenerator):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(DomainIPObject, self).__init__('domain-ip', strict=strict, standalone=standalone, **kwargs)
def __init__(self, parameters: dict, strict: bool = True, **kwargs):
super(DomainIPObject, self).__init__('domain-ip', strict=strict, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -21,7 +21,7 @@ except ImportError:
logger = logging.getLogger('pymisp')
def make_elf_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool=True, default_attributes_parameters: dict={}):
def make_elf_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool = True, default_attributes_parameters: dict = {}):
elf_object = ELFObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(elf_object.uuid, 'includes', 'ELF indicators')
elf_sections = []
@ -32,8 +32,9 @@ def make_elf_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone
class ELFObject(AbstractMISPObjectGenerator):
def __init__(self, parsed: lief.ELF.Binary=None, filepath: Union[Path, str]=None, pseudofile: Union[BytesIO, bytes]=None, standalone: bool=True, **kwargs):
super(ELFObject, self).__init__('elf', standalone=standalone, **kwargs)
def __init__(self, parsed: lief.ELF.Binary = None, filepath: Union[Path, str] = None, pseudofile: Union[BytesIO, bytes] = None, **kwargs):
"""Creates an ELF object, with lief"""
super(ELFObject, self).__init__('elf', **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if pseudofile:
@ -64,7 +65,7 @@ class ELFObject(AbstractMISPObjectGenerator):
if self.__elf.sections:
pos = 0
for section in self.__elf.sections:
s = ELFSectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
s = ELFSectionObject(section, standalone=self._standalone, default_attributes_parameters=self._default_attributes_parameters)
self.add_reference(s.uuid, 'includes', 'Section {} of ELF'.format(pos))
pos += 1
self.sections.append(s)
@ -73,10 +74,11 @@ class ELFObject(AbstractMISPObjectGenerator):
class ELFSectionObject(AbstractMISPObjectGenerator):
def __init__(self, section: lief.ELF.Section, standalone: bool=True, **kwargs):
def __init__(self, section: lief.ELF.Section, **kwargs):
"""Creates an ELF Section object. Object generated by ELFObject."""
# Python3 way
# super().__init__('pe-section')
super(ELFSectionObject, self).__init__('elf-section', standalone=standalone, **kwargs)
super(ELFSectionObject, self).__init__('elf-section', **kwargs)
self.__section = section
self.__data = bytes(self.__section.content)
self.generate_attributes()

View File

@ -14,10 +14,10 @@ logger = logging.getLogger('pymisp')
class EMailObject(AbstractMISPObjectGenerator):
def __init__(self, filepath: Union[Path, str]=None, pseudofile: BytesIO=None, attach_original_email: bool=True, standalone: bool=True, **kwargs):
def __init__(self, filepath: Union[Path, str] = None, pseudofile: BytesIO = None, attach_original_email: bool = True, **kwargs):
# PY3 way:
# super().__init__('file')
super(EMailObject, self).__init__('email', standalone=standalone, **kwargs)
super(EMailObject, self).__init__('email', **kwargs)
if filepath:
with open(filepath, 'rb') as f:
self.__pseudofile = BytesIO(f.read())
@ -26,6 +26,9 @@ class EMailObject(AbstractMISPObjectGenerator):
else:
raise InvalidMISPObject('File buffer (BytesIO) or a path is required.')
self.__email = message_from_bytes(self.__pseudofile.getvalue(), policy=policy.default)
# Improperly encoded emails (utf-8-sig) fail silently. An empty email indicates this might be the case.
if len(self.__email) == 0:
self.attempt_decoding()
if attach_original_email:
self.add_attribute('eml', value='Full email.eml', data=self.__pseudofile)
self.generate_attributes()
@ -44,6 +47,24 @@ class EMailObject(AbstractMISPObjectGenerator):
to_return.append((attachment.get_filename(), BytesIO(content)))
return to_return
def attempt_decoding(self):
"""Attempt to decode non-ascii encoded emails.
"""
_msg_bytes = self.__pseudofile.getvalue()
try:
_msg_bytes.decode("ASCII")
logger.info("EmailObject failed to decode ASCII encoded email.")
return
except UnicodeDecodeError:
logger.debug("EmailObject was passed a non-ASCII encoded binary blob.")
try:
if _msg_bytes[:3] == b'\xef\xbb\xbf': # utf-8-sig byte-order mark (BOM)
# Set Pseudofile to correctly encoded email in case it is used at some later point.
self.__pseudofile = BytesIO(_msg_bytes.decode('utf_8_sig').encode("ASCII"))
self.__email = message_from_bytes(self.__pseudofile.getvalue(), policy=policy.default)
except UnicodeDecodeError:
logger.debug("EmailObject does not know how to decode binary blob passed to it. Object may not be an email. If this is an email please submit it as an issue to PyMISP so we can add support.")
def generate_attributes(self):
if self.__email.get_body(preferencelist=('html', 'plain')):
self.add_attribute('email-body', value=self.__email.get_body(preferencelist=('html', 'plain')).get_payload(decode=True).decode('utf8', 'surrogateescape'))

View File

@ -9,8 +9,8 @@ logger = logging.getLogger('pymisp')
class Fail2BanObject(AbstractMISPObjectGenerator):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(Fail2BanObject, self).__init__('fail2ban', strict=strict, standalone=standalone, **kwargs)
def __init__(self, parameters: dict, strict: bool = True, **kwargs):
super(Fail2BanObject, self).__init__('fail2ban', strict=strict, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -30,10 +30,10 @@ except ImportError:
class FileObject(AbstractMISPObjectGenerator):
def __init__(self, filepath: Union[Path, str]=None, pseudofile: BytesIO=None, filename: str=None, standalone: bool=True, **kwargs):
def __init__(self, filepath: Union[Path, str] = None, pseudofile: BytesIO = None, filename: str = None, **kwargs):
# PY3 way:
# super().__init__('file')
super(FileObject, self).__init__('file', standalone=standalone, **kwargs)
super(FileObject, self).__init__('file', **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if not HAS_MAGIC:
@ -68,7 +68,7 @@ class FileObject(AbstractMISPObjectGenerator):
self.add_attribute('sha512', value=sha512(self.__data).hexdigest())
self.add_attribute('malware-sample', value=self.__filename, data=self.__pseudofile)
if HAS_MAGIC:
self.add_attribute('mimetype', value=magic.from_buffer(self.__data))
self.add_attribute('mimetype', value=magic.from_buffer(self.__data, mime=True))
if HAS_PYDEEP:
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
@ -79,10 +79,10 @@ class FileObject(AbstractMISPObjectGenerator):
if len(data) == 0:
return 0.0
occurences = Counter(bytearray(data))
occurrences = Counter(bytearray(data))
entropy = 0.0
for x in occurences.values():
for x in occurrences.values():
p_x = float(x) / len(data)
entropy -= p_x * math.log(p_x, 2)

View File

@ -9,8 +9,8 @@ logger = logging.getLogger('pymisp')
class GeolocationObject(AbstractMISPObjectGenerator):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(GeolocationObject, self).__init__('asn', strict=strict, standalone=standalone, **kwargs)
def __init__(self, parameters: dict, strict: bool = True, **kwargs):
super(GeolocationObject, self).__init__('asn', strict=strict, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -0,0 +1,28 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from .abstractgenerator import AbstractMISPObjectGenerator
import logging
logger = logging.getLogger('pymisp')
class GitVulnFinderObject(AbstractMISPObjectGenerator):
def __init__(self, parameters: dict, strict: bool = True, **kwargs):
super(GitVulnFinderObject, self).__init__('git-vuln-finder', strict=strict, **kwargs)
self._parameters = parameters
self.generate_attributes()
def generate_attributes(self):
authored_date = self._sanitize_timestamp(self._parameters.pop('authored_date', None))
self._parameters['authored_date'] = authored_date
committed_date = self._sanitize_timestamp(self._parameters.pop('committed_date', None))
self._parameters['committed_date'] = committed_date
if 'stats' in self._parameters:
stats = self._parameters.pop('stats')
self._parameters['stats.insertions'] = stats.pop('insertions')
self._parameters['stats.deletions'] = stats.pop('deletions')
self._parameters['stats.lines'] = stats.pop('lines')
self._parameters['stats.files'] = stats.pop('files')
return super(GitVulnFinderObject, self).generate_attributes()

View File

@ -21,7 +21,7 @@ except ImportError:
logger = logging.getLogger('pymisp')
def make_macho_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool=True, default_attributes_parameters: dict={}):
def make_macho_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool = True, default_attributes_parameters: dict = {}):
macho_object = MachOObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(macho_object.uuid, 'includes', 'MachO indicators')
macho_sections = []
@ -32,10 +32,11 @@ def make_macho_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalo
class MachOObject(AbstractMISPObjectGenerator):
def __init__(self, parsed: Optional[lief.MachO.Binary]=None, filepath: Optional[Union[Path, str]]=None, pseudofile: Optional[BytesIO]=None, standalone: bool=True, **kwargs):
def __init__(self, parsed: Optional[lief.MachO.Binary] = None, filepath: Optional[Union[Path, str]] = None, pseudofile: Optional[BytesIO] = None, **kwargs):
"""Creates an MachO object, with lief"""
# Python3 way
# super().__init__('elf')
super(MachOObject, self).__init__('macho', standalone=standalone, **kwargs)
super(MachOObject, self).__init__('macho', **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if pseudofile:
@ -66,7 +67,7 @@ class MachOObject(AbstractMISPObjectGenerator):
if self.__macho.sections:
pos = 0
for section in self.__macho.sections:
s = MachOSectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
s = MachOSectionObject(section, standalone=self._standalone, default_attributes_parameters=self._default_attributes_parameters)
self.add_reference(s.uuid, 'includes', 'Section {} of MachO'.format(pos))
pos += 1
self.sections.append(s)
@ -75,10 +76,11 @@ class MachOObject(AbstractMISPObjectGenerator):
class MachOSectionObject(AbstractMISPObjectGenerator):
def __init__(self, section: lief.MachO.Section, standalone: bool=True, **kwargs):
def __init__(self, section: lief.MachO.Section, **kwargs):
"""Creates an MachO Section object. Object generated by MachOObject."""
# Python3 way
# super().__init__('pe-section')
super(MachOSectionObject, self).__init__('macho-section', standalone=standalone, **kwargs)
super(MachOSectionObject, self).__init__('macho-section', **kwargs)
self.__section = section
self.__data = bytes(self.__section.content)
self.generate_attributes()

View File

@ -0,0 +1,144 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# NOTE: Reference on how this module is used: https://vvx7.io/posts/2020/05/misp-slack-bot/
from .abstractgenerator import AbstractMISPObjectGenerator
import logging
logger = logging.getLogger('pymisp')
class MicroblogObject(AbstractMISPObjectGenerator):
def __init__(self, parameters: dict, strict: bool = True, **kwargs):
super(MicroblogObject, self).__init__('microblog', strict=strict, **kwargs)
self._parameters = parameters
self.generate_attributes()
def generate_attributes(self):
# Raw post.
if 'post' in self._parameters:
self.add_attribute('post', value=self._parameters['post'])
# Title of the post.
if 'title' in self._parameters:
self.add_attribute('title', value=self._parameters['title'])
# Original link into the microblog post (Supposed harmless).
if 'link' in self._parameters:
self.add_attribute('link', value=self._parameters['link'])
# Original URL location of the microblog post (potentially malicious.
if 'url' in self._parameters:
if isinstance(self._parameters.get('url'), list):
for i in self._parameters.get('url'):
self.add_attribute('url', value=i)
else:
self.add_attribute('url', value=self._parameters['url'])
# Archive of the original document (Internet Archive, Archive.is, etc).
if 'archive' in self._parameters:
if isinstance(self._parameters.get('archive'), list):
for i in self._parameters.get('archive'):
self.add_attribute('archive', value=i)
else:
self.add_attribute('archive', value=self._parameters['archive'])
# Display name of the account who posted the microblog.
if 'display-name' in self._parameters:
self.add_attribute('display-name', value=self._parameters['display-name'])
# The user ID of the microblog this post replies to.
if 'in-reply-to-user-id' in self._parameters:
self.add_attribute('in-reply-to-user-id', value=self._parameters['in-reply-to-user-id'])
# The microblog ID of the microblog this post replies to.
if 'in-reply-to-status-id' in self._parameters:
self.add_attribute('in-reply-to-status-id', value=self._parameters['in-reply-to-status-id'])
# The user display name of the microblog this post replies to.
if 'in-reply-to-display-name' in self._parameters:
self.add_attribute('in-reply-to-display-name', value=self._parameters['in-reply-to-display-name'])
# The language of the post.
if 'language' in self._parameters:
self.add_attribute('language', value=self._parameters['language'], disable_correlation=True)
# The microblog post file or screen capture.
# if 'attachment' in self._parameters:
# self.add_attribute('attachment', value=self._parameters['attachment'])
# Type of the microblog post.
type_allowed_values = ["Twitter", "Facebook", "LinkedIn", "Reddit", "Google+",
"Instagram", "Forum", "Other"]
if 'type' in self._parameters:
if isinstance(self._parameters.get('type'), list):
for i in self._parameters.get('type'):
if i in type_allowed_values:
self.add_attribute('type', value=i)
else:
if self._parameters['type'] in type_allowed_values:
self.add_attribute('type', value=self._parameters['type'])
# State of the microblog post.
type_allowed_values = ["Informative", "Malicious", "Misinformation", "Disinformation", "Unknown"]
if 'state' in self._parameters:
if isinstance(self._parameters.get('state'), list):
for i in self._parameters.get('state'):
if i in type_allowed_values:
self.add_attribute('state', value=i)
else:
if self._parameters['state'] in type_allowed_values:
self.add_attribute('state', value=self._parameters['state'])
# Username who posted the microblog post (without the @ prefix).
if 'username' in self._parameters:
self.add_attribute('username', value=self._parameters['username'])
# == the username account verified by the operator of the microblog platform.
type_allowed_values = ["Verified", "Unverified", "Unknown"]
if 'verified-username' in self._parameters:
if isinstance(self._parameters.get('verified-username'), list):
for i in self._parameters.get('verified-username'):
if i in type_allowed_values:
self.add_attribute('verified-username', value=i)
else:
if self._parameters['verified-username'] in type_allowed_values:
self.add_attribute('verified-username', value=self._parameters['verified-username'])
# embedded-link.
if 'embedded-link' in self._parameters:
if isinstance(self._parameters.get('embedded-link'), list):
for i in self._parameters.get('embedded-link'):
self.add_attribute('embedded-link', value=i)
else:
self.add_attribute('embedded-link', value=self._parameters['embedded-link'])
# embedded-safe-link
if 'embedded-safe-link' in self._parameters:
if isinstance(self._parameters.get('embedded-safe-link'), list):
for i in self._parameters.get('embedded-safe-link'):
self.add_attribute('embedded-safe-link', value=i)
else:
self.add_attribute('embedded-safe-link', value=self._parameters['embedded-safe-link'])
# Hashtag into the microblog post.
if 'hashtag' in self._parameters:
if isinstance(self._parameters.get('hashtag'), list):
for i in self._parameters.get('hashtag'):
self.add_attribute('hashtag', value=i)
else:
self.add_attribute('hashtag', value=self._parameters['hashtag'])
# username quoted
if 'username-quoted' in self._parameters:
if isinstance(self._parameters.get('username-quoted'), list):
for i in self._parameters.get('username-quoted'):
self.add_attribute('username-quoted', value=i)
else:
self.add_attribute('username-quoted', value=self._parameters['username-quoted'])
# twitter post id
if 'twitter-id' in self._parameters:
self.add_attribute('twitter-id', value=self._parameters['twitter-id'])

View File

@ -23,7 +23,7 @@ except ImportError:
logger = logging.getLogger('pymisp')
def make_pe_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool=True, default_attributes_parameters: dict={}):
def make_pe_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool = True, default_attributes_parameters: dict = {}):
pe_object = PEObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(pe_object.uuid, 'includes', 'PE indicators')
pe_sections = []
@ -34,10 +34,11 @@ def make_pe_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone:
class PEObject(AbstractMISPObjectGenerator):
def __init__(self, parsed: Optional[lief.PE.Binary]=None, filepath: Optional[Union[Path, str]]=None, pseudofile: Optional[BytesIO]=None, standalone: bool=True, **kwargs):
def __init__(self, parsed: Optional[lief.PE.Binary] = None, filepath: Optional[Union[Path, str]] = None, pseudofile: Optional[BytesIO] = None, **kwargs):
"""Creates an PE object, with lief"""
# Python3 way
# super().__init__('pe')
super(PEObject, self).__init__('pe', standalone=standalone, **kwargs)
super(PEObject, self).__init__('pe', **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if pseudofile:
@ -111,7 +112,7 @@ class PEObject(AbstractMISPObjectGenerator):
if self.__pe.sections:
pos = 0
for section in self.__pe.sections:
s = PESectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
s = PESectionObject(section, standalone=self._standalone, default_attributes_parameters=self._default_attributes_parameters)
self.add_reference(s.uuid, 'includes', 'Section {} of PE'.format(pos))
if ((self.__pe.entrypoint >= section.virtual_address)
and (self.__pe.entrypoint < (section.virtual_address + section.virtual_size))):
@ -124,10 +125,11 @@ class PEObject(AbstractMISPObjectGenerator):
class PESectionObject(AbstractMISPObjectGenerator):
def __init__(self, section: lief.PE.Section, standalone: bool=True, **kwargs):
def __init__(self, section: lief.PE.Section, **kwargs):
"""Creates an PE Section object. Object generated by PEObject."""
# Python3 way
# super().__init__('pe-section')
super(PESectionObject, self).__init__('pe-section', standalone=standalone, **kwargs)
super(PESectionObject, self).__init__('pe-section', **kwargs)
self.__section = section
self.__data = bytes(self.__section.content)
self.generate_attributes()
@ -136,10 +138,17 @@ class PESectionObject(AbstractMISPObjectGenerator):
self.add_attribute('name', value=self.__section.name)
size = self.add_attribute('size-in-bytes', value=self.__section.size)
if int(size.value) > 0:
# zero-filled sections can create too many correlations
to_ids = float(self.__section.entropy) > 0
disable_correlation = not to_ids
self.add_attribute('entropy', value=self.__section.entropy)
self.add_attribute('md5', value=md5(self.__data).hexdigest())
self.add_attribute('sha1', value=sha1(self.__data).hexdigest())
self.add_attribute('sha256', value=sha256(self.__data).hexdigest())
self.add_attribute('sha512', value=sha512(self.__data).hexdigest())
if HAS_PYDEEP:
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
self.add_attribute('md5', value=md5(self.__data).hexdigest(), disable_correlation=disable_correlation, to_ids=to_ids)
self.add_attribute('sha1', value=sha1(self.__data).hexdigest(), disable_correlation=disable_correlation, to_ids=to_ids)
self.add_attribute('sha256', value=sha256(self.__data).hexdigest(), disable_correlation=disable_correlation, to_ids=to_ids)
self.add_attribute('sha512', value=sha512(self.__data).hexdigest(), disable_correlation=disable_correlation, to_ids=to_ids)
if HAS_PYDEEP and float(self.__section.entropy) > 0:
if self.__section.name == '.rsrc':
# ssdeep of .rsrc creates too many correlations
disable_correlation = True
to_ids = False
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode(), disable_correlation=disable_correlation, to_ids=to_ids)

View File

@ -8,7 +8,7 @@ class SBSignatureObject(AbstractMISPObjectGenerator):
'''
Sandbox Analyzer
'''
def __init__(self, software: str, report: list, standalone: bool=True, **kwargs):
def __init__(self, software: str, report: list, **kwargs):
super(SBSignatureObject, self).__init__("sb-signature", **kwargs)
self._software = software
self._report = report

View File

@ -13,10 +13,10 @@ logger = logging.getLogger('pymisp')
class SSHAuthorizedKeysObject(AbstractMISPObjectGenerator):
def __init__(self, authorized_keys_path: Optional[Union[Path, str]]=None, authorized_keys_pseudofile: Optional[StringIO]=None, standalone: bool=True, **kwargs):
def __init__(self, authorized_keys_path: Optional[Union[Path, str]] = None, authorized_keys_pseudofile: Optional[StringIO] = None, **kwargs):
# PY3 way:
# super().__init__('file')
super(SSHAuthorizedKeysObject, self).__init__('ssh-authorized-keys', standalone=standalone, **kwargs)
super(SSHAuthorizedKeysObject, self).__init__('ssh-authorized-keys', **kwargs)
if authorized_keys_path:
with open(authorized_keys_path, 'r') as f:
self.__pseudofile = StringIO(f.read())
@ -28,7 +28,7 @@ class SSHAuthorizedKeysObject(AbstractMISPObjectGenerator):
self.generate_attributes()
def generate_attributes(self):
for l in self.__pseudofile:
if l.startswith('ssh') or l.startswith('ecdsa'):
key = l.split(' ')[1]
for line in self.__pseudofile:
if line.startswith('ssh') or line.startswith('ecdsa'):
key = line.split(' ')[1]
self.add_attribute('key', key)

View File

@ -9,7 +9,7 @@ except ImportError:
has_misp_stix_converter = False
def load_stix(stix, distribution: int=3, threat_level_id: int=2, analysis: int=0):
def load_stix(stix, distribution: int = 3, threat_level_id: int = 2, analysis: int = 0):
'''Returns a MISPEvent object from a STIX package'''
if not has_misp_stix_converter:
raise Exception('You need to install misp_stix_converter: pip install git+https://github.com/MISP/MISP-STIX-Converter.git')
@ -18,7 +18,7 @@ def load_stix(stix, distribution: int=3, threat_level_id: int=2, analysis: int=0
threat_level_id=threat_level_id, analysis=analysis)
def make_stix_package(misp_event, to_json: bool=False, to_xml: bool=False):
def make_stix_package(misp_event, to_json: bool = False, to_xml: bool = False):
'''Returns a STIXPackage from a MISPEvent.
Optionally can return the package in json or xml.

View File

@ -0,0 +1,29 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import zipfile
from io import BytesIO
from pathlib import Path
import requests
from ..abstract import resources_path
static_repo = "https://github.com/MISP/misp-objects/archive/main.zip"
def update_objects():
r = requests.get(static_repo)
zipped_repo = BytesIO(r.content)
with zipfile.ZipFile(zipped_repo, 'r') as myzip:
for name in myzip.namelist():
if not name.endswith('.json'):
continue
name_on_disk = name.replace('misp-objects-main', 'misp-objects')
path = resources_path / Path(name_on_disk)
if not path.parent.exists():
path.parent.mkdir(parents=True)
with path.open('wb') as f:
f.write(myzip.read(name))

View File

@ -13,10 +13,10 @@ faup = Faup()
class URLObject(AbstractMISPObjectGenerator):
def __init__(self, url: str, standalone: bool=True, **kwargs):
def __init__(self, url: str, **kwargs):
# PY3 way:
# super().__init__('file')
super(URLObject, self).__init__('url', standalone=standalone, **kwargs)
super(URLObject, self).__init__('url', **kwargs)
faup.decode(unquote_plus(url))
self.generate_attributes()

View File

@ -17,8 +17,8 @@ class VehicleObject(AbstractMISPObjectGenerator):
'uk': "http://www.regcheck.org.uk/api/reg.asmx/Check"
}
def __init__(self, country: str, registration: str, username: str, standalone=True, **kwargs):
super(VehicleObject, self).__init__("vehicle", standalone=standalone, **kwargs)
def __init__(self, country: str, registration: str, username: str, **kwargs):
super(VehicleObject, self).__init__("vehicle", **kwargs)
self._country = country
self._registration = registration
self._username = username

View File

@ -24,10 +24,10 @@ class VTReportObject(AbstractMISPObjectGenerator):
:indicator: IOC to search VirusTotal for
'''
def __init__(self, apikey: str, indicator: str, vt_proxies: Optional[dict]=None, standalone: bool=True, **kwargs):
def __init__(self, apikey: str, indicator: str, vt_proxies: Optional[dict] = None, **kwargs):
# PY3 way:
# super().__init__("virustotal-report")
super(VTReportObject, self).__init__("virustotal-report", standalone=standalone, **kwargs)
super(VTReportObject, self).__init__("virustotal-report", **kwargs)
indicator = indicator.strip()
self._resource_type = self.__validate_resource(indicator)
if self._resource_type:
@ -81,7 +81,7 @@ class VTReportObject(AbstractMISPObjectGenerator):
report = requests.get(url, params=params)
report_json = report.json()
if report_json["response_code"] == 1:
return report
return report_json
else:
error_msg = "{}: {}".format(resource, report_json["verbose_msg"])
raise InvalidMISPObject(error_msg)

79
pyproject.toml Normal file
View File

@ -0,0 +1,79 @@
[tool.poetry]
name = "pymisp"
version = "2.4.131"
description = "Python API for MISP."
authors = ["Raphaël Vinot <raphael.vinot@circl.lu>"]
license = "BSD-2-Clause"
repository = "https://github.com/MISP/PyMISP"
documentation = "http://pymisp.readthedocs.io"
readme = "README.md"
classifiers=[
'License :: OSI Approved :: BSD License',
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Operating System :: POSIX :: Linux',
'Intended Audience :: Science/Research',
'Intended Audience :: Telecommunications Industry',
'Intended Audience :: Information Technology',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Topic :: Security',
'Topic :: Internet'
]
include = [
"CHANGELOG.txt",
"README.md",
"pymisp/data/*.json",
"pymisp/data/misp-objects/schema_objects.json",
"pymisp/data/misp-objects/schema_relationships.json",
"pymisp/data/misp-objects/objects/*/definition.json",
"pymisp/data/misp-objects/relationships/definition.json",
"pymisp/tools/pdf_fonts/Noto_TTF/*"
]
[tool.poetry.urls]
"Bug Tracker" = "https://github.com/MISP/PyMISP/issues"
"Source" = "https://github.com/MISP/PyMISP"
[tool.poetry.dependencies]
python = "^3.6"
requests = "^2.22.0"
python-dateutil = "^2.8.1"
jsonschema = "^3.2.0"
deprecated = "^1.2.7"
python-magic = {version = "^0.4.15", optional = true}
pydeep = {version = "^0.4", optional = true}
lief = {version = "^0.10.1", optional = true}
beautifulsoup4 = {version = "^4.8.2", optional = true}
validators = {version = "^0.14.2", optional = true}
sphinx-autodoc-typehints = {version = "^1.10.3", optional = true}
recommonmark = {version = "^0.6.0", optional = true}
reportlab = {version = "^3.5.34", optional = true}
pyfaup = {version = "^1.2", optional = true}
[tool.poetry.extras]
fileobjects = ['python-magic', 'pydeep', 'lief']
openioc = ['beautifulsoup4']
virustotal = ['validators']
docs = ['sphinx-autodoc-typehints', 'recommonmark']
pdfexport = ['reportlab']
url = ['pyfaup']
[tool.poetry.dev-dependencies]
nose = "^1.3.7"
coveralls = "^1.11.1"
codecov = "^2.0.15"
requests-mock = "^1.7.0"
mypy = "^0.761"
flake8 = "^3.7.9"
ipython = "^7.12.0"
jupyterlab = "^1.2.6"
[build-system]
requires = ["poetry_core>=1.0", "setuptools"]
build-backend = "poetry.core.masonry.api"

File diff suppressed because it is too large Load Diff

View File

@ -30,7 +30,7 @@
"name": "file",
"sharing_group_id": "0",
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"template_version": "19",
"template_version": "23",
"uuid": "a"
},
{

View File

@ -30,7 +30,7 @@
"name": "file",
"sharing_group_id": "0",
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"template_version": "19",
"template_version": "23",
"uuid": "a"
},
{
@ -55,7 +55,7 @@
"name": "file",
"sharing_group_id": "0",
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"template_version": "19",
"template_version": "23",
"uuid": "b"
}
]

View File

@ -2631,7 +2631,7 @@
"uuid": "5a3d0143-c300-4118-8afe-4a2d950d210f"
}
],
"comment": "Win32/Sednit.AX\t",
"comment": "Win32/Sednit.AX",
"deleted": false,
"description": "File object describing a file with meta-information",
"distribution": "5",

View File

@ -2634,7 +2634,7 @@
"uuid": "5a3d0143-c300-4118-8afe-4a2d950d210f"
}
],
"comment": "Win32/Sednit.AX\t",
"comment": "Win32/Sednit.AX",
"deleted": false,
"description": "File object describing a file with meta-information",
"distribution": "5",

19
tests/test_fileobject.py Normal file
View File

@ -0,0 +1,19 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import unittest
import json
from pymisp.tools import FileObject
import pathlib
class TestFileObject(unittest.TestCase):
def test_mimeType(self):
file_object = FileObject(filepath=pathlib.Path(__file__))
attributes = json.loads(file_object.to_json())['Attribute']
mime = next(attr for attr in attributes if attr['object_relation'] == 'mimetype')
# was "Python script, ASCII text executable"
# libmagic on linux: 'text/x-python'
# libmagic on os x: 'text/x-script.python'
self.assertEqual(mime['value'][:7], 'text/x-')
self.assertEqual(mime['value'][-6:], 'python')

View File

@ -9,8 +9,10 @@ import glob
import hashlib
from datetime import date, datetime
from pymisp import MISPEvent, MISPSighting, MISPTag, MISPOrganisation
from pymisp import (MISPEvent, MISPSighting, MISPTag, MISPOrganisation,
MISPObject)
from pymisp.exceptions import InvalidMISPObject
from pymisp.tools import GitVulnFinderObject
class TestMISPEvent(unittest.TestCase):
@ -29,6 +31,7 @@ class TestMISPEvent(unittest.TestCase):
def test_simple(self):
with open('tests/mispevent_testfiles/simple.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_event(self):
@ -36,12 +39,14 @@ class TestMISPEvent(unittest.TestCase):
self.mispevent.publish()
with open('tests/mispevent_testfiles/event.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_loadfile(self):
self.mispevent.load_file('tests/mispevent_testfiles/event.json')
with open('tests/mispevent_testfiles/event.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_event_tag(self):
@ -53,6 +58,7 @@ class TestMISPEvent(unittest.TestCase):
self.mispevent.add_tag(new_tag)
with open('tests/mispevent_testfiles/event_tags.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_attribute(self):
@ -65,6 +71,7 @@ class TestMISPEvent(unittest.TestCase):
self.assertEqual(attr_tags[0].name, 'osint')
with open('tests/mispevent_testfiles/attribute.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
# Fake setting an attribute ID for testing
self.mispevent.attributes[0].id = 42
@ -94,6 +101,7 @@ class TestMISPEvent(unittest.TestCase):
self.assertEqual(self.mispevent.objects[0].references[0].relationship_type, 'baz')
with open('tests/mispevent_testfiles/event_obj_attr_tag.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
@unittest.skip("Not supported on MISP: https://github.com/MISP/MISP/issues/2638 - https://github.com/MISP/PyMISP/issues/168")
@ -116,6 +124,7 @@ class TestMISPEvent(unittest.TestCase):
self.assertEqual(attribute.malware_binary, pseudofile)
with open('tests/mispevent_testfiles/malware.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_existing_malware(self):
@ -173,6 +182,7 @@ class TestMISPEvent(unittest.TestCase):
self.mispevent.objects[1].uuid = 'b'
with open('tests/mispevent_testfiles/event_obj_def_param.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_obj_default_values(self):
@ -189,8 +199,22 @@ class TestMISPEvent(unittest.TestCase):
self.mispevent.objects[0].uuid = 'a'
with open('tests/mispevent_testfiles/def_param.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_obj_references_export(self):
self.init_event()
obj1 = MISPObject(name="file")
obj2 = MISPObject(name="url", standalone=False)
obj1.add_reference(obj2, "downloads")
obj2.add_reference(obj1, "downloaded-by")
self.assertFalse("ObjectReference" in obj1.jsonable())
self.assertTrue("ObjectReference" in obj2.jsonable())
self.mispevent.add_object(obj1)
obj2.standalone = True
self.assertTrue("ObjectReference" in obj1.jsonable())
self.assertFalse("ObjectReference" in obj2.jsonable())
def test_event_not_edited(self):
self.mispevent.load_file('tests/mispevent_testfiles/existing_event.json')
self.assertFalse(self.mispevent.edited)
@ -293,6 +317,7 @@ class TestMISPEvent(unittest.TestCase):
self.mispevent.objects[0].uuid = 'a'
with open('tests/mispevent_testfiles/misp_custom_obj.json', 'r') as f:
ref_json = json.load(f)
del self.mispevent.uuid
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
def test_first_last_seen(self):
@ -347,6 +372,15 @@ class TestMISPEvent(unittest.TestCase):
subset = set(entry['categories']).issubset(me.describe_types['categories'])
self.assertTrue(subset, f'{t_json["name"]} - {obj_relation}')
def test_git_vuln_finder(self):
with open('tests/git-vuln-finder-quagga.json') as f:
dump = json.load(f)
for vuln in dump.values():
author = vuln['author']
vuln_finder = GitVulnFinderObject(vuln)
self.assertEqual(vuln_finder.get_attributes_by_relation('author')[0].value, author)
if __name__ == '__main__':
unittest.main()

View File

@ -26,7 +26,7 @@ logger = logging.getLogger('pymisp')
try:
from pymisp import ExpandedPyMISP, MISPEvent, MISPOrganisation, MISPUser, Distribution, ThreatLevel, Analysis, MISPObject, MISPAttribute, MISPSighting, MISPShadowAttribute, MISPTag, MISPSharingGroup, MISPFeed, MISPServer, MISPUserSetting
from pymisp import register_user, PyMISP, MISPEvent, MISPOrganisation, MISPUser, Distribution, ThreatLevel, Analysis, MISPObject, MISPAttribute, MISPSighting, MISPShadowAttribute, MISPTag, MISPSharingGroup, MISPFeed, MISPServer, MISPUserSetting, MISPEventBlocklist
from pymisp.tools import CSVLoader, DomainIPObject, ASNObject, GenericObjectGenerator
from pymisp.exceptions import MISPServerError
except ImportError:
@ -57,7 +57,8 @@ class TestComprehensive(unittest.TestCase):
def setUpClass(cls):
cls.maxDiff = None
# Connect as admin
cls.admin_misp_connector = ExpandedPyMISP(url, key, verifycert, debug=False)
cls.admin_misp_connector = PyMISP(url, key, verifycert, debug=False)
cls.admin_misp_connector.set_server_setting('Security.allow_self_registration', True, force=True)
if not fast_mode:
r = cls.admin_misp_connector.update_misp()
print(r)
@ -76,7 +77,7 @@ class TestComprehensive(unittest.TestCase):
user.email = 'testusr@user.local'
user.org_id = cls.test_org.id
cls.test_usr = cls.admin_misp_connector.add_user(user, pythonify=True)
cls.user_misp_connector = ExpandedPyMISP(url, cls.test_usr.authkey, verifycert, debug=True)
cls.user_misp_connector = PyMISP(url, cls.test_usr.authkey, verifycert, debug=True)
cls.user_misp_connector.toggle_global_pythonify()
# Creates a publisher
user = MISPUser()
@ -84,14 +85,14 @@ class TestComprehensive(unittest.TestCase):
user.org_id = cls.test_org.id
user.role_id = 4
cls.test_pub = cls.admin_misp_connector.add_user(user, pythonify=True)
cls.pub_misp_connector = ExpandedPyMISP(url, cls.test_pub.authkey, verifycert)
cls.pub_misp_connector = PyMISP(url, cls.test_pub.authkey, verifycert)
# Creates a user that can accept a delegation request
user = MISPUser()
user.email = 'testusr@delegate.recipient.local'
user.org_id = cls.test_org_delegate.id
user.role_id = 2
cls.test_usr_delegate = cls.admin_misp_connector.add_user(user, pythonify=True)
cls.delegate_user_misp_connector = ExpandedPyMISP(url, cls.test_usr_delegate.authkey, verifycert, debug=False)
cls.delegate_user_misp_connector = PyMISP(url, cls.test_usr_delegate.authkey, verifycert, debug=False)
cls.delegate_user_misp_connector.toggle_global_pythonify()
if not fast_mode:
# Update all json stuff
@ -100,6 +101,7 @@ class TestComprehensive(unittest.TestCase):
cls.admin_misp_connector.update_noticelists()
cls.admin_misp_connector.update_warninglists()
cls.admin_misp_connector.update_taxonomies()
cls.admin_misp_connector.load_default_feeds()
@classmethod
def tearDownClass(cls):
@ -292,6 +294,24 @@ class TestComprehensive(unittest.TestCase):
self.admin_misp_connector.delete_event(second)
self.admin_misp_connector.delete_event(third)
def test_search_objects(self):
'''Search for objects'''
try:
first = self.create_simple_event()
obj = MISPObject('file')
obj.add_attribute('filename', 'foo')
first.add_object(obj)
first = self.user_misp_connector.add_event(first)
logger = logging.getLogger('pymisp')
logger.setLevel(logging.DEBUG)
objects = self.user_misp_connector.search(controller='objects',
object_name='file', pythonify=True)
self.assertEqual(len(objects), 1)
self.assertEqual(objects[0].attributes[0].value, 'foo')
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
def test_search_type_attribute(self):
'''Search multiple attributes, search attributes with specific types'''
try:
@ -516,17 +536,69 @@ class TestComprehensive(unittest.TestCase):
# Delete event
self.admin_misp_connector.delete_event(first)
def test_delete_by_uuid(self):
def test_delete_with_update(self):
try:
first = self.create_simple_event()
obj = MISPObject('file')
obj.add_attribute('filename', 'foo')
first.add_object(obj)
first = self.user_misp_connector.add_event(first)
first.attributes[0].deleted = True
deleted_attribute = self.user_misp_connector.update_attribute(first.attributes[0], pythonify=True)
self.assertTrue(deleted_attribute.deleted)
first.objects[0].deleted = True
deleted_object = self.user_misp_connector.update_object(first.objects[0], pythonify=True)
self.assertTrue(deleted_object.deleted)
# Get event with deleted entries
first = self.user_misp_connector.get_event(first, deleted=True, pythonify=True)
self.assertTrue(first.attributes[0].deleted)
self.assertTrue(first.objects[0].deleted)
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
def test_get_non_exists_event(self):
event = self.user_misp_connector.get_event(0) # non exists id
self.assertEqual(event['errors'][0], 404)
event = self.user_misp_connector.get_event("ab2b6e28-fda5-4282-bf60-22b81de77851") # non exists uuid
self.assertEqual(event['errors'][0], 404)
def test_delete_by_uuid(self):
try:
first = self.create_simple_event()
obj = MISPObject('file')
obj.add_attribute('filename', 'foo')
first.add_object(obj)
obj = MISPObject('file')
obj.add_attribute('filename', 'bar')
first.add_object(obj)
first = self.user_misp_connector.add_event(first)
r = self.user_misp_connector.delete_attribute(first.attributes[0].uuid)
self.assertEqual(r['message'], 'Attribute deleted.')
r = self.user_misp_connector.delete_object(first.objects[0].uuid)
self.assertEqual(r['message'], 'Object deleted')
# Test deleted search
r = self.user_misp_connector.search(event_id=first.id, deleted=[0, 1], pythonify=True)
self.assertTrue(isinstance(r[0], MISPEvent))
self.assertEqual(len(r[0].objects), 2)
self.assertTrue(r[0].objects[0].deleted)
self.assertFalse(r[0].objects[1].deleted)
self.assertEqual(len(r[0].attributes), 1)
self.assertTrue(r[0].attributes[0].deleted)
# Test deleted get
r = self.user_misp_connector.get_event(first, deleted=True, pythonify=True)
self.assertTrue(isinstance(r, MISPEvent))
self.assertEqual(len(r.objects), 2)
self.assertTrue(r.objects[0].deleted)
self.assertFalse(r.objects[1].deleted)
self.assertEqual(len(r.attributes), 1)
self.assertTrue(r.attributes[0].deleted)
r = self.user_misp_connector.delete_event(first.uuid)
self.assertEqual(r['message'], 'Event deleted.')
finally:
@ -770,7 +842,7 @@ class TestComprehensive(unittest.TestCase):
events = self.user_misp_connector.search(eventid=second.id, enforce_warninglist=True)
self.assertEqual(len(events), 1)
self.assertEqual(events[0].id, second.id)
self.assertEqual(len(events[0].attributes), 3)
self.assertEqual(len(events[0].attributes), 4)
response = self.admin_misp_connector.toggle_warninglist(warninglist_name='%dns resolv%') # disable ipv4 DNS.
self.assertDictEqual(response, {'saved': True, 'success': '3 warninglist(s) toggled'})
@ -812,6 +884,27 @@ class TestComprehensive(unittest.TestCase):
self.admin_misp_connector.delete_event(first)
self.admin_misp_connector.delete_event(second)
def test_extend_event(self):
first = self.create_simple_event()
first.info = 'parent event'
first.add_tag('tlp:amber___test')
first.set_date('2018-09-01')
second = self.create_simple_event()
second.info = 'event extension'
second.add_tag('tlp:amber___test')
second.set_date('2018-09-01')
second.add_attribute('ip-src', '9.9.9.9')
try:
first = self.user_misp_connector.add_event(first)
second = self.user_misp_connector.add_event(second)
first_extended = self.user_misp_connector.update_event({'extends_uuid': second.uuid}, event_id=first, pythonify=True)
self.assertTrue(isinstance(first_extended, MISPEvent), first_extended)
self.assertEqual(first_extended.extends_uuid, second.uuid)
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
self.admin_misp_connector.delete_event(second)
def test_edit_attribute(self):
first = self.create_simple_event()
try:
@ -994,8 +1087,9 @@ class TestComprehensive(unittest.TestCase):
stix = self.user_misp_connector.search(return_format='stix', eventid=first.id)
self.assertTrue(stix['related_packages'][0]['package']['incidents'][0]['related_indicators']['indicators'][0]['indicator']['observable']['object']['properties']['address_value']['value'], '8.8.8.8')
stix2 = self.user_misp_connector.search(return_format='stix2', eventid=first.id)
print(json.dumps(stix2, indent=2))
self.assertEqual(stix2['objects'][-1]['pattern'], "[network-traffic:src_ref.type = 'ipv4-addr' AND network-traffic:src_ref.value = '8.8.8.8']")
stix_xml = self.user_misp_connector.search(return_format='stix-xml', eventid=first.id)
self.assertTrue('<AddressObj:Address_Value condition="Equals">8.8.8.8</AddressObj:Address_Value>' in stix_xml)
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
@ -1034,18 +1128,30 @@ class TestComprehensive(unittest.TestCase):
# Test generic Tag methods
r = self.admin_misp_connector.tag(second, 'generic_tag_test')
self.assertTrue(r['message'].endswith(f'successfully attached to Event({second.id}).'), r['message'])
self.assertTrue('successfully' in r['message'].lower() and f'Event ({second.id})' in r['message'], r['message'])
second = self.user_misp_connector.get_event(second.id, pythonify=True)
self.assertTrue('generic_tag_test' == second.tags[0].name)
r = self.admin_misp_connector.untag(second, 'generic_tag_test')
self.assertTrue(r['message'].endswith(f'successfully removed from Event({second.id}).'), r['message'])
second = self.user_misp_connector.get_event(second.id, pythonify=True)
self.assertFalse(second.tags)
# NOTE: object tagging not supported yet
# r = self.admin_misp_connector.tag(second.objects[0].uuid, 'generic_tag_test')
# self.assertTrue(r['message'].endswith(f'successfully attached to Object({second.objects[0].id}).'), r['message'])
# r = self.admin_misp_connector.untag(second.objects[0].uuid, 'generic_tag_test')
# self.assertTrue(r['message'].endswith(f'successfully removed from Object({second.objects[0].id}).'), r['message'])
r = self.admin_misp_connector.tag(second.objects[0].attributes[0].uuid, 'generic_tag_test')
self.assertTrue(r['message'].endswith(f'successfully attached to Attribute({second.objects[0].attributes[0].id}).'), r['message'])
self.assertTrue('successfully' in r['message'].lower() and f'Attribute ({second.objects[0].attributes[0].id})' in r['message'], r['message'])
attr = self.user_misp_connector.get_attribute(second.objects[0].attributes[0].uuid, pythonify=True)
self.assertTrue('generic_tag_test' == attr.tags[0].name)
r = self.admin_misp_connector.untag(second.objects[0].attributes[0].uuid, 'generic_tag_test')
self.assertTrue(r['message'].endswith(f'successfully removed from Attribute({second.objects[0].attributes[0].id}).'), r['message'])
second = self.user_misp_connector.get_event(second.id, pythonify=True)
for tag in second.objects[0].attributes[0].tags:
self.assertFalse('generic_tag_test' == tag.name)
attr = self.user_misp_connector.get_attribute(second.objects[0].attributes[0].uuid, pythonify=True)
self.assertFalse(attr.tags)
# Delete tag to avoid polluting the db
tags = self.admin_misp_connector.tags(pythonify=True)
@ -1206,11 +1312,11 @@ class TestComprehensive(unittest.TestCase):
# self.assertEqual(r['errors'][1]['message'], 'Invalid Tag. This tag can only be set by a fixed organisation.')
self.assertEqual(r['errors'][1]['message'], 'Invalid Target.')
r = self.user_misp_connector.tag(first, tag_org_restricted)
self.assertEqual(r['name'], f'Global tag {tag_org_restricted.name}({tag_org_restricted.id}) successfully attached to Event({first.id}).')
self.assertTrue('successfully' in r['message'].lower() and f'Event ({first.id})' in r['message'], r['message'])
r = self.pub_misp_connector.tag(first.attributes[0], tag_user_restricted)
self.assertEqual(r['errors'][1]['message'], 'Invalid Tag. This tag can only be set by a fixed user.')
self.assertIn('Invalid Tag. This tag can only be set by a fixed user.', r['errors'][1]['errors'])
r = self.user_misp_connector.tag(first.attributes[0], tag_user_restricted)
self.assertEqual(r['name'], f'Global tag {tag_user_restricted.name}({tag_user_restricted.id}) successfully attached to Attribute({first.attributes[0].id}).')
self.assertTrue('successfully' in r['message'].lower() and f'Attribute ({first.attributes[0].id})' in r['message'], r['message'])
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
@ -1761,6 +1867,10 @@ class TestComprehensive(unittest.TestCase):
first = self.create_simple_event()
o = first.add_object(name='file')
o.add_attribute('filename', value='foo2.exe')
second_object = MISPObject('file')
second_object.add_attribute("tlsh", value='92a4b4a3d342a21fe1147474c19c9ab6a01717713a0248a2bb15affce77c1c14a79b93',
category="Payload delivery", to_ids=True, distribution=4, sharing_group_id=sharing_group.id)
try:
first = self.user_misp_connector.add_event(first)
first = self.admin_misp_connector.change_sharing_group_on_entity(first, sharing_group.id, pythonify=True)
@ -1771,6 +1881,14 @@ class TestComprehensive(unittest.TestCase):
first_attribute = self.admin_misp_connector.change_sharing_group_on_entity(first.attributes[0], sharing_group.id, pythonify=True)
self.assertEqual(first_attribute.distribution, 4)
self.assertEqual(first_attribute.sharing_group_id, int(sharing_group.id))
# manual create
second_object = self.admin_misp_connector.add_object(first.id, second_object, pythonify=True)
self.assertEqual(second_object.attributes[0].sharing_group_id, int(sharing_group.id))
# manual update
first_object.add_attribute("tlsh", value='92a4b4a3d342a21fe1147474c19c9ab6a01717713a0248a2bb15affce77c1c14a79b93',
category="Payload delivery", to_ids=True, distribution=4, sharing_group_id=sharing_group.id)
first_object = self.admin_misp_connector.update_object(first_object, pythonify=True)
self.assertEqual(first_object.attributes[-1].sharing_group_id, int(sharing_group.id))
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
@ -1840,6 +1958,20 @@ class TestComprehensive(unittest.TestCase):
self.assertFalse(feed.enabled)
feed = self.admin_misp_connector.disable_feed_cache(botvrij.id, pythonify=True)
self.assertFalse(feed.enabled)
# Test enable csv feed - https://github.com/MISP/PyMISP/issues/574
feeds = self.admin_misp_connector.feeds(pythonify=True)
for feed in feeds:
if feed.name == 'blockrules of rules.emergingthreats.net':
e_thread_csv_feed = feed
break
updated_feed = self.admin_misp_connector.enable_feed(e_thread_csv_feed, pythonify=True)
self.assertEqual(updated_feed.settings, e_thread_csv_feed.settings)
updated_feed = self.admin_misp_connector.disable_feed(e_thread_csv_feed, pythonify=True)
self.assertEqual(updated_feed.settings, e_thread_csv_feed.settings)
# Test partial update
updated_feed = self.admin_misp_connector.enable_feed(e_thread_csv_feed.id, pythonify=True)
self.assertEqual(updated_feed.settings, e_thread_csv_feed.settings)
def test_servers(self):
# add
@ -1886,7 +2018,7 @@ class TestComprehensive(unittest.TestCase):
try:
test_roles_user = self.admin_misp_connector.add_user(user, pythonify=True)
test_tag = self.admin_misp_connector.add_tag(tag, pythonify=True)
test_roles_user_connector = ExpandedPyMISP(url, test_roles_user.authkey, verifycert, debug=False)
test_roles_user_connector = PyMISP(url, test_roles_user.authkey, verifycert, debug=False)
test_roles_user_connector.toggle_global_pythonify()
# ===== Read Only
self.admin_misp_connector.update_user({'role_id': 6}, test_roles_user)
@ -1953,6 +2085,8 @@ class TestComprehensive(unittest.TestCase):
self.assertEqual(e.org.name, 'Test Org - delegate')
r = self.delegate_user_misp_connector.delete_event(e)
self.assertEqual(r['message'], 'Event deleted.', r)
# Change base_event UUID do we can add it
base_event.uuid = str(uuid4())
e = test_roles_user_connector.add_event(base_event)
delegation = test_roles_user_connector.delegate_event(e, self.test_org_delegate)
r = test_roles_user_connector.discard_event_delegation(delegation.id)
@ -2014,6 +2148,11 @@ class TestComprehensive(unittest.TestCase):
self.assertTrue(isinstance(user_settings, list))
# Test if publish_alert_filter works
# # Enable autoalert on admin
self.admin_misp_connector._current_user.autoalert = True
self.admin_misp_connector._current_user.termsaccepted = True
self.user_misp_connector.update_user(self.admin_misp_connector._current_user)
first = self.admin_misp_connector.add_event(first, pythonify=True)
second = self.admin_misp_connector.add_event(second, pythonify=True)
r = self.user_misp_connector.change_user_password('Password1234')
@ -2079,7 +2218,6 @@ class TestComprehensive(unittest.TestCase):
self.admin_misp_connector.delete_event(second)
def test_first_last_seen(self):
local_tz = datetime.now(timezone.utc).astimezone().tzinfo
event = MISPEvent()
event.info = 'Test First Last seen'
event.add_attribute('ip-dst', '8.8.8.8', first_seen='2020-01-04', last_seen='2020-01-04T12:30:34.323242+0800')
@ -2090,7 +2228,7 @@ class TestComprehensive(unittest.TestCase):
try:
first = self.admin_misp_connector.add_event(event, pythonify=True)
# Simple attribute
self.assertEqual(first.attributes[0].first_seen, datetime(2020, 1, 4, 0, 0, tzinfo=local_tz))
self.assertEqual(first.attributes[0].first_seen, datetime(2020, 1, 4, 0, 0).astimezone())
self.assertEqual(first.attributes[0].last_seen, datetime(2020, 1, 4, 4, 30, 34, 323242, tzinfo=timezone.utc))
# Object
@ -2098,8 +2236,8 @@ class TestComprehensive(unittest.TestCase):
self.assertEqual(first.objects[0].last_seen, datetime(2020, 1, 27, 17, 48, 20, tzinfo=timezone.utc))
# Object attribute
self.assertEqual(first.objects[0].attributes[0].first_seen, datetime(2022, 1, 30, 0, 0, tzinfo=local_tz))
self.assertEqual(first.objects[0].attributes[0].last_seen, datetime(2022, 2, 23, 0, 0, tzinfo=local_tz))
self.assertEqual(first.objects[0].attributes[0].first_seen, datetime(2022, 1, 30, 0, 0).astimezone())
self.assertEqual(first.objects[0].attributes[0].last_seen, datetime(2022, 2, 23, 0, 0).astimezone())
# Update values
# Attribute in full event
@ -2123,6 +2261,523 @@ class TestComprehensive(unittest.TestCase):
finally:
self.admin_misp_connector.delete_event(first)
def test_registrations(self):
r = register_user(url, 'self_register@user.local', organisation=self.test_org,
org_name=self.test_org.name, verify=verifycert)
self.assertTrue(r['saved'])
r = register_user(url, 'discard@tesst.de', verify=verifycert)
self.assertTrue(r['saved'])
registrations = self.admin_misp_connector.user_registrations(pythonify=True)
self.assertTrue(len(registrations), 2)
self.assertEqual(registrations[0].data['email'], 'self_register@user.local')
self.assertEqual(registrations[0].data['org_name'], 'Test Org')
self.assertEqual(registrations[1].data['email'], 'discard@tesst.de')
m = self.admin_misp_connector.accept_user_registration(registrations[0], unsafe_fallback=True)
self.assertTrue(m['saved'])
# delete new user
for user in self.admin_misp_connector.users(pythonify=True):
if user.email == registrations[0].data['email']:
self.admin_misp_connector.delete_user(user)
break
# Expected: accept registration fails because the orgname is missing
m = self.admin_misp_connector.accept_user_registration(registrations[1], unsafe_fallback=True)
self.assertEqual(m['errors'][1]['message'], 'No organisation selected. Supply an Organisation ID')
m = self.admin_misp_connector.discard_user_registration(registrations[1].id)
self.assertEqual(m['name'], '1 registration(s) discarded.')
def test_search_workflow(self):
first = self.create_simple_event()
first.add_attribute('domain', 'google.com')
tag = MISPTag()
tag.name = 'my_tag'
try:
# Note: attribute 0 doesn't matter
# Attribute 1 = google.com, no tag
# Init tag and event
tag = self.admin_misp_connector.add_tag(tag, pythonify=True)
self.assertEqual(tag.name, 'my_tag')
first = self.user_misp_connector.add_event(first, pythonify=True)
time.sleep(10)
# Add tag to attribute 1, add attribute 2, update
first.attributes[1].add_tag(tag)
first.add_attribute('domain', 'google.fr')
# Attribute 1 = google.com, tag
# Attribute 2 = google.fr, no tag
first = self.user_misp_connector.update_event(first, pythonify=True)
self.assertEqual(first.attributes[1].tags[0].name, 'my_tag')
self.assertEqual(first.attributes[2].tags, [])
updated_attrs = self.user_misp_connector.search(controller='attributes', eventid=first.id, timestamp='5s', pythonify=True)
# Get two attributes, 0 (google.com) has a tag, 1 (google.fr) doesn't
self.assertEqual(len(updated_attrs), 2)
self.assertEqual(updated_attrs[0].tags[0].name, 'my_tag')
self.assertEqual(updated_attrs[1].value, 'google.fr')
self.assertEqual(updated_attrs[1].tags, [])
# Get the metadata only of the event
first_meta_only = self.user_misp_connector.search(eventid=first.id, metadata=True, pythonify=True)
# Add tag to attribute 1 (google.fr)
attr_to_update = updated_attrs[1]
attr_to_update.add_tag(tag)
# attr_to_update.pop('timestamp')
# Add new attribute to event with metadata only
first_meta_only[0].add_attribute('domain', 'google.lu')
# Add tag to new attribute
first_meta_only[0].attributes[0].add_tag('my_tag')
# Re-add attribute 1 (google.fr), newly tagged
first_meta_only[0].add_attribute(**attr_to_update)
# When we push, all the attributes should be tagged
first = self.user_misp_connector.update_event(first_meta_only[0], pythonify=True)
self.assertEqual(first.attributes[1].tags[0].name, 'my_tag')
self.assertEqual(first.attributes[2].tags[0].name, 'my_tag')
self.assertEqual(first.attributes[3].tags[0].name, 'my_tag')
finally:
self.admin_misp_connector.delete_event(first)
self.admin_misp_connector.delete_tag(tag)
def test_search_workflow_ts(self):
first = self.create_simple_event()
first.add_attribute('domain', 'google.com')
tag = MISPTag()
tag.name = 'my_tag'
try:
# Note: attribute 0 doesn't matter
# Attribute 1 = google.com, no tag
# Init tag and event
tag = self.admin_misp_connector.add_tag(tag, pythonify=True)
self.assertEqual(tag.name, 'my_tag')
first = self.user_misp_connector.add_event(first, pythonify=True)
time.sleep(10)
# Add tag to attribute 1, add attribute 2, update
first.attributes[1].add_tag(tag)
first.add_attribute('domain', 'google.fr')
# Attribute 1 = google.com, tag
# Attribute 2 = google.fr, no tag
first = self.user_misp_connector.update_event(first, pythonify=True)
self.assertEqual(first.attributes[1].tags[0].name, 'my_tag')
self.assertEqual(first.attributes[2].tags, [])
updated_attrs = self.user_misp_connector.search(controller='attributes', eventid=first.id, timestamp=first.timestamp.timestamp(), pythonify=True)
# Get two attributes, 0 (google.com) has a tag, 1 (google.fr) doesn't
self.assertEqual(len(updated_attrs), 2)
self.assertEqual(updated_attrs[0].tags[0].name, 'my_tag')
self.assertEqual(updated_attrs[1].value, 'google.fr')
self.assertEqual(updated_attrs[1].tags, [])
# Get the metadata only of the event
first_meta_only = self.user_misp_connector.search(eventid=first.id, metadata=True, pythonify=True)
# Add tag to attribute 1 (google.fr)
attr_to_update = updated_attrs[1]
attr_to_update.add_tag(tag)
# attr_to_update.pop('timestamp')
# Add new attribute to event with metadata only
first_meta_only[0].add_attribute('domain', 'google.lu')
# Add tag to new attribute
first_meta_only[0].attributes[0].add_tag('my_tag')
# Re-add attribute 1 (google.fr), newly tagged
first_meta_only[0].add_attribute(**attr_to_update)
# When we push, all the attributes should be tagged
first = self.user_misp_connector.update_event(first_meta_only[0], pythonify=True)
self.assertEqual(first.attributes[1].tags[0].name, 'my_tag')
self.assertEqual(first.attributes[2].tags[0].name, 'my_tag')
self.assertEqual(first.attributes[3].tags[0].name, 'my_tag')
finally:
self.admin_misp_connector.delete_event(first)
self.admin_misp_connector.delete_tag(tag)
def test_blocklists(self):
first = self.create_simple_event()
second = self.create_simple_event()
second.Orgc = self.test_org
to_delete = {'bl_events': [], 'bl_organisations': []}
try:
# test events BL
ebl = self.admin_misp_connector.add_event_blocklist(uuids=[first.uuid])
self.assertEqual(ebl['result']['successes'][0], first.uuid, ebl)
bl_events = self.admin_misp_connector.event_blocklists(pythonify=True)
for ble in bl_events:
if ble.event_uuid == first.uuid:
to_delete['bl_events'].append(ble)
break
else:
raise Exception('Unable to find UUID in Events blocklist')
first = self.user_misp_connector.add_event(first, pythonify=True)
self.assertEqual(first['errors'][1]['message'], 'Could not add Event', first)
ble.comment = 'This is a test'
ble.event_info = 'foo'
ble.event_orgc = 'bar'
ble = self.admin_misp_connector.update_event_blocklist(ble, pythonify=True)
self.assertEqual(ble.comment, 'This is a test')
r = self.admin_misp_connector.delete_event_blocklist(ble)
self.assertTrue(r['success'])
# test Org BL
obl = self.admin_misp_connector.add_organisation_blocklist(uuids=self.test_org.uuid)
self.assertEqual(obl['result']['successes'][0], self.test_org.uuid, obl)
bl_orgs = self.admin_misp_connector.organisation_blocklists(pythonify=True)
for blo in bl_orgs:
if blo.org_uuid == self.test_org.uuid:
to_delete['bl_organisations'].append(blo)
break
else:
raise Exception('Unable to find UUID in Orgs blocklist')
first = self.user_misp_connector.add_event(first, pythonify=True)
self.assertEqual(first['errors'][1]['message'], 'Could not add Event', first)
blo.comment = 'This is a test'
blo.org_name = 'bar'
blo = self.admin_misp_connector.update_organisation_blocklist(blo, pythonify=True)
self.assertEqual(blo.org_name, 'bar')
r = self.admin_misp_connector.delete_organisation_blocklist(blo)
self.assertTrue(r['success'])
finally:
for ble in to_delete['bl_events']:
self.admin_misp_connector.delete_event_blocklist(ble)
for blo in to_delete['bl_organisations']:
self.admin_misp_connector.delete_organisation_blocklist(blo)
@unittest.skip("Internal use only")
def missing_methods(self):
skip = [
"attributes/download",
"attributes/add_attachment",
"attributes/add_threatconnect",
"attributes/editField",
"attributes/viewPicture",
"attributes/restore",
"attributes/deleteSelected",
"attributes/editSelected",
"attributes/search",
"attributes/searchAlternate",
"attributes/checkComposites",
"attributes/downloadAttachment",
"attributes/returnAttributes",
"attributes/text",
"attributes/rpz",
"attributes/bro",
"attributes/reportValidationIssuesAttributes",
"attributes/generateCorrelation",
"attributes/getMassEditForm",
"attributes/fetchViewValue",
"attributes/fetchEditForm",
"attributes/attributeReplace",
"attributes/downloadSample",
"attributes/pruneOrphanedAttributes",
"attributes/checkOrphanedAttributes",
"attributes/updateAttributeValues",
"attributes/hoverEnrichment",
"attributes/addTag",
"attributes/removeTag",
"attributes/toggleCorrelation", # Use update attribute
"attributes/toggleToIDS", # Use update attribute
"attributes/checkAttachments",
"attributes/exportSearch",
'dashboards',
'decayingModel',
"eventBlocklists/massDelete",
"eventDelegations/view",
"eventDelegations/index",
"eventGraph/view",
"eventGraph/add",
"eventGraph/delete",
"events/filterEventIndex",
"events/viewEventAttributes",
"events/removePivot",
"events/addIOC",
"events/add_misp_export",
"events/merge",
"events/unpublish",
"events/publishSightings",
"events/automation",
"events/export",
"events/downloadExport",
"events/xml",
"events/nids",
"events/hids",
"events/csv",
"events/downloadOpenIOCEvent",
"events/proposalEventIndex",
"events/reportValidationIssuesEvents",
"events/addTag",
"events/removeTag",
"events/saveFreeText",
"events/stix2",
"events/stix",
"events/filterEventIdsForPush",
"events/checkuuid",
"events/pushProposals",
"events/exportChoice",
"events/importChoice",
"events/upload_sample",
"events/viewGraph",
"events/viewEventGraph",
"events/updateGraph",
"events/genDistributionGraph",
"events/getEventTimeline",
"events/getDistributionGraph",
"events/getEventGraphReferences",
"events/getEventGraphTags",
"events/getEventGraphGeneric",
"events/getReferenceData",
"events/getObjectTemplate",
"events/viewGalaxyMatrix",
"events/delegation_index",
"events/queryEnrichment",
"events/handleModuleResults",
"events/importModule",
"events/exportModule",
"events/toggleCorrelation", # TODO
"events/checkPublishedStatus",
"events/pushEventToKafka",
"events/getEventInfoById",
"events/enrichEvent", # TODO
"events/checkLocks",
"events/getEditStrategy",
"events/upload_analysis_file",
"events/cullEmptyEvents",
"favouriteTags/toggle", # TODO
"favouriteTags/getToggleField", # TODO
"feeds/feedCoverage",
"feeds/importFeeds",
"feeds/fetchFromAllFeeds",
"feeds/getEvent",
"feeds/previewIndex", # TODO
"feeds/previewEvent", # TODO
"feeds/enable",
"feeds/disable",
"feeds/fetchSelectedFromFreetextIndex",
"feeds/toggleSelected", # TODO
"galaxies/delete",
"galaxies/selectGalaxy",
"galaxies/selectGalaxyNamespace",
"galaxies/selectCluster",
"galaxies/attachCluster",
"galaxies/attachMultipleClusters",
"galaxies/viewGraph",
"galaxies/showGalaxies",
"galaxyClusters/index",
"galaxyClusters/view",
"galaxyClusters/attachToEvent",
"galaxyClusters/detach",
"galaxyClusters/delete",
"galaxyClusters/viewGalaxyMatrix",
"galaxyElements/index",
"jobs/index",
"jobs/getError",
"jobs/getGenerateCorrelationProgress",
"jobs/getProgress",
"jobs/cache",
"jobs/clearJobs",
"logs/event_index",
"admin/logs/search",
"logs/returnDates",
"logs/pruneUpdateLogs",
"logs/testForStolenAttributes",
"modules/queryEnrichment",
"modules/index",
"news/index",
"news/add",
"news/edit",
"news/delete",
"noticelists/toggleEnable",
"noticelists/getToggleField",
"noticelists/delete",
"objectReferences/view",
"objectTemplateElements/viewElements",
"objectTemplates/objectMetaChoice",
"objectTemplates/objectChoice",
"objectTemplates/delete",
"objectTemplates/viewElements",
"objectTemplates/activate",
"objectTemplates/getToggleField",
"objects/revise_object",
"objects/get_row",
"objects/editField",
"objects/fetchViewValue",
"objects/fetchEditForm",
"objects/quickFetchTemplateWithValidObjectAttributes",
"objects/quickAddAttributeForm",
"objects/orphanedObjectDiagnostics",
"objects/proposeObjectsFromAttributes",
"objects/groupAttributesIntoObject",
"admin/organisations/generateuuid",
"organisations/landingpage",
"organisations/fetchOrgsForSG",
"organisations/fetchSGOrgRow",
"organisations/getUUIDs",
"admin/organisations/merge",
"pages/display",
"posts/pushMessageToZMQ",
"posts/add",
"posts/edit",
"posts/delete",
"admin/regexp/add",
"admin/regexp/index",
"admin/regexp/edit",
"admin/regexp/delete",
"regexp/index",
"admin/regexp/clean",
"regexp/cleanRegexModifiers",
"restClientHistory/index",
"restClientHistory/delete",
"roles/view",
"admin/roles/add", # TODO
"admin/roles/edit", # TODO
"admin/roles/index", # TODO
"admin/roles/delete", # TODO
"servers/previewIndex",
"servers/previewEvent",
"servers/filterEventIndex",
"servers/eventBlockRule",
"servers/serverSettingsReloadSetting",
"servers/startWorker", # TODO
"servers/stopWorker", # TODO
"servers/getWorkers", # TODO
"servers/getSubmodulesStatus", # TODO,
"servers/restartDeadWorkers", # TODO
"servers/deleteFile",
"servers/uploadFile",
"servers/fetchServersForSG",
"servers/postTest",
"servers/getRemoteUser",
"servers/startZeroMQServer",
"servers/stopZeroMQServer",
"servers/statusZeroMQServer",
"servers/purgeSessions",
"servers/clearWorkerQueue", # TODO
"servers/getGit",
"servers/checkout",
"servers/ondemandAction",
"servers/updateProgress",
"servers/getSubmoduleQuickUpdateForm",
"servers/updateSubmodule",
"servers/getInstanceUUID",
"servers/getApiInfo",
"servers/cache",
"servers/updateJSON",
"servers/resetRemoteAuthKey",
"servers/changePriority",
"servers/releaseUpdateLock",
"servers/viewDeprecatedFunctionUse",
"shadowAttributes/download",
"shadowAttributes/add_attachment",
"shadowAttributes/discardSelected",
"shadowAttributes/acceptSelected",
"shadowAttributes/generateCorrelation",
"sharingGroups/edit",
"sharingGroups/view",
"sightingdb/add",
"sightingdb/edit",
"sightingdb/delete",
"sightingdb/index",
"sightingdb/requestStatus",
"sightingdb/search",
"sightings/advanced",
"sightings/quickAdd",
"sightings/quickDelete",
"sightings/viewSightings",
"sightings/bulkSaveSightings",
"tagCollections/add",
"tagCollections/import",
"tagCollections/view",
"tagCollections/edit",
"tagCollections/delete",
"tagCollections/addTag",
"tagCollections/removeTag",
"tagCollections/index",
"tagCollections/getRow",
"tags/quickAdd",
"tags/showEventTag",
"tags/showAttributeTag",
"tags/showTagControllerTag",
"tags/viewTag",
"tags/selectTaxonomy",
"tags/selectTag",
"tags/viewGraph",
"tags/search",
"tasks/index",
"tasks/setTask",
"taxonomies/hideTag",
"taxonomies/unhideTag",
"taxonomies/taxonomyMassConfirmation",
"taxonomies/taxonomyMassHide",
"taxonomies/taxonomyMassUnhide",
"taxonomies/delete",
"taxonomies/toggleRequired",
"templateElements/index",
"templateElements/templateElementAddChoices",
"templateElements/add",
"templateElements/edit",
"templateElements/delete",
"templates/index",
"templates/edit",
"templates/view",
"templates/add",
"templates/saveElementSorting",
"templates/delete",
"templates/templateChoices",
"templates/populateEventFromTemplate",
"templates/submitEventPopulation",
"templates/uploadFile",
"templates/deleteTemporaryFile",
"threads/viewEvent",
"threads/view",
"threads/index",
"userSettings/view",
"userSettings/setHomePage",
"users/request_API",
"admin/users/filterUserIndex",
"admin/users/view",
"admin/users/edit",
"users/updateLoginTime",
"users/login",
"users/routeafterlogin",
"users/logout",
"users/resetauthkey",
"users/resetAllSyncAuthKeys",
"users/histogram",
"users/terms",
"users/downloadTerms",
"users/checkAndCorrectPgps",
"admin/users/quickEmail",
"admin/users/email",
"users/initiatePasswordReset",
"users/email_otp",
"users/tagStatisticsGraph",
"users/verifyGPG",
"users/verifyCertificate",
"users/searchGpgKey",
"users/fetchGpgKey",
"users/checkIfLoggedIn",
"admin/users/monitor",
"warninglists/enableWarninglist",
"warninglists/getToggleField",
"warninglists/delete",
"admin/allowedlists/add",
"admin/allowedlists/index",
"admin/allowedlists/edit",
"admin/allowedlists/delete",
"allowedlists/index"
]
missing = self.admin_misp_connector.get_all_functions(True)
with open('all_missing.json', 'w') as f:
json.dump(missing, f, indent=2)
final_missing = []
for m in missing:
if any(m.startswith(s) for s in skip):
continue
final_missing.append(m)
with open('plop', 'w') as f:
json.dump(final_missing, f, indent=2)
print(final_missing)
print(len(final_missing))
raise Exception()
if __name__ == '__main__':
unittest.main()

View File

@ -3,6 +3,6 @@
set -e
set -x
# We're in python3, installing with pipenv.
pip3 install pipenv
pipenv update --dev
# We're in python3, installing with poetry.
pip3 install poetry
poetry install -E fileobjects -E openioc -E virustotal -E docs -E pdfexport -E url

View File

@ -3,6 +3,6 @@
set -e
set -x
pipenv run nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
pipenv run mypy tests/testlive_comprehensive.py tests/test_mispevent.py tests/testlive_sync.py pymisp
pipenv run flake8 --ignore=E501,W503,E226,E252 pymisp
poetry run nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
poetry run mypy tests/testlive_comprehensive.py tests/test_mispevent.py tests/testlive_sync.py pymisp
poetry run flake8 --ignore=E501,W503,E226,E252 pymisp