Merge remote-tracking branch 'upstream/master' into feature/tagdelete_searchsg

pull/500/head
Tom King 2020-02-13 16:27:24 +00:00
commit 394b7a545e
77 changed files with 4903 additions and 7402 deletions

View File

@ -9,27 +9,13 @@ addons:
- libstdc++6
- libfuzzy-dev
matrix:
include:
- name: "Python 2.7 - legacy"
python: 2.7
env: LEGACY=true
- name: "Python 3.5"
python: 3.5
dist: xenial
- name: "Python 3.6"
python: 3.6
dist: xenial
- name: "Python 3.6 - Dev"
python: 3.6-dev
dist: xenial
- name: "Python 3.7"
python: 3.7
dist: xenial
- name: "Python 3.7 - Dev"
python: 3.7-dev
dist: xenial
python:
- "3.6"
- "3.6-dev"
- "3.7"
- "3.7-dev"
- "3.8"
- "3.8-dev"
install:
- bash travis/install_travis.sh

View File

@ -2,9 +2,264 @@ Changelog
=========
%%version%% (unreleased)
------------------------
Changes
~~~~~~~
- Bump objects. [Raphaël Vinot]
v2.4.121.1 (2020-02-07)
-----------------------
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
Fix
~~~
- Make lief optional again. [Raphaël Vinot]
fix #538
v2.4.121 (2020-02-06)
---------------------
New
~~~
- Add includeDecayScore to rest search. [VVX7]
- Support for first_seen/last_seen. [Raphaël Vinot]
Cleaner import of datetime
- [attributes] chrome-extension-id added. [Alexandre Dulaunoy]
Changes
~~~~~~~
- Bump version. [Raphaël Vinot]
- Do not install neo by default. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- More flexible when an event is in a weird state. [Raphaël Vinot]
- Str to int, properly load SharingGroup. [Raphaël Vinot]
Fix #535
- Bump deps, add pep8 test. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- Support dict in tag/untag. [Raphaël Vinot]
- Test update last seen. [Raphaël Vinot]
- Add test cases in feed. [Raphaël Vinot]
- Add test cases. [Raphaël Vinot]
- Normalize to_datetime conversion. [Raphaël Vinot]
- Trustar example uses objects. [Raphaël Vinot]
- Add lief in the generic requirements. [Raphaël Vinot]
- Refactorize typing, validate. [Raphaël Vinot]
Fix
~~~
- Bump objects. [Raphaël Vinot]
- Issue with readme. [Raphaël Vinot]
- Remove debugging. [Raphaël Vinot]
- [*-seen] Consider that `-` can also be in the date component while
parsing. [mokaddem]
- First seen was after last seen, trigerring the exception. [Raphaël
Vinot]
- Tests failing if local tz was not CET. [Raphaël Vinot]
- Syntax and typos. [Raphaël Vinot]
- Bugs introduced by last commit. [Raphaël Vinot]
Other
~~~~~
- Doc: fix Search-FullOverview.ipynb code example. [Bernhard E. Reiter]
- Chore: delete old examples. [Manabu Niseki]
Delete examples which use deprecated/deleted methods
- Scrape trustar intel platform reports and create misp events.
[th3jiv3r]
- Configuration for trustar integration. [th3jiv3r]
- Fixed trailing lines. [turtlefac3]
- Fixed trailing lines. [turtlefac3]
- Custom integration written in python to scrape Proofpoint VAP API for
metrics of top Very Attacked Persons and create MISP events.
[turtlefac3]
- Fix typos on FullOverview.ipynb. [Bernhard E. Reiter]
v2.4.120 (2020-01-17)
---------------------
New
~~~
- [attribute type] kusto-query attribute type. [Alexandre Dulaunoy]
Kusto query is the query language for the Kusto services in Azure used
to search large dataset. It's used in Windows Defender ATP Hunting-Queries
and also Azure Sentinel (Cloud-native SIEM).
- Remove python < 3.6 support. [Raphaël Vinot]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump Changelog. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Bump dependencies, add debug. [Raphaël Vinot]
- Upate dummy events creator. [Raphaël Vinot]
- Add tests on more version of Python. [Raphaël Vinot]
- Search with the STIX output returns a json STIX. [Raphaël Vinot]
Was XML before.
- Bump dependencies. [Raphaël Vinot]
- Add more typing information. [Raphaël Vinot]
- Add typing markup. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Bump Dependencies. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
Fix
~~~
- Bump template_version in test cases. [Raphaël Vinot]
- Add missing variable in dummy creator. [Raphaël Vinot]
- Et2misp was python2 only. [Raphaël Vinot]
- Feed generator was broken. [Raphaël Vinot]
Fix #506
- Event without hashable attribute. [Raphaël Vinot]
Related #506
Other
~~~~~
- Update api.py. [AaronK]
minor typo, can;t help it noticing those. sorry,
- Fixed TODO, added quarantineFolder/quarantineRule from
messagesBlocked, added some error handling to prevent empty attributes
from trying to be added. [th3jiv3r]
- Scrape proofpoint tap api for messages blocked/delivered & clicks
blocked/permitted and create misp events. [th3jiv3r]
- Add variable for proofpoint tap api auth. [th3jiv3r]
- Update README.md. [AaronK]
minor typo
- Define the number of entries to output. [AndreC10002]
Allow for defining in the settings.py file the number of entries to output
- Update generate.py. [AndreC10002]
- Cleanup of code and 'quick-n-dirty' sanitizing of tags. [Koen Van
Impe]
- Sync. [Koen Van Impe]
- Update README.md. [Raphaël Vinot]
v2.4.119.1 (2019-12-17)
-----------------------
New
~~~
- URLObject (requires pyfaup) [Raphaël Vinot]
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Version bump. [Raphaël Vinot]
- Bump test files. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Debug travis error message. [Raphaël Vinot]
- [types] eppn type added. [Alexandre Dulaunoy]
- Fix typo. [Raphaël Vinot]
- Move scrippsco2 feed generator to a sub directory. [Raphaël Vinot]
- Update documentation. [Raphaël Vinot]
Fix #396
- Bump objects. [Raphaël Vinot]
Fix
~~~
- Properly test custom objects. [Raphaël Vinot]
- Adding a sighting takes a little bit of time. [Raphaël Vinot]
- Test case on reference. [Raphaël Vinot]
- Add missing fields to event & attribute for the feed output. [Raphaël
Vinot]
- Make sure the publish timestamp is bumped on update. [Raphaël Vinot]
v2.4.119 (2019-12-02)
---------------------
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
Fix
~~~
- Bump lief to 0.10.1. [Raphaël Vinot]
- Update tests. [Raphaël Vinot]
- Raise PyMISPError instead of Exception. [Raphaël Vinot]
- Rename feed_meta_generator so it clearly fails with python<3.6.
[Raphaël Vinot]
v2.4.117.3 (2019-11-25)
-----------------------
New
~~~
- Script to generate the metadata of a feed out of a directory. [Raphaël
Vinot]
- Add to_feed export to MISPEvent. [Raphaël Vinot]
- Validate object templates. [Raphaël Vinot]
fix https://github.com/MISP/misp-objects/issues/199
- Test cases for restricted tags. [Raphaël Vinot]
Fix #483
- Get Database Schema Diagnostic. [Raphaël Vinot]
Fix #492
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
- Bump version. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- Require stable version of lief again. [Raphaël Vinot]
- Few more improvements on the feed export. [Raphaël Vinot]
- Bump misp-objects. [Raphaël Vinot]
- Make the feed generator more generic. [Raphaël Vinot]
- Use New version of PyMISP in the feed generator. [Raphaël Vinot]
- Bump misp-object. [Raphaël Vinot]
- Allow to sort and indent the json output for objects. [Raphaël Vinot]
- Bump objects. [Raphaël Vinot]
- Bump dependencies. [Raphaël Vinot]
- [test] feed test updated as botvrij is now TLS by default. [Alexandre
Dulaunoy]
Fix
~~~
- Improve stability of feed output. [Raphaël Vinot]
- Do not unitialize the uuid in MISPEvent. [Raphaël Vinot]
- Bump url template version in test cases. [Raphaël Vinot]
- Python 2.7 tests. [Raphaël Vinot]
- Print the full json blob in debug mode. [Raphaël Vinot]
Related https://github.com/MISP/PyMISP/issues/462
Other
~~~~~
- Cch: Bump misp-objects. [Raphaël Vinot]
v2.4.117.2 (2019-10-30)
-----------------------
Changes
~~~~~~~
- Bump changelog. [Raphaël Vinot]
Fix
~~~
- Avoid exception on legacy MISP. [Raphaël Vinot]

View File

@ -11,6 +11,8 @@ requests-mock = "*"
pymisp = {editable = true,extras = ["fileobjects", "neo", "openioc", "virustotal", "pdfexport", "docs"],path = "."}
docutils = "==0.15"
memory-profiler = "*"
mypy = "*"
flake8 = "*"
[packages]
pymisp = {editable = true,extras = ["fileobjects", "openioc", "virustotal", "pdfexport"],path = "."}

653
Pipfile.lock generated
View File

@ -1,7 +1,7 @@
{
"_meta": {
"hash": {
"sha256": "4be7259a433785d74e1879a4a555bb669d50c5f409d0a094652c1abc9b1227c5"
"sha256": "980c848909285e25224dc957df15e733666b06107dfbd97e6edfcd51c8da9206"
},
"pipfile-spec": 6,
"requires": {
@ -25,18 +25,18 @@
},
"beautifulsoup4": {
"hashes": [
"sha256:5279c36b4b2ec2cb4298d723791467e3000e5384a43ea0cdf5d45207c7e97169",
"sha256:6135db2ba678168c07950f9a16c4031822c6f4aec75a65e0a97bc5ca09789931",
"sha256:dcdef580e18a76d54002088602eba453eec38ebbcafafeaabd8cab12b6155d57"
"sha256:05fd825eb01c290877657a56df4c6e4c311b3965bda790c613a3d6fb01a5462a",
"sha256:9fbb4d6e48ecd30bcacc5b63b94088192dcda178513b2ae3c394229f8911b887",
"sha256:e1505eeed31b0f4ce2dbb3bc8eb256c04cc2b3b72af7d551a4ab6efd5cbe5dae"
],
"version": "==4.8.1"
"version": "==4.8.2"
},
"certifi": {
"hashes": [
"sha256:e4f3620cfea4f83eedc95b24abd9cd56f3c4b146dd0177e83a21b4eb49e21e50",
"sha256:fd7c7c74727ddcf00e9acd26bba8da604ffec95bf1c2144e67aff7a8b50e6cef"
"sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3",
"sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f"
],
"version": "==2019.9.11"
"version": "==2019.11.28"
},
"chardet": {
"hashes": [
@ -66,78 +66,58 @@
],
"version": "==2.8"
},
"importlib-metadata": {
"hashes": [
"sha256:aa18d7378b00b40847790e7c27e11673d7fed219354109d0e7b9e5b25dc3ad26",
"sha256:d5f18a79777f3aa179c145737780282e27b508fc8fd688cb17c7a813e8bd39af"
],
"version": "==0.23"
},
"jsonschema": {
"hashes": [
"sha256:2fa0684276b6333ff3c0b1b27081f4b2305f0a36cf702a23db50edb141893c3f",
"sha256:94c0a13b4a0616458b42529091624e66700a17f847453e52279e35509a5b7631"
"sha256:4e5b3cf8216f577bee9ce139cbe72eca3ea4f292ec60928ff24758ce626cd163",
"sha256:c8a85b28d377cc7737e46e2d9f2b4f44ee3c0e1deac6bf46ddefc7187d30797a"
],
"version": "==3.1.1"
"version": "==3.2.0"
},
"lief": {
"hashes": [
"sha256:0efba18d7b9776529ea5c18c771b35871896a8ceb95a19351e50d4813a11c632",
"sha256:3d9c7bb1e353e875f295a72a58d3a37ae1ba3e1ff1beb57b8a65f1a726064093",
"sha256:3db5939e7d95f776f9866586128c2a5be614eaec43ab985ac27ff2c531f8ac5f",
"sha256:4c61598818b0091d80839875aa107cfd10ae1017a3e9c9de4bc002622b8e3179",
"sha256:4f26d07bdada8ca5ef3dc5fa2f71f20f7e8ab4f78f7c5e00134477f51feb6a80",
"sha256:55fe3c8a0990dce16ab5bf88df707f1eacac4eb34561667ac478497e0e0807c7",
"sha256:68bcf18e40c9412d2d08d6311e04eb6c19e20ec174764706da2d602c45aa4fd5",
"sha256:7ff910d99361022451e9c25e34cb844768e2fa347cfb0f4ad70f531810d776d4",
"sha256:ac571152d0b864e8d376bc733c5728a224316be1cdefc290174f1bf8ab10ec70",
"sha256:dd17a7cdcd29a2efca3d4cb4fb078a06daf1cafec8912560965a8d8dbf346739",
"sha256:efa5f3523c01f7f0f5f2c14e5ac808e2447d1435c6a2872e5ab1a97ef1b0db9b",
"sha256:f1aadb344b5e14b308167bd2c9f31f1915e3c4e3f9a9ca92ff7b7bfbede5034c"
"sha256:276cc63ec12a21bdf01b8d30962692c17499788234f0765247ca7a35872097ec",
"sha256:3e6baaeb52bdc339b5f19688b58fd8d5778b92e50221f920cedfa2bec1f4d5c2",
"sha256:45e5c592b57168c447698381d927eb2386ffdd52afe0c48245f848d4cc7ee05a",
"sha256:6547752b5db105cd41c9fa65d0d7452a4d7541b77ffee716b46246c6d81e172f",
"sha256:83b51e01627b5982662f9550ac1230758aa56945ed86829e4291932d98417da3",
"sha256:895599194ea7495bf304e39317b04df20cccf799fc2751867cc1aa4997cfcdae",
"sha256:8a91cee2568306fe1d2bf84341b459c85368317d01d7105fa49e4f4ede837076",
"sha256:913b36a67707dc2afa72f117bab9856ea3f434f332b04a002a0f9723c8779320",
"sha256:9f604a361a3b1b3ed5fdafed0321c5956cb3b265b5efe2250d1bf8911a80c65b",
"sha256:a487fe7234c04bccd58223dbb79214421176e2629814c7a4a887764cceb5be7c",
"sha256:bc8488fb0661cb436fe4bb4fe947d0f9aa020e9acaed233ccf01ab04d888c68a",
"sha256:bddbf333af62310a10cb738a1df1dc2b140dd9c663b55ba3500c10c249d416d2",
"sha256:cce48d7c97cef85e01e6cfeff55f2068956b5c0257eb9c2d2c6d15e33dd1e4fc",
"sha256:f8b3f66956c56b582b3adc573bf2a938c25fb21c8894b373a113e24c494fc982"
],
"version": "==0.10.0.dev0"
},
"more-itertools": {
"hashes": [
"sha256:409cd48d4db7052af495b09dec721011634af3753ae1ef92d2b32f73a745f832",
"sha256:92b8c4b06dac4f0611c0729b2f2ede52b2e1bac1ab48f089c7ddc12e26bb60c4"
],
"version": "==7.2.0"
"version": "==0.10.1"
},
"pillow": {
"hashes": [
"sha256:047d9473cf68af50ac85f8ee5d5f21a60f849bc17d348da7fc85711287a75031",
"sha256:0f66dc6c8a3cc319561a633b6aa82c44107f12594643efa37210d8c924fc1c71",
"sha256:12c9169c4e8fe0a7329e8658c7e488001f6b4c8e88740e76292c2b857af2e94c",
"sha256:248cffc168896982f125f5c13e9317c059f74fffdb4152893339f3be62a01340",
"sha256:27faf0552bf8c260a5cee21a76e031acaea68babb64daf7e8f2e2540745082aa",
"sha256:285edafad9bc60d96978ed24d77cdc0b91dace88e5da8c548ba5937c425bca8b",
"sha256:384b12c9aa8ef95558abdcb50aada56d74bc7cc131dd62d28c2d0e4d3aadd573",
"sha256:38950b3a707f6cef09cd3cbb142474357ad1a985ceb44d921bdf7b4647b3e13e",
"sha256:4aad1b88933fd6dc2846552b89ad0c74ddbba2f0884e2c162aa368374bf5abab",
"sha256:4ac6148008c169603070c092e81f88738f1a0c511e07bd2bb0f9ef542d375da9",
"sha256:4deb1d2a45861ae6f0b12ea0a786a03d19d29edcc7e05775b85ec2877cb54c5e",
"sha256:59aa2c124df72cc75ed72c8d6005c442d4685691a30c55321e00ed915ad1a291",
"sha256:5a47d2123a9ec86660fe0e8d0ebf0aa6bc6a17edc63f338b73ea20ba11713f12",
"sha256:5cc901c2ab9409b4b7ac7b5bcc3e86ac14548627062463da0af3b6b7c555a871",
"sha256:6c1db03e8dff7b9f955a0fb9907eb9ca5da75b5ce056c0c93d33100a35050281",
"sha256:7ce80c0a65a6ea90ef9c1f63c8593fcd2929448613fc8da0adf3e6bfad669d08",
"sha256:809c19241c14433c5d6135e1b6c72da4e3b56d5c865ad5736ab99af8896b8f41",
"sha256:83792cb4e0b5af480588601467c0764242b9a483caea71ef12d22a0d0d6bdce2",
"sha256:846fa202bd7ee0f6215c897a1d33238ef071b50766339186687bd9b7a6d26ac5",
"sha256:9f5529fc02009f96ba95bea48870173426879dc19eec49ca8e08cd63ecd82ddb",
"sha256:a423c2ea001c6265ed28700df056f75e26215fd28c001e93ef4380b0f05f9547",
"sha256:ac4428094b42907aba5879c7c000d01c8278d451a3b7cccd2103e21f6397ea75",
"sha256:b1ae48d87f10d1384e5beecd169c77502fcc04a2c00a4c02b85f0a94b419e5f9",
"sha256:bf4e972a88f8841d8fdc6db1a75e0f8d763e66e3754b03006cbc3854d89f1cb1",
"sha256:c6414f6aad598364aaf81068cabb077894eb88fed99c6a65e6e8217bab62ae7a",
"sha256:c710fcb7ee32f67baf25aa9ffede4795fd5d93b163ce95fdc724383e38c9df96",
"sha256:c7be4b8a09852291c3c48d3c25d1b876d2494a0a674980089ac9d5e0d78bd132",
"sha256:c9e5ffb910b14f090ac9c38599063e354887a5f6d7e6d26795e916b4514f2c1a",
"sha256:e0697b826da6c2472bb6488db4c0a7fa8af0d52fa08833ceb3681358914b14e5",
"sha256:e9a3edd5f714229d41057d56ac0f39ad9bdba6767e8c888c951869f0bdd129b0"
"sha256:0a628977ac2e01ca96aaae247ec2bd38e729631ddf2221b4b715446fd45505be",
"sha256:4d9ed9a64095e031435af120d3c910148067087541131e82b3e8db302f4c8946",
"sha256:54ebae163e8412aff0b9df1e88adab65788f5f5b58e625dc5c7f51eaf14a6837",
"sha256:5bfef0b1cdde9f33881c913af14e43db69815c7e8df429ceda4c70a5e529210f",
"sha256:5f3546ceb08089cedb9e8ff7e3f6a7042bb5b37c2a95d392fb027c3e53a2da00",
"sha256:5f7ae9126d16194f114435ebb79cc536b5682002a4fa57fa7bb2cbcde65f2f4d",
"sha256:62a889aeb0a79e50ecf5af272e9e3c164148f4bd9636cc6bcfa182a52c8b0533",
"sha256:7406f5a9b2fd966e79e6abdaf700585a4522e98d6559ce37fc52e5c955fade0a",
"sha256:8453f914f4e5a3d828281a6628cf517832abfa13ff50679a4848926dac7c0358",
"sha256:87269cc6ce1e3dee11f23fa515e4249ae678dbbe2704598a51cee76c52e19cda",
"sha256:875358310ed7abd5320f21dd97351d62de4929b0426cdb1eaa904b64ac36b435",
"sha256:8ac6ce7ff3892e5deaab7abaec763538ffd011f74dc1801d93d3c5fc541feee2",
"sha256:91b710e3353aea6fc758cdb7136d9bbdcb26b53cefe43e2cba953ac3ee1d3313",
"sha256:9d2ba4ed13af381233e2d810ff3bab84ef9f18430a9b336ab69eaf3cd24299ff",
"sha256:a62ec5e13e227399be73303ff301f2865bf68657d15ea50b038d25fc41097317",
"sha256:ab76e5580b0ed647a8d8d2d2daee170e8e9f8aad225ede314f684e297e3643c2",
"sha256:bf4003aa538af3f4205c5fac56eacaa67a6dd81e454ffd9e9f055fff9f1bc614",
"sha256:bf598d2e37cf8edb1a2f26ed3fb255191f5232badea4003c16301cb94ac5bdd0",
"sha256:c18f70dc27cc5d236f10e7834236aff60aadc71346a5bc1f4f83a4b3abee6386",
"sha256:c5ed816632204a2fc9486d784d8e0d0ae754347aba99c811458d69fcdfd2a2f9",
"sha256:dc058b7833184970d1248135b8b0ab702e6daa833be14035179f2acb78ff5636",
"sha256:ff3797f2f16bf9d17d53257612da84dd0758db33935777149b3334c01ff68865"
],
"version": "==6.2.1"
"version": "==7.0.0"
},
"pydeep": {
"hashes": [
@ -162,9 +142,9 @@
},
"pyrsistent": {
"hashes": [
"sha256:eb6545dbeb1aa69ab1fb4809bfbf5a8705e44d92ef8fc7c2361682a47c46c778"
"sha256:cdc7b5e3ed77bed61270a47d35434a30617b9becdf2478af76ad2c6ade307280"
],
"version": "==0.15.5"
"version": "==0.15.7"
},
"python-dateutil": {
"hashes": [
@ -182,36 +162,36 @@
},
"reportlab": {
"hashes": [
"sha256:149f0eeb4ea716441638b05fd6d3667d32f1463f3eac50b63e100a73a5533cdd",
"sha256:1aa9a2e1a87749db265b592ad25e498b39f70fce9f53a012cdf69f74259b6e43",
"sha256:1f5ce489adb2db2862249492e6367539cfa65b781cb06dcf13363dc52219be7e",
"sha256:23b28ba1784a6c52a926c075abd9f396d03670e71934b24db5ff684f8b870e0f",
"sha256:3d3de0f4facdd7e3c56ecbc55733a958b86c35a8e7ba6066c7b1ba383e282f58",
"sha256:484d346b8f463ba2ddaf6d365c6ac5971cd062528b6d5ba68cac02b9435366c5",
"sha256:4da2467def21f2e20720b21f6c18e7f7866720a955c716b990e94e3979fe913f",
"sha256:5ebdf22daee7d8e630134d94f477fe6abd65a65449d4eec682a7b458b5249604",
"sha256:655a1b68be18a73fec5233fb5d81f726b4db32269e487aecf5b6853cca926d86",
"sha256:6c535a304888dafe50c2c24d4924aeefc11e0542488ee6965f6133d415e86bbc",
"sha256:7560ef655ac6448bb257fd34bfdfb8d546f9c7c0900ed8963fb8509f75e8ca80",
"sha256:7a1c2fa3e6310dbe47efee2020dc0f25be7a75ff09a8fedc4a87d4397f3810c1",
"sha256:817c344b9aa53b5bfc2f58ff82111a1e85ca4c8b68d1add088b547360a6ebcfa",
"sha256:81d950e398d6758aeaeeb267aa1a62940735414c980f77dd0a270cef1782a43d",
"sha256:83ef44936ef4e9c432d62bc2b72ec8d772b87af319d123e827a72e9b6884c851",
"sha256:9f975adc2c7a236403f0bc91d7a3916e644e47b1f1e3990325f15e73b83581ec",
"sha256:a5ca59e2b7e70a856de6db9dadd3e11a1b3b471c999585284d5c1d479c01cf5d",
"sha256:ad2cf5a673c05fae9e91e987994b95205c13c5fa55d7393cf8b06f9de6f92990",
"sha256:b8c3d76276372f87b7c8ff22065dbc072cca5ffb06ba0267edc298df7acf942d",
"sha256:b93f7f908e916d9413dd8c04da1ccb3977e446803f59078424decdc0de449133",
"sha256:c0ecd0af92c759edec0d24ba92f4a18c28d4a19229ae7c8249f94e82f3d76288",
"sha256:c9e38eefc90a02c072a87a627ff66b2d67c23f6f82274d2aa7fb28e644e8f409",
"sha256:ca2a1592d2e181a04372d0276ee847308ea206dfe7c86fe94769e7ac126e6e85",
"sha256:ce1dfc9beec83e66250ca3afaf5ddf6b9a3ce70a30a9526dec7c6bec3266baf1",
"sha256:d3550c90751132b26b72a78954905974f33b1237335fbe0d8be957f9636c376a",
"sha256:e35a574f4e5ec0fdd5dc354e74ec143d853abd7f76db435ffe2a57d0161a22eb",
"sha256:ee5cafca6ef1a38fef8cbf3140dd2198ad1ee82331530b546039216ef94f93cb",
"sha256:fa1c969176cb3594a785c6818bcb943ebd49453791f702380b13a35fa23b385a"
"sha256:2a1c4ea2155fd5b6e3f89e36b8aa21b5a14c9bbaf9b44de2787641668bc95edc",
"sha256:2b7469a98df1315d4f52319c4438eaee3fdd17330830edadae775e9312402638",
"sha256:3b556160aac294fa661545245e4bc273328f9226e5110139647f4d4bc0cfc453",
"sha256:3eb25d2c2bde078815d8f7ea400abbcae16a0c498a4b27ead3c4a620b1f1f980",
"sha256:3f229c0b2ca27eb5b08777981d3bd0d34e59bfa306627b88d80c3734cd3e26d5",
"sha256:4695755cc70b7a9308508aa41eafc3f335348be0eadd86e8f92cb87815d6177b",
"sha256:4f97b4474e419ae5c441ecdf0db8eceb5f5af0461bdf73e3e5ec05353844045c",
"sha256:550d2d8516e468192e12be8aeaf80f3bd805dc46dd0a5a4ddf2a3e1cd8149a16",
"sha256:59aa9c4ca80d397f6cabec092b5a6e2304fb1b7ca53e5b650872aae13ebfeb68",
"sha256:6e4479b75778b9c1e4640dc90efb72cb990471d56089947d6be4ccd9e7a56a3c",
"sha256:6e9434bd0afa6d6fcf9abbc565750cc456b6e60dc49abd7cd2bc7cf414ee079b",
"sha256:73e4e30b72da1f9f8caba775ad9cc027957c2340c38ba2d6622a9f2351b12c3a",
"sha256:7c05c2ba8ab32f02b23a56a75a4d136c2bfb7221a04a8306835a938fa6711644",
"sha256:849e4cabce1ed1183e83dc89570810b3bf9bf9cf0d0a605bde854a0baf212124",
"sha256:863c6fcf5fc0c8184b6315885429f5468373a3def2eb0c0073d09b79b2161113",
"sha256:8e688df260682038ecd32f106d796024fbcf70e7bf54340b14f991bd5465f97a",
"sha256:9675a26d01ec141cb717091bb139b6227bfb3794f521943101da50327bff4825",
"sha256:969b0d9663c0c641347d2408d41e6723e84d9f7863babc94438c91295c74f36d",
"sha256:978560732758bf5fca4ec1ed124afe2702d08824f6b0364cca31519bd5e7dadd",
"sha256:99ea85b47248c6cdbece147bdbd67aed16209bdd95770aa1f151ec3bb8794496",
"sha256:9cdc318c37fa959909db5beb05ca0b684d3e2cba8f40af1ce6f332c3f69bd2b8",
"sha256:b55c26510ff7f135af8eae1216372028cde7dab22003d918649fce219020eb58",
"sha256:cb301340b4fc1f2b7b25ea4584c5cbde139ced2d4ff01ad5e8fcf7d7822982b0",
"sha256:e7578a573454a5490553fb091374996d32269dff44021a401763080bda1357cf",
"sha256:e84387d35a666aafafda332afca8a75fb04f097cc0a2dc2d04e8c90a83cf7c1b",
"sha256:eb66eff64ea75f028af3ac63a7a2bf1e8733297141a85cbdffd5deaef404fa52",
"sha256:f5e3afd2cc35a73f34c3084c69fe4653591611da5189e50b58db550bb46e340a",
"sha256:f6c10628386bfe0c1f6640c28fb262d0960bb26c249cefabb755fb273323220d"
],
"version": "==3.5.32"
"version": "==3.5.34"
},
"requests": {
"hashes": [
@ -222,10 +202,10 @@
},
"six": {
"hashes": [
"sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd",
"sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66"
"sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
],
"version": "==1.13.0"
"version": "==1.14.0"
},
"soupsieve": {
"hashes": [
@ -236,29 +216,22 @@
},
"urllib3": {
"hashes": [
"sha256:a8a318824cc77d1fd4b2bec2ded92646630d7fe8619497b142c84a9e6f5a7293",
"sha256:f3c5fd51747d450d4dcf6f923c81f78f811aab8205fda64b0aba34a4e48b0745"
"sha256:2f3db8b19923a873b3e5256dc9c2dedfa883e33d87c690d9c7913e1f40673cdc",
"sha256:87716c2d2a7121198ebcb7ce7cccf6ce5e9ba539041cfbaeecfb641dc0bf6acc"
],
"version": "==1.25.7"
"version": "==1.25.8"
},
"validators": {
"hashes": [
"sha256:f0ac832212e3ee2e9b10e156f19b106888cf1429c291fbc5297aae87685014ae"
"sha256:b192e6bde7d617811d59f50584ed240b580375648cd032d106edeb3164099508"
],
"version": "==0.14.0"
"version": "==0.14.2"
},
"wrapt": {
"hashes": [
"sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1"
],
"version": "==1.11.2"
},
"zipp": {
"hashes": [
"sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e",
"sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335"
],
"version": "==0.6.0"
}
},
"develop": {
@ -278,25 +251,25 @@
},
"babel": {
"hashes": [
"sha256:af92e6106cb7c55286b25b38ad7695f8b4efb36a90ba483d7f7a6628c46158ab",
"sha256:e86135ae101e31e2c8ec20a4e0c5220f4eed12487d5cf3f78be7e98d3a57fc28"
"sha256:1aac2ae2d0d8ea368fa90906567f5c08463d98ade155c0c4bfedd6a0f7160e38",
"sha256:d670ea0b10f8b723672d3a6abeb87b565b244da220d76b4dba1b66269ec152d4"
],
"version": "==2.7.0"
"version": "==2.8.0"
},
"beautifulsoup4": {
"hashes": [
"sha256:5279c36b4b2ec2cb4298d723791467e3000e5384a43ea0cdf5d45207c7e97169",
"sha256:6135db2ba678168c07950f9a16c4031822c6f4aec75a65e0a97bc5ca09789931",
"sha256:dcdef580e18a76d54002088602eba453eec38ebbcafafeaabd8cab12b6155d57"
"sha256:05fd825eb01c290877657a56df4c6e4c311b3965bda790c613a3d6fb01a5462a",
"sha256:9fbb4d6e48ecd30bcacc5b63b94088192dcda178513b2ae3c394229f8911b887",
"sha256:e1505eeed31b0f4ce2dbb3bc8eb256c04cc2b3b72af7d551a4ab6efd5cbe5dae"
],
"version": "==4.8.1"
"version": "==4.8.2"
},
"certifi": {
"hashes": [
"sha256:e4f3620cfea4f83eedc95b24abd9cd56f3c4b146dd0177e83a21b4eb49e21e50",
"sha256:fd7c7c74727ddcf00e9acd26bba8da604ffec95bf1c2144e67aff7a8b50e6cef"
"sha256:017c25db2a153ce562900032d5bc68e9f191e44e9a0f762f373977de9df1fbb3",
"sha256:25b64c7da4cd7479594d035c08c2d809eb4aab3a26e5a990ea98cc450c320f1f"
],
"version": "==2019.9.11"
"version": "==2019.11.28"
},
"chardet": {
"hashes": [
@ -322,10 +295,10 @@
},
"colorama": {
"hashes": [
"sha256:05eed71e2e327246ad6b38c540c4a3117230b19679b875190486ddd2d721422d",
"sha256:f8ac84de7840f5b9c4e3347b3c1eaa50f7e49c2b07596221daec5edaabbd7c48"
"sha256:7d73d2a99753107a36ac6b455ee49046802e59d9d076ef8e47b61499fa29afff",
"sha256:e96da0d330793e2cb9485e9ddfd918d456036c7149416295932478192f4436a1"
],
"version": "==0.4.1"
"version": "==0.4.3"
},
"commonmark": {
"hashes": [
@ -336,48 +309,47 @@
},
"coverage": {
"hashes": [
"sha256:08907593569fe59baca0bf152c43f3863201efb6113ecb38ce7e97ce339805a6",
"sha256:0be0f1ed45fc0c185cfd4ecc19a1d6532d72f86a2bac9de7e24541febad72650",
"sha256:141f08ed3c4b1847015e2cd62ec06d35e67a3ac185c26f7635f4406b90afa9c5",
"sha256:19e4df788a0581238e9390c85a7a09af39c7b539b29f25c89209e6c3e371270d",
"sha256:23cc09ed395b03424d1ae30dcc292615c1372bfba7141eb85e11e50efaa6b351",
"sha256:245388cda02af78276b479f299bbf3783ef0a6a6273037d7c60dc73b8d8d7755",
"sha256:331cb5115673a20fb131dadd22f5bcaf7677ef758741312bee4937d71a14b2ef",
"sha256:386e2e4090f0bc5df274e720105c342263423e77ee8826002dcffe0c9533dbca",
"sha256:3a794ce50daee01c74a494919d5ebdc23d58873747fa0e288318728533a3e1ca",
"sha256:60851187677b24c6085248f0a0b9b98d49cba7ecc7ec60ba6b9d2e5574ac1ee9",
"sha256:63a9a5fc43b58735f65ed63d2cf43508f462dc49857da70b8980ad78d41d52fc",
"sha256:6b62544bb68106e3f00b21c8930e83e584fdca005d4fffd29bb39fb3ffa03cb5",
"sha256:6ba744056423ef8d450cf627289166da65903885272055fb4b5e113137cfa14f",
"sha256:7494b0b0274c5072bddbfd5b4a6c6f18fbbe1ab1d22a41e99cd2d00c8f96ecfe",
"sha256:826f32b9547c8091679ff292a82aca9c7b9650f9fda3e2ca6bf2ac905b7ce888",
"sha256:93715dffbcd0678057f947f496484e906bf9509f5c1c38fc9ba3922893cda5f5",
"sha256:9a334d6c83dfeadae576b4d633a71620d40d1c379129d587faa42ee3e2a85cce",
"sha256:af7ed8a8aa6957aac47b4268631fa1df984643f07ef00acd374e456364b373f5",
"sha256:bf0a7aed7f5521c7ca67febd57db473af4762b9622254291fbcbb8cd0ba5e33e",
"sha256:bf1ef9eb901113a9805287e090452c05547578eaab1b62e4ad456fcc049a9b7e",
"sha256:c0afd27bc0e307a1ffc04ca5ec010a290e49e3afbe841c5cafc5c5a80ecd81c9",
"sha256:dd579709a87092c6dbee09d1b7cfa81831040705ffa12a1b248935274aee0437",
"sha256:df6712284b2e44a065097846488f66840445eb987eb81b3cc6e4149e7b6982e1",
"sha256:e07d9f1a23e9e93ab5c62902833bf3e4b1f65502927379148b6622686223125c",
"sha256:e2ede7c1d45e65e209d6093b762e98e8318ddeff95317d07a27a2140b80cfd24",
"sha256:e4ef9c164eb55123c62411f5936b5c2e521b12356037b6e1c2617cef45523d47",
"sha256:eca2b7343524e7ba246cab8ff00cab47a2d6d54ada3b02772e908a45675722e2",
"sha256:eee64c616adeff7db37cc37da4180a3a5b6177f5c46b187894e633f088fb5b28",
"sha256:ef824cad1f980d27f26166f86856efe11eff9912c4fed97d3804820d43fa550c",
"sha256:efc89291bd5a08855829a3c522df16d856455297cf35ae827a37edac45f466a7",
"sha256:fa964bae817babece5aa2e8c1af841bebb6d0b9add8e637548809d040443fee0",
"sha256:ff37757e068ae606659c28c3bd0d923f9d29a85de79bf25b2b34b148473b5025"
"sha256:15cf13a6896048d6d947bf7d222f36e4809ab926894beb748fc9caa14605d9c3",
"sha256:1daa3eceed220f9fdb80d5ff950dd95112cd27f70d004c7918ca6dfc6c47054c",
"sha256:1e44a022500d944d42f94df76727ba3fc0a5c0b672c358b61067abb88caee7a0",
"sha256:25dbf1110d70bab68a74b4b9d74f30e99b177cde3388e07cc7272f2168bd1477",
"sha256:3230d1003eec018ad4a472d254991e34241e0bbd513e97a29727c7c2f637bd2a",
"sha256:3dbb72eaeea5763676a1a1efd9b427a048c97c39ed92e13336e726117d0b72bf",
"sha256:5012d3b8d5a500834783689a5d2292fe06ec75dc86ee1ccdad04b6f5bf231691",
"sha256:51bc7710b13a2ae0c726f69756cf7ffd4362f4ac36546e243136187cfcc8aa73",
"sha256:527b4f316e6bf7755082a783726da20671a0cc388b786a64417780b90565b987",
"sha256:722e4557c8039aad9592c6a4213db75da08c2cd9945320220634f637251c3894",
"sha256:76e2057e8ffba5472fd28a3a010431fd9e928885ff480cb278877c6e9943cc2e",
"sha256:77afca04240c40450c331fa796b3eab6f1e15c5ecf8bf2b8bee9706cd5452fef",
"sha256:7afad9835e7a651d3551eab18cbc0fdb888f0a6136169fbef0662d9cdc9987cf",
"sha256:9bea19ac2f08672636350f203db89382121c9c2ade85d945953ef3c8cf9d2a68",
"sha256:a8b8ac7876bc3598e43e2603f772d2353d9931709345ad6c1149009fd1bc81b8",
"sha256:b0840b45187699affd4c6588286d429cd79a99d509fe3de0f209594669bb0954",
"sha256:b26aaf69713e5674efbde4d728fb7124e429c9466aeaf5f4a7e9e699b12c9fe2",
"sha256:b63dd43f455ba878e5e9f80ba4f748c0a2156dde6e0e6e690310e24d6e8caf40",
"sha256:be18f4ae5a9e46edae3f329de2191747966a34a3d93046dbdf897319923923bc",
"sha256:c312e57847db2526bc92b9bfa78266bfbaabac3fdcd751df4d062cd4c23e46dc",
"sha256:c60097190fe9dc2b329a0eb03393e2e0829156a589bd732e70794c0dd804258e",
"sha256:c62a2143e1313944bf4a5ab34fd3b4be15367a02e9478b0ce800cb510e3bbb9d",
"sha256:cc1109f54a14d940b8512ee9f1c3975c181bbb200306c6d8b87d93376538782f",
"sha256:cd60f507c125ac0ad83f05803063bed27e50fa903b9c2cfee3f8a6867ca600fc",
"sha256:d513cc3db248e566e07a0da99c230aca3556d9b09ed02f420664e2da97eac301",
"sha256:d649dc0bcace6fcdb446ae02b98798a856593b19b637c1b9af8edadf2b150bea",
"sha256:d7008a6796095a79544f4da1ee49418901961c97ca9e9d44904205ff7d6aa8cb",
"sha256:da93027835164b8223e8e5af2cf902a4c80ed93cb0909417234f4a9df3bcd9af",
"sha256:e69215621707119c6baf99bda014a45b999d37602cb7043d943c76a59b05bf52",
"sha256:ea9525e0fef2de9208250d6c5aeeee0138921057cd67fcef90fbed49c4d62d37",
"sha256:fca1669d464f0c9831fd10be2eef6b86f5ebd76c724d1e0706ebdff86bb4adf0"
],
"version": "==4.5.4"
"version": "==5.0.3"
},
"coveralls": {
"hashes": [
"sha256:9bc5a1f92682eef59f688a8f280207190d9a6afb84cef8f567fa47631a784060",
"sha256:fb51cddef4bc458de347274116df15d641a735d3f0a580a9472174e2e62f408c"
"sha256:2da39aeaef986757653f0a442ba2bef22a8ec602c8bacbc69d39f468dfae12ec",
"sha256:906e07a12b2ac04b8ad782d06173975fe5ff815fe9df3bfedd2c099bc5791aec"
],
"index": "pypi",
"version": "==1.8.2"
"version": "==1.10.0"
},
"decorator": {
"hashes": [
@ -407,6 +379,21 @@
"index": "pypi",
"version": "==0.15"
},
"entrypoints": {
"hashes": [
"sha256:589f874b313739ad35be6e0cd7efde2a4e9b6fea91edcc34e58ecbb8dbe56d19",
"sha256:c70dd71abe5a8c85e55e12c19bd91ccfeec11a6e99044204511f9ed547d48451"
],
"version": "==0.3"
},
"flake8": {
"hashes": [
"sha256:45681a117ecc81e870cbf1262835ae4af5e7a8b08e40b944a8a6e6b895914cfb",
"sha256:49356e766643ad15072a789a20915d3c91dc89fd313ccd71802303fd67e4deca"
],
"index": "pypi",
"version": "==3.7.9"
},
"idna": {
"hashes": [
"sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407",
@ -416,48 +403,43 @@
},
"imagesize": {
"hashes": [
"sha256:3f349de3eb99145973fefb7dbe38554414e5c30abd0c8e4b970a7c9d09f3a1d8",
"sha256:f3832918bc3c66617f92e35f5d70729187676313caa60c187eb0f28b8fe5e3b5"
"sha256:6965f19a6a2039c7d48bca7dba2473069ff854c36ae6f19d2cde309d998228a1",
"sha256:b1f6b5a4eab1f73479a50fb79fcf729514a900c341d8503d62a62dbc4127a2b1"
],
"version": "==1.1.0"
},
"importlib-metadata": {
"hashes": [
"sha256:aa18d7378b00b40847790e7c27e11673d7fed219354109d0e7b9e5b25dc3ad26",
"sha256:d5f18a79777f3aa179c145737780282e27b508fc8fd688cb17c7a813e8bd39af"
],
"version": "==0.23"
"version": "==1.2.0"
},
"jinja2": {
"hashes": [
"sha256:74320bb91f31270f9551d46522e33af46a80c3d619f4a4bf42b3164d30b5911f",
"sha256:9fe95f19286cfefaa917656583d020be14e7859c6b0252588391e47db34527de"
"sha256:6e7a3c2934694d59ad334c93dd1b6c96699cf24c53fdb8ec848ac6b23e685734",
"sha256:d6609ae5ec3d56212ca7d802eda654eaf2310000816ce815361041465b108be4"
],
"version": "==2.10.3"
"version": "==2.11.0"
},
"jsonschema": {
"hashes": [
"sha256:2fa0684276b6333ff3c0b1b27081f4b2305f0a36cf702a23db50edb141893c3f",
"sha256:94c0a13b4a0616458b42529091624e66700a17f847453e52279e35509a5b7631"
"sha256:4e5b3cf8216f577bee9ce139cbe72eca3ea4f292ec60928ff24758ce626cd163",
"sha256:c8a85b28d377cc7737e46e2d9f2b4f44ee3c0e1deac6bf46ddefc7187d30797a"
],
"version": "==3.1.1"
"version": "==3.2.0"
},
"lief": {
"hashes": [
"sha256:0efba18d7b9776529ea5c18c771b35871896a8ceb95a19351e50d4813a11c632",
"sha256:3d9c7bb1e353e875f295a72a58d3a37ae1ba3e1ff1beb57b8a65f1a726064093",
"sha256:3db5939e7d95f776f9866586128c2a5be614eaec43ab985ac27ff2c531f8ac5f",
"sha256:4c61598818b0091d80839875aa107cfd10ae1017a3e9c9de4bc002622b8e3179",
"sha256:4f26d07bdada8ca5ef3dc5fa2f71f20f7e8ab4f78f7c5e00134477f51feb6a80",
"sha256:55fe3c8a0990dce16ab5bf88df707f1eacac4eb34561667ac478497e0e0807c7",
"sha256:68bcf18e40c9412d2d08d6311e04eb6c19e20ec174764706da2d602c45aa4fd5",
"sha256:7ff910d99361022451e9c25e34cb844768e2fa347cfb0f4ad70f531810d776d4",
"sha256:ac571152d0b864e8d376bc733c5728a224316be1cdefc290174f1bf8ab10ec70",
"sha256:dd17a7cdcd29a2efca3d4cb4fb078a06daf1cafec8912560965a8d8dbf346739",
"sha256:efa5f3523c01f7f0f5f2c14e5ac808e2447d1435c6a2872e5ab1a97ef1b0db9b",
"sha256:f1aadb344b5e14b308167bd2c9f31f1915e3c4e3f9a9ca92ff7b7bfbede5034c"
"sha256:276cc63ec12a21bdf01b8d30962692c17499788234f0765247ca7a35872097ec",
"sha256:3e6baaeb52bdc339b5f19688b58fd8d5778b92e50221f920cedfa2bec1f4d5c2",
"sha256:45e5c592b57168c447698381d927eb2386ffdd52afe0c48245f848d4cc7ee05a",
"sha256:6547752b5db105cd41c9fa65d0d7452a4d7541b77ffee716b46246c6d81e172f",
"sha256:83b51e01627b5982662f9550ac1230758aa56945ed86829e4291932d98417da3",
"sha256:895599194ea7495bf304e39317b04df20cccf799fc2751867cc1aa4997cfcdae",
"sha256:8a91cee2568306fe1d2bf84341b459c85368317d01d7105fa49e4f4ede837076",
"sha256:913b36a67707dc2afa72f117bab9856ea3f434f332b04a002a0f9723c8779320",
"sha256:9f604a361a3b1b3ed5fdafed0321c5956cb3b265b5efe2250d1bf8911a80c65b",
"sha256:a487fe7234c04bccd58223dbb79214421176e2629814c7a4a887764cceb5be7c",
"sha256:bc8488fb0661cb436fe4bb4fe947d0f9aa020e9acaed233ccf01ab04d888c68a",
"sha256:bddbf333af62310a10cb738a1df1dc2b140dd9c663b55ba3500c10c249d416d2",
"sha256:cce48d7c97cef85e01e6cfeff55f2068956b5c0257eb9c2d2c6d15e33dd1e4fc",
"sha256:f8b3f66956c56b582b3adc573bf2a938c25fb21c8894b373a113e24c494fc982"
],
"version": "==0.10.0.dev0"
"version": "==0.10.1"
},
"markupsafe": {
"hashes": [
@ -465,13 +447,16 @@
"sha256:09027a7803a62ca78792ad89403b1b7a73a01c8cb65909cd876f7fcebd79b161",
"sha256:09c4b7f37d6c648cb13f9230d847adf22f8171b1ccc4d5682398e77f40309235",
"sha256:1027c282dad077d0bae18be6794e6b6b8c91d58ed8a8d89a89d59693b9131db5",
"sha256:13d3144e1e340870b25e7b10b98d779608c02016d5184cfb9927a9f10c689f42",
"sha256:24982cc2533820871eba85ba648cd53d8623687ff11cbb805be4ff7b4c971aff",
"sha256:29872e92839765e546828bb7754a68c418d927cd064fd4708fab9fe9c8bb116b",
"sha256:43a55c2930bbc139570ac2452adf3d70cdbb3cfe5912c71cdce1c2c6bbd9c5d1",
"sha256:46c99d2de99945ec5cb54f23c8cd5689f6d7177305ebff350a58ce5f8de1669e",
"sha256:500d4957e52ddc3351cabf489e79c91c17f6e0899158447047588650b5e69183",
"sha256:535f6fc4d397c1563d08b88e485c3496cf5784e927af890fb3c3aac7f933ec66",
"sha256:596510de112c685489095da617b5bcbbac7dd6384aeebeda4df6025d0256a81b",
"sha256:62fe6c95e3ec8a7fad637b7f3d372c15ec1caa01ab47926cfdf7a75b40e0eac1",
"sha256:6788b695d50a51edb699cb55e35487e430fa21f1ed838122d722e0ff0ac5ba15",
"sha256:6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1",
"sha256:717ba8fe3ae9cc0006d7c451f0bb265ee07739daf76355d06366154ee68d221e",
"sha256:79855e1c5b8da654cf486b830bd42c06e8780cea587384cf6545b7d9ac013a0b",
@ -488,29 +473,58 @@
"sha256:ba59edeaa2fc6114428f1637ffff42da1e311e29382d81b339c1817d37ec93c6",
"sha256:c8716a48d94b06bb3b2524c2b77e055fb313aeb4ea620c8dd03a105574ba704f",
"sha256:cd5df75523866410809ca100dc9681e301e3c27567cf498077e8551b6d20e42f",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7"
"sha256:cdb132fc825c38e1aeec2c8aa9338310d29d337bebbd7baa06889d09a60a1fa2",
"sha256:e249096428b3ae81b08327a63a485ad0878de3fb939049038579ac0ef61e17e7",
"sha256:e8313f01ba26fbbe36c7be1966a7b7424942f670f38e666995b88d012765b9be"
],
"version": "==1.1.1"
},
"mccabe": {
"hashes": [
"sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42",
"sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f"
],
"version": "==0.6.1"
},
"memory-profiler": {
"hashes": [
"sha256:5fa47b274c929dd2cbcd9190afb62fec110701251d2ac2d301caaf545c81afc1"
"sha256:23b196f91ea9ac9996e30bfab1e82fecc30a4a1d24870e81d1e81625f786a2c3"
],
"index": "pypi",
"version": "==0.55.0"
"version": "==0.57.0"
},
"more-itertools": {
"mypy": {
"hashes": [
"sha256:409cd48d4db7052af495b09dec721011634af3753ae1ef92d2b32f73a745f832",
"sha256:92b8c4b06dac4f0611c0729b2f2ede52b2e1bac1ab48f089c7ddc12e26bb60c4"
"sha256:0a9a45157e532da06fe56adcfef8a74629566b607fa2c1ac0122d1ff995c748a",
"sha256:2c35cae79ceb20d47facfad51f952df16c2ae9f45db6cb38405a3da1cf8fc0a7",
"sha256:4b9365ade157794cef9685791032521233729cb00ce76b0ddc78749abea463d2",
"sha256:53ea810ae3f83f9c9b452582261ea859828a9ed666f2e1ca840300b69322c474",
"sha256:634aef60b4ff0f650d3e59d4374626ca6153fcaff96ec075b215b568e6ee3cb0",
"sha256:7e396ce53cacd5596ff6d191b47ab0ea18f8e0ec04e15d69728d530e86d4c217",
"sha256:7eadc91af8270455e0d73565b8964da1642fe226665dd5c9560067cd64d56749",
"sha256:7f672d02fffcbace4db2b05369142e0506cdcde20cea0e07c7c2171c4fd11dd6",
"sha256:85baab8d74ec601e86134afe2bcccd87820f79d2f8d5798c889507d1088287bf",
"sha256:87c556fb85d709dacd4b4cb6167eecc5bbb4f0a9864b69136a0d4640fdc76a36",
"sha256:a6bd44efee4dc8c3324c13785a9dc3519b3ee3a92cada42d2b57762b7053b49b",
"sha256:c6d27bd20c3ba60d5b02f20bd28e20091d6286a699174dfad515636cb09b5a72",
"sha256:e2bb577d10d09a2d8822a042a23b8d62bc3b269667c9eb8e60a6edfa000211b1",
"sha256:f97a605d7c8bc2c6d1172c2f0d5a65b24142e11a58de689046e62c2d632ca8c1"
],
"version": "==7.2.0"
"index": "pypi",
"version": "==0.761"
},
"mypy-extensions": {
"hashes": [
"sha256:090fedd75945a69ae91ce1303b5824f428daf5a028d2f6ab8a299250a846f15d",
"sha256:2d82818f5bb3e369420cb3c4060a7970edba416647068eb4c5343488a6c604a8"
],
"version": "==0.4.3"
},
"neobolt": {
"hashes": [
"sha256:56b86b8b2c3facdd54589e60ecd22e0234d6f40645ab2e2cf87ef0cd79df20af"
"sha256:ca4e87679fe3ed39aec23638658e02dbdc6bbc3289a04e826f332e05ab32275d"
],
"version": "==1.7.15"
"version": "==1.7.16"
},
"neotime": {
"hashes": [
@ -529,45 +543,37 @@
},
"packaging": {
"hashes": [
"sha256:28b924174df7a2fa32c1953825ff29c61e2f5e082343165438812f00d3a7fc47",
"sha256:d9551545c6d761f3def1677baf08ab2a3ca17c56879e70fecba2fc4dde4ed108"
"sha256:170748228214b70b672c581a3dd610ee51f733018650740e98c7df862a583f73",
"sha256:e665345f9eef0c621aa0bf2f8d78cf6d21904eef16a93f020240b704a57f1334"
],
"version": "==19.2"
"version": "==20.1"
},
"pillow": {
"hashes": [
"sha256:047d9473cf68af50ac85f8ee5d5f21a60f849bc17d348da7fc85711287a75031",
"sha256:0f66dc6c8a3cc319561a633b6aa82c44107f12594643efa37210d8c924fc1c71",
"sha256:12c9169c4e8fe0a7329e8658c7e488001f6b4c8e88740e76292c2b857af2e94c",
"sha256:248cffc168896982f125f5c13e9317c059f74fffdb4152893339f3be62a01340",
"sha256:27faf0552bf8c260a5cee21a76e031acaea68babb64daf7e8f2e2540745082aa",
"sha256:285edafad9bc60d96978ed24d77cdc0b91dace88e5da8c548ba5937c425bca8b",
"sha256:384b12c9aa8ef95558abdcb50aada56d74bc7cc131dd62d28c2d0e4d3aadd573",
"sha256:38950b3a707f6cef09cd3cbb142474357ad1a985ceb44d921bdf7b4647b3e13e",
"sha256:4aad1b88933fd6dc2846552b89ad0c74ddbba2f0884e2c162aa368374bf5abab",
"sha256:4ac6148008c169603070c092e81f88738f1a0c511e07bd2bb0f9ef542d375da9",
"sha256:4deb1d2a45861ae6f0b12ea0a786a03d19d29edcc7e05775b85ec2877cb54c5e",
"sha256:59aa2c124df72cc75ed72c8d6005c442d4685691a30c55321e00ed915ad1a291",
"sha256:5a47d2123a9ec86660fe0e8d0ebf0aa6bc6a17edc63f338b73ea20ba11713f12",
"sha256:5cc901c2ab9409b4b7ac7b5bcc3e86ac14548627062463da0af3b6b7c555a871",
"sha256:6c1db03e8dff7b9f955a0fb9907eb9ca5da75b5ce056c0c93d33100a35050281",
"sha256:7ce80c0a65a6ea90ef9c1f63c8593fcd2929448613fc8da0adf3e6bfad669d08",
"sha256:809c19241c14433c5d6135e1b6c72da4e3b56d5c865ad5736ab99af8896b8f41",
"sha256:83792cb4e0b5af480588601467c0764242b9a483caea71ef12d22a0d0d6bdce2",
"sha256:846fa202bd7ee0f6215c897a1d33238ef071b50766339186687bd9b7a6d26ac5",
"sha256:9f5529fc02009f96ba95bea48870173426879dc19eec49ca8e08cd63ecd82ddb",
"sha256:a423c2ea001c6265ed28700df056f75e26215fd28c001e93ef4380b0f05f9547",
"sha256:ac4428094b42907aba5879c7c000d01c8278d451a3b7cccd2103e21f6397ea75",
"sha256:b1ae48d87f10d1384e5beecd169c77502fcc04a2c00a4c02b85f0a94b419e5f9",
"sha256:bf4e972a88f8841d8fdc6db1a75e0f8d763e66e3754b03006cbc3854d89f1cb1",
"sha256:c6414f6aad598364aaf81068cabb077894eb88fed99c6a65e6e8217bab62ae7a",
"sha256:c710fcb7ee32f67baf25aa9ffede4795fd5d93b163ce95fdc724383e38c9df96",
"sha256:c7be4b8a09852291c3c48d3c25d1b876d2494a0a674980089ac9d5e0d78bd132",
"sha256:c9e5ffb910b14f090ac9c38599063e354887a5f6d7e6d26795e916b4514f2c1a",
"sha256:e0697b826da6c2472bb6488db4c0a7fa8af0d52fa08833ceb3681358914b14e5",
"sha256:e9a3edd5f714229d41057d56ac0f39ad9bdba6767e8c888c951869f0bdd129b0"
"sha256:0a628977ac2e01ca96aaae247ec2bd38e729631ddf2221b4b715446fd45505be",
"sha256:4d9ed9a64095e031435af120d3c910148067087541131e82b3e8db302f4c8946",
"sha256:54ebae163e8412aff0b9df1e88adab65788f5f5b58e625dc5c7f51eaf14a6837",
"sha256:5bfef0b1cdde9f33881c913af14e43db69815c7e8df429ceda4c70a5e529210f",
"sha256:5f3546ceb08089cedb9e8ff7e3f6a7042bb5b37c2a95d392fb027c3e53a2da00",
"sha256:5f7ae9126d16194f114435ebb79cc536b5682002a4fa57fa7bb2cbcde65f2f4d",
"sha256:62a889aeb0a79e50ecf5af272e9e3c164148f4bd9636cc6bcfa182a52c8b0533",
"sha256:7406f5a9b2fd966e79e6abdaf700585a4522e98d6559ce37fc52e5c955fade0a",
"sha256:8453f914f4e5a3d828281a6628cf517832abfa13ff50679a4848926dac7c0358",
"sha256:87269cc6ce1e3dee11f23fa515e4249ae678dbbe2704598a51cee76c52e19cda",
"sha256:875358310ed7abd5320f21dd97351d62de4929b0426cdb1eaa904b64ac36b435",
"sha256:8ac6ce7ff3892e5deaab7abaec763538ffd011f74dc1801d93d3c5fc541feee2",
"sha256:91b710e3353aea6fc758cdb7136d9bbdcb26b53cefe43e2cba953ac3ee1d3313",
"sha256:9d2ba4ed13af381233e2d810ff3bab84ef9f18430a9b336ab69eaf3cd24299ff",
"sha256:a62ec5e13e227399be73303ff301f2865bf68657d15ea50b038d25fc41097317",
"sha256:ab76e5580b0ed647a8d8d2d2daee170e8e9f8aad225ede314f684e297e3643c2",
"sha256:bf4003aa538af3f4205c5fac56eacaa67a6dd81e454ffd9e9f055fff9f1bc614",
"sha256:bf598d2e37cf8edb1a2f26ed3fb255191f5232badea4003c16301cb94ac5bdd0",
"sha256:c18f70dc27cc5d236f10e7834236aff60aadc71346a5bc1f4f83a4b3abee6386",
"sha256:c5ed816632204a2fc9486d784d8e0d0ae754347aba99c811458d69fcdfd2a2f9",
"sha256:dc058b7833184970d1248135b8b0ab702e6daa833be14035179f2acb78ff5636",
"sha256:ff3797f2f16bf9d17d53257612da84dd0758db33935777149b3334c01ff68865"
],
"version": "==6.2.1"
"version": "==7.0.0"
},
"prompt-toolkit": {
"hashes": [
@ -579,19 +585,19 @@
},
"psutil": {
"hashes": [
"sha256:021d361439586a0fd8e64f8392eb7da27135db980f249329f1a347b9de99c695",
"sha256:145e0f3ab9138165f9e156c307100905fd5d9b7227504b8a9d3417351052dc3d",
"sha256:348ad4179938c965a27d29cbda4a81a1b2c778ecd330a221aadc7bd33681afbd",
"sha256:3feea46fbd634a93437b718518d15b5dd49599dfb59a30c739e201cc79bb759d",
"sha256:474e10a92eeb4100c276d4cc67687adeb9d280bbca01031a3e41fb35dfc1d131",
"sha256:47aeb4280e80f27878caae4b572b29f0ec7967554b701ba33cd3720b17ba1b07",
"sha256:73a7e002781bc42fd014dfebb3fc0e45f8d92a4fb9da18baea6fb279fbc1d966",
"sha256:d051532ac944f1be0179e0506f6889833cf96e466262523e57a871de65a15147",
"sha256:dfb8c5c78579c226841908b539c2374da54da648ee5a837a731aa6a105a54c00",
"sha256:e3f5f9278867e95970854e92d0f5fe53af742a7fc4f2eba986943345bcaed05d",
"sha256:e9649bb8fc5cea1f7723af53e4212056a6f984ee31784c10632607f472dec5ee"
"sha256:094f899ac3ef72422b7e00411b4ed174e3c5a2e04c267db6643937ddba67a05b",
"sha256:10b7f75cc8bd676cfc6fa40cd7d5c25b3f45a0e06d43becd7c2d2871cbb5e806",
"sha256:1b1575240ca9a90b437e5a40db662acd87bbf181f6aa02f0204978737b913c6b",
"sha256:21231ef1c1a89728e29b98a885b8e0a8e00d09018f6da5cdc1f43f988471a995",
"sha256:28f771129bfee9fc6b63d83a15d857663bbdcae3828e1cb926e91320a9b5b5cd",
"sha256:70387772f84fa5c3bb6a106915a2445e20ac8f9821c5914d7cbde148f4d7ff73",
"sha256:b560f5cd86cf8df7bcd258a851ca1ad98f0d5b8b98748e877a0aec4e9032b465",
"sha256:b74b43fecce384a57094a83d2778cdfc2e2d9a6afaadd1ebecb2e75e0d34e10d",
"sha256:e85f727ffb21539849e6012f47b12f6dd4c44965e56591d8dec6e8bc9ab96f4a",
"sha256:fd2e09bb593ad9bdd7429e779699d2d47c1268cbde4dda95fcd1bd17544a0217",
"sha256:ffad8eb2ac614518bbe3c0b8eb9dffdb3a8d2e3a7d5da51c5b974fb723a5c5aa"
],
"version": "==5.6.5"
"version": "==5.6.7"
},
"py2neo": {
"hashes": [
@ -599,12 +605,26 @@
],
"version": "==4.3.0"
},
"pycodestyle": {
"hashes": [
"sha256:95a2219d12372f05704562a14ec30bc76b05a5b297b21a5dfe3f6fac3491ae56",
"sha256:e40a936c9a450ad81df37f549d676d127b1b66000a6c500caa2b085bc0ca976c"
],
"version": "==2.5.0"
},
"pydeep": {
"hashes": [
"sha256:22866eb422d1d5907f8076ee792da65caecb172425d27576274e2a8eacf6afc1"
],
"version": "==0.4"
},
"pyflakes": {
"hashes": [
"sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0",
"sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2"
],
"version": "==2.1.1"
},
"pygments": {
"hashes": [
"sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a",
@ -624,16 +644,16 @@
},
"pyparsing": {
"hashes": [
"sha256:20f995ecd72f2a1f4bf6b072b63b22e2eb457836601e76d6e5dfcd75436acc1f",
"sha256:4ca62001be367f01bd3e92ecbb79070272a9d4964dce6a48a82ff0b8bc7e683a"
"sha256:4c830582a84fb022400b85429791bc551f1f4871c33f23e44f353119e92f969f",
"sha256:c342dccb5250c08d45fd6f8b4a559613ca603b57498511740e65cd11a2e7dcec"
],
"version": "==2.4.5"
"version": "==2.4.6"
},
"pyrsistent": {
"hashes": [
"sha256:eb6545dbeb1aa69ab1fb4809bfbf5a8705e44d92ef8fc7c2361682a47c46c778"
"sha256:cdc7b5e3ed77bed61270a47d35434a30617b9becdf2478af76ad2c6ade307280"
],
"version": "==0.15.5"
"version": "==0.15.7"
},
"python-dateutil": {
"hashes": [
@ -665,36 +685,36 @@
},
"reportlab": {
"hashes": [
"sha256:149f0eeb4ea716441638b05fd6d3667d32f1463f3eac50b63e100a73a5533cdd",
"sha256:1aa9a2e1a87749db265b592ad25e498b39f70fce9f53a012cdf69f74259b6e43",
"sha256:1f5ce489adb2db2862249492e6367539cfa65b781cb06dcf13363dc52219be7e",
"sha256:23b28ba1784a6c52a926c075abd9f396d03670e71934b24db5ff684f8b870e0f",
"sha256:3d3de0f4facdd7e3c56ecbc55733a958b86c35a8e7ba6066c7b1ba383e282f58",
"sha256:484d346b8f463ba2ddaf6d365c6ac5971cd062528b6d5ba68cac02b9435366c5",
"sha256:4da2467def21f2e20720b21f6c18e7f7866720a955c716b990e94e3979fe913f",
"sha256:5ebdf22daee7d8e630134d94f477fe6abd65a65449d4eec682a7b458b5249604",
"sha256:655a1b68be18a73fec5233fb5d81f726b4db32269e487aecf5b6853cca926d86",
"sha256:6c535a304888dafe50c2c24d4924aeefc11e0542488ee6965f6133d415e86bbc",
"sha256:7560ef655ac6448bb257fd34bfdfb8d546f9c7c0900ed8963fb8509f75e8ca80",
"sha256:7a1c2fa3e6310dbe47efee2020dc0f25be7a75ff09a8fedc4a87d4397f3810c1",
"sha256:817c344b9aa53b5bfc2f58ff82111a1e85ca4c8b68d1add088b547360a6ebcfa",
"sha256:81d950e398d6758aeaeeb267aa1a62940735414c980f77dd0a270cef1782a43d",
"sha256:83ef44936ef4e9c432d62bc2b72ec8d772b87af319d123e827a72e9b6884c851",
"sha256:9f975adc2c7a236403f0bc91d7a3916e644e47b1f1e3990325f15e73b83581ec",
"sha256:a5ca59e2b7e70a856de6db9dadd3e11a1b3b471c999585284d5c1d479c01cf5d",
"sha256:ad2cf5a673c05fae9e91e987994b95205c13c5fa55d7393cf8b06f9de6f92990",
"sha256:b8c3d76276372f87b7c8ff22065dbc072cca5ffb06ba0267edc298df7acf942d",
"sha256:b93f7f908e916d9413dd8c04da1ccb3977e446803f59078424decdc0de449133",
"sha256:c0ecd0af92c759edec0d24ba92f4a18c28d4a19229ae7c8249f94e82f3d76288",
"sha256:c9e38eefc90a02c072a87a627ff66b2d67c23f6f82274d2aa7fb28e644e8f409",
"sha256:ca2a1592d2e181a04372d0276ee847308ea206dfe7c86fe94769e7ac126e6e85",
"sha256:ce1dfc9beec83e66250ca3afaf5ddf6b9a3ce70a30a9526dec7c6bec3266baf1",
"sha256:d3550c90751132b26b72a78954905974f33b1237335fbe0d8be957f9636c376a",
"sha256:e35a574f4e5ec0fdd5dc354e74ec143d853abd7f76db435ffe2a57d0161a22eb",
"sha256:ee5cafca6ef1a38fef8cbf3140dd2198ad1ee82331530b546039216ef94f93cb",
"sha256:fa1c969176cb3594a785c6818bcb943ebd49453791f702380b13a35fa23b385a"
"sha256:2a1c4ea2155fd5b6e3f89e36b8aa21b5a14c9bbaf9b44de2787641668bc95edc",
"sha256:2b7469a98df1315d4f52319c4438eaee3fdd17330830edadae775e9312402638",
"sha256:3b556160aac294fa661545245e4bc273328f9226e5110139647f4d4bc0cfc453",
"sha256:3eb25d2c2bde078815d8f7ea400abbcae16a0c498a4b27ead3c4a620b1f1f980",
"sha256:3f229c0b2ca27eb5b08777981d3bd0d34e59bfa306627b88d80c3734cd3e26d5",
"sha256:4695755cc70b7a9308508aa41eafc3f335348be0eadd86e8f92cb87815d6177b",
"sha256:4f97b4474e419ae5c441ecdf0db8eceb5f5af0461bdf73e3e5ec05353844045c",
"sha256:550d2d8516e468192e12be8aeaf80f3bd805dc46dd0a5a4ddf2a3e1cd8149a16",
"sha256:59aa9c4ca80d397f6cabec092b5a6e2304fb1b7ca53e5b650872aae13ebfeb68",
"sha256:6e4479b75778b9c1e4640dc90efb72cb990471d56089947d6be4ccd9e7a56a3c",
"sha256:6e9434bd0afa6d6fcf9abbc565750cc456b6e60dc49abd7cd2bc7cf414ee079b",
"sha256:73e4e30b72da1f9f8caba775ad9cc027957c2340c38ba2d6622a9f2351b12c3a",
"sha256:7c05c2ba8ab32f02b23a56a75a4d136c2bfb7221a04a8306835a938fa6711644",
"sha256:849e4cabce1ed1183e83dc89570810b3bf9bf9cf0d0a605bde854a0baf212124",
"sha256:863c6fcf5fc0c8184b6315885429f5468373a3def2eb0c0073d09b79b2161113",
"sha256:8e688df260682038ecd32f106d796024fbcf70e7bf54340b14f991bd5465f97a",
"sha256:9675a26d01ec141cb717091bb139b6227bfb3794f521943101da50327bff4825",
"sha256:969b0d9663c0c641347d2408d41e6723e84d9f7863babc94438c91295c74f36d",
"sha256:978560732758bf5fca4ec1ed124afe2702d08824f6b0364cca31519bd5e7dadd",
"sha256:99ea85b47248c6cdbece147bdbd67aed16209bdd95770aa1f151ec3bb8794496",
"sha256:9cdc318c37fa959909db5beb05ca0b684d3e2cba8f40af1ce6f332c3f69bd2b8",
"sha256:b55c26510ff7f135af8eae1216372028cde7dab22003d918649fce219020eb58",
"sha256:cb301340b4fc1f2b7b25ea4584c5cbde139ced2d4ff01ad5e8fcf7d7822982b0",
"sha256:e7578a573454a5490553fb091374996d32269dff44021a401763080bda1357cf",
"sha256:e84387d35a666aafafda332afca8a75fb04f097cc0a2dc2d04e8c90a83cf7c1b",
"sha256:eb66eff64ea75f028af3ac63a7a2bf1e8733297141a85cbdffd5deaef404fa52",
"sha256:f5e3afd2cc35a73f34c3084c69fe4653591611da5189e50b58db550bb46e340a",
"sha256:f6c10628386bfe0c1f6640c28fb262d0960bb26c249cefabb755fb273323220d"
],
"version": "==3.5.32"
"version": "==3.5.34"
},
"requests": {
"hashes": [
@ -713,10 +733,10 @@
},
"six": {
"hashes": [
"sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd",
"sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66"
"sha256:236bdbdce46e6e6a3d61a337c0f8b763ca1e8717c03b369e87a7ec7ce1319c0a",
"sha256:8f3cd2e254d8f793e7f3d6d9df77b92252b52637291d0f0da013c76ea2724b6c"
],
"version": "==1.13.0"
"version": "==1.14.0"
},
"snowballstemmer": {
"hashes": [
@ -734,10 +754,10 @@
},
"sphinx": {
"hashes": [
"sha256:31088dfb95359384b1005619827eaee3056243798c62724fd3fa4b84ee4d71bd",
"sha256:52286a0b9d7caa31efee301ec4300dbdab23c3b05da1c9024b4e84896fb73d79"
"sha256:298537cb3234578b2d954ff18c5608468229e116a9757af3b831c2b2b4819159",
"sha256:e6e766b74f85f37a5f3e0773a1e1be8db3fcb799deb58ca6d18b70b0b44542a5"
],
"version": "==2.2.1"
"version": "==2.3.1"
},
"sphinx-autodoc-typehints": {
"hashes": [
@ -788,38 +808,65 @@
],
"version": "==1.1.3"
},
"typed-ast": {
"hashes": [
"sha256:0666aa36131496aed8f7be0410ff974562ab7eeac11ef351def9ea6fa28f6355",
"sha256:0c2c07682d61a629b68433afb159376e24e5b2fd4641d35424e462169c0a7919",
"sha256:249862707802d40f7f29f6e1aad8d84b5aa9e44552d2cc17384b209f091276aa",
"sha256:24995c843eb0ad11a4527b026b4dde3da70e1f2d8806c99b7b4a7cf491612652",
"sha256:269151951236b0f9a6f04015a9004084a5ab0d5f19b57de779f908621e7d8b75",
"sha256:4083861b0aa07990b619bd7ddc365eb7fa4b817e99cf5f8d9cf21a42780f6e01",
"sha256:498b0f36cc7054c1fead3d7fc59d2150f4d5c6c56ba7fb150c013fbc683a8d2d",
"sha256:4e3e5da80ccbebfff202a67bf900d081906c358ccc3d5e3c8aea42fdfdfd51c1",
"sha256:6daac9731f172c2a22ade6ed0c00197ee7cc1221aa84cfdf9c31defeb059a907",
"sha256:715ff2f2df46121071622063fc7543d9b1fd19ebfc4f5c8895af64a77a8c852c",
"sha256:73d785a950fc82dd2a25897d525d003f6378d1cb23ab305578394694202a58c3",
"sha256:8c8aaad94455178e3187ab22c8b01a3837f8ee50e09cf31f1ba129eb293ec30b",
"sha256:8ce678dbaf790dbdb3eba24056d5364fb45944f33553dd5869b7580cdbb83614",
"sha256:aaee9905aee35ba5905cfb3c62f3e83b3bec7b39413f0a7f19be4e547ea01ebb",
"sha256:bcd3b13b56ea479b3650b82cabd6b5343a625b0ced5429e4ccad28a8973f301b",
"sha256:c9e348e02e4d2b4a8b2eedb48210430658df6951fa484e59de33ff773fbd4b41",
"sha256:d205b1b46085271b4e15f670058ce182bd1199e56b317bf2ec004b6a44f911f6",
"sha256:d43943ef777f9a1c42bf4e552ba23ac77a6351de620aa9acf64ad54933ad4d34",
"sha256:d5d33e9e7af3b34a40dc05f498939f0ebf187f07c385fd58d591c533ad8562fe",
"sha256:fc0fea399acb12edbf8a628ba8d2312f583bdbdb3335635db062fa98cf71fca4",
"sha256:fe460b922ec15dd205595c9b5b99e2f056fd98ae8f9f56b888e7a17dc2b757e7"
],
"version": "==1.4.1"
},
"typing-extensions": {
"hashes": [
"sha256:091ecc894d5e908ac75209f10d5b4f118fbdb2eb1ede6a63544054bb1edb41f2",
"sha256:910f4656f54de5993ad9304959ce9bb903f90aadc7c67a0bef07e678014e892d",
"sha256:cf8b63fedea4d89bab840ecbb93e75578af28f76f66c35889bd7065f5af88575"
],
"version": "==3.7.4.1"
},
"urllib3": {
"hashes": [
"sha256:a8a318824cc77d1fd4b2bec2ded92646630d7fe8619497b142c84a9e6f5a7293",
"sha256:f3c5fd51747d450d4dcf6f923c81f78f811aab8205fda64b0aba34a4e48b0745"
"sha256:2f3db8b19923a873b3e5256dc9c2dedfa883e33d87c690d9c7913e1f40673cdc",
"sha256:87716c2d2a7121198ebcb7ce7cccf6ce5e9ba539041cfbaeecfb641dc0bf6acc"
],
"version": "==1.25.7"
"version": "==1.25.8"
},
"validators": {
"hashes": [
"sha256:f0ac832212e3ee2e9b10e156f19b106888cf1429c291fbc5297aae87685014ae"
"sha256:b192e6bde7d617811d59f50584ed240b580375648cd032d106edeb3164099508"
],
"version": "==0.14.0"
"version": "==0.14.2"
},
"wcwidth": {
"hashes": [
"sha256:3df37372226d6e63e1b1e1eda15c594bca98a22d33a23832a90998faa96bc65e",
"sha256:f4ebe71925af7b40a864553f761ed559b43544f8f71746c2d756c7fe788ade7c"
"sha256:8fd29383f539be45b20bd4df0dc29c20ba48654a41e661925e612311e9f3c603",
"sha256:f28b3e8a6483e5d49e7f8949ac1a78314e740333ae305b4ba5defd3e74fb37a8"
],
"version": "==0.1.7"
"version": "==0.1.8"
},
"wrapt": {
"hashes": [
"sha256:565a021fd19419476b9362b05eeaa094178de64f8361e44468f9e9d7843901e1"
],
"version": "==1.11.2"
},
"zipp": {
"hashes": [
"sha256:3718b1cbcd963c7d4c5511a8240812904164b7f381b647143a89d3b98f9bcd8e",
"sha256:f06903e9f1f43b12d371004b4ac7b06ab39a44adc747266928ae6debfa7b3335"
],
"version": "==0.6.0"
}
}
}

View File

@ -1,3 +1,5 @@
**IMPORTANT NOTE**: This library will require **at least** python 3.6 starting the 1st of January 2020. If you have to legacy versions of python, please use PyMISP v2.4.119.1, and consider updating your system(s). Anything released within the last 2 years will do, starting with Ubuntu 18.04.
README
======
@ -29,7 +31,7 @@ pip3 install pymisp
```
git clone https://github.com/MISP/PyMISP.git && cd PyMISP
git submodule update --init
pip3 install -I .[fileobjects,neo,openioc,virustotal]
pip3 install -I .[fileobjects,openioc,virustotal]
```
## Installing it with virtualenv
@ -38,10 +40,10 @@ It is recommended to use virtualenv to not polute your OS python envirenment.
```
pip3 install virtualenv
git clone https://github.com/MISP/PyMISP.git && cd PyMISP
python3 -m venv ./
python3 -m venv ./venv
source venv/bin/activate
git submodule update --init
pip3 install -I .[fileobjects,neo,openioc,virustotal]
pip3 install -I .[fileobjects,openioc,virustotal]
```
## Running the tests

View File

@ -192,7 +192,7 @@
"source": [
"## Set parameters (inline)\n",
"\n",
"This is the was to pass other parameters"
"This is the way to pass other parameters"
]
},
{
@ -603,7 +603,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Use locally defined objet templates\n",
"## Use locally defined object templates\n",
"\n",
"**Important**: The path you pass as parameter for `misp_objects_path_custom` needs to contain a directory equals to the value of the parameter `name` (same structure as the content of the `misp-object` repository)\n"
]
@ -654,7 +654,7 @@
"source": [
"## Use lief to extract indicators out of binaries\n",
"\n",
"An other cool helper: one liner to whom you can pass the path to a binary, if it is supported by `lief` (PE/ELF/Mach-o), you get the the file object, a PE, ELF, or Mach-o object, and the relevant sections.\n",
"An other cool helper: one liner to whom you can pass the path to a binary, if it is supported by `lief` (PE/ELF/Mach-o), you get the file object, a PE, ELF, or Mach-o object, and the relevant sections.\n",
"\n",
"If it is anything else, it will just generate the the file object.\n"
]

View File

@ -413,7 +413,7 @@
"\n",
"event_ids = set()\n",
"for attr in attributes:\n",
" event_ids.add(event_id)\n",
" event_ids.add(attr.event_id)\n",
"\n",
"# Fetch all related events\n",
"for event_id in event_ids:\n",

View File

@ -4,7 +4,7 @@
from pymisp import ExpandedPyMISP
from pymisp.tools import EMailObject
import traceback
from keys import misp_url, misp_key, misp_verifycert
from keys import misp_url, misp_key, misp_verifycert # type: ignore
import glob
import argparse

View File

@ -1,16 +0,0 @@
import json
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
from pymisp.tools import SBSignatureObject
pymisp = PyMISP(misp_url, misp_key, misp_verifycert)
a = json.loads('{"signatures":[{"new_data":[],"confidence":100,"families":[],"severity":1,"weight":0,"description":"AttemptstoconnecttoadeadIP:Port(2uniquetimes)","alert":false,"references":[],"data":[{"IP":"95.101.39.58:80(Europe)"},{"IP":"192.35.177.64:80(UnitedStates)"}],"name":"dead_connect"},{"new_data":[],"confidence":30,"families":[],"severity":2,"weight":1,"description":"PerformssomeHTTPrequests","alert":false,"references":[],"data":[{"url":"http://cert.int-x3.letsencrypt.org/"},{"url":"http://apps.identrust.com/roots/dstrootcax3.p7c"}],"name":"network_http"},{"new_data":[],"confidence":100,"families":[],"severity":2,"weight":1,"description":"Theofficefilehasaunconventionalcodepage:ANSICyrillic;Cyrillic(Windows)","alert":false,"references":[],"data":[],"name":"office_code_page"}]}')
a = [(x['name'], x['description']) for x in a["signatures"]]
b = SBSignatureObject(a)
template_id = [x['ObjectTemplate']['id'] for x in pymisp.get_object_templates_list() if x['ObjectTemplate']['name'] == 'sb-signature'][0]
pymisp.add_object(234111, template_id, b)

View File

@ -1,28 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
# For python2 & 3 compat, a bit dirty, but it seems to be the least bad one
try:
input = raw_input
except NameError:
pass
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Add the user described in the given json. If no file is provided, returns a json listing all the fields used to describe a user.')
parser.add_argument("-f", "--json_file", help="The name of the json file describing the user you want to create.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
if args.json_file is None:
print (misp.get_add_user_fields_list())
else:
print(misp.add_user_json(args.json_file))

View File

@ -1,21 +1,42 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import datetime
from dateutil.parser import parse
import csv
from pathlib import Path
import json
from uuid import uuid4
import requests
from pymisp import MISPEvent, MISPObject, MISPTag
from keys import misp_url, misp_key, misp_verifycert
from pymisp import ExpandedPyMISP
from pymisp import MISPEvent, MISPObject, MISPTag, MISPOrganisation
from pymisp.tools import feed_meta_generator
class Scrippts:
def __init__(self):
self.misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert)
def __init__(self, output_dir: str= 'output', org_name: str='CIRCL',
org_uuid: str='55f6ea5e-2c60-40e5-964f-47a8950d210f'):
self.misp_org = MISPOrganisation()
self.misp_org.name = org_name
self.misp_org.uuid = org_uuid
self.output_dir = Path(output_dir)
self.output_dir.mkdir(exist_ok=True)
self.data_dir = self.output_dir / 'data'
self.data_dir.mkdir(exist_ok=True)
self.scrippts_meta_file = self.output_dir / '.meta_scrippts'
self.scrippts_meta = {}
if self.scrippts_meta_file.exists():
# Format: <infofield>,<uuid>.json
with self.scrippts_meta_file.open() as f:
reader = csv.reader(f)
for row in reader:
self.scrippts_meta[row[0]] = row[1]
else:
self.scrippts_meta_file.touch()
def geolocation_alt(self) -> MISPObject:
# Alert, NWT, Canada
@ -200,9 +221,7 @@ class Scrippts:
return tag
def fetch(self, url):
filepath = Path('scrippts') / Path(url).name
if filepath.exists():
return filepath
filepath = self.data_dir / Path(url).name
r = requests.get(url)
if r.status_code != 200 or r.text[0] != '"':
print(url)
@ -211,42 +230,45 @@ class Scrippts:
f.write(r.text)
return filepath
def get_existing_event_to_update(self, infofield):
found = self.misp.search(eventinfo=infofield, pythonify=True)
if found:
event = found[0]
return event
return False
def import_all(self, stations_short_names, interval, data_type):
object_creator = getattr(self, f'{interval}_flask_{data_type}')
if data_type == 'co2':
base_url = 'http://scrippsco2.ucsd.edu/assets/data/atmospheric/stations/flask_co2/'
base_url = 'https://scrippsco2.ucsd.edu/assets/data/atmospheric/stations/flask_co2/'
elif data_type in ['c13', 'o18']:
base_url = 'http://scrippsco2.ucsd.edu/assets/data/atmospheric/stations/flask_isotopic/'
base_url = 'https://scrippsco2.ucsd.edu/assets/data/atmospheric/stations/flask_isotopic/'
for station in stations_short_names:
url = f'{base_url}/{interval}/{interval}_flask_{data_type}_{station}.csv'
infofield = f'[{station.upper()}] {interval} average atmospheric {data_type} concentrations'
filepath = self.fetch(url)
if not filepath:
continue
update = True
event = self.get_existing_event_to_update(infofield)
if event:
location = event.get_objects_by_name('geolocation')[0]
if not event:
if infofield in self.scrippts_meta:
event = MISPEvent()
event.load_file(str(self.output_dir / self.scrippts_meta[infofield]))
location = event.get_objects_by_name('geolocation')[0]
update = True
else:
event = MISPEvent()
event.uuid = str(uuid4())
event.info = infofield
event.Orgc = self.misp_org
event.add_tag(getattr(self, f'tag_{station}')())
location = getattr(self, f'geolocation_{station}')()
event.add_object(location)
event.add_attribute('link', f'http://scrippsco2.ucsd.edu/data/atmospheric_co2/{station}')
event.add_attribute('link', f'https://scrippsco2.ucsd.edu/data/atmospheric_co2/{station}')
update = False
with self.scrippts_meta_file.open('a') as f:
writer = csv.writer(f)
writer.writerow([infofield, f'{event.uuid}.json'])
object_creator(event, location, filepath, update)
if update:
self.misp.update_event(event)
else:
self.misp.add_event(event)
# Bump the publish timestamp
event.publish_timestamp = datetime.datetime.timestamp(datetime.datetime.now())
feed_output = event.to_feed(with_meta=False)
with (self.output_dir / f'{event.uuid}.json').open('w') as f:
# json.dump(feed_output, f, indent=2, sort_keys=True) # For testing
json.dump(feed_output, f)
def import_monthly_co2_all(self):
to_import = ['alt', 'ptb', 'stp', 'ljo', 'bcs', 'mlo', 'kum', 'chr', 'sam', 'ker', 'nzd']
@ -458,10 +480,14 @@ class Scrippts:
if __name__ == '__main__':
i = Scrippts()
output_dir = 'scrippsco2_feed'
i = Scrippts(output_dir=output_dir)
i.import_daily_co2_all()
i.import_daily_c13_all()
i.import_daily_o18_all()
i.import_monthly_co2_all()
i.import_monthly_c13_all()
i.import_monthly_o18_all()
feed_meta_generator(Path(output_dir))

View File

@ -1,29 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
# For python2 & 3 compat, a bit dirty, but it seems to be the least bad one
try:
input = raw_input
except NameError:
pass
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Edit the organisation designed by the organisation_id. If no file is provided, returns a json listing all the fields used to describe an organisation.')
parser.add_argument("-i", "--organisation_id", required=True, help="The name of the json file describing the organisation you want to modify.")
parser.add_argument("-f", "--json_file", help="The name of the json file describing your modifications.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
if args.json_file is None:
print (misp.get_edit_organisation_fields_list(args.organisation_id))
else:
print(misp.edit_organisation_json(args.json_file, args.organisation_id))

View File

@ -1,29 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
# For python2 & 3 compat, a bit dirty, but it seems to be the least bad one
try:
input = raw_input
except NameError:
pass
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Edit the user designed by the user_id. If no file is provided, returns a json listing all the fields used to describe a user.')
parser.add_argument("-i", "--user_id", required=True, help="The name of the json file describing the user you want to modify.")
parser.add_argument("-f", "--json_file", help="The name of the json file describing your modifications.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
if args.json_file is None:
print (misp.get_edit_user_fields_list(args.user_id))
else:
print(misp.edit_user_json(args.json_file, args.user_id))

View File

@ -1,126 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Copy Emerging Threats Block IPs list to several MISP events
# Because of the large size of the list the first run will take a minute
# Running it again will update the MISP events if changes are detected
#
# This script requires PyMISP 2.4.50 or later
import sys, json, time, requests
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
et_url = 'https://rules.emergingthreats.net/fwrules/emerging-Block-IPs.txt'
et_str = 'Emerging Threats '
def init_misp():
global mymisp
mymisp = PyMISP(misp_url, misp_key, misp_verifycert)
def load_misp_event(eid):
global et_attr
global et_drev
global et_event
et_attr = {}
et_drev = {}
et_event = mymisp.get(eid)
echeck(et_event)
for a in et_event['Event']['Attribute']:
if a['category'] == 'Network activity':
et_attr[a['value']] = a['id']
continue
if a['category'] == 'Internal reference':
et_drev = a;
def init_et():
global et_data
global et_rev
requests.packages.urllib3.disable_warnings()
s = requests.Session()
r = s.get(et_url)
if r.status_code != 200:
raise Exception('Error getting ET data: {}'.format(r.text))
name = ''
et_data = {}
et_rev = 0
for line in r.text.splitlines():
if line.startswith('# Rev '):
et_rev = int(line[6:])
continue
if line.startswith('#'):
name = line[1:].strip()
if et_rev and not et_data.get(name):
et_data[name] = {}
continue
l = line.rstrip()
if l:
et_data[name][l] = name
def update_et_event(name):
if et_drev and et_rev and int(et_drev['value']) < et_rev:
# Copy MISP attributes to new dict
et_ips = dict.fromkeys(et_attr.keys())
# Weed out attributes still in ET data
for k,v in et_data[name].items():
et_attr.pop(k, None)
# Delete the leftover attributes from MISP
for k,v in et_attr.items():
r = mymisp.delete_attribute(v)
if r.get('errors'):
print "Error deleting attribute {} ({}): {}\n".format(v,k,r['errors'])
# Weed out ips already in the MISP event
for k,v in et_ips.items():
et_data[name].pop(k, None)
# Add new attributes to MISP event
ipdst = []
for i,k in enumerate(et_data[name].items(), 1-len(et_data[name])):
ipdst.append(k[0])
if i % 100 == 0:
r = mymisp.add_ipdst(et_event, ipdst)
echeck(r, et_event['Event']['id'])
ipdst = []
# Update revision number
et_drev['value'] = et_rev
et_drev.pop('timestamp', None)
attr = []
attr.append(et_drev)
# Publish updated MISP event
et_event['Event']['Attribute'] = attr
et_event['Event']['published'] = False
et_event['Event']['date'] = time.strftime('%Y-%m-%d')
r = mymisp.publish(et_event)
echeck(r, et_event['Event']['id'])
def echeck(r, eid=None):
if r.get('errors'):
if eid:
print "Processing event {} failed: {}".format(eid, r['errors'])
else:
print r['errors']
sys.exit(1)
if __name__ == '__main__':
init_misp()
init_et()
for et_type in set(et_data.keys()):
info = et_str + et_type
r = mymisp.search_index(eventinfo=info)
if r['response']:
eid=r['response'][0]['id']
else: # event not found, create it
new_event = mymisp.new_event(info=info, distribution=3, threat_level_id=4, analysis=1)
echeck(new_event)
eid=new_event['Event']['id']
r = mymisp.add_internal_text(new_event, 1, comment='Emerging Threats revision number')
echeck(r, eid)
load_misp_event(eid)
update_et_event(et_type)

View File

@ -4,9 +4,11 @@
from pymisp import ExpandedPyMISP
try:
from keys import url, key
verifycert = False
except ImportError:
url = 'http://localhost:8080'
key = '8h0gHbhS0fv6JUOlTED0AznLXFbf83TYtQrCycqb'
url = 'https://localhost:8443'
key = 'd6OmdDFvU3Seau3UjwvHS1y3tFQbaRNhJhDX0tjh'
verifycert = False
import argparse
import tools
@ -17,7 +19,7 @@ if __name__ == '__main__':
parser.add_argument("-a", "--attribute", type=int, help="Number of attributes per event (default 3000)")
args = parser.parse_args()
misp = ExpandedPyMISP(url, key, True)
misp = ExpandedPyMISP(url, key, verifycert)
misp.toggle_global_pythonify()
if args.limit is None:

View File

@ -4,7 +4,7 @@
import random
from random import randint
import string
from pymisp import MISPEvent
from pymisp import MISPEvent, MISPAttribute
def randomStringGenerator(size, chars=string.ascii_lowercase + string.digits):
@ -15,32 +15,34 @@ def randomIpGenerator():
return str(randint(0, 255)) + '.' + str(randint(0, 255)) + '.' + str(randint(0, 255)) + '.' + str(randint(0, 255))
def _attribute(category, type, value):
attribute = MISPAttribute()
attribute.category = category
attribute.type = type
attribute.value = value
return attribute
def floodtxt(misp, event, maxlength=255):
text = randomStringGenerator(randint(1, maxlength))
textfunctions = [misp.add_internal_comment, misp.add_internal_text, misp.add_internal_other, misp.add_email_subject, misp.add_mutex, misp.add_filename]
textfunctions[randint(0, 5)](event, text)
choose_from = [('Internal reference', 'comment', text), ('Internal reference', 'text', text),
('Internal reference', 'other', text), ('Network activity', 'email-subject', text),
('Artifacts dropped', 'mutex', text), ('Artifacts dropped', 'filename', text)]
misp.add_attribute(event, _attribute(*random.choice(choose_from)))
def floodip(misp, event):
ip = randomIpGenerator()
ipfunctions = [misp.add_ipsrc, misp.add_ipdst]
ipfunctions[randint(0, 1)](event, ip)
choose_from = [('Network activity', 'ip-src', ip), ('Network activity', 'ip-dst', ip)]
misp.add_attribute(event, _attribute(*random.choice(choose_from)))
def flooddomain(misp, event, maxlength=25):
a = randomStringGenerator(randint(1, maxlength))
b = randomStringGenerator(randint(2, 3), chars=string.ascii_lowercase)
domain = a + '.' + b
domainfunctions = [misp.add_hostname, misp.add_domain]
domainfunctions[randint(0, 1)](event, domain)
def flooddomainip(misp, event, maxlength=25):
a = randomStringGenerator(randint(1, maxlength))
b = randomStringGenerator(randint(2, 3), chars=string.ascii_lowercase)
domain = a + '.' + b
ip = randomIpGenerator()
misp.add_domain_ip(event, domain, ip)
choose_from = [('Network activity', 'domain', domain), ('Network activity', 'hostname', domain)]
misp.add_attribute(event, _attribute(*random.choice(choose_from)))
def floodemail(misp, event, maxlength=25):
@ -48,19 +50,12 @@ def floodemail(misp, event, maxlength=25):
b = randomStringGenerator(randint(1, maxlength))
c = randomStringGenerator(randint(2, 3), chars=string.ascii_lowercase)
email = a + '@' + b + '.' + c
emailfunctions = [misp.add_email_src, misp.add_email_dst]
emailfunctions[randint(0, 1)](event, email)
def floodattachment(misp, eventid, distribution, to_ids, category, comment, info, analysis, threat_level_id):
filename = randomStringGenerator(randint(1, 128))
misp.upload_sample(filename, 'dummy', eventid, distribution, to_ids, category, comment, info, analysis, threat_level_id)
choose_from = [('Network activity', 'email-dst', email), ('Network activity', 'email-src', email)]
misp.add_attribute(event, _attribute(*random.choice(choose_from)))
def create_dummy_event(misp):
event = misp.new_event(0, 4, 0, 'dummy event')
flooddomainip(misp, event)
floodattachment(misp, event['Event']['id'], event['Event']['distribution'], False, 'Payload delivery', '', event['Event']['info'], event['Event']['analysis'], event['Event']['threat_level_id'])
return misp.new_event(0, 4, 0, 'dummy event')
def create_massive_dummy_events(misp, nbattribute):
@ -68,12 +63,6 @@ def create_massive_dummy_events(misp, nbattribute):
event.info = 'massive dummy event'
event = misp.add_event(event)
print(event)
eventid = event.id
distribution = '0'
functions = [floodtxt, floodip, flooddomain, flooddomainip, floodemail, floodattachment]
functions = [floodtxt, floodip, flooddomain, floodemail]
for i in range(nbattribute):
choice = randint(0, 5)
if choice == 5:
floodattachment(misp, eventid, distribution, False, 'Payload delivery', '', event.info, event.analysis, event.threat_level_id)
else:
functions[choice](misp, event)
functions[random.randint(0, len(functions) - 1)](misp, event)

View File

@ -7,7 +7,7 @@ This python script can be used to generate a MISP feed based on an existing MISP
````
git clone https://github.com/MISP/PyMISP.git
cd examples/feed-generator
cp settings-default.py settings.py
cp settings.default.py settings.py
vi settings.py #adjust your settings
python3 generate.py
````

View File

@ -5,7 +5,7 @@ import sys
import json
import os
from pymisp import ExpandedPyMISP
from settings import url, key, ssl, outputdir, filters, valid_attribute_distribution_levels
from settings import entries, url, key, ssl, outputdir, filters, valid_attribute_distribution_levels
valid_attribute_distributions = []
@ -14,15 +14,15 @@ def init():
# If we have an old settings.py file then this variable won't exist
global valid_attribute_distributions
try:
valid_attribute_distributions = valid_attribute_distribution_levels
valid_attribute_distributions = [int(v) for v in valid_attribute_distribution_levels]
except Exception:
valid_attribute_distributions = ['0', '1', '2', '3', '4', '5']
valid_attribute_distributions = [0, 1, 2, 3, 4, 5]
return ExpandedPyMISP(url, key, ssl)
def saveEvent(event):
try:
with open(os.path.join(outputdir, f'{event["uuid"]}.json'), 'w') as f:
with open(os.path.join(outputdir, f'{event["Event"]["uuid"]}.json'), 'w') as f:
json.dump(event, f, indent=2)
except Exception as e:
print(e)
@ -52,7 +52,7 @@ def saveManifest(manifest):
if __name__ == '__main__':
misp = init()
try:
events = misp.search(metadata=True, limit=200, **filters, pythonify=True)
events = misp.search(metadata=True, limit=entries, **filters, pythonify=True)
except Exception as e:
print(e)
sys.exit("Invalid response received from MISP.")
@ -63,10 +63,17 @@ if __name__ == '__main__':
counter = 1
total = len(events)
for event in events:
e = misp.get_event(event.uuid, pythonify=True)
e_feed = e.to_feed()
hashes += [[h, e.uuid] for h in e_feed.pop('_hashes')]
manifest.update(e_feed.pop('_manifest'))
try:
e = misp.get_event(event.uuid, pythonify=True)
e_feed = e.to_feed(valid_distributions=valid_attribute_distributions, with_meta=True)
except Exception as e:
print(e, event.uuid)
continue
if not e_feed:
print(f'Invalid distribution {e.distribution}, skipping')
continue
hashes += [[h, e.uuid] for h in e_feed['Event'].pop('_hashes')]
manifest.update(e_feed['Event'].pop('_manifest'))
saveEvent(e_feed)
print("Event " + str(counter) + "/" + str(total) + " exported.")
counter += 1

View File

@ -12,6 +12,9 @@ ssl = False
# sure that you use a directory dedicated to the feed
outputdir = 'output'
# Determine the number of entries to output
entries = 200
# The filters to be used for by the feed. You can use any filter that
# you can use on the event index, such as organisation, tags, etc.
# It uses the same joining and condition rules as the API parameters

View File

@ -0,0 +1,15 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from pymisp.tools import feed_meta_generator
import argparse
from pathlib import Path
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Build meta files for feed')
parser.add_argument("--feed", required=True, help="Path to directory containing the feed.")
args = parser.parse_args()
feed = Path(args.feed)
feed_meta_generator(feed)

View File

@ -1,26 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Get an attachment.')
parser.add_argument("-a", "--attribute", type=int, help="Attribute ID to download.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
with open('foo', 'wb') as f:
out = misp.get_attachment(args.attribute)
if isinstance(out, dict):
# Fails
print(out)
else:
f.write(out)

View File

@ -5,3 +5,4 @@ misp_url = 'https://<your MISP URL>/'
misp_key = 'Your MISP auth key' # The MISP auth key can be found on the MISP web interface under the automation section
misp_verifycert = True
misp_client_cert = ''
proofpoint_key = 'Your Proofpoint TAP auth key'

View File

@ -2,7 +2,11 @@
# -*- coding: utf-8 -*-
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert, misp_client_cert
from keys import misp_url, misp_key, misp_verifycert
try:
from keys import misp_client_cert
except ImportError:
misp_client_cert = ''
import argparse
import os

201
examples/proofpoint_tap.py Normal file
View File

@ -0,0 +1,201 @@
import requests
import json
from pymisp import ExpandedPyMISP, MISPEvent, MISPOrganisation
from keys import misp_url, misp_key, misp_verifycert, proofpoint_key
# initialize PyMISP and set url for Panorama
misp = ExpandedPyMISP(url=misp_url, key=misp_key, ssl=misp_verifycert)
urlSiem = "https://tap-api-v2.proofpoint.com/v2/siem/all"
alertType = ("messagesDelivered", "messagesBlocked", "clicksPermitted", "clicksBlocked")
# max query is 1h, and we want Proofpoint TAP api to return json
queryString = {
"sinceSeconds": "3600",
"format": "json"
}
# auth to api needs to be set as a header, not as part of the query string
headers = {
'Authorization': "Basic " + proofpoint_key
}
responseSiem = requests.request("GET", urlSiem, headers=headers, params=queryString)
jsonDataSiem = json.loads(responseSiem.text)
for alert in alertType:
for messages in jsonDataSiem[alert]:
orgc = MISPOrganisation()
orgc.name = 'Proofpoint'
orgc.id = '#{ORGC.ID}' # organisation id
orgc.uuid = '#{ORGC.UUID}' # organisation uuid
# initialize and set MISPEvent()
event = MISPEvent()
event.Orgc = orgc
if alert == "messagesDelivered" or alert == "messagesBlocked":
if alert == "messagesDelivered":
event.info = alert
event.distribution = 0 # Optional, defaults to MISP.default_event_distribution in MISP config
event.threat_level_id = 2 # setting this to 0 breaks the integration
event.analysis = 0 # Optional, defaults to 0 (initial analysis)
else:
event.info = alert
event.distribution = 0 # Optional, defaults to MISP.default_event_distribution in MISP config
event.threat_level_id = 2 # BLOCKED = LOW
event.analysis = 0 # Optional, defaults to 0 (initial analysis)
recipient = event.add_attribute('email-dst', messages["recipient"][0])
recipient.comment = 'recipient address'
sender = event.add_attribute('email-src', messages["sender"])
sender.comment = 'sender address'
if messages["fromAddress"] is not None and messages["fromAddress"] != "" :
fromAddress = event.add_attribute('email-src-display-name', messages["fromAddress"])
headerFrom = event.add_attribute('email-header', messages["headerFrom"])
headerFrom.comment = 'email header from'
senderIP = event.add_attribute('ip-src', messages["senderIP"])
senderIP.comment = 'sender IP'
subject = event.add_attribute('email-subject', messages["subject"])
subject.comment = 'email subject'
if messages["quarantineFolder"] is not None and messages["quarantineFolder"] != "":
quarantineFolder = event.add_attribute('comment', messages["quarantineFolder"])
quarantineFolder.comment = 'quarantine folder'
if messages["quarantineRule"] is not None and messages["quarantineRule"] != "":
quarantineRule = event.add_attribute('comment', messages["quarantineRule"])
quarantineRule.comment = 'quarantine rule'
messageSize = event.add_attribute('size-in-bytes', messages["messageSize"])
messageSize.comment = 'size of email in bytes'
malwareScore = event.add_attribute('comment', messages["malwareScore"])
malwareScore.comment = 'malware score'
phishScore = event.add_attribute('comment', messages["phishScore"])
phishScore.comment = 'phish score'
spamScore = event.add_attribute('comment', messages["spamScore"])
spamScore.comment = 'spam score'
imposterScore = event.add_attribute('comment', messages["impostorScore"])
imposterScore.comment = 'impostor score'
completelyRewritten = event.add_attribute('comment', messages["completelyRewritten"])
completelyRewritten.comment = 'proofpoint url defense'
# grab the threat info for each message in TAP
for threatInfo in messages["threatsInfoMap"]:
threat_type = {
"url": "url",
"attachment": "email-attachment",
"message": "email-body"
}
threat = event.add_attribute(threat_type.get(threatInfo["threatType"]), threatInfo["threat"])
threat.comment = 'threat'
threatUrl = event.add_attribute('link', threatInfo["threatUrl"])
threatUrl.comment = 'link to threat in TAP'
threatStatus = event.add_attribute('comment', threatInfo["threatStatus"])
threatStatus.comment = "proofpoint's threat status"
event.add_tag(threatInfo["classification"])
# get campaignID from each TAP alert and query campaign API
if threatInfo["campaignID"] is not None and threatInfo["campaignID"] != "":
urlCampaign = "https://tap-api-v2.proofpoint.com/v2/campaign/" + threatInfo["campaignID"]
responseCampaign = requests.request("GET", urlCampaign, headers=headers)
jsonDataCampaign = json.loads(responseCampaign.text)
campaignType = ("actors", "families", "malware", "techniques")
# loop through campaignType and grab tags to add to MISP event
for tagType in campaignType:
for tag in jsonDataCampaign[tagType]:
event.add_tag(tag['name'])
# grab which policy route the message took
for policy in messages["policyRoutes"]:
policyRoute = event.add_attribute('comment', policy)
policyRoute.comment = 'email policy route'
# was the threat in the body of the email or is it an attachment?
for parts in messages["messageParts"]:
disposition = event.add_attribute('comment', parts["disposition"])
disposition.comment = 'email body or attachment'
# sha256 hash of threat
if parts["sha256"] is not None and parts["sha256"] != "":
sha256 = event.add_attribute('sha256', parts["sha256"])
sha256.comment = 'sha256 hash'
# md5 hash of threat
if parts["md5"] is not None and parts["md5"] != "":
md5 = event.add_attribute('md5', parts["md5"])
md5.comment = 'md5 hash'
# filename of threat
if parts["filename"] is not None and parts["filename"] != "":
filename = event.add_attribute('filename', parts["filename"])
filename.comment = 'filename'
misp.add_event(event.to_json())
if alert == "clicksPermitted" or alert == "clicksBlocked":
if alert == "clicksPermitted":
print(alert + " is a permitted click")
event.info = alert
event.distribution = 0 # Optional, defaults to MISP.default_event_distribution in MISP config
event.threat_level_id = 2 # setting this to 0 breaks the integration
event.analysis = 0 # Optional, defaults to 0 (initial analysis)
else:
print(alert + " is a blocked click")
event.info = alert
event.distribution = 0 # Optional, defaults to MISP.default_event_distribution in MISP config
event.threat_level_id = 2 # BLOCKED = LOW
event.analysis = 0 # Optional, defaults to 0 (initial analysis)
event.add_tag(messages["classification"])
campaignId = event.add_attribute('campaign-id', messages["campaignId"][0])
campaignId.comment = 'campaignId'
clickIP = event.add_attribute('ip-src', messages["clickIP"])
clickIP.comment = 'clickIP'
clickTime = event.add_attribute('datetime', messages["clickTime"])
clickTime.comment = 'clicked threat'
threatTime = event.add_attribute('datetime', messages["threatTime"])
threatTime.comment = 'identified threat'
GUID = event.add_attribute('comment', messages["GUID"])
GUID.comment = 'PPS message ID'
recipient = event.add_attribute('email-dst', messages["recipient"][0])
recipient.comment = 'recipient address'
sender = event.add_attribute('email-src', messages["sender"])
sender.comment = 'sender address'
senderIP = event.add_attribute('ip-src', messages["senderIP"])
senderIP.comment = 'sender IP'
threatURL = event.add_attribute('link', messages["threatURL"])
threatURL.comment = 'link to threat in TAP'
url = event.add_attribute('link', messages["url"])
url.comment = 'malicious url clicked'
userAgent = event.add_attribute('user-agent', messages["userAgent"])
misp.add_event(event.to_json())

View File

@ -0,0 +1,65 @@
import requests
import json
from pymisp import ExpandedPyMISP, MISPEvent, MISPOrganisation
from keys import misp_url, misp_key, misp_verifycert, proofpoint_key
# initialize PyMISP and set url for Panorama
misp = ExpandedPyMISP(url=misp_url, key=misp_key, ssl=misp_verifycert)
urlVap = "https://tap-api-v2.proofpoint.com/v2/people/vap?window=30" # Window can be 14, 30, and 90 Days
headers = {
'Authorization': "Basic " + proofpoint_key
}
responseVap = requests.request("GET", urlVap, headers=headers)
jsonDataVap = json.loads(responseVap.text)
for alert in jsonDataVap["users"]:
orgc = MISPOrganisation()
orgc.name = 'Proofpoint'
orgc.id = '#{ORGC.ID}' # organisation id
orgc.uuid = '#{ORGC.UUID}' # organisation uuid
# initialize and set MISPEvent()
event = MISPEvent()
event.Orgc = orgc
event.info = 'Very Attacked Person ' + jsonDataVap["interval"]
event.distribution = 0 # Optional, defaults to MISP.default_event_distribution in MISP config
event.threat_level_id = 2 # setting this to 0 breaks the integration
event.analysis = 0 # Optional, defaults to 0 (initial analysis)
totalVapUsers = event.add_attribute('counter', jsonDataVap["totalVapUsers"], comment="Total VAP Users")
averageAttackIndex = event.add_attribute('counter', jsonDataVap["averageAttackIndex"], comment="Average Attack Count")
vapAttackIndexThreshold = event.add_attribute('counter', jsonDataVap["vapAttackIndexThreshold"], comment="Attack Threshold")
emails = event.add_attribute('email-dst', alert["identity"]["emails"], comment="Email Destination")
attack = event.add_attribute('counter', alert["threatStatistics"]["attackIndex"], comment="Attack Count")
vip = event.add_attribute('other', str(alert["identity"]["vip"]), comment="VIP")
guid = event.add_attribute('other', alert["identity"]["guid"], comment="GUID")
if alert["identity"]["customerUserId"] is not None:
customerUserId = event.add_attribute('other', alert["identity"]["customerUserId"], comment="Customer User Id")
if alert["identity"]["department"] is not None:
department = event.add_attribute(alert['other', "identity"]["department"], comment="Department")
if alert["identity"]["location"] is not None:
location = event.add_attribute('other', alert["identity"]["location"], comment="Location")
if alert["identity"]["name"] is not None:
name = event.add_attribute('target-user', alert["identity"]["name"], comment="Name")
if alert["identity"]["title"] is not None:
title = event.add_attribute('other', alert["identity"]["title"], comment="Title")
event.add_tag("VAP")
misp.add_event(event.to_json())

View File

@ -1,25 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
# For python2 & 3 compat, a bit dirty, but it seems to be the least bad one
try:
input = raw_input
except NameError:
pass
def init(url, key):
return PyMISP(url, key, misp_verifycert, 'json')
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Add sighting.')
parser.add_argument("-f", "--json_file", required=True, help="The name of the json file describing the attribute you want to add sighting to.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
misp.sighting_per_json(args.json_file)

View File

@ -1,16 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Output attributes statistics from a MISP instance.')
args = parser.parse_args()
misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert)
print(misp.get_attributes_statistics(misp, percentage=True))
print(misp.get_attributes_statistics(context='category', percentage=True))

810
examples/stats_report.py Normal file → Executable file
View File

@ -1,405 +1,405 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
Maxime Thiebaut
Generate a report of your MISP statistics
Put this script in crontab to run every /15 or /60
*/5 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/stats_report.py -t 30d -m -v
Do inline config in "main"
'''
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
from datetime import datetime
from datetime import date
import time
import sys
import smtplib
import mimetypes
from email.mime.multipart import MIMEMultipart
from email import encoders
from email.mime.base import MIMEBase
from email.mime.text import MIMEText
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def init(url, key, verifycert):
'''
Template to get MISP module started
'''
return ExpandedPyMISP(url, key, verifycert, 'json')
def get_data(misp, timeframe, date_from = None, date_to = None):
'''
Get the event date to build our report
'''
number_of_misp_events = 0
number_of_attributes = 0
number_of_attributes_to_ids = 0
attr_type = {}
attr_category = {}
tags_type = {}
tags_tlp = {'tlp:white': 0, 'tlp:green': 0, 'tlp:amber': 0, 'tlp:red': 0}
tags_misp_galaxy_mitre = {}
tags_misp_galaxy = {}
tags_misp_galaxy_threat_actor = {}
galaxies = {}
galaxies_cluster = {}
threat_levels_counts = [0, 0, 0, 0]
analysis_completion_counts = [0, 0, 0]
report = {}
try:
if date_from and date_to:
stats_event_response = misp.search(date_from=date_from, date_to=date_to)
else:
stats_event_response = misp.search(last=timeframe)
# Number of new or updated events since timestamp
report['number_of_misp_events'] = len(stats_event_response)
report['misp_events'] = []
for event in stats_event_response:
event_data = event['Event']
timestamp = datetime.utcfromtimestamp(int(event_data['timestamp'])).strftime(ts_format)
publish_timestamp = datetime.utcfromtimestamp(int(event_data['publish_timestamp'])).strftime(ts_format)
threat_level_id = int(event_data['threat_level_id']) - 1
threat_levels_counts[threat_level_id] = threat_levels_counts[threat_level_id] + 1
threat_level_id = threat_levels[threat_level_id]
analysis_id = int(event_data['analysis'])
analysis_completion_counts[analysis_id] = analysis_completion_counts[analysis_id] + 1
analysis = analysis_completion[analysis_id]
report['misp_events'].append({'id': event_data['id'], 'title': event_data['info'].replace('\n', '').encode('utf-8'), 'date': event_data['date'], 'timestamp': timestamp, 'publish_timestamp': publish_timestamp, 'threat_level': threat_level_id, 'analysis_completion': analysis})
# Walk through the attributes
if 'Attribute' in event_data:
event_attr = event_data['Attribute']
for attr in event_attr:
number_of_attributes = number_of_attributes + 1
type = attr['type']
category = attr['category']
to_ids = attr['to_ids']
if to_ids:
number_of_attributes_to_ids = number_of_attributes_to_ids + 1
if type in attr_type:
attr_type[type] = attr_type[type] + 1
else:
attr_type[type] = 1
if category in attr_category:
attr_category[category] = attr_category[category] + 1
else:
attr_category[category] = 1
# Process tags
if 'Tag' in event_data:
tags_attr = event_data['Tag']
for tag in tags_attr:
tag_title = tag['name']
if tag_title.lower().replace(' ', '') in tags_tlp:
tags_tlp[tag_title.lower().replace(' ', '')] = tags_tlp[tag_title.lower().replace(' ', '')] + 1
if 'misp-galaxy:mitre-' in tag_title:
if tag_title in tags_misp_galaxy_mitre:
tags_misp_galaxy_mitre[tag_title] = tags_misp_galaxy_mitre[tag_title] + 1
else:
tags_misp_galaxy_mitre[tag_title] = 1
if 'misp-galaxy:threat-actor=' in tag_title:
if tag_title in tags_misp_galaxy_threat_actor:
tags_misp_galaxy_threat_actor[tag_title] = tags_misp_galaxy_threat_actor[tag_title] + 1
else:
tags_misp_galaxy_threat_actor[tag_title] = 1
elif 'misp-galaxy:' in tag_title:
if tag_title in tags_misp_galaxy:
tags_misp_galaxy[tag_title] = tags_misp_galaxy[tag_title] + 1
else:
tags_misp_galaxy[tag_title] = 1
if tag_title in tags_type:
tags_type[tag_title] = tags_type[tag_title] + 1
else:
tags_type[tag_title] = 1
# Process the galaxies
if 'Galaxy' in event_data:
galaxy_attr = event_data['Galaxy']
for galaxy in galaxy_attr:
galaxy_title = galaxy['type']
if galaxy_title in galaxies:
galaxies[galaxy_title] = galaxies[galaxy_title] + 1
else:
galaxies[galaxy_title] = 1
for cluster in galaxy['GalaxyCluster']:
cluster_value = cluster['type']
if cluster_value in galaxies_cluster:
galaxies_cluster[cluster_value] = galaxies_cluster[cluster_value] + 1
else:
galaxies_cluster[cluster_value] = 1
report['number_of_attributes'] = number_of_attributes
report['number_of_attributes_to_ids'] = number_of_attributes_to_ids
report['attr_type'] = attr_type
report['attr_category'] = attr_category
report['tags_type'] = tags_type
report['tags_tlp'] = tags_tlp
report['tags_misp_galaxy_mitre'] = tags_misp_galaxy_mitre
report['tags_misp_galaxy'] = tags_misp_galaxy
report['tags_misp_galaxy_threat_actor'] = tags_misp_galaxy_threat_actor
report['galaxies'] = galaxies
report['galaxies_cluster'] = galaxies_cluster
# General MISP statistics
user_statistics = misp.users_statistics()
if user_statistics and 'errors' not in user_statistics:
report['user_statistics'] = user_statistics
# Return the report data
return report
except Exception as e:
sys.exit('Unable to get statistics from MISP')
def build_report(report, timeframe, misp_url):
'''
Build the body of the report and optional attachments
'''
attachments = {}
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
report_body = "MISP Report %s for last %s on %s\n-------------------------------------------------------------------------------" % (current_date, timeframe, misp_url)
else:
report_body = "MISP Report %s from %s to %s on %s\n-------------------------------------------------------------------------------" % (current_date, date_from, date_to, misp_url)
report_body = report_body + '\nNew or updated events: %s' % report['number_of_misp_events']
report_body = report_body + '\nNew or updated attributes: %s' % report['number_of_attributes']
report_body = report_body + '\nNew or updated attributes with IDS flag: %s' % report['number_of_attributes_to_ids']
report_body = report_body + '\n'
if 'user_statistics' in report:
report_body = report_body + '\nTotal events: %s' % report['user_statistics']['stats']['event_count']
report_body = report_body + '\nTotal attributes: %s' % report['user_statistics']['stats']['attribute_count']
report_body = report_body + '\nTotal users: %s' % report['user_statistics']['stats']['user_count']
report_body = report_body + '\nTotal orgs: %s' % report['user_statistics']['stats']['org_count']
report_body = report_body + '\nTotal correlation: %s' % report['user_statistics']['stats']['correlation_count']
report_body = report_body + '\nTotal proposals: %s' % report['user_statistics']['stats']['proposal_count']
report_body = report_body + '\n\n'
if args.mispevent:
report_body = report_body + '\nNew or updated events\n-------------------------------------------------------------------------------'
attachments['misp_events'] = 'ID;Title;Date;Updated;Published;ThreatLevel;AnalysisStatus'
for el in report['misp_events']:
report_body = report_body + '\n #%s %s (%s) \t%s \n\t\t\t\t(Date: %s, Updated: %s, Published: %s)' % (el['id'], el['threat_level'], el['analysis_completion'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'])
attachments['misp_events'] = attachments['misp_events'] + '\n%s;%s;%s;%s;%s;%s;%s' % (el['id'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'], el['threat_level'], el['analysis_completion'])
report_body = report_body + '\n\n'
report_body = report_body + '\nNew or updated attributes - Category \n-------------------------------------------------------------------------------'
attr_category_s = sorted(report['attr_category'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['attr_category'] = 'AttributeCategory;Qt'
for el in attr_category_s:
report_body = report_body + '\n%s \t %s' % (el[0], el[1])
attachments['attr_category'] = attachments['attr_category'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nNew or updated attributes - Type \n-------------------------------------------------------------------------------'
attr_type_s = sorted(report['attr_type'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['attr_type'] = 'AttributeType;Qt'
for el in attr_type_s:
report_body = report_body + '\n%s \t %s' % (el[0], el[1])
attachments['attr_type'] = attachments['attr_type'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nTLP Codes \n-------------------------------------------------------------------------------'
attachments['tags_tlp'] = 'TLP;Qt'
for el in report['tags_tlp']:
report_body = report_body + "\n%s \t %s" % (el, report['tags_tlp'][el])
attachments['tags_tlp'] = attachments['tags_tlp'] + '\n%s;%s' % (el, report['tags_tlp'][el])
report_body = report_body + '\n\n'
report_body = report_body + '\nTag MISP Galaxy\n-------------------------------------------------------------------------------'
tags_misp_galaxy_s = sorted(report['tags_misp_galaxy'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['tags_misp_galaxy'] = 'MISPGalaxy;Qt'
for el in tags_misp_galaxy_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
attachments['tags_misp_galaxy'] = attachments['tags_misp_galaxy'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nTag MISP Galaxy Mitre \n-------------------------------------------------------------------------------'
tags_misp_galaxy_mitre_s = sorted(report['tags_misp_galaxy_mitre'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['tags_misp_galaxy_mitre'] = 'MISPGalaxyMitre;Qt'
for el in tags_misp_galaxy_mitre_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
attachments['tags_misp_galaxy_mitre'] = attachments['tags_misp_galaxy_mitre'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nTag MISP Galaxy Threat Actor \n-------------------------------------------------------------------------------'
tags_misp_galaxy_threat_actor_s = sorted(report['tags_misp_galaxy_threat_actor'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['tags_misp_galaxy_threat_actor'] = 'MISPGalaxyThreatActor;Qt'
for el in tags_misp_galaxy_threat_actor_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
attachments['tags_misp_galaxy_threat_actor'] = attachments['tags_misp_galaxy_threat_actor'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nTags \n-------------------------------------------------------------------------------'
tags_type_s = sorted(report['tags_type'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['tags_type'] = 'Tag;Qt'
for el in tags_type_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
attachments['tags_type'] = attachments['tags_type'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nGalaxies \n-------------------------------------------------------------------------------'
galaxies_s = sorted(report['galaxies'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['galaxies'] = 'Galaxies;Qt'
for el in galaxies_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
attachments['galaxies'] = attachments['galaxies'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + '\n\n'
report_body = report_body + '\nGalaxies Cluster \n-------------------------------------------------------------------------------'
galaxies_cluster_s = sorted(report['galaxies_cluster'].items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
attachments['galaxies_cluster'] = 'Galaxies;Qt'
for el in galaxies_cluster_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
attachments['galaxies_cluster'] = attachments['galaxies_cluster'] + '\n%s;%s' % (el[0], el[1])
report_body = report_body + "\n\nMISP Reporter Finished\n"
return report_body, attachments
def msg_attach(content, filename):
'''
Return an message attachment object
'''
part = MIMEBase('application', "octet-stream")
part.set_payload(content)
part.add_header('Content-Disposition', 'attachment; filename="%s"' % filename)
return part
def print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url):
'''
Print (or send) the report
'''
if args.mail:
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
subject = "MISP Report %s for last %s on %s" % (current_date, timeframe, misp_url)
else:
subject = "MISP Report %s from %s to %s on %s" % (current_date, date_from, date_to, misp_url)
msg = MIMEMultipart()
msg['From'] = smtp_from
msg['To'] = smtp_to
msg['Subject'] = subject
msg.attach(MIMEText(report_body, 'text'))
if args.mispevent:
part = MIMEBase('application', "octet-stream")
part.set_payload(attachments['misp_events'])
part.add_header('Content-Disposition', 'attachment; filename="misp_events.csv"')
msg.attach(part)
msg.attach(msg_attach(attachments['attr_type'], 'attr_type.csv'))
msg.attach(msg_attach(attachments['attr_category'], 'attr_category.csv'))
msg.attach(msg_attach(attachments['tags_tlp'], 'tags_tlp.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_mitre'], 'tags_misp_galaxy_mitre.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy'], 'tags_misp_galaxy.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_threat_actor'], 'tags_misp_galaxy_threat_actor.csv'))
msg.attach(msg_attach(attachments['tags_type'], 'tags_type.csv'))
msg.attach(msg_attach(attachments['galaxies'], 'galaxies.csv'))
msg.attach(msg_attach(attachments['galaxies_cluster'], 'galaxies_cluster.csv'))
server = smtplib.SMTP(smtp_server)
server.sendmail(smtp_from, smtp_to, msg.as_string())
else:
print(report_body)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Generate a report of your MISP statistics.')
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument('-t', '--timeframe',action='store', help='Timeframe to include in the report')
group.add_argument('-f', '--date_from',action='store', help='Start date of query (YYYY-MM-DD)')
parser.add_argument('-u', '---date-to', action='store', help='End date of query (YYYY-MM-DD)')
parser.add_argument('-e', '--mispevent', action='store_true', help='Include MISP event titles')
parser.add_argument('-m', '--mail', action='store_true', help='Mail the report')
parser.add_argument('-o', '--mailoptions', action='store', help='mailoptions: \'smtp_from=INSERT_FROM;smtp_to=INSERT_TO;smtp_server=localhost\'')
args = parser.parse_args()
misp = init(misp_url, misp_key, misp_verifycert)
timeframe = args.timeframe
if not timeframe:
date_from = args.date_from
if not args.date_to:
today = date.today()
date_to = today.strftime("%Y-%m-%d")
else:
date_to = args.date_to
else:
date_from = None
date_to = None
ts_format = '%Y-%m-%d %H:%M:%S'
threat_levels = ['High', 'Medium', 'Low', 'Undef']
analysis_completion = ['Initial', 'Ongoing', 'Complete']
smtp_from = 'INSERT_FROM'
smtp_to = 'INSERT_TO'
smtp_server = 'localhost'
if args.mailoptions:
mailoptions = args.mailoptions.split(';')
for s in mailoptions:
if s.split('=')[0] == 'smtp_from':
smtp_from = s.split('=')[1]
if s.split('=')[0] == 'smtp_to':
smtp_to = s.split('=')[1]
if s.split('=')[0] == 'smtp_server':
smtp_server = s.split('=')[1]
report = get_data(misp, timeframe, date_from, date_to)
if(report):
report_body, attachments = build_report(report, timeframe, misp_url)
print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url)
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
Maxime Thiebaut
Generate a report of your MISP statistics
Put this script in crontab to run every /15 or /60
*/5 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/stats_report.py -t 30d -m -v
Do inline config in "main"
'''
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
from datetime import datetime
from datetime import date
import time
import sys
import smtplib
import mimetypes
from email.mime.multipart import MIMEMultipart
from email import encoders
from email.mime.base import MIMEBase
from email.mime.text import MIMEText
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def init(url, key, verifycert):
'''
Template to get MISP module started
'''
return ExpandedPyMISP(url, key, verifycert, 'json')
def get_data(misp, timeframe, date_from=None, date_to=None):
'''
Get the event date to build our report
'''
number_of_misp_events = 0
number_of_attributes = 0
number_of_attributes_to_ids = 0
attr_type = {}
attr_category = {}
tags_type = {}
tags_tlp = {'tlp:white': 0, 'tlp:green': 0, 'tlp:amber': 0, 'tlp:red': 0}
tags_misp_galaxy_mitre = {}
tags_misp_galaxy = {}
tags_misp_galaxy_threat_actor = {}
galaxies = {}
galaxies_cluster = {}
threat_levels_counts = [0, 0, 0, 0]
analysis_completion_counts = [0, 0, 0]
report = {}
try:
if date_from and date_to:
stats_event_response = misp.search(date_from=date_from, date_to=date_to)
else:
stats_event_response = misp.search(last=timeframe)
# Number of new or updated events since timestamp
report['number_of_misp_events'] = len(stats_event_response)
report['misp_events'] = []
for event in stats_event_response:
event_data = event['Event']
timestamp = datetime.utcfromtimestamp(int(event_data['timestamp'])).strftime(ts_format)
publish_timestamp = datetime.utcfromtimestamp(int(event_data['publish_timestamp'])).strftime(ts_format)
threat_level_id = int(event_data['threat_level_id']) - 1
threat_levels_counts[threat_level_id] = threat_levels_counts[threat_level_id] + 1
threat_level_id = threat_levels[threat_level_id]
analysis_id = int(event_data['analysis'])
analysis_completion_counts[analysis_id] = analysis_completion_counts[analysis_id] + 1
analysis = analysis_completion[analysis_id]
report['misp_events'].append({'id': event_data['id'], 'title': event_data['info'].replace('\n', '').encode('utf-8'), 'date': event_data['date'], 'timestamp': timestamp, 'publish_timestamp': publish_timestamp, 'threat_level': threat_level_id, 'analysis_completion': analysis})
# Walk through the attributes
if 'Attribute' in event_data:
event_attr = event_data['Attribute']
for attr in event_attr:
number_of_attributes = number_of_attributes + 1
type = attr['type']
category = attr['category']
to_ids = attr['to_ids']
if to_ids:
number_of_attributes_to_ids = number_of_attributes_to_ids + 1
if type in attr_type:
attr_type[type] = attr_type[type] + 1
else:
attr_type[type] = 1
if category in attr_category:
attr_category[category] = attr_category[category] + 1
else:
attr_category[category] = 1
# Process tags
if 'Tag' in event_data:
tags_attr = event_data['Tag']
for tag in tags_attr:
tag_title = tag['name']
if tag_title.lower().replace(' ', '') in tags_tlp:
tags_tlp[tag_title.lower().replace(' ', '')] = tags_tlp[tag_title.lower().replace(' ', '')] + 1
if 'misp-galaxy:mitre-' in tag_title:
if tag_title in tags_misp_galaxy_mitre:
tags_misp_galaxy_mitre[tag_title] = tags_misp_galaxy_mitre[tag_title] + 1
else:
tags_misp_galaxy_mitre[tag_title] = 1
if 'misp-galaxy:threat-actor=' in tag_title:
if tag_title in tags_misp_galaxy_threat_actor:
tags_misp_galaxy_threat_actor[tag_title] = tags_misp_galaxy_threat_actor[tag_title] + 1
else:
tags_misp_galaxy_threat_actor[tag_title] = 1
elif 'misp-galaxy:' in tag_title:
if tag_title in tags_misp_galaxy:
tags_misp_galaxy[tag_title] = tags_misp_galaxy[tag_title] + 1
else:
tags_misp_galaxy[tag_title] = 1
if tag_title in tags_type:
tags_type[tag_title] = tags_type[tag_title] + 1
else:
tags_type[tag_title] = 1
# Process the galaxies
if 'Galaxy' in event_data:
galaxy_attr = event_data['Galaxy']
for galaxy in galaxy_attr:
galaxy_title = galaxy['type']
if galaxy_title in galaxies:
galaxies[galaxy_title] = galaxies[galaxy_title] + 1
else:
galaxies[galaxy_title] = 1
for cluster in galaxy['GalaxyCluster']:
cluster_value = cluster['type']
if cluster_value in galaxies_cluster:
galaxies_cluster[cluster_value] = galaxies_cluster[cluster_value] + 1
else:
galaxies_cluster[cluster_value] = 1
report['number_of_attributes'] = number_of_attributes
report['number_of_attributes_to_ids'] = number_of_attributes_to_ids
report['attr_type'] = attr_type
report['attr_category'] = attr_category
report['tags_type'] = tags_type
report['tags_tlp'] = tags_tlp
report['tags_misp_galaxy_mitre'] = tags_misp_galaxy_mitre
report['tags_misp_galaxy'] = tags_misp_galaxy
report['tags_misp_galaxy_threat_actor'] = tags_misp_galaxy_threat_actor
report['galaxies'] = galaxies
report['galaxies_cluster'] = galaxies_cluster
# General MISP statistics
user_statistics = misp.users_statistics()
if user_statistics and 'errors' not in user_statistics:
report['user_statistics'] = user_statistics
# Return the report data
return report
except Exception as e:
sys.exit('Unable to get statistics from MISP')
def build_report(report, timeframe, misp_url, sanitize_report=True):
'''
Build the body of the report and optional attachments
'''
attachments = {}
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
report_body = "MISP Report %s for last %s on %s\n-------------------------------------------------------------------------------" % (current_date, timeframe, misp_url)
else:
report_body = "MISP Report %s from %s to %s on %s\n-------------------------------------------------------------------------------" % (current_date, date_from, date_to, misp_url)
report_body = report_body + '\nNew or updated events: %s' % report['number_of_misp_events']
report_body = report_body + '\nNew or updated attributes: %s' % report['number_of_attributes']
report_body = report_body + '\nNew or updated attributes with IDS flag: %s' % report['number_of_attributes_to_ids']
report_body = report_body + '\n'
if 'user_statistics' in report:
report_body = report_body + '\nTotal events: %s' % report['user_statistics']['stats']['event_count']
report_body = report_body + '\nTotal attributes: %s' % report['user_statistics']['stats']['attribute_count']
report_body = report_body + '\nTotal users: %s' % report['user_statistics']['stats']['user_count']
report_body = report_body + '\nTotal orgs: %s' % report['user_statistics']['stats']['org_count']
report_body = report_body + '\nTotal correlation: %s' % report['user_statistics']['stats']['correlation_count']
report_body = report_body + '\nTotal proposals: %s' % report['user_statistics']['stats']['proposal_count']
report_body = report_body + '\n\n'
if args.mispevent:
report_body = report_body + '\nNew or updated events\n-------------------------------------------------------------------------------'
attachments['misp_events'] = 'ID;Title;Date;Updated;Published;ThreatLevel;AnalysisStatus'
for el in report['misp_events']:
report_body = report_body + '\n #%s %s (%s) \t%s \n\t\t\t\t(Date: %s, Updated: %s, Published: %s)' % (el['id'], el['threat_level'], el['analysis_completion'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'])
attachments['misp_events'] = attachments['misp_events'] + '\n%s;%s;%s;%s;%s;%s;%s' % (el['id'], el['title'].decode('utf-8'), el['date'], el['timestamp'], el['publish_timestamp'], el['threat_level'], el['analysis_completion'])
report_body, attachments['attr_category'] = add_report_body(report_body, 'New or updated attributes - Category', report['attr_category'], 'AttributeCategory;Qt')
report_body, attachments['attr_type'] = add_report_body(report_body, 'New or updated attributes - Type', report['attr_type'], 'AttributeType;Qt')
report_body, attachments['tags_tlp'] = add_report_body(report_body, 'TLP Codes', report['tags_tlp'], 'TLP;Qt')
report_body, attachments['tags_misp_galaxy'] = add_report_body(report_body, 'Tag MISP Galaxy', report['tags_misp_galaxy'], 'MISPGalaxy;Qt')
report_body, attachments['tags_misp_galaxy_mitre'] = add_report_body(report_body, 'Tag MISP Galaxy Mitre', report['tags_misp_galaxy_mitre'], 'MISPGalaxyMitre;Qt')
report_body, attachments['tags_misp_galaxy_threat_actor'] = add_report_body(report_body, 'Tag MISP Galaxy Threat Actor', report['tags_misp_galaxy_threat_actor'], 'MISPGalaxyThreatActor;Qt')
report_body, attachments['tags_type'] = add_report_body(report_body, 'Tags', report['tags_type'], 'Tag;Qt')
report_body, attachments['galaxies'] = add_report_body(report_body, 'Galaxies', report['galaxies'], 'Galaxies;Qt')
report_body, attachments['galaxies_cluster'] = add_report_body(report_body, 'Galaxies Cluster', report['galaxies_cluster'], 'Galaxies;Qt')
if sanitize_report:
mitre_tactic = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Tactic')
mitre_group = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Group')
mitre_software = get_sanitized_report(report['tags_misp_galaxy_mitre'], 'ATT&CK Software')
threat_actor = get_sanitized_report(report['tags_misp_galaxy_threat_actor'], 'MISP Threat Actor')
misp_tag = get_sanitized_report(report['tags_type'], 'MISP Tags', False, True)
report_body, attachments['mitre_tactics'] = add_report_body(report_body, 'MITRE ATT&CK Tactics (sanitized)', mitre_tactic, 'MITRETactics;Qt')
report_body, attachments['mitre_group'] = add_report_body(report_body, 'MITRE ATT&CK Group (sanitized)', mitre_group, 'MITREGroup;Qt')
report_body, attachments['mitre_software'] = add_report_body(report_body, 'MITRE ATT&CK Software (sanitized)', mitre_software, 'MITRESoftware;Qt')
report_body, attachments['threat_actor'] = add_report_body(report_body, 'MISP Threat Actor (sanitized)', threat_actor, 'MISPThreatActor;Qt')
report_body, attachments['misp_tag'] = add_report_body(report_body, 'Tags (sanitized)', misp_tag, 'MISPTags;Qt')
report_body = report_body + "\n\nMISP Reporter Finished\n"
return report_body, attachments
def add_report_body(report_body, subtitle, data_object, csv_title):
'''
Add a section to the report body text
'''
if report_body:
report_body = report_body + '\n\n'
report_body = report_body + '\n%s\n-------------------------------------------------------------------------------' % subtitle
data_object_s = sorted(data_object.items(), key=lambda kv: (kv[1], kv[0]), reverse=True)
csv_attachment = csv_title
for el in data_object_s:
report_body = report_body + "\n%s \t %s" % (el[0], el[1])
csv_attachment = csv_attachment + '\n%s;%s' % (el[0], el[1])
return report_body, csv_attachment
def msg_attach(content, filename):
'''
Return an message attachment object
'''
part = MIMEBase('application', "octet-stream")
part.set_payload(content)
part.add_header('Content-Disposition', 'attachment; filename="%s"' % filename)
return part
def print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url):
'''
Print (or send) the report
'''
if args.mail:
now = datetime.now()
current_date = now.strftime(ts_format)
if timeframe:
subject = "MISP Report %s for last %s on %s" % (current_date, timeframe, misp_url)
else:
subject = "MISP Report %s from %s to %s on %s" % (current_date, date_from, date_to, misp_url)
msg = MIMEMultipart()
msg['From'] = smtp_from
msg['To'] = smtp_to
msg['Subject'] = subject
msg.attach(MIMEText(report_body, 'text'))
if args.mispevent:
part = MIMEBase('application', "octet-stream")
part.set_payload(attachments['misp_events'])
part.add_header('Content-Disposition', 'attachment; filename="misp_events.csv"')
msg.attach(part)
msg.attach(msg_attach(attachments['attr_type'], 'attr_type.csv'))
msg.attach(msg_attach(attachments['attr_category'], 'attr_category.csv'))
msg.attach(msg_attach(attachments['tags_tlp'], 'tags_tlp.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_mitre'], 'tags_misp_galaxy_mitre.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy'], 'tags_misp_galaxy.csv'))
msg.attach(msg_attach(attachments['tags_misp_galaxy_threat_actor'], 'tags_misp_galaxy_threat_actor.csv'))
msg.attach(msg_attach(attachments['tags_type'], 'tags_type.csv'))
msg.attach(msg_attach(attachments['galaxies'], 'galaxies.csv'))
msg.attach(msg_attach(attachments['galaxies_cluster'], 'galaxies_cluster.csv'))
msg.attach(msg_attach(attachments['misp_tag'], 'misp_tag.csv'))
msg.attach(msg_attach(attachments['threat_actor'], 'threat_actor.csv'))
msg.attach(msg_attach(attachments['mitre_software'], 'mitre_software.csv'))
msg.attach(msg_attach(attachments['mitre_group'], 'mitre_group.csv'))
msg.attach(msg_attach(attachments['mitre_tactics'], 'mitre_tactics.csv'))
server = smtplib.SMTP(smtp_server)
server.sendmail(smtp_from, smtp_to, msg.as_string())
else:
print(report_body)
def get_sanitized_report(dataset, sanitize_selector='ATT&CK Tactic', lower=False, add_not_sanitized=False):
'''
Remove or bundle some of the tags
'quick'n'dirty ; could also do this by using the galaxy/tags definition
'''
# If you add the element completely then it gets removed by an empty string; this allows to filter out non-relevant items
sanitize_set = {
'ATT&CK Tactic': ['misp-galaxy:mitre-enterprise-attack-pattern="', 'misp-galaxy:mitre-pre-attack-pattern="', 'misp-galaxy:mitre-mobile-attack-pattern="', 'misp-galaxy:mitre-attack-pattern="', 'misp-galaxy:mitre-enterprise-attack-attack-pattern="', 'misp-galaxy:mitre-pre-attack-attack-pattern="', 'misp-galaxy:mitre-enterprise-attack-attack-pattern="', 'misp-galaxy:mitre-mobile-attack-attack-pattern="'],
'ATT&CK Group': ['misp-galaxy:mitre-enterprise-intrusion-set="', 'misp-galaxy:mitre-pre-intrusion-set="', 'misp-galaxy:mitre-mobile-intrusion-set="', 'misp-galaxy:mitre-intrusion-set="', 'misp-galaxy:mitre-enterprise-attack-intrusion-set="', 'misp-galaxy:mitre-pre-attack-intrusion-set="', 'misp-galaxy:mitre-mobile-attack-intrusion-set="'],
'ATT&CK Software': ['misp-galaxy:mitre-enterprise-malware="', 'misp-galaxy:mitre-pre-malware="', 'misp-galaxy:mitre-mobile-malware="', 'misp-galaxy:mitre-malware="', 'misp-galaxy:mitre-enterprise-attack-tool="', 'misp-galaxy:mitre-enterprise-tool="', 'misp-galaxy:mitre-pre-tool="', 'misp-galaxy:mitre-mobile-tool="', 'misp-galaxy:mitre-tool="', 'misp-galaxy:mitre-enterprise-attack-malware="'],
'MISP Threat Actor': ['misp-galaxy:threat-actor="'],
'MISP Tags': ['circl:incident-classification="', 'osint:source-type="blog-post"', 'misp-galaxy:tool="', 'CERT-XLM:malicious-code="', 'circl:topic="', 'ddos:type="', 'ecsirt:fraud="', 'dnc:malware-type="', 'enisa:nefarious-activity-abuse="', 'europol-incident:information-gathering="', 'misp-galaxy:ransomware="', 'misp-galaxy:rat="', 'misp-galaxy:social-dark-patterns="', 'misp-galaxy:tool="', 'misp:threat-level="', 'ms-caro-malware:malware-platform=', 'ms-caro-malware:malware-type=', 'veris:security_incident="', 'veris:attribute:integrity:variety="', 'veris:actor:motive="', 'misp-galaxy:banker="', 'misp-galaxy:malpedia="', 'misp-galaxy:botnet="', 'malware_classification:malware-category="', 'TLP: white', 'TLP: Green',
'inthreat:event-src="feed-osint"', 'tlp:white', 'tlp:amber', 'tlp:green', 'tlp:red', 'osint:source-type="blog-post"', 'Partner Feed', 'IBM XForce', 'type:OSINT', 'malware:', 'osint:lifetime="perpetual"', 'Actor:', 'osint:certainty="50"', 'Banker:', 'Group:', 'Threat:',
'ncsc-nl-ndn:feed="selected"', 'misp-galaxy:microsoft-activity-group="', 'admiralty-scale:source-reliability="b"', 'admiralty-scale:source-reliability="a"', 'admiralty-scale:information-credibility="2"', 'admiralty-scale:information-credibility="3"',
'feed:source="CESICAT"', 'osint:source-type="automatic-analysis"', 'workflow:state="complete"', 'osint:source-type="technical-report"',
'csirt_case_classification:incident-category="', 'dnc:driveby-type="', 'veris:action:social:variety="', 'osint:source-type="',
'osint:source-type="microblog-post"', 'ecsirt:malicious-code="', 'misp-galaxy:sector="', 'veris:action:variety=', 'label=', 'csirt_case_classification:incident-category="', 'admiralty-scale:source-reliability="c"', 'workflow:todo="review"', 'LDO-CERT:detection="toSIEM"', 'Threat tlp:White', 'Threat Type:', 'adversary:infrastructure-state="active"', 'cirl:incident-classification:', 'misp-galaxy:android="', 'dnc:infrastructure-type="', 'ecsirt:information-gathering="', 'ecsirt:intrusions="', 'dhs-ciip-sectors:DHS-critical-sectors="', 'malware_classification:obfuscation-technique="no-obfuscation"',
'riskiq:threat-type="', 'veris:action:hacking:variety="', 'veris:action:social:target="', 'workflow:state="incomplete"', 'workflow:todo="add-tagging"', 'workflow:todo="add-context"', 'europol-incident:availability="', 'label=', 'misp-galaxy:stealer="', 'misp-galaxy:exploit-kit="', 'rsit:availability="', 'rsit:fraud="', 'ransomware:type="', 'veris:action:variety=', 'malware:',
'ecsirt:abusive-content="']}
if sanitize_selector == 'MISP Tags':
sanitize_set['MISP Tags'] = sanitize_set['MISP Tags'] + sanitize_set['ATT&CK Tactic'] + sanitize_set['ATT&CK Group'] + sanitize_set['ATT&CK Software'] + sanitize_set['MISP Threat Actor']
result_sanitize_set = {}
if dataset:
for element in dataset:
sanited = False
for sanitize_el in sanitize_set[sanitize_selector]:
if sanitize_el in element:
sanited = True
new_el = element.replace(sanitize_el, '').replace('"', '').strip()
if lower:
new_el = new_el.lower()
result_sanitize_set[new_el] = dataset[element]
if add_not_sanitized and not sanited:
new_el = element.strip()
if lower:
new_el = new_el.lower()
result_sanitize_set[new_el] = dataset[element]
return result_sanitize_set
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Generate a report of your MISP statistics.')
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument('-t', '--timeframe', action='store', help='Timeframe to include in the report')
group.add_argument('-f', '--date_from', action='store', help='Start date of query (YYYY-MM-DD)')
parser.add_argument('-u', '---date-to', action='store', help='End date of query (YYYY-MM-DD)')
parser.add_argument('-e', '--mispevent', action='store_true', help='Include MISP event titles')
parser.add_argument('-m', '--mail', action='store_true', help='Mail the report')
parser.add_argument('-o', '--mailoptions', action='store', help='mailoptions: \'smtp_from=INSERT_FROM;smtp_to=INSERT_TO;smtp_server=localhost\'')
args = parser.parse_args()
misp = init(misp_url, misp_key, misp_verifycert)
timeframe = args.timeframe
if not timeframe:
date_from = args.date_from
if not args.date_to:
today = date.today()
date_to = today.strftime("%Y-%m-%d")
else:
date_to = args.date_to
else:
date_from = None
date_to = None
ts_format = '%Y-%m-%d %H:%M:%S'
threat_levels = ['High', 'Medium', 'Low', 'Undef']
analysis_completion = ['Initial', 'Ongoing', 'Complete']
smtp_from = 'INSERT_FROM'
smtp_to = 'INSERT_TO'
smtp_server = 'localhost'
if args.mailoptions:
mailoptions = args.mailoptions.split(';')
for s in mailoptions:
if s.split('=')[0] == 'smtp_from':
smtp_from = s.split('=')[1]
if s.split('=')[0] == 'smtp_to':
smtp_to = s.split('=')[1]
if s.split('=')[0] == 'smtp_server':
smtp_server = s.split('=')[1]
report = get_data(misp, timeframe, date_from, date_to)
if(report):
report_body, attachments = build_report(report, timeframe, misp_url)
print_report(report_body, attachments, smtp_from, smtp_to, smtp_server, misp_url)

View File

@ -1,28 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
def init(url, key):
return PyMISP(url, key, misp_verifycert)
def fetch(m, all_events, event):
if all_events:
print(misp.download_all_suricata().text)
else:
print(misp.download_suricata_rule_event(event).text)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Download Suricata events.')
parser.add_argument("-a", "--all", action='store_true', help="Download all suricata rules available.")
parser.add_argument("-e", "--event", help="Download suricata rules from one event.")
args = parser.parse_args()
misp = init(misp_url, misp_key)
fetch(misp, args.all, args.event)

View File

@ -1,18 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import ExpandedPyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import json
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Get statistics from tags.')
parser.add_argument("-p", "--percentage", action='store_true', default=None, help="An optional field, if set, it will return the results in percentages, otherwise it returns exact count.")
parser.add_argument("-n", "--namesort", action='store_true', default=None, help="An optional field, if set, values are sort by the namespace, otherwise the sorting will happen on the value.")
args = parser.parse_args()
misp = ExpandedPyMISP(misp_url, misp_key, misp_verifycert)
stats = misp.get_tags_statistics(args.percentage, args.namesort)
print(json.dumps(stats))

14
examples/trustar.conf Normal file
View File

@ -0,0 +1,14 @@
[trustar]
# endpoint that provides oauth token
auth_endpoint = https://api.trustar.co/oauth/token
# base API URL access endpoint
api_endpoint = https://api.trustar.co/api/1.3
# Generate and copy your API key and secret on user API settings page on Station: https://station.trustar.co/settings/api
user_api_key = '#{API_KEY}'
user_api_secret = '#{API_SECRET}'
# OPTIONAL: enter one or more comma-separate enclave IDs to submit to - get these from API settings page on Station
# enclave_ids = abcdef,1234f

59
examples/trustar_misp.py Normal file
View File

@ -0,0 +1,59 @@
from trustar import TruStar, datetime_to_millis
from datetime import datetime, timedelta
from keys import misp_url, misp_key, misp_verifycert
from pymisp import PyMISP, MISPEvent, MISPOrganisation, MISPObject
# enclave_ids = '7a33144f-aef3-442b-87d4-dbf70d8afdb0' # RHISAC
enclave_ids = None
time_interval = {'days': 30, 'hours': 0}
distribution = None # Optional, defaults to MISP.default_event_distribution in MISP config
threat_level_id = None # Optional, defaults to MISP.default_event_threat_level in MISP config
analysis = None # Optional, defaults to 0 (initial analysis)
tru = TruStar()
misp = PyMISP(misp_url, misp_key, misp_verifycert)
now = datetime.now()
# date range for pulling reports is last 4 hours when script is run
to_time = datetime.now()
from_time = to_time - timedelta(**time_interval)
# convert to millis since epoch
to_time = datetime_to_millis(to_time)
from_time = datetime_to_millis(from_time)
if not enclave_ids:
reports = tru.get_reports(from_time=from_time,
to_time=to_time)
else:
reports = tru.get_reports(from_time=from_time,
to_time=to_time,
is_enclave=True,
enclave_ids=enclave_ids)
# loop through each trustar report and create MISP events for each
for report in reports:
# initialize and set MISPEvent()
event = MISPEvent()
event.info = report.title
event.distribution = distribution
event.threat_level_id = threat_level_id
event.analysis = analysis
# get tags for report
for tag in tru.get_enclave_tags(report.id):
event.add_tag(tag.name)
obj = MISPObject('trustar_report', standalone=False, strict=True)
# get indicators for report
for indicator in tru.get_indicators_for_report(report.id):
obj.add_attribute(indicator.type, indicator.value)
event.add_object(obj)
# post each event to MISP via API
misp.add_event(event)

View File

@ -1,213 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Koen Van Impe
VMRay automatic import
Put this script in crontab to run every /15 or /60
*/5 * * * * mispuser /usr/bin/python3 /home/mispuser/PyMISP/examples/vmray_automation.py
Calls "vmray_import" for all events that have an 'incomplete' VMray analysis
Do inline config in "main"
'''
from pymisp import PyMISP
from keys import misp_url, misp_key, misp_verifycert
import argparse
import os
import json
import datetime
import time
import requests
import sys
# Suppress those "Unverified HTTPS request is being made"
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def init(url, key):
'''
Template to get MISP module started
'''
return PyMISP(url, key, misp_verifycert, 'json')
def get_vmray_config(url, key, misp_verifycert, default_wait_period):
'''
Fetch configuration settings from MISP
Includes VMRay API and modules URL
'''
try:
misp_headers = {'Content-Type': 'application/json', 'Accept': 'application/json', 'Authorization': key}
req = requests.get(url + 'servers/serverSettings.json', verify=misp_verifycert, headers=misp_headers)
if req.status_code == 200:
req_json = req.json()
if 'finalSettings' in req_json:
finalSettings = req_json['finalSettings']
vmray_api = ''
vmray_url = ''
vmray_wait_period = 0
for el in finalSettings:
# Is the vmray import module enabled?
if el['setting'] == 'Plugin.Import_vmray_import_enabled':
vmray_import_enabled = el['value']
if vmray_import_enabled is False:
break
# Get the VMRay API key from the MISP settings
elif el['setting'] == 'Plugin.Import_vmray_import_apikey':
vmray_api = el['value']
# The VMRay URL to query
elif el['setting'] == 'Plugin.Import_vmray_import_url':
vmray_url = el['value'].replace('/', '\\/')
# MISP modules - Port?
elif el['setting'] == 'Plugin.Import_services_port':
module_import_port = el['value']
# MISP modules - URL
elif el['setting'] == 'Plugin.Import_services_url':
module_import_url = el['value'].replace('\/\/', '//')
# Wait period
elif el['setting'] == 'Plugin.Import_vmray_import_wait_period':
vmray_wait_period = abs(int(el['value']))
if vmray_wait_period < 1:
vmray_wait_period = default_wait_period
else:
sys.exit('Did not receive a 200 code from MISP')
if vmray_import_enabled and vmray_api and vmray_url and module_import_port and module_import_url:
return {'vmray_wait_period': vmray_wait_period, 'vmray_api': vmray_api, 'vmray_url': vmray_url, 'module_import_port': module_import_port, 'module_import_url': module_import_url}
else:
sys.exit('Did not receive all the necessary configuration information from MISP')
except Exception as e:
sys.exit('Unable to get VMRay config from MISP')
def search_vmray_incomplete(m, url, wait_period, module_import_url, module_import_port, vmray_url, vmray_api, vmray_attribute_category, vmray_include_analysisid, vmray_include_imphash_ssdeep, vmray_include_extracted_files, vmray_include_analysisdetails, vmray_include_vtidetails, custom_tags_incomplete, custom_tags_complete):
'''
Search for the events with VMRay samples that are marked incomplete
and then update these events
'''
controller = 'attributes'
vmray_value = 'VMRay Sample ID:' # How sample IDs are stored in MISP
req = None
# Search for the events
try:
result = m.search(controller, tags=custom_tags_incomplete)
response = result['response']
if len(response) == 0:
sys.exit("No VMRay attributes found that match %s" % custom_tags_incomplete)
attribute = response['Attribute']
if len(attribute) == 0:
sys.exit("No VMRay attributes found that match %s" % custom_tags_incomplete)
timestamp = int(attribute[0]["timestamp"])
# Not enough time has gone by to lookup the analysis jobs
if int((time.time() - timestamp) / 60) < int(wait_period):
if module_DEBUG:
r_timestamp = datetime.datetime.fromtimestamp(timestamp).strftime('%Y-%m-%d %H:%M:%S')
print("Attribute to recent for wait_period (%s minutes) - timestamp attribute: %s (%s minutes old)" % (wait_period, r_timestamp, round((int(time.time() - timestamp) / 60), 2)))
return False
if module_DEBUG:
print("All attributes older than %s" % int(wait_period))
for att in attribute:
value = att['value']
if vmray_value in value: # We found a sample ID
att_id = att['id']
att_uuid = att['uuid']
# VMRay Sample IDs are stored as VMRay Sample ID: 2796577
vmray_sample_id = value.split(vmray_value)[1].strip()
if vmray_sample_id.isdigit():
event_id = att['event_id']
if module_DEBUG:
print("Found event %s with matching tags %s for sample id %s " % (event_id, custom_tags_incomplete, vmray_sample_id))
# Prepare request to send to vmray_import via misp modules
misp_modules_url = module_import_url + ':' + module_import_port + '/query'
misp_modules_headers = {'Content-Type': 'application/json'}
misp_modules_body = '{ "sample_id":"' + vmray_sample_id + '","module":"vmray_import","event_id":"' + event_id + '","config":{"apikey":"' + vmray_api + '","url":"' + vmray_url + '","include_analysisid":"' + vmray_include_analysisid + '","include_analysisdetails":"' + vmray_include_analysisdetails + '","include_extracted_files":"' + vmray_include_extracted_files + '","include_imphash_ssdeep":"' + vmray_include_imphash_ssdeep + '","include_vtidetails":"' + vmray_include_vtidetails + '","sample_id":"' + vmray_sample_id + '"},"data":""}'
req = requests.post(misp_modules_url, data=misp_modules_body, headers=misp_modules_headers)
if module_DEBUG and req is not None:
print("Response code from submitting to MISP modules %s" % (req.status_code))
# Succesful response from the misp modules?
if req.status_code == 200:
req_json = req.json()
if "error" in req_json:
print("Error code in reply %s " % req_json["error"])
continue
else:
results = req_json["results"]
# Walk through all results in the misp-module reply
for el in results:
to_ids = True
values = el['values']
types = el['types']
if "to_ids" in el:
to_ids = el['to_ids']
if "text" in types:
to_ids = False
comment = el['comment']
if len(comment) < 1:
comment = "Enriched via the vmray_import module"
# Attribute can belong in different types
for type in types:
try:
r = m.add_named_attribute(event_id, type, values, vmray_attribute_category, to_ids, comment)
if module_DEBUG:
print("Add event %s: %s as %s (%s) (toids: %s)" % (event_id, values, type, comment, to_ids))
except Exception as e:
continue
if module_DEBUG:
print("Unable to add attribute %s as type %s for event %s" % (values, type, event_id))
# Remove 'incomplete' state tags
m.untag(att_uuid, custom_tags_incomplete)
# Update tags to 'complete' state
m.tag(att_uuid, custom_tags_complete)
if module_DEBUG:
print("Updated event %s" % event_id)
else:
sys.exit('MISP modules did not return HTTP 200 code (event %s ; sampleid %s)' % (event_id, vmray_sample_id))
except Exception as e:
sys.exit("Invalid response received from MISP : %s", e)
if __name__ == '__main__':
module_DEBUG = True
# Set some defaults to be used in this module
vmray_attribute_category = 'External analysis'
vmray_include_analysisid = '0'
vmray_include_imphash_ssdeep = '0'
vmray_include_extracted_files = '0'
vmray_include_analysisdetails = '0'
vmray_include_vtidetails = '0'
custom_tags_incomplete = 'workflow:state="incomplete"'
custom_tags_complete = 'workflow:state="complete"'
default_wait_period = 30
misp = init(misp_url, misp_key)
vmray_config = get_vmray_config(misp_url, misp_key, misp_verifycert, default_wait_period)
search_vmray_incomplete(misp, misp_url, vmray_config['vmray_wait_period'], vmray_config['module_import_url'], vmray_config['module_import_port'], vmray_config['vmray_url'], vmray_config['vmray_api'], vmray_attribute_category, vmray_include_analysisid, vmray_include_imphash_ssdeep, vmray_include_extracted_files, vmray_include_analysisdetails, vmray_include_vtidetails, custom_tags_incomplete, custom_tags_complete)

View File

@ -1,7 +1,5 @@
__version__ = '2.4.117.1'
__version__ = '2.4.121.1'
import logging
import warnings
import sys
FORMAT = "%(levelname)s [%(filename)s:%(lineno)s - %(funcName)s() ] %(message)s"
formatter = logging.Formatter(FORMAT)
@ -13,24 +11,18 @@ logger.addHandler(default_handler)
logger.setLevel(logging.WARNING)
def warning_2020():
if sys.version_info < (3, 6):
warnings.warn("""
Python 2.7 is officially end of life the 2020-01-01. For this occasion,
we decided to review which versions of Python we support and our conclusion
is to only support python 3.6+ starting the 2020-01-01.
Every version of pymisp released after the 2020-01-01 will fail if the
python interpreter is prior to python 3.6.
**Please update your codebase.**""", DeprecationWarning, stacklevel=3)
everything_broken = '''Unknown error: the response is not in JSON.
Something is broken server-side, please send us everything that follows (careful with the auth key):
Request headers:
{}
Request body:
{}
Response (if any):
{}'''
try:
warning_2020()
from .exceptions import PyMISPError, NewEventError, NewAttributeError, MissingDependency, NoURL, NoKey, InvalidMISPObject, UnknownMISPObjectTemplate, PyMISPInvalidFormat, MISPServerError, PyMISPNotImplementedYet, PyMISPUnexpectedResponse, PyMISPEmptyResponse # noqa
from .api import PyMISP # noqa
from .abstract import AbstractMISP, MISPEncode, pymisp_json_default, MISPTag, Distribution, ThreatLevel, Analysis # noqa
from .mispevent import MISPEvent, MISPAttribute, MISPObjectReference, MISPObjectAttribute, MISPObject, MISPUser, MISPOrganisation, MISPSighting, MISPLog, MISPShadowAttribute, MISPWarninglist, MISPTaxonomy, MISPNoticelist, MISPObjectTemplate, MISPSharingGroup, MISPRole, MISPServer, MISPFeed, MISPEventDelegation, MISPUserSetting # noqa
from .tools import AbstractMISPObjectGenerator # noqa
@ -39,18 +31,18 @@ try:
from .tools import openioc # noqa
from .tools import ext_lookups # noqa
if sys.version_info >= (3, 6):
from .aping import ExpandedPyMISP # noqa
from .tools import load_warninglists # noqa
# Let's not bother with old python
try:
from .tools import reportlab_generator # noqa
except ImportError:
# FIXME: The import should not raise an exception if reportlab isn't installed
pass
except NameError:
# FIXME: The import should not raise an exception if reportlab isn't installed
pass
from .api import PyMISP # noqa
from .api import PyMISP as ExpandedPyMISP # noqa
from .tools import load_warninglists # noqa
# Let's not bother with old python
try:
from .tools import reportlab_generator # noqa
except ImportError:
# FIXME: The import should not raise an exception if reportlab isn't installed
pass
except NameError:
# FIXME: The import should not raise an exception if reportlab isn't installed
pass
logger.debug('pymisp loaded properly')
except ImportError as e:
logger.warning('Unable to load pymisp properly: {}'.format(e))

View File

@ -1,114 +1,54 @@
#!/usr/bin/env python
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import sys
import datetime
from datetime import date, datetime
from deprecated import deprecated
from deprecated import deprecated # type: ignore
from json import JSONEncoder
from uuid import UUID
from abc import ABCMeta
try:
from rapidjson import load
from rapidjson import loads
from rapidjson import dumps
import rapidjson
from rapidjson import load # type: ignore
from rapidjson import loads # type: ignore
from rapidjson import dumps # type: ignore
HAS_RAPIDJSON = True
except ImportError:
from json import load
from json import loads
from json import dumps
import json
HAS_RAPIDJSON = False
import logging
from enum import Enum
from typing import Union, Optional
from .exceptions import PyMISPInvalidFormat
from .exceptions import PyMISPInvalidFormat, PyMISPError
from collections.abc import MutableMapping
from functools import lru_cache
from pathlib import Path
logger = logging.getLogger('pymisp')
if sys.version_info < (3, 0):
from collections import MutableMapping
import os
from cachetools import cached, LRUCache
resources_path = Path(__file__).parent / 'data'
misp_objects_path = resources_path / 'misp-objects' / 'objects'
with (resources_path / 'describeTypes.json').open('r') as f:
describe_types = load(f)['result']
resources_path = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'data')
misp_objects_path = os.path.join(resources_path, 'misp-objects', 'objects')
with open(os.path.join(resources_path, 'describeTypes.json'), 'r') as f:
describe_types = load(f)['result']
# This is required because Python 2 is a pain.
from datetime import tzinfo, timedelta
class MISPFileCache(object):
# cache up to 150 JSON structures in class attribute
class UTC(tzinfo):
"""UTC"""
def utcoffset(self, dt):
return timedelta(0)
def tzname(self, dt):
return "UTC"
def dst(self, dt):
return timedelta(0)
class MISPFileCache(object):
# cache up to 150 JSON structures in class attribute
@staticmethod
@cached(cache=LRUCache(maxsize=150))
def _load_json(path):
if not os.path.exists(path):
return None
with open(path, 'r') as f:
data = load(f)
return data
elif sys.version_info < (3, 4):
from collections.abc import MutableMapping
from functools import lru_cache
import os
resources_path = os.path.join(os.path.abspath(os.path.dirname(__file__)), 'data')
misp_objects_path = os.path.join(resources_path, 'misp-objects', 'objects')
with open(os.path.join(resources_path, 'describeTypes.json'), 'r') as f:
describe_types = load(f)['result']
class MISPFileCache(object):
# cache up to 150 JSON structures in class attribute
@staticmethod
@lru_cache(maxsize=150)
def _load_json(path):
if not os.path.exists(path):
return None
with open(path, 'r') as f:
data = load(f)
return data
else:
from collections.abc import MutableMapping
from functools import lru_cache
from pathlib import Path
resources_path = Path(__file__).parent / 'data'
misp_objects_path = resources_path / 'misp-objects' / 'objects'
with (resources_path / 'describeTypes.json').open('r') as f:
describe_types = load(f)['result']
class MISPFileCache(object):
# cache up to 150 JSON structures in class attribute
@staticmethod
@lru_cache(maxsize=150)
def _load_json(path):
if not path.exists():
return None
with path.open('r') as f:
data = load(f)
return data
@staticmethod
@lru_cache(maxsize=150)
def _load_json(path: Path) -> Union[dict, None]:
if not path.exists():
return None
with path.open('r') as f:
data = load(f)
return data
class Distribution(Enum):
@ -133,10 +73,12 @@ class Analysis(Enum):
completed = 2
def _int_to_str(d):
def _int_to_str(d: dict) -> dict:
# transform all integer back to string
for k, v in d.items():
if isinstance(v, (int, float)) and not isinstance(v, bool):
if isinstance(v, dict):
d[k] = _int_to_str(v)
elif isinstance(v, int) and not isinstance(v, bool):
d[k] = str(v)
return d
@ -146,7 +88,7 @@ class MISPEncode(JSONEncoder):
def default(self, obj):
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime.datetime, datetime.date)):
elif isinstance(obj, (datetime, date)):
return obj.isoformat()
elif isinstance(obj, Enum):
return obj.value
@ -155,81 +97,56 @@ class MISPEncode(JSONEncoder):
return JSONEncoder.default(self, obj)
if HAS_RAPIDJSON:
def pymisp_json_default(obj):
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime.datetime, datetime.date)):
return obj.isoformat()
elif isinstance(obj, Enum):
return obj.value
elif isinstance(obj, UUID):
return str(obj)
return rapidjson.default(obj)
else:
def pymisp_json_default(obj):
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime.datetime, datetime.date)):
return obj.isoformat()
elif isinstance(obj, Enum):
return obj.value
elif isinstance(obj, UUID):
return str(obj)
return json.default(obj)
class AbstractMISP(MutableMapping, MISPFileCache):
class AbstractMISP(MutableMapping, MISPFileCache, metaclass=ABCMeta):
__resources_path = resources_path
__misp_objects_path = misp_objects_path
__describe_types = describe_types
def __init__(self, **kwargs):
"""Abstract class for all the MISP objects"""
super(AbstractMISP, self).__init__()
self.__edited = True # As we create a new object, we assume it is edited
self.__not_jsonable = []
self.__self_defined_describe_types = None
"""Abstract class for all the MISP objects.
NOTE: Every method in every classes inheriting this one are doing
changes in memory and do not modify data on a remote MISP instance.
To do so, you need to call the respective add_* or update_*
methods in ExpandedPyMISP/PyMISP.
"""
super().__init__()
self.__edited: bool = True # As we create a new object, we assume it is edited
self.__not_jsonable: list = []
self._fields_for_feed: set
self.__self_defined_describe_types: Union[dict, None] = None
self.uuid: str
if kwargs.get('force_timestamps') is not None:
# Ignore the edited objects and keep the timestamps.
self.__force_timestamps = True
self.__force_timestamps: bool = True
else:
self.__force_timestamps = False
# List of classes having tags
from .mispevent import MISPAttribute, MISPEvent
self.__has_tags = (MISPAttribute, MISPEvent)
if isinstance(self, self.__has_tags):
self.Tag = []
setattr(AbstractMISP, 'add_tag', AbstractMISP.__add_tag)
setattr(AbstractMISP, 'tags', property(AbstractMISP.__get_tags, AbstractMISP.__set_tags))
self.__force_timestamps: bool = False
@property
def describe_types(self):
def describe_types(self) -> dict:
if self.__self_defined_describe_types:
return self.__self_defined_describe_types
return self.__describe_types
@describe_types.setter
def describe_types(self, describe_types):
def describe_types(self, describe_types: dict):
self.__self_defined_describe_types = describe_types
@property
def resources_path(self):
def resources_path(self) -> Path:
return self.__resources_path
@property
def misp_objects_path(self):
def misp_objects_path(self) -> Path:
return self.__misp_objects_path
@misp_objects_path.setter
def misp_objects_path(self, misp_objects_path):
if sys.version_info >= (3, 0) and isinstance(misp_objects_path, str):
def misp_objects_path(self, misp_objects_path: Union[str, Path]):
if isinstance(misp_objects_path, str):
misp_objects_path = Path(misp_objects_path)
self.__misp_objects_path = misp_objects_path
def from_dict(self, **kwargs):
def from_dict(self, **kwargs) -> None:
"""Loading all the parameters as class properties, if they aren't `None`.
This method aims to be called when all the properties requiring a special
treatment are processed.
@ -242,19 +159,19 @@ class AbstractMISP(MutableMapping, MISPFileCache):
# We load an existing dictionary, marking it an not-edited
self.__edited = False
def update_not_jsonable(self, *args):
def update_not_jsonable(self, *args) -> None:
"""Add entries to the __not_jsonable list"""
self.__not_jsonable += args
def set_not_jsonable(self, *args):
def set_not_jsonable(self, args: list) -> None:
"""Set __not_jsonable to a new list"""
self.__not_jsonable = args
def from_json(self, json_string):
def from_json(self, json_string: str) -> None:
"""Load a JSON string"""
self.from_dict(**loads(json_string))
def to_dict(self):
def to_dict(self) -> dict:
"""Dump the class to a dictionary.
This method automatically removes the timestamp recursively in every object
that has been edited is order to let MISP update the event accordingly."""
@ -274,29 +191,43 @@ class AbstractMISP(MutableMapping, MISPFileCache):
continue
else:
val = self._datetime_to_timestamp(val)
if (attribute in ['first_seen', 'last_seen', 'datetime']
and isinstance(val, datetime)
and not val.tzinfo):
# Need to make sure the timezone is set. Otherwise, it will be processed as UTC on the server
val = val.astimezone()
to_return[attribute] = val
to_return = _int_to_str(to_return)
return to_return
def jsonable(self):
def jsonable(self) -> dict:
"""This method is used by the JSON encoder"""
return self.to_dict()
def _to_feed(self):
if not hasattr(self, '_fields_for_feed'):
raise Exception('Unable to export in the feed format, _fields_for_feed is missing.')
def _to_feed(self) -> dict:
if not hasattr(self, '_fields_for_feed') or not self._fields_for_feed:
raise PyMISPError('Unable to export in the feed format, _fields_for_feed is missing.')
if hasattr(self, '_set_default') and callable(self._set_default): # type: ignore
self._set_default() # type: ignore
to_return = {}
for field in self._fields_for_feed:
if getattr(self, field, None):
if getattr(self, field, None) is not None:
if field in ['timestamp', 'publish_timestamp']:
to_return[field] = self._datetime_to_timestamp(getattr(self, field))
elif field == 'date':
elif isinstance(getattr(self, field), (datetime, date)):
to_return[field] = getattr(self, field).isoformat()
else:
to_return[field] = getattr(self, field)
else:
if field == 'data':
# data in attribute is special
continue
raise PyMISPError('The field {} is required in {} when generating a feed.'.format(field, self.__class__.__name__))
to_return = _int_to_str(to_return)
return to_return
def to_json(self, sort_keys=False, indent=None):
def to_json(self, sort_keys: bool=False, indent: Optional[int]=None):
"""Dump recursively any class of type MISPAbstract to a json string"""
return dumps(self, default=pymisp_json_default, sort_keys=sort_keys, indent=indent)
@ -322,7 +253,7 @@ class AbstractMISP(MutableMapping, MISPFileCache):
return len([k for k in self.__dict__.keys() if not (k[0] == '_' or k in self.__not_jsonable)])
@property
def edited(self):
def edited(self) -> bool:
"""Recursively check if an object has been edited and update the flag accordingly
to the parent objects"""
if self.__edited:
@ -343,26 +274,23 @@ class AbstractMISP(MutableMapping, MISPFileCache):
if isinstance(val, bool):
self.__edited = val
else:
raise Exception('edited can only be True or False')
raise PyMISPError('edited can only be True or False')
def __setattr__(self, name, value):
if name[0] != '_' and not self.__edited and name in self.keys():
# The private members don't matter
# If we already have a key with that name, we're modifying it.
self.__edited = True
super(AbstractMISP, self).__setattr__(name, value)
super().__setattr__(name, value)
def _datetime_to_timestamp(self, d):
"""Convert a datetime.datetime object to a timestamp (int)"""
if isinstance(d, (int, str)) or (sys.version_info < (3, 0) and isinstance(d, unicode)):
def _datetime_to_timestamp(self, d: Union[int, float, str, datetime]) -> int:
"""Convert a datetime object to a timestamp (int)"""
if isinstance(d, (int, float, str)):
# Assume we already have a timestamp
return int(d)
if sys.version_info >= (3, 3):
return int(d.timestamp())
else:
return int((d - datetime.datetime.fromtimestamp(0, UTC())).total_seconds())
return int(d.timestamp())
def __add_tag(self, tag=None, **kwargs):
def _add_tag(self, tag=None, **kwargs):
"""Add a tag to the attribute (by name or a MISPTag object)"""
if isinstance(tag, str):
misp_tag = MISPTag()
@ -376,16 +304,13 @@ class AbstractMISP(MutableMapping, MISPFileCache):
misp_tag = MISPTag()
misp_tag.from_dict(**kwargs)
else:
raise PyMISPInvalidFormat("The tag is in an invalid format (can be either string, MISPTag, or an expanded dict): {}".format(tag))
raise PyMISPInvalidFormat(f"The tag is in an invalid format (can be either string, MISPTag, or an expanded dict): {tag}")
if misp_tag not in self.tags:
self.Tag.append(misp_tag)
self.edited = True
return misp_tag
def __get_tags(self):
"""Returns a lost of tags associated to this Attribute"""
return self.Tag
def __set_tags(self, tags):
def _set_tags(self, tags):
"""Set a list of prepared MISPTag."""
if all(isinstance(x, MISPTag) for x in tags):
self.Tag = tags
@ -408,21 +333,48 @@ class AbstractMISP(MutableMapping, MISPFileCache):
class MISPTag(AbstractMISP):
_fields_for_feed = {'name', 'colour'}
_fields_for_feed: set = {'name', 'colour'}
def __init__(self):
super(MISPTag, self).__init__()
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.name: str
def from_dict(self, **kwargs):
if kwargs.get('Tag'):
kwargs = kwargs.get('Tag')
super(MISPTag, self).from_dict(**kwargs)
super().from_dict(**kwargs)
def _set_default(self):
if not hasattr(self, 'colour'):
self.colour = '#ffffff'
def _to_feed(self):
if hasattr(self, 'exportable') and not self.exportable:
return False
return super(MISPTag, self)._to_feed()
return super()._to_feed()
def delete(self):
self.deleted = True
self.edited = True
if HAS_RAPIDJSON:
def pymisp_json_default(obj: Union[AbstractMISP, datetime, date, Enum, UUID]) -> Union[dict, str]:
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime, date)):
return obj.isoformat()
elif isinstance(obj, Enum):
return obj.value
elif isinstance(obj, UUID):
return str(obj)
else:
def pymisp_json_default(obj: Union[AbstractMISP, datetime, date, Enum, UUID]) -> Union[dict, str]:
if isinstance(obj, AbstractMISP):
return obj.jsonable()
elif isinstance(obj, (datetime, date)):
return obj.isoformat()
elif isinstance(obj, Enum):
return obj.value
elif isinstance(obj, UUID):
return str(obj)

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -54,6 +54,7 @@
"hex",
"impfuzzy",
"imphash",
"kusto-query",
"malware-sample",
"md5",
"mime-type",
@ -187,7 +188,9 @@
"domain",
"domain|ip",
"email-dst",
"email-src",
"email-subject",
"eppn",
"hassh-md5",
"hasshserver-md5",
"hex",
@ -237,6 +240,7 @@
"attachment",
"authentihash",
"cdhash",
"chrome-extension-id",
"comment",
"domain",
"email-attachment",
@ -318,6 +322,7 @@
"attachment",
"authentihash",
"cdhash",
"chrome-extension-id",
"comment",
"filename",
"filename|authentihash",
@ -418,6 +423,7 @@
"comment",
"email-dst",
"email-src",
"eppn",
"github-organisation",
"github-repository",
"github-username",
@ -508,6 +514,10 @@
"default_category": "Payload delivery",
"to_ids": 1
},
"chrome-extension-id": {
"default_category": "Payload delivery",
"to_ids": 1
},
"comment": {
"default_category": "Other",
"to_ids": 0
@ -612,6 +622,10 @@
"default_category": "Payload delivery",
"to_ids": 0
},
"eppn": {
"default_category": "Network activity",
"to_ids": 1
},
"filename": {
"default_category": "Payload delivery",
"to_ids": 1
@ -772,6 +786,10 @@
"default_category": "Social network",
"to_ids": 0
},
"kusto-query": {
"default_category": "Artifacts dropped",
"to_ids": 0
},
"last-name": {
"default_category": "Person",
"to_ids": 0
@ -1109,6 +1127,7 @@
"campaign-name",
"cc-number",
"cdhash",
"chrome-extension-id",
"comment",
"community-id",
"cookie",
@ -1135,6 +1154,7 @@
"email-subject",
"email-thread-index",
"email-x-mailer",
"eppn",
"filename",
"filename|authentihash",
"filename|impfuzzy",
@ -1175,6 +1195,7 @@
"issue-date-of-the-visa",
"ja3-fingerprint-md5",
"jabber-id",
"kusto-query",
"last-name",
"link",
"mac-address",

@ -1 +1 @@
Subproject commit 3d7b09e9c47929a437c10b77b8525ed5fc16def6
Subproject commit 3ba77c9d2cfea5c27bc8935812d83be54c4f0fd4

View File

@ -66,7 +66,7 @@ class PyMISPNotImplementedYet(PyMISPError):
class PyMISPUnexpectedResponse(PyMISPError):
pass
pass
class PyMISPEmptyResponse(PyMISPError):

File diff suppressed because it is too large Load Diff

0
pymisp/py.typed Normal file
View File

View File

@ -1,11 +1,6 @@
import sys
from .vtreportobject import VTReportObject # noqa
from .neo4j import Neo4j # noqa
from .fileobject import FileObject # noqa
from .peobject import PEObject, PESectionObject # noqa
from .elfobject import ELFObject, ELFSectionObject # noqa
from .machoobject import MachOObject, MachOSectionObject # noqa
from .create_misp_object import make_binary_objects # noqa
from .abstractgenerator import AbstractMISPObjectGenerator # noqa
from .genericgenerator import GenericObjectGenerator # noqa
@ -16,8 +11,21 @@ from .domainipobject import DomainIPObject # noqa
from .asnobject import ASNObject # noqa
from .geolocationobject import GeolocationObject # noqa
if sys.version_info >= (3, 6):
from .emailobject import EMailObject # noqa
from .vehicleobject import VehicleObject # noqa
from .csvloader import CSVLoader # noqa
from .sshauthkeyobject import SSHAuthorizedKeysObject # noqa
from .emailobject import EMailObject # noqa
from .vehicleobject import VehicleObject # noqa
from .csvloader import CSVLoader # noqa
from .sshauthkeyobject import SSHAuthorizedKeysObject # noqa
from .feed import feed_meta_generator # noqa
try:
from .urlobject import URLObject # noqa
except ImportError:
# Requires faup, which is a bit difficult to install
pass
try:
from .peobject import PEObject, PESectionObject # noqa
from .elfobject import ELFObject, ELFSectionObject # noqa
from .machoobject import MachOObject, MachOSectionObject # noqa
except ImportError:
# Requires lief, which is a bit difficult to install
pass

View File

@ -5,11 +5,12 @@ from .. import MISPObject
from ..exceptions import InvalidMISPObject
from datetime import datetime, date
from dateutil.parser import parse
from typing import Union, Optional
class AbstractMISPObjectGenerator(MISPObject):
def _detect_epoch(self, timestamp):
def _detect_epoch(self, timestamp: Union[str, int, float]) -> bool:
try:
tmp = float(timestamp)
if tmp < 30000000:
@ -20,7 +21,7 @@ class AbstractMISPObjectGenerator(MISPObject):
except ValueError:
return False
def _sanitize_timestamp(self, timestamp):
def _sanitize_timestamp(self, timestamp: Optional[Union[datetime, date, dict, str, int, float]]=None) -> datetime:
if not timestamp:
return datetime.now()
@ -31,12 +32,14 @@ class AbstractMISPObjectGenerator(MISPObject):
elif isinstance(timestamp, dict):
if not isinstance(timestamp['value'], datetime):
timestamp['value'] = parse(timestamp['value'])
return timestamp
elif not isinstance(timestamp, datetime): # Supported: float/int/string
if self._detect_epoch(timestamp):
return timestamp['value']
else: # Supported: float/int/string
if isinstance(timestamp, (str, int, float)) and self._detect_epoch(timestamp):
return datetime.fromtimestamp(float(timestamp))
return parse(timestamp)
return timestamp
elif isinstance(timestamp, str):
return parse(timestamp)
else:
raise Exception(f'Unable to convert {timestamp} to a datetime.')
def generate_attributes(self):
"""Contains the logic where all the values of the object are gathered"""

View File

@ -9,7 +9,7 @@ logger = logging.getLogger('pymisp')
class ASNObject(AbstractMISPObjectGenerator):
def __init__(self, parameters, strict=True, standalone=True, **kwargs):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(ASNObject, self).__init__('asn', strict=strict, standalone=standalone, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -2,18 +2,25 @@
# -*- coding: utf-8 -*-
import sys
from io import BytesIO
from . import FileObject, PEObject, ELFObject, MachOObject
from . import FileObject
from ..exceptions import MISPObjectException
import logging
from typing import Optional
logger = logging.getLogger('pymisp')
try:
import lief
from lief import Logger
import lief # type: ignore
from lief import Logger # type: ignore
Logger.disable()
HAS_LIEF = True
from .peobject import make_pe_objects
from .elfobject import make_elf_objects
from .machoobject import make_macho_objects
except ImportError:
HAS_LIEF = False
@ -22,46 +29,22 @@ class FileTypeNotImplemented(MISPObjectException):
pass
def make_pe_objects(lief_parsed, misp_file, standalone=True, default_attributes_parameters={}):
pe_object = PEObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(pe_object.uuid, 'includes', 'PE indicators')
pe_sections = []
for s in pe_object.sections:
pe_sections.append(s)
return misp_file, pe_object, pe_sections
def make_elf_objects(lief_parsed, misp_file, standalone=True, default_attributes_parameters={}):
elf_object = ELFObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(elf_object.uuid, 'includes', 'ELF indicators')
elf_sections = []
for s in elf_object.sections:
elf_sections.append(s)
return misp_file, elf_object, elf_sections
def make_macho_objects(lief_parsed, misp_file, standalone=True, default_attributes_parameters={}):
macho_object = MachOObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(macho_object.uuid, 'includes', 'MachO indicators')
macho_sections = []
for s in macho_object.sections:
macho_sections.append(s)
return misp_file, macho_object, macho_sections
def make_binary_objects(filepath=None, pseudofile=None, filename=None, standalone=True, default_attributes_parameters={}):
def make_binary_objects(filepath: Optional[str]=None, pseudofile: Optional[BytesIO]=None, filename: Optional[str]=None, standalone: bool=True, default_attributes_parameters: dict={}):
misp_file = FileObject(filepath=filepath, pseudofile=pseudofile, filename=filename,
standalone=standalone, default_attributes_parameters=default_attributes_parameters)
if HAS_LIEF and (filepath or (pseudofile and filename)):
try:
if filepath:
lief_parsed = lief.parse(filepath=filepath)
else:
elif pseudofile and filename:
if sys.version_info < (3, 0):
logger.critical('Pseudofile is not supported in python2. Just update.')
lief_parsed = None
else:
lief_parsed = lief.parse(raw=pseudofile.getvalue(), name=filename)
else:
logger.critical('You need either a filepath, or a pseudofile and a filename.')
lief_parsed = None
if isinstance(lief_parsed, lief.PE.Binary):
return make_pe_objects(lief_parsed, misp_file, standalone, default_attributes_parameters)
elif isinstance(lief_parsed, lief.ELF.Binary):

View File

@ -9,7 +9,7 @@ logger = logging.getLogger('pymisp')
class DomainIPObject(AbstractMISPObjectGenerator):
def __init__(self, parameters, strict=True, standalone=True, **kwargs):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(DomainIPObject, self).__init__('domain-ip', strict=strict, standalone=standalone, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -6,30 +6,36 @@ from ..exceptions import InvalidMISPObject
from io import BytesIO
from hashlib import md5, sha1, sha256, sha512
import logging
from typing import Union
from pathlib import Path
from . import FileObject
logger = logging.getLogger('pymisp')
import lief # type: ignore
try:
import lief
HAS_LIEF = True
except ImportError:
HAS_LIEF = False
try:
import pydeep
import pydeep # type: ignore
HAS_PYDEEP = True
except ImportError:
HAS_PYDEEP = False
logger = logging.getLogger('pymisp')
def make_elf_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool=True, default_attributes_parameters: dict={}):
elf_object = ELFObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(elf_object.uuid, 'includes', 'ELF indicators')
elf_sections = []
for s in elf_object.sections:
elf_sections.append(s)
return misp_file, elf_object, elf_sections
class ELFObject(AbstractMISPObjectGenerator):
def __init__(self, parsed=None, filepath=None, pseudofile=None, standalone=True, **kwargs):
def __init__(self, parsed: lief.ELF.Binary=None, filepath: Union[Path, str]=None, pseudofile: Union[BytesIO, bytes]=None, standalone: bool=True, **kwargs):
super(ELFObject, self).__init__('elf', standalone=standalone, **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if not HAS_LIEF:
raise ImportError('Please install lief, documentation here: https://github.com/lief-project/LIEF')
if pseudofile:
if isinstance(pseudofile, BytesIO):
self.__elf = lief.ELF.parse(raw=pseudofile.getvalue())
@ -67,7 +73,7 @@ class ELFObject(AbstractMISPObjectGenerator):
class ELFSectionObject(AbstractMISPObjectGenerator):
def __init__(self, section, standalone=True, **kwargs):
def __init__(self, section: lief.ELF.Section, standalone: bool=True, **kwargs):
# Python3 way
# super().__init__('pe-section')
super(ELFSectionObject, self).__init__('elf-section', standalone=standalone, **kwargs)

View File

@ -6,13 +6,15 @@ from .abstractgenerator import AbstractMISPObjectGenerator
from io import BytesIO
import logging
from email import message_from_bytes, policy
from pathlib import Path
from typing import Union
logger = logging.getLogger('pymisp')
class EMailObject(AbstractMISPObjectGenerator):
def __init__(self, filepath=None, pseudofile=None, attach_original_email=True, standalone=True, **kwargs):
def __init__(self, filepath: Union[Path, str]=None, pseudofile: BytesIO=None, attach_original_email: bool=True, standalone: bool=True, **kwargs):
# PY3 way:
# super().__init__('file')
super(EMailObject, self).__init__('email', standalone=standalone, **kwargs)
@ -50,17 +52,30 @@ class EMailObject(AbstractMISPObjectGenerator):
if 'Message-ID' in self.__email:
self.add_attribute('message-id', value=self.__email['Message-ID'])
if 'To' in self.__email:
# TODO: split name and email address
to_add = [to.strip() for to in self.__email['To'].split(',')]
self.add_attributes('to', *to_add)
if 'Cc' in self.__email:
# TODO: split name and email address
to_add = [to.strip() for to in self.__email['Cc'].split(',')]
self.add_attributes('cc', *to_add)
if 'Subject' in self.__email:
self.add_attribute('subject', value=self.__email['Subject'])
if 'From' in self.__email:
# TODO: split name and email address
to_add = [to.strip() for to in self.__email['From'].split(',')]
self.add_attributes('from', *to_add)
if 'Return-Path' in self.__email:
# TODO: split name and email address
self.add_attribute('return-path', value=self.__email['Return-Path'])
if 'User-Agent' in self.__email:
self.add_attribute('user-agent', value=self.__email['User-Agent'])
if self.__email.get_boundary():
self.add_attribute('mime-boundary', value=self.__email.get_boundary())
if 'X-Mailer' in self.__email:
self.add_attribute('x-mailer', value=self.__email['X-Mailer'])
if 'Thread-Index' in self.__email:
self.add_attribute('thread-index', value=self.__email['Thread-Index'])
# TODO: email-header: all headers in one bloc
# TODO: BCC?
# TODO: received headers sometimes have TO email addresses

View File

@ -2,13 +2,13 @@
# -*- coding: utf-8 -*-
try:
from pymispgalaxies import Clusters
from pymispgalaxies import Clusters # type: ignore
has_pymispgalaxies = True
except ImportError:
has_pymispgalaxies = False
try:
from pytaxonomies import Taxonomies
from pytaxonomies import Taxonomies # type: ignore
has_pymispgalaxies = True
except ImportError:
has_pymispgalaxies = False

View File

@ -9,7 +9,7 @@ logger = logging.getLogger('pymisp')
class Fail2BanObject(AbstractMISPObjectGenerator):
def __init__(self, parameters, strict=True, standalone=True, **kwargs):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(Fail2BanObject, self).__init__('fail2ban', strict=strict, standalone=standalone, **kwargs)
self._parameters = parameters
self.generate_attributes()

27
pymisp/tools/feed.py Normal file
View File

@ -0,0 +1,27 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
from pathlib import Path
from pymisp import MISPEvent
import json
from typing import List
def feed_meta_generator(path: Path):
manifests = {}
hashes: List[str] = []
for f_name in path.glob('*.json'):
if str(f_name.name) == 'manifest.json':
continue
event = MISPEvent()
event.load_file(str(f_name))
manifests.update(event.manifest)
hashes += [f'{h},{event.uuid}' for h in event.attributes_hashes('md5')]
with (path / 'manifest.json').open('w') as f:
json.dump(manifests, f)
with (path / 'hashes.csv').open('w') as f:
for h in hashes:
f.write(f'{h}\n')

View File

@ -9,18 +9,20 @@ from hashlib import md5, sha1, sha256, sha512
import math
from collections import Counter
import logging
from pathlib import Path
from typing import Union
logger = logging.getLogger('pymisp')
try:
import pydeep
import pydeep # type: ignore
HAS_PYDEEP = True
except ImportError:
HAS_PYDEEP = False
try:
import magic
import magic # type: ignore
HAS_MAGIC = True
except ImportError:
HAS_MAGIC = False
@ -28,7 +30,7 @@ except ImportError:
class FileObject(AbstractMISPObjectGenerator):
def __init__(self, filepath=None, pseudofile=None, filename=None, standalone=True, **kwargs):
def __init__(self, filepath: Union[Path, str]=None, pseudofile: BytesIO=None, filename: str=None, standalone: bool=True, **kwargs):
# PY3 way:
# super().__init__('file')
super(FileObject, self).__init__('file', standalone=standalone, **kwargs)
@ -70,7 +72,7 @@ class FileObject(AbstractMISPObjectGenerator):
if HAS_PYDEEP:
self.add_attribute('ssdeep', value=pydeep.hash_buf(self.__data).decode())
def __entropy_H(self, data):
def __entropy_H(self, data: bytes) -> float:
"""Calculate the entropy of a chunk of data."""
# NOTE: copy of the entropy function from pefile
@ -79,7 +81,7 @@ class FileObject(AbstractMISPObjectGenerator):
occurences = Counter(bytearray(data))
entropy = 0
entropy = 0.0
for x in occurences.values():
p_x = float(x) / len(data)
entropy -= p_x * math.log(p_x, 2)

View File

@ -2,11 +2,13 @@
# -*- coding: utf-8 -*-
from .abstractgenerator import AbstractMISPObjectGenerator
from typing import List
class GenericObjectGenerator(AbstractMISPObjectGenerator):
def generate_attributes(self, attributes):
# FIXME: this method is different from the master one, and that's probably not a good idea.
def generate_attributes(self, attributes: List[dict]): # type: ignore
"""Generates MISPObjectAttributes from a list of dictionaries.
Each entry if the list must be in one of the two following formats:
* {<object_relation>: <value>}

View File

@ -9,7 +9,7 @@ logger = logging.getLogger('pymisp')
class GeolocationObject(AbstractMISPObjectGenerator):
def __init__(self, parameters, strict=True, standalone=True, **kwargs):
def __init__(self, parameters: dict, strict: bool=True, standalone: bool=True, **kwargs):
super(GeolocationObject, self).__init__('asn', strict=strict, standalone=standalone, **kwargs)
self._parameters = parameters
self.generate_attributes()

View File

@ -2,7 +2,7 @@
# -*- coding: utf-8 -*-
try:
from pymispwarninglists import WarningLists
from pymispwarninglists import WarningLists # type: ignore
has_pymispwarninglists = True
except ImportError:
has_pymispwarninglists = False

View File

@ -6,33 +6,38 @@ from .abstractgenerator import AbstractMISPObjectGenerator
from io import BytesIO
from hashlib import md5, sha1, sha256, sha512
import logging
from typing import Optional, Union
from pathlib import Path
from . import FileObject
logger = logging.getLogger('pymisp')
import lief # type: ignore
try:
import lief
HAS_LIEF = True
except ImportError:
HAS_LIEF = False
try:
import pydeep
import pydeep # type: ignore
HAS_PYDEEP = True
except ImportError:
HAS_PYDEEP = False
logger = logging.getLogger('pymisp')
def make_macho_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool=True, default_attributes_parameters: dict={}):
macho_object = MachOObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(macho_object.uuid, 'includes', 'MachO indicators')
macho_sections = []
for s in macho_object.sections:
macho_sections.append(s)
return misp_file, macho_object, macho_sections
class MachOObject(AbstractMISPObjectGenerator):
def __init__(self, parsed=None, filepath=None, pseudofile=None, standalone=True, **kwargs):
def __init__(self, parsed: Optional[lief.MachO.Binary]=None, filepath: Optional[Union[Path, str]]=None, pseudofile: Optional[BytesIO]=None, standalone: bool=True, **kwargs):
# Python3 way
# super().__init__('elf')
super(MachOObject, self).__init__('macho', standalone=standalone, **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if not HAS_LIEF:
raise ImportError('Please install lief, documentation here: https://github.com/lief-project/LIEF')
if pseudofile:
if isinstance(pseudofile, BytesIO):
self.__macho = lief.MachO.parse(raw=pseudofile.getvalue())
@ -70,7 +75,7 @@ class MachOObject(AbstractMISPObjectGenerator):
class MachOSectionObject(AbstractMISPObjectGenerator):
def __init__(self, section, standalone=True, **kwargs):
def __init__(self, section: lief.MachO.Section, standalone: bool=True, **kwargs):
# Python3 way
# super().__init__('pe-section')
super(MachOSectionObject, self).__init__('macho-section', standalone=standalone, **kwargs)

View File

@ -5,7 +5,7 @@ import os
from .. import MISPEvent
try:
from py2neo import authenticate, Graph, Node, Relationship
from py2neo import authenticate, Graph, Node, Relationship # type: ignore
has_py2neo = True
except ImportError:
has_py2neo = False

View File

@ -5,7 +5,7 @@ import os
from .. import MISPEvent
try:
from bs4 import BeautifulSoup
from bs4 import BeautifulSoup # type: ignore
has_bs4 = True
except ImportError:
has_bs4 = False

View File

@ -7,32 +7,39 @@ from io import BytesIO
from hashlib import md5, sha1, sha256, sha512
from datetime import datetime
import logging
from typing import Optional, Union
from pathlib import Path
logger = logging.getLogger('pymisp')
from . import FileObject
import lief # type: ignore
try:
import lief
HAS_LIEF = True
except ImportError:
HAS_LIEF = False
try:
import pydeep
import pydeep # type: ignore
HAS_PYDEEP = True
except ImportError:
HAS_PYDEEP = False
logger = logging.getLogger('pymisp')
def make_pe_objects(lief_parsed: lief.Binary, misp_file: FileObject, standalone: bool=True, default_attributes_parameters: dict={}):
pe_object = PEObject(parsed=lief_parsed, standalone=standalone, default_attributes_parameters=default_attributes_parameters)
misp_file.add_reference(pe_object.uuid, 'includes', 'PE indicators')
pe_sections = []
for s in pe_object.sections:
pe_sections.append(s)
return misp_file, pe_object, pe_sections
class PEObject(AbstractMISPObjectGenerator):
def __init__(self, parsed=None, filepath=None, pseudofile=None, standalone=True, **kwargs):
def __init__(self, parsed: Optional[lief.PE.Binary]=None, filepath: Optional[Union[Path, str]]=None, pseudofile: Optional[BytesIO]=None, standalone: bool=True, **kwargs):
# Python3 way
# super().__init__('pe')
super(PEObject, self).__init__('pe', standalone=standalone, **kwargs)
if not HAS_PYDEEP:
logger.warning("Please install pydeep: pip install git+https://github.com/kbandla/pydeep.git")
if not HAS_LIEF:
raise ImportError('Please install lief, documentation here: https://github.com/lief-project/LIEF')
if pseudofile:
if isinstance(pseudofile, BytesIO):
self.__pe = lief.PE.parse(raw=pseudofile.getvalue())
@ -82,10 +89,10 @@ class PEObject(AbstractMISPObjectGenerator):
self.add_attribute('compilation-timestamp', value=datetime.utcfromtimestamp(self.__pe.header.time_date_stamps).isoformat())
# self.imphash = self.__pe.get_imphash()
try:
if (self.__pe.has_resources and
self.__pe.resources_manager.has_version and
self.__pe.resources_manager.version.has_string_file_info and
self.__pe.resources_manager.version.string_file_info.langcode_items):
if (self.__pe.has_resources
and self.__pe.resources_manager.has_version
and self.__pe.resources_manager.version.has_string_file_info
and self.__pe.resources_manager.version.string_file_info.langcode_items):
fileinfo = dict(self.__pe.resources_manager.version.string_file_info.langcode_items[0].items.items())
self.add_attribute('original-filename', value=fileinfo.get('OriginalFilename'))
self.add_attribute('internal-filename', value=fileinfo.get('InternalName'))
@ -106,8 +113,8 @@ class PEObject(AbstractMISPObjectGenerator):
for section in self.__pe.sections:
s = PESectionObject(section, self._standalone, default_attributes_parameters=self._default_attributes_parameters)
self.add_reference(s.uuid, 'includes', 'Section {} of PE'.format(pos))
if ((self.__pe.entrypoint >= section.virtual_address) and
(self.__pe.entrypoint < (section.virtual_address + section.virtual_size))):
if ((self.__pe.entrypoint >= section.virtual_address)
and (self.__pe.entrypoint < (section.virtual_address + section.virtual_size))):
self.add_attribute('entrypoint-section-at-position', value='{}|{}'.format(section.name, pos))
pos += 1
self.sections.append(s)
@ -117,7 +124,7 @@ class PEObject(AbstractMISPObjectGenerator):
class PESectionObject(AbstractMISPObjectGenerator):
def __init__(self, section, standalone=True, **kwargs):
def __init__(self, section: lief.PE.Section, standalone: bool=True, **kwargs):
# Python3 way
# super().__init__('pe-section')
super(PESectionObject, self).__init__('pe-section', standalone=standalone, **kwargs)

View File

@ -20,17 +20,17 @@ logger = logging.getLogger('pymisp')
# Potentially not installed imports
try:
from reportlab.pdfgen import canvas
from reportlab.pdfbase.pdfmetrics import stringWidth, registerFont
from reportlab.pdfbase.ttfonts import TTFont
from reportlab.lib import colors
from reportlab.lib.pagesizes import A4
from reportlab.pdfgen import canvas # type: ignore
from reportlab.pdfbase.pdfmetrics import stringWidth, registerFont # type: ignore
from reportlab.pdfbase.ttfonts import TTFont # type: ignore
from reportlab.lib import colors # type: ignore
from reportlab.lib.pagesizes import A4 # type: ignore
from reportlab.platypus import SimpleDocTemplate, Paragraph, PageBreak, Table, TableStyle, Flowable, Image, Indenter
from reportlab.platypus import SimpleDocTemplate, Paragraph, PageBreak, Table, TableStyle, Flowable, Image, Indenter # type: ignore
from reportlab.lib.styles import getSampleStyleSheet, ParagraphStyle
from reportlab.lib.units import mm
from reportlab.lib.enums import TA_CENTER, TA_JUSTIFY, TA_LEFT
from reportlab.lib.styles import getSampleStyleSheet, ParagraphStyle # type: ignore
from reportlab.lib.units import mm # type: ignore
from reportlab.lib.enums import TA_CENTER, TA_JUSTIFY, TA_LEFT # type: ignore
HAS_REPORTLAB = True
except ImportError:
@ -481,14 +481,17 @@ def get_clusters_table_styles():
def safe_string(bad_str):
return escape(str(bad_str))
def is_safe_value(value):
return (value is not None
and value != "")
def is_safe_table(value):
return (value is not None
and value != [])
def is_safe_attribute(curr_object, attribute_name):
return (hasattr(curr_object, attribute_name)
and getattr(curr_object, attribute_name) is not None
@ -660,7 +663,7 @@ class Value_Formatter():
return self.get_unoverflowable_paragraph(answer)
def get_threat_value(self, threat_level = None):
def get_threat_value(self, threat_level=None):
'''
Returns a flowable paragraph to add to the pdf given the misp_event threat
:param threat_level: MISP_EVENT threat level (int) to be formatted
@ -671,9 +674,9 @@ class Value_Formatter():
if is_safe_value(threat_level) and str(threat_level) in threat_map:
answer = threat_map[safe_string(threat_level)]
return self.get_unoverflowable_paragraph(answer,do_escape_string=False)
return self.get_unoverflowable_paragraph(answer, do_escape_string=False)
def get_analysis_value(self, analysis_level = None):
def get_analysis_value(self, analysis_level=None):
'''
Returns a flowable paragraph to add to the pdf given the misp_event analysis
:param analysis_level: MISP_EVENT analysis level (int) to be formatted
@ -684,7 +687,7 @@ class Value_Formatter():
if is_safe_value(analysis_level) and str(analysis_level) in analysis_map:
answer = analysis_map[safe_string(analysis_level)]
return self.get_unoverflowable_paragraph(answer,do_escape_string=False)
return self.get_unoverflowable_paragraph(answer, do_escape_string=False)
def get_timestamp_value(self, timestamp=None):
'''
@ -764,7 +767,7 @@ class Value_Formatter():
try:
# Get the image
buf = image_buffer # TODO : Do verification on the buffer ?
buf = image_buffer # TODO : Do verification on the buffer ?
# Create image within a bounded box (to allow pdf creation)
img = Image(buf, width=FRAME_PICTURE_MAX_WIDTH, height=FRAME_PICTURE_MAX_HEIGHT, kind='bound')
@ -821,8 +824,8 @@ class Value_Formatter():
if is_safe_dict_attribute(misp_galaxy, 'name'):
answer = '{} <i>from</i> {}:{}'.format(safe_string(misp_galaxy['name']),
safe_string(misp_galaxy["namespace"]),
safe_string(misp_galaxy["type"]))
safe_string(misp_galaxy["namespace"]),
safe_string(misp_galaxy["type"]))
return self.get_unoverflowable_paragraph(answer, do_small=True)
@ -866,7 +869,7 @@ class Event_Metadata():
########################################################################
# General Event's Attributes formater
def create_flowable_table_from_event(self, misp_event ):
def create_flowable_table_from_event(self, misp_event):
'''
Returns Table presenting a MISP event
:param misp_event: A misp event (complete or not)
@ -879,8 +882,8 @@ class Event_Metadata():
# Manual addition
# UUID
data.append([self.value_formatter.get_col1_paragraph("UUID"),
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid',None),
text=misp_event.get('uuid',None))])
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid', None),
text=misp_event.get('uuid', None))])
# Date
data.append({self.value_formatter.get_col1_paragraph("Date"),
@ -888,48 +891,48 @@ class Event_Metadata():
# Owner
data.append([self.value_formatter.get_col1_paragraph("Owner org"),
self.value_formatter.get_owner_value(owner=misp_event.get('owner',None))])
self.value_formatter.get_owner_value(owner=misp_event.get('owner', None))])
# Threat
data.append([self.value_formatter.get_col1_paragraph("Threat level"),
self.value_formatter.get_threat_value(threat_level=misp_event.get('threat_level_id',None))])
self.value_formatter.get_threat_value(threat_level=misp_event.get('threat_level_id', None))])
# Analysis
data.append([self.value_formatter.get_col1_paragraph("Analysis"),
self.value_formatter.get_analysis_value(analysis_level=misp_event.get('analysis',None))])
self.value_formatter.get_analysis_value(analysis_level=misp_event.get('analysis', None))])
# Info
data.append([self.value_formatter.get_col1_paragraph("Info"),
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid',None),
text=misp_event.get('info',None))])
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid', None),
text=misp_event.get('info', None))])
# Timestamp
data.append([self.value_formatter.get_col1_paragraph("Event date"),
self.value_formatter.get_timestamp_value(timestamp=misp_event.get('timestamp',None))])
self.value_formatter.get_timestamp_value(timestamp=misp_event.get('timestamp', None))])
# Published
data.append([self.value_formatter.get_col1_paragraph("Published"),
self.value_formatter.get_published_value(published_bool=misp_event.get('published',None),
published_timestamp=misp_event.get('publish_timestamp',None))])
self.value_formatter.get_published_value(published_bool=misp_event.get('published', None),
published_timestamp=misp_event.get('publish_timestamp', None))])
# Creator organisation
data.append([self.value_formatter.get_col1_paragraph("Creator Org"),
self.value_formatter.get_creator_organisation_value(creator=misp_event.get('Orgc',None))])
self.value_formatter.get_creator_organisation_value(creator=misp_event.get('Orgc', None))])
# Number of Attributes
data.append([self.value_formatter.get_col1_paragraph("# Attributes"),
self.value_formatter.get_attributes_number_value(attributes=misp_event.get('Attribute',None))])
self.value_formatter.get_attributes_number_value(attributes=misp_event.get('Attribute', None))])
# Tags
curr_Tags = Tags(self.config, self.value_formatter)
data.append([self.value_formatter.get_col1_paragraph("Tags"),
curr_Tags.get_tag_value(tags=misp_event.get('Tag',None))])
curr_Tags.get_tag_value(tags=misp_event.get('Tag', None))])
flowable_table.append(create_flowable_table_from_data(data))
# Correlation
if is_safe_table(misp_event.get('RelatedEvent',None)) and is_in_config(self.config, 4):
flowable_table += self.get_correlation_values(related_events=misp_event.get('RelatedEvent',None))
if is_safe_table(misp_event.get('RelatedEvent', None)) and is_in_config(self.config, 4):
flowable_table += self.get_correlation_values(related_events=misp_event.get('RelatedEvent', None))
# Galaxies
if is_safe_attribute_table(misp_event, "Related Galaxies") and is_in_config(self.config, 3):
@ -952,17 +955,17 @@ class Event_Metadata():
# Manual addition
# UUID
data.append([self.value_formatter.get_col1_paragraph("UUID"),
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid',None),
text=misp_event.get('uuid',None))])
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid', None),
text=misp_event.get('uuid', None))])
# Info
data.append([self.value_formatter.get_col1_paragraph("Info"),
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid',None),
text=misp_event.get('info',None))])
self.value_formatter.get_value_link_to_event(uuid=misp_event.get('uuid', None),
text=misp_event.get('info', None))])
# Timestamp
data.append([self.value_formatter.get_col1_paragraph("Event date"),
self.value_formatter.get_timestamp_value(timestamp=misp_event.get('timestamp',None))])
self.value_formatter.get_timestamp_value(timestamp=misp_event.get('timestamp', None))])
flowable_table.append(create_flowable_table_from_data(data))
@ -1167,10 +1170,10 @@ class Attributes():
# data.append([Paragraph(item[0], col1_style), Paragraph(item[2], col2_style)])
# Handle Special case for links (Value) - There were not written in the previous loop
if not STANDARD_TYPE and is_safe_value(misp_attribute.get('value',None)):
if not STANDARD_TYPE and is_safe_value(misp_attribute.get('value', None)):
data.append([self.value_formatter.get_col1_paragraph("Value"),
self.value_formatter.get_good_or_bad_link(value=misp_attribute.get('value',None),
type=misp_attribute.get('type',None))])
self.value_formatter.get_good_or_bad_link(value=misp_attribute.get('value', None),
type=misp_attribute.get('type', None))])
# Handle pictures
if is_safe_value(misp_attribute.get('data', None)) and misp_attribute.type == IMAGE_TYPE:
@ -1190,7 +1193,7 @@ class Attributes():
if is_safe_table(misp_attribute.get('Sighting', None)):
data.append([self.value_formatter.get_col1_paragraph("Sighting"),
curr_Sighting.create_flowable_paragraph_from_sightings(sightings=misp_attribute.get('Sighting',None))])
curr_Sighting.create_flowable_paragraph_from_sightings(sightings=misp_attribute.get('Sighting', None))])
flowable_table.append(create_flowable_table_from_data(data))
@ -1399,7 +1402,7 @@ class Object():
data = [create_flowable_table_from_data(data)]
# Handle all the attributes
if is_safe_value(misp_object.get("Attribute",None)):
if is_safe_value(misp_object.get("Attribute", None)):
curr_attributes = Attributes(self.config, self.value_formatter)
data.append(Indenter(left=INDENT_SIZE))
data += curr_attributes.create_flowable_table_from_attributes(misp_object)
@ -1674,8 +1677,8 @@ def collect_parts(misp_event, config=None):
# Create stuff
title_style = ParagraphStyle(name='Column_1', parent=sample_style_sheet['Heading1'],
fontName=FIRST_COL_FONT, alignment=TA_CENTER)
title = curr_val_f.get_value_link_to_event(uuid=misp_event.get('uuid',None),
text=misp_event.get('info',None),
title = curr_val_f.get_value_link_to_event(uuid=misp_event.get('uuid', None),
text=misp_event.get('info', None),
curr_style=title_style, color=False)
# Add all parts to final PDF
flowables.append(title)
@ -1708,7 +1711,7 @@ def collect_parts(misp_event, config=None):
flowables.append(PageBreak())
event_objects_title = Paragraph("Objects", sample_style_sheet['Heading2'])
table_objects = curr_object.create_flowable_table_from_objects(objects=misp_event.get("Object",None))
table_objects = curr_object.create_flowable_table_from_objects(objects=misp_event.get("Object", None))
flowables.append(event_objects_title)
flowables += table_objects

View File

@ -8,7 +8,7 @@ class SBSignatureObject(AbstractMISPObjectGenerator):
'''
Sandbox Analyzer
'''
def __init__(self, software, report, standalone=True, **kwargs):
def __init__(self, software: str, report: list, standalone: bool=True, **kwargs):
super(SBSignatureObject, self).__init__("sb-signature", **kwargs)
self._software = software
self._report = report

View File

@ -5,13 +5,15 @@ from ..exceptions import InvalidMISPObject
from .abstractgenerator import AbstractMISPObjectGenerator
from io import StringIO
import logging
from typing import Optional, Union
from pathlib import Path
logger = logging.getLogger('pymisp')
class SSHAuthorizedKeysObject(AbstractMISPObjectGenerator):
def __init__(self, authorized_keys_path=None, authorized_keys_pseudofile=None, standalone=True, **kwargs):
def __init__(self, authorized_keys_path: Optional[Union[Path, str]]=None, authorized_keys_pseudofile: Optional[StringIO]=None, standalone: bool=True, **kwargs):
# PY3 way:
# super().__init__('file')
super(SSHAuthorizedKeysObject, self).__init__('ssh-authorized-keys', standalone=standalone, **kwargs)
@ -19,7 +21,7 @@ class SSHAuthorizedKeysObject(AbstractMISPObjectGenerator):
with open(authorized_keys_path, 'r') as f:
self.__pseudofile = StringIO(f.read())
elif authorized_keys_pseudofile and isinstance(authorized_keys_pseudofile, StringIO):
self.__pseudofile = authorized_keys_path
self.__pseudofile = authorized_keys_pseudofile
else:
raise InvalidMISPObject('File buffer (StringIO) or a path is required.')
self.__data = self.__pseudofile.getvalue()

View File

@ -1,15 +1,15 @@
# -*- coding: utf-8 -*-
try:
from misp_stix_converter.converters.buildMISPAttribute import buildEvent
from misp_stix_converter.converters import convert
from misp_stix_converter.converters.convert import MISPtoSTIX
from misp_stix_converter.converters.buildMISPAttribute import buildEvent # type: ignore
from misp_stix_converter.converters import convert # type: ignore
from misp_stix_converter.converters.convert import MISPtoSTIX # type: ignore
has_misp_stix_converter = True
except ImportError:
has_misp_stix_converter = False
def load_stix(stix, distribution=3, threat_level_id=2, analysis=0):
def load_stix(stix, distribution: int=3, threat_level_id: int=2, analysis: int=0):
'''Returns a MISPEvent object from a STIX package'''
if not has_misp_stix_converter:
raise Exception('You need to install misp_stix_converter: pip install git+https://github.com/MISP/MISP-STIX-Converter.git')
@ -18,7 +18,7 @@ def load_stix(stix, distribution=3, threat_level_id=2, analysis=0):
threat_level_id=threat_level_id, analysis=analysis)
def make_stix_package(misp_event, to_json=False, to_xml=False):
def make_stix_package(misp_event, to_json: bool=False, to_xml: bool=False):
'''Returns a STIXPackage from a MISPEvent.
Optionally can return the package in json or xml.

28
pymisp/tools/urlobject.py Normal file
View File

@ -0,0 +1,28 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from .abstractgenerator import AbstractMISPObjectGenerator
import logging
from pyfaup.faup import Faup # type: ignore
from urllib.parse import unquote_plus
logger = logging.getLogger('pymisp')
faup = Faup()
class URLObject(AbstractMISPObjectGenerator):
def __init__(self, url: str, standalone: bool=True, **kwargs):
# PY3 way:
# super().__init__('file')
super(URLObject, self).__init__('url', standalone=standalone, **kwargs)
faup.decode(unquote_plus(url))
self.generate_attributes()
def generate_attributes(self):
self.add_attribute('url', value=faup.url.decode())
if faup.get_host():
self.add_attribute('host', value=faup.get_host())
if faup.get_domain():
self.add_attribute('domain', value=faup.get_domain())

View File

@ -1,10 +1,7 @@
#!/usr/bin/python3
import sys
import getopt
import requests
import json
from pymisp import MISPObject
from .abstractgenerator import AbstractMISPObjectGenerator

View File

@ -2,10 +2,11 @@
# -*- coding: utf-8 -*-
import re
from typing import Optional
import requests
try:
import validators
import validators # type: ignore
has_validators = True
except ImportError:
has_validators = False
@ -23,7 +24,7 @@ class VTReportObject(AbstractMISPObjectGenerator):
:indicator: IOC to search VirusTotal for
'''
def __init__(self, apikey, indicator, vt_proxies=None, standalone=True, **kwargs):
def __init__(self, apikey: str, indicator: str, vt_proxies: Optional[dict]=None, standalone: bool=True, **kwargs):
# PY3 way:
# super().__init__("virustotal-report")
super(VTReportObject, self).__init__("virustotal-report", standalone=standalone, **kwargs)
@ -47,7 +48,7 @@ class VTReportObject(AbstractMISPObjectGenerator):
ratio = "{}/{}".format(self._report["positives"], self._report["total"])
self.add_attribute("detection-ratio", value=ratio)
def __validate_resource(self, ioc):
def __validate_resource(self, ioc: str):
'''
Validate the data type of an indicator.
Domains and IP addresses aren't supported because
@ -63,7 +64,7 @@ class VTReportObject(AbstractMISPObjectGenerator):
return "file"
return False
def __query_virustotal(self, apikey, resource):
def __query_virustotal(self, apikey: str, resource: str):
'''
Query VirusTotal for information about an indicator
@ -78,9 +79,9 @@ class VTReportObject(AbstractMISPObjectGenerator):
report = requests.get(url, params=params, proxies=self._proxies)
else:
report = requests.get(url, params=params)
report = report.json()
if report["response_code"] == 1:
report_json = report.json()
if report_json["response_code"] == 1:
return report
else:
error_msg = "{}: {}".format(resource, report["verbose_msg"])
error_msg = "{}: {}".format(resource, report_json["verbose_msg"])
raise InvalidMISPObject(error_msg)

View File

@ -34,15 +34,12 @@ setup(
'Intended Audience :: Science/Research',
'Intended Audience :: Telecommunications Industry',
'Intended Audience :: Information Technology',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.6',
'Topic :: Security',
'Topic :: Internet',
],
install_requires=['six', 'requests', 'python-dateutil', 'jsonschema',
'python-dateutil', 'enum34;python_version<"3.4"',
'functools32;python_version<"3.0"', 'deprecated', 'cachetools;python_version<"3.0"'],
extras_require={'fileobjects': ['lief>=0.8,<0.10;python_version<"3.5"', 'lief>=0.10.0.dev0;python_version>"3.5"', 'python-magic', 'pydeep'],
install_requires=['six', 'requests', 'python-dateutil', 'jsonschema', 'deprecated'],
extras_require={'fileobjects': ['python-magic', 'pydeep', 'lief>=0.10.1'],
'neo': ['py2neo'],
'openioc': ['beautifulsoup4'],
'virustotal': ['validators'],

View File

@ -30,7 +30,7 @@
"name": "file",
"sharing_group_id": "0",
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"template_version": "17",
"template_version": "19",
"uuid": "a"
},
{

View File

@ -30,7 +30,7 @@
"name": "file",
"sharing_group_id": "0",
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"template_version": "17",
"template_version": "19",
"uuid": "a"
},
{
@ -55,7 +55,7 @@
"name": "file",
"sharing_group_id": "0",
"template_uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"template_version": "17",
"template_version": "19",
"uuid": "b"
}
]

View File

@ -449,9 +449,9 @@
]
}
},
"version": 17,
"version": 1,
"description": "File object describing a file with meta-information",
"meta-category": "file",
"uuid": "688c46fb-5edb-40a3-8273-1af7923e2215",
"name": "file"
"uuid": "688c46fb-5edb-40a3-8273-1af7923e0000",
"name": "overwrite_file"
}

View File

@ -1,314 +0,0 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from pymisp import PyMISP, __version__
try:
from keys import url, key
except ImportError as e:
print(e)
url = 'https://localhost:8443'
key = 'd6OmdDFvU3Seau3UjwvHS1y3tFQbaRNhJhDX0tjh'
import time
import unittest
class TestBasic(unittest.TestCase):
def setUp(self):
self.maxDiff = None
self.misp = PyMISP(url, key, False, 'json')
self.live_describe_types = self.misp.get_live_describe_types()
def _clean_event(self, event):
event['Event'].pop('orgc_id', None)
event['Event'].pop('uuid', None)
event['Event'].pop('sharing_group_id', None)
event['Event'].pop('timestamp', None)
event['Event'].pop('org_id', None)
event['Event'].pop('date', None)
event['Event'].pop('RelatedEvent', None)
event['Event'].pop('publish_timestamp', None)
if event['Event'].get('Attribute'):
for a in event['Event'].get('Attribute'):
a.pop('uuid', None)
a.pop('event_id', None)
a.pop('id', None)
a.pop('timestamp', None)
if event['Event'].get('Orgc'):
event['Event']['Orgc'].pop('uuid', None)
event['Event']['Orgc'].pop('id', None)
if event['Event'].get('Org'):
event['Event']['Org'].pop('uuid', None)
event['Event']['Org'].pop('id', None)
return event['Event'].pop('id', None)
def new_event(self):
event = self.misp.new_event(0, 1, 0, "This is a test")
event_id = self._clean_event(event)
to_check = {u'Event': {u'info': u'This is a test', u'locked': False,
u'attribute_count': u'0', 'disable_correlation': False, u'analysis': u'0',
u'ShadowAttribute': [], u'published': False,
u'distribution': u'0', u'event_creator_email': u'admin@admin.test', u'Attribute': [], u'proposal_email_lock': False,
u'extends_uuid': '',
u'Object': [], u'Org': {'local': True, u'name': u'ORGNAME'},
u'Orgc': {'local': True, u'name': u'ORGNAME'},
u'Galaxy': [],
u'threat_level_id': u'1'}}
self.assertEqual(event, to_check, 'Failed at creating a new Event')
return int(event_id)
def add_hashes(self, eventid):
r = self.misp.get_event(eventid)
event = r.json()
event = self.misp.add_hashes(event,
category='Payload installation',
filename='dll_installer.dll',
md5='0a209ac0de4ac033f31d6ba9191a8f7a',
sha1='1f0ae54ac3f10d533013f74f48849de4e65817a7',
sha256='003315b0aea2fcb9f77d29223dd8947d0e6792b3a0227e054be8eb2a11f443d9',
ssdeep=None,
comment='Fanny modules',
to_ids=False,
distribution=2,
proposal=False)
self._clean_event(event)
to_check = {u'Event': {u'info': u'This is a test', u'locked': False,
u'attribute_count': u'3', u'analysis': u'0',
u'ShadowAttribute': [], u'published': False, u'distribution': u'0', u'event_creator_email': u'admin@admin.test',
u'Org': {'local': True, u'name': u'ORGNAME'},
u'Orgc': {'local': True, u'name': u'ORGNAME'},
u'Galaxy': [],
u'Attribute': [
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|0a209ac0de4ac033f31d6ba9191a8f7a',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|md5'},
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|1f0ae54ac3f10d533013f74f48849de4e65817a7',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|sha1'},
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|003315b0aea2fcb9f77d29223dd8947d0e6792b3a0227e054be8eb2a11f443d9',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|sha256'}],
u'proposal_email_lock': False, u'threat_level_id': u'1'}}
self.assertEqual(event, to_check, 'Failed at adding hashes')
def publish(self, eventid):
r = self.misp.get_event(eventid)
event = r.json()
event = self.misp.publish(event)
self._clean_event(event)
to_check = {u'Event': {u'info': u'This is a test', u'locked': False,
u'attribute_count': u'3', u'analysis': u'0',
u'ShadowAttribute': [], u'published': True, u'distribution': u'0', u'event_creator_email': u'admin@admin.test',
u'Org': {'local': True, u'name': u'ORGNAME'},
u'Orgc': {'local': True, u'name': u'ORGNAME'},
u'Galaxy': [],
u'Attribute': [
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|0a209ac0de4ac033f31d6ba9191a8f7a',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|md5'},
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|1f0ae54ac3f10d533013f74f48849de4e65817a7',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|sha1'},
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|003315b0aea2fcb9f77d29223dd8947d0e6792b3a0227e054be8eb2a11f443d9',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|sha256'}],
u'proposal_email_lock': False, u'threat_level_id': u'1'}}
self.assertEqual(event, to_check, 'Failed at publishing event')
def delete(self, eventid):
event = self.misp.delete_event(eventid)
print(event)
def delete_attr(self, attrid):
event = self.misp.delete_attribute(attrid)
print(event)
def get(self, eventid):
event = self.misp.get_event(eventid)
print(event)
def get_stix(self, **kwargs):
event = self.misp.get_stix(kwargs)
print(event)
def add(self):
event = {u'Event': {u'info': u'This is a test', u'locked': False,
u'attribute_count': u'3', u'analysis': u'0',
u'ShadowAttribute': [], u'published': False, u'distribution': u'0', u'event_creator_email': u'admin@admin.test',
u'Attribute': [
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|0a209ac0de4ac033f31d6ba9191a8f7a',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|md5'},
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|1f0ae54ac3f10d533013f74f48849de4e65817a7',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|sha1'},
{u'category': u'Payload installation', u'comment': u'Fanny modules',
u'to_ids': False, u'value': u'dll_installer.dll|003315b0aea2fcb9f77d29223dd8947d0e6792b3a0227e054be8eb2a11f443d9',
u'ShadowAttribute': [], u'distribution': u'2', u'type': u'filename|sha256'}],
u'proposal_email_lock': False, u'threat_level_id': u'1'}}
event = self.misp.add_event(event)
print(event)
def add_user(self):
email = 'test@misp.local'
role_id = '5'
org_id = '1'
password = 'Password1234!'
external_auth_required = False
external_auth_key = ''
enable_password = False
nids_sid = '1238717'
server_id = '1'
gpgkey = ''
certif_public = ''
autoalert = False
contactalert = False
disabled = False
change_pw = '0'
termsaccepted = False
newsread = '0'
authkey = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa'
to_check = {'User': {'email': email, 'org_id': org_id, 'role_id': role_id,
'password': password, 'external_auth_required': external_auth_required,
'external_auth_key': external_auth_key, 'enable_password': enable_password,
'nids_sid': nids_sid, 'server_id': server_id, 'gpgkey': gpgkey,
'certif_public': certif_public, 'autoalert': autoalert,
'contactalert': contactalert, 'disabled': disabled,
'change_pw': change_pw, 'termsaccepted': termsaccepted,
'newsread': newsread, 'authkey': authkey}}
user = self.misp.add_user(email=email,
role_id=role_id,
org_id=org_id,
password=password,
external_auth_required=external_auth_required,
external_auth_key=external_auth_key,
enable_password=enable_password,
nids_sid=nids_sid,
server_id=server_id,
gpgkey=gpgkey,
certif_public=certif_public,
autoalert=autoalert,
contactalert=contactalert,
disabled=disabled,
change_pw=change_pw,
termsaccepted=termsaccepted,
newsread=newsread,
authkey=authkey)
# delete user to allow reuse of test
uid = user.get('User').get('id')
self.misp.delete_user(uid)
# ----------------------------------
# test interesting keys only (some keys are modified(password) and some keys are added (lastlogin)
tested_keys = ['email', 'org_id', 'role_id', 'server_id', 'autoalert',
'authkey', 'gpgkey', 'certif_public', 'nids_sid', 'termsaccepted',
'newsread', 'contactalert', 'disabled']
for k in tested_keys:
self.assertEqual(user.get('User').get(k), to_check.get('User').get(k), "Failed to match input with output on key: {}".format(k))
def add_organisation(self):
name = 'Organisation tests'
description = 'This is a test organisation'
orgtype = 'Type is a string'
nationality = 'French'
sector = 'Bank sector'
uuid = '16fd2706-8baf-433b-82eb-8c7fada847da'
contacts = 'Text field with no limitations'
local = False
to_check = {'Organisation': {'name': name, 'description': description,
'type': orgtype, 'nationality': nationality,
'sector': sector, 'uuid': uuid, 'contacts': contacts,
'local': local}}
org = self.misp.add_organisation(name=name,
description=description,
type=orgtype,
nationality=nationality,
sector=sector,
uuid=uuid,
contacts=contacts,
local=local,
)
# delete organisation to allow reuse of test
oid = org.get('Organisation').get('id')
self.misp.delete_organisation(oid)
# ----------------------------------
tested_keys = ['anonymise', 'contacts', 'description', 'local', 'name',
'nationality', 'sector', 'type', 'uuid']
for k in tested_keys:
self.assertEqual(org.get('Organisation').get(k), to_check.get('Organisation').get(k), "Failed to match input with output on key: {}".format(k))
def test_create_event(self):
eventid = self.new_event()
time.sleep(1)
self.delete(eventid)
def test_get_event(self):
eventid = self.new_event()
time.sleep(1)
self.get(eventid)
time.sleep(1)
self.delete(eventid)
def test_add_event(self):
self.add()
time.sleep(1)
self.delete(1)
def test_del_attr(self):
eventid = self.new_event()
time.sleep(1)
self.delete_attr(1)
time.sleep(1)
self.delete(eventid)
def test_one_or_more(self):
self.assertEqual(self.misp._one_or_more(1), (1,))
self.assertEqual(self.misp._one_or_more([1]), [1])
def test_create_user(self):
self.add_user()
def test_create_organisation(self):
self.add_organisation()
def test_describeTypes_sane_default(self):
sane_default = self.live_describe_types['sane_defaults']
self.assertEqual(sorted(sane_default.keys()), sorted(self.live_describe_types['types']))
def test_describeTypes_categories(self):
category_type_mappings = self.live_describe_types['category_type_mappings']
self.assertEqual(sorted(category_type_mappings.keys()), sorted(self.live_describe_types['categories']))
def test_describeTypes_types_in_categories(self):
category_type_mappings = self.live_describe_types['category_type_mappings']
for category, types in category_type_mappings.items():
existing_types = [t for t in types if t in self.live_describe_types['types']]
self.assertEqual(sorted(existing_types), sorted(types))
def test_describeTypes_types_have_category(self):
category_type_mappings = self.live_describe_types['category_type_mappings']
all_types = set()
for category, types in category_type_mappings.items():
all_types.update(types)
self.assertEqual(sorted(list(all_types)), sorted(self.live_describe_types['types']))
def test_describeTypes_sane_default_valid_category(self):
sane_default = self.live_describe_types['sane_defaults']
categories = self.live_describe_types['categories']
for t, sd in sane_default.items():
self.assertTrue(sd['to_ids'] in [0, 1])
self.assertTrue(sd['default_category'] in categories)
def test_live_acl(self):
query_acl = self.misp.get_live_query_acl()
self.assertEqual(query_acl['response'], [])
def test_recommended_pymisp_version(self):
response = self.misp.get_recommended_api_version()
recommended_version_tup = tuple(int(x) for x in response['version'].split('.'))
pymisp_version_tup = tuple(int(x) for x in __version__.split('.'))[:3]
self.assertEqual(recommended_version_tup, pymisp_version_tup)
if __name__ == '__main__':
unittest.main()

View File

@ -6,8 +6,10 @@ import json
import sys
from io import BytesIO
import glob
import hashlib
from datetime import date, datetime
from pymisp import MISPEvent, MISPSighting, MISPTag
from pymisp import MISPEvent, MISPSighting, MISPTag, MISPOrganisation
from pymisp.exceptions import InvalidMISPObject
@ -87,7 +89,8 @@ class TestMISPEvent(unittest.TestCase):
del a.uuid
self.mispevent.objects[0].uuid = 'a'
self.mispevent.objects[1].uuid = 'b'
self.mispevent.objects[0].add_reference(self.mispevent.objects[1], 'baz', comment='foo')
reference = self.mispevent.objects[0].add_reference(self.mispevent.objects[1], 'baz', comment='foo')
del reference.uuid
self.assertEqual(self.mispevent.objects[0].references[0].relationship_type, 'baz')
with open('tests/mispevent_testfiles/event_obj_attr_tag.json', 'r') as f:
ref_json = json.load(f)
@ -292,9 +295,39 @@ class TestMISPEvent(unittest.TestCase):
ref_json = json.load(f)
self.assertEqual(self.mispevent.to_json(sort_keys=True, indent=2), json.dumps(ref_json, sort_keys=True, indent=2))
'''
# Reenable that the 1st of jan 2020.
@unittest.skipIf(sys.version_info < (3, 6), 'Not supported on python < 3.6')
def test_first_last_seen(self):
me = MISPEvent()
me.info = 'Test First and Last Seen'
me.date = '2020.01.12'
self.assertEqual(me.date.day, 12)
me.add_attribute('ip-dst', '8.8.8.8', first_seen='06-21-1998', last_seen=1580213607.469571)
self.assertEqual(me.attributes[0].first_seen.year, 1998)
self.assertEqual(me.attributes[0].last_seen.year, 2020)
now = datetime.now().astimezone()
me.attributes[0].last_seen = now
today = date.today()
me.attributes[0].first_seen = today
self.assertEqual(me.attributes[0].first_seen.year, today.year)
self.assertEqual(me.attributes[0].last_seen, now)
def test_feed(self):
me = MISPEvent()
me.info = 'Test feed'
org = MISPOrganisation()
org.name = 'TestOrg'
org.uuid = '123478'
me.Orgc = org
me.add_attribute('ip-dst', '8.8.8.8')
obj = me.add_object(name='file')
obj.add_attributes('filename', *['foo.exe', 'bar.exe'])
h = hashlib.new('md5')
h.update(b'8.8.8.8')
hash_attr_val = h.hexdigest()
feed = me.to_feed(with_meta=True)
self.assertEqual(feed['Event']['_hashes'][0], hash_attr_val)
self.assertEqual(feed['Event']['_manifest'][me.uuid]['info'], 'Test feed')
self.assertEqual(len(feed['Event']['Object'][0]['Attribute']), 2)
def test_object_templates(self):
me = MISPEvent()
for template in glob.glob(str(me.misp_objects_path / '*' / 'definition.json')):
@ -313,7 +346,7 @@ class TestMISPEvent(unittest.TestCase):
if 'categories' in entry:
subset = set(entry['categories']).issubset(me.describe_types['categories'])
self.assertTrue(subset, f'{t_json["name"]} - {obj_relation}')
'''
if __name__ == '__main__':
unittest.main()

View File

@ -7,13 +7,12 @@ import sys
import unittest
from pymisp.tools import make_binary_objects
from datetime import datetime, timedelta, date
from datetime import datetime, timedelta, date, timezone
from io import BytesIO
import re
import json
from pathlib import Path
import urllib3
import urllib3 # type: ignore
import time
from uuid import uuid4
@ -23,6 +22,8 @@ from collections import defaultdict
import logging
logging.disable(logging.CRITICAL)
logger = logging.getLogger('pymisp')
try:
from pymisp import ExpandedPyMISP, MISPEvent, MISPOrganisation, MISPUser, Distribution, ThreatLevel, Analysis, MISPObject, MISPAttribute, MISPSighting, MISPShadowAttribute, MISPTag, MISPSharingGroup, MISPFeed, MISPServer, MISPUserSetting
@ -36,7 +37,7 @@ except ImportError:
raise
try:
from keys import url, key
from keys import url, key # type: ignore
verifycert = False
except ImportError as e:
print(e)
@ -75,7 +76,7 @@ class TestComprehensive(unittest.TestCase):
user.email = 'testusr@user.local'
user.org_id = cls.test_org.id
cls.test_usr = cls.admin_misp_connector.add_user(user, pythonify=True)
cls.user_misp_connector = ExpandedPyMISP(url, cls.test_usr.authkey, verifycert, debug=False)
cls.user_misp_connector = ExpandedPyMISP(url, cls.test_usr.authkey, verifycert, debug=True)
cls.user_misp_connector.toggle_global_pythonify()
# Creates a publisher
user = MISPUser()
@ -839,6 +840,7 @@ class TestComprehensive(unittest.TestCase):
second = self.user_misp_connector.add_event(second)
current_ts = int(time.time())
time.sleep(5)
r = self.user_misp_connector.add_sighting({'value': first.attributes[0].value})
self.assertEqual(int(r.attribute_id), first.attributes[0].id)
@ -990,10 +992,9 @@ class TestComprehensive(unittest.TestCase):
try:
first = self.user_misp_connector.add_event(first)
stix = self.user_misp_connector.search(return_format='stix', eventid=first.id)
found = re.findall('8.8.8.8', stix)
self.assertTrue(found)
self.assertTrue(stix['related_packages'][0]['package']['incidents'][0]['related_indicators']['indicators'][0]['indicator']['observable']['object']['properties']['address_value']['value'], '8.8.8.8')
stix2 = self.user_misp_connector.search(return_format='stix2', eventid=first.id)
json.dumps(stix2, indent=2)
print(json.dumps(stix2, indent=2))
self.assertEqual(stix2['objects'][-1]['pattern'], "[network-traffic:src_ref.type = 'ipv4-addr' AND network-traffic:src_ref.value = '8.8.8.8']")
finally:
# Delete event
@ -1072,8 +1073,24 @@ class TestComprehensive(unittest.TestCase):
file_object = first.get_objects_by_name('file')[0]
file_object.force_misp_objects_path_custom('tests/mispevent_testfiles', 'overwrite_file')
file_object.add_attribute('test_overwrite', 'blah')
obj = self.admin_misp_connector.update_object(file_object, pythonify=True)
obj_json = self.admin_misp_connector.update_object(file_object)
self.assertTrue('Object' in obj_json, obj_json)
self.assertTrue('name' in obj_json['Object'], obj_json)
obj = MISPObject(obj_json['Object']['name'])
obj.from_dict(**obj_json)
self.assertEqual(obj.get_attributes_by_relation('test_overwrite')[0].value, 'blah')
# FULL object add & update with custom template
new_object = MISPObject('overwrite_file', misp_objects_path_custom='tests/mispevent_testfiles')
new_object.add_attribute('test_overwrite', 'barbaz')
new_object.add_attribute('filename', 'barbaz.exe')
new_object = self.admin_misp_connector.add_object(first, new_object, pythonify=True)
self.assertEqual(new_object.get_attributes_by_relation('test_overwrite')[0].value, 'barbaz', new_object)
new_object.force_misp_objects_path_custom('tests/mispevent_testfiles', 'overwrite_file')
new_object.add_attribute('filename', 'foobar.exe')
new_object = self.admin_misp_connector.update_object(new_object, pythonify=True)
self.assertEqual(new_object.get_attributes_by_relation('filename')[1].value, 'foobar.exe', new_object)
finally:
# Delete event
self.admin_misp_connector.delete_event(first)
@ -1185,7 +1202,9 @@ class TestComprehensive(unittest.TestCase):
self.assertFalse(first.attributes[0].tags)
# Reference: https://github.com/MISP/PyMISP/issues/483
r = self.delegate_user_misp_connector.tag(first, tag_org_restricted)
self.assertEqual(r['errors'][1]['message'], 'Invalid Tag. This tag can only be set by a fixed organisation.')
# FIXME: The error message changed and is unhelpful.
# self.assertEqual(r['errors'][1]['message'], 'Invalid Tag. This tag can only be set by a fixed organisation.')
self.assertEqual(r['errors'][1]['message'], 'Invalid Target.')
r = self.user_misp_connector.tag(first, tag_org_restricted)
self.assertEqual(r['name'], f'Global tag {tag_org_restricted.name}({tag_org_restricted.id}) successfully attached to Event({first.id}).')
r = self.pub_misp_connector.tag(first.attributes[0], tag_user_restricted)
@ -1958,7 +1977,6 @@ class TestComprehensive(unittest.TestCase):
self.admin_misp_connector.delete_user(test_roles_user)
self.admin_misp_connector.delete_tag(test_tag)
@unittest.skipIf(sys.version_info < (3, 6), 'Not supported on python < 3.6')
def test_expansion(self):
first = self.create_simple_event()
try:
@ -2027,7 +2045,6 @@ class TestComprehensive(unittest.TestCase):
self.admin_misp_connector.delete_event(first)
self.admin_misp_connector.delete_event(second)
@unittest.skipIf(sys.version_info < (3, 6), 'Not supported on python < 3.6')
def test_communities(self):
communities = self.admin_misp_connector.communities(pythonify=True)
self.assertEqual(communities[0].name, 'CIRCL Private Sector Information Sharing Community - aka MISPPRIV')
@ -2061,6 +2078,51 @@ class TestComprehensive(unittest.TestCase):
self.admin_misp_connector.delete_event(first)
self.admin_misp_connector.delete_event(second)
def test_first_last_seen(self):
local_tz = datetime.now(timezone.utc).astimezone().tzinfo
event = MISPEvent()
event.info = 'Test First Last seen'
event.add_attribute('ip-dst', '8.8.8.8', first_seen='2020-01-04', last_seen='2020-01-04T12:30:34.323242+0800')
obj = event.add_object(name='file', first_seen=1580147259.268763, last_seen=1580147300)
attr = obj.add_attribute('filename', 'blah.exe')
attr.first_seen = '2022-01-30'
attr.last_seen = '2022-02-23'
try:
first = self.admin_misp_connector.add_event(event, pythonify=True)
# Simple attribute
self.assertEqual(first.attributes[0].first_seen, datetime(2020, 1, 4, 0, 0, tzinfo=local_tz))
self.assertEqual(first.attributes[0].last_seen, datetime(2020, 1, 4, 4, 30, 34, 323242, tzinfo=timezone.utc))
# Object
self.assertEqual(first.objects[0].first_seen, datetime(2020, 1, 27, 17, 47, 39, 268763, tzinfo=timezone.utc))
self.assertEqual(first.objects[0].last_seen, datetime(2020, 1, 27, 17, 48, 20, tzinfo=timezone.utc))
# Object attribute
self.assertEqual(first.objects[0].attributes[0].first_seen, datetime(2022, 1, 30, 0, 0, tzinfo=local_tz))
self.assertEqual(first.objects[0].attributes[0].last_seen, datetime(2022, 2, 23, 0, 0, tzinfo=local_tz))
# Update values
# Attribute in full event
now = datetime.now().astimezone()
first.attributes[0].last_seen = now
first = self.admin_misp_connector.update_event(first, pythonify=True)
self.assertEqual(first.attributes[0].last_seen, now)
# Object only
now = datetime.now().astimezone()
obj = first.objects[0]
obj.last_seen = now
obj = self.admin_misp_connector.update_object(obj, pythonify=True)
self.assertEqual(obj.last_seen, now)
# Attribute in object only
now = datetime.now().astimezone()
attr = obj.attributes[0]
attr.last_seen = now
attr = self.admin_misp_connector.update_attribute(attr, pythonify=True)
self.assertEqual(attr.last_seen, now)
finally:
self.admin_misp_connector.delete_event(first)
if __name__ == '__main__':
unittest.main()

View File

@ -6,7 +6,7 @@ import sys
import unittest
import subprocess
import urllib3
import urllib3 # type: ignore
import logging
logging.disable(logging.CRITICAL)

View File

@ -3,11 +3,6 @@
set -e
set -x
if [ ${LEGACY} == true ]; then
pip install nose coveralls codecov requests-mock pydeep
pip install .[fileobjects]
else
# We're in python3, installing with pipenv.
pip install pipenv
pipenv update --dev
fi
# We're in python3, installing with pipenv.
pip3 install pipenv
pipenv update --dev

View File

@ -3,9 +3,6 @@
set -e
set -x
if [ -z ${LEGACY} ]; then
# We're in python3, test all and use pipenv.
pipenv run nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
else
nosetests --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_mispevent.py
fi
pipenv run nosetests-3.4 --with-coverage --cover-package=pymisp,tests --cover-tests tests/test_*.py
pipenv run mypy tests/testlive_comprehensive.py tests/test_mispevent.py tests/testlive_sync.py pymisp
pipenv run flake8 --ignore=E501,W503,E226,E252 pymisp