Compare commits

...

644 Commits

Author SHA1 Message Date
chrisr3d ca61b06aa2
Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-05-22 22:15:40 +02:00
Chris Lenk 7e418252d5
Merge pull request #397 from chisholm/drop_dateutil
Drop python-dateutil and switch to built-in datetime module
2020-05-21 10:23:40 -04:00
Michael Chisholm 7955a41997 Drop python-dateutil as a dependency and switch to the builtin
datetime module for parsing timestamps.  Dateutil proved too
slow.
2020-05-20 15:06:53 -04:00
Chris Lenk 33e07edf3b
Merge pull request #393 from emmanvg/391-ssdeep-hash-case
resolve problem with SSDEEP use in hashing-algorithm-ov
2020-05-15 09:36:26 -04:00
Emmanuelle Vargas-Gonzalez b4dbc419f6 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into 391-ssdeep-hash-case 2020-05-14 12:51:02 -04:00
Chris Lenk 658e70bf04
Merge pull request #392 from khdesai/fix_issue_389
Fix issue 389 - Add property names to canonicalization for deterministic id gen
2020-05-14 09:39:26 -04:00
Emmanuelle Vargas-Gonzalez 68f7ca6377 resolve problem with SSDEEP vocab use for 2.1, closes #391 2020-05-13 18:17:17 -04:00
Desai, Kartikey H 998b4c0725 Change streamlined_obj_vals list to streamlined_object dict 2020-05-13 12:45:16 -05:00
Desai, Kartikey H 9ce299b660 Fixes #389 2020-05-13 11:40:37 -05:00
Desai, Kartikey H 65d4060e6a Fixes #389 2020-05-13 11:23:26 -05:00
Desai, Kartikey H 0b1297b14a Fixes #389 2020-05-13 11:22:51 -05:00
Desai, Kartikey H de3fa99a12 Add property names to canonicalization for deterministic id gen 2020-05-13 11:20:16 -05:00
chrisr3d e4f08557ec
fix: Diffusing interoperability parameter to all included objects & references 2020-04-15 18:00:21 +02:00
chrisr3d 8e95dbfce2
Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-04-15 16:26:20 +02:00
Chris Lenk 31cb2f85be Bump version: 1.3.1 → 1.4.0 2020-04-03 17:44:52 -04:00
Chris Lenk c68dd055c1 Update CHANGELOG for v1.4.0 2020-04-03 17:44:20 -04:00
Chris Lenk df92770d25
Merge pull request #384 from oasis-open/365-versioned-classes
Validate custom type/property name formats
2020-04-03 17:30:24 -04:00
Chris Lenk 8c4204de74
Merge pull request #385 from emmanvg/taxii-datastore-updates
Test Datastore TAXII Updates
2020-04-03 17:30:04 -04:00
Emmanuelle Vargas-Gonzalez 2b0d63c4b1 update test_datastore_taxii.py conftest.py for latest changes in medallion. add extra data used by filter 2020-04-03 17:19:36 -04:00
Chris Lenk c7fb79d195 Fix some TAXII DataStore tests 2020-04-03 15:58:56 -04:00
Chris Lenk 9145bdf5e8
Merge pull request #374 from chisholm/version_precision
Support STIX 2.1 version precision
2020-04-03 15:52:42 -04:00
Chris Lenk 0d770972cf
Merge pull request #382 from oasis-open/more-pattern-tests
More pattern tests
2020-04-03 11:24:43 -04:00
Chris Lenk e730d45d44 Use DEFAULT_VERSION in create_pattern_object() 2020-04-03 10:45:36 -04:00
Chris Lenk 14540c0ea1 Clean up _register_* functions
Made them consistent with _register_observable_extension, by:
- moving validation logic there from _custom_*_builder functions
- using a new function for ensuring properties are dict-like
- using the library default spec version instead of None

Fix #371, fix #372, fix #373.
2020-04-02 14:15:45 -04:00
Chris Lenk bbf0f81d5f
Merge pull request #376 from khdesai/fix_issue_363
Fix existing tests and add new tests. Fixes #363
2020-04-02 13:42:02 -04:00
chrisr3d 65a943d892 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-04-02 17:55:24 +02:00
Chris Lenk d33adbcc71 Rename test files to align with module renaming 2020-04-02 08:22:49 -04:00
Chris Lenk 13cddf9d6d Move TypeProperty format checks to __init__
TypeProperty uses a fixed value, so check() was never called. This way
also runs the check at object registration time because the wrapper
creates an instance of TypeProperty and doesn't have to wait for the
object to be instantiated so clean() can be called.
Also fix some tests.
2020-04-02 08:17:34 -04:00
Chris Lenk 03cb225932 Merge branch 'master' into 365-versioned-classes 2020-04-02 06:02:20 -04:00
Chris Lenk 897e884217 Fix some testing 2020-04-02 04:46:11 -04:00
Chris Lenk c494a2e477 Use TypeProperty.clean() to verify type format 2020-04-01 21:52:04 -04:00
Desai, Kartikey H c911cff97f Add duplicate checking to markings and observable extensions, and fix some tests and add some tests. Fixes #363 2020-03-27 14:58:18 -04:00
Rich Piazza 1a2b1367cf flaky 2 2020-03-27 14:06:24 -04:00
Rich Piazza 9933f88975 few more pattern op tests 2020-03-27 13:59:03 -04:00
Rich Piazza e3ebb6393d flaky 2020-03-27 12:33:24 -04:00
Rich Piazza 202111acdf more pattern tests 2020-03-27 11:22:00 -04:00
Chris Lenk 38817a603f
Merge pull request #379 from oasis-open/new-master
add 2.1 links
2020-03-27 10:10:44 -04:00
Rich Piazza 46219bf072 add 2.1 links 2020-03-27 09:36:10 -04:00
Chris Lenk b4700e6d00 Fix import errors
And pin medallion version for testing.
2020-03-27 06:33:29 -04:00
Chris Lenk 50df6f1474 Rename core.py -> parsing.py 2020-03-27 05:53:39 -04:00
Chris Lenk 01ba190525 Reorganize bases, use isinstance to check version
Renamed STIXDomainObject -> _DomainObject.
Renamed STIXRelationshipObject -> _RelationshipObject.
2020-03-27 02:40:42 -04:00
Desai, Kartikey H a7e9a7dde5 Merge branch 'master' of https://github.com/oasis-open/cti-python-stix2 into fix_issue_363 2020-03-26 23:27:51 -04:00
Chris Lenk e8035863b8
Make swid an id-contributing property 2020-03-21 23:56:09 -04:00
Chris Lenk e31634c32b Rework spec version detection for _STIXBase objs 2020-03-21 22:22:36 -04:00
Desai, Kartikey H 1a1ad90388 Fixes #363 2020-03-20 17:37:15 -04:00
Desai, Kartikey H b06bc1afc1 Fix import issues 2020-03-20 17:32:18 -04:00
Desai, Kartikey H f37b84a564 Pull in updates from master 2020-03-20 16:52:21 -04:00
Desai, Kartikey H 1260c7b45e Fix existing tests and add new tests. Fixes #363 2020-03-20 16:49:20 -04:00
Rich Piazza 2dea4caf00 fix re so they begin with ^ 2020-03-20 14:24:16 -04:00
Rich Piazza d8a9fc2306 flaky 2020-03-20 13:15:42 -04:00
Rich Piazza 9e5e998c3d don't allow leading '_' on custom properties, whenever allow_custom is true 2020-03-20 12:49:20 -04:00
Rich Piazza 2c4e47de56 remove leading - from type name re 2020-03-20 11:56:09 -04:00
Rich Piazza 6e4151aeeb flaky 2020-03-19 16:49:46 -04:00
Rich Piazza fe919049b8 fix marking test 2020-03-19 16:43:37 -04:00
Rich Piazza f60e4170fd finish 365 2020-03-19 16:11:52 -04:00
Rich Piazza 844ec2c3bf more on issue 365 2020-03-19 14:16:48 -04:00
Rich Piazza 9699c78ad8 issue-365 2020-03-19 10:40:35 -04:00
Michael Chisholm 1741cc9f6b Fix import sort order for the import sorter precommit hook 2020-03-17 20:26:21 -04:00
Michael Chisholm 6f43814918 Add xfail mark to a unit test which trips a Python 3.6 bug.
https://bugs.python.org/issue32404
2020-03-17 20:21:09 -04:00
Michael Chisholm f99665f2ba One more comma, because python 3.8's add-trailing-comma
pre-commit hook doesn't add all the commas Travis's hook
script expects...
2020-03-17 19:45:39 -04:00
Michael Chisholm cf9aef59c2 More flake8 style fixes 2020-03-17 18:28:38 -04:00
Michael Chisholm a9ac7ce838 pre-commit hook changes, e.g. trailing commas, import sorting,
flake8 style.
2020-03-17 18:26:57 -04:00
Michael Chisholm 4aa69fa7c9 Add support for enforcing STIX 2.1 minimum precision requirement
on versioning timestamps.
2020-03-16 20:25:38 -04:00
Chris Lenk 6842abb371
Merge pull request #370 from chisholm/observable_extension_names_ext
New STIX 2.1 SCO extension name requirement: must end with "-ext"
2020-03-12 17:26:26 -04:00
Michael Chisholm 15316e7933 Added "x-" to SCO extension names in unit tests, to illustrate
best practice and follow a spec "should" rule.
2020-03-12 16:20:32 -04:00
Chris Lenk 3dda25e976
Merge pull request #362 from chisholm/file_id_contrib_props
Add parent_directory_ref as an ID contributing property for file SCOs
2020-03-12 09:36:16 -04:00
Chris Lenk 5abc139e79 Merge branch 'khdesai-fix_issue_338'; Close #347 2020-03-12 09:30:52 -04:00
Chris Lenk 3dd9351d38 Bring back lang, confidence for Course of Action 2020-03-12 09:24:43 -04:00
Desai, Kartikey H 82517ae284 Fixes #338 2020-03-12 09:24:43 -04:00
Desai, Kartikey H 8885a757cb Fix properties spec version back to 2.1, and re-adjust tests. Fixes #338 2020-03-12 09:24:43 -04:00
Desai, Kartikey H 36f7035785 Fixes #338 2020-03-12 09:24:43 -04:00
Chris Lenk e782d095ea
Merge pull request #369 from chisholm/malware_os_refs
Change software SCO: os_execution_envs -> operating_system_refs
2020-03-11 23:47:14 -04:00
Chris Lenk 94e3cd7ca6
Merge pull request #360 from chisholm/enforce_hash_keys
Enforce hash keys on 2.1 external-references
2020-03-11 23:13:55 -04:00
Chris Lenk 87c5ef30ad
Merge pull request #358 from chisholm/software_cpe_swid
Add swid property to the software SCO
2020-03-11 23:03:18 -04:00
Michael Chisholm 2472af387b Change a SWID tagId in a unit test from a UUID to something
more plausible.
2020-03-11 15:21:34 -04:00
Chris Lenk 33fb31421b
Merge pull request #357 from chisholm/malware_analysis_result
Update malware-analysis SDO's av_result property
2020-03-11 09:16:34 -04:00
Chris Lenk bdf7cab8fe
Merge pull request #356 from chisholm/malware_analysis_sample_ref
Add the "sample_ref" property to malware-analysis SDOs
2020-03-11 09:12:02 -04:00
Chris Lenk 2429533e4f
Merge pull request #355 from chisholm/optional_type_properties
Changed several *_types properties to be optional due to STIX spec change
2020-03-11 09:08:42 -04:00
Michael Chisholm 371bf0b9a4 Add trailing commas for git commit hook... 2020-03-10 21:21:53 -04:00
Michael Chisholm d708537b85 Add enforcement of a new STIX 2.1 SCO extension name requirement:
that it must end with "-ext".
2020-03-10 20:24:53 -04:00
chrisr3d 77ca5ae2f9 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-03-09 16:20:57 +01:00
Michael Chisholm 792cc570d7 Change the os_execution_envs property of software SCOs to
operating_system_refs, and add a test for it.
2020-03-06 19:43:47 -05:00
Chris Lenk 380926cff5 Bump version: 1.3.0 → 1.3.1 2020-03-06 09:50:09 -05:00
Chris Lenk 013e8ece4c Update CHANGELOG for v1.3.1 2020-03-06 09:48:03 -05:00
Michael Chisholm e32b074bc9 Fix stylistic issues for pre-commit hooks. 2020-03-05 17:39:35 -05:00
Michael Chisholm 22f2b241a7 Add a missing required property to fix up an external-reference
test.
2020-03-05 17:38:03 -05:00
Michael Chisholm a862b930be Add parent_directory_ref as an ID contributing property for the
file SCO.
2020-03-05 17:18:32 -05:00
Chris Lenk 3803e4bdd7
Merge pull request #343 from chisholm/sco_tlo_filesystemstore
Fix the filesystem store to support the new top-level 2.1 SCOs.
2020-03-05 17:08:20 -05:00
Chris Lenk 67548a8e5e Merge branch 'khdesai-fix_issue_323', Close #345 2020-03-05 00:14:53 -05:00
Chris Lenk 413aab62dd
Merge pull request #353 from chisholm/fix_indicator_test
Update exception message in an indicator unit test
2020-03-05 10:51:27 -05:00
Chris Lenk cdde664434
Merge branch 'master' into fix_indicator_test 2020-03-05 10:51:03 -05:00
Chris Lenk d260aa8ebd Tweak custom SCO ID-generating properties docs 2020-03-05 00:11:24 -05:00
Desai, Kartikey H 5df319bcbb Add documentation about id-contributing properties for 2.1 custom SCOs 2020-03-05 00:11:24 -05:00
Desai, Kartikey H 02171996fa Add documentation about _id_contributing_properties for custom SCOs. Will add in code example once PR 354 is merged in 2020-03-05 00:11:24 -05:00
Desai, Kartikey H 47b28d1194 Fixes #323 2020-03-05 00:11:24 -05:00
Michael Chisholm a5dc514403 Fix external-references to force hash keys to come from
hash-algorithm-ov.
2020-03-04 20:55:52 -05:00
Chris Lenk 57d24adea9
Merge pull request #354 from khdesai/fix_issue_351
Add _id_contributing_properties functionality to custom SCOs. Tests c…
2020-03-04 15:01:08 -05:00
Desai, Kartikey H a5cd0fdc50 Change location of None-check for id_contrib_props. Fixes #351 2020-03-04 14:46:55 -05:00
Desai, Kartikey H fc95b400ff Change default parameters from empty lists to None. Fixes #351 2020-03-04 14:29:35 -05:00
Desai, Kartikey H 8810983ca0 Merge branch 'master' of https://github.com/oasis-open/cti-python-stix2 into fix_issue_351 2020-03-04 14:16:54 -05:00
Chris Lenk 30a59ad776
Merge pull request #344 from chisholm/fix_ast_builder
Fix the pattern AST creation function
2020-03-04 13:49:16 -05:00
Chris Lenk 4f00c7ca4f Fix patterning test 2020-03-04 13:33:54 -05:00
Michael Chisholm 4e2b018272 Add a property to the software SCO, due to STIX spec change. 2020-03-02 16:57:18 -05:00
chrisr3d 0f0bc42681
Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-03-02 15:31:39 +01:00
Michael Chisholm d2bff4d411 Update malware-analysis SDO's av_result property: replace it with
result and result_name properties.  Per:
https://github.com/oasis-tcs/cti-stix2/issues/213
2020-02-27 17:26:04 -05:00
Michael Chisholm 50eb188190 Add the "sample_ref" property to malware-analysis SDOs, per:
https://github.com/oasis-tcs/cti-stix2/issues/210
2020-02-27 16:40:56 -05:00
Desai, Kartikey H 055ad97a7a Add tests for _id_contributing_properties for custom observables 2020-02-27 15:15:37 -05:00
Michael Chisholm 93a8caa09d Remove unused import 2020-02-25 20:19:30 -05:00
Michael Chisholm 31c37a9b12 Changed several *_types properties which were formerly required,
to be optional, due to a STIX spec change.  Updated unit tests
accordingly.
2020-02-25 20:07:47 -05:00
Desai, Kartikey H 41e541959d Add _id_contributing_properties functionality to custom SCOs. Tests coming soon. Fixes #351 2020-02-24 21:11:42 -05:00
Michael Chisholm 274abc52e9 An exception message changed as a result of a pattern-validator
update.  This broke a unit test which was testing the message.
I updated the test.
2020-02-24 20:02:26 -05:00
Chris Lenk c2b71672f5
Merge pull request #348 from khdesai/fix_issue_334
Fixes #334
2020-02-21 16:35:18 -05:00
Desai, Kartikey H a0a8b7d0e1 Fixes #334 2020-02-21 15:40:38 -05:00
Desai, Kartikey H 796a4e20fa Correct bug in recursive dict loop. Fixes #334 2020-02-21 15:26:19 -05:00
Chris Lenk 148d672b24
Merge pull request #346 from khdesai/fix_issue_336
Fixes #336 (some SCO _refs properties no longer deprecated)
2020-02-21 11:51:29 -05:00
Michael Chisholm 1959cc6097 Removed a bunch of no-longer-used imports from pattern_visitor.py 2020-02-19 16:45:15 -05:00
Michael Chisholm 76a6eb5873 Greatly simplify the create_pattern_object() function to take
advantage of the pattern validator library's Pattern.visit()
method.
2020-02-19 16:39:15 -05:00
Desai, Kartikey H 1084c75d33 Fixes #334 2020-02-19 16:29:13 -05:00
Michael Chisholm 14daa1edae Add a test case to test parse exceptions from
create_pattern_object().
2020-02-19 15:39:23 -05:00
Desai, Kartikey H 8219b34ea4 Fix formatting issues. Fixes #336 2020-02-19 09:24:27 -05:00
Desai, Kartikey H 86f9e51a42 Fixes #336 2020-02-19 09:11:30 -05:00
Michael Chisholm cfb7c4c73b Fix stix2.pattern_visitor.create_pattern_object() so its
documentation at least isn't wrong, and it behaves better.
2020-02-17 19:26:21 -05:00
Michael Chisholm 4c67142b92 Fix the filesystem store to support the new top-level 2.1 SCOs. 2020-02-15 19:02:53 -05:00
Chris Lenk 8aca39a0b0
Merge pull request #342 from chisholm/sco_tlo_memorystore
Fix the memory store to support the new top-level 2.1 SCOs.
2020-02-14 10:16:06 -05:00
Michael Chisholm be5274878d Add trailing commas for pre-commit hook... 2020-02-13 17:37:59 -05:00
Michael Chisholm 98a654884d Fix the memory store to support the new top-level 2.1 SCOs. 2020-02-13 17:11:58 -05:00
Chris Lenk fdb00c4a8c
Merge pull request #337 from chisholm/improve_stix_version_detection
Fix STIX version detection from dicts
2020-02-13 14:53:03 -05:00
Michael Chisholm f86b6e8a66 More add-trailing-comma junk, which is not done by the python 3.8
pre-commit add-trailing-comma library.  Hopefully this satisfies
the travis tests for other versions of python!
2020-02-07 19:15:59 -05:00
Michael Chisholm bf83ca62b3 Add trailing commas for the pre-commit hook...... 2020-02-07 18:58:45 -05:00
Michael Chisholm 19707677c9 Fix STIX version detection from dicts. In particular, 2.1 SCOs
without the spec_version property ought to be correctly detected
as 2.1 now.
2020-02-07 18:17:12 -05:00
chrisr3d 5aaf07702d Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-02-05 21:18:14 +01:00
Chris Lenk c96b74294a
Merge pull request #331 from chisholm/remove_values_workaround
Remove workaround for "values" being both a Mapping method name and STIX property name
2020-02-04 10:39:42 -05:00
Michael Chisholm c6b2b6fbfa Change the warning in the jupyter notebook for creating content,
regarding name collisions between method and property names, to
not pick on the Mapping methods specifically.  The problem is
more general than that: stix objects have more methods than those.
Instead of listing them all out, a more general statement is
made, that accessing those attributes will result in a bound
method, not a STIX property value.
2020-02-03 16:51:12 -05:00
chrisr3d dece917a68 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-01-31 11:38:26 +01:00
Michael Chisholm 1cadeaa1c6 Added a warning to creating.ipynb about Mapping attributes: they
can't be used to access STIX object properties.
2020-01-29 16:01:15 -05:00
Michael Chisholm 176cb980a2 Remove workaround for "values" being both a Mapping method name
and sometimes a STIX property name.  It didn't work (caused
crashes under some circumstances).  Now, attributes whose names
conflict with Mapping methods will have the Mapping
interpretation.  Same-named STIX object properties will not be
accessible as attributes.
2020-01-28 18:13:36 -05:00
Chris Lenk 7c186b0a06 Merge branch 'khdesai-fix_issue_303' 2020-01-28 16:10:54 -05:00
Desai, Kartikey H 5b07887edc Fixes #303 2020-01-28 15:41:38 -05:00
Chris Lenk 4b8fda0c2f
Merge pull request #328 from emmanvg/327-marking-definition-changes
327 marking definition changes
2020-01-28 14:05:20 -05:00
Emmanuelle Vargas-Gonzalez 88426de424 update test suite to include new property present in TLP Markings 2020-01-28 13:20:58 -05:00
Emmanuelle Vargas-Gonzalez 6f4e819c73 update check_tlp_marking() to contain new representation for TLP markings 2020-01-28 13:20:20 -05:00
Emmanuelle Vargas-Gonzalez 9463884170 add optional "name" StringProperty to MarkingDefinition
update TLP_* v21 constants according to spec
2020-01-28 13:19:23 -05:00
chrisr3d 96946d956d
fix: Avoid errors with custom object ids in the list of object refs in the Report object 2020-01-17 17:13:43 +01:00
chrisr3d ec1fbb58f6 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-01-17 09:20:06 +01:00
Chris Lenk 0af1f442c0
Merge pull request #322 from emmanvg/321-issue
add encoding option to areas where open() is used
2020-01-16 10:17:28 -05:00
Emmanuelle Vargas-Gonzalez c467f198c8 add encoding to MemorySource load_from_file() 2020-01-15 14:15:08 -05:00
Emmanuelle Vargas-Gonzalez 25cfb25ef3 add encoding and propagate accordingly for calls 2020-01-15 14:12:58 -05:00
chrisr3d 0cc3a4585e Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-01-13 23:46:52 +01:00
Chris Lenk b17ac3ba30
Merge pull request #319 from oasis-open/fix-travis-pyyaml
Redo pyyaml workaround for travis
2020-01-08 13:37:03 -05:00
Chris Lenk 76439d1ce9 Redo pyyaml workaround for travis 2020-01-08 13:22:08 -05:00
chrisr3d 0a188f9440 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-01-08 17:01:31 +01:00
chrisr3d cbe109d815
fix: Avoiding issues with some extensions that are not dict but defaultdict 2020-01-08 17:00:18 +01:00
chrisr3d 36e4b41b9c
fix: Avoiding inconsistency in the id prefixes causing uuid check issues 2020-01-08 16:58:34 +01:00
Chris Lenk 8140258d81 Set pyyaml version for Python3.4 2020-01-08 09:44:49 -05:00
chrisr3d c8cd84925b
Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2020-01-08 14:53:36 +01:00
Chris Lenk 6cf47443b1
Update .travis.yml for 3.4 workaround
We'll drop 3.4 support in our next major release but until then this should keep travis builds working.
2020-01-07 16:18:14 -05:00
Chris Lenk a6fefff33d Fix semantic equivalence documentation 2020-01-07 15:49:06 -05:00
Chris Lenk 92f413a2e0 Bump version: 1.2.1 → 1.3.0 2020-01-04 19:40:01 -05:00
Chris Lenk f195cb299e Update CHANGELOG for v1.3.0 2020-01-04 19:35:01 -05:00
Chris Lenk 3092d88154 Fix trailing comma 2020-01-04 18:02:01 -05:00
Chris Lenk e3c2a3a57b Fix error: dict keys changing during iteration 2020-01-04 14:48:49 -05:00
Chris Lenk 4aaf1a0be7 Fix typo 2020-01-04 13:55:04 -05:00
Chris Lenk 6c0fba0e67 Add Python3.8 support, fix import
Uses try/catch to still support 2.7 too
2020-01-04 13:50:06 -05:00
Chris Lenk 2d3afb2a27 Merge branch 'khdesai-fix_issue_307'
Closes #317.
2020-01-04 10:24:51 -05:00
Desai, Kartikey H d50792b4d2 Fix tests. Fixes #307 2020-01-04 10:24:17 -05:00
Desai, Kartikey H 7a47f348a0 Introduce and relocate version-based pattern checking. Fixes #307 2020-01-04 10:24:17 -05:00
Desai, Kartikey H 4350680e79 Introduce and relocate version-based pattern checking. Fixes #307 2020-01-04 10:24:17 -05:00
Desai, Kartikey H a18612bdfb Fixes #307 2020-01-04 10:24:17 -05:00
Chris Lenk 25eb3bdb0c Merge branch 'khdesai-fix_issue_309'
Close #316.
2019-12-23 17:32:26 -05:00
Desai, Kartikey H e260dbb716 Fixes #309 2019-12-23 17:30:34 -05:00
Desai, Kartikey H 32d2a0a4fd Fixes #309 2019-12-23 17:30:34 -05:00
Chris Lenk 74eeabab77 Merge branch 'khdesai-change_logging'
Close #304.
2019-12-23 17:20:32 -05:00
Chris Lenk a9932c09c8 Update Semantic Equivalence documentation 2019-12-23 17:02:21 -05:00
Chris Lenk 62cd4fd33c Change string semantic comparison algorithm
Use `fuzzywuzzy`'s Token Sort Ratio instead of Jaro-Winkler.
2019-12-23 17:00:52 -05:00
Chris Lenk 457564f2f9 Update SemEq test, use dict for property weights 2019-12-20 17:01:21 -05:00
Chris Lenk cde57ce8f7
Merge pull request #315 from khdesai/fix_issue_308
Fix issue 308
2019-12-17 12:13:25 -05:00
Desai, Kartikey H 6df7da65b8 Fixes #308 2019-12-17 11:57:55 -05:00
Desai, Kartikey H 8719a7206f Fixes #308 2019-12-16 16:32:55 -05:00
chrisr3d 31d944b159 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-12-12 22:59:39 +01:00
Chris Lenk 77eda29471 Add default weight_dict to documentation
for semantic equivalence
2019-12-11 13:13:36 -05:00
Desai, Kartikey H f6e75cd8f8 Add debug logging messages and add documentation to equivalence.ipynb 2019-12-06 10:46:27 -05:00
Desai, Kartikey H c09bd071d0 Make requested changes, except documentation, which is coming soon 2019-12-06 10:46:27 -05:00
Desai, Kartikey H 2b180c40b5 Remove unnecessary functions 2019-12-06 10:46:27 -05:00
Desai, Kartikey H f5d199bedf Generalize checking functionality within environment.py and add prop_scores dict so all scoring info is one python object 2019-12-06 10:46:27 -05:00
Chris Lenk 387810c4a3 Merge branch 'khdesai-fix_issue_310' 2019-12-06 09:54:43 -05:00
Desai, Kartikey H a350ad01ac Fixes #310 2019-12-06 09:54:00 -05:00
Chris Lenk 1d9ff5d5ae Merge branch 'khdesai-spec_fixes' 2019-12-06 09:48:48 -05:00
Chris Lenk e9795a945b Fix long line 2019-12-06 09:40:27 -05:00
Desai, Kartikey H 54ecba736d Add docstring for enumerate_types() 2019-12-06 09:35:36 -05:00
Desai, Kartikey H f09cf4867d Remove unnecessary comments 2019-12-06 09:35:36 -05:00
Desai, Kartikey H 3a46d42aaa parse() handles observables in 2.1. Change mechanism for (in)valid_types in ReferenceProperty. Fix _custom_observable_builder to include ReferenceProperty instead of ObjectReferenceProperty, and added ID property to custom observables 2019-12-06 09:35:36 -05:00
Desai, Kartikey H aee296ea46 Fixes #296 2019-12-06 09:35:36 -05:00
Chris Lenk 055bd7ad04
Merge pull request #306 from zrush-mitre/various_fixes
Various fixes
2019-11-27 09:47:08 -05:00
Zach Rush 9a56344d92 Forgot to add TLSH to a different regex 2019-11-25 16:14:23 -05:00
Zach Rush 806c6c52d9 Added tests for other changes, and moved attribute defaults to an init function 2019-11-25 15:52:50 -05:00
Chris Lenk c0580a4d86
Fix parsing example in Readme 2019-11-25 13:05:42 -05:00
Zach Rush 46f1778d04 Fixed all issues brought up in issue #305 by Chris Lenk 2019-11-22 13:24:09 -05:00
zrush-mitre 388d95d456
Merge pull request #1 from oasis-open/master
update
2019-11-21 09:47:06 -05:00
chrisr3d e2a4129ad3 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-10-22 01:30:57 +02:00
Chris Lenk d4c0115735 Bump version: 1.2.0 → 1.2.1 2019-10-16 17:24:16 -04:00
Chris Lenk 4d2925c406 Update CHANGELOG for v1.2.1 2019-10-16 17:23:43 -04:00
Chris Lenk 1d671bd144 Merge branch 'master' into stix2.1 2019-10-16 17:02:21 -04:00
Chris Lenk b5612c9dc2 Update semantic equivalence docs
- Comparing object type not in config dictionary now gives a warning and
result of 0 instead of an error.
- Adds an example of the new detailed debug output.
2019-10-16 09:08:03 -04:00
Chris Lenk ec115b3586
Merge pull request #301 from emmanvg/294-semantic-methods
Update semantic equivalence approach
2019-10-16 09:07:34 -04:00
Emmanuelle Vargas-Gonzalez 13fda69079 add test for object not present in configuration 2019-10-15 13:25:11 -04:00
Emmanuelle Vargas-Gonzalez 024e023967 update semantic equivalence approach to:
- add more detailed output via the logging module
- don't fail hard if an object sent to the semantically_equivalent() method
- remove specific exception related to Semantic Equivalence and tests
2019-10-15 12:54:41 -04:00
Chris Lenk 39e1ddbbf6 Update semantic equivalence docs 2019-10-14 14:31:44 -04:00
Chris Lenk 08e8b88410
Merge pull request #300 from chisholm/deterministic_id_unicode_fix
Fix deterministic ID handling with unicode
2019-10-14 11:03:53 -04:00
chrisr3d 4f1d68065a
fix: Making python imports happy in travis 2019-10-14 14:41:17 +02:00
chrisr3d bdba2c0a63
fix: Removed comment 2019-10-14 14:27:44 +02:00
chrisr3d bf45f26bfe
fix: Making pep8 happy 2019-10-14 14:15:26 +02:00
chrisr3d 8809418dab
fix: Updated interoperability tests with required arguments 2019-10-14 14:14:39 +02:00
chrisr3d f89940ec0e
fix: Added missing param to the id validation function 2019-10-14 12:34:31 +02:00
chrisr3d adbaec1942
Merge branch 'master' of github.com:oasis-open/cti-python-stix2 + fix interoperability param support 2019-10-14 12:30:15 +02:00
Michael Chisholm edf465bd80 Add a unit test for deterministic ID, with unicode 2019-10-11 18:15:47 -04:00
Michael Chisholm 216b43d49e Fix determinstic UUID handling when there are high-codepoint
unicode characters.  Make compatible with both python 2 and 3.
2019-10-11 17:12:44 -04:00
Emmanuelle Vargas-Gonzalez c42f42e983
Update README.rst
add documentation badge
2019-09-30 13:55:07 -04:00
Emmanuelle Vargas-Gonzalez b9927fd4a5 update .ipynb files with correct references. update package requirements 2019-09-30 13:16:06 -04:00
Chris Lenk 3bc59d6898
Update requirements.txt 2019-09-26 15:53:34 -04:00
Chris Lenk 9e04481acb
Update requirements.txt
Fix failing ReadTheDocs builds.
Related: https://github.com/readthedocs/readthedocs.org/issues/5332
2019-09-26 15:52:44 -04:00
Chris Lenk c6936ae7a2 Bump version: 1.1.3 → 1.2.0 2019-09-25 16:04:07 -04:00
Chris Lenk 0124dbc0dc Update CHANGELOG for v1.2.0 2019-09-25 16:02:26 -04:00
Chris Lenk 5fedd89606 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-09-25 15:21:11 -04:00
Chris Lenk a55666f1a5
Merge pull request #289 from emmanvg/semantic-equivalence
Semantic Equivalence
2019-09-25 15:19:40 -04:00
Chris Lenk 47551b8cc1 Add documentation for semantic equivalence 2019-09-25 14:28:24 -04:00
Emmanuelle Vargas-Gonzalez 75b87f50dd
Update .isort.cfg 2019-09-23 23:33:04 -04:00
Emmanuelle Vargas-Gonzalez de478df687 update test after merge, formatting 2019-09-23 23:27:43 -04:00
Emmanuelle Vargas-Gonzalez 88b883c91d
Merge branch 'master' into semantic-equivalence 2019-09-23 23:20:42 -04:00
Emmanuelle Vargas-Gonzalez dc79a1f869 add docstrings for new public methods. add test with disabled spec_version check.
fix calculation for distance, using incorrect algorithm. update package settings, tox settings
2019-09-23 23:13:50 -04:00
Chris Lenk 9c4f044cc1
Merge pull request #291 from oasis-open/stix2.1
Update Support for STIX 2.1
2019-09-23 12:55:16 -04:00
Chris Lenk 401c9ad950
Merge branch 'master' into stix2.1 2019-09-23 12:26:27 -04:00
Emmanuelle Vargas-Gonzalez 4eaaee89dc make changes according to feedback. allow for custom objects to be supplied to method 2019-09-23 09:44:09 -04:00
Chris Lenk c3b2121f41
Merge pull request #290 from khdesai/wd05SCO
Remove at_least_one=False from Artifact SCO
2019-09-19 10:43:36 -04:00
Desai, Kartikey H 113d481e84 Make SCO deterministic ID namespace a global var for better software hygiene 2019-09-19 10:31:14 -04:00
Desai, Kartikey H f241ed5c6c Remove at_least_one=False from Artifact SCO 2019-09-18 10:56:42 -04:00
Chris Lenk e1a88a4840
Merge pull request #285 from khdesai/wd05SCO
WD 05 SCO
2019-09-18 10:54:13 -04:00
Desai, Kartikey H 3b1c922ba6 Fix observed data property check for at least one property existing 2019-09-18 10:29:07 -04:00
Emmanuelle Vargas-Gonzalez e138753576 add another test 2019-09-17 16:10:54 -04:00
Emmanuelle Vargas-Gonzalez 351362ae33 more tests for coverage 2019-09-17 15:55:12 -04:00
Emmanuelle Vargas-Gonzalez 09858ba263 create more tests to improve coverage 2019-09-17 15:28:37 -04:00
Emmanuelle Vargas-Gonzalez 98ecdf53e3 update timestamp comparison method 2019-09-17 11:08:01 -04:00
Chris Lenk 91f2e50321
Merge pull request #287 from zrush-mitre/master
Fixing precision not holding on a marking-definition during parsing and serialization
2019-09-17 09:10:51 -04:00
Emmanuelle Vargas-Gonzalez ea0df70806 update test environment requirements 2019-09-16 14:45:01 -04:00
Emmanuelle Vargas-Gonzalez e8eb7bcca2 fix logging messages, typos and add tests for the semantic equivalence method 2019-09-16 14:35:14 -04:00
Zach Rush 855bc96863 Avoid throwing exceptions when unneeded to avoid problems 2019-09-13 14:54:52 -04:00
Zach Rush 4c6519cf43 Changed 'six.text_type' to 'six.string_types', since the former didn't seem to work in python2.7 2019-09-13 12:09:02 -04:00
Zach Rush 4753519349 Marking-definitions are now checked for their attribute before being tested and tests were modified to expect the correct value 2019-09-13 10:52:50 -04:00
Zach Rush 5f3e41a9ab Marking-definitions are now checked for their attribute before being tested and tests were modified to expect the correct value 2019-09-13 10:51:28 -04:00
Desai, Kartikey H 8447c9fcd9 Add few tests to improve some code coverage 2019-09-11 14:21:41 -04:00
Zach Rush afa4af65c6 Fixing pre-commit things 2019-09-11 12:22:55 -04:00
Zach Rush e7a6554395 Fixing pre-commit issues 2019-09-11 12:12:26 -04:00
Zach Rush 7c96d991e6 Added a function to ensure precision consistency 2019-09-11 10:55:09 -04:00
Desai, Kartikey H 9c7128d074 Fix indentation issue 2019-09-11 10:49:11 -04:00
Desai, Kartikey H d828e41c78 End of changes 2019-09-11 10:44:14 -04:00
Emmanuelle Vargas-Gonzalez 6fa77adfe3 wrote all default weights, actually computing the equivalence score
logging for unsupported objects, finished implementing some methods. Missing to implement patterning.
2019-09-10 15:04:07 -04:00
Zach Rush 53db47b447 Statement-type definitions will now match the timestamp precision given to them 2019-09-09 21:38:58 -04:00
Desai, Kartikey H 5b6592e2dc Some changes. More fixes coming soon, hopefully 2019-09-06 18:08:27 -04:00
Desai, Kartikey H 8f773fd556 Temp backup of some code changes. More coming soon 2019-09-06 00:25:42 -04:00
Desai, Kartikey H abf2980336 Fix tests and ReferenceProperty 2019-09-04 19:08:34 -04:00
chrisr3d a739c1154e Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-09-03 13:47:43 +02:00
Desai, Kartikey H 44ebd64a16 Some test fixes. More coming soon 2019-08-30 03:47:47 -04:00
Desai, Kartikey H f69b13a006 Some more updates, primarily to ReferenceProperty (and related code) 2019-08-29 17:15:51 -04:00
Chris Lenk 886ef8464d Add contributing note about isort 2019-08-29 16:14:40 -04:00
Desai, Kartikey H 5825118ad4 Merge branch 'stix2.1' of https://github.com/oasis-open/cti-python-stix2 into wd05SCO 2019-08-27 17:37:14 -04:00
Desai, Kartikey H 49077352d7 Updates and corrections for SCO WD 05 updates. Temp backup; testing and more fixes coming soon 2019-08-27 17:36:45 -04:00
Chris Lenk d2a649b4bc
Merge pull request #286 from chisholm/custom_content_error_fix
Fix handling of custom extensions: uncleaned property values
2019-08-27 13:02:11 -04:00
Michael Chisholm 94bb76f669 Fix docstring on the unit tests I added. I'd said "partially
cleaned" property, but actually, the cleaning algorithm works on
a dict copy, so aborting cleaning partway through doesn't
actually affect the object in that way.  It would actually cause
the extensions property to be completely uncleaned, rather than
partially cleaned.
2019-08-26 17:49:55 -04:00
Michael Chisholm c212c7c678 Fix handling of custom extensions: make sure when
allow_custom=True that you never get a half-cleaned property
value.
2019-08-26 17:10:54 -04:00
Desai, Kartikey H 7c9fc3fd08 Fix deterministic ID tests 2019-08-21 09:33:42 -04:00
Desai, Kartikey H 364daec40a Add deterministic ID tests 2019-08-21 09:21:51 -04:00
Desai, Kartikey H 5e9d6a6a14 Fix small indentation error 2019-08-21 08:49:33 -04:00
Desai, Kartikey H bf1b8b567d Updates to allow existing tests to pass 2019-08-21 02:00:41 -04:00
Desai, Kartikey H 46359ead69 Modify a few things 2019-08-19 13:35:17 -04:00
Desai, Kartikey H ec55463398 Update SCO stuff to WD 05 2019-08-19 09:39:13 -04:00
Chris Lenk d4bbc80777 Fix package distribution
(don't include tests)
2019-08-12 15:57:25 -04:00
Chris Lenk b0a1bbbc84 Bump version: 1.1.2 → 1.1.3 2019-08-12 13:32:47 -04:00
Chris Lenk 2b06c8d29b Update CHANGELOG for v1.1.3 2019-08-12 12:52:23 -04:00
Chris Lenk 4fafdfecff Update readme check 2019-08-12 12:28:27 -04:00
Desai, Kartikey H dee2f1f60c Merge branch 'stix2.1' of https://github.com/oasis-open/cti-python-stix2 into wd05SCO 2019-08-12 08:16:00 -04:00
Chris Lenk b981cdf4fc Fix tests
by removing certain human message assertions from test suites.
(These changes are cherry-picked from commit by @khdesai on a different
branch)
2019-08-09 16:04:29 -04:00
Chris Lenk fb834d83f6
Merge pull request #283 from chisholm/stix2.1_wd05
Stix2.1 wd05
2019-08-09 11:27:26 -04:00
Michael Chisholm bb4a6b1bcb Change how tox invokes pytest, to workaround a seeming tox
bug which seems to be related to how it copes with code which
emits warnings.
2019-08-08 15:22:21 -04:00
Michael Chisholm 5e5a03c001 Changed emitted deprecation warnings to a custom DeprecationWarning
subclass.  Changed the unit test to test for that specific
warning category, instead of any DeprecationWarning.
2019-08-07 10:16:18 -04:00
Michael Chisholm 27beec4060 Add a deprecation warning for the "objects" property of
observed-data.  Add a unit test to ensure we get the warning.
2019-07-29 16:35:38 -04:00
Emmanuelle Vargas-Gonzalez 93aa709b68 write down some of the semantic-equivalence work. WIP 2019-07-26 16:01:45 -04:00
Michael Chisholm 9404cf4560 Fix flake8 style error. 2019-07-25 16:58:48 -04:00
Michael Chisholm 423487d65a Add a unit test for the first/last_seen value co-constraint
on ThreatActor.
2019-07-25 16:57:15 -04:00
Michael Chisholm 8362d80206 Change "object_modified" property of LocationContent to be
optional.  Add a corresponding unit test.
2019-07-25 16:56:34 -04:00
Chris Lenk 7978a3e142
Update README.rst
Trim trailing whitespace
2019-07-25 14:33:02 -04:00
Chris Lenk 1a719db40a Add examples to docs for removing properties 2019-07-25 14:22:29 -04:00
Chris Lenk ca818974ac
Update Maintainers List
Added Jason Keirstead
2019-07-25 08:43:36 -04:00
Michael Chisholm 5649559c6d Removed some more hard-codings of v20 in the workbench test
suite.
2019-07-24 17:39:00 -04:00
Michael Chisholm b0eb518997 Added adaptability to the workbench module, regarding the
autogenerated docstrings: v20/v21 is automatically referenced as
appropriate, based on stix2.DEFAULT_VERSION.  To avoid
duplication, I also moved _STIX_VID from test_workbench.py to
workbench.py; the former now imports it from the latter.
2019-07-24 17:20:52 -04:00
Michael Chisholm 22face6c1a Add trailing commas to satisfy pre-commit hooks... 2019-07-24 16:30:18 -04:00
Michael Chisholm 9d08cadcfd Turn off the workbench test suite's side effects after each test
that turns them on.  These have the potential to affect subsequent
tests.  The side effects include automatically setting
property values, and automatically appending additional values
to list-valued properties.
2019-07-24 16:23:19 -04:00
Michael Chisholm 38103ac6c5 Moved test/v20/test_workbench.py up one directory level since
it doesn't make sense to have a test per STIX version.  The
workbench only uses the latest supported STIX version.  In
order to make this work, the test suite was modified to
dynamically compute some settings like where to get demo data,
based on the value of stix2.DEFAULT_VERSION.

Switched stix2.DEFAULT_VERSION back to "2.0", since I figure it
should be sync'd up with the 'from .vxx import *' import
statement from the top level package.
2019-07-24 15:35:59 -04:00
Michael Chisholm d69449706f Revert the docstrings generated for the workbench dynamically
created subclasses, to mention v20 instead of v21.
2019-07-22 17:01:52 -04:00
Michael Chisholm 165d87e103 Revert the import in the top-level stix2 package, to v20. This
additionally required:

- Removing the v21 workbench test suite and reinstating the v20
  test suite
- Fixing up a few v20 unit tests to work with the workbench
  monkeypatching.
- I didn't revert the analogous changes I'd previously made to
  the v21 unit tests, because I think they make sense even when
  the workbench monkeypatching isn't happening.
2019-07-22 16:55:22 -04:00
Michael Chisholm 227383cdcb Removed _observed_data_init() from workbench.py, part of the old
monkeypatching algorithm.  It's no longer needed and I forgot to
delete it.
2019-07-19 15:58:15 -04:00
Michael Chisholm 823b67a4fc Add a few more tests to exercise more complex property presence
constraint checking.
2019-07-19 15:40:03 -04:00
Michael Chisholm 5589480980 Improved the exception class hierarchy:
- Removed all plain python base classes (e.g. ValueError, TypeError)
- Renamed InvalidPropertyConfigurationError -> PropertyPresenceError,
  since incorrect values could be considered a property config error, and
  I really just wanted this class to apply to presence (co-)constraint
  violations.
- Added ObjectConfigurationError as a superclass of InvalidValueError,
  PropertyPresenceError, and any other exception that could be raised
  during _STIXBase object init, which is when the spec compliance
  checks happen.  This class is intended to represent general spec
  violations.
- Did some class reordering in exceptions.py, so all the
  ObjectConfigurationError subclasses were together.

Changed how property "cleaning" errors were handled:
- Previous docs said they should all be ValueErrors, but that would require
  extra exception check-and-replace complexity in the property
  implementations, so that requirement is removed.  Doc is changed to just
  say that cleaning problems should cause exceptions to be raised.
  _STIXBase._check_property() now handles most exception types, not just
  ValueError.
- Decided to try chaining the original clean error to the InvalidValueError,
  in case the extra diagnostics would be helpful in the future.  This is
  done via 'six' adapter function and only works on python3.
- A small amount of testing was removed, since it was looking at custom
  exception properties which became unavailable once the exception was
  replaced with InvalidValueError.

Did another pass through unit tests to fix breakage caused by the changed
exception class hierarchy.

Removed unnecessary observable extension handling code from
parse_observable(), since it was all duplicated in ExtensionsProperty.
The redundant code in parse_observable() had different exception behavior
than ExtensionsProperty, which makes the API inconsistent and unit tests
more complicated.  (Problems in ExtensionsProperty get replaced with
InvalidValueError, but extensions problems handled directly in
parse_observable() don't get the same replacement, and so the exception
type is different.)

Redid the workbench monkeypatching.  The old way was impossible to make
work, and had caused ugly ripple effect hackage in other parts of the
codebase.  Now, it replaces the global object maps with factory functions
which behave the same way when called, as real classes.  Had to fix up a
few unit tests to get them all passing with this monkeypatching in place.
Also remove all the xfail markings in the workbench test suite, since all
tests now pass.

Since workbench monkeypatching isn't currently affecting any unit tests,
tox.ini was simplified to remove the special-casing for running the
workbench tests.

Removed the v20 workbench test suite, since the workbench currently only
works with the latest stix object version.
2019-07-19 14:50:11 -04:00
Desai, Kartikey H 4660d5ea28 Update SCO specs per WD 05 specs 2019-07-17 15:48:09 -04:00
Michael Chisholm cd0c4984fa Fix most unit tests to pass again. Awaiting feedback regarding
possible library bugs, before I fix the remaining unit tests.
2019-07-16 16:10:25 -04:00
Michael Chisholm 1b7abaf228 WIP: updating objects to be compliant with stix2.1 WD05. This
includes SDO/SRO class updates, but no unit test updates.  The
class updates broke unit tests, so that still needs to be
addressed.
2019-07-14 15:34:31 -04:00
Emmanuelle Vargas-Gonzalez b1fa177f07
Merge pull request #275 from khdesai/stix21master
Stix21master, add infrastructure, grouping to the working branch
2019-07-10 10:35:57 -04:00
Desai, Kartikey H b464a9cc0a Remove certain human message assertions from test suites 2019-07-09 13:34:19 -04:00
Desai, Kartikey H ae35d2ab01 Add and update tests to conform code to WD04 SDO specs 2019-07-02 13:17:43 -04:00
Desai, Kartikey H ffbf5fa34c Fix JSON encoding issue within tests 2019-07-01 15:41:44 -04:00
Desai, Kartikey H c98fcafb1a Update tests to address conformance to WD04 specs 2019-07-01 15:26:30 -04:00
Desai, Kartikey H ef408e1971 preliminary changes to make stix2 code conform to WD 04 specs 2019-07-01 11:52:55 -04:00
chrisr3d b204b9fdda Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-07-01 08:51:26 +02:00
Chris Lenk 244d8c969a
Merge pull request #274 from chrisr3d/master
Avoid issues using parse on bundles
2019-06-28 12:49:17 -04:00
Chris Lenk 953a91ba8e
Merge pull request #273 from chisholm/update_course_of_action
Update course of action for stix2.1 (again)
2019-06-28 12:32:14 -04:00
Chris Lenk 266516ebbc
Merge pull request #272 from chisholm/malware_analysis
Add stix2.1 malware-analysis SDO
2019-06-28 09:57:47 -04:00
chrisr3d 6aff018695
fix: Avoid issues with custom objects
- Custom objects type is dict, which makes it fail
  when the attribute 'id' is called
2019-06-27 17:19:05 +02:00
Michael Chisholm e779cacf3e Update course of action tests, to include tests with the
action_reference property.  Also, stylistic changes to hopefully
let it do more testing with less code.
2019-06-26 21:01:41 -04:00
Michael Chisholm de93a2ee32 Fix stix2.1 course-of-action SDO class properties action_reference
and action_bin to have the correct types.
2019-06-26 19:54:28 -04:00
Michael Chisholm c6132537b8 Changes from add-trailing-comma hook 2019-06-26 17:17:16 -04:00
Michael Chisholm 68f93f4110 Oops, forgot to add the malware-analysis test suite... 2019-06-26 17:10:04 -04:00
Michael Chisholm 5c92db9861 Add stix2.1 malware-analysis SDO 2019-06-26 17:06:26 -04:00
Chris Lenk b8c5bec101 Merge branch 'master' into stix2.1 2019-06-26 12:22:40 -04:00
Chris Lenk 7e989dd13d Merge branch 'chisholm-update_observed_data' into stix2.1 2019-06-26 11:22:12 -04:00
Chris Lenk 28ac284b84 Remove unnecessary ObservedData constraint
first_observed and last_observed are both required, so this co-constraint was removed from WD04.
2019-06-26 11:18:47 -04:00
chrisr3d 565acc7618 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-06-25 09:06:50 +02:00
Michael Chisholm 58ff89f112 Update observed-data SDO class, adding the new stix2.1 property
"object_refs".  Added a couple tests for it.
2019-06-21 15:44:04 -04:00
Chris Lenk 49501029dd
Merge pull request #269 from chisholm/update_id_properties
Update id properties
2019-06-21 15:40:35 -04:00
Michael Chisholm 23d5bef2ec Change all uses of multi-STIX-version properties (i.e. those
with a spec_version constructor argument) in STIX-version-specific
contexts, to explicitly specify the STIX version.
2019-06-21 14:29:08 -04:00
Michael Chisholm 9cc1e6e8c1 Change location of DEFAULT_VERSION definition, to be before
the imports.  This ensures the attribute will be defined even if
there are import loops.
2019-06-21 14:26:48 -04:00
Michael Chisholm 8bb6c79f1d Change import order to satisfy style checkers... 2019-06-21 14:25:36 -04:00
Michael Chisholm f9578313a0 Change stix2.DEFAULT_VERSION to "2.1" on the stix2.1 branch. 2019-06-21 13:20:37 -04:00
Michael Chisholm ea98a53fae Change all hard-coded spec_version defaults in property classes
to stix2.DEFAULT_VERSION.
2019-06-21 13:18:51 -04:00
Michael Chisholm d61b543266 Style changes to satisfy the 'style' tox check 2019-06-14 18:10:38 -04:00
Michael Chisholm a150b0f4aa Change all uses of IDProperty and ReferenceProperty to specify
a particular spec_version.
2019-06-14 17:58:51 -04:00
Michael Chisholm da5978d317 Factored out more of the STIX identifier validity checking,
partly inspired by PR #263.  This resulted in some error message
format changes (an improvement, I think), which caused some
unit test breakage.  Removed those asserts from the unit tests,
since tests shouldn't be testing human-targeted error messages.
2019-06-13 18:37:21 -04:00
Michael Chisholm ed106f23ff Update IDProperty and ReferenceProperty to support both stix 2.0
and 2.1 rules regarding identifiers.  Change relevant property
tests to specify which spec version to use, and modify tests
according to the specs.
2019-06-12 20:19:47 -04:00
Chris Lenk 5b6a0dc087
Merge pull request #268 from chisholm/update_course_of_action
Update stix2.1 course-of-action support to the latest spec.
2019-06-12 17:09:31 -04:00
Michael Chisholm 4f593e6d16 Changes from the add-trailing-comma pre-commit hook 2019-06-12 14:49:34 -04:00
Michael Chisholm caa1d45ae2 Update stix2.1 course-of-action support to the latest spec. 2019-06-11 18:10:02 -04:00
Chris Lenk a6fa3ff1d7 Slightly change bundle error message 2019-05-22 11:05:01 -04:00
Chris Lenk 51b2db4fba
Merge pull request #264 from khdesai/stix2_issue_257
Stix2 issue 257: make accessing bundles easier
2019-05-22 09:59:42 -04:00
Desai, Kartikey H ce86db2a12 Fixes #257 2019-05-20 15:36:35 -05:00
Desai, Kartikey H 86790a736f Fixes #257 2019-05-20 15:29:01 -05:00
Desai, Kartikey H 45d3020518 Fixes #257 2019-05-17 14:21:35 -05:00
Desai, Kartikey H a61344a8aa Add get_obj function to bundle.py to make accessing bundles easier 2019-05-14 13:48:54 -04:00
Chris Lenk 67f6663e5e
Merge pull request #262 from khdesai/master
Fixes #175 - Add documentation for _valid_refs
2019-05-13 10:36:30 -04:00
Kartikey Desai 1bf12221a0 Update _valid_refs doc and add test to v20 test suite 2019-05-13 09:18:50 -04:00
Kartikey Desai 8946308527 Updated documentation for _valid_refs 2019-05-10 12:30:52 -04:00
Desai, Kartikey H 3a33dbe2fa Merge branch 'stix2_issue_175' of https://github.com/khdesai/cti-python-stix2 2019-05-10 11:56:43 -04:00
Kartikey Desai 989f17e673 Add documentation for _valid_refs 2019-05-10 11:54:59 -04:00
Desai, Kartikey H f79b3c9876 Add functionality to _valid_refs to accept actual cyber observable objects instead of just strings with their types 2019-05-10 10:22:45 -04:00
Chris Lenk ddd4fa3e95
Merge pull request #261 from emmanvg/252-TLPMarking-constraints
TLP marking constraints
2019-05-08 10:50:35 -04:00
Emmanuelle Vargas-Gonzalez 087ac35f38 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into 252-TLPMarking-constraints 2019-05-08 10:43:13 -04:00
Emmanuelle Vargas-Gonzalez 00d99e3815 remove unused imports 2019-05-08 10:38:23 -04:00
Emmanuelle Vargas-Gonzalez 9c34e2f8ca update tests to make sure we are testing the serialized instance correctly 2019-05-08 10:36:31 -04:00
Emmanuelle Vargas-Gonzalez d5f0c46dd5 re-organize imports in v20, v21 2019-05-08 10:35:53 -04:00
Emmanuelle Vargas-Gonzalez 47f8ed9282 move check_tlp_marking to markings\utils.py 2019-05-08 10:34:56 -04:00
Chris Lenk 582ba2be2c
Merge pull request #259 from emmanvg/251-lang-markings-support
Language markings support
2019-05-08 09:48:21 -04:00
Emmanuelle Vargas-Gonzalez 851ed3e85a marking-definition 2019-05-03 15:41:58 -04:00
Emmanuelle Vargas-Gonzalez 8d842aeb94 update user guide on marking extraction via API 2019-05-03 14:48:16 -04:00
Emmanuelle Vargas-Gonzalez 4b21708e03 modify test to cover exception message 2019-05-03 11:05:32 -04:00
Emmanuelle Vargas-Gonzalez b3a601e4c8 add new files for marking-definition tests 2019-05-03 10:25:11 -04:00
Emmanuelle Vargas-Gonzalez d6497f66fe create a new exception for TLP validation and util method 2019-05-03 10:03:15 -04:00
Emmanuelle Vargas-Gonzalez 46c47a0d08 new approach towards validation of tlp instances 2019-05-03 09:59:07 -04:00
Emmanuelle Vargas-Gonzalez fff0e9e731 update test_datastore_filesystem.py to create proper tlp markings 2019-05-03 09:58:45 -04:00
chrisr3d 92c0582d8f Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-04-29 16:35:46 +02:00
Chris Lenk 7ee7fb8336
Merge pull request #258 from khdesai/stix2_issue_248
Fixes #248
2019-04-24 13:42:14 -04:00
Emmanuelle Vargas-Gonzalez c3aecd76ba update unnecesary property clean-up and add tests 2019-04-23 09:27:21 -04:00
Emmanuelle Vargas-Gonzalez f8857569d5 Add header to test file 2019-04-23 07:48:51 -04:00
Emmanuelle Vargas-Gonzalez dbc63b7b9f pre-commit changes 2019-04-23 07:43:56 -04:00
Emmanuelle Vargas-Gonzalez 0c78acafd0 add tests to cover the language aspect of the markings 2019-04-22 15:26:21 -04:00
Emmanuelle Vargas-Gonzalez 4bbabaecb2 update marking API methods to allow/use the 'lang' property
including utility methods that expand collapse markings
2019-04-22 15:25:46 -04:00
Desai, Kartikey H 84fc71add4 Add test to ensure fix. Fixes #248 2019-04-19 12:17:42 -04:00
Desai, Kartikey H e748923f19 Fixes #248 2019-04-17 10:08:34 -04:00
chrisr3d 61e9fc0748 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-03-18 09:13:31 +01:00
Chris Lenk f8d4669f80 Bump version: 1.1.1 → 1.1.2 2019-02-13 10:37:38 -05:00
Chris Lenk bddfb06b2a Update CHANGELOG for v1.1.2 2019-02-13 10:37:11 -05:00
chrisr3d 28db2df045 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-02-12 11:45:14 +01:00
Chris Lenk afe57f642d Add docstring for to_maps_url() 2019-02-08 14:41:54 -05:00
Chris Lenk 5b2d28ac0b Merge branch 'khdesai-location_issue_86' 2019-02-08 14:26:43 -05:00
Chris Lenk e976f0a926 Trim location tests
We can rely on defaults for some properties we aren't testing.
2019-02-08 14:17:19 -05:00
Desai, Kartikey H edfe0ba51a Add support for Bing Maps and corresponding tests. Fixes #86 2019-02-08 09:37:27 -05:00
chrisr3d f049c98d43 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-02-08 15:24:22 +01:00
Desai, Kartikey H 516789400b Merge branch 'master' of https://github.com/khdesai/cti-python-stix2 into location_issue_86 2019-02-07 10:37:37 -05:00
Desai, Kartikey H 8be704a5b9 Update to_map_url and add tests. Fixes #86 2019-02-07 10:31:51 -05:00
Desai, Kartikey H dc91c9cbf4 Initial fix for issue 86. Fixes #86 2019-02-06 16:16:50 -05:00
Chris Lenk 69d2529f0e Fix style issues 2019-02-06 15:23:00 -05:00
Chris Lenk 0ca2bc087a Fix SDO link in docs 2019-02-06 14:27:33 -05:00
chrisr3d 469d17bcee Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-02-06 08:55:41 +01:00
Chris Lenk 370a7c6f06
Merge pull request #250 from jmgnc/jmgnc-patch-1
minor grammar fix
2019-02-05 09:26:17 -05:00
John-Mark Gurney 1c03b4a1f0
minor grammar fix 2019-02-04 13:58:33 -08:00
Chris Lenk 3912df2a36
Merge pull request #246 from khdesai/fix_issue_245
Start updating test suites to fix issue 245
2019-01-29 11:54:53 -05:00
Desai, Kartikey H a788dbb64c Replace most SDO/SRO values in tests with IDs from constants.py 2019-01-29 10:52:59 -05:00
Desai, Kartikey H 10bfde0e86 Merge branch 'master' of https://github.com/khdesai/cti-python-stix2 into fix_issue_245 2019-01-29 08:31:47 -05:00
Chris Lenk 29a4e2f9a8
Merge pull request #244 from khdesai/fix_issue_230
Fix issue 230
2019-01-23 14:29:41 -05:00
Desai, Kartikey H b4d4a582ce Update timestamps in v20 testsuite JSON files 2019-01-23 13:42:25 -05:00
Desai, Kartikey H cdac66c04d Update v21 test suite. Fixes #245 2019-01-23 10:56:20 -05:00
Desai, Kartikey H 9941014f3a Update v20 test suite to fix issue 245 2019-01-22 23:07:20 -05:00
Desai, Kartikey H 5fb69e1d44 Start updating test suites to fix issue 245 2019-01-22 21:25:09 -05:00
Desai, Kartikey H 59ec498fa0 Fix test cases in v20 2019-01-22 12:55:19 -05:00
Desai, Kartikey H f59db77352 Update v21 tests and add them to v20 test suite 2019-01-22 12:42:47 -05:00
Desai, Kartikey H dda8a7f724 Add two tests to ensure millisecond precision is used in timestamps irrespective of user-provided precision 2019-01-22 10:05:22 -05:00
Desai, Kartikey H 5658cebf57 Update JSON files so timestamps are only precise to the millisecond (3 decimal points), per the specs 2019-01-18 13:28:37 -05:00
chrisr3d 407f346eb8 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-01-14 12:16:12 +01:00
Emmanuelle Vargas-Gonzalez 7e64c70d8b Bump version: 1.1.0 → 1.1.1 2019-01-11 14:27:35 -05:00
Emmanuelle Vargas-Gonzalez ce1bb3bfa7 Update CHANGELOG for v1.1.1 2019-01-11 14:21:05 -05:00
Chris Lenk 4aeff2d977
Merge pull request #242 from emmanvg/234-update-documentation
Update documentation
2019-01-11 14:07:19 -05:00
Emmanuelle Vargas-Gonzalez f80728d3f5 New files and directories for RST files. closes #234 2019-01-11 13:59:08 -05:00
Emmanuelle Vargas-Gonzalez db5f8f2ebf Update docstrings to relocate links\documentation 2019-01-11 13:55:05 -05:00
Emmanuelle Vargas-Gonzalez e1356437fc
Merge pull request #240 from khdesai/fix_issue_232
Fix issue 232, raise DataSourceError when FileSystemStore attempts to overwrite an existing file
2019-01-11 11:10:42 -05:00
Desai, Kartikey H 72d7757c7b Change test to use store instead of source & sink 2019-01-11 10:46:16 -05:00
Desai, Kartikey H 5dea09547e Fix test for fix to issue 232 2019-01-11 09:40:57 -05:00
Desai, Kartikey H 6e28cc8fe6 Add test to fix for issue 232 2019-01-11 09:26:55 -05:00
Desai, Kartikey H 710afe68b2 Merge branch 'master' of https://github.com/khdesai/cti-python-stix2 into fix_issue_232 2019-01-11 08:04:54 -05:00
chrisr3d 83709358c3 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2019-01-10 09:51:41 +01:00
Desai, Kartikey H 767a758b28 Fix styling issue around imports for issue 232 2019-01-09 11:32:51 -05:00
Chris Lenk 8c3ecd1c48
Merge pull request #237 from oasis-open/236-WindowsRegistryKey
Fix error when printing WindowsRegistryKey
2019-01-09 11:12:46 -05:00
Kartikey Desai 5d08edb8b0
Merge pull request #1 from khdesai/fix_issue_232
Fix issue 232
2019-01-09 10:55:28 -05:00
Chris Lenk 1ad64dfc0c Move CallableValues to prevent duplicate code 2019-01-09 10:46:48 -05:00
Desai, Kartikey H 77b2e0e3e3 Remove a few comments and Fixes #232 2019-01-09 10:22:33 -05:00
Chris Lenk 101eb762d1
Merge pull request #239 from oasis-open/238-empty-dictionaries
Remove dictionary/extension property non-empty req
2019-01-09 09:22:59 -05:00
Desai, Kartikey H 7883614d2f Fixes #232 2019-01-09 08:36:10 -05:00
Emmanuelle Vargas-Gonzalez 26a658b789 Update test to v21 2019-01-08 09:41:53 -05:00
Emmanuelle Vargas-Gonzalez 67d3970a50
Update test_observed_data.py
Change to correct version
2019-01-08 09:35:01 -05:00
Chris Lenk ab687d8d0e Test empty extension property serialization 2019-01-07 15:22:08 -05:00
Chris Lenk 2966efa4f0 Remove dictionary/extension property non-empty req
Only bundle.objects and observed-data.objects have a requirement to
include at least one item.
2019-01-07 11:15:47 -05:00
chrisr3d f527e279b3
fix: Using raw string in the regular expression compilation 2018-12-26 12:57:46 +01:00
Chris Lenk 34002c4f7c Fix error when printing WindowsRegistryKey
Caused by WindowsRegistryKey having a 'values' property. Fixes #236.
2018-12-21 14:33:59 -05:00
chrisr3d 383bd56f0e
fix: Added the interoperability parameter to new_inner objects _STIXBase __deepcopy__ function 2018-12-21 10:59:34 +01:00
chrisr3d f560049f96
Splitted interoperability tests for both versions 2018-12-18 10:48:18 +01:00
chrisr3d 939a2d5428
add: Applying interoperability parameter to v2.1 objects 2018-12-14 10:12:30 +01:00
chrisr3d a68a43a732 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2018-12-14 10:09:58 +01:00
Emmanuelle Vargas-Gonzalez 06e23b08b8 Bump version: 1.0.4 → 1.1.0 2018-12-11 14:14:49 -05:00
Emmanuelle Vargas-Gonzalez c67d180e99 update CHANGELOG for v1.1.0 2018-12-11 14:12:49 -05:00
Emmanuelle Vargas-Gonzalez ba2f63f745
Merge pull request #220 from emmanvg/1.1.0-release
Stage changes for v1.1.0 release. Introduces initial support for 2.1 objects
2018-12-11 14:08:11 -05:00
Emmanuelle Vargas-Gonzalez 6e50bf5123 Formatting problems... 2018-12-11 13:48:56 -05:00
Emmanuelle Vargas-Gonzalez c8c48a305a Add future import to resolve compatibility problems 2018-12-11 13:41:19 -05:00
Emmanuelle Vargas-Gonzalez 7d84c63e8e pre-commit formatting changes 2018-12-11 13:23:43 -05:00
Emmanuelle Vargas-Gonzalez f12cc82d8a incorporate feedback
update documentation for core.py and automatic copyright year for docs
2018-12-11 13:22:04 -05:00
Emmanuelle Vargas-Gonzalez 3f02925fc9 add new pattern_expressions tests to proper locations 2018-12-11 13:07:53 -05:00
Emmanuelle Vargas-Gonzalez ff098a19b1 update method _timestamp2filename() since it introduces timing precision problems 2018-12-11 13:06:51 -05:00
Emmanuelle Vargas-Gonzalez c75a0857ec Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into 1.1.0-release 2018-12-11 13:03:42 -05:00
Emmanuelle Vargas-Gonzalez 605842001f
Merge pull request #229 from oasis-open/add_visitor2
Add an AST for STIX pattern navigation. From add_visitor2
2018-12-11 09:00:34 -05:00
chrisr3d f0ac7aeb3c add: Added tests for the new parameter 2018-12-11 10:16:17 +01:00
chrisr3d 3850a046ff fix: Applying the interoperability parameter to UUIDs referenced in various SDOs & SROs 2018-12-11 10:16:17 +01:00
chrisr3d 3ae38fe687 fix: Applying the same parameter as for IDs in created_by_ref UUIDs 2018-12-11 10:16:17 +01:00
chrisr3d 86536b43b1 fix: Avoiding our additional parameter to be in the parsed STIX objects 2018-12-11 10:16:17 +01:00
chrisr3d 067d76bb90 chg: Added parameter to accept UUIDs not v4
- This parameter is not used at the creation of
  STIX objects, only while converting json format
  into STIX objects.1
2018-12-11 10:16:17 +01:00
Emmanuelle Vargas-Gonzalez f20ee91544 rename 'STIXPatternVisitor' to 'pattern_visitor' 2018-12-10 15:23:26 -05:00
Emmanuelle Vargas-Gonzalez 9a69823d08 Revert unnecessary changes 2018-12-10 15:08:43 -05:00
Emmanuelle Vargas-Gonzalez 7702d435ba update method to use docstrings 2018-12-10 15:07:38 -05:00
Emmanuelle Vargas-Gonzalez 50a2191805 Favor star import, disable messages for undefined methods F405 2018-12-10 14:44:44 -05:00
Emmanuelle Vargas-Gonzalez fc0069ed60 re-order imports, add entry to isort file 2018-12-10 14:29:31 -05:00
Richard Piazza b3f69bf942 imports again 2018-12-10 13:42:05 -05:00
Richard Piazza fcea810ea1 added test for ListConstant 2018-12-10 12:54:58 -05:00
Richard Piazza 7bd330dfae import experiment 3 2018-12-10 12:25:59 -05:00
Richard Piazza 1bb3aa12f0 import experiment 2 2018-12-10 12:16:39 -05:00
Richard Piazza 05964ee0c7 import experiment 2018-12-10 12:14:31 -05:00
Richard Piazza a5eca9916c last flake-y 2018-12-09 21:48:24 -05:00
Richard Piazza 99276c92fc flake-y again 2018-12-09 21:42:24 -05:00
Richard Piazza f3526bbfa6 flakey5 2018-12-07 14:06:36 -05:00
Richard Piazza 55ac3564bd flakey4 2018-12-07 14:03:46 -05:00
Richard Piazza 52c7d2a722 flakey3 2018-12-07 13:59:57 -05:00
Richard Piazza 3ea8035fcb flakey2 2018-12-07 13:47:44 -05:00
Richard Piazza da4a91a3ca flakey 2018-12-07 13:30:05 -05:00
Richard Piazza 03cceb827d add_visitor - take 2 2018-12-07 12:43:23 -05:00
Emmanuelle Vargas-Gonzalez 96b81fc489 pre-commit formatting changes 2018-12-06 15:19:50 -05:00
Emmanuelle Vargas-Gonzalez 01df0ccc57 Add new files for the test/v2X/stix2_data/ directory 2018-12-06 15:18:25 -05:00
Emmanuelle Vargas-Gonzalez bfa49ce37a Removed everything in test/v21/stix2_data/
Bring back optional version parameter to datastores. Update
documentation. Update v21 test suite
2018-12-06 15:11:30 -05:00
Emmanuelle Vargas-Gonzalez 3d099bec91 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into 1.1.0-release 2018-12-06 15:08:36 -05:00
Chris Lenk 522e9cedd0
Merge pull request #228 from chisholm/multi_version_filesystem_store
Multi version filesystem store, take 2
2018-12-03 07:29:07 -05:00
Emmanuelle Vargas-Gonzalez e1f7cc4028 change "Notes" for "Note" to keep visual effect in documentation 2018-11-30 09:39:05 -05:00
Emmanuelle Vargas-Gonzalez 7ee7352574
Update README.rst 2018-11-29 19:05:49 -05:00
Emmanuelle Vargas-Gonzalez f76de87f59
Update test_datastore_taxii.py
return the right bundle...
2018-11-29 18:45:34 -05:00
Emmanuelle Vargas-Gonzalez c62b9e92e7 revamp code in MockTAXIICollectionEndpoint, add more tests 2018-11-29 18:36:37 -05:00
Emmanuelle Vargas-Gonzalez 06716e3cfd remove redundant/unreachable code in core, add tests 2018-11-29 14:41:57 -05:00
Emmanuelle Vargas-Gonzalez aa649d4727 more pre-commit changes 2018-11-29 13:50:05 -05:00
Emmanuelle Vargas-Gonzalez f1490a98c8 remove full path from `constants` and fix directory resolution 2018-11-29 13:49:06 -05:00
Emmanuelle Vargas-Gonzalez 63c22aba99 fix path issues related to memory datastore 2018-11-29 12:17:26 -05:00
Emmanuelle Vargas-Gonzalez 6e9312efb7 fix test memory datastore teardown 2018-11-29 11:48:14 -05:00
Emmanuelle Vargas-Gonzalez 1b0fa0129f pre-commit changes 2018-11-29 11:06:27 -05:00
Emmanuelle Vargas-Gonzalez e365de3693 update setup.py information 2018-11-29 10:53:54 -05:00
Emmanuelle Vargas-Gonzalez 7f3a8b6c80 more tests to improve coverage 2018-11-29 10:27:13 -05:00
Emmanuelle Vargas-Gonzalez 6f897bc91d small enhancements, fix property for TLPMarking 2018-11-29 10:26:20 -05:00
Emmanuelle Vargas-Gonzalez 79c9d85072 make Memory datastore return path where data was saved to 2018-11-29 10:25:15 -05:00
Emmanuelle Vargas-Gonzalez 682e90ccaa expose the confidence methods via `stix2.scales.<method>` 2018-11-28 17:17:05 -05:00
Emmanuelle Vargas-Gonzalez ee14a116bd add new .rst documentation files 2018-11-28 17:03:02 -05:00
Emmanuelle Vargas-Gonzalez e896812754 minor code changes 2018-11-28 16:51:35 -05:00
Emmanuelle Vargas-Gonzalez 71a2aa2611 update project documentation. 2018-11-28 16:51:00 -05:00
Emmanuelle Vargas-Gonzalez 97a21c3064 my precious tables gone :( 2018-11-28 15:34:48 -05:00
Emmanuelle Vargas-Gonzalez c3031a0282 fix typo on DNI scale 2018-11-28 11:28:26 -05:00
Emmanuelle Vargas-Gonzalez aaddeb8b97 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into 1.1.0-release 2018-11-28 11:21:27 -05:00
Emmanuelle Vargas-Gonzalez e8b5ecc0de download badge is back! 2018-11-28 09:31:17 -05:00
Emmanuelle Vargas-Gonzalez 7aad97307e use stix2 version to update documentation 2018-11-28 09:20:26 -05:00
Michael Chisholm 17970a3faa Fixed a couple filter tests.
- a length check should come before the access, so you can verify
whether the access will succeed.

- Also removed some tests which can't work, due to the filter
changes.  In fact, a lot of these tests should probably be
removed or changed if we want to disallow running
apply_common_filters() on plain dicts.  They will often
coincidentally still succeed though, so I left them in.
2018-11-27 18:42:51 -05:00
Michael Chisholm 3a2f247f68 Fixed my own brainfart with converting string filter values to
datetimes: I'd converted the object property instead of the
filter value! :-P

Also, I fixed filter validation: it was checking for exact types
of the filter values and disallowing subtypes.  This library
includes a datetime subtype named STIXdatetime, and this type
should be usable as a filter value too.  So we need to allow
subtypes.
2018-11-27 18:38:55 -05:00
Michael Chisholm f57b9c34ef Add a newline to the end of a file 2018-11-27 17:58:01 -05:00
Michael Chisholm 18ff6f6094 Import cleanup to satisfy tox checks 2018-11-27 17:52:38 -05:00
Michael Chisholm 3adf7800a8 Changed how filters work, with respect to datetime objects.
Timestamp properties can now be checked against filter values
which are either strings or datetime objects, using datetime
semantics (previously, it reduced to a string compare).
If a stix object property is datetime-valued and the filter
value is a string, the string is parsed to a datetime object,
rather than the other way around.

Filtering in the filesystem store now parses JSON dicts to
_STIXBase objects before applying the filters.

Due to the parsing change, bad JSON content can produce a
different kind of error, so I had to change one of the tests.
2018-11-27 17:36:17 -05:00
Michael Chisholm d8a775c60d Fix some more improper exception re-raises in the filesystem
datastore test suite.  Add a new test corpus file, located so
as to test the backward compatibility functionality of
FileSystemSource.  Add a test to the suite which ensures that
this new file is found.
2018-11-27 15:24:09 -05:00
Michael Chisholm 63166ab256 Add some backward-compatibility to filesystem store: versioned
objects are searched for as ID-named json files in the type
directories, in addition to timestamp-named files in ID
directories.

Made a bugfix: fixed improper exception re-raises

Made an efficiency improvement: don't stat() files in
_get_matching_dir_entries() if no st_mode_test callable is given.
2018-11-27 15:24:09 -05:00
Michael Chisholm f615161110 Added some tests for adding markings to sinks and stores. 2018-11-27 15:24:08 -05:00
Michael Chisholm da13882eec Fix FileSystemSource.get() to not look for the latest version of
an object when markings are queried, since markings are not
versioned.
2018-11-27 15:24:08 -05:00
Michael Chisholm 0cecbeb9d8 Ran trailing-whitespace pre-commit hook. It changed a bunch of
files, in ways we don't completely understand...
2018-11-27 15:24:08 -05:00
Michael Chisholm 0a8ff2ab2e Add some newer versions of a couple of object IDs in the stix2
test data corpus.  Updated filesystem store tests accordingly:
- Remove comments from all_versions tests stating that multiple
  versions are not supported.  Improve the tests to ensure that
  all versions are in fact retrieved.
- Update the get() test to assure that it gets only the latest
  version, when there is more than one version.
- Update some count checks, since there are more objects now
- Fix some typos
2018-11-27 15:24:07 -05:00
Michael Chisholm 2b983368e5 Fix an indexing error which caused FileSystemSource.get() to return
the oldest object instead of the newest.
2018-11-27 15:24:07 -05:00
Michael Chisholm 9693c16cd1 Adjust import order to satisfy tox import check 2018-11-27 15:24:07 -05:00
Michael Chisholm 428a18ded2 Implemented clenk's suggested changes in multi-version filesystem
store:
- Use utils.get_type_from_id() instead of my own (I didn't know it
  was already there)
- Use dict-style instead of attribute-style access to get stix
  object properties
- Convert timezone-aware timestamps to UTC in _timestamp2filename()
  to ensure that different times always result in different
  filenames.

Also added a couple new tests for _timestamp2filename(), which
exercises the timezone conversion code.
2018-11-27 15:24:07 -05:00
Michael Chisholm 461e8bd5cb Removed the old FileSystemSource.query method. I'd renamed it
"query2" and forgot about it and left it there...
2018-11-27 15:24:07 -05:00
Michael Chisholm 0096835cfc Add multi-version support to the filesystem datastore.
Factored out the _is_marking() function from the memory datastore
module to utils so it can be reused, and changed both filesystem
and memory datastore modules to import and use it.
2018-11-27 15:24:07 -05:00
Emmanuelle Vargas-Gonzalez 7cc7431cb7 update maintainer information 2018-11-16 15:18:55 -05:00
Chris Lenk 84348b526d
Merge pull request #226 from emmanvg/1.0.4-release
Final update for v1.0.4 release
2018-11-15 13:46:34 -05:00
Emmanuelle Vargas-Gonzalez d01e6b47af Bump version: 1.0.3 → 1.0.4 2018-11-15 11:10:50 -05:00
Emmanuelle Vargas-Gonzalez 17fa71d201 Update CHANGELOG for v1.0.4 2018-11-15 11:10:26 -05:00
Chris Lenk 28d069a7f5
Merge pull request #225 from emmanvg/memory-datastore-fix
Memory datastore fix
2018-11-14 15:53:03 -05:00
Emmanuelle Vargas-Gonzalez 51df054f33
Update memory.py 2018-11-14 15:16:49 -05:00
Emmanuelle Vargas-Gonzalez d6435a18fa Missing changes to key/value approach 2018-11-14 15:03:57 -05:00
Emmanuelle Vargas-Gonzalez c80f39ceed Change approach to allow for custom objects 2018-11-14 14:35:22 -05:00
Chris Lenk de73935d2a
Merge pull request #224 from oasis-open/revert-222-multi_version_filesystem_store
Revert "Multi version filesystem store"
2018-11-07 10:26:42 -05:00
Chris Lenk c4668f5dc1
Revert "Multi version filesystem store" 2018-11-07 10:10:06 -05:00
Chris Lenk 150457c1bb
Merge pull request #222 from chisholm/multi_version_filesystem_store
Multi version filesystem store
2018-11-06 16:32:07 -05:00
Michael Chisholm b235e5773c Added some tests for adding markings to sinks and stores. 2018-11-06 16:15:33 -05:00
Michael Chisholm 7bb3d1f6a6 Fix FileSystemSource.get() to not look for the latest version of
an object when markings are queried, since markings are not
versioned.
2018-11-06 16:06:26 -05:00
Michael Chisholm 9f83f2140b Ran trailing-whitespace pre-commit hook. It changed a bunch of
files, in ways we don't completely understand...
2018-11-06 15:10:40 -05:00
Chris Lenk 693879eeb1
Merge pull request #221 from oasis-open/212-extensions-none-defined
Use consistent errors for observable extensions
2018-11-02 17:30:12 -04:00
Michael Chisholm a8d9aef673 Add some newer versions of a couple of object IDs in the stix2
test data corpus.  Updated filesystem store tests accordingly:
- Remove comments from all_versions tests stating that multiple
  versions are not supported.  Improve the tests to ensure that
  all versions are in fact retrieved.
- Update the get() test to assure that it gets only the latest
  version, when there is more than one version.
- Update some count checks, since there are more objects now
- Fix some typos
2018-11-01 20:25:00 -04:00
Michael Chisholm e2f5d60b51 Fix an indexing error which caused FileSystemSource.get() to return
the oldest object instead of the newest.
2018-11-01 20:25:00 -04:00
Michael Chisholm e2d9325356 Adjust import order to satisfy tox import check 2018-11-01 20:25:00 -04:00
Michael Chisholm ee57596d6a Implemented clenk's suggested changes in multi-version filesystem
store:
- Use utils.get_type_from_id() instead of my own (I didn't know it
  was already there)
- Use dict-style instead of attribute-style access to get stix
  object properties
- Convert timezone-aware timestamps to UTC in _timestamp2filename()
  to ensure that different times always result in different
  filenames.

Also added a couple new tests for _timestamp2filename(), which
exercises the timezone conversion code.
2018-11-01 20:25:00 -04:00
Michael Chisholm 51668a9a04 Removed the old FileSystemSource.query method. I'd renamed it
"query2" and forgot about it and left it there...
2018-11-01 20:25:00 -04:00
Michael Chisholm 9486b46f77 Add multi-version support to the filesystem datastore.
Factored out the _is_marking() function from the memory datastore
module to utils so it can be reused, and changed both filesystem
and memory datastore modules to import and use it.
2018-11-01 20:25:00 -04:00
Chris Lenk 3b297c17b5 Use consistent errors for observable extensions
Whether or not the Observable type is in the EXT_MAP already, using a
custom extension without also using allow_custom=True should result in
the same behavior/error message.
2018-11-01 17:23:55 -04:00
Emmanuelle Vargas-Gonzalez 8d24015186 Update Memory datastore to allow for mapping objects 2018-11-01 10:54:58 -04:00
Emmanuelle Vargas-Gonzalez 5abe518b8a Bump version: 1.0.3 → 1.1.0 2018-11-01 09:55:37 -04:00
Emmanuelle Vargas-Gonzalez 700988c65f Update CHANGELOG for v1.1.0 2018-11-01 09:48:59 -04:00
Emmanuelle Vargas-Gonzalez 493bd65ead Update README and refactor code to make 2.0 default. Update some tests 2018-11-01 09:21:02 -04:00
Emmanuelle Vargas-Gonzalez 5e5d10e7aa Finish alignment of 2.1 components 2018-11-01 08:17:34 -04:00
Emmanuelle Vargas-Gonzalez eff5369670 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2018-11-01 07:57:09 -04:00
Chris Lenk 3084c9f51f
Merge pull request #219 from oasis-open/stix2.0
Final updates to STIX 2.0 release
2018-10-31 14:12:56 -04:00
Emmanuelle Vargas-Gonzalez d614343910 Rename tests with duplicate name. 2018-10-17 07:56:10 -04:00
Emmanuelle Vargas-Gonzalez 352749edb0 Add constrains to ObservedData and Sighting, tests updated. 2018-10-17 07:47:25 -04:00
Emmanuelle Vargas-Gonzalez f8a72b0937 Custom builder code updated for 3.7 support.
Updated properties to support more constrains.
Make all regexes literal strings.
Update tests to align to new constrains.
Workbench problem. _check_object_constraints() uses instance class to
perform proper class resolution calls.
2018-10-17 07:34:15 -04:00
Emmanuelle Vargas-Gonzalez b2ef77b322 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2018-10-17 07:30:23 -04:00
Emmanuelle Vargas-Gonzalez dec75082df Add new constrains parameters to IntegerProperty and FloatProperty
New constraints on timestamps, integer and floats for many objects
2018-10-15 15:02:59 -04:00
Emmanuelle Vargas-Gonzalez acd86c80dd Update tests to new object constraints 2018-10-15 14:48:52 -04:00
Emmanuelle Vargas-Gonzalez 120e897e9b Update Indicator example to 2.1 representation. 2018-07-26 09:00:20 -04:00
Emmanuelle Vargas-Gonzalez 211b8d8cee Add core tests and minor change to parse_observable() 2018-07-25 20:53:53 -04:00
Emmanuelle Vargas-Gonzalez 5e71f9225b Add regex approach to load mappings 2018-07-25 14:06:18 -04:00
Emmanuelle Vargas-Gonzalez ad76e7155c MALWARE RESTORE POINT - Reverted changes to Malware based on STIX 2.1 CSD01
Use this commit to restore Malware changes.
2018-07-25 13:34:56 -04:00
Emmanuelle Vargas-Gonzalez 303159a818 pre-commit hooks changes 2018-07-25 13:32:22 -04:00
Emmanuelle Vargas-Gonzalez 21c84acc8f Add missing properties to Relationship object and update tests 2018-07-25 12:44:46 -04:00
Emmanuelle Vargas-Gonzalez af2a5605ce Add constraints to Location object 2018-07-25 12:43:57 -04:00
Emmanuelle Vargas-Gonzalez 40d656c94c Minor changes to tests. 2018-07-25 12:43:08 -04:00
Emmanuelle Vargas-Gonzalez ea15b6f795 Empty commit, but signals the following work was completed
This closes #202, closes #197, closes #196, closes #193, closes #80
2018-07-13 11:14:27 -04:00
Emmanuelle Vargas-Gonzalez 51a499cb33 Formatting changes made by the new pre-commit hook 'add trailing commas'
closes #189
2018-07-13 11:10:05 -04:00
Emmanuelle Vargas-Gonzalez e0aa8abd0c Update README.rst
Just formatting changes to enhance readability
2018-07-13 11:02:29 -04:00
Emmanuelle Vargas-Gonzalez 7476456e46 Update isort.cfg and .pre-commit-config.yaml
Adds 'add trailing commas' hook and changes isort to not re-write each other
2018-07-13 10:37:53 -04:00
Emmanuelle Vargas-Gonzalez 965d7fa788 Update v20 and v21 tests
In v20, only minor stuff that was addressing wrong spec. In v21, align tests with new/changed properties in the specs
2018-07-12 14:33:00 -04:00
Emmanuelle Vargas-Gonzalez bdfc221cb0 Update v21 properties to latest spec changes 2018-07-12 14:31:14 -04:00
Emmanuelle Vargas-Gonzalez 281dbfb0f4 Align tests with news additions from 'master' branch. 2018-07-11 09:43:37 -04:00
Emmanuelle Vargas-Gonzalez a042970a1f Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2018-07-11 09:41:42 -04:00
Emmanuelle Vargas-Gonzalez ee260b7574 Sort import check 2018-07-11 08:38:06 -04:00
Emmanuelle Vargas-Gonzalez e513c8d638 Hide builder methods in 'custom.py' and update imports accordingly 2018-07-11 08:11:47 -04:00
Emmanuelle Vargas-Gonzalez 6bd797e258 Fix 'test_memory_store_object_with_custom_property_in_bundle'
Since Bundles objects are now added instead of the Bundle itself, it now works as intended.
2018-07-10 16:19:40 -04:00
Emmanuelle Vargas-Gonzalez 834ef2c847 Fix check to collections.Mapping 2018-07-10 16:13:29 -04:00
Emmanuelle Vargas-Gonzalez c91bcd43f6 Fix location for test_object_property 2018-07-10 16:11:07 -04:00
Emmanuelle Vargas-Gonzalez 48e0442439 Fix tests in 'test_properties.py' 2018-07-10 16:10:01 -04:00
Emmanuelle Vargas-Gonzalez d24cddb547 Temporarily skip failing tests in workbench for v20, but the approach needs to be fixed 2018-07-10 16:08:36 -04:00
Emmanuelle Vargas-Gonzalez 4583da3ef2 Sort imports 2018-07-10 16:05:30 -04:00
Emmanuelle Vargas-Gonzalez ce42c02cee Fix tests that use property objects to call in the right path 2018-07-10 15:56:22 -04:00
Emmanuelle Vargas-Gonzalez 8cf68054d4 Remove str() as a way to serialize objects. Add support for encodings and Bundle versions. 2018-07-10 15:51:20 -04:00
Emmanuelle Vargas-Gonzalez 012eba4e9b Add new Bundle support for add_objects() request. Add encoding support.
Removed the version from all methods since it is no longer necessary.
2018-07-10 15:49:36 -04:00
Emmanuelle Vargas-Gonzalez 9b8cb09b1a Remove 'version' from calls to parse since it is no longer necessary
Also, fixed adding STIX2 Bundles to MemorySource. Enhancements to 'save_to_file'. Fix docstrings and encoding support when writing to file. closes #202
2018-07-10 15:43:58 -04:00
Emmanuelle Vargas-Gonzalez b6fefc52d9 Fix call to collect STIX2 mappings, make parse_observable available 2018-07-10 15:31:22 -04:00
Emmanuelle Vargas-Gonzalez 8d378fcf81 Remove STIXRelationshipObject from 'sro.py' 2018-07-10 15:27:05 -04:00
Emmanuelle Vargas-Gonzalez 023603d86f Remove duplicate code from 'sdo.py', removed STIXDomainObject
Apply proper 'spec_version' constraints to v21 objects
2018-07-10 15:22:21 -04:00
Emmanuelle Vargas-Gonzalez 54268ae7dd Remove observables and extension mappings, custom code and apply property constrain in v21 2018-07-10 15:20:16 -04:00
Emmanuelle Vargas-Gonzalez 1177694739 Adding docstrings to 2.1 objects 2018-07-10 15:15:33 -04:00
Emmanuelle Vargas-Gonzalez b722fdc0ed Remove duplicate register methods and CustomMarking code 2018-07-10 15:07:08 -04:00
Emmanuelle Vargas-Gonzalez 5332d54383 Refactor Bundle
Removed redundant STIXObjectProperty, for 2.1 use validation specific to that version
2018-07-10 15:02:55 -04:00
Emmanuelle Vargas-Gonzalez 78d77254ae Add object mappings in the top of each version package 2018-07-10 14:59:43 -04:00
Emmanuelle Vargas-Gonzalez 03e19f197c Moved STIXDomainObject and STIXRelationshipObject here, the observable
parsing code, minor changes to the mappings collections and all _register methods
2018-07-10 14:56:31 -04:00
Emmanuelle Vargas-Gonzalez b76888c682 Add a new 'custom.py' module to store and consolidate all custom code 2018-07-10 14:54:17 -04:00
Emmanuelle Vargas-Gonzalez f669656a4d Removed 'ExtensionsProperty' from workbench. 2018-07-10 14:52:10 -04:00
Emmanuelle Vargas-Gonzalez 7da6f1ed88 Add a package-wide 'properties.py'
Consolidated STIXObjectProperty, ExtensionsProperty, ObservableProperty code. Also added a 'spec_version' to allow validation changes per spec.
2018-07-10 14:50:03 -04:00
Emmanuelle Vargas-Gonzalez fe64fb044f Removed per version 'properties.py' 2018-07-10 14:47:30 -04:00
Emmanuelle Vargas-Gonzalez 645a258c62 Fix file indentation 2018-07-10 14:46:46 -04:00
Emmanuelle Vargas-Gonzalez 99e76c26ae Update Tox test configuration 2018-07-10 14:39:20 -04:00
Emmanuelle Vargas-Gonzalez edd7148e3c It appears we did not support the case when the Bundle contains 'utf-8' 2018-07-09 15:26:57 -04:00
Emmanuelle Vargas-Gonzalez 70a1e9522b Add 'stix2_data' test directory for v20 tests 2018-07-09 15:22:08 -04:00
Emmanuelle Vargas-Gonzalez 0197f9fd17 Minor fixes to tests. Some datastore had strange parameters in calls.
Fix error values for CustomMarking and fix incorrect test data
2018-07-09 15:20:04 -04:00
Emmanuelle Vargas-Gonzalez 646d941032 Removed 'test_memory' file and moved all tests into 'test_datastore_memory'
The tests under 'test_memory' were moved into a new file called 'test_datastore_composite' to make clear what we are testing.
2018-07-09 15:15:29 -04:00
Emmanuelle Vargas-Gonzalez 546216c396 Remove unnecessary 'True' in Memory datastore tests. 2018-07-09 15:07:05 -04:00
Emmanuelle Vargas-Gonzalez 8aeac369f4 Returning double quotes here... went overboard. 2018-07-09 14:59:19 -04:00
Emmanuelle Vargas-Gonzalez 21d5451d1c Small changes to tests 2018-07-06 14:11:59 -04:00
Emmanuelle Vargas-Gonzalez 52c1850655 Small addition to patterns.py 2018-07-06 14:08:49 -04:00
Emmanuelle Vargas-Gonzalez 5be1636b10 Update v20 tests to ensure right methods and classes are used 2018-07-05 15:23:25 -04:00
Emmanuelle Vargas-Gonzalez 2c5ddc14af Update v21 tests for some missing methods, ensure we are calling and
using the right classes.
2018-07-05 15:21:09 -04:00
Emmanuelle Vargas-Gonzalez bfa86bf87e Format objects in observed_data 2018-07-03 10:32:04 -04:00
Emmanuelle Vargas-Gonzalez 04680d8a3d First pass at making sure everything uses v21 classes and representations 2018-07-03 09:40:51 -04:00
Emmanuelle Vargas-Gonzalez 3100fa1fb8 Move v20 tests to their own package 2018-07-03 07:02:57 -04:00
Emmanuelle Vargas-Gonzalez da5b16dc2f Create v21 test package with new spec changes 2018-07-03 07:00:18 -04:00
Emmanuelle Vargas-Gonzalez c2f5a40986 Create new test subpackages 2018-06-30 19:36:54 -04:00
Emmanuelle Vargas-Gonzalez 2e6bb74be8 Add spec version 2.1 to object missing the property. 2018-06-29 18:48:41 -04:00
Emmanuelle Vargas-Gonzalez 7fd379d0b5 Minor style changes.
Removed OrderedDict and update()... Also a lot of single quoting except for errors
2018-06-29 18:38:04 -04:00
Emmanuelle Vargas-Gonzalez 9cc74e88b6
Merge pull request #198 from treyka/stix2.1
update dict key limits per 2.1 spec
2018-06-26 12:57:44 -04:00
Emmanuelle Vargas-Gonzalez 59fdd3082e Update tests. 2018-06-26 12:29:20 -04:00
Emmanuelle Vargas-Gonzalez 9baaad6e08 Sort imports. 2018-06-26 12:23:53 -04:00
Trey Darley 5cbe886cdb split properties out by spec version 2018-06-26 09:32:24 +00:00
Trey Darley d44c2abd0f 2.1 spec (somewhat inexplicably) limits dict keys to 250 chars 2018-06-26 09:23:52 +00:00
Trey Darley cc58a3a4f4 2.1 removes 3 char limit on dict keys 2018-06-26 09:22:57 +00:00
Trey Darley 6b1da856dd split properties out by spec version 2018-06-26 09:22:04 +00:00
Emmanuelle Vargas-Gonzalez b852b91652 Reformat methods documentation 2018-06-25 10:06:07 -04:00
Emmanuelle Vargas-Gonzalez 0ddb7b3807 Update observables.RasterImageExt 'image_weight' property to 'image_width' 2018-06-25 08:55:12 -04:00
Emmanuelle Vargas-Gonzalez abd172eb3f Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2018-06-25 08:25:57 -04:00
Greg Back 53a1a0329a Merge branch 'master' into stix2.1 2018-06-15 10:04:00 -05:00
Greg Back 78c4d48bd9
Merge pull request #191 from chisholm/malware2.1
Malware2.1
2018-06-15 08:22:35 -05:00
Michael Chisholm 240a75861e Updated stix2.parse()-related docstrings. Its description of how
the "version" parameter was used, was out of date.
2018-06-14 15:56:02 -04:00
Michael Chisholm 486c588306 Fix silly isort check errors.. 2018-06-14 15:56:02 -04:00
Michael Chisholm 3101584b3d Fix test_bundle to compare against stix2.1 relationships. The
fixture those particular tests use creates 2.1 relationships.
2018-06-14 15:56:02 -04:00
Michael Chisholm 0c3f826c24 First cut at splitting the Bundle implementation into v20 and
v21 variants.  Also fixed up unit tests and got them passing
again.
2018-06-14 15:56:02 -04:00
Michael Chisholm ef8d45723f Update many unit tests to work with the malware2.1 API changes
I made.  The bundle tests and Bundle itself have not been fixed
yet in this commit.
2018-06-14 15:56:01 -04:00
Michael Chisholm f211649529 Made some minimal changes to support the STIX 2.1 Malware SDO,
and the maec2stix tool.
2018-06-14 15:56:01 -04:00
Greg Back 2e0dfc6592 Merge remote-tracking branch 'origin/master' into stix2.1 2018-06-14 12:42:06 -05:00
Greg Back 75e478312a Additional changes to match updates in v20 code. 2018-06-11 13:37:45 -05:00
Greg Back 3e159abd4d WIP: Merge branch 'master' into stix2.1 2018-05-23 10:43:52 -05:00
Greg Back a780bcfe86
Merge pull request #130 from emmanvg/stix2.1
STIX 2.1 Branch Update
2018-02-26 10:40:50 -06:00
Emmanuelle Vargas-Gonzalez b6c22010bb Fix imports for test_location.py 2018-02-23 08:52:34 -05:00
Emmanuelle Vargas-Gonzalez d51f1014c7 Fix InstrusionSet 'last_seen' in 2.1 2018-02-23 08:24:26 -05:00
Emmanuelle Vargas-Gonzalez 06974c72f5 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2018-02-23 08:21:16 -05:00
Emmanuelle Vargas-Gonzalez 722d46c6c5 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2017-12-08 09:36:59 -05:00
Emmanuelle Vargas-Gonzalez b83c5ac7ef Update README 2017-11-02 09:59:42 -04:00
Emmanuelle Vargas-Gonzalez f6f7d0aed8 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 2017-11-02 07:48:37 -04:00
Emmanuelle Vargas-Gonzalez bdb91c6ac4 Update STIX 2.1 structure 2017-11-02 07:21:24 -04:00
Emmanuelle Vargas-Gonzalez d4db4f0ab8 Define source code encoding 2017-10-24 12:53:53 -04:00
Emmanuelle Vargas-Gonzalez 7b6236674c Add new tests. 2017-10-23 08:06:29 -04:00
Emmanuelle Vargas-Gonzalez ef98c38937 Minor changes 2017-10-23 08:04:18 -04:00
Emmanuelle Vargas-Gonzalez be3e841ecb New object test files 2017-10-18 13:58:24 -04:00
Greg Back a79f10eab8 Merge pull request #78 from emmanvg/stix2.1
STIX 2.1 Work in Progress
2017-10-11 18:57:56 +00:00
Emmanuelle Vargas-Gonzalez f6e21d2199 Merge branch 'master' of github.com:oasis-open/cti-python-stix2 into stix2.1 2017-10-11 13:31:46 -04:00
Emmanuelle Vargas-Gonzalez c4c2fb950e Implement LanguageContent object. Update GranularMarking and other missing properties 2017-10-11 13:30:26 -04:00
Emmanuelle Vargas-Gonzalez b99d9e4132 Update confidence docstrings 2017-10-11 13:16:59 -04:00
Emmanuelle Vargas-Gonzalez 2ec8205f1e Implement confidence conversion scales 2017-10-09 07:22:19 -04:00
Emmanuelle Vargas-Gonzalez 50f3d60259 Update branch 2017-10-06 20:34:08 -04:00
Emmanuelle Vargas-Gonzalez 5577686ee8 Add new STIX2.1 SDOs and additional properties 2017-10-06 15:09:14 -04:00
256 changed files with 34416 additions and 6010 deletions

29
.gitignore vendored
View File

@ -55,6 +55,7 @@ coverage.xml
# Sphinx documentation
docs/_build/
.ipynb_checkpoints
default_sem_eq_weights.rst
# PyBuilder
target/
@ -68,3 +69,31 @@ cache.sqlite
# PyCharm
.idea/
### macOS template
# General
.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk

View File

@ -2,8 +2,11 @@
skip = workbench.py
not_skip = __init__.py
known_third_party =
antlr4,
dateutil,
haversine,
medallion,
pyjarowinkler,
pytest,
pytz,
requests,
@ -14,3 +17,5 @@ known_third_party =
taxii2client,
known_first_party = stix2
force_sort_within_sections = 1
multi_line_output = 5
include_trailing_comma = True

View File

@ -1,11 +1,16 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
sha: v0.9.4
rev: v1.3.0
hooks:
- id: trailing-whitespace
- id: flake8
args:
- --max-line-length=160
- id: check-merge-conflict
- repo: https://github.com/asottile/add-trailing-comma
rev: v0.6.4
hooks:
- id: add-trailing-comma
- repo: https://github.com/FalconSocial/pre-commit-python-sorter
sha: b57843b0b874df1d16eb0bef00b868792cb245c2
hooks:

View File

@ -1,22 +1,21 @@
sudo: false
os: linux
language: python
cache: pip
dist: xenial
python:
- "2.7"
- "3.4"
- "3.5"
- "3.6"
matrix:
include:
- python: 3.7 # https://github.com/travis-ci/travis-ci/issues/9069#issuecomment-425720905
dist: xenial
sudo: true
- "3.7"
- "3.8"
install:
- pip install -U pip setuptools
- pip install tox-travis pre-commit
- pip install tox-travis
- pip install codecov
- if [[ $TRAVIS_PYTHON_VERSION != 3.4 ]]; then pip install pre-commit; fi
script:
- tox
- pre-commit run --all-files
- if [[ $TRAVIS_PYTHON_VERSION != 3.4 ]]; then pre-commit run --all-files; fi
after_success:
- codecov

View File

@ -1,6 +1,86 @@
CHANGELOG
=========
1.4.0 - 2020-04-03
* #347, #355, #356, #357, #358, #360, #362, #369, #370, #379, #374, #384 Updates STIX 2.1 support to CS01
* #376 Fixes bug where registering object of same name would overwrite it; will
now raise an error
1.3.1 - 2020-03-06
* #322 Adds encoding option FileSystemSource and MemorySource
* #354 Adds ability to specify id-contributing properties on custom SCOs
* #346 Certain SCO properties are no longer deprecated
* #327 Fixes missing 'name' property on Marking Definitions
* #303 Fixes bug with escaping quotes in patterns
* #331 Fixes crashing bug of property names that conflict with Mapping methods
* #337 Fixes bug with detecting STIX version of content when parsing
* #342, #343 Fixes bug when adding SCOs to Memory or FileSystem Stores
* #348 Fixes bug with generating deterministic IDs for SCOs
* #344 Fixes bug with propagating errors from the pattern validator
1.3.0 - 2020-01-04
* #305 Updates support of STIX 2.1 to WD06
* #304 Updates semantic equivalence to latest draft, and allows programmatic
detailed logging
* Adds Python 3.8 support
* #297 Fixes bug with File.contains_refs
* #311 Fixes several DeprecationWarnings
* #315 Fixes parsing embedded external references with custom properties
* #316 Fix socket extension key checking
* #317 Fixes checking of Indicator's pattern property based on pattern_version
1.2.1 - 2019-10-16
* #301 Adds more detailed debugging semantic equivalence output
* #301 Updates semantic equivalence errors
* #300 Fixes bug with deterministic IDs for SCOs containing unicode
1.2.0 - 2019-09-25
* #268, #271, #273, #275, #283, #285, #290 Changes support of STIX 2.1 to WD05 (CSD02), for all object types
* #269 Updates id properties to take a spec_version parameter
* #283 Changes the exception class hierarchy
* #289 Adds functions for calculating semantic equivalence of two objects
* #286 Fixes handling of custom observable extensions
* #287 Fixes bug with timestamp precision preservation in MarkingDefinition objects
1.1.3 - 2019-08-12
* #258 Ignores empty values for optional fields
* #259 Adds support for lang granular markings
* #261 Prevents instantiation or serialization of TLP marking-definitions that don't follow the spec
* #262 Supports actual objects in _valid_refs instead of just strings
* #264 Supports accessing objects in bundles via STIX Object IDs
* #274 Fixes bug parsing bundle containing custom objects
1.1.2 - 2019-02-13
* #86 Adds helper function to Location objects to generate a URL to the location in an online map engine.
1.1.1 - 2019-01-11
* #234 Update documentation structure to better navigate between v20/v21 objects
* #232 FileSystemStore now raises an exception if you attempt to overwrite an existing file
* #236 Fix a serialization problem with the WindowsRegistryKey observable object
* #238 Fix a problem with the LanguageContent object not allowing its creation with an empty dictionary
1.1.0 - 2018-12-11
- Most (if not all) STIX 2.1 SDOs/SROs and core objects have been implemented according to the latest CSD/WD document
- There is an implementation for the conversion scales
- #196, #193 Removing duplicate code for: properties, registering objects, parsing objects, custom objects
- #80, #197 Most (if not all) tests created for v20 are also implemented for v21
- #189 Added extra checks for the pre-commit tool
- #202 It is now possible to pass a Bundle into add() method in Memory datastores
1.0.4 - 2018-11-15
* #225 MemorySource fix to support custom objects
* #212 More consistency in Observable extensions behavior/error messages
1.0.3 - 2018-10-31
* #187 Pickle proof objects

View File

@ -1,42 +1,34 @@
|Build_Status| |Coverage| |Version|
|Build_Status| |Coverage| |Version| |Downloads_Badge| |Documentation_Status|
cti-python-stix2
================
This is an `OASIS TC Open
Repository <https://www.oasis-open.org/resources/open-
repositories/>`__.
This is an `OASIS TC Open Repository <https://www.oasis-open.org/resources/open-repositories/>`__.
See the `Governance <#governance>`__ section for more information.
This repository provides Python APIs for serializing and de-
serializing
STIX 2 JSON content, along with higher-level APIs for common tasks,
including data markings, versioning, and for resolving STIX IDs across
multiple data sources.
This repository provides Python APIs for serializing and de-serializing STIX2
JSON content, along with higher-level APIs for common tasks, including data
markings, versioning, and for resolving STIX IDs across multiple data sources.
For more information, see `the
documentation <https://stix2.readthedocs.io/>`__ on
ReadTheDocs.
For more information, see `the documentation <https://stix2.readthedocs.io/>`__ on ReadTheDocs.
Installation
------------
Install with `pip <https://pip.pypa.io/en/stable/>`__:
::
.. code-block:: bash
pip install stix2
$ pip install stix2
Usage
-----
To create a STIX object, provide keyword arguments to the type's
constructor. Certain required attributes of all objects, such as
``type`` or
``id``, will be set automatically if not provided as keyword
arguments.
To create a STIX object, provide keyword arguments to the type's constructor.
Certain required attributes of all objects, such as ``type`` or ``id``, will
be set automatically if not provided as keyword arguments.
.. code:: python
.. code-block:: python
from stix2 import Indicator
@ -44,172 +36,141 @@ arguments.
labels=["malicious-activity"],
pattern="[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']")
To parse a STIX JSON string into a Python STIX object, use
``parse()``:
To parse a STIX JSON string into a Python STIX object, use ``parse()``:
.. code:: python
.. code-block:: python
from stix2 import parse
indicator = parse("""{
"type": "indicator",
"spec_version": "2.1",
"id": "indicator--dbcbd659-c927-4f9a-994f-0a2632274394",
"created": "2017-09-26T23:33:39.829Z",
"modified": "2017-09-26T23:33:39.829Z",
"labels": [
"name": "File hash for malware variant",
"indicator_types": [
"malicious-activity"
],
"name": "File hash for malware variant",
"pattern_type": "stix",
"pattern": "[file:hashes.md5 ='d41d8cd98f00b204e9800998ecf8427e']",
"valid_from": "2017-09-26T23:33:39.829952Z"
}""")
print(indicator)
For more in-depth documentation, please see
`https://stix2.readthedocs.io/ <https://stix2.readthedocs.io/>`__.
For more in-depth documentation, please see `https://stix2.readthedocs.io/ <https://stix2.readthedocs.io/>`__.
STIX 2.X Technical Specification Support
----------------------------------------
This version of python-stix2 supports STIX 2.0 by default. Although,
the
`stix2` Python library is built to support multiple versions of the
STIX
Technical Specification. With every major release of stix2 the
``import stix2``
statement will automatically load the SDO/SROs equivalent to the most
recent
supported 2.X Technical Specification. Please see the library
documentation
for more details.
This version of python-stix2 brings initial support to STIX 2.1 currently at the
CSD level. The intention is to help debug components of the library and also
check for problems that should be fixed in the specification.
The `stix2` Python library is built to support multiple versions of the STIX
Technical Specification. With every major release of stix2 the ``import stix2``
statement will automatically load the SDO/SROs equivalent to the most recent
supported 2.X Committee Specification. Please see the library documentation for
more details.
Governance
----------
This GitHub public repository (
**https://github.com/oasis-open/cti-python-stix2** ) was
`proposed <https://lists.oasis-
open.org/archives/cti/201702/msg00008.html>`__
and
`approved <https://www.oasis-
open.org/committees/download.php/60009/>`__
This GitHub public repository (**https://github.com/oasis-open/cti-python-stix2**) was
`proposed <https://lists.oasis-open.org/archives/cti/201702/msg00008.html>`__ and
`approved <https://www.oasis-open.org/committees/download.php/60009/>`__
[`bis <https://issues.oasis-open.org/browse/TCADMIN-2549>`__] by the
`OASIS Cyber Threat Intelligence (CTI)
TC <https://www.oasis-open.org/committees/cti/>`__ as an `OASIS TC
Open
Repository <https://www.oasis-open.org/resources/open-
repositories/>`__
to support development of open source resources related to Technical
Committee work.
`OASIS Cyber Threat Intelligence (CTI) TC <https://www.oasis-open.org/committees/cti/>`__
as an `OASIS TC Open Repository <https://www.oasis-open.org/resources/open-repositories/>`__
to support development of open source resources related to Technical Committee work.
While this TC Open Repository remains associated with the sponsor TC,
its
development priorities, leadership, intellectual property terms,
participation rules, and other matters of governance are `separate and
distinct <https://github.com/oasis-open/cti-python-
stix2/blob/master/CONTRIBUTING.md#governance-distinct-from-oasis-tc-
process>`__
While this TC Open Repository remains associated with the sponsor TC, its
development priorities, leadership, intellectual property terms, participation
rules, and other matters of governance are `separate and distinct
<https://github.com/oasis-open/cti-python-stix2/blob/master/CONTRIBUTING.md#governance-distinct-from-oasis-tc-process>`__
from the OASIS TC Process and related policies.
All contributions made to this TC Open Repository are subject to open
source license terms expressed in the `BSD-3-Clause
License <https://www.oasis-open.org/sites/www.oasis-
open.org/files/BSD-3-Clause.txt>`__.
That license was selected as the declared `"Applicable
License" <https://www.oasis-open.org/resources/open-
repositories/licenses>`__
source license terms expressed in the `BSD-3-Clause License <https://www.oasis-open.org/sites/www.oasis-open.org/files/BSD-3-Clause.txt>`__.
That license was selected as the declared `"Applicable License" <https://www.oasis-open.org/resources/open-repositories/licenses>`__
when the TC Open Repository was created.
As documented in `"Public Participation
Invited <https://github.com/oasis-open/cti-python-
stix2/blob/master/CONTRIBUTING.md#public-participation-invited>`__",
contributions to this OASIS TC Open Repository are invited from all
parties, whether affiliated with OASIS or not. Participants must have
a
GitHub account, but no fees or OASIS membership obligations are
required. Participation is expected to be consistent with the `OASIS
TC Open Repository Guidelines and
Procedures <https://www.oasis-open.org/policies-guidelines/open-
repositories>`__,
the open source
`LICENSE <https://github.com/oasis-open/cti-python-
stix2/blob/master/LICENSE>`__
As documented in `"Public Participation Invited
<https://github.com/oasis-open/cti-python-stix2/blob/master/CONTRIBUTING.md#public-participation-invited>`__",
contributions to this OASIS TC Open Repository are invited from all parties,
whether affiliated with OASIS or not. Participants must have a GitHub account,
but no fees or OASIS membership obligations are required. Participation is
expected to be consistent with the `OASIS TC Open Repository Guidelines and Procedures
<https://www.oasis-open.org/policies-guidelines/open-repositories>`__,
the open source `LICENSE <https://github.com/oasis-open/cti-python-stix2/blob/master/LICENSE>`__
designated for this particular repository, and the requirement for an
`Individual Contributor License
Agreement <https://www.oasis-open.org/resources/open-
repositories/cla/individual-cla>`__
`Individual Contributor License Agreement <https://www.oasis-open.org/resources/open-repositories/cla/individual-cla>`__
that governs intellectual property.
Maintainers
~~~~~~~~~~~
TC Open Repository
`Maintainers <https://www.oasis-open.org/resources/open-
repositories/maintainers-guide>`__
TC Open Repository `Maintainers <https://www.oasis-open.org/resources/open-repositories/maintainers-guide>`__
are responsible for oversight of this project's community development
activities, including evaluation of GitHub `pull
requests <https://github.com/oasis-open/cti-python-
stix2/blob/master/CONTRIBUTING.md#fork-and-pull-collaboration-
model>`__
and
`preserving <https://www.oasis-open.org/policies-guidelines/open-
repositories#repositoryManagement>`__
open source principles of openness and fairness. Maintainers are
recognized and trusted experts who serve to implement community goals
and consensus design preferences.
activities, including evaluation of GitHub
`pull requests <https://github.com/oasis-open/cti-python-stix2/blob/master/CONTRIBUTING.md#fork-and-pull-collaboration-model>`__
and `preserving <https://www.oasis-open.org/policies-guidelines/open-repositories#repositoryManagement>`__
open source principles of openness and fairness. Maintainers are recognized
and trusted experts who serve to implement community goals and consensus design
preferences.
Initially, the associated TC members have designated one or more
persons
to serve as Maintainer(s); subsequently, participating community
members
may select additional or substitute Maintainers, per `consensus
agreements <https://www.oasis-open.org/resources/open-
repositories/maintainers-guide#additionalMaintainers>`__.
Initially, the associated TC members have designated one or more persons to
serve as Maintainer(s); subsequently, participating community members may
select additional or substitute Maintainers, per `consensus agreements
<https://www.oasis-open.org/resources/open-repositories/maintainers-guide#additionalMaintainers>`__.
.. _currentMaintainers:
.. _currentmaintainers:
**Current Maintainers of this TC Open Repository**
- `Chris Lenk <mailto:clenk@mitre.org>`__; GitHub ID:
https://github.com/clenk/; WWW: `MITRE
Corporation <http://www.mitre.org/>`__
https://github.com/clenk/; WWW: `MITRE Corporation <http://www.mitre.org/>`__
- `Emmanuelle Vargas-Gonzalez <mailto:emmanuelle@mitre.org>`__; GitHub ID:
https://github.com/emmanvg/; WWW: `MITRE
Corporation <https://www.mitre.org/>`__
- `Jason Keirstead <mailto:Jason.Keirstead@ca.ibm.com>`__; GitHub ID:
https://github.com/JasonKeirstead; WWW: `IBM <http://www.ibm.com/>`__
About OASIS TC Open Repositories
--------------------------------
- `TC Open Repositories: Overview and
Resources <https://www.oasis-open.org/resources/open-
repositories/>`__
- `Frequently Asked
Questions <https://www.oasis-open.org/resources/open-
repositories/faq>`__
- `Open Source
Licenses <https://www.oasis-open.org/resources/open-
repositories/licenses>`__
- `Contributor License Agreements
(CLAs) <https://www.oasis-open.org/resources/open-
repositories/cla>`__
- `Maintainers' Guidelines and
Agreement <https://www.oasis-open.org/resources/open-
repositories/maintainers-guide>`__
- `TC Open Repositories: Overview and Resources <https://www.oasis-open.org/resources/open-repositories/>`__
- `Frequently Asked Questions <https://www.oasis-open.org/resources/open-repositories/faq>`__
- `Open Source Licenses <https://www.oasis-open.org/resources/open-repositories/licenses>`__
- `Contributor License Agreements (CLAs) <https://www.oasis-open.org/resources/open-repositories/cla>`__
- `Maintainers' Guidelines and Agreement <https://www.oasis-open.org/resources/open-repositories/maintainers-guide>`__
Feedback
--------
Questions or comments about this TC Open Repository's activities
should be
composed as GitHub issues or comments. If use of an issue/comment is
not
Questions or comments about this TC Open Repository's activities should be
composed as GitHub issues or comments. If use of an issue/comment is not
possible or appropriate, questions may be directed by email to the
Maintainer(s) `listed above <#currentmaintainers>`__. Please send
general questions about TC Open Repository participation to OASIS
Staff at
Maintainer(s) `listed above <#currentmaintainers>`__. Please send general
questions about TC Open Repository participation to OASIS Staff at
repository-admin@oasis-open.org and any specific CLA-related questions
to repository-cla@oasis-open.org.
.. |Build_Status| image:: https://travis-ci.org/oasis-open/cti-python-stix2.svg?branch=master
:target: https://travis-ci.org/oasis-open/cti-python-stix2
:alt: Build Status
.. |Coverage| image:: https://codecov.io/gh/oasis-open/cti-python-stix2/branch/master/graph/badge.svg
:target: https://codecov.io/gh/oasis-open/cti-python-stix2
:alt: Coverage
.. |Version| image:: https://img.shields.io/pypi/v/stix2.svg?maxAge=3600
:target: https://pypi.python.org/pypi/stix2/
:alt: Version
.. |Downloads_Badge| image:: https://img.shields.io/pypi/dm/stix2.svg?maxAge=3600
:target: https://pypi.python.org/pypi/stix2/
:alt: Downloads
.. |Documentation_Status| image:: https://readthedocs.org/projects/stix2/badge/?version=latest
:target: https://stix2.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status

View File

@ -0,0 +1,5 @@
scales
=======================
.. automodule:: stix2.confidence.scales
:members:

View File

@ -0,0 +1,5 @@
confidence
================
.. automodule:: stix2.confidence
:members:

5
docs/api/stix2.v20.rst Normal file
View File

@ -0,0 +1,5 @@
v20
=========
.. automodule:: stix2.v20
:members:

5
docs/api/stix2.v21.rst Normal file
View File

@ -0,0 +1,5 @@
v21
=========
.. automodule:: stix2.v21
:members:

View File

@ -0,0 +1,5 @@
bundle
================
.. automodule:: stix2.v20.bundle
:members:

View File

@ -0,0 +1,5 @@
bundle
================
.. automodule:: stix2.v21.bundle
:members:

View File

@ -0,0 +1,5 @@
common
================
.. automodule:: stix2.v21.common
:members:

View File

@ -0,0 +1,5 @@
observables
=====================
.. automodule:: stix2.v21.observables
:members:

View File

@ -0,0 +1,5 @@
sdo
=============
.. automodule:: stix2.v21.sdo
:members:

View File

@ -0,0 +1,5 @@
sro
=============
.. automodule:: stix2.v21.sro
:members:

View File

@ -1,3 +1,5 @@
import datetime
import json
import os
import re
import sys
@ -6,6 +8,8 @@ from six import class_types
from sphinx.ext.autodoc import ClassDocumenter
from stix2.base import _STIXBase
from stix2.environment import WEIGHTS
from stix2.version import __version__
sys.path.insert(0, os.path.abspath('..'))
@ -31,11 +35,11 @@ source_suffix = '.rst'
master_doc = 'index'
project = 'stix2'
copyright = '2017, OASIS Open'
copyright = '{}, OASIS Open'.format(datetime.date.today().year)
author = 'OASIS Open'
version = '1.0.3'
release = '1.0.3'
version = __version__
release = __version__
language = None
exclude_patterns = ['_build', '_templates', 'Thumbs.db', '.DS_Store', 'guide/.ipynb_checkpoints']
@ -49,7 +53,7 @@ html_sidebars = {
'navigation.html',
'relations.html',
'searchbox.html',
]
],
}
latex_elements = {}
@ -57,6 +61,14 @@ latex_documents = [
(master_doc, 'stix2.tex', 'stix2 Documentation', 'OASIS', 'manual'),
]
# Add a formatted version of environment.WEIGHTS
default_sem_eq_weights = json.dumps(WEIGHTS, indent=4, default=lambda o: o.__name__)
default_sem_eq_weights = default_sem_eq_weights.replace('\n', '\n ')
default_sem_eq_weights = default_sem_eq_weights.replace(' "', ' ')
default_sem_eq_weights = default_sem_eq_weights.replace('"\n', '\n')
with open('default_sem_eq_weights.rst', 'w') as f:
f.write(".. code-block:: py\n\n {}\n\n".format(default_sem_eq_weights))
def get_property_type(prop):
"""Convert property classname into pretty string name of property.

View File

@ -109,3 +109,11 @@ then look at the resulting report in ``htmlcov/index.html``.
All commits pushed to the ``master`` branch or submitted as a pull request are
tested with `Travis-CI <https://travis-ci.org/oasis-open/cti-python-stix2>`_
automatically.
Adding a dependency
-------------------
One of the pre-commit hooks we use in our develoment environment enforces a
consistent ordering to imports. If you need to add a new library as a dependency
please add it to the `known_third_party` section of `.isort.cfg` to make sure
the import is sorted correctly.

View File

@ -144,12 +144,12 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--548af3be-39d7-4a3e-93c2-1a63cccf8951&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:24.193Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:24.193Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--2f3d4926-163d-4aef-bcd2-19dea96916ae&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:14:48.509Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:14:48.509Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;File hash for malware variant&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;pattern&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;[file:hashes.md5 = &#39;d41d8cd98f00b204e9800998ecf8427e&#39;]&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:24.193659Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:14:48.509629Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;malicious-activity&quot;</span>\n",
" <span class=\"p\">]</span>\n",
@ -330,6 +330,19 @@
"indicator.name"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<div class=\"alert alert-warning\">\n",
"\n",
"**Warning**\n",
"\n",
"Note that there are several attributes on these objects used for method names. Accessing those will return a bound method, not the attribute value.\n",
"\n",
"</div>\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -465,9 +478,9 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--3d7f0c1c-616a-4868-aa7b-150821d2a429&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:46.584Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:46.584Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--1f2aba70-f0ae-49cd-9267-6fcb1e43be67&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:04.698Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:04.698Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;Poison Ivy&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;remote-access-trojan&quot;</span>\n",
@ -498,7 +511,7 @@
"source": [
"As with indicators, the ``type``, ``id``, ``created``, and ``modified`` properties will be set automatically if not provided. For Malware objects, the ``labels`` and ``name`` properties must be provided.\n",
"\n",
"You can see the full list of SDO classes [here](../api/stix2.v20.sdo.rst)."
"You can see the full list of SDO classes [here](../api/v20/stix2.v20.sdo.rst)."
]
},
{
@ -588,12 +601,12 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship--34ddc7b4-4965-4615-b286-1c8bbaa1e7db&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:49.474Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:49.474Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship--80c174fa-36d1-47c2-9a9d-ce0c636bedcc&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:13.152Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:13.152Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;relationship_type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicates&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;source_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--548af3be-39d7-4a3e-93c2-1a63cccf8951&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;target_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--3d7f0c1c-616a-4868-aa7b-150821d2a429&quot;</span>\n",
" <span class=\"nt\">&quot;source_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--2f3d4926-163d-4aef-bcd2-19dea96916ae&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;target_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--1f2aba70-f0ae-49cd-9267-6fcb1e43be67&quot;</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
@ -700,12 +713,12 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship--0a646403-f7e7-4cfd-b945-cab5cde05857&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:51.417Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:51.417Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship--47395d23-dedd-45d4-8db1-c9ffaf44493d&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:16.566Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:16.566Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;relationship_type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicates&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;source_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--548af3be-39d7-4a3e-93c2-1a63cccf8951&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;target_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--3d7f0c1c-616a-4868-aa7b-150821d2a429&quot;</span>\n",
" <span class=\"nt\">&quot;source_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--2f3d4926-163d-4aef-bcd2-19dea96916ae&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;target_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--1f2aba70-f0ae-49cd-9267-6fcb1e43be67&quot;</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
@ -810,26 +823,26 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;bundle&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;bundle--f83477e5-f853-47e1-a267-43f3aa1bd5b0&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;bundle--388c9b2c-936c-420a-baa5-04f48d682a01&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;spec_version&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2.0&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;objects&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--548af3be-39d7-4a3e-93c2-1a63cccf8951&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:24.193Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:24.193Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--2f3d4926-163d-4aef-bcd2-19dea96916ae&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:14:48.509Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:14:48.509Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;File hash for malware variant&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;pattern&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;[file:hashes.md5 = &#39;d41d8cd98f00b204e9800998ecf8427e&#39;]&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:24.193659Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:14:48.509629Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;malicious-activity&quot;</span>\n",
" <span class=\"p\">]</span>\n",
" <span class=\"p\">},</span>\n",
" <span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--3d7f0c1c-616a-4868-aa7b-150821d2a429&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:46.584Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:46.584Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--1f2aba70-f0ae-49cd-9267-6fcb1e43be67&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:04.698Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:04.698Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;Poison Ivy&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;remote-access-trojan&quot;</span>\n",
@ -837,12 +850,12 @@
" <span class=\"p\">},</span>\n",
" <span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship--34ddc7b4-4965-4615-b286-1c8bbaa1e7db&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:49.474Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:32:49.474Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;relationship--80c174fa-36d1-47c2-9a9d-ce0c636bedcc&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:13.152Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-05-13T13:15:13.152Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;relationship_type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicates&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;source_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--548af3be-39d7-4a3e-93c2-1a63cccf8951&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;target_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--3d7f0c1c-616a-4868-aa7b-150821d2a429&quot;</span>\n",
" <span class=\"nt\">&quot;source_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--2f3d4926-163d-4aef-bcd2-19dea96916ae&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;target_ref&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;malware--1f2aba70-f0ae-49cd-9267-6fcb1e43be67&quot;</span>\n",
" <span class=\"p\">}</span>\n",
" <span class=\"p\">]</span>\n",
"<span class=\"p\">}</span>\n",
@ -863,25 +876,268 @@
"bundle = Bundle(indicator, malware, relationship)\n",
"print(bundle)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Creating Cyber Observable References\n",
"Cyber Observable Objects have properties that can reference other Cyber Observable Objects. In order to create those references, use the ``_valid_refs`` property as shown in the following examples. It should be noted that ``_valid_refs`` is necessary when creating references to Cyber Observable Objects since some embedded references can only point to certain types, and ``_valid_refs`` helps ensure consistency. \n",
"\n",
"There are two cases."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Case 1: Specifying the type of the Cyber Observable Objects being referenced\n",
"In the following example, the IPv4Address object has its ``resolves_to_refs`` property specified. As per the spec, this property's value must be a list of reference(s) to MACAddress objects. In this case, those references are strings that state the type of the Cyber Observable Object being referenced, and are provided in ``_valid_refs``."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;ipv4-addr&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;value&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;177.60.40.7&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;resolves_to_refs&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;1&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"s2\">&quot;2&quot;</span>\n",
" <span class=\"p\">]</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from stix2 import IPv4Address\n",
"\n",
"ip4 = IPv4Address(\n",
" _valid_refs={\"1\": \"mac-addr\", \"2\": \"mac-addr\"},\n",
" value=\"177.60.40.7\",\n",
" resolves_to_refs=[\"1\", \"2\"]\n",
")\n",
"\n",
"print(ip4)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Case 2: Specifying the name of the Cyber Observable Objects being referenced\n",
"The following example is just like the one provided in Case 1 above, with one key difference: instead of using strings to specify the type of the Cyber Observable Objects being referenced in ``_valid_refs``, the referenced Cyber Observable Objects are created beforehand and then their names are provided in ``_valid_refs``."
]
},
{
"cell_type": "code",
"execution_count": 17,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;ipv4-addr&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;value&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;177.60.40.7&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;resolves_to_refs&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;1&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"s2\">&quot;2&quot;</span>\n",
" <span class=\"p\">]</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 17,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from stix2 import MACAddress\n",
"\n",
"mac_addr_a = MACAddress(value=\"a1:b2:c3:d4:e5:f6\")\n",
"mac_addr_b = MACAddress(value=\"a7:b8:c9:d0:e1:f2\")\n",
"\n",
"ip4_valid_refs = IPv4Address(\n",
" _valid_refs={\"1\": mac_addr_a, \"2\": mac_addr_b},\n",
" value=\"177.60.40.7\",\n",
" resolves_to_refs=[\"1\", \"2\"]\n",
")\n",
"\n",
"print(ip4_valid_refs)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "Python 2",
"language": "python",
"name": "python3"
"name": "python2"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
"version": 2
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.5"
"pygments_lexer": "ipython2",
"version": "2.7.15"
}
},
"nbformat": 4,

View File

@ -175,9 +175,9 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity--87aac643-341b-413a-b702-ea5820416155&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:38:10.269Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:38:10.269Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity--d6996982-5fb7-4364-b716-b618516989b6&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:27.349Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:27.349Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;John Smith&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;identity_class&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;individual&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;x_foo&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;bar&quot;</span>\n",
@ -194,8 +194,6 @@
}
],
"source": [
"from stix2 import Identity\n",
"\n",
"identity = Identity(name=\"John Smith\",\n",
" identity_class=\"individual\",\n",
" custom_properties={\n",
@ -289,9 +287,9 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity--a1ad0a6f-39ab-4642-9a72-aaa198b1eee2&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:38:12.270Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:38:12.270Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity--a167d2de-9fc4-4734-a1ae-57a548aad22a&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:29.180Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:29.180Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;John Smith&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;identity_class&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;individual&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;x_foo&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;bar&quot;</span>\n",
@ -426,20 +424,127 @@
"print(identity3.x_foo)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To remove a custom properties, use `new_version()` and set it to `None`."
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;identity--311b2d2d-f010-4473-83ec-1edf84858f4c&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2015-12-21T19:59:11.000Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:32.934Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;John Smith&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;identity_class&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;individual&quot;</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"identity4 = identity3.new_version(x_foo=None)\n",
"print(identity4)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Custom STIX Object Types\n",
"\n",
"To create a custom STIX object type, define a class with the @[CustomObject](../api/stix2.v20.sdo.rst#stix2.v20.sdo.CustomObject) decorator. It takes the type name and a list of property tuples, each tuple consisting of the property name and a property instance. Any special validation of the properties can be added by supplying an ``__init__`` function.\n",
"To create a custom STIX object type, define a class with the @[CustomObject](../api/v20/stix2.v20.sdo.rst#stix2.v20.sdo.CustomObject) decorator. It takes the type name and a list of property tuples, each tuple consisting of the property name and a property instance. Any special validation of the properties can be added by supplying an ``__init__`` function.\n",
"\n",
"Let's say zoo animals have become a serious cyber threat and we want to model them in STIX using a custom object type. Let's use a ``species`` property to store the kind of animal, and make that property required. We also want a property to store the class of animal, such as \"mammal\" or \"bird\" but only want to allow specific values in it. We can add some logic to validate this property in ``__init__``."
]
},
{
"cell_type": "code",
"execution_count": 7,
"execution_count": 8,
"metadata": {},
"outputs": [],
"source": [
@ -464,7 +569,7 @@
},
{
"cell_type": "code",
"execution_count": 8,
"execution_count": 9,
"metadata": {},
"outputs": [
{
@ -540,9 +645,9 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-animal&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-animal--b1e4fe7f-7985-451d-855c-6ba5c265b22a&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:38:19.790Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T18:38:19.790Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-animal--1f7ce0ad-fd3a-4cf0-9cd7-13f7bef9ecd4&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:38.010Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2020-03-05T05:06:38.010Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;species&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;lion&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;animal_class&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;mammal&quot;</span>\n",
"<span class=\"p\">}</span>\n",
@ -552,7 +657,7 @@
"<IPython.core.display.HTML object>"
]
},
"execution_count": 8,
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
@ -572,7 +677,7 @@
},
{
"cell_type": "code",
"execution_count": 9,
"execution_count": 10,
"metadata": {},
"outputs": [
{
@ -598,7 +703,7 @@
},
{
"cell_type": "code",
"execution_count": 10,
"execution_count": 11,
"metadata": {},
"outputs": [
{
@ -679,7 +784,7 @@
"<IPython.core.display.HTML object>"
]
},
"execution_count": 10,
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
@ -706,7 +811,7 @@
},
{
"cell_type": "code",
"execution_count": 11,
"execution_count": 12,
"metadata": {},
"outputs": [
{
@ -736,12 +841,12 @@
"source": [
"### Custom Cyber Observable Types\n",
"\n",
"Similar to custom STIX object types, use a decorator to create [custom Cyber Observable](../api/stix2.v20.observables.rst#stix2.v20.observables.CustomObservable) types. Just as before, ``__init__()`` can hold additional validation, but it is not necessary."
"Similar to custom STIX object types, use a decorator to create [custom Cyber Observable](../api/v20/stix2.v20.observables.rst#stix2.v20.observables.CustomObservable) types. Just as before, ``__init__()`` can hold additional validation, but it is not necessary."
]
},
{
"cell_type": "code",
"execution_count": 12,
"execution_count": 13,
"metadata": {},
"outputs": [
{
@ -826,7 +931,7 @@
"<IPython.core.display.HTML object>"
]
},
"execution_count": 12,
"execution_count": 13,
"metadata": {},
"output_type": "execute_result"
}
@ -857,7 +962,7 @@
},
{
"cell_type": "code",
"execution_count": 13,
"execution_count": 14,
"metadata": {},
"outputs": [
{
@ -938,7 +1043,7 @@
"<IPython.core.display.HTML object>"
]
},
"execution_count": 13,
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
},
@ -1020,7 +1125,7 @@
"<IPython.core.display.HTML object>"
]
},
"execution_count": 13,
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
@ -1050,6 +1155,316 @@
"print(obs_data.objects[\"0\"].property_2)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### ID-Contributing Properties for Custom Cyber Observables\n",
"STIX 2.1 Cyber Observables (SCOs) have deterministic IDs, meaning that the ID of a SCO is based on the values of some of its properties. Thus, if multiple cyber observables of the same type have the same values for their ID-contributing properties, then these SCOs will have the same ID. UUIDv5 is used for the deterministic IDs, using the namespace `\"00abedb4-aa42-466c-9c01-fed23315a9b7\"`. A SCO's ID-contributing properties may consist of a combination of required properties and optional properties.\n",
"\n",
"If a SCO type does not have any ID contributing properties defined, or all of the ID-contributing properties are not present on the object, then the SCO uses a randomly-generated UUIDv4. Thus, you can optionally define which of your custom SCO's properties should be ID-contributing properties. Similar to standard SCOs, your custom SCO's ID-contributing properties can be any combination of the SCO's required and optional properties.\n",
"\n",
"You define the ID-contributing properties when defining your custom SCO with the `CustomObservable` decorator. After the list of properties, you can optionally define the list of id-contributing properties. If you do not want to specify any id-contributing properties for your custom SCO, then you do not need to do anything additional.\n",
"\n",
"See the example below:"
]
},
{
"cell_type": "code",
"execution_count": 15,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-new-observable-2&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-new-observable-2--6bc655d6-dcb8-52a3-a862-46848c17e599&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;a_property&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;A property&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;property_2&quot;</span><span class=\"p\">:</span> <span class=\"mi\">2000</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-new-observable-2&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-new-observable-2--6bc655d6-dcb8-52a3-a862-46848c17e599&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;a_property&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;A property&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;property_2&quot;</span><span class=\"p\">:</span> <span class=\"mi\">3000</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
},
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-new-observable-2&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;x-new-observable-2--1e56f9c3-a73b-5fbd-b348-83c76523c4df&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;a_property&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;A different property&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;property_2&quot;</span><span class=\"p\">:</span> <span class=\"mi\">3000</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"from stix2.v21 import CustomObservable # IDs and Deterministic IDs are NOT part of STIX 2.0 Custom Observables\n",
"\n",
"@CustomObservable('x-new-observable-2', [\n",
" ('a_property', properties.StringProperty(required=True)),\n",
" ('property_2', properties.IntegerProperty()),\n",
"], [\n",
" 'a_property'\n",
"])\n",
"class NewObservable2():\n",
" pass\n",
"\n",
"new_observable_a = NewObservable2(a_property=\"A property\", property_2=2000)\n",
"print(new_observable_a)\n",
"\n",
"new_observable_b = NewObservable2(a_property=\"A property\", property_2=3000)\n",
"print(new_observable_b)\n",
"\n",
"new_observable_c = NewObservable2(a_property=\"A different property\", property_2=3000)\n",
"print(new_observable_c)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In this example, `a_property` is the only id-contributing property. Notice that the ID for `new_observable_a` and `new_observable_b` is the same since they have the same value for the id-contributing `a_property` property."
]
},
{
"cell_type": "markdown",
"metadata": {
@ -1058,7 +1473,7 @@
"source": [
"### Custom Cyber Observable Extensions\n",
"\n",
"Finally, custom extensions to existing Cyber Observable types can also be created. Just use the @[CustomExtension](../api/stix2.v20.observables.rst#stix2.v20.observables.CustomExtension) decorator. Note that you must provide the Cyber Observable class to which the extension applies. Again, any extra validation of the properties can be implemented by providing an ``__init__()`` but it is not required. Let's say we want to make an extension to the ``File`` Cyber Observable Object:"
"Finally, custom extensions to existing Cyber Observable types can also be created. Just use the @[CustomExtension](../api/v20/stix2.v20.observables.rst#stix2.v20.observables.CustomExtension) decorator. Note that you must provide the Cyber Observable class to which the extension applies. Again, any extra validation of the properties can be implemented by providing an ``__init__()`` but it is not required. Let's say we want to make an extension to the ``File`` Cyber Observable Object:"
]
},
{
@ -1378,21 +1793,21 @@
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"display_name": "Python 2",
"language": "python",
"name": "python3"
"name": "python2"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
"version": 2
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.3"
"pygments_lexer": "ipython2",
"version": "2.7.15+"
}
},
"nbformat": 4,

View File

@ -450,6 +450,14 @@
"mem.source.filters.add([f1,f2])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note: The `defanged` property is now always included (implicitly) for STIX 2.1 Cyber Observable Objects (SCOs)**\n\n",
"This is important to remember if you are writing a filter that involves checking the `objects` property of a STIX 2.1 `ObservedData` object. If any of the objects associated with the `objects` property are STIX 2.1 SCOs, then your filter must include the `defanged` property. For an example, refer to `filters[14]` & `filters[15]` in stix2/test/v21/test_datastore_filters.py "
]
},
{
"cell_type": "markdown",
"metadata": {},
@ -484,7 +492,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"If a STIX object has a `created_by_ref` property, you can use the [creator_of()](../api/stix2.datastore.rst#stix2.datastore.DataSource.creator_of) method to retrieve the [Identity](../api/stix2.v20.sdo.rst#stix2.v20.sdo.Identity) object that created it."
"If a STIX object has a `created_by_ref` property, you can use the [creator_of()](../api/stix2.datastore.rst#stix2.datastore.DataSource.creator_of) method to retrieve the [Identity](../api/v20/stix2.v20.sdo.rst#stix2.v20.sdo.Identity) object that created it."
]
},
{
@ -726,21 +734,21 @@
],
"metadata": {
"kernelspec": {
"display_name": "cti-python-stix2",
"display_name": "Python 3",
"language": "python",
"name": "cti-python-stix2"
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 2
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
"version": "2.7.12"
"pygments_lexer": "ipython3",
"version": "3.6.7"
}
},
"nbformat": 4,

View File

@ -67,7 +67,7 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [

3058
docs/guide/equivalence.ipynb Normal file

File diff suppressed because it is too large Load Diff

View File

@ -1310,6 +1310,212 @@
"source": [
"malware.is_marked(TLP_WHITE.id, 'description')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Extracting Lang Data Markings or marking-definition Data Markings\n",
"\n",
"If you need a specific kind of marking, you can also filter them using the API. By default the library will get both types of markings by default. You can choose between `lang=True/False` or `marking_ref=True/False` depending on your use-case."
]
},
{
"cell_type": "code",
"execution_count": 16,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{\n",
" \"type\": \"indicator\",\n",
" \"spec_version\": \"2.1\",\n",
" \"id\": \"indicator--634ef462-d6b5-48bc-9d9f-b46a6919227c\",\n",
" \"created\": \"2019-05-03T18:36:44.354Z\",\n",
" \"modified\": \"2019-05-03T18:36:44.354Z\",\n",
" \"description\": \"Una descripcion sobre este indicador\",\n",
" \"indicator_types\": [\n",
" \"malware\"\n",
" ],\n",
" \"pattern\": \"[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']\",\n",
" \"valid_from\": \"2019-05-03T18:36:44.354443Z\",\n",
" \"object_marking_refs\": [\n",
" \"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82\"\n",
" ],\n",
" \"granular_markings\": [\n",
" {\n",
" \"lang\": \"es\",\n",
" \"selectors\": [\n",
" \"description\"\n",
" ]\n",
" },\n",
" {\n",
" \"marking_ref\": \"marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da\",\n",
" \"selectors\": [\n",
" \"description\"\n",
" ]\n",
" }\n",
" ]\n",
"}\n",
"['es', 'marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da']\n",
"['marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da']\n",
"['es']\n"
]
}
],
"source": [
"from stix2 import v21\n",
"\n",
"v21_indicator = v21.Indicator(\n",
" description=\"Una descripcion sobre este indicador\",\n",
" pattern=\"[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']\",\n",
" object_marking_refs=['marking-definition--f88d31f6-486f-44da-b317-01333bde0b82'],\n",
" indicator_types=['malware'],\n",
" granular_markings=[\n",
" {\n",
" 'selectors': ['description'],\n",
" 'lang': 'es'\n",
" },\n",
" {\n",
" 'selectors': ['description'],\n",
" 'marking_ref': 'marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da'\n",
" }\n",
" ]\n",
")\n",
"print(v21_indicator)\n",
"\n",
"# Gets both lang and marking_ref markings for 'description'\n",
"print(v21_indicator.get_markings('description'))\n",
"\n",
"# Exclude lang markings from results\n",
"print(v21_indicator.get_markings('description', lang=False))\n",
"\n",
"# Exclude marking-definition markings from results\n",
"print(v21_indicator.get_markings('description', marking_ref=False))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In this same manner, calls to `clear_markings` and `set_markings` also have the ability to operate in for one or both types of markings."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{\n",
" \"type\": \"indicator\",\n",
" \"spec_version\": \"2.1\",\n",
" \"id\": \"indicator--a612665a-2df4-4fd2-851c-7fbb8c92339a\",\n",
" \"created\": \"2019-05-03T19:13:59.010Z\",\n",
" \"modified\": \"2019-05-03T19:15:41.173Z\",\n",
" \"description\": \"Una descripcion sobre este indicador\",\n",
" \"indicator_types\": [\n",
" \"malware\"\n",
" ],\n",
" \"pattern\": \"[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']\",\n",
" \"valid_from\": \"2019-05-03T19:13:59.010624Z\",\n",
" \"object_marking_refs\": [\n",
" \"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82\"\n",
" ]\n",
"}\n"
]
}
],
"source": [
"print(v21_indicator.clear_markings(\"description\")) # By default, both types of markings will be removed"
]
},
{
"cell_type": "code",
"execution_count": 13,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{\n",
" \"type\": \"indicator\",\n",
" \"spec_version\": \"2.1\",\n",
" \"id\": \"indicator--982aeb4d-4dd3-4b04-aa50-a1d00c31986c\",\n",
" \"created\": \"2019-05-03T19:19:26.542Z\",\n",
" \"modified\": \"2019-05-03T19:20:51.818Z\",\n",
" \"description\": \"Una descripcion sobre este indicador\",\n",
" \"indicator_types\": [\n",
" \"malware\"\n",
" ],\n",
" \"pattern\": \"[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']\",\n",
" \"valid_from\": \"2019-05-03T19:19:26.542267Z\",\n",
" \"object_marking_refs\": [\n",
" \"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82\"\n",
" ],\n",
" \"granular_markings\": [\n",
" {\n",
" \"lang\": \"es\",\n",
" \"selectors\": [\n",
" \"description\"\n",
" ]\n",
" }\n",
" ]\n",
"}\n"
]
}
],
"source": [
"# If lang is False, no lang markings will be removed\n",
"print(v21_indicator.clear_markings(\"description\", lang=False))"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{\n",
" \"type\": \"indicator\",\n",
" \"spec_version\": \"2.1\",\n",
" \"id\": \"indicator--de0316d6-38e1-43c2-af4f-649305251864\",\n",
" \"created\": \"2019-05-03T19:40:21.459Z\",\n",
" \"modified\": \"2019-05-03T19:40:26.431Z\",\n",
" \"description\": \"Una descripcion sobre este indicador\",\n",
" \"indicator_types\": [\n",
" \"malware\"\n",
" ],\n",
" \"pattern\": \"[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']\",\n",
" \"valid_from\": \"2019-05-03T19:40:21.459582Z\",\n",
" \"object_marking_refs\": [\n",
" \"marking-definition--f88d31f6-486f-44da-b317-01333bde0b82\"\n",
" ],\n",
" \"granular_markings\": [\n",
" {\n",
" \"marking_ref\": \"marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da\",\n",
" \"selectors\": [\n",
" \"description\"\n",
" ]\n",
" }\n",
" ]\n",
"}\n"
]
}
],
"source": [
"# If marking_ref is False, no marking-definition markings will be removed\n",
"print(v21_indicator.clear_markings(\"description\", marking_ref=False))"
]
}
],
"metadata": {

View File

@ -365,7 +365,7 @@
"source": [
"### How custom content works\n",
"\n",
"[CustomObject](../api/stix2.v20.sdo.rst#stix2.v20.sdo.CustomObject), [CustomObservable](../api/stix2.v20.observables.rst#stix2.v20.observables.CustomObservable), [CustomMarking](../api/stix2.v20.common.rst#stix2.v20.common.CustomMarking) and [CustomExtension](../api/stix2.v20.observables.rst#stix2.v20.observables.CustomExtension) must be registered explicitly by STIX version. This is a design decision since properties or requirements may change as the STIX Technical Specification advances.\n",
"[CustomObject](../api/v20/stix2.v20.sdo.rst#stix2.v20.sdo.CustomObject), [CustomObservable](../api/v20/stix2.v20.observables.rst#stix2.v20.observables.CustomObservable), [CustomMarking](../api/v20/stix2.v20.common.rst#stix2.v20.common.CustomMarking) and [CustomExtension](../api/v20/stix2.v20.observables.rst#stix2.v20.observables.CustomExtension) must be registered explicitly by STIX version. This is a design decision since properties or requirements may change as the STIX Technical Specification advances.\n",
"\n",
"You can perform this by:"
]

View File

@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "code",
"execution_count": 2,
"execution_count": 1,
"metadata": {
"nbsphinx": "hidden"
},
@ -22,7 +22,7 @@
},
{
"cell_type": "code",
"execution_count": 3,
"execution_count": 2,
"metadata": {
"nbsphinx": "hidden"
},
@ -63,12 +63,12 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"To create a new version of an existing object, specify the property(ies) you want to change and their new values:"
"To create a new version of an existing object, specify the property(ies) you want to change and their new values. For example, here we change the label from \"anomalous-activity\" to \"malicious-activity\":"
]
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 3,
"metadata": {},
"outputs": [
{
@ -144,12 +144,13 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--dd052ff6-e404-444b-beb9-eae96d1e79ea&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--8ad18fc7-457c-475d-b292-1ec44febe0fd&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2016-01-01T08:00:00.000Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T20:02:51.161Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-07-25T17:59:34.815Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;File hash for Foobar malware&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;description&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;A file indicator&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;pattern&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;[file:hashes.md5 = &#39;d41d8cd98f00b204e9800998ecf8427e&#39;]&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T20:02:51.138312Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-07-25T17:59:34.779826Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;malicious-activity&quot;</span>\n",
" <span class=\"p\">]</span>\n",
@ -160,7 +161,7 @@
"<IPython.core.display.HTML object>"
]
},
"execution_count": 4,
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
@ -170,6 +171,7 @@
"\n",
"indicator = Indicator(created=\"2016-01-01T08:00:00.000Z\",\n",
" name=\"File hash for suspicious file\",\n",
" description=\"A file indicator\",\n",
" labels=[\"anomalous-activity\"],\n",
" pattern=\"[file:hashes.md5 = 'd41d8cd98f00b204e9800998ecf8427e']\")\n",
"\n",
@ -187,7 +189,7 @@
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 4,
"metadata": {
"scrolled": true
},
@ -205,6 +207,117 @@
"indicator.new_version(id=\"indicator--cc42e358-8b9b-493c-9646-6ecd73b41c21\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can remove optional or custom properties by setting them to `None` when you call `new_version()`."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [
{
"data": {
"text/html": [
"<style type=\"text/css\">.highlight .hll { background-color: #ffffcc }\n",
".highlight { background: #f8f8f8; }\n",
".highlight .c { color: #408080; font-style: italic } /* Comment */\n",
".highlight .err { border: 1px solid #FF0000 } /* Error */\n",
".highlight .k { color: #008000; font-weight: bold } /* Keyword */\n",
".highlight .o { color: #666666 } /* Operator */\n",
".highlight .ch { color: #408080; font-style: italic } /* Comment.Hashbang */\n",
".highlight .cm { color: #408080; font-style: italic } /* Comment.Multiline */\n",
".highlight .cp { color: #BC7A00 } /* Comment.Preproc */\n",
".highlight .cpf { color: #408080; font-style: italic } /* Comment.PreprocFile */\n",
".highlight .c1 { color: #408080; font-style: italic } /* Comment.Single */\n",
".highlight .cs { color: #408080; font-style: italic } /* Comment.Special */\n",
".highlight .gd { color: #A00000 } /* Generic.Deleted */\n",
".highlight .ge { font-style: italic } /* Generic.Emph */\n",
".highlight .gr { color: #FF0000 } /* Generic.Error */\n",
".highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */\n",
".highlight .gi { color: #00A000 } /* Generic.Inserted */\n",
".highlight .go { color: #888888 } /* Generic.Output */\n",
".highlight .gp { color: #000080; font-weight: bold } /* Generic.Prompt */\n",
".highlight .gs { font-weight: bold } /* Generic.Strong */\n",
".highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */\n",
".highlight .gt { color: #0044DD } /* Generic.Traceback */\n",
".highlight .kc { color: #008000; font-weight: bold } /* Keyword.Constant */\n",
".highlight .kd { color: #008000; font-weight: bold } /* Keyword.Declaration */\n",
".highlight .kn { color: #008000; font-weight: bold } /* Keyword.Namespace */\n",
".highlight .kp { color: #008000 } /* Keyword.Pseudo */\n",
".highlight .kr { color: #008000; font-weight: bold } /* Keyword.Reserved */\n",
".highlight .kt { color: #B00040 } /* Keyword.Type */\n",
".highlight .m { color: #666666 } /* Literal.Number */\n",
".highlight .s { color: #BA2121 } /* Literal.String */\n",
".highlight .na { color: #7D9029 } /* Name.Attribute */\n",
".highlight .nb { color: #008000 } /* Name.Builtin */\n",
".highlight .nc { color: #0000FF; font-weight: bold } /* Name.Class */\n",
".highlight .no { color: #880000 } /* Name.Constant */\n",
".highlight .nd { color: #AA22FF } /* Name.Decorator */\n",
".highlight .ni { color: #999999; font-weight: bold } /* Name.Entity */\n",
".highlight .ne { color: #D2413A; font-weight: bold } /* Name.Exception */\n",
".highlight .nf { color: #0000FF } /* Name.Function */\n",
".highlight .nl { color: #A0A000 } /* Name.Label */\n",
".highlight .nn { color: #0000FF; font-weight: bold } /* Name.Namespace */\n",
".highlight .nt { color: #008000; font-weight: bold } /* Name.Tag */\n",
".highlight .nv { color: #19177C } /* Name.Variable */\n",
".highlight .ow { color: #AA22FF; font-weight: bold } /* Operator.Word */\n",
".highlight .w { color: #bbbbbb } /* Text.Whitespace */\n",
".highlight .mb { color: #666666 } /* Literal.Number.Bin */\n",
".highlight .mf { color: #666666 } /* Literal.Number.Float */\n",
".highlight .mh { color: #666666 } /* Literal.Number.Hex */\n",
".highlight .mi { color: #666666 } /* Literal.Number.Integer */\n",
".highlight .mo { color: #666666 } /* Literal.Number.Oct */\n",
".highlight .sa { color: #BA2121 } /* Literal.String.Affix */\n",
".highlight .sb { color: #BA2121 } /* Literal.String.Backtick */\n",
".highlight .sc { color: #BA2121 } /* Literal.String.Char */\n",
".highlight .dl { color: #BA2121 } /* Literal.String.Delimiter */\n",
".highlight .sd { color: #BA2121; font-style: italic } /* Literal.String.Doc */\n",
".highlight .s2 { color: #BA2121 } /* Literal.String.Double */\n",
".highlight .se { color: #BB6622; font-weight: bold } /* Literal.String.Escape */\n",
".highlight .sh { color: #BA2121 } /* Literal.String.Heredoc */\n",
".highlight .si { color: #BB6688; font-weight: bold } /* Literal.String.Interpol */\n",
".highlight .sx { color: #008000 } /* Literal.String.Other */\n",
".highlight .sr { color: #BB6688 } /* Literal.String.Regex */\n",
".highlight .s1 { color: #BA2121 } /* Literal.String.Single */\n",
".highlight .ss { color: #19177C } /* Literal.String.Symbol */\n",
".highlight .bp { color: #008000 } /* Name.Builtin.Pseudo */\n",
".highlight .fm { color: #0000FF } /* Name.Function.Magic */\n",
".highlight .vc { color: #19177C } /* Name.Variable.Class */\n",
".highlight .vg { color: #19177C } /* Name.Variable.Global */\n",
".highlight .vi { color: #19177C } /* Name.Variable.Instance */\n",
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--8ad18fc7-457c-475d-b292-1ec44febe0fd&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2016-01-01T08:00:00.000Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-07-25T17:59:42.648Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;File hash for suspicious file&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;pattern&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;[file:hashes.md5 = &#39;d41d8cd98f00b204e9800998ecf8427e&#39;]&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-07-25T17:59:34.779826Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;anomalous-activity&quot;</span>\n",
" <span class=\"p\">]</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
],
"text/plain": [
"<IPython.core.display.HTML object>"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"indicator3 = indicator.new_version(description=None)\n",
"print(indicator3)"
]
},
{
"cell_type": "markdown",
"metadata": {
@ -292,15 +405,15 @@
".highlight .vm { color: #19177C } /* Name.Variable.Magic */\n",
".highlight .il { color: #666666 } /* Literal.Number.Integer.Long */</style><div class=\"highlight\"><pre><span></span><span class=\"p\">{</span>\n",
" <span class=\"nt\">&quot;type&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--dd052ff6-e404-444b-beb9-eae96d1e79ea&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;id&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;indicator--8ad18fc7-457c-475d-b292-1ec44febe0fd&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;created&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2016-01-01T08:00:00.000Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T20:02:54.704Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;File hash for Foobar malware&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;modified&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-07-25T17:59:52.198Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;name&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;File hash for suspicious file&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;pattern&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;[file:hashes.md5 = &#39;d41d8cd98f00b204e9800998ecf8427e&#39;]&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2018-04-05T20:02:51.138312Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;valid_from&quot;</span><span class=\"p\">:</span> <span class=\"s2\">&quot;2019-07-25T17:59:34.779826Z&quot;</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;revoked&quot;</span><span class=\"p\">:</span> <span class=\"kc\">true</span><span class=\"p\">,</span>\n",
" <span class=\"nt\">&quot;labels&quot;</span><span class=\"p\">:</span> <span class=\"p\">[</span>\n",
" <span class=\"s2\">&quot;malicious-activity&quot;</span>\n",
" <span class=\"s2\">&quot;anomalous-activity&quot;</span>\n",
" <span class=\"p\">]</span>\n",
"<span class=\"p\">}</span>\n",
"</pre></div>\n"
@ -315,8 +428,8 @@
}
],
"source": [
"indicator2 = indicator2.revoke()\n",
"print(indicator2)"
"indicator4 = indicator3.revoke()\n",
"print(indicator4)"
]
}
],

View File

@ -624,7 +624,7 @@
"source": [
"### Creating STIX Data\n",
"\n",
"To create a STIX object, just use that object's class constructor. Once it's created, add it to the workbench with [save()](../api/datastore/stix2.workbench.rst#stix2.workbench.save)."
"To create a STIX object, just use that object's class constructor. Once it's created, add it to the workbench with [save()](../api/stix2.workbench.rst#stix2.workbench.save)."
]
},
{
@ -760,7 +760,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Defaults can also be set for the [created timestamp](../api/datastore/stix2.workbench.rst#stix2.workbench.set_default_created), [external references](../api/datastore/stix2.workbench.rst#stix2.workbench.set_default_external_refs) and [object marking references](../api/datastore/stix2.workbench.rst#stix2.workbench.set_default_object_marking_refs)."
"Defaults can also be set for the [created timestamp](../api/stix2.workbench.rst#stix2.workbench.set_default_created), [external references](../api/stix2.workbench.rst#stix2.workbench.set_default_external_refs) and [object marking references](../api/stix2.workbench.rst#stix2.workbench.set_default_object_marking_refs)."
]
},
{

View File

@ -7,8 +7,10 @@ import stix2
def main():
collection = Collection("http://127.0.0.1:5000/trustgroup1/collections/52892447-4d7e-4f70-b94d-d7f22742ff63/",
user="admin", password="Password0")
collection = Collection(
"http://127.0.0.1:5000/trustgroup1/collections/52892447-4d7e-4f70-b94d-d7f22742ff63/",
user="admin", password="Password0",
)
# instantiate TAXII data source
taxii = stix2.TAXIICollectionSource(collection)

View File

@ -1,10 +1,10 @@
bumpversion
ipython
nbsphinx==0.3.2
nbsphinx==0.4.3
pre-commit
pytest
pytest-cov
sphinx<1.6
sphinx<2
sphinx-prompt
tox

View File

@ -1,14 +1,13 @@
[bumpversion]
current_version = 1.0.3
current_version = 1.4.0
commit = True
tag = True
[bumpversion:file:stix2/version.py]
[bumpversion:file:docs/conf.py]
[metadata]
license_file = LICENSE
[bdist_wheel]
universal = 1

View File

@ -11,26 +11,28 @@ VERSION_FILE = os.path.join(BASE_DIR, 'stix2', 'version.py')
def get_version():
with open(VERSION_FILE) as f:
for line in f.readlines():
if line.startswith("__version__"):
if line.startswith('__version__'):
version = line.split()[-1].strip('"')
return version
raise AttributeError("Package does not have a __version__")
def get_long_description():
with open('README.rst') as f:
long_description = f.read()
return f.read()
setup(
name='stix2',
version=get_version(),
description='Produce and consume STIX 2 JSON content',
long_description=long_description,
url='https://github.com/oasis-open/cti-python-stix2',
long_description=get_long_description(),
long_description_content_type='text/x-rst',
url='https://oasis-open.github.io/cti-documentation/',
author='OASIS Cyber Threat Intelligence Technical Committee',
author_email='cti-users@lists.oasis-open.org',
maintainer='Greg Back',
maintainer_email='gback@mitre.org',
maintainer='Chris Lenk, Emmanuelle Vargas-Gonzalez',
maintainer_email='clenk@mitre.org, emmanuelle@mitre.org',
license='BSD',
classifiers=[
'Development Status :: 4 - Beta',
@ -44,18 +46,25 @@ setup(
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
],
keywords="stix stix2 json cti cyber threat intelligence",
packages=find_packages(exclude=['*.test']),
keywords='stix stix2 json cti cyber threat intelligence',
packages=find_packages(exclude=['*.test', '*.test.*']),
install_requires=[
'python-dateutil',
'enum34 ; python_version<"3.4"',
'pytz',
'requests',
'simplejson',
'six',
'stix2-patterns',
],
project_urls={
'Documentation': 'https://stix2.readthedocs.io/',
'Source Code': 'https://github.com/oasis-open/cti-python-stix2/',
'Bug Tracker': 'https://github.com/oasis-open/cti-python-stix2/issues/',
},
extras_require={
'taxii': ['taxii2-client']
}
'taxii': ['taxii2-client'],
'semantic': ['haversine', 'fuzzywuzzy'],
},
)

View File

@ -3,6 +3,7 @@
.. autosummary::
:toctree: api
confidence
core
datastore
environment
@ -11,49 +12,49 @@
patterns
properties
utils
v20
v21
workbench
v20.common
v20.observables
v20.sdo
v20.sro
"""
# flake8: noqa
from .core import Bundle, _collect_stix2_obj_maps, _register_type, parse
DEFAULT_VERSION = '2.0' # Default version will always be the latest STIX 2.X version
from .confidence import scales
from .datastore import CompositeDataSource
from .datastore.filesystem import (FileSystemSink, FileSystemSource,
FileSystemStore)
from .datastore.filesystem import (
FileSystemSink, FileSystemSource, FileSystemStore,
)
from .datastore.filters import Filter
from .datastore.memory import MemorySink, MemorySource, MemoryStore
from .datastore.taxii import (TAXIICollectionSink, TAXIICollectionSource,
TAXIICollectionStore)
from .datastore.taxii import (
TAXIICollectionSink, TAXIICollectionSource, TAXIICollectionStore,
)
from .environment import Environment, ObjectFactory
from .markings import (add_markings, clear_markings, get_markings, is_marked,
remove_markings, set_markings)
from .patterns import (AndBooleanExpression, AndObservationExpression,
BasicObjectPathComponent, BinaryConstant,
BooleanConstant, EqualityComparisonExpression,
from .markings import (
add_markings, clear_markings, get_markings, is_marked, remove_markings,
set_markings,
)
from .parsing import _collect_stix2_mappings, parse, parse_observable
from .patterns import (
AndBooleanExpression, AndObservationExpression, BasicObjectPathComponent,
BinaryConstant, BooleanConstant, EqualityComparisonExpression,
FloatConstant, FollowedByObservationExpression,
GreaterThanComparisonExpression,
GreaterThanEqualComparisonExpression, HashConstant,
HexConstant, InComparisonExpression, IntegerConstant,
IsSubsetComparisonExpression,
IsSupersetComparisonExpression,
LessThanComparisonExpression,
LessThanEqualComparisonExpression,
LikeComparisonExpression, ListConstant,
ListObjectPathComponent, MatchesComparisonExpression,
ObjectPath, ObservationExpression, OrBooleanExpression,
OrObservationExpression, ParentheticalExpression,
QualifiedObservationExpression,
ReferenceObjectPathComponent, RepeatQualifier,
StartStopQualifier, StringConstant, TimestampConstant,
WithinQualifier)
GreaterThanComparisonExpression, GreaterThanEqualComparisonExpression,
HashConstant, HexConstant, InComparisonExpression, IntegerConstant,
IsSubsetComparisonExpression, IsSupersetComparisonExpression,
LessThanComparisonExpression, LessThanEqualComparisonExpression,
LikeComparisonExpression, ListConstant, ListObjectPathComponent,
MatchesComparisonExpression, ObjectPath, ObservationExpression,
OrBooleanExpression, OrObservationExpression, ParentheticalExpression,
QualifiedObservationExpression, ReferenceObjectPathComponent,
RepeatQualifier, StartStopQualifier, StringConstant, TimestampConstant,
WithinQualifier,
)
from .utils import new_version, revoke
from .v20 import * # This import will always be the latest STIX 2.X version
from .version import __version__
_collect_stix2_obj_maps()
DEFAULT_VERSION = "2.0" # Default version will always be the latest STIX 2.X version
_collect_stix2_mappings()

View File

@ -1,24 +1,39 @@
"""Base classes for type definitions in the stix2 library."""
"""Base classes for type definitions in the STIX2 library."""
import collections
import copy
import datetime as dt
import re
import uuid
import simplejson as json
import six
from .exceptions import (AtLeastOnePropertyError, CustomContentError,
DependentPropertiesError, ExtraPropertiesError,
import stix2
from stix2.canonicalization.Canonicalize import canonicalize
from .exceptions import (
AtLeastOnePropertyError, DependentPropertiesError, ExtraPropertiesError,
ImmutableError, InvalidObjRefError, InvalidValueError,
MissingPropertiesError,
MutuallyExclusivePropertiesError)
MissingPropertiesError, MutuallyExclusivePropertiesError,
)
from .markings import _MarkingsMixin
from .markings.utils import validate
from .utils import NOW, find_property_index, format_datetime, get_timestamp
from .utils import (
NOW, PREFIX_21_REGEX, find_property_index, format_datetime, get_timestamp,
)
from .utils import new_version as _new_version
from .utils import revoke as _revoke
try:
from collections.abc import Mapping
except ImportError:
from collections import Mapping
__all__ = ['STIXJSONEncoder', '_STIXBase']
DEFAULT_ERROR = "{type} must have {property}='{expected}'."
SCO_DET_ID_NAMESPACE = uuid.UUID("00abedb4-aa42-466c-9c01-fed23315a9b7")
class STIXJSONEncoder(json.JSONEncoder):
@ -63,7 +78,7 @@ def get_required_properties(properties):
return (k for k, v in properties.items() if v.required)
class _STIXBase(collections.Mapping):
class _STIXBase(Mapping):
"""Base class for STIX object types"""
def object_properties(self):
@ -87,10 +102,17 @@ class _STIXBase(collections.Mapping):
if prop_name in kwargs:
try:
kwargs[prop_name] = prop.clean(kwargs[prop_name])
except ValueError as exc:
if self.__allow_custom and isinstance(exc, CustomContentError):
return
raise InvalidValueError(self.__class__, prop_name, reason=str(exc))
except InvalidValueError:
# No point in wrapping InvalidValueError in another
# InvalidValueError... so let those propagate.
raise
except Exception as exc:
six.raise_from(
InvalidValueError(
self.__class__, prop_name, reason=str(exc),
),
exc,
)
# interproperty constraint methods
@ -104,11 +126,16 @@ class _STIXBase(collections.Mapping):
def _check_at_least_one_property(self, list_of_properties=None):
if not list_of_properties:
list_of_properties = sorted(list(self.__class__._properties.keys()))
if "type" in list_of_properties:
list_of_properties.remove("type")
if isinstance(self, _Observable):
props_to_remove = ["type", "id", "defanged", "spec_version"]
else:
props_to_remove = ["type"]
list_of_properties = [prop for prop in list_of_properties if prop not in props_to_remove]
current_properties = self.properties_populated()
list_of_properties_populated = set(list_of_properties).intersection(current_properties)
if list_of_properties and (not list_of_properties_populated or list_of_properties_populated == set(["extensions"])):
if list_of_properties and (not list_of_properties_populated or list_of_properties_populated == set(['extensions'])):
raise AtLeastOnePropertyError(self.__class__, list_of_properties)
def _check_properties_dependency(self, list_of_properties, list_of_dependent_properties):
@ -121,12 +148,13 @@ class _STIXBase(collections.Mapping):
raise DependentPropertiesError(self.__class__, failed_dependency_pairs)
def _check_object_constraints(self):
for m in self.get("granular_markings", []):
validate(self, m.get("selectors"))
for m in self.get('granular_markings', []):
validate(self, m.get('selectors'))
def __init__(self, allow_custom=False, **kwargs):
def __init__(self, allow_custom=False, interoperability=False, **kwargs):
cls = self.__class__
self.__allow_custom = allow_custom
self._allow_custom = allow_custom
self.__interoperability = interoperability
# Use the same timestamp for any auto-generated datetimes
self.__now = get_timestamp()
@ -135,19 +163,30 @@ class _STIXBase(collections.Mapping):
custom_props = kwargs.pop('custom_properties', {})
if custom_props and not isinstance(custom_props, dict):
raise ValueError("'custom_properties' must be a dictionary")
if not self.__allow_custom:
extra_kwargs = list(set(kwargs) - set(self._properties))
if extra_kwargs:
raise ExtraPropertiesError(cls, extra_kwargs)
if custom_props:
self.__allow_custom = True
# Remove any keyword arguments whose value is None
extra_kwargs = list(set(kwargs) - set(self._properties))
if extra_kwargs and not self._allow_custom:
raise ExtraPropertiesError(cls, extra_kwargs)
# because allow_custom is true, any extra kwargs are custom
if custom_props or extra_kwargs:
self._allow_custom = True
if isinstance(self, stix2.v21._STIXBase21):
all_custom_prop_names = extra_kwargs
all_custom_prop_names.extend(list(custom_props.keys()))
for prop_name in all_custom_prop_names:
if not re.match(PREFIX_21_REGEX, prop_name):
raise InvalidValueError(
self.__class__, prop_name,
reason="Property name '%s' must begin with an alpha character." % prop_name,
)
# Remove any keyword arguments whose value is None or [] (i.e. empty list)
setting_kwargs = {}
props = kwargs.copy()
props.update(custom_props)
for prop_name, prop_value in props.items():
if prop_value is not None:
if prop_value is not None and prop_value != []:
setting_kwargs[prop_name] = prop_value
# Detect any missing required properties
@ -190,7 +229,7 @@ class _STIXBase(collections.Mapping):
# usual behavior of this method reads an __init__-assigned attribute,
# which would cause infinite recursion. So this check disables all
# attribute reads until the instance has been properly initialized.
unpickling = "_inner" not in self.__dict__
unpickling = '_inner' not in self.__dict__
if not unpickling and name in self:
return self.__getitem__(name)
raise AttributeError("'%s' object has no attribute '%s'" %
@ -206,8 +245,10 @@ class _STIXBase(collections.Mapping):
def __repr__(self):
props = [(k, self[k]) for k in self.object_properties() if self.get(k)]
return "{0}({1})".format(self.__class__.__name__,
", ".join(["{0!s}={1!r}".format(k, v) for k, v in props]))
return '{0}({1})'.format(
self.__class__.__name__,
', '.join(['{0!s}={1!r}'.format(k, v) for k, v in props]),
)
def __deepcopy__(self, memo):
# Assume: we can ignore the memo argument, because no object will ever contain the same sub-object multiple times.
@ -216,7 +257,8 @@ class _STIXBase(collections.Mapping):
if isinstance(self, _Observable):
# Assume: valid references in the original object are still valid in the new version
new_inner['_valid_refs'] = {'*': '*'}
new_inner['allow_custom'] = self.__allow_custom
new_inner['allow_custom'] = self._allow_custom
new_inner['interoperability'] = self.__interoperability
return cls(**new_inner)
def properties_populated(self):
@ -273,7 +315,7 @@ class _STIXBase(collections.Mapping):
def sort_by(element):
return find_property_index(self, *element)
kwargs.update({'indent': 4, 'separators': (",", ": "), 'item_sort_key': sort_by})
kwargs.update({'indent': 4, 'separators': (',', ': '), 'item_sort_key': sort_by})
if include_optional_defaults:
return json.dumps(self, cls=STIXJSONIncludeOptionalDefaultsEncoder, **kwargs)
@ -281,18 +323,58 @@ class _STIXBase(collections.Mapping):
return json.dumps(self, cls=STIXJSONEncoder, **kwargs)
class _DomainObject(_STIXBase, _MarkingsMixin):
def __init__(self, *args, **kwargs):
interoperability = kwargs.get('interoperability', False)
self.__interoperability = interoperability
self._properties['id'].interoperability = interoperability
self._properties['created_by_ref'].interoperability = interoperability
if kwargs.get('object_marking_refs'):
self._properties['object_marking_refs'].contained.interoperability = interoperability
super(_DomainObject, self).__init__(*args, **kwargs)
class _RelationshipObject(_STIXBase, _MarkingsMixin):
def __init__(self, *args, **kwargs):
interoperability = kwargs.get('interoperability', False)
self.__interoperability = interoperability
self._properties['id'].interoperability = interoperability
if kwargs.get('created_by_ref'):
self._properties['created_by_ref'].interoperability = interoperability
if kwargs.get('object_marking_refs'):
self._properties['object_marking_refs'].contained.interoperability = interoperability
super(_RelationshipObject, self).__init__(*args, **kwargs)
class _Observable(_STIXBase):
def __init__(self, **kwargs):
# the constructor might be called independently of an observed data object
self._STIXBase__valid_refs = kwargs.pop('_valid_refs', [])
self.__allow_custom = kwargs.get('allow_custom', False)
self._allow_custom = kwargs.get('allow_custom', False)
self._properties['extensions'].allow_custom = kwargs.get('allow_custom', False)
try:
# Since `spec_version` is optional, this is how we check for a 2.1 SCO
self._id_contributing_properties
if 'id' not in kwargs:
possible_id = self._generate_id(kwargs)
if possible_id is not None:
kwargs['id'] = possible_id
except AttributeError:
# End up here if handling a 2.0 SCO, and don't need to do anything further
pass
super(_Observable, self).__init__(**kwargs)
def _check_ref(self, ref, prop, prop_name):
"""
Only for checking `*_ref` or `*_refs` properties in spec_version 2.0
STIX Cyber Observables (SCOs)
"""
if '*' in self._STIXBase__valid_refs:
return # don't check if refs are valid
@ -305,6 +387,9 @@ class _Observable(_STIXBase):
allowed_types = prop.valid_types
try:
try:
ref_type = self._STIXBase__valid_refs[ref].type
except AttributeError:
ref_type = self._STIXBase__valid_refs[ref]
except TypeError:
raise ValueError("'%s' must be created with _valid_refs as a dict, not a list." % self.__class__.__name__)
@ -318,13 +403,53 @@ class _Observable(_STIXBase):
if prop_name not in kwargs:
return
from .properties import ObjectReferenceProperty
if prop_name.endswith('_ref'):
if isinstance(prop, ObjectReferenceProperty):
ref = kwargs[prop_name]
self._check_ref(ref, prop, prop_name)
elif prop_name.endswith('_refs'):
if isinstance(prop.contained, ObjectReferenceProperty):
for ref in kwargs[prop_name]:
self._check_ref(ref, prop, prop_name)
def _generate_id(self, kwargs):
required_prefix = self._type + "--"
properties_to_use = self._id_contributing_properties
if properties_to_use:
streamlined_object = {}
if "hashes" in kwargs and "hashes" in properties_to_use:
possible_hash = _choose_one_hash(kwargs["hashes"])
if possible_hash:
streamlined_object["hashes"] = possible_hash
for key in properties_to_use:
if key != "hashes" and key in kwargs:
if isinstance(kwargs[key], dict) or isinstance(kwargs[key], _STIXBase):
temp_deep_copy = copy.deepcopy(dict(kwargs[key]))
_recursive_stix_to_dict(temp_deep_copy)
streamlined_object[key] = temp_deep_copy
elif isinstance(kwargs[key], list):
temp_deep_copy = copy.deepcopy(kwargs[key])
_recursive_stix_list_to_dict(temp_deep_copy)
streamlined_object[key] = temp_deep_copy
else:
streamlined_object[key] = kwargs[key]
if streamlined_object:
data = canonicalize(streamlined_object, utf8=False)
# The situation is complicated w.r.t. python 2/3 behavior, so
# I'd rather not rely on particular exceptions being raised to
# determine what to do. Better to just check the python version
# directly.
if six.PY3:
return required_prefix + six.text_type(uuid.uuid5(SCO_DET_ID_NAMESPACE, data))
else:
return required_prefix + six.text_type(uuid.uuid5(SCO_DET_ID_NAMESPACE, data.encode("utf-8")))
# We return None if there are no values specified for any of the id-contributing-properties
return None
class _Extension(_STIXBase):
@ -333,6 +458,49 @@ class _Extension(_STIXBase):
self._check_at_least_one_property()
def _choose_one_hash(hash_dict):
if "MD5" in hash_dict:
return {"MD5": hash_dict["MD5"]}
elif "SHA-1" in hash_dict:
return {"SHA-1": hash_dict["SHA-1"]}
elif "SHA-256" in hash_dict:
return {"SHA-256": hash_dict["SHA-256"]}
elif "SHA-512" in hash_dict:
return {"SHA-512": hash_dict["SHA-512"]}
else:
k = next(iter(hash_dict), None)
if k is not None:
return {k: hash_dict[k]}
def _cls_init(cls, obj, kwargs):
if getattr(cls, '__init__', object.__init__) is not object.__init__:
cls.__init__(obj, **kwargs)
def _recursive_stix_to_dict(input_dict):
for key in input_dict:
if isinstance(input_dict[key], dict):
_recursive_stix_to_dict(input_dict[key])
elif isinstance(input_dict[key], _STIXBase):
input_dict[key] = dict(input_dict[key])
# There may stil be nested _STIXBase objects
_recursive_stix_to_dict(input_dict[key])
elif isinstance(input_dict[key], list):
_recursive_stix_list_to_dict(input_dict[key])
else:
pass
def _recursive_stix_list_to_dict(input_list):
for i in range(len(input_list)):
if isinstance(input_list[i], _STIXBase):
input_list[i] = dict(input_list[i])
elif isinstance(input_list[i], dict):
pass
elif isinstance(input_list[i], list):
_recursive_stix_list_to_dict(input_list[i])
else:
continue
_recursive_stix_to_dict(input_list[i])

View File

@ -0,0 +1,512 @@
##############################################################################
# #
# Copyright 2006-2019 WebPKI.org (http://webpki.org). #
# #
# Licensed under the Apache License, Version 2.0 (the "License"); #
# you may not use this file except in compliance with the License. #
# You may obtain a copy of the License at #
# #
# https://www.apache.org/licenses/LICENSE-2.0 #
# #
# Unless required by applicable law or agreed to in writing, software #
# distributed under the License is distributed on an "AS IS" BASIS, #
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #
# See the License for the specific language governing permissions and #
# limitations under the License. #
# #
##############################################################################
#################################################
# JCS compatible JSON serializer for Python 3.x #
#################################################
# This file has been modified to be compatible with Python 2.x as well
import re
import six
from stix2.canonicalization.NumberToJson import convert2Es6Format
try:
from _json import encode_basestring_ascii as c_encode_basestring_ascii
except ImportError:
c_encode_basestring_ascii = None
try:
from _json import encode_basestring as c_encode_basestring
except ImportError:
c_encode_basestring = None
try:
from _json import make_encoder as c_make_encoder
except ImportError:
c_make_encoder = None
ESCAPE = re.compile(r'[\x00-\x1f\\"\b\f\n\r\t]')
ESCAPE_ASCII = re.compile(r'([\\"]|[^\ -~])')
HAS_UTF8 = re.compile(b'[\x80-\xff]')
ESCAPE_DCT = {
'\\': '\\\\',
'"': '\\"',
'\b': '\\b',
'\f': '\\f',
'\n': '\\n',
'\r': '\\r',
'\t': '\\t',
}
for i in range(0x20):
ESCAPE_DCT.setdefault(chr(i), '\\u{0:04x}'.format(i))
INFINITY = float('inf')
def py_encode_basestring(s):
"""Return a JSON representation of a Python string
"""
def replace(match):
return ESCAPE_DCT[match.group(0)]
return '"' + ESCAPE.sub(replace, s) + '"'
encode_basestring = (c_encode_basestring or py_encode_basestring)
def py_encode_basestring_ascii(s):
"""Return an ASCII-only JSON representation of a Python string
"""
def replace(match):
s = match.group(0)
try:
return ESCAPE_DCT[s]
except KeyError:
n = ord(s)
if n < 0x10000:
return '\\u{0:04x}'.format(n)
else:
# surrogate pair
n -= 0x10000
s1 = 0xd800 | ((n >> 10) & 0x3ff)
s2 = 0xdc00 | (n & 0x3ff)
return '\\u{0:04x}\\u{1:04x}'.format(s1, s2)
return '"' + ESCAPE_ASCII.sub(replace, s) + '"'
encode_basestring_ascii = (
c_encode_basestring_ascii or py_encode_basestring_ascii
)
class JSONEncoder(object):
"""Extensible JSON <http://json.org> encoder for Python data structures.
Supports the following objects and types by default:
+-------------------+---------------+
| Python | JSON |
+===================+===============+
| dict | object |
+-------------------+---------------+
| list, tuple | array |
+-------------------+---------------+
| str | string |
+-------------------+---------------+
| int, float | number |
+-------------------+---------------+
| True | true |
+-------------------+---------------+
| False | false |
+-------------------+---------------+
| None | null |
+-------------------+---------------+
To extend this to recognize other objects, subclass and implement a
``.default()`` method with another method that returns a serializable
object for ``o`` if possible, otherwise it should call the superclass
implementation (to raise ``TypeError``).
"""
item_separator = ', '
key_separator = ': '
def __init__(
self, skipkeys=False, ensure_ascii=False,
check_circular=True, allow_nan=True, sort_keys=True,
indent=None, separators=(',', ':'), default=None,
):
"""Constructor for JSONEncoder, with sensible defaults.
If skipkeys is false, then it is a TypeError to attempt
encoding of keys that are not str, int, float or None. If
skipkeys is True, such items are simply skipped.
If ensure_ascii is true, the output is guaranteed to be str
objects with all incoming non-ASCII characters escaped. If
ensure_ascii is false, the output can contain non-ASCII characters.
If check_circular is true, then lists, dicts, and custom encoded
objects will be checked for circular references during encoding to
prevent an infinite recursion (which would cause an OverflowError).
Otherwise, no such check takes place.
If allow_nan is true, then NaN, Infinity, and -Infinity will be
encoded as such. This behavior is not JSON specification compliant,
but is consistent with most JavaScript based encoders and decoders.
Otherwise, it will be a ValueError to encode such floats.
If sort_keys is true, then the output of dictionaries will be
sorted by key; this is useful for regression tests to ensure
that JSON serializations can be compared on a day-to-day basis.
If indent is a non-negative integer, then JSON array
elements and object members will be pretty-printed with that
indent level. An indent level of 0 will only insert newlines.
None is the most compact representation.
If specified, separators should be an (item_separator, key_separator)
tuple. The default is (', ', ': ') if *indent* is ``None`` and
(',', ': ') otherwise. To get the most compact JSON representation,
you should specify (',', ':') to eliminate whitespace.
If specified, default is a function that gets called for objects
that can't otherwise be serialized. It should return a JSON encodable
version of the object or raise a ``TypeError``.
"""
self.skipkeys = skipkeys
self.ensure_ascii = ensure_ascii
self.check_circular = check_circular
self.allow_nan = allow_nan
self.sort_keys = sort_keys
self.indent = indent
if separators is not None:
self.item_separator, self.key_separator = separators
elif indent is not None:
self.item_separator = ','
if default is not None:
self.default = default
def default(self, o):
"""Implement this method in a subclass such that it returns
a serializable object for ``o``, or calls the base implementation
(to raise a ``TypeError``).
For example, to support arbitrary iterators, you could
implement default like this::
def default(self, o):
try:
iterable = iter(o)
except TypeError:
pass
else:
return list(iterable)
# Let the base class default method raise the TypeError
return JSONEncoder.default(self, o)
"""
raise TypeError(
"Object of type '%s' is not JSON serializable" %
o.__class__.__name__,
)
def encode(self, o):
"""Return a JSON string representation of a Python data structure.
>>> from json.encoder import JSONEncoder
>>> JSONEncoder().encode({"foo": ["bar", "baz"]})
'{"foo": ["bar", "baz"]}'
"""
# This is for extremely simple cases and benchmarks.
if isinstance(o, str):
if self.ensure_ascii:
return encode_basestring_ascii(o)
else:
return encode_basestring(o)
# This doesn't pass the iterator directly to ''.join() because the
# exceptions aren't as detailed. The list call should be roughly
# equivalent to the PySequence_Fast that ''.join() would do.
chunks = self.iterencode(o, _one_shot=False)
if not isinstance(chunks, (list, tuple)):
chunks = list(chunks)
return ''.join(chunks)
def iterencode(self, o, _one_shot=False):
"""Encode the given object and yield each string
representation as available.
For example::
for chunk in JSONEncoder().iterencode(bigobject):
mysocket.write(chunk)
"""
if self.check_circular:
markers = {}
else:
markers = None
if self.ensure_ascii:
_encoder = encode_basestring_ascii
else:
_encoder = encode_basestring
def floatstr(
o, allow_nan=self.allow_nan,
_repr=float.__repr__, _inf=INFINITY, _neginf=-INFINITY,
):
# Check for specials. Note that this type of test is processor
# and/or platform-specific, so do tests which don't depend on the
# internals.
if o != o:
text = 'NaN'
elif o == _inf:
text = 'Infinity'
elif o == _neginf:
text = '-Infinity'
else:
return _repr(o)
if not allow_nan:
raise ValueError(
"Out of range float values are not JSON compliant: " +
repr(o),
)
return text
if (
_one_shot and c_make_encoder is not None
and self.indent is None
):
_iterencode = c_make_encoder(
markers, self.default, _encoder, self.indent,
self.key_separator, self.item_separator, self.sort_keys,
self.skipkeys, self.allow_nan,
)
else:
_iterencode = _make_iterencode(
markers, self.default, _encoder, self.indent, floatstr,
self.key_separator, self.item_separator, self.sort_keys,
self.skipkeys, _one_shot,
)
return _iterencode(o, 0)
def _make_iterencode(
markers, _default, _encoder, _indent, _floatstr,
_key_separator, _item_separator, _sort_keys, _skipkeys, _one_shot,
# HACK: hand-optimized bytecode; turn globals into locals
ValueError=ValueError,
dict=dict,
float=float,
id=id,
int=int,
isinstance=isinstance,
list=list,
str=str,
tuple=tuple,
_intstr=int.__str__,
):
if _indent is not None and not isinstance(_indent, str):
_indent = ' ' * _indent
def _iterencode_list(lst, _current_indent_level):
if not lst:
yield '[]'
return
if markers is not None:
markerid = id(lst)
if markerid in markers:
raise ValueError("Circular reference detected")
markers[markerid] = lst
buf = '['
if _indent is not None:
_current_indent_level += 1
newline_indent = '\n' + _indent * _current_indent_level
separator = _item_separator + newline_indent
buf += newline_indent
else:
newline_indent = None
separator = _item_separator
first = True
for value in lst:
if first:
first = False
else:
buf = separator
if isinstance(value, str):
yield buf + _encoder(value)
elif value is None:
yield buf + 'null'
elif value is True:
yield buf + 'true'
elif value is False:
yield buf + 'false'
elif isinstance(value, int):
# Subclasses of int/float may override __str__, but we still
# want to encode them as integers/floats in JSON. One example
# within the standard library is IntEnum.
yield buf + convert2Es6Format(value)
elif isinstance(value, float):
# see comment above for int
yield buf + convert2Es6Format(value)
else:
yield buf
if isinstance(value, (list, tuple)):
chunks = _iterencode_list(value, _current_indent_level)
elif isinstance(value, dict):
chunks = _iterencode_dict(value, _current_indent_level)
else:
chunks = _iterencode(value, _current_indent_level)
# Below line commented-out for python2 compatibility
# yield from chunks
for chunk in chunks:
yield chunk
if newline_indent is not None:
_current_indent_level -= 1
yield '\n' + _indent * _current_indent_level
yield ']'
if markers is not None:
del markers[markerid]
def _iterencode_dict(dct, _current_indent_level):
if not dct:
yield '{}'
return
if markers is not None:
markerid = id(dct)
if markerid in markers:
raise ValueError("Circular reference detected")
markers[markerid] = dct
yield '{'
if _indent is not None:
_current_indent_level += 1
newline_indent = '\n' + _indent * _current_indent_level
item_separator = _item_separator + newline_indent
yield newline_indent
else:
newline_indent = None
item_separator = _item_separator
first = True
if _sort_keys:
items = sorted(dct.items(), key=lambda kv: kv[0].encode('utf-16_be'))
else:
items = dct.items()
for key, value in items:
# Replaced isinstance(key, str) with below to enable simultaneous python 2 & 3 compatibility
if isinstance(key, six.string_types) or isinstance(key, six.binary_type):
pass
# JavaScript is weakly typed for these, so it makes sense to
# also allow them. Many encoders seem to do something like this.
elif isinstance(key, float):
# see comment for int/float in _make_iterencode
key = convert2Es6Format(key)
elif key is True:
key = 'true'
elif key is False:
key = 'false'
elif key is None:
key = 'null'
elif isinstance(key, int):
# see comment for int/float in _make_iterencode
key = convert2Es6Format(key)
elif _skipkeys:
continue
else:
raise TypeError("key " + repr(key) + " is not a string")
if first:
first = False
else:
yield item_separator
yield _encoder(key)
yield _key_separator
if isinstance(value, str):
yield _encoder(value)
elif value is None:
yield 'null'
elif value is True:
yield 'true'
elif value is False:
yield 'false'
elif isinstance(value, int):
# see comment for int/float in _make_iterencode
yield convert2Es6Format(value)
elif isinstance(value, float):
# see comment for int/float in _make_iterencode
yield convert2Es6Format(value)
else:
if isinstance(value, (list, tuple)):
chunks = _iterencode_list(value, _current_indent_level)
elif isinstance(value, dict):
chunks = _iterencode_dict(value, _current_indent_level)
else:
chunks = _iterencode(value, _current_indent_level)
# Below line commented-out for python2 compatibility
# yield from chunks
for chunk in chunks:
yield chunk
if newline_indent is not None:
_current_indent_level -= 1
yield '\n' + _indent * _current_indent_level
yield '}'
if markers is not None:
del markers[markerid]
def _iterencode(o, _current_indent_level):
# Replaced isinstance(o, str) with below to enable simultaneous python 2 & 3 compatibility
if isinstance(o, six.string_types) or isinstance(o, six.binary_type):
yield _encoder(o)
elif o is None:
yield 'null'
elif o is True:
yield 'true'
elif o is False:
yield 'false'
elif isinstance(o, int):
# see comment for int/float in _make_iterencode
yield convert2Es6Format(o)
elif isinstance(o, float):
# see comment for int/float in _make_iterencode
yield convert2Es6Format(o)
elif isinstance(o, (list, tuple)):
# Below line commented-out for python2 compatibility
# yield from _iterencode_list(o, _current_indent_level)
for thing in _iterencode_list(o, _current_indent_level):
yield thing
elif isinstance(o, dict):
# Below line commented-out for python2 compatibility
# yield from _iterencode_dict(o, _current_indent_level)
for thing in _iterencode_dict(o, _current_indent_level):
yield thing
else:
if markers is not None:
markerid = id(o)
if markerid in markers:
raise ValueError("Circular reference detected")
markers[markerid] = o
o = _default(o)
# Below line commented-out for python2 compatibility
# yield from _iterencode(o, _current_indent_level)
for thing in _iterencode(o, _current_indent_level):
yield thing
if markers is not None:
del markers[markerid]
return _iterencode
def canonicalize(obj, utf8=True):
textVal = JSONEncoder(sort_keys=True).encode(obj)
if utf8:
return textVal.encode()
return textVal
def serialize(obj, utf8=True):
textVal = JSONEncoder(sort_keys=False).encode(obj)
if utf8:
return textVal.encode()
return textVal

View File

@ -0,0 +1,95 @@
##############################################################################
# #
# Copyright 2006-2019 WebPKI.org (http://webpki.org). #
# #
# Licensed under the Apache License, Version 2.0 (the "License"); #
# you may not use this file except in compliance with the License. #
# You may obtain a copy of the License at #
# #
# https://www.apache.org/licenses/LICENSE-2.0 #
# #
# Unless required by applicable law or agreed to in writing, software #
# distributed under the License is distributed on an "AS IS" BASIS, #
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #
# See the License for the specific language governing permissions and #
# limitations under the License. #
# #
##############################################################################
##################################################################
# Convert a Python double/float into an ES6/V8 compatible string #
##################################################################
def convert2Es6Format(value):
# Convert double/float to str using the native Python formatter
fvalue = float(value)
# Zero is a special case. The following line takes "-0" case as well
if fvalue == 0:
return '0'
# The rest of the algorithm works on the textual representation only
pyDouble = str(fvalue)
# The following line catches the "inf" and "nan" values returned by str(fvalue)
if pyDouble.find('n') >= 0:
raise ValueError("Invalid JSON number: " + pyDouble)
# Save sign separately, it doesn't have any role in the algorithm
pySign = ''
if pyDouble.find('-') == 0:
pySign = '-'
pyDouble = pyDouble[1:]
# Now we should only have valid non-zero values
pyExpStr = ''
pyExpVal = 0
q = pyDouble.find('e')
if q > 0:
# Grab the exponent and remove it from the number
pyExpStr = pyDouble[q:]
if pyExpStr[2:3] == '0':
# Supress leading zero on exponents
pyExpStr = pyExpStr[:2] + pyExpStr[3:]
pyDouble = pyDouble[0:q]
pyExpVal = int(pyExpStr[1:])
# Split number in pyFirst + pyDot + pyLast
pyFirst = pyDouble
pyDot = ''
pyLast = ''
q = pyDouble.find('.')
if q > 0:
pyDot = '.'
pyFirst = pyDouble[:q]
pyLast = pyDouble[q + 1:]
# Now the string is split into: pySign + pyFirst + pyDot + pyLast + pyExpStr
if pyLast == '0':
# Always remove trailing .0
pyDot = ''
pyLast = ''
if pyExpVal > 0 and pyExpVal < 21:
# Integers are shown as is with up to 21 digits
pyFirst += pyLast
pyLast = ''
pyDot = ''
pyExpStr = ''
q = pyExpVal - len(pyFirst)
while q >= 0:
q -= 1
pyFirst += '0'
elif pyExpVal < 0 and pyExpVal > -7:
# Small numbers are shown as 0.etc with e-6 as lower limit
pyLast = pyFirst + pyLast
pyFirst = '0'
pyDot = '.'
pyExpStr = ''
q = pyExpVal
while q < -1:
q += 1
pyLast = '0' + pyLast
# The resulting sub-strings are concatenated
return pySign + pyFirst + pyDot + pyLast + pyExpStr

View File

View File

@ -0,0 +1,10 @@
"""
Functions to operate with STIX2 Confidence scales.
.. autosummary::
:toctree: confidence
scales
|
"""

571
stix2/confidence/scales.py Normal file
View File

@ -0,0 +1,571 @@
# -*- coding: utf-8 -*-
"""Functions to perform conversions between the different Confidence scales.
As specified in STIX Version 2.1. Part 1: STIX Core Concepts - Appendix B"""
def none_low_med_high_to_value(scale_value):
"""
This method will transform a string value from the None / Low / Med /
High scale to its confidence integer representation.
The scale for this confidence representation is the following:
.. list-table:: None, Low, Med, High to STIX Confidence
:header-rows: 1
* - None/ Low/ Med/ High
- STIX Confidence Value
* - Not Specified
- Not Specified
* - None
- 0
* - Low
- 15
* - Med
- 50
* - High
- 85
Args:
scale_value (str): A string value from the scale. Accepted strings are
"None", "Low", "Med" and "High". Argument is case sensitive.
Returns:
int: The numerical representation corresponding to values in the
None / Low / Med / High scale.
Raises:
ValueError: If `scale_value` is not within the accepted strings.
"""
if scale_value == 'None':
return 0
elif scale_value == 'Low':
return 15
elif scale_value == 'Med':
return 50
elif scale_value == 'High':
return 85
else:
raise ValueError("STIX Confidence value cannot be determined for %s" % scale_value)
def value_to_none_low_medium_high(confidence_value):
"""
This method will transform an integer value into the None / Low / Med /
High scale string representation.
The scale for this confidence representation is the following:
.. list-table:: STIX Confidence to None, Low, Med, High
:header-rows: 1
* - Range of Values
- None/ Low/ Med/ High
* - 0
- None
* - 1-29
- Low
* - 30-69
- Med
* - 70-100
- High
Args:
confidence_value (int): An integer value between 0 and 100.
Returns:
str: A string corresponding to the None / Low / Med / High scale.
Raises:
ValueError: If `confidence_value` is out of bounds.
"""
if confidence_value == 0:
return 'None'
elif 29 >= confidence_value >= 1:
return 'Low'
elif 69 >= confidence_value >= 30:
return 'Med'
elif 100 >= confidence_value >= 70:
return 'High'
else:
raise ValueError("Range of values out of bounds: %s" % confidence_value)
def zero_ten_to_value(scale_value):
"""
This method will transform a string value from the 0-10 scale to its
confidence integer representation.
The scale for this confidence representation is the following:
.. list-table:: 0-10 to STIX Confidence
:header-rows: 1
* - 0-10 Scale
- STIX Confidence Value
* - 0
- 0
* - 1
- 10
* - 2
- 20
* - 3
- 30
* - 4
- 40
* - 5
- 50
* - 6
- 60
* - 7
- 70
* - 8
- 80
* - 9
- 90
* - 10
- 100
Args:
scale_value (str): A string value from the scale. Accepted strings are "0"
through "10" inclusive.
Returns:
int: The numerical representation corresponding to values in the 0-10
scale.
Raises:
ValueError: If `scale_value` is not within the accepted strings.
"""
if scale_value == '0':
return 0
elif scale_value == '1':
return 10
elif scale_value == '2':
return 20
elif scale_value == '3':
return 30
elif scale_value == '4':
return 40
elif scale_value == '5':
return 50
elif scale_value == '6':
return 60
elif scale_value == '7':
return 70
elif scale_value == '8':
return 80
elif scale_value == '9':
return 90
elif scale_value == '10':
return 100
else:
raise ValueError("STIX Confidence value cannot be determined for %s" % scale_value)
def value_to_zero_ten(confidence_value):
"""
This method will transform an integer value into the 0-10 scale string
representation.
The scale for this confidence representation is the following:
.. list-table:: STIX Confidence to 0-10
:header-rows: 1
* - Range of Values
- 0-10 Scale
* - 0-4
- 0
* - 5-14
- 1
* - 15-24
- 2
* - 25-34
- 3
* - 35-44
- 4
* - 45-54
- 5
* - 55-64
- 6
* - 65-74
- 7
* - 75-84
- 8
* - 95-94
- 9
* - 95-100
- 10
Args:
confidence_value (int): An integer value between 0 and 100.
Returns:
str: A string corresponding to the 0-10 scale.
Raises:
ValueError: If `confidence_value` is out of bounds.
"""
if 4 >= confidence_value >= 0:
return '0'
elif 14 >= confidence_value >= 5:
return '1'
elif 24 >= confidence_value >= 15:
return '2'
elif 34 >= confidence_value >= 25:
return '3'
elif 44 >= confidence_value >= 35:
return '4'
elif 54 >= confidence_value >= 45:
return '5'
elif 64 >= confidence_value >= 55:
return '6'
elif 74 >= confidence_value >= 65:
return '7'
elif 84 >= confidence_value >= 75:
return '8'
elif 94 >= confidence_value >= 85:
return '9'
elif 100 >= confidence_value >= 95:
return '10'
else:
raise ValueError("Range of values out of bounds: %s" % confidence_value)
def admiralty_credibility_to_value(scale_value):
"""
This method will transform a string value from the Admiralty Credibility
scale to its confidence integer representation.
The scale for this confidence representation is the following:
.. list-table:: Admiralty Credibility Scale to STIX Confidence
:header-rows: 1
* - Admiralty Credibility
- STIX Confidence Value
* - 6 - Truth cannot be judged
- (Not present)
* - 5 - Improbable
- 10
* - 4 - Doubtful
- 30
* - 3 - Possibly True
- 50
* - 2 - Probably True
- 70
* - 1 - Confirmed by other sources
- 90
Args:
scale_value (str): A string value from the scale. Accepted strings are
"6 - Truth cannot be judged", "5 - Improbable", "4 - Doubtful",
"3 - Possibly True", "2 - Probably True" and
"1 - Confirmed by other sources". Argument is case sensitive.
Returns:
int: The numerical representation corresponding to values in the
Admiralty Credibility scale.
Raises:
ValueError: If `scale_value` is not within the accepted strings.
"""
if scale_value == '6 - Truth cannot be judged':
raise ValueError("STIX Confidence value cannot be determined for %s" % scale_value)
elif scale_value == '5 - Improbable':
return 10
elif scale_value == '4 - Doubtful':
return 30
elif scale_value == '3 - Possibly True':
return 50
elif scale_value == '2 - Probably True':
return 70
elif scale_value == '1 - Confirmed by other sources':
return 90
else:
raise ValueError("STIX Confidence value cannot be determined for %s" % scale_value)
def value_to_admiralty_credibility(confidence_value):
"""
This method will transform an integer value into the Admiralty Credibility
scale string representation.
The scale for this confidence representation is the following:
.. list-table:: STIX Confidence to Admiralty Credibility Scale
:header-rows: 1
* - Range of Values
- Admiralty Credibility
* - N/A
- 6 - Truth cannot be judged
* - 0-19
- 5 - Improbable
* - 20-39
- 4 - Doubtful
* - 40-59
- 3 - Possibly True
* - 60-79
- 2 - Probably True
* - 80-100
- 1 - Confirmed by other sources
Args:
confidence_value (int): An integer value between 0 and 100.
Returns:
str: A string corresponding to the Admiralty Credibility scale.
Raises:
ValueError: If `confidence_value` is out of bounds.
"""
if 19 >= confidence_value >= 0:
return '5 - Improbable'
elif 39 >= confidence_value >= 20:
return '4 - Doubtful'
elif 59 >= confidence_value >= 40:
return '3 - Possibly True'
elif 79 >= confidence_value >= 60:
return '2 - Probably True'
elif 100 >= confidence_value >= 80:
return '1 - Confirmed by other sources'
else:
raise ValueError("Range of values out of bounds: %s" % confidence_value)
def wep_to_value(scale_value):
"""
This method will transform a string value from the WEP scale to its
confidence integer representation.
The scale for this confidence representation is the following:
.. list-table:: WEP to STIX Confidence
:header-rows: 1
* - WEP
- STIX Confidence Value
* - Impossible
- 0
* - Highly Unlikely/Almost Certainly Not
- 10
* - Unlikely/Probably Not
- 20
* - Even Chance
- 50
* - Likely/Probable
- 70
* - Highly likely/Almost Certain
- 90
* - Certain
- 100
Args:
scale_value (str): A string value from the scale. Accepted strings are
"Impossible", "Highly Unlikely/Almost Certainly Not",
"Unlikely/Probably Not", "Even Chance", "Likely/Probable",
"Highly likely/Almost Certain" and "Certain". Argument is case
sensitive.
Returns:
int: The numerical representation corresponding to values in the WEP
scale.
Raises:
ValueError: If `scale_value` is not within the accepted strings.
"""
if scale_value == 'Impossible':
return 0
elif scale_value == 'Highly Unlikely/Almost Certainly Not':
return 10
elif scale_value == 'Unlikely/Probably Not':
return 30
elif scale_value == 'Even Chance':
return 50
elif scale_value == 'Likely/Probable':
return 70
elif scale_value == 'Highly likely/Almost Certain':
return 90
elif scale_value == 'Certain':
return 100
else:
raise ValueError("STIX Confidence value cannot be determined for %s" % scale_value)
def value_to_wep(confidence_value):
"""
This method will transform an integer value into the WEP scale string
representation.
The scale for this confidence representation is the following:
.. list-table:: STIX Confidence to WEP
:header-rows: 1
* - Range of Values
- WEP
* - 0
- Impossible
* - 1-19
- Highly Unlikely/Almost Certainly Not
* - 20-39
- Unlikely/Probably Not
* - 40-59
- Even Chance
* - 60-79
- Likely/Probable
* - 80-99
- Highly likely/Almost Certain
* - 100
- Certain
Args:
confidence_value (int): An integer value between 0 and 100.
Returns:
str: A string corresponding to the WEP scale.
Raises:
ValueError: If `confidence_value` is out of bounds.
"""
if confidence_value == 0:
return 'Impossible'
elif 19 >= confidence_value >= 1:
return 'Highly Unlikely/Almost Certainly Not'
elif 39 >= confidence_value >= 20:
return 'Unlikely/Probably Not'
elif 59 >= confidence_value >= 40:
return 'Even Chance'
elif 79 >= confidence_value >= 60:
return 'Likely/Probable'
elif 99 >= confidence_value >= 80:
return 'Highly likely/Almost Certain'
elif confidence_value == 100:
return 'Certain'
else:
raise ValueError("Range of values out of bounds: %s" % confidence_value)
def dni_to_value(scale_value):
"""
This method will transform a string value from the DNI scale to its
confidence integer representation.
The scale for this confidence representation is the following:
.. list-table:: DNI Scale to STIX Confidence
:header-rows: 1
* - DNI Scale
- STIX Confidence Value
* - Almost No Chance / Remote
- 5
* - Very Unlikely / Highly Improbable
- 15
* - Unlikely / Improbable
- 30
* - Roughly Even Chance / Roughly Even Odds
- 50
* - Likely / Probable
- 70
* - Very Likely / Highly Probable
- 85
* - Almost Certain / Nearly Certain
- 95
Args:
scale_value (str): A string value from the scale. Accepted strings are
"Almost No Chance / Remote", "Very Unlikely / Highly Improbable",
"Unlikely / Improbable", "Roughly Even Chance / Roughly Even Odds",
"Likely / Probable", "Very Likely / Highly Probable" and
"Almost Certain / Nearly Certain". Argument is case sensitive.
Returns:
int: The numerical representation corresponding to values in the DNI
scale.
Raises:
ValueError: If `scale_value` is not within the accepted strings.
"""
if scale_value == 'Almost No Chance / Remote':
return 5
elif scale_value == 'Very Unlikely / Highly Improbable':
return 15
elif scale_value == 'Unlikely / Improbable':
return 30
elif scale_value == 'Roughly Even Chance / Roughly Even Odds':
return 50
elif scale_value == 'Likely / Probable':
return 70
elif scale_value == 'Very Likely / Highly Probable':
return 85
elif scale_value == 'Almost Certain / Nearly Certain':
return 95
else:
raise ValueError("STIX Confidence value cannot be determined for %s" % scale_value)
def value_to_dni(confidence_value):
"""
This method will transform an integer value into the DNI scale string
representation.
The scale for this confidence representation is the following:
.. list-table:: STIX Confidence to DNI Scale
:header-rows: 1
* - Range of Values
- DNI Scale
* - 0-9
- Almost No Chance / Remote
* - 10-19
- Very Unlikely / Highly Improbable
* - 20-39
- Unlikely / Improbable
* - 40-59
- Roughly Even Chance / Roughly Even Odds
* - 60-79
- Likely / Probable
* - 80-89
- Very Likely / Highly Probable
* - 90-100
- Almost Certain / Nearly Certain
Args:
confidence_value (int): An integer value between 0 and 100.
Returns:
str: A string corresponding to the DNI scale.
Raises:
ValueError: If `confidence_value` is out of bounds.
"""
if 9 >= confidence_value >= 0:
return 'Almost No Chance / Remote'
elif 19 >= confidence_value >= 10:
return 'Very Unlikely / Highly Improbable'
elif 39 >= confidence_value >= 20:
return 'Unlikely / Improbable'
elif 59 >= confidence_value >= 40:
return 'Roughly Even Chance / Roughly Even Odds'
elif 79 >= confidence_value >= 60:
return 'Likely / Probable'
elif 89 >= confidence_value >= 80:
return 'Very Likely / Highly Probable'
elif 100 >= confidence_value >= 90:
return 'Almost Certain / Nearly Certain'
else:
raise ValueError("Range of values out of bounds: %s" % confidence_value)

View File

@ -1,180 +0,0 @@
"""STIX 2.0 Objects that are neither SDOs nor SROs."""
from collections import OrderedDict
import importlib
import pkgutil
import stix2
from . import exceptions
from .base import _STIXBase
from .properties import IDProperty, ListProperty, Property, TypeProperty
from .utils import _get_dict, get_class_hierarchy_names
class STIXObjectProperty(Property):
def __init__(self, allow_custom=False, *args, **kwargs):
self.allow_custom = allow_custom
super(STIXObjectProperty, self).__init__(*args, **kwargs)
def clean(self, value):
# Any STIX Object (SDO, SRO, or Marking Definition) can be added to
# a bundle with no further checks.
if any(x in ('STIXDomainObject', 'STIXRelationshipObject', 'MarkingDefinition')
for x in get_class_hierarchy_names(value)):
return value
try:
dictified = _get_dict(value)
except ValueError:
raise ValueError("This property may only contain a dictionary or object")
if dictified == {}:
raise ValueError("This property may only contain a non-empty dictionary or object")
if 'type' in dictified and dictified['type'] == 'bundle':
raise ValueError('This property may not contain a Bundle object')
if self.allow_custom:
parsed_obj = parse(dictified, allow_custom=True)
else:
parsed_obj = parse(dictified)
return parsed_obj
class Bundle(_STIXBase):
"""For more detailed information on this object's properties, see
`the STIX 2.0 specification <http://docs.oasis-open.org/cti/stix/v2.0/cs01/part1-stix-core/stix-v2.0-cs01-part1-stix-core.html#_Toc496709293>`__.
"""
_type = 'bundle'
_properties = OrderedDict()
_properties.update([
('type', TypeProperty(_type)),
('id', IDProperty(_type)),
('spec_version', Property(fixed="2.0")),
('objects', ListProperty(STIXObjectProperty)),
])
def __init__(self, *args, **kwargs):
# Add any positional arguments to the 'objects' kwarg.
if args:
if isinstance(args[0], list):
kwargs['objects'] = args[0] + list(args[1:]) + kwargs.get('objects', [])
else:
kwargs['objects'] = list(args) + kwargs.get('objects', [])
self.__allow_custom = kwargs.get('allow_custom', False)
self._properties['objects'].contained.allow_custom = kwargs.get('allow_custom', False)
super(Bundle, self).__init__(**kwargs)
STIX2_OBJ_MAPS = {}
def parse(data, allow_custom=False, version=None):
"""Convert a string, dict or file-like object into a STIX object.
Args:
data (str, dict, file-like object): The STIX 2 content to be parsed.
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects. Note that unknown custom objects cannot be parsed
into STIX objects, and will be returned as is. Default: False.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
Returns:
An instantiated Python STIX object.
WARNING: 'allow_custom=True' will allow for the return of any supplied STIX
dict(s) that cannot be found to map to any known STIX object types (both STIX2
domain objects or defined custom STIX2 objects); NO validation is done. This is
done to allow the processing of possibly unknown custom STIX objects (example
scenario: I need to query a third-party TAXII endpoint that could provide custom
STIX objects that I dont know about ahead of time)
"""
# convert STIX object to dict, if not already
obj = _get_dict(data)
# convert dict to full python-stix2 obj
obj = dict_to_stix2(obj, allow_custom, version)
return obj
def dict_to_stix2(stix_dict, allow_custom=False, version=None):
"""convert dictionary to full python-stix2 object
Args:
stix_dict (dict): a python dictionary of a STIX object
that (presumably) is semantically correct to be parsed
into a full python-stix2 obj
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects. Note that unknown custom objects cannot be parsed
into STIX objects, and will be returned as is. Default: False.
Returns:
An instantiated Python STIX object
WARNING: 'allow_custom=True' will allow for the return of any supplied STIX
dict(s) that cannot be found to map to any known STIX object types (both STIX2
domain objects or defined custom STIX2 objects); NO validation is done. This is
done to allow the processing of possibly unknown custom STIX objects (example
scenario: I need to query a third-party TAXII endpoint that could provide custom
STIX objects that I dont know about ahead of time)
"""
if not version:
# Use latest version
v = 'v' + stix2.DEFAULT_VERSION.replace('.', '')
else:
v = 'v' + version.replace('.', '')
OBJ_MAP = STIX2_OBJ_MAPS[v]
if 'type' not in stix_dict:
raise exceptions.ParseError("Can't parse object with no 'type' property: %s" % str(stix_dict))
try:
obj_class = OBJ_MAP[stix_dict['type']]
except KeyError:
if allow_custom:
# flag allows for unknown custom objects too, but will not
# be parsed into STIX object, returned as is
return stix_dict
raise exceptions.ParseError("Can't parse unknown object type '%s'! For custom types, use the CustomObject decorator." % stix_dict['type'])
return obj_class(allow_custom=allow_custom, **stix_dict)
def _register_type(new_type, version=None):
"""Register a custom STIX Object type.
Args:
new_type (class): A class to register in the Object map.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
"""
if not version:
# Use latest version
v = 'v' + stix2.DEFAULT_VERSION.replace('.', '')
else:
v = 'v' + version.replace('.', '')
OBJ_MAP = STIX2_OBJ_MAPS[v]
OBJ_MAP[new_type._type] = new_type
def _collect_stix2_obj_maps():
"""Navigate the package once and retrieve all OBJ_MAP dicts for each v2X
package."""
if not STIX2_OBJ_MAPS:
top_level_module = importlib.import_module('stix2')
path = top_level_module.__path__
prefix = str(top_level_module.__name__) + '.'
for module_loader, name, is_pkg in pkgutil.walk_packages(path=path,
prefix=prefix):
if name.startswith('stix2.v2') and is_pkg:
mod = importlib.import_module(name, str(top_level_module.__name__))
STIX2_OBJ_MAPS[name.split('.')[-1]] = mod.OBJ_MAP

92
stix2/custom.py Normal file
View File

@ -0,0 +1,92 @@
from collections import OrderedDict
import six
from .base import _cls_init
from .parsing import (
_register_marking, _register_object, _register_observable,
_register_observable_extension,
)
def _get_properties_dict(properties):
try:
return OrderedDict(properties)
except TypeError as e:
six.raise_from(
ValueError(
"properties must be dict-like, e.g. a list "
"containing tuples. For example, "
"[('property1', IntegerProperty())]",
),
e,
)
def _custom_object_builder(cls, type, properties, version, base_class):
prop_dict = _get_properties_dict(properties)
class _CustomObject(cls, base_class):
_type = type
_properties = prop_dict
def __init__(self, **kwargs):
base_class.__init__(self, **kwargs)
_cls_init(cls, self, kwargs)
_register_object(_CustomObject, version=version)
return _CustomObject
def _custom_marking_builder(cls, type, properties, version, base_class):
prop_dict = _get_properties_dict(properties)
class _CustomMarking(cls, base_class):
_type = type
_properties = prop_dict
def __init__(self, **kwargs):
base_class.__init__(self, **kwargs)
_cls_init(cls, self, kwargs)
_register_marking(_CustomMarking, version=version)
return _CustomMarking
def _custom_observable_builder(cls, type, properties, version, base_class, id_contrib_props=None):
if id_contrib_props is None:
id_contrib_props = []
prop_dict = _get_properties_dict(properties)
class _CustomObservable(cls, base_class):
_type = type
_properties = prop_dict
if version != '2.0':
_id_contributing_properties = id_contrib_props
def __init__(self, **kwargs):
base_class.__init__(self, **kwargs)
_cls_init(cls, self, kwargs)
_register_observable(_CustomObservable, version=version)
return _CustomObservable
def _custom_extension_builder(cls, observable, type, properties, version, base_class):
prop_dict = _get_properties_dict(properties)
class _CustomExtension(cls, base_class):
_type = type
_properties = prop_dict
def __init__(self, **kwargs):
base_class.__init__(self, **kwargs)
_cls_init(cls, self, kwargs)
_register_observable_extension(observable, _CustomExtension, version=version)
return _CustomExtension

View File

@ -1,4 +1,5 @@
"""Python STIX 2.0 DataStore API.
"""
Python STIX2 DataStore API.
.. autosummary::
:toctree: datastore
@ -83,7 +84,8 @@ class DataStoreMixin(object):
try:
return self.source.get(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data source to query' % self.__class__.__name__)
msg = "%s has no data source to query"
raise AttributeError(msg % self.__class__.__name__)
def all_versions(self, *args, **kwargs):
"""Retrieve all versions of a single STIX object by ID.
@ -100,7 +102,8 @@ class DataStoreMixin(object):
try:
return self.source.all_versions(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data source to query' % self.__class__.__name__)
msg = "%s has no data source to query"
raise AttributeError(msg % self.__class__.__name__)
def query(self, *args, **kwargs):
"""Retrieve STIX objects matching a set of filters.
@ -118,7 +121,8 @@ class DataStoreMixin(object):
try:
return self.source.query(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data source to query' % self.__class__.__name__)
msg = "%s has no data source to query"
raise AttributeError(msg % self.__class__.__name__)
def creator_of(self, *args, **kwargs):
"""Retrieve the Identity refered to by the object's `created_by_ref`.
@ -137,7 +141,8 @@ class DataStoreMixin(object):
try:
return self.source.creator_of(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data source to query' % self.__class__.__name__)
msg = "%s has no data source to query"
raise AttributeError(msg % self.__class__.__name__)
def relationships(self, *args, **kwargs):
"""Retrieve Relationships involving the given STIX object.
@ -163,7 +168,8 @@ class DataStoreMixin(object):
try:
return self.source.relationships(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data source to query' % self.__class__.__name__)
msg = "%s has no data source to query"
raise AttributeError(msg % self.__class__.__name__)
def related_to(self, *args, **kwargs):
"""Retrieve STIX Objects that have a Relationship involving the given
@ -193,7 +199,8 @@ class DataStoreMixin(object):
try:
return self.source.related_to(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data source to query' % self.__class__.__name__)
msg = "%s has no data source to query"
raise AttributeError(msg % self.__class__.__name__)
def add(self, *args, **kwargs):
"""Method for storing STIX objects.
@ -208,7 +215,8 @@ class DataStoreMixin(object):
try:
return self.sink.add(*args, **kwargs)
except AttributeError:
raise AttributeError('%s has no data sink to put objects in' % self.__class__.__name__)
msg = "%s has no data sink to put objects in"
raise AttributeError(msg % self.__class__.__name__)
class DataSink(with_metaclass(ABCMeta)):
@ -301,7 +309,7 @@ class DataSource(with_metaclass(ABCMeta)):
"""
def creator_of(self, obj):
"""Retrieve the Identity refered to by the object's `created_by_ref`.
"""Retrieve the Identity referred to by the object's `created_by_ref`.
Args:
obj: The STIX object whose `created_by_ref` property will be looked
@ -412,7 +420,7 @@ class CompositeDataSource(DataSource):
"""Controller for all the attached DataSources.
A user can have a single CompositeDataSource as an interface
the a set of DataSources. When an API call is made to the
to a set of DataSources. When an API call is made to the
CompositeDataSource, it is delegated to each of the (real)
DataSources that are attached to it.
@ -457,7 +465,7 @@ class CompositeDataSource(DataSource):
"""
if not self.has_data_sources():
raise AttributeError('CompositeDataSource has no data sources')
raise AttributeError("CompositeDataSource has no data sources")
all_data = []
all_filters = FilterSet()
@ -504,7 +512,7 @@ class CompositeDataSource(DataSource):
"""
if not self.has_data_sources():
raise AttributeError('CompositeDataSource has no data sources')
raise AttributeError("CompositeDataSource has no data sources")
all_data = []
all_filters = FilterSet()
@ -543,7 +551,7 @@ class CompositeDataSource(DataSource):
"""
if not self.has_data_sources():
raise AttributeError('CompositeDataSource has no data sources')
raise AttributeError("CompositeDataSource has no data sources")
if not query:
# don't mess with the query (i.e. deduplicate, as that's done
@ -594,7 +602,7 @@ class CompositeDataSource(DataSource):
"""
if not self.has_data_sources():
raise AttributeError('CompositeDataSource has no data sources')
raise AttributeError("CompositeDataSource has no data sources")
results = []
for ds in self.data_sources:
@ -634,7 +642,7 @@ class CompositeDataSource(DataSource):
"""
if not self.has_data_sources():
raise AttributeError('CompositeDataSource has no data sources')
raise AttributeError("CompositeDataSource has no data sources")
results = []
for ds in self.data_sources:

View File

@ -1,15 +1,492 @@
"""
Python STIX 2.0 FileSystem Source/Sink
"""
"""Python STIX2 FileSystem Source/Sink"""
import errno
import io
import json
import os
import re
import stat
from stix2.core import Bundle, parse
from stix2.datastore import DataSink, DataSource, DataStoreMixin
import six
from stix2 import v20, v21
from stix2.base import _STIXBase
from stix2.datastore import (
DataSink, DataSource, DataSourceError, DataStoreMixin,
)
from stix2.datastore.filters import Filter, FilterSet, apply_common_filters
from stix2.utils import deduplicate, get_class_hierarchy_names
from stix2.parsing import parse
from stix2.utils import format_datetime, get_type_from_id
def _timestamp2filename(timestamp):
"""
Encapsulates a way to create unique filenames based on an object's
"modified" property value. This should not include an extension.
Args:
timestamp: A timestamp, as a datetime.datetime object.
"""
# The format_datetime will determine the correct level of precision.
ts = format_datetime(timestamp)
ts = re.sub(r"[-T:\.Z ]", "", ts)
return ts
class AuthSet(object):
"""
Represents either a whitelist or blacklist of values, where/what we
must/must not search to find objects which match a query. (Maybe "AuthSet"
isn't the right name, but determining authorization is a typical context in
which black/white lists are used.)
The set may be empty. For a whitelist, this means you mustn't search
anywhere, which means the query was impossible to match, so you can skip
searching altogether. For a blacklist, this means nothing is excluded
and you must search everywhere.
"""
BLACK = 0
WHITE = 1
def __init__(self, allowed, prohibited):
"""
Initialize this AuthSet from the given sets of allowed and/or
prohibited values. The type of set (black or white) is determined
from the allowed and/or prohibited values given.
Args:
allowed: A set of allowed values (or None if no allow filters
were found in the query)
prohibited: A set of prohibited values (not None)
"""
if allowed is None:
self.__values = prohibited
self.__type = AuthSet.BLACK
else:
# There was at least one allow filter, so create a whitelist. But
# any matching prohibited values create a combination of conditions
# which can never match. So exclude those.
self.__values = allowed - prohibited
self.__type = AuthSet.WHITE
@property
def values(self):
"""
Get the values in this white/blacklist, as a set.
"""
return self.__values
@property
def auth_type(self):
"""
Get the type of set: AuthSet.WHITE or AuthSet.BLACK.
"""
return self.__type
def __repr__(self):
return "{}list: {}".format(
"white" if self.auth_type == AuthSet.WHITE else "black",
self.values,
)
# A fixed, reusable AuthSet which accepts anything. It came in handy.
_AUTHSET_ANY = AuthSet(None, set())
def _update_allow(allow_set, value):
"""
Updates the given set of "allow" values. The first time an update to the
set occurs, the value(s) are added. Thereafter, since all filters are
implicitly AND'd, the given values are intersected with the existing allow
set, which may remove values. At the end, it may even wind up empty.
Args:
allow_set: The allow set, or None
value: The value(s) to add (single value, or iterable of values)
Returns:
The updated allow set (not None)
"""
adding_seq = hasattr(value, "__iter__") and \
not isinstance(value, six.string_types)
if allow_set is None:
allow_set = set()
if adding_seq:
allow_set.update(value)
else:
allow_set.add(value)
else:
# strangely, the "&=" operator requires a set on the RHS
# whereas the method allows any iterable.
if adding_seq:
allow_set.intersection_update(value)
else:
allow_set.intersection_update({value})
return allow_set
def _find_search_optimizations(filters):
"""
Searches through all the filters, and creates white/blacklists of types and
IDs, which can be used to optimize the filesystem search.
Args:
filters: An iterable of filter objects representing a query
Returns:
A 2-tuple of AuthSet objects: the first is for object types, and
the second is for object IDs.
"""
# The basic approach to this is to determine what is allowed and
# prohibited, independently, and then combine them to create the final
# white/blacklists.
allowed_types = allowed_ids = None
prohibited_types = set()
prohibited_ids = set()
for filter_ in filters:
if filter_.property == "type":
if filter_.op in ("=", "in"):
allowed_types = _update_allow(allowed_types, filter_.value)
elif filter_.op == "!=":
prohibited_types.add(filter_.value)
elif filter_.property == "id":
if filter_.op == "=":
# An "allow" ID filter implies a type filter too, since IDs
# contain types within them.
allowed_ids = _update_allow(allowed_ids, filter_.value)
allowed_types = _update_allow(
allowed_types,
get_type_from_id(filter_.value),
)
elif filter_.op == "!=":
prohibited_ids.add(filter_.value)
elif filter_.op == "in":
allowed_ids = _update_allow(allowed_ids, filter_.value)
allowed_types = _update_allow(
allowed_types, (
get_type_from_id(id_) for id_ in filter_.value
),
)
opt_types = AuthSet(allowed_types, prohibited_types)
opt_ids = AuthSet(allowed_ids, prohibited_ids)
# If we have both type and ID whitelists, perform a type-based intersection
# on them, to further optimize. (Some of the cross-property constraints
# occur above; this is essentially a second pass which operates on the
# final whitelists, which among other things, incorporates any of the
# prohibitions found above.)
if opt_types.auth_type == AuthSet.WHITE and \
opt_ids.auth_type == AuthSet.WHITE:
opt_types.values.intersection_update(
get_type_from_id(id_) for id_ in opt_ids.values
)
opt_ids.values.intersection_update(
id_ for id_ in opt_ids.values
if get_type_from_id(id_) in opt_types.values
)
return opt_types, opt_ids
def _get_matching_dir_entries(parent_dir, auth_set, st_mode_test=None, ext=""):
"""
Search a directory (non-recursively), and find entries which match the
given criteria.
Args:
parent_dir: The directory to search
auth_set: an AuthSet instance, which represents a black/whitelist
filter on filenames
st_mode_test: A callable allowing filtering based on the type of
directory entry. E.g. just get directories, or just get files. It
will be passed the st_mode field of a stat() structure and should
return True to include the file, or False to exclude it. Easy thing to
do is pass one of the stat module functions, e.g. stat.S_ISREG. If
None, don't filter based on entry type.
ext: Determines how names from auth_set match up to directory
entries, and allows filtering by extension. The extension is added
to auth_set values to obtain directory entries; it is removed from
directory entries to obtain auth_set values. In this way, auth_set
may be treated as having only "basenames" of the entries. Only entries
having the given extension will be included in the results. If not
empty, the extension MUST include a leading ".". The default is the
empty string, which will result in direct comparisons, and no
extension-based filtering.
Returns:
(list): A list of directory entries matching the criteria. These will not
have any path info included; they will just be bare names.
Raises:
OSError: If there are errors accessing directory contents or stat()'ing
files
"""
results = []
if auth_set.auth_type == AuthSet.WHITE:
for value in auth_set.values:
filename = value + ext
try:
if st_mode_test:
s = os.stat(os.path.join(parent_dir, filename))
type_pass = st_mode_test(s.st_mode)
else:
type_pass = True
if type_pass:
results.append(filename)
except OSError as e:
if e.errno != errno.ENOENT:
raise
# else, file-not-found is ok, just skip
else: # auth_set is a blacklist
for entry in os.listdir(parent_dir):
if ext:
auth_name, this_ext = os.path.splitext(entry)
if this_ext != ext:
continue
else:
auth_name = entry
if auth_name in auth_set.values:
continue
try:
if st_mode_test:
s = os.stat(os.path.join(parent_dir, entry))
type_pass = st_mode_test(s.st_mode)
else:
type_pass = True
if type_pass:
results.append(entry)
except OSError as e:
if e.errno != errno.ENOENT:
raise
# else, file-not-found is ok, just skip
return results
def _check_object_from_file(query, filepath, allow_custom, version, encoding):
"""
Read a STIX object from the given file, and check it against the given
filters.
Args:
query: Iterable of filters
filepath (str): Path to file to read
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
encoding (str): The encoding to use when reading a file from the
filesystem.
Returns:
The (parsed) STIX object, if the object passes the filters. If
not, None is returned.
Raises:
TypeError: If the file had invalid JSON
IOError: If there are problems opening/reading the file
stix2.exceptions.STIXError: If there were problems creating a STIX
object from the JSON
"""
try:
with io.open(filepath, "r", encoding=encoding) as f:
stix_json = json.load(f)
except ValueError: # not a JSON file
raise TypeError(
"STIX JSON object at '{0}' could either not be parsed "
"to JSON or was not valid STIX JSON".format(filepath),
)
stix_obj = parse(stix_json, allow_custom, version)
if stix_obj["type"] == "bundle":
stix_obj = stix_obj["objects"][0]
# check against other filters, add if match
result = next(apply_common_filters([stix_obj], query), None)
return result
def _is_versioned_type_dir(type_path, type_name):
"""
Try to detect whether the given directory is for a versioned type of STIX
object. This is done by looking for a directory whose name is a STIX ID
of the appropriate type. If found, treat this type as versioned. This
doesn't work when a versioned type directory is empty (it will be
mis-classified as unversioned), but this detection is only necessary when
reading/querying data. If a directory is empty, you'll get no results
either way.
Args:
type_path: A path to a directory containing one type of STIX object.
type_name: The STIX type name.
Returns:
True if the directory looks like it contains versioned objects; False
if not.
Raises:
OSError: If there are errors accessing directory contents or stat()'ing
files
"""
id_regex = re.compile(
r"^" + re.escape(type_name) +
r"--[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}"
r"-[0-9a-f]{12}$",
re.I,
)
for entry in os.listdir(type_path):
s = os.stat(os.path.join(type_path, entry))
if stat.S_ISDIR(s.st_mode) and id_regex.match(entry):
is_versioned = True
break
else:
is_versioned = False
return is_versioned
def _search_versioned(query, type_path, auth_ids, allow_custom, version, encoding):
"""
Searches the given directory, which contains data for STIX objects of a
particular versioned type, and return any which match the query.
Args:
query: The query to match against
type_path: The directory with type-specific STIX object files
auth_ids: Search optimization based on object ID
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
encoding (str): The encoding to use when reading a file from the
filesystem.
Returns:
A list of all matching objects
Raises:
stix2.exceptions.STIXError: If any objects had invalid content
TypeError: If any objects had invalid content
IOError: If there were any problems opening/reading files
OSError: If there were any problems opening/reading files
"""
results = []
id_dirs = _get_matching_dir_entries(
type_path, auth_ids,
stat.S_ISDIR,
)
for id_dir in id_dirs:
id_path = os.path.join(type_path, id_dir)
# This leverages a more sophisticated function to do a simple thing:
# get all the JSON files from a directory. I guess it does give us
# file type checking, ensuring we only get regular files.
version_files = _get_matching_dir_entries(
id_path, _AUTHSET_ANY,
stat.S_ISREG, ".json",
)
for version_file in version_files:
version_path = os.path.join(id_path, version_file)
try:
stix_obj = _check_object_from_file(
query, version_path,
allow_custom, version,
encoding,
)
if stix_obj:
results.append(stix_obj)
except IOError as e:
if e.errno != errno.ENOENT:
raise
# else, file-not-found is ok, just skip
# For backward-compatibility, also search for plain files named after
# object IDs, in the type directory.
backcompat_results = _search_unversioned(
query, type_path, auth_ids, allow_custom, version, encoding,
)
results.extend(backcompat_results)
return results
def _search_unversioned(
query, type_path, auth_ids, allow_custom, version, encoding,
):
"""
Searches the given directory, which contains unversioned data, and return
any objects which match the query.
Args:
query: The query to match against
type_path: The directory with STIX files of unversioned type
auth_ids: Search optimization based on object ID
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
encoding (str): The encoding to use when reading a file from the
filesystem.
Returns:
A list of all matching objects
Raises:
stix2.exceptions.STIXError: If any objects had invalid content
TypeError: If any objects had invalid content
IOError: If there were any problems opening/reading files
OSError: If there were any problems opening/reading files
"""
results = []
id_files = _get_matching_dir_entries(
type_path, auth_ids, stat.S_ISREG,
".json",
)
for id_file in id_files:
id_path = os.path.join(type_path, id_file)
try:
stix_obj = _check_object_from_file(
query, id_path, allow_custom,
version, encoding,
)
if stix_obj:
results.append(stix_obj)
except IOError as e:
if e.errno != errno.ENOENT:
raise
# else, file-not-found is ok, just skip
return results
class FileSystemStore(DataStoreMixin):
@ -21,19 +498,21 @@ class FileSystemStore(DataStoreMixin):
Args:
stix_dir (str): path to directory of STIX objects
allow_custom (bool): whether to allow custom STIX content to be
pushed/retrieved. Defaults to True for FileSystemSource side(retrieving data)
and False for FileSystemSink side(pushing data). However, when
parameter is supplied, it will be applied to both FileSystemSource
and FileSystemSink.
bundlify (bool): whether to wrap objects in bundles when saving them.
Default: False.
pushed/retrieved. Defaults to True for FileSystemSource side
(retrieving data) and False for FileSystemSink
side(pushing data). However, when parameter is supplied, it
will be applied to both FileSystemSource and FileSystemSink.
bundlify (bool): whether to wrap objects in bundles when saving
them. Default: False.
encoding (str): The encoding to use when reading a file from the
filesystem.
Attributes:
source (FileSystemSource): FileSystemSource
sink (FileSystemSink): FileSystemSink
"""
def __init__(self, stix_dir, allow_custom=None, bundlify=False):
def __init__(self, stix_dir, allow_custom=None, bundlify=False, encoding='utf-8'):
if allow_custom is None:
allow_custom_source = True
allow_custom_sink = False
@ -41,8 +520,8 @@ class FileSystemStore(DataStoreMixin):
allow_custom_sink = allow_custom_source = allow_custom
super(FileSystemStore, self).__init__(
source=FileSystemSource(stix_dir=stix_dir, allow_custom=allow_custom_source),
sink=FileSystemSink(stix_dir=stix_dir, allow_custom=allow_custom_sink, bundlify=bundlify)
source=FileSystemSource(stix_dir=stix_dir, allow_custom=allow_custom_source, encoding=encoding),
sink=FileSystemSink(stix_dir=stix_dir, allow_custom=allow_custom_sink, bundlify=bundlify),
)
@ -74,19 +553,39 @@ class FileSystemSink(DataSink):
def stix_dir(self):
return self._stix_dir
def _check_path_and_write(self, stix_obj):
def _check_path_and_write(self, stix_obj, encoding='utf-8'):
"""Write the given STIX object to a file in the STIX file directory.
"""
path = os.path.join(self._stix_dir, stix_obj["type"], stix_obj["id"] + ".json")
type_dir = os.path.join(self._stix_dir, stix_obj["type"])
if not os.path.exists(os.path.dirname(path)):
os.makedirs(os.path.dirname(path))
# All versioned objects should have a "modified" property.
if "modified" in stix_obj:
filename = _timestamp2filename(stix_obj["modified"])
obj_dir = os.path.join(type_dir, stix_obj["id"])
else:
filename = stix_obj["id"]
obj_dir = type_dir
file_path = os.path.join(obj_dir, filename + ".json")
if not os.path.exists(obj_dir):
os.makedirs(obj_dir)
if self.bundlify:
stix_obj = Bundle(stix_obj, allow_custom=self.allow_custom)
if 'spec_version' in stix_obj:
# Assuming future specs will allow multiple SDO/SROs
# versions in a single bundle we won't need to check this
# and just use the latest supported Bundle version.
stix_obj = v21.Bundle(stix_obj, allow_custom=self.allow_custom)
else:
stix_obj = v20.Bundle(stix_obj, allow_custom=self.allow_custom)
with open(path, "w") as f:
f.write(str(stix_obj))
if os.path.isfile(file_path):
raise DataSourceError("Attempted to overwrite file (!) at: {}".format(file_path))
else:
with io.open(file_path, 'w', encoding=encoding) as f:
stix_obj = stix_obj.serialize(pretty=True, encoding=encoding, ensure_ascii=False)
f.write(stix_obj)
def add(self, stix_data=None, version=None):
"""Add STIX objects to file directory.
@ -95,8 +594,9 @@ class FileSystemSink(DataSink):
stix_data (STIX object OR dict OR str OR list): valid STIX 2.0 content
in a STIX object (or list of), dict (or list of), or a STIX 2.0
json encoded string.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
Note:
``stix_data`` can be a Bundle object, but each object in it will be
@ -104,35 +604,30 @@ class FileSystemSink(DataSink):
the Bundle contained, but not the Bundle itself.
"""
if any(x in ('STIXDomainObject', 'STIXRelationshipObject', 'MarkingDefinition')
for x in get_class_hierarchy_names(stix_data)):
if isinstance(stix_data, (v20.Bundle, v21.Bundle)):
# recursively add individual STIX objects
for stix_obj in stix_data.get("objects", []):
self.add(stix_obj, version=version)
elif isinstance(stix_data, _STIXBase):
# adding python STIX object
self._check_path_and_write(stix_data)
elif isinstance(stix_data, (str, dict)):
stix_data = parse(stix_data, allow_custom=self.allow_custom, version=version)
if stix_data["type"] == "bundle":
# extract STIX objects
for stix_obj in stix_data.get("objects", []):
self.add(stix_obj, version=version)
else:
# adding json-formatted STIX
self._check_path_and_write(stix_data,)
elif isinstance(stix_data, Bundle):
# recursively add individual STIX objects
for stix_obj in stix_data.get("objects", []):
self.add(stix_obj, version=version)
self.add(stix_data, version=version)
elif isinstance(stix_data, list):
# recursively add individual STIX objects
for stix_obj in stix_data:
self.add(stix_obj, version=version)
self.add(stix_obj)
else:
raise TypeError("stix_data must be a STIX object (or list of), "
raise TypeError(
"stix_data must be a STIX object (or list of), "
"JSON formatted STIX (or list of), "
"or a JSON formatted STIX bundle")
"or a JSON formatted STIX bundle",
)
class FileSystemSource(DataSource):
@ -146,12 +641,15 @@ class FileSystemSource(DataSource):
stix_dir (str): path to directory of STIX objects
allow_custom (bool): Whether to allow custom STIX content to be
added to the FileSystemSink. Default: True
encoding (str): The encoding to use when reading a file from the
filesystem.
"""
def __init__(self, stix_dir, allow_custom=True):
def __init__(self, stix_dir, allow_custom=True, encoding='utf-8'):
super(FileSystemSource, self).__init__()
self._stix_dir = os.path.abspath(stix_dir)
self.allow_custom = allow_custom
self.encoding = encoding
if not os.path.exists(self._stix_dir):
raise ValueError("directory path for STIX data does not exist: %s" % self._stix_dir)
@ -167,8 +665,9 @@ class FileSystemSource(DataSource):
stix_id (str): The STIX ID of the STIX object to be retrieved.
_composite_filters (FilterSet): collection of filters passed from the parent
CompositeDataSource, not user supplied
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
Returns:
(STIX object): STIX object that has the supplied STIX ID.
@ -176,12 +675,17 @@ class FileSystemSource(DataSource):
a python STIX object and then returned
"""
query = [Filter("id", "=", stix_id)]
all_data = self.query(query=query, version=version, _composite_filters=_composite_filters)
all_data = self.all_versions(stix_id, version=version, _composite_filters=_composite_filters)
if all_data:
stix_obj = sorted(all_data, key=lambda k: k['modified'])[0]
# Simple check for a versioned STIX type: see if the objects have a
# "modified" property. (Need only check one, since they are all of
# the same type.)
is_versioned = "modified" in all_data[0]
if is_versioned:
stix_obj = sorted(all_data, key=lambda k: k['modified'])[-1]
else:
stix_obj = all_data[0]
else:
stix_obj = None
@ -195,10 +699,11 @@ class FileSystemSource(DataSource):
Args:
stix_id (str): The STIX ID of the STIX objects to be retrieved.
_composite_filters (FilterSet): collection of filters passed from the parent
CompositeDataSource, not user supplied
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
_composite_filters (FilterSet): collection of filters passed from
the parent CompositeDataSource, not user supplied
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
Returns:
(list): of STIX objects that has the supplied STIX ID.
@ -206,7 +711,8 @@ class FileSystemSource(DataSource):
a python STIX objects and then returned
"""
return [self.get(stix_id=stix_id, version=version, _composite_filters=_composite_filters)]
query = [Filter("id", "=", stix_id)]
return self.query(query, version=version, _composite_filters=_composite_filters)
def query(self, query=None, version=None, _composite_filters=None):
"""Search and retrieve STIX objects based on the complete query.
@ -217,10 +723,11 @@ class FileSystemSource(DataSource):
Args:
query (list): list of filters to search on
_composite_filters (FilterSet): collection of filters passed from the
CompositeDataSource, not user supplied
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
_composite_filters (FilterSet): collection of filters passed from
the CompositeDataSource, not user supplied
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
Returns:
(list): list of STIX objects that matches the supplied
@ -228,9 +735,7 @@ class FileSystemSource(DataSource):
parsed into a python STIX objects and then returned.
"""
all_data = []
query = FilterSet(query)
# combine all query filters
@ -239,105 +744,26 @@ class FileSystemSource(DataSource):
if _composite_filters:
query.add(_composite_filters)
# extract any filters that are for "type" or "id" , as we can then do
# filtering before reading in the STIX objects. A STIX 'type' filter
# can reduce the query to a single sub-directory. A STIX 'id' filter
# allows for the fast checking of the file names versus loading it.
file_filters = self._parse_file_filters(query)
# establish which subdirectories can be avoided in query
# by decluding as many as possible. A filter with "type" as the property
# means that certain STIX object types can be ruled out, and thus
# the corresponding subdirectories as well
include_paths = []
declude_paths = []
if "type" in [filter.property for filter in file_filters]:
for filter in file_filters:
if filter.property == "type":
if filter.op == "=":
include_paths.append(os.path.join(self._stix_dir, filter.value))
elif filter.op == "!=":
declude_paths.append(os.path.join(self._stix_dir, filter.value))
auth_types, auth_ids = _find_search_optimizations(query)
type_dirs = _get_matching_dir_entries(
self._stix_dir, auth_types,
stat.S_ISDIR,
)
for type_dir in type_dirs:
type_path = os.path.join(self._stix_dir, type_dir)
type_is_versioned = _is_versioned_type_dir(type_path, type_dir)
if type_is_versioned:
type_results = _search_versioned(
query, type_path, auth_ids,
self.allow_custom, version,
self.encoding,
)
else:
# have to walk entire STIX directory
include_paths.append(self._stix_dir)
type_results = _search_unversioned(
query, type_path, auth_ids,
self.allow_custom, version,
self.encoding,
)
all_data.extend(type_results)
# if a user specifies a "type" filter like "type = <stix-object_type>",
# the filter is reducing the search space to single stix object types
# (and thus single directories). This makes such a filter more powerful
# than "type != <stix-object_type>" bc the latter is substracting
# only one type of stix object type (and thus only one directory),
# As such the former type of filters are given preference over the latter;
# i.e. if both exist in a query, that latter type will be ignored
if not include_paths:
# user has specified types that are not wanted (i.e. "!=")
# so query will look in all STIX directories that are not
# the specified type. Compile correct dir paths
for dir in os.listdir(self._stix_dir):
if os.path.abspath(os.path.join(self._stix_dir, dir)) not in declude_paths:
include_paths.append(os.path.abspath(os.path.join(self._stix_dir, dir)))
# grab stix object ID as well - if present in filters, as
# may forgo the loading of STIX content into memory
if "id" in [filter.property for filter in file_filters]:
for filter in file_filters:
if filter.property == "id" and filter.op == "=":
id_ = filter.value
break
else:
id_ = None
else:
id_ = None
# now iterate through all STIX objs
for path in include_paths:
for root, dirs, files in os.walk(path):
for file_ in files:
if not file_.endswith(".json"):
# skip non '.json' files as more likely to be random non-STIX files
continue
if not id_ or id_ == file_.split(".")[0]:
# have to load into memory regardless to evaluate other filters
try:
stix_obj = json.load(open(os.path.join(root, file_)))
if stix_obj["type"] == "bundle":
stix_obj = stix_obj["objects"][0]
# naive STIX type checking
stix_obj["type"]
stix_obj["id"]
except (ValueError, KeyError): # likely not a JSON file
raise TypeError("STIX JSON object at '{0}' could either not be parsed to "
"JSON or was not valid STIX JSON".format(os.path.join(root, file_)))
# check against other filters, add if match
all_data.extend(apply_common_filters([stix_obj], query))
all_data = deduplicate(all_data)
# parse python STIX objects from the STIX object dicts
stix_objs = [parse(stix_obj_dict, allow_custom=self.allow_custom, version=version) for stix_obj_dict in all_data]
return stix_objs
def _parse_file_filters(self, query):
"""Extract STIX common filters.
Possibly speeds up querying STIX objects from the file system.
Extracts filters that are for the "id" and "type" property of
a STIX object. As the file directory is organized by STIX
object type with filenames that are equivalent to the STIX
object ID, these filters can be used first to reduce the
search space of a FileSystemStore (or FileSystemSink).
"""
file_filters = []
for filter_ in query:
if filter_.property == "id" or filter_.property == "type":
file_filters.append(filter_)
return file_filters
return all_data

View File

@ -1,23 +1,20 @@
"""
Filters for Python STIX 2.0 DataSources, DataSinks, DataStores
"""
"""Filters for Python STIX2 DataSources, DataSinks, DataStores"""
import collections
from datetime import datetime
from stix2.utils import format_datetime
import six
import stix2.utils
"""Supported filter operations"""
FILTER_OPS = ['=', '!=', 'in', '>', '<', '>=', '<=', 'contains']
"""Supported filter value types"""
FILTER_VALUE_TYPES = [bool, dict, float, int, list, str, tuple]
try:
FILTER_VALUE_TYPES.append(unicode)
except NameError:
# Python 3 doesn't need to worry about unicode
pass
FILTER_VALUE_TYPES = (
bool, dict, float, int, list, tuple, six.string_types,
datetime,
)
def _check_filter_components(prop, op, value):
@ -36,18 +33,18 @@ def _check_filter_components(prop, op, value):
# check filter operator is supported
raise ValueError("Filter operator '%s' not supported for specified property: '%s'" % (op, prop))
if type(value) not in FILTER_VALUE_TYPES:
if not isinstance(value, FILTER_VALUE_TYPES):
# check filter value type is supported
raise TypeError("Filter value of '%s' is not supported. The type must be a Python immutable type or dictionary" % type(value))
if prop == "type" and "_" in value:
if prop == 'type' and '_' in value:
# check filter where the property is type, value (type name) cannot have underscores
raise ValueError("Filter for property 'type' cannot have its value '%s' include underscores" % value)
return True
class Filter(collections.namedtuple("Filter", ['property', 'op', 'value'])):
class Filter(collections.namedtuple('Filter', ['property', 'op', 'value'])):
"""STIX 2 filters that support the querying functionality of STIX 2
DataStores and DataSources.
@ -69,10 +66,6 @@ class Filter(collections.namedtuple("Filter", ['property', 'op', 'value'])):
if isinstance(value, list):
value = tuple(value)
if isinstance(value, datetime):
# if value is a datetime obj, convert to str
value = format_datetime(value)
_check_filter_components(prop, op, value)
self = super(Filter, cls).__new__(cls, prop, op, value)
@ -88,31 +81,33 @@ class Filter(collections.namedtuple("Filter", ['property', 'op', 'value'])):
True if property matches the filter,
False otherwise.
"""
if isinstance(stix_obj_property, datetime):
# if a datetime obj, convert to str format before comparison
# NOTE: this check seems like it should be done upstream
# but will put here for now
stix_obj_property = format_datetime(stix_obj_property)
# If filtering on a timestamp property and the filter value is a string,
# try to convert the filter value to a datetime instance.
if isinstance(stix_obj_property, datetime) and \
isinstance(self.value, six.string_types):
filter_value = stix2.utils.parse_into_datetime(self.value)
else:
filter_value = self.value
if self.op == "=":
return stix_obj_property == self.value
return stix_obj_property == filter_value
elif self.op == "!=":
return stix_obj_property != self.value
return stix_obj_property != filter_value
elif self.op == "in":
return stix_obj_property in self.value
return stix_obj_property in filter_value
elif self.op == "contains":
if isinstance(self.value, dict):
return self.value in stix_obj_property.values()
if isinstance(filter_value, dict):
return filter_value in stix_obj_property.values()
else:
return self.value in stix_obj_property
return filter_value in stix_obj_property
elif self.op == ">":
return stix_obj_property > self.value
return stix_obj_property > filter_value
elif self.op == "<":
return stix_obj_property < self.value
return stix_obj_property < filter_value
elif self.op == ">=":
return stix_obj_property >= self.value
return stix_obj_property >= filter_value
elif self.op == "<=":
return stix_obj_property <= self.value
return stix_obj_property <= filter_value
else:
raise ValueError("Filter operator: {0} not supported for specified property: {1}".format(self.op, self.property))
@ -161,7 +156,7 @@ def _check_filter(filter_, stix_obj):
"""
# For properties like granular_markings and external_references
# need to extract the first property from the string.
prop = filter_.property.split(".")[0]
prop = filter_.property.split('.')[0]
if prop not in stix_obj.keys():
# check filter "property" is in STIX object - if cant be
@ -169,9 +164,9 @@ def _check_filter(filter_, stix_obj):
# (i.e. did not make it through the filter)
return False
if "." in filter_.property:
if '.' in filter_.property:
# Check embedded properties, from e.g. granular_markings or external_references
sub_property = filter_.property.split(".", 1)[1]
sub_property = filter_.property.split('.', 1)[1]
sub_filter = filter_._replace(property=sub_property)
if isinstance(stix_obj[prop], list):
@ -226,7 +221,8 @@ class FilterSet(object):
Operates like set, only adding unique stix2.Filters to the FilterSet
NOTE: method designed to be very accomodating (i.e. even accepting filters=None)
Note:
method designed to be very accomodating (i.e. even accepting filters=None)
as it allows for blind calls (very useful in DataStore)
Args:
@ -248,11 +244,13 @@ class FilterSet(object):
def remove(self, filters=None):
"""Remove a Filter, list of Filters, or FilterSet from the FilterSet.
NOTE: method designed to be very accomodating (i.e. even accepting filters=None)
Note:
method designed to be very accomodating (i.e. even accepting filters=None)
as it allows for blind calls (very useful in DataStore)
Args:
filters: stix2.Filter OR list of stix2.Filter or stix2.FilterSet
"""
if not filters:
# so remove() can be called blindly, useful for

View File

@ -1,27 +1,32 @@
"""
Python STIX 2.0 Memory Source/Sink
"""
"""Python STIX2 Memory Source/Sink"""
import io
import itertools
import json
import os
from stix2 import v20, v21
from stix2.base import _STIXBase
from stix2.core import Bundle, parse
from stix2.datastore import DataSink, DataSource, DataStoreMixin
from stix2.datastore.filters import FilterSet, apply_common_filters
from stix2.parsing import parse
def _add(store, stix_data=None, allow_custom=True, version=None):
def _add(store, stix_data, allow_custom=True, version=None):
"""Add STIX objects to MemoryStore/Sink.
Adds STIX objects to an in-memory dictionary for fast lookup.
Recursive function, breaks down STIX Bundles and lists.
Args:
store: A MemoryStore, MemorySink or MemorySource object.
stix_data (list OR dict OR STIX object): STIX objects to be added
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects. Note that unknown custom objects cannot be parsed
into STIX objects, and will be returned as is. Default: False.
version (str): Which STIX2 version to lock the parser to. (e.g. "2.0",
"2.1"). If None, the library makes the best effort to figure
out the spec representation of the object.
"""
if isinstance(stix_data, list):
@ -41,35 +46,20 @@ def _add(store, stix_data=None, allow_custom=True, version=None):
else:
stix_obj = parse(stix_data, allow_custom, version)
# Map ID directly to the object, if it is a marking. Otherwise,
# map to a family, so we can track multiple versions.
if _is_marking(stix_obj):
store._data[stix_obj.id] = stix_obj
else:
if stix_obj.id in store._data:
obj_family = store._data[stix_obj.id]
# Map ID to a _ObjectFamily if the object is versioned, so we can track
# multiple versions. Otherwise, map directly to the object. All
# versioned objects should have a "modified" property.
if "modified" in stix_obj:
if stix_obj["id"] in store._data:
obj_family = store._data[stix_obj["id"]]
else:
obj_family = _ObjectFamily()
store._data[stix_obj.id] = obj_family
store._data[stix_obj["id"]] = obj_family
obj_family.add(stix_obj)
def _is_marking(obj_or_id):
"""Determines whether the given object or object ID is/is for a marking
definition.
:param obj_or_id: A STIX object or object ID as a string.
:return: True if a marking definition, False otherwise.
"""
if isinstance(obj_or_id, _STIXBase):
id_ = obj_or_id.id
else:
id_ = obj_or_id
return id_.startswith("marking-definition--")
store._data[stix_obj["id"]] = stix_obj
class _ObjectFamily(object):
@ -84,14 +74,16 @@ class _ObjectFamily(object):
self.latest_version = None
def add(self, obj):
self.all_versions[obj.modified] = obj
if self.latest_version is None or \
obj.modified > self.latest_version.modified:
self.all_versions[obj["modified"]] = obj
if (self.latest_version is None or
obj["modified"] > self.latest_version["modified"]):
self.latest_version = obj
def __str__(self):
return "<<{}; latest={}>>".format(self.all_versions,
self.latest_version.modified)
return "<<{}; latest={}>>".format(
self.all_versions,
self.latest_version["modified"],
)
def __repr__(self):
return str(self)
@ -111,8 +103,6 @@ class MemoryStore(DataStoreMixin):
allow_custom (bool): whether to allow custom STIX content.
Only applied when export/input functions called, i.e.
load_from_file() and save_to_file(). Defaults to True.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
Attributes:
_data (dict): the in-memory dict that holds STIX objects
@ -124,19 +114,21 @@ class MemoryStore(DataStoreMixin):
self._data = {}
if stix_data:
_add(self, stix_data, allow_custom, version=version)
_add(self, stix_data, allow_custom, version)
super(MemoryStore, self).__init__(
source=MemorySource(stix_data=self._data, allow_custom=allow_custom, version=version, _store=True),
sink=MemorySink(stix_data=self._data, allow_custom=allow_custom, version=version, _store=True)
sink=MemorySink(stix_data=self._data, allow_custom=allow_custom, version=version, _store=True),
)
def save_to_file(self, *args, **kwargs):
"""Write SITX objects from in-memory dictionary to JSON file, as a STIX
Bundle.
Bundle. If a directory is given, the Bundle 'id' will be used as
filename. Otherwise, the provided value will be used.
Args:
file_path (str): file path to write STIX data to
path (str): file path to write STIX data to.
encoding (str): The file encoding. Default utf-8.
"""
return self.sink.save_to_file(*args, **kwargs)
@ -144,13 +136,11 @@ class MemoryStore(DataStoreMixin):
def load_from_file(self, *args, **kwargs):
"""Load STIX data from JSON file.
File format is expected to be a single JSON
STIX object or JSON STIX bundle.
File format is expected to be a single JSON STIX object or JSON STIX
bundle.
Args:
file_path (str): file path to load STIX data from
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
path (str): file path to load STIX data from
"""
return self.source.load_from_file(*args, **kwargs)
@ -171,6 +161,9 @@ class MemorySink(DataSink):
allow_custom (bool): whether to allow custom objects/properties
when exporting STIX content to file.
Default: True.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
Attributes:
_data (dict): the in-memory dict that holds STIX objects.
@ -186,25 +179,41 @@ class MemorySink(DataSink):
else:
self._data = {}
if stix_data:
_add(self, stix_data, allow_custom, version=version)
_add(self, stix_data, allow_custom, version)
def add(self, stix_data, version=None):
_add(self, stix_data, self.allow_custom, version)
add.__doc__ = _add.__doc__
def save_to_file(self, file_path):
file_path = os.path.abspath(file_path)
def save_to_file(self, path, encoding="utf-8"):
path = os.path.abspath(path)
all_objs = itertools.chain.from_iterable(
all_objs = list(itertools.chain.from_iterable(
value.all_versions.values() if isinstance(value, _ObjectFamily)
else [value]
for value in self._data.values()
)
))
if not os.path.exists(os.path.dirname(file_path)):
os.makedirs(os.path.dirname(file_path))
with open(file_path, "w") as f:
f.write(str(Bundle(list(all_objs), allow_custom=self.allow_custom)))
if any("spec_version" in x for x in all_objs):
bundle = v21.Bundle(all_objs, allow_custom=self.allow_custom)
else:
bundle = v20.Bundle(all_objs, allow_custom=self.allow_custom)
if path.endswith(".json"):
if not os.path.exists(os.path.dirname(path)):
os.makedirs(os.path.dirname(path))
else:
if not os.path.exists(path):
os.makedirs(path)
# if the user only provided a directory, use the bundle id for filename
path = os.path.join(path, bundle["id"] + ".json")
with io.open(path, "w", encoding=encoding) as f:
bundle = bundle.serialize(pretty=True, encoding=encoding, ensure_ascii=False)
f.write(bundle)
return path
save_to_file.__doc__ = MemoryStore.save_to_file.__doc__
@ -224,6 +233,9 @@ class MemorySource(DataSource):
allow_custom (bool): whether to allow custom objects/properties
when importing STIX content from file.
Default: True.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
Attributes:
_data (dict): the in-memory dict that holds STIX objects.
@ -239,7 +251,7 @@ class MemorySource(DataSource):
else:
self._data = {}
if stix_data:
_add(self, stix_data, allow_custom, version=version)
_add(self, stix_data, allow_custom, version)
def get(self, stix_id, _composite_filters=None):
"""Retrieve STIX object from in-memory dict via STIX ID.
@ -255,19 +267,19 @@ class MemorySource(DataSource):
"""
stix_obj = None
if _is_marking(stix_id):
stix_obj = self._data.get(stix_id)
mapped_value = self._data.get(stix_id)
if mapped_value:
if isinstance(mapped_value, _ObjectFamily):
stix_obj = mapped_value.latest_version
else:
object_family = self._data.get(stix_id)
if object_family:
stix_obj = object_family.latest_version
stix_obj = mapped_value
if stix_obj:
all_filters = list(
itertools.chain(
_composite_filters or [],
self.filters
)
self.filters,
),
)
stix_obj = next(apply_common_filters([stix_obj], all_filters), None)
@ -275,41 +287,35 @@ class MemorySource(DataSource):
return stix_obj
def all_versions(self, stix_id, _composite_filters=None):
"""Retrieve STIX objects from in-memory dict via STIX ID, all versions of it
Note: Since Memory sources/sinks don't handle multiple versions of a
STIX object, this operation is unnecessary. Translate call to get().
"""Retrieve STIX objects from in-memory dict via STIX ID, all versions
of it.
Args:
stix_id (str): The STIX ID of the STIX 2 object to retrieve.
_composite_filters (FilterSet): collection of filters passed from the parent
CompositeDataSource, not user supplied
_composite_filters (FilterSet): collection of filters passed from
the parent CompositeDataSource, not user supplied
Returns:
(list): list of STIX objects that have the supplied ID.
"""
results = []
stix_objs_to_filter = None
if _is_marking(stix_id):
stix_obj = self._data.get(stix_id)
if stix_obj:
stix_objs_to_filter = [stix_obj]
mapped_value = self._data.get(stix_id)
if mapped_value:
if isinstance(mapped_value, _ObjectFamily):
stix_objs_to_filter = mapped_value.all_versions.values()
else:
object_family = self._data.get(stix_id)
if object_family:
stix_objs_to_filter = object_family.all_versions.values()
stix_objs_to_filter = [mapped_value]
if stix_objs_to_filter:
all_filters = list(
itertools.chain(
_composite_filters or [],
self.filters
)
self.filters,
),
)
results.extend(
apply_common_filters(stix_objs_to_filter, all_filters)
apply_common_filters(stix_objs_to_filter, all_filters),
)
return results
@ -323,8 +329,8 @@ class MemorySource(DataSource):
Args:
query (list): list of filters to search on
_composite_filters (FilterSet): collection of filters passed from the
CompositeDataSource, not user supplied
_composite_filters (FilterSet): collection of filters passed from
the CompositeDataSource, not user supplied
Returns:
(list): list of STIX objects that match the supplied query.
@ -349,13 +355,9 @@ class MemorySource(DataSource):
return all_data
def load_from_file(self, file_path, version=None):
with open(os.path.abspath(file_path), "r") as f:
def load_from_file(self, file_path, version=None, encoding='utf-8'):
with io.open(os.path.abspath(file_path), "r", encoding=encoding) as f:
stix_data = json.load(f)
# Override user version selection if loading a bundle
if stix_data["type"] == "bundle":
version = stix_data["spec_version"]
_add(self, stix_data, self.allow_custom, version)
load_from_file.__doc__ = MemoryStore.load_from_file.__doc__

View File

@ -1,17 +1,18 @@
"""
Python STIX 2.x TAXIICollectionStore
"""
"""Python STIX2 TAXIICollection Source/Sink"""
from requests.exceptions import HTTPError
from stix2 import v20, v21
from stix2.base import _STIXBase
from stix2.core import Bundle, parse
from stix2.datastore import (DataSink, DataSource, DataSourceError,
DataStoreMixin)
from stix2.datastore import (
DataSink, DataSource, DataSourceError, DataStoreMixin,
)
from stix2.datastore.filters import Filter, FilterSet, apply_common_filters
from stix2.parsing import parse
from stix2.utils import deduplicate
try:
from taxii2client import ValidationError
from taxii2client.exceptions import ValidationError
_taxii2_client = True
except ImportError:
_taxii2_client = False
@ -43,7 +44,7 @@ class TAXIICollectionStore(DataStoreMixin):
super(TAXIICollectionStore, self).__init__(
source=TAXIICollectionSource(collection, allow_custom=allow_custom_source),
sink=TAXIICollectionSink(collection, allow_custom=allow_custom_sink)
sink=TAXIICollectionSink(collection, allow_custom=allow_custom_sink),
)
@ -66,12 +67,16 @@ class TAXIICollectionSink(DataSink):
if collection.can_write:
self.collection = collection
else:
raise DataSourceError("The TAXII Collection object provided does not have write access"
" to the underlying linked Collection resource")
raise DataSourceError(
"The TAXII Collection object provided does not have write access"
" to the underlying linked Collection resource",
)
except (HTTPError, ValidationError) as e:
raise DataSourceError("The underlying TAXII Collection resource defined in the supplied TAXII"
" Collection object provided could not be reached. Receved error:", e)
raise DataSourceError(
"The underlying TAXII Collection resource defined in the supplied TAXII"
" Collection object provided could not be reached. Receved error:", e,
)
self.allow_custom = allow_custom
@ -79,26 +84,34 @@ class TAXIICollectionSink(DataSink):
"""Add/push STIX content to TAXII Collection endpoint
Args:
stix_data (STIX object OR dict OR str OR list): valid STIX 2.0 content
in a STIX object (or Bundle), STIX onject dict (or Bundle dict), or a STIX 2.0
json encoded string, or list of any of the following
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
stix_data (STIX object OR dict OR str OR list): valid STIX2
content in a STIX object (or Bundle), STIX object dict (or
Bundle dict), or a STIX2 json encoded string, or list of
any of the following.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
"""
if isinstance(stix_data, _STIXBase):
# adding python STIX object
if stix_data["type"] == "bundle":
bundle = stix_data.serialize(encoding="utf-8")
if stix_data['type'] == 'bundle':
bundle = stix_data.serialize(encoding='utf-8', ensure_ascii=False)
elif 'spec_version' in stix_data:
# If the spec_version is present, use new Bundle object...
bundle = v21.Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding='utf-8', ensure_ascii=False)
else:
bundle = Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding="utf-8")
bundle = v20.Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding='utf-8', ensure_ascii=False)
elif isinstance(stix_data, dict):
# adding python dict (of either Bundle or STIX obj)
if stix_data["type"] == "bundle":
bundle = parse(stix_data, allow_custom=self.allow_custom, version=version).serialize(encoding="utf-8")
if stix_data['type'] == 'bundle':
bundle = parse(stix_data, allow_custom=self.allow_custom, version=version).serialize(encoding='utf-8', ensure_ascii=False)
elif 'spec_version' in stix_data:
# If the spec_version is present, use new Bundle object...
bundle = v21.Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding='utf-8', ensure_ascii=False)
else:
bundle = Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding="utf-8")
bundle = v20.Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding='utf-8', ensure_ascii=False)
elif isinstance(stix_data, list):
# adding list of something - recurse on each
@ -109,10 +122,13 @@ class TAXIICollectionSink(DataSink):
elif isinstance(stix_data, str):
# adding json encoded string of STIX content
stix_data = parse(stix_data, allow_custom=self.allow_custom, version=version)
if stix_data["type"] == "bundle":
bundle = stix_data.serialize(encoding="utf-8")
if stix_data['type'] == 'bundle':
bundle = stix_data.serialize(encoding='utf-8', ensure_ascii=False)
elif 'spec_version' in stix_data:
# If the spec_version is present, use new Bundle object...
bundle = v21.Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding='utf-8', ensure_ascii=False)
else:
bundle = Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding="utf-8")
bundle = v20.Bundle(stix_data, allow_custom=self.allow_custom).serialize(encoding='utf-8', ensure_ascii=False)
else:
raise TypeError("stix_data must be as STIX object(or list of),json formatted STIX (or list of), or a json formatted STIX bundle")
@ -139,12 +155,16 @@ class TAXIICollectionSource(DataSource):
if collection.can_read:
self.collection = collection
else:
raise DataSourceError("The TAXII Collection object provided does not have read access"
" to the underlying linked Collection resource")
raise DataSourceError(
"The TAXII Collection object provided does not have read access"
" to the underlying linked Collection resource",
)
except (HTTPError, ValidationError) as e:
raise DataSourceError("The underlying TAXII Collection resource defined in the supplied TAXII"
" Collection object provided could not be reached. Recieved error:", e)
raise DataSourceError(
"The underlying TAXII Collection resource defined in the supplied TAXII"
" Collection object provided could not be reached. Recieved error:", e,
)
self.allow_custom = allow_custom
@ -154,10 +174,11 @@ class TAXIICollectionSource(DataSource):
Args:
stix_id (str): The STIX ID of the STIX object to be retrieved.
_composite_filters (FilterSet): collection of filters passed from the parent
CompositeDataSource, not user supplied
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
_composite_filters (FilterSet): collection of filters passed from
the parent CompositeDataSource, not user supplied
Returns:
(STIX object): STIX object that has the supplied STIX ID.
@ -173,15 +194,16 @@ class TAXIICollectionSource(DataSource):
if _composite_filters:
query.add(_composite_filters)
# dont extract TAXII filters from query (to send to TAXII endpoint)
# as directly retrieveing a STIX object by ID
# don't extract TAXII filters from query (to send to TAXII endpoint)
# as directly retrieving a STIX object by ID
try:
stix_objs = self.collection.get_object(stix_id)["objects"]
stix_objs = self.collection.get_object(stix_id)['objects']
stix_obj = list(apply_common_filters(stix_objs, query))
except HTTPError as e:
if e.response.status_code == 404:
# if resource not found or access is denied from TAXII server, return None
# if resource not found or access is denied from TAXII server,
# return None
stix_obj = []
else:
raise DataSourceError("TAXII Collection resource returned error", e)
@ -202,10 +224,11 @@ class TAXIICollectionSource(DataSource):
Args:
stix_id (str): The STIX ID of the STIX objects to be retrieved.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
_composite_filters (FilterSet): collection of filters passed from the parent
CompositeDataSource, not user supplied
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
Returns:
(see query() as all_versions() is just a wrapper)
@ -213,8 +236,8 @@ class TAXIICollectionSource(DataSource):
"""
# make query in TAXII query format since 'id' is TAXII field
query = [
Filter("id", "=", stix_id),
Filter("version", "=", "all")
Filter('id', '=', stix_id),
Filter('version', '=', 'all'),
]
all_data = self.query(query=query, _composite_filters=_composite_filters)
@ -236,10 +259,11 @@ class TAXIICollectionSource(DataSource):
Args:
query (list): list of filters to search on
_composite_filters (FilterSet): collection of filters passed from the
CompositeDataSource, not user supplied
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property.
_composite_filters (FilterSet): collection of filters passed from
the CompositeDataSource, not user supplied
Returns:
(list): list of STIX objects that matches the supplied
@ -263,7 +287,7 @@ class TAXIICollectionSource(DataSource):
# query TAXII collection
try:
all_data = self.collection.get_objects(**taxii_filters_dict)["objects"]
all_data = self.collection.get_objects(**taxii_filters_dict)['objects']
# deduplicate data (before filtering as reduces wasted filtering)
all_data = deduplicate(all_data)
@ -275,9 +299,11 @@ class TAXIICollectionSource(DataSource):
except HTTPError as e:
# if resources not found or access is denied from TAXII server, return empty list
if e.response.status_code == 404:
raise DataSourceError("The requested STIX objects for the TAXII Collection resource defined in"
raise DataSourceError(
"The requested STIX objects for the TAXII Collection resource defined in"
" the supplied TAXII Collection object are either not found or access is"
" denied. Received error: ", e)
" denied. Received error: ", e,
)
# parse python STIX objects from the STIX object dicts
stix_objs = [parse(stix_obj_dict, allow_custom=self.allow_custom, version=version) for stix_obj_dict in all_data]
@ -290,18 +316,17 @@ class TAXIICollectionSource(DataSource):
Does not put in TAXII spec format as the TAXII2Client (that we use)
does this for us.
Notes:
Note:
Currently, the TAXII2Client can handle TAXII filters where the
filter value is list, as both a comma-seperated string or python list
filter value is list, as both a comma-seperated string or python
list.
For instance - "?match[type]=indicator,sighting" can be in a
filter in any of these formats:
Filter("type", "<any op>", "indicator,sighting")
Filter("type", "<any op>", ["indicator", "sighting"])
Args:
query (list): list of filters to extract which ones are TAXII
specific.

View File

@ -1,10 +1,14 @@
"""Python STIX 2.0 Environment API.
"""
"""Python STIX2 Environment API."""
import copy
import logging
import time
from .core import parse as _parse
from .datastore import CompositeDataSource, DataStoreMixin
from .parsing import parse as _parse
from .utils import STIXdatetime, parse_into_datetime
logger = logging.getLogger(__name__)
class ObjectFactory(object):
@ -27,9 +31,11 @@ class ObjectFactory(object):
default. Defaults to True.
"""
def __init__(self, created_by_ref=None, created=None,
def __init__(
self, created_by_ref=None, created=None,
external_references=None, object_marking_refs=None,
list_append=True):
list_append=True,
):
self._defaults = {}
if created_by_ref:
@ -166,3 +172,350 @@ class Environment(DataStoreMixin):
def parse(self, *args, **kwargs):
return _parse(*args, **kwargs)
parse.__doc__ = _parse.__doc__
def creator_of(self, obj):
"""Retrieve the Identity refered to by the object's `created_by_ref`.
Args:
obj: The STIX object whose `created_by_ref` property will be looked
up.
Returns:
str: The STIX object's creator, or None, if the object contains no
`created_by_ref` property or the object's creator cannot be
found.
"""
creator_id = obj.get('created_by_ref', '')
if creator_id:
return self.get(creator_id)
else:
return None
@staticmethod
def semantically_equivalent(obj1, obj2, prop_scores={}, **weight_dict):
"""This method is meant to verify if two objects of the same type are
semantically equivalent.
Args:
obj1: A stix2 object instance
obj2: A stix2 object instance
weight_dict: A dictionary that can be used to override settings
in the semantic equivalence process
Returns:
float: A number between 0.0 and 100.0 as a measurement of equivalence.
Warning:
Course of Action, Intrusion-Set, Observed-Data, Report are not supported
by this implementation. Indicator pattern check is also limited.
Note:
Default weights_dict:
.. include:: ../default_sem_eq_weights.rst
Note:
This implementation follows the Committee Note on semantic equivalence.
see `the Committee Note <link here>`__.
"""
weights = WEIGHTS.copy()
if weight_dict:
weights.update(weight_dict)
type1, type2 = obj1["type"], obj2["type"]
ignore_spec_version = weights["_internal"]["ignore_spec_version"]
if type1 != type2:
raise ValueError('The objects to compare must be of the same type!')
if ignore_spec_version is False and obj1.get("spec_version", "2.0") != obj2.get("spec_version", "2.0"):
raise ValueError('The objects to compare must be of the same spec version!')
try:
weights[type1]
except KeyError:
logger.warning("'%s' type has no 'weights' dict specified & thus no semantic equivalence method to call!", type1)
sum_weights = matching_score = 0
else:
try:
method = weights[type1]["method"]
except KeyError:
logger.debug("Starting semantic equivalence process between: '%s' and '%s'", obj1["id"], obj2["id"])
matching_score = 0.0
sum_weights = 0.0
for prop in weights[type1]:
if check_property_present(prop, obj1, obj2) or prop == "longitude_latitude":
w = weights[type1][prop][0]
comp_funct = weights[type1][prop][1]
if comp_funct == partial_timestamp_based:
contributing_score = w * comp_funct(obj1[prop], obj2[prop], weights[type1]["tdelta"])
elif comp_funct == partial_location_distance:
threshold = weights[type1]["threshold"]
contributing_score = w * comp_funct(obj1["latitude"], obj1["longitude"], obj2["latitude"], obj2["longitude"], threshold)
else:
contributing_score = w * comp_funct(obj1[prop], obj2[prop])
sum_weights += w
matching_score += contributing_score
prop_scores[prop] = {
"weight": w,
"contributing_score": contributing_score,
}
logger.debug("'%s' check -- weight: %s, contributing score: %s", prop, w, contributing_score)
prop_scores["matching_score"] = matching_score
prop_scores["sum_weights"] = sum_weights
logger.debug("Matching Score: %s, Sum of Weights: %s", matching_score, sum_weights)
else:
logger.debug("Starting semantic equivalence process between: '%s' and '%s'", obj1["id"], obj2["id"])
try:
matching_score, sum_weights = method(obj1, obj2, prop_scores, **weights[type1])
except TypeError:
# method doesn't support detailed output with prop_scores
matching_score, sum_weights = method(obj1, obj2, **weights[type1])
logger.debug("Matching Score: %s, Sum of Weights: %s", matching_score, sum_weights)
if sum_weights <= 0:
return 0
equivalence_score = (matching_score / sum_weights) * 100.0
return equivalence_score
def check_property_present(prop, obj1, obj2):
"""Helper method checks if a property is present on both objects."""
if prop in obj1 and prop in obj2:
return True
return False
def partial_timestamp_based(t1, t2, tdelta):
"""Performs a timestamp-based matching via checking how close one timestamp is to another.
Args:
t1: A datetime string or STIXdatetime object.
t2: A datetime string or STIXdatetime object.
tdelta (float): A given time delta. This number is multiplied by 86400 (1 day) to
extend or shrink your time change tolerance.
Returns:
float: Number between 0.0 and 1.0 depending on match criteria.
"""
if not isinstance(t1, STIXdatetime):
t1 = parse_into_datetime(t1)
if not isinstance(t2, STIXdatetime):
t2 = parse_into_datetime(t2)
t1, t2 = time.mktime(t1.timetuple()), time.mktime(t2.timetuple())
result = 1 - min(abs(t1 - t2) / (86400 * tdelta), 1)
logger.debug("--\t\tpartial_timestamp_based '%s' '%s' tdelta: '%s'\tresult: '%s'", t1, t2, tdelta, result)
return result
def partial_list_based(l1, l2):
"""Performs a partial list matching via finding the intersection between common values.
Args:
l1: A list of values.
l2: A list of values.
Returns:
float: 1.0 if the value matches exactly, 0.0 otherwise.
"""
l1_set, l2_set = set(l1), set(l2)
result = len(l1_set.intersection(l2_set)) / max(len(l1), len(l2))
logger.debug("--\t\tpartial_list_based '%s' '%s'\tresult: '%s'", l1, l2, result)
return result
def exact_match(val1, val2):
"""Performs an exact value match based on two values
Args:
val1: A value suitable for an equality test.
val2: A value suitable for an equality test.
Returns:
float: 1.0 if the value matches exactly, 0.0 otherwise.
"""
result = 0.0
if val1 == val2:
result = 1.0
logger.debug("--\t\texact_match '%s' '%s'\tresult: '%s'", val1, val2, result)
return result
def partial_string_based(str1, str2):
"""Performs a partial string match using the Jaro-Winkler distance algorithm.
Args:
str1: A string value to check.
str2: A string value to check.
Returns:
float: Number between 0.0 and 1.0 depending on match criteria.
"""
from fuzzywuzzy import fuzz
result = fuzz.token_sort_ratio(str1, str2, force_ascii=False)
logger.debug("--\t\tpartial_string_based '%s' '%s'\tresult: '%s'", str1, str2, result)
return result / 100.0
def custom_pattern_based(pattern1, pattern2):
"""Performs a matching on Indicator Patterns.
Args:
pattern1: An Indicator pattern
pattern2: An Indicator pattern
Returns:
float: Number between 0.0 and 1.0 depending on match criteria.
"""
logger.warning("Indicator pattern equivalence is not fully defined; will default to zero if not completely identical")
return exact_match(pattern1, pattern2) # TODO: Implement pattern based equivalence
def partial_external_reference_based(refs1, refs2):
"""Performs a matching on External References.
Args:
refs1: A list of external references.
refs2: A list of external references.
Returns:
float: Number between 0.0 and 1.0 depending on matches.
"""
allowed = set(("veris", "cve", "capec", "mitre-attack"))
matches = 0
if len(refs1) >= len(refs2):
l1 = refs1
l2 = refs2
else:
l1 = refs2
l2 = refs1
for ext_ref1 in l1:
for ext_ref2 in l2:
sn_match = False
ei_match = False
url_match = False
source_name = None
if check_property_present("source_name", ext_ref1, ext_ref2):
if ext_ref1["source_name"] == ext_ref2["source_name"]:
source_name = ext_ref1["source_name"]
sn_match = True
if check_property_present("external_id", ext_ref1, ext_ref2):
if ext_ref1["external_id"] == ext_ref2["external_id"]:
ei_match = True
if check_property_present("url", ext_ref1, ext_ref2):
if ext_ref1["url"] == ext_ref2["url"]:
url_match = True
# Special case: if source_name is a STIX defined name and either
# external_id or url match then its a perfect match and other entries
# can be ignored.
if sn_match and (ei_match or url_match) and source_name in allowed:
result = 1.0
logger.debug(
"--\t\tpartial_external_reference_based '%s' '%s'\tresult: '%s'",
refs1, refs2, result,
)
return result
# Regular check. If the source_name (not STIX-defined) or external_id or
# url matches then we consider the entry a match.
if (sn_match or ei_match or url_match) and source_name not in allowed:
matches += 1
result = matches / max(len(refs1), len(refs2))
logger.debug(
"--\t\tpartial_external_reference_based '%s' '%s'\tresult: '%s'",
refs1, refs2, result,
)
return result
def partial_location_distance(lat1, long1, lat2, long2, threshold):
"""Given two coordinates perform a matching based on its distance using the Haversine Formula.
Args:
lat1: Latitude value for first coordinate point.
lat2: Latitude value for second coordinate point.
long1: Longitude value for first coordinate point.
long2: Longitude value for second coordinate point.
threshold (float): A kilometer measurement for the threshold distance between these two points.
Returns:
float: Number between 0.0 and 1.0 depending on match.
"""
from haversine import haversine, Unit
distance = haversine((lat1, long1), (lat2, long2), unit=Unit.KILOMETERS)
result = 1 - (distance / threshold)
logger.debug(
"--\t\tpartial_location_distance '%s' '%s' threshold: '%s'\tresult: '%s'",
(lat1, long1), (lat2, long2), threshold, result,
)
return result
# default weights used for the semantic equivalence process
WEIGHTS = {
"attack-pattern": {
"name": (30, partial_string_based),
"external_references": (70, partial_external_reference_based),
},
"campaign": {
"name": (60, partial_string_based),
"aliases": (40, partial_list_based),
},
"identity": {
"name": (60, partial_string_based),
"identity_class": (20, exact_match),
"sectors": (20, partial_list_based),
},
"indicator": {
"indicator_types": (15, partial_list_based),
"pattern": (80, custom_pattern_based),
"valid_from": (5, partial_timestamp_based),
"tdelta": 1, # One day interval
},
"location": {
"longitude_latitude": (34, partial_location_distance),
"region": (33, exact_match),
"country": (33, exact_match),
"threshold": 1000.0,
},
"malware": {
"malware_types": (20, partial_list_based),
"name": (80, partial_string_based),
},
"threat-actor": {
"name": (60, partial_string_based),
"threat_actor_types": (20, partial_list_based),
"aliases": (20, partial_list_based),
},
"tool": {
"tool_types": (20, partial_list_based),
"name": (80, partial_string_based),
},
"vulnerability": {
"name": (30, partial_string_based),
"external_references": (70, partial_external_reference_based),
},
"_internal": {
"ignore_spec_version": False,
},
} #: :autodoc-skip:

View File

@ -1,12 +1,19 @@
"""STIX 2 error classes.
"""
"""STIX2 Error Classes."""
class STIXError(Exception):
"""Base class for errors generated in the stix2 library."""
class InvalidValueError(STIXError, ValueError):
class ObjectConfigurationError(STIXError):
"""
Represents specification violations regarding the composition of STIX
objects.
"""
pass
class InvalidValueError(ObjectConfigurationError):
"""An invalid value was provided to a STIX object's ``__init__``."""
def __init__(self, cls, prop_name, reason):
@ -20,48 +27,89 @@ class InvalidValueError(STIXError, ValueError):
return msg.format(self)
class MissingPropertiesError(STIXError, ValueError):
class PropertyPresenceError(ObjectConfigurationError):
"""
Represents an invalid combination of properties on a STIX object. This
class can be used directly when the object requirements are more
complicated and none of the more specific exception subclasses apply.
"""
def __init__(self, message, cls):
super(PropertyPresenceError, self).__init__(message)
self.cls = cls
class MissingPropertiesError(PropertyPresenceError):
"""Missing one or more required properties when constructing STIX object."""
def __init__(self, cls, properties):
super(MissingPropertiesError, self).__init__()
self.cls = cls
self.properties = sorted(list(properties))
self.properties = sorted(properties)
def __str__(self):
msg = "No values for required properties for {0}: ({1})."
return msg.format(self.cls.__name__,
", ".join(x for x in self.properties))
msg = "No values for required properties for {0}: ({1}).".format(
cls.__name__,
", ".join(x for x in self.properties),
)
super(MissingPropertiesError, self).__init__(msg, cls)
class ExtraPropertiesError(STIXError, TypeError):
class ExtraPropertiesError(PropertyPresenceError):
"""One or more extra properties were provided when constructing STIX object."""
def __init__(self, cls, properties):
super(ExtraPropertiesError, self).__init__()
self.cls = cls
self.properties = sorted(list(properties))
self.properties = sorted(properties)
def __str__(self):
msg = "Unexpected properties for {0}: ({1})."
return msg.format(self.cls.__name__,
", ".join(x for x in self.properties))
msg = "Unexpected properties for {0}: ({1}).".format(
cls.__name__,
", ".join(x for x in self.properties),
)
super(ExtraPropertiesError, self).__init__(msg, cls)
class ImmutableError(STIXError, ValueError):
"""Attempted to modify an object after creation."""
class MutuallyExclusivePropertiesError(PropertyPresenceError):
"""Violating interproperty mutually exclusive constraint of a STIX object type."""
def __init__(self, cls, key):
super(ImmutableError, self).__init__()
self.cls = cls
self.key = key
def __init__(self, cls, properties):
self.properties = sorted(properties)
def __str__(self):
msg = "Cannot modify '{0.key}' property in '{0.cls.__name__}' after creation."
return msg.format(self)
msg = "The ({1}) properties for {0} are mutually exclusive.".format(
cls.__name__,
", ".join(x for x in self.properties),
)
super(MutuallyExclusivePropertiesError, self).__init__(msg, cls)
class DictionaryKeyError(STIXError, ValueError):
class DependentPropertiesError(PropertyPresenceError):
"""Violating interproperty dependency constraint of a STIX object type."""
def __init__(self, cls, dependencies):
self.dependencies = dependencies
msg = "The property dependencies for {0}: ({1}) are not met.".format(
cls.__name__,
", ".join(name for x in self.dependencies for name in x),
)
super(DependentPropertiesError, self).__init__(msg, cls)
class AtLeastOnePropertyError(PropertyPresenceError):
"""Violating a constraint of a STIX object type that at least one of the given properties must be populated."""
def __init__(self, cls, properties):
self.properties = sorted(properties)
msg = "At least one of the ({1}) properties for {0} must be " \
"populated.".format(
cls.__name__,
", ".join(x for x in self.properties),
)
super(AtLeastOnePropertyError, self).__init__(msg, cls)
class DictionaryKeyError(ObjectConfigurationError):
"""Dictionary key does not conform to the correct format."""
def __init__(self, key, reason):
@ -74,7 +122,7 @@ class DictionaryKeyError(STIXError, ValueError):
return msg.format(self)
class InvalidObjRefError(STIXError, ValueError):
class InvalidObjRefError(ObjectConfigurationError):
"""A STIX Cyber Observable Object contains an invalid object reference."""
def __init__(self, cls, prop_name, reason):
@ -88,89 +136,7 @@ class InvalidObjRefError(STIXError, ValueError):
return msg.format(self)
class UnmodifiablePropertyError(STIXError, ValueError):
"""Attempted to modify an unmodifiable property of object when creating a new version."""
def __init__(self, unchangable_properties):
super(UnmodifiablePropertyError, self).__init__()
self.unchangable_properties = unchangable_properties
def __str__(self):
msg = "These properties cannot be changed when making a new version: {0}."
return msg.format(", ".join(self.unchangable_properties))
class MutuallyExclusivePropertiesError(STIXError, TypeError):
"""Violating interproperty mutually exclusive constraint of a STIX object type."""
def __init__(self, cls, properties):
super(MutuallyExclusivePropertiesError, self).__init__()
self.cls = cls
self.properties = sorted(list(properties))
def __str__(self):
msg = "The ({1}) properties for {0} are mutually exclusive."
return msg.format(self.cls.__name__,
", ".join(x for x in self.properties))
class DependentPropertiesError(STIXError, TypeError):
"""Violating interproperty dependency constraint of a STIX object type."""
def __init__(self, cls, dependencies):
super(DependentPropertiesError, self).__init__()
self.cls = cls
self.dependencies = dependencies
def __str__(self):
msg = "The property dependencies for {0}: ({1}) are not met."
return msg.format(self.cls.__name__,
", ".join(name for x in self.dependencies for name in x))
class AtLeastOnePropertyError(STIXError, TypeError):
"""Violating a constraint of a STIX object type that at least one of the given properties must be populated."""
def __init__(self, cls, properties):
super(AtLeastOnePropertyError, self).__init__()
self.cls = cls
self.properties = sorted(list(properties))
def __str__(self):
msg = "At least one of the ({1}) properties for {0} must be populated."
return msg.format(self.cls.__name__,
", ".join(x for x in self.properties))
class RevokeError(STIXError, ValueError):
"""Attempted to an operation on a revoked object."""
def __init__(self, called_by):
super(RevokeError, self).__init__()
self.called_by = called_by
def __str__(self):
if self.called_by == "revoke":
return "Cannot revoke an already revoked object."
else:
return "Cannot create a new version of a revoked object."
class ParseError(STIXError, ValueError):
"""Could not parse object."""
def __init__(self, msg):
super(ParseError, self).__init__(msg)
class CustomContentError(STIXError, ValueError):
"""Custom STIX Content (SDO, Observable, Extension, etc.) detected."""
def __init__(self, msg):
super(CustomContentError, self).__init__(msg)
class InvalidSelectorError(STIXError, AssertionError):
class InvalidSelectorError(ObjectConfigurationError):
"""Granular Marking selector violation. The selector must resolve into an existing STIX object property."""
def __init__(self, cls, key):
@ -183,7 +149,73 @@ class InvalidSelectorError(STIXError, AssertionError):
return msg.format(self.key, self.cls.__class__.__name__)
class MarkingNotFoundError(STIXError, AssertionError):
class TLPMarkingDefinitionError(ObjectConfigurationError):
"""Marking violation. The marking-definition for TLP MUST follow the mandated instances from the spec."""
def __init__(self, user_obj, spec_obj):
super(TLPMarkingDefinitionError, self).__init__()
self.user_obj = user_obj
self.spec_obj = spec_obj
def __str__(self):
msg = "Marking {0} does not match spec marking {1}!"
return msg.format(self.user_obj, self.spec_obj)
class ImmutableError(STIXError):
"""Attempted to modify an object after creation."""
def __init__(self, cls, key):
super(ImmutableError, self).__init__()
self.cls = cls
self.key = key
def __str__(self):
msg = "Cannot modify '{0.key}' property in '{0.cls.__name__}' after creation."
return msg.format(self)
class UnmodifiablePropertyError(STIXError):
"""Attempted to modify an unmodifiable property of object when creating a new version."""
def __init__(self, unchangable_properties):
super(UnmodifiablePropertyError, self).__init__()
self.unchangable_properties = unchangable_properties
def __str__(self):
msg = "These properties cannot be changed when making a new version: {0}."
return msg.format(", ".join(self.unchangable_properties))
class RevokeError(STIXError):
"""Attempted an operation on a revoked object."""
def __init__(self, called_by):
super(RevokeError, self).__init__()
self.called_by = called_by
def __str__(self):
if self.called_by == "revoke":
return "Cannot revoke an already revoked object."
else:
return "Cannot create a new version of a revoked object."
class ParseError(STIXError):
"""Could not parse object."""
def __init__(self, msg):
super(ParseError, self).__init__(msg)
class CustomContentError(STIXError):
"""Custom STIX Content (SDO, Observable, Extension, etc.) detected."""
def __init__(self, msg):
super(CustomContentError, self).__init__(msg)
class MarkingNotFoundError(STIXError):
"""Marking violation. The marking reference must be present in SDO or SRO."""
def __init__(self, cls, key):
@ -194,3 +226,23 @@ class MarkingNotFoundError(STIXError, AssertionError):
def __str__(self):
msg = "Marking {0} was not found in {1}!"
return msg.format(self.key, self.cls.__class__.__name__)
class STIXDeprecationWarning(DeprecationWarning):
"""
Represents usage of a deprecated component of a STIX specification.
"""
pass
class DuplicateRegistrationError(STIXError):
"""A STIX object with the same type as an existing object is being registered"""
def __init__(self, obj_type, reg_obj_type):
super(DuplicateRegistrationError, self).__init__()
self.obj_type = obj_type
self.reg_obj_type = reg_obj_type
def __str__(self):
msg = "A(n) {0} with type '{1}' already exists and cannot be registered again"
return msg.format(self.obj_type, self.reg_obj_type)

View File

@ -9,7 +9,6 @@ Note:
Definitions. The corresponding methods on those classes are identical to
these functions except that the `obj` parameter is omitted.
.. autosummary::
:toctree: markings
@ -23,7 +22,7 @@ Note:
from stix2.markings import granular_markings, object_markings
def get_markings(obj, selectors=None, inherited=False, descendants=False):
def get_markings(obj, selectors=None, inherited=False, descendants=False, marking_ref=True, lang=True):
"""
Get all markings associated to the field(s) specified by selectors.
@ -31,10 +30,13 @@ def get_markings(obj, selectors=None, inherited=False, descendants=False):
obj: An SDO or SRO object.
selectors: string or list of selectors strings relative to the SDO or
SRO in which the properties appear.
inherited: If True, include object level markings and granular markings
inherited relative to the properties.
descendants: If True, include granular markings applied to any children
relative to the properties.
inherited (bool): If True, include object level markings and granular
markings inherited relative to the properties.
descendants (bool): If True, include granular markings applied to any
children relative to the properties.
marking_ref (bool): If False, excludes markings that use
``marking_ref`` property.
lang (bool): If False, excludes markings that use ``lang`` property.
Returns:
list: Marking identifiers that matched the selectors expression.
@ -51,7 +53,9 @@ def get_markings(obj, selectors=None, inherited=False, descendants=False):
obj,
selectors,
inherited,
descendants
descendants,
marking_ref,
lang,
)
if inherited:
@ -60,7 +64,7 @@ def get_markings(obj, selectors=None, inherited=False, descendants=False):
return list(set(results))
def set_markings(obj, marking, selectors=None):
def set_markings(obj, marking, selectors=None, marking_ref=True, lang=True):
"""
Remove all markings associated with selectors and appends a new granular
marking. Refer to `clear_markings` and `add_markings` for details.
@ -71,6 +75,10 @@ def set_markings(obj, marking, selectors=None):
properties selected by `selectors`.
selectors: string or list of selectors strings relative to the SDO or
SRO in which the properties appear.
marking_ref (bool): If False, markings that use the ``marking_ref``
property will not be removed.
lang (bool): If False, markings that use the ``lang`` property
will not be removed.
Returns:
A new version of the given SDO or SRO with specified markings removed
@ -84,7 +92,7 @@ def set_markings(obj, marking, selectors=None):
if selectors is None:
return object_markings.set_markings(obj, marking)
else:
return granular_markings.set_markings(obj, marking, selectors)
return granular_markings.set_markings(obj, marking, selectors, marking_ref, lang)
def remove_markings(obj, marking, selectors=None):
@ -145,7 +153,7 @@ def add_markings(obj, marking, selectors=None):
return granular_markings.add_markings(obj, marking, selectors)
def clear_markings(obj, selectors=None):
def clear_markings(obj, selectors=None, marking_ref=True, lang=True):
"""
Remove all markings associated with the selectors.
@ -153,6 +161,10 @@ def clear_markings(obj, selectors=None):
obj: An SDO or SRO object.
selectors: string or list of selectors strings relative to the SDO or
SRO in which the field(s) appear(s).
marking_ref (bool): If False, markings that use the ``marking_ref``
property will not be removed.
lang (bool): If False, markings that use the ``lang`` property
will not be removed.
Raises:
InvalidSelectorError: If `selectors` fail validation.
@ -170,7 +182,7 @@ def clear_markings(obj, selectors=None):
if selectors is None:
return object_markings.clear_markings(obj)
else:
return granular_markings.clear_markings(obj, selectors)
return granular_markings.clear_markings(obj, selectors, marking_ref, lang)
def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=False):
@ -183,10 +195,11 @@ def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=Fa
properties selected by `selectors`.
selectors: string or list of selectors strings relative to the SDO or
SRO in which the field(s) appear(s).
inherited: If True, include object level markings and granular markings
inherited to determine if the properties is/are marked.
descendants: If True, include granular markings applied to any children
of the given selector to determine if the properties is/are marked.
inherited (bool): If True, include object level markings and granular
markings inherited to determine if the properties is/are marked.
descendants (bool): If True, include granular markings applied to any
children of the given selector to determine if the properties
is/are marked.
Returns:
bool: True if ``selectors`` is found on internal SDO or SRO collection.
@ -208,7 +221,7 @@ def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=Fa
marking,
selectors,
inherited,
descendants
descendants,
)
if inherited:
@ -221,7 +234,7 @@ def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=Fa
granular_marks,
selectors,
inherited,
descendants
descendants,
)
result = result or object_markings.is_marked(obj, object_marks)
@ -229,7 +242,7 @@ def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=Fa
return result
class _MarkingsMixin():
class _MarkingsMixin(object):
pass

View File

@ -1,12 +1,11 @@
"""Functions for working with STIX 2.0 granular markings.
"""
"""Functions for working with STIX2 granular markings."""
from stix2 import exceptions
from stix2.markings import utils
from stix2.utils import new_version
from stix2.utils import is_marking, new_version
def get_markings(obj, selectors, inherited=False, descendants=False):
def get_markings(obj, selectors, inherited=False, descendants=False, marking_ref=True, lang=True):
"""
Get all granular markings associated to with the properties.
@ -14,10 +13,13 @@ def get_markings(obj, selectors, inherited=False, descendants=False):
obj: An SDO or SRO object.
selectors: string or list of selector strings relative to the SDO or
SRO in which the properties appear.
inherited: If True, include markings inherited relative to the
inherited (bool): If True, include markings inherited relative to the
properties.
descendants: If True, include granular markings applied to any children
relative to the properties.
descendants (bool): If True, include granular markings applied to any
children relative to the properties.
marking_ref (bool): If False, excludes markings that use
``marking_ref`` property.
lang (bool): If False, excludes markings that use ``lang`` property.
Raises:
InvalidSelectorError: If `selectors` fail validation.
@ -29,7 +31,7 @@ def get_markings(obj, selectors, inherited=False, descendants=False):
selectors = utils.convert_to_list(selectors)
utils.validate(obj, selectors)
granular_markings = obj.get("granular_markings", [])
granular_markings = obj.get('granular_markings', [])
if not granular_markings:
return []
@ -38,17 +40,24 @@ def get_markings(obj, selectors, inherited=False, descendants=False):
for marking in granular_markings:
for user_selector in selectors:
for marking_selector in marking.get("selectors", []):
if any([(user_selector == marking_selector), # Catch explicit selectors.
for marking_selector in marking.get('selectors', []):
if any([
(user_selector == marking_selector), # Catch explicit selectors.
(user_selector.startswith(marking_selector) and inherited), # Catch inherited selectors.
(marking_selector.startswith(user_selector) and descendants)]): # Catch descendants selectors
refs = marking.get("marking_ref", [])
results.update([refs])
(marking_selector.startswith(user_selector) and descendants),
]): # Catch descendants selectors
ref = marking.get('marking_ref')
lng = marking.get('lang')
if ref and marking_ref:
results.add(ref)
if lng and lang:
results.add(lng)
return list(results)
def set_markings(obj, marking, selectors):
def set_markings(obj, marking, selectors, marking_ref=True, lang=True):
"""
Remove all granular markings associated with selectors and append a new
granular marking. Refer to `clear_markings` and `add_markings` for details.
@ -59,19 +68,25 @@ def set_markings(obj, marking, selectors):
SRO in which the properties appear.
marking: identifier or list of marking identifiers that apply to the
properties selected by `selectors`.
marking_ref (bool): If False, markings that use the ``marking_ref``
property will not be removed.
lang (bool): If False, markings that use the ``lang`` property
will not be removed.
Returns:
A new version of the given SDO or SRO with specified markings removed
and new ones added.
"""
obj = clear_markings(obj, selectors)
obj = clear_markings(obj, selectors, marking_ref, lang)
return add_markings(obj, marking, selectors)
def remove_markings(obj, marking, selectors):
"""
Remove a granular marking from the granular_markings collection.
Remove a granular marking from the granular_markings collection. The method
makes a best-effort attempt to distinguish between a marking-definition
or language granular marking.
Args:
obj: An SDO or SRO object.
@ -93,7 +108,7 @@ def remove_markings(obj, marking, selectors):
marking = utils.convert_to_marking_list(marking)
utils.validate(obj, selectors)
granular_markings = obj.get("granular_markings")
granular_markings = obj.get('granular_markings')
if not granular_markings:
return obj
@ -102,9 +117,12 @@ def remove_markings(obj, marking, selectors):
to_remove = []
for m in marking:
to_remove.append({"marking_ref": m, "selectors": selectors})
if is_marking(m):
to_remove.append({'marking_ref': m, 'selectors': selectors})
else:
to_remove.append({'lang': m, 'selectors': selectors})
remove = utils.build_granular_marking(to_remove).get("granular_markings")
remove = utils.build_granular_marking(to_remove).get('granular_markings')
if not any(marking in granular_markings for marking in remove):
raise exceptions.MarkingNotFoundError(obj, remove)
@ -123,7 +141,9 @@ def remove_markings(obj, marking, selectors):
def add_markings(obj, marking, selectors):
"""
Append a granular marking to the granular_markings collection.
Append a granular marking to the granular_markings collection. The method
makes a best-effort attempt to distinguish between a marking-definition
or language granular marking.
Args:
obj: An SDO or SRO object.
@ -145,17 +165,20 @@ def add_markings(obj, marking, selectors):
granular_marking = []
for m in marking:
granular_marking.append({"marking_ref": m, "selectors": sorted(selectors)})
if is_marking(m):
granular_marking.append({'marking_ref': m, 'selectors': sorted(selectors)})
else:
granular_marking.append({'lang': m, 'selectors': sorted(selectors)})
if obj.get("granular_markings"):
granular_marking.extend(obj.get("granular_markings"))
if obj.get('granular_markings'):
granular_marking.extend(obj.get('granular_markings'))
granular_marking = utils.expand_markings(granular_marking)
granular_marking = utils.compress_markings(granular_marking)
return new_version(obj, granular_markings=granular_marking, allow_custom=True)
def clear_markings(obj, selectors):
def clear_markings(obj, selectors, marking_ref=True, lang=True):
"""
Remove all granular markings associated with the selectors.
@ -163,6 +186,10 @@ def clear_markings(obj, selectors):
obj: An SDO or SRO object.
selectors: string or list of selectors strings relative to the SDO or
SRO in which the properties appear.
marking_ref (bool): If False, markings that use the ``marking_ref``
property will not be removed.
lang (bool): If False, markings that use the ``lang`` property
will not be removed.
Raises:
InvalidSelectorError: If `selectors` fail validation.
@ -176,33 +203,38 @@ def clear_markings(obj, selectors):
selectors = utils.convert_to_list(selectors)
utils.validate(obj, selectors)
granular_markings = obj.get("granular_markings")
granular_markings = obj.get('granular_markings')
if not granular_markings:
return obj
granular_markings = utils.expand_markings(granular_markings)
sdo = utils.build_granular_marking(
[{"selectors": selectors, "marking_ref": "N/A"}]
)
granular_dict = utils.build_granular_marking([
{'selectors': selectors, 'marking_ref': 'N/A'},
{'selectors': selectors, 'lang': 'N/A'},
])
clear = sdo.get("granular_markings", [])
clear = granular_dict.get('granular_markings', [])
if not any(clear_selector in sdo_selectors.get("selectors", [])
if not any(
clear_selector in sdo_selectors.get('selectors', [])
for sdo_selectors in granular_markings
for clear_marking in clear
for clear_selector in clear_marking.get("selectors", [])
for clear_selector in clear_marking.get('selectors', [])
):
raise exceptions.MarkingNotFoundError(obj, clear)
for granular_marking in granular_markings:
for s in selectors:
if s in granular_marking.get("selectors", []):
marking_refs = granular_marking.get("marking_ref")
if s in granular_marking.get('selectors', []):
ref = granular_marking.get('marking_ref')
lng = granular_marking.get('lang')
if marking_refs:
granular_marking["marking_ref"] = ""
if ref and marking_ref:
granular_marking['marking_ref'] = ''
if lng and lang:
granular_marking['lang'] = ''
granular_markings = utils.compress_markings(granular_markings)
@ -220,11 +252,12 @@ def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=Fa
obj: An SDO or SRO object.
marking: identifier or list of marking identifiers that apply to the
properties selected by `selectors`.
selectors: string or list of selectors strings relative to the SDO or
SRO in which the properties appear.
inherited: If True, return markings inherited from the given selector.
descendants: If True, return granular markings applied to any children
of the given selector.
selectors (bool): string or list of selectors strings relative to the
SDO or SRO in which the properties appear.
inherited (bool): If True, return markings inherited from the given
selector.
descendants (bool): If True, return granular markings applied to any
children of the given selector.
Raises:
InvalidSelectorError: If `selectors` fail validation.
@ -245,22 +278,27 @@ def is_marked(obj, marking=None, selectors=None, inherited=False, descendants=Fa
marking = utils.convert_to_marking_list(marking)
utils.validate(obj, selectors)
granular_markings = obj.get("granular_markings", [])
granular_markings = obj.get('granular_markings', [])
marked = False
markings = set()
for granular_marking in granular_markings:
for user_selector in selectors:
for marking_selector in granular_marking.get("selectors", []):
for marking_selector in granular_marking.get('selectors', []):
if any([(user_selector == marking_selector), # Catch explicit selectors.
if any([
(user_selector == marking_selector), # Catch explicit selectors.
(user_selector.startswith(marking_selector) and inherited), # Catch inherited selectors.
(marking_selector.startswith(user_selector) and descendants)]): # Catch descendants selectors
marking_ref = granular_marking.get("marking_ref", "")
(marking_selector.startswith(user_selector) and descendants),
]): # Catch descendants selectors
marking_ref = granular_marking.get('marking_ref', '')
lang = granular_marking.get('lang', '')
if marking and any(x == marking_ref for x in marking):
markings.update([marking_ref])
if marking and any(x == lang for x in marking):
markings.update([lang])
marked = True

View File

@ -1,5 +1,4 @@
"""Functions for working with STIX 2.0 object markings.
"""
"""Functions for working with STIX2 object markings."""
from stix2 import exceptions
from stix2.markings import utils
@ -18,7 +17,7 @@ def get_markings(obj):
markings are present in `object_marking_refs`.
"""
return obj.get("object_marking_refs", [])
return obj.get('object_marking_refs', [])
def add_markings(obj, marking):
@ -35,7 +34,7 @@ def add_markings(obj, marking):
"""
marking = utils.convert_to_marking_list(marking)
object_markings = set(obj.get("object_marking_refs", []) + marking)
object_markings = set(obj.get('object_marking_refs', []) + marking)
return new_version(obj, object_marking_refs=list(object_markings), allow_custom=True)
@ -59,12 +58,12 @@ def remove_markings(obj, marking):
"""
marking = utils.convert_to_marking_list(marking)
object_markings = obj.get("object_marking_refs", [])
object_markings = obj.get('object_marking_refs', [])
if not object_markings:
return obj
if any(x not in obj["object_marking_refs"] for x in marking):
if any(x not in obj['object_marking_refs'] for x in marking):
raise exceptions.MarkingNotFoundError(obj, marking)
new_markings = [x for x in object_markings if x not in marking]
@ -124,7 +123,7 @@ def is_marked(obj, marking=None):
"""
marking = utils.convert_to_marking_list(marking)
object_markings = obj.get("object_marking_refs", [])
object_markings = obj.get('object_marking_refs', [])
if marking:
return any(x in object_markings for x in marking)

View File

@ -1,11 +1,10 @@
"""Utility functions for STIX 2.0 data markings.
"""
"""Utility functions for STIX2 data markings."""
import collections
import six
from stix2 import exceptions
from stix2 import exceptions, utils
def _evaluate_expression(obj, selector):
@ -23,7 +22,7 @@ def _evaluate_expression(obj, selector):
"""
for items, value in iterpath(obj):
path = ".".join(items)
path = '.'.join(items)
if path == selector and value:
return [value]
@ -40,7 +39,7 @@ def _validate_selector(obj, selector):
def _get_marking_id(marking):
if type(marking).__name__ is 'MarkingDefinition': # avoid circular import
if type(marking).__name__ == 'MarkingDefinition': # avoid circular import
return marking.id
return marking
@ -119,13 +118,18 @@ def compress_markings(granular_markings):
map_ = collections.defaultdict(set)
for granular_marking in granular_markings:
if granular_marking.get("marking_ref"):
map_[granular_marking.get("marking_ref")].update(granular_marking.get("selectors"))
if granular_marking.get('marking_ref'):
map_[granular_marking.get('marking_ref')].update(granular_marking.get('selectors'))
if granular_marking.get('lang'):
map_[granular_marking.get('lang')].update(granular_marking.get('selectors'))
compressed = \
[
{"marking_ref": marking_ref, "selectors": sorted(selectors)}
for marking_ref, selectors in six.iteritems(map_)
{'marking_ref': item, 'selectors': sorted(selectors)}
if utils.is_marking(item) else
{'lang': item, 'selectors': sorted(selectors)}
for item, selectors in six.iteritems(map_)
]
return compressed
@ -173,14 +177,23 @@ def expand_markings(granular_markings):
expanded = []
for marking in granular_markings:
selectors = marking.get("selectors")
marking_ref = marking.get("marking_ref")
selectors = marking.get('selectors')
marking_ref = marking.get('marking_ref')
lang = marking.get('lang')
if marking_ref:
expanded.extend(
[
{"marking_ref": marking_ref, "selectors": [selector]}
{'marking_ref': marking_ref, 'selectors': [selector]}
for selector in selectors
]
],
)
if lang:
expanded.extend(
[
{'lang': lang, 'selectors': [selector]}
for selector in selectors
],
)
return expanded
@ -189,7 +202,7 @@ def expand_markings(granular_markings):
def build_granular_marking(granular_marking):
"""Return a dictionary with the required structure for a granular marking.
"""
return {"granular_markings": expand_markings(granular_marking)}
return {'granular_markings': expand_markings(granular_marking)}
def iterpath(obj, path=None):
@ -229,7 +242,7 @@ def iterpath(obj, path=None):
elif isinstance(varobj, list):
for item in varobj:
index = "[{0}]".format(varobj.index(item))
index = '[{0}]'.format(varobj.index(item))
path.append(index)
yield (path, item)
@ -241,3 +254,81 @@ def iterpath(obj, path=None):
path.pop()
path.pop()
def check_tlp_marking(marking_obj, spec_version):
# Specific TLP Marking validation case.
if marking_obj["definition_type"] == "tlp":
color = marking_obj["definition"]["tlp"]
if color == "white":
if spec_version == '2.0':
w = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "white"}, "definition_type": "tlp",'
' "id": "marking-definition--613f2e26-407d-48c7-9eca-b8e91df99dc9", "type": "marking-definition"}'
)
else:
w = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "white"}, "definition_type": "tlp",'
' "id": "marking-definition--613f2e26-407d-48c7-9eca-b8e91df99dc9", "name": "TLP:WHITE",'
' "type": "marking-definition", "spec_version": "2.1"}'
)
if marking_obj["id"] != "marking-definition--613f2e26-407d-48c7-9eca-b8e91df99dc9":
raise exceptions.TLPMarkingDefinitionError(marking_obj["id"], w)
elif utils.format_datetime(marking_obj["created"]) != "2017-01-20T00:00:00.000Z":
raise exceptions.TLPMarkingDefinitionError(utils.format_datetime(marking_obj["created"]), w)
elif color == "green":
if spec_version == '2.0':
g = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "green"}, "definition_type": "tlp",'
' "id": "marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da", "type": "marking-definition"}'
)
else:
g = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "green"}, "definition_type": "tlp",'
' "id": "marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da", "name": "TLP:GREEN",'
' "type": "marking-definition", "spec_version": "2.1"}'
)
if marking_obj["id"] != "marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da":
raise exceptions.TLPMarkingDefinitionError(marking_obj["id"], g)
elif utils.format_datetime(marking_obj["created"]) != "2017-01-20T00:00:00.000Z":
raise exceptions.TLPMarkingDefinitionError(utils.format_datetime(marking_obj["created"]), g)
elif color == "amber":
if spec_version == '2.0':
a = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "amber"}, "definition_type": "tlp",'
' "id": "marking-definition--f88d31f6-486f-44da-b317-01333bde0b82", "type": "marking-definition"}'
)
else:
a = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "amber"}, "definition_type": "tlp",'
' "id": "marking-definition--f88d31f6-486f-44da-b317-01333bde0b82", "name": "TLP:AMBER",'
' "type": "marking-definition", "spec_version": "2.1"}'
)
if marking_obj["id"] != "marking-definition--f88d31f6-486f-44da-b317-01333bde0b82":
raise exceptions.TLPMarkingDefinitionError(marking_obj["id"], a)
elif utils.format_datetime(marking_obj["created"]) != "2017-01-20T00:00:00.000Z":
raise exceptions.TLPMarkingDefinitionError(utils.format_datetime(marking_obj["created"]), a)
elif color == "red":
if spec_version == '2.0':
r = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "red"}, "definition_type": "tlp",'
' "id": "marking-definition--5e57c739-391a-4eb3-b6be-7d15ca92d5ed", "type": "marking-definition"}'
)
else:
r = (
'{"created": "2017-01-20T00:00:00.000Z", "definition": {"tlp": "red"}, "definition_type": "tlp",'
' "id": "marking-definition--5e57c739-391a-4eb3-b6be-7d15ca92d5ed", "name": "TLP:RED",'
' "type": "marking-definition", "spec_version": "2.1"}'
)
if marking_obj["id"] != "marking-definition--5e57c739-391a-4eb3-b6be-7d15ca92d5ed":
raise exceptions.TLPMarkingDefinitionError(marking_obj["id"], r)
elif utils.format_datetime(marking_obj["created"]) != "2017-01-20T00:00:00.000Z":
raise exceptions.TLPMarkingDefinitionError(utils.format_datetime(marking_obj["created"]), r)
else:
raise exceptions.TLPMarkingDefinitionError(marking_obj["id"], "Does not match any TLP Marking definition")

407
stix2/parsing.py Normal file
View File

@ -0,0 +1,407 @@
"""STIX2 Core Objects and Methods."""
import copy
import importlib
import pkgutil
import re
import stix2
from .base import _DomainObject, _Observable
from .exceptions import DuplicateRegistrationError, ParseError
from .utils import PREFIX_21_REGEX, _get_dict, get_class_hierarchy_names
STIX2_OBJ_MAPS = {}
def parse(data, allow_custom=False, interoperability=False, version=None):
"""Convert a string, dict or file-like object into a STIX object.
Args:
data (str, dict, file-like object): The STIX 2 content to be parsed.
allow_custom (bool): Whether to allow custom properties as well unknown
custom objects. Note that unknown custom objects cannot be parsed
into STIX objects, and will be returned as is. Default: False.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property. If none of the above are
possible, it will use the default version specified by the library.
Returns:
An instantiated Python STIX object.
Warnings:
'allow_custom=True' will allow for the return of any supplied STIX
dict(s) that cannot be found to map to any known STIX object types
(both STIX2 domain objects or defined custom STIX2 objects); NO
validation is done. This is done to allow the processing of possibly
unknown custom STIX objects (example scenario: I need to query a
third-party TAXII endpoint that could provide custom STIX objects that
I don't know about ahead of time)
"""
# convert STIX object to dict, if not already
obj = _get_dict(data)
# convert dict to full python-stix2 obj
obj = dict_to_stix2(obj, allow_custom, interoperability, version)
return obj
def _detect_spec_version(stix_dict):
"""
Given a dict representing a STIX object, try to detect what spec version
it is likely to comply with.
:param stix_dict: A dict with some STIX content. Must at least have a
"type" property.
:return: A string in "vXX" format, where "XX" indicates the spec version,
e.g. "v20", "v21", etc.
"""
obj_type = stix_dict["type"]
if 'spec_version' in stix_dict:
# For STIX 2.0, applies to bundles only.
# For STIX 2.1+, applies to SCOs, SDOs, SROs, and markings only.
v = 'v' + stix_dict['spec_version'].replace('.', '')
elif "id" not in stix_dict:
# Only 2.0 SCOs don't have ID properties
v = "v20"
elif obj_type == 'bundle':
# Bundle without a spec_version property: must be 2.1. But to
# future-proof, use max version over all contained SCOs, with 2.1
# minimum.
v = max(
"v21",
max(
_detect_spec_version(obj) for obj in stix_dict["objects"]
),
)
elif obj_type in STIX2_OBJ_MAPS["v21"]["observables"]:
# Non-bundle object with an ID and without spec_version. Could be a
# 2.1 SCO or 2.0 SDO/SRO/marking. Check for 2.1 SCO...
v = "v21"
else:
# Not a 2.1 SCO; must be a 2.0 object.
v = "v20"
return v
def dict_to_stix2(stix_dict, allow_custom=False, interoperability=False, version=None):
"""convert dictionary to full python-stix2 object
Args:
stix_dict (dict): a python dictionary of a STIX object
that (presumably) is semantically correct to be parsed
into a full python-stix2 obj
allow_custom (bool): Whether to allow custom properties as well
unknown custom objects. Note that unknown custom objects cannot
be parsed into STIX objects, and will be returned as is.
Default: False.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the library will make the best effort based
on checking the "spec_version" property. If none of the above are
possible, it will use the default version specified by the library.
Returns:
An instantiated Python STIX object
Warnings:
'allow_custom=True' will allow for the return of any supplied STIX
dict(s) that cannot be found to map to any known STIX object types
(both STIX2 domain objects or defined custom STIX2 objects); NO
validation is done. This is done to allow the processing of
possibly unknown custom STIX objects (example scenario: I need to
query a third-party TAXII endpoint that could provide custom STIX
objects that I don't know about ahead of time)
"""
if 'type' not in stix_dict:
raise ParseError("Can't parse object with no 'type' property: %s" % str(stix_dict))
if version:
# If the version argument was passed, override other approaches.
v = 'v' + version.replace('.', '')
else:
v = _detect_spec_version(stix_dict)
OBJ_MAP = dict(STIX2_OBJ_MAPS[v]['objects'], **STIX2_OBJ_MAPS[v]['observables'])
try:
obj_class = OBJ_MAP[stix_dict['type']]
except KeyError:
if allow_custom:
# flag allows for unknown custom objects too, but will not
# be parsed into STIX object, returned as is
return stix_dict
raise ParseError("Can't parse unknown object type '%s'! For custom types, use the CustomObject decorator." % stix_dict['type'])
return obj_class(allow_custom=allow_custom, interoperability=interoperability, **stix_dict)
def parse_observable(data, _valid_refs=None, allow_custom=False, version=None):
"""Deserialize a string or file-like object into a STIX Cyber Observable
object.
Args:
data (str, dict, file-like object): The STIX2 content to be parsed.
_valid_refs: A list of object references valid for the scope of the
object being parsed. Use empty list if no valid refs are present.
allow_custom (bool): Whether to allow custom properties or not.
Default: False.
version (str): If present, it forces the parser to use the version
provided. Otherwise, the default version specified by the library
will be used.
Returns:
An instantiated Python STIX Cyber Observable object.
"""
obj = _get_dict(data)
if 'type' not in obj:
raise ParseError("Can't parse observable with no 'type' property: %s" % str(obj))
# get deep copy since we are going modify the dict and might
# modify the original dict as _get_dict() does not return new
# dict when passed a dict
obj = copy.deepcopy(obj)
obj['_valid_refs'] = _valid_refs or []
if version:
# If the version argument was passed, override other approaches.
v = 'v' + version.replace('.', '')
else:
v = _detect_spec_version(obj)
try:
OBJ_MAP_OBSERVABLE = STIX2_OBJ_MAPS[v]['observables']
obj_class = OBJ_MAP_OBSERVABLE[obj['type']]
except KeyError:
if allow_custom:
# flag allows for unknown custom objects too, but will not
# be parsed into STIX observable object, just returned as is
return obj
raise ParseError("Can't parse unknown observable type '%s'! For custom observables, "
"use the CustomObservable decorator." % obj['type'])
return obj_class(allow_custom=allow_custom, **obj)
def _register_object(new_type, version=stix2.DEFAULT_VERSION):
"""Register a custom STIX Object type.
Args:
new_type (class): A class to register in the Object map.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
Raises:
ValueError: If the class being registered wasn't created with the
@CustomObject decorator.
DuplicateRegistrationError: If the class has already been registered.
"""
if not issubclass(new_type, _DomainObject):
raise ValueError(
"'%s' must be created with the @CustomObject decorator." %
new_type.__name__,
)
properties = new_type._properties
if version == "2.1":
for prop_name, prop in properties.items():
if not re.match(PREFIX_21_REGEX, prop_name):
raise ValueError("Property name '%s' must begin with an alpha character" % prop_name)
if version:
v = 'v' + version.replace('.', '')
else:
# Use default version (latest) if no version was provided.
v = 'v' + stix2.DEFAULT_VERSION.replace('.', '')
OBJ_MAP = STIX2_OBJ_MAPS[v]['objects']
if new_type._type in OBJ_MAP.keys():
raise DuplicateRegistrationError("STIX Object", new_type._type)
OBJ_MAP[new_type._type] = new_type
def _register_marking(new_marking, version=stix2.DEFAULT_VERSION):
"""Register a custom STIX Marking Definition type.
Args:
new_marking (class): A class to register in the Marking map.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
"""
mark_type = new_marking._type
properties = new_marking._properties
stix2.properties._validate_type(mark_type, version)
if version == "2.1":
for prop_name, prop_value in properties.items():
if not re.match(PREFIX_21_REGEX, prop_name):
raise ValueError("Property name '%s' must begin with an alpha character." % prop_name)
if version:
v = 'v' + version.replace('.', '')
else:
# Use default version (latest) if no version was provided.
v = 'v' + stix2.DEFAULT_VERSION.replace('.', '')
OBJ_MAP_MARKING = STIX2_OBJ_MAPS[v]['markings']
if mark_type in OBJ_MAP_MARKING.keys():
raise DuplicateRegistrationError("STIX Marking", mark_type)
OBJ_MAP_MARKING[mark_type] = new_marking
def _register_observable(new_observable, version=stix2.DEFAULT_VERSION):
"""Register a custom STIX Cyber Observable type.
Args:
new_observable (class): A class to register in the Observables map.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1"). If
None, use latest version.
"""
properties = new_observable._properties
if version == "2.0":
# If using STIX2.0, check properties ending in "_ref/s" are ObjectReferenceProperties
for prop_name, prop in properties.items():
if prop_name.endswith('_ref') and ('ObjectReferenceProperty' not in get_class_hierarchy_names(prop)):
raise ValueError(
"'%s' is named like an object reference property but "
"is not an ObjectReferenceProperty." % prop_name,
)
elif (prop_name.endswith('_refs') and ('ListProperty' not in get_class_hierarchy_names(prop) or
'ObjectReferenceProperty' not in get_class_hierarchy_names(prop.contained))):
raise ValueError(
"'%s' is named like an object reference list property but "
"is not a ListProperty containing ObjectReferenceProperty." % prop_name,
)
else:
# If using STIX2.1 (or newer...), check properties ending in "_ref/s" are ReferenceProperties
for prop_name, prop in properties.items():
if not re.match(PREFIX_21_REGEX, prop_name):
raise ValueError("Property name '%s' must begin with an alpha character." % prop_name)
elif prop_name.endswith('_ref') and ('ReferenceProperty' not in get_class_hierarchy_names(prop)):
raise ValueError(
"'%s' is named like a reference property but "
"is not a ReferenceProperty." % prop_name,
)
elif (prop_name.endswith('_refs') and ('ListProperty' not in get_class_hierarchy_names(prop) or
'ReferenceProperty' not in get_class_hierarchy_names(prop.contained))):
raise ValueError(
"'%s' is named like a reference list property but "
"is not a ListProperty containing ReferenceProperty." % prop_name,
)
if version:
v = 'v' + version.replace('.', '')
else:
# Use default version (latest) if no version was provided.
v = 'v' + stix2.DEFAULT_VERSION.replace('.', '')
OBJ_MAP_OBSERVABLE = STIX2_OBJ_MAPS[v]['observables']
if new_observable._type in OBJ_MAP_OBSERVABLE.keys():
raise DuplicateRegistrationError("Cyber Observable", new_observable._type)
OBJ_MAP_OBSERVABLE[new_observable._type] = new_observable
def _register_observable_extension(
observable, new_extension, version=stix2.DEFAULT_VERSION,
):
"""Register a custom extension to a STIX Cyber Observable type.
Args:
observable: An observable class or instance
new_extension (class): A class to register in the Observables
Extensions map.
version (str): Which STIX2 version to use. (e.g. "2.0", "2.1").
Defaults to the latest supported version.
"""
obs_class = observable if isinstance(observable, type) else \
type(observable)
ext_type = new_extension._type
properties = new_extension._properties
if not issubclass(obs_class, _Observable):
raise ValueError("'observable' must be a valid Observable class!")
stix2.properties._validate_type(ext_type, version)
if not new_extension._properties:
raise ValueError(
"Invalid extension: must define at least one property: " +
ext_type,
)
if version == "2.1":
if not ext_type.endswith('-ext'):
raise ValueError(
"Invalid extension type name '%s': must end with '-ext'." %
ext_type,
)
for prop_name, prop_value in properties.items():
if not re.match(PREFIX_21_REGEX, prop_name):
raise ValueError("Property name '%s' must begin with an alpha character." % prop_name)
v = 'v' + version.replace('.', '')
try:
observable_type = observable._type
except AttributeError:
raise ValueError(
"Unknown observable type. Custom observables must be "
"created with the @CustomObservable decorator.",
)
OBJ_MAP_OBSERVABLE = STIX2_OBJ_MAPS[v]['observables']
EXT_MAP = STIX2_OBJ_MAPS[v]['observable-extensions']
try:
if ext_type in EXT_MAP[observable_type].keys():
raise DuplicateRegistrationError("Observable Extension", ext_type)
EXT_MAP[observable_type][ext_type] = new_extension
except KeyError:
if observable_type not in OBJ_MAP_OBSERVABLE:
raise ValueError(
"Unknown observable type '%s'. Custom observables "
"must be created with the @CustomObservable decorator."
% observable_type,
)
else:
EXT_MAP[observable_type] = {ext_type: new_extension}
def _collect_stix2_mappings():
"""Navigate the package once and retrieve all object mapping dicts for each
v2X package. Includes OBJ_MAP, OBJ_MAP_OBSERVABLE, EXT_MAP."""
if not STIX2_OBJ_MAPS:
top_level_module = importlib.import_module('stix2')
path = top_level_module.__path__
prefix = str(top_level_module.__name__) + '.'
for module_loader, name, is_pkg in pkgutil.walk_packages(path=path, prefix=prefix):
ver = name.split('.')[1]
if re.match(r'^stix2\.v2[0-9]$', name) and is_pkg:
mod = importlib.import_module(name, str(top_level_module.__name__))
STIX2_OBJ_MAPS[ver] = {}
STIX2_OBJ_MAPS[ver]['objects'] = mod.OBJ_MAP
STIX2_OBJ_MAPS[ver]['observables'] = mod.OBJ_MAP_OBSERVABLE
STIX2_OBJ_MAPS[ver]['observable-extensions'] = mod.EXT_MAP
elif re.match(r'^stix2\.v2[0-9]\.common$', name) and is_pkg is False:
mod = importlib.import_module(name, str(top_level_module.__name__))
STIX2_OBJ_MAPS[ver]['markings'] = mod.OBJ_MAP_MARKING

375
stix2/pattern_visitor.py Normal file
View File

@ -0,0 +1,375 @@
import importlib
import inspect
from stix2patterns.exceptions import ParseException
from stix2patterns.grammars.STIXPatternParser import TerminalNode
from stix2patterns.v20.grammars.STIXPatternParser import \
STIXPatternParser as STIXPatternParser20
from stix2patterns.v20.grammars.STIXPatternVisitor import \
STIXPatternVisitor as STIXPatternVisitor20
from stix2patterns.v20.pattern import Pattern as Pattern20
from stix2patterns.v21.grammars.STIXPatternParser import \
STIXPatternParser as STIXPatternParser21
from stix2patterns.v21.grammars.STIXPatternVisitor import \
STIXPatternVisitor as STIXPatternVisitor21
from stix2patterns.v21.pattern import Pattern as Pattern21
import stix2
from .patterns import *
from .patterns import _BooleanExpression
# flake8: noqa F405
def collapse_lists(lists):
result = []
for c in lists:
if isinstance(c, list):
result.extend(c)
else:
result.append(c)
return result
def remove_terminal_nodes(parse_tree_nodes):
values = []
for x in parse_tree_nodes:
if not isinstance(x, TerminalNode):
values.append(x)
return values
class STIXPatternVisitorForSTIX2():
classes = {}
def get_class(self, class_name):
if class_name in STIXPatternVisitorForSTIX2.classes:
return STIXPatternVisitorForSTIX2.classes[class_name]
else:
return None
def instantiate(self, klass_name, *args):
klass_to_instantiate = None
if self.module_suffix:
klass_to_instantiate = self.get_class(klass_name + "For" + self.module_suffix)
if not klass_to_instantiate:
# use the classes in python_stix2
klass_to_instantiate = globals()[klass_name]
return klass_to_instantiate(*args)
# Visit a parse tree produced by STIXPatternParser#pattern.
def visitPattern(self, ctx):
children = self.visitChildren(ctx)
return children[0]
# Visit a parse tree produced by STIXPatternParser#observationExpressions.
def visitObservationExpressions(self, ctx):
children = self.visitChildren(ctx)
if len(children) == 1:
return children[0]
else:
return FollowedByObservationExpression([children[0], children[2]])
# Visit a parse tree produced by STIXPatternParser#observationExpressionOr.
def visitObservationExpressionOr(self, ctx):
children = self.visitChildren(ctx)
if len(children) == 1:
return children[0]
else:
return self.instantiate("OrObservationExpression", [children[0], children[2]])
# Visit a parse tree produced by STIXPatternParser#observationExpressionAnd.
def visitObservationExpressionAnd(self, ctx):
children = self.visitChildren(ctx)
if len(children) == 1:
return children[0]
else:
return self.instantiate("AndObservationExpression", [children[0], children[2]])
# Visit a parse tree produced by STIXPatternParser#observationExpressionRepeated.
def visitObservationExpressionRepeated(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("QualifiedObservationExpression", children[0], children[1])
# Visit a parse tree produced by STIXPatternParser#observationExpressionSimple.
def visitObservationExpressionSimple(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("ObservationExpression", children[1])
# Visit a parse tree produced by STIXPatternParser#observationExpressionCompound.
def visitObservationExpressionCompound(self, ctx):
children = self.visitChildren(ctx)
if isinstance(children[0], TerminalNode) and children[0].symbol.type == self.parser_class.LPAREN:
return self.instantiate("ParentheticalExpression", children[1])
else:
return self.instantiate("ObservationExpression", children[0])
# Visit a parse tree produced by STIXPatternParser#observationExpressionWithin.
def visitObservationExpressionWithin(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("QualifiedObservationExpression", children[0], children[1])
# Visit a parse tree produced by STIXPatternParser#observationExpressionStartStop.
def visitObservationExpressionStartStop(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("QualifiedObservationExpression", children[0], children[1])
# Visit a parse tree produced by STIXPatternParser#comparisonExpression.
def visitComparisonExpression(self, ctx):
children = self.visitChildren(ctx)
if len(children) == 1:
return children[0]
else:
if isinstance(children[0], _BooleanExpression):
children[0].operands.append(children[2])
return children[0]
else:
return self.instantiate("OrBooleanExpression", [children[0], children[2]])
# Visit a parse tree produced by STIXPatternParser#comparisonExpressionAnd.
def visitComparisonExpressionAnd(self, ctx):
# TODO: NOT
children = self.visitChildren(ctx)
if len(children) == 1:
return children[0]
else:
if isinstance(children[0], _BooleanExpression):
children[0].operands.append(children[2])
return children[0]
else:
return self.instantiate("AndBooleanExpression", [children[0], children[2]])
# Visit a parse tree produced by STIXPatternParser#propTestEqual.
def visitPropTestEqual(self, ctx):
children = self.visitChildren(ctx)
operator = children[1].symbol.type
negated = operator != self.parser_class.EQ
return self.instantiate(
"EqualityComparisonExpression", children[0], children[3 if len(children) > 3 else 2],
negated,
)
# Visit a parse tree produced by STIXPatternParser#propTestOrder.
def visitPropTestOrder(self, ctx):
children = self.visitChildren(ctx)
operator = children[1].symbol.type
if operator == self.parser_class.GT:
return self.instantiate(
"GreaterThanComparisonExpression", children[0],
children[3 if len(children) > 3 else 2], False,
)
elif operator == self.parser_class.LT:
return self.instantiate(
"LessThanComparisonExpression", children[0],
children[3 if len(children) > 3 else 2], False,
)
elif operator == self.parser_class.GE:
return self.instantiate(
"GreaterThanEqualComparisonExpression", children[0],
children[3 if len(children) > 3 else 2], False,
)
elif operator == self.parser_class.LE:
return self.instantiate(
"LessThanEqualComparisonExpression", children[0],
children[3 if len(children) > 3 else 2], False,
)
# Visit a parse tree produced by STIXPatternParser#propTestSet.
def visitPropTestSet(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("InComparisonExpression", children[0], children[3 if len(children) > 3 else 2], False)
# Visit a parse tree produced by STIXPatternParser#propTestLike.
def visitPropTestLike(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("LikeComparisonExpression", children[0], children[3 if len(children) > 3 else 2], False)
# Visit a parse tree produced by STIXPatternParser#propTestRegex.
def visitPropTestRegex(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate(
"MatchesComparisonExpression", children[0], children[3 if len(children) > 3 else 2],
False,
)
# Visit a parse tree produced by STIXPatternParser#propTestIsSubset.
def visitPropTestIsSubset(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("IsSubsetComparisonExpression", children[0], children[3 if len(children) > 3 else 2])
# Visit a parse tree produced by STIXPatternParser#propTestIsSuperset.
def visitPropTestIsSuperset(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("IsSupersetComparisonExpression", children[0], children[3 if len(children) > 3 else 2])
# Visit a parse tree produced by STIXPatternParser#propTestParen.
def visitPropTestParen(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("ParentheticalExpression", children[1])
# Visit a parse tree produced by STIXPatternParser#startStopQualifier.
def visitStartStopQualifier(self, ctx):
children = self.visitChildren(ctx)
return StartStopQualifier(children[1], children[3])
# Visit a parse tree produced by STIXPatternParser#withinQualifier.
def visitWithinQualifier(self, ctx):
children = self.visitChildren(ctx)
return WithinQualifier(children[1])
# Visit a parse tree produced by STIXPatternParser#repeatedQualifier.
def visitRepeatedQualifier(self, ctx):
children = self.visitChildren(ctx)
return RepeatQualifier(children[1])
# Visit a parse tree produced by STIXPatternParser#objectPath.
def visitObjectPath(self, ctx):
children = self.visitChildren(ctx)
flat_list = collapse_lists(children[2:])
property_path = []
i = 0
while i < len(flat_list):
current = flat_list[i]
if i == len(flat_list)-1:
property_path.append(current)
break
next = flat_list[i+1]
if isinstance(next, TerminalNode):
property_path.append(self.instantiate("ListObjectPathComponent", current.property_name, next.getText()))
i += 2
else:
property_path.append(current)
i += 1
return self.instantiate("ObjectPath", children[0].getText(), property_path)
# Visit a parse tree produced by STIXPatternParser#objectType.
def visitObjectType(self, ctx):
children = self.visitChildren(ctx)
return children[0]
# Visit a parse tree produced by STIXPatternParser#firstPathComponent.
def visitFirstPathComponent(self, ctx):
children = self.visitChildren(ctx)
step = children[0].getText()
# if step.endswith("_ref"):
# return stix2.ReferenceObjectPathComponent(step)
# else:
return self.instantiate("BasicObjectPathComponent", step, False)
# Visit a parse tree produced by STIXPatternParser#indexPathStep.
def visitIndexPathStep(self, ctx):
children = self.visitChildren(ctx)
return children[1]
# Visit a parse tree produced by STIXPatternParser#pathStep.
def visitPathStep(self, ctx):
return collapse_lists(self.visitChildren(ctx))
# Visit a parse tree produced by STIXPatternParser#keyPathStep.
def visitKeyPathStep(self, ctx):
children = self.visitChildren(ctx)
if isinstance(children[1], StringConstant):
# special case for hashes
return children[1].value
else:
return self.instantiate("BasicObjectPathComponent", children[1].getText(), True)
# Visit a parse tree produced by STIXPatternParser#setLiteral.
def visitSetLiteral(self, ctx):
children = self.visitChildren(ctx)
return self.instantiate("ListConstant", remove_terminal_nodes(children))
# Visit a parse tree produced by STIXPatternParser#primitiveLiteral.
def visitPrimitiveLiteral(self, ctx):
children = self.visitChildren(ctx)
return children[0]
# Visit a parse tree produced by STIXPatternParser#orderableLiteral.
def visitOrderableLiteral(self, ctx):
children = self.visitChildren(ctx)
return children[0]
def visitTerminal(self, node):
if node.symbol.type == self.parser_class.IntPosLiteral or node.symbol.type == self.parser_class.IntNegLiteral:
return IntegerConstant(node.getText())
elif node.symbol.type == self.parser_class.FloatPosLiteral or node.symbol.type == self.parser_class.FloatNegLiteral:
return FloatConstant(node.getText())
elif node.symbol.type == self.parser_class.HexLiteral:
return HexConstant(node.getText(), from_parse_tree=True)
elif node.symbol.type == self.parser_class.BinaryLiteral:
return BinaryConstant(node.getText(), from_parse_tree=True)
elif node.symbol.type == self.parser_class.StringLiteral:
if node.getText()[0] == "'" and node.getText()[-1] == "'":
return StringConstant(node.getText()[1:-1], from_parse_tree=True)
else:
raise ParseException("The pattern does not start and end with a single quote")
elif node.symbol.type == self.parser_class.BoolLiteral:
return BooleanConstant(node.getText())
elif node.symbol.type == self.parser_class.TimestampLiteral:
value = node.getText()
# STIX 2.1 uses a special timestamp literal syntax
if value.startswith("t"):
value = value[2:-1]
return TimestampConstant(value)
else:
return node
def aggregateResult(self, aggregate, nextResult):
if aggregate:
aggregate.append(nextResult)
elif nextResult:
aggregate = [nextResult]
return aggregate
# This class defines a complete generic visitor for a parse tree produced by STIXPatternParser.
class STIXPatternVisitorForSTIX21(STIXPatternVisitorForSTIX2, STIXPatternVisitor21):
classes = {}
def __init__(self, module_suffix, module_name):
if module_suffix and module_name:
self.module_suffix = module_suffix
if not STIXPatternVisitorForSTIX2.classes:
module = importlib.import_module(module_name)
for k, c in inspect.getmembers(module, inspect.isclass):
STIXPatternVisitorForSTIX2.classes[k] = c
else:
self.module_suffix = None
self.parser_class = STIXPatternParser21
super(STIXPatternVisitor21, self).__init__()
class STIXPatternVisitorForSTIX20(STIXPatternVisitorForSTIX2, STIXPatternVisitor20):
classes = {}
def __init__(self, module_suffix, module_name):
if module_suffix and module_name:
self.module_suffix = module_suffix
if not STIXPatternVisitorForSTIX2.classes:
module = importlib.import_module(module_name)
for k, c in inspect.getmembers(module, inspect.isclass):
STIXPatternVisitorForSTIX2.classes[k] = c
else:
self.module_suffix = None
self.parser_class = STIXPatternParser20
super(STIXPatternVisitor20, self).__init__()
def create_pattern_object(pattern, module_suffix="", module_name="", version=stix2.DEFAULT_VERSION):
"""
Create a STIX pattern AST from a pattern string.
"""
if version == "2.1":
pattern_class = Pattern21
visitor_class = STIXPatternVisitorForSTIX21
else:
pattern_class = Pattern20
visitor_class = STIXPatternVisitorForSTIX20
pattern_obj = pattern_class(pattern)
builder = visitor_class(module_suffix, module_name)
return pattern_obj.visit(builder)

View File

@ -1,11 +1,12 @@
"""Classes to aid in working with the STIX 2 patterning language.
"""
"""Classes to aid in working with the STIX 2 patterning language."""
import base64
import binascii
import datetime
import re
import six
from .utils import parse_into_datetime
@ -13,6 +14,14 @@ def escape_quotes_and_backslashes(s):
return s.replace(u'\\', u'\\\\').replace(u"'", u"\\'")
def quote_if_needed(x):
if isinstance(x, six.string_types):
if x.find("-") != -1:
if not x.startswith("'"):
return "'" + x + "'"
return x
class _Constant(object):
pass
@ -23,11 +32,13 @@ class StringConstant(_Constant):
Args:
value (str): string value
"""
def __init__(self, value):
def __init__(self, value, from_parse_tree=False):
self.needs_to_be_quoted = not from_parse_tree
self.value = value
def __str__(self):
return "'%s'" % escape_quotes_and_backslashes(self.value)
return "'%s'" % (escape_quotes_and_backslashes(self.value) if self.needs_to_be_quoted else self.value)
class TimestampConstant(_Constant):
@ -86,8 +97,8 @@ class BooleanConstant(_Constant):
self.value = value
return
trues = ['true', 't']
falses = ['false', 'f']
trues = ['true', 't', '1']
falses = ['false', 'f', '0']
try:
if value.lower() in trues:
self.value = True
@ -110,20 +121,21 @@ class BooleanConstant(_Constant):
_HASH_REGEX = {
"MD5": ("^[a-fA-F0-9]{32}$", "MD5"),
"MD6": ("^[a-fA-F0-9]{32}|[a-fA-F0-9]{40}|[a-fA-F0-9]{56}|[a-fA-F0-9]{64}|[a-fA-F0-9]{96}|[a-fA-F0-9]{128}$", "MD6"),
"RIPEMD160": ("^[a-fA-F0-9]{40}$", "RIPEMD-160"),
"SHA1": ("^[a-fA-F0-9]{40}$", "SHA-1"),
"SHA224": ("^[a-fA-F0-9]{56}$", "SHA-224"),
"SHA256": ("^[a-fA-F0-9]{64}$", "SHA-256"),
"SHA384": ("^[a-fA-F0-9]{96}$", "SHA-384"),
"SHA512": ("^[a-fA-F0-9]{128}$", "SHA-512"),
"SHA3224": ("^[a-fA-F0-9]{56}$", "SHA3-224"),
"SHA3256": ("^[a-fA-F0-9]{64}$", "SHA3-256"),
"SHA3384": ("^[a-fA-F0-9]{96}$", "SHA3-384"),
"SHA3512": ("^[a-fA-F0-9]{128}$", "SHA3-512"),
"SSDEEP": ("^[a-zA-Z0-9/+:.]{1,128}$", "ssdeep"),
"WHIRLPOOL": ("^[a-fA-F0-9]{128}$", "WHIRLPOOL"),
"MD5": (r"^[a-fA-F0-9]{32}$", "MD5"),
"MD6": (r"^[a-fA-F0-9]{32}|[a-fA-F0-9]{40}|[a-fA-F0-9]{56}|[a-fA-F0-9]{64}|[a-fA-F0-9]{96}|[a-fA-F0-9]{128}$", "MD6"),
"RIPEMD160": (r"^[a-fA-F0-9]{40}$", "RIPEMD-160"),
"SHA1": (r"^[a-fA-F0-9]{40}$", "SHA-1"),
"SHA224": (r"^[a-fA-F0-9]{56}$", "SHA-224"),
"SHA256": (r"^[a-fA-F0-9]{64}$", "SHA-256"),
"SHA384": (r"^[a-fA-F0-9]{96}$", "SHA-384"),
"SHA512": (r"^[a-fA-F0-9]{128}$", "SHA-512"),
"SHA3224": (r"^[a-fA-F0-9]{56}$", "SHA3-224"),
"SHA3256": (r"^[a-fA-F0-9]{64}$", "SHA3-256"),
"SHA3384": (r"^[a-fA-F0-9]{96}$", "SHA3-384"),
"SHA3512": (r"^[a-fA-F0-9]{128}$", "SHA3-512"),
"SSDEEP": (r"^[a-zA-Z0-9/+:.]{1,128}$", "SSDEEP"),
"WHIRLPOOL": (r"^[a-fA-F0-9]{128}$", "WHIRLPOOL"),
"TLSH": (r"^[a-fA-F0-9]{70}$", "TLSH"),
}
@ -143,7 +155,7 @@ class HashConstant(StringConstant):
vocab_key = _HASH_REGEX[key][1]
if not re.match(_HASH_REGEX[key][0], value):
raise ValueError("'%s' is not a valid %s hash" % (value, vocab_key))
self.value = value
super(HashConstant, self).__init__(value)
class BinaryConstant(_Constant):
@ -152,7 +164,13 @@ class BinaryConstant(_Constant):
Args:
value (str): base64 encoded string value
"""
def __init__(self, value):
def __init__(self, value, from_parse_tree=False):
# support with or without a 'b'
if from_parse_tree:
m = re.match("^b'(.+)'$", value)
if m:
value = m.group(1)
try:
base64.b64decode(value)
self.value = value
@ -169,10 +187,16 @@ class HexConstant(_Constant):
Args:
value (str): hexadecimal value
"""
def __init__(self, value):
if not re.match('^([a-fA-F0-9]{2})+$', value):
raise ValueError("must contain an even number of hexadecimal characters")
def __init__(self, value, from_parse_tree=False):
# support with or without an 'h'
if not from_parse_tree and re.match('^([a-fA-F0-9]{2})+$', value):
self.value = value
else:
m = re.match("^h'(([a-fA-F0-9]{2})+)'$", value)
if m:
self.value = m.group(1)
else:
raise ValueError("must contain an even number of hexadecimal characters")
def __str__(self):
return "h'%s'" % self.value
@ -185,10 +209,11 @@ class ListConstant(_Constant):
value (list): list of values
"""
def __init__(self, values):
self.value = values
# handle _Constants or make a _Constant
self.value = [x if isinstance(x, _Constant) else make_constant(x) for x in values]
def __str__(self):
return "(" + ", ".join([("%s" % make_constant(x)) for x in self.value]) + ")"
return "(" + ", ".join(["%s" % x for x in self.value]) + ")"
def make_constant(value):
@ -203,7 +228,7 @@ def make_constant(value):
try:
return parse_into_datetime(value)
except ValueError:
except (ValueError, TypeError):
pass
if isinstance(value, str):
@ -229,7 +254,10 @@ class _ObjectPathComponent(object):
parse1 = component_name.split("[")
return ListObjectPathComponent(parse1[0], parse1[1][:-1])
else:
return BasicObjectPathComponent(component_name)
return BasicObjectPathComponent(component_name, False)
def __str__(self):
return quote_if_needed(self.property_name)
class BasicObjectPathComponent(_ObjectPathComponent):
@ -243,14 +271,11 @@ class BasicObjectPathComponent(_ObjectPathComponent):
property_name (str): object property name
is_key (bool): is dictionary key, default: False
"""
def __init__(self, property_name, is_key=False):
def __init__(self, property_name, is_key):
self.property_name = property_name
# TODO: set is_key to True if this component is a dictionary key
# self.is_key = is_key
def __str__(self):
return self.property_name
class ListObjectPathComponent(_ObjectPathComponent):
"""List object path component (for an observation or expression)
@ -264,7 +289,7 @@ class ListObjectPathComponent(_ObjectPathComponent):
self.index = index
def __str__(self):
return "%s[%s]" % (self.property_name, self.index)
return "%s[%s]" % (quote_if_needed(self.property_name), self.index)
class ReferenceObjectPathComponent(_ObjectPathComponent):
@ -276,9 +301,6 @@ class ReferenceObjectPathComponent(_ObjectPathComponent):
def __init__(self, reference_property_name):
self.property_name = reference_property_name
def __str__(self):
return self.property_name
class ObjectPath(object):
"""Pattern operand object (property) path
@ -289,12 +311,14 @@ class ObjectPath(object):
"""
def __init__(self, object_type_name, property_path):
self.object_type_name = object_type_name
self.property_path = [x if isinstance(x, _ObjectPathComponent) else
self.property_path = [
x if isinstance(x, _ObjectPathComponent) else
_ObjectPathComponent.create_ObjectPathComponent(x)
for x in property_path]
for x in property_path
]
def __str__(self):
return "%s:%s" % (self.object_type_name, ".".join(["%s" % x for x in self.property_path]))
return "%s:%s" % (self.object_type_name, ".".join(["%s" % quote_if_needed(x) for x in self.property_path]))
def merge(self, other):
"""Extend the object property with that of the supplied object property path"""
@ -527,7 +551,7 @@ class ObservationExpression(_PatternExpression):
self.operand = operand
def __str__(self):
return "[%s]" % self.operand
return "%s" % self.operand if isinstance(self.operand, (ObservationExpression, _CompoundObservationExpression)) else "[%s]" % self.operand
class _CompoundObservationExpression(_PatternExpression):

View File

@ -1,32 +1,125 @@
"""Classes for representing properties of STIX Objects and Cyber Observables.
"""
"""Classes for representing properties of STIX Objects and Cyber Observables."""
import base64
import binascii
import collections
import copy
import inspect
import re
import uuid
from six import string_types, text_type
from stix2patterns.validator import run_validator
import stix2
from .base import _STIXBase
from .exceptions import DictionaryKeyError
from .utils import _get_dict, parse_into_datetime
from .exceptions import (
CustomContentError, DictionaryKeyError, MissingPropertiesError,
MutuallyExclusivePropertiesError,
)
from .parsing import STIX2_OBJ_MAPS, parse, parse_observable
from .utils import (
TYPE_21_REGEX, TYPE_REGEX, _get_dict, get_class_hierarchy_names,
parse_into_datetime,
)
# This uses the regular expression for a RFC 4122, Version 4 UUID. In the
# 8-4-4-4-12 hexadecimal representation, the first hex digit of the third
# component must be a 4, and the first hex digit of the fourth component must be
# 8, 9, a, or b (10xx bit pattern).
ID_REGEX = re.compile("^[a-z0-9][a-z0-9-]+[a-z0-9]--" # object type
"[0-9a-fA-F]{8}-"
ID_REGEX_interoperability = re.compile(r"[0-9a-fA-F]{8}-"
"[0-9a-fA-F]{4}-"
"[0-9a-fA-F]{4}-"
"[0-9a-fA-F]{4}-"
"4[0-9a-fA-F]{3}-"
"[89abAB][0-9a-fA-F]{3}-"
"[0-9a-fA-F]{12}$")
try:
from collections.abc import Mapping, defaultdict
except ImportError:
from collections import Mapping, defaultdict
ERROR_INVALID_ID = (
"not a valid STIX identifier, must match <object-type>--<UUIDv4>"
"not a valid STIX identifier, must match <object-type>--<UUID>: {}"
)
def _check_uuid(uuid_str, spec_version, interoperability):
"""
Check whether the given UUID string is valid with respect to the given STIX
spec version. STIX 2.0 requires UUIDv4; 2.1 only requires the RFC 4122
variant.
:param uuid_str: A UUID as a string
:param spec_version: The STIX spec version
:return: True if the UUID is valid, False if not
:raises ValueError: If uuid_str is malformed
"""
if interoperability:
return ID_REGEX_interoperability.match(uuid_str)
uuid_obj = uuid.UUID(uuid_str)
ok = uuid_obj.variant == uuid.RFC_4122
if ok and spec_version == "2.0":
ok = uuid_obj.version == 4
return ok
def _validate_id(id_, spec_version, required_prefix, interoperability):
"""
Check the STIX identifier for correctness, raise an exception if there are
errors.
:param id_: The STIX identifier
:param spec_version: The STIX specification version to use
:param required_prefix: The required prefix on the identifier, if any.
This function doesn't add a "--" suffix to the prefix, so callers must
add it if it is important. Pass None to skip the prefix check.
:raises ValueError: If there are any errors with the identifier
"""
if required_prefix:
if not id_.startswith(required_prefix):
raise ValueError("must start with '{}'.".format(required_prefix))
try:
if required_prefix:
uuid_part = id_[len(required_prefix):]
else:
idx = id_.index("--")
uuid_part = id_[idx+2:]
result = _check_uuid(uuid_part, spec_version, interoperability)
except ValueError:
# replace their ValueError with ours
raise ValueError(ERROR_INVALID_ID.format(id_))
if not result:
raise ValueError(ERROR_INVALID_ID.format(id_))
def _validate_type(type_, spec_version):
"""
Check the STIX type name for correctness, raise an exception if there are
errors.
:param type_: The STIX type name
:param spec_version: The STIX specification version to use
:raises ValueError: If there are any errors with the identifier
"""
if spec_version == "2.0":
if not re.match(TYPE_REGEX, type_):
raise ValueError(
"Invalid type name '%s': must only contain the "
"characters a-z (lowercase ASCII), 0-9, and hyphen (-)." %
type_,
)
else: # 2.1+
if not re.match(TYPE_21_REGEX, type_):
raise ValueError(
"Invalid type name '%s': must only contain the "
"characters a-z (lowercase ASCII), 0-9, and hyphen (-) "
"and must begin with an a-z character" % type_,
)
if len(type_) < 3 or len(type_) > 250:
raise ValueError(
"Invalid type name '%s': must be between 3 and 250 characters." % type_,
)
@ -37,14 +130,15 @@ class Property(object):
``__init__()``.
Args:
required (bool): If ``True``, the property must be provided when creating an
object with that property. No default value exists for these properties.
(Default: ``False``)
required (bool): If ``True``, the property must be provided when
creating an object with that property. No default value exists for
these properties. (Default: ``False``)
fixed: This provides a constant default value. Users are free to
provide this value explicity when constructing an object (which allows
you to copy **all** values from an existing object to a new object), but
if the user provides a value other than the ``fixed`` value, it will raise
an error. This is semantically equivalent to defining both:
provide this value explicity when constructing an object (which
allows you to copy **all** values from an existing object to a new
object), but if the user provides a value other than the ``fixed``
value, it will raise an error. This is semantically equivalent to
defining both:
- a ``clean()`` function that checks if the value matches the fixed
value, and
@ -55,29 +149,31 @@ class Property(object):
- ``def clean(self, value) -> any:``
- Return a value that is valid for this property. If ``value`` is not
valid for this property, this will attempt to transform it first. If
``value`` is not valid and no such transformation is possible, it should
raise a ValueError.
``value`` is not valid and no such transformation is possible, it
should raise an exception.
- ``def default(self):``
- provide a default value for this property.
- ``default()`` can return the special value ``NOW`` to use the current
time. This is useful when several timestamps in the same object need
to use the same default value, so calling now() for each property--
likely several microseconds apart-- does not work.
time. This is useful when several timestamps in the same object
need to use the same default value, so calling now() for each
property-- likely several microseconds apart-- does not work.
Subclasses can instead provide a lambda function for ``default`` as a keyword
argument. ``clean`` should not be provided as a lambda since lambdas cannot
raise their own exceptions.
Subclasses can instead provide a lambda function for ``default`` as a
keyword argument. ``clean`` should not be provided as a lambda since
lambdas cannot raise their own exceptions.
When instantiating Properties, ``required`` and ``default`` should not be
used together. ``default`` implies that the property is required in the
specification so this function will be used to supply a value if none is
provided. ``required`` means that the user must provide this; it is
required in the specification and we can't or don't want to create a
default value.
When instantiating Properties, ``required`` and ``default`` should not be used
together. ``default`` implies that the property is required in the specification
so this function will be used to supply a value if none is provided.
``required`` means that the user must provide this; it is required in the
specification and we can't or don't want to create a default value.
"""
def _default_clean(self, value):
if value != self._fixed_value:
raise ValueError("must equal '{0}'.".format(self._fixed_value))
raise ValueError("must equal '{}'.".format(self._fixed_value))
return value
def __init__(self, required=False, fixed=None, default=None):
@ -136,7 +232,7 @@ class ListProperty(Property):
if type(self.contained) is EmbeddedObjectProperty:
obj_type = self.contained.type
elif type(self.contained).__name__ is 'STIXObjectProperty':
elif type(self.contained).__name__ == "STIXObjectProperty":
# ^ this way of checking doesn't require a circular import
# valid is already an instance of a python-stix2 class; no need
# to turn it into a dictionary and then pass it to the class
@ -148,8 +244,13 @@ class ListProperty(Property):
else:
obj_type = self.contained
if isinstance(valid, collections.Mapping):
if isinstance(valid, Mapping):
try:
valid._allow_custom
except AttributeError:
result.append(obj_type(**valid))
else:
result.append(obj_type(allow_custom=True, **valid))
else:
result.append(obj_type(valid))
@ -163,30 +264,32 @@ class ListProperty(Property):
class StringProperty(Property):
def __init__(self, **kwargs):
self.string_type = text_type
super(StringProperty, self).__init__(**kwargs)
def clean(self, value):
return self.string_type(value)
if not isinstance(value, string_types):
return text_type(value)
return value
class TypeProperty(Property):
def __init__(self, type):
def __init__(self, type, spec_version=stix2.DEFAULT_VERSION):
_validate_type(type, spec_version)
self.spec_version = spec_version
super(TypeProperty, self).__init__(fixed=type)
class IDProperty(Property):
def __init__(self, type):
def __init__(self, type, spec_version=stix2.DEFAULT_VERSION):
self.required_prefix = type + "--"
self.spec_version = spec_version
super(IDProperty, self).__init__()
def clean(self, value):
if not value.startswith(self.required_prefix):
raise ValueError("must start with '{0}'.".format(self.required_prefix))
if not ID_REGEX.match(value):
raise ValueError(ERROR_INVALID_ID)
interoperability = self.interoperability if hasattr(self, 'interoperability') and self.interoperability else False
_validate_id(value, self.spec_version, self.required_prefix, interoperability)
return value
def default(self):
@ -195,21 +298,51 @@ class IDProperty(Property):
class IntegerProperty(Property):
def __init__(self, min=None, max=None, **kwargs):
self.min = min
self.max = max
super(IntegerProperty, self).__init__(**kwargs)
def clean(self, value):
try:
return int(value)
value = int(value)
except Exception:
raise ValueError("must be an integer.")
if self.min is not None and value < self.min:
msg = "minimum value is {}. received {}".format(self.min, value)
raise ValueError(msg)
if self.max is not None and value > self.max:
msg = "maximum value is {}. received {}".format(self.max, value)
raise ValueError(msg)
return value
class FloatProperty(Property):
def __init__(self, min=None, max=None, **kwargs):
self.min = min
self.max = max
super(FloatProperty, self).__init__(**kwargs)
def clean(self, value):
try:
return float(value)
value = float(value)
except Exception:
raise ValueError("must be a float.")
if self.min is not None and value < self.min:
msg = "minimum value is {}. received {}".format(self.min, value)
raise ValueError(msg)
if self.max is not None and value > self.max:
msg = "maximum value is {}. received {}".format(self.max, value)
raise ValueError(msg)
return value
class BooleanProperty(Property):
@ -217,8 +350,8 @@ class BooleanProperty(Property):
if isinstance(value, bool):
return value
trues = ['true', 't']
falses = ['false', 'f']
trues = ['true', 't', '1']
falses = ['false', 'f', '0']
try:
if value.lower() in trues:
return True
@ -235,52 +368,68 @@ class BooleanProperty(Property):
class TimestampProperty(Property):
def __init__(self, precision=None, **kwargs):
def __init__(self, precision="any", precision_constraint="exact", **kwargs):
self.precision = precision
self.precision_constraint = precision_constraint
super(TimestampProperty, self).__init__(**kwargs)
def clean(self, value):
return parse_into_datetime(value, self.precision)
return parse_into_datetime(
value, self.precision, self.precision_constraint,
)
class DictionaryProperty(Property):
def __init__(self, spec_version=stix2.DEFAULT_VERSION, **kwargs):
self.spec_version = spec_version
super(DictionaryProperty, self).__init__(**kwargs)
def clean(self, value):
try:
dictified = _get_dict(value)
except ValueError:
raise ValueError("The dictionary property must contain a dictionary")
if dictified == {}:
raise ValueError("The dictionary property must contain a non-empty dictionary")
for k in dictified.keys():
if self.spec_version == '2.0':
if len(k) < 3:
raise DictionaryKeyError(k, "shorter than 3 characters")
elif len(k) > 256:
raise DictionaryKeyError(k, "longer than 256 characters")
if not re.match('^[a-zA-Z0-9_-]+$', k):
raise DictionaryKeyError(k, "contains characters other than"
"lowercase a-z, uppercase A-Z, "
"numerals 0-9, hyphen (-), or "
"underscore (_)")
elif self.spec_version == '2.1':
if len(k) > 250:
raise DictionaryKeyError(k, "longer than 250 characters")
if not re.match(r"^[a-zA-Z0-9_-]+$", k):
msg = (
"contains characters other than lowercase a-z, "
"uppercase A-Z, numerals 0-9, hyphen (-), or "
"underscore (_)"
)
raise DictionaryKeyError(k, msg)
if len(dictified) < 1:
raise ValueError("must not be empty.")
return dictified
HASHES_REGEX = {
"MD5": ("^[a-fA-F0-9]{32}$", "MD5"),
"MD6": ("^[a-fA-F0-9]{32}|[a-fA-F0-9]{40}|[a-fA-F0-9]{56}|[a-fA-F0-9]{64}|[a-fA-F0-9]{96}|[a-fA-F0-9]{128}$", "MD6"),
"RIPEMD160": ("^[a-fA-F0-9]{40}$", "RIPEMD-160"),
"SHA1": ("^[a-fA-F0-9]{40}$", "SHA-1"),
"SHA224": ("^[a-fA-F0-9]{56}$", "SHA-224"),
"SHA256": ("^[a-fA-F0-9]{64}$", "SHA-256"),
"SHA384": ("^[a-fA-F0-9]{96}$", "SHA-384"),
"SHA512": ("^[a-fA-F0-9]{128}$", "SHA-512"),
"SHA3224": ("^[a-fA-F0-9]{56}$", "SHA3-224"),
"SHA3256": ("^[a-fA-F0-9]{64}$", "SHA3-256"),
"SHA3384": ("^[a-fA-F0-9]{96}$", "SHA3-384"),
"SHA3512": ("^[a-fA-F0-9]{128}$", "SHA3-512"),
"SSDEEP": ("^[a-zA-Z0-9/+:.]{1,128}$", "ssdeep"),
"WHIRLPOOL": ("^[a-fA-F0-9]{128}$", "WHIRLPOOL"),
"MD5": (r"^[a-fA-F0-9]{32}$", "MD5"),
"MD6": (r"^[a-fA-F0-9]{32}|[a-fA-F0-9]{40}|[a-fA-F0-9]{56}|[a-fA-F0-9]{64}|[a-fA-F0-9]{96}|[a-fA-F0-9]{128}$", "MD6"),
"RIPEMD160": (r"^[a-fA-F0-9]{40}$", "RIPEMD-160"),
"SHA1": (r"^[a-fA-F0-9]{40}$", "SHA-1"),
"SHA224": (r"^[a-fA-F0-9]{56}$", "SHA-224"),
"SHA256": (r"^[a-fA-F0-9]{64}$", "SHA-256"),
"SHA384": (r"^[a-fA-F0-9]{96}$", "SHA-384"),
"SHA512": (r"^[a-fA-F0-9]{128}$", "SHA-512"),
"SHA3224": (r"^[a-fA-F0-9]{56}$", "SHA3-224"),
"SHA3256": (r"^[a-fA-F0-9]{64}$", "SHA3-256"),
"SHA3384": (r"^[a-fA-F0-9]{96}$", "SHA3-384"),
"SHA3512": (r"^[a-fA-F0-9]{128}$", "SHA3-512"),
"SSDEEP": (r"^[a-zA-Z0-9/+:.]{1,128}$", "SSDEEP"),
"WHIRLPOOL": (r"^[a-fA-F0-9]{128}$", "WHIRLPOOL"),
"TLSH": (r"^[a-fA-F0-9]{70}$", "TLSH"),
}
@ -288,12 +437,14 @@ class HashesProperty(DictionaryProperty):
def clean(self, value):
clean_dict = super(HashesProperty, self).clean(value)
for k, v in clean_dict.items():
for k, v in copy.deepcopy(clean_dict).items():
key = k.upper().replace('-', '')
if key in HASHES_REGEX:
vocab_key = HASHES_REGEX[key][1]
if vocab_key == "SSDEEP" and self.spec_version == "2.0":
vocab_key = vocab_key.lower()
if not re.match(HASHES_REGEX[key][0], v):
raise ValueError("'%s' is not a valid %s hash" % (v, vocab_key))
raise ValueError("'{0}' is not a valid {1} hash".format(v, vocab_key))
if k != vocab_key:
clean_dict[vocab_key] = clean_dict[k]
del clean_dict[k]
@ -313,33 +464,90 @@ class BinaryProperty(Property):
class HexProperty(Property):
def clean(self, value):
if not re.match('^([a-fA-F0-9]{2})+$', value):
if not re.match(r"^([a-fA-F0-9]{2})+$", value):
raise ValueError("must contain an even number of hexadecimal characters")
return value
class ReferenceProperty(Property):
def __init__(self, type=None, **kwargs):
def __init__(self, valid_types=None, invalid_types=None, spec_version=stix2.DEFAULT_VERSION, **kwargs):
"""
references sometimes must be to a specific object type
"""
self.type = type
self.spec_version = spec_version
# These checks need to be done prior to the STIX object finishing construction
# and thus we can't use base.py's _check_mutually_exclusive_properties()
# in the typical location of _check_object_constraints() in sdo.py
if valid_types and invalid_types:
raise MutuallyExclusivePropertiesError(self.__class__, ['invalid_types', 'valid_types'])
elif valid_types is None and invalid_types is None:
raise MissingPropertiesError(self.__class__, ['invalid_types', 'valid_types'])
if valid_types and type(valid_types) is not list:
valid_types = [valid_types]
elif invalid_types and type(invalid_types) is not list:
invalid_types = [invalid_types]
self.valid_types = valid_types
self.invalid_types = invalid_types
super(ReferenceProperty, self).__init__(**kwargs)
def clean(self, value):
if isinstance(value, _STIXBase):
value = value.id
value = str(value)
if self.type:
if not value.startswith(self.type):
raise ValueError("must start with '{0}'.".format(self.type))
if not ID_REGEX.match(value):
raise ValueError(ERROR_INVALID_ID)
possible_prefix = value[:value.index('--')]
if self.valid_types:
ref_valid_types = enumerate_types(self.valid_types, 'v' + self.spec_version.replace(".", ""))
if possible_prefix in ref_valid_types or self.allow_custom:
required_prefix = possible_prefix + '--'
else:
raise ValueError("The type-specifying prefix '%s' for this property is not valid" % (possible_prefix))
elif self.invalid_types:
ref_invalid_types = enumerate_types(self.invalid_types, 'v' + self.spec_version.replace(".", ""))
if possible_prefix not in ref_invalid_types:
required_prefix = possible_prefix + '--'
else:
raise ValueError("An invalid type-specifying prefix '%s' was specified for this property" % (possible_prefix))
interoperability = self.interoperability if hasattr(self, 'interoperability') and self.interoperability else False
_validate_id(value, self.spec_version, required_prefix, interoperability)
return value
SELECTOR_REGEX = re.compile("^[a-z0-9_-]{3,250}(\\.(\\[\\d+\\]|[a-z0-9_-]{1,250}))*$")
def enumerate_types(types, spec_version):
"""
`types` is meant to be a list; it may contain specific object types and/or
the any of the words "SCO", "SDO", or "SRO"
Since "SCO", "SDO", and "SRO" are general types that encompass various specific object types,
once each of those words is being processed, that word will be removed from `return_types`,
so as not to mistakenly allow objects to be created of types "SCO", "SDO", or "SRO"
"""
return_types = []
return_types += types
if "SDO" in types:
return_types.remove("SDO")
return_types += STIX2_OBJ_MAPS[spec_version]['objects'].keys()
if "SCO" in types:
return_types.remove("SCO")
return_types += STIX2_OBJ_MAPS[spec_version]['observables'].keys()
if "SRO" in types:
return_types.remove("SRO")
return_types += ['relationship', 'sighting']
return return_types
SELECTOR_REGEX = re.compile(r"^[a-z0-9_-]{3,250}(\.(\[\d+\]|[a-z0-9_-]{1,250}))*$")
class SelectorProperty(Property):
@ -369,7 +577,7 @@ class EmbeddedObjectProperty(Property):
if type(value) is dict:
value = self.type(**value)
elif not isinstance(value, self.type):
raise ValueError("must be of type %s." % self.type.__name__)
raise ValueError("must be of type {}.".format(self.type.__name__))
return value
@ -382,18 +590,137 @@ class EnumProperty(StringProperty):
super(EnumProperty, self).__init__(**kwargs)
def clean(self, value):
value = super(EnumProperty, self).clean(value)
if value not in self.allowed:
raise ValueError("value '%s' is not valid for this enumeration." % value)
return self.string_type(value)
cleaned_value = super(EnumProperty, self).clean(value)
if cleaned_value not in self.allowed:
raise ValueError("value '{}' is not valid for this enumeration.".format(cleaned_value))
return cleaned_value
class PatternProperty(StringProperty):
pass
class ObservableProperty(Property):
"""Property for holding Cyber Observable Objects.
"""
def __init__(self, spec_version=stix2.DEFAULT_VERSION, allow_custom=False, *args, **kwargs):
self.allow_custom = allow_custom
self.spec_version = spec_version
super(ObservableProperty, self).__init__(*args, **kwargs)
def clean(self, value):
str_value = super(PatternProperty, self).clean(value)
errors = run_validator(str_value)
if errors:
raise ValueError(str(errors[0]))
try:
dictified = _get_dict(value)
# get deep copy since we are going modify the dict and might
# modify the original dict as _get_dict() does not return new
# dict when passed a dict
dictified = copy.deepcopy(dictified)
except ValueError:
raise ValueError("The observable property must contain a dictionary")
if dictified == {}:
raise ValueError("The observable property must contain a non-empty dictionary")
return self.string_type(value)
valid_refs = dict((k, v['type']) for (k, v) in dictified.items())
for key, obj in dictified.items():
parsed_obj = parse_observable(
obj,
valid_refs,
allow_custom=self.allow_custom,
version=self.spec_version,
)
dictified[key] = parsed_obj
return dictified
class ExtensionsProperty(DictionaryProperty):
"""Property for representing extensions on Observable objects.
"""
def __init__(self, spec_version=stix2.DEFAULT_VERSION, allow_custom=False, enclosing_type=None, required=False):
self.allow_custom = allow_custom
self.enclosing_type = enclosing_type
super(ExtensionsProperty, self).__init__(spec_version=spec_version, required=required)
def clean(self, value):
try:
dictified = _get_dict(value)
# get deep copy since we are going modify the dict and might
# modify the original dict as _get_dict() does not return new
# dict when passed a dict
dictified = copy.deepcopy(dictified)
except ValueError:
raise ValueError("The extensions property must contain a dictionary")
v = 'v' + self.spec_version.replace('.', '')
specific_type_map = STIX2_OBJ_MAPS[v]['observable-extensions'].get(self.enclosing_type, {})
for key, subvalue in dictified.items():
if key in specific_type_map:
cls = specific_type_map[key]
if isinstance(subvalue, (dict, defaultdict)):
if self.allow_custom:
subvalue['allow_custom'] = True
dictified[key] = cls(**subvalue)
else:
dictified[key] = cls(**subvalue)
elif type(subvalue) is cls:
# If already an instance of an _Extension class, assume it's valid
dictified[key] = subvalue
else:
raise ValueError("Cannot determine extension type.")
else:
if self.allow_custom:
dictified[key] = subvalue
else:
raise CustomContentError("Can't parse unknown extension type: {}".format(key))
return dictified
class STIXObjectProperty(Property):
def __init__(self, spec_version=stix2.DEFAULT_VERSION, allow_custom=False, interoperability=False, *args, **kwargs):
self.allow_custom = allow_custom
self.spec_version = spec_version
self.interoperability = interoperability
super(STIXObjectProperty, self).__init__(*args, **kwargs)
def clean(self, value):
# Any STIX Object (SDO, SRO, or Marking Definition) can be added to
# a bundle with no further checks.
if any(x in ('_DomainObject', '_RelationshipObject', 'MarkingDefinition')
for x in get_class_hierarchy_names(value)):
# A simple "is this a spec version 2.1+ object" test. For now,
# limit 2.0 bundles to 2.0 objects. It's not possible yet to
# have validation co-constraints among properties, e.g. have
# validation here depend on the value of another property
# (spec_version). So this is a hack, and not technically spec-
# compliant.
if 'spec_version' in value and self.spec_version == '2.0':
raise ValueError(
"Spec version 2.0 bundles don't yet support "
"containing objects of a different spec "
"version.",
)
return value
try:
dictified = _get_dict(value)
except ValueError:
raise ValueError("This property may only contain a dictionary or object")
if dictified == {}:
raise ValueError("This property may only contain a non-empty dictionary or object")
if 'type' in dictified and dictified['type'] == 'bundle':
raise ValueError("This property may not contain a Bundle object")
if 'spec_version' in dictified and self.spec_version == '2.0':
# See above comment regarding spec_version.
raise ValueError(
"Spec version 2.0 bundles don't yet support "
"containing objects of a different spec version.",
)
parsed_obj = parse(dictified, allow_custom=self.allow_custom, interoperability=self.interoperability)
return parsed_obj

View File

@ -1,82 +0,0 @@
import datetime as dt
import pytest
import pytz
import stix2
from .constants import ATTACK_PATTERN_ID
EXPECTED = """{
"type": "attack-pattern",
"id": "attack-pattern--0c7b5b88-8ff7-4a4d-aa9d-feb398cd0061",
"created": "2016-05-12T08:17:27.000Z",
"modified": "2016-05-12T08:17:27.000Z",
"name": "Spear Phishing",
"description": "...",
"external_references": [
{
"source_name": "capec",
"external_id": "CAPEC-163"
}
]
}"""
def test_attack_pattern_example():
ap = stix2.AttackPattern(
id="attack-pattern--0c7b5b88-8ff7-4a4d-aa9d-feb398cd0061",
created="2016-05-12T08:17:27.000Z",
modified="2016-05-12T08:17:27.000Z",
name="Spear Phishing",
external_references=[{
"source_name": "capec",
"external_id": "CAPEC-163"
}],
description="...",
)
assert str(ap) == EXPECTED
@pytest.mark.parametrize("data", [
EXPECTED,
{
"type": "attack-pattern",
"id": "attack-pattern--0c7b5b88-8ff7-4a4d-aa9d-feb398cd0061",
"created": "2016-05-12T08:17:27.000Z",
"modified": "2016-05-12T08:17:27.000Z",
"description": "...",
"external_references": [
{
"external_id": "CAPEC-163",
"source_name": "capec"
}
],
"name": "Spear Phishing",
},
])
def test_parse_attack_pattern(data):
ap = stix2.parse(data)
assert ap.type == 'attack-pattern'
assert ap.id == ATTACK_PATTERN_ID
assert ap.created == dt.datetime(2016, 5, 12, 8, 17, 27, tzinfo=pytz.utc)
assert ap.modified == dt.datetime(2016, 5, 12, 8, 17, 27, tzinfo=pytz.utc)
assert ap.description == "..."
assert ap.external_references[0].external_id == 'CAPEC-163'
assert ap.external_references[0].source_name == 'capec'
assert ap.name == "Spear Phishing"
def test_attack_pattern_invalid_labels():
with pytest.raises(stix2.exceptions.InvalidValueError):
stix2.AttackPattern(
id="attack-pattern--0c7b5b88-8ff7-4a4d-aa9d-feb398cd0061",
created="2016-05-12T08:17:27Z",
modified="2016-05-12T08:17:27Z",
name="Spear Phishing",
labels=1
)
# TODO: Add other examples

View File

@ -1,214 +0,0 @@
import json
import pytest
import stix2
EXPECTED_BUNDLE = """{
"type": "bundle",
"id": "bundle--00000000-0000-4000-8000-000000000007",
"spec_version": "2.0",
"objects": [
{
"type": "indicator",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"created": "2017-01-01T12:34:56.000Z",
"modified": "2017-01-01T12:34:56.000Z",
"pattern": "[file:hashes.MD5 = 'd41d8cd98f00b204e9800998ecf8427e']",
"valid_from": "2017-01-01T12:34:56Z",
"labels": [
"malicious-activity"
]
},
{
"type": "malware",
"id": "malware--00000000-0000-4000-8000-000000000003",
"created": "2017-01-01T12:34:56.000Z",
"modified": "2017-01-01T12:34:56.000Z",
"name": "Cryptolocker",
"labels": [
"ransomware"
]
},
{
"type": "relationship",
"id": "relationship--00000000-0000-4000-8000-000000000005",
"created": "2017-01-01T12:34:56.000Z",
"modified": "2017-01-01T12:34:56.000Z",
"relationship_type": "indicates",
"source_ref": "indicator--a740531e-63ff-4e49-a9e1-a0a3eed0e3e7",
"target_ref": "malware--9c4638ec-f1de-4ddb-abf4-1b760417654e"
}
]
}"""
EXPECTED_BUNDLE_DICT = {
"type": "bundle",
"id": "bundle--00000000-0000-4000-8000-000000000007",
"spec_version": "2.0",
"objects": [
{
"type": "indicator",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"created": "2017-01-01T12:34:56.000Z",
"modified": "2017-01-01T12:34:56.000Z",
"pattern": "[file:hashes.MD5 = 'd41d8cd98f00b204e9800998ecf8427e']",
"valid_from": "2017-01-01T12:34:56Z",
"labels": [
"malicious-activity"
]
},
{
"type": "malware",
"id": "malware--00000000-0000-4000-8000-000000000003",
"created": "2017-01-01T12:34:56.000Z",
"modified": "2017-01-01T12:34:56.000Z",
"name": "Cryptolocker",
"labels": [
"ransomware"
]
},
{
"type": "relationship",
"id": "relationship--00000000-0000-4000-8000-000000000005",
"created": "2017-01-01T12:34:56.000Z",
"modified": "2017-01-01T12:34:56.000Z",
"relationship_type": "indicates",
"source_ref": "indicator--a740531e-63ff-4e49-a9e1-a0a3eed0e3e7",
"target_ref": "malware--9c4638ec-f1de-4ddb-abf4-1b760417654e"
}
]
}
def test_empty_bundle():
bundle = stix2.Bundle()
assert bundle.type == "bundle"
assert bundle.id.startswith("bundle--")
assert bundle.spec_version == "2.0"
with pytest.raises(AttributeError):
assert bundle.objects
def test_bundle_with_wrong_type():
with pytest.raises(stix2.exceptions.InvalidValueError) as excinfo:
stix2.Bundle(type="not-a-bundle")
assert excinfo.value.cls == stix2.Bundle
assert excinfo.value.prop_name == "type"
assert excinfo.value.reason == "must equal 'bundle'."
assert str(excinfo.value) == "Invalid value for Bundle 'type': must equal 'bundle'."
def test_bundle_id_must_start_with_bundle():
with pytest.raises(stix2.exceptions.InvalidValueError) as excinfo:
stix2.Bundle(id='my-prefix--')
assert excinfo.value.cls == stix2.Bundle
assert excinfo.value.prop_name == "id"
assert excinfo.value.reason == "must start with 'bundle--'."
assert str(excinfo.value) == "Invalid value for Bundle 'id': must start with 'bundle--'."
def test_bundle_with_wrong_spec_version():
with pytest.raises(stix2.exceptions.InvalidValueError) as excinfo:
stix2.Bundle(spec_version="1.2")
assert excinfo.value.cls == stix2.Bundle
assert excinfo.value.prop_name == "spec_version"
assert excinfo.value.reason == "must equal '2.0'."
assert str(excinfo.value) == "Invalid value for Bundle 'spec_version': must equal '2.0'."
def test_create_bundle1(indicator, malware, relationship):
bundle = stix2.Bundle(objects=[indicator, malware, relationship])
assert str(bundle) == EXPECTED_BUNDLE
assert bundle.serialize(pretty=True) == EXPECTED_BUNDLE
def test_create_bundle2(indicator, malware, relationship):
bundle = stix2.Bundle(objects=[indicator, malware, relationship])
assert json.loads(bundle.serialize()) == EXPECTED_BUNDLE_DICT
def test_create_bundle_with_positional_args(indicator, malware, relationship):
bundle = stix2.Bundle(indicator, malware, relationship)
assert str(bundle) == EXPECTED_BUNDLE
def test_create_bundle_with_positional_listarg(indicator, malware, relationship):
bundle = stix2.Bundle([indicator, malware, relationship])
assert str(bundle) == EXPECTED_BUNDLE
def test_create_bundle_with_listarg_and_positional_arg(indicator, malware, relationship):
bundle = stix2.Bundle([indicator, malware], relationship)
assert str(bundle) == EXPECTED_BUNDLE
def test_create_bundle_with_listarg_and_kwarg(indicator, malware, relationship):
bundle = stix2.Bundle([indicator, malware], objects=[relationship])
assert str(bundle) == EXPECTED_BUNDLE
def test_create_bundle_with_arg_listarg_and_kwarg(indicator, malware, relationship):
bundle = stix2.Bundle([indicator], malware, objects=[relationship])
assert str(bundle) == EXPECTED_BUNDLE
def test_create_bundle_invalid(indicator, malware, relationship):
with pytest.raises(ValueError) as excinfo:
stix2.Bundle(objects=[1])
assert excinfo.value.reason == "This property may only contain a dictionary or object"
with pytest.raises(ValueError) as excinfo:
stix2.Bundle(objects=[{}])
assert excinfo.value.reason == "This property may only contain a non-empty dictionary or object"
with pytest.raises(ValueError) as excinfo:
stix2.Bundle(objects=[{'type': 'bundle'}])
assert excinfo.value.reason == 'This property may not contain a Bundle object'
@pytest.mark.parametrize("version", ["2.0"])
def test_parse_bundle(version):
bundle = stix2.parse(EXPECTED_BUNDLE, version=version)
assert bundle.type == "bundle"
assert bundle.id.startswith("bundle--")
assert bundle.spec_version == "2.0"
assert type(bundle.objects[0]) is stix2.Indicator
assert bundle.objects[0].type == 'indicator'
assert bundle.objects[1].type == 'malware'
assert bundle.objects[2].type == 'relationship'
def test_parse_unknown_type():
unknown = {
"type": "other",
"id": "other--8e2e2d2b-17d4-4cbf-938f-98ee46b3cd3f",
"created": "2016-04-06T20:03:00Z",
"modified": "2016-04-06T20:03:00Z",
"created_by_ref": "identity--f431f809-377b-45e0-aa1c-6a4751cae5ff",
"description": "Campaign by Green Group against a series of targets in the financial services sector.",
"name": "Green Group Attacks Against Finance",
}
with pytest.raises(stix2.exceptions.ParseError) as excinfo:
stix2.parse(unknown)
assert str(excinfo.value) == "Can't parse unknown object type 'other'! For custom types, use the CustomObject decorator."
def test_stix_object_property():
prop = stix2.core.STIXObjectProperty()
identity = stix2.Identity(name="test", identity_class="individual")
assert prop.clean(identity) is identity

View File

@ -1,57 +0,0 @@
import datetime as dt
import pytest
import pytz
import stix2
from .constants import CAMPAIGN_ID
EXPECTED = """{
"type": "campaign",
"id": "campaign--8e2e2d2b-17d4-4cbf-938f-98ee46b3cd3f",
"created_by_ref": "identity--f431f809-377b-45e0-aa1c-6a4751cae5ff",
"created": "2016-04-06T20:03:00.000Z",
"modified": "2016-04-06T20:03:00.000Z",
"name": "Green Group Attacks Against Finance",
"description": "Campaign by Green Group against a series of targets in the financial services sector."
}"""
def test_campaign_example():
campaign = stix2.Campaign(
id="campaign--8e2e2d2b-17d4-4cbf-938f-98ee46b3cd3f",
created_by_ref="identity--f431f809-377b-45e0-aa1c-6a4751cae5ff",
created="2016-04-06T20:03:00Z",
modified="2016-04-06T20:03:00Z",
name="Green Group Attacks Against Finance",
description="Campaign by Green Group against a series of targets in the financial services sector."
)
assert str(campaign) == EXPECTED
@pytest.mark.parametrize("data", [
EXPECTED,
{
"type": "campaign",
"id": "campaign--8e2e2d2b-17d4-4cbf-938f-98ee46b3cd3f",
"created": "2016-04-06T20:03:00Z",
"modified": "2016-04-06T20:03:00Z",
"created_by_ref": "identity--f431f809-377b-45e0-aa1c-6a4751cae5ff",
"description": "Campaign by Green Group against a series of targets in the financial services sector.",
"name": "Green Group Attacks Against Finance",
},
])
def test_parse_campaign(data):
cmpn = stix2.parse(data)
assert cmpn.type == 'campaign'
assert cmpn.id == CAMPAIGN_ID
assert cmpn.created == dt.datetime(2016, 4, 6, 20, 3, 0, tzinfo=pytz.utc)
assert cmpn.modified == dt.datetime(2016, 4, 6, 20, 3, 0, tzinfo=pytz.utc)
assert cmpn.created_by_ref == "identity--f431f809-377b-45e0-aa1c-6a4751cae5ff"
assert cmpn.description == "Campaign by Green Group against a series of targets in the financial services sector."
assert cmpn.name == "Green Group Attacks Against Finance"
# TODO: Add other examples

View File

@ -1,529 +0,0 @@
import json
import os
import shutil
import pytest
from stix2 import (Bundle, Campaign, CustomObject, FileSystemSink,
FileSystemSource, FileSystemStore, Filter, Identity,
Indicator, Malware, Relationship, properties)
from stix2.test.constants import (CAMPAIGN_ID, CAMPAIGN_KWARGS, IDENTITY_ID,
IDENTITY_KWARGS, INDICATOR_ID,
INDICATOR_KWARGS, MALWARE_ID, MALWARE_KWARGS,
RELATIONSHIP_IDS)
FS_PATH = os.path.join(os.path.dirname(os.path.realpath(__file__)), "stix2_data")
@pytest.fixture
def fs_store():
# create
yield FileSystemStore(FS_PATH)
# remove campaign dir
shutil.rmtree(os.path.join(FS_PATH, "campaign"), True)
@pytest.fixture
def fs_source():
# create
fs = FileSystemSource(FS_PATH)
assert fs.stix_dir == FS_PATH
yield fs
# remove campaign dir
shutil.rmtree(os.path.join(FS_PATH, "campaign"), True)
@pytest.fixture
def fs_sink():
# create
fs = FileSystemSink(FS_PATH)
assert fs.stix_dir == FS_PATH
yield fs
# remove campaign dir
shutil.rmtree(os.path.join(FS_PATH, "campaign"), True)
@pytest.fixture
def bad_json_files():
# create erroneous JSON files for tests to make sure handled gracefully
with open(os.path.join(FS_PATH, "intrusion-set", "intrusion-set--test-non-json.txt"), "w+") as f:
f.write("Im not a JSON file")
with open(os.path.join(FS_PATH, "intrusion-set", "intrusion-set--test-bad-json.json"), "w+") as f:
f.write("Im not a JSON formatted file")
yield True # dummy yield so can have teardown
os.remove(os.path.join(FS_PATH, "intrusion-set", "intrusion-set--test-non-json.txt"))
os.remove(os.path.join(FS_PATH, "intrusion-set", "intrusion-set--test-bad-json.json"))
@pytest.fixture
def bad_stix_files():
# create erroneous STIX JSON files for tests to make sure handled correctly
# bad STIX object
stix_obj = {
"id": "intrusion-set--test-bad-stix",
"spec_version": "2.0"
# no "type" field
}
with open(os.path.join(FS_PATH, "intrusion-set", "intrusion-set--test-non-stix.json"), "w+") as f:
f.write(json.dumps(stix_obj))
yield True # dummy yield so can have teardown
os.remove(os.path.join(FS_PATH, "intrusion-set", "intrusion-set--test-non-stix.json"))
@pytest.fixture(scope='module')
def rel_fs_store():
cam = Campaign(id=CAMPAIGN_ID, **CAMPAIGN_KWARGS)
idy = Identity(id=IDENTITY_ID, **IDENTITY_KWARGS)
ind = Indicator(id=INDICATOR_ID, **INDICATOR_KWARGS)
mal = Malware(id=MALWARE_ID, **MALWARE_KWARGS)
rel1 = Relationship(ind, 'indicates', mal, id=RELATIONSHIP_IDS[0])
rel2 = Relationship(mal, 'targets', idy, id=RELATIONSHIP_IDS[1])
rel3 = Relationship(cam, 'uses', mal, id=RELATIONSHIP_IDS[2])
stix_objs = [cam, idy, ind, mal, rel1, rel2, rel3]
fs = FileSystemStore(FS_PATH)
for o in stix_objs:
fs.add(o)
yield fs
for o in stix_objs:
os.remove(os.path.join(FS_PATH, o.type, o.id + '.json'))
def test_filesystem_source_nonexistent_folder():
with pytest.raises(ValueError) as excinfo:
FileSystemSource('nonexistent-folder')
assert "for STIX data does not exist" in str(excinfo)
def test_filesystem_sink_nonexistent_folder():
with pytest.raises(ValueError) as excinfo:
FileSystemSink('nonexistent-folder')
assert "for STIX data does not exist" in str(excinfo)
def test_filesystem_source_bad_json_file(fs_source, bad_json_files):
# this tests the handling of two bad json files
# - one file should just be skipped (silently) as its a ".txt" extension
# - one file should be parsed and raise Exception bc its not JSON
try:
fs_source.get("intrusion-set--test-bad-json")
except TypeError as e:
assert "intrusion-set--test-bad-json" in str(e)
assert "could either not be parsed to JSON or was not valid STIX JSON" in str(e)
def test_filesystem_source_bad_stix_file(fs_source, bad_stix_files):
# this tests handling of bad STIX json object
try:
fs_source.get("intrusion-set--test-non-stix")
except TypeError as e:
assert "intrusion-set--test-non-stix" in str(e)
assert "could either not be parsed to JSON or was not valid STIX JSON" in str(e)
def test_filesytem_source_get_object(fs_source):
# get object
mal = fs_source.get("malware--6b616fc1-1505-48e3-8b2c-0d19337bff38")
assert mal.id == "malware--6b616fc1-1505-48e3-8b2c-0d19337bff38"
assert mal.name == "Rover"
def test_filesytem_source_get_nonexistent_object(fs_source):
ind = fs_source.get("indicator--6b616fc1-1505-48e3-8b2c-0d19337bff38")
assert ind is None
def test_filesytem_source_all_versions(fs_source):
# all versions - (currently not a true all versions call as FileSystem cant have multiple versions)
id_ = fs_source.get("identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5")
assert id_.id == "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5"
assert id_.name == "The MITRE Corporation"
assert id_.type == "identity"
def test_filesytem_source_query_single(fs_source):
# query2
is_2 = fs_source.query([Filter("external_references.external_id", '=', "T1027")])
assert len(is_2) == 1
is_2 = is_2[0]
assert is_2.id == "attack-pattern--b3d682b6-98f2-4fb0-aa3b-b4df007ca70a"
assert is_2.type == "attack-pattern"
def test_filesytem_source_query_multiple(fs_source):
# query
intrusion_sets = fs_source.query([Filter("type", '=', "intrusion-set")])
assert len(intrusion_sets) == 2
assert "intrusion-set--a653431d-6a5e-4600-8ad3-609b5af57064" in [is_.id for is_ in intrusion_sets]
assert "intrusion-set--f3bdec95-3d62-42d9-a840-29630f6cdc1a" in [is_.id for is_ in intrusion_sets]
is_1 = [is_ for is_ in intrusion_sets if is_.id == "intrusion-set--f3bdec95-3d62-42d9-a840-29630f6cdc1a"][0]
assert "DragonOK" in is_1.aliases
assert len(is_1.external_references) == 4
def test_filesystem_sink_add_python_stix_object(fs_sink, fs_source):
# add python stix object
camp1 = Campaign(name="Hannibal",
objective="Targeting Italian and Spanish Diplomat internet accounts",
aliases=["War Elephant"])
fs_sink.add(camp1)
assert os.path.exists(os.path.join(FS_PATH, "campaign", camp1.id + ".json"))
camp1_r = fs_source.get(camp1.id)
assert camp1_r.id == camp1.id
assert camp1_r.name == "Hannibal"
assert "War Elephant" in camp1_r.aliases
os.remove(os.path.join(FS_PATH, "campaign", camp1_r.id + ".json"))
def test_filesystem_sink_add_stix_object_dict(fs_sink, fs_source):
# add stix object dict
camp2 = {
"name": "Aurelius",
"type": "campaign",
"objective": "German and French Intelligence Services",
"aliases": ["Purple Robes"],
"id": "campaign--8e2e2d2b-17d4-4cbf-938f-98ee46b3cd3f",
"created": "2017-05-31T21:31:53.197755Z"
}
fs_sink.add(camp2)
assert os.path.exists(os.path.join(FS_PATH, "campaign", camp2["id"] + ".json"))
camp2_r = fs_source.get(camp2["id"])
assert camp2_r.id == camp2["id"]
assert camp2_r.name == camp2["name"]
assert "Purple Robes" in camp2_r.aliases
os.remove(os.path.join(FS_PATH, "campaign", camp2_r.id + ".json"))
def test_filesystem_sink_add_stix_bundle_dict(fs_sink, fs_source):
# add stix bundle dict
bund = {
"type": "bundle",
"id": "bundle--040ae5ec-2e91-4e94-b075-bc8b368e8ca3",
"spec_version": "2.0",
"objects": [
{
"name": "Atilla",
"type": "campaign",
"objective": "Bulgarian, Albanian and Romanian Intelligence Services",
"aliases": ["Huns"],
"id": "campaign--b8f86161-ccae-49de-973a-4ca320c62478",
"created": "2017-05-31T21:31:53.197755Z"
}
]
}
fs_sink.add(bund)
assert os.path.exists(os.path.join(FS_PATH, "campaign", bund["objects"][0]["id"] + ".json"))
camp3_r = fs_source.get(bund["objects"][0]["id"])
assert camp3_r.id == bund["objects"][0]["id"]
assert camp3_r.name == bund["objects"][0]["name"]
assert "Huns" in camp3_r.aliases
os.remove(os.path.join(FS_PATH, "campaign", camp3_r.id + ".json"))
def test_filesystem_sink_add_json_stix_object(fs_sink, fs_source):
# add json-encoded stix obj
camp4 = '{"type": "campaign", "id":"campaign--6a6ca372-ba07-42cc-81ef-9840fc1f963d",'\
' "created":"2017-05-31T21:31:53.197755Z", "name": "Ghengis Khan", "objective": "China and Russian infrastructure"}'
fs_sink.add(camp4)
assert os.path.exists(os.path.join(FS_PATH, "campaign", "campaign--6a6ca372-ba07-42cc-81ef-9840fc1f963d" + ".json"))
camp4_r = fs_source.get("campaign--6a6ca372-ba07-42cc-81ef-9840fc1f963d")
assert camp4_r.id == "campaign--6a6ca372-ba07-42cc-81ef-9840fc1f963d"
assert camp4_r.name == "Ghengis Khan"
os.remove(os.path.join(FS_PATH, "campaign", camp4_r.id + ".json"))
def test_filesystem_sink_json_stix_bundle(fs_sink, fs_source):
# add json-encoded stix bundle
bund2 = '{"type": "bundle", "id": "bundle--3d267103-8475-4d8f-b321-35ec6eccfa37",' \
' "spec_version": "2.0", "objects": [{"type": "campaign", "id": "campaign--2c03b8bf-82ee-433e-9918-ca2cb6e9534b",' \
' "created":"2017-05-31T21:31:53.197755Z", "name": "Spartacus", "objective": "Oppressive regimes of Africa and Middle East"}]}'
fs_sink.add(bund2)
assert os.path.exists(os.path.join(FS_PATH, "campaign", "campaign--2c03b8bf-82ee-433e-9918-ca2cb6e9534b" + ".json"))
camp5_r = fs_source.get("campaign--2c03b8bf-82ee-433e-9918-ca2cb6e9534b")
assert camp5_r.id == "campaign--2c03b8bf-82ee-433e-9918-ca2cb6e9534b"
assert camp5_r.name == "Spartacus"
os.remove(os.path.join(FS_PATH, "campaign", camp5_r.id + ".json"))
def test_filesystem_sink_add_objects_list(fs_sink, fs_source):
# add list of objects
camp6 = Campaign(name="Comanche",
objective="US Midwest manufacturing firms, oil refineries, and businesses",
aliases=["Horse Warrior"])
camp7 = {
"name": "Napolean",
"type": "campaign",
"objective": "Central and Eastern Europe military commands and departments",
"aliases": ["The Frenchmen"],
"id": "campaign--122818b6-1112-4fb0-b11b-b111107ca70a",
"created": "2017-05-31T21:31:53.197755Z"
}
fs_sink.add([camp6, camp7])
assert os.path.exists(os.path.join(FS_PATH, "campaign", camp6.id + ".json"))
assert os.path.exists(os.path.join(FS_PATH, "campaign", "campaign--122818b6-1112-4fb0-b11b-b111107ca70a" + ".json"))
camp6_r = fs_source.get(camp6.id)
assert camp6_r.id == camp6.id
assert "Horse Warrior" in camp6_r.aliases
camp7_r = fs_source.get(camp7["id"])
assert camp7_r.id == camp7["id"]
assert "The Frenchmen" in camp7_r.aliases
# remove all added objects
os.remove(os.path.join(FS_PATH, "campaign", camp6_r.id + ".json"))
os.remove(os.path.join(FS_PATH, "campaign", camp7_r.id + ".json"))
def test_filesystem_store_get_stored_as_bundle(fs_store):
coa = fs_store.get("course-of-action--95ddb356-7ba0-4bd9-a889-247262b8946f")
assert coa.id == "course-of-action--95ddb356-7ba0-4bd9-a889-247262b8946f"
assert coa.type == "course-of-action"
def test_filesystem_store_get_stored_as_object(fs_store):
coa = fs_store.get("course-of-action--d9727aee-48b8-4fdb-89e2-4c49746ba4dd")
assert coa.id == "course-of-action--d9727aee-48b8-4fdb-89e2-4c49746ba4dd"
assert coa.type == "course-of-action"
def test_filesystem_store_all_versions(fs_store):
# all versions() - (note at this time, all_versions() is still not applicable to FileSystem, as only one version is ever stored)
rel = fs_store.all_versions("relationship--70dc6b5c-c524-429e-a6ab-0dd40f0482c1")[0]
assert rel.id == "relationship--70dc6b5c-c524-429e-a6ab-0dd40f0482c1"
assert rel.type == "relationship"
def test_filesystem_store_query(fs_store):
# query()
tools = fs_store.query([Filter("labels", "in", "tool")])
assert len(tools) == 2
assert "tool--242f3da3-4425-4d11-8f5c-b842886da966" in [tool.id for tool in tools]
assert "tool--03342581-f790-4f03-ba41-e82e67392e23" in [tool.id for tool in tools]
def test_filesystem_store_query_single_filter(fs_store):
query = Filter("labels", "in", "tool")
tools = fs_store.query(query)
assert len(tools) == 2
assert "tool--242f3da3-4425-4d11-8f5c-b842886da966" in [tool.id for tool in tools]
assert "tool--03342581-f790-4f03-ba41-e82e67392e23" in [tool.id for tool in tools]
def test_filesystem_store_empty_query(fs_store):
results = fs_store.query() # returns all
assert len(results) == 26
assert "tool--242f3da3-4425-4d11-8f5c-b842886da966" in [obj.id for obj in results]
assert "marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168" in [obj.id for obj in results]
def test_filesystem_store_query_multiple_filters(fs_store):
fs_store.source.filters.add(Filter("labels", "in", "tool"))
tools = fs_store.query(Filter("id", "=", "tool--242f3da3-4425-4d11-8f5c-b842886da966"))
assert len(tools) == 1
assert tools[0].id == "tool--242f3da3-4425-4d11-8f5c-b842886da966"
def test_filesystem_store_query_dont_include_type_folder(fs_store):
results = fs_store.query(Filter("type", "!=", "tool"))
assert len(results) == 24
def test_filesystem_store_add(fs_store):
# add()
camp1 = Campaign(name="Great Heathen Army",
objective="Targeting the government of United Kingdom and insitutions affiliated with the Church Of England",
aliases=["Ragnar"])
fs_store.add(camp1)
camp1_r = fs_store.get(camp1.id)
assert camp1_r.id == camp1.id
assert camp1_r.name == camp1.name
# remove
os.remove(os.path.join(FS_PATH, "campaign", camp1_r.id + ".json"))
def test_filesystem_store_add_as_bundle():
fs_store = FileSystemStore(FS_PATH, bundlify=True)
camp1 = Campaign(name="Great Heathen Army",
objective="Targeting the government of United Kingdom and insitutions affiliated with the Church Of England",
aliases=["Ragnar"])
fs_store.add(camp1)
with open(os.path.join(FS_PATH, "campaign", camp1.id + ".json")) as bundle_file:
assert '"type": "bundle"' in bundle_file.read()
camp1_r = fs_store.get(camp1.id)
assert camp1_r.id == camp1.id
assert camp1_r.name == camp1.name
shutil.rmtree(os.path.join(FS_PATH, "campaign"), True)
def test_filesystem_add_bundle_object(fs_store):
bundle = Bundle()
fs_store.add(bundle)
def test_filesystem_store_add_invalid_object(fs_store):
ind = ('campaign', 'campaign--8e2e2d2b-17d4-4cbf-938f-98ee46b3cd3f') # tuple isn't valid
with pytest.raises(TypeError) as excinfo:
fs_store.add(ind)
assert 'stix_data must be' in str(excinfo.value)
assert 'a STIX object' in str(excinfo.value)
assert 'JSON formatted STIX' in str(excinfo.value)
assert 'JSON formatted STIX bundle' in str(excinfo.value)
def test_filesystem_object_with_custom_property(fs_store):
camp = Campaign(name="Scipio Africanus",
objective="Defeat the Carthaginians",
x_empire="Roman",
allow_custom=True)
fs_store.add(camp, True)
camp_r = fs_store.get(camp.id)
assert camp_r.id == camp.id
assert camp_r.x_empire == camp.x_empire
def test_filesystem_object_with_custom_property_in_bundle(fs_store):
camp = Campaign(name="Scipio Africanus",
objective="Defeat the Carthaginians",
x_empire="Roman",
allow_custom=True)
bundle = Bundle(camp, allow_custom=True)
fs_store.add(bundle)
camp_r = fs_store.get(camp.id)
assert camp_r.id == camp.id
assert camp_r.x_empire == camp.x_empire
def test_filesystem_custom_object(fs_store):
@CustomObject('x-new-obj', [
('property1', properties.StringProperty(required=True)),
])
class NewObj():
pass
newobj = NewObj(property1='something')
fs_store.add(newobj)
newobj_r = fs_store.get(newobj.id)
assert newobj_r.id == newobj.id
assert newobj_r.property1 == 'something'
# remove dir
shutil.rmtree(os.path.join(FS_PATH, "x-new-obj"), True)
def test_relationships(rel_fs_store):
mal = rel_fs_store.get(MALWARE_ID)
resp = rel_fs_store.relationships(mal)
assert len(resp) == 3
assert any(x['id'] == RELATIONSHIP_IDS[0] for x in resp)
assert any(x['id'] == RELATIONSHIP_IDS[1] for x in resp)
assert any(x['id'] == RELATIONSHIP_IDS[2] for x in resp)
def test_relationships_by_type(rel_fs_store):
mal = rel_fs_store.get(MALWARE_ID)
resp = rel_fs_store.relationships(mal, relationship_type='indicates')
assert len(resp) == 1
assert resp[0]['id'] == RELATIONSHIP_IDS[0]
def test_relationships_by_source(rel_fs_store):
resp = rel_fs_store.relationships(MALWARE_ID, source_only=True)
assert len(resp) == 1
assert resp[0]['id'] == RELATIONSHIP_IDS[1]
def test_relationships_by_target(rel_fs_store):
resp = rel_fs_store.relationships(MALWARE_ID, target_only=True)
assert len(resp) == 2
assert any(x['id'] == RELATIONSHIP_IDS[0] for x in resp)
assert any(x['id'] == RELATIONSHIP_IDS[2] for x in resp)
def test_relationships_by_target_and_type(rel_fs_store):
resp = rel_fs_store.relationships(MALWARE_ID, relationship_type='uses', target_only=True)
assert len(resp) == 1
assert any(x['id'] == RELATIONSHIP_IDS[2] for x in resp)
def test_relationships_by_target_and_source(rel_fs_store):
with pytest.raises(ValueError) as excinfo:
rel_fs_store.relationships(MALWARE_ID, target_only=True, source_only=True)
assert 'not both' in str(excinfo.value)
def test_related_to(rel_fs_store):
mal = rel_fs_store.get(MALWARE_ID)
resp = rel_fs_store.related_to(mal)
assert len(resp) == 3
assert any(x['id'] == CAMPAIGN_ID for x in resp)
assert any(x['id'] == INDICATOR_ID for x in resp)
assert any(x['id'] == IDENTITY_ID for x in resp)
def test_related_to_by_source(rel_fs_store):
resp = rel_fs_store.related_to(MALWARE_ID, source_only=True)
assert len(resp) == 1
assert any(x['id'] == IDENTITY_ID for x in resp)
def test_related_to_by_target(rel_fs_store):
resp = rel_fs_store.related_to(MALWARE_ID, target_only=True)
assert len(resp) == 2
assert any(x['id'] == CAMPAIGN_ID for x in resp)
assert any(x['id'] == INDICATOR_ID for x in resp)

View File

@ -1,379 +0,0 @@
import datetime
import pytest
import stix2
def test_create_comparison_expression():
exp = stix2.EqualityComparisonExpression("file:hashes.'SHA-256'",
stix2.HashConstant("aec070645fe53ee3b3763059376134f058cc337247c978add178b6ccdfb0019f", "SHA-256")) # noqa
assert str(exp) == "file:hashes.'SHA-256' = 'aec070645fe53ee3b3763059376134f058cc337247c978add178b6ccdfb0019f'"
def test_boolean_expression():
exp1 = stix2.MatchesComparisonExpression("email-message:from_ref.value",
stix2.StringConstant(".+\\@example\\.com$"))
exp2 = stix2.MatchesComparisonExpression("email-message:body_multipart[*].body_raw_ref.name",
stix2.StringConstant("^Final Report.+\\.exe$"))
exp = stix2.AndBooleanExpression([exp1, exp2])
assert str(exp) == "email-message:from_ref.value MATCHES '.+\\\\@example\\\\.com$' AND email-message:body_multipart[*].body_raw_ref.name MATCHES '^Final Report.+\\\\.exe$'" # noqa
def test_boolean_expression_with_parentheses():
exp1 = stix2.MatchesComparisonExpression(stix2.ObjectPath("email-message",
[stix2.ReferenceObjectPathComponent("from_ref"),
stix2.BasicObjectPathComponent("value")]),
stix2.StringConstant(".+\\@example\\.com$"))
exp2 = stix2.MatchesComparisonExpression("email-message:body_multipart[*].body_raw_ref.name",
stix2.StringConstant("^Final Report.+\\.exe$"))
exp = stix2.ParentheticalExpression(stix2.AndBooleanExpression([exp1, exp2]))
assert str(exp) == "(email-message:from_ref.value MATCHES '.+\\\\@example\\\\.com$' AND email-message:body_multipart[*].body_raw_ref.name MATCHES '^Final Report.+\\\\.exe$')" # noqa
def test_hash_followed_by_registryKey_expression_python_constant():
hash_exp = stix2.EqualityComparisonExpression("file:hashes.MD5",
stix2.HashConstant("79054025255fb1a26e4bc422aef54eb4", "MD5"))
o_exp1 = stix2.ObservationExpression(hash_exp)
reg_exp = stix2.EqualityComparisonExpression(stix2.ObjectPath("windows-registry-key", ["key"]),
stix2.StringConstant("HKEY_LOCAL_MACHINE\\foo\\bar"))
o_exp2 = stix2.ObservationExpression(reg_exp)
fb_exp = stix2.FollowedByObservationExpression([o_exp1, o_exp2])
para_exp = stix2.ParentheticalExpression(fb_exp)
qual_exp = stix2.WithinQualifier(300)
exp = stix2.QualifiedObservationExpression(para_exp, qual_exp)
assert str(exp) == "([file:hashes.MD5 = '79054025255fb1a26e4bc422aef54eb4'] FOLLOWEDBY [windows-registry-key:key = 'HKEY_LOCAL_MACHINE\\\\foo\\\\bar']) WITHIN 300 SECONDS" # noqa
def test_hash_followed_by_registryKey_expression():
hash_exp = stix2.EqualityComparisonExpression("file:hashes.MD5",
stix2.HashConstant("79054025255fb1a26e4bc422aef54eb4", "MD5"))
o_exp1 = stix2.ObservationExpression(hash_exp)
reg_exp = stix2.EqualityComparisonExpression(stix2.ObjectPath("windows-registry-key", ["key"]),
stix2.StringConstant("HKEY_LOCAL_MACHINE\\foo\\bar"))
o_exp2 = stix2.ObservationExpression(reg_exp)
fb_exp = stix2.FollowedByObservationExpression([o_exp1, o_exp2])
para_exp = stix2.ParentheticalExpression(fb_exp)
qual_exp = stix2.WithinQualifier(stix2.IntegerConstant(300))
exp = stix2.QualifiedObservationExpression(para_exp, qual_exp)
assert str(exp) == "([file:hashes.MD5 = '79054025255fb1a26e4bc422aef54eb4'] FOLLOWEDBY [windows-registry-key:key = 'HKEY_LOCAL_MACHINE\\\\foo\\\\bar']) WITHIN 300 SECONDS" # noqa
def test_file_observable_expression():
exp1 = stix2.EqualityComparisonExpression("file:hashes.'SHA-256'",
stix2.HashConstant(
"aec070645fe53ee3b3763059376134f058cc337247c978add178b6ccdfb0019f",
'SHA-256'))
exp2 = stix2.EqualityComparisonExpression("file:mime_type", stix2.StringConstant("application/x-pdf"))
bool_exp = stix2.ObservationExpression(stix2.AndBooleanExpression([exp1, exp2]))
assert str(bool_exp) == "[file:hashes.'SHA-256' = 'aec070645fe53ee3b3763059376134f058cc337247c978add178b6ccdfb0019f' AND file:mime_type = 'application/x-pdf']" # noqa
@pytest.mark.parametrize("observation_class, op", [
(stix2.AndObservationExpression, 'AND'),
(stix2.OrObservationExpression, 'OR'),
])
def test_multiple_file_observable_expression(observation_class, op):
exp1 = stix2.EqualityComparisonExpression("file:hashes.'SHA-256'",
stix2.HashConstant(
"bf07a7fbb825fc0aae7bf4a1177b2b31fcf8a3feeaf7092761e18c859ee52a9c",
'SHA-256'))
exp2 = stix2.EqualityComparisonExpression("file:hashes.MD5",
stix2.HashConstant("cead3f77f6cda6ec00f57d76c9a6879f", "MD5"))
bool1_exp = stix2.OrBooleanExpression([exp1, exp2])
exp3 = stix2.EqualityComparisonExpression("file:hashes.'SHA-256'",
stix2.HashConstant(
"aec070645fe53ee3b3763059376134f058cc337247c978add178b6ccdfb0019f",
'SHA-256'))
op1_exp = stix2.ObservationExpression(bool1_exp)
op2_exp = stix2.ObservationExpression(exp3)
exp = observation_class([op1_exp, op2_exp])
assert str(exp) == "[file:hashes.'SHA-256' = 'bf07a7fbb825fc0aae7bf4a1177b2b31fcf8a3feeaf7092761e18c859ee52a9c' OR file:hashes.MD5 = 'cead3f77f6cda6ec00f57d76c9a6879f'] {} [file:hashes.'SHA-256' = 'aec070645fe53ee3b3763059376134f058cc337247c978add178b6ccdfb0019f']".format(op) # noqa
def test_root_types():
ast = stix2.ObservationExpression(
stix2.AndBooleanExpression(
[stix2.ParentheticalExpression(
stix2.OrBooleanExpression([
stix2.EqualityComparisonExpression("a:b", stix2.StringConstant("1")),
stix2.EqualityComparisonExpression("b:c", stix2.StringConstant("2"))])),
stix2.EqualityComparisonExpression(u"b:d", stix2.StringConstant("3"))]))
assert str(ast) == "[(a:b = '1' OR b:c = '2') AND b:d = '3']"
def test_artifact_payload():
exp1 = stix2.EqualityComparisonExpression("artifact:mime_type",
"application/vnd.tcpdump.pcap")
exp2 = stix2.MatchesComparisonExpression("artifact:payload_bin",
stix2.StringConstant("\\xd4\\xc3\\xb2\\xa1\\x02\\x00\\x04\\x00"))
and_exp = stix2.ObservationExpression(stix2.AndBooleanExpression([exp1, exp2]))
assert str(and_exp) == "[artifact:mime_type = 'application/vnd.tcpdump.pcap' AND artifact:payload_bin MATCHES '\\\\xd4\\\\xc3\\\\xb2\\\\xa1\\\\x02\\\\x00\\\\x04\\\\x00']" # noqa
def test_greater_than_python_constant():
exp1 = stix2.GreaterThanComparisonExpression("file:extensions.windows-pebinary-ext.sections[*].entropy", 7.0)
exp = stix2.ObservationExpression(exp1)
assert str(exp) == "[file:extensions.windows-pebinary-ext.sections[*].entropy > 7.0]"
def test_greater_than():
exp1 = stix2.GreaterThanComparisonExpression("file:extensions.windows-pebinary-ext.sections[*].entropy",
stix2.FloatConstant(7.0))
exp = stix2.ObservationExpression(exp1)
assert str(exp) == "[file:extensions.windows-pebinary-ext.sections[*].entropy > 7.0]"
def test_less_than():
exp = stix2.LessThanComparisonExpression("file:size", 1024)
assert str(exp) == "file:size < 1024"
def test_greater_than_or_equal():
exp = stix2.GreaterThanEqualComparisonExpression("file:size",
1024)
assert str(exp) == "file:size >= 1024"
def test_less_than_or_equal():
exp = stix2.LessThanEqualComparisonExpression("file:size",
1024)
assert str(exp) == "file:size <= 1024"
def test_not():
exp = stix2.LessThanComparisonExpression("file:size",
1024,
negated=True)
assert str(exp) == "file:size NOT < 1024"
def test_and_observable_expression():
exp1 = stix2.AndBooleanExpression([stix2.EqualityComparisonExpression("user-account:account_type",
"unix"),
stix2.EqualityComparisonExpression("user-account:user_id",
stix2.StringConstant("1007")),
stix2.EqualityComparisonExpression("user-account:account_login",
"Peter")])
exp2 = stix2.AndBooleanExpression([stix2.EqualityComparisonExpression("user-account:account_type",
"unix"),
stix2.EqualityComparisonExpression("user-account:user_id",
stix2.StringConstant("1008")),
stix2.EqualityComparisonExpression("user-account:account_login",
"Paul")])
exp3 = stix2.AndBooleanExpression([stix2.EqualityComparisonExpression("user-account:account_type",
"unix"),
stix2.EqualityComparisonExpression("user-account:user_id",
stix2.StringConstant("1009")),
stix2.EqualityComparisonExpression("user-account:account_login",
"Mary")])
exp = stix2.AndObservationExpression([stix2.ObservationExpression(exp1),
stix2.ObservationExpression(exp2),
stix2.ObservationExpression(exp3)])
assert str(exp) == "[user-account:account_type = 'unix' AND user-account:user_id = '1007' AND user-account:account_login = 'Peter'] AND [user-account:account_type = 'unix' AND user-account:user_id = '1008' AND user-account:account_login = 'Paul'] AND [user-account:account_type = 'unix' AND user-account:user_id = '1009' AND user-account:account_login = 'Mary']" # noqa
def test_invalid_and_observable_expression():
with pytest.raises(ValueError) as excinfo:
stix2.AndBooleanExpression([stix2.EqualityComparisonExpression("user-account:display_name",
"admin"),
stix2.EqualityComparisonExpression("email-addr:display_name",
stix2.StringConstant("admin"))])
assert "All operands to an 'AND' expression must have the same object type" in str(excinfo)
def test_hex():
exp_and = stix2.AndBooleanExpression([stix2.EqualityComparisonExpression("file:mime_type",
"image/bmp"),
stix2.EqualityComparisonExpression("file:magic_number_hex",
stix2.HexConstant("ffd8"))])
exp = stix2.ObservationExpression(exp_and)
assert str(exp) == "[file:mime_type = 'image/bmp' AND file:magic_number_hex = h'ffd8']"
def test_multiple_qualifiers():
exp_and = stix2.AndBooleanExpression([stix2.EqualityComparisonExpression("network-traffic:dst_ref.type",
"domain-name"),
stix2.EqualityComparisonExpression("network-traffic:dst_ref.value",
"example.com")])
exp_ob = stix2.ObservationExpression(exp_and)
qual_rep = stix2.RepeatQualifier(5)
qual_within = stix2.WithinQualifier(stix2.IntegerConstant(1800))
exp = stix2.QualifiedObservationExpression(stix2.QualifiedObservationExpression(exp_ob, qual_rep), qual_within)
assert str(exp) == "[network-traffic:dst_ref.type = 'domain-name' AND network-traffic:dst_ref.value = 'example.com'] REPEATS 5 TIMES WITHIN 1800 SECONDS" # noqa
def test_set_op():
exp = stix2.ObservationExpression(stix2.IsSubsetComparisonExpression("network-traffic:dst_ref.value",
"2001:0db8:dead:beef:0000:0000:0000:0000/64"))
assert str(exp) == "[network-traffic:dst_ref.value ISSUBSET '2001:0db8:dead:beef:0000:0000:0000:0000/64']"
def test_timestamp():
ts = stix2.TimestampConstant('2014-01-13T07:03:17Z')
assert str(ts) == "t'2014-01-13T07:03:17Z'"
def test_boolean():
exp = stix2.EqualityComparisonExpression("email-message:is_multipart",
True)
assert str(exp) == "email-message:is_multipart = true"
def test_binary():
const = stix2.BinaryConstant("dGhpcyBpcyBhIHRlc3Q=")
exp = stix2.EqualityComparisonExpression("artifact:payload_bin",
const)
assert str(exp) == "artifact:payload_bin = b'dGhpcyBpcyBhIHRlc3Q='"
def test_list():
exp = stix2.InComparisonExpression("process:name",
['proccy', 'proximus', 'badproc'])
assert str(exp) == "process:name IN ('proccy', 'proximus', 'badproc')"
def test_list2():
# alternate way to construct an "IN" Comparison Expression
exp = stix2.EqualityComparisonExpression("process:name",
['proccy', 'proximus', 'badproc'])
assert str(exp) == "process:name IN ('proccy', 'proximus', 'badproc')"
def test_invalid_constant_type():
with pytest.raises(ValueError) as excinfo:
stix2.EqualityComparisonExpression("artifact:payload_bin",
{'foo': 'bar'})
assert 'Unable to create a constant' in str(excinfo)
def test_invalid_integer_constant():
with pytest.raises(ValueError) as excinfo:
stix2.IntegerConstant('foo')
assert 'must be an integer' in str(excinfo)
def test_invalid_timestamp_constant():
with pytest.raises(ValueError) as excinfo:
stix2.TimestampConstant('foo')
assert 'Must be a datetime object or timestamp string' in str(excinfo)
def test_invalid_float_constant():
with pytest.raises(ValueError) as excinfo:
stix2.FloatConstant('foo')
assert 'must be a float' in str(excinfo)
@pytest.mark.parametrize("data, result", [
(True, True),
(False, False),
('True', True),
('False', False),
('true', True),
('false', False),
('t', True),
('f', False),
('T', True),
('F', False),
(1, True),
(0, False),
])
def test_boolean_constant(data, result):
boolean = stix2.BooleanConstant(data)
assert boolean.value == result
def test_invalid_boolean_constant():
with pytest.raises(ValueError) as excinfo:
stix2.BooleanConstant('foo')
assert 'must be a boolean' in str(excinfo)
@pytest.mark.parametrize("hashtype, data", [
('MD5', 'zzz'),
('ssdeep', 'zzz=='),
])
def test_invalid_hash_constant(hashtype, data):
with pytest.raises(ValueError) as excinfo:
stix2.HashConstant(data, hashtype)
assert 'is not a valid {} hash'.format(hashtype) in str(excinfo)
def test_invalid_hex_constant():
with pytest.raises(ValueError) as excinfo:
stix2.HexConstant('mm')
assert "must contain an even number of hexadecimal characters" in str(excinfo)
def test_invalid_binary_constant():
with pytest.raises(ValueError) as excinfo:
stix2.BinaryConstant('foo')
assert 'must contain a base64' in str(excinfo)
def test_escape_quotes_and_backslashes():
exp = stix2.MatchesComparisonExpression("file:name",
"^Final Report.+\\.exe$")
assert str(exp) == "file:name MATCHES '^Final Report.+\\\\.exe$'"
def test_like():
exp = stix2.LikeComparisonExpression("directory:path",
"C:\\Windows\\%\\foo")
assert str(exp) == "directory:path LIKE 'C:\\\\Windows\\\\%\\\\foo'"
def test_issuperset():
exp = stix2.IsSupersetComparisonExpression("ipv4-addr:value",
"198.51.100.0/24")
assert str(exp) == "ipv4-addr:value ISSUPERSET '198.51.100.0/24'"
def test_repeat_qualifier():
qual = stix2.RepeatQualifier(stix2.IntegerConstant(5))
assert str(qual) == 'REPEATS 5 TIMES'
def test_invalid_repeat_qualifier():
with pytest.raises(ValueError) as excinfo:
stix2.RepeatQualifier('foo')
assert 'is not a valid argument for a Repeat Qualifier' in str(excinfo)
def test_invalid_within_qualifier():
with pytest.raises(ValueError) as excinfo:
stix2.WithinQualifier('foo')
assert 'is not a valid argument for a Within Qualifier' in str(excinfo)
def test_startstop_qualifier():
qual = stix2.StartStopQualifier(stix2.TimestampConstant('2016-06-01T00:00:00Z'),
datetime.datetime(2017, 3, 12, 8, 30, 0))
assert str(qual) == "START t'2016-06-01T00:00:00Z' STOP t'2017-03-12T08:30:00Z'"
qual2 = stix2.StartStopQualifier(datetime.date(2016, 6, 1),
stix2.TimestampConstant('2016-07-01T00:00:00Z'))
assert str(qual2) == "START t'2016-06-01T00:00:00Z' STOP t'2016-07-01T00:00:00Z'"
def test_invalid_startstop_qualifier():
with pytest.raises(ValueError) as excinfo:
stix2.StartStopQualifier('foo',
stix2.TimestampConstant('2016-06-01T00:00:00Z'))
assert 'is not a valid argument for a Start/Stop Qualifier' in str(excinfo)
with pytest.raises(ValueError) as excinfo:
stix2.StartStopQualifier(datetime.date(2016, 6, 1),
'foo')
assert 'is not a valid argument for a Start/Stop Qualifier' in str(excinfo)
def test_make_constant_already_a_constant():
str_const = stix2.StringConstant('Foo')
result = stix2.patterns.make_constant(str_const)
assert result is str_const

View File

@ -1,420 +0,0 @@
import uuid
import pytest
from stix2 import CustomObject, EmailMIMEComponent, ExtensionsProperty, TCPExt
from stix2.exceptions import AtLeastOnePropertyError, DictionaryKeyError
from stix2.properties import (ERROR_INVALID_ID, BinaryProperty,
BooleanProperty, DictionaryProperty,
EmbeddedObjectProperty, EnumProperty,
FloatProperty, HashesProperty, HexProperty,
IDProperty, IntegerProperty, ListProperty,
Property, ReferenceProperty, StringProperty,
TimestampProperty, TypeProperty)
from . import constants
def test_property():
p = Property()
assert p.required is False
assert p.clean('foo') == 'foo'
assert p.clean(3) == 3
def test_basic_clean():
class Prop(Property):
def clean(self, value):
if value == 42:
return value
else:
raise ValueError("Must be 42")
p = Prop()
assert p.clean(42) == 42
with pytest.raises(ValueError):
p.clean(41)
def test_property_default():
class Prop(Property):
def default(self):
return 77
p = Prop()
assert p.default() == 77
def test_fixed_property():
p = Property(fixed="2.0")
assert p.clean("2.0")
with pytest.raises(ValueError):
assert p.clean("x") is False
with pytest.raises(ValueError):
assert p.clean(2.0) is False
assert p.default() == "2.0"
assert p.clean(p.default())
def test_list_property():
p = ListProperty(StringProperty)
assert p.clean(['abc', 'xyz'])
with pytest.raises(ValueError):
p.clean([])
def test_string_property():
prop = StringProperty()
assert prop.clean('foobar')
assert prop.clean(1)
assert prop.clean([1, 2, 3])
def test_type_property():
prop = TypeProperty('my-type')
assert prop.clean('my-type')
with pytest.raises(ValueError):
prop.clean('not-my-type')
assert prop.clean(prop.default())
ID_PROP = IDProperty('my-type')
MY_ID = 'my-type--232c9d3f-49fc-4440-bb01-607f638778e7'
@pytest.mark.parametrize("value", [
MY_ID,
'my-type--00000000-0000-4000-8000-000000000000',
])
def test_id_property_valid(value):
assert ID_PROP.clean(value) == value
CONSTANT_IDS = [
constants.ATTACK_PATTERN_ID,
constants.CAMPAIGN_ID,
constants.COURSE_OF_ACTION_ID,
constants.IDENTITY_ID,
constants.INDICATOR_ID,
constants.INTRUSION_SET_ID,
constants.MALWARE_ID,
constants.MARKING_DEFINITION_ID,
constants.OBSERVED_DATA_ID,
constants.RELATIONSHIP_ID,
constants.REPORT_ID,
constants.SIGHTING_ID,
constants.THREAT_ACTOR_ID,
constants.TOOL_ID,
constants.VULNERABILITY_ID,
]
CONSTANT_IDS.extend(constants.MARKING_IDS)
CONSTANT_IDS.extend(constants.RELATIONSHIP_IDS)
@pytest.mark.parametrize("value", CONSTANT_IDS)
def test_id_property_valid_for_type(value):
type = value.split('--', 1)[0]
assert IDProperty(type=type).clean(value) == value
def test_id_property_wrong_type():
with pytest.raises(ValueError) as excinfo:
ID_PROP.clean('not-my-type--232c9d3f-49fc-4440-bb01-607f638778e7')
assert str(excinfo.value) == "must start with 'my-type--'."
@pytest.mark.parametrize("value", [
'my-type--foo',
# Not a v4 UUID
'my-type--00000000-0000-0000-0000-000000000000',
'my-type--' + str(uuid.uuid1()),
'my-type--' + str(uuid.uuid3(uuid.NAMESPACE_DNS, "example.org")),
'my-type--' + str(uuid.uuid5(uuid.NAMESPACE_DNS, "example.org")),
])
def test_id_property_not_a_valid_hex_uuid(value):
with pytest.raises(ValueError) as excinfo:
ID_PROP.clean(value)
assert str(excinfo.value) == ERROR_INVALID_ID
def test_id_property_default():
default = ID_PROP.default()
assert ID_PROP.clean(default) == default
@pytest.mark.parametrize("value", [
2,
-1,
3.14,
False,
])
def test_integer_property_valid(value):
int_prop = IntegerProperty()
assert int_prop.clean(value) is not None
@pytest.mark.parametrize("value", [
"something",
StringProperty(),
])
def test_integer_property_invalid(value):
int_prop = IntegerProperty()
with pytest.raises(ValueError):
int_prop.clean(value)
@pytest.mark.parametrize("value", [
2,
-1,
3.14,
False,
])
def test_float_property_valid(value):
int_prop = FloatProperty()
assert int_prop.clean(value) is not None
@pytest.mark.parametrize("value", [
"something",
StringProperty(),
])
def test_float_property_invalid(value):
int_prop = FloatProperty()
with pytest.raises(ValueError):
int_prop.clean(value)
@pytest.mark.parametrize("value", [
True,
False,
'True',
'False',
'true',
'false',
'TRUE',
'FALSE',
'T',
'F',
't',
'f',
1,
0,
])
def test_boolean_property_valid(value):
bool_prop = BooleanProperty()
assert bool_prop.clean(value) is not None
@pytest.mark.parametrize("value", [
'abc',
['false'],
{'true': 'true'},
2,
-1,
])
def test_boolean_property_invalid(value):
bool_prop = BooleanProperty()
with pytest.raises(ValueError):
bool_prop.clean(value)
def test_reference_property():
ref_prop = ReferenceProperty()
assert ref_prop.clean("my-type--00000000-0000-4000-8000-000000000000")
with pytest.raises(ValueError):
ref_prop.clean("foo")
# This is not a valid V4 UUID
with pytest.raises(ValueError):
ref_prop.clean("my-type--00000000-0000-0000-0000-000000000000")
@pytest.mark.parametrize("value", [
'2017-01-01T12:34:56Z',
'2017-01-01 12:34:56',
'Jan 1 2017 12:34:56',
])
def test_timestamp_property_valid(value):
ts_prop = TimestampProperty()
assert ts_prop.clean(value) == constants.FAKE_TIME
def test_timestamp_property_invalid():
ts_prop = TimestampProperty()
with pytest.raises(ValueError):
ts_prop.clean(1)
with pytest.raises(ValueError):
ts_prop.clean("someday sometime")
def test_binary_property():
bin_prop = BinaryProperty()
assert bin_prop.clean("TG9yZW0gSXBzdW0=")
with pytest.raises(ValueError):
bin_prop.clean("foobar")
def test_hex_property():
hex_prop = HexProperty()
assert hex_prop.clean("4c6f72656d20497073756d")
with pytest.raises(ValueError):
hex_prop.clean("foobar")
@pytest.mark.parametrize("d", [
{'description': 'something'},
[('abc', 1), ('bcd', 2), ('cde', 3)],
])
def test_dictionary_property_valid(d):
dict_prop = DictionaryProperty()
assert dict_prop.clean(d)
@pytest.mark.parametrize("d", [
[{'a': 'something'}, "Invalid dictionary key a: (shorter than 3 characters)."],
[{'a'*300: 'something'}, "Invalid dictionary key aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
"aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"
"aaaaaaaaaaaaaaaaaaaaaaa: (longer than 256 characters)."],
[{'Hey!': 'something'}, "Invalid dictionary key Hey!: (contains characters other thanlowercase a-z, "
"uppercase A-Z, numerals 0-9, hyphen (-), or underscore (_))."],
])
def test_dictionary_property_invalid_key(d):
dict_prop = DictionaryProperty()
with pytest.raises(DictionaryKeyError) as excinfo:
dict_prop.clean(d[0])
assert str(excinfo.value) == d[1]
@pytest.mark.parametrize("d", [
({}, "The dictionary property must contain a non-empty dictionary"),
# TODO: This error message could be made more helpful. The error is caused
# because `json.loads()` doesn't like the *single* quotes around the key
# name, even though they are valid in a Python dictionary. While technically
# accurate (a string is not a dictionary), if we want to be able to load
# string-encoded "dictionaries" that are, we need a better error message
# or an alternative to `json.loads()` ... and preferably *not* `eval()`. :-)
# Changing the following to `'{"description": "something"}'` does not cause
# any ValueError to be raised.
("{'description': 'something'}", "The dictionary property must contain a dictionary"),
])
def test_dictionary_property_invalid(d):
dict_prop = DictionaryProperty()
with pytest.raises(ValueError) as excinfo:
dict_prop.clean(d[0])
assert str(excinfo.value) == d[1]
def test_property_list_of_dictionary():
@CustomObject('x-new-obj', [
('property1', ListProperty(DictionaryProperty(), required=True)),
])
class NewObj():
pass
test_obj = NewObj(property1=[{'foo': 'bar'}])
assert test_obj.property1[0]['foo'] == 'bar'
@pytest.mark.parametrize("value", [
{"sha256": "6db12788c37247f2316052e142f42f4b259d6561751e5f401a1ae2a6df9c674b"},
[('MD5', '2dfb1bcc980200c6706feee399d41b3f'), ('RIPEMD-160', 'b3a8cd8a27c90af79b3c81754f267780f443dfef')],
])
def test_hashes_property_valid(value):
hash_prop = HashesProperty()
assert hash_prop.clean(value)
@pytest.mark.parametrize("value", [
{"MD5": "a"},
{"SHA-256": "2dfb1bcc980200c6706feee399d41b3f"},
])
def test_hashes_property_invalid(value):
hash_prop = HashesProperty()
with pytest.raises(ValueError):
hash_prop.clean(value)
def test_embedded_property():
emb_prop = EmbeddedObjectProperty(type=EmailMIMEComponent)
mime = EmailMIMEComponent(
content_type="text/plain; charset=utf-8",
content_disposition="inline",
body="Cats are funny!"
)
assert emb_prop.clean(mime)
with pytest.raises(ValueError):
emb_prop.clean("string")
@pytest.mark.parametrize("value", [
['a', 'b', 'c'],
('a', 'b', 'c'),
'b',
])
def test_enum_property_valid(value):
enum_prop = EnumProperty(value)
assert enum_prop.clean('b')
def test_enum_property_invalid():
enum_prop = EnumProperty(['a', 'b', 'c'])
with pytest.raises(ValueError):
enum_prop.clean('z')
def test_extension_property_valid():
ext_prop = ExtensionsProperty(enclosing_type='file')
assert ext_prop({
'windows-pebinary-ext': {
'pe_type': 'exe'
},
})
@pytest.mark.parametrize("data", [
1,
{'foobar-ext': {
'pe_type': 'exe'
}},
])
def test_extension_property_invalid(data):
ext_prop = ExtensionsProperty(enclosing_type='file')
with pytest.raises(ValueError):
ext_prop.clean(data)
def test_extension_property_invalid_type():
ext_prop = ExtensionsProperty(enclosing_type='indicator')
with pytest.raises(ValueError) as excinfo:
ext_prop.clean({
'windows-pebinary-ext': {
'pe_type': 'exe'
}}
)
assert 'no extensions defined' in str(excinfo.value)
def test_extension_at_least_one_property_constraint():
with pytest.raises(AtLeastOnePropertyError):
TCPExt()

View File

@ -1,130 +0,0 @@
import datetime as dt
import pytest
import pytz
import stix2
from .constants import INDICATOR_KWARGS, REPORT_ID
EXPECTED = """{
"type": "report",
"id": "report--84e4d88f-44ea-4bcd-bbf3-b2c1c320bcb3",
"created_by_ref": "identity--a463ffb3-1bd9-4d94-b02d-74e4f1658283",
"created": "2015-12-21T19:59:11.000Z",
"modified": "2015-12-21T19:59:11.000Z",
"name": "The Black Vine Cyberespionage Group",
"description": "A simple report with an indicator and campaign",
"published": "2016-01-20T17:00:00Z",
"object_refs": [
"indicator--26ffb872-1dd9-446e-b6f5-d58527e5b5d2",
"campaign--83422c77-904c-4dc1-aff5-5c38f3a2c55c",
"relationship--f82356ae-fe6c-437c-9c24-6b64314ae68a"
],
"labels": [
"campaign"
]
}"""
def test_report_example():
report = stix2.Report(
id="report--84e4d88f-44ea-4bcd-bbf3-b2c1c320bcb3",
created_by_ref="identity--a463ffb3-1bd9-4d94-b02d-74e4f1658283",
created="2015-12-21T19:59:11.000Z",
modified="2015-12-21T19:59:11.000Z",
name="The Black Vine Cyberespionage Group",
description="A simple report with an indicator and campaign",
published="2016-01-20T17:00:00Z",
labels=["campaign"],
object_refs=[
"indicator--26ffb872-1dd9-446e-b6f5-d58527e5b5d2",
"campaign--83422c77-904c-4dc1-aff5-5c38f3a2c55c",
"relationship--f82356ae-fe6c-437c-9c24-6b64314ae68a"
],
)
assert str(report) == EXPECTED
def test_report_example_objects_in_object_refs():
report = stix2.Report(
id="report--84e4d88f-44ea-4bcd-bbf3-b2c1c320bcb3",
created_by_ref="identity--a463ffb3-1bd9-4d94-b02d-74e4f1658283",
created="2015-12-21T19:59:11.000Z",
modified="2015-12-21T19:59:11.000Z",
name="The Black Vine Cyberespionage Group",
description="A simple report with an indicator and campaign",
published="2016-01-20T17:00:00Z",
labels=["campaign"],
object_refs=[
stix2.Indicator(id="indicator--26ffb872-1dd9-446e-b6f5-d58527e5b5d2", **INDICATOR_KWARGS),
"campaign--83422c77-904c-4dc1-aff5-5c38f3a2c55c",
"relationship--f82356ae-fe6c-437c-9c24-6b64314ae68a"
],
)
assert str(report) == EXPECTED
def test_report_example_objects_in_object_refs_with_bad_id():
with pytest.raises(stix2.exceptions.InvalidValueError) as excinfo:
stix2.Report(
id="report--84e4d88f-44ea-4bcd-bbf3-b2c1c320bcb3",
created_by_ref="identity--a463ffb3-1bd9-4d94-b02d-74e4f1658283",
created="2015-12-21T19:59:11.000Z",
modified="2015-12-21T19:59:11.000Z",
name="The Black Vine Cyberespionage Group",
description="A simple report with an indicator and campaign",
published="2016-01-20T17:00:00Z",
labels=["campaign"],
object_refs=[
stix2.Indicator(id="indicator--26ffb872-1dd9-446e-b6f5-d58527e5b5d2", **INDICATOR_KWARGS),
"campaign-83422c77-904c-4dc1-aff5-5c38f3a2c55c", # the "bad" id, missing a "-"
"relationship--f82356ae-fe6c-437c-9c24-6b64314ae68a"
],
)
assert excinfo.value.cls == stix2.Report
assert excinfo.value.prop_name == "object_refs"
assert excinfo.value.reason == stix2.properties.ERROR_INVALID_ID
assert str(excinfo.value) == "Invalid value for Report 'object_refs': " + stix2.properties.ERROR_INVALID_ID
@pytest.mark.parametrize("data", [
EXPECTED,
{
"created": "2015-12-21T19:59:11.000Z",
"created_by_ref": "identity--a463ffb3-1bd9-4d94-b02d-74e4f1658283",
"description": "A simple report with an indicator and campaign",
"id": "report--84e4d88f-44ea-4bcd-bbf3-b2c1c320bcb3",
"labels": [
"campaign"
],
"modified": "2015-12-21T19:59:11.000Z",
"name": "The Black Vine Cyberespionage Group",
"object_refs": [
"indicator--26ffb872-1dd9-446e-b6f5-d58527e5b5d2",
"campaign--83422c77-904c-4dc1-aff5-5c38f3a2c55c",
"relationship--f82356ae-fe6c-437c-9c24-6b64314ae68a"
],
"published": "2016-01-20T17:00:00Z",
"type": "report"
},
])
def test_parse_report(data):
rept = stix2.parse(data)
assert rept.type == 'report'
assert rept.id == REPORT_ID
assert rept.created == dt.datetime(2015, 12, 21, 19, 59, 11, tzinfo=pytz.utc)
assert rept.modified == dt.datetime(2015, 12, 21, 19, 59, 11, tzinfo=pytz.utc)
assert rept.created_by_ref == "identity--a463ffb3-1bd9-4d94-b02d-74e4f1658283"
assert rept.object_refs == ["indicator--26ffb872-1dd9-446e-b6f5-d58527e5b5d2",
"campaign--83422c77-904c-4dc1-aff5-5c38f3a2c55c",
"relationship--f82356ae-fe6c-437c-9c24-6b64314ae68a"]
assert rept.description == "A simple report with an indicator and campaign"
assert rept.labels == ["campaign"]
assert rept.name == "The Black Vine Cyberespionage Group"
# TODO: Add other examples

View File

@ -0,0 +1,212 @@
from __future__ import unicode_literals
import pytest
from stix2.parsing import _detect_spec_version
@pytest.mark.parametrize(
"obj_dict, expected_ver", [
# STIX 2.0 examples
(
{
"type": "identity",
"id": "identity--d7f72e8d-657a-43ec-9324-b3ec67a97486",
"created": "1972-05-21T05:33:09.000Z",
"modified": "1973-05-28T02:10:54.000Z",
"name": "alice",
"identity_class": "individual",
},
"v20",
),
(
{
"type": "relationship",
"id": "relationship--63b0f1b7-925e-4795-ac9b-61fb9f235f1a",
"created": "1981-08-11T13:48:19.000Z",
"modified": "2000-02-16T15:33:15.000Z",
"source_ref": "attack-pattern--9391504a-ef29-4a41-a257-5634d9edc391",
"target_ref": "identity--ba18dde2-56d3-4a34-aa0b-fc56f5be568f",
"relationship_type": "targets",
},
"v20",
),
(
{
"type": "file",
"name": "notes.txt",
},
"v20",
),
(
{
"type": "marking-definition",
"id": "marking-definition--2a13090f-a493-4b70-85fe-fa021d91dcd2",
"created": "1998-03-27T19:44:53.000Z",
"definition_type": "statement",
"definition": {
"statement": "Copyright (c) ACME Corp.",
},
},
"v20",
),
(
{
"type": "bundle",
"id": "bundle--8379cb02-8131-47c8-8a7c-9a1f0e0986b1",
"spec_version": "2.0",
"objects": [
{
"type": "identity",
"id": "identity--d7f72e8d-657a-43ec-9324-b3ec67a97486",
"created": "1972-05-21T05:33:09.000Z",
"modified": "1973-05-28T02:10:54.000Z",
"name": "alice",
"identity_class": "individual",
},
{
"type": "marking-definition",
"id": "marking-definition--2a13090f-a493-4b70-85fe-fa021d91dcd2",
"created": "1998-03-27T19:44:53.000Z",
"definition_type": "statement",
"definition": {
"statement": "Copyright (c) ACME Corp.",
},
},
],
},
"v20",
),
# STIX 2.1 examples
(
{
"type": "identity",
"id": "identity--22299b4c-bc38-4485-ad7d-8222f01c58c7",
"spec_version": "2.1",
"created": "1995-07-24T04:07:48.000Z",
"modified": "2001-07-01T09:33:17.000Z",
"name": "alice",
},
"v21",
),
(
{
"type": "relationship",
"id": "relationship--0eec232d-e1ea-4f85-8e78-0de6ae9d09f0",
"spec_version": "2.1",
"created": "1975-04-05T10:47:22.000Z",
"modified": "1983-04-25T20:56:00.000Z",
"source_ref": "attack-pattern--9391504a-ef29-4a41-a257-5634d9edc391",
"target_ref": "identity--ba18dde2-56d3-4a34-aa0b-fc56f5be568f",
"relationship_type": "targets",
},
"v21",
),
(
{
"type": "file",
"id": "file--5eef3404-6a94-4db3-9a1a-5684cbea0dfe",
"spec_version": "2.1",
"name": "notes.txt",
},
"v21",
),
(
{
"type": "file",
"id": "file--5eef3404-6a94-4db3-9a1a-5684cbea0dfe",
"name": "notes.txt",
},
"v21",
),
(
{
"type": "marking-definition",
"spec_version": "2.1",
"id": "marking-definition--34098fce-860f-48ae-8e50-ebd3cc5e41da",
"created": "2017-01-20T00:00:00.000Z",
"definition_type": "tlp",
"name": "TLP:GREEN",
"definition": {
"tlp": "green",
},
},
"v21",
),
(
{
"type": "bundle",
"id": "bundle--d5787acd-1ffd-4630-ada3-6857698f6287",
"objects": [
{
"type": "identity",
"id": "identity--22299b4c-bc38-4485-ad7d-8222f01c58c7",
"spec_version": "2.1",
"created": "1995-07-24T04:07:48.000Z",
"modified": "2001-07-01T09:33:17.000Z",
"name": "alice",
},
{
"type": "file",
"id": "file--5eef3404-6a94-4db3-9a1a-5684cbea0dfe",
"name": "notes.txt",
},
],
},
"v21",
),
# Mixed spec examples
(
{
"type": "bundle",
"id": "bundle--e1a01e29-3432-401a-ab9f-c1082b056605",
"objects": [
{
"type": "identity",
"id": "identity--d7f72e8d-657a-43ec-9324-b3ec67a97486",
"created": "1972-05-21T05:33:09.000Z",
"modified": "1973-05-28T02:10:54.000Z",
"name": "alice",
"identity_class": "individual",
},
{
"type": "relationship",
"id": "relationship--63b0f1b7-925e-4795-ac9b-61fb9f235f1a",
"created": "1981-08-11T13:48:19.000Z",
"modified": "2000-02-16T15:33:15.000Z",
"source_ref": "attack-pattern--9391504a-ef29-4a41-a257-5634d9edc391",
"target_ref": "identity--ba18dde2-56d3-4a34-aa0b-fc56f5be568f",
"relationship_type": "targets",
},
],
},
"v21",
),
(
{
"type": "bundle",
"id": "bundle--eecad3d9-bb9a-4263-93f6-1c0ccc984574",
"objects": [
{
"type": "identity",
"id": "identity--d7f72e8d-657a-43ec-9324-b3ec67a97486",
"created": "1972-05-21T05:33:09.000Z",
"modified": "1973-05-28T02:10:54.000Z",
"name": "alice",
"identity_class": "individual",
},
{
"type": "file",
"id": "file--5eef3404-6a94-4db3-9a1a-5684cbea0dfe",
"name": "notes.txt",
},
],
},
"v21",
),
],
)
def test_spec_version_detect(obj_dict, expected_ver):
detected_ver = _detect_spec_version(obj_dict)
assert detected_ver == expected_ver

View File

@ -1,210 +0,0 @@
# -*- coding: utf-8 -*-
import datetime as dt
from io import StringIO
import pytest
import pytz
import stix2.utils
amsterdam = pytz.timezone('Europe/Amsterdam')
eastern = pytz.timezone('US/Eastern')
@pytest.mark.parametrize('dttm, timestamp', [
(dt.datetime(2017, 1, 1, tzinfo=pytz.utc), '2017-01-01T00:00:00Z'),
(amsterdam.localize(dt.datetime(2017, 1, 1)), '2016-12-31T23:00:00Z'),
(eastern.localize(dt.datetime(2017, 1, 1, 12, 34, 56)), '2017-01-01T17:34:56Z'),
(eastern.localize(dt.datetime(2017, 7, 1)), '2017-07-01T04:00:00Z'),
(dt.datetime(2017, 7, 1), '2017-07-01T00:00:00Z'),
(dt.datetime(2017, 7, 1, 0, 0, 0, 1), '2017-07-01T00:00:00.000001Z'),
(stix2.utils.STIXdatetime(2017, 7, 1, 0, 0, 0, 1, precision='millisecond'), '2017-07-01T00:00:00.000Z'),
(stix2.utils.STIXdatetime(2017, 7, 1, 0, 0, 0, 1, precision='second'), '2017-07-01T00:00:00Z'),
])
def test_timestamp_formatting(dttm, timestamp):
assert stix2.utils.format_datetime(dttm) == timestamp
@pytest.mark.parametrize('timestamp, dttm', [
(dt.datetime(2017, 1, 1, 0, tzinfo=pytz.utc), dt.datetime(2017, 1, 1, 0, 0, 0, tzinfo=pytz.utc)),
(dt.date(2017, 1, 1), dt.datetime(2017, 1, 1, 0, 0, 0, tzinfo=pytz.utc)),
('2017-01-01T00:00:00Z', dt.datetime(2017, 1, 1, 0, 0, 0, tzinfo=pytz.utc)),
('2017-01-01T02:00:00+2:00', dt.datetime(2017, 1, 1, 0, 0, 0, tzinfo=pytz.utc)),
('2017-01-01T00:00:00', dt.datetime(2017, 1, 1, 0, 0, 0, tzinfo=pytz.utc)),
])
def test_parse_datetime(timestamp, dttm):
assert stix2.utils.parse_into_datetime(timestamp) == dttm
@pytest.mark.parametrize('timestamp, dttm, precision', [
('2017-01-01T01:02:03.000001', dt.datetime(2017, 1, 1, 1, 2, 3, 0, tzinfo=pytz.utc), 'millisecond'),
('2017-01-01T01:02:03.001', dt.datetime(2017, 1, 1, 1, 2, 3, 1000, tzinfo=pytz.utc), 'millisecond'),
('2017-01-01T01:02:03.1', dt.datetime(2017, 1, 1, 1, 2, 3, 100000, tzinfo=pytz.utc), 'millisecond'),
('2017-01-01T01:02:03.45', dt.datetime(2017, 1, 1, 1, 2, 3, 450000, tzinfo=pytz.utc), 'millisecond'),
('2017-01-01T01:02:03.45', dt.datetime(2017, 1, 1, 1, 2, 3, tzinfo=pytz.utc), 'second'),
])
def test_parse_datetime_precision(timestamp, dttm, precision):
assert stix2.utils.parse_into_datetime(timestamp, precision) == dttm
@pytest.mark.parametrize('ts', [
'foobar',
1,
])
def test_parse_datetime_invalid(ts):
with pytest.raises(ValueError):
stix2.utils.parse_into_datetime('foobar')
@pytest.mark.parametrize('data', [
{"a": 1},
'{"a": 1}',
StringIO(u'{"a": 1}'),
[("a", 1,)],
])
def test_get_dict(data):
assert stix2.utils._get_dict(data)
@pytest.mark.parametrize('data', [
1,
[1],
['a', 1],
"foobar",
])
def test_get_dict_invalid(data):
with pytest.raises(ValueError):
stix2.utils._get_dict(data)
@pytest.mark.parametrize('stix_id, type', [
('malware--d69c8146-ab35-4d50-8382-6fc80e641d43', 'malware'),
('intrusion-set--899ce53f-13a0-479b-a0e4-67d46e241542', 'intrusion-set')
])
def test_get_type_from_id(stix_id, type):
assert stix2.utils.get_type_from_id(stix_id) == type
def test_deduplicate(stix_objs1):
unique = stix2.utils.deduplicate(stix_objs1)
# Only 3 objects are unique
# 2 id's vary
# 2 modified times vary for a particular id
assert len(unique) == 3
ids = [obj['id'] for obj in unique]
mods = [obj['modified'] for obj in unique]
assert "indicator--00000000-0000-4000-8000-000000000001" in ids
assert "indicator--00000000-0000-4000-8000-000000000001" in ids
assert "2017-01-27T13:49:53.935Z" in mods
assert "2017-01-27T13:49:53.936Z" in mods
@pytest.mark.parametrize('object, tuple_to_find, expected_index', [
(stix2.ObservedData(
id="observed-data--b67d30ff-02ac-498a-92f9-32f845f448cf",
created_by_ref="identity--f431f809-377b-45e0-aa1c-6a4751cae5ff",
created="2016-04-06T19:58:16.000Z",
modified="2016-04-06T19:58:16.000Z",
first_observed="2015-12-21T19:00:00Z",
last_observed="2015-12-21T19:00:00Z",
number_observed=50,
objects={
"0": {
"name": "foo.exe",
"type": "file"
},
"1": {
"type": "ipv4-addr",
"value": "198.51.100.3"
},
"2": {
"type": "network-traffic",
"src_ref": "1",
"protocols": [
"tcp",
"http"
],
"extensions": {
"http-request-ext": {
"request_method": "get",
"request_value": "/download.html",
"request_version": "http/1.1",
"request_header": {
"Accept-Encoding": "gzip,deflate",
"User-Agent": "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.6) Gecko/20040113",
"Host": "www.example.com"
}
}
}
}
},
), ('1', {"type": "ipv4-addr", "value": "198.51.100.3"}), 1),
({
"type": "x-example",
"id": "x-example--d5413db2-c26c-42e0-b0e0-ec800a310bfb",
"created": "2018-06-11T01:25:22.063Z",
"modified": "2018-06-11T01:25:22.063Z",
"dictionary": {
"key": {
"key_one": "value",
"key_two": "value"
}
}
}, ('key', {'key_one': 'value', 'key_two': 'value'}), 0),
({
"type": "language-content",
"id": "language-content--b86bd89f-98bb-4fa9-8cb2-9ad421da981d",
"created": "2017-02-08T21:31:22.007Z",
"modified": "2017-02-08T21:31:22.007Z",
"object_ref": "campaign--12a111f0-b824-4baf-a224-83b80237a094",
"object_modified": "2017-02-08T21:31:22.007Z",
"contents": {
"de": {
"name": "Bank Angriff 1",
"description": "Weitere Informationen über Banküberfall"
},
"fr": {
"name": "Attaque Bank 1",
"description": "Plus d'informations sur la crise bancaire"
}
}
}, ('fr', {"name": "Attaque Bank 1", "description": "Plus d'informations sur la crise bancaire"}), 1)
])
def test_find_property_index(object, tuple_to_find, expected_index):
assert stix2.utils.find_property_index(
object,
*tuple_to_find
) == expected_index
@pytest.mark.parametrize('dict_value, tuple_to_find, expected_index', [
({
"contents": {
"de": {
"name": "Bank Angriff 1",
"description": "Weitere Informationen über Banküberfall"
},
"fr": {
"name": "Attaque Bank 1",
"description": "Plus d'informations sur la crise bancaire"
},
"es": {
"name": "Ataque al Banco",
"description": "Mas informacion sobre el ataque al banco"
}
}
}, ('es', {"name": "Ataque al Banco", "description": "Mas informacion sobre el ataque al banco"}), 1), # Sorted alphabetically
({
'my_list': [
{"key_one": 1},
{"key_two": 2}
]
}, ('key_one', 1), 0)
])
def test_iterate_over_values(dict_value, tuple_to_find, expected_index):
assert stix2.utils._find_property_in_seq(dict_value.values(), *tuple_to_find) == expected_index

View File

@ -1,41 +1,43 @@
import importlib
import os
import stix2
from stix2.workbench import (AttackPattern, Bundle, Campaign, CourseOfAction,
ExternalReference, FileSystemSource, Filter,
Identity, Indicator, IntrusionSet, Malware,
MarkingDefinition, ObservedData, Relationship,
Report, StatementMarking, ThreatActor, Tool,
Vulnerability, add_data_source, all_versions,
attack_patterns, campaigns, courses_of_action,
create, get, identities, indicators,
intrusion_sets, malware, observed_data, query,
reports, save, set_default_created,
from stix2.workbench import (
_STIX_VID, AttackPattern, Bundle, Campaign, CourseOfAction,
ExternalReference, File, FileSystemSource, Filter, Identity, Indicator,
IntrusionSet, Malware, MarkingDefinition, NTFSExt, ObservedData,
Relationship, Report, StatementMarking, ThreatActor, Tool, Vulnerability,
add_data_source, all_versions, attack_patterns, campaigns,
courses_of_action, create, get, identities, indicators, intrusion_sets,
malware, observed_data, query, reports, save, set_default_created,
set_default_creator, set_default_external_refs,
set_default_object_marking_refs, threat_actors,
tools, vulnerabilities)
set_default_object_marking_refs, threat_actors, tools, vulnerabilities,
)
from .constants import (ATTACK_PATTERN_ID, ATTACK_PATTERN_KWARGS, CAMPAIGN_ID,
CAMPAIGN_KWARGS, COURSE_OF_ACTION_ID,
COURSE_OF_ACTION_KWARGS, IDENTITY_ID, IDENTITY_KWARGS,
INDICATOR_ID, INDICATOR_KWARGS, INTRUSION_SET_ID,
INTRUSION_SET_KWARGS, MALWARE_ID, MALWARE_KWARGS,
OBSERVED_DATA_ID, OBSERVED_DATA_KWARGS, REPORT_ID,
REPORT_KWARGS, THREAT_ACTOR_ID, THREAT_ACTOR_KWARGS,
TOOL_ID, TOOL_KWARGS, VULNERABILITY_ID,
VULNERABILITY_KWARGS)
# Auto-detect some settings based on the current default STIX version
_STIX_DATA_PATH = os.path.join(
os.path.dirname(os.path.realpath(__file__)),
_STIX_VID,
"stix2_data",
)
_STIX_CONSTANTS_MODULE = "stix2.test." + _STIX_VID + ".constants"
constants = importlib.import_module(_STIX_CONSTANTS_MODULE)
def test_workbench_environment():
# Create a STIX object
ind = create(Indicator, id=INDICATOR_ID, **INDICATOR_KWARGS)
ind = create(
Indicator, id=constants.INDICATOR_ID, **constants.INDICATOR_KWARGS
)
save(ind)
resp = get(INDICATOR_ID)
resp = get(constants.INDICATOR_ID)
assert resp['labels'][0] == 'malicious-activity'
resp = all_versions(INDICATOR_ID)
resp = all_versions(constants.INDICATOR_ID)
assert len(resp) == 1
# Search on something other than id
@ -45,176 +47,193 @@ def test_workbench_environment():
def test_workbench_get_all_attack_patterns():
mal = AttackPattern(id=ATTACK_PATTERN_ID, **ATTACK_PATTERN_KWARGS)
mal = AttackPattern(
id=constants.ATTACK_PATTERN_ID, **constants.ATTACK_PATTERN_KWARGS
)
save(mal)
resp = attack_patterns()
assert len(resp) == 1
assert resp[0].id == ATTACK_PATTERN_ID
assert resp[0].id == constants.ATTACK_PATTERN_ID
def test_workbench_get_all_campaigns():
cam = Campaign(id=CAMPAIGN_ID, **CAMPAIGN_KWARGS)
cam = Campaign(id=constants.CAMPAIGN_ID, **constants.CAMPAIGN_KWARGS)
save(cam)
resp = campaigns()
assert len(resp) == 1
assert resp[0].id == CAMPAIGN_ID
assert resp[0].id == constants.CAMPAIGN_ID
def test_workbench_get_all_courses_of_action():
coa = CourseOfAction(id=COURSE_OF_ACTION_ID, **COURSE_OF_ACTION_KWARGS)
coa = CourseOfAction(
id=constants.COURSE_OF_ACTION_ID, **constants.COURSE_OF_ACTION_KWARGS
)
save(coa)
resp = courses_of_action()
assert len(resp) == 1
assert resp[0].id == COURSE_OF_ACTION_ID
assert resp[0].id == constants.COURSE_OF_ACTION_ID
def test_workbench_get_all_identities():
idty = Identity(id=IDENTITY_ID, **IDENTITY_KWARGS)
idty = Identity(id=constants.IDENTITY_ID, **constants.IDENTITY_KWARGS)
save(idty)
resp = identities()
assert len(resp) == 1
assert resp[0].id == IDENTITY_ID
assert resp[0].id == constants.IDENTITY_ID
def test_workbench_get_all_indicators():
resp = indicators()
assert len(resp) == 1
assert resp[0].id == INDICATOR_ID
assert resp[0].id == constants.INDICATOR_ID
def test_workbench_get_all_intrusion_sets():
ins = IntrusionSet(id=INTRUSION_SET_ID, **INTRUSION_SET_KWARGS)
ins = IntrusionSet(
id=constants.INTRUSION_SET_ID, **constants.INTRUSION_SET_KWARGS
)
save(ins)
resp = intrusion_sets()
assert len(resp) == 1
assert resp[0].id == INTRUSION_SET_ID
assert resp[0].id == constants.INTRUSION_SET_ID
def test_workbench_get_all_malware():
mal = Malware(id=MALWARE_ID, **MALWARE_KWARGS)
mal = Malware(id=constants.MALWARE_ID, **constants.MALWARE_KWARGS)
save(mal)
resp = malware()
assert len(resp) == 1
assert resp[0].id == MALWARE_ID
assert resp[0].id == constants.MALWARE_ID
def test_workbench_get_all_observed_data():
od = ObservedData(id=OBSERVED_DATA_ID, **OBSERVED_DATA_KWARGS)
od = ObservedData(
id=constants.OBSERVED_DATA_ID, **constants.OBSERVED_DATA_KWARGS
)
save(od)
resp = observed_data()
assert len(resp) == 1
assert resp[0].id == OBSERVED_DATA_ID
assert resp[0].id == constants.OBSERVED_DATA_ID
def test_workbench_get_all_reports():
rep = Report(id=REPORT_ID, **REPORT_KWARGS)
rep = Report(id=constants.REPORT_ID, **constants.REPORT_KWARGS)
save(rep)
resp = reports()
assert len(resp) == 1
assert resp[0].id == REPORT_ID
assert resp[0].id == constants.REPORT_ID
def test_workbench_get_all_threat_actors():
thr = ThreatActor(id=THREAT_ACTOR_ID, **THREAT_ACTOR_KWARGS)
thr = ThreatActor(
id=constants.THREAT_ACTOR_ID, **constants.THREAT_ACTOR_KWARGS
)
save(thr)
resp = threat_actors()
assert len(resp) == 1
assert resp[0].id == THREAT_ACTOR_ID
assert resp[0].id == constants.THREAT_ACTOR_ID
def test_workbench_get_all_tools():
tool = Tool(id=TOOL_ID, **TOOL_KWARGS)
tool = Tool(id=constants.TOOL_ID, **constants.TOOL_KWARGS)
save(tool)
resp = tools()
assert len(resp) == 1
assert resp[0].id == TOOL_ID
assert resp[0].id == constants.TOOL_ID
def test_workbench_get_all_vulnerabilities():
vuln = Vulnerability(id=VULNERABILITY_ID, **VULNERABILITY_KWARGS)
vuln = Vulnerability(
id=constants.VULNERABILITY_ID, **constants.VULNERABILITY_KWARGS
)
save(vuln)
resp = vulnerabilities()
assert len(resp) == 1
assert resp[0].id == VULNERABILITY_ID
assert resp[0].id == constants.VULNERABILITY_ID
def test_workbench_add_to_bundle():
vuln = Vulnerability(**VULNERABILITY_KWARGS)
vuln = Vulnerability(**constants.VULNERABILITY_KWARGS)
bundle = Bundle(vuln)
assert bundle.objects[0].name == 'Heartbleed'
def test_workbench_relationships():
rel = Relationship(INDICATOR_ID, 'indicates', MALWARE_ID)
rel = Relationship(
constants.INDICATOR_ID, 'indicates', constants.MALWARE_ID,
)
save(rel)
ind = get(INDICATOR_ID)
ind = get(constants.INDICATOR_ID)
resp = ind.relationships()
assert len(resp) == 1
assert resp[0].relationship_type == 'indicates'
assert resp[0].source_ref == INDICATOR_ID
assert resp[0].target_ref == MALWARE_ID
assert resp[0].source_ref == constants.INDICATOR_ID
assert resp[0].target_ref == constants.MALWARE_ID
def test_workbench_created_by():
intset = IntrusionSet(name="Breach 123", created_by_ref=IDENTITY_ID)
intset = IntrusionSet(
name="Breach 123", created_by_ref=constants.IDENTITY_ID,
)
save(intset)
creator = intset.created_by()
assert creator.id == IDENTITY_ID
assert creator.id == constants.IDENTITY_ID
def test_workbench_related():
rel1 = Relationship(MALWARE_ID, 'targets', IDENTITY_ID)
rel2 = Relationship(CAMPAIGN_ID, 'uses', MALWARE_ID)
rel1 = Relationship(constants.MALWARE_ID, 'targets', constants.IDENTITY_ID)
rel2 = Relationship(constants.CAMPAIGN_ID, 'uses', constants.MALWARE_ID)
save([rel1, rel2])
resp = get(MALWARE_ID).related()
resp = get(constants.MALWARE_ID).related()
assert len(resp) == 3
assert any(x['id'] == CAMPAIGN_ID for x in resp)
assert any(x['id'] == INDICATOR_ID for x in resp)
assert any(x['id'] == IDENTITY_ID for x in resp)
assert any(x['id'] == constants.CAMPAIGN_ID for x in resp)
assert any(x['id'] == constants.INDICATOR_ID for x in resp)
assert any(x['id'] == constants.IDENTITY_ID for x in resp)
resp = get(MALWARE_ID).related(relationship_type='indicates')
resp = get(constants.MALWARE_ID).related(relationship_type='indicates')
assert len(resp) == 1
def test_workbench_related_with_filters():
malware = Malware(labels=["ransomware"], name="CryptorBit", created_by_ref=IDENTITY_ID)
rel = Relationship(malware.id, 'variant-of', MALWARE_ID)
malware = Malware(
labels=["ransomware"], name="CryptorBit", created_by_ref=constants.IDENTITY_ID,
)
rel = Relationship(malware.id, 'variant-of', constants.MALWARE_ID)
save([malware, rel])
filters = [Filter('created_by_ref', '=', IDENTITY_ID)]
resp = get(MALWARE_ID).related(filters=filters)
filters = [Filter('created_by_ref', '=', constants.IDENTITY_ID)]
resp = get(constants.MALWARE_ID).related(filters=filters)
assert len(resp) == 1
assert resp[0].name == malware.name
assert resp[0].created_by_ref == IDENTITY_ID
assert resp[0].created_by_ref == constants.IDENTITY_ID
# filters arg can also be single filter
resp = get(MALWARE_ID).related(filters=filters[0])
resp = get(constants.MALWARE_ID).related(filters=filters[0])
assert len(resp) == 1
def test_add_data_source():
fs_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "stix2_data")
fs = FileSystemSource(fs_path)
fs = FileSystemSource(_STIX_DATA_PATH)
add_data_source(fs)
resp = tools()
assert len(resp) == 3
resp_ids = [tool.id for tool in resp]
assert TOOL_ID in resp_ids
assert constants.TOOL_ID in resp_ids
assert 'tool--03342581-f790-4f03-ba41-e82e67392e23' in resp_ids
assert 'tool--242f3da3-4425-4d11-8f5c-b842886da966' in resp_ids
@ -225,56 +244,74 @@ def test_additional_filter():
def test_additional_filters_list():
resp = tools([Filter('created_by_ref', '=', 'identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5'),
Filter('name', '=', 'Windows Credential Editor')])
resp = tools([
Filter('created_by_ref', '=', 'identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5'),
Filter('name', '=', 'Windows Credential Editor'),
])
assert len(resp) == 1
def test_default_creator():
set_default_creator(IDENTITY_ID)
campaign = Campaign(**CAMPAIGN_KWARGS)
set_default_creator(constants.IDENTITY_ID)
campaign = Campaign(**constants.CAMPAIGN_KWARGS)
assert 'created_by_ref' not in CAMPAIGN_KWARGS
assert campaign.created_by_ref == IDENTITY_ID
assert 'created_by_ref' not in constants.CAMPAIGN_KWARGS
assert campaign.created_by_ref == constants.IDENTITY_ID
# turn off side-effects to avoid affecting future tests
set_default_creator(None)
def test_default_created_timestamp():
timestamp = "2018-03-19T01:02:03.000Z"
set_default_created(timestamp)
campaign = Campaign(**CAMPAIGN_KWARGS)
campaign = Campaign(**constants.CAMPAIGN_KWARGS)
assert 'created' not in CAMPAIGN_KWARGS
assert 'created' not in constants.CAMPAIGN_KWARGS
assert stix2.utils.format_datetime(campaign.created) == timestamp
assert stix2.utils.format_datetime(campaign.modified) == timestamp
# turn off side-effects to avoid affecting future tests
set_default_created(None)
def test_default_external_refs():
ext_ref = ExternalReference(source_name="ACME Threat Intel",
description="Threat report")
ext_ref = ExternalReference(
source_name="ACME Threat Intel",
description="Threat report",
)
set_default_external_refs(ext_ref)
campaign = Campaign(**CAMPAIGN_KWARGS)
campaign = Campaign(**constants.CAMPAIGN_KWARGS)
assert campaign.external_references[0].source_name == "ACME Threat Intel"
assert campaign.external_references[0].description == "Threat report"
# turn off side-effects to avoid affecting future tests
set_default_external_refs([])
def test_default_object_marking_refs():
stmt_marking = StatementMarking("Copyright 2016, Example Corp")
mark_def = MarkingDefinition(definition_type="statement",
definition=stmt_marking)
mark_def = MarkingDefinition(
definition_type="statement",
definition=stmt_marking,
)
set_default_object_marking_refs(mark_def)
campaign = Campaign(**CAMPAIGN_KWARGS)
campaign = Campaign(**constants.CAMPAIGN_KWARGS)
assert campaign.object_marking_refs[0] == mark_def.id
# turn off side-effects to avoid affecting future tests
set_default_object_marking_refs([])
def test_workbench_custom_property_object_in_observable_extension():
ntfs = stix2.NTFSExt(
ntfs = NTFSExt(
allow_custom=True,
sid=1,
x_foo='bar',
)
artifact = stix2.File(
artifact = File(
name='test',
extensions={'ntfs-ext': ntfs},
)
@ -282,7 +319,7 @@ def test_workbench_custom_property_object_in_observable_extension():
allow_custom=True,
first_observed="2015-12-21T19:00:00Z",
last_observed="2015-12-21T19:00:00Z",
number_observed=0,
number_observed=1,
objects={"0": artifact},
)
@ -291,7 +328,7 @@ def test_workbench_custom_property_object_in_observable_extension():
def test_workbench_custom_property_dict_in_observable_extension():
artifact = stix2.File(
artifact = File(
allow_custom=True,
name='test',
extensions={
@ -299,14 +336,14 @@ def test_workbench_custom_property_dict_in_observable_extension():
'allow_custom': True,
'sid': 1,
'x_foo': 'bar',
}
},
},
)
observed_data = ObservedData(
allow_custom=True,
first_observed="2015-12-21T19:00:00Z",
last_observed="2015-12-21T19:00:00Z",
number_observed=0,
number_observed=1,
objects={"0": artifact},
)

View File

View File

@ -4,8 +4,9 @@ import pytest
import stix2
from .constants import (FAKE_TIME, INDICATOR_KWARGS, MALWARE_KWARGS,
RELATIONSHIP_KWARGS)
from .constants import (
FAKE_TIME, INDICATOR_KWARGS, MALWARE_KWARGS, RELATIONSHIP_KWARGS,
)
# Inspired by: http://stackoverflow.com/a/24006251
@ -35,17 +36,17 @@ def uuid4(monkeypatch):
@pytest.fixture
def indicator(uuid4, clock):
return stix2.Indicator(**INDICATOR_KWARGS)
return stix2.v20.Indicator(**INDICATOR_KWARGS)
@pytest.fixture
def malware(uuid4, clock):
return stix2.Malware(**MALWARE_KWARGS)
return stix2.v20.Malware(**MALWARE_KWARGS)
@pytest.fixture
def relationship(uuid4, clock):
return stix2.Relationship(**RELATIONSHIP_KWARGS)
return stix2.v20.Relationship(**RELATIONSHIP_KWARGS)
@pytest.fixture
@ -54,61 +55,97 @@ def stix_objs1():
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
ind2 = {
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
ind3 = {
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.936Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
ind4 = {
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000002",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
ind5 = {
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000002",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
return [ind1, ind2, ind3, ind4, ind5]
@pytest.fixture
def stix_objs1_manifests():
# Tests against latest medallion (TAXII 2.1)
ind1 = {
"date_added": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"media_type": "application/stix+json;version=2.1",
"version": "2017-01-27T13:49:53.935Z",
}
ind2 = {
"date_added": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"media_type": "application/stix+json;version=2.1",
"version": "2017-01-27T13:49:53.935Z",
}
ind3 = {
"date_added": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"media_type": "application/stix+json;version=2.1",
"version": "2017-01-27T13:49:53.936Z",
}
ind4 = {
"date_added": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000002",
"media_type": "application/stix+json;version=2.1",
"version": "2017-01-27T13:49:53.935Z",
}
ind5 = {
"date_added": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000002",
"media_type": "application/stix+json;version=2.1",
"version": "2017-01-27T13:49:53.935Z",
}
return [ind1, ind2, ind3, ind4, ind5]
@ -119,41 +156,41 @@ def stix_objs2():
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000001",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-31T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
ind7 = {
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000002",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
ind8 = {
"created": "2017-01-27T13:49:53.935Z",
"id": "indicator--00000000-0000-4000-8000-000000000002",
"labels": [
"url-watchlist"
"url-watchlist",
],
"modified": "2017-01-27T13:49:53.935Z",
"name": "Malicious site hosting downloader",
"pattern": "[url:value = 'http://x4z9arb.cn/4712']",
"type": "indicator",
"valid_from": "2017-01-27T13:49:53.935382Z"
"valid_from": "2017-01-27T13:49:53.935382Z",
}
return [ind6, ind7, ind8]
@pytest.fixture
def real_stix_objs2(stix_objs2):
return [stix2.parse(x) for x in stix_objs2]
return [stix2.parse(x, version="2.0") for x in stix_objs2]

View File

@ -12,6 +12,7 @@ INDICATOR_ID = "indicator--a740531e-63ff-4e49-a9e1-a0a3eed0e3e7"
INTRUSION_SET_ID = "intrusion-set--4e78f46f-a023-4e5f-bc24-71b3ca22ec29"
MALWARE_ID = "malware--9c4638ec-f1de-4ddb-abf4-1b760417654e"
MARKING_DEFINITION_ID = "marking-definition--613f2e26-407d-48c7-9eca-b8e91df99dc9"
NOTE_ID = "note--0c7b5b88-8ff7-4a4d-aa9d-feb398cd0061"
OBSERVED_DATA_ID = "observed-data--b67d30ff-02ac-498a-92f9-32f845f448cf"
RELATIONSHIP_ID = "relationship--df7c87eb-75d2-4948-af81-9d49d246f301"
REPORT_ID = "report--84e4d88f-44ea-4bcd-bbf3-b2c1c320bcb3"
@ -31,7 +32,7 @@ MARKING_IDS = [
RELATIONSHIP_IDS = [
'relationship--06520621-5352-4e6a-b976-e8fa3d437ffd',
'relationship--181c9c09-43e6-45dd-9374-3bec192f05ef',
'relationship--a0cbb21c-8daf-4a7f-96aa-7155a4ef8f70'
'relationship--a0cbb21c-8daf-4a7f-96aa-7155a4ef8f70',
]
# *_KWARGS contains all required arguments to create an instance of that STIX object
@ -49,7 +50,7 @@ CAMPAIGN_KWARGS = dict(
CAMPAIGN_MORE_KWARGS = dict(
type='campaign',
id=CAMPAIGN_ID,
created_by_ref="identity--f431f809-377b-45e0-aa1c-6a4751cae5ff",
created_by_ref=IDENTITY_ID,
created="2016-04-06T20:03:00.000Z",
modified="2016-04-06T20:03:00.000Z",
name="Green Group Attacks Against Finance",
@ -86,7 +87,7 @@ MALWARE_MORE_KWARGS = dict(
modified="2016-04-06T20:03:00.000Z",
labels=['ransomware'],
name="Cryptolocker",
description="A ransomware related to ..."
description="A ransomware related to ...",
)
OBSERVED_DATA_KWARGS = dict(
@ -97,8 +98,8 @@ OBSERVED_DATA_KWARGS = dict(
"0": {
"type": "windows-registry-key",
"key": "HKEY_LOCAL_MACHINE\\System\\Foo\\Bar",
}
}
},
},
)
REPORT_KWARGS = dict(

View File

@ -2,7 +2,7 @@
"id": "bundle--f68640b4-0cdc-42ae-b176-def1754a1ea0",
"objects": [
{
"created": "2017-05-31T21:30:19.73501Z",
"created": "2017-05-31T21:30:19.735Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Credential dumping is the process of obtaining account login and password information from the operating system and software. Credentials can be used to perform Windows Credential Editor, Mimikatz, and gsecdump. These tools are in use by both professional security testers and adversaries.\n\nPlaintext passwords can be obtained using tools such as Mimikatz to extract passwords stored by the Local Security Authority (LSA). If smart cards are used to authenticate to a domain using a personal identification number (PIN), then that PIN is also cached as a result and may be dumped.Mimikatz access the LSA Subsystem Service (LSASS) process by opening the process, locating the LSA secrets key, and decrypting the sections in memory where credential details are stored. Credential dumpers may also use methods for reflective DLL Injection to reduce potential indicators of malicious activity.\n\nNTLM hash dumpers open the Security Accounts Manager (SAM) on the local file system (%SystemRoot%/system32/config/SAM) or create a dump of the Registry SAM key to access stored account password hashes. Some hash dumpers will open the local file system as a device and parse to the SAM table to avoid file access defenses. Others will make an in-memory copy of the SAM table before reading hashes. Detection of compromised Legitimate Credentials in-use by adversaries may help as well. \n\nOn Windows 8.1 and Windows Server 2012 R2, monitor Windows Logs for LSASS.exe creation to verify that LSASS started as a protected process.\n\nMonitor processes and command-line arguments for program execution that may be indicative of credential dumping. Remote access tools may contain built-in features or incorporate existing tools like Mimikatz. PowerShell scripts also exist that contain credential dumping functionality, such as PowerSploit's Invoke-Mimikatz module,[[Citation: Powersploit]] which may require additional logging features to be configured in the operating system to collect necessary information for analysis.\n\nPlatforms: Windows Server 2003, Windows Server 2008, Windows Server 2012, Windows XP, Windows 7, Windows 8, Windows Server 2003 R2, Windows Server 2008 R2, Windows Server 2012 R2, Windows Vista, Windows 8.1\n\nData Sources: API monitoring, Process command-line parameters, Process monitoring, PowerShell logs",
"external_references": [
@ -29,7 +29,7 @@
"phase_name": "credential-access"
}
],
"modified": "2017-05-31T21:30:19.73501Z",
"modified": "2017-05-31T21:30:19.735Z",
"name": "Credential Dumping",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--b07d6fd6-7cc5-492d-a1eb-9ba956b329d5",
"objects": [
{
"created": "2017-05-31T21:30:26.496201Z",
"created": "2017-05-31T21:30:26.496Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Rootkits are programs that hide the existence of malware by intercepting and modifying operating system API calls that supply system information. Rootkits or rootkit enabling functionality may reside at the user or kernel level in the operating system or lower, to include a Hypervisor, Master Boot Record, or the Basic Input/Output System.[[Citation: Wikipedia Rootkit]]\n\nAdversaries may use rootkits to hide the presence of programs, files, network connections, services, drivers, and other system components.\n\nDetection: Some rootkit protections may be built into anti-virus or operating system software. There are dedicated rootkit detection tools that look for specific types of rootkit behavior. Monitor for the existence of unrecognized DLLs, devices, services, and changes to the MBR.[[Citation: Wikipedia Rootkit]]\n\nPlatforms: Windows Server 2003, Windows Server 2008, Windows Server 2012, Windows XP, Windows 7, Windows 8, Windows Server 2003 R2, Windows Server 2008 R2, Windows Server 2012 R2, Windows Vista, Windows 8.1\n\nData Sources: BIOS, MBR, System calls",
"external_references": [
@ -24,7 +24,7 @@
"phase_name": "defense-evasion"
}
],
"modified": "2017-05-31T21:30:26.496201Z",
"modified": "2017-05-31T21:30:26.496Z",
"name": "Rootkit",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--1a854c96-639e-4771-befb-e7b960a65974",
"objects": [
{
"created": "2017-05-31T21:30:29.45894Z",
"created": "2017-05-31T21:30:29.458Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Data, such as sensitive documents, may be exfiltrated through the use of automated processing or Scripting after being gathered during Exfiltration Over Command and Control Channel and Exfiltration Over Alternative Protocol.\n\nDetection: Monitor process file access patterns and network behavior. Unrecognized processes or scripts that appear to be traversing file systems and sending network traffic may be suspicious.\n\nPlatforms: Windows Server 2003, Windows Server 2008, Windows Server 2012, Windows XP, Windows 7, Windows 8, Windows Server 2003 R2, Windows Server 2008 R2, Windows Server 2012 R2, Windows Vista, Windows 8.1\n\nData Sources: File monitoring, Process monitoring, Process use of network",
"external_references": [
@ -19,7 +19,7 @@
"phase_name": "exfiltration"
}
],
"modified": "2017-05-31T21:30:29.45894Z",
"modified": "2017-05-31T21:30:29.458Z",
"name": "Automated Exfiltration",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--33e3e33a-38b8-4a37-9455-5b8c82d3b10a",
"objects": [
{
"created": "2017-05-31T21:30:45.139269Z",
"created": "2017-05-31T21:30:45.139Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Adversaries may attempt to get a listing of network connections to or from the compromised system.\nUtilities and commands that acquire this information include netstat, \"net use,\" and \"net session\" with Net.\n\nDetection: System and network discovery techniques normally occur throughout an operation as an adversary learns the environment. Data and events should not be viewed in isolation, but as part of a chain of behavior that could lead to other activities, such as Windows Management Instrumentation and PowerShell.\n\nPlatforms: Windows Server 2003, Windows Server 2008, Windows Server 2012, Windows XP, Windows 7, Windows 8, Windows Server 2003 R2, Windows Server 2008 R2, Windows Server 2012 R2, Windows Vista, Windows 8.1\n\nData Sources: Process command-line parameters, Process monitoring",
"external_references": [
@ -19,7 +19,7 @@
"phase_name": "discovery"
}
],
"modified": "2017-05-31T21:30:45.139269Z",
"modified": "2017-05-31T21:30:45.139Z",
"name": "Local Network Connections Discovery",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--a87938c5-cc1e-4e06-a8a3-b10243ae397d",
"objects": [
{
"created": "2017-05-31T21:30:41.022897Z",
"created": "2017-05-31T21:30:41.022Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Sensitive data can be collected from remote systems via shared network drives (host shared directory, network file server, etc.) that are accessible from the current system prior to cmd may be used to gather information.\n\nDetection: Monitor processes and command-line arguments for actions that could be taken to collect files from a network share. Remote access tools with built-in features may interact directly with the Windows API to gather data. Data may also be acquired through Windows system management tools such as Windows Management Instrumentation and PowerShell.\n\nPlatforms: Windows Server 2003, Windows Server 2008, Windows Server 2012, Windows XP, Windows 7, Windows 8, Windows Server 2003 R2, Windows Server 2008 R2, Windows Server 2012 R2, Windows Vista, Windows 8.1\n\nData Sources: File monitoring, Process monitoring, Process command-line parameters",
"external_references": [
@ -19,7 +19,7 @@
"phase_name": "collection"
}
],
"modified": "2017-05-31T21:30:41.022897Z",
"modified": "2017-05-31T21:30:41.022Z",
"name": "Data from Network Shared Drive",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--5ddaeff9-eca7-4094-9e65-4f53da21a444",
"objects": [
{
"created": "2017-05-31T21:30:32.662702Z",
"created": "2017-05-31T21:30:32.662Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Adversaries may attempt to make an executable or file difficult to discover or analyze by encrypting, encoding, or otherwise obfuscating its contents on the system.\n\nDetection: Detection of file obfuscation is difficult unless artifacts are left behind by the obfuscation process that are uniquely detectable with a signature. If detection of the obfuscation itself is not possible, it may be possible to detect the malicious activity that caused the obfuscated file (for example, the method that was used to write, read, or modify the file on the file system).\n\nPlatforms: Windows Server 2003, Windows Server 2008, Windows Server 2012, Windows XP, Windows 7, Windows 8, Windows Server 2003 R2, Windows Server 2008 R2, Windows Server 2012 R2, Windows Vista, Windows 8.1\n\nData Sources: Network protocol analysis, Process use of network, Binary file metadata, File monitoring, Malware reverse engineering",
"external_references": [
@ -19,7 +19,7 @@
"phase_name": "defense-evasion"
}
],
"modified": "2017-05-31T21:30:32.662702Z",
"modified": "2017-05-31T21:30:32.662Z",
"name": "Obfuscated Files or Information",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,11 +2,11 @@
"id": "bundle--a42d26fe-c938-4074-a1b3-50d852e6f0bd",
"objects": [
{
"created": "2017-05-31T21:30:26.495974Z",
"created": "2017-05-31T21:30:26.495Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Identify potentially malicious software that may contain rootkit functionality, and audit and/or block it by using whitelisting[[CiteRef::Beechey 2010]] tools, like AppLocker,[[CiteRef::Windows Commands JPCERT]][[CiteRef::NSA MS AppLocker]] or Software Restriction Policies[[CiteRef::Corio 2008]] where appropriate.[[CiteRef::TechNet Applocker vs SRP]]",
"id": "course-of-action--95ddb356-7ba0-4bd9-a889-247262b8946f",
"modified": "2017-05-31T21:30:26.495974Z",
"modified": "2017-05-31T21:30:26.495Z",
"name": "Rootkit Mitigation",
"type": "course-of-action"
}

View File

@ -1,9 +1,9 @@
{
"created": "2017-05-31T21:30:41.022744Z",
"created": "2017-05-31T21:30:41.022Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Identify unnecessary system utilities or potentially malicious software that may be used to collect data from a network share, and audit and/or block them by using whitelisting[[CiteRef::Beechey 2010]] tools, like AppLocker,[[CiteRef::Windows Commands JPCERT]][[CiteRef::NSA MS AppLocker]] or Software Restriction Policies[[CiteRef::Corio 2008]] where appropriate.[[CiteRef::TechNet Applocker vs SRP]]",
"id": "course-of-action--d9727aee-48b8-4fdb-89e2-4c49746ba4dd",
"modified": "2017-05-31T21:30:41.022744Z",
"modified": "2017-05-31T21:30:41.022Z",
"name": "Data from Network Shared Drive Mitigation",
"type": "course-of-action"
}

View File

@ -2,10 +2,10 @@
"id": "bundle--81884287-2548-47fc-a997-39489ddd5462",
"objects": [
{
"created": "2017-06-01T00:00:00Z",
"created": "2017-06-01T00:00:00.000Z",
"id": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"identity_class": "organization",
"modified": "2017-06-01T00:00:00Z",
"modified": "2017-06-01T00:00:00.000Z",
"name": "The MITRE Corporation",
"type": "identity"
}

View File

@ -0,0 +1,11 @@
{
"type": "identity",
"id": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"created": "2017-06-01T00:00:00.000Z",
"modified": "2018-11-01T23:24:48.446Z",
"name": "The MITRE Corporation",
"identity_class": "organization",
"labels": [
"version two"
]
}

View File

@ -10,7 +10,7 @@
"PinkPanther",
"Black Vine"
],
"created": "2017-05-31T21:31:49.412497Z",
"created": "2017-05-31T21:31:49.412Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Deep Panda is a suspected Chinese threat group known to target many industries, including government, defense, financial, and telecommunications.Deep Panda.Deep Panda also appears to be known as Black Vine based on the attribution of both group names to the Anthem intrusion.[[Citation: Symantec Black Vine]]",
"external_references": [
@ -41,7 +41,7 @@
}
],
"id": "intrusion-set--a653431d-6a5e-4600-8ad3-609b5af57064",
"modified": "2017-05-31T21:31:49.412497Z",
"modified": "2017-05-31T21:31:49.412Z",
"name": "Deep Panda",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -5,7 +5,7 @@
"aliases": [
"DragonOK"
],
"created": "2017-05-31T21:31:53.197755Z",
"created": "2017-05-31T21:31:53.197Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "DragonOK is a threat group that has targeted Japanese organizations with phishing emails. Due to overlapping TTPs, including similar custom tools, DragonOK is thought to have a direct or indirect relationship with the threat group Moafee. [[Citation: Operation Quantum Entanglement]][[Citation: Symbiotic APT Groups]] It is known to use a variety of malware, including Sysget/HelloBridge, PlugX, PoisonIvy, FormerFirstRat, NFlog, and NewCT. [[Citation: New DragonOK]]",
"external_references": [
@ -31,7 +31,7 @@
}
],
"id": "intrusion-set--f3bdec95-3d62-42d9-a840-29630f6cdc1a",
"modified": "2017-05-31T21:31:53.197755Z",
"modified": "2017-05-31T21:31:53.197Z",
"name": "DragonOK",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -0,0 +1,27 @@
{
"type": "malware",
"id": "malware--6b616fc1-1505-48e3-8b2c-0d19337bff38",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"created": "2017-05-31T21:32:58.226Z",
"modified": "2018-11-16T22:54:20.390Z",
"name": "Rover",
"description": "Rover is malware suspected of being used for espionage purposes. It was used in 2015 in a targeted email sent to an Indian Ambassador to Afghanistan.[[Citation: Palo Alto Rover]]",
"labels": [
"version four"
],
"external_references": [
{
"source_name": "mitre-attack",
"url": "https://attack.mitre.org/wiki/Software/S0090",
"external_id": "S0090"
},
{
"source_name": "Palo Alto Rover",
"description": "Ray, V., Hayashi, K. (2016, February 29). New Malware \u2018Rover\u2019 Targets Indian Ambassador to Afghanistan. Retrieved February 29, 2016.",
"url": "http://researchcenter.paloaltonetworks.com/2016/02/new-malware-rover-targets-indian-ambassador-to-afghanistan/"
}
],
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
]
}

View File

@ -2,7 +2,7 @@
"id": "bundle--f64de948-7067-4534-8018-85f03d470625",
"objects": [
{
"created": "2017-05-31T21:32:58.226477Z",
"created": "2017-05-31T21:32:58.226Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Rover is malware suspected of being used for espionage purposes. It was used in 2015 in a targeted email sent to an Indian Ambassador to Afghanistan.[[Citation: Palo Alto Rover]]",
"external_references": [
@ -21,7 +21,7 @@
"labels": [
"malware"
],
"modified": "2017-05-31T21:32:58.226477Z",
"modified": "2017-05-31T21:32:58.226Z",
"name": "Rover",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -0,0 +1,27 @@
{
"type": "malware",
"id": "malware--6b616fc1-1505-48e3-8b2c-0d19337bff38",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"created": "2017-05-31T21:32:58.226Z",
"modified": "2018-11-01T23:24:48.456Z",
"name": "Rover",
"description": "Rover is malware suspected of being used for espionage purposes. It was used in 2015 in a targeted email sent to an Indian Ambassador to Afghanistan.[[Citation: Palo Alto Rover]]",
"labels": [
"version two"
],
"external_references": [
{
"source_name": "mitre-attack",
"url": "https://attack.mitre.org/wiki/Software/S0090",
"external_id": "S0090"
},
{
"source_name": "Palo Alto Rover",
"description": "Ray, V., Hayashi, K. (2016, February 29). New Malware \u2018Rover\u2019 Targets Indian Ambassador to Afghanistan. Retrieved February 29, 2016.",
"url": "http://researchcenter.paloaltonetworks.com/2016/02/new-malware-rover-targets-indian-ambassador-to-afghanistan/"
}
],
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
]
}

View File

@ -0,0 +1,27 @@
{
"type": "malware",
"id": "malware--6b616fc1-1505-48e3-8b2c-0d19337bff38",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"created": "2017-05-31T21:32:58.226Z",
"modified": "2018-11-01T23:24:48.457Z",
"name": "Rover",
"description": "Rover is malware suspected of being used for espionage purposes. It was used in 2015 in a targeted email sent to an Indian Ambassador to Afghanistan.[[Citation: Palo Alto Rover]]",
"labels": [
"version three"
],
"external_references": [
{
"source_name": "mitre-attack",
"url": "https://attack.mitre.org/wiki/Software/S0090",
"external_id": "S0090"
},
{
"source_name": "Palo Alto Rover",
"description": "Ray, V., Hayashi, K. (2016, February 29). New Malware \u2018Rover\u2019 Targets Indian Ambassador to Afghanistan. Retrieved February 29, 2016.",
"url": "http://researchcenter.paloaltonetworks.com/2016/02/new-malware-rover-targets-indian-ambassador-to-afghanistan/"
}
],
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
]
}

View File

@ -2,7 +2,7 @@
"id": "bundle--c633942b-545c-4c87-91b7-9fe5740365e0",
"objects": [
{
"created": "2017-05-31T21:33:26.565056Z",
"created": "2017-05-31T21:33:26.565Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "RTM is custom malware written in Delphi. It is used by the group of the same name (RTM).[[Citation: ESET RTM Feb 2017]]",
"external_references": [
@ -21,7 +21,7 @@
"labels": [
"malware"
],
"modified": "2017-05-31T21:33:26.565056Z",
"modified": "2017-05-31T21:33:26.565Z",
"name": "RTM",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--09ce4338-8741-4fcf-9738-d216c8e40974",
"objects": [
{
"created": "2017-05-31T21:32:48.482655Z",
"created": "2017-05-31T21:32:48.482Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "Sakula is a remote access tool (RAT) that first surfaced in 2012 and was used in intrusions throughout 2015.[[Citation: Dell Sakula]]\n\nAliases: Sakula, Sakurel, VIPER",
"external_references": [
@ -21,7 +21,7 @@
"labels": [
"malware"
],
"modified": "2017-05-31T21:32:48.482655Z",
"modified": "2017-05-31T21:32:48.482Z",
"name": "Sakula",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,7 +2,7 @@
"id": "bundle--611947ce-ae3b-4fdb-b297-aed8eab22e4f",
"objects": [
{
"created": "2017-05-31T21:32:15.263882Z",
"created": "2017-05-31T21:32:15.263Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"description": "PoisonIvy is a popular remote access tool (RAT) that has been used by many groups.[[Citation: FireEye Poison Ivy]]\n\nAliases: PoisonIvy, Poison Ivy",
"external_references": [
@ -21,7 +21,7 @@
"labels": [
"malware"
],
"modified": "2017-05-31T21:32:15.263882Z",
"modified": "2017-05-31T21:32:15.263Z",
"name": "PoisonIvy",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"

View File

@ -2,10 +2,10 @@
"id": "bundle--7e715462-dd9d-40b9-968a-10ef0ecf126d",
"objects": [
{
"created": "2017-05-31T21:33:27.182784Z",
"created": "2017-05-31T21:33:27.182Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--0d4a7788-7f3b-4df8-a498-31a38003c883",
"modified": "2017-05-31T21:33:27.182784Z",
"modified": "2017-05-31T21:33:27.182Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--a53eef35-abfc-4bcd-b84e-a048f7b4a9bf",
"objects": [
{
"created": "2017-05-31T21:33:27.082801Z",
"created": "2017-05-31T21:33:27.082Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--0e55ee98-0c6d-43d4-b424-b18a0036b227",
"modified": "2017-05-31T21:33:27.082801Z",
"modified": "2017-05-31T21:33:27.082Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--0b9f6412-314f-44e3-8779-9738c9578ef5",
"objects": [
{
"created": "2017-05-31T21:33:27.018782Z",
"created": "2017-05-31T21:33:27.018Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--1e91cd45-a725-4965-abe3-700694374432",
"modified": "2017-05-31T21:33:27.018782Z",
"modified": "2017-05-31T21:33:27.018Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--6d5b04a8-efb2-4179-990e-74f1dcc76e0c",
"objects": [
{
"created": "2017-05-31T21:33:27.100701Z",
"created": "2017-05-31T21:33:27.100Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--3a3084f9-0302-4fd5-9b8a-e0db10f5345e",
"modified": "2017-05-31T21:33:27.100701Z",
"modified": "2017-05-31T21:33:27.100Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--a7efc025-040d-49c7-bf97-e5a1120ecacc",
"objects": [
{
"created": "2017-05-31T21:33:27.143973Z",
"created": "2017-05-31T21:33:27.143Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--3a3ed0b2-0c38-441f-ac40-53b873e545d1",
"modified": "2017-05-31T21:33:27.143973Z",
"modified": "2017-05-31T21:33:27.143Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--9f013d47-7704-41c2-9749-23d0d94af94d",
"objects": [
{
"created": "2017-05-31T21:33:27.021562Z",
"created": "2017-05-31T21:33:27.021Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--592d0c31-e61f-495e-a60e-70d7be59a719",
"modified": "2017-05-31T21:33:27.021562Z",
"modified": "2017-05-31T21:33:27.021Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--15167b24-4cee-4c96-a140-32a6c37df4b4",
"objects": [
{
"created": "2017-05-31T21:33:27.044387Z",
"created": "2017-05-31T21:33:27.044Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--70dc6b5c-c524-429e-a6ab-0dd40f0482c1",
"modified": "2017-05-31T21:33:27.044387Z",
"modified": "2017-05-31T21:33:27.044Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

View File

@ -2,10 +2,10 @@
"id": "bundle--ff845dca-7036-416f-aae0-95030994c49f",
"objects": [
{
"created": "2017-05-31T21:33:27.051532Z",
"created": "2017-05-31T21:33:27.051Z",
"created_by_ref": "identity--c78cb6e5-0c4b-4611-8297-d1b8b55e40b5",
"id": "relationship--8797579b-e3be-4209-a71b-255a4d08243d",
"modified": "2017-05-31T21:33:27.051532Z",
"modified": "2017-05-31T21:33:27.051Z",
"object_marking_refs": [
"marking-definition--fa42a846-8d90-4e51-bc29-71d5b4802168"
],

Some files were not shown because too many files have changed in this diff Show More