Compare commits

...

193 Commits
v0.2 ... master

Author SHA1 Message Date
Jean-Louis Huynen cb3d618ee1
Merge pull request #50 from cudeso/master
Contributions to the documentation and small type for "registered"
2023-12-23 17:58:53 +01:00
Koen Van Impe 27aa5b1df9 Contributions to the documentation small type for "registered"
- Clarifications for basic install of the client
- Clarifications for basic install of the server
- Fix small types registered instead of registred
2023-12-22 18:31:40 +01:00
Alexandre Dulaunoy 2e8ddd490f
Merge pull request #49 from DocArmoryTech/patch-1
installation refinement
2023-11-28 22:20:44 +01:00
DocArmoryTech 090d0f66bb
Merge pull request #1 from DocArmoryTech/patch-2
Update install_server.sh
2023-11-28 20:36:24 +00:00
DocArmoryTech dfd53c126b
Update install_server.sh
- removed remnant reference to AIL_ENV from AIL install script
- removed non-existent flags from LAUNCH.sh
2023-11-28 20:30:33 +00:00
DocArmoryTech 6273a220b2
Update requirement.txt
set Flask version to 2.2.2
added Werkzeug version to match

Running Ubuntu 20.04 using Python 3.8.10
 - the latest version of flask causes the server to fail to start (cannot import name 'escape' from 'flask')
 - after fixing flask at version 2.2.2, the server again failed to start (ImportError: cannot import name 'url_quote' from 'werkzeug.urls')
2023-11-28 18:51:38 +00:00
Jean-Louis Huynen 81686aa022
fix: [web] fix #47 2023-03-02 15:41:45 +01:00
Gerard Wagener 399e659d8f fix: [d4-client] Removed hardcoded gcc command from Makefile 2021-10-08 11:32:38 +02:00
Terrtia b2f463e8f1
fix: [d4-server] check HMAC key 2021-04-20 16:42:22 +02:00
Terrtia adf0f6008b
fix: [d4-server] worker launcher: don't add invalid HMAC or empty data stream to workers queue 2021-04-20 15:43:03 +02:00
Terrtia 39d593364d
chg: [D4Server] add server port in config 2021-03-31 11:43:54 +02:00
Terrtia cbb90c057a
merge 2021-03-30 18:28:55 +02:00
Terrtia b7998d5601
chg: [D4server] add shared hmac key in config 2021-03-30 18:27:42 +02:00
Terrtia dc3cdcbc1c
chg: [D4server] add shared hmac key in config 2021-03-30 18:26:15 +02:00
Jean-Louis Huynen 6c3c9f9954
fix #41 2021-02-23 19:54:36 +01:00
Jean-Louis Huynen a6d5a3d22c
chg: [filerwatcher] fix double queue logging for compressed files 2021-02-19 09:45:47 +01:00
Jean-Louis Huynen 36a771ea2d
chg: [filerwatcher] define segragation from MH 2021-02-18 18:04:44 +01:00
Jean-Louis Huynen ef6e87f3c5
chg: [filerwatcher] compression, ext from MH + remove buffer limits 2021-02-18 17:00:55 +01:00
Jean-Louis Huynen 5a3e299332
add: [filerwatcher] enable by_uuid / date filing 2021-02-18 14:37:43 +01:00
Jean-Louis Huynen d74d2fb71a
add: [filerwatcher] +base64 worker 2021-02-17 16:46:33 +01:00
Jean-Louis Huynen cf64529929
add: [filerwatcher] initial json file worker 2021-02-16 16:13:53 +01:00
Terrtia 893631e003
fix: [client] no data: send empty D4 packet 2020-12-02 15:54:20 +01:00
Terrtia ac301b5360
fix: [Flask] fix flask host 2020-11-10 16:12:09 +01:00
Terrtia 04fab82f5e
fic: [Sensors monitoring] typo 2020-11-10 15:09:12 +01:00
Terrtia a297cef179
fic: [Sensors monitoring] fix reload list of sensors to monitor 2020-11-10 15:04:45 +01:00
Terrtia 3edf227cc1
fic: [Sensors monitoring] fix API 2020-11-10 14:41:44 +01:00
Terrtia 2d358918c9
fic: [Sensors monitoring] fct is_sensor_monitored 2020-11-10 14:21:00 +01:00
Terrtia 47f82c8879
fix: [Sensors API] check user token 2020-11-10 11:19:17 +01:00
Terrtia 6fee7df9fe
chg: [Sensors API + UI] add sensors monitoring 2020-11-10 11:11:23 +01:00
Terrtia 4b30072880
fix: [Flask server] typo 2020-09-04 09:31:40 +02:00
Terrtia 7ce265e477
fix: [Flask server] change default host 2020-09-04 09:22:14 +02:00
Terrtia 168c31a5bf
fix: [UI sensor_register role] avoid login + fix error template 2020-08-20 15:33:21 +02:00
Terrtia adda78faad
fix: [UI change password] check user role 2020-08-20 15:14:41 +02:00
Terrtia df482d6ee3
fix: [Analyzer queues] delete an analyzer queue 2020-05-26 09:48:21 +02:00
Terrtia 609402ebf2
fix: [edit user] edit user password 2020-05-26 09:38:24 +02:00
Thirion Aurélien 98e562bd47
Merge pull request #37 from D4-project/gallypette-patch-1
chg: [install] popper folder name changed.
2020-04-06 16:05:30 +02:00
Terrtia f17f80b21c
fix: [Flask cookie name] 2020-04-06 15:33:29 +02:00
Jean-Louis Huynen 00e3ce3437
chg: [install] popper folder name changed. 2020-04-06 15:13:43 +02:00
Terrtia 82b2944119
fix: [Analyzer - close socket] use shutdown fct 2020-03-17 17:58:13 +01:00
Terrtia 7078f341ae
fix: [README] 2020-03-12 11:15:35 +01:00
Terrtia cb1c8c4d65
chg: [README] update screenshot 2020-03-12 11:10:27 +01:00
Terrtia 091994a34d
chg: [server] add screenshots 2020-03-12 10:45:11 +01:00
Terrtia 209cd0500f
chg: [exporter TLS] add client cert 2020-03-10 15:04:29 +01:00
Terrtia 99656658f2
chg: [TLS Exporter] add new analyzer: tls export, fix: #35 2020-03-10 14:43:45 +01:00
Terrtia cdc72e7998
fix: [Analyzer queue 254] fix metatype: push to queue 2020-03-03 15:34:51 +01:00
Terrtia 8a792fe4ba
fix: [Analyzer queue 254] fix list by type 2020-03-03 14:33:54 +01:00
Terrtia ab261a6bd2
chg: [Analyzer Queue] add template: edit queue 2020-03-03 14:14:35 +01:00
Terrtia 14d3a650e5
chg: [Metatype default] send data to analyzer queues by default 2020-03-03 10:57:04 +01:00
Terrtia b48ad52845
chg: [Analyzer Queue] add update script + global queue_uuid by type/extended type 2020-03-03 10:50:08 +01:00
Terrtia 10430135d1
chg; [Analyzer Queue UI] add queue creator template + bug fix 2020-03-02 16:56:20 +01:00
Terrtia 4d55d601a1
chg: [Analyzer Queues] Add queue by group of sensors (TODO: add sensor uuid in the UI) 2020-02-28 16:52:48 +01:00
Terrtia aabf74f2f3
fix: [worker 2 8] fix config redis 2020-01-22 15:39:36 +01:00
Terrtia d3087662a7
fix: [worker 2 8] fix config import 2020-01-22 15:37:06 +01:00
Terrtia 8fa83dd248
fix: [worker 1] fix config import 2020-01-22 14:00:18 +01:00
Terrtia 56e7657253
chg: [worker] add worker 3 2020-01-22 13:52:53 +01:00
Terrtia bb3c1b2676
fix: typo 2020-01-21 11:52:44 +01:00
Terrtia 1c61e1d1fe
fix: [Flask server + cookie session] chg default cookie name (also use port number) + add Flask port number to config 2020-01-21 11:51:42 +01:00
Terrtia f5770b6e60
fix: [install script] cp default config 2019-12-03 10:29:15 +01:00
Terrtia e39ef2c551
fix: typo 2019-11-25 17:14:59 +01:00
Terrtia d01f686514
fix: typo 2019-11-25 17:11:12 +01:00
Terrtia a800e8c8f1
fix: [sys import] 2019-11-25 17:05:15 +01:00
Terrtia 4dc2d1abef
fix: [Redis conn] 2019-11-25 17:04:07 +01:00
Terrtia 5b0b5a6f68
fix: [Redis conn] 2019-11-25 17:01:43 +01:00
Terrtia 8f5a084d32
fix: [Redis conn] 2019-11-25 16:55:21 +01:00
Terrtia 8bf0fe4590
chg: [core] add redis server in config 2019-11-25 16:41:39 +01:00
Terrtia 6f58e862cc
chg: [core] add redis server in config 2019-11-25 16:28:20 +01:00
Terrtia 8db01c389b
chg: [API - sensor registration] add third_party field 2019-10-02 09:48:20 +02:00
Terrtia 9a71a7a892
fix: [conf] remove comment 2019-10-01 13:25:43 +02:00
Terrtia d870819080
chg: [registered sensor] remove registered sensors 2019-10-01 11:47:33 +02:00
Terrtia 0bd02f21d6
chg: [server + UI] change server_mode + display all registered sensors 2019-10-01 11:26:14 +02:00
Terrtia 3ce8557cff
chg: [server + UI + API] add server mode: registration + shared-secret + API-UI: approve/delete pending sensors 2019-09-30 17:07:25 +02:00
Terrtia 336fc7655a
fix: [worker 8] fix save_file 2019-09-30 11:56:19 +02:00
Terrtia b530c67825
chg: [export] add tcp export analyzer 2019-09-25 14:51:08 +02:00
Terrtia 16d9eb2561
fix: [disk stats] 2019-09-24 14:59:55 +02:00
Terrtia c54575ae77
fix: typo 2019-09-24 14:39:53 +02:00
Terrtia 5b320a9470
Merge branch 'master' of https://github.com/D4-project/d4-core 2019-09-24 14:36:14 +02:00
Terrtia 648e406c54
chg: [Sensor registration] handle empty key 2019-09-24 14:35:55 +02:00
Jean-Louis Huynen f5af770516
chg: [install] copy crt and key to Flash folder 2019-09-23 11:32:05 +02:00
Thirion Aurélien 61043d81aa
Merge pull request #29 from D4-project/apiV0
chg: [analyzer + flask] add maltrail worker + fix show_uuid: filter e…
2019-09-19 11:58:14 +02:00
Terrtia 2bc20333a9
chg: [analyzer + flask] add maltrail worker + fix show_uuid: filter empty field 2019-09-19 11:57:18 +02:00
Thirion Aurélien e9ef2d529f
Merge pull request #28 from D4-project/apiV0
chg: [analyzer] add export analyzer: syslog, unix, udp   fix:#27
2019-09-19 09:37:09 +02:00
Terrtia 4ce9888f5d
chg: [analyzer] add export analyzer: syslog, unix, udp fix:#27 2019-09-18 17:16:45 +02:00
Alexandre Dulaunoy ff256984a3
Merge pull request #24 from D4-project/apiV0
Api v0
2019-09-17 09:04:46 +02:00
Terrtia 96cfebd0ea
fix: [Flask auth] add brute force and side-channel protection 2019-09-03 10:43:52 +02:00
Thirion Aurélien f6b6137937
Update README.md 2019-09-02 16:13:40 +02:00
Terrtia eb6ff228e8
chg: [UI] add users management 2019-09-02 16:06:46 +02:00
Terrtia 450f5860e4
chg: [install] create default user + add mail to sensor registration 2019-08-20 16:05:27 +02:00
Terrtia 3630ec0460
fix: [api register_sensor] fix role + endpoint 2019-08-16 17:52:02 +02:00
Terrtia e5720087de
chg: [register sensor] uuid format 2019-08-16 09:52:58 +02:00
Terrtia 15bb67a086
chg: [api] add£: register sensor endpoint 2019-08-14 16:42:15 +02:00
Terrtia d722390f89
chg: [Flask server] add restAPI blueprint 2019-08-14 13:58:58 +02:00
Terrtia c8d2b8cb95
chg: [UI] add user management 2019-08-14 12:53:51 +02:00
Thirion Aurélien 113159f820
chg: [README] add client logo 2019-07-11 14:40:40 +02:00
Terrtia 67bf0c3cf0
chg: [client] add client logo 2019-07-11 14:35:34 +02:00
Alexandre Dulaunoy 8bf6bdc1fe
chg: [template] Ko -> Kb 2019-07-03 10:17:14 +02:00
Terrtia 85f2964c6c
chg: [d4-analyzer-stdout] catch FileNotFoundError + only append one file + helper: fix required arguments 2019-06-17 14:44:04 +02:00
Terrtia c19e43c931
chg: [analyzer-stdout] add newline flag + file flag (read file to stdout) 2019-06-17 11:22:25 +02:00
Terrtia 489ce2c955
chg: [analyzer] add default stdout analyzer 2019-06-12 16:51:42 +02:00
Alexandre Dulaunoy 868777eba5
Merge pull request #23 from trolldbois/master
Use Environmental variables for redis & Docker container configuration files
2019-06-12 14:04:24 +02:00
ljaqueme acb20a769b use d4 github. Will works if pull request is accepted and Environmental variables allowed to control REDIS servers config 2019-06-11 16:41:31 -06:00
ljaqueme 91500ba460 that works 2019-06-11 16:37:11 -06:00
ljaqueme c6f21f0b5f that work 2019-06-11 13:49:56 -06:00
ljaqueme 6b5ec52e28 make it docker compliance 2019-06-11 12:27:39 -06:00
Terrtia 3650637ce8
chg: [sensor_status UI] add types popover
TODO: handle extented types description
2019-06-04 13:21:20 +02:00
Terrtia b6df534a72
chg: [sensor_status UI] show connected types by uuid 2019-06-04 11:22:46 +02:00
Terrtia fb15487773
fix: [server] connection lost: avoid none uuid 2019-06-04 09:13:00 +02:00
Terrtia bfc75e0db8
fix: [server] fix extended-types connection (allow concurrent 2/254) + fix extended types metadata + save connected types/extended in DB 2019-06-03 17:29:20 +02:00
Terrtia e6d98d2dbc
chg: [worker 2] add extended types metadata 2019-05-29 12:11:44 +02:00
Terrtia c26e95ce50
chg: [UI uuid_management + sensor_status] add sensor description 2019-05-28 16:17:58 +02:00
Terrtia bf2fce284f
fix: [UI sensor_status] avoid None descriptions 2019-05-28 13:41:19 +02:00
Terrtia 1dd57366c2
chg: [UI sensor_status] refractor: use datatable + display sensor types + description
TODO: edit desription
2019-05-28 13:34:46 +02:00
Terrtia 3a22c250ee
chg: [UI uuid_management] show stats: files save on disk by types
TODO: split 254 by extended type
2019-05-27 15:46:40 +02:00
Terrtia ae2adfe4d6
fix: [server] fix type 2, active_connection keys 2019-04-24 10:53:36 +02:00
Terrtia 40ff019e2f
fix: [server] duplicate connections 2019-04-24 10:43:47 +02:00
Terrtia 0816a93efe
chg: [worker2 ja3] add debug 2019-04-24 10:21:03 +02:00
Terrtia e4e4d8d57e
chg: [worker2 ja3] add debug 2019-04-24 09:59:17 +02:00
Terrtia 7d96e76690
fix: [worker 2] fix JSONDecodeError 2019-04-23 10:10:45 +02:00
Terrtia 87a68494c1
fix: [server] fix type 2/254 error handler and stats 2019-04-10 13:26:09 +02:00
Terrtia c0e441ee6b
fix: [UI v0.3] add sensor link, fix #19 2019-04-10 11:34:01 +02:00
Terrtia 4086b462b7
fix: #19, remove demo button 2019-04-08 17:09:42 +02:00
Gerard Wagener e0d101dc3a chg: [client] try to compile on openbsd 2019-04-05 15:59:02 +02:00
Gerard Wagener 60cfbcb250 Merge branch 'master' of github.com:D4-project/d4-core 2019-04-05 15:07:46 +02:00
Gerard Wagener 88e22b3e62 chg: [client] getentropy is not available on old Linux distributions 2019-04-05 15:02:55 +02:00
Terrtia f726237c65
chg: [worker8] compress files 2019-04-04 14:51:49 +02:00
Terrtia 45546fd4a2
chg: [UI v0.3 uuid_management] add type/uuid barchart 2019-04-04 13:55:56 +02:00
Terrtia 5d85903bf2
fix: [UI v0.3] fix css, sensor_status btn 2019-04-03 14:54:05 +02:00
Terrtia 580b2bd7d8
fix:[UI v0.3] fix index loading time 2019-04-03 14:36:50 +02:00
Terrtia 9144cd8af1
chg: [server] timeout unvalid connections 2019-04-03 14:05:16 +02:00
Terrtia 5d923a39e2
chg: [server] add tcpkeepalive 2019-04-03 11:55:15 +02:00
Terrtia 8da1bce74a
add: [server] stats: type by uuid, #2 #13 #14 2019-04-03 09:55:17 +02:00
Terrtia d07ba92aaf
chg: [server + UI v0.3] kick sensor by uuid + add temp ban + close connection when requested by server 2019-04-02 16:18:37 +02:00
Terrtia b4d0b83b88
chg: [Readme] add troubleshooting 2019-04-01 16:16:37 +02:00
Terrtia f371a2e9ac
chg: [worker 1] add debug 2019-04-01 15:59:10 +02:00
Terrtia a0857974ba
chg: [server_management UI v0.3] add analyzer description, fix #11, description can be updated by adding the same uuid 2019-04-01 11:27:47 +02:00
Terrtia 01c9d99273
chg: [server_management UI v0.3] add analyzer description, fix #11, description can be updated by adding the same uuid 2019-04-01 11:19:31 +02:00
Terrtia 8dbd2cd0b1
fix: [UI v0.3] convert epoch to date, fix #12 2019-04-01 10:34:24 +02:00
Terrtia ea22b5677d
fix: [UI v0.3] convert epoch to date, fix #12 2019-04-01 10:31:10 +02:00
Terrtia 10d29fff8c
fix: [worker 8] join 2019-04-01 09:33:48 +02:00
Terrtia 4caa46a003
fix: [worker 8] join 2019-04-01 09:29:47 +02:00
Terrtia a8014a2efc
fix: [worker 8] use join with bytes 2019-03-29 09:46:46 +01:00
Terrtia bb4998aa4b
chg: [UI v0.2 server_management] add button to generate uuid v4 2019-03-27 12:00:22 +01:00
Jean-Louis Huynen 8b74c98396
Worker 2 pushes to analyzer's redis 2019-03-27 11:09:45 +01:00
Jean-Louis Huynen bb09272e64
typo 2019-03-27 10:36:02 +01:00
Jean-Louis Huynen 17497413a3 how to update web assets 2019-03-27 10:15:57 +01:00
Terrtia 890d2b5e38
chg: [server + workers] add config file + add option to specify save directory 2019-03-26 15:21:36 +01:00
Terrtia f67224e2e9
chg: [server + workers] add config file + add option to specify save directory 2019-03-26 15:20:06 +01:00
Jean-Louis Huynen c7ea2d7863
non interacting 2019-03-22 09:50:02 +01:00
Terrtia f9c6ed3755
fix: [metatypes] add option to save json by uuid 2019-03-19 15:11:47 +01:00
Terrtia 877992ff81
fix: [metatypes] add option to save json by uuid 2019-03-19 14:57:07 +01:00
Terrtia 64f6664611
Merge branch 'master' of https://github.com/D4-project/d4-core 2019-03-19 14:44:58 +01:00
Terrtia 394ed4cc5e
fix: [metatypes] add option to save json by uuid 2019-03-19 14:44:29 +01:00
Jean-Louis Huynen d44b5cedb6 Merge branch 'visu-type' 2019-03-19 14:28:21 +01:00
Terrtia a39d52808c
fix: [worker 2] typo 2019-03-19 14:27:07 +01:00
Terrtia e220c61c73
fix: [worker 2] use file rotation mode 2019-03-19 14:23:55 +01:00
Terrtia 29c259a98e
fix: typo 2019-03-19 14:17:25 +01:00
Terrtia e70904ab20
add [worker 2] add new save mode: save all datas in same directory 2019-03-19 14:10:43 +01:00
Terrtia 3bc20df5da
chg: [UI v0.2] add jquery daterange picker 2019-03-19 09:30:25 +01:00
Jean-Louis Huynen b9d5d52116
Uncomment to enable hot code reloading and debugger 2019-03-18 14:37:42 +01:00
Jean-Louis Huynen 2836414484
compat with python <3.6 2019-03-18 13:00:39 +01:00
Jean-Louis Huynen e4bb9b21fa
Write certificate and json to disk 2019-03-18 11:34:44 +01:00
Jean-Louis Huynen d8a1bfd74f
wip - write raw certs to disk 2019-03-14 17:36:14 +01:00
Terrtia bbb87418b9
fix: [workers 2] buffer 2019-03-14 13:20:06 +01:00
Terrtia 798f9c63d5
chg: [workers] debug: add epoch output 2019-03-14 11:44:56 +01:00
Terrtia 1b3d73e287
Merge branch 'master' of https://github.com/D4-project/d4-core 2019-03-14 11:28:44 +01:00
Terrtia 365f542d1e
chg: [UI v0.2] manage extended accepted types + extended types analyzers 2019-03-14 11:28:33 +01:00
Jean-Louis Huynen 724c4b1a30 Merge branch 'master' of github.com:D4-project/d4-core 2019-03-14 09:44:22 +01:00
Jean-Louis Huynen 0e30b962c9 wip - testing out meta types buffer 2019-03-14 09:44:09 +01:00
Terrtia b6c18f606b
fix: [Flask] fix json type parsing 2019-03-13 11:31:49 +01:00
Terrtia 6a82c5c020
chg: [gitignore] update 2019-03-13 10:42:18 +01:00
Terrtia 5f1afd92d6
chg: [default module 254] cleaning 2019-03-13 10:40:59 +01:00
Terrtia d6d11fecc3
chg: add basic gitignore 2019-03-13 10:16:01 +01:00
Terrtia fa442a2f70
chg: [254 default module] add functions to reconstruct data and handle buffer 2019-03-13 10:04:20 +01:00
Jean-Louis Huynen 6c90b140f7
Fix indent and self 2019-03-12 15:19:18 +01:00
Thirion Aurélien 6bdd70f55c
Merge pull request #9 from D4-project/metatypes
chg: [254 default module] add compress file + send data to analyzers
2019-03-11 16:05:33 +01:00
Terrtia 5891ddef9b
chg: [254 default module] add compress file + send data to analyzers 2019-03-11 16:03:42 +01:00
Thirion Aurélien e24b9f576b
Merge pull request #8 from D4-project/metatypes
Use parent and Child class for 254 types. Use Module (Child Class) to change type handler behaviour
2019-03-11 11:57:58 +01:00
Terrtia 9b310f7498
chg: [worker 254] use module class 2019-03-11 11:54:20 +01:00
Terrtia 3ad297799e
use dynamic Parent/Child class 2019-03-08 17:11:03 +01:00
Jean-Louis Huynen 9c17d74d7d
typo 2019-03-01 15:39:24 +01:00
Terrtia 16f506111c
fix: [LAUNCH] typo 2019-03-01 10:30:04 +01:00
Terrtia 10a335cdf2
chg: [worker 2] save 254 with file rotation 2019-03-01 10:19:04 +01:00
Terrtia 7444bcdf7b
fix: [worker 8] fix buffer concatenation 2019-02-28 16:43:48 +01:00
Terrtia 25d11be213
chg: [worker 2] basic worker 2, save json on disk 2019-02-28 16:35:34 +01:00
Terrtia bd50fea4ef
fix: [doc] add readme link 2019-02-28 11:47:29 +01:00
Terrtia 87ce104ef5
chg: [doc] add basic server readme 2019-02-28 10:36:24 +01:00
Terrtia be3029721b
chg: [server worker2] add worker2 v0.0 + detect type change 2019-02-27 15:46:34 +01:00
Terrtia 711e44d24d
chg: [server UI] empty analyzer queues 2019-02-20 10:04:44 +01:00
Terrtia fca82b15b1
fix: [worker 8] use binary type 2019-02-20 09:08:28 +01:00
Thirion Aurélien 3162f44e36
chg: [update readme] 2019-02-14 16:38:08 +01:00
Terrtia 68cefff471
Merge branch 'master' of https://github.com/D4-project/d4-core 2019-02-14 16:35:02 +01:00
Terrtia 1dac8d336d
chg: [doc] update screenshots 2019-02-14 16:34:30 +01:00
Alexandre Dulaunoy e12c72026c
chg: [DOC] badges updated 2019-02-14 16:27:06 +01:00
Alexandre Dulaunoy 891f39ae40
add: [doc] gitchangelogrc added 2019-02-14 16:20:55 +01:00
94 changed files with 7584 additions and 589 deletions

289
.gitchangelog.rc Normal file
View File

@ -0,0 +1,289 @@
# -*- coding: utf-8; mode: python -*-
##
## Format
##
## ACTION: [AUDIENCE:] COMMIT_MSG [!TAG ...]
##
## Description
##
## ACTION is one of 'chg', 'fix', 'new'
##
## Is WHAT the change is about.
##
## 'chg' is for refactor, small improvement, cosmetic changes...
## 'fix' is for bug fixes
## 'new' is for new features, big improvement
##
## AUDIENCE is optional and one of 'dev', 'usr', 'pkg', 'test', 'doc'|'docs'
##
## Is WHO is concerned by the change.
##
## 'dev' is for developpers (API changes, refactors...)
## 'usr' is for final users (UI changes)
## 'pkg' is for packagers (packaging changes)
## 'test' is for testers (test only related changes)
## 'doc' is for doc guys (doc only changes)
##
## COMMIT_MSG is ... well ... the commit message itself.
##
## TAGs are additionnal adjective as 'refactor' 'minor' 'cosmetic'
##
## They are preceded with a '!' or a '@' (prefer the former, as the
## latter is wrongly interpreted in github.) Commonly used tags are:
##
## 'refactor' is obviously for refactoring code only
## 'minor' is for a very meaningless change (a typo, adding a comment)
## 'cosmetic' is for cosmetic driven change (re-indentation, 80-col...)
## 'wip' is for partial functionality but complete subfunctionality.
##
## Example:
##
## new: usr: support of bazaar implemented
## chg: re-indentend some lines !cosmetic
## new: dev: updated code to be compatible with last version of killer lib.
## fix: pkg: updated year of licence coverage.
## new: test: added a bunch of test around user usability of feature X.
## fix: typo in spelling my name in comment. !minor
##
## Please note that multi-line commit message are supported, and only the
## first line will be considered as the "summary" of the commit message. So
## tags, and other rules only applies to the summary. The body of the commit
## message will be displayed in the changelog without reformatting.
##
## ``ignore_regexps`` is a line of regexps
##
## Any commit having its full commit message matching any regexp listed here
## will be ignored and won't be reported in the changelog.
##
ignore_regexps = [
r'@minor', r'!minor',
r'@cosmetic', r'!cosmetic',
r'@refactor', r'!refactor',
r'@wip', r'!wip',
r'^([cC]hg|[fF]ix|[nN]ew)\s*:\s*[p|P]kg:',
r'^([cC]hg|[fF]ix|[nN]ew)\s*:\s*[d|D]ev:',
r'^(.{3,3}\s*:)?\s*[fF]irst commit.?\s*$',
]
## ``section_regexps`` is a list of 2-tuples associating a string label and a
## list of regexp
##
## Commit messages will be classified in sections thanks to this. Section
## titles are the label, and a commit is classified under this section if any
## of the regexps associated is matching.
##
## Please note that ``section_regexps`` will only classify commits and won't
## make any changes to the contents. So you'll probably want to go check
## ``subject_process`` (or ``body_process``) to do some changes to the subject,
## whenever you are tweaking this variable.
##
section_regexps = [
('New', [
r'^[nN]ew\s*:\s*((dev|use?r|pkg|test|doc|docs)\s*:\s*)?([^\n]*)$',
]),
('Changes', [
r'^[cC]hg\s*:\s*((dev|use?r|pkg|test|doc|docs)\s*:\s*)?([^\n]*)$',
]),
('Fix', [
r'^[fF]ix\s*:\s*((dev|use?r|pkg|test|doc|docs)\s*:\s*)?([^\n]*)$',
]),
('Other', None ## Match all lines
),
]
## ``body_process`` is a callable
##
## This callable will be given the original body and result will
## be used in the changelog.
##
## Available constructs are:
##
## - any python callable that take one txt argument and return txt argument.
##
## - ReSub(pattern, replacement): will apply regexp substitution.
##
## - Indent(chars=" "): will indent the text with the prefix
## Please remember that template engines gets also to modify the text and
## will usually indent themselves the text if needed.
##
## - Wrap(regexp=r"\n\n"): re-wrap text in separate paragraph to fill 80-Columns
##
## - noop: do nothing
##
## - ucfirst: ensure the first letter is uppercase.
## (usually used in the ``subject_process`` pipeline)
##
## - final_dot: ensure text finishes with a dot
## (usually used in the ``subject_process`` pipeline)
##
## - strip: remove any spaces before or after the content of the string
##
## - SetIfEmpty(msg="No commit message."): will set the text to
## whatever given ``msg`` if the current text is empty.
##
## Additionally, you can `pipe` the provided filters, for instance:
#body_process = Wrap(regexp=r'\n(?=\w+\s*:)') | Indent(chars=" ")
#body_process = Wrap(regexp=r'\n(?=\w+\s*:)')
#body_process = noop
body_process = ReSub(r'((^|\n)[A-Z]\w+(-\w+)*: .*(\n\s+.*)*)+$', r'') | strip
## ``subject_process`` is a callable
##
## This callable will be given the original subject and result will
## be used in the changelog.
##
## Available constructs are those listed in ``body_process`` doc.
subject_process = (strip |
ReSub(r'^([cC]hg|[fF]ix|[nN]ew)\s*:\s*((dev|use?r|pkg|test|doc|docs)\s*:\s*)?([^\n@]*)(@[a-z]+\s+)*$', r'\4') |
SetIfEmpty("No commit message.") | ucfirst | final_dot)
## ``tag_filter_regexp`` is a regexp
##
## Tags that will be used for the changelog must match this regexp.
##
tag_filter_regexp = r'^v[0-9]+\.[0-9]+$'
## ``unreleased_version_label`` is a string or a callable that outputs a string
##
## This label will be used as the changelog Title of the last set of changes
## between last valid tag and HEAD if any.
unreleased_version_label = "%%version%% (unreleased)"
## ``output_engine`` is a callable
##
## This will change the output format of the generated changelog file
##
## Available choices are:
##
## - rest_py
##
## Legacy pure python engine, outputs ReSTructured text.
## This is the default.
##
## - mustache(<template_name>)
##
## Template name could be any of the available templates in
## ``templates/mustache/*.tpl``.
## Requires python package ``pystache``.
## Examples:
## - mustache("markdown")
## - mustache("restructuredtext")
##
## - makotemplate(<template_name>)
##
## Template name could be any of the available templates in
## ``templates/mako/*.tpl``.
## Requires python package ``mako``.
## Examples:
## - makotemplate("restructuredtext")
##
output_engine = rest_py
#output_engine = mustache("restructuredtext")
#output_engine = mustache("markdown")
#output_engine = makotemplate("restructuredtext")
## ``include_merge`` is a boolean
##
## This option tells git-log whether to include merge commits in the log.
## The default is to include them.
include_merge = True
## ``log_encoding`` is a string identifier
##
## This option tells gitchangelog what encoding is outputed by ``git log``.
## The default is to be clever about it: it checks ``git config`` for
## ``i18n.logOutputEncoding``, and if not found will default to git's own
## default: ``utf-8``.
#log_encoding = 'utf-8'
## ``publish`` is a callable
##
## Sets what ``gitchangelog`` should do with the output generated by
## the output engine. ``publish`` is a callable taking one argument
## that is an interator on lines from the output engine.
##
## Some helper callable are provided:
##
## Available choices are:
##
## - stdout
##
## Outputs directly to standard output
## (This is the default)
##
## - FileInsertAtFirstRegexMatch(file, pattern, idx=lamda m: m.start())
##
## Creates a callable that will parse given file for the given
## regex pattern and will insert the output in the file.
## ``idx`` is a callable that receive the matching object and
## must return a integer index point where to insert the
## the output in the file. Default is to return the position of
## the start of the matched string.
##
## - FileRegexSubst(file, pattern, replace, flags)
##
## Apply a replace inplace in the given file. Your regex pattern must
## take care of everything and might be more complex. Check the README
## for a complete copy-pastable example.
##
# publish = FileInsertIntoFirstRegexMatch(
# "CHANGELOG.rst",
# r'/(?P<rev>[0-9]+\.[0-9]+(\.[0-9]+)?)\s+\([0-9]+-[0-9]{2}-[0-9]{2}\)\n--+\n/',
# idx=lambda m: m.start(1)
# )
#publish = stdout
## ``revs`` is a list of callable or a list of string
##
## callable will be called to resolve as strings and allow dynamical
## computation of these. The result will be used as revisions for
## gitchangelog (as if directly stated on the command line). This allows
## to filter exaclty which commits will be read by gitchangelog.
##
## To get a full documentation on the format of these strings, please
## refer to the ``git rev-list`` arguments. There are many examples.
##
## Using callables is especially useful, for instance, if you
## are using gitchangelog to generate incrementally your changelog.
##
## Some helpers are provided, you can use them::
##
## - FileFirstRegexMatch(file, pattern): will return a callable that will
## return the first string match for the given pattern in the given file.
## If you use named sub-patterns in your regex pattern, it'll output only
## the string matching the regex pattern named "rev".
##
## - Caret(rev): will return the rev prefixed by a "^", which is a
## way to remove the given revision and all its ancestor.
##
## Please note that if you provide a rev-list on the command line, it'll
## replace this value (which will then be ignored).
##
## If empty, then ``gitchangelog`` will act as it had to generate a full
## changelog.
##
## The default is to use all commits to make the changelog.
#revs = ["^1.0.3", ]
#revs = [
# Caret(
# FileFirstRegexMatch(
# "CHANGELOG.rst",
# r"(?P<rev>[0-9]+\.[0-9]+(\.[0-9]+)?)\s+\([0-9]+-[0-9]{2}-[0-9]{2}\)\n--+\n")),
# "HEAD"
#]
revs = []

8
.gitignore vendored Normal file
View File

@ -0,0 +1,8 @@
# Temp files
*.swp
*.pyc
*.swo
*.o
# redis datas
server/dump6380.rdb

View File

@ -5,10 +5,17 @@
D4 core are software components used in the D4 project. The software includes everything to create your own sensor network or connect
to an existing sensor network using simple clients.
![https://github.com/D4-project/d4-core/releases/latest](https://img.shields.io/github/release/D4-project/d4-core/all.svg)
![https://github.com/D4-project/d4-core/blob/master/LICENSE](https://img.shields.io/badge/License-AGPL-yellow.svg)
## D4 core client
[D4 core client](https://github.com/D4-project/d4-core/tree/master/client) is a simple and minimal implementation of the [D4 encapsulation protocol](https://github.com/D4-project/architecture/tree/master/format). There is also a [portable D4 client](https://github.com/D4-project/d4-goclient) in Go including the support for the SSL/TLS connectivity.
<p align="center">
<img alt="d4-cclient" src="https://raw.githubusercontent.com/D4-project/d4-core/master/client/media/d4c-client.png" height="140" />
</p>
### Requirements
- Unix-like operating system
@ -57,10 +64,31 @@ git submodule init
git submodule update
~~~~
Build the d4 client. This will create the `d4` binary.
~~~~
make
~~~~
Then register the sensor with the server. Replace `API_TOKEN`, `VALID_UUID4` (create a random UUID via [UUIDgenerator](https://www.uuidgenerator.net/)) and `VALID_HMAC_KEY`.
~~~~
curl -k https://127.0.0.1:7000/api/v1/add/sensor/register --header "Authorization: API_TOKEN" -H "Content-Type: application/json" --data '{"uuid":"VALID_UUID4","hmac_key":"VALID_HMAC_KEY"}' -X POST
~~~~
If the registration went correctly the UUID is returned. Do not forget to approve the registration in the D4 server web interface.
Update the configuration file
~~~~
cp -r conf.sample conf
echo VALID_UUID4 > conf/uuid
echo VALID_HMAC_KEY > conf/key
~~~~
## D4 core server
D4 core server is a complete server to handle clients (sensors) including the decapsulation of the [D4 protocol](https://github.com/D4-project/architecture/tree/master/format), control of
sensor registrations, management of decoding protocols and dispatching to adequate decoders/analysers.
D4 core server is a complete server to handle clients (sensors) including the decapsulation of the [D4 protocol](https://github.com/D4-project/architecture/tree/master/format), control of sensor registrations, management of decoding protocols and dispatching to adequate decoders/analysers.
### Requirements
@ -69,16 +97,26 @@ sensor registrations, management of decoding protocols and dispatching to adequa
### Installation
~~~~
cd server
./install_server.sh
./LAUNCH.sh -l
~~~~
- [Install D4 Server](https://github.com/D4-project/d4-core/tree/master/server)
The web interface is accessible via `http://127.0.0.1:7000/`
### Screenshots of D4 core server management
### D4 core server Screenshots
#### Dashboard:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/main.png)
#### Connected Sensors:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor-mgmt.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-mgmt.png)
#### Sensors Status:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_status.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_types.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_files.png)
#### Server Management:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management-types.png)
#### analyzer Queues:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-queues.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/create_analyzer_queue.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-mgmt.png)

View File

@ -32,7 +32,7 @@ clean:
- rm -rf *.o hmac
d4: d4.o sha2.o hmac.o unpack.o unparse.o pack.o gen_uuid.o randutils.o parse.o
gcc -Wall -o d4 d4.o hmac.o sha2.o unpack.o pack.o unparse.o gen_uuid.o randutils.o parse.o
$(CC) -Wall -o d4 d4.o hmac.o sha2.o unpack.o pack.o unparse.o gen_uuid.o randutils.o parse.o
d4.o: d4.c
gcc -Wall -c d4.c
$(CC) -Wall -c d4.c

View File

@ -210,7 +210,7 @@ void d4_transfert(d4_t* d4)
//In case of errors see block of 0 bytes
bzero(buf, d4->snaplen);
nread = read(d4->source.fd, buf, d4->snaplen);
if ( nread > 0 ) {
if ( nread >= 0 ) {
d4_update_header(d4, nread);
//Do HMAC on header and payload. HMAC field is 0 during computation
if (d4->ctx) {
@ -238,6 +238,11 @@ void d4_transfert(d4_t* d4)
fprintf(stderr,"Incomplete header written. abort to let consumer known that the packet is corrupted\n");
abort();
}
// no data - create empty D4 packet
if ( nread == 0 ) {
//FIXME no data available, sleep, abort, retry
break;
}
} else{
//FIXME no data available, sleep, abort, retry
break;

BIN
client/media/d4c-client.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 82 KiB

View File

@ -98,6 +98,19 @@ int random_get_fd(void)
return fd;
}
int my_getentropy(void *buf, size_t buflen)
{
#ifdef __GLIBC__
if (buflen > 256) {
errno = EIO;
return -1;
}
return syscall(SYS_getrandom, buf, buflen, 0);
#else
return getentropy(buf, buflen);
#endif
}
/*
* Generate a stream of random nbytes into buf.
* Use /dev/urandom if possible, and if not,
@ -117,7 +130,7 @@ void random_get_bytes(void *buf, size_t nbytes)
int x;
errno = 0;
x = getentropy(cp, n);
x = my_getentropy(cp, n);
if (x > 0) { /* success */
n -= x;
cp += x;

Binary file not shown.

After

Width:  |  Height:  |  Size: 64 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 141 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 117 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 185 KiB

After

Width:  |  Height:  |  Size: 243 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 107 KiB

After

Width:  |  Height:  |  Size: 94 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 66 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 85 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

BIN
doc/images/server-mgmt2.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 127 KiB

1
server/.gitignore vendored
View File

@ -2,6 +2,7 @@
*.csr
*.pem
*.key
configs/server.conf
data/
logs/
redis/

View File

@ -0,0 +1,15 @@
FROM python:3
WORKDIR /usr/src/
RUN git clone https://github.com/D4-project/analyzer-d4-passivedns.git
# RUN git clone https://github.com/trolldbois/analyzer-d4-passivedns.git
WORKDIR /usr/src/analyzer-d4-passivedns
# FIXME typo in requirements.txt filename
RUN pip install --no-cache-dir -r requirements
WORKDIR /usr/src/analyzer-d4-passivedns/bin
# should be a config
# RUN cat /usr/src/analyzer-d4-passivedns/etc/analyzer.conf.sample | sed "s/127.0.0.1/redis-metadata/g" > /usr/src/analyzer-d4-passivedns/etc/analyzer.conf
# ignore the config and use ENV variables.
RUN cp ../etc/analyzer.conf.sample ../etc/analyzer.conf

View File

@ -0,0 +1,37 @@
FROM python:3
# that doesn't work on windows docker due to linefeeds
# WORKDIR /usr/src/d4-server
# COPY . .
## alternate solution
WORKDIR /usr/src/tmp
# RUN git clone https://github.com/trolldbois/d4-core.git
RUN git clone https://github.com/D4-project/d4-core.git
RUN mv d4-core/server/ /usr/src/d4-server
WORKDIR /usr/src/d4-server
ENV D4_HOME=/usr/src/d4-server
RUN pip install --no-cache-dir -r requirement.txt
# move to tls proxy ?
WORKDIR /usr/src/d4-server/gen_cert
RUN ./gen_root.sh
RUN ./gen_cert.sh
# setup a lots of files
WORKDIR /usr/src/d4-server/web
RUN ./update_web.sh
WORKDIR /usr/src/d4-server
# Should be using configs instead. but not supported until docker 17.06+
RUN cp configs/server.conf.sample configs/server.conf
# workers need tcpdump
RUN apt-get update && apt-get install -y tcpdump
ENTRYPOINT ["python", "server.py", "-v", "10"]
# CMD bash -l

View File

@ -11,10 +11,10 @@ CYAN="\\033[1;36m"
. ./D4ENV/bin/activate
isredis=`screen -ls | egrep '[0-9]+.Redis_D4' | cut -d. -f1`
isd4server=`screen -ls | egrep '[0-9]+.Server_D4' | cut -d. -f1`
isworker=`screen -ls | egrep '[0-9]+.Workers_D4' | cut -d. -f1`
isflask=`screen -ls | egrep '[0-9]+.Flask_D4' | cut -d. -f1`
isredis=`screen -ls | egrep '[0-9]+.Redis_D4 ' | cut -d. -f1`
isd4server=`screen -ls | egrep '[0-9]+.Server_D4 ' | cut -d. -f1`
isworker=`screen -ls | egrep '[0-9]+.Workers_D4 ' | cut -d. -f1`
isflask=`screen -ls | egrep '[0-9]+.Flask_D4 ' | cut -d. -f1`
function helptext {
echo -e $YELLOW"
@ -36,7 +36,7 @@ function helptext {
- D4 Twisted server.
- All wokers manager.
- All Redis in memory servers.
- Flak server.
- Flask server.
Usage: LAUNCH.sh
[-l | --launchAuto]
@ -45,6 +45,10 @@ function helptext {
"
}
CONFIG=$D4_HOME/configs/server.conf
redis_stream=`sed -nr '/\[Redis_STREAM\]/,/\[/{/port/p}' ${CONFIG} | awk -F= '/port/{print $2}' | sed 's/ //g'`
redis_metadata=`sed -nr '/\[Redis_METADATA\]/,/\[/{/port/p}' ${CONFIG} | awk -F= '/port/{print $2}' | sed 's/ //g'`
function launching_redis {
conf_dir="${D4_HOME}/configs/"
redis_dir="${D4_HOME}/redis/src/"
@ -65,6 +69,8 @@ function launching_d4_server {
screen -S "Server_D4" -X screen -t "Server_D4" bash -c "cd ${D4_HOME}; ./server.py -v 10; read x"
sleep 0.1
screen -S "Server_D4" -X screen -t "sensors_manager" bash -c "cd ${D4_HOME}; ./sensors_manager.py; read x"
sleep 0.1
}
function launching_workers {
@ -72,32 +78,36 @@ function launching_workers {
sleep 0.1
echo -e $GREEN"\t* Launching D4 Workers"$DEFAULT
screen -S "Workers_D4" -X screen -t "1_workers_manager" bash -c "cd ${D4_HOME}/workers/workers_1; ./workers_manager.py; read x"
screen -S "Workers_D4" -X screen -t "1_workers" bash -c "cd ${D4_HOME}/workers/workers_1; ./workers_manager.py; read x"
sleep 0.1
screen -S "Workers_D4" -X screen -t "4_workers_manager" bash -c "cd ${D4_HOME}/workers/workers_4; ./workers_manager.py; read x"
screen -S "Workers_D4" -X screen -t "2_workers" bash -c "cd ${D4_HOME}/workers/workers_2; ./workers_manager.py; read x"
sleep 0.1
screen -S "Workers_D4" -X screen -t "8_workers_manager" bash -c "cd ${D4_HOME}/workers/workers_8; ./workers_manager.py; read x"
screen -S "Workers_D4" -X screen -t "3_workers" bash -c "cd ${D4_HOME}/workers/workers_3; ./workers_manager.py; read x"
sleep 0.1
screen -S "Workers_D4" -X screen -t "4_workers" bash -c "cd ${D4_HOME}/workers/workers_4; ./workers_manager.py; read x"
sleep 0.1
screen -S "Workers_D4" -X screen -t "8_workers" bash -c "cd ${D4_HOME}/workers/workers_8; ./workers_manager.py; read x"
sleep 0.1
}
function shutting_down_redis {
redis_dir=${D4_HOME}/redis/src/
bash -c $redis_dir'redis-cli -p 6379 SHUTDOWN'
bash -c $redis_dir'redis-cli -p '$redis_stream' SHUTDOWN'
sleep 0.1
bash -c $redis_dir'redis-cli -p 6380 SHUTDOWN'
bash -c $redis_dir'redis-cli -p '$redis_metadata' SHUTDOWN'
sleep 0.1
}
function checking_redis {
flag_redis=0
redis_dir=${D4_HOME}/redis/src/
bash -c $redis_dir'redis-cli -p 6379 PING | grep "PONG" &> /dev/null'
bash -c $redis_dir'redis-cli -p '$redis_stream' PING | grep "PONG" &> /dev/null'
if [ ! $? == 0 ]; then
echo -e $RED"\t6379 not ready"$DEFAULT
flag_redis=1
fi
sleep 0.1
bash -c $redis_dir'redis-cli -p 6380 PING | grep "PONG" &> /dev/null'
bash -c $redis_dir'redis-cli -p '$redis_metadata' PING | grep "PONG" &> /dev/null'
if [ ! $? == 0 ]; then
echo -e $RED"\t6380 not ready"$DEFAULT
flag_redis=1
@ -107,6 +117,18 @@ function checking_redis {
return $flag_redis;
}
function wait_until_redis_is_ready {
redis_not_ready=true
while $redis_not_ready; do
if checking_redis; then
redis_not_ready=false;
else
sleep 1
fi
done
echo -e $YELLOW"\t* Redis Launched"$DEFAULT
}
function launch_redis {
if [[ ! $isredis ]]; then
launching_redis;
@ -161,6 +183,7 @@ function launch_flask {
screen -dmS "Flask_D4"
sleep 0.1
echo -e $GREEN"\t* Launching Flask server"$DEFAULT
# screen -S "Flask_D4" -X screen -t "Flask_server" bash -c "cd $flask_dir; export FLASK_DEBUG=1;export FLASK_APP=Flask_server.py; python -m flask run --port 7000; read x"
screen -S "Flask_D4" -X screen -t "Flask_server" bash -c "cd $flask_dir; ls; ./Flask_server.py; read x"
else
echo -e $RED"\t* A Flask_D4 screen is already launched"$DEFAULT
@ -202,9 +225,15 @@ function update_web {
fi
}
function update_config {
echo -e $GREEN"\t* Updating Config File"$DEFAULT
bash -c "(cd ${D4_HOME}/configs; ./update_conf.py -v 0)"
}
function launch_all {
helptext;
launch_redis;
update_config;
launch_d4_server;
launch_workers;
launch_flask;
@ -266,16 +295,19 @@ function launch_all {
while [ "$1" != "" ]; do
case $1 in
-l | --launchAuto ) launch_all;
;;
-k | --killAll ) helptext;
killall;
;;
-h | --help ) helptext;
exit
;;
* ) helptext
exit 1
-l | --launchAuto ) launch_all;
;;
-k | --killAll ) helptext;
killall;
;;
-lrv | --launchRedisVerify ) launch_redis;
wait_until_redis_is_ready;
;;
-h | --help ) helptext;
exit
;;
* ) helptext
exit 1
esac
shift
done

110
server/README.md Normal file
View File

@ -0,0 +1,110 @@
# D4 core
![](https://www.d4-project.org/assets/images/logo.png)
## D4 core server
D4 core server is a complete server to handle clients (sensors) including the decapsulation of the [D4 protocol](https://github.com/D4-project/architecture/tree/master/format), control of
sensor registrations, management of decoding protocols and dispatching to adequate decoders/analysers.
### Requirements
- Python 3.6
- GNU/Linux distribution
### Installation
###### Install D4 server
Clone the repository and install necessary packages. Installation requires *sudo* permissions.
~~~~
git clone https://github.com/D4-project/d4-core.git
cd d4-core
cd server
./install_server.sh
~~~~
When the installation is finished, scroll back to where `+ ./create_default_user.py` is displayed. The next lines contain the default generated user and should resemble the snippet below. Take a temporary note of the password, you are required to **change the password** on first login.
~~~~
new user created: admin@admin.test
password: <redacted>
token: <redacted>
~~~~
Then create or add a pem in [d4-core/server](https://github.com/D4-project/d4-core/tree/master/server) :
~~~~
cd gen_cert
./gen_root.sh
./gen_cert.sh
cd ..
~~~~
###### Launch D4 server
~~~~
./LAUNCH.sh -l
~~~~
The web interface is accessible via `http://127.0.0.1:7000/`
If you cannot access the web interface on localhost (for example because the system is running on a remote host), then stop the server, change the listening host IP and restart the server. In the below example it's changed to `0.0.0.0` (all interfaces). Make sure that the IP is not unintentionally publicly exposed.
~~~~
./LAUNCH.sh -k
sed -i '/\[Flask_Server\]/{:a;N;/host = 127\.0\.0\.1/!ba;s/host = 127\.0\.0\.1/host = 0.0.0.0/}' configs/server.conf
./LAUNCH.sh -l
~~~~
### Updating web assets
To update javascript libs run:
~~~~
cd web
./update_web.sh
~~~~
### API
[API Documentation](https://github.com/D4-project/d4-core/tree/master/server/documentation/README.md)
### Notes
- All server logs are located in ``d4-core/server/logs/``
- Close D4 Server: ``./LAUNCH.sh -k``
### D4 core server
#### Dashboard:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/main.png)
#### Connected Sensors:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor-mgmt.png)
#### Sensors Status:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_status.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_types.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_files.png)
#### Server Management:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management-types.png)
#### analyzer Queues:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-queues.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/create_analyzer_queue.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-mgmt.png)
### Troubleshooting
###### Worker 1, tcpdump: Permission denied
Could be related to AppArmor:
~~~~
sudo cat /var/log/syslog | grep denied
~~~~
Run the following command as root:
~~~~
aa-complain /usr/sbin/tcpdump
~~~~
###### WARNING - Not registered UUID=UUID4, connection closed
This happens after you have registered a new sensor, but have not approved the registration. In order to approve the sensor, go in the web interface to **Server Management**, and click **Pending Sensors**.

View File

@ -0,0 +1,75 @@
#!/usr/bin/env python3
import os
import sys
import time
import redis
import socket
import argparse
import logging
import logging.handlers
log_level = {'DEBUG': 10, 'INFO': 20, 'WARNING': 30, 'ERROR': 40, 'CRITICAL': 50}
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type or extended type' , type=str, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid' , type=str, dest='uuid', required=True)
parser.add_argument('-i', '--ip',help='server ip' , type=str, default='127.0.0.1', dest='target_ip')
parser.add_argument('-p', '--port',help='server port' ,type=int, default=514, dest='target_port')
parser.add_argument('-l', '--log_level', help='log level: DEBUG, INFO, WARNING, ERROR, CRITICAL', type=str, default='INFO', dest='req_level')
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip',help='redis host' , type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port',help='redis port' , type=int, default=6380, dest='port_redis')
args = parser.parse_args()
if not args.uuid or not args.type or not args.target_port:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines = args.newline
req_level = args.req_level
if req_level not in log_level:
print('ERROR: incorrect log level')
sys.exit(0)
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
target_ip = args.target_ip
target_port = args.target_port
addr = (target_ip, target_port)
syslog_logger = logging.getLogger('D4-SYSLOGOUT')
syslog_logger.setLevel(logging.DEBUG)
client_socket = logging.handlers.SysLogHandler(address = addr)
syslog_logger.addHandler(client_socket)
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if newLines:
d4_data = d4_data + b'\n'
syslog_logger.log(log_level[req_level], d4_data.decode())
client_socket.close()

View File

@ -0,0 +1,86 @@
#!/usr/bin/env python3
import os
import sys
import redis
import time
import datetime
import argparse
import logging
import logging.handlers
import socket
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type or extended type' , type=str, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid' , type=str, dest='uuid', required=True)
parser.add_argument('-i', '--ip',help='server ip' , type=str, default='127.0.0.1', dest='target_ip')
parser.add_argument('-p', '--port',help='server port' , type=int, dest='target_port', required=True)
parser.add_argument('-k', '--Keepalive', help='Keepalive in second', type=int, default='15', dest='ka_sec')
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip',help='redis ip' , type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port',help='redis port' , type=int, default=6380, dest='port_redis')
args = parser.parse_args()
if not args.uuid or not args.type or not args.target_port:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines = args.newline
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
target_ip = args.target_ip
target_port = args.target_port
addr = (target_ip, target_port)
# default keep alive: 15
ka_sec = args.ka_sec
# Create a TCP socket
client_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# TCP Keepalive
client_socket.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
client_socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 1)
client_socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, ka_sec)
client_socket.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, ka_sec)
# TCP connect
client_socket.connect(addr)
newLines=True
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if newLines:
d4_data = d4_data + b'\n'
print(d4_data)
client_socket.sendall(d4_data)
client_socket.shutdown(socket.SHUT_RDWR)

View File

@ -0,0 +1,101 @@
#!/usr/bin/env python3
import os
import sys
import redis
import time
import datetime
import argparse
import logging
import logging.handlers
import socket
import ssl
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type or extended type', type=str, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid', type=str, dest='uuid', required=True)
parser.add_argument('-i', '--ip',help='server ip', type=str, default='127.0.0.1', dest='target_ip')
parser.add_argument('-p', '--port',help='server port', type=int, dest='target_port', required=True)
parser.add_argument('-k', '--Keepalive', help='Keepalive in second', type=int, default='15', dest='ka_sec')
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip', help='redis ip', type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port', help='redis port', type=int, default=6380, dest='port_redis')
parser.add_argument('-v', '--verify_certificate', help='verify server certificate', type=str, default='True', dest='verify_certificate')
parser.add_argument('-c', '--ca_certs', help='cert filename' , type=str, default=None, dest='ca_certs')
args = parser.parse_args()
if not args.uuid or not args.type or not args.target_port:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines=args.newline
verify_certificate=args.verify_certificate
ca_certs=args.ca_certs
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
target_ip = args.target_ip
target_port = args.target_port
addr = (target_ip, target_port)
# default keep alive: 15
ka_sec = args.ka_sec
# Create a TCP socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# TCP Keepalive
s.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
s.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 1)
s.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, ka_sec)
s.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, ka_sec)
# SSL
if verify_certificate in ['False', 'false', 'f']:
cert_reqs_option = ssl.CERT_NONE
else:
cert_reqs_option = ssl.CERT_REQUIRED
if ca_certs:
ca_certs = None
client_socket = ssl.wrap_socket(s, cert_reqs=cert_reqs_option, ca_certs=ca_certs, ssl_version=ssl.PROTOCOL_TLS)
# TCP connect
client_socket.connect(addr)
newLines=True
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if newLines:
d4_data = d4_data + b'\n'
print(d4_data)
client_socket.send(d4_data)
client_socket.shutdown(socket.SHUT_RDWR)

View File

@ -0,0 +1,73 @@
#!/usr/bin/env python3
import os
import sys
import redis
import time
import datetime
import argparse
import logging
import logging.handlers
import socket
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type or extended type' , type=str, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid' , type=str, dest='uuid', required=True)
parser.add_argument('-i', '--ip',help='server ip' , type=str, default='127.0.0.1', dest='target_ip')
parser.add_argument('-p', '--port',help='server port' , type=int, dest='target_port', required=True)
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip',help='redis host' , type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port',help='redis port' , type=int, default=6380, dest='port_redis')
args = parser.parse_args()
if not args.uuid or not args.type or not args.target_port:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines = args.newline
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
target_ip = args.target_ip
target_port = args.target_port
addr = (target_ip, target_port)
#Create a UDP socket
client_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
newLines=True
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if newLines:
d4_data = d4_data + b'\n'
print(d4_data)
client_socket.sendto(d4_data, addr)
client_socket.close()

View File

@ -0,0 +1,80 @@
#!/usr/bin/env python3
import os
import sys
import redis
import time
import datetime
import argparse
import logging
import logging.handlers
import socket
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type or extended type' , type=str, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid' , type=str, dest='uuid', required=True)
parser.add_argument('-s', '--socket',help='socket file' , type=str, dest='socket_file', required=True)
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip',help='redis host' , type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port',help='redis port' , type=int, default=6380, dest='port_redis')
args = parser.parse_args()
if not args.uuid or not args.type or not args.socket_file:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines = args.newline
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
socket_file = args.socket_file
print("UNIX SOCKET: Connecting...")
if os.path.exists(socket_file):
client = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM)
client.connect(socket_file)
print("Connected")
else:
print("Couldn't Connect!")
print("ERROR: socket file not found")
print("Done")
newLines=False
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if newLines:
d4_data = d4_data + b'\n'
print(d4_data)
client.send(d4_data)
client.close()

View File

@ -0,0 +1,81 @@
#!/usr/bin/env python3
import os
import sys
import redis
import time
import datetime
import argparse
import logging
import logging.handlers
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type' , type=int, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid' , type=str, dest='uuid', required=True)
parser.add_argument('-f', '--files', help='read data from files. Append file to stdin', action="store_true")
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-i', '--ip',help='redis host' , type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-p', '--port',help='redis port' , type=int, default=6380, dest='port_redis')
args = parser.parse_args()
if not args.uuid or not args.type:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines = args.newline
read_files = args.files
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
# logs_dir = 'logs'
# if not os.path.isdir(logs_dir):
# os.makedirs(logs_dir)
#
# log_filename = 'logs/d4-stdout.log'
# logger = logging.getLogger()
# formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
# handler_log = logging.handlers.TimedRotatingFileHandler(log_filename, when="midnight", interval=1)
# handler_log.suffix = '%Y-%m-%d.log'
# handler_log.setFormatter(formatter)
# logger.addHandler(handler_log)
# logger.setLevel(args.verbose)
#
# logger.info('Launching stdout Analyzer ...')
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if read_files:
try:
with open(d4_data, 'rb') as f:
sys.stdout.buffer.write(f.read())
sys.exit(0)
except FileNotFoundError:
## TODO: write logs file
continue
else:
if newLines:
sys.stdout.buffer.write(d4_data + b'\n')
else:
sys.stdout.buffer.write(d4_data)

View File

@ -155,7 +155,7 @@ supervised no
#
# Creating a pid file is best effort: if Redis is not able to create it
# nothing bad happens, the server will start and run normally.
pidfile /var/run/redis_6379.pid
pidfile /var/run/redis_6380.pid
# Specify the server verbosity level.
# This can be one of:
@ -843,7 +843,7 @@ lua-time-limit 5000
# Make sure that instances running in the same system do not have
# overlapping cluster configuration file names.
#
# cluster-config-file nodes-6379.conf
# cluster-config-file nodes-6380.conf
# Cluster node timeout is the amount of milliseconds a node must be unreachable
# for it to be considered in failure state.
@ -971,7 +971,7 @@ lua-time-limit 5000
# Example:
#
# cluster-announce-ip 10.1.1.5
# cluster-announce-port 6379
# cluster-announce-port 6380
# cluster-announce-bus-port 6380
################################## SLOW LOG ###################################

View File

@ -0,0 +1,41 @@
[Save_Directories]
# By default all datas are saved in $D4_HOME/data/
use_default_save_directory = yes
save_directory = None
[D4_Server]
server_port=4443
# registration or shared-secret
server_mode = registration
default_hmac_key = private key to change
analyzer_queues_max_size = 100000000
[Flask_Server]
# UI port number
host = 127.0.0.1
port = 7000
[Redis_STREAM]
host = localhost
port = 6379
db = 0
[Redis_METADATA]
host = localhost
port = 6380
db = 0
[Redis_SERV]
host = localhost
port = 6380
db = 1
[Redis_ANALYZER]
host = localhost
port = 6380
db = 2
[Redis_CACHE]
host = localhost
port = 6380
db = 3

77
server/configs/update_conf.py Executable file
View File

@ -0,0 +1,77 @@
#!/usr/bin/env python3
import os
import argparse
import configparser
def print_message(message_to_print, verbose):
if verbose:
print(message_to_print)
if __name__ == "__main__":
# parse parameters
parser = argparse.ArgumentParser()
parser.add_argument('-v', '--verbose',help='Display Info Messages', type=int, default=1, choices=[0, 1])
parser.add_argument('-b', '--backup',help='Create Config Backup', type=int, default=1, choices=[0, 1])
args = parser.parse_args()
if args.verbose == 1:
verbose = True
else:
verbose = False
if args.backup == 1:
backup = True
else:
backup = False
config_file_server = os.path.join(os.environ['D4_HOME'], 'configs/server.conf')
config_file_sample = os.path.join(os.environ['D4_HOME'], 'configs/server.conf.sample')
config_file_backup = os.path.join(os.environ['D4_HOME'], 'configs/server.conf.backup')
# Check if confile file exist
if not os.path.isfile(config_file_server):
# create config file
with open(config_file_server, 'w') as configfile:
with open(config_file_sample, 'r') as config_file_sample:
configfile.write(config_file_sample.read())
print_message('Config File Created', verbose)
else:
config_server = configparser.ConfigParser()
config_server.read(config_file_server)
config_sections = config_server.sections()
config_sample = configparser.ConfigParser()
config_sample.read(config_file_sample)
sample_sections = config_sample.sections()
mew_content_added = False
for section in sample_sections:
new_key_added = False
if section not in config_sections:
# add new section
config_server.add_section(section)
mew_content_added = True
for key in config_sample[section]:
if key not in config_server[section]:
# add new section key
config_server.set(section, key, config_sample[section][key])
if not new_key_added:
print_message('[{}]'.format(section), verbose)
new_key_added = True
mew_content_added = True
print_message(' {} = {}'.format(key, config_sample[section][key]), verbose)
# new keys have been added to config file
if mew_content_added:
# backup config file
if backup:
with open(config_file_backup, 'w') as configfile:
with open(config_file_server, 'r') as configfile_origin:
configfile.write(configfile_origin.read())
print_message('New Backup Created', verbose)
# create new config file
with open(config_file_server, 'w') as configfile:
config_server.write(configfile)
print_message('Config file updated', verbose)
else:
print_message('Nothing to update', verbose)

156
server/docker-compose.yml Normal file
View File

@ -0,0 +1,156 @@
# Should be using configs but not supported until docker 17.06+
# https://www.d4-project.org/2019/05/28/passive-dns-tutorial.html
version: "3"
services:
redis-stream:
image: redis
command: redis-server --port 6379
redis-metadata:
image: redis
command: redis-server --port 6380
redis-analyzer:
image: redis
command: redis-server --port 6400
d4-server:
build:
context: .
dockerfile: Dockerfile.d4-server
image: d4-server:latest
depends_on:
- redis-stream
- redis-metadata
environment:
- D4_REDIS_STREAM_HOST=redis-stream
- D4_REDIS_STREAM_PORT=6379
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
ports:
- "4443:4443"
d4-worker_1:
build:
context: .
dockerfile: Dockerfile.d4-server
image: d4-server:latest
depends_on:
- redis-stream
- redis-metadata
environment:
- D4_REDIS_STREAM_HOST=redis-stream
- D4_REDIS_STREAM_PORT=6379
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
entrypoint: bash -c "cd workers/workers_1; ./workers_manager.py; read x"
volumes:
- d4-data:/usr/src/d4-server/data
d4-worker_2:
build:
context: .
dockerfile: Dockerfile.d4-server
image: d4-server:latest
depends_on:
- redis-stream
- redis-metadata
environment:
- D4_REDIS_STREAM_HOST=redis-stream
- D4_REDIS_STREAM_PORT=6379
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
entrypoint: bash -c "cd workers/workers_2; ./workers_manager.py; read x"
volumes:
- d4-data:/usr/src/d4-server/data
d4-worker_4:
build:
context: .
dockerfile: Dockerfile.d4-server
image: d4-server:latest
depends_on:
- redis-stream
- redis-metadata
environment:
- D4_REDIS_STREAM_HOST=redis-stream
- D4_REDIS_STREAM_PORT=6379
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
entrypoint: bash -c "cd workers/workers_4; ./workers_manager.py; read x"
volumes:
- d4-data:/usr/src/d4-server/data
d4-worker_8:
build:
context: .
dockerfile: Dockerfile.d4-server
image: d4-server:latest
depends_on:
- redis-stream
- redis-metadata
environment:
- D4_REDIS_STREAM_HOST=redis-stream
- D4_REDIS_STREAM_PORT=6379
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
entrypoint: bash -c "cd workers/workers_8; ./workers_manager.py; read x"
volumes:
- d4-data:/usr/src/d4-server/data
d4-web:
build:
context: .
dockerfile: Dockerfile.d4-server
image: d4-server:latest
depends_on:
- redis-stream
- redis-metadata
environment:
- D4_REDIS_STREAM_HOST=redis-stream
- D4_REDIS_STREAM_PORT=6379
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
entrypoint: bash -c "cd web; ./Flask_server.py; read x"
ports:
- "7000:7000"
volumes:
- d4-data:/usr/src/d4-server/data
d4-analyzer-passivedns-cof:
build:
context: .
dockerfile: Dockerfile.analyzer-d4-passivedns
image: analyzer-d4-passivedns:latest
depends_on:
- redis-metadata
- redis-analyzer
environment:
- D4_ANALYZER_REDIS_HOST=redis-analyzer
- D4_ANALYZER_REDIS_PORT=6400
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
- DEBUG=true
entrypoint: bash -c "python ./pdns-cof-server.py; read x"
ports:
- "8400:8400"
d4-analyzer-passivedns-ingestion:
build:
context: .
dockerfile: Dockerfile.analyzer-d4-passivedns
image: analyzer-d4-passivedns:latest
depends_on:
- redis-metadata
- redis-analyzer
environment:
- D4_ANALYZER_REDIS_HOST=redis-analyzer
- D4_ANALYZER_REDIS_PORT=6400
- D4_REDIS_METADATA_HOST=redis-metadata
- D4_REDIS_METADATA_PORT=6380
- DEBUG=true
entrypoint: bash -c "python ./pdns-ingestion.py; read x"
volumes:
d4-data:

View File

@ -0,0 +1,130 @@
# D4 core
![](https://www.d4-project.org/assets/images/logo.png)
## D4 core server
D4 core server is a complete server to handle clients (sensors) including the decapsulation of the [D4 protocol](https://github.com/D4-project/architecture/tree/master/format), control of
sensor registrations, management of decoding protocols and dispatching to adequate decoders/analysers.
## Database map - Metadata
```
DB 0 - Stats + sensor configs
DB 1 - Users
DB 2 - Analyzer queue
DB 3 - Flask Cache
```
### DB 1
##### User Management:
| Hset Key | Field | Value |
| ------ | ------ | ------ |
| user:all | **user id** | **password hash** |
| | | |
| user:tokens | **token** | **user id** |
| | | |
| user_metadata:**user id** | token | **token** |
| | change_passwd | **boolean** |
| | role | **role** |
| Set Key | Value |
| ------ | ------ |
| user_role:**role** | **user id** |
| Zrank Key | Field | Value |
| ------ | ------ | ------ |
| ail:all_role | **role** | **int, role priority (1=admin)** |
### Server
| Key | Value |
| --- | --- |
| server:hmac_default_key | **hmac_default_key** |
| Set Key | Value |
| --- | --- |
| server:accepted_type | **accepted type** |
| server:accepted_extended_type | **accepted extended type** |
###### Server Mode
| Set Key | Value |
| --- | --- |
| blacklist_ip | **blacklisted ip** |
| blacklist_ip_by_uuid | **uuidv4** |
| blacklist_uuid | **uuidv4** |
###### Connection Manager
| Set Key | Value |
| --- | --- |
| active_connection | **uuid** |
| | |
| active_connection:**type** | **uuid** |
| active_connection_extended_type:**uuid** | **extended type** |
| | |
| active_uuid_type2:**uuid** | **session uuid** |
| | |
| map:active_connection-uuid-session_uuid:**uuid** | **session uuid** |
| Set Key | Field | Value |
| --- | --- | --- |
| map:session-uuid_active_extended_type | **session_uuid** | **extended_type** |
### Stats
| Zset Key | Field | Value |
| --- | --- | --- |
| stat_uuid_ip:**date**:**uuid** | **IP** | **number D4 Packets** |
| | | |
| stat_uuid_type:**date**:**uuid** | **type** | **number D4 Packets** |
| | | |
| stat_type_uuid:**date**:**type** | **uuid** | **number D4 Packets** |
| | | |
| stat_ip_uuid:20190519:158.64.14.86 | **uuid** | **number D4 Packets** |
| | | |
| | | |
| daily_uuid:**date** | **uuid** | **number D4 Packets** |
| | | |
| daily_type:**date** | **type** | **number D4 Packets** |
| | | |
| daily_ip:**date** | **IP** | **number D4 Packets** |
### metadata sensors
| Hset Key | Field | Value |
| --- | --- | --- |
| metadata_uuid:**uuid** | first_seen | **epoch** |
| | last_seen | **epoch** |
| | description | **description** | (optionnal)
| | Error | **error message** | (optionnal)
| | hmac_key | **hmac_key** | (optionnal)
| | user_id | **user_id** | (optionnal)
###### Last IP
| List Key | Value |
| --- | --- |
| list_uuid_ip:**uuid** | **IP** |
### metadata types by sensors
| Hset Key | Field | Value |
| --- | --- | --- |
| metadata_uuid:**uuid** | first_seen | **epoch** |
| | last_seen | **epoch** |
| Set Key | Value |
| --- | --- |
| all_types_by_uuid:**uuid** | **type** |
| all_extended_types_by_uuid:**uuid** | **type** |
### analyzers
###### metadata
| Hset Key | Field | Value |
| --- | --- | --- |
| analyzer:**uuid** | last_updated | **epoch** |
| | description | **description** |
| | max_size | **queue max size** |
###### all analyzers by type
| Set Key | Value |
| --- | --- |
| analyzer:**type** | **uuid** |
| analyzer:254:**extended type** | **uuid** |

View File

@ -0,0 +1,94 @@
# API DOCUMENTATION
## General
### Automation key
The authentication of the automation is performed via a secure key available in the D4 UI interface. Make sure you keep that key secret. It gives access to the entire database! The API key is available in the ``Settings`` menu under ``My Profile``.
The authorization is performed by using the following header:
~~~~
Authorization: YOUR_API_KEY
~~~~
### Accept and Content-Type headers
When submitting data in a POST, PUT or DELETE operation you need to specify in what content-type you encoded the payload. This is done by setting the below Content-Type headers:
~~~~
Content-Type: application/json
~~~~
Example:
~~~~
curl --header "Authorization: YOUR_API_KEY" --header "Content-Type: application/json" https://D4_URL/
~~~~
## Sensor Registration
### Register a sensor: `api/v1/add/sensor/register`<a name="add_sensor_register"></a>
#### Description
Register a sensor.
**Method** : `POST`
#### Parameters
- `uuid`
- sensor uuid
- *uuid4*
- mandatory
- `hmac_key`
- sensor secret key
- *binary*
- mandatory
- `description`
- sensor description
- *str*
- `mail`
- user mail
- *str*
#### JSON response
- `uuid`
- sensor uuid
- *uuid4*
#### Example
```
curl https://127.0.0.1:7000/api/v1/add/sensor/register --header "Authorization: iHc1_ChZxj1aXmiFiF1mkxxQkzawwriEaZpPqyTQj " -H "Content-Type: application/json" --data @input.json -X POST
```
#### input.json Example
```json
{
"uuid": "ff7ba400-e76c-4053-982d-feec42bdef38",
"hmac_key": "...HMAC_KEY..."
}
```
#### Expected Success Response
**HTTP Status Code** : `200`
```json
{
"uuid": "ff7ba400-e76c-4053-982d-feec42bdef38",
}
```
#### Expected Fail Response
**HTTP Status Code** : `400`
```json
{"status": "error", "reason": "Mandatory parameter(s) not provided"}
{"status": "error", "reason": "Invalid uuid"}
```
**HTTP Status Code** : `409`
```json
{"status": "error", "reason": "Sensor already registred"}
```

View File

@ -7,3 +7,6 @@ openssl req -sha256 -new -key server.key -out server.csr -config san.cnf
openssl x509 -req -in server.csr -CA rootCA.crt -CAkey rootCA.key -CAcreateserial -out server.crt -days 500 -sha256 -extfile ext3.cnf
# Concat in pem
cat server.crt server.key > ../server.pem
# Copy certs for Flask https
cp server.key ../web/server.key
cp server.crt ../web/server.crt

View File

@ -2,4 +2,4 @@
# Create Root key
openssl genrsa -out rootCA.key 4096
# Create and Sign the Root CA Certificate
openssl req -x509 -new -nodes -key rootCA.key -sha256 -days 1024 -out rootCA.crt
openssl req -x509 -new -nodes -key rootCA.key -sha256 -days 1024 -out rootCA.crt -config san.cnf

View File

@ -12,6 +12,10 @@ if [ -z "$VIRTUAL_ENV" ]; then
fi
python3 -m pip install -r requirement.txt
pushd configs/
cp server.conf.sample server.conf
popd
pushd web/
./update_web.sh
popd
@ -25,3 +29,17 @@ pushd redis/
git checkout 5.0
make
popd
# LAUNCH
bash LAUNCH.sh -l &
wait
echo ""
# create default users
pushd web/
./create_default_user.py
popd
bash LAUNCH.sh -k &
wait
echo ""

370
server/lib/Analyzer_Queue.py Executable file
View File

@ -0,0 +1,370 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import datetime
import time
import uuid
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import d4_type
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_analyzer = config_loader.get_redis_conn("Redis_ANALYZER")
LIST_DEFAULT_SIZE = config_loader.get_config_int('D4_Server', 'analyzer_queues_max_size')
config_loader = None
### ###
def is_valid_uuid_v4(uuid_v4):
if uuid_v4:
uuid_v4 = uuid_v4.replace('-', '')
else:
return False
try:
uuid_test = uuid.UUID(hex=uuid_v4, version=4)
return uuid_test.hex == uuid_v4
except:
return False
def sanitize_uuid(uuid_v4, not_exist=False):
if not is_valid_uuid_v4(uuid_v4):
uuid_v4 = str(uuid.uuid4())
if not_exist:
if exist_queue(uuid_v4):
uuid_v4 = str(uuid.uuid4())
return uuid_v4
def sanitize_queue_type(format_type):
try:
format_type = int(format_type)
except:
format_type = 1
if format_type == 2:
format_type = 254
return format_type
def exist_queue(queue_uuid):
return r_serv_metadata.exists('analyzer:{}'.format(queue_uuid))
def get_all_queues(r_list=None):
res = r_serv_metadata.smembers('all_analyzer_queues')
if r_list:
return list(res)
return res
def get_all_queues_format_type(r_list=None):
res = r_serv_metadata.smembers('all:analyzer:format_type')
if r_list:
return list(res)
return res
def get_all_queues_extended_type(r_list=None):
res = r_serv_metadata.smembers('all:analyzer:extended_type')
if r_list:
return list(res)
return res
# GLOBAL
def get_all_queues_uuid_by_type(format_type, r_list=None):
res = r_serv_metadata.smembers('all:analyzer:by:format_type:{}'.format(format_type))
if r_list:
return list(res)
return res
# GLOBAL
def get_all_queues_uuid_by_extended_type(extended_type, r_list=None):
res = r_serv_metadata.smembers('all:analyzer:by:extended_type:{}'.format(extended_type))
if r_list:
return list(res)
return res
def get_queues_list_by_type(queue_type):
if isinstance(queue_type ,int):
return get_all_queues_by_type(queue_type)
else:
return get_all_queues_by_extended_type(queue_type)
# ONLY NON GROUP
def get_all_queues_by_type(format_type, r_list=None):
'''
Get all analyzer Queues by type
:param format_type: data type
:type domain_type: int
:param r_list: return list
:type r_list: boolean
:return: list or set of queus (uuid)
:rtype: list or set
'''
# 'all_analyzer_queues_by_type'
res = r_serv_metadata.smembers('analyzer:{}'.format(format_type))
if r_list:
return list(res)
return res
# ONLY NON GROUP
def get_all_queues_by_extended_type(extended_type, r_list=None):
res = r_serv_metadata.smembers('analyzer:254:{}'.format(extended_type))
if r_list:
return list(res)
return res
def get_all_queues_group_by_type(format_type, r_list=None):
res = r_serv_metadata.smembers('analyzer_uuid_group:{}'.format(format_type))
if r_list:
return list(res)
return res
def get_all_queues_group_by_extended_type(extended_type, r_list=None):
res = r_serv_metadata.smembers('analyzer_uuid_group:254:{}'.format(extended_type))
if r_list:
return list(res)
return res
def get_all_queues_by_sensor_group(queue_type, sensor_uuid, r_list=None):
res = r_serv_metadata.smembers('sensor:queues:{}:{}'.format(queue_type, sensor_uuid))
if r_list:
return list(res)
return res
def get_queue_group_all_sensors(queue_uuid, r_list=None):
res = r_serv_metadata.smembers('analyzer_sensor_group:{}'.format(queue_uuid))
if r_list:
return list(res)
return res
def get_queue_last_seen(queue_uuid, f_date='str_time'):
res = r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'last_updated')
if f_date == 'str_date':
if res is None:
res = 'Never'
else:
res = datetime.datetime.fromtimestamp(float(res)).strftime('%Y-%m-%d %H:%M:%S')
return res
def get_queue_max_size(queue_uuid):
max_size = r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'max_size')
if max_size is None:
max_size = LIST_DEFAULT_SIZE
return max_size
def get_queue_size(queue_uuid, format_type, extended_type=None):
if format_type==254:
if not extended_type:
extended_type = get_queue_extended_type(queue_uuid)
length = r_serv_analyzer.llen('analyzer:{}:{}'.format(extended_type, queue_uuid))
else:
length = r_serv_analyzer.llen('analyzer:{}:{}'.format(format_type, queue_uuid))
if length is None:
length = 0
return length
def get_queue_format_type(queue_uuid):
return int(r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'type'))
def get_queue_extended_type(queue_uuid):
return r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'metatype')
def is_queue_group_of_sensors(queue_uuid):
return r_serv_metadata.exists('analyzer_sensor_group:{}'.format(queue_uuid))
def get_queue_metadata(queue_uuid, format_type=None, extended_type=None, f_date='str_date', is_group=None, force_is_group_queue=False):
dict_queue_meta = {}
dict_queue_meta['uuid'] = queue_uuid
dict_queue_meta['size_limit'] = get_queue_max_size(queue_uuid)
dict_queue_meta['last_updated'] = get_queue_last_seen(queue_uuid, f_date=f_date)
dict_queue_meta['description'] = r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'description')
if dict_queue_meta['description'] is None:
dict_queue_meta['description'] = ''
if not format_type:
format_type = get_queue_format_type(queue_uuid)
dict_queue_meta['format_type'] = format_type
if format_type==254:
if not extended_type:
extended_type = get_queue_extended_type(queue_uuid)
dict_queue_meta['extended_type'] = extended_type
dict_queue_meta['length'] = get_queue_size(queue_uuid, format_type, extended_type=extended_type)
if is_group and not force_is_group_queue:
dict_queue_meta['is_group_queue'] = is_queue_group_of_sensors(queue_uuid)
else:
if force_is_group_queue:
dict_queue_meta['is_group_queue'] = True
else:
dict_queue_meta['is_group_queue'] = False
return dict_queue_meta
def edit_queue_description(queue_uuid, description):
if r_serv_metadata.exists('analyzer:{}'.format(queue_uuid)) and description:
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'description', description)
def edit_queue_max_size(queue_uuid, max_size):
try:
max_size = int(max_size)
except:
return 'analyzer max size, Invalid Integer'
if r_serv_metadata.exists('analyzer:{}'.format(queue_uuid)) and max_size > 0:
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'max_size', max_size)
def edit_queue_sensors_set(queue_uuid, l_sensors_uuid):
format_type = get_queue_format_type(queue_uuid)
set_current_sensors = get_queue_group_all_sensors(queue_uuid)
l_new_sensors_uuid = []
for sensor_uuid in l_sensors_uuid:
l_new_sensors_uuid.append(sensor_uuid.replace('-', ''))
sensors_to_add = l_sensors_uuid.difference(set_current_sensors)
sensors_to_remove = set_current_sensors.difference(l_sensors_uuid)
for sensor_uuid in sensors_to_add:
r_serv_metadata.sadd('analyzer_sensor_group:{}'.format(queue_uuid), sensor_uuid)
r_serv_metadata.sadd('sensor:queues:{}:{}'.format(format_type, sensor_uuid), queue_uuid)
for sensor_uuid in sensors_to_remove:
r_serv_metadata.srem('analyzer_sensor_group:{}'.format(queue_uuid), sensor_uuid)
r_serv_metadata.srem('sensor:queues:{}:{}'.format(format_type, sensor_uuid), queue_uuid)
# create queu by type or by group of uuid
# # TODO: add size limit
def create_queues(format_type, queue_uuid=None, l_uuid=[], queue_type='list', metatype_name=None, description=None):
format_type = sanitize_queue_type(format_type)
if not d4_type.is_accepted_format_type(format_type):
return {'error': 'Invalid type'}
if format_type == 254 and not d4_type.is_accepted_extended_type(metatype_name):
return {'error': 'Invalid extended type'}
queue_uuid = sanitize_uuid(queue_uuid, not_exist=True)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'type', format_type)
edit_queue_description(queue_uuid, description)
# # TODO: check l_uuid is valid
if l_uuid:
analyzer_key_name = 'analyzer_uuid_group'
else:
analyzer_key_name = 'analyzer'
r_serv_metadata.sadd('all:analyzer:format_type', format_type)
r_serv_metadata.sadd('all:analyzer:by:format_type:{}'.format(format_type), queue_uuid)
if format_type == 254:
# TODO: check metatype_name
r_serv_metadata.sadd('{}:{}:{}'.format(analyzer_key_name, format_type, metatype_name), queue_uuid)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'metatype', metatype_name)
r_serv_metadata.sadd('all:analyzer:by:extended_type:{}'.format(metatype_name), queue_uuid)
r_serv_metadata.sadd('all:analyzer:extended_type', metatype_name)
else:
r_serv_metadata.sadd('{}:{}'.format(analyzer_key_name, format_type), queue_uuid)
# Group by UUID
if l_uuid:
# # TODO: check sensor_uuid is valid
if format_type == 254:
queue_type = metatype_name
for sensor_uuid in l_uuid:
sensor_uuid = sensor_uuid.replace('-', '')
r_serv_metadata.sadd('analyzer_sensor_group:{}'.format(queue_uuid), sensor_uuid)
r_serv_metadata.sadd('sensor:queues:{}:{}'.format(queue_type, sensor_uuid), queue_uuid)
# ALL
r_serv_metadata.sadd('all_analyzer_queues', queue_uuid)
return queue_uuid
# format_type int or str (extended type)
def add_data_to_queue(sensor_uuid, queue_type, data):
if data:
# by data type
for queue_uuid in get_queues_list_by_type(queue_type):
r_serv_analyzer.lpush('analyzer:{}:{}'.format(queue_type, queue_uuid), data)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'last_updated', time.time())
analyser_queue_max_size = get_queue_max_size(queue_uuid)
r_serv_analyzer.ltrim('analyzer:{}:{}'.format(queue_type, queue_uuid), 0, analyser_queue_max_size)
# by data type
for queue_uuid in get_all_queues_by_sensor_group(queue_type, sensor_uuid):
r_serv_analyzer.lpush('analyzer:{}:{}'.format(queue_type, queue_uuid), data)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'last_updated', time.time())
analyser_queue_max_size = get_queue_max_size(queue_uuid)
r_serv_analyzer.ltrim('analyzer:{}:{}'.format(queue_type, queue_uuid), 0, analyser_queue_max_size)
def flush_queue(queue_uuid, queue_type):
r_serv_analyzer.delete('analyzer:{}:{}'.format(queue_type, queue_uuid))
def remove_queues(queue_uuid, queue_type, metatype_name=None):
try:
queue_type = int(queue_type)
except:
print('error: Invalid format type')
return {'error': 'Invalid format type'}
if not is_valid_uuid_v4(queue_uuid):
print('error: Invalid uuid')
return {'error': 'Invalid uuid'}
if not exist_queue(queue_uuid):
print('error: unknow queue uuid')
return {'error': 'unknow queue uuid'}
if queue_type==254 and not metatype_name:
metatype_name = get_queue_extended_type(queue_uuid)
# delete metadata
r_serv_metadata.delete('analyzer:{}'.format(queue_uuid))
# delete queue group of sensors uuid
l_sensors_uuid = get_queue_group_all_sensors(queue_uuid)
if l_sensors_uuid:
r_serv_metadata.delete('analyzer_sensor_group:{}'.format(queue_uuid))
if queue_type == 254:
queue_type = metatype_name
for sensor_uuid in l_sensors_uuid:
r_serv_metadata.srem('sensor:queues:{}:{}'.format(queue_type, sensor_uuid), queue_uuid)
if l_sensors_uuid:
analyzer_key_name = 'analyzer_uuid_group'
else:
analyzer_key_name = 'analyzer'
r_serv_metadata.srem('all:analyzer:by:format_type:{}'.format(queue_type), queue_uuid)
if queue_type == 254:
r_serv_metadata.srem('{}:254:{}'.format(analyzer_key_name, metatype_name), queue_uuid)
r_serv_metadata.srem('all:analyzer:by:extended_type:{}'.format(metatype_name), queue_uuid)
else:
r_serv_metadata.srem('{}:{}'.format(analyzer_key_name, queue_type), queue_uuid)
r_serv_metadata.srem('all_analyzer_queues', queue_uuid)
## delete global queue ##
if not r_serv_metadata.exists('all:analyzer:by:format_type:{}'.format(queue_type)):
r_serv_metadata.srem('all:analyzer:format_type', queue_type)
if queue_type ==254:
if not r_serv_metadata.exists('all:analyzer:by:extended_type:{}'.format(metatype_name)):
r_serv_metadata.srem('all:analyzer:extended_type', metatype_name)
## --- ##
# delete qeue
r_serv_analyzer.delete('analyzer:{}:{}'.format(queue_type, queue_uuid))
def get_sensor_queues(sensor_uuid):
pass
if __name__ == '__main__':
#create_queues(3, l_uuid=['03c00bcf-fe53-46a1-85bb-ee6084cb5bb2'])
remove_queues('a2e6f95c-1efe-4d2b-a0f5-d8e205d85670', 3)

54
server/lib/ConfigLoader.py Executable file
View File

@ -0,0 +1,54 @@
#!/usr/bin/python3
"""
The ``ConfigLoader``
===================
"""
import os
import sys
import time
import redis
import configparser
# Get Config file
config_dir = os.path.join(os.environ['D4_HOME'], 'configs')
config_file = os.path.join(config_dir, 'server.conf')
if not os.path.exists(config_file):
raise Exception('Unable to find the configuration file. \
Did you set environment variables? \
Or activate the virtualenv.')
# # TODO: create sphinx doc
# # TODO: add config_field to reload
class ConfigLoader(object):
"""docstring for Config_Loader."""
def __init__(self):
self.cfg = configparser.ConfigParser()
self.cfg.read(config_file)
def get_redis_conn(self, redis_name, decode_responses=True): ## TODO: verify redis name
return redis.StrictRedis( host=self.cfg.get(redis_name, "host"),
port=self.cfg.getint(redis_name, "port"),
db=self.cfg.getint(redis_name, "db"),
decode_responses=decode_responses )
def get_config_str(self, section, key_name):
return self.cfg.get(section, key_name)
def get_config_int(self, section, key_name):
return self.cfg.getint(section, key_name)
def get_config_boolean(self, section, key_name):
return self.cfg.getboolean(section, key_name)
def has_option(self, section, key_name):
return self.cfg.has_option(section, key_name)
def has_section(self, section):
return self.cfg.has_section(section)

275
server/lib/Sensor.py Executable file
View File

@ -0,0 +1,275 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import time
import uuid
import redis
from flask import escape
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import d4_server
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_db = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
def is_valid_uuid_v4(UUID):
UUID = UUID.replace('-', '')
try:
uuid_test = uuid.UUID(hex=UUID, version=4)
return uuid_test.hex == UUID
except:
return False
def get_time_sensor_last_seen(sensor_uuid):
res = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'last_seen')
if res:
return int(res)
else:
return 0
def _get_sensor_type(sensor_uuid, first_seen=True, last_seen=True, time_format='default'):
uuid_type = []
uuid_all_type = r_serv_db.smembers('all_types_by_uuid:{}'.format(sensor_uuid))
for type in uuid_all_type:
type_meta = {}
type_meta['type'] = type
if first_seen:
type_meta['first_seen'] = r_serv_db.hget('metadata_type_by_uuid:{}:{}'.format(sensor_uuid, type), 'first_seen')
if last_seen:
type_meta['last_seen'] = r_serv_db.hget('metadata_type_by_uuid:{}:{}'.format(sensor_uuid, type), 'last_seen')
# time format
if time_format=='gmt':
if type_meta['first_seen']:
type_meta['first_seen'] = datetime.datetime.fromtimestamp(float(type_meta['first_seen'])).strftime('%Y-%m-%d %H:%M:%S')
if type_meta['last_seen']:
type_meta['last_seen'] = datetime.datetime.fromtimestamp(float(type_meta['last_seen'])).strftime('%Y-%m-%d %H:%M:%S')
uuid_type.append(type_meta)
return uuid_type
def _get_sensor_metadata(sensor_uuid, first_seen=True, last_seen=True, time_format='default', sensor_types=False, mail=True, description=True):
meta_sensor = {}
meta_sensor['uuid'] = sensor_uuid
if first_seen:
meta_sensor['first_seen'] = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'first_seen')
if last_seen:
meta_sensor['last_seen'] = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'last_seen')
# time format
if time_format=='gmt':
if meta_sensor['first_seen']:
meta_sensor['first_seen'] = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(meta_sensor['first_seen'])))
if meta_sensor['last_seen']:
meta_sensor['last_seen'] = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(meta_sensor['last_seen'])))
if sensor_types:
meta_sensor['types'] = _get_sensor_type(sensor_uuid, first_seen=False, last_seen=False)
if description:
meta_sensor['description'] = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'description')
if mail:
meta_sensor['mail'] = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'user_mail')
return meta_sensor
### BEGIN - SENSOR REGISTRATION ###
## TODO: add description
def register_sensor(req_dict):
sensor_uuid = req_dict.get('uuid', None)
hmac_key = req_dict.get('hmac_key', None)
user_id = req_dict.get('mail', None)
third_party = req_dict.get('third_party', None)
# verify uuid
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
# sensor already exist
if r_serv_db.exists('metadata_uuid:{}'.format(sensor_uuid)):
return ({"status": "error", "reason": "Sensor already registered"}, 409)
# hmac key
if not hmac_key:
return ({"status": "error", "reason": "Mandatory parameter(s) not provided"}, 400)
else:
hmac_key = escape(hmac_key)
if len(hmac_key)>100:
hmac_key=hmac_key[:100]
if third_party:
third_party = True
res = _register_sensor(sensor_uuid, hmac_key, user_id=user_id, third_party=third_party, description=None)
return res
def _register_sensor(sensor_uuid, secret_key, user_id=None, third_party=False, description=None):
r_serv_db.hset('metadata_uuid:{}'.format(sensor_uuid), 'hmac_key', secret_key)
if user_id:
r_serv_db.hset('metadata_uuid:{}'.format(sensor_uuid), 'user_mail', user_id)
if description:
r_serv_db.hset('metadata_uuid:{}'.format(sensor_uuid), 'description', description)
if third_party:
r_serv_db.hset('metadata_uuid:{}'.format(sensor_uuid), 'third_party', True)
r_serv_db.sadd('sensor_pending_registration', sensor_uuid)
return ({'uuid': sensor_uuid}, 200)
def get_pending_sensor():
return list(r_serv_db.smembers('sensor_pending_registration'))
def get_nb_pending_sensor():
return r_serv_db.scard('sensor_pending_registration')
def get_nb_registered_sensors():
return r_serv_db.scard('registered_uuid')
def get_registered_sensors():
return list(r_serv_db.smembers('registered_uuid'))
def approve_sensor(req_dict):
sensor_uuid = req_dict.get('uuid', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
# sensor not registred
#if r_serv_db.sismember('sensor_pending_registration', sensor_uuid):
# return ({"status": "error", "reason": "Sensor not registred"}, 404)
# sensor already approved
if r_serv_db.sismember('registered_uuid', sensor_uuid):
return ({"status": "error", "reason": "Sensor already approved"}, 409)
return _approve_sensor(sensor_uuid)
def _approve_sensor(sensor_uuid):
r_serv_db.sadd('registered_uuid', sensor_uuid)
r_serv_db.srem('sensor_pending_registration', sensor_uuid)
return ({'uuid': sensor_uuid}, 200)
def delete_pending_sensor(req_dict):
sensor_uuid = req_dict.get('uuid', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
# sensor not registred
#if r_serv_db.sismember('sensor_pending_registration', sensor_uuid):
# return ({"status": "error", "reason": "Sensor not registred"}, 404)
# sensor already approved
if not r_serv_db.sismember('sensor_pending_registration', sensor_uuid):
return ({"status": "error", "reason": "Not Pending Sensor"}, 409)
return _delete_pending_sensor(sensor_uuid)
def _delete_pending_sensor(sensor_uuid):
r_serv_db.srem('sensor_pending_registration', sensor_uuid)
return ({'uuid': sensor_uuid}, 200)
def delete_registered_sensor(req_dict):
sensor_uuid = req_dict.get('uuid', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
# sensor not registred
if not r_serv_db.sismember('registered_uuid', sensor_uuid):
return ({"status": "error", "reason": "Sensor not registered"}, 404)
return _delete_registered_sensor(sensor_uuid)
def _delete_registered_sensor(sensor_uuid):
r_serv_db.srem('registered_uuid', sensor_uuid)
return ({'uuid': sensor_uuid}, 200)
### --- END - SENSOR REGISTRATION --- ###
### BEGIN - SENSOR MONITORING ###
def get_sensors_monitoring_last_updated():
res = r_serv_db.get('sensors_monitoring:last_updated')
if res:
return int(res)
else:
return 0
def get_all_sensors_to_monitor():
return r_serv_db.smembers('to_monitor:sensors')
def get_to_monitor_delta_time_by_uuid(sensor_uuid):
return int(r_serv_db.hget('to_monitor:sensor:{}'.format(sensor_uuid), 'delta_time'))
def get_all_sensors_to_monitor_dict():
dict_to_monitor = {}
for sensor_uuid in get_all_sensors_to_monitor():
dict_to_monitor[sensor_uuid] = get_to_monitor_delta_time_by_uuid(sensor_uuid)
return dict_to_monitor
def _check_sensor_delta(sensor_uuid, sensor_delta):
last_d4_packet = get_time_sensor_last_seen(sensor_uuid)
# check sensor delta time between two D4 packets + check sensor connection
if int(time.time()) - last_d4_packet > sensor_delta or not d4_server.is_sensor_connected(sensor_uuid):
r_serv_db.sadd('sensors_monitoring:sensors_error', sensor_uuid)
handle_sensor_monitoring_error(sensor_uuid)
else:
r_serv_db.srem('sensors_monitoring:sensors_error', sensor_uuid)
def handle_sensor_monitoring_error(sensor_uuid):
print('sensor monitoring error: {}'.format(sensor_uuid))
## TODO: ##
# MAILS
# UI Notifications
# SNMP
# Syslog message
## ## ## ##
return None
def is_sensor_monitored(sensor_uuid):
return r_serv_db.sismember('to_monitor:sensors', sensor_uuid)
def get_all_sensors_connection_errors():
return r_serv_db.smembers('sensors_monitoring:sensors_error')
def api_get_all_sensors_connection_errors():
return list(get_all_sensors_connection_errors()), 200
def add_sensor_to_monitor(sensor_uuid, delta_time):
r_serv_db.sadd('to_monitor:sensors', sensor_uuid)
r_serv_db.hset('to_monitor:sensor:{}'.format(sensor_uuid), 'delta_time', delta_time)
r_serv_db.set('sensors_monitoring:last_updated', int(time.time()))
r_serv_db.srem('sensors_monitoring:sensors_error', sensor_uuid)
def delete_sensor_to_monitor(sensor_uuid):
r_serv_db.srem('to_monitor:sensors', sensor_uuid)
r_serv_db.delete('to_monitor:sensor:{}'.format(sensor_uuid))
r_serv_db.set('sensors_monitoring:last_updated', int(time.time()))
r_serv_db.srem('sensors_monitoring:sensors_error', sensor_uuid)
def api_add_sensor_to_monitor(data_dict):
sensor_uuid = data_dict.get('uuid', None)
delta_time = data_dict.get('delta_time', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
# hmac key
if not delta_time:
return ({"status": "error", "reason": "Mandatory parameter(s) not provided"}, 400)
else:
try:
delta_time = int(delta_time)
if delta_time < 1:
return ({"status": "error", "reason": "Invalid delta_time"}, 400)
except Exception:
return ({"status": "error", "reason": "Invalid delta_time"}, 400)
add_sensor_to_monitor(sensor_uuid, delta_time)
def api_delete_sensor_to_monitor(data_dict):
sensor_uuid = data_dict.get('uuid', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
if not is_sensor_monitored(sensor_uuid):
return ({"status": "error", "reason": "Sensor not monitored"}, 400)
delete_sensor_to_monitor(sensor_uuid)
### --- END - SENSOR REGISTRATION --- ###

74
server/lib/User.py Executable file
View File

@ -0,0 +1,74 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import time
import redis
import bcrypt
import random
from flask_login import UserMixin
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
config_loader = ConfigLoader.ConfigLoader()
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
# CONFIG #
config_loader = ConfigLoader.ConfigLoader()
class User(UserMixin):
def __init__(self, id):
self.r_serv_db = r_serv_db
if self.r_serv_db.hexists('user:all', id):
self.id = id
else:
self.id = "__anonymous__"
# return True or False
#def is_authenticated():
# return True or False
#def is_anonymous():
@classmethod
def get(self_class, id):
return self_class(id)
def user_is_anonymous(self):
if self.id == "__anonymous__":
return True
else:
return False
def check_password(self, password):
if self.user_is_anonymous():
return False
rand_sleep = random.randint(1,300)/1000
time.sleep(rand_sleep)
password = password.encode()
hashed_password = self.r_serv_db.hget('user:all', self.id).encode()
if bcrypt.checkpw(password, hashed_password):
return True
else:
return False
def request_password_change(self):
if self.r_serv_db.hget('user_metadata:{}'.format(self.id), 'change_passwd') == 'True':
return True
else:
return False
def is_in_role(self, role):
if self.r_serv_db.sismember('user_role:{}'.format(role), self.id):
return True
else:
return False

44
server/lib/d4_server.py Executable file
View File

@ -0,0 +1,44 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import time
import uuid
import redis
from flask import escape
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_stream = config_loader.get_redis_conn("Redis_STREAM")
config_loader = None
### ###
### BEGIN - SENSOR CONNECTION ###
def get_all_connected_sensors(r_list=False):
res = r_stream.smembers('active_connection')
if r_list:
if res:
return list(res)
else:
return []
else:
return res
def get_all_connected_sensors_by_type(d4_type, d4_extended_type=None):
# D4 extended type
if d4_type == 254 and d4_extended_type:
return r_stream.smembers('active_connection_extended_type:{}'.format(d4_extended_type))
# type 1-253
else:
return r_stream.smembers('active_connection:{}'.format(d4_type))
def is_sensor_connected(sensor_uuid):
return r_stream.sismember('active_connection', sensor_uuid)
### --- END - SENSOR CONNECTION --- ###

42
server/lib/d4_type.py Executable file
View File

@ -0,0 +1,42 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import datetime
import time
import uuid
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
def get_all_accepted_format_type(r_list=False):
res = r_serv_metadata.smembers('server:accepted_type')
if r_list:
if res:
return list(res)
else:
return []
return res
def get_all_accepted_extended_type(r_list=False):
res = r_serv_metadata.smembers('server:accepted_extended_type')
if r_list:
if res:
return list(res)
else:
return []
return res
def is_accepted_format_type(format_type):
return r_serv_metadata.sismember('server:accepted_type', format_type)
def is_accepted_extended_type(extended_type):
return r_serv_metadata.sismember('server:accepted_extended_type', extended_type)

View File

@ -1,6 +1,9 @@
twisted[tls]
redis
flask
flask==2.2.2
flask-login
bcrypt
Werkzeug==2.2.2
#sudo python3 -m pip install --upgrade service_identity

54
server/sensors_manager.py Executable file
View File

@ -0,0 +1,54 @@
#!/usr/bin/env python3
import os
import sys
import time
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Sensor
### Config ###
config_loader = ConfigLoader.ConfigLoader()
#redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
try:
redis_server_metadata.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server: Redis_METADATA, ConnectionError')
sys.exit(1)
def reload_all_sensors_to_monitor_dict(dict_to_monitor, last_updated):
if not dict_to_monitor:
dict_to_monitor = Sensor.get_all_sensors_to_monitor_dict()
else:
monitoring_last_updated = Sensor.get_sensors_monitoring_last_updated()
if monitoring_last_updated > last_updated:
dict_to_monitor = Sensor.get_all_sensors_to_monitor_dict()
last_updated = int(time.time())
print('updated: List of sensors to monitor')
return dict_to_monitor
if __name__ == "__main__":
time_refresh = int(time.time())
last_updated = time_refresh
all_sensors_to_monitor = Sensor.get_all_sensors_to_monitor_dict()
while True:
for sensor_uuid in all_sensors_to_monitor:
Sensor._check_sensor_delta(sensor_uuid, all_sensors_to_monitor[sensor_uuid])
time.sleep(10)
## reload dict_to_monitor ##
curr_time = int(time.time())
if curr_time - time_refresh >= 60:
time_refresh = curr_time
all_sensors_to_monitor = reload_all_sensors_to_monitor_dict(all_sensors_to_monitor, last_updated)
##-- --##

View File

@ -13,6 +13,8 @@ import argparse
import logging
import logging.handlers
import configparser
from twisted.internet import ssl, task, protocol, endpoints, defer
from twisted.python import log
from twisted.python.modules import getModule
@ -20,10 +22,15 @@ from twisted.python.modules import getModule
from twisted.internet.protocol import Protocol
from twisted.protocols.policies import TimeoutMixin
hmac_reset = bytearray(32)
hmac_key = b'private key to change'
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
accepted_type = [1, 4, 8]
hmac_reset = bytearray(32)
accepted_type = [1, 2, 4, 8, 254]
accepted_extended_type = ['ja3-jl']
all_server_modes = ('registration', 'shared-secret')
timeout_time = 30
@ -32,41 +39,144 @@ header_size = 62
data_default_size_limit = 1000000
default_max_entries_by_stream = 10000
host_redis_stream = "localhost"
port_redis_stream = 6379
### Config ###
config_loader = ConfigLoader.ConfigLoader()
host_redis_metadata = "localhost"
port_redis_metadata= 6380
# REDIS #
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
# get server_mode
try:
D4server_port = config_loader.get_config_int("D4_Server", "server_port")
except configparser.NoOptionError:
D4server_port = 4443
redis_server_metadata = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=0)
server_mode = config_loader.get_config_str("D4_Server", "server_mode")
try:
hmac_key = config_loader.get_config_str("D4_Server", "default_hmac_key")
except configparser.NoOptionError:
hmac_key = 'private key to change'
config_loader = None
### ###
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis_stream, port_redis_stream))
print('Error: Redis server Redis_STREAM, ConnectionError')
sys.exit(1)
try:
redis_server_metadata.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis_metadata, port_redis_metadata))
print('Error: Redis server Redis_METADATA, ConnectionError')
sys.exit(1)
### REDIS ###
# set hmac default key
redis_server_metadata.set('server:hmac_default_key', hmac_key)
# init redis_server_metadata
redis_server_metadata.delete('server:accepted_type')
for type in accepted_type:
redis_server_metadata.sadd('server:accepted_type', type)
for type in accepted_extended_type:
redis_server_metadata.sadd('server:accepted_extended_type', type)
dict_all_connection = {}
### FUNCTIONS ###
# kick sensors
def kick_sensors():
for client_uuid in redis_server_stream.smembers('server:sensor_to_kick'):
client_uuid = client_uuid.decode()
for session_uuid in redis_server_stream.smembers('map:active_connection-uuid-session_uuid:{}'.format(client_uuid)):
session_uuid = session_uuid.decode()
logger.warning('Sensor kicked uuid={}, session_uuid={}'.format(client_uuid, session_uuid))
redis_server_stream.set('temp_blacklist_uuid:{}'.format(client_uuid), 'some random string')
redis_server_stream.expire('temp_blacklist_uuid:{}'.format(client_uuid), 30)
dict_all_connection[session_uuid].transport.abortConnection()
redis_server_stream.srem('server:sensor_to_kick', client_uuid)
# Unpack D4 Header
#def unpack_header(data):
# data_header = {}
# if len(data) >= header_size:
# data_header['version'] = struct.unpack('B', data[0:1])[0]
# data_header['type'] = struct.unpack('B', data[1:2])[0]
# data_header['uuid_header'] = data[2:18].hex()
# data_header['timestamp'] = struct.unpack('Q', data[18:26])[0]
# data_header['hmac_header'] = data[26:58]
# data_header['size'] = struct.unpack('I', data[58:62])[0]
# return data_header
def is_valid_uuid_v4(header_uuid):
try:
uuid_test = uuid.UUID(hex=header_uuid, version=4)
return uuid_test.hex == header_uuid
except:
logger.info('Not UUID v4: uuid={}, session_uuid={}'.format(header_uuid, self.session_uuid))
return False
# # TODO: check timestamp
def is_valid_header(uuid_to_check, type):
if is_valid_uuid_v4(uuid_to_check):
if redis_server_metadata.sismember('server:accepted_type', type):
return True
else:
logger.warning('Invalid type, the server don\'t accept this type: {}, uuid={}, session_uuid={}'.format(type, uuid_to_check, self.session_uuid))
return False
else:
logger.info('Invalid Header, uuid={}, session_uuid={}'.format(uuid_to_check, self.session_uuid))
return False
def extract_ip(ip_string):
#remove interface
ip_string = ip_string.split('%')[0]
# IPv4
#extract ipv4
if '.' in ip_string:
return ip_string.split(':')[-1]
# IPv6
else:
return ip_string
def server_mode_registration(header_uuid):
# only accept registered uuid
if server_mode == 'registration':
if not redis_server_metadata.sismember('registered_uuid', header_uuid):
error_msg = 'Not registered UUID={}, connection closed'.format(header_uuid)
print(error_msg)
logger.warning(error_msg)
#redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: This UUID is temporarily blacklisted')
return False
else:
return True
else:
return True
def is_client_ip_blacklisted():
pass
def is_uuid_blacklisted(uuid):
return redis_server_metadata.sismember('blacklist_uuid', data_header['uuid_header'])
# return True if not blocked
# False if blacklisted
def check_blacklist():
pass
# Kill Connection + create log
#def manual_abort_connection(self, message, log_level='WARNING'):
# logger.log(message)
# self.transport.abortConnection()
# return 1
### ###
class D4_Server(Protocol, TimeoutMixin):
@ -75,7 +185,9 @@ class D4_Server(Protocol, TimeoutMixin):
self.setTimeout(timeout_time)
self.session_uuid = str(uuid.uuid4())
self.data_saved = False
self.update_stream_type = True
self.first_connection = True
self.duplicate = False
self.ip = None
self.source_port = None
self.stream_max_size = None
@ -84,32 +196,67 @@ class D4_Server(Protocol, TimeoutMixin):
self.type = None
self.uuid = None
logger.debug('New session: session_uuid={}'.format(self.session_uuid))
dict_all_connection[self.session_uuid] = self
def dataReceived(self, data):
# check and kick sensor by uuid
kick_sensors()
self.resetTimeout()
if self.first_connection or self.ip is None:
client_info = self.transport.client
self.ip = self.extract_ip(client_info[0])
self.ip = extract_ip(client_info[0])
self.source_port = client_info[1]
logger.debug('New connection, ip={}, port={} session_uuid={}'.format(self.ip, self.source_port, self.session_uuid))
# check blacklisted_ip
if redis_server_metadata.sismember('blacklist_ip', self.ip):
self.transport.abortConnection()
logger.warning('Blacklisted IP={}, connection closed'.format(self.ip))
self.process_header(data, self.ip, self.source_port)
else:
# process data
self.process_header(data, self.ip, self.source_port)
def timeoutConnection(self):
self.resetTimeout()
self.buffer = b''
logger.debug('buffer timeout, session_uuid={}'.format(self.session_uuid))
if self.uuid is None:
# # TODO: ban auto
logger.warning('Timeout, no D4 header send, session_uuid={}, connection closed'.format(self.session_uuid))
self.transport.abortConnection()
else:
self.resetTimeout()
self.buffer = b''
logger.debug('buffer timeout, session_uuid={}'.format(self.session_uuid))
def connectionMade(self):
self.transport.setTcpKeepAlive(1)
def connectionLost(self, reason):
redis_server_stream.sadd('ended_session', self.session_uuid)
self.setTimeout(None)
redis_server_stream.srem('active_connection:{}'.format(self.type), '{}:{}'.format(self.ip, self.uuid))
redis_server_stream.srem('active_connection', '{}'.format(self.uuid))
if not self.duplicate:
if self.type == 254 or self.type == 2:
redis_server_stream.srem('active_uuid_type{}:{}'.format(self.type, self.uuid), self.session_uuid)
if not redis_server_stream.exists('active_uuid_type{}:{}'.format(self.type, self.uuid)):
redis_server_stream.srem('active_connection:{}'.format(self.type), self.uuid)
redis_server_stream.srem('active_connection_by_uuid:{}'.format(self.uuid), self.type)
# clean extended type
current_extended_type = redis_server_stream.hget('map:session-uuid_active_extended_type', self.session_uuid)
if current_extended_type:
redis_server_stream.hdel('map:session-uuid_active_extended_type', self.session_uuid)
redis_server_stream.srem('active_connection_extended_type:{}'.format(self.uuid), current_extended_type)
else:
if self.uuid:
redis_server_stream.srem('active_connection:{}'.format(self.type), self.uuid)
redis_server_stream.srem('active_connection_by_uuid:{}'.format(self.uuid), self.type)
if self.uuid:
redis_server_stream.srem('map:active_connection-uuid-session_uuid:{}'.format(self.uuid), self.session_uuid)
if not redis_server_stream.exists('active_connection_by_uuid:{}'.format(self.uuid)):
redis_server_stream.srem('active_connection', self.uuid)
logger.debug('Connection closed: session_uuid={}'.format(self.session_uuid))
dict_all_connection.pop(self.session_uuid)
def unpack_header(self, data):
data_header = {}
@ -120,89 +267,172 @@ class D4_Server(Protocol, TimeoutMixin):
data_header['timestamp'] = struct.unpack('Q', data[18:26])[0]
data_header['hmac_header'] = data[26:58]
data_header['size'] = struct.unpack('I', data[58:62])[0]
# blacklist ip by uuid
if redis_server_metadata.sismember('blacklist_ip_by_uuid', data_header['uuid_header']):
redis_server_metadata.sadd('blacklist_ip', self.ip)
self.transport.abortConnection()
logger.warning('Blacklisted IP by UUID={}, connection closed'.format(data_header['uuid_header']))
# uuid blacklist
if redis_server_metadata.sismember('blacklist_uuid', data_header['uuid_header']):
self.transport.abortConnection()
logger.warning('Blacklisted UUID={}, connection closed'.format(data_header['uuid_header']))
# check default size limit
if data_header['size'] > data_default_size_limit:
self.transport.abortConnection()
logger.warning('Incorrect header data size: the server received more data than expected by default, expected={}, received={} , uuid={}, session_uuid={}'.format(data_default_size_limit, data_header['size'] ,data_header['uuid_header'], self.session_uuid))
# Worker: Incorrect type
if redis_server_stream.sismember('Error:IncorrectType:{}'.format(data_header['type']), self.session_uuid):
self.transport.abortConnection()
redis_server_stream.delete('stream:{}:{}'.format(data_header['type'], self.session_uuid))
redis_server_stream.srem('Error:IncorrectType:{}'.format(data_header['type']), self.session_uuid)
logger.warning('Incorrect type={} detected by worker, uuid={}, session_uuid={}'.format(data_header['type'] ,data_header['uuid_header'], self.session_uuid))
return data_header
def extract_ip(self, ip_string):
#remove interface
ip_string = ip_string.split('%')[0]
# IPv4
#extract ipv4
if '.' in ip_string:
return ip_string.split(':')[-1]
# IPv6
else:
return ip_string
def check_hmac_key(self, hmac_header, data):
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.hget('metadata_uuid:{}'.format(self.uuid), 'hmac_key')
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.get('server:hmac_default_key')
def is_valid_uuid_v4(self, header_uuid):
try:
uuid_test = uuid.UUID(hex=header_uuid, version=4)
return uuid_test.hex == header_uuid
except:
logger.info('Not UUID v4: uuid={}, session_uuid={}'.format(header_uuid, self.session_uuid))
# set hmac_header to 0
data = data.replace(hmac_header, hmac_reset, 1)
HMAC = hmac.new(self.hmac_key, msg=data, digestmod='sha256')
hmac_header = hmac_header.hex()
# hmac match
return hmac_header == HMAC.hexdigest()
def check_connection_validity(self, data_header):
# blacklist ip by uuid
if redis_server_metadata.sismember('blacklist_ip_by_uuid', data_header['uuid_header']):
redis_server_metadata.sadd('blacklist_ip', self.ip)
self.transport.abortConnection()
logger.warning('Blacklisted IP by UUID={}, connection closed'.format(data_header['uuid_header']))
return False
# # TODO: check timestamp
def is_valid_header(self, uuid_to_check, type):
if self.is_valid_uuid_v4(uuid_to_check):
if redis_server_metadata.sismember('server:accepted_type', type):
return True
else:
logger.warning('Invalid type, the server don\'t accept this type: {}, uuid={}, session_uuid={}'.format(type, uuid_to_check, self.session_uuid))
else:
logger.info('Invalid Header, uuid={}, session_uuid={}'.format(uuid_to_check, self.session_uuid))
# uuid blacklist
if redis_server_metadata.sismember('blacklist_uuid', data_header['uuid_header']):
logger.warning('Blacklisted UUID={}, connection closed'.format(data_header['uuid_header']))
self.transport.abortConnection()
return False
# Check server mode
if not server_mode_registration(data_header['uuid_header']):
self.transport.abortConnection()
return False
# check temp blacklist
if redis_server_stream.exists('temp_blacklist_uuid:{}'.format(data_header['uuid_header'])):
logger.warning('Temporarily Blacklisted UUID={}, connection closed'.format(data_header['uuid_header']))
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: This UUID is temporarily blacklisted')
self.transport.abortConnection()
return False
# check default size limit
if data_header['size'] > data_default_size_limit:
self.transport.abortConnection()
logger.warning('Incorrect header data size: the server received more data than expected by default, expected={}, received={} , uuid={}, session_uuid={}'.format(data_default_size_limit, data_header['size'] ,data_header['uuid_header'], self.session_uuid))
return False
# Worker: Incorrect type
if redis_server_stream.sismember('Error:IncorrectType', self.session_uuid):
self.transport.abortConnection()
redis_server_stream.delete('stream:{}:{}'.format(data_header['type'], self.session_uuid))
redis_server_stream.srem('Error:IncorrectType', self.session_uuid)
logger.warning('Incorrect type={} detected by worker, uuid={}, session_uuid={}'.format(data_header['type'] ,data_header['uuid_header'], self.session_uuid))
return False
return True
def process_header(self, data, ip, source_port):
if not self.buffer:
data_header = self.unpack_header(data)
if data_header:
if self.is_valid_header(data_header['uuid_header'], data_header['type']):
if not self.check_connection_validity(data_header):
return 1
if is_valid_header(data_header['uuid_header'], data_header['type']):
# auto kill connection # TODO: map type
if self.first_connection:
self.first_connection = False
if redis_server_stream.sismember('active_connection:{}'.format(data_header['type']), '{}:{}'.format(ip, data_header['uuid_header'])):
if data_header['type'] == 2:
redis_server_stream.sadd('active_uuid_type2:{}'.format(data_header['uuid_header']), self.session_uuid)
# type 254, check if previous type 2 saved
elif data_header['type'] == 254:
logger.warning('a type 2 packet must be sent, ip={} uuid={} type={} session_uuid={}'.format(ip, data_header['uuid_header'], data_header['type'], self.session_uuid))
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: a type 2 packet must be sent, type={}'.format(data_header['type']))
self.duplicate = True
self.transport.abortConnection()
return 1
# accept only one type/by uuid (except for type 2/254)
elif redis_server_stream.sismember('active_connection:{}'.format(data_header['type']), '{}'.format(data_header['uuid_header'])):
# same IP-type for an UUID
logger.warning('is using the same UUID for one type, ip={} uuid={} type={} session_uuid={}'.format(ip, data_header['uuid_header'], data_header['type'], self.session_uuid))
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: This UUID is using the same UUID for one type={}'.format(data_header['type']))
self.duplicate = True
self.transport.abortConnection()
else:
#self.version = None
self.type = data_header['type']
self.uuid = data_header['uuid_header']
#active Connection
redis_server_stream.sadd('active_connection:{}'.format(self.type), '{}:{}'.format(ip, self.uuid))
redis_server_stream.sadd('active_connection', '{}'.format(self.uuid))
# Clean Error Message
redis_server_metadata.hdel('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error')
return 1
self.type = data_header['type']
self.uuid = data_header['uuid_header']
# # check HMAC /!\ incomplete data
# if not self.check_hmac_key(data_header['hmac_header'], data):
# print('hmac do not match')
# print(data)
# logger.debug("HMAC don't match, uuid={}, session_uuid={}".format(self.uuid, self.session_uuid))
# redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: HMAC don\'t match')
# self.transport.abortConnection()
# return 1
## save active connection ##
#active Connection
redis_server_stream.sadd('active_connection:{}'.format(self.type), self.uuid)
redis_server_stream.sadd('active_connection_by_uuid:{}'.format(self.uuid), self.type)
redis_server_stream.sadd('active_connection', self.uuid)
# map session_uuid/uuid
redis_server_stream.sadd('map:active_connection-uuid-session_uuid:{}'.format(self.uuid), self.session_uuid)
# map all type by uuid ## TODO: # FIXME: put me in workers ??????
redis_server_metadata.sadd('all_types_by_uuid:{}'.format(data_header['uuid_header']), data_header['type'])
## ##
# check if type change
if self.data_saved:
# type change detected
if self.type != data_header['type']:
# Meta types
if self.type == 2 and data_header['type'] == 254:
self.update_stream_type = True
self.type = data_header['type']
#redis_server_stream.hdel('map-type:session_uuid-uuid:2', self.session_uuid) # # TODO: to remove / refractor
redis_server_stream.srem('active_uuid_type2:{}'.format(self.uuid), self.session_uuid)
# remove type 2 connection
if not redis_server_stream.exists('active_uuid_type2:{}'.format(self.uuid)):
redis_server_stream.srem('active_connection:2', self.uuid)
redis_server_stream.srem('active_connection_by_uuid:{}'.format(self.uuid), 2)
## save active connection ##
#active Connection
redis_server_stream.sadd('active_connection:{}'.format(self.type), self.uuid)
redis_server_stream.sadd('active_connection_by_uuid:{}'.format(self.uuid), self.type)
redis_server_stream.sadd('active_connection', self.uuid)
redis_server_stream.sadd('active_uuid_type254:{}'.format(self.uuid), self.session_uuid)
# map all type by uuid ## TODO: # FIXME: put me in workers ??????
redis_server_metadata.sadd('all_types_by_uuid:{}'.format(data_header['uuid_header']), data_header['type'])
## ##
#redis_server_stream.hset('map-type:session_uuid-uuid:{}'.format(data_header['type']), self.session_uuid, data_header['uuid_header'])
# Type Error
else:
logger.warning('Unexpected type change, type={} new type={}, ip={} uuid={} session_uuid={}'.format(ip, data_header['uuid_header'], data_header['type'], self.session_uuid))
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: Unexpected type change type={}, new type={}'.format(self.type, data_header['type']))
self.transport.abortConnection()
return 1
# check if the uuid is the same
if self.uuid != data_header['uuid_header']:
logger.warning('The uuid change during the connection, ip={} uuid={} type={} session_uuid={} new_uuid={}'.format(ip, self.uuid, data_header['type'], self.session_uuid, data_header['uuid_header']))
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: The uuid change, new_uuid={}'.format(data_header['uuid_header']))
self.transport.abortConnection()
return 1
## TODO: ban ?
# check data size
if data_header['size'] == (len(data) - header_size):
self.process_d4_data(data, data_header, ip)
res = self.process_d4_data(data, data_header, ip)
# Error detected, kill connection
if res == 1:
return 1
# multiple d4 headers
elif data_header['size'] < (len(data) - header_size):
next_data = data[data_header['size'] + header_size:]
@ -211,7 +441,10 @@ class D4_Server(Protocol, TimeoutMixin):
#print(data)
#print()
#print(next_data)
self.process_d4_data(data, data_header, ip)
res = self.process_d4_data(data, data_header, ip)
# Error detected, kill connection
if res == 1:
return 1
# process next d4 header
self.process_header(next_data, ip, source_port)
# data_header['size'] > (len(data) - header_size)
@ -260,15 +493,6 @@ class D4_Server(Protocol, TimeoutMixin):
def process_d4_data(self, data, data_header, ip):
# empty buffer
self.buffer = b''
# set hmac_header to 0
data = data.replace(data_header['hmac_header'], hmac_reset, 1)
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.hget('metadata_uuid:{}'.format(data_header['uuid_header']), 'hmac_key')
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.get('server:hmac_default_key')
HMAC = hmac.new(self.hmac_key, msg=data, digestmod='sha256')
data_header['hmac_header'] = data_header['hmac_header'].hex()
### Debug ###
#print('hexdigest: {}'.format( HMAC.hexdigest() ))
@ -281,7 +505,7 @@ class D4_Server(Protocol, TimeoutMixin):
### ###
# hmac match
if data_header['hmac_header'] == HMAC.hexdigest():
if self.check_hmac_key(data_header['hmac_header'], data):
if not self.stream_max_size:
temp = redis_server_metadata.hget('stream_max_size_by_uuid', data_header['uuid_header'])
if temp is not None:
@ -291,6 +515,8 @@ class D4_Server(Protocol, TimeoutMixin):
date = datetime.datetime.now().strftime("%Y%m%d")
if redis_server_stream.xlen('stream:{}:{}'.format(data_header['type'], self.session_uuid)) < self.stream_max_size:
# Clean Error Message
redis_server_metadata.hdel('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error')
redis_server_stream.xadd('stream:{}:{}'.format(data_header['type'], self.session_uuid), {'message': data[header_size:], 'uuid': data_header['uuid_header'], 'timestamp': data_header['timestamp'], 'version': data_header['version']})
@ -301,34 +527,45 @@ class D4_Server(Protocol, TimeoutMixin):
redis_server_metadata.zincrby('daily_ip:{}'.format(date), 1, ip)
redis_server_metadata.zincrby('daily_type:{}'.format(date), 1, data_header['type'])
redis_server_metadata.zincrby('stat_type_uuid:{}:{}'.format(date, data_header['type']), 1, data_header['uuid_header'])
redis_server_metadata.zincrby('stat_uuid_type:{}:{}'.format(date, data_header['uuid_header']), 1, data_header['type'])
#
d4_packet_rcv_time = int(time.time())
if not redis_server_metadata.hexists('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen'):
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen', data_header['timestamp'])
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'last_seen', data_header['timestamp'])
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen', d4_packet_rcv_time)
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'last_seen', d4_packet_rcv_time)
redis_server_metadata.hset('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'last_seen', d4_packet_rcv_time)
if not self.data_saved:
# worker entry point: map type:session_uuid
redis_server_stream.sadd('session_uuid:{}'.format(data_header['type']), self.session_uuid.encode())
redis_server_stream.hset('map-type:session_uuid-uuid:{}'.format(data_header['type']), self.session_uuid, data_header['uuid_header'])
#UUID IP: ## TODO: use d4 timestamp ?
redis_server_metadata.lpush('list_uuid_ip:{}'.format(data_header['uuid_header']), '{}-{}'.format(ip, datetime.datetime.now().strftime("%Y%m%d%H%M%S")))
redis_server_metadata.ltrim('list_uuid_ip:{}'.format(data_header['uuid_header']), 0, 15)
self.data_saved = True
if self.update_stream_type:
if not redis_server_metadata.hexists('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'first_seen'):
redis_server_metadata.hset('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'first_seen', d4_packet_rcv_time)
self.update_stream_type = False
return 0
else:
logger.warning("stream exceed max entries limit, uuid={}, session_uuid={}, type={}".format(data_header['uuid_header'], self.session_uuid, data_header['type']))
## TODO: FIXME
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: stream exceed max entries limit')
self.transport.abortConnection()
return 1
else:
print('hmac do not match')
print(data)
logger.debug("HMAC don't match, uuid={}, session_uuid={}".format(data_header['uuid_header'], self.session_uuid))
## TODO: FIXME
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: HMAC don\'t match')
self.transport.abortConnection()
return 1
def main(reactor):
@ -342,7 +579,7 @@ def main(reactor):
certificate = ssl.PrivateCertificate.loadPEM(certData)
factory = protocol.Factory.forProtocol(D4_Server)
# use interface to support both IPv4 and IPv6
reactor.listenSSL(4443, factory, certificate.options(), interface='::')
reactor.listenSSL(D4server_port, factory, certificate.options(), interface='::')
return defer.Deferred()
@ -351,6 +588,9 @@ if __name__ == "__main__":
parser.add_argument('-v', '--verbose',help='dddd' , type=int, default=30)
args = parser.parse_args()
if not redis_server_metadata.exists('first_date'):
redis_server_metadata.set('first_date', datetime.datetime.now().strftime("%Y%m%d"))
logs_dir = 'logs'
if not os.path.isdir(logs_dir):
os.makedirs(logs_dir)
@ -365,5 +605,14 @@ if __name__ == "__main__":
logger.addHandler(handler_log)
logger.setLevel(args.verbose)
# get server_mode
if server_mode not in all_server_modes:
print('Error: incorrect server_mode')
logger.critical('Error: incorrect server_mode')
sys.exit(1)
logger.info('Server mode: {}'.format(server_mode))
logger.info('Launching Server ...')
task.react(main)

38
server/update/update_v0.5.py Executable file
View File

@ -0,0 +1,38 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import datetime
import time
import uuid
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
import d4_type
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
if __name__ == '__main__':
for format_type in d4_type.get_all_accepted_format_type():
format_type = int(format_type)
for queue_uuid in Analyzer_Queue.get_all_queues_by_type(format_type):
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'type', format_type)
r_serv_metadata.sadd('all:analyzer:format_type', format_type)
r_serv_metadata.sadd('all:analyzer:by:format_type:{}'.format(format_type), queue_uuid)
for extended_type in d4_type.get_all_accepted_extended_type():
for queue_uuid in Analyzer_Queue.get_all_queues_by_extended_type(extended_type):
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'type', 254)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'metatype', extended_type)
r_serv_metadata.sadd('all:analyzer:extended_type', extended_type)
r_serv_metadata.sadd('all:analyzer:format_type', 254)
r_serv_metadata.sadd('all:analyzer:by:extended_type:{}'.format(extended_type), queue_uuid)
r_serv_metadata.sadd('all:analyzer:by:format_type:254', queue_uuid)

View File

@ -3,25 +3,45 @@
import os
import re
import ssl
import sys
import uuid
import time
import json
import redis
import time
import uuid
import flask
import redis
import random
import datetime
import ipaddress
import subprocess
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response, escape
from flask_login import LoginManager, current_user, login_user, logout_user, login_required
import bcrypt
# Import Role_Manager
from Role_Manager import create_user_db, check_password_strength, check_user_role_integrity
from Role_Manager import login_user_basic, login_admin
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
from User import User
import Sensor
import ConfigLoader
import Analyzer_Queue
# Import Blueprint
from blueprints.restApi import restApi
from blueprints.settings import settings
from blueprints.analyzer_queue import analyzer_queue
from blueprints.D4_sensors import D4_sensors
baseUrl = ''
if baseUrl != '':
baseUrl = '/'+baseUrl
host_redis_stream = "localhost"
port_redis_stream = 6379
all_server_modes = ('registration', 'shared-secret')
default_max_entries_by_stream = 10000
analyzer_list_max_default_size = 10000
@ -30,30 +50,82 @@ default_analyzer_max_line_len = 3000
json_type_description_path = os.path.join(os.environ['D4_HOME'], 'web/static/json/type.json')
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0,
decode_responses=True)
### Config ###
config_loader = ConfigLoader.ConfigLoader()
host_redis_metadata = "localhost"
port_redis_metadata= 6380
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
redis_server_metadata = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=0,
decode_responses=True)
server_mode = config_loader.get_config_str("D4_Server", "server_mode")
if server_mode not in all_server_modes:
print('Error: incorrect server_mode')
redis_server_analyzer = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=2,
decode_responses=True)
try:
FLASK_HOST = config_loader.get_config_str("Flask_Server", "host")
except Exception as e:
print(e)
FLASK_HOST = '127.0.0.1'
try:
FLASK_PORT = config_loader.get_config_int("Flask_Server", "port")
except Exception:
FLASK_PORT = 7000
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM")
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA")
redis_users = config_loader.get_redis_conn("Redis_SERV")
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER")
r_cache = config_loader.get_redis_conn("Redis_CACHE")
config_loader = None
### ###
with open(json_type_description_path, 'r') as f:
json_type = json.loads(f.read())
json_type_description = {}
for type_info in json_type:
json_type_description[type_info['type']] = type_info
Flask_dir = os.path.join(os.environ['D4_HOME'], 'web')
# ========= TLS =========#
ssl_context = ssl.SSLContext(ssl.PROTOCOL_TLSv1_2)
ssl_context.load_cert_chain(certfile=os.path.join(Flask_dir, 'server.crt'), keyfile=os.path.join(Flask_dir, 'server.key'))
#print(ssl_context.get_ciphers())
# ========= =========#
app = Flask(__name__, static_url_path=baseUrl+'/static/')
app.config['MAX_CONTENT_LENGTH'] = 900 * 1024 * 1024
# ========= Cookie name ========
app.config.update(SESSION_COOKIE_NAME='d4_project_server{}'.format(uuid.uuid4().int))
# ========= session ========
app.secret_key = str(random.getrandbits(256))
login_manager = LoginManager()
login_manager.login_view = 'login'
login_manager.init_app(app)
# ========= =========#
# ========= BLUEPRINT =========#
app.register_blueprint(restApi)
app.register_blueprint(settings)
app.register_blueprint(analyzer_queue)
app.register_blueprint(D4_sensors)
# ========= =========#
# ========= LOGIN MANAGER ========
@login_manager.user_loader
def load_user(user_id):
return User.get(user_id)
# ========= =========#
# ========== FUNCTIONS ============
def is_valid_uuid_v4(header_uuid):
try:
@ -90,8 +162,6 @@ def get_server_management_input_handler_value(value):
return value
def get_json_type_description():
with open(json_type_description_path, 'r') as f:
json_type_description = json.loads(f.read())
return json_type_description
def get_whois_ouput(ip):
@ -101,19 +171,214 @@ def get_whois_ouput(ip):
else:
return ''
def get_substract_date_range(num_day, date_from=None):
if date_from is None:
date_from = datetime.datetime.now()
else:
date_from = datetime.date(int(date_from[0:4]), int(date_from[4:6]), int(date_from[6:8]))
l_date = []
for i in range(num_day):
date = date_from - datetime.timedelta(days=i)
l_date.append( date.strftime('%Y%m%d') )
return list(reversed(l_date))
def get_uuid_all_types_disk(uuid_name):
uuid_data_directory = os.path.join(data_directory, uuid_name)
all_types_on_disk = []
# Get all types save on disk
for file in os.listdir(uuid_data_directory):
uuid_type_path = os.path.join(uuid_data_directory, file)
if os.path.isdir(uuid_type_path):
all_types_on_disk.append(file)
return all_types_on_disk
def get_uuid_disk_statistics(uuid_name, date_day='', type='', all_types_on_disk=[], all_stats=True):
# # TODO: escape uuid_name
stat_disk_uuid = {}
uuid_data_directory = os.path.join(data_directory, uuid_name)
if date_day:
directory_date = os.path.join(date_day[0:4], date_day[4:6], date_day[6:8])
all_types_on_disk = {}
if all_types_on_disk:
for type in all_types_on_disk:
if date_day:
uuid_type_path = os.path.join(uuid_data_directory, type, directory_date)
else:
uuid_type_path = os.path.join(uuid_data_directory, type)
all_types_on_disk[type] = uuid_type_path
else:
# Get all types save on disk
if os.path.isdir(uuid_data_directory):
for file in os.listdir(uuid_data_directory):
if date_day:
uuid_type_path = os.path.join(uuid_data_directory, file, directory_date)
else:
uuid_type_path = os.path.join(uuid_data_directory, file)
if os.path.isdir(uuid_type_path):
all_types_on_disk[file] = uuid_type_path
nb_file = 0
total_size = 0
for uuid_type in all_types_on_disk:
nb_file_type = 0
total_size_type = 0
for dirpath, dirnames, filenames in os.walk(all_types_on_disk[uuid_type]):
stat_disk_uuid[uuid_type] = {}
for f in filenames:
fp = os.path.join(dirpath, f)
file_size = os.path.getsize(fp)
total_size_type += file_size
total_size += file_size
nb_file_type += 1
nb_file += 1
stat_disk_uuid[uuid_type]['nb_files'] = nb_file_type
stat_disk_uuid[uuid_type]['total_size'] = total_size_type
if all_stats:
stat_all = {}
stat_all['nb_files'] = nb_file
stat_all['total_size'] = total_size
stat_disk_uuid['All'] = stat_all
return stat_disk_uuid
# ========== ERRORS ============
@app.errorhandler(404)
def page_not_found(e):
return render_template('404.html'), 404
# API - JSON
if request.path.startswith('/api/'):
return Response(json.dumps({"status": "error", "reason": "404 Not Found"}, indent=2, sort_keys=True), mimetype='application/json'), 404
# UI - HTML Template
else:
return render_template('404.html'), 404
@app.errorhandler(405)
def _handle_client_error(e):
if request.path.startswith('/api/'):
res_dict = {"status": "error", "reason": "Method Not Allowed: The method is not allowed for the requested URL"}
anchor_id = request.path[8:]
anchor_id = anchor_id.replace('/', '_')
api_doc_url = 'https://d4-project.org#{}'.format(anchor_id)
res_dict['documentation'] = api_doc_url
return Response(json.dumps(res_dict, indent=2, sort_keys=True), mimetype='application/json'), 405
else:
return
# ========== ROUTES ============
@app.route('/login', methods=['POST', 'GET'])
def login():
current_ip = request.remote_addr
login_failed_ip = r_cache.get('failed_login_ip:{}'.format(current_ip))
# brute force by ip
if login_failed_ip:
login_failed_ip = int(login_failed_ip)
if login_failed_ip >= 5:
error = 'Max Connection Attempts reached, Please wait {}s'.format(r_cache.ttl('failed_login_ip:{}'.format(current_ip)))
return render_template("login.html", error=error)
if request.method == 'POST':
username = request.form.get('username')
password = request.form.get('password')
next_page = request.form.get('next_page')
if username is not None:
user = User.get(username)
login_failed_user_id = r_cache.get('failed_login_user_id:{}'.format(username))
# brute force by user_id
if login_failed_user_id:
login_failed_user_id = int(login_failed_user_id)
if login_failed_user_id >= 5:
error = 'Max Connection Attempts reached, Please wait {}s'.format(r_cache.ttl('failed_login_user_id:{}'.format(username)))
return render_template("login.html", error=error)
if user and user.check_password(password):
#if not check_user_role_integrity(user.get_id()):
# error = 'Incorrect User ACL, Please contact your administrator'
# return render_template("login.html", error=error)
if not user.is_in_role('user'):
return render_template("403.html"), 403
login_user(user) ## TODO: use remember me ?
if user.request_password_change():
return redirect(url_for('change_password'))
else:
if next_page and next_page!='None':
return redirect(next_page)
else:
return redirect(url_for('index'))
# login failed
else:
# set brute force protection
#logger.warning("Login failed, ip={}, username={}".format(current_ip, username))
r_cache.incr('failed_login_ip:{}'.format(current_ip))
r_cache.expire('failed_login_ip:{}'.format(current_ip), 300)
r_cache.incr('failed_login_user_id:{}'.format(username))
r_cache.expire('failed_login_user_id:{}'.format(username), 300)
error = 'Password Incorrect'
return render_template("login.html", error=error)
return 'please provide a valid username'
else:
next_page = request.args.get('next')
error = request.args.get('error')
return render_template("login.html" , error=error, next_page=next_page)
@app.route('/change_password', methods=['POST', 'GET'])
@login_required
@login_user_basic
def change_password():
password1 = request.form.get('password1')
password2 = request.form.get('password2')
error = request.args.get('error')
if error:
return render_template("change_password.html", error=error)
if current_user.is_authenticated and password1!=None:
if password1==password2:
if check_password_strength(password1):
user_id = current_user.get_id()
create_user_db(user_id , password1, update=True)
return redirect(url_for('index'))
else:
error = 'Incorrect password'
return render_template("change_password.html", error=error)
else:
error = "Passwords don't match"
return render_template("change_password.html", error=error)
else:
error = 'Please choose a new password'
return render_template("change_password.html", error=error)
@app.route('/logout')
@login_required
def logout():
logout_user()
return redirect(url_for('login'))
# role error template
@app.route('/role', methods=['POST', 'GET'])
@login_required
def role():
return render_template("403.html"), 403
@app.route('/')
@login_required
@login_user_basic
def index():
date = datetime.datetime.now().strftime("%Y/%m/%d")
return render_template("index.html", date=date)
@app.route('/_json_daily_uuid_stats')
@login_required
@login_user_basic
def _json_daily_uuid_stats():
date = datetime.datetime.now().strftime("%Y%m%d")
daily_uuid = redis_server_metadata.zrange('daily_uuid:{}'.format(date), 0, -1, withscores=True)
@ -125,6 +390,8 @@ def _json_daily_uuid_stats():
return jsonify(data_daily_uuid)
@app.route('/_json_daily_type_stats')
@login_required
@login_user_basic
def _json_daily_type_stats():
date = datetime.datetime.now().strftime("%Y%m%d")
daily_uuid = redis_server_metadata.zrange('daily_type:{}'.format(date), 0, -1, withscores=True)
@ -141,6 +408,8 @@ def _json_daily_type_stats():
return jsonify(data_daily_uuid)
@app.route('/sensors_status')
@login_required
@login_user_basic
def sensors_status():
active_connection_filter = request.args.get('active_connection_filter')
if active_connection_filter is None:
@ -158,12 +427,44 @@ def sensors_status():
else:
daily_uuid = redis_server_stream.smembers('active_connection')
type_description_json = get_json_type_description()
status_daily_uuid = []
types_description = {}
for result in daily_uuid:
first_seen = redis_server_metadata.hget('metadata_uuid:{}'.format(result), 'first_seen')
first_seen_gmt = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(first_seen)))
last_seen = redis_server_metadata.hget('metadata_uuid:{}'.format(result), 'last_seen')
last_seen_gmt = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(last_seen)))
description = redis_server_metadata.hget('metadata_uuid:{}'.format(result), 'description')
if not description:
description = ''
type_connection_status = {}
l_uuid_types = []
l_uuid_typ = redis_server_metadata.smembers('all_types_by_uuid:{}'.format(result))
for type in l_uuid_typ:
type = int(type)
if redis_server_stream.sismember('active_connection:{}'.format(type), result):
type_connection_status[type] = True
else:
type_connection_status[type] = False
l_uuid_types.append(type)
if type not in types_description:
types_description[type] = type_description_json[type]['description']
if not types_description[type]:
types_description[type] = 'please update your web server'
l_uuid_types.sort()
if 254 in l_uuid_types:
extended_type = list(redis_server_metadata.smembers('all_extended_types_by_uuid:{}'.format(result)))
extended_type.sort()
for extended in extended_type:
if redis_server_stream.sismember('active_connection_extended_type:{}'.format(result), extended):
type_connection_status[extended] = True
else:
type_connection_status[extended] = False
types_description[extended] = ''
l_uuid_types.extend(extended_type)
if redis_server_metadata.sismember('blacklist_ip_by_uuid', result):
Error = "All IP using this UUID are Blacklisted"
elif redis_server_metadata.sismember('blacklist_uuid', result):
@ -176,14 +477,20 @@ def sensors_status():
active_connection = False
if first_seen is not None and last_seen is not None:
status_daily_uuid.append({"uuid": result,"first_seen": first_seen, "last_seen": last_seen,
status_daily_uuid.append({"uuid": result,
"active_connection": active_connection,
"first_seen_gmt": first_seen_gmt, "last_seen_gmt": last_seen_gmt, "Error": Error})
"type_connection_status": type_connection_status,
"description": description,
"first_seen_gmt": first_seen_gmt, "last_seen_gmt": last_seen_gmt,
"l_uuid_types": l_uuid_types, "Error": Error})
return render_template("sensors_status.html", status_daily_uuid=status_daily_uuid,
types_description=types_description,
active_connection_filter=active_connection_filter)
@app.route('/show_active_uuid')
@login_required
@login_user_basic
def show_active_uuid():
#swap switch value
active_connection_filter = request.args.get('show_active_connection')
@ -198,7 +505,12 @@ def show_active_uuid():
return redirect(url_for('sensors_status', active_connection_filter=active_connection_filter))
@app.route('/server_management')
@login_required
@login_user_basic
def server_management():
nb_sensors_registered = Sensor.get_nb_registered_sensors()
nb_sensors_pending = Sensor.get_nb_pending_sensor()
blacklisted_ip = request.args.get('blacklisted_ip')
unblacklisted_ip = request.args.get('unblacklisted_ip')
blacklisted_uuid = request.args.get('blacklisted_uuid')
@ -220,35 +532,61 @@ def server_management():
description = 'Please update your web server'
list_analyzer_uuid = []
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)):
size_limit = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if size_limit is None:
size_limit = analyzer_list_max_default_size
last_updated = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'last_updated')
if last_updated is None:
last_updated = 'Never'
len_queue = redis_server_analyzer.llen('analyzer:{}:{}'.format(type, analyzer_uuid))
if len_queue is None:
len_queue = 0
list_analyzer_uuid.append({'uuid': analyzer_uuid, 'size_limit': size_limit,'last_updated': last_updated, 'length': len_queue})
for analyzer_uuid in Analyzer_Queue.get_all_queues_by_type(type):
list_analyzer_uuid.append(Analyzer_Queue.get_queue_metadata(analyzer_uuid, format_type=type))
for analyzer_uuid in Analyzer_Queue.get_all_queues_group_by_type(type):
list_analyzer_uuid.append(Analyzer_Queue.get_queue_metadata(analyzer_uuid, format_type=type, force_is_group_queue=True))
list_accepted_types.append({"id": int(type), "description": description, 'list_analyzer_uuid': list_analyzer_uuid})
return render_template("server_management.html", list_accepted_types=list_accepted_types,
list_accepted_extended_types = []
l_queue_extended_type = []
for extended_type in redis_server_metadata.smembers('server:accepted_extended_type'):
list_accepted_extended_types.append({"name": extended_type, 'list_analyzer_uuid': []})
for extended_queue_uuid in Analyzer_Queue.get_all_queues_by_extended_type(extended_type):
l_queue_extended_type.append(Analyzer_Queue.get_queue_metadata(extended_queue_uuid, format_type=254, extended_type=extended_type))
for extended_queue_uuid in Analyzer_Queue.get_all_queues_group_by_extended_type(extended_type):
l_queue_extended_type.append(Analyzer_Queue.get_queue_metadata(extended_queue_uuid, format_type=254, extended_type=extended_type, force_is_group_queue=True))
return render_template("server_management.html", list_accepted_types=list_accepted_types, list_accepted_extended_types=list_accepted_extended_types,
server_mode=server_mode,
l_queue_extended_type=l_queue_extended_type,
nb_sensors_registered=nb_sensors_registered, nb_sensors_pending=nb_sensors_pending,
default_analyzer_max_line_len=default_analyzer_max_line_len,
blacklisted_ip=blacklisted_ip, unblacklisted_ip=unblacklisted_ip,
blacklisted_uuid=blacklisted_uuid, unblacklisted_uuid=unblacklisted_uuid)
@app.route('/uuid_management')
@login_required
@login_user_basic
def uuid_management():
uuid_sensor = request.args.get('uuid')
if is_valid_uuid_v4(uuid_sensor):
uuid_sensor = uuid_sensor.replace('-', '')
disk_stats = get_uuid_disk_statistics(uuid_sensor)
first_seen = redis_server_metadata.hget('metadata_uuid:{}'.format(uuid_sensor), 'first_seen')
first_seen_gmt = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(first_seen)))
if first_seen:
first_seen_gmt = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(first_seen)))
else:
first_seen_gmt = '-'
last_seen = redis_server_metadata.hget('metadata_uuid:{}'.format(uuid_sensor), 'last_seen')
last_seen_gmt = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(last_seen)))
if last_seen:
last_seen_gmt = time.strftime('%Y-%m-%d %H:%M:%S', time.localtime(int(last_seen)))
else:
last_seen_gmt = '-'
description = redis_server_metadata.hget('metadata_uuid:{}'.format(uuid_sensor), 'description')
if not description:
description = ''
Error = redis_server_metadata.hget('metadata_uuid:{}'.format(uuid_sensor), 'Error')
if redis_server_stream.exists('temp_blacklist_uuid:{}'.format(uuid_sensor)):
temp_blacklist_uuid = True
else:
temp_blacklist_uuid = False
if redis_server_metadata.sismember('blacklist_uuid', uuid_sensor):
blacklisted_uuid = True
Error = "Blacklisted UUID"
@ -259,10 +597,13 @@ def uuid_management():
Error = "All IP using this UUID are Blacklisted"
else:
blacklisted_ip_by_uuid = False
data_uuid= {"first_seen": first_seen, "last_seen": last_seen,
data_uuid= {"description": description,
"temp_blacklist_uuid": temp_blacklist_uuid,
"blacklisted_uuid": blacklisted_uuid, "blacklisted_ip_by_uuid": blacklisted_ip_by_uuid,
"first_seen_gmt": first_seen_gmt, "last_seen_gmt": last_seen_gmt, "Error": Error}
data_uuid['is_monitored'] = Sensor.is_sensor_monitored(uuid_sensor)
if redis_server_stream.sismember('active_connection', uuid_sensor):
active_connection = True
else:
@ -278,6 +619,17 @@ def uuid_management():
if uuid_key is None:
uuid_key = redis_server_metadata.get('server:hmac_default_key')
uuid_all_type_list = []
uuid_all_type = redis_server_metadata.smembers('all_types_by_uuid:{}'.format(uuid_sensor))
for type in uuid_all_type:
type_first_seen = redis_server_metadata.hget('metadata_type_by_uuid:{}:{}'.format(uuid_sensor, type), 'first_seen')
type_last_seen = redis_server_metadata.hget('metadata_type_by_uuid:{}:{}'.format(uuid_sensor, type), 'last_seen')
if type_first_seen:
type_first_seen = datetime.datetime.fromtimestamp(float(type_first_seen)).strftime('%Y-%m-%d %H:%M:%S')
if type_last_seen:
type_last_seen = datetime.datetime.fromtimestamp(float(type_last_seen)).strftime('%Y-%m-%d %H:%M:%S')
uuid_all_type_list.append({'type': type, 'first_seen':type_first_seen, 'last_seen': type_last_seen})
list_ip = redis_server_metadata.lrange('list_uuid_ip:{}'.format(uuid_sensor), 0, -1)
all_ip = []
for elem in list_ip:
@ -285,11 +637,15 @@ def uuid_management():
all_ip.append({'ip': ip,'datetime': '{}/{}/{} - {}:{}.{}'.format(d_time[0:4], d_time[5:6], d_time[6:8], d_time[8:10], d_time[10:12], d_time[12:14])})
return render_template("uuid_management.html", uuid_sensor=uuid_sensor, active_connection=active_connection,
uuid_key=uuid_key, data_uuid=data_uuid, max_uuid_stream=max_uuid_stream, all_ip=all_ip)
uuid_key=uuid_key, data_uuid=data_uuid, uuid_all_type=uuid_all_type_list,
disk_stats=disk_stats,
max_uuid_stream=max_uuid_stream, all_ip=all_ip)
else:
return 'Invalid uuid'
@app.route('/blacklisted_ip')
@login_required
@login_user_basic
def blacklisted_ip():
blacklisted_ip = request.args.get('blacklisted_ip')
unblacklisted_ip = request.args.get('unblacklisted_ip')
@ -315,6 +671,8 @@ def blacklisted_ip():
unblacklisted_ip=unblacklisted_ip, blacklisted_ip=blacklisted_ip)
@app.route('/blacklisted_uuid')
@login_required
@login_user_basic
def blacklisted_uuid():
blacklisted_uuid = request.args.get('blacklisted_uuid')
unblacklisted_uuid = request.args.get('unblacklisted_uuid')
@ -339,8 +697,62 @@ def blacklisted_uuid():
page=page, nb_page_max=nb_page_max,
unblacklisted_uuid=unblacklisted_uuid, blacklisted_uuid=blacklisted_uuid)
@app.route('/server/registered_sensor')
@login_required
@login_admin
def registered_sensor():
sensors = Sensor.get_registered_sensors()
all_sensors = []
for sensor_uuid in sensors:
all_sensors.append(Sensor._get_sensor_metadata(sensor_uuid, time_format='gmt', sensor_types=True))
return render_template("registered_sensors.html", all_sensors=all_sensors)
@app.route('/server/pending_sensor')
@login_required
@login_admin
def pending_sensors():
sensors = Sensor.get_pending_sensor()
all_pending = []
for sensor_uuid in sensors:
all_pending.append(Sensor._get_sensor_metadata(sensor_uuid, first_seen=False, last_seen=False))
return render_template("pending_sensor.html", all_pending=all_pending)
@app.route('/server/approve_sensor')
@login_required
@login_admin
def approve_sensor():
uuid_sensor = request.args.get('uuid')
res = Sensor.approve_sensor({'uuid': uuid_sensor})
if res[1] == 200:
return redirect(url_for('pending_sensors'))
else:
return jsonify(res[0])
@app.route('/server/delete_pending_sensor')
@login_required
@login_admin
def delete_pending_sensor():
uuid_sensor = request.args.get('uuid')
res = Sensor.delete_pending_sensor({'uuid': uuid_sensor})
if res[1] == 200:
return redirect(url_for('pending_sensors'))
else:
return jsonify(res[0])
@app.route('/server/delete_registered_sensor')
@login_required
@login_admin
def delete_registered_sensor():
uuid_sensor = request.args.get('uuid')
res = Sensor.delete_registered_sensor({'uuid': uuid_sensor})
if res[1] == 200:
return redirect(url_for('registered_sensor'))
else:
return jsonify(res[0])
@app.route('/uuid_change_stream_max_size')
@login_required
@login_user_basic
def uuid_change_stream_max_size():
uuid_sensor = request.args.get('uuid')
user = request.args.get('redirect')
@ -358,63 +770,78 @@ def uuid_change_stream_max_size():
else:
return 'Invalid uuid'
@app.route('/add_new_analyzer')
def add_new_analyzer():
type = request.args.get('type')
user = request.args.get('redirect')
@app.route('/uuid_change_description')
@login_required
@login_user_basic
def uuid_change_description():
uuid_sensor = request.args.get('uuid')
description = request.args.get('description')
if is_valid_uuid_v4(uuid_sensor):
redis_server_metadata.hset('metadata_uuid:{}'.format(uuid_sensor), 'description', description)
return jsonify()
else:
return jsonify({'error':'invalid uuid'}), 400
@app.route('/empty_analyzer_queue')
@login_required
@login_user_basic
def empty_analyzer_queue():
analyzer_uuid = request.args.get('analyzer_uuid')
format_type = request.args.get('type')
metatype_name = request.args.get('metatype_name')
user = request.args.get('redirect')
if is_valid_uuid_v4(analyzer_uuid):
try:
type = int(type)
if type < 0:
return 'type, Invalid Integer'
except:
return 'type, Invalid Integer'
redis_server_metadata.sadd('analyzer:{}'.format(type), analyzer_uuid)
if format_type == 254:
format_type = metatype_name
Analyzer_Queue.flush_queue(analyzer_uuid, format_type)
if user:
return redirect(url_for('server_management'))
else:
return 'Invalid uuid'
@app.route('/remove_analyzer')
@login_required
@login_user_basic
def remove_analyzer():
analyzer_uuid = request.args.get('analyzer_uuid')
type = request.args.get('type')
format_type = request.args.get('type')
metatype_name = request.args.get('metatype_name')
user = request.args.get('redirect')
if is_valid_uuid_v4(analyzer_uuid):
try:
type = int(type)
if type < 0:
return 'type, Invalid Integer'
except:
return 'type, Invalid Integer'
redis_server_metadata.srem('analyzer:{}'.format(type), analyzer_uuid)
redis_server_analyzer.delete('analyzer:{}:{}'.format(type, analyzer_uuid))
redis_server_metadata.delete('analyzer:{}'.format(analyzer_uuid))
Analyzer_Queue.remove_queues(analyzer_uuid, format_type)
if user:
return redirect(url_for('server_management'))
else:
return 'Invalid uuid'
@app.route('/analyzer_change_max_size')
@login_required
@login_user_basic
def analyzer_change_max_size():
analyzer_uuid = request.args.get('analyzer_uuid')
user = request.args.get('redirect')
max_size_analyzer = request.args.get('max_size_analyzer')
if is_valid_uuid_v4(analyzer_uuid):
try:
max_size_analyzer = int(max_size_analyzer)
if max_size_analyzer < 0:
return 'analyzer max size, Invalid Integer'
except:
return 'analyzer max size, Invalid Integer'
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'max_size', max_size_analyzer)
Analyzer_Queue.edit_queue_max_size(analyzer_uuid, max_size_analyzer)
if user:
return redirect(url_for('server_management'))
else:
return 'Invalid uuid'
@app.route('/kick_uuid')
@login_required
@login_user_basic
def kick_uuid():
uuid_sensor = request.args.get('uuid')
if is_valid_uuid_v4(uuid_sensor):
redis_server_stream.sadd('server:sensor_to_kick', uuid_sensor)
return redirect(url_for('uuid_management', uuid=uuid_sensor))
else:
return 'Invalid uuid'
@app.route('/blacklist_uuid')
@login_required
@login_user_basic
def blacklist_uuid():
uuid_sensor = request.args.get('uuid')
user = request.args.get('redirect')
@ -435,6 +862,8 @@ def blacklist_uuid():
return 'Invalid uuid'
@app.route('/unblacklist_uuid')
@login_required
@login_user_basic
def unblacklist_uuid():
uuid_sensor = request.args.get('uuid')
user = request.args.get('redirect')
@ -458,6 +887,8 @@ def unblacklist_uuid():
return 'Invalid uuid'
@app.route('/blacklist_ip')
@login_required
@login_user_basic
def blacklist_ip():
ip = request.args.get('ip')
user = request.args.get('redirect')
@ -483,6 +914,8 @@ def blacklist_ip():
return 'Invalid ip'
@app.route('/unblacklist_ip')
@login_required
@login_user_basic
def unblacklist_ip():
ip = request.args.get('ip')
user = request.args.get('redirect')
@ -510,6 +943,8 @@ def unblacklist_ip():
return 'Invalid ip'
@app.route('/blacklist_ip_by_uuid')
@login_required
@login_user_basic
def blacklist_ip_by_uuid():
uuid_sensor = request.args.get('uuid')
user = request.args.get('redirect')
@ -521,6 +956,8 @@ def blacklist_ip_by_uuid():
return 'Invalid uuid'
@app.route('/unblacklist_ip_by_uuid')
@login_required
@login_user_basic
def unblacklist_ip_by_uuid():
uuid_sensor = request.args.get('uuid')
user = request.args.get('redirect')
@ -532,18 +969,29 @@ def unblacklist_ip_by_uuid():
return 'Invalid uuid'
@app.route('/add_accepted_type')
@login_required
@login_user_basic
def add_accepted_type():
type = request.args.get('type')
extended_type_name = request.args.get('extended_type_name')
user = request.args.get('redirect')
json_type_description = get_json_type_description()
try:
type = int(type)
except:
return 'Invalid type'
if json_type_description[int(type)]:
redis_server_metadata.sadd('server:accepted_type', type)
if type == 254:
redis_server_metadata.sadd('server:accepted_extended_type', extended_type_name)
if user:
return redirect(url_for('server_management'))
else:
return 'Invalid type'
@app.route('/remove_accepted_type')
@login_required
@login_user_basic
def remove_accepted_type():
type = request.args.get('type')
user = request.args.get('redirect')
@ -555,8 +1003,18 @@ def remove_accepted_type():
else:
return 'Invalid type'
@app.route('/remove_accepted_extended_type')
@login_required
@login_user_basic
def remove_accepted_extended_type():
type_name = request.args.get('type_name')
redis_server_metadata.srem('server:accepted_extended_type', type_name)
return redirect(url_for('server_management'))
# demo function
@app.route('/delete_data')
@login_required
@login_user_basic
def delete_data():
date = datetime.datetime.now().strftime("%Y%m%d")
redis_server_metadata.delete('daily_type:{}'.format(date))
@ -565,17 +1023,24 @@ def delete_data():
# demo function
@app.route('/set_uuid_hmac_key')
@login_required
@login_user_basic
def set_uuid_hmac_key():
uuid_sensor = request.args.get('uuid')
user = request.args.get('redirect')
key = request.args.get('key')
redis_server_metadata.hset('metadata_uuid:{}'.format(uuid_sensor), 'hmac_key', key)
hmac_key = escape(key)
if len(hmac_key)>100:
hmac_key=hmac_key[:100]
redis_server_metadata.hset('metadata_uuid:{}'.format(uuid_sensor), 'hmac_key', hmac_key)
if user:
return redirect(url_for('uuid_management', uuid=uuid_sensor))
# demo function
@app.route('/whois_data')
@login_required
@login_user_basic
def whois_data():
ip = request.args.get('ip')
if is_valid_ip:
@ -583,8 +1048,16 @@ def whois_data():
else:
return 'Invalid IP'
# demo function
@app.route('/generate_uuid')
@login_required
@login_user_basic
def generate_uuid():
new_uuid = uuid.uuid4()
return jsonify({'uuid': new_uuid})
@app.route('/get_analyser_sample')
@login_required
@login_user_basic
def get_analyser_sample():
type = request.args.get('type')
analyzer_uuid = request.args.get('analyzer_uuid')
@ -611,5 +1084,70 @@ def get_analyser_sample():
else:
return jsonify('Incorrect UUID')
@app.route('/get_uuid_type_history_json')
@login_required
@login_user_basic
def get_uuid_type_history_json():
uuid_sensor = request.args.get('uuid_sensor')
if is_valid_uuid_v4(uuid_sensor):
num_day_type = 7
date_range = get_substract_date_range(num_day_type)
type_history = []
range_decoder = []
all_type = set()
for date in date_range:
type_day = redis_server_metadata.zrange('stat_uuid_type:{}:{}'.format(date, uuid_sensor), 0, -1, withscores=True)
for type in type_day:
all_type.add(type[0])
range_decoder.append((date, type_day))
default_dict_type = {}
for type in all_type:
default_dict_type[type] = 0
for row in range_decoder:
day_type = default_dict_type.copy()
date = row[0]
day_type['date']= date[0:4] + '-' + date[4:6] + '-' + date[6:8]
for type in row[1]:
day_type[type[0]]= type[1]
type_history.append(day_type)
return jsonify(type_history)
else:
return jsonify('Incorrect UUID')
@app.route('/get_uuid_stats_history_json')
@login_required
@login_user_basic
def get_uuid_stats_history_json():
uuid_sensor = request.args.get('uuid_sensor')
stats = request.args.get('stats')
if is_valid_uuid_v4(uuid_sensor):
if stats not in ['nb_files', 'total_size']:
stats = 'nb_files'
num_day_type = 7
date_range = get_substract_date_range(num_day_type)
stat_type_history = []
range_decoder = []
all_type = get_uuid_all_types_disk(uuid_sensor)
default_dict_type = {}
for type in all_type:
default_dict_type[type] = 0
for date in date_range:
day_type = default_dict_type.copy()
daily_stat = get_uuid_disk_statistics(uuid_sensor, date, all_types_on_disk=all_type, all_stats=False)
day_type['date']= date[0:4] + '-' + date[4:6] + '-' + date[6:8]
for type_key in daily_stat:
day_type[type_key] += daily_stat[type_key][stats]
stat_type_history.append(day_type)
return jsonify(stat_type_history)
else:
return jsonify('Incorrect UUID')
if __name__ == "__main__":
app.run(host='0.0.0.0', port=7000, threaded=True)
app.run(host=FLASK_HOST, port=FLASK_PORT, threaded=True, ssl_context=ssl_context)

184
server/web/Role_Manager.py Normal file
View File

@ -0,0 +1,184 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import re
import sys
import redis
import bcrypt
from functools import wraps
from flask_login import LoginManager, current_user, login_user, logout_user, login_required
from flask import request, current_app
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
login_manager = LoginManager()
login_manager.login_view = 'role'
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
default_passwd_file = os.path.join(os.environ['D4_HOME'], 'DEFAULT_PASSWORD')
regex_password = r'^(?=(.*\d){2})(?=.*[a-z])(?=.*[A-Z]).{10,100}$'
regex_password = re.compile(regex_password)
###############################################################
############### CHECK ROLE ACCESS ##################
###############################################################
def login_admin(func):
@wraps(func)
def decorated_view(*args, **kwargs):
if not current_user.is_authenticated:
return login_manager.unauthorized()
elif (not current_user.is_in_role('admin')):
return login_manager.unauthorized()
return func(*args, **kwargs)
return decorated_view
def login_user_basic(func):
@wraps(func)
def decorated_view(*args, **kwargs):
if not current_user.is_authenticated:
return login_manager.unauthorized()
elif (not current_user.is_in_role('user')):
return login_manager.unauthorized()
return func(*args, **kwargs)
return decorated_view
###############################################################
###############################################################
###############################################################
def gen_password(length=30, charset="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_!@#$%^&*()"):
random_bytes = os.urandom(length)
len_charset = len(charset)
indices = [int(len_charset * (byte / 256.0)) for byte in random_bytes]
return "".join([charset[index] for index in indices])
def gen_token(length=41, charset="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789-_"):
random_bytes = os.urandom(length)
len_charset = len(charset)
indices = [int(len_charset * (byte / 256.0)) for byte in random_bytes]
return "".join([charset[index] for index in indices])
def generate_new_token(user_id):
# create user token
current_token = r_serv_db.hget('user_metadata:{}'.format(user_id), 'token')
if current_token:
r_serv_db.hdel('user:tokens', current_token)
token = gen_token(41)
r_serv_db.hset('user:tokens', token, user_id)
r_serv_db.hset('user_metadata:{}'.format(user_id), 'token', token)
def get_default_admin_token():
if r_serv_db.exists('user_metadata:admin@admin.test'):
return r_serv_db.hget('user_metadata:admin@admin.test', 'token')
else:
return ''
def create_user_db(username_id , password, default=False, role=None, update=False):
password = password.encode()
password_hash = hashing_password(password)
# create user token
generate_new_token(username_id)
if update:
r_serv_db.hdel('user_metadata:{}'.format(username_id), 'change_passwd')
# remove default user password file
if username_id=='admin@admin.test':
os.remove(default_passwd_file)
else:
if default:
r_serv_db.hset('user_metadata:{}'.format(username_id), 'change_passwd', 'True')
if role:
if role in get_all_role():
for role_to_add in get_all_user_role(role):
r_serv_db.sadd('user_role:{}'.format(role_to_add), username_id)
r_serv_db.hset('user_metadata:{}'.format(username_id), 'role', role)
r_serv_db.hset('user:all', username_id, password_hash)
def edit_user_db(user_id, password=None, role=None):
if password:
password_hash = hashing_password(password.encode())
r_serv_db.hset('user:all', user_id, password_hash)
current_role = r_serv_db.hget('user_metadata:{}'.format(user_id), 'role')
if role != current_role:
request_level = get_role_level(role)
current_role = get_role_level(current_role)
if current_role < request_level:
role_to_remove = get_user_role_by_range(current_role -1, request_level - 2)
for role_id in role_to_remove:
r_serv_db.srem('user_role:{}'.format(role_id), user_id)
r_serv_db.hset('user_metadata:{}'.format(user_id), 'role', role)
else:
role_to_add = get_user_role_by_range(request_level -1, current_role)
for role_id in role_to_add:
r_serv_db.sadd('user_role:{}'.format(role_id), user_id)
r_serv_db.hset('user_metadata:{}'.format(user_id), 'role', role)
def delete_user_db(user_id):
if r_serv_db.exists('user_metadata:{}'.format(user_id)):
role_to_remove =get_all_role()
for role_id in role_to_remove:
r_serv_db.srem('user_role:{}'.format(role_id), user_id)
user_token = r_serv_db.hget('user_metadata:{}'.format(user_id), 'token')
r_serv_db.hdel('user:tokens', user_token)
r_serv_db.delete('user_metadata:{}'.format(user_id))
r_serv_db.hdel('user:all', user_id)
def hashing_password(bytes_password):
hashed = bcrypt.hashpw(bytes_password, bcrypt.gensalt())
return hashed
def check_password_strength(password):
result = regex_password.match(password)
if result:
return True
else:
return False
def get_all_role():
return r_serv_db.zrange('d4:all_role', 0, -1)
def get_role_level(role):
return int(r_serv_db.zscore('d4:all_role', role))
def get_all_user_role(user_role):
current_role_val = get_role_level(user_role)
return r_serv_db.zrangebyscore('d4:all_role', current_role_val, 50)
def get_all_user_upper_role(user_role):
current_role_val = get_role_level(user_role)
# remove one rank
if current_role_val > 1:
return r_serv_db.zrange('d4:all_role', 0, current_role_val -2)
else:
return []
def get_user_role_by_range(inf, sup):
return r_serv_db.zrange('d4:all_role', inf, sup)
def get_user_role(user_id):
return r_serv_db.hget('user_metadata:{}'.format(user_id), 'role')
def check_user_role_integrity(user_id):
user_role = get_user_role(user_id)
all_user_role = get_all_user_role(user_role)
res = True
if user_role not in all_user_role:
return False
return res

View File

@ -0,0 +1,76 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
'''
Flask functions and routes for all D4 sensors
'''
import os
import re
import sys
import json
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import ConfigLoader
import Sensor
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response
from flask_login import login_required, current_user
from Role_Manager import login_admin, login_user_basic
# ============ BLUEPRINT ============
D4_sensors = Blueprint('D4_sensors', __name__, template_folder='templates')
# ============ VARIABLES ============
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
# ============ FUNCTIONS ============
# ============= ROUTES ==============
@D4_sensors.route("/sensors/monitoring/add", methods=['GET'])
@login_required
@login_user_basic
def add_sensor_to_monitor():
sensor_uuid = request.args.get("uuid")
return render_template("sensors/add_sensor_to_monitor.html",
sensor_uuid=sensor_uuid)
@D4_sensors.route("/sensors/monitoring/add_post", methods=['POST'])
@login_required
@login_user_basic
def add_sensor_to_monitor_post():
sensor_uuid = request.form.get("uuid")
delta_time = request.form.get("delta_time")
res = Sensor.api_add_sensor_to_monitor({'uuid':sensor_uuid, 'delta_time': delta_time})
if res:
Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]
return redirect(url_for('uuid_management', uuid=sensor_uuid))
@D4_sensors.route("/sensors/monitoring/delete", methods=['GET'])
@login_required
@login_user_basic
def delete_sensor_to_monitor():
sensor_uuid = request.args.get("uuid")
res = Sensor.api_delete_sensor_to_monitor({'uuid':sensor_uuid})
if res:
Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]
return redirect(url_for('uuid_management', uuid=sensor_uuid))
@D4_sensors.route("/sensors/monitoring/errors", methods=['GET'])
@login_required
@login_user_basic
def get_all_sensors_connection_errors():
res = Sensor.api_get_all_sensors_connection_errors()
return Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]

View File

@ -0,0 +1,122 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
'''
Flask functions and routes for the rest api
'''
import os
import re
import sys
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import ConfigLoader
import Analyzer_Queue
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response
from flask_login import login_required, current_user
from Role_Manager import login_admin, login_user_basic
# ============ BLUEPRINT ============
analyzer_queue = Blueprint('analyzer_queue', __name__, template_folder='templates')
# ============ VARIABLES ============
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
# ============ FUNCTIONS ============
# ============= ROUTES ==============
@analyzer_queue.route("/analyzer_queue/create_queue", methods=['GET'])
@login_required
@login_user_basic
def create_analyzer_queue():
return render_template("analyzer_queue/queue_creator.html")
@analyzer_queue.route("/analyzer_queue/create_queue_post", methods=['POST'])
@login_required
@login_user_basic
def create_analyzer_queue_post():
l_queue_meta = ['analyzer_type', 'analyzer_metatype', 'description', 'analyzer_uuid']
queue_type = request.form.get("analyzer_type")
queue_metatype = request.form.get("analyzer_metatype")
queue_description = request.form.get("description")
queue_uuid = request.form.get("analyzer_uuid")
queue_type = Analyzer_Queue.sanitize_queue_type(queue_type)
# unpack uuid group
l_uuid = set()
l_invalid_uuid = set()
for obj_tuple in list(request.form):
if obj_tuple not in l_queue_meta:
sensor_uuid = request.form.get(obj_tuple)
if Analyzer_Queue.is_valid_uuid_v4(sensor_uuid):
l_uuid.add(sensor_uuid)
else:
if sensor_uuid:
l_invalid_uuid.add(sensor_uuid)
l_uuid = list(l_uuid)
l_invalid_uuid = list(l_invalid_uuid)
if l_invalid_uuid:
return render_template("analyzer_queue/queue_creator.html", queue_uuid=queue_uuid, queue_type=queue_type, metatype_name=queue_metatype,
description=queue_description, l_uuid=l_uuid, l_invalid_uuid=l_invalid_uuid)
res = Analyzer_Queue.create_queues(queue_type, queue_uuid=queue_uuid, l_uuid=l_uuid, metatype_name=queue_metatype, description=queue_description)
if isinstance(res,dict):
return jsonify(res)
if res:
return redirect(url_for('server_management', _anchor=res))
@analyzer_queue.route("/analyzer_queue/edit_queue", methods=['GET'])
@login_required
@login_user_basic
def edit_queue_analyzer_queue():
queue_uuid = request.args.get("queue_uuid")
queue_metadata = Analyzer_Queue.get_queue_metadata(queue_uuid, is_group=True)
if 'is_group_queue' in queue_metadata:
l_sensors_uuid = Analyzer_Queue.get_queue_group_all_sensors(queue_uuid)
else:
l_sensors_uuid = None
return render_template("analyzer_queue/queue_editor.html", queue_metadata=queue_metadata, l_sensors_uuid=l_sensors_uuid)
@analyzer_queue.route("/analyzer_queue/edit_queue_post", methods=['POST'])
@login_required
@login_user_basic
def edit_queue_analyzer_queue_post():
l_queue_meta = ['queue_uuid', 'description']
queue_uuid = request.form.get("queue_uuid")
queue_description = request.form.get("description")
l_uuid = set()
l_invalid_uuid = set()
for obj_tuple in list(request.form):
if obj_tuple not in l_queue_meta:
sensor_uuid = request.form.get(obj_tuple)
if Analyzer_Queue.is_valid_uuid_v4(sensor_uuid):
l_uuid.add(sensor_uuid)
else:
if sensor_uuid:
l_invalid_uuid.add(sensor_uuid)
if l_invalid_uuid:
queue_metadata = Analyzer_Queue.get_queue_metadata(queue_uuid, is_group=True)
if queue_description:
queue_metadata['description'] = queue_description
return render_template("analyzer_queue/queue_editor.html", queue_metadata=queue_metadata, l_sensors_uuid=l_uuid, l_invalid_uuid=l_invalid_uuid)
Analyzer_Queue.edit_queue_description(queue_uuid, queue_description)
Analyzer_Queue.edit_queue_sensors_set(queue_uuid, l_uuid)
return redirect(url_for('analyzer_queue.edit_queue_analyzer_queue', queue_uuid=queue_uuid))

View File

@ -0,0 +1,162 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
'''
Flask functions and routes for the rest api
'''
import os
import re
import sys
import time
import uuid
import json
import redis
import random
import datetime
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response
from flask_login import login_required
from functools import wraps
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import Sensor
import ConfigLoader
# ============ BLUEPRINT ============
restApi = Blueprint('restApi', __name__, template_folder='templates')
# ============ VARIABLES ============
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
r_cache = config_loader.get_redis_conn("Redis_CACHE")
config_loader = None
### ###
# ============ AUTH FUNCTIONS ============
def check_token_format(strg, search=re.compile(r'[^a-zA-Z0-9_-]').search):
return not bool(search(strg))
def verify_token(token):
if len(token) != 41:
return False
if not check_token_format(token):
return False
rand_sleep = random.randint(1,300)/1000
time.sleep(rand_sleep)
if r_serv_db.hexists('user:tokens', token):
return True
else:
return False
def get_user_from_token(token):
return r_serv_db.hget('user:tokens', token)
def verify_user_role(role, token):
user_id = get_user_from_token(token)
if user_id:
if is_in_role(user_id, role):
return True
else:
return False
else:
return False
def is_in_role(user_id, role):
if r_serv_db.sismember('user_role:{}'.format(role), user_id):
return True
else:
return False
# ============ DECORATOR ============
def token_required(user_role):
def actual_decorator(funct):
@wraps(funct)
def api_token(*args, **kwargs):
data = authErrors(user_role)
if data:
return Response(json.dumps(data[0], indent=2, sort_keys=True), mimetype='application/json'), data[1]
else:
return funct(*args, **kwargs)
return api_token
return actual_decorator
def get_auth_from_header():
token = request.headers.get('Authorization').replace(' ', '') # remove space
return token
def authErrors(user_role):
# Check auth
if not request.headers.get('Authorization'):
return ({'status': 'error', 'reason': 'Authentication needed'}, 401)
token = get_auth_from_header()
data = None
# verify token format
# brute force protection
current_ip = request.remote_addr
login_failed_ip = r_cache.get('failed_login_ip_api:{}'.format(current_ip))
# brute force by ip
if login_failed_ip:
login_failed_ip = int(login_failed_ip)
if login_failed_ip >= 5:
return ({'status': 'error', 'reason': 'Max Connection Attempts reached, Please wait {}s'.format(r_cache.ttl('failed_login_ip_api:{}'.format(current_ip)))}, 401)
try:
authenticated = False
if verify_token(token):
authenticated = True
# check user role
if not verify_user_role(user_role, token):
data = ({'status': 'error', 'reason': 'Access Forbidden'}, 403)
if not authenticated:
r_cache.incr('failed_login_ip_api:{}'.format(current_ip))
r_cache.expire('failed_login_ip_api:{}'.format(current_ip), 300)
data = ({'status': 'error', 'reason': 'Authentication failed'}, 401)
except Exception as e:
print(e)
data = ({'status': 'error', 'reason': 'Malformed Authentication String'}, 400)
if data:
return data
else:
return None
# ============ FUNCTIONS ============
def is_valid_uuid_v4(header_uuid):
try:
header_uuid=header_uuid.replace('-', '')
uuid_test = uuid.UUID(hex=header_uuid, version=4)
return uuid_test.hex == header_uuid
except:
return False
def build_json_response(resp_data, resp_code):
return Response(json.dumps(resp_data, indent=2, sort_keys=True), mimetype='application/json'), resp_code
# ============= ROUTES ==============
@restApi.route("/api/v1/add/sensor/register", methods=['POST'])
@token_required('sensor_register')
def add_sensor_register():
data = request.get_json()
res = Sensor.register_sensor(data)
return Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]
@restApi.route("/api/v1/sensors/monitoring/errors", methods=['GET'])
@token_required('user')
def get_all_sensors_connection_errors():
res = Sensor.api_get_all_sensors_connection_errors()
return build_json_response(res[0], res[1])

View File

@ -0,0 +1,184 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
'''
Flask functions and routes for the rest api
'''
import os
import re
import sys
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import ConfigLoader
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response
from flask_login import login_required, current_user
from Role_Manager import login_admin, login_user_basic
from Role_Manager import create_user_db, edit_user_db, delete_user_db, check_password_strength, generate_new_token, gen_password, get_all_role
# ============ BLUEPRINT ============
settings = Blueprint('settings', __name__, template_folder='templates')
# ============ VARIABLES ============
email_regex = r'[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,6}'
email_regex = re.compile(email_regex)
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
# ============ FUNCTIONS ============
def one():
return 1
def check_email(email):
return email_regex.match(email)
def get_user_metadata(user_id):
user_metadata = {}
user_metadata['email'] = user_id
user_metadata['role'] = r_serv_db.hget('user_metadata:{}'.format(user_id), 'role')
user_metadata['api_key'] = r_serv_db.hget('user_metadata:{}'.format(user_id), 'token')
return user_metadata
def get_users_metadata(list_users):
users = []
for user in list_users:
users.append(get_user_metadata(user))
return users
def get_all_users():
return r_serv_db.hkeys('user:all')
# ============= ROUTES ==============
@settings.route("/settings/", methods=['GET'])
@login_required
@login_user_basic
def settings_page():
return redirect(url_for('settings.edit_profile'))
@settings.route("/settings/edit_profile", methods=['GET'])
@login_required
@login_user_basic
def edit_profile():
user_metadata = get_user_metadata(current_user.get_id())
admin_level = current_user.is_in_role('admin')
return render_template("edit_profile.html", user_metadata=user_metadata,
admin_level=admin_level)
@settings.route("/settings/new_token", methods=['GET'])
@login_required
@login_user_basic
def new_token():
generate_new_token(current_user.get_id())
return redirect(url_for('settings.edit_profile'))
@settings.route("/settings/new_token_user", methods=['GET'])
@login_required
@login_admin
def new_token_user():
user_id = request.args.get('user_id')
if r_serv_db.exists('user_metadata:{}'.format(user_id)):
generate_new_token(user_id)
return redirect(url_for('settings.users_list'))
@settings.route("/settings/create_user", methods=['GET'])
@login_required
@login_admin
def create_user():
user_id = request.args.get('user_id')
error = request.args.get('error')
error_mail = request.args.get('error_mail')
role = None
if r_serv_db.exists('user_metadata:{}'.format(user_id)):
role = r_serv_db.hget('user_metadata:{}'.format(user_id), 'role')
else:
user_id = None
all_roles = get_all_role()
return render_template("create_user.html", all_roles=all_roles, user_id=user_id, user_role=role,
error=error, error_mail=error_mail,
admin_level=True)
@settings.route("/settings/create_user_post", methods=['POST'])
@login_required
@login_admin
def create_user_post():
email = request.form.get('username')
role = request.form.get('user_role')
password1 = request.form.get('password1')
password2 = request.form.get('password2')
all_roles = get_all_role()
if email and len(email)< 300 and check_email(email) and role:
if role in all_roles:
# password set
if password1 and password2:
if password1==password2:
if check_password_strength(password1):
password = password1
else:
return render_template("create_user.html", all_roles=all_roles, error="Incorrect Password", admin_level=True)
else:
return render_template("create_user.html", all_roles=all_roles, error="Passwords don't match", admin_level=True)
# generate password
else:
password = gen_password()
if current_user.is_in_role('admin'):
# edit user
if r_serv_db.exists('user_metadata:{}'.format(email)):
if password1 and password2:
edit_user_db(email, password=password, role=role)
return redirect(url_for('settings.users_list', new_user=email, new_user_password=password, new_user_edited=True))
else:
edit_user_db(email, role=role)
return redirect(url_for('settings.users_list', new_user=email, new_user_password='Password not changed', new_user_edited=True))
# create user
else:
create_user_db(email, password, default=True, role=role)
return redirect(url_for('settings.users_list', new_user=email, new_user_password=password, new_user_edited=False))
else:
return render_template("create_user.html", all_roles=all_roles, admin_level=True)
else:
return render_template("create_user.html", all_roles=all_roles, error_mail=True, admin_level=True)
@settings.route("/settings/users_list", methods=['GET'])
@login_required
@login_admin
def users_list():
all_users = get_users_metadata(get_all_users())
new_user = request.args.get('new_user')
new_user_dict = {}
if new_user:
new_user_dict['email'] = new_user
new_user_dict['edited'] = request.args.get('new_user_edited')
new_user_dict['password'] = request.args.get('new_user_password')
return render_template("users_list.html", all_users=all_users, new_user=new_user_dict, admin_level=True)
@settings.route("/settings/edit_user", methods=['GET'])
@login_required
@login_admin
def edit_user():
user_id = request.args.get('user_id')
return redirect(url_for('settings.create_user', user_id=user_id))
@settings.route("/settings/delete_user", methods=['GET'])
@login_required
@login_admin
def delete_user():
user_id = request.args.get('user_id')
delete_user_db(user_id)
return redirect(url_for('settings.users_list'))

View File

@ -0,0 +1,52 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import ConfigLoader
from Role_Manager import create_user_db, edit_user_db, get_default_admin_token, gen_password
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
if __name__ == "__main__":
# create role_list
if not r_serv.exists('d4:all_role'):
role_dict = {'admin': 1, 'user': 2, 'sensor_register': 20}
r_serv.zadd('d4:all_role', role_dict)
username = 'admin@admin.test'
password = gen_password()
if r_serv.exists('user_metadata:{}'.format(username)):
edit_user_db(username, password=password, role='admin')
else:
create_user_db(username, password, role='admin', default=True)
username2 = 'config_generator@register.test'
password2 = gen_password()
if r_serv.exists('user_metadata:config_generator@register.test'):
edit_user_db(username2, password=password2, role='sensor_register')
else:
create_user_db(username2, password2, role='sensor_register', default=True)
token = get_default_admin_token()
default_passwd_file = os.path.join(os.environ['D4_HOME'], 'DEFAULT_PASSWORD')
to_write_str = '# Password Generated by default\n# This file is deleted after the first login\n#\nemail=admin@admin.test\npassword='
to_write_str = to_write_str + password + '\nAPI_Key=' + token
with open(default_passwd_file, 'w') as f:
f.write(to_write_str)
print('new user created: {}'.format(username))
print('password: {}'.format(password))
print('token: {}'.format(token))

View File

@ -0,0 +1,58 @@
<!DOCTYPE html>
<html>
<head>
<title>403 - D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='image/d4-logo.png') }}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
</head>
<body>
{% include 'navbar.html' %}
<div>
<br>
<br>
<h1 class="text-center">403 Forbidden</h1>
</div>
<br>
<br>
<br>
<br>
<div class="d-flex justify-content-center">
<pre>
,d8 ,a8888a, ad888888b,
,d888 ,8P"' `"Y8, d8" "88
,d8" 88 ,8P Y8, a8P
,d8" 88 88 88 aad8"
,d8" 88 88 88 ""Y8,
8888888888888 `8b d8' "8b
88 `8ba, ,ad8' Y8, a88
88 "Y8888P" "Y888888P'
88888888888 88 88 88 88
88 88 "" 88 88
88 88 88 88
88aaaaa ,adPPYba, 8b,dPPYba, 88,dPPYba, 88 ,adPPYb,88 ,adPPYb,88 ,adPPYba, 8b,dPPYba,
88""""" a8" "8a 88P' "Y8 88P' "8a 88 a8" `Y88 a8" `Y88 a8P_____88 88P' `"8a
88 8b d8 88 88 d8 88 8b 88 8b 88 8PP""""""" 88 88
88 "8a, ,a8" 88 88b, ,a8" 88 "8a, ,d88 "8a, ,d88 "8b, ,aa 88 88
88 `"YbbdP"' 88 8Y"Ybbd8"' 88 `"8bbdP"Y8 `"8bbdP"Y8 `"Ybbd8"' 88 88
</pre>
</div>
{% include 'navfooter.html' %}
<body>
<script>
$(document).ready(function(){
$("#nav-home").addClass("active");
} );
</script>
</html>

View File

@ -23,22 +23,7 @@
<body>
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="d-flex justify-content-center">
<pre>
@ -68,3 +53,11 @@
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-home").addClass("active");
} );
</script>
</html>

View File

@ -0,0 +1,7 @@
<div class="input-group mb-1">
<input type="text" class="form-control col-10 {%if error%}is-invalid{%else%}is-valid{%endif%}" name="{{sensor_uuid}}" value="{{sensor_uuid}}">
<span class="btn btn-danger input-group-addon delete-field col-2"><i class="fa fa-trash"></i></span>
<div class="invalid-feedback">
Please provide a valid UUID v4.
</div>
</div>

View File

@ -0,0 +1,157 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="card mb-3 mt-1">
<div class="card-header text-white bg-dark">
<h5 class="card-title">Create Analyzer Queue</h5>
</div>
<div class="card-body">
<form action="{{ url_for('analyzer_queue.create_analyzer_queue_post') }}" method="post" enctype=multipart/form-data onsubmit="submitPaste()">
<div class="form-group mb-4">
<label for="analyzer_type"><b>Analyzer Type</b></label>
<input class="form-control col-md-4" type="number" name="analyzer_type" id="analyzer_type" value="{%if queue_type%}{{queue_type}}{%else%}1{%endif%}" min="1" max="254" required>
<input class="form-control" type="text" name="analyzer_metatype" id="analyzer_metatype_name" placeholder="Meta Type Name" {%if metatype_name%}value="{{metatype_name}}"{%endif%}>
</div>
<div class="input-group my-2">
<div class="input-group-prepend">
<button class="btn btn-outline-secondary" type="button" onclick="generate_new_uuid();"><i class="fa fa-random"></i></button>
</div>
<input class="form-control col-md-4" type="text" name="analyzer_uuid" id="analyzer_uuid" {%if queue_uuid%}value="{{queue_uuid}}"{%endif%} placeholder="Analyzer uuid - (Optional)">
</div>
<div class="form-group my-2">
<input class="form-control" type="text" name="description" id="analyzer_description" {%if description%}value="{{description}}"{%endif%} placeholder="Description - (Optional)">
</div>
<div id="container-id-to-import">
<p>Create Queue by Group of UUID</p>
<div for="first_sensor_uuid"><b>Sensor UUID</b></div>
<div class="form-horizontal">
<div class="form-body">
<div class="form-group">
<div class="fields">
{% if l_uuid %}
{% for sensor_uuid in l_uuid %}
{% with sensor_uuid=sensor_uuid, error=False%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
{% if l_invalid_uuid %}
{% for sensor_uuid in l_invalid_uuid %}
{% with sensor_uuid=sensor_uuid, error=True%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
<div class="input-group mb-1">
<input type="text" class="form-control col-10" name="first_sensor_uuid" id="first_sensor_uuid">
<span class="btn btn-info input-group-addon add-field col-2"><i class="fa fa-plus"></i></span>
</div>
<span class="help-block" hidden>Export Objects</span>
</div>
</div>
</div>
</div>
</div>
<div class="form-group">
<button class="btn btn-info" type="submit">Create Queue</button>
</div>
</form>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
{%if queue_type!=2 and queue_type!=254%}
$('#analyzer_metatype_name').hide();
{%endif%}
});
var input_part_1 = '<div class="input-group mb-1"><input type="text" class="form-control col-10" name="'
var input_part_2 = '"></div>'
var minusButton = '<span class="btn btn-danger input-group-addon delete-field col-2"><i class="fa fa-trash"></i></span>'
$('.add-field').click(function() {
var new_uuid = uuidv4();
var template = input_part_1 + new_uuid + input_part_2;
var temp = $(template).insertBefore('.help-block');
temp.append(minusButton);
});
$('.fields').on('click', '.delete-field', function(){
console.log($(this).parent());
$(this).parent().remove();
//$.get( "#")
});
function uuidv4() {
return ([1e7]+-1e3+-4e3+-8e3+-1e11).replace(/[018]/g, c =>
(c ^ crypto.getRandomValues(new Uint8Array(1))[0] & 15 >> c / 4).toString(16)
);
}
$('#analyzer_type').on('input', function() {
if ($('#analyzer_type').val() == 2 || $('#analyzer_type').val() == 254){
$('#analyzer_metatype_name').show()
} else {
$('#analyzer_metatype_name').hide()
}
});
function generate_new_uuid(){
$.getJSON( "{{url_for('generate_uuid')}}", function( data ) {
console.log(data['uuid'])
$( "#analyzer_uuid" ).val(data['uuid']);
});
}
</script>

View File

@ -0,0 +1,168 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="card mb-3 mt-1">
<div class="card-header text-white bg-dark">
<h5 class="card-title">Analyzer Queue: <b>{{queue_metadata['uuid']}}</b></h5>
</div>
<div class="card-body">
<table class="table table-striped table-bordered">
<thead class="thead-dark">
<tr>
<th>Type Name</th>
<th>Group</th>
<th style="max-width: 800px;">Name</th>
<th style="max-width: 800px;">Last updated</th>
<th style="max-width: 800px;">Change max size limit</th>
</tr>
</thead>
<tbody>
<tr>
<td>
{%if queue_metadata['format_type'] == 254%}
{{queue_metadata['extended_type']}}
{%else%}
{{queue_metadata['format_type']}}
{%endif%}
</td>
{%if queue_metadata['is_group_queue']%}
<td class="text-center"><i class="fa fa-group"></i></td>
{%else%}
<td></td>
{%endif%}
<td>
<div class="d-flex">
<b>{{queue_metadata['uuid']}}:{{queue_metadata['format_type']}}{%if queue_metadata['format_type'] == 254%}:{{queue_metadata['extended_type']}}{%endif%}</b>
</div>
</td>
<td>{{queue_metadata['last_updated']}}</td>
<td>
<div class="d-xl-flex justify-content-xl-center">
<input class="form-control mr-lg-1" style="max-width: 100px;" type="number" id="max_size_analyzer_{{queue_metadata['uuid']}}" value="{{queue_metadata['size_limit']}}" min="0" required="">
<button type="button" class="btn btn-outline-secondary" onclick="window.location.href ='{{ url_for('analyzer_change_max_size') }}?analyzer_uuid={{queue_metadata['uuid']}}&redirect=0&max_size_analyzer='+$('#max_size_analyzer_{{queue_metadata['uuid']}}').val();">Change Max Size</button>
</div>
</td>
</tr>
</tbody>
</table>
<form action="{{ url_for('analyzer_queue.edit_queue_analyzer_queue_post') }}" method="post" enctype=multipart/form-data>
<input class="form-control" type="text" name="queue_uuid" id="queue_uuid" value="{{queue_metadata['uuid']}}" hidden>
<div class="form-group my-2">
<input class="form-control" type="text" name="description" id="analyzer_description" {%if 'description' in queue_metadata%}value="{{queue_metadata['description']}}"{%endif%} placeholder="Description - (Optional)">
</div>
<div>
<br>
<div for="first_sensor_uuid"><b>Sensor UUID</b></div>
<div class="form-horizontal">
<div class="form-body">
<div class="form-group">
<div class="fields">
{% if l_sensors_uuid %}
{% for sensor_uuid in l_sensors_uuid %}
{% with sensor_uuid=sensor_uuid, error=False%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
{% if l_invalid_uuid %}
{% for sensor_uuid in l_invalid_uuid %}
{% with sensor_uuid=sensor_uuid, error=True%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
<div class="input-group mb-1">
<input type="text" class="form-control col-10" name="first_sensor_uuid" id="first_sensor_uuid">
<span class="btn btn-info input-group-addon add-field col-2"><i class="fa fa-plus"></i></span>
</div>
<span class="help-block" hidden>Sensor UUID</span>
</div>
</div>
</div>
</div>
</div>
<div class="form-group">
<button class="btn btn-info" type="submit">Edit Queue</button>
</div>
</form>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
});
var input_part_1 = '<div class="input-group mb-1"><input type="text" class="form-control col-10" name="'
var input_part_2 = '"></div>'
var minusButton = '<span class="btn btn-danger input-group-addon delete-field col-2"><i class="fa fa-trash"></i></span>'
$('.add-field').click(function() {
var new_uuid = uuidv4();
var template = input_part_1 + new_uuid + input_part_2;
var temp = $(template).insertBefore('.help-block');
temp.append(minusButton);
});
$('.fields').on('click', '.delete-field', function(){
console.log($(this).parent());
$(this).parent().remove();
//$.get( "#")
});
function uuidv4() {
return ([1e7]+-1e3+-4e3+-8e3+-1e11).replace(/[018]/g, c =>
(c ^ crypto.getRandomValues(new Uint8Array(1))[0] & 15 >> c / 4).toString(16)
);
}
</script>

View File

@ -23,22 +23,7 @@
<body>
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="card-deck justify-content-center ml-0 mr-0">
<div class="card border-dark mt-3 ml-4 mr-4">

View File

@ -23,22 +23,7 @@
<body>
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="card-deck justify-content-center ml-0 mr-0">
<div class="card border-dark mt-3 ml-4 mr-4">

View File

@ -0,0 +1,108 @@
<!DOCTYPE html>
<html>
<head>
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<style>
html,
body {
height: 100%;
}
body {
display: -ms-flexbox;
display: flex;
-ms-flex-align: center;
align-items: center;
padding-top: 40px;
padding-bottom: 40px;
background-color: #f5f5f5;
}
.form-signin {
width: 100%;
max-width: 330px;
padding: 15px;
margin: auto;
}
.form-signin .checkbox {
font-weight: 400;
}
.form-signin .form-control {
position: relative;
box-sizing: border-box;
height: auto;
padding: 10px;
font-size: 16px;
}
.form-signin .form-control:focus {
z-index: 2;
}
.form-signin input[type="password"] {
margin-bottom: 10px;
border-top-left-radius: 0;
border-top-right-radius: 0;
}
</style>
</head>
<body class="text-center">
<form class="form-signin" action="{{ url_for('change_password')}}" autocomplete="off" method="post">
<img class="mb-4" src="{{ url_for('static', filename='img/d4-logo.png')}}" width="300">
<h1 class="h3 mb-3 text-secondary">Change Password</h1>
<label for="inputPassword1" class="sr-only">Password</label>
<input type="password" id="inputPassword1" name="password1" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Password" autocomplete="new-password" required autofocus>
<label for="inputPassword2" class="sr-only">Confirm Password</label>
<input type="password" id="inputPassword2" name="password2" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Confirm Password" value="" autocomplete="new-password" required>
{% if error %}
<div class="invalid-feedback">
{{error}}
</div>
{% endif %}
<button class="btn btn-lg btn-primary btn-block" type="submit">Submit</button>
<br>
<br>
<br>
<h5 class="h3 mb-3 text-secondary">Password Requirements</h5>
<ul class="list-group">
<li class="list-group-item d-flex justify-content-between align-items-center">
Minimal length
<span class="badge badge-primary badge-pill">10</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Upper characters: A-Z
<span class="badge badge-primary badge-pill">1</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Lower characters: a-z
<span class="badge badge-primary badge-pill">1</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Digits: 0-9
<span class="badge badge-primary badge-pill">2</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Maximum length
<span class="badge badge-primary badge-pill">100</span>
</li>
</ul>
</form>
</body>

View File

@ -0,0 +1,156 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png') }}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
</head>
<body>
{% include 'navbar.html' %}
<div class="container-fluid">
<div class="row">
{% include 'sidebar_settings.html' %}
<div class="col-12 col-lg-10" id="core_content">
<form class="form-signin" action="{{ url_for('settings.create_user_post')}}" autocomplete="off" method="post">
<h1 class="h3 mt-1 mb-3 text-center text-secondary">Create User</h1>
<label for="inputEmail" class="sr-only">Email address</label>
<input type="email" id="inputEmail" name="username" class="form-control {% if error_mail %}is-invalid{% endif %}" placeholder="Email address" autocomplete="off" required {% if user_id %}value="{{user_id}}"{% else %}{% endif %}>
{% if error_mail %}
<div class="invalid-feedback">
Please provide a valid email address
</div>
{% endif %}
<label class="mt-3" for="role_selector">User Role</label>
<select class="custom-select" id="role_selector" name="user_role">
{% for role in all_roles %}
{% if role == user_role %}
<option value="{{role}}" selected>{{role}}</option>
{% else %}
<option value="{{role}}">{{role}}</option>
{% endif %}
{% endfor %}
</select>
<div class="custom-control custom-switch mt-4 mb-3">
<input type="checkbox" class="custom-control-input" id="set_manual_password" value="" onclick="toggle_password_fields();">
<label class="custom-control-label" for="set_manual_password">Set Password</label>
</div>
<div id="password-section">
<h1 class="h3 mb-3 text-center text-secondary">Create Password</h1>
<label for="inputPassword1" class="sr-only">Password</label>
<input type="password" id="inputPassword1" name="password1" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Password" autocomplete="new-password">
<label for="inputPassword2" class="sr-only">Confirm Password</label>
<input type="password" id="inputPassword2" name="password2" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Confirm Password" value="" autocomplete="new-password">
{% if error %}
<div class="invalid-feedback">
{{error}}
</div>
{% endif %}
</div>
<button class="btn btn-lg btn-primary btn-block mt-3" type="submit">Submit</button>
<div id="password-section-info">
<br>
<br>
<br>
<h5 class="h3 mb-3 text-center text-secondary">Password Requirements</h5>
<ul class="list-group">
<li class="list-group-item d-flex justify-content-between align-items-center">
Minimal length
<span class="badge badge-primary badge-pill">10</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Upper characters: A-Z
<span class="badge badge-primary badge-pill">1</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Lower characters: a-z
<span class="badge badge-primary badge-pill">1</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Digits: 0-9
<span class="badge badge-primary badge-pill">2</span>
</li>
<li class="list-group-item d-flex justify-content-between align-items-center">
Maximum length
<span class="badge badge-primary badge-pill">100</span>
</li>
</ul>
</div>
</form>
</div>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#password-section").hide();
$("#password-section-info").hide();
$("#nav-settings").addClass("active");
$("#nav_create_user").addClass("active");
$("#nav_user_management").removeClass("text-muted");
{% if error %}
toggle_password_fields();
{% endif %}
} );
function toggle_sidebar(){
if($('#nav_menu').is(':visible')){
$('#nav_menu').hide();
$('#side_menu').removeClass('border-right')
$('#side_menu').removeClass('col-lg-2')
$('#core_content').removeClass('col-lg-10')
}else{
$('#nav_menu').show();
$('#side_menu').addClass('border-right')
$('#side_menu').addClass('col-lg-2')
$('#core_content').addClass('col-lg-10')
}
}
function toggle_password_fields() {
var password_div = $("#password-section");
if(password_div.is(":visible")){
$("#password-section").hide();
$("#password-section-info").hide();
$("#inputPassword1").prop('required',false);
$("#inputPassword2").prop('required',false);
} else {
$("#password-section").show();
$("#password-section-info").show();
$("#inputPassword1").prop('required',true);
$("#inputPassword2").prop('required',true);
}
}
</script>
</html>

View File

@ -0,0 +1,99 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png') }}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
</head>
<body>
{% include 'navbar.html' %}
<div class="container-fluid">
<div class="row">
{% include 'sidebar_settings.html' %}
<div class="col-12 col-lg-10" id="core_content">
<div class="card mb-3 mt-1">
<div class="card-header text-white bg-dark pb-1">
<h5 class="card-title">My Profile :</h5>
</div>
<div class="card-body">
<div class="row">
<div class="col-xl-6">
<div class="card text-center border-secondary">
<div class="card-body px-1 py-0">
<table class="table table-sm">
<tbody>
<tr>
<td>Email</td>
<td>{{user_metadata['email']}}</td>
</tr>
<tr>
<td>Role</td>
<td>{{user_metadata['role']}}</td>
</tr>
<tr>
<td>API Key</td>
<td>
{{user_metadata['api_key']}}
<a class="ml-3" href="{{url_for('settings.new_token')}}"><i class="fa fa-random"></i></a>
</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-settings").addClass("active");
$("#nav_edit_profile").addClass("active");
$("#nav_my_profile").removeClass("text-muted");
} );
function toggle_sidebar(){
if($('#nav_menu').is(':visible')){
$('#nav_menu').hide();
$('#side_menu').removeClass('border-right')
$('#side_menu').removeClass('col-lg-2')
$('#core_content').removeClass('col-lg-10')
}else{
$('#nav_menu').show();
$('#side_menu').addClass('border-right')
$('#side_menu').addClass('col-lg-2')
$('#core_content').addClass('col-lg-10')
}
}
</script>
</html>

View File

@ -9,6 +9,7 @@
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/d3.min.js')}}"></script>
@ -29,8 +30,12 @@
width:600px;
height:500px;
}
.bar{
.bar{
fill:#eaeaea;
}
.bars:hover{
fill: aqua;
cursor: pointer;
}
text.label{
fill: #777777;
@ -38,9 +43,13 @@
font-size: 20px;
font-weight: bold;
}
text.category{
text.category{
fill: #666666;
font-size: 18px;
}
text.categorys:hover{
fill: black;
cursor: pointer;
}
</style>
@ -49,22 +58,7 @@
<body>
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item active">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="row mr-0">
@ -96,17 +90,14 @@
</div>
</div>
<div class="d-flex justify-content-center">
<a href="{{ url_for('delete_data') }}">
<button type="button" class="btn btn-primary mt-3 mb-2">Delete All Data (Demo)</button>
</a>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-home").addClass("active");
} );
////
//http://bl.ocks.org/charlesdguthrie/11356441, updated and modified
//updating BarChart
@ -135,12 +126,12 @@ var setup = function(targetID){
var settings = {
margin:margin, width:width, height:height, categoryIndent:categoryIndent,
svg:svg, x:x, y:y
svg:svg, x:x, y:y, myid:targetID
}
return settings;
}
var redrawChart = function(targetID, newdata) {
var redrawChart = function(div_id, targetID, newdata) {
//Import settings
var margin=targetID.margin, width=targetID.width, height=targetID.height, categoryIndent=targetID.categoryIndent,
@ -169,13 +160,24 @@ var redrawChart = function(targetID, newdata) {
.attr("class", "chartRow")
.attr("transform", "translate(0," + height + margin.top + margin.bottom + ")");
//bars
newRow.insert("rect")
.attr("class","bar")
.attr("x", 0)
.attr("opacity",0)
.attr("height", y.bandwidth())
.attr("width", function(d) { return x(d.value);})
if (div_id=='chart_uuid'){
//bars
newRow.insert("rect")
.on("click", function (d) { window.location.href = "{{ url_for('uuid_management') }}?uuid="+d.key })
.attr("class","bar bars")
.attr("x", 0)
.attr("opacity",0)
.attr("height", y.bandwidth())
.attr("width", function(d) { return x(d.value);})
} else {
//bars
newRow.insert("rect")
.attr("class","bar")
.attr("x", 0)
.attr("opacity",0)
.attr("height", y.bandwidth())
.attr("width", function(d) { return x(d.value);})
}
//labels
newRow.append("text")
@ -187,17 +189,30 @@ var redrawChart = function(targetID, newdata) {
.attr("dx","0.5em")
.text(function(d){return d.value;});
//text
newRow.append("text")
.attr("class","category")
.attr("text-overflow","ellipsis")
.attr("y", y.bandwidth()/2)
.attr("x",categoryIndent)
.attr("opacity",0)
.attr("dy",".35em")
.attr("dx","5em")
.text(function(d){return d.key});
if (div_id=='chart_uuid'){
//text
newRow.append("text")
.on("click", function (d) { window.location.href = "{{ url_for('uuid_management') }}?uuid="+d.key })
.attr("class","category categorys")
.attr("text-overflow","ellipsis")
.attr("y", y.bandwidth()/2)
.attr("x",categoryIndent)
.attr("opacity",0)
.attr("dy",".35em")
.attr("dx","5em")
.text(function(d){return d.key});
} else {
//text
newRow.append("text")
.attr("class","category")
.attr("text-overflow","ellipsis")
.attr("y", y.bandwidth()/2)
.attr("x",categoryIndent)
.attr("opacity",0)
.attr("dy",".35em")
.attr("dx","5em")
.text(function(d){return d.key});
}
//////////
//UPDATE//
@ -249,10 +264,10 @@ var redrawChart = function(targetID, newdata) {
.attr("transform", function(d){ return "translate(0," + y(d.key) + ")"; });
};
var pullData = function(json_url,settings,callback){
var pullData = function(div_id,json_url,settings,callback){
d3.json(json_url, function (err, data){
if (err) return console.warn(err);
callback(settings,data);
callback(div_id,settings,data);
})
}
@ -264,8 +279,8 @@ var formatData = function(data){
.slice(0, 15); // linit to 15 items
}
var redraw = function(json_url,settings){
pullData(json_url,settings,redrawChart)
var redraw = function(div_id,json_url,settings){
pullData(div_id,json_url,settings,redrawChart)
}
json_url_uuid = "{{ url_for('_json_daily_uuid_stats') }}"
@ -273,15 +288,17 @@ json_url_type = "{{ url_for('_json_daily_type_stats') }}"
//setup
var settings = setup('#chart_uuid');
redraw(json_url_uuid,settings)
redraw('chart_uuid',json_url_uuid,settings)
redraw('chart_uuid',json_url_uuid,settings)
var settings_type = setup('#chart_type');
redraw(json_url_type,settings_type)
redraw('chart_type',json_url_type,settings_type)
redraw('chart_type',json_url_type,settings_type)
//Interval
setInterval(function(){
redraw(json_url_uuid,settings)
redraw(json_url_type,settings_type)
redraw('chart_uuid',json_url_uuid,settings)
redraw('chart_type',json_url_type,settings_type)
}, 4000);
////

View File

@ -0,0 +1,86 @@
<!DOCTYPE html>
<html>
<head>
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<style>
html,
body {
height: 100%;
}
body {
display: -ms-flexbox;
display: flex;
-ms-flex-align: center;
align-items: center;
padding-top: 40px;
padding-bottom: 40px;
background-color: #f5f5f5;
}
.form-signin {
width: 100%;
max-width: 330px;
padding: 15px;
margin: auto;
}
.form-signin .checkbox {
font-weight: 400;
}
.form-signin .form-control {
position: relative;
box-sizing: border-box;
height: auto;
padding: 10px;
font-size: 16px;
}
.form-signin .form-control:focus {
z-index: 2;
}
.form-signin input[type="email"] {
margin-bottom: -1px;
border-bottom-right-radius: 0;
border-bottom-left-radius: 0;
}
.form-signin input[type="password"] {
margin-bottom: 10px;
border-top-left-radius: 0;
border-top-right-radius: 0;
}
</style>
</head>
<body class="text-center">
<form class="form-signin" action="{{ url_for('login')}}" method="post">
<img class="mb-4" src="{{ url_for('static', filename='img/d4-logo.png')}}" width="300">
<h1 class="h3 mb-3 text-secondary">Please sign in</h1>
<input type="text" id="next_page" name="next_page" value="{{next_page}}" hidden>
<label for="inputEmail" class="sr-only">Email address</label>
<input type="email" id="inputEmail" name="username" class="form-control" placeholder="Email address" required autofocus>
<label for="inputPassword" class="sr-only">Password</label>
<input type="password" id="inputPassword" name="password" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Password" required>
{% if error %}
<div class="invalid-feedback">
{{error}}
</div>
{% endif %}
<button class="btn btn-lg btn-primary btn-block" type="submit">Sign in</button>
</form>
</body>

View File

@ -0,0 +1,22 @@
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" id="nav-home" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" id="nav-sensor" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" id="nav-server" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" id="nav-settings" href="{{ url_for('settings.settings_page') }}" tabindex="-1" aria-disabled="true">Settings</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('logout') }}" tabindex="-1" aria-disabled="true"><i class="fa fa-sign-out"></i>Log Out</a>
</li>
</ul>
</nav>

View File

@ -0,0 +1,96 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="d-flex justify-content-center">
<div class="card border-dark mt-3 text-center" style="max-width: 30rem;">
<div class="card-body text-dark">
<h5 class="card-title">Approve New Sensor UUID</h5>
<input class="form-control" type="text" id="uuid" value="" required>
<button type="button" class="btn btn-outline-secondary mt-1" onclick="window.location.href ='{{ url_for('approve_sensor') }}?uuid='+$('#uuid').val();">Approve UUID</button>
</div>
</div>
</div>
<div class="py-3 mx-2">
<table class="table table-striped table-bordered table-hover text-center" id="myTable_1">
<thead>
<tr>
<th class="bg-info text-white">UUID</th>
<th class="bg-info text-white">description</th>
<th class="bg-info text-white">mail</th>
<th class="bg-info text-white"></th>
</tr>
</thead>
<tbody>
{% for row_uuid in all_pending %}
<tr data-trigger="hover" title="" data-content="test content" data-original-title="test title">
<td>
<a class="" href="{{ url_for('uuid_management') }}?uuid={{row_uuid['uuid']}}">
{{row_uuid['uuid']}}
</a>
</td>
<td>{{row_uuid['description']}}</td>
<td>{{row_uuid['mail']}}</td>
<td>
<a href="{{ url_for('approve_sensor') }}?uuid={{row_uuid['uuid']}}">
<button type="button" class="btn btn-outline-info"><i class="fa fa-plus"></i></button>
</a>
<a href="{{ url_for('delete_pending_sensor') }}?uuid={{row_uuid['uuid']}}">
<button type="button" class="btn btn-outline-danger"><i class="fa fa-trash"></i></button>
</a>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
table = $('#myTable_1').DataTable(
{
"aLengthMenu": [[5, 10, 15, 20, -1], [5, 10, 15, 20, "All"]],
"iDisplayLength": 10,
"order": [[ 0, "asc" ]]
}
);
$('[data-toggle="popover"]').popover({
placement: 'top',
container: 'body',
html : false,
})
});
</script>

View File

@ -0,0 +1,116 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="mx-2 py-3">
<table class="table table-striped table-bordered table-hover text-center" id="myTable_1">
<thead>
<tr>
<th class="bg-info text-white">UUID</th>
<th class="bg-info text-white">first seen</th>
<th class="bg-info text-white">last seen</th>
<th class="bg-info text-white">types</th>
<th class="bg-secondary text-white">Status</th>
<th class="bg-secondary text-white"></th>
</tr>
</thead>
<tbody>
{% for row_uuid in all_sensors %}
<tr data-trigger="hover" title="" data-content="test content" data-original-title="test title">
<td>
<a class="" href="{{ url_for('uuid_management') }}?uuid={{row_uuid['uuid']}}">
{{row_uuid['uuid']}}
</a>
{% if row_uuid['description'] %}
<div class="text-info"><small>{{row_uuid['description']}}</small></div>
{% endif %}
</td>
<td>
{% if row_uuid['first_seen'] %}
{{row_uuid['first_seen']}}
{% else %}
{{'-'}}
{% endif %}
</td>
<td>
{% if row_uuid['first_seen'] %}
{{row_uuid['first_seen']}}
{% else %}
{{'-'}}
{% endif %}
</td>
<td>
{{type_description}}
{% for uuid_type in row_uuid['types'] %}
<span class="badge badge-dark">
{{uuid_type['type']}}
</span>
{% endfor %}
</td>
<td
{% if not row_uuid['Error'] %}
div class="text-success">
OK -
{% else %}
div class="text-danger">
<i class="fa fa-times-circle"></i> {{row_uuid['Error']}}
{% endif %}
{% if row_uuid['active_connection'] %}
<i class="fa fa-check-circle"></i> Connected
{% endif %}
</td>
<td>
<a href="{{ url_for('delete_registered_sensor') }}?uuid={{row_uuid['uuid']}}">
<button type="button" class="btn btn-outline-danger"><i class="fa fa-trash"></i></button>
</a>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-sensor").addClass("active");
table = $('#myTable_1').DataTable(
{
"aLengthMenu": [[5, 10, 15, 20, -1], [5, 10, 15, 20, "All"]],
"iDisplayLength": 10,
"order": [[ 0, "asc" ]]
}
);
});
</script>

View File

@ -0,0 +1,57 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
</head>
<body>
{% include 'navbar.html' %}
<form action="{{ url_for('D4_sensors.add_sensor_to_monitor_post') }}" method="post" enctype=multipart/form-data>
<div class="d-flex justify-content-center">
<div class="col-sm-6">
<h4 class="my-3">Monitor a Sensor</h4>
<div class="form-group">
<input class="form-control text-center bg-dark text-white" type="text" value="{{sensor_uuid}}" disabled>
<input type="text" name="uuid" id="uuid" value="{{sensor_uuid}}" hidden>
</div>
<div class="input-group mt-2 mb-2">
<div class="input-group-prepend">
<span class="input-group-text bg-light"><i class="fa fa-clock-o"></i>&nbsp;</span>
</div>
<input class="form-control" type="number" id="delta_time" value="3600" min="30" name="delta_time" required>
<div class="input-group-append">
<span class="input-group-text">Maxinum Time (seconds) between two D4 packets</span>
</div>
</div>
<div class="form-group">
<button class="btn btn-primary" type="submit">Monitor Sensor</button>
</div>
</div>
</div>
</form>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-sensor").addClass("active");
});
</script>

View File

@ -7,13 +7,19 @@
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
@ -21,22 +27,7 @@
<body>
<nav class="navbar navbar-expand-lg navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link active mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="card mt-2 mb-2">
<div class="card-body bg-dark text-white">
@ -63,56 +54,79 @@
</div>
</div>
{% for row_uuid in status_daily_uuid %}
<div class="card text-center mt-3 ml-2 mr-2">
<a href="{{ url_for('uuid_management') }}?uuid={{row_uuid['uuid']}}">
<div class="card-header bg-dark text-white">
UUID: {{row_uuid['uuid']}}
</div>
</a>
<div class="card-body">
<div class="card-group">
<div class="card">
<div class="card-header bg-info text-white">
First Seen
</div>
<div class="card-body">
<p class="card-text">{{row_uuid['first_seen_gmt']}} - ({{row_uuid['first_seen']}})</p>
</div>
</div>
<div class="card">
<div class="card-header bg-info text-white">
Last Seen
</div>
<div class="card-body">
<p class="card-text">{{row_uuid['last_seen_gmt']}} - ({{row_uuid['last_seen']}})</p>
</div>
</div>
<div class="card">
{% if not row_uuid['Error'] %}
<div class="card-header bg-success text-white">
Status
</div>
<div class="card-body text-success">
<p class="card-text">OK</p>
{% else %}
<div class="card-header bg-danger text-white">
Status
</div>
<div class="card-body text-danger">
<p class="card-text">{{row_uuid['Error']}}</p>
{% endif %}
{% if row_uuid['active_connection'] %}
<div style="color:Green; display:inline-block">
<i class="fa fa-check-circle"></i> Connected
</div>
<div class="mx-2">
<table class="table table-striped table-bordered table-hover text-center" id="myTable_1">
<thead>
<tr>
<th class="bg-info text-white">UUID</th>
<th class="bg-info text-white">first seen</th>
<th class="bg-info text-white">last seen</th>
<th class="bg-info text-white">types</th>
<th class="bg-secondary text-white">Status</th>
</tr>
</thead>
<tbody>
{% for row_uuid in status_daily_uuid %}
<tr data-trigger="hover" title="" data-content="test content" data-original-title="test title">
<td>
<a class="" href="{{ url_for('uuid_management') }}?uuid={{row_uuid['uuid']}}">
{{row_uuid['uuid']}}
</a>
<div class="text-info"><small>{{row_uuid['description']}}</small></div>
</td>
<td>{{row_uuid['first_seen_gmt']}}</td>
<td>{{row_uuid['last_seen_gmt']}}</td>
<td>
{{type_description}}
{% for uuid_type in row_uuid['l_uuid_types'] %}
{% if row_uuid['type_connection_status'][uuid_type] %}
<span class="badge badge-success" data-toggle="popover" data-trigger="hover" title="" data-content="{{types_description[uuid_type]}}" data-original-title="{{uuid_type}}">
{{uuid_type}}
</span>
{% else %}
<span class="badge badge-dark" data-toggle="popover" data-trigger="hover" title="" data-content="{{types_description[uuid_type]}}" data-original-title="{{uuid_type}}">
{{uuid_type}}
</span>
{% endif %}
{% endfor %}
</td>
<td
{% if not row_uuid['Error'] %}
div class="text-success">
OK -
{% else %}
div class="text-danger">
<i class="fa fa-times-circle"></i> {{row_uuid['Error']}}
{% endif %}
</div>
</div>
</div>
</div>
</div>
{% endfor %}
{% if row_uuid['active_connection'] %}
<i class="fa fa-check-circle"></i> Connected
{% endif %}
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-sensor").addClass("active");
table = $('#myTable_1').DataTable(
{
"aLengthMenu": [[5, 10, 15, 20, -1], [5, 10, 15, 20, "All"]],
"iDisplayLength": 10,
"order": [[ 0, "asc" ]]
}
);
$('[data-toggle="popover"]').popover({
placement: 'top',
container: 'body',
html : false,
})
});
</script>

View File

@ -19,22 +19,25 @@
<body>
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link active" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="d-flex justify-content-center">
<div class="card border-secondary mt-3 text-center" style="max-width: 30rem;">
<div class="card-body text-dark">
<h5 class="card-title">D4 Server mode:
<span class="badge badge-dark">
{{server_mode}}
</span>
</h5>
<a href="{{ url_for('registered_sensor') }}">
<button type="button" class="btn btn-info">Registered Sensors <span class="badge badge-light">{{nb_sensors_registered}}</span></button>
</a>
<a href="{{ url_for('pending_sensors') }}">
<button type="button" class="btn btn-outline-secondary">Pending Sensors <span class="badge badge-danger">{{nb_sensors_pending}}</span></button>
</a>
</div>
</div>
</div>
<div class="card-deck ml-0 mr-0">
<div class="card text-center mt-3 ml-xl-4">
@ -175,13 +178,40 @@
{% endfor %}
</tbody>
</table>
<div class="mt-3">
<table class="table table-striped table-bordered table-hover mt-3" id="table_accepted_extended_type">
<thead class="thead-dark">
<tr>
<th>Type Name</th>
<th>Description</th>
<th>Remove Type</th>
</tr>
</thead>
<tbody id="table_accepted_extended_type_tbody">
{% for type in list_accepted_extended_types %}
<tr>
<td>{{type['name']}}</td>
<td>{{type['description']}}</td>
<td>
<a href="{{ url_for('remove_accepted_extended_type') }}?type_name={{type['name']}}">
<button type="button" class="btn btn-outline-danger">Remove Extended Type</button>
</a>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
<div class="col-md-4">
<div class="card border-dark mt-3" style="max-width: 18rem;">
<div class="card-body text-dark">
<h5 class="card-title">Add New Types</h5>
<input class="form-control" type="number" id="accepted_type" value="1" min="1" max="8" required>
<button type="button" class="btn btn-outline-primary mt-1" onclick="window.location.href ='{{ url_for('add_accepted_type') }}?redirect=1&type='+$('#accepted_type').val();">Add New Type</button>
<input class="form-control" type="number" id="accepted_type" value="1" min="1" max="254" required>
<input class="form-control" type="text" id="extended_type_name" placeholder="Type Name">
<button type="button" class="btn btn-outline-primary mt-1" onclick="window.location.href ='{{ url_for('add_accepted_type') }}?redirect=1&type='+$('#accepted_type').val()+'&extended_type_name='+$('#extended_type_name').val();">Add New Type</button>
</div>
</div>
</div>
@ -201,11 +231,12 @@
<div class="card-body text-dark">
<div class="row">
<div class="col-xl-8">
<div class="col-xl-10">
<table class="table table-striped table-bordered table-hover" id="myTable_1">
<thead class="thead-dark">
<tr>
<th>Type</th>
<th>Group</th>
<th style="max-width: 800px;">uuid</th>
<th style="max-width: 800px;">last updated</th>
<th style="max-width: 800px;">Change max size limit</th>
@ -216,19 +247,36 @@
{% for type in list_accepted_types %}
{% if type['list_analyzer_uuid'] %}
{% for analyzer in type['list_analyzer_uuid'] %}
<tr>
<tr id="{{analyzer['uuid']}}">
<td>{{type['id']}}</td>
<td>{{analyzer['uuid']}}</td>
{%if analyzer['is_group_queue']%}
<td class="text-center"><i class="fa fa-group"></i></td>
{%else%}
<td></td>
{%endif%}
<td>
<div class="d-flex">
<a href="{{ url_for('analyzer_queue.edit_queue_analyzer_queue') }}?queue_uuid={{analyzer['uuid']}}">
{{analyzer['uuid']}}
</a>
<a href="{{ url_for('remove_analyzer') }}?redirect=1&type={{type['id']}}&analyzer_uuid={{analyzer['uuid']}}" class="ml-auto">
<button type="button" class="btn btn-outline-danger px-2 py-0"><i class="fa fa-trash"></i></button>
</a>
</div>
{%if analyzer['description']%}
<div class="text-info"><small>{{analyzer['description']}}</small></div>
{%endif%}
</td>
<td>{{analyzer['last_updated']}}</td>
<td>
<div class="d-lg-flex justify-content-lg-center">
<div class="d-xl-flex justify-content-xl-center">
<input class="form-control mr-lg-1" style="max-width: 100px;" type="number" id="max_size_analyzer_{{analyzer['uuid']}}" value="{{analyzer['size_limit']}}" min="0" required="">
<button type="button" class="btn btn-outline-secondary" onclick="window.location.href ='{{ url_for('analyzer_change_max_size') }}?analyzer_uuid={{analyzer['uuid']}}&redirect=0&max_size_analyzer='+$('#max_size_analyzer_{{analyzer['uuid']}}').val();">Change Max Size</button>
</div>
</td>
<td>
<a href="{{ url_for('remove_analyzer') }}?redirect=1&type={{type['id']}}&analyzer_uuid={{analyzer['uuid']}}">
<button type="button" class="btn btn-outline-danger"><i class="fa fa-trash"></i></button>
<a href="{{ url_for('empty_analyzer_queue') }}?redirect=1&type={{type['id']}}&analyzer_uuid={{analyzer['uuid']}}">
<button type="button" class="btn btn-outline-danger"><i class="fa fa-eraser"></i></button>
</a>
<button type="button" class="btn btn-outline-info ml-xl-3" onclick="get_analyser_sample('{{type['id']}}', '{{analyzer['uuid']}}');"><i class="fa fa-database"></i> {{analyzer['length']}}</button>
</td>
@ -238,16 +286,65 @@
{% endfor %}
</tbody>
</table>
</div>
<div class="col-xl-4">
<div class="card border-dark mt-3" style="max-width: 18rem;">
<div class="card-body text-dark">
<h5 class="card-title">Add New Analyzer Queue</h5>
<input class="form-control" type="number" id="analyzer_type" value="1" min="1" max="8" required>
<input class="form-control" type="text" id="analyzer_uuid" required>
<button type="button" class="btn btn-outline-primary mt-1" onclick="window.location.href ='{{ url_for('add_new_analyzer') }}?redirect=1&type='+$('#analyzer_type').val()+'&analyzer_uuid='+$('#analyzer_uuid').val();">Add New Analyzer</button>
</div>
<div class="mt-3">
<table class="table table-striped table-bordered table-hover" id="analyzer_accepted_extended_types">
<thead class="thead-dark">
<tr>
<th>Type Name</th>
<th>Group</th>
<th style="max-width: 800px;">uuid</th>
<th style="max-width: 800px;">last updated</th>
<th style="max-width: 800px;">Change max size limit</th>
<th style="max-width: 800px;">Analyzer Queue</th>
</tr>
</thead>
<tbody id="analyzer_accepted_extended_types_tbody">
{% for dict_queue in l_queue_extended_type %}
<tr>
<td>{{dict_queue['extended_type']}}</td>
{%if dict_queue['is_group_queue']%}
<td class="text-center"><i class="fa fa-group"></i></td>
{%else%}
<td></td>
{%endif%}
<td>
<div class="d-flex">
<a href="{{ url_for('analyzer_queue.edit_queue_analyzer_queue') }}?queue_uuid={{dict_queue['uuid']}}">
{{dict_queue['uuid']}}
</a>
<a href="{{ url_for('remove_analyzer') }}?redirect=1&type=254&metatype_name={{dict_queue['extended_type']}}&analyzer_uuid={{dict_queue['uuid']}}" class="ml-auto">
<button type="button" class="btn btn-outline-danger px-2 py-0"><i class="fa fa-trash"></i></button>
</a>
</div>
{%if dict_queue['description']%}
<div class="text-info"><small>{{dict_queue['description']}}</small></div>
{%endif%}
</td>
<td>{{dict_queue['last_updated']}}</td>
<td>
<div class="d-xl-flex justify-content-xl-center">
<input class="form-control mr-lg-1" style="max-width: 100px;" type="number" id="max_size_analyzer_{{dict_queue['uuid']}}" value="{{dict_queue['size_limit']}}" min="0" required="">
<button type="button" class="btn btn-outline-secondary" onclick="window.location.href ='{{ url_for('analyzer_change_max_size') }}?analyzer_uuid={{dict_queue['uuid']}}&redirect=0&max_size_analyzer='+$('#max_size_analyzer_{{dict_queue['uuid']}}').val();">Change Max Size</button>
</div>
</td>
<td>
<a href="{{ url_for('empty_analyzer_queue') }}?redirect=1&type=254&metatype_name={{dict_queue['extended_type']}}&analyzer_uuid={{dict_queue['uuid']}}">
<button type="button" class="btn btn-outline-danger"><i class="fa fa-eraser"></i></button>
</a>
<button type="button" class="btn btn-outline-info ml-xl-3" onclick="get_analyser_sample('{{dict_queue['extended_type']}}', '{{dict_queue['uuid']}}');"><i class="fa fa-database"></i> {{dict_queue['length']}}</button>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
<div class="col-xl-2">
<a href="{{ url_for('analyzer_queue.create_analyzer_queue') }}" class="ml-auto">
<button type="button" class="btn btn-primary"><i class="fa fa-plus"></i> Add New Analyzer Queue</button>
</a>
</div>
</div>
@ -288,6 +385,9 @@
<script>
var table
$(document).ready(function(){
$('#extended_type_name').hide()
$('#analyzer_metatype_name').hide()
$("#nav-server").addClass("active");
table = $('#myTable_').DataTable(
{
@ -307,6 +407,44 @@ $(document).ready(function(){
});
var tbody = $("#table_accepted_extended_type_tbody");
if (tbody.children().length == 0) {
$("#table_accepted_extended_type").hide();
} else {
table = $('#table_accepted_extended_type').DataTable(
{
"order": [[ 0, "asc" ]]
}
);
}
var tbody = $("#analyzer_accepted_extended_types_tbody");
if (tbody.children().length == 0) {
$("#analyzer_accepted_extended_types").hide();
} else {
table = $('#analyzer_accepted_extended_types').DataTable(
{
"order": [[ 0, "asc" ]]
}
);
}
$('#accepted_type').on('input', function() {
if ($('#analyzer_type').val() == 2 || $('#accepted_type').val() == 254){
$('#extended_type_name').show()
} else {
$('#extended_type_name').hide()
}
});
$('#analyzer_type').on('input', function() {
if ($('#analyzer_type').val() == 2 || $('#analyzer_type').val() == 254){
$('#analyzer_metatype_name').show()
} else {
$('#analyzer_metatype_name').hide()
}
});
function get_analyser_sample(type, analyzer_uuid, max_line_len){
$.getJSON( "{{url_for('get_analyser_sample')}}?type="+type+"&analyzer_uuid="+analyzer_uuid+"&max_line_len="+max_line_len, function( data ) {
@ -320,4 +458,11 @@ function change_analyser_sample_max_len(){
var analyzer_data_info=$('#modal_analyser_sample_label').text().split(":");
get_analyser_sample(analyzer_data_info[1], analyzer_data_info[2], $('#max_line_len').val());
}
function generate_new_uuid(){
$.getJSON( "{{url_for('generate_uuid')}}", function( data ) {
console.log(data['uuid'])
$( "#analyzer_uuid" ).val(data['uuid']);
});
}
</script>

View File

@ -0,0 +1,47 @@
<div class="col-12 col-lg-2 p-0 bg-light border-right" id="side_menu">
<button type="button" class="btn btn-outline-secondary mt-1 ml-3" onclick="toggle_sidebar();">
<i class="fa align-left"></i>
<span>Toggle Sidebar</span>
</button>
<nav class="navbar navbar-expand navbar-light bg-light flex-md-column flex-row align-items-start py-2" id="nav_menu">
<h5 class="d-flex text-muted w-100 py-2" id="nav_my_profile">
<span>My Profile</span>
</h5>
<ul class="nav flex-md-column flex-row navbar-nav justify-content-between w-100"> <!--nav-pills-->
<li class="nav-item">
<a class="nav-link" href="{{url_for('settings.edit_profile')}}" id="nav_edit_profile">
<i class="fa fa-user"></i>
<span>My Profile</span>
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{url_for('change_password')}}" id="nav_dashboard">
<i class="fa fa-key"></i>
<span>Change Password</span>
</a>
</li>
</ul>
{% if admin_level %}
<h5 class="d-flex text-muted w-100 py-2" id="nav_user_management">
<span>User Management</span>
</h5>
<ul class="nav flex-md-column flex-row navbar-nav justify-content-between w-100"> <!--nav-pills-->
<li class="nav-item">
<a class="nav-link" href="{{url_for('settings.create_user')}}" id="nav_create_user">
<i class="fa fa-user-plus"></i>
<span>Create User</span>
</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{url_for('settings.users_list')}}" id="nav_users_list">
<i class="fa fa-users"></i>
<span>Users List</span>
</a>
</li>
</ul>
{% endif %}
</nav>
</div>

View File

@ -0,0 +1,125 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.edit_icon:hover{
cursor: pointer;
color: #17a2b8;
}
.trash_icon:hover{
cursor: pointer;
color: #c82333;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="container-fluid">
<div class="row">
{% include 'sidebar_settings.html' %}
<div class="col-12 col-lg-10" id="core_content">
{% if new_user %}
<div class="text-center my-3 ">
<div class="card">
<div class="card-header">
{% if new_user['edited']=='True' %}
<h5 class="card-title">User Edited</h5>
{% else %}
<h5 class="card-title">User Created</h5>
{% endif %}
</div>
<div class="card-body">
<p>User: {{new_user['email']}}</p>
<p>Password: {{new_user['password']}}</p>
<a href="{{url_for('settings.users_list')}}" class="btn btn-primary"><i class="fa fa-eye-slash"></i> Hide</a>
</div>
</div>
</div>
{% endif %}
<div class="table-responsive mt-1 table-hover table-borderless table-striped">
<table class="table">
<thead class="thead-dark">
<tr>
<th>Email</th>
<th>Role</th>
<th>Api Key</th>
<th>Actions</th>
</tr>
</thead>
<tbody id="tbody_last_crawled">
{% for user in all_users %}
<tr>
<td>{{user['email']}}</td>
<td>{{user['role']}}</td>
<td>
{{user['api_key']}}
<a class="ml-3" href="{{url_for('settings.new_token_user')}}?user_id={{user['email']}}"><i class="fa fa-random"></i></a>
</td>
<td>
<a href="{{ url_for('settings.edit_user')}}?user_id={{user['email']}}">
<i class="fa fa-pencil edit_icon"></i>
</a>
<a href="{{ url_for('settings.delete_user')}}?user_id={{user['email']}}" class="ml-4">
<i class="fa fa-trash trash_icon"></i>
</a>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-settings").addClass("active");
$("#nav_users_list").addClass("active");
$("#nav_user_management").removeClass("text-muted");
} );
function toggle_sidebar(){
if($('#nav_menu').is(':visible')){
$('#nav_menu').hide();
$('#side_menu').removeClass('border-right')
$('#side_menu').removeClass('col-lg-2')
$('#core_content').removeClass('col-lg-10')
}else{
$('#nav_menu').show();
$('#side_menu').addClass('border-right')
$('#side_menu').addClass('col-lg-2')
$('#core_content').addClass('col-lg-10')
}
}
</script>
</html>

View File

@ -7,44 +7,59 @@
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/d3v5.min.js')}}"></script>
<style>
.edit_icon:hover{
cursor: pointer;
color: #17a2b8;
}
</style>
</head>
<body>
<nav class="navbar navbar-expand-sm navbar-dark bg-dark">
<a class="navbar-brand" href="{{ url_for('index') }}">
<img src="{{ url_for('static', filename='img/d4-logo.png')}}" alt="D4 Project" style="width:80px;">
</a>
<ul class="navbar-nav">
<li class="nav-item">
<a class="nav-link mr-3" href="{{ url_for('index') }}">Home <span class="sr-only">(current)</span></a>
</li>
<li class="nav-item" mr-3>
<a class="nav-link mr-3" href="{{ url_for('sensors_status') }}">Sensors Status</a>
</li>
<li class="nav-item mr-3">
<a class="nav-link" href="{{ url_for('server_management') }}" tabindex="-1" aria-disabled="true">Server Management</a>
</li>
</ul>
</nav>
{% include 'navbar.html' %}
<div class="card text-center mt-3 ml-2 mr-2">
<div class="card-header bg-dark text-white">
UUID: {{uuid_sensor}}
</div>
<div class="card-body">
<div class="mb-2">
<span id="description-text-block">
<span id="description-text">{{data_uuid['description']}}</span>
<span onclick="show_edit_description();">
<i class="fa fa-pencil edit_icon"></i>
</span>
</span>
<span id="description-edit-block" hidden>
<div class="input-group">
<input class="form-control" type="text" id="input-description" value="{{data_uuid['description']}}"></input>
<div class="input-group-append">
<button class="btn btn-info" onclick="edit_description();">
<i class="fa fa-pencil edit_icon"></i>
</button>
</div>
</div>
</span>
</div>
<div class="card-group">
<div class="card">
<div class="card-header bg-info text-white">
First Seen
</div>
<div class="card-body">
<p class="card-text">{{data_uuid['first_seen_gmt']}} - ({{data_uuid['first_seen']}})</p>
<p class="card-text">{{data_uuid['first_seen_gmt']}}</p>
</div>
</div>
<div class="card">
@ -52,7 +67,7 @@
Last Seen
</div>
<div class="card-body">
<p class="card-text">{{data_uuid['last_seen_gmt']}} - ({{data_uuid['last_seen']}})</p>
<p class="card-text">{{data_uuid['last_seen_gmt']}}</p>
</div>
</div>
<div class="card">
@ -73,6 +88,11 @@
<div style="color:Green; display:inline-block">
<i class="fa fa-check-circle"></i> Connected
</div>
<div>
<a href="{{ url_for('kick_uuid') }}?uuid={{uuid_sensor}}" {% if data_uuid['temp_blacklist_uuid'] %}style="pointer-events: none;"{% endif %}>
<button type="button" class="btn btn-outline-info" {% if data_uuid['temp_blacklist_uuid'] %}disabled{% endif %}>Kick UUID</button>
</a>
</div>
{% endif %}
</div>
</div>
@ -81,6 +101,18 @@
</div>
</div>
<div class="d-flex justify-content-center mt-2">
{% if not data_uuid.get('is_monitored', False) %}
<a href="{{ url_for('D4_sensors.add_sensor_to_monitor') }}?uuid={{uuid_sensor}}">
<button type="button" class="btn btn-primary">Monitor Sensor</button>
</a>
{% else %}
<a href="{{ url_for('D4_sensors.delete_sensor_to_monitor') }}?uuid={{uuid_sensor}}">
<button type="button" class="btn btn-danger">Remove Sensor from monitoring</button>
</a>
{% endif %}
</div>
<div class="card-deck justify-content-center ml-0 mr-0">
<div class="card border-dark mt-3" style="max-width: 18rem;">
<div class="card-body text-dark">
@ -131,8 +163,85 @@
</div>
</div>
<div>
<div class="card text-center mt-3 mx-3">
<div class="card-header bg-dark text-white">
Types Used:
</div>
<div class="row ml-0 mr-0">
<div class="col-xl-4">
<div class="mt-2">
<table class="table table-striped table-bordered table-hover" id="myTable_1">
<thead class="thead-dark">
<tr>
<th>Type</th>
<th style="max-width: 800px;">first seen</th>
<th style="max-width: 800px;">last seen</th>
</tr>
</thead>
<tbody>
{% for type in uuid_all_type %}
<tr>
<td>{{type['type']}}</td>
<td>{{type['first_seen']}}</td>
<td>{{type['last_seen']}}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
<div class="col-xl-8">
<div id="barchart_type">
</div>
</div>
</div>
</div>
</div>
<div>
<div class="card text-center mt-3 mx-3">
<div class="card-header bg-dark text-white">
Data Saved:
</div>
<div class="row ml-0 mr-0">
<div class="col-xl-4">
<div class="mt-2">
<table class="table table-striped table-bordered table-hover" id="myTable_2">
<thead class="thead-dark">
<tr>
<th>Type</th>
<th style="max-width: 800px;">Size (Kb)</th>
<th style="max-width: 800px;">Nb Files</th>
</tr>
</thead>
<tbody>
{% for type_stats in disk_stats %}
<tr>
<td>{{type_stats}}</td>
<td>{{disk_stats[type_stats]['total_size']}}</td>
<td>{{disk_stats[type_stats]['nb_files']}}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
</div>
<div class="col-xl-8">
<input value="nb_files" id="type_stats_disk" hidden></input>
<h4 id="stats_disk_title">Number of files save on disk :</h4>
<div id="barchart_type_disk">
</div>
<button type="button" id="stats_disk_btn" class="btn btn-outline-secondary mt-1" onclick="swap_stats_type();">
Show Size Chart
</button>
</div>
</div>
</div>
</div>
<div class="row ml-0 mr-0">
<div class="col-6">
<div class="col-lg-6">
<div class="card text-center mt-3">
<div class="card-header bg-dark text-white">
Last IP Used:
@ -146,7 +255,7 @@
</ul>
</div>
</div>
<div class="col-6">
<div class="col-lg-6">
<div class="d-none card mt-3 mb-3" id="whois_data">
<div class="card-header bg-dark text-center text-white">
Whois Info:
@ -161,13 +270,252 @@
</body>
<script>
function get_whois_data(ip){
var chart = {};
$(document).ready(function(){
$('#description-edit-block').hide();
$('#description-edit-block').removeAttr("hidden")
$.getJSON( "{{url_for('whois_data')}}?ip="+ip, function( data ) {
console.log(data)
$( "#whois_data" ).removeClass( "d-none" );
$( "#whois_output" ).text(data);
table = $('#myTable_1').DataTable(
{
"aLengthMenu": [[5, 10, 15, 20, -1], [5, 10, 15, 20, "All"]],
"iDisplayLength": 10,
"order": [[ 0, "asc" ]]
}
);
table = $('#myTable_2').DataTable(
{
"aLengthMenu": [[5, 10, 15, 20, -1], [5, 10, 15, 20, "All"]],
"iDisplayLength": 10,
"order": [[ 0, "asc" ]]
}
);
chart.stackBarChart1 =barchart_type_stack("{{ url_for('get_uuid_type_history_json') }}?uuid_sensor={{uuid_sensor}}", '#barchart_type');
chart.stackBarChart2 =barchart_type_stack("{{ url_for('get_uuid_stats_history_json') }}?uuid_sensor={{uuid_sensor}}", '#barchart_type_disk');
});
chart.onResize();
$(window).on("resize", function() {
chart.onResize();
});
$('[data-toggle="popover"]').popover({
placement: 'top',
container: 'body',
html : true,
});
});
function get_whois_data(ip){
$.getJSON( "{{url_for('whois_data')}}?ip="+ip, function( data ) {
$( "#whois_data" ).removeClass( "d-none" );
$( "#whois_output" ).text(data);
});
}
function swap_stats_type(){
var stats_value = $('#type_stats_disk').val();
if(stats_value==='nb_files'){
$('#type_stats_disk').val('total_size');
$('#stats_disk_title').text('Size of files save on disk :');
$('#stats_disk_btn').text('Show # Files Chart');
stats_value = 'total_size';
} else {
$('#type_stats_disk').val('nb_files');
$('#stats_disk_title').text('Number of files save on disk :');
$('#stats_disk_btn').text('Show Size Chart');
stats_value = 'nb_files';
}
$('#barchart_type_disk').children().remove();
url_json_stats = "{{ url_for('get_uuid_stats_history_json') }}?uuid_sensor={{uuid_sensor}}&stats=" + stats_value;
chart.stackBarChart2 =barchart_type_stack(url_json_stats, '#barchart_type_disk');
chart.onResize();
}
function show_edit_description(){
$('#description-text-block').hide();
$('#description-edit-block').show();
}
function edit_description(){
var new_description = $('#input-description').val()
var data_to_send = { uuid: "{{uuid_sensor}}", "description": new_description}
$.get("{{ url_for('uuid_change_description') }}", data_to_send, function(data, status){
if(status == "success") {
$('#description-text').text(new_description)
$('#description-edit-block').hide();
$('#description-text-block').show();
}
});
}
</script>
<script>
var margin = {top: 20, right: 90, bottom: 55, left: 0},
width = parseInt(d3.select('#barchart_type').style('width'), 10);
width = 1000 - margin.left - margin.right,
height = 500 - margin.top - margin.bottom;
function barchart_type_stack(url, id) {
var x = d3.scaleBand().rangeRound([0, width]).padding(0.1);
var y = d3.scaleLinear().rangeRound([height, 0]);
var xAxis = d3.axisBottom(x);
var yAxis = d3.axisLeft(y);
var color = d3.scaleOrdinal(d3.schemeSet3);
var svg = d3.select(id).append("svg")
.attr("id", "thesvg")
.attr("viewBox", "0 0 "+width+" 500")
.attr("width", width + margin.left + margin.right)
.attr("height", height + margin.top + margin.bottom)
.append("g")
.attr("transform", "translate(" + margin.left + "," + margin.top + ")");
d3.json(url)
.then(function(data){
var labelVar = 'date'; //A
var varNames = d3.keys(data[0])
.filter(function (key) { return key !== labelVar;}); //B
data.forEach(function (d) { //D
var y0 = 0;
d.mapping = varNames.map(function (name) {
return {
name: name,
label: d[labelVar],
y0: y0,
y1: y0 += +d[name]
};
});
d.total = d.mapping[d.mapping.length - 1].y1;
});
x.domain(data.map(function (d) { return (d.date); })); //E
y.domain([0, d3.max(data, function (d) { return d.total; })]);
svg.append("g")
.attr("class", "x axis")
.attr("transform", "translate(0," + height + ")")
.call(xAxis)
.selectAll("text")
.attr("class", "bar")
.on("click", function (d) { window.location.href = "#" })
.attr("transform", "rotate(-18)" )
//.attr("transform", "rotate(-40)" )
.style("text-anchor", "end");
svg.append("g")
.attr("class", "y axis")
.call(yAxis)
.append("text")
.attr("transform", "rotate(-90)")
.attr("y", 6)
.attr("dy", ".71em")
.style("text-anchor", "end");
var selection = svg.selectAll(".series")
.data(data)
.enter().append("g")
.attr("class", "series")
.attr("transform", function (d) { return "translate(" + x((d.date)) + ",0)"; });
selection.selectAll("rect")
.data(function (d) { return d.mapping; })
.enter().append("rect")
.attr("class", "bar_stack")
.attr("width", x.bandwidth())
.attr("y", function (d) { return y(d.y1); })
.attr("height", function (d) { return y(d.y0) - y(d.y1); })
.style("fill", function (d) { return color(d.name); })
.style("stroke", "grey")
.on("mouseover", function (d) { showPopover.call(this, d); })
.on("mouseout", function (d) { removePopovers(); })
.on("click", function(d){ window.location.href = "#" });
data.forEach(function(d) {
if(d.total != 0){
svg.append("text")
.attr("class", "bar")
.attr("dy", "-.35em")
.attr('x', x(d.date) + x.bandwidth()/2)
.attr('y', y(d.total))
.on("click", function () {window.location.href = "#" })
.style("text-anchor", "middle")
.text(d.total);
}
});
drawLegend(varNames);
});
function drawLegend (varNames) {
var legend = svg.selectAll(".legend")
.data(varNames.slice().reverse())
.enter().append("g")
.attr("class", "legend")
.attr("transform", function (d, i) { return "translate(0," + i * 20 + ")"; });
legend.append("rect")
.attr("x", 943)
.attr("width", 10)
.attr("height", 10)
.style("fill", color)
.style("stroke", "grey");
legend.append("text")
.attr("class", "svgText")
.attr("x", 941)
.attr("y", 6)
.attr("dy", ".35em")
.style("text-anchor", "end")
.text(function (d) { return d; });
}
function showPopover (d) {
$(this).popover({
title: d.name,
placement: 'top',
container: 'body',
trigger: 'manual',
html : true,
content: function() {
return d.label +
"<br/>num: " + d3.format(",")(d.value ? d.value: d.y1 - d.y0); }
});
$(this).popover('show')
}
}
function removePopovers () {
$('.popover').each(function() {
$(this).remove();
});
}
function resize_chart_by_id(id_chart) {
var aspect = width / height, chart = $(id_chart).children();
var targetWidth = chart.parent().width();
chart.attr("width", targetWidth);
chart.attr("height", targetWidth / 2);
}
chart.onResize = function () {
resize_chart_by_id("#barchart_type");
resize_chart_by_id("#barchart_type_disk");
}
window.chart = chart;
</script>

View File

@ -5,6 +5,7 @@ set -e
BOOTSTRAP_VERSION='4.2.1'
FONT_AWESOME_VERSION='4.7.0'
D3_JS_VERSION='4.13.0'
D3_JS_VERSIONv5='5.9.2'
if [ ! -d static/css ]; then
mkdir static/css
@ -19,38 +20,48 @@ fi
rm -rf temp
mkdir temp
mkdir temp/d3v5/
wget https://github.com/twbs/bootstrap/releases/download/v${BOOTSTRAP_VERSION}/bootstrap-${BOOTSTRAP_VERSION}-dist.zip -O temp/bootstrap${BOOTSTRAP_VERSION}.zip
wget https://github.com/FortAwesome/Font-Awesome/archive/v${FONT_AWESOME_VERSION}.zip -O temp/FONT_AWESOME_${FONT_AWESOME_VERSION}.zip
wget https://github.com/d3/d3/releases/download/v${D3_JS_VERSION}/d3.zip -O temp/d3_${D3_JS_VERSION}.zip
wget https://github.com/d3/d3/releases/download/v${D3_JS_VERSIONv5}/d3.zip -O temp/d3v5/d3_${D3_JS_VERSIONv5}.zip
wget https://github.com/FezVrasta/popper.js/archive/v1.14.3.zip -O temp/popper.zip
# dateRangePicker
#wget https://github.com/moment/moment/archive/2.22.2.zip -O temp/moment_2.22.2.zip
#wget https://github.com/longbill/jquery-date-range-picker/archive/v0.18.0.zip -O temp/daterangepicker_v0.18.0.zip
wget https://github.com/moment/moment/archive/2.22.2.zip -O temp/moment_2.22.2.zip
wget https://github.com/longbill/jquery-date-range-picker/archive/v0.18.0.zip -O temp/daterangepicker_v0.18.0.zip
unzip temp/bootstrap${BOOTSTRAP_VERSION}.zip -d temp/
unzip temp/FONT_AWESOME_${FONT_AWESOME_VERSION}.zip -d temp/
unzip temp/d3_${D3_JS_VERSION}.zip -d temp/
unzip temp/d3v5/d3_${D3_JS_VERSIONv5}.zip -d temp/d3v5/
unzip temp/popper.zip -d temp/
#unzip temp/moment_2.22.2.zip -d temp/
#unzip temp/daterangepicker_v0.18.0.zip -d temp/
unzip temp/moment_2.22.2.zip -d temp/
unzip temp/daterangepicker_v0.18.0.zip -d temp/
mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/js/bootstrap.min.js ./static/js/
mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/css/bootstrap.min.css ./static/css/
mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/css/bootstrap.min.css.map ./static/css/
mv temp/floating-ui-1.14.3/dist/umd/popper.min.js ./static/js/
mv temp/floating-ui-1.14.3/dist/umd/popper.min.js.map ./static/js/
mv temp/Font-Awesome-${FONT_AWESOME_VERSION} temp/font-awesome
rm -rf ./static/fonts/ ./static/font-awesome/
mv temp/font-awesome/ ./static/
#mv temp/jquery-date-range-picker-0.18.0/dist/daterangepicker.min.css ./static/css/
mv temp/jquery-date-range-picker-0.18.0/dist/daterangepicker.min.css ./static/css/
mv temp/d3.min.js ./static/js/
#mv temp/moment-2.22.2/min/moment.min.js ./static/js/
#mv temp/jquery-date-range-picker-0.18.0/dist/jquery.daterangepicker.min.js ./static/js/
cp temp/d3v5/d3.min.js ./static/js/d3v5.min.js
mv temp/moment-2.22.2/min/moment.min.js ./static/js/
mv temp/jquery-date-range-picker-0.18.0/dist/jquery.daterangepicker.min.js ./static/js/
rm -rf temp

View File

@ -10,6 +10,9 @@ import datetime
import signal
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
class GracefulKiller:
kill_now = False
def __init__(self):
@ -45,27 +48,12 @@ def compress_file(file_full_path, session_uuid,i=0):
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(type, analyzer_uuid), 0, analyser_queue_max_size)
host_redis_stream = "localhost"
port_redis_stream = 6379
host_redis_metadata = "localhost"
port_redis_metadata = 6380
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
redis_server_metadata = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=0)
redis_server_analyzer = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=2)
### Config ###
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
config_loader = None
type = 1
sleep_time = 300
@ -96,13 +84,13 @@ if __name__ == "__main__":
new_date = datetime.datetime.now().strftime("%Y%m%d")
# get all directory files
all_files = os.listdir(worker_data_directory)
not_compressed_file = []
# filter: get all not compressed files
for file in all_files:
if file.endswith('.cap'):
not_compressed_file.append(os.path.join(worker_data_directory, file))
if os.path.isdir(worker_data_directory):
all_files = os.listdir(worker_data_directory)
for file in all_files:
if file.endswith('.cap'):
not_compressed_file.append(os.path.join(worker_data_directory, file))
if not_compressed_file:
### check time-change (minus one hour) ###

View File

@ -9,11 +9,15 @@ import shutil
import datetime
import subprocess
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(stream_name, session_uuid, uuid):
redis_server_stream.sadd('Error:IncorrectType:{}'.format(type), session_uuid)
redis_server_stream.sadd('Error:IncorrectType', session_uuid)
redis_server_metadata.hset('metadata_uuid:{}'.format(uuid), 'Error', 'Error: Type={}, Incorrect file format'.format(type))
clean_stream(stream_name, session_uuid)
print('Incorrect format')
print('Incorrect format, uuid={}'.format(uuid))
sys.exit(1)
def clean_stream(stream_name, session_uuid):
@ -36,35 +40,21 @@ def compress_file(file_full_path, i=0):
shutil.copyfileobj(f_in, f_out)
os.remove(file_full_path)
# save full path in anylyzer queue
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)):
analyzer_uuid = analyzer_uuid.decode()
redis_server_analyzer.lpush('analyzer:{}:{}'.format(type, analyzer_uuid), compressed_filename)
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'last_updated', time.time())
analyser_queue_max_size = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if analyser_queue_max_size is None:
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(type, analyzer_uuid), 0, analyser_queue_max_size)
Analyzer_Queue.add_data_to_queue(uuid, type, compressed_filename)
host_redis_stream = "localhost"
port_redis_stream = 6379
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
host_redis_metadata = "localhost"
port_redis_metadata = 6380
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
redis_server_metadata = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=0)
redis_server_analyzer = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=2)
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
config_loader = None
type = 1
tcp_dump_cycle = '300'
@ -88,15 +78,15 @@ if __name__ == "__main__":
if res:
uuid = res[0][1][0][1][b'uuid'].decode()
date = datetime.datetime.now().strftime("%Y%m%d")
tcpdump_path = os.path.join('../../data', uuid, str(type))
full_tcpdump_path = os.path.join(os.environ['D4_HOME'], 'data', uuid, str(type))
tcpdump_path = os.path.join(data_directory, uuid, str(type))
full_tcpdump_path = os.path.join(data_directory, uuid, str(type))
rel_path = os.path.join(tcpdump_path, date[0:4], date[4:6], date[6:8])
if not os.path.isdir(rel_path):
os.makedirs(rel_path)
print('---- worker launched, uuid={} session_uuid={}'.format(uuid, session_uuid))
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else:
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1)
print('Incorrect message')
redis_server_stream.sadd('working_session_uuid:{}'.format(type), session_uuid)
#LAUNCH a tcpdump
@ -130,6 +120,8 @@ if __name__ == "__main__":
Error_message = process.stderr.read()
if Error_message == b'tcpdump: unknown file format\n':
data_incorrect_format(stream_name, session_uuid, uuid)
elif Error_message:
print(Error_message)
#print(process.stdout.read())
nb_save += 1
@ -144,7 +136,8 @@ if __name__ == "__main__":
# success, all data are saved
if redis_server_stream.sismember('ended_session', session_uuid):
out, err = process.communicate(timeout= 0.5)
#print(out)
#if out:
# print(out)
if err == b'tcpdump: unknown file format\n':
data_incorrect_format(stream_name, session_uuid, uuid)
elif err:
@ -156,16 +149,16 @@ if __name__ == "__main__":
except subprocess.TimeoutExpired:
process_compressor.kill()
### compress all files ###
date = datetime.datetime.now().strftime("%Y%m%d")
worker_data_directory = os.path.join(full_tcpdump_path, date[0:4], date[4:6], date[6:8])
all_files = os.listdir(worker_data_directory)
all_files.sort()
if all_files:
for file in all_files:
if file.endswith('.cap'):
full_path = os.path.join(worker_data_directory, file)
if redis_server_stream.get('data_in_process:{}'.format(session_uuid)) != full_path:
compress_file(full_path)
if os.path.isdir(worker_data_directory):
all_files = os.listdir(worker_data_directory)
all_files.sort()
if all_files:
for file in all_files:
if file.endswith('.cap'):
full_path = os.path.join(worker_data_directory, file)
if redis_server_stream.get('data_in_process:{}'.format(session_uuid)) != full_path:
compress_file(full_path)
### ###
#print(process.stderr.read())
@ -177,7 +170,7 @@ if __name__ == "__main__":
redis_server_stream.delete('data_in_process:{}'.format(session_uuid))
# make sure that tcpdump can save all datas
time.sleep(10)
print('---- tcpdump DONE, uuid={} session_uuid={}'.format(uuid, session_uuid))
print('---- tcpdump DONE, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
sys.exit(0)
else:
time.sleep(10)

View File

@ -6,19 +6,19 @@ import time
import redis
import subprocess
host_redis_stream = "localhost"
port_redis_stream = 6379
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
type = 1
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
print('Error: Redis server {}, ConnectionError'.format("Redis_STREAM"))
sys.exit(1)
if __name__ == "__main__":

View File

@ -0,0 +1,149 @@
#!/usr/bin/env python3
import os
import sys
import time
import gzip
import redis
import shutil
import datetime
import signal
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
class GracefulKiller:
kill_now = False
def __init__(self):
signal.signal(signal.SIGINT, self.exit_gracefully)
signal.signal(signal.SIGTERM, self.exit_gracefully)
def exit_gracefully(self,signum, frame):
self.kill_now = True
def compress_file(file_full_path, session_uuid,i=0):
redis_server_stream.set('data_in_process:{}'.format(session_uuid), file_full_path)
if i==0:
compressed_filename = '{}.gz'.format(file_full_path)
else:
compressed_filename = '{}.{}.gz'.format(file_full_path, i)
if os.path.isfile(compressed_filename):
compress_file(file_full_path, session_uuid, i+1)
else:
with open(file_full_path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
try:
os.remove(file_full_path)
except FileNotFoundError:
pass
# save full path in anylyzer queue
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)):
analyzer_uuid = analyzer_uuid.decode()
redis_server_analyzer.lpush('analyzer:{}:{}'.format(type, analyzer_uuid), compressed_filename)
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'last_updated', time.time())
analyser_queue_max_size = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if analyser_queue_max_size is None:
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(type, analyzer_uuid), 0, analyser_queue_max_size)
### Config ###
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
config_loader = None
### ###
type = 1
sleep_time = 300
analyzer_list_max_default_size = 10000
if __name__ == "__main__":
killer = GracefulKiller()
if len(sys.argv) != 4:
print('usage:', 'Worker.py', 'session_uuid', 'tcpdump', 'date')
exit(1)
# TODO sanityse input
session_uuid = sys.argv[1]
directory_data_uuid = sys.argv[2]
date = sys.argv[3]
worker_data_directory = os.path.join(directory_data_uuid, date[0:4], date[4:6], date[6:8])
full_datetime = datetime.datetime.now().strftime("%Y%m%d%H")
current_file = None
time_change = False
while True:
if killer.kill_now:
break
new_date = datetime.datetime.now().strftime("%Y%m%d")
# get all directory files
all_files = os.listdir(worker_data_directory)
not_compressed_file = []
# filter: get all not compressed files
for file in all_files:
if file.endswith('.cap'):
not_compressed_file.append(os.path.join(worker_data_directory, file))
if not_compressed_file:
### check time-change (minus one hour) ###
new_full_datetime = datetime.datetime.now().strftime("%Y%m%d%H")
if new_full_datetime < full_datetime:
# sort list, last modified
not_compressed_file.sort(key=os.path.getctime)
else:
# sort list
not_compressed_file.sort()
### ###
# new day
if date != new_date:
# compress all file
for file in not_compressed_file:
if killer.kill_now:
break
compress_file(file, session_uuid)
# reset file tracker
current_file = None
date = new_date
# update worker_data_directory
worker_data_directory = os.path.join(directory_data_uuid, date[0:4], date[4:6], date[6:8])
# restart
continue
# file used by tcpdump
max_file = not_compressed_file[-1]
full_datetime = new_full_datetime
# Init: set current_file
if not current_file:
current_file = max_file
#print('max_file set: {}'.format(current_file))
# new file created
if max_file != current_file:
# get all previous files
for file in not_compressed_file:
if file != max_file:
if killer.kill_now:
break
#print('new file: {}'.format(file))
compress_file(file, session_uuid)
# update current_file tracker
current_file = max_file
if killer.kill_now:
break
time.sleep(sleep_time)

View File

@ -0,0 +1,283 @@
#!/usr/bin/env python3
import os
import sys
import time
import json
import gzip
import redis
import shutil
import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
DEFAULT_FILE_EXTENSION = 'txt'
DEFAULT_FILE_SEPARATOR = b'\n'
ROTATION_SAVE_CYCLE = 300 # seconds
MAX_BUFFER_LENGTH = 100000
TYPE = 254
# CONFIG #
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
analyzer_list_max_default_size = 10000
class MetaTypesDefault:
def __init__(self, uuid, json_file):
self.uuid = uuid
self.type_name = json_file['type']
self.save_path = None
self.buffer = b''
self.file_rotation_mode = True
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
self.data_directory = data_directory
self.parse_json(json_file)
def test(self):
print('class: MetaTypesDefault')
######## JSON PARSER ########
def parse_json(self, json_file):
self.file_rotation = False
self.file_separator = b'\n'
self.filename = b''.join([self.type_name.encode(), b'.txt'])
######## PROCESS FUNCTIONS ########
def process_data(self, data):
# save data on disk
self.save_rotate_file(data)
# do something with the data (send to analyzer queue by default)
self.reconstruct_data(data)
######## CORE FUNCTIONS ########
def check_json_file(self, json_file):
# the json object must contain a type field
if "type" in json_file:
return True
else:
return False
def save_json_file(self, json_file, save_by_uuid=True):
self.set_last_time_saved(time.time()) #time_file
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S")) #date_file
# update save path
self.set_save_path( os.path.join(self.get_save_dir(save_by_uuid=save_by_uuid), self.get_filename(file_extention='json', save_by_uuid=save_by_uuid)) )
# save json
with open(self.get_save_path(), 'w') as f:
f.write(json.dumps(json_file))
# update save path for 254 files type
if self.is_file_rotation_mode():
self.set_save_path( os.path.join(self.get_save_dir(), self.get_filename()) )
def save_rotate_file(self, data):
if not self.get_file_rotation():
new_date = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
# check if a new file rotation is needed # # TODO: change ROTATION_SAVE_CYCLE
if ( new_date[0:8] != self.get_last_saved_date()[0:8] ) or ( int(time.time()) - self.get_last_time_saved() > ROTATION_SAVE_CYCLE ):
self.set_rotate_file(True)
# rotate file
if self.get_file_rotation():
# init save path
if self.get_save_path() is None:
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# update save path
self.set_save_path( os.path.join(self.get_save_dir(), self.get_filename()) )
# rotate file
if self.get_file_separator() in data:
end_file, start_new_file = data.rsplit(self.get_file_separator(), maxsplit=1)
# save end of file
with open(self.get_save_path(), 'ab') as f:
f.write(end_file)
self.compress_file(self.get_save_path())
# set last saved date/time
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# update save path
self.set_save_path( os.path.join(self.get_save_dir(), self.get_filename()) )
# save start of new file
if start_new_file != b'':
with open(self.get_save_path(), 'ab') as f:
f.write(start_new_file)
# end of rotation
self.set_rotate_file(False)
# wait file separator
else:
with open(self.get_save_path(), 'ab') as f:
f.write(data)
else:
# save file
with open(self.get_save_path(), 'ab') as f:
f.write(data)
def reconstruct_data(self, data):
# save data in buffer
self.add_to_buffer(data)
data = self.get_buffer()
# end of element found in data
if self.get_file_separator() in data:
# empty buffer
self.reset_buffer()
all_line = data.split(self.get_file_separator())
for reconstructed_data in all_line[:-1]:
if reconstructed_data != b'':
self.handle_reconstructed_data(reconstructed_data)
# save incomplete element in buffer
if all_line[-1] != b'':
self.add_to_buffer(all_line[-1])
# no elements
else:
# force file_separator when max buffer size is reached
if self.get_size_buffer() > MAX_BUFFER_LENGTH:
print('Error, infinite loop, max buffer length reached')
self.add_to_buffer(self.get_file_separator())
def handle_reconstructed_data(self, data):
# send data to analyzer
self.send_to_analyzers(data)
def compress_file(self, file_full_path, i=0):
if i==0:
compressed_filename = '{}.gz'.format(file_full_path)
else:
compressed_filename = '{}.{}.gz'.format(file_full_path, i)
if os.path.isfile(compressed_filename):
self.compress_file(file_full_path, i+1)
else:
with open(file_full_path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
os.remove(file_full_path)
def send_to_analyzers(self, data_to_send):
Analyzer_Queue.add_data_to_queue(self.uuid, self.get_type_name(), data_to_send)
######## GET FUNCTIONS ########
def get_type_name(self):
return self.type_name
def get_file_separator(self):
return self.file_separator
def get_uuid(self):
return self.uuid
def get_buffer(self):
return self.buffer
def get_size_buffer(self):
return len(self.buffer)
def get_filename(self, file_extention=None, save_by_uuid=False):
if file_extention is None:
file_extention = DEFAULT_FILE_EXTENSION
# File Rotation, : data/<uuid>/254/<year>/<month>/<day>/
if self.is_file_rotation_mode() or save_by_uuid:
return '{}-{}-{}-{}-{}.{}'.format(self.uuid, self.get_last_saved_year(), self.get_last_saved_month(), self.get_last_saved_day(), self.get_last_saved_hour_minute(), file_extention)
def get_data_save_directory(self):
return self.data_directory
def get_save_dir(self, save_by_uuid=False):
# File Rotation, save data in directory: data/<uuid>/254/<year>/<month>/<day>/
if self.is_file_rotation_mode() or save_by_uuid:
data_directory_uuid_type = os.path.join(self.get_data_save_directory(), self.get_uuid(), str(TYPE))
return os.path.join(data_directory_uuid_type, self.get_last_saved_year(), self.get_last_saved_month(), self.get_last_saved_day() , self.type_name)
# data save in the same directory
else:
save_dir = os.path.join(self.get_data_save_directory(), 'datas', self.get_type_name())
if not os.path.isdir(save_dir):
os.makedirs(save_dir)
return save_dir
def get_save_path(self):
return self.save_path
def is_empty_buffer(self):
if self.buffer==b'':
return True
else:
return False
def is_file_rotation_mode(self):
if self.file_rotation_mode:
return True
else:
return False
def get_file_rotation(self):
return self.file_rotation
def get_last_time_saved(self):
return self.last_time_saved
def get_last_saved_date(self):
return self.last_saved_date
def get_last_saved_year(self):
return self.last_saved_date[0:4]
def get_last_saved_month(self):
return self.last_saved_date[4:6]
def get_last_saved_day(self):
return self.last_saved_date[6:8]
def get_last_saved_hour_minute(self):
return self.last_saved_date[8:14]
######## SET FUNCTIONS ########
def reset_buffer(self):
self.buffer = b''
def set_buffer(self, data):
self.buffer = data
def add_to_buffer(self, data):
self.buffer = b''.join([self.buffer, data])
def set_rotate_file(self, boolean_value):
self.file_rotation = boolean_value
def set_rotate_file_mode(self, boolean_value):
self.file_rotation_mode = boolean_value
def set_last_time_saved(self, value_time):
self.last_time_saved = int(value_time)
def set_last_saved_date(self, date):
self.last_saved_date = date
def set_save_path(self, save_path):
dir_path = os.path.dirname(save_path)
if not os.path.isdir(dir_path):
os.makedirs(dir_path)
self.save_path = save_path

View File

@ -0,0 +1,80 @@
#!/usr/bin/env python3
from meta_types_modules.MetaTypesDefault import MetaTypesDefault
import hashlib
import time
import os
import datetime
import base64
import shutil
import gzip
class TypeHandler(MetaTypesDefault):
def __init__(self, uuid, json_file):
super().__init__(uuid, json_file)
self.compress = False
self.extension = ''
self.segregate = True
if "compress" in json_file:
self.compress = json_file['compress']
if "extension" in json_file:
self.extension = json_file['extension']
if "segregate" in json_file:
self.segregate = json_file['segregate']
self.set_rotate_file_mode(False)
self.saved_dir = ''
def process_data(self, data):
# Unpack the thing
self.reconstruct_data(data)
# pushing the filepath instead of the file content to the analyzer
def handle_reconstructed_data(self, data):
m = hashlib.sha256()
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# Create folder
save_dir = os.path.join(self.get_save_dir(save_by_uuid=self.segregate), 'files')
if not os.path.isdir(save_dir):
os.makedirs(save_dir)
# write file to disk
decodeddata = base64.b64decode(data)
m.update(decodeddata)
path = os.path.join(save_dir, m.hexdigest())
path = '{}.{}'.format(path, self.extension)
with open(path, 'wb') as p:
p.write(decodeddata)
if self.compress:
compressed_filename = '{}.gz'.format(path)
with open(path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
os.remove(path)
self.send_to_analyzers(compressed_filename)
else:
self.send_to_analyzers(path)
def reconstruct_data(self, data):
# save data in buffer
self.add_to_buffer(data)
data = self.get_buffer()
# end of element found in data
if self.get_file_separator() in data:
# empty buffer
self.reset_buffer()
all_line = data.split(self.get_file_separator())
for reconstructed_data in all_line[:-1]:
if reconstructed_data != b'':
self.handle_reconstructed_data(reconstructed_data)
# save incomplete element in buffer
if all_line[-1] != b'':
self.add_to_buffer(all_line[-1])
def test(self):
print('Class: filewatcher')

View File

@ -0,0 +1,38 @@
#!/usr/bin/env python3
from meta_types_modules.MetaTypesDefault import MetaTypesDefault
import hashlib
import time
import os
import datetime
class TypeHandler(MetaTypesDefault):
def __init__(self, uuid, json_file):
super().__init__(uuid, json_file)
self.set_rotate_file_mode(False)
self.saved_dir = ''
def process_data(self, data):
self.reconstruct_data(data)
# pushing the filepath instead of the file content to the analyzer
def handle_reconstructed_data(self, data):
m = hashlib.sha256()
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# Create folder
jsons_save_dir = os.path.join(self.get_save_dir(save_by_uuid=True), 'files')
if not os.path.isdir(jsons_save_dir):
os.makedirs(jsons_save_dir)
# write json file to disk
m.update(data)
jsons_path = os.path.join(jsons_save_dir, m.hexdigest()+'.json')
with open(jsons_path, 'wb') as j:
j.write(data)
# Send data to Analyszer
self.send_to_analyzers(jsons_path)
def test(self):
print('Class: filewatcherjson')

View File

@ -0,0 +1,67 @@
#!/usr/bin/env python3
import os
import sys
import time
import json
import redis
import datetime
import hashlib
import binascii
import redis
import pdb
from meta_types_modules.MetaTypesDefault import MetaTypesDefault
class TypeHandler(MetaTypesDefault):
def __init__(self, uuid, json_file):
super().__init__(uuid, json_file)
self.set_rotate_file_mode(False)
def process_data(self, data):
self.reconstruct_data(data)
def handle_reconstructed_data(self, data):
decoded_data = data.decode()
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# Create folders
cert_save_dir = os.path.join(self.get_save_dir(), 'certs')
jsons_save_dir = os.path.join(self.get_save_dir(), 'jsons')
if not os.path.isdir(cert_save_dir):
os.makedirs(cert_save_dir)
if not os.path.isdir(jsons_save_dir):
os.makedirs(jsons_save_dir)
# Extract certificates from json
try:
mtjson = json.loads(decoded_data)
res = True
except Exception as e:
print(decoded_data)
res = False
if res:
#mtjson = json.loads(decoded_data)
for certificate in mtjson["Certificates"] or []:
cert = binascii.a2b_base64(certificate["Raw"])
# one could also load this cert with
# xcert = x509.load_der_x509_certificate(cert, default_backend())
m = hashlib.sha1()
m.update(cert)
cert_path = os.path.join(cert_save_dir, m.hexdigest()+'.crt')
# write unique certificate der file to disk
with open(cert_path, 'w+b') as c:
c.write(cert)
# write json file to disk
jsons_path = os.path.join(jsons_save_dir, mtjson["Timestamp"]+'.json')
with open(jsons_path, 'w') as j:
j.write(decoded_data)
# Send data to Analyszer
self.send_to_analyzers(jsons_path)
def test(self):
print('Class: ja3-jl')

View File

@ -0,0 +1,16 @@
#!/usr/bin/env python3
from meta_types_modules.MetaTypesDefault import MetaTypesDefault
class TypeHandler(MetaTypesDefault):
def __init__(self, uuid, json_file):
super().__init__(uuid, json_file)
self.set_rotate_file_mode(False)
self.saved_dir = ''
def process_data(self, data):
self.reconstruct_data(data)
def test(self):
print('Class: maltrail')

View File

@ -0,0 +1,214 @@
#!/usr/bin/env python3
import os
import sys
import time
import json
import redis
import datetime
from meta_types_modules import MetaTypesDefault
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
### Config ###
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
config_loader = None
### ###
type_meta_header = 2
type_defined = 254
max_buffer_length = 100000
rotation_save_cycle = 10 #seconds
json_file_name = 'meta_json.json'
def get_class( package_class ):
parts = package_class.split('.')
module = ".".join(parts[:-1])
mod = __import__( module )
for comp in parts[1:]:
mod = getattr(mod, comp)
return mod
def check_default_json_file(json_file):
# the json object must contain a type field
if "type" in json_file:
return True
else:
return False
def on_error(session_uuid, type_error, message):
redis_server_stream.sadd('Error:IncorrectType', session_uuid)
redis_server_metadata.hset('metadata_uuid:{}'.format(uuid), 'Error', 'Error: Type={}, {}'.format(type_error, message))
clean_db(session_uuid)
print('Incorrect format')
sys.exit(1)
def clean_db(session_uuid):
clean_stream(stream_meta_json, type_meta_header, session_uuid)
clean_stream(stream_defined, type_defined, session_uuid)
redis_server_stream.srem('ended_session', session_uuid)
redis_server_stream.srem('working_session_uuid:{}'.format(type_meta_header), session_uuid)
# clean extended type (used)
redis_server_stream.hdel('map:session-uuid_active_extended_type', session_uuid)
try:
redis_server_stream.srem('active_connection_extended_type:{}'.format(uuid), extended_type)
except Exception as e:
print(e)
def clean_stream(stream_name, type, session_uuid):
redis_server_stream.srem('session_uuid:{}'.format(type), session_uuid)
#redis_server_stream.hdel('map-type:session_uuid-uuid:{}'.format(type), session_uuid)
redis_server_stream.delete(stream_name)
if __name__ == "__main__":
###################################################3
if len(sys.argv) != 2:
print('usage:', 'Worker.py', 'session_uuid')
exit(1)
session_uuid = sys.argv[1]
stream_meta_json = 'stream:{}:{}'.format(type_meta_header, session_uuid)
stream_defined = 'stream:{}:{}'.format(type_defined, session_uuid)
id = '0'
buffer = b''
stream_name = stream_meta_json
type = type_meta_header
# track launched worker
redis_server_stream.sadd('working_session_uuid:{}'.format(type_meta_header), session_uuid)
# get uuid
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
uuid = res[0][1][0][1][b'uuid'].decode()
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else:
clean_db(session_uuid)
print('Incorrect Stream, Closing worker: type={} session_uuid={} epoch={}'.format(type, session_uuid, time.time()))
sys.exit(1)
full_json = None
# active session
while full_json is None:
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
new_id = res[0][1][0][0].decode()
if id != new_id:
id = new_id
data = res[0][1][0][1]
if id and data:
# remove line from json
data[b'message'] = data[b'message'].replace(b'\n', b'')
# reconstruct data
if buffer != b'':
data[b'message'] = b''.join([buffer, data[b'message']])
buffer = b''
try:
full_json = json.loads(data[b'message'].decode())
except:
buffer += data[b'message']
# # TODO: filter too big json
redis_server_stream.xdel(stream_name, id)
# complete json received
if full_json:
print(full_json)
if check_default_json_file(full_json):
# end type 2 processing
break
# Incorrect Json
else:
on_error(session_uuid, type, 'Incorrect JSON object')
else:
# end session, no json received
if redis_server_stream.sismember('ended_session', session_uuid):
clean_db(session_uuid)
print('---- Incomplete JSON object, DONE, uuid={} session_uuid={}'.format(uuid, session_uuid))
sys.exit(0)
else:
time.sleep(10)
# extract/parse JSON
extended_type = full_json['type']
if not redis_server_metadata.sismember('server:accepted_extended_type', extended_type):
error_mess = 'Unsupported extended_type: {}'.format(extended_type)
on_error(session_uuid, type, error_mess)
clean_db(session_uuid)
sys.exit(1)
# create active_connection for extended type
redis_server_stream.sadd('active_connection_extended_type:{}'.format(uuid), extended_type)
redis_server_stream.hset('map:session-uuid_active_extended_type', session_uuid, extended_type)
#### Handle Specific MetaTypes ####
# Use Specific Handler defined
if os.path.isdir(os.path.join('meta_types_modules', extended_type)):
class_type_handler = get_class('meta_types_modules.{}.{}.TypeHandler'.format(extended_type, extended_type))
type_handler = class_type_handler(uuid, full_json)
# Use Standard Handler
else:
type_handler = MetaTypesDefault.MetaTypesDefault(uuid, full_json)
#file_separator = type_handler.get_file_separator(self)
#extended_type_name = type_handler.get_file_name()
# save json on disk
type_handler.save_json_file(full_json)
# change stream_name/type
stream_name = stream_defined
type = type_defined
id = 0
buffer = b''
type_handler.test()
# update uuid: extended type list
redis_server_metadata.sadd('all_extended_types_by_uuid:{}'.format(uuid), extended_type)
# update metadata extended type
time_val = int(time.time())
if not redis_server_metadata.hexists('metadata_extended_type_by_uuid:{}:{}'.format(uuid, extended_type), 'first_seen'):
redis_server_metadata.hset('metadata_extended_type_by_uuid:{}:{}'.format(uuid, extended_type), 'first_seen', time_val)
redis_server_metadata.hset('metadata_extended_type_by_uuid:{}:{}'.format(uuid, extended_type), 'last_seen', time_val)
# handle 254 type
while True:
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
new_id = res[0][1][0][0].decode()
if id != new_id:
id = new_id
data = res[0][1][0][1]
if id and data:
# update metadata extended type
redis_server_metadata.hset('metadata_extended_type_by_uuid:{}:{}'.format(uuid, extended_type), 'last_seen', int(time.time()) )
# process 254 data type
type_handler.process_data(data[b'message'])
# remove data from redis stream
redis_server_stream.xdel(stream_name, id)
else:
# end session, no json received
if redis_server_stream.sismember('ended_session', session_uuid):
clean_db(session_uuid)
print('---- JSON object, DONE, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
sys.exit(0)
else:
time.sleep(10)

View File

@ -0,0 +1,39 @@
#!/usr/bin/env python3
import os
import sys
import time
import redis
import subprocess
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
### Config ###
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
### ###
type = 2
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server: Redis_STREAM, ConnectionError')
sys.exit(1)
if __name__ == "__main__":
stream_name = 'stream:{}'.format(type)
redis_server_stream.delete('working_session_uuid:{}'.format(type))
while True:
for session_uuid in redis_server_stream.smembers('session_uuid:{}'.format(type)):
session_uuid = session_uuid.decode()
if not redis_server_stream.sismember('working_session_uuid:{}'.format(type), session_uuid):
process = subprocess.Popen(['./worker.py', session_uuid])
print('Launching new worker{} ... session_uuid={}'.format(type, session_uuid))
#print('.')
time.sleep(10)

View File

@ -0,0 +1,180 @@
#!/usr/bin/env python3
import os
import sys
import time
import gzip
import redis
import shutil
import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(session_uuid):
print('Incorrect format')
sys.exit(1)
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
config_loader = None
type = 3
rotation_save_cycle = 300 #seconds
max_buffer_length = 10000
save_to_file = True
def compress_file(file_full_path, i=0):
if i==0:
compressed_filename = '{}.gz'.format(file_full_path)
else:
compressed_filename = '{}.{}.gz'.format(file_full_path, i)
if os.path.isfile(compressed_filename):
compress_file(file_full_path, i+1)
else:
with open(file_full_path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
os.remove(file_full_path)
def get_save_dir(dir_data_uuid, year, month, day):
dir_path = os.path.join(dir_data_uuid, year, month, day)
if not os.path.isdir(dir_path):
os.makedirs(dir_path)
return dir_path
if __name__ == "__main__":
if len(sys.argv) != 2:
print('usage:', 'Worker.py', 'session_uuid')
exit(1)
session_uuid = sys.argv[1]
stream_name = 'stream:{}:{}'.format(type, session_uuid)
id = '0'
buffer = b''
# track launched worker
redis_server_stream.sadd('working_session_uuid:{}'.format(type), session_uuid)
# get uuid
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
uuid = res[0][1][0][1][b'uuid'].decode()
# init file rotation
if save_to_file:
rotate_file = False
time_file = time.time()
date_file = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
dir_data_uuid = os.path.join(data_directory, uuid, str(type))
dir_full_path = get_save_dir(dir_data_uuid, date_file[0:4], date_file[4:6], date_file[6:8])
filename = '{}-{}-{}-{}-{}.syslog.txt'.format(uuid, date_file[0:4], date_file[4:6], date_file[6:8], date_file[8:14])
save_path = os.path.join(dir_full_path, filename)
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else:
########################### # TODO: clean db on error
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1)
while True:
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
new_id = res[0][1][0][0].decode()
if id != new_id:
id = new_id
data = res[0][1][0][1]
if id and data:
# reconstruct data
if buffer != b'':
data[b'message'] = b''.join([buffer, data[b'message']])
buffer = b''
# send data to redis
# new line in received data
if b'\n' in data[b'message']:
all_line = data[b'message'].split(b'\n')
for line in all_line[:-1]:
Analyzer_Queue.add_data_to_queue(uuid, type, line)
# analyzer_uuid = analyzer_uuid.decode()
# keep incomplete line
if all_line[-1] != b'':
buffer += all_line[-1]
else:
if len(buffer) < max_buffer_length:
buffer += data[b'message']
else:
print('Error, infinite loop, max buffer length reached')
# force new line
buffer += b''.join([ data[b'message'], b'\n' ])
# save data on disk
if save_to_file and b'\n' in data[b'message']:
new_date = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
# check if a new rotation is needed
if ( new_date[0:8] != date_file[0:8] ) or ( time.time() - time_file > rotation_save_cycle ):
date_file = new_date
rotate_file = True
# file rotation
if rotate_file:
end_file, start_new_file = data[b'message'].rsplit(b'\n', maxsplit=1)
# save end of file
with open(save_path, 'ab') as f:
f.write(end_file)
compress_file(save_path)
# get new save_path
dir_full_path = get_save_dir(dir_data_uuid, date_file[0:4], date_file[4:6], date_file[6:8])
filename = '{}-{}-{}-{}-{}.syslog.txt'.format(uuid, date_file[0:4], date_file[4:6], date_file[6:8], date_file[8:14])
save_path = os.path.join(dir_full_path, filename)
# save start of new file
if start_new_file != b'':
with open(save_path, 'ab') as f:
f.write(start_new_file)
# end of rotation
rotate_file = False
time_file = time.time()
else:
with open(save_path, 'ab') as f:
f.write(data[b'message'])
redis_server_stream.xdel(stream_name, id)
else:
# sucess, all data are saved
if redis_server_stream.sismember('ended_session', session_uuid):
redis_server_stream.srem('ended_session', session_uuid)
redis_server_stream.srem('session_uuid:{}'.format(type), session_uuid)
redis_server_stream.srem('working_session_uuid:{}'.format(type), session_uuid)
redis_server_stream.hdel('map-type:session_uuid-uuid:{}'.format(type), session_uuid)
redis_server_stream.delete(stream_name)
try:
if os.path.isfile(save_path):
#print('save')
compress_file(save_path)
except NameError:
pass
print('---- syslog DONE, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
sys.exit(0)
else:
time.sleep(10)

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python3
import os
import sys
import time
import redis
import subprocess
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
type = 3
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}, ConnectionError'.format("Redis_STREAM"))
sys.exit(1)
if __name__ == "__main__":
stream_name = 'stream:{}'.format(type)
redis_server_stream.delete('working_session_uuid:{}'.format(type))
while True:
for session_uuid in redis_server_stream.smembers('session_uuid:{}'.format(type)):
session_uuid = session_uuid.decode()
if not redis_server_stream.sismember('working_session_uuid:{}'.format(type), session_uuid):
process = subprocess.Popen(['./worker.py', session_uuid])
print('Launching new worker{} ... session_uuid={}'.format(type, session_uuid))
#print('.')
time.sleep(10)

View File

@ -7,17 +7,31 @@ import redis
import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(session_uuid):
print('Incorrect format')
sys.exit(1)
host_redis_stream = "localhost"
port_redis_stream = 6379
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
# get file config
config_file_server = os.path.join(os.environ['D4_HOME'], 'configs/server.conf')
config_server = configparser.ConfigParser()
config_server.read(config_file_server)
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = get_config_str.get_config_boolean("Save_Directories", "save_directory")
config_loader = None
type = 4
rotation_save_cycle = 300 #seconds
@ -38,16 +52,16 @@ if __name__ == "__main__":
if res:
date = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
uuid = res[0][1][0][1][b'uuid'].decode()
data_rel_path = os.path.join('../../data', uuid, str(type))
data_rel_path = os.path.join(data_directory, uuid, str(type))
dir_path = os.path.join(data_rel_path, date[0:4], date[4:6], date[6:8])
if not os.path.isdir(dir_path):
os.makedirs(dir_path)
filename = '{}-{}-{}-{}-{}.dnscap.txt'.format(uuid, date[0:4], date[4:6], date[6:8], date[8:14])
rel_path = os.path.join(dir_path, filename)
print('---- worker launched, uuid={} session_uuid={}'.format(uuid, session_uuid))
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else:
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1)
print('Incorrect message')
time_file = time.time()
rotate_file = False
@ -98,7 +112,7 @@ if __name__ == "__main__":
redis_server_stream.srem('working_session_uuid:{}'.format(type), session_uuid)
redis_server_stream.hdel('map-type:session_uuid-uuid:{}'.format(type), session_uuid)
redis_server_stream.delete(stream_name)
print('---- dnscap DONE, uuid={} session_uuid={}'.format(uuid, session_uuid))
print('---- dnscap DONE, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
sys.exit(0)
else:
time.sleep(10)

View File

@ -6,19 +6,19 @@ import time
import redis
import subprocess
host_redis_stream = "localhost"
port_redis_stream = 6379
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
type = 4
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
print('Error: Redis server {}, ConnectionError'.format("Redis_STREAM"))
sys.exit(1)
if __name__ == "__main__":

View File

@ -3,34 +3,33 @@
import os
import sys
import time
import gzip
import redis
import shutil
import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(session_uuid):
print('Incorrect format')
sys.exit(1)
host_redis_stream = "localhost"
port_redis_stream = 6379
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
host_redis_metadata = "localhost"
port_redis_metadata = 6380
redis_server_metadata = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=0)
redis_server_analyzer = redis.StrictRedis(
host=host_redis_metadata,
port=port_redis_metadata,
db=2)
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
config_loader = None
type = 8
rotation_save_cycle = 300 #seconds
@ -41,6 +40,19 @@ max_buffer_length = 10000
save_to_file = True
def compress_file(file_full_path, i=0):
if i==0:
compressed_filename = '{}.gz'.format(file_full_path)
else:
compressed_filename = '{}.{}.gz'.format(file_full_path, i)
if os.path.isfile(compressed_filename):
compress_file(file_full_path, i+1)
else:
with open(file_full_path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
os.remove(file_full_path)
def get_save_dir(dir_data_uuid, year, month, day):
dir_path = os.path.join(dir_data_uuid, year, month, day)
if not os.path.isdir(dir_path):
@ -70,13 +82,14 @@ if __name__ == "__main__":
rotate_file = False
time_file = time.time()
date_file = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
dir_data_uuid = os.path.join('../../data', uuid, str(type))
dir_data_uuid = os.path.join(data_directory, uuid, str(type))
dir_full_path = get_save_dir(dir_data_uuid, date_file[0:4], date_file[4:6], date_file[6:8])
filename = '{}-{}-{}-{}-{}.passivedns.txt'.format(uuid, date_file[0:4], date_file[4:6], date_file[6:8], date_file[8:14])
save_path = os.path.join(dir_full_path, filename)
print('---- worker launched, uuid={} session_uuid={}'.format(uuid, session_uuid))
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else:
########################### # TODO: clean db on error
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1)
@ -92,7 +105,7 @@ if __name__ == "__main__":
if id and data:
# reconstruct data
if buffer != b'':
data[b'message'] = '{}{}'.format(buffer, data[b'message'])
data[b'message'] = b''.join([buffer, data[b'message']])
buffer = b''
# send data to redis
@ -100,28 +113,21 @@ if __name__ == "__main__":
if b'\n' in data[b'message']:
all_line = data[b'message'].split(b'\n')
for line in all_line[:-1]:
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)):
analyzer_uuid = analyzer_uuid.decode()
redis_server_analyzer.lpush('analyzer:{}:{}'.format(type, analyzer_uuid), line)
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'last_updated', time.time())
analyser_queue_max_size = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if analyser_queue_max_size is None:
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(type, analyzer_uuid), 0, analyser_queue_max_size)
Analyzer_Queue.add_data_to_queue(uuid, type, line)
# keep incomplete line
if all_line[-1] != b'':
buffer += data[b'message']
buffer += all_line[-1]
else:
if len(buffer) < max_buffer_length:
buffer += data[b'message']
else:
print('Error, infinite loop, max buffer length reached')
# force new line
buffer += b'{}\n'.format(data[b'message'])
buffer += b''.join([ data[b'message'], b'\n' ])
# save data on disk
if save_to_file:
if save_to_file and b'\n' in data[b'message']:
new_date = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
# check if a new rotation is needed
if ( new_date[0:8] != date_file[0:8] ) or ( time.time() - time_file > rotation_save_cycle ):
@ -129,11 +135,12 @@ if __name__ == "__main__":
rotate_file = True
# file rotation
if rotate_file and b'\n' in data[b'message']:
if rotate_file:
end_file, start_new_file = data[b'message'].rsplit(b'\n', maxsplit=1)
# save end of file
with open(save_path, 'ab') as f:
f.write(end_file)
compress_file(save_path)
# get new save_path
dir_full_path = get_save_dir(dir_data_uuid, date_file[0:4], date_file[4:6], date_file[6:8])
@ -162,7 +169,13 @@ if __name__ == "__main__":
redis_server_stream.srem('working_session_uuid:{}'.format(type), session_uuid)
redis_server_stream.hdel('map-type:session_uuid-uuid:{}'.format(type), session_uuid)
redis_server_stream.delete(stream_name)
print('---- passivedns DONE, uuid={} session_uuid={}'.format(uuid, session_uuid))
try:
if os.path.isfile(save_path):
print('save')
compress_file(save_path)
except NameError:
pass
print('---- passivedns DONE, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
sys.exit(0)
else:
time.sleep(10)

View File

@ -6,19 +6,19 @@ import time
import redis
import subprocess
host_redis_stream = "localhost"
port_redis_stream = 6379
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
redis_server_stream = redis.StrictRedis(
host=host_redis_stream,
port=port_redis_stream,
db=0)
type = 8
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
print('Error: Redis server {}, ConnectionError'.format("Redis_STREAM"))
sys.exit(1)
if __name__ == "__main__":