Compare commits

...

57 Commits
v0.4 ... master

Author SHA1 Message Date
Jean-Louis Huynen cb3d618ee1
Merge pull request #50 from cudeso/master
Contributions to the documentation and small type for "registered"
2023-12-23 17:58:53 +01:00
Koen Van Impe 27aa5b1df9 Contributions to the documentation small type for "registered"
- Clarifications for basic install of the client
- Clarifications for basic install of the server
- Fix small types registered instead of registred
2023-12-22 18:31:40 +01:00
Alexandre Dulaunoy 2e8ddd490f
Merge pull request #49 from DocArmoryTech/patch-1
installation refinement
2023-11-28 22:20:44 +01:00
DocArmoryTech 090d0f66bb
Merge pull request #1 from DocArmoryTech/patch-2
Update install_server.sh
2023-11-28 20:36:24 +00:00
DocArmoryTech dfd53c126b
Update install_server.sh
- removed remnant reference to AIL_ENV from AIL install script
- removed non-existent flags from LAUNCH.sh
2023-11-28 20:30:33 +00:00
DocArmoryTech 6273a220b2
Update requirement.txt
set Flask version to 2.2.2
added Werkzeug version to match

Running Ubuntu 20.04 using Python 3.8.10
 - the latest version of flask causes the server to fail to start (cannot import name 'escape' from 'flask')
 - after fixing flask at version 2.2.2, the server again failed to start (ImportError: cannot import name 'url_quote' from 'werkzeug.urls')
2023-11-28 18:51:38 +00:00
Jean-Louis Huynen 81686aa022
fix: [web] fix #47 2023-03-02 15:41:45 +01:00
Gerard Wagener 399e659d8f fix: [d4-client] Removed hardcoded gcc command from Makefile 2021-10-08 11:32:38 +02:00
Terrtia b2f463e8f1
fix: [d4-server] check HMAC key 2021-04-20 16:42:22 +02:00
Terrtia adf0f6008b
fix: [d4-server] worker launcher: don't add invalid HMAC or empty data stream to workers queue 2021-04-20 15:43:03 +02:00
Terrtia 39d593364d
chg: [D4Server] add server port in config 2021-03-31 11:43:54 +02:00
Terrtia cbb90c057a
merge 2021-03-30 18:28:55 +02:00
Terrtia b7998d5601
chg: [D4server] add shared hmac key in config 2021-03-30 18:27:42 +02:00
Terrtia dc3cdcbc1c
chg: [D4server] add shared hmac key in config 2021-03-30 18:26:15 +02:00
Jean-Louis Huynen 6c3c9f9954
fix #41 2021-02-23 19:54:36 +01:00
Jean-Louis Huynen a6d5a3d22c
chg: [filerwatcher] fix double queue logging for compressed files 2021-02-19 09:45:47 +01:00
Jean-Louis Huynen 36a771ea2d
chg: [filerwatcher] define segragation from MH 2021-02-18 18:04:44 +01:00
Jean-Louis Huynen ef6e87f3c5
chg: [filerwatcher] compression, ext from MH + remove buffer limits 2021-02-18 17:00:55 +01:00
Jean-Louis Huynen 5a3e299332
add: [filerwatcher] enable by_uuid / date filing 2021-02-18 14:37:43 +01:00
Jean-Louis Huynen d74d2fb71a
add: [filerwatcher] +base64 worker 2021-02-17 16:46:33 +01:00
Jean-Louis Huynen cf64529929
add: [filerwatcher] initial json file worker 2021-02-16 16:13:53 +01:00
Terrtia 893631e003
fix: [client] no data: send empty D4 packet 2020-12-02 15:54:20 +01:00
Terrtia ac301b5360
fix: [Flask] fix flask host 2020-11-10 16:12:09 +01:00
Terrtia 04fab82f5e
fic: [Sensors monitoring] typo 2020-11-10 15:09:12 +01:00
Terrtia a297cef179
fic: [Sensors monitoring] fix reload list of sensors to monitor 2020-11-10 15:04:45 +01:00
Terrtia 3edf227cc1
fic: [Sensors monitoring] fix API 2020-11-10 14:41:44 +01:00
Terrtia 2d358918c9
fic: [Sensors monitoring] fct is_sensor_monitored 2020-11-10 14:21:00 +01:00
Terrtia 47f82c8879
fix: [Sensors API] check user token 2020-11-10 11:19:17 +01:00
Terrtia 6fee7df9fe
chg: [Sensors API + UI] add sensors monitoring 2020-11-10 11:11:23 +01:00
Terrtia 4b30072880
fix: [Flask server] typo 2020-09-04 09:31:40 +02:00
Terrtia 7ce265e477
fix: [Flask server] change default host 2020-09-04 09:22:14 +02:00
Terrtia 168c31a5bf
fix: [UI sensor_register role] avoid login + fix error template 2020-08-20 15:33:21 +02:00
Terrtia adda78faad
fix: [UI change password] check user role 2020-08-20 15:14:41 +02:00
Terrtia df482d6ee3
fix: [Analyzer queues] delete an analyzer queue 2020-05-26 09:48:21 +02:00
Terrtia 609402ebf2
fix: [edit user] edit user password 2020-05-26 09:38:24 +02:00
Thirion Aurélien 98e562bd47
Merge pull request #37 from D4-project/gallypette-patch-1
chg: [install] popper folder name changed.
2020-04-06 16:05:30 +02:00
Terrtia f17f80b21c
fix: [Flask cookie name] 2020-04-06 15:33:29 +02:00
Jean-Louis Huynen 00e3ce3437
chg: [install] popper folder name changed. 2020-04-06 15:13:43 +02:00
Terrtia 82b2944119
fix: [Analyzer - close socket] use shutdown fct 2020-03-17 17:58:13 +01:00
Terrtia 7078f341ae
fix: [README] 2020-03-12 11:15:35 +01:00
Terrtia cb1c8c4d65
chg: [README] update screenshot 2020-03-12 11:10:27 +01:00
Terrtia 091994a34d
chg: [server] add screenshots 2020-03-12 10:45:11 +01:00
Terrtia 209cd0500f
chg: [exporter TLS] add client cert 2020-03-10 15:04:29 +01:00
Terrtia 99656658f2
chg: [TLS Exporter] add new analyzer: tls export, fix: #35 2020-03-10 14:43:45 +01:00
Terrtia cdc72e7998
fix: [Analyzer queue 254] fix metatype: push to queue 2020-03-03 15:34:51 +01:00
Terrtia 8a792fe4ba
fix: [Analyzer queue 254] fix list by type 2020-03-03 14:33:54 +01:00
Terrtia ab261a6bd2
chg: [Analyzer Queue] add template: edit queue 2020-03-03 14:14:35 +01:00
Terrtia 14d3a650e5
chg: [Metatype default] send data to analyzer queues by default 2020-03-03 10:57:04 +01:00
Terrtia b48ad52845
chg: [Analyzer Queue] add update script + global queue_uuid by type/extended type 2020-03-03 10:50:08 +01:00
Terrtia 10430135d1
chg; [Analyzer Queue UI] add queue creator template + bug fix 2020-03-02 16:56:20 +01:00
Terrtia 4d55d601a1
chg: [Analyzer Queues] Add queue by group of sensors (TODO: add sensor uuid in the UI) 2020-02-28 16:52:48 +01:00
Terrtia aabf74f2f3
fix: [worker 2 8] fix config redis 2020-01-22 15:39:36 +01:00
Terrtia d3087662a7
fix: [worker 2 8] fix config import 2020-01-22 15:37:06 +01:00
Terrtia 8fa83dd248
fix: [worker 1] fix config import 2020-01-22 14:00:18 +01:00
Terrtia 56e7657253
chg: [worker] add worker 3 2020-01-22 13:52:53 +01:00
Terrtia bb3c1b2676
fix: typo 2020-01-21 11:52:44 +01:00
Terrtia 1c61e1d1fe
fix: [Flask server + cookie session] chg default cookie name (also use port number) + add Flask port number to config 2020-01-21 11:51:42 +01:00
48 changed files with 1994 additions and 247 deletions

View File

@ -64,10 +64,31 @@ git submodule init
git submodule update git submodule update
~~~~ ~~~~
Build the d4 client. This will create the `d4` binary.
~~~~
make
~~~~
Then register the sensor with the server. Replace `API_TOKEN`, `VALID_UUID4` (create a random UUID via [UUIDgenerator](https://www.uuidgenerator.net/)) and `VALID_HMAC_KEY`.
~~~~
curl -k https://127.0.0.1:7000/api/v1/add/sensor/register --header "Authorization: API_TOKEN" -H "Content-Type: application/json" --data '{"uuid":"VALID_UUID4","hmac_key":"VALID_HMAC_KEY"}' -X POST
~~~~
If the registration went correctly the UUID is returned. Do not forget to approve the registration in the D4 server web interface.
Update the configuration file
~~~~
cp -r conf.sample conf
echo VALID_UUID4 > conf/uuid
echo VALID_HMAC_KEY > conf/key
~~~~
## D4 core server ## D4 core server
D4 core server is a complete server to handle clients (sensors) including the decapsulation of the [D4 protocol](https://github.com/D4-project/architecture/tree/master/format), control of D4 core server is a complete server to handle clients (sensors) including the decapsulation of the [D4 protocol](https://github.com/D4-project/architecture/tree/master/format), control of sensor registrations, management of decoding protocols and dispatching to adequate decoders/analysers.
sensor registrations, management of decoding protocols and dispatching to adequate decoders/analysers.
### Requirements ### Requirements
@ -76,13 +97,26 @@ sensor registrations, management of decoding protocols and dispatching to adequa
### Installation ### Installation
- [Install D4 Server](https://github.com/D4-project/d4-core/tree/master/server) - [Install D4 Server](https://github.com/D4-project/d4-core/tree/master/server)
### Screenshots of D4 core server management ### D4 core server Screenshots
#### Dashboard:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/main.png) ![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/main.png)
#### Connected Sensors:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor-mgmt.png) ![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor-mgmt.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-mgmt.png)
#### Sensors Status:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_status.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_types.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_files.png)
#### Server Management:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management-types.png)
#### analyzer Queues:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-queues.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/create_analyzer_queue.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-mgmt.png) ![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-mgmt.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-mgmt2.png)

View File

@ -32,7 +32,7 @@ clean:
- rm -rf *.o hmac - rm -rf *.o hmac
d4: d4.o sha2.o hmac.o unpack.o unparse.o pack.o gen_uuid.o randutils.o parse.o d4: d4.o sha2.o hmac.o unpack.o unparse.o pack.o gen_uuid.o randutils.o parse.o
gcc -Wall -o d4 d4.o hmac.o sha2.o unpack.o pack.o unparse.o gen_uuid.o randutils.o parse.o $(CC) -Wall -o d4 d4.o hmac.o sha2.o unpack.o pack.o unparse.o gen_uuid.o randutils.o parse.o
d4.o: d4.c d4.o: d4.c
gcc -Wall -c d4.c $(CC) -Wall -c d4.c

View File

@ -210,7 +210,7 @@ void d4_transfert(d4_t* d4)
//In case of errors see block of 0 bytes //In case of errors see block of 0 bytes
bzero(buf, d4->snaplen); bzero(buf, d4->snaplen);
nread = read(d4->source.fd, buf, d4->snaplen); nread = read(d4->source.fd, buf, d4->snaplen);
if ( nread > 0 ) { if ( nread >= 0 ) {
d4_update_header(d4, nread); d4_update_header(d4, nread);
//Do HMAC on header and payload. HMAC field is 0 during computation //Do HMAC on header and payload. HMAC field is 0 during computation
if (d4->ctx) { if (d4->ctx) {
@ -238,6 +238,11 @@ void d4_transfert(d4_t* d4)
fprintf(stderr,"Incomplete header written. abort to let consumer known that the packet is corrupted\n"); fprintf(stderr,"Incomplete header written. abort to let consumer known that the packet is corrupted\n");
abort(); abort();
} }
// no data - create empty D4 packet
if ( nread == 0 ) {
//FIXME no data available, sleep, abort, retry
break;
}
} else{ } else{
//FIXME no data available, sleep, abort, retry //FIXME no data available, sleep, abort, retry
break; break;

Binary file not shown.

After

Width:  |  Height:  |  Size: 141 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 117 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 88 KiB

After

Width:  |  Height:  |  Size: 243 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 66 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 85 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 68 KiB

View File

@ -69,6 +69,8 @@ function launching_d4_server {
screen -S "Server_D4" -X screen -t "Server_D4" bash -c "cd ${D4_HOME}; ./server.py -v 10; read x" screen -S "Server_D4" -X screen -t "Server_D4" bash -c "cd ${D4_HOME}; ./server.py -v 10; read x"
sleep 0.1 sleep 0.1
screen -S "Server_D4" -X screen -t "sensors_manager" bash -c "cd ${D4_HOME}; ./sensors_manager.py; read x"
sleep 0.1
} }
function launching_workers { function launching_workers {
@ -80,6 +82,8 @@ function launching_workers {
sleep 0.1 sleep 0.1
screen -S "Workers_D4" -X screen -t "2_workers" bash -c "cd ${D4_HOME}/workers/workers_2; ./workers_manager.py; read x" screen -S "Workers_D4" -X screen -t "2_workers" bash -c "cd ${D4_HOME}/workers/workers_2; ./workers_manager.py; read x"
sleep 0.1 sleep 0.1
screen -S "Workers_D4" -X screen -t "3_workers" bash -c "cd ${D4_HOME}/workers/workers_3; ./workers_manager.py; read x"
sleep 0.1
screen -S "Workers_D4" -X screen -t "4_workers" bash -c "cd ${D4_HOME}/workers/workers_4; ./workers_manager.py; read x" screen -S "Workers_D4" -X screen -t "4_workers" bash -c "cd ${D4_HOME}/workers/workers_4; ./workers_manager.py; read x"
sleep 0.1 sleep 0.1
screen -S "Workers_D4" -X screen -t "8_workers" bash -c "cd ${D4_HOME}/workers/workers_8; ./workers_manager.py; read x" screen -S "Workers_D4" -X screen -t "8_workers" bash -c "cd ${D4_HOME}/workers/workers_8; ./workers_manager.py; read x"

View File

@ -15,11 +15,24 @@ sensor registrations, management of decoding protocols and dispatching to adequa
### Installation ### Installation
###### Install D4 server ###### Install D4 server
Clone the repository and install necessary packages. Installation requires *sudo* permissions.
~~~~ ~~~~
git clone https://github.com/D4-project/d4-core.git
cd d4-core
cd server cd server
./install_server.sh ./install_server.sh
~~~~ ~~~~
Create or add a pem in [d4-core/server](https://github.com/D4-project/d4-core/tree/master/server) :
When the installation is finished, scroll back to where `+ ./create_default_user.py` is displayed. The next lines contain the default generated user and should resemble the snippet below. Take a temporary note of the password, you are required to **change the password** on first login.
~~~~
new user created: admin@admin.test
password: <redacted>
token: <redacted>
~~~~
Then create or add a pem in [d4-core/server](https://github.com/D4-project/d4-core/tree/master/server) :
~~~~ ~~~~
cd gen_cert cd gen_cert
./gen_root.sh ./gen_root.sh
@ -27,7 +40,6 @@ cd gen_cert
cd .. cd ..
~~~~ ~~~~
###### Launch D4 server ###### Launch D4 server
~~~~ ~~~~
./LAUNCH.sh -l ./LAUNCH.sh -l
@ -35,6 +47,14 @@ cd ..
The web interface is accessible via `http://127.0.0.1:7000/` The web interface is accessible via `http://127.0.0.1:7000/`
If you cannot access the web interface on localhost (for example because the system is running on a remote host), then stop the server, change the listening host IP and restart the server. In the below example it's changed to `0.0.0.0` (all interfaces). Make sure that the IP is not unintentionally publicly exposed.
~~~~
./LAUNCH.sh -k
sed -i '/\[Flask_Server\]/{:a;N;/host = 127\.0\.0\.1/!ba;s/host = 127\.0\.0\.1/host = 0.0.0.0/}' configs/server.conf
./LAUNCH.sh -l
~~~~
### Updating web assets ### Updating web assets
To update javascript libs run: To update javascript libs run:
~~~~ ~~~~
@ -46,19 +66,32 @@ cd web
[API Documentation](https://github.com/D4-project/d4-core/tree/master/server/documentation/README.md) [API Documentation](https://github.com/D4-project/d4-core/tree/master/server/documentation/README.md)
### Notes ### Notes
- All server logs are located in ``d4-core/server/logs/`` - All server logs are located in ``d4-core/server/logs/``
- Close D4 Server: ``./LAUNCH.sh -k`` - Close D4 Server: ``./LAUNCH.sh -k``
### Screenshots of D4 core server management ### D4 core server
#### Dashboard:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/main.png) ![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/main.png)
#### Connected Sensors:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor-mgmt.png) ![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor-mgmt.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-mgmt.png)
#### Sensors Status:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_status.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_types.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/sensor_stat_files.png)
#### Server Management:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-management-types.png)
#### analyzer Queues:
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-queues.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/create_analyzer_queue.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-mgmt.png) ![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/analyzer-mgmt.png)
![](https://raw.githubusercontent.com/D4-project/d4-core/master/doc/images/server-mgmt2.png)
### Troubleshooting ### Troubleshooting
@ -71,3 +104,7 @@ Run the following command as root:
~~~~ ~~~~
aa-complain /usr/sbin/tcpdump aa-complain /usr/sbin/tcpdump
~~~~ ~~~~
###### WARNING - Not registered UUID=UUID4, connection closed
This happens after you have registered a new sensor, but have not approved the registration. In order to approve the sensor, go in the web interface to **Server Management**, and click **Pending Sensors**.

View File

@ -24,7 +24,7 @@ if __name__ == "__main__":
parser.add_argument('-p', '--port',help='server port' , type=int, dest='target_port', required=True) parser.add_argument('-p', '--port',help='server port' , type=int, dest='target_port', required=True)
parser.add_argument('-k', '--Keepalive', help='Keepalive in second', type=int, default='15', dest='ka_sec') parser.add_argument('-k', '--Keepalive', help='Keepalive in second', type=int, default='15', dest='ka_sec')
parser.add_argument('-n', '--newline', help='add new lines', action="store_true") parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip',help='redis host' , type=str, default='127.0.0.1', dest='host_redis') parser.add_argument('-ri', '--redis_ip',help='redis ip' , type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port',help='redis port' , type=int, default=6380, dest='port_redis') parser.add_argument('-rp', '--redis_port',help='redis port' , type=int, default=6380, dest='port_redis')
args = parser.parse_args() args = parser.parse_args()
@ -83,4 +83,4 @@ if __name__ == "__main__":
print(d4_data) print(d4_data)
client_socket.sendall(d4_data) client_socket.sendall(d4_data)
client_socket.close() client_socket.shutdown(socket.SHUT_RDWR)

View File

@ -0,0 +1,101 @@
#!/usr/bin/env python3
import os
import sys
import redis
import time
import datetime
import argparse
import logging
import logging.handlers
import socket
import ssl
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Export d4 data to stdout')
parser.add_argument('-t', '--type', help='d4 type or extended type', type=str, dest='type', required=True)
parser.add_argument('-u', '--uuid', help='queue uuid', type=str, dest='uuid', required=True)
parser.add_argument('-i', '--ip',help='server ip', type=str, default='127.0.0.1', dest='target_ip')
parser.add_argument('-p', '--port',help='server port', type=int, dest='target_port', required=True)
parser.add_argument('-k', '--Keepalive', help='Keepalive in second', type=int, default='15', dest='ka_sec')
parser.add_argument('-n', '--newline', help='add new lines', action="store_true")
parser.add_argument('-ri', '--redis_ip', help='redis ip', type=str, default='127.0.0.1', dest='host_redis')
parser.add_argument('-rp', '--redis_port', help='redis port', type=int, default=6380, dest='port_redis')
parser.add_argument('-v', '--verify_certificate', help='verify server certificate', type=str, default='True', dest='verify_certificate')
parser.add_argument('-c', '--ca_certs', help='cert filename' , type=str, default=None, dest='ca_certs')
args = parser.parse_args()
if not args.uuid or not args.type or not args.target_port:
parser.print_help()
sys.exit(0)
host_redis=args.host_redis
port_redis=args.port_redis
newLines=args.newline
verify_certificate=args.verify_certificate
ca_certs=args.ca_certs
redis_d4= redis.StrictRedis(
host=host_redis,
port=port_redis,
db=2)
try:
redis_d4.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}:{}, ConnectionError'.format(host_redis, port_redis))
sys.exit(1)
d4_uuid = args.uuid
d4_type = args.type
data_queue = 'analyzer:{}:{}'.format(d4_type, d4_uuid)
target_ip = args.target_ip
target_port = args.target_port
addr = (target_ip, target_port)
# default keep alive: 15
ka_sec = args.ka_sec
# Create a TCP socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
# TCP Keepalive
s.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1)
s.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPCNT, 1)
s.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPIDLE, ka_sec)
s.setsockopt(socket.IPPROTO_TCP, socket.TCP_KEEPINTVL, ka_sec)
# SSL
if verify_certificate in ['False', 'false', 'f']:
cert_reqs_option = ssl.CERT_NONE
else:
cert_reqs_option = ssl.CERT_REQUIRED
if ca_certs:
ca_certs = None
client_socket = ssl.wrap_socket(s, cert_reqs=cert_reqs_option, ca_certs=ca_certs, ssl_version=ssl.PROTOCOL_TLS)
# TCP connect
client_socket.connect(addr)
newLines=True
while True:
d4_data = redis_d4.rpop(data_queue)
if d4_data is None:
time.sleep(1)
continue
if newLines:
d4_data = d4_data + b'\n'
print(d4_data)
client_socket.send(d4_data)
client_socket.shutdown(socket.SHUT_RDWR)

View File

@ -4,8 +4,16 @@ use_default_save_directory = yes
save_directory = None save_directory = None
[D4_Server] [D4_Server]
server_port=4443
# registration or shared-secret # registration or shared-secret
server_mode = registration server_mode = registration
default_hmac_key = private key to change
analyzer_queues_max_size = 100000000
[Flask_Server]
# UI port number
host = 127.0.0.1
port = 7000
[Redis_STREAM] [Redis_STREAM]
host = localhost host = localhost

View File

@ -30,8 +30,8 @@ git checkout 5.0
make make
popd popd
# LAUNCH Redis # LAUNCH
bash ${AIL_BIN}LAUNCH.sh -lrv & bash LAUNCH.sh -l &
wait wait
echo "" echo ""

370
server/lib/Analyzer_Queue.py Executable file
View File

@ -0,0 +1,370 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import datetime
import time
import uuid
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import d4_type
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_analyzer = config_loader.get_redis_conn("Redis_ANALYZER")
LIST_DEFAULT_SIZE = config_loader.get_config_int('D4_Server', 'analyzer_queues_max_size')
config_loader = None
### ###
def is_valid_uuid_v4(uuid_v4):
if uuid_v4:
uuid_v4 = uuid_v4.replace('-', '')
else:
return False
try:
uuid_test = uuid.UUID(hex=uuid_v4, version=4)
return uuid_test.hex == uuid_v4
except:
return False
def sanitize_uuid(uuid_v4, not_exist=False):
if not is_valid_uuid_v4(uuid_v4):
uuid_v4 = str(uuid.uuid4())
if not_exist:
if exist_queue(uuid_v4):
uuid_v4 = str(uuid.uuid4())
return uuid_v4
def sanitize_queue_type(format_type):
try:
format_type = int(format_type)
except:
format_type = 1
if format_type == 2:
format_type = 254
return format_type
def exist_queue(queue_uuid):
return r_serv_metadata.exists('analyzer:{}'.format(queue_uuid))
def get_all_queues(r_list=None):
res = r_serv_metadata.smembers('all_analyzer_queues')
if r_list:
return list(res)
return res
def get_all_queues_format_type(r_list=None):
res = r_serv_metadata.smembers('all:analyzer:format_type')
if r_list:
return list(res)
return res
def get_all_queues_extended_type(r_list=None):
res = r_serv_metadata.smembers('all:analyzer:extended_type')
if r_list:
return list(res)
return res
# GLOBAL
def get_all_queues_uuid_by_type(format_type, r_list=None):
res = r_serv_metadata.smembers('all:analyzer:by:format_type:{}'.format(format_type))
if r_list:
return list(res)
return res
# GLOBAL
def get_all_queues_uuid_by_extended_type(extended_type, r_list=None):
res = r_serv_metadata.smembers('all:analyzer:by:extended_type:{}'.format(extended_type))
if r_list:
return list(res)
return res
def get_queues_list_by_type(queue_type):
if isinstance(queue_type ,int):
return get_all_queues_by_type(queue_type)
else:
return get_all_queues_by_extended_type(queue_type)
# ONLY NON GROUP
def get_all_queues_by_type(format_type, r_list=None):
'''
Get all analyzer Queues by type
:param format_type: data type
:type domain_type: int
:param r_list: return list
:type r_list: boolean
:return: list or set of queus (uuid)
:rtype: list or set
'''
# 'all_analyzer_queues_by_type'
res = r_serv_metadata.smembers('analyzer:{}'.format(format_type))
if r_list:
return list(res)
return res
# ONLY NON GROUP
def get_all_queues_by_extended_type(extended_type, r_list=None):
res = r_serv_metadata.smembers('analyzer:254:{}'.format(extended_type))
if r_list:
return list(res)
return res
def get_all_queues_group_by_type(format_type, r_list=None):
res = r_serv_metadata.smembers('analyzer_uuid_group:{}'.format(format_type))
if r_list:
return list(res)
return res
def get_all_queues_group_by_extended_type(extended_type, r_list=None):
res = r_serv_metadata.smembers('analyzer_uuid_group:254:{}'.format(extended_type))
if r_list:
return list(res)
return res
def get_all_queues_by_sensor_group(queue_type, sensor_uuid, r_list=None):
res = r_serv_metadata.smembers('sensor:queues:{}:{}'.format(queue_type, sensor_uuid))
if r_list:
return list(res)
return res
def get_queue_group_all_sensors(queue_uuid, r_list=None):
res = r_serv_metadata.smembers('analyzer_sensor_group:{}'.format(queue_uuid))
if r_list:
return list(res)
return res
def get_queue_last_seen(queue_uuid, f_date='str_time'):
res = r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'last_updated')
if f_date == 'str_date':
if res is None:
res = 'Never'
else:
res = datetime.datetime.fromtimestamp(float(res)).strftime('%Y-%m-%d %H:%M:%S')
return res
def get_queue_max_size(queue_uuid):
max_size = r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'max_size')
if max_size is None:
max_size = LIST_DEFAULT_SIZE
return max_size
def get_queue_size(queue_uuid, format_type, extended_type=None):
if format_type==254:
if not extended_type:
extended_type = get_queue_extended_type(queue_uuid)
length = r_serv_analyzer.llen('analyzer:{}:{}'.format(extended_type, queue_uuid))
else:
length = r_serv_analyzer.llen('analyzer:{}:{}'.format(format_type, queue_uuid))
if length is None:
length = 0
return length
def get_queue_format_type(queue_uuid):
return int(r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'type'))
def get_queue_extended_type(queue_uuid):
return r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'metatype')
def is_queue_group_of_sensors(queue_uuid):
return r_serv_metadata.exists('analyzer_sensor_group:{}'.format(queue_uuid))
def get_queue_metadata(queue_uuid, format_type=None, extended_type=None, f_date='str_date', is_group=None, force_is_group_queue=False):
dict_queue_meta = {}
dict_queue_meta['uuid'] = queue_uuid
dict_queue_meta['size_limit'] = get_queue_max_size(queue_uuid)
dict_queue_meta['last_updated'] = get_queue_last_seen(queue_uuid, f_date=f_date)
dict_queue_meta['description'] = r_serv_metadata.hget('analyzer:{}'.format(queue_uuid), 'description')
if dict_queue_meta['description'] is None:
dict_queue_meta['description'] = ''
if not format_type:
format_type = get_queue_format_type(queue_uuid)
dict_queue_meta['format_type'] = format_type
if format_type==254:
if not extended_type:
extended_type = get_queue_extended_type(queue_uuid)
dict_queue_meta['extended_type'] = extended_type
dict_queue_meta['length'] = get_queue_size(queue_uuid, format_type, extended_type=extended_type)
if is_group and not force_is_group_queue:
dict_queue_meta['is_group_queue'] = is_queue_group_of_sensors(queue_uuid)
else:
if force_is_group_queue:
dict_queue_meta['is_group_queue'] = True
else:
dict_queue_meta['is_group_queue'] = False
return dict_queue_meta
def edit_queue_description(queue_uuid, description):
if r_serv_metadata.exists('analyzer:{}'.format(queue_uuid)) and description:
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'description', description)
def edit_queue_max_size(queue_uuid, max_size):
try:
max_size = int(max_size)
except:
return 'analyzer max size, Invalid Integer'
if r_serv_metadata.exists('analyzer:{}'.format(queue_uuid)) and max_size > 0:
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'max_size', max_size)
def edit_queue_sensors_set(queue_uuid, l_sensors_uuid):
format_type = get_queue_format_type(queue_uuid)
set_current_sensors = get_queue_group_all_sensors(queue_uuid)
l_new_sensors_uuid = []
for sensor_uuid in l_sensors_uuid:
l_new_sensors_uuid.append(sensor_uuid.replace('-', ''))
sensors_to_add = l_sensors_uuid.difference(set_current_sensors)
sensors_to_remove = set_current_sensors.difference(l_sensors_uuid)
for sensor_uuid in sensors_to_add:
r_serv_metadata.sadd('analyzer_sensor_group:{}'.format(queue_uuid), sensor_uuid)
r_serv_metadata.sadd('sensor:queues:{}:{}'.format(format_type, sensor_uuid), queue_uuid)
for sensor_uuid in sensors_to_remove:
r_serv_metadata.srem('analyzer_sensor_group:{}'.format(queue_uuid), sensor_uuid)
r_serv_metadata.srem('sensor:queues:{}:{}'.format(format_type, sensor_uuid), queue_uuid)
# create queu by type or by group of uuid
# # TODO: add size limit
def create_queues(format_type, queue_uuid=None, l_uuid=[], queue_type='list', metatype_name=None, description=None):
format_type = sanitize_queue_type(format_type)
if not d4_type.is_accepted_format_type(format_type):
return {'error': 'Invalid type'}
if format_type == 254 and not d4_type.is_accepted_extended_type(metatype_name):
return {'error': 'Invalid extended type'}
queue_uuid = sanitize_uuid(queue_uuid, not_exist=True)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'type', format_type)
edit_queue_description(queue_uuid, description)
# # TODO: check l_uuid is valid
if l_uuid:
analyzer_key_name = 'analyzer_uuid_group'
else:
analyzer_key_name = 'analyzer'
r_serv_metadata.sadd('all:analyzer:format_type', format_type)
r_serv_metadata.sadd('all:analyzer:by:format_type:{}'.format(format_type), queue_uuid)
if format_type == 254:
# TODO: check metatype_name
r_serv_metadata.sadd('{}:{}:{}'.format(analyzer_key_name, format_type, metatype_name), queue_uuid)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'metatype', metatype_name)
r_serv_metadata.sadd('all:analyzer:by:extended_type:{}'.format(metatype_name), queue_uuid)
r_serv_metadata.sadd('all:analyzer:extended_type', metatype_name)
else:
r_serv_metadata.sadd('{}:{}'.format(analyzer_key_name, format_type), queue_uuid)
# Group by UUID
if l_uuid:
# # TODO: check sensor_uuid is valid
if format_type == 254:
queue_type = metatype_name
for sensor_uuid in l_uuid:
sensor_uuid = sensor_uuid.replace('-', '')
r_serv_metadata.sadd('analyzer_sensor_group:{}'.format(queue_uuid), sensor_uuid)
r_serv_metadata.sadd('sensor:queues:{}:{}'.format(queue_type, sensor_uuid), queue_uuid)
# ALL
r_serv_metadata.sadd('all_analyzer_queues', queue_uuid)
return queue_uuid
# format_type int or str (extended type)
def add_data_to_queue(sensor_uuid, queue_type, data):
if data:
# by data type
for queue_uuid in get_queues_list_by_type(queue_type):
r_serv_analyzer.lpush('analyzer:{}:{}'.format(queue_type, queue_uuid), data)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'last_updated', time.time())
analyser_queue_max_size = get_queue_max_size(queue_uuid)
r_serv_analyzer.ltrim('analyzer:{}:{}'.format(queue_type, queue_uuid), 0, analyser_queue_max_size)
# by data type
for queue_uuid in get_all_queues_by_sensor_group(queue_type, sensor_uuid):
r_serv_analyzer.lpush('analyzer:{}:{}'.format(queue_type, queue_uuid), data)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'last_updated', time.time())
analyser_queue_max_size = get_queue_max_size(queue_uuid)
r_serv_analyzer.ltrim('analyzer:{}:{}'.format(queue_type, queue_uuid), 0, analyser_queue_max_size)
def flush_queue(queue_uuid, queue_type):
r_serv_analyzer.delete('analyzer:{}:{}'.format(queue_type, queue_uuid))
def remove_queues(queue_uuid, queue_type, metatype_name=None):
try:
queue_type = int(queue_type)
except:
print('error: Invalid format type')
return {'error': 'Invalid format type'}
if not is_valid_uuid_v4(queue_uuid):
print('error: Invalid uuid')
return {'error': 'Invalid uuid'}
if not exist_queue(queue_uuid):
print('error: unknow queue uuid')
return {'error': 'unknow queue uuid'}
if queue_type==254 and not metatype_name:
metatype_name = get_queue_extended_type(queue_uuid)
# delete metadata
r_serv_metadata.delete('analyzer:{}'.format(queue_uuid))
# delete queue group of sensors uuid
l_sensors_uuid = get_queue_group_all_sensors(queue_uuid)
if l_sensors_uuid:
r_serv_metadata.delete('analyzer_sensor_group:{}'.format(queue_uuid))
if queue_type == 254:
queue_type = metatype_name
for sensor_uuid in l_sensors_uuid:
r_serv_metadata.srem('sensor:queues:{}:{}'.format(queue_type, sensor_uuid), queue_uuid)
if l_sensors_uuid:
analyzer_key_name = 'analyzer_uuid_group'
else:
analyzer_key_name = 'analyzer'
r_serv_metadata.srem('all:analyzer:by:format_type:{}'.format(queue_type), queue_uuid)
if queue_type == 254:
r_serv_metadata.srem('{}:254:{}'.format(analyzer_key_name, metatype_name), queue_uuid)
r_serv_metadata.srem('all:analyzer:by:extended_type:{}'.format(metatype_name), queue_uuid)
else:
r_serv_metadata.srem('{}:{}'.format(analyzer_key_name, queue_type), queue_uuid)
r_serv_metadata.srem('all_analyzer_queues', queue_uuid)
## delete global queue ##
if not r_serv_metadata.exists('all:analyzer:by:format_type:{}'.format(queue_type)):
r_serv_metadata.srem('all:analyzer:format_type', queue_type)
if queue_type ==254:
if not r_serv_metadata.exists('all:analyzer:by:extended_type:{}'.format(metatype_name)):
r_serv_metadata.srem('all:analyzer:extended_type', metatype_name)
## --- ##
# delete qeue
r_serv_analyzer.delete('analyzer:{}:{}'.format(queue_type, queue_uuid))
def get_sensor_queues(sensor_uuid):
pass
if __name__ == '__main__':
#create_queues(3, l_uuid=['03c00bcf-fe53-46a1-85bb-ee6084cb5bb2'])
remove_queues('a2e6f95c-1efe-4d2b-a0f5-d8e205d85670', 3)

View File

@ -11,6 +11,7 @@ from flask import escape
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/')) sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader import ConfigLoader
import d4_server
### Config ### ### Config ###
config_loader = ConfigLoader.ConfigLoader() config_loader = ConfigLoader.ConfigLoader()
@ -26,6 +27,13 @@ def is_valid_uuid_v4(UUID):
except: except:
return False return False
def get_time_sensor_last_seen(sensor_uuid):
res = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'last_seen')
if res:
return int(res)
else:
return 0
def _get_sensor_type(sensor_uuid, first_seen=True, last_seen=True, time_format='default'): def _get_sensor_type(sensor_uuid, first_seen=True, last_seen=True, time_format='default'):
uuid_type = [] uuid_type = []
uuid_all_type = r_serv_db.smembers('all_types_by_uuid:{}'.format(sensor_uuid)) uuid_all_type = r_serv_db.smembers('all_types_by_uuid:{}'.format(sensor_uuid))
@ -68,6 +76,8 @@ def _get_sensor_metadata(sensor_uuid, first_seen=True, last_seen=True, time_form
meta_sensor['mail'] = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'user_mail') meta_sensor['mail'] = r_serv_db.hget('metadata_uuid:{}'.format(sensor_uuid), 'user_mail')
return meta_sensor return meta_sensor
### BEGIN - SENSOR REGISTRATION ###
## TODO: add description ## TODO: add description
def register_sensor(req_dict): def register_sensor(req_dict):
sensor_uuid = req_dict.get('uuid', None) sensor_uuid = req_dict.get('uuid', None)
@ -80,7 +90,7 @@ def register_sensor(req_dict):
sensor_uuid = sensor_uuid.replace('-', '') sensor_uuid = sensor_uuid.replace('-', '')
# sensor already exist # sensor already exist
if r_serv_db.exists('metadata_uuid:{}'.format(sensor_uuid)): if r_serv_db.exists('metadata_uuid:{}'.format(sensor_uuid)):
return ({"status": "error", "reason": "Sensor already registred"}, 409) return ({"status": "error", "reason": "Sensor already registered"}, 409)
# hmac key # hmac key
if not hmac_key: if not hmac_key:
@ -167,3 +177,99 @@ def delete_registered_sensor(req_dict):
def _delete_registered_sensor(sensor_uuid): def _delete_registered_sensor(sensor_uuid):
r_serv_db.srem('registered_uuid', sensor_uuid) r_serv_db.srem('registered_uuid', sensor_uuid)
return ({'uuid': sensor_uuid}, 200) return ({'uuid': sensor_uuid}, 200)
### --- END - SENSOR REGISTRATION --- ###
### BEGIN - SENSOR MONITORING ###
def get_sensors_monitoring_last_updated():
res = r_serv_db.get('sensors_monitoring:last_updated')
if res:
return int(res)
else:
return 0
def get_all_sensors_to_monitor():
return r_serv_db.smembers('to_monitor:sensors')
def get_to_monitor_delta_time_by_uuid(sensor_uuid):
return int(r_serv_db.hget('to_monitor:sensor:{}'.format(sensor_uuid), 'delta_time'))
def get_all_sensors_to_monitor_dict():
dict_to_monitor = {}
for sensor_uuid in get_all_sensors_to_monitor():
dict_to_monitor[sensor_uuid] = get_to_monitor_delta_time_by_uuid(sensor_uuid)
return dict_to_monitor
def _check_sensor_delta(sensor_uuid, sensor_delta):
last_d4_packet = get_time_sensor_last_seen(sensor_uuid)
# check sensor delta time between two D4 packets + check sensor connection
if int(time.time()) - last_d4_packet > sensor_delta or not d4_server.is_sensor_connected(sensor_uuid):
r_serv_db.sadd('sensors_monitoring:sensors_error', sensor_uuid)
handle_sensor_monitoring_error(sensor_uuid)
else:
r_serv_db.srem('sensors_monitoring:sensors_error', sensor_uuid)
def handle_sensor_monitoring_error(sensor_uuid):
print('sensor monitoring error: {}'.format(sensor_uuid))
## TODO: ##
# MAILS
# UI Notifications
# SNMP
# Syslog message
## ## ## ##
return None
def is_sensor_monitored(sensor_uuid):
return r_serv_db.sismember('to_monitor:sensors', sensor_uuid)
def get_all_sensors_connection_errors():
return r_serv_db.smembers('sensors_monitoring:sensors_error')
def api_get_all_sensors_connection_errors():
return list(get_all_sensors_connection_errors()), 200
def add_sensor_to_monitor(sensor_uuid, delta_time):
r_serv_db.sadd('to_monitor:sensors', sensor_uuid)
r_serv_db.hset('to_monitor:sensor:{}'.format(sensor_uuid), 'delta_time', delta_time)
r_serv_db.set('sensors_monitoring:last_updated', int(time.time()))
r_serv_db.srem('sensors_monitoring:sensors_error', sensor_uuid)
def delete_sensor_to_monitor(sensor_uuid):
r_serv_db.srem('to_monitor:sensors', sensor_uuid)
r_serv_db.delete('to_monitor:sensor:{}'.format(sensor_uuid))
r_serv_db.set('sensors_monitoring:last_updated', int(time.time()))
r_serv_db.srem('sensors_monitoring:sensors_error', sensor_uuid)
def api_add_sensor_to_monitor(data_dict):
sensor_uuid = data_dict.get('uuid', None)
delta_time = data_dict.get('delta_time', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
# hmac key
if not delta_time:
return ({"status": "error", "reason": "Mandatory parameter(s) not provided"}, 400)
else:
try:
delta_time = int(delta_time)
if delta_time < 1:
return ({"status": "error", "reason": "Invalid delta_time"}, 400)
except Exception:
return ({"status": "error", "reason": "Invalid delta_time"}, 400)
add_sensor_to_monitor(sensor_uuid, delta_time)
def api_delete_sensor_to_monitor(data_dict):
sensor_uuid = data_dict.get('uuid', None)
if not is_valid_uuid_v4(sensor_uuid):
return ({"status": "error", "reason": "Invalid uuid"}, 400)
sensor_uuid = sensor_uuid.replace('-', '')
if not is_sensor_monitored(sensor_uuid):
return ({"status": "error", "reason": "Sensor not monitored"}, 400)
delete_sensor_to_monitor(sensor_uuid)
### --- END - SENSOR REGISTRATION --- ###

44
server/lib/d4_server.py Executable file
View File

@ -0,0 +1,44 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import time
import uuid
import redis
from flask import escape
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_stream = config_loader.get_redis_conn("Redis_STREAM")
config_loader = None
### ###
### BEGIN - SENSOR CONNECTION ###
def get_all_connected_sensors(r_list=False):
res = r_stream.smembers('active_connection')
if r_list:
if res:
return list(res)
else:
return []
else:
return res
def get_all_connected_sensors_by_type(d4_type, d4_extended_type=None):
# D4 extended type
if d4_type == 254 and d4_extended_type:
return r_stream.smembers('active_connection_extended_type:{}'.format(d4_extended_type))
# type 1-253
else:
return r_stream.smembers('active_connection:{}'.format(d4_type))
def is_sensor_connected(sensor_uuid):
return r_stream.sismember('active_connection', sensor_uuid)
### --- END - SENSOR CONNECTION --- ###

42
server/lib/d4_type.py Executable file
View File

@ -0,0 +1,42 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import datetime
import time
import uuid
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
def get_all_accepted_format_type(r_list=False):
res = r_serv_metadata.smembers('server:accepted_type')
if r_list:
if res:
return list(res)
else:
return []
return res
def get_all_accepted_extended_type(r_list=False):
res = r_serv_metadata.smembers('server:accepted_extended_type')
if r_list:
if res:
return list(res)
else:
return []
return res
def is_accepted_format_type(format_type):
return r_serv_metadata.sismember('server:accepted_type', format_type)
def is_accepted_extended_type(extended_type):
return r_serv_metadata.sismember('server:accepted_extended_type', extended_type)

View File

@ -1,8 +1,9 @@
twisted[tls] twisted[tls]
redis redis
flask flask==2.2.2
flask-login flask-login
bcrypt bcrypt
Werkzeug==2.2.2
#sudo python3 -m pip install --upgrade service_identity #sudo python3 -m pip install --upgrade service_identity

54
server/sensors_manager.py Executable file
View File

@ -0,0 +1,54 @@
#!/usr/bin/env python3
import os
import sys
import time
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Sensor
### Config ###
config_loader = ConfigLoader.ConfigLoader()
#redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
try:
redis_server_metadata.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server: Redis_METADATA, ConnectionError')
sys.exit(1)
def reload_all_sensors_to_monitor_dict(dict_to_monitor, last_updated):
if not dict_to_monitor:
dict_to_monitor = Sensor.get_all_sensors_to_monitor_dict()
else:
monitoring_last_updated = Sensor.get_sensors_monitoring_last_updated()
if monitoring_last_updated > last_updated:
dict_to_monitor = Sensor.get_all_sensors_to_monitor_dict()
last_updated = int(time.time())
print('updated: List of sensors to monitor')
return dict_to_monitor
if __name__ == "__main__":
time_refresh = int(time.time())
last_updated = time_refresh
all_sensors_to_monitor = Sensor.get_all_sensors_to_monitor_dict()
while True:
for sensor_uuid in all_sensors_to_monitor:
Sensor._check_sensor_delta(sensor_uuid, all_sensors_to_monitor[sensor_uuid])
time.sleep(10)
## reload dict_to_monitor ##
curr_time = int(time.time())
if curr_time - time_refresh >= 60:
time_refresh = curr_time
all_sensors_to_monitor = reload_all_sensors_to_monitor_dict(all_sensors_to_monitor, last_updated)
##-- --##

View File

@ -13,6 +13,8 @@ import argparse
import logging import logging
import logging.handlers import logging.handlers
import configparser
from twisted.internet import ssl, task, protocol, endpoints, defer from twisted.internet import ssl, task, protocol, endpoints, defer
from twisted.python import log from twisted.python import log
from twisted.python.modules import getModule from twisted.python.modules import getModule
@ -24,7 +26,6 @@ sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader import ConfigLoader
hmac_reset = bytearray(32) hmac_reset = bytearray(32)
hmac_key = os.getenv('D4_HMAC_KEY', b'private key to change')
accepted_type = [1, 2, 4, 8, 254] accepted_type = [1, 2, 4, 8, 254]
accepted_extended_type = ['ja3-jl'] accepted_extended_type = ['ja3-jl']
@ -46,7 +47,16 @@ redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_respon
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False) redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
# get server_mode # get server_mode
try:
D4server_port = config_loader.get_config_int("D4_Server", "server_port")
except configparser.NoOptionError:
D4server_port = 4443
server_mode = config_loader.get_config_str("D4_Server", "server_mode") server_mode = config_loader.get_config_str("D4_Server", "server_mode")
try:
hmac_key = config_loader.get_config_str("D4_Server", "default_hmac_key")
except configparser.NoOptionError:
hmac_key = 'private key to change'
config_loader = None config_loader = None
### ### ### ###
@ -134,10 +144,10 @@ def extract_ip(ip_string):
return ip_string return ip_string
def server_mode_registration(header_uuid): def server_mode_registration(header_uuid):
# only accept registred uuid # only accept registered uuid
if server_mode == 'registration': if server_mode == 'registration':
if not redis_server_metadata.sismember('registered_uuid', header_uuid): if not redis_server_metadata.sismember('registered_uuid', header_uuid):
error_msg = 'Not registred UUID={}, connection closed'.format(header_uuid) error_msg = 'Not registered UUID={}, connection closed'.format(header_uuid)
print(error_msg) print(error_msg)
logger.warning(error_msg) logger.warning(error_msg)
#redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: This UUID is temporarily blacklisted') #redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: This UUID is temporarily blacklisted')
@ -245,8 +255,8 @@ class D4_Server(Protocol, TimeoutMixin):
if not redis_server_stream.exists('active_connection_by_uuid:{}'.format(self.uuid)): if not redis_server_stream.exists('active_connection_by_uuid:{}'.format(self.uuid)):
redis_server_stream.srem('active_connection', self.uuid) redis_server_stream.srem('active_connection', self.uuid)
logger.debug('Connection closed: session_uuid={}'.format(self.session_uuid)) logger.debug('Connection closed: session_uuid={}'.format(self.session_uuid))
dict_all_connection.pop(self.session_uuid) dict_all_connection.pop(self.session_uuid)
def unpack_header(self, data): def unpack_header(self, data):
data_header = {} data_header = {}
@ -259,6 +269,20 @@ class D4_Server(Protocol, TimeoutMixin):
data_header['size'] = struct.unpack('I', data[58:62])[0] data_header['size'] = struct.unpack('I', data[58:62])[0]
return data_header return data_header
def check_hmac_key(self, hmac_header, data):
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.hget('metadata_uuid:{}'.format(self.uuid), 'hmac_key')
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.get('server:hmac_default_key')
# set hmac_header to 0
data = data.replace(hmac_header, hmac_reset, 1)
HMAC = hmac.new(self.hmac_key, msg=data, digestmod='sha256')
hmac_header = hmac_header.hex()
# hmac match
return hmac_header == HMAC.hexdigest()
def check_connection_validity(self, data_header): def check_connection_validity(self, data_header):
# blacklist ip by uuid # blacklist ip by uuid
if redis_server_metadata.sismember('blacklist_ip_by_uuid', data_header['uuid_header']): if redis_server_metadata.sismember('blacklist_ip_by_uuid', data_header['uuid_header']):
@ -335,8 +359,14 @@ class D4_Server(Protocol, TimeoutMixin):
self.type = data_header['type'] self.type = data_header['type']
self.uuid = data_header['uuid_header'] self.uuid = data_header['uuid_header']
# worker entry point: map type:session_uuid # # check HMAC /!\ incomplete data
redis_server_stream.sadd('session_uuid:{}'.format(data_header['type']), self.session_uuid.encode()) # if not self.check_hmac_key(data_header['hmac_header'], data):
# print('hmac do not match')
# print(data)
# logger.debug("HMAC don't match, uuid={}, session_uuid={}".format(self.uuid, self.session_uuid))
# redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'Error', 'Error: HMAC don\'t match')
# self.transport.abortConnection()
# return 1
## save active connection ## ## save active connection ##
#active Connection #active Connection
@ -463,15 +493,6 @@ class D4_Server(Protocol, TimeoutMixin):
def process_d4_data(self, data, data_header, ip): def process_d4_data(self, data, data_header, ip):
# empty buffer # empty buffer
self.buffer = b'' self.buffer = b''
# set hmac_header to 0
data = data.replace(data_header['hmac_header'], hmac_reset, 1)
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.hget('metadata_uuid:{}'.format(data_header['uuid_header']), 'hmac_key')
if self.hmac_key is None:
self.hmac_key = redis_server_metadata.get('server:hmac_default_key')
HMAC = hmac.new(self.hmac_key, msg=data, digestmod='sha256')
data_header['hmac_header'] = data_header['hmac_header'].hex()
### Debug ### ### Debug ###
#print('hexdigest: {}'.format( HMAC.hexdigest() )) #print('hexdigest: {}'.format( HMAC.hexdigest() ))
@ -484,7 +505,7 @@ class D4_Server(Protocol, TimeoutMixin):
### ### ### ###
# hmac match # hmac match
if data_header['hmac_header'] == HMAC.hexdigest(): if self.check_hmac_key(data_header['hmac_header'], data):
if not self.stream_max_size: if not self.stream_max_size:
temp = redis_server_metadata.hget('stream_max_size_by_uuid', data_header['uuid_header']) temp = redis_server_metadata.hget('stream_max_size_by_uuid', data_header['uuid_header'])
if temp is not None: if temp is not None:
@ -509,12 +530,16 @@ class D4_Server(Protocol, TimeoutMixin):
redis_server_metadata.zincrby('stat_uuid_type:{}:{}'.format(date, data_header['uuid_header']), 1, data_header['type']) redis_server_metadata.zincrby('stat_uuid_type:{}:{}'.format(date, data_header['uuid_header']), 1, data_header['type'])
# #
d4_packet_rcv_time = int(time.time())
if not redis_server_metadata.hexists('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen'): if not redis_server_metadata.hexists('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen'):
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen', data_header['timestamp']) redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'first_seen', d4_packet_rcv_time)
redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'last_seen', data_header['timestamp']) redis_server_metadata.hset('metadata_uuid:{}'.format(data_header['uuid_header']), 'last_seen', d4_packet_rcv_time)
redis_server_metadata.hset('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'last_seen', data_header['timestamp']) redis_server_metadata.hset('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'last_seen', d4_packet_rcv_time)
if not self.data_saved: if not self.data_saved:
# worker entry point: map type:session_uuid
redis_server_stream.sadd('session_uuid:{}'.format(data_header['type']), self.session_uuid.encode())
#UUID IP: ## TODO: use d4 timestamp ? #UUID IP: ## TODO: use d4 timestamp ?
redis_server_metadata.lpush('list_uuid_ip:{}'.format(data_header['uuid_header']), '{}-{}'.format(ip, datetime.datetime.now().strftime("%Y%m%d%H%M%S"))) redis_server_metadata.lpush('list_uuid_ip:{}'.format(data_header['uuid_header']), '{}-{}'.format(ip, datetime.datetime.now().strftime("%Y%m%d%H%M%S")))
redis_server_metadata.ltrim('list_uuid_ip:{}'.format(data_header['uuid_header']), 0, 15) redis_server_metadata.ltrim('list_uuid_ip:{}'.format(data_header['uuid_header']), 0, 15)
@ -523,7 +548,7 @@ class D4_Server(Protocol, TimeoutMixin):
if self.update_stream_type: if self.update_stream_type:
if not redis_server_metadata.hexists('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'first_seen'): if not redis_server_metadata.hexists('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'first_seen'):
redis_server_metadata.hset('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'first_seen', data_header['timestamp']) redis_server_metadata.hset('metadata_type_by_uuid:{}:{}'.format(data_header['uuid_header'], data_header['type']), 'first_seen', d4_packet_rcv_time)
self.update_stream_type = False self.update_stream_type = False
return 0 return 0
else: else:
@ -554,7 +579,7 @@ def main(reactor):
certificate = ssl.PrivateCertificate.loadPEM(certData) certificate = ssl.PrivateCertificate.loadPEM(certData)
factory = protocol.Factory.forProtocol(D4_Server) factory = protocol.Factory.forProtocol(D4_Server)
# use interface to support both IPv4 and IPv6 # use interface to support both IPv4 and IPv6
reactor.listenSSL(4443, factory, certificate.options(), interface='::') reactor.listenSSL(D4server_port, factory, certificate.options(), interface='::')
return defer.Deferred() return defer.Deferred()

38
server/update/update_v0.5.py Executable file
View File

@ -0,0 +1,38 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
import os
import sys
import datetime
import time
import uuid
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
import d4_type
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
config_loader = None
### ###
if __name__ == '__main__':
for format_type in d4_type.get_all_accepted_format_type():
format_type = int(format_type)
for queue_uuid in Analyzer_Queue.get_all_queues_by_type(format_type):
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'type', format_type)
r_serv_metadata.sadd('all:analyzer:format_type', format_type)
r_serv_metadata.sadd('all:analyzer:by:format_type:{}'.format(format_type), queue_uuid)
for extended_type in d4_type.get_all_accepted_extended_type():
for queue_uuid in Analyzer_Queue.get_all_queues_by_extended_type(extended_type):
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'type', 254)
r_serv_metadata.hset('analyzer:{}'.format(queue_uuid), 'metatype', extended_type)
r_serv_metadata.sadd('all:analyzer:extended_type', extended_type)
r_serv_metadata.sadd('all:analyzer:format_type', 254)
r_serv_metadata.sadd('all:analyzer:by:extended_type:{}'.format(extended_type), queue_uuid)
r_serv_metadata.sadd('all:analyzer:by:format_type:254', queue_uuid)

View File

@ -29,10 +29,13 @@ sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
from User import User from User import User
import Sensor import Sensor
import ConfigLoader import ConfigLoader
import Analyzer_Queue
# Import Blueprint # Import Blueprint
from blueprints.restApi import restApi from blueprints.restApi import restApi
from blueprints.settings import settings from blueprints.settings import settings
from blueprints.analyzer_queue import analyzer_queue
from blueprints.D4_sensors import D4_sensors
baseUrl = '' baseUrl = ''
if baseUrl != '': if baseUrl != '':
@ -62,6 +65,17 @@ server_mode = config_loader.get_config_str("D4_Server", "server_mode")
if server_mode not in all_server_modes: if server_mode not in all_server_modes:
print('Error: incorrect server_mode') print('Error: incorrect server_mode')
try:
FLASK_HOST = config_loader.get_config_str("Flask_Server", "host")
except Exception as e:
print(e)
FLASK_HOST = '127.0.0.1'
try:
FLASK_PORT = config_loader.get_config_int("Flask_Server", "port")
except Exception:
FLASK_PORT = 7000
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM") redis_server_stream = config_loader.get_redis_conn("Redis_STREAM")
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA") redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA")
redis_users = config_loader.get_redis_conn("Redis_SERV") redis_users = config_loader.get_redis_conn("Redis_SERV")
@ -88,6 +102,9 @@ ssl_context.load_cert_chain(certfile=os.path.join(Flask_dir, 'server.crt'), keyf
app = Flask(__name__, static_url_path=baseUrl+'/static/') app = Flask(__name__, static_url_path=baseUrl+'/static/')
app.config['MAX_CONTENT_LENGTH'] = 900 * 1024 * 1024 app.config['MAX_CONTENT_LENGTH'] = 900 * 1024 * 1024
# ========= Cookie name ========
app.config.update(SESSION_COOKIE_NAME='d4_project_server{}'.format(uuid.uuid4().int))
# ========= session ======== # ========= session ========
app.secret_key = str(random.getrandbits(256)) app.secret_key = str(random.getrandbits(256))
login_manager = LoginManager() login_manager = LoginManager()
@ -98,6 +115,8 @@ login_manager.init_app(app)
# ========= BLUEPRINT =========# # ========= BLUEPRINT =========#
app.register_blueprint(restApi) app.register_blueprint(restApi)
app.register_blueprint(settings) app.register_blueprint(settings)
app.register_blueprint(analyzer_queue)
app.register_blueprint(D4_sensors)
# ========= =========# # ========= =========#
# ========= LOGIN MANAGER ======== # ========= LOGIN MANAGER ========
@ -265,7 +284,7 @@ def login():
if request.method == 'POST': if request.method == 'POST':
username = request.form.get('username') username = request.form.get('username')
password = request.form.get('password') password = request.form.get('password')
#next_page = request.form.get('next_page') next_page = request.form.get('next_page')
if username is not None: if username is not None:
user = User.get(username) user = User.get(username)
@ -282,11 +301,16 @@ def login():
#if not check_user_role_integrity(user.get_id()): #if not check_user_role_integrity(user.get_id()):
# error = 'Incorrect User ACL, Please contact your administrator' # error = 'Incorrect User ACL, Please contact your administrator'
# return render_template("login.html", error=error) # return render_template("login.html", error=error)
if not user.is_in_role('user'):
return render_template("403.html"), 403
login_user(user) ## TODO: use remember me ? login_user(user) ## TODO: use remember me ?
if user.request_password_change(): if user.request_password_change():
return redirect(url_for('change_password')) return redirect(url_for('change_password'))
else: else:
return redirect(url_for('index')) if next_page and next_page!='None':
return redirect(next_page)
else:
return redirect(url_for('index'))
# login failed # login failed
else: else:
# set brute force protection # set brute force protection
@ -302,12 +326,13 @@ def login():
return 'please provide a valid username' return 'please provide a valid username'
else: else:
#next_page = request.args.get('next') next_page = request.args.get('next')
error = request.args.get('error') error = request.args.get('error')
return render_template("login.html" , error=error) return render_template("login.html" , error=error, next_page=next_page)
@app.route('/change_password', methods=['POST', 'GET']) @app.route('/change_password', methods=['POST', 'GET'])
@login_required @login_required
@login_user_basic
def change_password(): def change_password():
password1 = request.form.get('password1') password1 = request.form.get('password1')
password2 = request.form.get('password2') password2 = request.form.get('password2')
@ -342,7 +367,7 @@ def logout():
@app.route('/role', methods=['POST', 'GET']) @app.route('/role', methods=['POST', 'GET'])
@login_required @login_required
def role(): def role():
return render_template("error/403.html"), 403 return render_template("403.html"), 403
@app.route('/') @app.route('/')
@login_required @login_required
@ -507,50 +532,29 @@ def server_management():
description = 'Please update your web server' description = 'Please update your web server'
list_analyzer_uuid = [] list_analyzer_uuid = []
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)): for analyzer_uuid in Analyzer_Queue.get_all_queues_by_type(type):
size_limit = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size') list_analyzer_uuid.append(Analyzer_Queue.get_queue_metadata(analyzer_uuid, format_type=type))
if size_limit is None:
size_limit = analyzer_list_max_default_size for analyzer_uuid in Analyzer_Queue.get_all_queues_group_by_type(type):
last_updated = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'last_updated') list_analyzer_uuid.append(Analyzer_Queue.get_queue_metadata(analyzer_uuid, format_type=type, force_is_group_queue=True))
if last_updated is None:
last_updated = 'Never'
else:
last_updated = datetime.datetime.fromtimestamp(float(last_updated)).strftime('%Y-%m-%d %H:%M:%S')
description_analyzer = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'description')
if description_analyzer is None:
description_analyzer = ''
len_queue = redis_server_analyzer.llen('analyzer:{}:{}'.format(type, analyzer_uuid))
if len_queue is None:
len_queue = 0
list_analyzer_uuid.append({'uuid': analyzer_uuid, 'description': description_analyzer, 'size_limit': size_limit,'last_updated': last_updated, 'length': len_queue})
list_accepted_types.append({"id": int(type), "description": description, 'list_analyzer_uuid': list_analyzer_uuid}) list_accepted_types.append({"id": int(type), "description": description, 'list_analyzer_uuid': list_analyzer_uuid})
list_accepted_extended_types = [] list_accepted_extended_types = []
l_queue_extended_type = []
for extended_type in redis_server_metadata.smembers('server:accepted_extended_type'): for extended_type in redis_server_metadata.smembers('server:accepted_extended_type'):
list_accepted_extended_types.append({"name": extended_type, 'list_analyzer_uuid': []})
list_analyzer_uuid = [] for extended_queue_uuid in Analyzer_Queue.get_all_queues_by_extended_type(extended_type):
for analyzer_uuid in redis_server_metadata.smembers('analyzer:254:{}'.format(extended_type)): l_queue_extended_type.append(Analyzer_Queue.get_queue_metadata(extended_queue_uuid, format_type=254, extended_type=extended_type))
size_limit = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if size_limit is None: for extended_queue_uuid in Analyzer_Queue.get_all_queues_group_by_extended_type(extended_type):
size_limit = analyzer_list_max_default_size l_queue_extended_type.append(Analyzer_Queue.get_queue_metadata(extended_queue_uuid, format_type=254, extended_type=extended_type, force_is_group_queue=True))
last_updated = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'last_updated')
if last_updated is None:
last_updated = 'Never'
else:
last_updated = datetime.datetime.fromtimestamp(float(last_updated)).strftime('%Y-%m-%d %H:%M:%S')
description_analyzer = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'description')
if description_analyzer is None:
description_analyzer = ''
len_queue = redis_server_analyzer.llen('analyzer:{}:{}'.format(extended_type, analyzer_uuid))
if len_queue is None:
len_queue = 0
list_analyzer_uuid.append({'uuid': analyzer_uuid, 'description': description_analyzer, 'size_limit': size_limit,'last_updated': last_updated, 'length': len_queue})
list_accepted_extended_types.append({"name": extended_type, 'list_analyzer_uuid': list_analyzer_uuid})
return render_template("server_management.html", list_accepted_types=list_accepted_types, list_accepted_extended_types=list_accepted_extended_types, return render_template("server_management.html", list_accepted_types=list_accepted_types, list_accepted_extended_types=list_accepted_extended_types,
server_mode=server_mode, server_mode=server_mode,
l_queue_extended_type=l_queue_extended_type,
nb_sensors_registered=nb_sensors_registered, nb_sensors_pending=nb_sensors_pending, nb_sensors_registered=nb_sensors_registered, nb_sensors_pending=nb_sensors_pending,
default_analyzer_max_line_len=default_analyzer_max_line_len, default_analyzer_max_line_len=default_analyzer_max_line_len,
blacklisted_ip=blacklisted_ip, unblacklisted_ip=unblacklisted_ip, blacklisted_ip=blacklisted_ip, unblacklisted_ip=unblacklisted_ip,
@ -598,6 +602,8 @@ def uuid_management():
"blacklisted_uuid": blacklisted_uuid, "blacklisted_ip_by_uuid": blacklisted_ip_by_uuid, "blacklisted_uuid": blacklisted_uuid, "blacklisted_ip_by_uuid": blacklisted_ip_by_uuid,
"first_seen_gmt": first_seen_gmt, "last_seen_gmt": last_seen_gmt, "Error": Error} "first_seen_gmt": first_seen_gmt, "last_seen_gmt": last_seen_gmt, "Error": Error}
data_uuid['is_monitored'] = Sensor.is_sensor_monitored(uuid_sensor)
if redis_server_stream.sismember('active_connection', uuid_sensor): if redis_server_stream.sismember('active_connection', uuid_sensor):
active_connection = True active_connection = True
else: else:
@ -776,54 +782,18 @@ def uuid_change_description():
else: else:
return jsonify({'error':'invalid uuid'}), 400 return jsonify({'error':'invalid uuid'}), 400
# # TODO: check analyser uuid dont exist
@app.route('/add_new_analyzer')
@login_required
@login_user_basic
def add_new_analyzer():
type = request.args.get('type')
user = request.args.get('redirect')
metatype_name = request.args.get('metatype_name')
analyzer_description = request.args.get('analyzer_description')
analyzer_uuid = request.args.get('analyzer_uuid')
if is_valid_uuid_v4(analyzer_uuid):
try:
type = int(type)
if type < 0:
return 'type, Invalid Integer'
except:
return 'type, Invalid Integer'
if type == 254:
# # TODO: check metatype_name
redis_server_metadata.sadd('analyzer:{}:{}'.format(type, metatype_name), analyzer_uuid)
else:
redis_server_metadata.sadd('analyzer:{}'.format(type), analyzer_uuid)
if redis_server_metadata.exists('analyzer:{}:{}'.format(type, metatype_name)) or redis_server_metadata.exists('analyzer:{}'.format(type)):
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'description', analyzer_description)
if user:
return redirect(url_for('server_management'))
else:
return 'Invalid uuid'
@app.route('/empty_analyzer_queue') @app.route('/empty_analyzer_queue')
@login_required @login_required
@login_user_basic @login_user_basic
def empty_analyzer_queue(): def empty_analyzer_queue():
analyzer_uuid = request.args.get('analyzer_uuid') analyzer_uuid = request.args.get('analyzer_uuid')
type = request.args.get('type') format_type = request.args.get('type')
metatype_name = request.args.get('metatype_name') metatype_name = request.args.get('metatype_name')
user = request.args.get('redirect') user = request.args.get('redirect')
if is_valid_uuid_v4(analyzer_uuid): if is_valid_uuid_v4(analyzer_uuid):
try: if format_type == 254:
type = int(type) format_type = metatype_name
if type < 0: Analyzer_Queue.flush_queue(analyzer_uuid, format_type)
return 'type, Invalid Integer'
except:
return 'type, Invalid Integer'
if type == 254:
redis_server_analyzer.delete('analyzer:{}:{}'.format(metatype_name, analyzer_uuid))
else:
redis_server_analyzer.delete('analyzer:{}:{}'.format(type, analyzer_uuid))
if user: if user:
return redirect(url_for('server_management')) return redirect(url_for('server_management'))
else: else:
@ -834,23 +804,11 @@ def empty_analyzer_queue():
@login_user_basic @login_user_basic
def remove_analyzer(): def remove_analyzer():
analyzer_uuid = request.args.get('analyzer_uuid') analyzer_uuid = request.args.get('analyzer_uuid')
type = request.args.get('type') format_type = request.args.get('type')
metatype_name = request.args.get('metatype_name') metatype_name = request.args.get('metatype_name')
user = request.args.get('redirect') user = request.args.get('redirect')
if is_valid_uuid_v4(analyzer_uuid): if is_valid_uuid_v4(analyzer_uuid):
try: Analyzer_Queue.remove_queues(analyzer_uuid, format_type)
type = int(type)
if type < 0:
return 'type, Invalid Integer'
except:
return 'type, Invalid Integer'
if type == 254:
redis_server_metadata.srem('analyzer:{}:{}'.format(type, metatype_name), analyzer_uuid)
redis_server_analyzer.delete('analyzer:{}:{}'.format(metatype_name, analyzer_uuid))
else:
redis_server_metadata.srem('analyzer:{}'.format(type), analyzer_uuid)
redis_server_analyzer.delete('analyzer:{}:{}'.format(type, analyzer_uuid))
redis_server_metadata.delete('analyzer:{}'.format(analyzer_uuid))
if user: if user:
return redirect(url_for('server_management')) return redirect(url_for('server_management'))
else: else:
@ -864,13 +822,7 @@ def analyzer_change_max_size():
user = request.args.get('redirect') user = request.args.get('redirect')
max_size_analyzer = request.args.get('max_size_analyzer') max_size_analyzer = request.args.get('max_size_analyzer')
if is_valid_uuid_v4(analyzer_uuid): if is_valid_uuid_v4(analyzer_uuid):
try: Analyzer_Queue.edit_queue_max_size(analyzer_uuid, max_size_analyzer)
max_size_analyzer = int(max_size_analyzer)
if max_size_analyzer < 0:
return 'analyzer max size, Invalid Integer'
except:
return 'analyzer max size, Invalid Integer'
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'max_size', max_size_analyzer)
if user: if user:
return redirect(url_for('server_management')) return redirect(url_for('server_management'))
else: else:
@ -1198,4 +1150,4 @@ def get_uuid_stats_history_json():
if __name__ == "__main__": if __name__ == "__main__":
app.run(host='0.0.0.0', port=7000, threaded=True, ssl_context=ssl_context) app.run(host=FLASK_HOST, port=FLASK_PORT, threaded=True, ssl_context=ssl_context)

View File

@ -109,7 +109,7 @@ def create_user_db(username_id , password, default=False, role=None, update=Fals
r_serv_db.hset('user:all', username_id, password_hash) r_serv_db.hset('user:all', username_id, password_hash)
def edit_user_db(user_id, role, password=None): def edit_user_db(user_id, password=None, role=None):
if password: if password:
password_hash = hashing_password(password.encode()) password_hash = hashing_password(password.encode())
r_serv_db.hset('user:all', user_id, password_hash) r_serv_db.hset('user:all', user_id, password_hash)

View File

@ -0,0 +1,76 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
'''
Flask functions and routes for all D4 sensors
'''
import os
import re
import sys
import json
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import ConfigLoader
import Sensor
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response
from flask_login import login_required, current_user
from Role_Manager import login_admin, login_user_basic
# ============ BLUEPRINT ============
D4_sensors = Blueprint('D4_sensors', __name__, template_folder='templates')
# ============ VARIABLES ============
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
# ============ FUNCTIONS ============
# ============= ROUTES ==============
@D4_sensors.route("/sensors/monitoring/add", methods=['GET'])
@login_required
@login_user_basic
def add_sensor_to_monitor():
sensor_uuid = request.args.get("uuid")
return render_template("sensors/add_sensor_to_monitor.html",
sensor_uuid=sensor_uuid)
@D4_sensors.route("/sensors/monitoring/add_post", methods=['POST'])
@login_required
@login_user_basic
def add_sensor_to_monitor_post():
sensor_uuid = request.form.get("uuid")
delta_time = request.form.get("delta_time")
res = Sensor.api_add_sensor_to_monitor({'uuid':sensor_uuid, 'delta_time': delta_time})
if res:
Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]
return redirect(url_for('uuid_management', uuid=sensor_uuid))
@D4_sensors.route("/sensors/monitoring/delete", methods=['GET'])
@login_required
@login_user_basic
def delete_sensor_to_monitor():
sensor_uuid = request.args.get("uuid")
res = Sensor.api_delete_sensor_to_monitor({'uuid':sensor_uuid})
if res:
Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]
return redirect(url_for('uuid_management', uuid=sensor_uuid))
@D4_sensors.route("/sensors/monitoring/errors", methods=['GET'])
@login_required
@login_user_basic
def get_all_sensors_connection_errors():
res = Sensor.api_get_all_sensors_connection_errors()
return Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]

View File

@ -0,0 +1,122 @@
#!/usr/bin/env python3
# -*-coding:UTF-8 -*
'''
Flask functions and routes for the rest api
'''
import os
import re
import sys
import redis
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib'))
import ConfigLoader
import Analyzer_Queue
from flask import Flask, render_template, jsonify, request, Blueprint, redirect, url_for, Response
from flask_login import login_required, current_user
from Role_Manager import login_admin, login_user_basic
# ============ BLUEPRINT ============
analyzer_queue = Blueprint('analyzer_queue', __name__, template_folder='templates')
# ============ VARIABLES ============
### Config ###
config_loader = ConfigLoader.ConfigLoader()
r_serv_metadata = config_loader.get_redis_conn("Redis_METADATA")
r_serv_db = config_loader.get_redis_conn("Redis_SERV")
config_loader = None
### ###
# ============ FUNCTIONS ============
# ============= ROUTES ==============
@analyzer_queue.route("/analyzer_queue/create_queue", methods=['GET'])
@login_required
@login_user_basic
def create_analyzer_queue():
return render_template("analyzer_queue/queue_creator.html")
@analyzer_queue.route("/analyzer_queue/create_queue_post", methods=['POST'])
@login_required
@login_user_basic
def create_analyzer_queue_post():
l_queue_meta = ['analyzer_type', 'analyzer_metatype', 'description', 'analyzer_uuid']
queue_type = request.form.get("analyzer_type")
queue_metatype = request.form.get("analyzer_metatype")
queue_description = request.form.get("description")
queue_uuid = request.form.get("analyzer_uuid")
queue_type = Analyzer_Queue.sanitize_queue_type(queue_type)
# unpack uuid group
l_uuid = set()
l_invalid_uuid = set()
for obj_tuple in list(request.form):
if obj_tuple not in l_queue_meta:
sensor_uuid = request.form.get(obj_tuple)
if Analyzer_Queue.is_valid_uuid_v4(sensor_uuid):
l_uuid.add(sensor_uuid)
else:
if sensor_uuid:
l_invalid_uuid.add(sensor_uuid)
l_uuid = list(l_uuid)
l_invalid_uuid = list(l_invalid_uuid)
if l_invalid_uuid:
return render_template("analyzer_queue/queue_creator.html", queue_uuid=queue_uuid, queue_type=queue_type, metatype_name=queue_metatype,
description=queue_description, l_uuid=l_uuid, l_invalid_uuid=l_invalid_uuid)
res = Analyzer_Queue.create_queues(queue_type, queue_uuid=queue_uuid, l_uuid=l_uuid, metatype_name=queue_metatype, description=queue_description)
if isinstance(res,dict):
return jsonify(res)
if res:
return redirect(url_for('server_management', _anchor=res))
@analyzer_queue.route("/analyzer_queue/edit_queue", methods=['GET'])
@login_required
@login_user_basic
def edit_queue_analyzer_queue():
queue_uuid = request.args.get("queue_uuid")
queue_metadata = Analyzer_Queue.get_queue_metadata(queue_uuid, is_group=True)
if 'is_group_queue' in queue_metadata:
l_sensors_uuid = Analyzer_Queue.get_queue_group_all_sensors(queue_uuid)
else:
l_sensors_uuid = None
return render_template("analyzer_queue/queue_editor.html", queue_metadata=queue_metadata, l_sensors_uuid=l_sensors_uuid)
@analyzer_queue.route("/analyzer_queue/edit_queue_post", methods=['POST'])
@login_required
@login_user_basic
def edit_queue_analyzer_queue_post():
l_queue_meta = ['queue_uuid', 'description']
queue_uuid = request.form.get("queue_uuid")
queue_description = request.form.get("description")
l_uuid = set()
l_invalid_uuid = set()
for obj_tuple in list(request.form):
if obj_tuple not in l_queue_meta:
sensor_uuid = request.form.get(obj_tuple)
if Analyzer_Queue.is_valid_uuid_v4(sensor_uuid):
l_uuid.add(sensor_uuid)
else:
if sensor_uuid:
l_invalid_uuid.add(sensor_uuid)
if l_invalid_uuid:
queue_metadata = Analyzer_Queue.get_queue_metadata(queue_uuid, is_group=True)
if queue_description:
queue_metadata['description'] = queue_description
return render_template("analyzer_queue/queue_editor.html", queue_metadata=queue_metadata, l_sensors_uuid=l_uuid, l_invalid_uuid=l_invalid_uuid)
Analyzer_Queue.edit_queue_description(queue_uuid, queue_description)
Analyzer_Queue.edit_queue_sensors_set(queue_uuid, l_uuid)
return redirect(url_for('analyzer_queue.edit_queue_analyzer_queue', queue_uuid=queue_uuid))

View File

@ -142,8 +142,8 @@ def is_valid_uuid_v4(header_uuid):
except: except:
return False return False
def one(): def build_json_response(resp_data, resp_code):
return 1 return Response(json.dumps(resp_data, indent=2, sort_keys=True), mimetype='application/json'), resp_code
# ============= ROUTES ============== # ============= ROUTES ==============
@ -154,3 +154,9 @@ def add_sensor_register():
data = request.get_json() data = request.get_json()
res = Sensor.register_sensor(data) res = Sensor.register_sensor(data)
return Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1] return Response(json.dumps(res[0], indent=2, sort_keys=True), mimetype='application/json'), res[1]
@restApi.route("/api/v1/sensors/monitoring/errors", methods=['GET'])
@token_required('user')
def get_all_sensors_connection_errors():
res = Sensor.api_get_all_sensors_connection_errors()
return build_json_response(res[0], res[1])

View File

@ -25,7 +25,7 @@ if __name__ == "__main__":
username = 'admin@admin.test' username = 'admin@admin.test'
password = gen_password() password = gen_password()
if r_serv.exists('user_metadata:admin@admin.test'): if r_serv.exists('user_metadata:{}'.format(username)):
edit_user_db(username, password=password, role='admin') edit_user_db(username, password=password, role='admin')
else: else:
create_user_db(username, password, role='admin', default=True) create_user_db(username, password, role='admin', default=True)

View File

@ -0,0 +1,7 @@
<div class="input-group mb-1">
<input type="text" class="form-control col-10 {%if error%}is-invalid{%else%}is-valid{%endif%}" name="{{sensor_uuid}}" value="{{sensor_uuid}}">
<span class="btn btn-danger input-group-addon delete-field col-2"><i class="fa fa-trash"></i></span>
<div class="invalid-feedback">
Please provide a valid UUID v4.
</div>
</div>

View File

@ -0,0 +1,157 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="card mb-3 mt-1">
<div class="card-header text-white bg-dark">
<h5 class="card-title">Create Analyzer Queue</h5>
</div>
<div class="card-body">
<form action="{{ url_for('analyzer_queue.create_analyzer_queue_post') }}" method="post" enctype=multipart/form-data onsubmit="submitPaste()">
<div class="form-group mb-4">
<label for="analyzer_type"><b>Analyzer Type</b></label>
<input class="form-control col-md-4" type="number" name="analyzer_type" id="analyzer_type" value="{%if queue_type%}{{queue_type}}{%else%}1{%endif%}" min="1" max="254" required>
<input class="form-control" type="text" name="analyzer_metatype" id="analyzer_metatype_name" placeholder="Meta Type Name" {%if metatype_name%}value="{{metatype_name}}"{%endif%}>
</div>
<div class="input-group my-2">
<div class="input-group-prepend">
<button class="btn btn-outline-secondary" type="button" onclick="generate_new_uuid();"><i class="fa fa-random"></i></button>
</div>
<input class="form-control col-md-4" type="text" name="analyzer_uuid" id="analyzer_uuid" {%if queue_uuid%}value="{{queue_uuid}}"{%endif%} placeholder="Analyzer uuid - (Optional)">
</div>
<div class="form-group my-2">
<input class="form-control" type="text" name="description" id="analyzer_description" {%if description%}value="{{description}}"{%endif%} placeholder="Description - (Optional)">
</div>
<div id="container-id-to-import">
<p>Create Queue by Group of UUID</p>
<div for="first_sensor_uuid"><b>Sensor UUID</b></div>
<div class="form-horizontal">
<div class="form-body">
<div class="form-group">
<div class="fields">
{% if l_uuid %}
{% for sensor_uuid in l_uuid %}
{% with sensor_uuid=sensor_uuid, error=False%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
{% if l_invalid_uuid %}
{% for sensor_uuid in l_invalid_uuid %}
{% with sensor_uuid=sensor_uuid, error=True%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
<div class="input-group mb-1">
<input type="text" class="form-control col-10" name="first_sensor_uuid" id="first_sensor_uuid">
<span class="btn btn-info input-group-addon add-field col-2"><i class="fa fa-plus"></i></span>
</div>
<span class="help-block" hidden>Export Objects</span>
</div>
</div>
</div>
</div>
</div>
<div class="form-group">
<button class="btn btn-info" type="submit">Create Queue</button>
</div>
</form>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
{%if queue_type!=2 and queue_type!=254%}
$('#analyzer_metatype_name').hide();
{%endif%}
});
var input_part_1 = '<div class="input-group mb-1"><input type="text" class="form-control col-10" name="'
var input_part_2 = '"></div>'
var minusButton = '<span class="btn btn-danger input-group-addon delete-field col-2"><i class="fa fa-trash"></i></span>'
$('.add-field').click(function() {
var new_uuid = uuidv4();
var template = input_part_1 + new_uuid + input_part_2;
var temp = $(template).insertBefore('.help-block');
temp.append(minusButton);
});
$('.fields').on('click', '.delete-field', function(){
console.log($(this).parent());
$(this).parent().remove();
//$.get( "#")
});
function uuidv4() {
return ([1e7]+-1e3+-4e3+-8e3+-1e11).replace(/[018]/g, c =>
(c ^ crypto.getRandomValues(new Uint8Array(1))[0] & 15 >> c / 4).toString(16)
);
}
$('#analyzer_type').on('input', function() {
if ($('#analyzer_type').val() == 2 || $('#analyzer_type').val() == 254){
$('#analyzer_metatype_name').show()
} else {
$('#analyzer_metatype_name').hide()
}
});
function generate_new_uuid(){
$.getJSON( "{{url_for('generate_uuid')}}", function( data ) {
console.log(data['uuid'])
$( "#analyzer_uuid" ).val(data['uuid']);
});
}
</script>

View File

@ -0,0 +1,168 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='css/dataTables.bootstrap.min.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/dataTables.bootstrap.min.js')}}"></script>
<style>
.popover{
max-width: 100%;
}
</style>
</head>
<body>
{% include 'navbar.html' %}
<div class="card mb-3 mt-1">
<div class="card-header text-white bg-dark">
<h5 class="card-title">Analyzer Queue: <b>{{queue_metadata['uuid']}}</b></h5>
</div>
<div class="card-body">
<table class="table table-striped table-bordered">
<thead class="thead-dark">
<tr>
<th>Type Name</th>
<th>Group</th>
<th style="max-width: 800px;">Name</th>
<th style="max-width: 800px;">Last updated</th>
<th style="max-width: 800px;">Change max size limit</th>
</tr>
</thead>
<tbody>
<tr>
<td>
{%if queue_metadata['format_type'] == 254%}
{{queue_metadata['extended_type']}}
{%else%}
{{queue_metadata['format_type']}}
{%endif%}
</td>
{%if queue_metadata['is_group_queue']%}
<td class="text-center"><i class="fa fa-group"></i></td>
{%else%}
<td></td>
{%endif%}
<td>
<div class="d-flex">
<b>{{queue_metadata['uuid']}}:{{queue_metadata['format_type']}}{%if queue_metadata['format_type'] == 254%}:{{queue_metadata['extended_type']}}{%endif%}</b>
</div>
</td>
<td>{{queue_metadata['last_updated']}}</td>
<td>
<div class="d-xl-flex justify-content-xl-center">
<input class="form-control mr-lg-1" style="max-width: 100px;" type="number" id="max_size_analyzer_{{queue_metadata['uuid']}}" value="{{queue_metadata['size_limit']}}" min="0" required="">
<button type="button" class="btn btn-outline-secondary" onclick="window.location.href ='{{ url_for('analyzer_change_max_size') }}?analyzer_uuid={{queue_metadata['uuid']}}&redirect=0&max_size_analyzer='+$('#max_size_analyzer_{{queue_metadata['uuid']}}').val();">Change Max Size</button>
</div>
</td>
</tr>
</tbody>
</table>
<form action="{{ url_for('analyzer_queue.edit_queue_analyzer_queue_post') }}" method="post" enctype=multipart/form-data>
<input class="form-control" type="text" name="queue_uuid" id="queue_uuid" value="{{queue_metadata['uuid']}}" hidden>
<div class="form-group my-2">
<input class="form-control" type="text" name="description" id="analyzer_description" {%if 'description' in queue_metadata%}value="{{queue_metadata['description']}}"{%endif%} placeholder="Description - (Optional)">
</div>
<div>
<br>
<div for="first_sensor_uuid"><b>Sensor UUID</b></div>
<div class="form-horizontal">
<div class="form-body">
<div class="form-group">
<div class="fields">
{% if l_sensors_uuid %}
{% for sensor_uuid in l_sensors_uuid %}
{% with sensor_uuid=sensor_uuid, error=False%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
{% if l_invalid_uuid %}
{% for sensor_uuid in l_invalid_uuid %}
{% with sensor_uuid=sensor_uuid, error=True%}
{% include 'analyzer_queue/block_add_sensor_to_group_block.html' %}
{% endwith %}
{% endfor %}
<br>
{% endif %}
<div class="input-group mb-1">
<input type="text" class="form-control col-10" name="first_sensor_uuid" id="first_sensor_uuid">
<span class="btn btn-info input-group-addon add-field col-2"><i class="fa fa-plus"></i></span>
</div>
<span class="help-block" hidden>Sensor UUID</span>
</div>
</div>
</div>
</div>
</div>
<div class="form-group">
<button class="btn btn-info" type="submit">Edit Queue</button>
</div>
</form>
</div>
</div>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
});
var input_part_1 = '<div class="input-group mb-1"><input type="text" class="form-control col-10" name="'
var input_part_2 = '"></div>'
var minusButton = '<span class="btn btn-danger input-group-addon delete-field col-2"><i class="fa fa-trash"></i></span>'
$('.add-field').click(function() {
var new_uuid = uuidv4();
var template = input_part_1 + new_uuid + input_part_2;
var temp = $(template).insertBefore('.help-block');
temp.append(minusButton);
});
$('.fields').on('click', '.delete-field', function(){
console.log($(this).parent());
$(this).parent().remove();
//$.get( "#")
});
function uuidv4() {
return ([1e7]+-1e3+-4e3+-8e3+-1e11).replace(/[018]/g, c =>
(c ^ crypto.getRandomValues(new Uint8Array(1))[0] & 15 >> c / 4).toString(16)
);
}
</script>

View File

@ -69,7 +69,8 @@
<form class="form-signin" action="{{ url_for('login')}}" method="post"> <form class="form-signin" action="{{ url_for('login')}}" method="post">
<img class="mb-4" src="{{ url_for('static', filename='img/d4-logo.png')}}" width="300"> <img class="mb-4" src="{{ url_for('static', filename='img/d4-logo.png')}}" width="300">
<h1 class="h3 mb-3 text-secondary">Please sign in</h1> <h1 class="h3 mb-3 text-secondary">Please sign in</h1>
<label for="inputEmail" class="sr-only">Email address</label> <input type="text" id="next_page" name="next_page" value="{{next_page}}" hidden>
<label for="inputEmail" class="sr-only">Email address</label>
<input type="email" id="inputEmail" name="username" class="form-control" placeholder="Email address" required autofocus> <input type="email" id="inputEmail" name="username" class="form-control" placeholder="Email address" required autofocus>
<label for="inputPassword" class="sr-only">Password</label> <label for="inputPassword" class="sr-only">Password</label>
<input type="password" id="inputPassword" name="password" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Password" required> <input type="password" id="inputPassword" name="password" class="form-control {% if error %}is-invalid{% endif %}" placeholder="Password" required>

View File

@ -0,0 +1,57 @@
<!DOCTYPE html>
<html>
<head>
<title>D4-Project</title>
<link rel="icon" href="{{ url_for('static', filename='img/d4-logo.png')}}">
<!-- Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<link href="{{ url_for('static', filename='font-awesome/css/font-awesome.css') }}" rel="stylesheet">
<!-- JS -->
<script src="{{ url_for('static', filename='js/jquery.js')}}"></script>
<script src="{{ url_for('static', filename='js/popper.min.js')}}"></script>
<script src="{{ url_for('static', filename='js/bootstrap.min.js')}}"></script>
</head>
<body>
{% include 'navbar.html' %}
<form action="{{ url_for('D4_sensors.add_sensor_to_monitor_post') }}" method="post" enctype=multipart/form-data>
<div class="d-flex justify-content-center">
<div class="col-sm-6">
<h4 class="my-3">Monitor a Sensor</h4>
<div class="form-group">
<input class="form-control text-center bg-dark text-white" type="text" value="{{sensor_uuid}}" disabled>
<input type="text" name="uuid" id="uuid" value="{{sensor_uuid}}" hidden>
</div>
<div class="input-group mt-2 mb-2">
<div class="input-group-prepend">
<span class="input-group-text bg-light"><i class="fa fa-clock-o"></i>&nbsp;</span>
</div>
<input class="form-control" type="number" id="delta_time" value="3600" min="30" name="delta_time" required>
<div class="input-group-append">
<span class="input-group-text">Maxinum Time (seconds) between two D4 packets</span>
</div>
</div>
<div class="form-group">
<button class="btn btn-primary" type="submit">Monitor Sensor</button>
</div>
</div>
</div>
</form>
{% include 'navfooter.html' %}
</body>
<script>
$(document).ready(function(){
$("#nav-sensor").addClass("active");
});
</script>

View File

@ -231,11 +231,12 @@
<div class="card-body text-dark"> <div class="card-body text-dark">
<div class="row"> <div class="row">
<div class="col-xl-8"> <div class="col-xl-10">
<table class="table table-striped table-bordered table-hover" id="myTable_1"> <table class="table table-striped table-bordered table-hover" id="myTable_1">
<thead class="thead-dark"> <thead class="thead-dark">
<tr> <tr>
<th>Type</th> <th>Type</th>
<th>Group</th>
<th style="max-width: 800px;">uuid</th> <th style="max-width: 800px;">uuid</th>
<th style="max-width: 800px;">last updated</th> <th style="max-width: 800px;">last updated</th>
<th style="max-width: 800px;">Change max size limit</th> <th style="max-width: 800px;">Change max size limit</th>
@ -246,11 +247,18 @@
{% for type in list_accepted_types %} {% for type in list_accepted_types %}
{% if type['list_analyzer_uuid'] %} {% if type['list_analyzer_uuid'] %}
{% for analyzer in type['list_analyzer_uuid'] %} {% for analyzer in type['list_analyzer_uuid'] %}
<tr> <tr id="{{analyzer['uuid']}}">
<td>{{type['id']}}</td> <td>{{type['id']}}</td>
{%if analyzer['is_group_queue']%}
<td class="text-center"><i class="fa fa-group"></i></td>
{%else%}
<td></td>
{%endif%}
<td> <td>
<div class="d-flex"> <div class="d-flex">
{{analyzer['uuid']}} <a href="{{ url_for('analyzer_queue.edit_queue_analyzer_queue') }}?queue_uuid={{analyzer['uuid']}}">
{{analyzer['uuid']}}
</a>
<a href="{{ url_for('remove_analyzer') }}?redirect=1&type={{type['id']}}&analyzer_uuid={{analyzer['uuid']}}" class="ml-auto"> <a href="{{ url_for('remove_analyzer') }}?redirect=1&type={{type['id']}}&analyzer_uuid={{analyzer['uuid']}}" class="ml-auto">
<button type="button" class="btn btn-outline-danger px-2 py-0"><i class="fa fa-trash"></i></button> <button type="button" class="btn btn-outline-danger px-2 py-0"><i class="fa fa-trash"></i></button>
</a> </a>
@ -284,6 +292,7 @@
<thead class="thead-dark"> <thead class="thead-dark">
<tr> <tr>
<th>Type Name</th> <th>Type Name</th>
<th>Group</th>
<th style="max-width: 800px;">uuid</th> <th style="max-width: 800px;">uuid</th>
<th style="max-width: 800px;">last updated</th> <th style="max-width: 800px;">last updated</th>
<th style="max-width: 800px;">Change max size limit</th> <th style="max-width: 800px;">Change max size limit</th>
@ -291,60 +300,51 @@
</tr> </tr>
</thead> </thead>
<tbody id="analyzer_accepted_extended_types_tbody"> <tbody id="analyzer_accepted_extended_types_tbody">
{% for type in list_accepted_extended_types %} {% for dict_queue in l_queue_extended_type %}
{% if type['list_analyzer_uuid'] %} <tr>
{% for analyzer in type['list_analyzer_uuid'] %} <td>{{dict_queue['extended_type']}}</td>
<tr> {%if dict_queue['is_group_queue']%}
<td>{{type['name']}}</td> <td class="text-center"><i class="fa fa-group"></i></td>
<td> {%else%}
<div class="d-flex"> <td></td>
{{analyzer['uuid']}} {%endif%}
<a href="{{ url_for('remove_analyzer') }}?redirect=1&type=254&metatype_name={{type['name']}}&analyzer_uuid={{analyzer['uuid']}}" class="ml-auto"> <td>
<button type="button" class="btn btn-outline-danger px-2 py-0"><i class="fa fa-trash"></i></button> <div class="d-flex">
</a> <a href="{{ url_for('analyzer_queue.edit_queue_analyzer_queue') }}?queue_uuid={{dict_queue['uuid']}}">
</div> {{dict_queue['uuid']}}
{%if analyzer['description']%} </a>
<div class="text-info"><small>{{analyzer['description']}}</small></div> <a href="{{ url_for('remove_analyzer') }}?redirect=1&type=254&metatype_name={{dict_queue['extended_type']}}&analyzer_uuid={{dict_queue['uuid']}}" class="ml-auto">
{%endif%} <button type="button" class="btn btn-outline-danger px-2 py-0"><i class="fa fa-trash"></i></button>
</td> </a>
<td>{{analyzer['last_updated']}}</td> </div>
<td> {%if dict_queue['description']%}
<div class="d-xl-flex justify-content-xl-center"> <div class="text-info"><small>{{dict_queue['description']}}</small></div>
<input class="form-control mr-lg-1" style="max-width: 100px;" type="number" id="max_size_analyzer_{{analyzer['uuid']}}" value="{{analyzer['size_limit']}}" min="0" required=""> {%endif%}
<button type="button" class="btn btn-outline-secondary" onclick="window.location.href ='{{ url_for('analyzer_change_max_size') }}?analyzer_uuid={{analyzer['uuid']}}&redirect=0&max_size_analyzer='+$('#max_size_analyzer_{{analyzer['uuid']}}').val();">Change Max Size</button> </td>
</div> <td>{{dict_queue['last_updated']}}</td>
</td> <td>
<td> <div class="d-xl-flex justify-content-xl-center">
<a href="{{ url_for('empty_analyzer_queue') }}?redirect=1&type=254&metatype_name={{type['name']}}&analyzer_uuid={{analyzer['uuid']}}"> <input class="form-control mr-lg-1" style="max-width: 100px;" type="number" id="max_size_analyzer_{{dict_queue['uuid']}}" value="{{dict_queue['size_limit']}}" min="0" required="">
<button type="button" class="btn btn-outline-danger"><i class="fa fa-eraser"></i></button> <button type="button" class="btn btn-outline-secondary" onclick="window.location.href ='{{ url_for('analyzer_change_max_size') }}?analyzer_uuid={{dict_queue['uuid']}}&redirect=0&max_size_analyzer='+$('#max_size_analyzer_{{dict_queue['uuid']}}').val();">Change Max Size</button>
</a> </div>
<button type="button" class="btn btn-outline-info ml-xl-3" onclick="get_analyser_sample('{{type['name']}}', '{{analyzer['uuid']}}');"><i class="fa fa-database"></i> {{analyzer['length']}}</button> </td>
</td> <td>
</tr> <a href="{{ url_for('empty_analyzer_queue') }}?redirect=1&type=254&metatype_name={{dict_queue['extended_type']}}&analyzer_uuid={{dict_queue['uuid']}}">
{% endfor %} <button type="button" class="btn btn-outline-danger"><i class="fa fa-eraser"></i></button>
{% endif %} </a>
<button type="button" class="btn btn-outline-info ml-xl-3" onclick="get_analyser_sample('{{dict_queue['extended_type']}}', '{{dict_queue['uuid']}}');"><i class="fa fa-database"></i> {{dict_queue['length']}}</button>
</td>
</tr>
{% endfor %} {% endfor %}
</tbody> </tbody>
</table> </table>
</div> </div>
</div> </div>
<div class="col-xl-4"> <div class="col-xl-2">
<div class="card border-dark mt-3" style="max-width: 18rem;"> <a href="{{ url_for('analyzer_queue.create_analyzer_queue') }}" class="ml-auto">
<div class="card-body text-dark"> <button type="button" class="btn btn-primary"><i class="fa fa-plus"></i> Add New Analyzer Queue</button>
<h5 class="card-title">Add New Analyzer Queue</h5> </a>
<input class="form-control" type="number" id="analyzer_type" value="1" min="1" max="254" required>
<input class="form-control" type="text" id="analyzer_metatype_name" placeholder="Meta Type Name">
<div class="input-group">
<div class="input-group-prepend">
<button class="btn btn-outline-secondary" type="button" onclick="generate_new_uuid();"><i class="fa fa-random"></i></button>
</div>
<input class="form-control" type="text" id="analyzer_uuid" required placeholder="Analyzer uuid">
</div>
<input class="form-control" type="text" id="analyzer_description" required placeholder="Optional Description">
<button type="button" class="btn btn-outline-primary mt-1" onclick="window.location.href ='{{ url_for('add_new_analyzer') }}?redirect=1&type='+$('#analyzer_type').val()+'&analyzer_uuid='+$('#analyzer_uuid').val()+'&metatype_name='+$('#analyzer_metatype_name').val()+'&analyzer_description='+$('#analyzer_description').val();">Add New Analyzer</button>
</div>
</div>
</div> </div>
</div> </div>
@ -430,7 +430,7 @@ if (tbody.children().length == 0) {
} }
$('#accepted_type').on('input', function() { $('#accepted_type').on('input', function() {
if ($('#accepted_type').val() == 254){ if ($('#analyzer_type').val() == 2 || $('#accepted_type').val() == 254){
$('#extended_type_name').show() $('#extended_type_name').show()
} else { } else {
$('#extended_type_name').hide() $('#extended_type_name').hide()
@ -438,7 +438,7 @@ $('#accepted_type').on('input', function() {
}); });
$('#analyzer_type').on('input', function() { $('#analyzer_type').on('input', function() {
if ($('#analyzer_type').val() == 254){ if ($('#analyzer_type').val() == 2 || $('#analyzer_type').val() == 254){
$('#analyzer_metatype_name').show() $('#analyzer_metatype_name').show()
} else { } else {
$('#analyzer_metatype_name').hide() $('#analyzer_metatype_name').hide()

View File

@ -101,6 +101,18 @@
</div> </div>
</div> </div>
<div class="d-flex justify-content-center mt-2">
{% if not data_uuid.get('is_monitored', False) %}
<a href="{{ url_for('D4_sensors.add_sensor_to_monitor') }}?uuid={{uuid_sensor}}">
<button type="button" class="btn btn-primary">Monitor Sensor</button>
</a>
{% else %}
<a href="{{ url_for('D4_sensors.delete_sensor_to_monitor') }}?uuid={{uuid_sensor}}">
<button type="button" class="btn btn-danger">Remove Sensor from monitoring</button>
</a>
{% endif %}
</div>
<div class="card-deck justify-content-center ml-0 mr-0"> <div class="card-deck justify-content-center ml-0 mr-0">
<div class="card border-dark mt-3" style="max-width: 18rem;"> <div class="card border-dark mt-3" style="max-width: 18rem;">
<div class="card-body text-dark"> <div class="card-body text-dark">

View File

@ -47,8 +47,8 @@ mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/js/bootstrap.min.js ./static/js/
mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/css/bootstrap.min.css ./static/css/ mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/css/bootstrap.min.css ./static/css/
mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/css/bootstrap.min.css.map ./static/css/ mv temp/bootstrap-${BOOTSTRAP_VERSION}-dist/css/bootstrap.min.css.map ./static/css/
mv temp/popper.js-1.14.3/dist/umd/popper.min.js ./static/js/ mv temp/floating-ui-1.14.3/dist/umd/popper.min.js ./static/js/
mv temp/popper.js-1.14.3/dist/umd/popper.min.js.map ./static/js/ mv temp/floating-ui-1.14.3/dist/umd/popper.min.js.map ./static/js/
mv temp/Font-Awesome-${FONT_AWESOME_VERSION} temp/font-awesome mv temp/Font-Awesome-${FONT_AWESOME_VERSION} temp/font-awesome

View File

@ -84,13 +84,13 @@ if __name__ == "__main__":
new_date = datetime.datetime.now().strftime("%Y%m%d") new_date = datetime.datetime.now().strftime("%Y%m%d")
# get all directory files
all_files = os.listdir(worker_data_directory)
not_compressed_file = [] not_compressed_file = []
# filter: get all not compressed files # filter: get all not compressed files
for file in all_files: if os.path.isdir(worker_data_directory):
if file.endswith('.cap'): all_files = os.listdir(worker_data_directory)
not_compressed_file.append(os.path.join(worker_data_directory, file)) for file in all_files:
if file.endswith('.cap'):
not_compressed_file.append(os.path.join(worker_data_directory, file))
if not_compressed_file: if not_compressed_file:
### check time-change (minus one hour) ### ### check time-change (minus one hour) ###

View File

@ -11,6 +11,7 @@ import subprocess
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/')) sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(stream_name, session_uuid, uuid): def data_incorrect_format(stream_name, session_uuid, uuid):
redis_server_stream.sadd('Error:IncorrectType', session_uuid) redis_server_stream.sadd('Error:IncorrectType', session_uuid)
@ -39,15 +40,9 @@ def compress_file(file_full_path, i=0):
shutil.copyfileobj(f_in, f_out) shutil.copyfileobj(f_in, f_out)
os.remove(file_full_path) os.remove(file_full_path)
# save full path in anylyzer queue # save full path in anylyzer queue
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)): Analyzer_Queue.add_data_to_queue(uuid, type, compressed_filename)
analyzer_uuid = analyzer_uuid.decode()
redis_server_analyzer.lpush('analyzer:{}:{}'.format(type, analyzer_uuid), compressed_filename)
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'last_updated', time.time())
analyser_queue_max_size = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if analyser_queue_max_size is None:
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(type, analyzer_uuid), 0, analyser_queue_max_size)
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False) redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False) redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False) redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
@ -59,7 +54,7 @@ if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data') data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else: else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory") data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
config_loader = None
type = 1 type = 1
tcp_dump_cycle = '300' tcp_dump_cycle = '300'
@ -90,8 +85,8 @@ if __name__ == "__main__":
os.makedirs(rel_path) os.makedirs(rel_path)
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time())) print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else: else:
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1) sys.exit(1)
print('Incorrect message')
redis_server_stream.sadd('working_session_uuid:{}'.format(type), session_uuid) redis_server_stream.sadd('working_session_uuid:{}'.format(type), session_uuid)
#LAUNCH a tcpdump #LAUNCH a tcpdump
@ -154,16 +149,16 @@ if __name__ == "__main__":
except subprocess.TimeoutExpired: except subprocess.TimeoutExpired:
process_compressor.kill() process_compressor.kill()
### compress all files ### ### compress all files ###
date = datetime.datetime.now().strftime("%Y%m%d")
worker_data_directory = os.path.join(full_tcpdump_path, date[0:4], date[4:6], date[6:8]) worker_data_directory = os.path.join(full_tcpdump_path, date[0:4], date[4:6], date[6:8])
all_files = os.listdir(worker_data_directory) if os.path.isdir(worker_data_directory):
all_files.sort() all_files = os.listdir(worker_data_directory)
if all_files: all_files.sort()
for file in all_files: if all_files:
if file.endswith('.cap'): for file in all_files:
full_path = os.path.join(worker_data_directory, file) if file.endswith('.cap'):
if redis_server_stream.get('data_in_process:{}'.format(session_uuid)) != full_path: full_path = os.path.join(worker_data_directory, file)
compress_file(full_path) if redis_server_stream.get('data_in_process:{}'.format(session_uuid)) != full_path:
compress_file(full_path)
### ### ### ###
#print(process.stderr.read()) #print(process.stderr.read())

View File

@ -11,6 +11,7 @@ import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/')) sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader import ConfigLoader
import Analyzer_Queue
DEFAULT_FILE_EXTENSION = 'txt' DEFAULT_FILE_EXTENSION = 'txt'
@ -61,6 +62,8 @@ class MetaTypesDefault:
def process_data(self, data): def process_data(self, data):
# save data on disk # save data on disk
self.save_rotate_file(data) self.save_rotate_file(data)
# do something with the data (send to analyzer queue by default)
self.reconstruct_data(data)
######## CORE FUNCTIONS ######## ######## CORE FUNCTIONS ########
@ -172,15 +175,7 @@ class MetaTypesDefault:
os.remove(file_full_path) os.remove(file_full_path)
def send_to_analyzers(self, data_to_send): def send_to_analyzers(self, data_to_send):
## save full path in anylyzer queue Analyzer_Queue.add_data_to_queue(self.uuid, self.get_type_name(), data_to_send)
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}:{}'.format(TYPE, self.get_type_name())):
analyzer_uuid = analyzer_uuid.decode()
redis_server_analyzer.lpush('analyzer:{}:{}'.format(self.get_type_name(), analyzer_uuid), data_to_send)
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'last_updated', time.time())
analyser_queue_max_size = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if analyser_queue_max_size is None:
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(self.get_type_name(), analyzer_uuid), 0, analyser_queue_max_size)
######## GET FUNCTIONS ######## ######## GET FUNCTIONS ########

View File

@ -0,0 +1,80 @@
#!/usr/bin/env python3
from meta_types_modules.MetaTypesDefault import MetaTypesDefault
import hashlib
import time
import os
import datetime
import base64
import shutil
import gzip
class TypeHandler(MetaTypesDefault):
def __init__(self, uuid, json_file):
super().__init__(uuid, json_file)
self.compress = False
self.extension = ''
self.segregate = True
if "compress" in json_file:
self.compress = json_file['compress']
if "extension" in json_file:
self.extension = json_file['extension']
if "segregate" in json_file:
self.segregate = json_file['segregate']
self.set_rotate_file_mode(False)
self.saved_dir = ''
def process_data(self, data):
# Unpack the thing
self.reconstruct_data(data)
# pushing the filepath instead of the file content to the analyzer
def handle_reconstructed_data(self, data):
m = hashlib.sha256()
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# Create folder
save_dir = os.path.join(self.get_save_dir(save_by_uuid=self.segregate), 'files')
if not os.path.isdir(save_dir):
os.makedirs(save_dir)
# write file to disk
decodeddata = base64.b64decode(data)
m.update(decodeddata)
path = os.path.join(save_dir, m.hexdigest())
path = '{}.{}'.format(path, self.extension)
with open(path, 'wb') as p:
p.write(decodeddata)
if self.compress:
compressed_filename = '{}.gz'.format(path)
with open(path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
os.remove(path)
self.send_to_analyzers(compressed_filename)
else:
self.send_to_analyzers(path)
def reconstruct_data(self, data):
# save data in buffer
self.add_to_buffer(data)
data = self.get_buffer()
# end of element found in data
if self.get_file_separator() in data:
# empty buffer
self.reset_buffer()
all_line = data.split(self.get_file_separator())
for reconstructed_data in all_line[:-1]:
if reconstructed_data != b'':
self.handle_reconstructed_data(reconstructed_data)
# save incomplete element in buffer
if all_line[-1] != b'':
self.add_to_buffer(all_line[-1])
def test(self):
print('Class: filewatcher')

View File

@ -0,0 +1,38 @@
#!/usr/bin/env python3
from meta_types_modules.MetaTypesDefault import MetaTypesDefault
import hashlib
import time
import os
import datetime
class TypeHandler(MetaTypesDefault):
def __init__(self, uuid, json_file):
super().__init__(uuid, json_file)
self.set_rotate_file_mode(False)
self.saved_dir = ''
def process_data(self, data):
self.reconstruct_data(data)
# pushing the filepath instead of the file content to the analyzer
def handle_reconstructed_data(self, data):
m = hashlib.sha256()
self.set_last_time_saved(time.time())
self.set_last_saved_date(datetime.datetime.now().strftime("%Y%m%d%H%M%S"))
# Create folder
jsons_save_dir = os.path.join(self.get_save_dir(save_by_uuid=True), 'files')
if not os.path.isdir(jsons_save_dir):
os.makedirs(jsons_save_dir)
# write json file to disk
m.update(data)
jsons_path = os.path.join(jsons_save_dir, m.hexdigest()+'.json')
with open(jsons_path, 'wb') as j:
j.write(data)
# Send data to Analyszer
self.send_to_analyzers(jsons_path)
def test(self):
print('Class: filewatcherjson')

View File

@ -0,0 +1,180 @@
#!/usr/bin/env python3
import os
import sys
import time
import gzip
import redis
import shutil
import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(session_uuid):
print('Incorrect format')
sys.exit(1)
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
# get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
# check if field is None
if use_default_save_directory:
data_directory = os.path.join(os.environ['D4_HOME'], 'data')
else:
data_directory = config_loader.get_config_str("Save_Directories", "save_directory")
config_loader = None
type = 3
rotation_save_cycle = 300 #seconds
max_buffer_length = 10000
save_to_file = True
def compress_file(file_full_path, i=0):
if i==0:
compressed_filename = '{}.gz'.format(file_full_path)
else:
compressed_filename = '{}.{}.gz'.format(file_full_path, i)
if os.path.isfile(compressed_filename):
compress_file(file_full_path, i+1)
else:
with open(file_full_path, 'rb') as f_in:
with gzip.open(compressed_filename, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
os.remove(file_full_path)
def get_save_dir(dir_data_uuid, year, month, day):
dir_path = os.path.join(dir_data_uuid, year, month, day)
if not os.path.isdir(dir_path):
os.makedirs(dir_path)
return dir_path
if __name__ == "__main__":
if len(sys.argv) != 2:
print('usage:', 'Worker.py', 'session_uuid')
exit(1)
session_uuid = sys.argv[1]
stream_name = 'stream:{}:{}'.format(type, session_uuid)
id = '0'
buffer = b''
# track launched worker
redis_server_stream.sadd('working_session_uuid:{}'.format(type), session_uuid)
# get uuid
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
uuid = res[0][1][0][1][b'uuid'].decode()
# init file rotation
if save_to_file:
rotate_file = False
time_file = time.time()
date_file = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
dir_data_uuid = os.path.join(data_directory, uuid, str(type))
dir_full_path = get_save_dir(dir_data_uuid, date_file[0:4], date_file[4:6], date_file[6:8])
filename = '{}-{}-{}-{}-{}.syslog.txt'.format(uuid, date_file[0:4], date_file[4:6], date_file[6:8], date_file[8:14])
save_path = os.path.join(dir_full_path, filename)
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else:
########################### # TODO: clean db on error
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1)
while True:
res = redis_server_stream.xread({stream_name: id}, count=1)
if res:
new_id = res[0][1][0][0].decode()
if id != new_id:
id = new_id
data = res[0][1][0][1]
if id and data:
# reconstruct data
if buffer != b'':
data[b'message'] = b''.join([buffer, data[b'message']])
buffer = b''
# send data to redis
# new line in received data
if b'\n' in data[b'message']:
all_line = data[b'message'].split(b'\n')
for line in all_line[:-1]:
Analyzer_Queue.add_data_to_queue(uuid, type, line)
# analyzer_uuid = analyzer_uuid.decode()
# keep incomplete line
if all_line[-1] != b'':
buffer += all_line[-1]
else:
if len(buffer) < max_buffer_length:
buffer += data[b'message']
else:
print('Error, infinite loop, max buffer length reached')
# force new line
buffer += b''.join([ data[b'message'], b'\n' ])
# save data on disk
if save_to_file and b'\n' in data[b'message']:
new_date = datetime.datetime.now().strftime("%Y%m%d%H%M%S")
# check if a new rotation is needed
if ( new_date[0:8] != date_file[0:8] ) or ( time.time() - time_file > rotation_save_cycle ):
date_file = new_date
rotate_file = True
# file rotation
if rotate_file:
end_file, start_new_file = data[b'message'].rsplit(b'\n', maxsplit=1)
# save end of file
with open(save_path, 'ab') as f:
f.write(end_file)
compress_file(save_path)
# get new save_path
dir_full_path = get_save_dir(dir_data_uuid, date_file[0:4], date_file[4:6], date_file[6:8])
filename = '{}-{}-{}-{}-{}.syslog.txt'.format(uuid, date_file[0:4], date_file[4:6], date_file[6:8], date_file[8:14])
save_path = os.path.join(dir_full_path, filename)
# save start of new file
if start_new_file != b'':
with open(save_path, 'ab') as f:
f.write(start_new_file)
# end of rotation
rotate_file = False
time_file = time.time()
else:
with open(save_path, 'ab') as f:
f.write(data[b'message'])
redis_server_stream.xdel(stream_name, id)
else:
# sucess, all data are saved
if redis_server_stream.sismember('ended_session', session_uuid):
redis_server_stream.srem('ended_session', session_uuid)
redis_server_stream.srem('session_uuid:{}'.format(type), session_uuid)
redis_server_stream.srem('working_session_uuid:{}'.format(type), session_uuid)
redis_server_stream.hdel('map-type:session_uuid-uuid:{}'.format(type), session_uuid)
redis_server_stream.delete(stream_name)
try:
if os.path.isfile(save_path):
#print('save')
compress_file(save_path)
except NameError:
pass
print('---- syslog DONE, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
sys.exit(0)
else:
time.sleep(10)

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python3
import os
import sys
import time
import redis
import subprocess
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader
config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
config_loader = None
type = 3
try:
redis_server_stream.ping()
except redis.exceptions.ConnectionError:
print('Error: Redis server {}, ConnectionError'.format("Redis_STREAM"))
sys.exit(1)
if __name__ == "__main__":
stream_name = 'stream:{}'.format(type)
redis_server_stream.delete('working_session_uuid:{}'.format(type))
while True:
for session_uuid in redis_server_stream.smembers('session_uuid:{}'.format(type)):
session_uuid = session_uuid.decode()
if not redis_server_stream.sismember('working_session_uuid:{}'.format(type), session_uuid):
process = subprocess.Popen(['./worker.py', session_uuid])
print('Launching new worker{} ... session_uuid={}'.format(type, session_uuid))
#print('.')
time.sleep(10)

View File

@ -9,6 +9,7 @@ import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/')) sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(session_uuid): def data_incorrect_format(session_uuid):
print('Incorrect format') print('Incorrect format')
@ -59,8 +60,8 @@ if __name__ == "__main__":
rel_path = os.path.join(dir_path, filename) rel_path = os.path.join(dir_path, filename)
print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time())) print('---- worker launched, uuid={} session_uuid={} epoch={}'.format(uuid, session_uuid, time.time()))
else: else:
print('Incorrect Stream, Closing worker: type={} session_uuid={}'.format(type, session_uuid))
sys.exit(1) sys.exit(1)
print('Incorrect message')
time_file = time.time() time_file = time.time()
rotate_file = False rotate_file = False

View File

@ -11,6 +11,7 @@ import datetime
sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/')) sys.path.append(os.path.join(os.environ['D4_HOME'], 'lib/'))
import ConfigLoader import ConfigLoader
import Analyzer_Queue
def data_incorrect_format(session_uuid): def data_incorrect_format(session_uuid):
print('Incorrect format') print('Incorrect format')
@ -19,7 +20,7 @@ def data_incorrect_format(session_uuid):
config_loader = ConfigLoader.ConfigLoader() config_loader = ConfigLoader.ConfigLoader()
redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False) redis_server_stream = config_loader.get_redis_conn("Redis_STREAM", decode_responses=False)
redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False) redis_server_analyzer = config_loader.get_redis_conn("Redis_ANALYZER", decode_responses=False)
config_loader = None redis_server_metadata = config_loader.get_redis_conn("Redis_METADATA", decode_responses=False)
# get data directory # get data directory
use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory") use_default_save_directory = config_loader.get_config_boolean("Save_Directories", "use_default_save_directory")
@ -112,14 +113,7 @@ if __name__ == "__main__":
if b'\n' in data[b'message']: if b'\n' in data[b'message']:
all_line = data[b'message'].split(b'\n') all_line = data[b'message'].split(b'\n')
for line in all_line[:-1]: for line in all_line[:-1]:
for analyzer_uuid in redis_server_metadata.smembers('analyzer:{}'.format(type)): Analyzer_Queue.add_data_to_queue(uuid, type, line)
analyzer_uuid = analyzer_uuid.decode()
redis_server_analyzer.lpush('analyzer:{}:{}'.format(type, analyzer_uuid), line)
redis_server_metadata.hset('analyzer:{}'.format(analyzer_uuid), 'last_updated', time.time())
analyser_queue_max_size = redis_server_metadata.hget('analyzer:{}'.format(analyzer_uuid), 'max_size')
if analyser_queue_max_size is None:
analyser_queue_max_size = analyzer_list_max_default_size
redis_server_analyzer.ltrim('analyzer:{}:{}'.format(type, analyzer_uuid), 0, analyser_queue_max_size)
# keep incomplete line # keep incomplete line
if all_line[-1] != b'': if all_line[-1] != b'':
buffer += all_line[-1] buffer += all_line[-1]