Compare commits

...

105 Commits

Author SHA1 Message Date
Steve Clement a46cffb0a5
Merge pull request #155 from SteveClement/master 2021-03-30 13:19:35 +09:00
Steve Clement 461c8af2d7
chg: [installer] Added centos, added note on slow MaxMind keygen 2021-03-30 13:18:18 +09:00
Steve Clement 67d7d06af2
Merge pull request #147 from SteveClement/maxmind
fix: [maxmind] catch 401 not authorized
2020-04-27 18:50:57 +09:00
Steve Clement fdcc13d9d7
fix: [maxmind] catch 401 not authorized 2020-04-27 18:49:51 +09:00
Alexandre Dulaunoy c201d9e3b8
Merge pull request #145 from automationator/master
Fixes MaxMind SHA256 installer bug
2020-03-24 07:23:58 +01:00
Matthew Wilson 109ffaa13d
Fixes MaxMind SHA256 installer bug
The command used to edit the sha256 MaxMind file caused it to be replaced with a null file. This properly edits the file in place.
2020-03-23 12:36:07 -04:00
Steve Clement f6e62aaf49
Merge pull request #144 from SteveClement/maxmind-fix
fix: [GeoIP] MaxMind now requires a free account
2020-02-20 15:02:55 +08:00
Steve Clement 1b585d8ff2
chg: [GeoIP] Properly clean all files 2020-02-20 14:48:26 +08:00
Steve Clement 8d28c5128d
fix: [GeoIP] Small refactor 2020-02-20 14:45:03 +08:00
Steve Clement 4de261ee62
fix: [GeoIP] MaxMind now requires a free account 2020-02-20 14:39:57 +08:00
mokaddem 6510e6dca1 chg: [diagnostic] Added new catches to better handle errors 2020-02-10 07:44:20 +01:00
Alexandre Dulaunoy d0b9e05fe8
Merge pull request #140 from VVX7/master
chg: [auth] conditionally send debug token
2019-12-20 21:54:58 +01:00
VVX7 ed7e5d03bf chg: [auth] only send debug token when MISP is running in debug mode. 2019-12-20 15:16:17 -05:00
VVX7 ad1e776906 chg: [misp_login] remove debug field 2019-11-20 15:49:32 -05:00
Alexandre Dulaunoy 28200e4862
Merge pull request #139 from cudeso/master
SSL for Flask server
2019-11-13 20:46:34 +01:00
Koen Van Impe c5644f52f0 SSL for Flask server 2019-11-13 19:05:54 +01:00
Sami Mokaddem 8a49904235
chg: [README] Added Authentication section 2019-10-30 15:06:25 +01:00
Sami Mokaddem 5967a9d34a
Merge pull request #135 from mokaddem/improvements-login-diagnostic
Improvements on login and diagnostic
2019-10-29 16:40:05 +01:00
mokaddem f1c5c76ec5 chg: [login/diagnostic] Added catch if a secure cookie can't be set 2019-10-29 15:42:58 +01:00
mokaddem ad041c5f77 fix: [login/diagnostic] Return data with the expected format 2019-10-29 15:12:43 +01:00
mokaddem ce4aa82c0a chg: [login/diagnostic] Improved login errors feedback and adjusted
diagnostic
2019-10-29 14:52:10 +01:00
Sami Mokaddem d390a169b5
Merge pull request #130 from MISP/authImprovements
Few authentication improvements
2019-10-11 09:59:27 +02:00
Sami Mokaddem 47c4c2e529
Merge pull request #129 from VVX7/master
new: [authentication] Flask-login authentication via MISP.
2019-10-11 09:51:52 +02:00
mokaddem 1b4df61591 chg: [auth] Simplified condition 2019-10-11 09:45:54 +02:00
mokaddem 8da3d509cd chg: [diagnostic] Fixed to support auth 2019-10-11 09:35:03 +02:00
mokaddem eaf3ad30d1 chg: [auth] Check if can access the dashboard 2019-10-11 08:57:55 +02:00
mokaddem 2ecc4a8fad chg: [login] Fixed web dependencies and added auth error message 2019-10-11 08:38:33 +02:00
mokaddem 21dedd37ed chg: [auth] Takes into account MISP baseurl for redirections 2019-10-11 08:37:46 +02:00
VVX7 4d5ee49357 chg: [Authentication] User authentication can be disabled in config. If disabled, users are automatically logged in with a randomly generated account name and redirected to /index. 2019-10-03 17:26:58 -04:00
VVX7 b313b7cc74 chg: [authentication] add logout endpoint to dashboard dropdown 2019-10-02 20:15:34 -04:00
VVX7 3b0ebe8c72 chg: [authentication] session_cookie_sametime is str 2019-10-02 19:35:58 -04:00
VVX7 07f68cb33f chg: [authentication] configure misp-dashboard cookie policy 2019-10-02 19:32:39 -04:00
VVX7 bd5984faad chg: [authentication] set session protection to kill session when session identifier does not match 2019-10-02 19:10:35 -04:00
VVX7 9c028e697f chg: [authentication] require authorization on hidden endpoints. 2019-10-02 18:20:11 -04:00
VVX7 88cc920bd3 chg: [authentication] add unauthorized_handler to redirect unauthorized user to login page. 2019-10-02 18:12:52 -04:00
VVX7 708addaa34 chg: [authentication] add required login to dashboard views 2019-10-02 18:01:27 -04:00
VVX7 71780003d0 chg: [authentication] turn off password autocomplete 2019-10-02 17:25:00 -04:00
VVX7 83df3e4f74 chg: [authentication] increased password field length to 255. minor changes to login page. 2019-10-02 17:14:22 -04:00
VVX7 a0ccff71ef chg: [authentication] added flask session secret 2019-10-02 16:34:27 -04:00
VVX7 e18728e8b1 chg: [authentication] enforce session ssl 2019-10-02 13:35:12 -04:00
VVX7 b7c8f6b577 chg: [authentication] enforce session ssl 2019-10-02 12:46:37 -04:00
VVX7 e44f7e2c4b chg: [authentication] added misp logo 2019-10-02 12:35:21 -04:00
VVX7 2b99e13110 chg: [authentication] added login page 2019-10-02 12:34:08 -04:00
VVX7 1356e0003e chg: [authentication] removed auth required on endpoints 2019-10-02 11:17:08 -04:00
VVX7 2be101fdfc new: [authentication] Flask-login authentication via MISP instance. 2019-10-01 21:06:29 -04:00
Sami Mokaddem 60ce6ce5cd
Update README.md 2019-09-27 10:46:02 +02:00
mokaddem dd218f4cf4 fix: [security] prevent XSS injection in livelog table 2019-09-16 20:58:13 +02:00
Sami Mokaddem 0ac7e7cf84
Merge pull request #121 from mokaddem/fewFixes2
Various fixes and improvements
2019-08-30 13:16:47 +02:00
mokaddem 8fd474712b chg: [livelog] Scrolling Logs when fullscreen is on - Fix #118 2019-08-30 12:15:43 +02:00
mokaddem fba754b2e5 chg: [livelog] Fix z-index and fullscreen log panel z-index 2019-08-30 11:59:50 +02:00
mokaddem 3e218cd145 chg: [startup] Wait until redis is ready before starting the zmqs
scripts
2019-08-30 11:39:14 +02:00
mokaddem 19842f9445 fix: Catch if country does not have alpha_2 attribute - fix #119 2019-08-30 11:05:43 +02:00
mokaddem 2f3fd08404 chg: [start] Added restart capability 2019-08-30 10:50:41 +02:00
mokaddem 0dbaa034fb fix: [contrib] Hide broken organisation images - Fix #110 2019-08-29 10:25:57 +02:00
mokaddem fb1332be6a fix: [diagnostic] Corrected copy/paste typo
Just me being a monkey
2019-08-28 16:04:45 +02:00
mokaddem 26f9e31786 fix: [update] Changed string formating to `format` 2019-08-28 15:57:13 +02:00
mokaddem f2fb36601a fix: [helpers] Changed string formating to `format` and slight refact 2019-08-28 15:54:37 +02:00
mokaddem f009d81321 fix: [diagnostic] Changed string formating to `format` 2019-08-28 15:49:40 +02:00
mokaddem b2be833801 Merge branch 'master' of github.com:MISP/misp-dashboard 2019-07-22 13:20:27 +02:00
Sami Mokaddem 16144e1acc
Merge pull request #113 from Kortho/patch-2
added net-tools to debian-based install command
2019-07-02 16:46:59 +02:00
Sami Mokaddem 0ff42a0a29
Merge pull request #112 from Kortho/patch-1
removed hard-coded zmq startup
2019-07-02 16:46:31 +02:00
Kortho 9d1b488399
added user zmqs back 2019-07-02 11:57:46 +02:00
Kortho 71fc511c61
added net-tools to debian-based install command
needed to run the netstat command
2019-07-02 09:14:35 +02:00
Kortho 4715f0ec29
removed hard-coded zmq startup
It was hard coded to run as a specific user and a hard coded location of script
2019-07-02 08:43:18 +02:00
Steve Clement 8dae1b1524
Merge pull request #111 from SteveClement/CentOS_RHEL
fix: [installer] Make it work on RHEL/CentOS
2019-07-01 15:16:13 +09:00
Steve Clement 1ccf833428
fix: [installer] Make it work on RHEL/CentOS 2019-07-01 15:15:00 +09:00
Sami Mokaddem beb17f7b56
Merge pull request #109 from MISP/fixlogs
fix: [logs:helper] Helpers get their own log file
2019-06-27 11:08:08 +02:00
Sami Mokaddem ab886714d5
Merge pull request #108 from MISP/fixGeoReader
Fix geo reader
2019-06-27 11:07:59 +02:00
mokaddem b7d8259a73 Merge branch 'fixlogs' 2019-06-27 11:06:21 +02:00
mokaddem 7e44e00788 fix: [logs:helper] Helpers get their own log file 2019-06-27 10:47:32 +02:00
mokaddem 6b064732fd fix: try another mean to forward the country to the client 2019-06-27 10:39:03 +02:00
mokaddem a4bdf6e21e fix: [geohelper] Prevent crash if country not defined in the geo
response
2019-06-27 09:00:38 +02:00
Sami Mokaddem f75c107805
Clarified updated from pulling 2019-06-24 15:28:46 +02:00
Sami Mokaddem 97894d06f0
Merge pull request #106 from MISP/subzero
Pulling from several 0MQ feeds + screens + diagnostic tool
2019-06-21 16:27:36 +02:00
mokaddem ddf7cbcfff fix: [diagnostic] socket subscribing multiple time and improved status
message
2019-06-21 16:19:57 +02:00
mokaddem a9f9a67184 chg: [diagnostic] Added support of multiple subscribers - WiP 2019-06-21 15:55:03 +02:00
mokaddem cbbdf7cbfc fix: mergeconflict and log filename 2019-06-21 15:32:59 +02:00
mokaddem b06c95c907 Merge branch 'master' of github.com:MISP/misp-dashboard into subzero 2019-06-21 14:39:24 +02:00
Jean-Louis Huynen 1439804d46 chg: create zmqs user + sudoer right for www-data 2019-06-21 12:38:31 +02:00
Sami Mokaddem 46682e3195
Merge pull request #103 from MISP/diagnosticTool
Livelog Improvement, Diagnostic tool and Updater
2019-06-20 15:49:34 +02:00
mokaddem 4463361192 chg: [diagnostic] Improved config comparison 2019-06-20 15:40:14 +02:00
mokaddem a50bab4e77 chg: [diagnostic] Provide suggestion to fix py-redis version 2019-06-20 14:16:25 +02:00
mokaddem 194b28ea93 fix: Force closing the connection before trying to reconnect 2019-06-19 11:45:43 +02:00
mokaddem c1a2d17bbf chg: More sane response decoding, done by the ORM 2019-06-19 11:32:06 +02:00
mokaddem 4bfc7e2f48 chg: slightly improved pgrep parsing 2019-06-19 11:30:49 +02:00
mokaddem 3110c936bd chg: Removed debug message 2019-06-19 10:32:12 +02:00
mokaddem ea9bc654f8 chg: [diagnostic] Added packages check 2019-06-18 12:30:07 +02:00
mokaddem b14f291c94 chg: [updater] More intuitive db numbering 2019-06-18 11:46:48 +02:00
Sami Mokaddem 0cbbb59e67
Update README.md
Note about restarting the system after updating by pulling.
2019-06-18 11:40:05 +02:00
mokaddem 68f158ebed fix: [all] Fixed issue with py-redis>2.x and fix failed merge conflict 2019-06-18 11:25:17 +02:00
mokaddem 618da4eea5 fix: [contributors] Show the correct datetime 2019-06-18 11:20:32 +02:00
mokaddem 3ca3b37785 chg: [config] Moved dbVersion in the appropriate section 2019-06-18 09:57:28 +02:00
mokaddem 21af08a886 chg: [updates] Improved database updates 2019-06-18 09:39:22 +02:00
mokaddem 3a1fe9205d new: [updates] Update script - WiP 2019-06-18 09:25:46 +02:00
mokaddem 8869d6ba44 chg: increased dispatcher pooling rate and improved diagnostic's text feedback 2019-06-17 16:16:05 +02:00
mokaddem d442a40175 chg: [diagnostic] Added info about elapsed time 2019-06-17 15:53:03 +02:00
mokaddem f3f2c0d4bf fix: [diagnostic] Catch connectionError exception 2019-06-17 15:09:08 +02:00
mokaddem 3545eaa970 chg: [diagnostic] Moved timeoutException into util and bumped
requirements
2019-06-17 14:47:51 +02:00
mokaddem 1d7e6c185e chg: [diagnostic] Added tests for server 2019-06-17 14:42:41 +02:00
mokaddem 9d4c86010b chg: Added more processes and subscriber tests 2019-06-17 11:41:14 +02:00
mokaddem b4c2193b1a chg: Added more tests in diagnostic tool 2019-06-17 09:50:26 +02:00
mokaddem 51095d685c chg: Improved diagnostic tool 2019-06-14 16:59:00 +02:00
mokaddem a50c5c6fdb new: Started dev of diagnostic tool - WiP 2019-06-14 16:15:07 +02:00
Sami Mokaddem e2dea48294
Update zmq_subscribers.py
Added a test comment
2019-03-14 11:41:07 +01:00
Jean-Louis Huynen 787edbc301
put 0MQ subscribers into screens 2019-03-06 11:31:00 +01:00
28 changed files with 1427 additions and 108 deletions

View File

@ -57,6 +57,13 @@ __Includes__:
![Dashboard users](./screenshots/dashboard-trendings.png)
# Installation
Before installing, consider that the only supported system are open source Unix-like operating system such as Linux and others.
1. You will need to [create a free MaxMind account.](https://www.maxmind.com/en/geolite2/signup)
2. Set your password and [create a license key](https://www.maxmind.com/en/accounts/current/license-key)
2.1 Make a note of your License Key it's needed during install.
- Launch ```./install_dependencies.sh``` from the MISP-Dashboard directory ([idempotent-ish](https://en.wikipedia.org/wiki/Idempotence))
- Update the configuration file ```config.cfg``` so that it matches your system
- Fields that you may change:
@ -90,6 +97,7 @@ Traceback (most recent call last):
with open(dst, 'wb') as fdst:
OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin/python3'
```
- Restart the System: `./start_all.sh` **OR** `./start_zmq.sh` and `./server.py &`
# Starting the System
:warning: You should not run it as root. Normal privileges are fine.
@ -104,6 +112,10 @@ OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin
__Alternatively__, you can run the ```start_all.sh``` script to run the commands described above.
# Authentication
Authentication can be enable in ``config/config.cfg`` by setting ``auth_enabled = True``.
Users will be required to login to MISP and will be allowed to proceed if they have the *User Setting*'s ``dashboard_access`` sets to 1 for the MISP user account.
# Debug
Debug is fun and gives you more details on what is going on when things fail.

View File

@ -2,6 +2,22 @@
host = localhost
port = 8001
debug = False
ssl = False
# If you set SSL to True with a cert/key then an adhoc (self-signed) certificate is created
# ssl_cert = cert.pem
# ssl_key = key.pem
[Auth]
auth_enabled = False
misp_fqdn = https://misp.local
ssl_verify = True
session_secret = **Change_Me**
# Only send cookies with requests over HTTPS if the cookie is marked secure.
session_cookie_secure = True
# Prevent sending cookies in all external requests including regular links.
session_cookie_samesite = Strict
# Expire session cookie after n days.
permanent_session_lifetime = 1
[Dashboard]
#hours
@ -17,7 +33,7 @@ size_dashboard_left_width = 5
size_openStreet_pannel_perc = 55
size_world_pannel_perc = 35
item_to_plot = Attribute.category
fieldname_order=["Event.id", "Attribute.Tag", "Attribute.category", "Attribute.type", ["Attribute.value", "Attribute.comment"]]
fieldname_order=["Attribute.timestamp", "Event.id", "Attribute.Tag", "Attribute.category", "Attribute.type", ["Attribute.value", "Attribute.comment"]]
char_separator=||
[GEO]
@ -37,14 +53,26 @@ directory=logs
dispatcher_filename=zmq_dispatcher.log
subscriber_filename=zmq_subscriber.log
helpers_filename=helpers.log
update_filename=updates.log
[RedisGlobal]
host=localhost
port=6250
#misp_web_url = http://192.168.56.50
misp_web_url = http://localhost
#zmq_url=tcp://192.168.56.50:50000
zmq_url=tcp://localhost:50000
misp_web_url = http://0.0.0.0
misp_instances = [{
"name": "misp1",
"url": "http://localhost",
"zmq": "tcp://localhost:50000"}]
#misp_instances = [{
# "name": "misp1",
# "url": "http://localhost",
# "zmq": "tcp://localhost:50000"},
# {
# "name": "misp2",
# "url": "http://10.0.2.4",
# "zmq": "tcp://10.0.2.4:50000"}
# ]
[RedisLIST]
db=3
@ -67,3 +95,4 @@ path_countrycode_to_coord_JSON=./data/country_code_lat_long.json
[RedisDB]
db=2
dbVersion=db_version

500
diagnostic.py Executable file
View File

@ -0,0 +1,500 @@
#!/usr/bin/env python3
import os
import sys
import stat
import time
import signal
import functools
import configparser
from urllib.parse import urlparse, parse_qs
import subprocess
import diagnostic_util
try:
import redis
import zmq
import json
import flask
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
from halo import Halo
except ModuleNotFoundError as e:
print('Dependency not met. Either not in a virtualenv or dependency not installed.')
print('- Error: {}'.format(e))
sys.exit(1)
'''
Steps:
- check if dependencies exists
- check if virtualenv exists
- check if configuration is update-to-date
- check file permission
- check if redis is running and responding
- check if able to connect to zmq
- check zmq_dispatcher processing queue
- check queue status: being filled up / being filled down
- check if subscriber responding
- check if dispatcher responding
- check if server listening
- check log static endpoint
- check log dynamic endpoint
'''
HOST = 'http://127.0.0.1'
PORT = 8001 # overriden by configuration file
configuration_file = {}
pgrep_subscriber_output = ''
pgrep_dispatcher_output = ''
signal.signal(signal.SIGALRM, diagnostic_util.timeout_handler)
def humanize(name, isResult=False):
words = name.split('_')
if isResult:
words = words[1:]
words[0] = words[0][0].upper() + words[0][1:]
else:
words[0] = words[0][0].upper() + words[0][1:] + 'ing'
return ' '.join(words)
def add_spinner(_func=None, name='dots'):
def decorator_add_spinner(func):
@functools.wraps(func)
def wrapper_add_spinner(*args, **kwargs):
human_func_name = humanize(func.__name__)
human_func_result = humanize(func.__name__, isResult=True)
flag_skip = False
with Halo(text=human_func_name, spinner=name) as spinner:
result = func(spinner, *args, **kwargs)
if isinstance(result, tuple):
status, output = result
elif isinstance(result, list):
status = result[0]
output = result[1]
elif isinstance(result, bool):
status = result
output = None
else:
status = False
flag_skip = True
spinner.fail('{} - Function return unexpected result: {}'.format(human_func_name, str(result)))
if not flag_skip:
text = human_func_result
if output is not None and len(output) > 0:
text += ': {}'.format(output)
if isinstance(status, bool) and status:
spinner.succeed(text)
elif isinstance(status, bool) and not status:
spinner.fail(text)
else:
if status == 'info':
spinner.info(text)
else:
spinner.warn(text)
return status
return wrapper_add_spinner
if _func is None:
return decorator_add_spinner
else:
return decorator_add_spinner(_func)
@add_spinner
def check_virtual_environment_and_packages(spinner):
result = os.environ.get('VIRTUAL_ENV')
if result is None:
return (False, 'This diagnostic tool should be started inside a virtual environment.')
else:
if redis.__version__.startswith('2'):
return (False, '''Redis python client have version {}. Version 3.x required.
\t [inside virtualenv] pip3 install -U redis'''.format(redis.__version__))
else:
return (True, '')
@add_spinner
def check_configuration(spinner):
global configuration_file, port
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
configuration_file = cfg
cfg = {s: dict(cfg.items(s)) for s in cfg.sections()}
configfile_default = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg.default')
cfg_default = configparser.ConfigParser()
cfg_default.read(configfile_default)
cfg_default = {s: dict(cfg_default.items(s)) for s in cfg_default.sections()}
# Check if all fields from config.default exists in config
result, faulties = diagnostic_util.dict_compare(cfg_default, cfg)
if result:
port = configuration_file.get("Server", "port")
return (True, '')
else:
return_text = '''Configuration incomplete.
\tUpdate your configuration file `config.cfg`.\n\t Faulty fields:\n'''
for field_name in faulties:
return_text += '\t\t- {}\n'.format(field_name)
return (False, return_text)
@add_spinner(name='dot')
def check_file_permission(spinner):
max_mind_database_path = configuration_file.get('RedisMap', 'pathmaxminddb')
try:
st = os.stat(max_mind_database_path)
except FileNotFoundError:
return (False, 'Maxmind GeoDB - File not found')
all_read_perm = bool(st.st_mode & stat.S_IROTH) # FIXME: permission may be changed
if all_read_perm:
return (True, '')
else:
return (False, 'Maxmind GeoDB might have incorrect read file permission')
@add_spinner
def check_redis(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLog', 'db'))
if redis_server.ping():
return (True, '')
else:
return (False, '''Can\'t reach Redis server.
\t Make sure it is running and adapt your configuration accordingly''')
@add_spinner
def check_zmq(spinner):
timeout = 15
context = zmq.Context()
misp_instances = json.loads(configuration_file.get('RedisGlobal', 'misp_instances'))
instances_status = {}
for misp_instance in misp_instances:
socket = context.socket(zmq.SUB)
socket.connect(misp_instance.get('zmq'))
socket.setsockopt_string(zmq.SUBSCRIBE, '')
poller = zmq.Poller()
flag_skip = False
start_time = time.time()
poller.register(socket, zmq.POLLIN)
for t in range(1, timeout+1):
socks = dict(poller.poll(timeout=1*1000))
if len(socks) > 0:
if socket in socks and socks[socket] == zmq.POLLIN:
rcv_string = socket.recv()
if rcv_string.startswith(b'misp_json'):
instances_status[misp_instance.get('name')] = True
flag_skip = True
break
else:
spinner.text = 'checking zmq of {} - elapsed time: {}s'.format(misp_instance.get("name"), int(time.time() - start_time))
if not flag_skip:
instances_status[misp_instance.get('name')] = False
results = [s for n, s in instances_status.items()]
if all(results):
return (True, '')
elif any(results):
return_text = 'Connection to ZMQ stream(s) failed.\n'
for name, status in instances_status.items():
return_text += '\t{}: {}\n'.format(name, "success" if status else "failed")
return (True, return_text)
else:
return (False, '''Can\'t connect to the ZMQ stream(s).
\t Make sure the MISP ZMQ is running: `/servers/serverSettings/diagnostics`
\t Make sure your network infrastucture allows you to connect to the ZMQ''')
@add_spinner
def check_processes_status(spinner):
global pgrep_subscriber_output, pgrep_dispatcher_output
try:
response = subprocess.check_output(
["pgrep", "-laf", "zmq_"],
universal_newlines=True
)
except subprocess.CalledProcessError as e:
return (False, 'Could not get processes status. Error returned:\n'+str(e))
for line in response.splitlines():
lines = line.split(' ', maxsplit=1)
pid, p_name = lines
if 'zmq_subscriber.py' in p_name:
pgrep_subscriber_output = line
elif 'zmq_dispatcher.py' in p_name:
pgrep_dispatcher_output = line
if len(pgrep_subscriber_output) == 0:
return (False, 'zmq_subscriber is not running')
elif len(pgrep_dispatcher_output) == 0:
return (False, 'zmq_dispatcher is not running')
else:
return (True, 'Both processes are running')
@add_spinner
def check_subscriber_status(spinner):
global pgrep_subscriber_output
pool = redis.ConnectionPool(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'),
decode_responses=True)
monitor = diagnostic_util.Monitor(pool)
commands = monitor.monitor()
start_time = time.time()
signal.alarm(15)
try:
for i, c in enumerate(commands):
if i == 0: # Skip 'OK'
continue
split = c.split()
try:
action = split[3]
target = split[4]
except IndexError:
pass
if action == '"LPUSH"' and target == '\"{}\"'.format(configuration_file.get("RedisLIST", "listName")):
signal.alarm(0)
break
else:
spinner.text = 'Checking subscriber status - elapsed time: {}s'.format(int(time.time() - start_time))
except diagnostic_util.TimeoutException:
return_text = '''zmq_subscriber seems not to be working.
\t Consider restarting it: {}'''.format(pgrep_subscriber_output)
return (False, return_text)
return (True, 'subscriber is running and populating the buffer')
@add_spinner
def check_buffer_queue(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
warning_threshold = 100
elements_in_list = redis_server.llen(configuration_file.get('RedisLIST', 'listName'))
return_status = 'warning' if elements_in_list > warning_threshold else ('info' if elements_in_list > 0 else True)
return_text = 'Currently {} items in the buffer'.format(elements_in_list)
return (return_status, return_text)
@add_spinner
def check_buffer_change_rate(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
time_slept = 0
sleep_duration = 0.001
sleep_max = 10.0
refresh_frequency = 1.0
next_refresh = 0
change_increase = 0
change_decrease = 0
elements_in_list_prev = 0
elements_in_list = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
elements_in_inlist_init = elements_in_list
consecutive_no_rate_change = 0
while True:
elements_in_list_prev = elements_in_list
elements_in_list = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
change_increase += elements_in_list - elements_in_list_prev if elements_in_list - elements_in_list_prev > 0 else 0
change_decrease += elements_in_list_prev - elements_in_list if elements_in_list_prev - elements_in_list > 0 else 0
if next_refresh < time_slept:
next_refresh = time_slept + refresh_frequency
change_rate_text = '{}/sec\t{}/sec'.format(change_increase, change_decrease)
spinner.text = 'Buffer: {}\t{}'.format(elements_in_list, change_rate_text)
if consecutive_no_rate_change == 3:
time_slept = sleep_max
if elements_in_list == 0:
consecutive_no_rate_change += 1
else:
consecutive_no_rate_change = 0
change_increase = 0
change_decrease = 0
if time_slept >= sleep_max:
return_flag = elements_in_list == 0 or (elements_in_list < elements_in_inlist_init or elements_in_list < 2)
return_text = 'Buffer is consumed {} than being populated'.format("faster" if return_flag else "slower")
break
time.sleep(sleep_duration)
time_slept += sleep_duration
elements_in_inlist_final = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
return (return_flag, return_text)
@add_spinner
def check_dispatcher_status(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
content = {'content': time.time()}
redis_server.rpush(configuration_file.get('RedisLIST', 'listName'),
json.dumps({'zmq_name': 'diagnostic_channel', 'content': 'diagnostic_channel ' + json.dumps(content)})
)
return_flag = False
return_text = ''
time_slept = 0
sleep_duration = 0.2
sleep_max = 10.0
redis_server.delete('diagnostic_tool_response')
while True:
reply = redis_server.get('diagnostic_tool_response')
elements_in_list = redis_server.llen(configuration_file.get('RedisLIST', 'listName'))
if reply is None:
if time_slept >= sleep_max:
return_flag = False
return_text = 'zmq_dispatcher did not respond in the given time ({}s)'.format(int(sleep_max))
if len(pgrep_dispatcher_output) > 0:
return_text += '\n\t➥ Consider restarting it: {}'.format(pgrep_dispatcher_output)
else:
return_text += '\n\t➥ Consider starting it'
break
time.sleep(sleep_duration)
spinner.text = 'Dispatcher status: No response yet'
time_slept += sleep_duration
else:
return_flag = True
return_text = 'Took {:.2f}s to complete'.format(float(reply))
break
return (return_flag, return_text)
@add_spinner
def check_server_listening(spinner):
url = '{}:{}/_get_log_head'.format(HOST, PORT)
spinner.text = 'Trying to connect to {}'.format(url)
try:
r = requests.get(url)
except requests.exceptions.ConnectionError:
return (False, 'Can\'t connect to {}').format(url)
if '/error_page' in r.url:
o = urlparse(r.url)
query = parse_qs(o.query)
error_code = query.get('error_code', '')
if error_code[0] == '1':
return (False, 'To many redirects. Server may not be properly configured\n\t➥ Try to correctly setup an HTTPS server or change the cookie policy in the configuration')
else:
error_message = query.get('error_message', '')[0]
return (False, 'Unkown error: {}\n{}'.format(error_code, error_message))
else:
return (
r.status_code == 200,
'{} {}reached. Status code [{}]'.format(url, "not " if r.status_code != 200 else "", r.status_code)
)
@add_spinner
def check_server_dynamic_enpoint(spinner):
payload = {
'username': 'admin@admin.test',
'password': 'Password1234',
'submit': 'Sign In'
}
sleep_max = 15
start_time = time.time()
# Check MISP connectivity
url_misp = configuration_file.get("Auth", "misp_fqdn")
try:
r = requests.get(url_misp, verify=configuration_file.getboolean("Auth", "ssl_verify"))
except requests.exceptions.SSLError as e:
if 'CERTIFICATE_VERIFY_FAILED' in str(e):
return (False, 'SSL connection error certificate verify failed.\n\t➥ Review your configuration'.format(e))
else:
return (False, 'SSL connection error `{}`.\n\t➥ Review your configuration'.format(e))
except requests.exceptions.ConnectionError:
return (False, 'MISP `{}` cannot be reached.\n\t➥ Review your configuration'.format(url_misp))
url_login = '{}:{}/login'.format(HOST, PORT)
url = '{}:{}/_logs'.format(HOST, PORT)
session = requests.Session()
session.verify = configuration_file.getboolean("Auth", "ssl_verify")
r_login = session.post(url_login, data=payload)
# Check if we ended up on the error page
if '/error_page' in r_login.url:
o = urlparse(r_login.url)
query = parse_qs(o.query)
error_code = query.get('error_code', '')
if error_code[0] == '2':
return (False, 'MISP cannot be reached for authentication\n\t➥ Review MISP fully qualified name and SSL settings')
else:
error_message = query.get('error_message', '')[0]
return (False, 'Unkown error: {}\n{}'.format(error_code, error_message))
# Recover error message from the url
if '/login' in r_login.url:
o = urlparse(r_login.url)
query = parse_qs(o.query)
error_message = query.get('auth_error_message', ['Redirected to `loging` caused by an unknown error'])[0]
return_text = 'Redirected to `loging` caused by: {}'.format(error_message)
return (False, return_text)
# Connection seems to be successful, checking if we receive data from event-stream
r = session.get(url, stream=True, timeout=sleep_max, headers={'Accept': 'text/event-stream'})
return_flag = False
return_text = 'Dynamic endpoint returned data but not in the correct format.'
try:
for line in r.iter_lines():
if line.startswith(b'data: '):
data = line[6:]
try:
json.loads(data)
return_flag = True
return_text = 'Dynamic endpoint returned data (took {:.2f}s)\n\t{}...'.format(time.time()-start_time, line[6:20])
break
except Exception:
return_flag = False
return_text = 'Something went wrong. Output {}'.format(line)
break
except diagnostic_util.TimeoutException:
return_text = 'Dynamic endpoint did not returned data in the given time ({}sec)'.format(int(time.time()-start_time))
return (return_flag, return_text)
def start_diagnostic():
if not (check_virtual_environment_and_packages() and check_configuration()):
return
check_file_permission()
check_redis()
check_zmq()
check_processes_status()
check_subscriber_status()
if check_buffer_queue() is not True:
check_buffer_change_rate()
dispatcher_running = check_dispatcher_status()
if check_server_listening() and dispatcher_running:
check_server_dynamic_enpoint()
def main():
start_diagnostic()
if __name__ == '__main__':
main()

68
diagnostic_util.py Normal file
View File

@ -0,0 +1,68 @@
import configparser
def dict_compare(dict1, dict2, itercount=0):
dict1_keys = set(dict1.keys())
dict2_keys = set(dict2.keys())
intersection = dict1_keys.difference(dict2_keys)
faulties = []
if itercount > 0 and len(intersection) > 0:
return (False, list(intersection))
flag_no_error = True
for k, v in dict1.items():
if isinstance(v, dict):
if k not in dict2:
faulties.append({k: dict1[k]})
flag_no_error = False
else:
status, faulty = dict_compare(v, dict2[k], itercount+1)
flag_no_error = flag_no_error and status
if len(faulty) > 0:
faulties.append({k: faulty})
else:
return (True, [])
if flag_no_error:
return (True, [])
else:
return (False, faulties)
class TimeoutException(Exception):
pass
def timeout_handler(signum, frame):
raise TimeoutException
# https://stackoverflow.com/a/10464730
class Monitor():
def __init__(self, connection_pool):
self.connection_pool = connection_pool
self.connection = None
def __del__(self):
try:
self.reset()
except Exception:
pass
def reset(self):
if self.connection:
self.connection_pool.release(self.connection)
self.connection = None
def monitor(self):
if self.connection is None:
self.connection = self.connection_pool.get_connection(
'monitor', None)
self.connection.send_command("monitor")
return self.listen()
def parse_response(self):
return self.connection.read_response()
def listen(self):
while True:
yield self.parse_response()

View File

@ -210,7 +210,7 @@ def main():
for award in awards_given:
# update awards given
serv_redis_db.zadd('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec }))
serv_redis_db.zadd('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), {json.dumps({'org': org, 'award': award, 'epoch': nowSec }): nowSec})
serv_redis_db.expire('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), ONE_DAY*7) #expire after 7 day
# publish
publish_log('GIVE_HONOR_ZMQ', 'CONTRIBUTION', {'org': org, 'award': award, 'epoch': nowSec }, CHANNEL_LASTAWARDS)

View File

@ -39,12 +39,16 @@ class Contributor_helper:
if not os.path.exists(logDir):
os.makedirs(logDir)
try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
#honorBadge
self.honorBadgeNum = len(self.cfg_org_rank.options('HonorBadge'))
@ -96,7 +100,7 @@ class Contributor_helper:
self.DICO_PNTS_REWARD[categ] = self.default_pnts_per_contribution
self.rankMultiplier = self.cfg_org_rank.getfloat('monthlyRanking' ,'rankMultiplier')
self.levelMax = self.cfg_org_rank.getint('monthlyRanking' ,'levelMax')
self.levelMax = self.cfg_org_rank.getint('monthlyRanking', 'levelMax')
# REDIS KEYS
self.keyDay = KEYDAY
@ -107,7 +111,6 @@ class Contributor_helper:
self.keyTrophy = "CONTRIB_TROPHY"
self.keyLastAward = "CONTRIB_LAST_AWARDS"
''' HELPER '''
def getOrgLogoFromMISP(self, org):
return "{}/img/orgs/{}.png".format(self.misp_web_url, org)
@ -115,11 +118,11 @@ class Contributor_helper:
def addContributionToCateg(self, date, categ, org, count=1):
today_str = util.getDateStrFormat(date)
keyname = "{}:{}:{}".format(self.keyCateg, today_str, categ)
self.serv_redis_db.zincrby(keyname, org, count)
self.serv_redis_db.zincrby(keyname, count, org)
self.logger.debug('Added to redis: keyname={}, org={}, count={}'.format(keyname, org, count))
def publish_log(self, zmq_name, name, content, channel=""):
to_send = { 'name': name, 'log': json.dumps(content), 'zmqName': zmq_name }
to_send = {'name': name, 'log': json.dumps(content), 'zmqName': zmq_name }
self.serv_log.publish(channel, json.dumps(to_send))
self.logger.debug('Published: {}'.format(json.dumps(to_send)))
@ -155,7 +158,7 @@ class Contributor_helper:
self.serv_redis_db.sadd(self.keyAllOrg, org)
keyname = "{}:{}".format(self.keyLastContrib, util.getDateStrFormat(now))
self.serv_redis_db.zadd(keyname, nowSec, org)
self.serv_redis_db.zadd(keyname, {org: nowSec})
self.logger.debug('Added to redis: keyname={}, nowSec={}, org={}'.format(keyname, nowSec, org))
self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day
@ -164,7 +167,7 @@ class Contributor_helper:
for award in awards_given:
# update awards given
keyname = "{}:{}".format(self.keyLastAward, util.getDateStrFormat(now))
self.serv_redis_db.zadd(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec }))
self.serv_redis_db.zadd(keyname, {json.dumps({'org': org, 'award': award, 'epoch': nowSec }): nowSec})
self.logger.debug('Added to redis: keyname={}, nowSec={}, content={}'.format(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })))
self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day
# publish
@ -177,7 +180,7 @@ class Contributor_helper:
if pnts is None:
pnts = 0
else:
pnts = int(pnts.decode('utf8'))
pnts = int(pnts)
return pnts
# return: [final_rank, requirement_fulfilled, requirement_not_fulfilled]
@ -381,7 +384,7 @@ class Contributor_helper:
def getOrgsTrophyRanking(self, categ):
keyname = '{mainKey}:{orgCateg}'
res = self.serv_redis_db.zrange(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), 0, -1, withscores=True, desc=True)
res = [[org.decode('utf8'), score] for org, score in res]
res = [[org, score] for org, score in res]
return res
def getAllOrgsTrophyRanking(self, category=None):
@ -410,12 +413,12 @@ class Contributor_helper:
def giveTrophyPointsToOrg(self, org, categ, points):
keyname = '{mainKey}:{orgCateg}'
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), org, points)
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), points, org)
self.logger.debug('Giving {} trophy points to {} in {}'.format(points, org, categ))
def removeTrophyPointsFromOrg(self, org, categ, points):
keyname = '{mainKey}:{orgCateg}'
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), org, -points)
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), -points, org)
self.logger.debug('Removing {} trophy points from {} in {}'.format(points, org, categ))
''' AWARDS HELPER '''
@ -562,7 +565,7 @@ class Contributor_helper:
def getAllOrgFromRedis(self):
data = self.serv_redis_db.smembers(self.keyAllOrg)
data = [x.decode('utf8') for x in data]
data = [x for x in data]
return data
def getCurrentOrgRankFromRedis(self, org):

View File

@ -21,6 +21,7 @@ from phonenumbers import geocoder
class InvalidCoordinate(Exception):
pass
class Geo_helper:
def __init__(self, serv_redis_db, cfg):
self.serv_redis_db = serv_redis_db
@ -38,12 +39,16 @@ class Geo_helper:
if not os.path.exists(logDir):
os.makedirs(logDir)
try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
self.keyCategCoord = "GEO_COORD"
self.keyCategCountry = "GEO_COUNTRY"
@ -58,7 +63,12 @@ class Geo_helper:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.country_to_iso = { country.name: country.alpha_2 for country in pycountry.countries}
self.country_to_iso = {}
for country in pycountry.countries:
try:
self.country_to_iso[country.name] = country.alpha_2
except AttributeError:
pass
with open(self.PATH_TO_JSON) as f:
self.country_code_to_coord = json.load(f)
@ -120,7 +130,9 @@ class Geo_helper:
if not self.coordinate_list_valid(coord_list):
raise InvalidCoordinate("Coordinate do not match EPSG:900913 / EPSG:3785 / OSGEO:41001")
self.push_to_redis_zset(self.keyCategCoord, json.dumps(ordDic))
self.push_to_redis_zset(self.keyCategCountry, rep['full_rep'].country.iso_code)
iso_code = rep['full_rep'].country.iso_code if rep['full_rep'].country.iso_code is not None else rep['full_rep'].registered_country.iso_code
country_name = rep['full_rep'].country.name if rep['full_rep'].country.name is not None else rep['full_rep'].registered_country.name
self.push_to_redis_zset(self.keyCategCountry, iso_code)
ordDic = OrderedDict() #keep fields with the same layout in redis
ordDic['categ'] = categ
ordDic['value'] = supposed_ip
@ -129,10 +141,10 @@ class Geo_helper:
"coord": coord,
"categ": categ,
"value": supposed_ip,
"country": rep['full_rep'].country.name,
"country": country_name,
"specifName": rep['full_rep'].subdivisions.most_specific.name,
"cityName": rep['full_rep'].city.name,
"regionCode": rep['full_rep'].country.iso_code,
"regionCode": iso_code,
}
j_to_send = json.dumps(to_send)
self.serv_coord.publish(self.CHANNELDISP, j_to_send)
@ -202,11 +214,15 @@ class Geo_helper:
print("Please fix the above, and make sure you use a redis version that supports the GEOADD command.")
print("To test for support: echo \"help GEOADD\"| redis-cli")
self.logger.debug('Added to redis: keyname={}, lon={}, lat={}, content={}'.format(keyname, lon, lat, content))
def push_to_redis_zset(self, keyCateg, toAdd, endSubkey="", count=1):
if not isinstance(toAdd, str):
self.logger.warning('Can\'t add to redis, element is not of type String. {}'.format(type(toAdd)))
return
now = datetime.datetime.now()
today_str = util.getDateStrFormat(now)
keyname = "{}:{}{}".format(keyCateg, today_str, endSubkey)
self.serv_redis_db.zincrby(keyname, toAdd, count)
self.serv_redis_db.zincrby(keyname, count, toAdd)
self.logger.debug('Added to redis: keyname={}, toAdd={}, count={}'.format(keyname, toAdd, count))
def ip_to_coord(self, ip):

View File

@ -23,12 +23,16 @@ class Live_helper:
if not os.path.exists(logDir):
os.makedirs(logDir)
try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
def publish_log(self, zmq_name, name, content, channel=None):
channel = channel if channel is not None else self.CHANNEL
@ -48,7 +52,7 @@ class Live_helper:
entries = self.serv_live.lrange(rKey, 0, -1)
to_ret = []
for entry in entries:
jentry = json.loads(entry.decode('utf8'))
jentry = json.loads(entry)
to_ret.append(jentry)
return to_ret

View File

@ -32,12 +32,16 @@ class Trendings_helper:
if not os.path.exists(logDir):
os.makedirs(logDir)
try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
''' SETTER '''
@ -49,7 +53,7 @@ class Trendings_helper:
to_save = json.dumps(data)
else:
to_save = data
self.serv_redis_db.zincrby(keyname, to_save, 1)
self.serv_redis_db.zincrby(keyname, 1, to_save)
self.logger.debug('Added to redis: keyname={}, content={}'.format(keyname, to_save))
def addTrendingEvent(self, eventName, timestamp):
@ -91,7 +95,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ]
data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else []
to_ret.append([util.getTimestamp(curDate), data])
to_ret = util.sortByTrendingScore(to_ret, topNum=topNum)
@ -124,7 +128,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(self.keyTag, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ]
data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else []
temp = []
for jText, score in data:
@ -139,10 +143,10 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(self.keySigh, util.getDateStrFormat(curDate))
sight = self.serv_redis_db.get(keyname)
sight = 0 if sight is None else int(sight.decode('utf8'))
sight = 0 if sight is None else int(sight)
keyname = "{}:{}".format(self.keyFalse, util.getDateStrFormat(curDate))
fp = self.serv_redis_db.get(keyname)
fp = 0 if fp is None else int(fp.decode('utf8'))
fp = 0 if fp is None else int(fp)
to_ret.append([util.getTimestamp(curDate), { 'sightings': sight, 'false_positive': fp}])
return to_ret
@ -158,7 +162,7 @@ class Trendings_helper:
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True)
for elem in data:
allSet.add(elem.decode('utf8'))
allSet.add(elem)
to_ret[trendingType] = list(allSet)
tags = self.getTrendingTags(dateS, dateE)
tagSet = set()
@ -187,7 +191,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ]
data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else []
to_format.append([util.getTimestamp(curDate), data])

View File

@ -29,23 +29,27 @@ class Users_helper:
if not os.path.exists(logDir):
os.makedirs(logDir)
try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
def add_user_login(self, timestamp, org, email=''):
timestampDate = datetime.datetime.fromtimestamp(float(timestamp))
timestampDate_str = util.getDateStrFormat(timestampDate)
keyname_timestamp = "{}:{}".format(self.keyTimestamp, org)
self.serv_redis_db.zadd(keyname_timestamp, timestamp, timestamp)
self.serv_redis_db.zadd(keyname_timestamp, {timestamp: timestamp})
self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_timestamp, timestamp))
keyname_org = "{}:{}".format(self.keyOrgLog, timestampDate_str)
self.serv_redis_db.zincrby(keyname_org, org, 1)
self.serv_redis_db.zincrby(keyname_org, 1, org)
self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_org, org))
self.serv_redis_db.sadd(self.keyAllOrgLog, org)
@ -53,7 +57,7 @@ class Users_helper:
def getAllOrg(self):
temp = self.serv_redis_db.smembers(self.keyAllOrgLog)
return [ org.decode('utf8') for org in temp ]
return [ org for org in temp ]
# return: All timestamps for one org for the spanned time or not
def getDates(self, org, date=None):
@ -90,7 +94,7 @@ class Users_helper:
keyname = "{}:{}".format(self.keyOrgLog, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True)
for org in data:
orgs.add(org.decode('utf8'))
orgs.add(org)
return list(orgs)
# return: list composed of the number of [log, contrib] for one org for the time spanned
@ -134,7 +138,7 @@ class Users_helper:
def getLoginVSCOntribution(self, date):
keyname = "{}:{}".format(self.keyContribDay, util.getDateStrFormat(date))
orgs_contri = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=False)
orgs_contri = [ org.decode('utf8') for org in orgs_contri ]
orgs_contri = [ org for org in orgs_contri ]
orgs_login = [ org for org in self.getAllLoggedInOrgs(date, prev_days=0) ]
contributed_num = 0
non_contributed_num = 0

View File

@ -6,10 +6,43 @@
## Debug mode
#set -x
sudo apt-get install python3-virtualenv virtualenv screen redis-server unzip -y
# Functions
get_distribution() {
lsb_dist=""
# Every system that we officially support has /etc/os-release
if [ -r /etc/os-release ]; then
lsb_dist="$(. /etc/os-release && echo "$ID")"
fi
# Returning an empty string here should be alright since the
# case statements don't act unless you provide an actual value
echo "$lsb_dist" | tr '[:upper:]' '[:lower:]'
}
sudo chmod -R g+w .
if ! id zmqs >/dev/null 2>&1; then
if [ "$(get_distribution)" == "rhel" ] || [ "${get_distribution}" == "centos" ]; then
# Create zmq user
sudo useradd -U -G apache -m -s /usr/bin/bash zmqs
# Adds right to www-data to run ./start-zmq as zmq
echo "apache ALL=(zmqs) NOPASSWD:/bin/bash /var/www/misp-dashboard/start_zmq.sh" |sudo tee /etc/sudoers.d/apache
VENV_BIN="/usr/local/bin/virtualenv"
else
# Create zmq user
sudo useradd -U -G www-data -m -s /bin/bash zmqs
# Adds right to www-data to run ./start-zmq as zmq
echo "www-data ALL=(zmqs) NOPASSWD:/bin/bash /var/www/misp-dashboard/start_zmq.sh" |sudo tee /etc/sudoers.d/www-data
VENV_BIN="virtualenv"
fi
fi
VENV_BIN="${VENV_BIN:-virtualenv}"
sudo apt-get install python3-virtualenv virtualenv screen redis-server unzip net-tools -y
if [ -z "$VIRTUAL_ENV" ]; then
virtualenv -p python3 DASHENV ; DASH_VENV=$?
${VENV_BIN} -p python3 DASHENV ; DASH_VENV=$?
if [[ "$DASH_VENV" != "0" ]]; then
echo "Something went wrong with either the update or install of the virtualenv."
@ -114,10 +147,24 @@ wget http://jvectormap.com/js/jquery-jvectormap-world-mill.js -O ./static/js/jqu
rm -rf data/GeoLite2-City*
mkdir -p data
pushd data
wget http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz -O GeoLite2-City.tar.gz
# The following lines do not work any more, see: https://blog.maxmind.com/2019/12/18/significant-changes-to-accessing-and-using-geolite2-databases/
#wget http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz -O GeoLite2-City.tar.gz
read -p "Please paste your Max Mind License key: " MM_LIC
while [ "$(sha256sum -c GeoLite2-City.tar.gz.sha256 >/dev/null; echo $?)" != "0" ]; do
echo "Redownloading GeoLite Assets, if this loops, CTRL-C and investigate"
wget "https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_LIC}&suffix=tar.gz" -O GeoLite2-City.tar.gz
wget "https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_LIC}&suffix=tar.gz.sha256" -O GeoLite2-City.tar.gz.sha256
if [[ $? == 6 ]]; then
echo "Something is wrong with your License Key, please try entering another one. (You DO NOT need a GeoIP Update key) "
echo "If you created the key JUST NOW, it will take a couple of minutes to become active."
read -p "Please paste your Max Mind License key: " MM_LIC
fi
sed -i 's/_.*/.tar.gz/' GeoLite2-City.tar.gz.sha256
sleep 3
done
tar xvfz GeoLite2-City.tar.gz
ln -s GeoLite2-City_* GeoLite2-City
rm -rf GeoLite2-City.tar.gz
rm -rf GeoLite2-City.tar.gz*
popd
# DataTable

View File

@ -1,9 +1,13 @@
argparse
flask
flask-login
wtforms
geoip2
# Redis needs to be 2.10.6 due to a change in redis 3.10 client, see here: https://github.com/MISP/misp-dashboard/issues/76#issuecomment-439389621
redis==2.10.6
redis
phonenumbers
pip
pycountry
zmq
requests
halo
pyopenssl

276
server.py
View File

@ -1,11 +1,14 @@
#!/usr/bin/env python3
import configparser
import datetime
import uuid
import errno
import json
import logging
import math
import os
import re
from datetime import timedelta
import random
from time import gmtime as now
from time import sleep, strftime
@ -13,11 +16,15 @@ from time import sleep, strftime
import redis
import util
from flask import (Flask, Response, jsonify, render_template, request,
send_from_directory, stream_with_context)
from flask import (Flask, Response, jsonify, render_template, request, make_response,
send_from_directory, stream_with_context, url_for, redirect)
from flask_login import (UserMixin, LoginManager, current_user, login_user, logout_user, login_required)
from helpers import (contributor_helper, geo_helper, live_helper,
trendings_helper, users_helper)
import requests
from wtforms import Form, SubmitField, StringField, PasswordField, validators
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
@ -28,21 +35,46 @@ logger.setLevel(logging.ERROR)
server_host = cfg.get("Server", "host")
server_port = cfg.getint("Server", "port")
server_debug = cfg.get("Server", "debug")
server_ssl = cfg.get("Server", "ssl")
try:
server_ssl_cert = cfg.get("Server", "ssl_cert")
server_ssl_key = cfg.get("Server", "ssl_key")
except:
server_ssl_cert = None
server_ssl_key = None
pass
auth_host = cfg.get("Auth", "misp_fqdn")
auth_enabled = cfg.getboolean("Auth", "auth_enabled")
auth_ssl_verify = cfg.getboolean("Auth", "ssl_verify")
auth_session_secret = cfg.get("Auth", "session_secret")
auth_session_cookie_secure = cfg.getboolean("Auth", "session_cookie_secure")
auth_session_cookie_samesite = cfg.get("Auth", "session_cookie_samesite")
auth_permanent_session_lifetime = cfg.getint("Auth", "permanent_session_lifetime")
app = Flask(__name__)
#app.secret_key = auth_session_secret
app.config.update(
SECRET_KEY=auth_session_secret,
SESSION_COOKIE_SECURE=auth_session_cookie_secure,
SESSION_COOKIE_SAMESITE=auth_session_cookie_samesite,
PERMANENT_SESSION_LIFETIME=timedelta(days=auth_permanent_session_lifetime)
)
redis_server_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db'))
db=cfg.getint('RedisLog', 'db'),
decode_responses=True)
redis_server_map = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisMap', 'db'))
db=cfg.getint('RedisMap', 'db'),
decode_responses=True)
serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db'))
db=cfg.getint('RedisDB', 'db'),
decode_responses=True)
streamLogCacheKey = cfg.get('RedisLog', 'streamLogCacheKey')
streamMapCacheKey = cfg.get('RedisLog', 'streamMapCacheKey')
@ -53,6 +85,177 @@ contributor_helper = contributor_helper.Contributor_helper(serv_redis_db, cfg)
users_helper = users_helper.Users_helper(serv_redis_db, cfg)
trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg)
login_manager = LoginManager(app)
login_manager.session_protection = "strong"
login_manager.init_app(app)
##########
## Auth ##
##########
class User(UserMixin):
def __init__(self, id, password):
self.id = id
self.password = password
def misp_login(self):
"""
Use login form data to authenticate a user to MISP.
This function uses requests to log a user into the MISP web UI. When authentication is successful MISP redirects the client to the '/users/routeafterlogin' endpoint. The requests session history is parsed for a redirect to this endpoint.
:param misp_url: The FQDN of a MISP instance to authenticate against.
:param user: The user account to authenticate.
:param password: The user account password.
:return:
"""
post_data = {
"_method": "POST",
"data[_Token][key]": "",
"data[_Token][fields]": "",
"data[_Token][unlocked]": "",
"data[User][email]": self.id,
"data[User][password]": self.password,
}
misp_login_page = auth_host + "/users/login"
misp_user_me_page = auth_host + "/users/view/me.json"
session = requests.Session()
session.verify = auth_ssl_verify
# The login page contains hidden form values required for authenticaiton.
login_page = session.get(misp_login_page)
# This regex matches the "data[_Token][fields]" value needed to make a POST request on the MISP login page.
token_fields_exp = re.compile(r'name="data\[_Token]\[fields]" value="([^\s]+)"')
token_fields = token_fields_exp.search(login_page.text)
# This regex matches the "data[_Token][key]" value needed to make a POST request on the MISP login page.
token_key_exp = re.compile(r'name="data\[_Token]\[key]" value="([^\s]+)"')
token_key = token_key_exp.search(login_page.text)
# This regex matches the "data[_Token][debug]" value needed to make a POST request on the MISP login page.
token_key_exp = re.compile(r'name="data\[_Token]\[debug]" value="([^\s]+)"')
token_debug = token_key_exp.search(login_page.text)
post_data["data[_Token][fields]"] = token_fields.group(1)
post_data["data[_Token][key]"] = token_key.group(1)
# debug_token should return None when MISP debug is off.
# Only send debug_token when MISP is running in debug mode.
if token_debug is not None:
post_data["data[_Token][debug]"] = token_debug.group(1)
# POST request with user credentials + hidden form values.
post_to_login_page = session.post(misp_login_page, data=post_data, allow_redirects=False)
# Consider setup with MISP baseurl set
redirect_location = post_to_login_page.headers.get('Location', '')
# Authentication is successful if MISP returns a redirect to '/users/routeafterlogin'.
if '/users/routeafterlogin' in redirect_location:
# Logged in, check if logged in user can access the dashboard
me_json = session.get(misp_user_me_page).json()
dashboard_access = me_json.get('UserSetting', {}).get('dashboard_access', False)
if dashboard_access is True or dashboard_access == 1:
return (True, '')
else:
return (None, 'User does not have dashboard access')
return (None, '')
@login_manager.user_loader
def load_user(user_id):
"""
Return a User object required by flask-login to keep state of a user session.
Typically load_user is used to perform a user lookup on a db; it should return a User object or None if the user is not found. Authentication is defered to MISP via User.misp_login() and so this function always returns a User object .
:param user_id: A MISP username.
:return:
"""
return User(user_id, "")
@login_manager.unauthorized_handler
def unauthorized():
"""
Redirect unauthorized user to login page.
:return:
"""
redirectCount = int(request.cookies.get('redirectCount', '0'))
if redirectCount > 5:
response = make_response(redirect(url_for(
'error_page',
error_message='Too many redirects. This can be due to your brower not accepting cookies or the misp-dashboard website is badly configured',
error_code='1'
)))
response.set_cookie('redirectCount', '0', secure=False, httponly=True)
else:
response = make_response(redirect(url_for('login', auth_error=True, auth_error_message='Unauthorized. Review your cookie settings')))
response.set_cookie('redirectCount', str(redirectCount+1), secure=False, httponly=True)
return response
@app.route('/error_page')
def error_page():
error_message = request.args.get('error_message', False)
return render_template('error_page.html', error_message=error_message)
@app.route('/logout')
@login_required
def logout():
"""
Logout the user and redirect to the login form.
:return:
"""
logout_user()
return redirect(url_for('login'))
@app.route('/login', methods=['GET', 'POST'])
def login():
"""
Login form route.
:return:
"""
if not auth_enabled:
# Generate a random user name and redirect the automatically authenticated user to index.
user = User(str(uuid.uuid4()).replace('-',''), '')
login_user(user)
return redirect(url_for('index'))
if current_user.is_authenticated:
return redirect(url_for('index'))
form = LoginForm(request.form)
if request.method == 'POST' and form.validate():
user = User(form.username.data, form.password.data)
error_message = 'Username and Password does not match when connecting to MISP or incorrect MISP permission'
try:
is_logged_in, misp_error_message = user.misp_login()
if len(misp_error_message) > 0:
error_message = misp_error_message
if is_logged_in:
login_user(user)
return redirect(url_for('index'))
except requests.exceptions.SSLError:
return redirect(url_for('login', auth_error=True, auth_error_message='MISP cannot be reached for authentication'))
return redirect(url_for('login', auth_error=True, auth_error_message=error_message))
else:
auth_error = request.args.get('auth_error', False)
auth_error_message = request.args.get('auth_error_message', '')
return render_template('login.html', title='Login', form=form, authError=auth_error, authErrorMessage=auth_error_message)
class LoginForm(Form):
"""
WTForm form object. This object defines form fields in the login endpoint.
"""
username = StringField('Username', [validators.Length(max=255)])
password = PasswordField('Password', [validators.Length(max=255)])
submit = SubmitField('Sign In')
##########
## UTIL ##
@ -95,13 +298,6 @@ class LogItem():
else:
to_add = util.getFields(self.feed, field)
to_ret[i] = to_add if to_add is not None else ''
# Number to keep them sorted (jsonify sort keys)
for item in range(len(LogItem.FIELDNAME_ORDER)):
try:
to_ret[item] = self.fields[item]
except IndexError: # not enough field in rcv item
to_ret[item] = ''
return to_ret
@ -117,7 +313,6 @@ class EventMessage():
# Suppose the event message is a json with the format {name: 'feedName', log:'logData'}
def __init__(self, msg, filters):
if not isinstance(msg, dict):
msg = msg.decode('utf8')
try:
jsonMsg = json.loads(msg)
jsonMsg['log'] = json.loads(jsonMsg['log'])
@ -164,6 +359,7 @@ class EventMessage():
''' MAIN ROUTE '''
@app.route("/")
@login_required
def index():
ratioCorrection = 88
pannelSize = [
@ -185,11 +381,13 @@ def index():
)
@app.route('/favicon.ico')
@login_required
def favicon():
return send_from_directory(os.path.join(app.root_path, 'static'),
'favicon.ico', mimetype='image/vnd.microsoft.icon')
@app.route("/geo")
@login_required
def geo():
return render_template('geo.html',
zoomlevel=cfg.getint('GEO' ,'zoomlevel'),
@ -197,6 +395,7 @@ def geo():
)
@app.route("/contrib")
@login_required
def contrib():
categ_list = contributor_helper.categories_in_datatable
categ_list_str = [ s[0].upper() + s[1:].replace('_', ' ') for s in categ_list]
@ -248,12 +447,14 @@ def contrib():
)
@app.route("/users")
@login_required
def users():
return render_template('users.html',
)
@app.route("/trendings")
@login_required
def trendings():
maxNum = request.args.get('maxNum')
try:
@ -270,6 +471,7 @@ def trendings():
''' INDEX '''
@app.route("/_logs")
@login_required
def logs():
if request.accept_mimetypes.accept_json or request.method == 'POST':
key = 'Attribute'
@ -288,6 +490,7 @@ def logs():
return Response(stream_with_context(event_stream_log()), mimetype="text/event-stream")
@app.route("/_maps")
@login_required
def maps():
if request.accept_mimetypes.accept_json or request.method == 'POST':
key = 'Map'
@ -297,6 +500,7 @@ def maps():
return Response(event_stream_maps(), mimetype="text/event-stream")
@app.route("/_get_log_head")
@login_required
def getLogHead():
return json.dumps(LogItem('').get_head_row())
@ -321,7 +525,7 @@ def event_stream_maps():
subscriber_map.psubscribe(cfg.get('RedisMap', 'channelDisp'))
try:
for msg in subscriber_map.listen():
content = msg['data'].decode('utf8')
content = msg['data']
to_ret = 'data: {}\n\n'.format(content)
yield to_ret
except GeneratorExit:
@ -330,6 +534,7 @@ def event_stream_maps():
''' GEO '''
@app.route("/_getTopCoord")
@login_required
def getTopCoord():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -339,6 +544,7 @@ def getTopCoord():
return jsonify(data)
@app.route("/_getHitMap")
@login_required
def getHitMap():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -348,6 +554,7 @@ def getHitMap():
return jsonify(data)
@app.route("/_getCoordsByRadius")
@login_required
def getCoordsByRadius():
try:
dateStart = datetime.datetime.fromtimestamp(float(request.args.get('dateStart')))
@ -364,14 +571,17 @@ def getCoordsByRadius():
''' CONTRIB '''
@app.route("/_getLastContributors")
@login_required
def getLastContributors():
return jsonify(contributor_helper.getLastContributorsFromRedis())
@app.route("/_eventStreamLastContributor")
@login_required
def getLastContributor():
return Response(eventStreamLastContributor(), mimetype="text/event-stream")
@app.route("/_eventStreamAwards")
@login_required
def getLastStreamAwards():
return Response(eventStreamAwards(), mimetype="text/event-stream")
@ -380,7 +590,7 @@ def eventStreamLastContributor():
subscriber_lastContrib.psubscribe(cfg.get('RedisLog', 'channelLastContributor'))
try:
for msg in subscriber_lastContrib.listen():
content = msg['data'].decode('utf8')
content = msg['data']
contentJson = json.loads(content)
lastContribJson = json.loads(contentJson['log'])
org = lastContribJson['org']
@ -396,7 +606,7 @@ def eventStreamAwards():
subscriber_lastAwards.psubscribe(cfg.get('RedisLog', 'channelLastAwards'))
try:
for msg in subscriber_lastAwards.listen():
content = msg['data'].decode('utf8')
content = msg['data']
contentJson = json.loads(content)
lastAwardJson = json.loads(contentJson['log'])
org = lastAwardJson['org']
@ -409,6 +619,7 @@ def eventStreamAwards():
subscriber_lastAwards.unsubscribe()
@app.route("/_getTopContributor")
@login_required
def getTopContributor(suppliedDate=None, maxNum=100):
if suppliedDate is None:
try:
@ -422,6 +633,7 @@ def getTopContributor(suppliedDate=None, maxNum=100):
return jsonify(data)
@app.route("/_getFameContributor")
@login_required
def getFameContributor():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -432,6 +644,7 @@ def getFameContributor():
return getTopContributor(suppliedDate=date, maxNum=10)
@app.route("/_getFameQualContributor")
@login_required
def getFameQualContributor():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -442,10 +655,12 @@ def getFameQualContributor():
return getTopContributor(suppliedDate=date, maxNum=10)
@app.route("/_getTop5Overtime")
@login_required
def getTop5Overtime():
return jsonify(contributor_helper.getTop5OvertimeFromRedis())
@app.route("/_getOrgOvertime")
@login_required
def getOrgOvertime():
try:
org = request.args.get('org')
@ -454,6 +669,7 @@ def getOrgOvertime():
return jsonify(contributor_helper.getOrgOvertime(org))
@app.route("/_getCategPerContrib")
@login_required
def getCategPerContrib():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -463,6 +679,7 @@ def getCategPerContrib():
return jsonify(contributor_helper.getCategPerContribFromRedis(date))
@app.route("/_getLatestAwards")
@login_required
def getLatestAwards():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -472,10 +689,12 @@ def getLatestAwards():
return jsonify(contributor_helper.getLastAwardsFromRedis())
@app.route("/_getAllOrg")
@login_required
def getAllOrg():
return jsonify(contributor_helper.getAllOrgFromRedis())
@app.route("/_getOrgRank")
@login_required
def getOrgRank():
try:
org = request.args.get('org')
@ -484,6 +703,7 @@ def getOrgRank():
return jsonify(contributor_helper.getCurrentOrgRankFromRedis(org))
@app.route("/_getContributionOrgStatus")
@login_required
def getContributionOrgStatus():
try:
org = request.args.get('org')
@ -492,6 +712,7 @@ def getContributionOrgStatus():
return jsonify(contributor_helper.getCurrentContributionStatus(org))
@app.route("/_getHonorBadges")
@login_required
def getHonorBadges():
try:
org = request.args.get('org')
@ -500,6 +721,7 @@ def getHonorBadges():
return jsonify(contributor_helper.getOrgHonorBadges(org))
@app.route("/_getTrophies")
@login_required
def getTrophies():
try:
org = request.args.get('org')
@ -509,6 +731,7 @@ def getTrophies():
@app.route("/_getAllOrgsTrophyRanking")
@app.route("/_getAllOrgsTrophyRanking/<string:categ>")
@login_required
def getAllOrgsTrophyRanking(categ=None):
return jsonify(contributor_helper.getAllOrgsTrophyRanking(categ))
@ -516,6 +739,7 @@ def getAllOrgsTrophyRanking(categ=None):
''' USERS '''
@app.route("/_getUserLogins")
@login_required
def getUserLogins():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -527,10 +751,12 @@ def getUserLogins():
return jsonify(data)
@app.route("/_getAllLoggedOrg")
@login_required
def getAllLoggedOrg():
return jsonify(users_helper.getAllOrg())
@app.route("/_getTopOrglogin")
@login_required
def getTopOrglogin():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -541,6 +767,7 @@ def getTopOrglogin():
return jsonify(data)
@app.route("/_getLoginVSCOntribution")
@login_required
def getLoginVSCOntribution():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -551,6 +778,7 @@ def getLoginVSCOntribution():
return jsonify(data)
@app.route("/_getUserLoginsAndContribOvertime")
@login_required
def getUserLoginsAndContribOvertime():
try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -563,6 +791,7 @@ def getUserLoginsAndContribOvertime():
''' TRENDINGS '''
@app.route("/_getTrendingEvents")
@login_required
def getTrendingEvents():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -576,6 +805,7 @@ def getTrendingEvents():
return jsonify(data)
@app.route("/_getTrendingCategs")
@login_required
def getTrendingCategs():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -589,6 +819,7 @@ def getTrendingCategs():
return jsonify(data)
@app.route("/_getTrendingTags")
@login_required
def getTrendingTags():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -602,6 +833,7 @@ def getTrendingTags():
return jsonify(data)
@app.route("/_getTrendingSightings")
@login_required
def getTrendingSightings():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -614,6 +846,7 @@ def getTrendingSightings():
return jsonify(data)
@app.route("/_getTrendingDisc")
@login_required
def getTrendingDisc():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -627,6 +860,7 @@ def getTrendingDisc():
return jsonify(data)
@app.route("/_getTypeaheadData")
@login_required
def getTypeaheadData():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -639,6 +873,7 @@ def getTypeaheadData():
return jsonify(data)
@app.route("/_getGenericTrendingOvertime")
@login_required
def getGenericTrendingOvertime():
try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -653,8 +888,17 @@ def getGenericTrendingOvertime():
if __name__ == '__main__':
try:
if bool(server_ssl) is True:
if server_ssl_cert and server_ssl_key:
server_ssl_context = (server_ssl_cert, server_ssl_key)
else:
server_ssl_context = 'adhoc'
else:
server_ssl_context = None
app.run(host=server_host,
port=server_port,
ssl_context=server_ssl_context,
debug=server_debug,
threaded=True)
except OSError as error:

View File

@ -6,6 +6,29 @@ GREEN="\\033[1;32m"
DEFAULT="\\033[0;39m"
RED="\\033[1;31m"
function wait_until_redis_is_ready {
redis_not_ready=true
while $redis_not_ready; do
if checking_redis; then
redis_not_ready=false;
else
sleep 1
fi
done
echo -e $GREEN"* Redis 6250 is running"$DEFAULT
}
function checking_redis {
flag_redis=0
bash -c 'redis-cli -p 6250 PING | grep "PONG" &> /dev/null'
if [ ! $? == 0 ]; then
echo -e $RED"Redis 6250 not ready"$DEFAULT
flag_redis=1
fi
sleep 0.1
return $flag_redis;
}
# Getting CWD where bash script resides
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DASH_HOME="${DIR}"
@ -39,8 +62,6 @@ fi
netstat -an |grep LISTEN |grep 6250 |grep -v tcp6 ; check_redis_port=$?
netstat -an |grep LISTEN |grep 8001 |grep -v tcp6 ; check_dashboard_port=$?
ps auxw |grep zmq_subscriber.py |grep -v grep ; check_zmq_subscriber=$?
ps auxw |grep zmq_dispatcher.py |grep -v grep ; check_zmq_dispatcher=$?
# Configure accordingly, remember: 0.0.0.0 exposes to every active IP interface, play safe and bind it to something you trust and know
export FLASK_APP=server.py
@ -63,25 +84,14 @@ else
fi
sleep 0.1
if [ "${check_zmq_subscriber}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq subscriber"$DEFAULT
${ENV_PY} ./zmq_subscriber.py &
else
echo -e $RED"\t* NOT starting zmq subscriber, made a rather unrealiable ps -auxw | grep for zmq_subscriber.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi
wait_until_redis_is_ready;
sleep 0.1
if [ "${check_zmq_dispatcher}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq dispatcher"$DEFAULT
${ENV_PY} ./zmq_dispatcher.py &
else
echo -e $RED"\t* NOT starting zmq dispatcher, made a rather unrealiable ps -auxw | grep for zmq_dispatcher.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi
sleep 0.1
if [ "${check_dashboard_port}" == "1" ]; then
echo -e $GREEN"\t* Launching flask server"$DEFAULT
${ENV_PY} ./server.py &
else
echo -e $RED"\t* NOT starting flask server, made a very unrealiable check on port 8001, and something seems to be there… please double check if this is good!"$DEFAULT
fi
sleep 0.1
sudo -u zmqs /bin/bash ${DIR}/start_zmq.sh &

50
start_zmq.sh Executable file
View File

@ -0,0 +1,50 @@
#!/usr/bin/env bash
#set -x
GREEN="\\033[1;32m"
DEFAULT="\\033[0;39m"
RED="\\033[1;31m"
# Getting CWD where bash script resides
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DASH_HOME="${DIR}"
SCREEN_NAME="Misp_Dashboard"
cd ${DASH_HOME}
if [ -e "${DIR}/DASHENV/bin/python" ]; then
echo "dashboard virtualenv seems to exist, good"
ENV_PY="${DIR}/DASHENV/bin/python"
else
echo "Please make sure you have a dashboard environment, au revoir"
exit 1
fi
PID_SCREEN=$(screen -ls | grep ${SCREEN_NAME} | cut -f2 | cut -d. -f1)
if [[ $PID_SCREEN ]]; then
echo -e $RED"* A screen '$SCREEN_NAME' is already launched"$DEFAULT
echo -e $GREEN"Killing $PID_SCREEN"$DEFAULT;
kill $PID_SCREEN
else
echo 'No screen detected'
fi
screen -dmS ${SCREEN_NAME}
ps auxw |grep zmq_subscriber.py |grep -v grep ; check_zmq_subscriber=$?
ps auxw |grep zmq_dispatcher.py |grep -v grep ; check_zmq_dispatcher=$?
sleep 0.1
if [ "${check_zmq_subscriber}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq subscribers"$DEFAULT
screen -S "Misp_Dashboard" -X screen -t "zmq-subscribers" bash -c ${ENV_PY}' ./zmq_subscribers.py; read x'
else
echo -e $RED"\t* NOT starting zmq subscribers, made a rather unrealiable ps -auxw | grep for zmq_subscriber.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi
sleep 0.1
if [ "${check_zmq_dispatcher}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq dispatcher"$DEFAULT
screen -S "Misp_Dashboard" -X screen -t "zmq-dispacher" bash -c ${ENV_PY}' ./zmq_dispatcher.py; read x'
else
echo -e $RED"\t* NOT starting zmq dispatcher, made a rather unrealiable ps -auxw | grep for zmq_dispatcher.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi

View File

@ -152,6 +152,7 @@ function getMonthlyRankIcon(rank, size, header) {
img.width = size;
}
}
img.setAttribute('onerror', "this.style.display='none'");
return img.outerHTML;
}
@ -167,7 +168,8 @@ function getOrgRankIcon(rank, size) {
obj.src = rankLogoPath;
obj.type = "image/svg"
obj.title = org_rank_obj[rank];
obj.classList.add('orgRankClass')
obj.classList.add('orgRankClass');
obj.setAttribute('onerror', "this.style.display='none'");
return obj.outerHTML;
}
@ -177,8 +179,9 @@ function createImg(source, size) {
obj.width = size;
obj.style.margin = 'auto';
obj.src = source;
obj.type = "image/png"
obj.alt = ""
obj.type = "image/png";
obj.alt = "";
obj.setAttribute('onerror', "this.style.display='none'");
return obj.outerHTML;
}
@ -187,10 +190,11 @@ function createTrophyImg(rank, size, categ) {
obj.height = size;
obj.width = size;
obj.style.margin = 'auto';
obj.src = url_baseTrophyLogo+rank+'.png';;
obj.src = url_baseTrophyLogo+rank+'.png';
obj.title = trophy_title[rank] + " in " + categ;
obj.type = "image/png"
obj.alt = ""
obj.type = "image/png";
obj.alt = "";
obj.setAttribute('onerror', "this.style.display='none'");
return obj.outerHTML;
}
@ -208,6 +212,7 @@ function createHonorImg(array, size) {
obj.style.margin = 'auto';
obj.title = org_honor_badge_title[badgeNum];
obj.src = url_baseHonorLogo+badgeNum+'.svg';
obj.setAttribute('onerror', "this.style.display='none'");
div.appendChild(obj);
}
div.style.width = 32*array.length+'px';
@ -347,7 +352,7 @@ function addLastContributor(datatable, data, update) {
last_added_contrib = org;
var date = new Date(data.epoch*1000);
//date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); };
date = date.getFullYear() + "-" + String(date.getMonth()).padStart(2, "0") + "-" + String(date.getDay()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
date = date.getFullYear() + "-" + String(date.getMonth()+1).padStart(2, "0") + "-" + String(date.getDate()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
var to_add = [
date,
data.pnts,
@ -385,7 +390,7 @@ function addAwards(datatableAwards, json, playAnim) {
}
var date = new Date(json.epoch*1000);
//date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); };
date = date.getFullYear() + "-" + String(date.getMonth()).padStart(2, "0") + "-" + String(date.getDay()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
date = date.getFullYear() + "-" + String(date.getMonth()+1).padStart(2, "0") + "-" + String(date.getDate()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
var to_add = [
date,
createImg(json.logo_path, 32),
@ -563,7 +568,7 @@ function generate_table_ranking_on_category(categ) {
var rank = arr[2];
var tr = $('<tr></tr>');
tr.append($('<td style="width: 100px;">'+i+'</td>'));
tr.append($('<td style="width: 100px;"><img src="'+url_baseTrophyLogo+rank+'.png" width="30" height="30"></td>'));
tr.append($('<td style="width: 100px;"><img src="'+url_baseTrophyLogo+rank+'.png" width="30" height="30" onerror="this.style.display=\'none\'"></td>'));
tr.append($('<td style="width: 200px;">'+points+'</td>'));
tr.append($('<td><a href="?org='+org+'">'+org+'</a></td>'));
if (currOrg == org) {

View File

@ -184,7 +184,6 @@ $(document).ready(function () {
});
// LOG TABLE
function updateLogTable(name, log, zmqName, ignoreLed) {
if (log.length == 0)
@ -445,6 +444,13 @@ function createHead(callback) {
}
},
changeOptions: function(options) {
var that = this;
Object.keys(options).forEach(function (optionName) {
that._options[optionName] = options[optionName];
});
},
fetch_predata: function() {
var that = this;
if (this._options.preDataURL !== null) {
@ -583,6 +589,7 @@ function createHead(callback) {
},
add_entry: function(entry, isObjectAttribute) {
entry = this.sanitizeJson(entry);
var rowNode = this.dt.row.add(entry).draw().node();
if (this._options.animate) {
$( rowNode )
@ -590,7 +597,6 @@ function createHead(callback) {
.animate( { 'background-color': '' }, { duration: 1500 } );
}
if (isObjectAttribute === true) {
console.log(entry);
$( rowNode ).children().last()
.css('position', 'relative')
.append(
@ -608,6 +614,29 @@ function createHead(callback) {
//remove the rows and redraw the table
var rows = this.dt.rows(arraySlice).remove().draw();
}
},
sanitizeJson: function(dirty_json) {
var sanitized_json = {};
var that = this;
Object.keys(dirty_json).forEach(function(k) {
var val = dirty_json[k];
if (Array.isArray(val)) {
var clear_array = [];
sanitized_json[k] = val.map(function(item) {
return that.sanitize(item);
});
} else if(typeof val === 'object') {
sanitized_json[k] = that.sanitizeJson(val);
} else {
sanitized_json[k] = that.sanitize(val);
}
});
return sanitized_json;
},
sanitize: function(e) {
return $("<p>").text(e).html();;
}
};
@ -809,10 +838,14 @@ $(document).ready(function() {
$panel.detach().prependTo('#page-wrapper')
$panel.addClass('liveLogFullScreen');
$this.data('isfullscreen', true);
$panel.find('#divLogTable').css({'overflow-y': 'auto'});
livelog.changeOptions({tableMaxEntries: 300});
} else {
$panel.detach().appendTo('#rightCol')
$panel.removeClass('liveLogFullScreen');
$this.data('isfullscreen', false);
$panel.find('#divLogTable').css({'overflow': 'hidden'});
livelog.changeOptions({tableMaxEntries: 50});
}
});

View File

@ -23,7 +23,7 @@ class MapEvent {
this.specifName = json.specifName;
this.cityName = json.cityName;
this.text = this.categ + ": " + this.value;
let underText = "";
let underText = "";
if (this.specifName !== null && this.cityName !== null) {
underText = this.specifName+", "+this.cityName;
} else if (this.specifName !== null) {
@ -225,6 +225,7 @@ function connect_source_map() {
};
source_map.onerror = function(){
console.log('error: '+source_map.readyState);
source_map.close()
setTimeout(function() { connect_source_map(); }, 5000);
};
}

BIN
static/pics/misp-logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

View File

@ -120,7 +120,7 @@
<tbody id='bodyTablerankingModal'>
{% for item in org_rank_list %}
<tr data-rank={{ loop.index }}>
<td style='padding: 0px; text-align: right;'><img height='35px' width='70px' style="margin-right: 20px;" src="{{ url_for('static', filename='pics/rankingMISPOrg/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;"</img></td>
<td style='padding: 0px; text-align: right;'><img height='35px' width='70px' style="margin-right: 20px;" src="{{ url_for('static', filename='pics/rankingMISPOrg/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;" onerror="this.style.display='none'"</img></td>
<td>{{ item[1] }}</td>
<td>{{ item[2] }}</td>
<td>{{ item[3] }}</td>
@ -172,7 +172,7 @@
<tr>
<td>
<div id="divBadge_{{ loop.index }}" class="circleBadgeSmall circlBadgeNotAcquired">
<img height='48px' width='48' class="" style="margin-top: 3px;" src="{{ url_for('static', filename='pics/MISPHonorableIcons/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;"</img>
<img height='48px' width='48' class="" style="margin-top: 3px;" src="{{ url_for('static', filename='pics/MISPHonorableIcons/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;" onerror="this.style.display='none'"</img>
</div>
</td>
<td style="padding-left: 15px;">{{ item[1] }}</td>

43
templates/error_page.html Normal file
View File

@ -0,0 +1,43 @@
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <meta name="viewport" content="width=device-width" />
<title>
Users - MISP
</title>
<!-- jQuery -->
<script src="{{ url_for('static', filename='js/jquery.min.js') }}"></script>
<!-- Bootstrap Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<!-- Custom CSS -->
<link href="{{ url_for('static', filename='css/sb-admin-2.css') }}" rel="stylesheet">
<!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
</head>
<body>
<div id="flashContainer" style="padding-top:50px; !important;">
<div id="main-view-container" class="container-fluid ">
</div>
</div>
<div style="width:100%;">
<table style="margin-left:auto;margin-right:auto;">
<tr>
<td style="text-align:right;width:250px;padding-right:50px"></td>
<td style="width:460px">
<div>
<img src="{{ url_for('static', filename='pics/misp-logo.png') }}" style="display:block; margin-left: auto; margin-right: auto;"/>
</div>
<div class="alert alert-danger" style="margin-top: 15px;">
{{ error_message }}
</div>
</td>
</tr>
</table>
</div>
</body>
</html>

View File

@ -188,11 +188,15 @@ div.dataTables_scrollHead table.dataTable {
top: 66px !important;
left: 15px !important;
right: 10px !important;
z-index: 1001 !important;
z-index: 990 !important;
bottom: -7px !important;
height: unset !important;
}
div.leaflet-bottom {
z-index: 900;
}
.rowTableIsObject {
position: absolute;
right: 15px;
@ -217,6 +221,7 @@ div.dataTables_scrollHead table.dataTable {
<li><a href="{{ url_for('contrib') }}">MISP Contributors</a></li>
<li><a href="{{ url_for('users') }}">MISP Users</a></li>
<li><a href="{{ url_for('trendings') }}">MISP Trendings</a></li>
<li><a href="{{ url_for('logout') }}">Logout</a></li>
</ul>
<div id="ledsHolder" style="float: right; height: 50px;"></div>

67
templates/login.html Normal file
View File

@ -0,0 +1,67 @@
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <meta name="viewport" content="width=device-width" />
<title>
Users - MISP
</title>
<!-- jQuery -->
<script src="{{ url_for('static', filename='js/jquery.min.js') }}"></script>
<!-- Bootstrap Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<!-- Custom CSS -->
<link href="{{ url_for('static', filename='css/sb-admin-2.css') }}" rel="stylesheet">
<!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
</head>
<body>
<div id="flashContainer" style="padding-top:50px; !important;">
<div id="main-view-container" class="container-fluid ">
</div>
</div>
<div>
<div style="width:100%;">
<table style="margin-left:auto;margin-right:auto;">
<tr>
<td style="text-align:right;width:250px;padding-right:50px"></td>
<td style="width:460px">
<div>
<img src="{{ url_for('static', filename='pics/misp-logo.png') }}" style="display:block; margin-left: auto; margin-right: auto;"/>
</div>
{% if authError %}
<div class="alert alert-danger">
{{ authErrorMessage }}
</div>
{% endif %}
<form action="" id="UserLoginForm" method="post" accept-charset="utf-8">
<br><legend>Welcome to MISP-Dashboard</legend><br>
<div class="input email required">
{{ form.username.label }}<br>
{{ form.username(size=32, maxlength=255, autocomplete="off", autofocus="autofocus") }}
</div>
<div class="input password required">
{{ form.password.label }}<br>
{{ form.password(size=32, maxlength=255, autocomplete="off") }}
</div>
<div class="clear"></div>
<p>{{ form.submit() }}</p>
</form>
</td>
<td style="width:250px;padding-left:50px"></td>
</tr>
</table>
</div>
</div>
</body>
</html>

79
updates.py Normal file
View File

@ -0,0 +1,79 @@
import redis
import os
import configparser
import logging
DATABASE_VERSION = [
1
]
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
serv_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db'))
serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db'))
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
# logger
logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'update_filename')
logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir):
os.makedirs(logDir)
handler = logging.FileHandler(logPath)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
update_logger = logging.getLogger(__name__)
update_logger.setLevel(logging.INFO)
update_logger.addHandler(handler)
def check_for_updates():
db_version = serv_redis_db.get(cfg.get('RedisDB', 'dbVersion'))
db_version = int(db_version) if db_version is not None else 0
updates_to_be_done = find_updates(db_version)
if len(updates_to_be_done) == 0:
update_logger.info('database up-to-date')
else:
for i in updates_to_be_done:
exec_updates(i)
def find_updates(db_version):
updates_to_be_done = []
for i in DATABASE_VERSION:
if db_version < i:
updates_to_be_done.append(i)
return updates_to_be_done
def exec_updates(db_version):
result = False
if db_version == 1:
result = apply_update_1()
if result:
serv_redis_db.set(cfg.get('RedisDB', 'dbVersion'), db_version)
update_logger.warning('dbVersion sets to {}'.format(db_version))
else:
update_logger.error('Something went wrong. {}'.format(result))
# Data format changed. Wipe the key.
def apply_update_1():
serv_redis_db.delete('TEMP_CACHE_LIVE:Attribute')
log_text = 'Executed update 1. Deleted Redis key `TEMP_CACHE_LIVE:Attribute`'
print(log_text)
update_logger.info(log_text)
return True

View File

@ -8,7 +8,7 @@ def getZrange(serv_redis_db, keyCateg, date, topNum, endSubkey=""):
date_str = getDateStrFormat(date)
keyname = "{}:{}{}".format(keyCateg, date_str, endSubkey)
data = serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ]
data = [ [record[0], record[1]] for record in data ]
return data
def noSpaceLower(text):
@ -18,7 +18,7 @@ def push_to_redis_zset(serv_redis_db, mainKey, toAdd, endSubkey="", count=1):
now = datetime.datetime.now()
today_str = getDateStrFormat(now)
keyname = "{}:{}{}".format(mainKey, today_str, endSubkey)
serv_redis_db.zincrby(keyname, toAdd, count)
serv_redis_db.zincrby(keyname, count, toAdd)
def getMonthSpan(date):
ds = datetime.datetime(date.year, date.month, 1)

View File

@ -15,6 +15,7 @@ import redis
import zmq
import util
import updates
from helpers import (contributor_helper, geo_helper, live_helper,
trendings_helper, users_helper)
@ -40,15 +41,18 @@ LISTNAME = cfg.get('RedisLIST', 'listName')
serv_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db'))
db=cfg.getint('RedisLog', 'db'),
decode_responses=True)
serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db'))
db=cfg.getint('RedisDB', 'db'),
decode_responses=True)
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
db=cfg.getint('RedisLIST', 'db'),
decode_responses=True)
live_helper = live_helper.Live_helper(serv_redis_db, cfg)
geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg)
@ -238,6 +242,12 @@ def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False, parent
# Push to log
live_helper.publish_log(zmq_name, attributeType, jsonobj)
def handler_diagnostic_tool(zmq_name, jsonobj):
try:
res = time.time() - float(jsonobj['content'])
except Exception as e:
logger.error(e)
serv_list.set('diagnostic_tool_response', str(res))
###############
## MAIN LOOP ##
@ -253,15 +263,18 @@ def process_log(zmq_name, event):
def main(sleeptime):
updates.check_for_updates()
numMsg = 0
while True:
content = serv_list.rpop(LISTNAME)
if content is None:
logger.debug('Processed {} message(s) since last sleep.'.format(numMsg))
log_text = 'Processed {} message(s) since last sleep.'.format(numMsg)
logger.info(log_text)
numMsg = 0
time.sleep(sleeptime)
continue
content = content.decode('utf8')
content = content
the_json = json.loads(content)
zmqName = the_json['zmq_name']
content = the_json['content']
@ -281,13 +294,14 @@ dico_action = {
"misp_json_conversation": handler_conversation,
"misp_json_object_reference": handler_skip,
"misp_json_audit": handler_audit,
"diagnostic_channel": handler_diagnostic_tool
}
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='The ZMQ dispatcher. It pops from the redis buffer then redispatch it to the correct handlers')
parser.add_argument('-s', '--sleep', required=False, dest='sleeptime', type=int, help='The number of second to wait before checking redis list size', default=5)
parser.add_argument('-s', '--sleep', required=False, dest='sleeptime', type=int, help='The number of second to wait before checking redis list size', default=1)
args = parser.parse_args()
try:

View File

@ -28,14 +28,14 @@ except PermissionError as error:
sys.exit(126)
logger = logging.getLogger('zmq_subscriber')
ZMQ_URL = cfg.get('RedisGlobal', 'zmq_url')
CHANNEL = cfg.get('RedisLog', 'channel')
LISTNAME = cfg.get('RedisLIST', 'listName')
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
db=cfg.getint('RedisLIST', 'db'),
decode_responses=True)
###############
@ -48,28 +48,31 @@ def put_in_redis_list(zmq_name, content):
serv_list.lpush(LISTNAME, json.dumps(to_add))
logger.debug('Pushed: {}'.format(json.dumps(to_add)))
def main(zmqName):
def main(zmqName, zmqurl):
context = zmq.Context()
socket = context.socket(zmq.SUB)
socket.connect(ZMQ_URL)
socket.connect(zmqurl)
socket.setsockopt_string(zmq.SUBSCRIBE, '')
while True:
try:
content = socket.recv()
put_in_redis_list(zmqName, content)
print(zmqName, content)
except KeyboardInterrupt:
return
except Exception as e:
logger.warning('Error:' + str(e))
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='A zmq subscriber. It subscribes to a ZMQ then redispatch it to the misp-dashboard')
parser.add_argument('-n', '--name', required=False, dest='zmqname', help='The ZMQ feed name', default="MISP Standard ZMQ")
parser.add_argument('-u', '--url', required=False, dest='zmqurl', help='The URL to connect to', default=ZMQ_URL)
parser.add_argument('-u', '--url', required=False, dest='zmqurl', help='The URL to connect to', default="tcp://localhost:50000")
args = parser.parse_args()
try:
main(args.zmqname)
main(args.zmqname, args.zmqurl)
except redis.exceptions.ResponseError as error:
print(error)

74
zmq_subscribers.py Executable file
View File

@ -0,0 +1,74 @@
#!/usr/bin/env python3
import time, datetime
import logging
import redis
import configparser
import argparse
import os
import subprocess
import sys
import json
import atexit
import signal
import shlex
import pty
import threading
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'subscriber_filename')
logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir):
os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
logger = logging.getLogger('zmq_subscriber')
CHANNEL = cfg.get('RedisLog', 'channel')
LISTNAME = cfg.get('RedisLIST', 'listName')
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
children = []
def signal_handler(signal, frame):
for child in children:
# We don't resume as we are already attached
cmd = "screen -p"+child+" -X {arg}"
argsc = shlex.split(cmd.format(arg = "kill"))
print("\n\033[1;31m [-] Terminating {child}\033[0;39m".format(child=child))
logger.info('Terminate: {child}'.format(child=child))
subprocess.call(argsc) # kill window
sys.exit(0)
###############
## MAIN LOOP ##
###############
def main():
print("\033[1;31m [+] I am the subscriber's master - kill me to kill'em'all \033[0;39m")
# screen needs a shell and I an no fan of shell=True
(master, slave) = pty.openpty()
try:
for item in json.loads(cfg.get('RedisGlobal', 'misp_instances')):
name = shlex.quote(item.get("name"))
zmq = shlex.quote(item.get("zmq"))
print("\033[1;32m [+] Subscribing to "+zmq+"\033[0;39m")
logger.info('Launching: {child}'.format(child=name))
children.append(name)
subprocess.Popen(["screen", "-r", "Misp_Dashboard", "-X", "screen", "-t", name ,sys.executable, "./zmq_subscriber.py", "-n", name, "-u", zmq], close_fds=True, shell=False, stdin=slave, stdout=slave, stderr=slave)
except ValueError as error:
print("\033[1;31m [!] Fatal exception: {error} \033[0;39m".format(error=error))
logger.error("JSON error: %s", error)
sys.exit(1)
signal.signal(signal.SIGINT, signal_handler)
forever = threading.Event()
forever.wait() # Wait for SIGINT
if __name__ == "__main__":
main()