Merge branch 'master' of github.com:MISP/misp-dashboard into subzero

subzero
mokaddem 2019-06-21 14:39:24 +02:00
commit b06c95c907
37 changed files with 23070 additions and 361 deletions

106
README.md
View File

@ -1,23 +1,23 @@
# misp-dashboard # misp-dashboard
A dashboard showing live data and statistics from the ZMQ feeds of one or more [MISP](https://www.misp-project.org/) instances. The dashboard A dashboard showing live data and statistics from the ZMQ feeds of one or more [MISP](https://www.misp-project.org/) instances.
can be used as a real-time situational awareness tool to gather threat intelligence information. The misp-dashboard includes The dashboard can be used as a real-time situational awareness tool to gather threat intelligence information.
a gamification tool to show the contributions of each organisations and how they are ranked over time. The dashboard can be used for The misp-dashboard includes a [gamification](https://en.wikipedia.org/wiki/Gamification#Criticism) tool to show the contributions of each organisation and how they are ranked over time.
SOC (Security Operation Center), security team or during cyber exercise to keep track of what's going on your various MISP instances. The dashboard can be used for SOCs (Security Operation Centers), security teams or during cyber exercises to keep track of what is being processed on your various MISP instances.
# Features # Features
## Live Dashboard ## Live Dashboard
- Possibility to subscribe to multiple ZMQ feeds - Possibility to subscribe to multiple ZMQ feeds from different MISP instances
- Shows direct contribution made by organisations - Shows immediate contributions made by organisations
- Shows live resolvable posted locations - Displays live resolvable posted geo-locations
![Dashboard live](./screenshots/dashboard-live.png) ![Dashboard live](./screenshots/dashboard-live.png)
## Geolocalisation Dashboard ## Geolocalisation Dashboard
- Provides historical geolocalised information to support security teams, CSIRTs or SOC finding threats in their constituency - Provides historical geolocalised information to support security teams, CSIRTs or SOCs in finding threats within their constituency
- Possibility to get geospatial information from specific regions - Possibility to get geospatial information from specific regions
![Dashbaord geo](./screenshots/dashboard-geo.png) ![Dashbaord geo](./screenshots/dashboard-geo.png)
@ -25,25 +25,25 @@ SOC (Security Operation Center), security team or during cyber exercise to keep
## Contributors Dashboard ## Contributors Dashboard
__Shows__: __Shows__:
- The monthly rank of all organisation - The monthly rank of all organisations
- The last organisation that contributed (dynamic updates) - The last organisation that contributed (dynamic updates)
- The contribution level of all organisation - The contribution level of all organisations
- Each category of contribution per organisation - Each category of contributions per organisation
- The current ranking of the selected organisation (dynamic updates) - The current ranking of the selected organisation (dynamic updates)
__Includes__: __Includes__:
- Gamification of the platform: - [Gamification](https://en.wikipedia.org/wiki/Gamification#Criticism) of the platform:
- Two different levels of ranking with unique icons - Two different levels of ranking with unique icons
- Exclusive obtainable badges for source code contributors and donator - Exclusive obtainable badges for source code contributors and donator
![Dashboard contributor](./screenshots/dashboard-contributors2.png) ![Dashboard contributors](./screenshots/dashboard-contributors2.png)
![Dashboard contributor2](./screenshots/dashboard-contributors3.png) ![Dashboard contributors2](./screenshots/dashboard-contributors3.png)
## Users Dashboard ## Users Dashboard
- Shows when and how the platform is used: - Shows when and how the platform is used:
- Login punchcard and overtime - Login punchcard and contributions over time
- Contribution vs login - Contribution vs login
![Dashboard users](./screenshots/dashboard-users.png) ![Dashboard users](./screenshots/dashboard-users.png)
@ -57,7 +57,7 @@ __Includes__:
![Dashboard users](./screenshots/dashboard-trendings.png) ![Dashboard users](./screenshots/dashboard-trendings.png)
# Installation # Installation
- Launch ```./install_dependencies.sh``` from the MISP-Dashboard directory - Launch ```./install_dependencies.sh``` from the MISP-Dashboard directory ([idempotent-ish](https://en.wikipedia.org/wiki/Idempotence))
- Update the configuration file ```config.cfg``` so that it matches your system - Update the configuration file ```config.cfg``` so that it matches your system
- Fields that you may change: - Fields that you may change:
- RedisGlobal -> host - RedisGlobal -> host
@ -68,7 +68,7 @@ __Includes__:
# Updating by pulling # Updating by pulling
- Re-launch ```./install_dependencies.sh``` to fetch new required dependencies - Re-launch ```./install_dependencies.sh``` to fetch new required dependencies
- Re-update your configuration file ```config.cfg``` - Re-update your configuration file ```config.cfg``` by comparing eventual changes in ```config.cfg.default```
:warning: Make sure no zmq python3 scripts are running. They block the update. :warning: Make sure no zmq python3 scripts are running. They block the update.
@ -90,9 +90,10 @@ Traceback (most recent call last):
with open(dst, 'wb') as fdst: with open(dst, 'wb') as fdst:
OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin/python3' OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin/python3'
``` ```
- Restart the System: `./zmq_subscriber.py &`, `./zmq_dispatcher.py &` and `./server.py &`
# Starting the System # Starting the System
:warning: You do not need to run it as root. Normal privileges are fine. :warning: You should not run it as root. Normal privileges are fine.
- Be sure to have a running redis server - Be sure to have a running redis server
- e.g. ```redis-server --port 6250``` - e.g. ```redis-server --port 6250```
@ -102,7 +103,7 @@ OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin
- Start the Flask server ```./server.py &``` - Start the Flask server ```./server.py &```
- Access the interface at ```http://localhost:8001/``` - Access the interface at ```http://localhost:8001/```
Alternatively, you can run the ```start_all.sh``` script to run the commands described above. __Alternatively__, you can run the ```start_all.sh``` script to run the commands described above.
# Debug # Debug
@ -117,7 +118,7 @@ export FLASK_APP=server.py
flask run --host=0.0.0.0 --port=8001 # <- Be careful here, this exposes it on ALL ip addresses. Ideally if run locally --host=127.0.0.1 flask run --host=0.0.0.0 --port=8001 # <- Be careful here, this exposes it on ALL ip addresses. Ideally if run locally --host=127.0.0.1
``` ```
OR, just toggle the debug flag in start_all.sh script. OR, just toggle the debug flag in start_all.sh or config.cfg.
Happy hacking ;) Happy hacking ;)
@ -135,6 +136,29 @@ optional arguments:
a soft method to delete only keys used by MISP-Dashboard. a soft method to delete only keys used by MISP-Dashboard.
``` ```
## Notes about ZMQ
The misp-dashboard being stateless in regards to MISP, it can only process data that it received. Meaning that if your MISP is not publishing all notifications to its ZMQ, the misp-dashboard will not have them.
The most revelant example could be the user login punchcard. If your MISP doesn't have the option ``Plugin.ZeroMQ_audit_notifications_enable`` set to ``true``, the punchcard will be empty.
## Dashboard not showing results - No module named zmq
When the misp-dashboard does not show results then first check if the zmq module within MISP is properly installed.
In **Administration**, **Plugin Settings**, **ZeroMQ** check that **Plugin.ZeroMQ_enable** is set to **True**.
Publish a test event from MISP to ZMQ via **Event Actions**, **Publish event to ZMQ**.
Verify the logfiles
```
${PATH_TO_MISP}/app/tmp/log/mispzmq.error.log
${PATH_TO_MISP}/app/tmp/log/mispzmq.log
```
If there's an error **ModuleNotFoundError: No module named 'zmq'** then install pyzmq.
```
$SUDO_WWW ${PATH_TO_MISP}/venv/bin/pip install pyzmq
```
# zmq_subscriber options # zmq_subscriber options
```usage: zmq_subscriber.py [-h] [-n ZMQNAME] [-u ZMQURL] ```usage: zmq_subscriber.py [-h] [-n ZMQNAME] [-u ZMQURL]
@ -151,7 +175,7 @@ optional arguments:
# Deploy in production using mod_wsgi # Deploy in production using mod_wsgi
Install Apache's mod-wsgi for Python3 Install Apache mod-wsgi for Python3
```bash ```bash
sudo apt-get install libapache2-mod-wsgi-py3 sudo apt-get install libapache2-mod-wsgi-py3
@ -166,7 +190,7 @@ The following NEW packages will be installed:
libapache2-mod-wsgi-py3 libapache2-mod-wsgi-py3
``` ```
Configuration file `/etc/apache2/sites-available/misp-dashboard.conf` assumes that `misp-dashboard` is cloned into `var/www/misp-dashboard`. It runs as user `misp` in this example. Change the permissions to folder and files accordingly. Configuration file `/etc/apache2/sites-available/misp-dashboard.conf` assumes that `misp-dashboard` is cloned into `/var/www/misp-dashboard`. It runs as user `misp` in this example. Change the permissions to your custom folder and files accordingly.
``` ```
<VirtualHost *:8001> <VirtualHost *:8001>
@ -214,33 +238,35 @@ Configuration file `/etc/apache2/sites-available/misp-dashboard.conf` assumes th
# License # License
~~~~
Copyright (C) 2017-2019 CIRCL - Computer Incident Response Center Luxembourg (c/o smile, security made in Lëtzebuerg, Groupement d'Intérêt Economique)
Copyright (c) 2017-2019 Sami Mokaddem
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
~~~~
Images and logos are handmade for: Images and logos are handmade for:
- rankingMISPOrg/ - rankingMISPOrg/
- rankingMISPMonthly/ - rankingMISPMonthly/
- MISPHonorableIcons/ - MISPHonorableIcons/
Note that: Note that:
- Part of ```MISPHonorableIcons/1.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/1.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/2.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/2.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/3.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/3.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/4.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/4.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/5.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/5.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
```
Copyright (C) 2017-2018 CIRCL - Computer Incident Response Center Luxembourg (c/o smile, security made in Lëtzebuerg, Groupement d'Intérêt Economique)
Copyright (c) 2017-2018 Sami Mokaddem
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
```

View File

@ -1,11 +1,11 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from pprint import pprint
import os
import redis
import configparser
import argparse import argparse
import configparser
import os
from pprint import pprint
import redis
RED="\033[91m" RED="\033[91m"
GREEN="\033[92m" GREEN="\033[92m"

View File

@ -62,6 +62,7 @@ tcp-backlog 511
# #
# bind 192.168.1.100 10.0.0.1 # bind 192.168.1.100 10.0.0.1
# bind 127.0.0.1 # bind 127.0.0.1
bind 127.0.0.1 ::1
# Specify the path for the Unix socket that will be used to listen for # Specify the path for the Unix socket that will be used to listen for
# incoming connections. There is no default, so Redis will not listen # incoming connections. There is no default, so Redis will not listen

View File

@ -1,6 +1,7 @@
[Server] [Server]
host = localhost host = localhost
port = 8001 port = 8001
debug = False
[Dashboard] [Dashboard]
#hours #hours
@ -16,7 +17,7 @@ size_dashboard_left_width = 5
size_openStreet_pannel_perc = 55 size_openStreet_pannel_perc = 55
size_world_pannel_perc = 35 size_world_pannel_perc = 35
item_to_plot = Attribute.category item_to_plot = Attribute.category
fieldname_order=["Event.id", "Attribute.Tag", "Attribute.category", "Attribute.type", ["Attribute.value", "Attribute.comment"]] fieldname_order=["Attribute.timestamp", "Event.id", "Attribute.Tag", "Attribute.category", "Attribute.type", ["Attribute.value", "Attribute.comment"]]
char_separator=|| char_separator=||
[GEO] [GEO]
@ -33,7 +34,10 @@ additional_help_text = ["Sightings multiplies earned points by 2", "Editing an a
[Log] [Log]
directory=logs directory=logs
filename=logs.log dispatcher_filename=zmq_dispatcher.log
subscriber_filename=zmq_subscriber.log
helpers_filename=helpers.log
update_filename=updates.log
[RedisGlobal] [RedisGlobal]
host=localhost host=localhost
@ -75,3 +79,4 @@ path_countrycode_to_coord_JSON=./data/country_code_lat_long.json
[RedisDB] [RedisDB]
db=2 db=2
dbVersion=db_version

426
diagnostic.py Executable file
View File

@ -0,0 +1,426 @@
#!/usr/bin/env python3
import os
import sys
import stat
import time
import signal
import functools
import configparser
from pprint import pprint
import subprocess
import diagnostic_util
try:
import redis
import zmq
import json
import flask
import requests
from halo import Halo
except ModuleNotFoundError as e:
print('Dependency not met. Either not in a virtualenv or dependency not installed.')
print(f'- Error: {e}')
sys.exit(1)
'''
Steps:
- check if dependencies exists
- check if virtualenv exists
- check if configuration is update-to-date
- check file permission
- check if redis is running and responding
- check if able to connect to zmq
- check zmq_dispatcher processing queue
- check queue status: being filled up / being filled down
- check if subscriber responding
- check if dispatcher responding
- check if server listening
- check log static endpoint
- check log dynamic endpoint
'''
HOST = 'http://127.0.0.1'
PORT = 8001 # overriden by configuration file
configuration_file = {}
pgrep_subscriber_output = ''
pgrep_dispatcher_output = ''
signal.signal(signal.SIGALRM, diagnostic_util.timeout_handler)
def humanize(name, isResult=False):
words = name.split('_')
if isResult:
words = words[1:]
words[0] = words[0][0].upper() + words[0][1:]
else:
words[0] = words[0][0].upper() + words[0][1:] + 'ing'
return ' '.join(words)
def add_spinner(_func=None, name='dots'):
def decorator_add_spinner(func):
@functools.wraps(func)
def wrapper_add_spinner(*args, **kwargs):
human_func_name = humanize(func.__name__)
human_func_result = humanize(func.__name__, isResult=True)
flag_skip = False
with Halo(text=human_func_name, spinner=name) as spinner:
result = func(spinner, *args, **kwargs)
if isinstance(result, tuple):
status, output = result
elif isinstance(result, list):
status = result[0]
output = result[1]
elif isinstance(result, bool):
status = result
output = None
else:
status = False
flag_skip = True
spinner.fail(f'{human_func_name} - Function return unexpected result: {str(result)}')
if not flag_skip:
text = human_func_result
if output is not None and len(output) > 0:
text += f': {output}'
if isinstance(status, bool) and status:
spinner.succeed(text)
elif isinstance(status, bool) and not status:
spinner.fail(text)
else:
if status == 'info':
spinner.info(text)
else:
spinner.warn(text)
return status
return wrapper_add_spinner
if _func is None:
return decorator_add_spinner
else:
return decorator_add_spinner(_func)
@add_spinner
def check_virtual_environment_and_packages(spinner):
result = os.environ.get('VIRTUAL_ENV')
if result is None:
return (False, 'This diagnostic tool should be started inside a virtual environment.')
else:
if redis.__version__.startswith('2'):
return (False, f'''Redis python client have version {redis.__version__}. Version 3.x required.
\t [inside virtualenv] pip3 install -U redis''')
else:
return (True, '')
@add_spinner
def check_configuration(spinner):
global configuration_file, port
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
configuration_file = cfg
cfg = {s: dict(cfg.items(s)) for s in cfg.sections()}
configfile_default = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg.default')
cfg_default = configparser.ConfigParser()
cfg_default.read(configfile_default)
cfg_default = {s: dict(cfg_default.items(s)) for s in cfg_default.sections()}
# Check if all fields from config.default exists in config
result, faulties = diagnostic_util.dict_compare(cfg_default, cfg)
if result:
port = configuration_file.get("Server", "port")
return (True, '')
else:
return_text = '''Configuration incomplete.
\tUpdate your configuration file `config.cfg`.\n\t Faulty fields:\n'''
for field_name in faulties:
return_text += f'\t\t- {field_name}\n'
return (False, return_text)
@add_spinner(name='dot')
def check_file_permission(spinner):
max_mind_database_path = configuration_file.get('RedisMap', 'pathmaxminddb')
st = os.stat(max_mind_database_path)
all_read_perm = bool(st.st_mode & stat.S_IROTH) # FIXME: permission may be changed
if all_read_perm:
return (True, '')
else:
return (False, 'Maxmin GeoDB might have incorrect read file permission')
@add_spinner
def check_redis(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLog', 'db'))
if redis_server.ping():
return (True, '')
else:
return (False, '''Can\'t reach Redis server.
\t Make sure it is running and adapt your configuration accordingly''')
@add_spinner
def check_zmq(spinner):
timeout = 15
context = zmq.Context()
socket = context.socket(zmq.SUB)
socket.connect(configuration_file.get('RedisGlobal', 'zmq_url'))
socket.setsockopt_string(zmq.SUBSCRIBE, '')
poller = zmq.Poller()
start_time = time.time()
poller.register(socket, zmq.POLLIN)
for t in range(1, timeout+1):
socks = dict(poller.poll(timeout=1*1000))
if len(socks) > 0:
if socket in socks and socks[socket] == zmq.POLLIN:
rcv_string = socket.recv()
if rcv_string.startswith(b'misp_json'):
return (True, '')
else:
pass
spinner.text = f'checking zmq - elapsed time: {int(time.time() - start_time)}s'
else:
return (False, '''Can\'t connect to the ZMQ stream.
\t Make sure the MISP ZMQ is running: `/servers/serverSettings/diagnostics`
\t Make sure your network infrastucture allows you to connect to the ZMQ''')
@add_spinner
def check_processes_status(spinner):
global pgrep_subscriber_output, pgrep_dispatcher_output
response = subprocess.check_output(
["pgrep", "-laf", "zmq_"],
universal_newlines=True
)
for line in response.splitlines():
lines = line.split(' ')
if len(lines) == 2:
pid, p_name = lines
elif len(lines) ==3:
pid, _, p_name = lines
if 'zmq_subscriber.py' in p_name:
pgrep_subscriber_output = line
elif 'zmq_dispatcher.py' in p_name:
pgrep_dispatcher_output = line
if len(pgrep_subscriber_output) == 0:
return (False, 'zmq_subscriber is not running')
elif len(pgrep_dispatcher_output) == 0:
return (False, 'zmq_dispatcher is not running')
else:
return (True, 'Both processes are running')
@add_spinner
def check_subscriber_status(spinner):
global pgrep_subscriber_output
pool = redis.ConnectionPool(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'),
decode_responses=True)
monitor = diagnostic_util.Monitor(pool)
commands = monitor.monitor()
start_time = time.time()
signal.alarm(15)
try:
for i, c in enumerate(commands):
if i == 0: # Skip 'OK'
continue
split = c.split()
try:
action = split[3]
target = split[4]
except IndexError:
pass
if action == '"LPUSH"' and target == f'\"{configuration_file.get("RedisLIST", "listName")}\"':
signal.alarm(0)
break
else:
spinner.text = f'Checking subscriber status - elapsed time: {int(time.time() - start_time)}s'
except diagnostic_util.TimeoutException:
return_text = f'''zmq_subscriber seems not to be working.
\t Consider restarting it: {pgrep_subscriber_output}'''
return (False, return_text)
return (True, 'subscriber is running and populating the buffer')
@add_spinner
def check_buffer_queue(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
warning_threshold = 100
elements_in_list = redis_server.llen(configuration_file.get('RedisLIST', 'listName'))
return_status = 'warning' if elements_in_list > warning_threshold else ('info' if elements_in_list > 0 else True)
return_text = f'Currently {elements_in_list} items in the buffer'
return (return_status, return_text)
@add_spinner
def check_buffer_change_rate(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
time_slept = 0
sleep_duration = 0.001
sleep_max = 10.0
refresh_frequency = 1.0
next_refresh = 0
change_increase = 0
change_decrease = 0
elements_in_list_prev = 0
elements_in_list = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
elements_in_inlist_init = elements_in_list
consecutive_no_rate_change = 0
while True:
elements_in_list_prev = elements_in_list
elements_in_list = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
change_increase += elements_in_list - elements_in_list_prev if elements_in_list - elements_in_list_prev > 0 else 0
change_decrease += elements_in_list_prev - elements_in_list if elements_in_list_prev - elements_in_list > 0 else 0
if next_refresh < time_slept:
next_refresh = time_slept + refresh_frequency
change_rate_text = f'{change_increase}/sec\t{change_decrease}/sec'
spinner.text = f'Buffer: {elements_in_list}\t{change_rate_text}'
if consecutive_no_rate_change == 3:
time_slept = sleep_max
if elements_in_list == 0:
consecutive_no_rate_change += 1
else:
consecutive_no_rate_change = 0
change_increase = 0
change_decrease = 0
if time_slept >= sleep_max:
return_flag = elements_in_list == 0 or (elements_in_list < elements_in_inlist_init or elements_in_list < 2)
return_text = f'Buffer is consumed {"faster" if return_flag else "slower" } than being populated'
break
time.sleep(sleep_duration)
time_slept += sleep_duration
elements_in_inlist_final = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
return (return_flag, return_text)
@add_spinner
def check_dispatcher_status(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
content = {'content': time.time()}
redis_server.rpush(configuration_file.get('RedisLIST', 'listName'),
json.dumps({'zmq_name': 'diagnostic_channel', 'content': 'diagnostic_channel ' + json.dumps(content)})
)
return_flag = False
return_text = ''
time_slept = 0
sleep_duration = 0.2
sleep_max = 10.0
redis_server.delete('diagnostic_tool_response')
while True:
reply = redis_server.get('diagnostic_tool_response')
elements_in_list = redis_server.llen(configuration_file.get('RedisLIST', 'listName'))
if reply is None:
if time_slept >= sleep_max:
return_flag = False
return_text = f'zmq_dispatcher did not respond in the given time ({int(sleep_max)}s)'
if len(pgrep_dispatcher_output) > 0:
return_text += f'\n\t➥ Consider restarting it: {pgrep_dispatcher_output}'
else:
return_text += '\n\t➥ Consider starting it'
break
time.sleep(sleep_duration)
spinner.text = f'Dispatcher status: No response yet'
time_slept += sleep_duration
else:
return_flag = True
return_text = f'Took {float(reply):.2f}s to complete'
break
return (return_flag, return_text)
@add_spinner
def check_server_listening(spinner):
url = f'{HOST}:{PORT}/_get_log_head'
spinner.text = f'Trying to connect to {url}'
try:
r = requests.get(url)
except requests.exceptions.ConnectionError:
return (False, f'Can\'t connect to {url}')
return (
r.status_code == 200,
f'{url} {"not " if r.status_code != 200 else ""}reached. Status code [{r.status_code}]'
)
@add_spinner
def check_server_dynamic_enpoint(spinner):
sleep_max = 15
start_time = time.time()
url = f'{HOST}:{PORT}/_logs'
p = subprocess.Popen(
['curl', '-sfN', '--header', 'Accept: text/event-stream', url],
stdout=subprocess.PIPE,
bufsize=1)
signal.alarm(sleep_max)
return_flag = False
return_text = f'Dynamic endpoint returned data but not in the correct format.'
try:
for line in iter(p.stdout.readline, b''):
if line.startswith(b'data: '):
data = line[6:]
try:
j = json.loads(data)
return_flag = True
return_text = f'Dynamic endpoint returned data (took {time.time()-start_time:.2f}s)'
signal.alarm(0)
break
except Exception as e:
return_flag = False
return_text = f'Something went wrong. Output {line}'
break
except diagnostic_util.TimeoutException:
return_text = f'Dynamic endpoint did not returned data in the given time ({int(time.time()-start_time)}sec)'
return (return_flag, return_text)
def start_diagnostic():
if not (check_virtual_environment_and_packages() and check_configuration()):
return
check_file_permission()
check_redis()
check_zmq()
check_processes_status()
check_subscriber_status()
if check_buffer_queue() is not True:
check_buffer_change_rate()
dispatcher_running = check_dispatcher_status()
if check_server_listening() and dispatcher_running:
check_server_dynamic_enpoint()
def main():
start_diagnostic()
if __name__ == '__main__':
main()

68
diagnostic_util.py Normal file
View File

@ -0,0 +1,68 @@
import configparser
def dict_compare(dict1, dict2, itercount=0):
dict1_keys = set(dict1.keys())
dict2_keys = set(dict2.keys())
intersection = dict1_keys.difference(dict2_keys)
faulties = []
if itercount > 0 and len(intersection) > 0:
return (False, list(intersection))
flag_no_error = True
for k, v in dict1.items():
if isinstance(v, dict):
if k not in dict2:
faulties.append({k: dict1[k]})
flag_no_error = False
else:
status, faulty = dict_compare(v, dict2[k], itercount+1)
flag_no_error = flag_no_error and status
if len(faulty) > 0:
faulties.append({k: faulty})
else:
return (True, [])
if flag_no_error:
return (True, [])
else:
return (False, faulties)
class TimeoutException(Exception):
pass
def timeout_handler(signum, frame):
raise TimeoutException
# https://stackoverflow.com/a/10464730
class Monitor():
def __init__(self, connection_pool):
self.connection_pool = connection_pool
self.connection = None
def __del__(self):
try:
self.reset()
except Exception:
pass
def reset(self):
if self.connection:
self.connection_pool.release(self.connection)
self.connection = None
def monitor(self):
if self.connection is None:
self.connection = self.connection_pool.get_connection(
'monitor', None)
self.connection.send_command("monitor")
return self.listen()
def parse_response(self):
return self.connection.read_response()
def listen(self):
while True:
yield self.parse_response()

View File

@ -1,9 +1,13 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import os, sys, json
import datetime, time
import redis
import configparser import configparser
import datetime
import json
import os
import sys
import time
import redis
import util import util
from helpers import contributor_helper from helpers import contributor_helper
@ -206,7 +210,7 @@ def main():
for award in awards_given: for award in awards_given:
# update awards given # update awards given
serv_redis_db.zadd('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })) serv_redis_db.zadd('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), {json.dumps({'org': org, 'award': award, 'epoch': nowSec }): nowSec})
serv_redis_db.expire('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), ONE_DAY*7) #expire after 7 day serv_redis_db.expire('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), ONE_DAY*7) #expire after 7 day
# publish # publish
publish_log('GIVE_HONOR_ZMQ', 'CONTRIBUTION', {'org': org, 'award': award, 'epoch': nowSec }, CHANNEL_LASTAWARDS) publish_log('GIVE_HONOR_ZMQ', 'CONTRIBUTION', {'org': org, 'award': award, 'epoch': nowSec }, CHANNEL_LASTAWARDS)

View File

@ -1,16 +1,20 @@
import util
from util import getZrange
import math, random
import time
import os
import configparser import configparser
import json
import datetime import datetime
import json
import logging import logging
import math
import os
import random
import sys
import time
import redis import redis
import util import util
from util import getZrange
from . import users_helper from . import users_helper
KEYDAY = "CONTRIB_DAY" # To be used by other module KEYDAY = "CONTRIB_DAY" # To be used by other module
KEYALLORG = "CONTRIB_ALL_ORG" # To be used by other module KEYALLORG = "CONTRIB_ALL_ORG" # To be used by other module
@ -30,11 +34,16 @@ class Contributor_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
#honorBadge #honorBadge
@ -106,7 +115,7 @@ class Contributor_helper:
def addContributionToCateg(self, date, categ, org, count=1): def addContributionToCateg(self, date, categ, org, count=1):
today_str = util.getDateStrFormat(date) today_str = util.getDateStrFormat(date)
keyname = "{}:{}:{}".format(self.keyCateg, today_str, categ) keyname = "{}:{}:{}".format(self.keyCateg, today_str, categ)
self.serv_redis_db.zincrby(keyname, org, count) self.serv_redis_db.zincrby(keyname, count, org)
self.logger.debug('Added to redis: keyname={}, org={}, count={}'.format(keyname, org, count)) self.logger.debug('Added to redis: keyname={}, org={}, count={}'.format(keyname, org, count))
def publish_log(self, zmq_name, name, content, channel=""): def publish_log(self, zmq_name, name, content, channel=""):
@ -120,14 +129,14 @@ class Contributor_helper:
if action in ['edit', None]: if action in ['edit', None]:
pass pass
#return #not a contribution? #return #not a contribution?
now = datetime.datetime.now() now = datetime.datetime.now()
nowSec = int(time.time()) nowSec = int(time.time())
pnts_to_add = self.default_pnts_per_contribution pnts_to_add = self.default_pnts_per_contribution
# Do not consider contribution as login anymore # Do not consider contribution as login anymore
#self.users_helper.add_user_login(nowSec, org) #self.users_helper.add_user_login(nowSec, org)
# is a valid contribution # is a valid contribution
if categ is not None: if categ is not None:
try: try:
@ -135,27 +144,27 @@ class Contributor_helper:
except KeyError: except KeyError:
pnts_to_add = self.default_pnts_per_contribution pnts_to_add = self.default_pnts_per_contribution
pnts_to_add *= pntMultiplier pnts_to_add *= pntMultiplier
util.push_to_redis_zset(self.serv_redis_db, self.keyDay, org, count=pnts_to_add) util.push_to_redis_zset(self.serv_redis_db, self.keyDay, org, count=pnts_to_add)
#CONTRIB_CATEG retain the contribution per category, not the point earned in this categ #CONTRIB_CATEG retain the contribution per category, not the point earned in this categ
util.push_to_redis_zset(self.serv_redis_db, self.keyCateg, org, count=1, endSubkey=':'+util.noSpaceLower(categ)) util.push_to_redis_zset(self.serv_redis_db, self.keyCateg, org, count=1, endSubkey=':'+util.noSpaceLower(categ))
self.publish_log(zmq_name, 'CONTRIBUTION', {'org': org, 'categ': categ, 'action': action, 'epoch': nowSec }, channel=self.CHANNEL_LASTCONTRIB) self.publish_log(zmq_name, 'CONTRIBUTION', {'org': org, 'categ': categ, 'action': action, 'epoch': nowSec }, channel=self.CHANNEL_LASTCONTRIB)
else: else:
categ = "" categ = ""
self.serv_redis_db.sadd(self.keyAllOrg, org) self.serv_redis_db.sadd(self.keyAllOrg, org)
keyname = "{}:{}".format(self.keyLastContrib, util.getDateStrFormat(now)) keyname = "{}:{}".format(self.keyLastContrib, util.getDateStrFormat(now))
self.serv_redis_db.zadd(keyname, nowSec, org) self.serv_redis_db.zadd(keyname, {org: nowSec})
self.logger.debug('Added to redis: keyname={}, nowSec={}, org={}'.format(keyname, nowSec, org)) self.logger.debug('Added to redis: keyname={}, nowSec={}, org={}'.format(keyname, nowSec, org))
self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day
awards_given = self.updateOrgContributionRank(org, pnts_to_add, action, contribType, eventTime=datetime.datetime.now(), isLabeled=isLabeled, categ=util.noSpaceLower(categ)) awards_given = self.updateOrgContributionRank(org, pnts_to_add, action, contribType, eventTime=datetime.datetime.now(), isLabeled=isLabeled, categ=util.noSpaceLower(categ))
for award in awards_given: for award in awards_given:
# update awards given # update awards given
keyname = "{}:{}".format(self.keyLastAward, util.getDateStrFormat(now)) keyname = "{}:{}".format(self.keyLastAward, util.getDateStrFormat(now))
self.serv_redis_db.zadd(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })) self.serv_redis_db.zadd(keyname, {json.dumps({'org': org, 'award': award, 'epoch': nowSec }): nowSec})
self.logger.debug('Added to redis: keyname={}, nowSec={}, content={}'.format(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec }))) self.logger.debug('Added to redis: keyname={}, nowSec={}, content={}'.format(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })))
self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day
# publish # publish
@ -168,7 +177,7 @@ class Contributor_helper:
if pnts is None: if pnts is None:
pnts = 0 pnts = 0
else: else:
pnts = int(pnts.decode('utf8')) pnts = int(pnts)
return pnts return pnts
# return: [final_rank, requirement_fulfilled, requirement_not_fulfilled] # return: [final_rank, requirement_fulfilled, requirement_not_fulfilled]
@ -372,7 +381,7 @@ class Contributor_helper:
def getOrgsTrophyRanking(self, categ): def getOrgsTrophyRanking(self, categ):
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
res = self.serv_redis_db.zrange(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), 0, -1, withscores=True, desc=True) res = self.serv_redis_db.zrange(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), 0, -1, withscores=True, desc=True)
res = [[org.decode('utf8'), score] for org, score in res] res = [[org, score] for org, score in res]
return res return res
def getAllOrgsTrophyRanking(self, category=None): def getAllOrgsTrophyRanking(self, category=None):
@ -401,12 +410,12 @@ class Contributor_helper:
def giveTrophyPointsToOrg(self, org, categ, points): def giveTrophyPointsToOrg(self, org, categ, points):
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), org, points) self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), points, org)
self.logger.debug('Giving {} trophy points to {} in {}'.format(points, org, categ)) self.logger.debug('Giving {} trophy points to {} in {}'.format(points, org, categ))
def removeTrophyPointsFromOrg(self, org, categ, points): def removeTrophyPointsFromOrg(self, org, categ, points):
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), org, -points) self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), -points, org)
self.logger.debug('Removing {} trophy points from {} in {}'.format(points, org, categ)) self.logger.debug('Removing {} trophy points from {} in {}'.format(points, org, categ))
''' AWARDS HELPER ''' ''' AWARDS HELPER '''
@ -553,7 +562,7 @@ class Contributor_helper:
def getAllOrgFromRedis(self): def getAllOrgFromRedis(self):
data = self.serv_redis_db.smembers(self.keyAllOrg) data = self.serv_redis_db.smembers(self.keyAllOrg)
data = [x.decode('utf8') for x in data] data = [x for x in data]
return data return data
def getCurrentOrgRankFromRedis(self, org): def getCurrentOrgRankFromRedis(self, org):
@ -589,4 +598,3 @@ class Contributor_helper:
return { 'remainingPts': i-points, 'stepPts': prev } return { 'remainingPts': i-points, 'stepPts': prev }
prev = i prev = i
return { 'remainingPts': 0, 'stepPts': self.rankMultiplier**self.levelMax } return { 'remainingPts': 0, 'stepPts': self.rankMultiplier**self.levelMax }

View File

@ -1,18 +1,22 @@
import math, random import datetime
import os
import json import json
import datetime, time
import logging import logging
import json import math
import redis import os
import random
import sys
import time
from collections import OrderedDict from collections import OrderedDict
import geoip2.database import redis
import phonenumbers, pycountry
from phonenumbers import geocoder
import geoip2.database
import phonenumbers
import pycountry
import util import util
from helpers import live_helper from helpers import live_helper
from phonenumbers import geocoder
class InvalidCoordinate(Exception): class InvalidCoordinate(Exception):
pass pass
@ -29,11 +33,16 @@ class Geo_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
self.keyCategCoord = "GEO_COORD" self.keyCategCoord = "GEO_COORD"
@ -43,7 +52,12 @@ class Geo_helper:
self.PATH_TO_JSON = cfg.get('RedisMap', 'path_countrycode_to_coord_JSON') self.PATH_TO_JSON = cfg.get('RedisMap', 'path_countrycode_to_coord_JSON')
self.CHANNELDISP = cfg.get('RedisMap', 'channelDisp') self.CHANNELDISP = cfg.get('RedisMap', 'channelDisp')
self.reader = geoip2.database.Reader(self.PATH_TO_DB) try:
self.reader = geoip2.database.Reader(self.PATH_TO_DB)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.country_to_iso = { country.name: country.alpha_2 for country in pycountry.countries} self.country_to_iso = { country.name: country.alpha_2 for country in pycountry.countries}
with open(self.PATH_TO_JSON) as f: with open(self.PATH_TO_JSON) as f:
self.country_code_to_coord = json.load(f) self.country_code_to_coord = json.load(f)
@ -125,7 +139,7 @@ class Geo_helper:
self.live_helper.add_to_stream_log_cache('Map', j_to_send) self.live_helper.add_to_stream_log_cache('Map', j_to_send)
self.logger.info('Published: {}'.format(json.dumps(to_send))) self.logger.info('Published: {}'.format(json.dumps(to_send)))
except ValueError: except ValueError:
self.logger.warning("can't resolve ip") self.logger.warning("Can't resolve IP: " + str(supposed_ip))
except geoip2.errors.AddressNotFoundError: except geoip2.errors.AddressNotFoundError:
self.logger.warning("Address not in Database") self.logger.warning("Address not in Database")
except InvalidCoordinate: except InvalidCoordinate:
@ -181,13 +195,18 @@ class Geo_helper:
now = datetime.datetime.now() now = datetime.datetime.now()
today_str = util.getDateStrFormat(now) today_str = util.getDateStrFormat(now)
keyname = "{}:{}".format(keyCateg, today_str) keyname = "{}:{}".format(keyCateg, today_str)
self.serv_redis_db.geoadd(keyname, lon, lat, content) try:
self.serv_redis_db.geoadd(keyname, lon, lat, content)
except redis.exceptions.ResponseError as error:
print(error)
print("Please fix the above, and make sure you use a redis version that supports the GEOADD command.")
print("To test for support: echo \"help GEOADD\"| redis-cli")
self.logger.debug('Added to redis: keyname={}, lon={}, lat={}, content={}'.format(keyname, lon, lat, content)) self.logger.debug('Added to redis: keyname={}, lon={}, lat={}, content={}'.format(keyname, lon, lat, content))
def push_to_redis_zset(self, keyCateg, toAdd, endSubkey="", count=1): def push_to_redis_zset(self, keyCateg, toAdd, endSubkey="", count=1):
now = datetime.datetime.now() now = datetime.datetime.now()
today_str = util.getDateStrFormat(now) today_str = util.getDateStrFormat(now)
keyname = "{}:{}{}".format(keyCateg, today_str, endSubkey) keyname = "{}:{}{}".format(keyCateg, today_str, endSubkey)
self.serv_redis_db.zincrby(keyname, toAdd, count) self.serv_redis_db.zincrby(keyname, count, toAdd)
self.logger.debug('Added to redis: keyname={}, toAdd={}, count={}'.format(keyname, toAdd, count)) self.logger.debug('Added to redis: keyname={}, toAdd={}, count={}'.format(keyname, toAdd, count))
def ip_to_coord(self, ip): def ip_to_coord(self, ip):

View File

@ -1,8 +1,10 @@
import os import datetime
import json import json
import random
import datetime, time
import logging import logging
import os
import random
import sys
import time
class Live_helper: class Live_helper:
@ -16,11 +18,16 @@ class Live_helper:
# logger # logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
def publish_log(self, zmq_name, name, content, channel=None): def publish_log(self, zmq_name, name, content, channel=None):
@ -32,6 +39,7 @@ class Live_helper:
self.serv_live.publish(channel, j_to_send) self.serv_live.publish(channel, j_to_send)
self.logger.debug('Published: {}'.format(j_to_send)) self.logger.debug('Published: {}'.format(j_to_send))
if name != 'Keepalive': if name != 'Keepalive':
name = 'Attribute' if 'ObjectAttribute' else name
self.add_to_stream_log_cache(name, j_to_send_keep) self.add_to_stream_log_cache(name, j_to_send_keep)
@ -40,10 +48,10 @@ class Live_helper:
entries = self.serv_live.lrange(rKey, 0, -1) entries = self.serv_live.lrange(rKey, 0, -1)
to_ret = [] to_ret = []
for entry in entries: for entry in entries:
jentry = json.loads(entry.decode('utf8')) jentry = json.loads(entry)
to_ret.append(jentry) to_ret.append(jentry)
return to_ret return to_ret
def add_to_stream_log_cache(self, cacheKey, item): def add_to_stream_log_cache(self, cacheKey, item):
rKey = self.prefix_redis_key+cacheKey rKey = self.prefix_redis_key+cacheKey

View File

@ -1,13 +1,17 @@
import math, random
import os
import json
import copy import copy
import datetime, time import datetime
import json
import logging import logging
import math
import os
import random
import sys
import time
from collections import OrderedDict from collections import OrderedDict
import util import util
class Trendings_helper: class Trendings_helper:
def __init__(self, serv_redis_db, cfg): def __init__(self, serv_redis_db, cfg):
self.serv_redis_db = serv_redis_db self.serv_redis_db = serv_redis_db
@ -23,11 +27,16 @@ class Trendings_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
''' SETTER ''' ''' SETTER '''
@ -40,7 +49,7 @@ class Trendings_helper:
to_save = json.dumps(data) to_save = json.dumps(data)
else: else:
to_save = data to_save = data
self.serv_redis_db.zincrby(keyname, to_save, 1) self.serv_redis_db.zincrby(keyname, 1, to_save)
self.logger.debug('Added to redis: keyname={}, content={}'.format(keyname, to_save)) self.logger.debug('Added to redis: keyname={}, content={}'.format(keyname, to_save))
def addTrendingEvent(self, eventName, timestamp): def addTrendingEvent(self, eventName, timestamp):
@ -82,7 +91,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=True) data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else [] data = data if data is not None else []
to_ret.append([util.getTimestamp(curDate), data]) to_ret.append([util.getTimestamp(curDate), data])
to_ret = util.sortByTrendingScore(to_ret, topNum=topNum) to_ret = util.sortByTrendingScore(to_ret, topNum=topNum)
@ -115,7 +124,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(self.keyTag, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keyTag, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else [] data = data if data is not None else []
temp = [] temp = []
for jText, score in data: for jText, score in data:
@ -130,10 +139,10 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(self.keySigh, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keySigh, util.getDateStrFormat(curDate))
sight = self.serv_redis_db.get(keyname) sight = self.serv_redis_db.get(keyname)
sight = 0 if sight is None else int(sight.decode('utf8')) sight = 0 if sight is None else int(sight)
keyname = "{}:{}".format(self.keyFalse, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keyFalse, util.getDateStrFormat(curDate))
fp = self.serv_redis_db.get(keyname) fp = self.serv_redis_db.get(keyname)
fp = 0 if fp is None else int(fp.decode('utf8')) fp = 0 if fp is None else int(fp)
to_ret.append([util.getTimestamp(curDate), { 'sightings': sight, 'false_positive': fp}]) to_ret.append([util.getTimestamp(curDate), { 'sightings': sight, 'false_positive': fp}])
return to_ret return to_ret
@ -149,7 +158,7 @@ class Trendings_helper:
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True) data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True)
for elem in data: for elem in data:
allSet.add(elem.decode('utf8')) allSet.add(elem)
to_ret[trendingType] = list(allSet) to_ret[trendingType] = list(allSet)
tags = self.getTrendingTags(dateS, dateE) tags = self.getTrendingTags(dateS, dateE)
tagSet = set() tagSet = set()
@ -178,7 +187,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else [] data = data if data is not None else []
to_format.append([util.getTimestamp(curDate), data]) to_format.append([util.getTimestamp(curDate), data])

View File

@ -1,10 +1,14 @@
import math, random import datetime
import os
import json import json
import datetime, time
import logging import logging
import math
import os
import random
import sys
import time
import util import util
from . import contributor_helper from . import contributor_helper
@ -20,11 +24,16 @@ class Users_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
def add_user_login(self, timestamp, org, email=''): def add_user_login(self, timestamp, org, email=''):
@ -32,11 +41,11 @@ class Users_helper:
timestampDate_str = util.getDateStrFormat(timestampDate) timestampDate_str = util.getDateStrFormat(timestampDate)
keyname_timestamp = "{}:{}".format(self.keyTimestamp, org) keyname_timestamp = "{}:{}".format(self.keyTimestamp, org)
self.serv_redis_db.zadd(keyname_timestamp, timestamp, timestamp) self.serv_redis_db.zadd(keyname_timestamp, {timestamp: timestamp})
self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_timestamp, timestamp)) self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_timestamp, timestamp))
keyname_org = "{}:{}".format(self.keyOrgLog, timestampDate_str) keyname_org = "{}:{}".format(self.keyOrgLog, timestampDate_str)
self.serv_redis_db.zincrby(keyname_org, org, 1) self.serv_redis_db.zincrby(keyname_org, 1, org)
self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_org, org)) self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_org, org))
self.serv_redis_db.sadd(self.keyAllOrgLog, org) self.serv_redis_db.sadd(self.keyAllOrgLog, org)
@ -44,7 +53,7 @@ class Users_helper:
def getAllOrg(self): def getAllOrg(self):
temp = self.serv_redis_db.smembers(self.keyAllOrgLog) temp = self.serv_redis_db.smembers(self.keyAllOrgLog)
return [ org.decode('utf8') for org in temp ] return [ org for org in temp ]
# return: All timestamps for one org for the spanned time or not # return: All timestamps for one org for the spanned time or not
def getDates(self, org, date=None): def getDates(self, org, date=None):
@ -63,11 +72,11 @@ class Users_helper:
else: else:
break # timestamps should be sorted, no need to process anymore break # timestamps should be sorted, no need to process anymore
return to_return return to_return
# return: All dates for all orgs, if date is not supplied, return for all dates # return: All dates for all orgs, if date is not supplied, return for all dates
def getUserLogins(self, date=None): def getUserLogins(self, date=None):
# get all orgs and retreive their timestamps # get all orgs and retrieve their timestamps
dates = [] dates = []
for org in self.getAllOrg(): for org in self.getAllOrg():
keyname = "{}:{}".format(self.keyOrgLog, org) keyname = "{}:{}".format(self.keyOrgLog, org)
@ -81,7 +90,7 @@ class Users_helper:
keyname = "{}:{}".format(self.keyOrgLog, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keyOrgLog, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True) data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True)
for org in data: for org in data:
orgs.add(org.decode('utf8')) orgs.add(org)
return list(orgs) return list(orgs)
# return: list composed of the number of [log, contrib] for one org for the time spanned # return: list composed of the number of [log, contrib] for one org for the time spanned
@ -125,7 +134,7 @@ class Users_helper:
def getLoginVSCOntribution(self, date): def getLoginVSCOntribution(self, date):
keyname = "{}:{}".format(self.keyContribDay, util.getDateStrFormat(date)) keyname = "{}:{}".format(self.keyContribDay, util.getDateStrFormat(date))
orgs_contri = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=False) orgs_contri = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=False)
orgs_contri = [ org.decode('utf8') for org in orgs_contri ] orgs_contri = [ org for org in orgs_contri ]
orgs_login = [ org for org in self.getAllLoggedInOrgs(date, prev_days=0) ] orgs_login = [ org for org in self.getAllLoggedInOrgs(date, prev_days=0) ]
contributed_num = 0 contributed_num = 0
non_contributed_num = 0 non_contributed_num = 0
@ -169,7 +178,7 @@ class Users_helper:
data = [data[6]]+data[:6] data = [data[6]]+data[:6]
return data return data
# return: a dico of the form {login: [[timestamp, count], ...], contrib: [[timestamp, 1/0], ...]} # return: a dico of the form {login: [[timestamp, count], ...], contrib: [[timestamp, 1/0], ...]}
# either for all orgs or the supplied one # either for all orgs or the supplied one
def getUserLoginsAndContribOvertime(self, date, org=None, prev_days=6): def getUserLoginsAndContribOvertime(self, date, org=None, prev_days=6):
dico_hours_contrib = {} dico_hours_contrib = {}

View File

@ -1,6 +1,9 @@
#!/bin/bash #!/bin/bash
set -e ## disable -e for production systems
#set -e
## Debug mode
#set -x #set -x
sudo chmod -R g+w . sudo chmod -R g+w .
@ -15,7 +18,13 @@ fi
sudo apt-get install python3-virtualenv virtualenv screen redis-server unzip -y sudo apt-get install python3-virtualenv virtualenv screen redis-server unzip -y
if [ -z "$VIRTUAL_ENV" ]; then if [ -z "$VIRTUAL_ENV" ]; then
virtualenv -p python3 DASHENV virtualenv -p python3 DASHENV ; DASH_VENV=$?
if [[ "$DASH_VENV" != "0" ]]; then
echo "Something went wrong with either the update or install of the virtualenv."
echo "Please investigate manually."
exit $DASH_VENV
fi
. ./DASHENV/bin/activate . ./DASHENV/bin/activate
fi fi
@ -44,7 +53,14 @@ mkdir -p css fonts js
popd popd
mkdir -p temp mkdir -p temp
wget http://www.misp-project.org/assets/images/misp-small.png -O static/pics/MISP.png NET_WGET=$(wget --no-cache -q https://www.misp-project.org/assets/images/misp-small.png -O static/pics/MISP.png; echo $?)
if [[ "$NET_WGET" != "0" ]]; then
echo "The first wget we tried failed, please investigate manually."
exit $NET_WGET
fi
wget https://www.misp-project.org/favicon.ico -O static/favicon.ico
# jquery # jquery
JQVERSION="3.2.1" JQVERSION="3.2.1"

View File

@ -1,9 +1,10 @@
argparse argparse
flask flask
geoip2 geoip2
# Redis needs to be 2.10.6 due to a change in redis 3.10 client, see here: https://github.com/MISP/misp-dashboard/issues/76#issuecomment-439389621 redis
redis==2.10.6
phonenumbers phonenumbers
pip pip
pycountry pycountry
zmq zmq
requests
halo

View File

@ -1,13 +1,15 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import redis
import requests
import shutil
import json import json
import math import math
import sys, os import os
import shlex
import shutil
import sys
import time import time
from subprocess import PIPE, Popen from subprocess import PIPE, Popen
import shlex
import redis
import requests
URL_OPEN_MAP = "http://tile.openstreetmap.org/{zoom}/{x}/{y}.png" URL_OPEN_MAP = "http://tile.openstreetmap.org/{zoom}/{x}/{y}.png"
MAP_DIR = "static/maps/" MAP_DIR = "static/maps/"

156
server.py
View File

@ -1,21 +1,22 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from flask import Flask, render_template, request, Response, jsonify, stream_with_context
import json
import redis
import random, math
import configparser import configparser
import datetime
import errno
import json
import logging
import math
import os
import random
from time import gmtime as now from time import gmtime as now
from time import sleep, strftime from time import sleep, strftime
import datetime
import os import redis
import logging
import util import util
from helpers import geo_helper from flask import (Flask, Response, jsonify, render_template, request,
from helpers import contributor_helper send_from_directory, stream_with_context)
from helpers import users_helper from helpers import (contributor_helper, geo_helper, live_helper,
from helpers import trendings_helper trendings_helper, users_helper)
from helpers import live_helper
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser() cfg = configparser.ConfigParser()
@ -26,21 +27,25 @@ logger.setLevel(logging.ERROR)
server_host = cfg.get("Server", "host") server_host = cfg.get("Server", "host")
server_port = cfg.getint("Server", "port") server_port = cfg.getint("Server", "port")
server_debug = cfg.get("Server", "debug")
app = Flask(__name__) app = Flask(__name__)
redis_server_log = redis.StrictRedis( redis_server_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db')) db=cfg.getint('RedisLog', 'db'),
decode_responses=True)
redis_server_map = redis.StrictRedis( redis_server_map = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisMap', 'db')) db=cfg.getint('RedisMap', 'db'),
decode_responses=True)
serv_redis_db = redis.StrictRedis( serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db')) db=cfg.getint('RedisDB', 'db'),
decode_responses=True)
streamLogCacheKey = cfg.get('RedisLog', 'streamLogCacheKey') streamLogCacheKey = cfg.get('RedisLog', 'streamLogCacheKey')
streamMapCacheKey = cfg.get('RedisLog', 'streamMapCacheKey') streamMapCacheKey = cfg.get('RedisLog', 'streamMapCacheKey')
@ -68,10 +73,10 @@ class LogItem():
FIELDNAME_ORDER_HEADER.append(item) FIELDNAME_ORDER_HEADER.append(item)
FIELDNAME_ORDER.append(item) FIELDNAME_ORDER.append(item)
def __init__(self, feed): def __init__(self, feed, filters={}):
self.filters = filters
self.feed = feed
self.fields = [] self.fields = []
for f in feed:
self.fields.append(f)
def get_head_row(self): def get_head_row(self):
to_ret = [] to_ret = []
@ -80,38 +85,72 @@ class LogItem():
return to_ret return to_ret
def get_row(self): def get_row(self):
if not self.pass_filter():
return False
to_ret = {} to_ret = {}
#Number to keep them sorted (jsonify sort keys) for i, field in enumerate(json.loads(cfg.get('Dashboard', 'fieldname_order'))):
for item in range(len(LogItem.FIELDNAME_ORDER)): if type(field) is list:
try: to_join = []
to_ret[item] = self.fields[item] for subField in field:
except IndexError: # not enough field in rcv item to_join.append(str(util.getFields(self.feed, subField)))
to_ret[item] = '' to_add = cfg.get('Dashboard', 'char_separator').join(to_join)
else:
to_add = util.getFields(self.feed, field)
to_ret[i] = to_add if to_add is not None else ''
return to_ret return to_ret
def pass_filter(self):
for filter, filterValue in self.filters.items():
jsonValue = util.getFields(self.feed, filter)
if jsonValue is None or jsonValue != filterValue:
return False
return True
class EventMessage(): class EventMessage():
# Suppose the event message is a json with the format {name: 'feedName', log:'logData'} # Suppose the event message is a json with the format {name: 'feedName', log:'logData'}
def __init__(self, msg): def __init__(self, msg, filters):
msg = msg.decode('utf8') if not isinstance(msg, dict):
try: try:
jsonMsg = json.loads(msg) jsonMsg = json.loads(msg)
except json.JSONDecodeError as e: jsonMsg['log'] = json.loads(jsonMsg['log'])
logger.error(e) except json.JSONDecodeError as e:
jsonMsg = { 'name': "undefined" ,'log': json.loads(msg) } logger.error(e)
jsonMsg = { 'name': "undefined" ,'log': json.loads(msg) }
else:
jsonMsg = msg
self.name = jsonMsg['name'] self.name = jsonMsg['name']
self.zmqName = jsonMsg['zmqName'] self.zmqName = jsonMsg['zmqName']
self.feed = json.loads(jsonMsg['log'])
self.feed = LogItem(self.feed).get_row() if self.name == 'Attribute':
self.feed = jsonMsg['log']
self.feed = LogItem(self.feed, filters).get_row()
elif self.name == 'ObjectAttribute':
self.feed = jsonMsg['log']
self.feed = LogItem(self.feed, filters).get_row()
else:
self.feed = jsonMsg['log']
def to_json_ev(self): def to_json_ev(self):
to_ret = { 'log': self.feed, 'name': self.name, 'zmqName': self.zmqName } if self.feed is not False:
return 'data: {}\n\n'.format(json.dumps(to_ret)) to_ret = { 'log': self.feed, 'name': self.name, 'zmqName': self.zmqName }
return 'data: {}\n\n'.format(json.dumps(to_ret))
else:
return ''
def to_json(self): def to_json(self):
to_ret = { 'log': self.feed, 'name': self.name, 'zmqName': self.zmqName } if self.feed is not False:
return json.dumps(to_ret) to_ret = { 'log': self.feed, 'name': self.name, 'zmqName': self.zmqName }
return json.dumps(to_ret)
else:
return ''
def to_dict(self):
return {'log': self.feed, 'name': self.name, 'zmqName': self.zmqName}
########### ###########
## ROUTE ## ## ROUTE ##
@ -140,6 +179,10 @@ def index():
zoomlevel=cfg.getint('Dashboard' ,'zoomlevel') zoomlevel=cfg.getint('Dashboard' ,'zoomlevel')
) )
@app.route('/favicon.ico')
def favicon():
return send_from_directory(os.path.join(app.root_path, 'static'),
'favicon.ico', mimetype='image/vnd.microsoft.icon')
@app.route("/geo") @app.route("/geo")
def geo(): def geo():
@ -226,9 +269,18 @@ def logs():
if request.accept_mimetypes.accept_json or request.method == 'POST': if request.accept_mimetypes.accept_json or request.method == 'POST':
key = 'Attribute' key = 'Attribute'
j = live_helper.get_stream_log_cache(key) j = live_helper.get_stream_log_cache(key)
return jsonify(j) to_ret = []
for item in j:
filters = request.cookies.get('filters', '{}')
filters = json.loads(filters)
ev = EventMessage(item, filters)
if ev is not None:
dico = ev.to_dict()
if dico['log'] != False:
to_ret.append(dico)
return jsonify(to_ret)
else: else:
return Response(event_stream_log(), mimetype="text/event-stream") return Response(stream_with_context(event_stream_log()), mimetype="text/event-stream")
@app.route("/_maps") @app.route("/_maps")
def maps(): def maps():
@ -248,9 +300,14 @@ def event_stream_log():
subscriber_log.subscribe(live_helper.CHANNEL) subscriber_log.subscribe(live_helper.CHANNEL)
try: try:
for msg in subscriber_log.listen(): for msg in subscriber_log.listen():
filters = request.cookies.get('filters', '{}')
filters = json.loads(filters)
content = msg['data'] content = msg['data']
ev = EventMessage(content) ev = EventMessage(content, filters)
yield ev.to_json_ev() if ev is not None:
yield ev.to_json_ev()
else:
pass
except GeneratorExit: except GeneratorExit:
subscriber_log.unsubscribe() subscriber_log.unsubscribe()
@ -259,7 +316,7 @@ def event_stream_maps():
subscriber_map.psubscribe(cfg.get('RedisMap', 'channelDisp')) subscriber_map.psubscribe(cfg.get('RedisMap', 'channelDisp'))
try: try:
for msg in subscriber_map.listen(): for msg in subscriber_map.listen():
content = msg['data'].decode('utf8') content = msg['data']
to_ret = 'data: {}\n\n'.format(content) to_ret = 'data: {}\n\n'.format(content)
yield to_ret yield to_ret
except GeneratorExit: except GeneratorExit:
@ -318,7 +375,7 @@ def eventStreamLastContributor():
subscriber_lastContrib.psubscribe(cfg.get('RedisLog', 'channelLastContributor')) subscriber_lastContrib.psubscribe(cfg.get('RedisLog', 'channelLastContributor'))
try: try:
for msg in subscriber_lastContrib.listen(): for msg in subscriber_lastContrib.listen():
content = msg['data'].decode('utf8') content = msg['data']
contentJson = json.loads(content) contentJson = json.loads(content)
lastContribJson = json.loads(contentJson['log']) lastContribJson = json.loads(contentJson['log'])
org = lastContribJson['org'] org = lastContribJson['org']
@ -334,7 +391,7 @@ def eventStreamAwards():
subscriber_lastAwards.psubscribe(cfg.get('RedisLog', 'channelLastAwards')) subscriber_lastAwards.psubscribe(cfg.get('RedisLog', 'channelLastAwards'))
try: try:
for msg in subscriber_lastAwards.listen(): for msg in subscriber_lastAwards.listen():
content = msg['data'].decode('utf8') content = msg['data']
contentJson = json.loads(content) contentJson = json.loads(content)
lastAwardJson = json.loads(contentJson['log']) lastAwardJson = json.loads(contentJson['log'])
org = lastAwardJson['org'] org = lastAwardJson['org']
@ -590,4 +647,13 @@ def getGenericTrendingOvertime():
return jsonify(data) return jsonify(data)
if __name__ == '__main__': if __name__ == '__main__':
app.run(host=server_host, port=server_port, threaded=True) try:
app.run(host=server_host,
port=server_port,
debug=server_debug,
threaded=True)
except OSError as error:
if error.errno == 98:
print("\n\n\nAddress already in use, the defined port is: " + str(server_port))
else:
print(str(error))

View File

@ -20,7 +20,22 @@ else
exit 1 exit 1
fi fi
[ ! -f "`which redis-server`" ] && echo "'redis-server' is not installed/not on PATH. Please fix and run again." && exit 1 if [[ -f "/etc/redhat-release" ]]; then
echo "You are running a RedHat flavour. Detecting scl potential..."
if [[ -f "/usr/bin/scl" ]]; then
echo "scl detected, checking for redis-server"
SCL_REDIS=$(scl -l|grep rh-redis)
if [[ ! -z $SCL_REDIS ]]; then
echo "We detected: ${SCL_REDIS} acting accordingly"
REDIS_RUN="/usr/bin/scl enable ${SCL_REDIS}"
fi
else
echo "redis-server seems not to be install in scl, perhaps system-wide, testing."
[ ! -f "`which redis-server`" ] && echo "'redis-server' is not installed/not on PATH. Please fix and run again." && exit 1
fi
else
[ ! -f "`which redis-server`" ] && echo "'redis-server' is not installed/not on PATH. Please fix and run again." && exit 1
fi
netstat -an |grep LISTEN |grep 6250 |grep -v tcp6 ; check_redis_port=$? netstat -an |grep LISTEN |grep 6250 |grep -v tcp6 ; check_redis_port=$?
netstat -an |grep LISTEN |grep 8001 |grep -v tcp6 ; check_dashboard_port=$? netstat -an |grep LISTEN |grep 8001 |grep -v tcp6 ; check_dashboard_port=$?
@ -35,8 +50,12 @@ conf_dir="config/"
sleep 0.1 sleep 0.1
if [ "${check_redis_port}" == "1" ]; then if [ "${check_redis_port}" == "1" ]; then
echo -e $GREEN"\t* Launching Redis servers"$DEFAULT echo -e $GREEN"\t* Launching Redis servers"$DEFAULT
redis-server ${conf_dir}6250.conf & if [[ ! -z $REDIS_RUN ]]; then
$REDIS_RUN "redis-server ${conf_dir}6250.conf" &
else
redis-server ${conf_dir}6250.conf &
fi
else else
echo -e $RED"\t* NOT starting Redis server, made a very unrealiable check on port 6250, and something seems to be there… please double check if this is good!"$DEFAULT echo -e $RED"\t* NOT starting Redis server, made a very unrealiable check on port 6250, and something seems to be there… please double check if this is good!"$DEFAULT
fi fi

7
static/css/jquery-ui.min.css vendored Normal file

File diff suppressed because one or more lines are too long

1
static/css/jquery.dataTables.min.css vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,314 @@
.selected-path-container {
padding-left: 10px;
border: 1px solid #DCC896;
background: rgb(250, 240, 210);
border-radius: 4px;
margin-bottom: 0px;
}
.group-conditions > button[data-not="group"].active {
color: #FFF;
background-color: #C9302C;
border-color: #AC2925;
}
.query-builder, .query-builder * {
margin: 0;
padding: 0;
box-sizing: border-box;
}
.query-builder {
font-family: sans-serif;
}
.query-builder .hide {
display: none;
}
.query-builder .pull-right {
float: right !important;
}
.query-builder .btn {
text-transform: none;
display: inline-block;
padding: 6px 12px;
margin-bottom: 0px;
font-size: 14px;
font-weight: 400;
line-height: 1.42857;
text-align: center;
white-space: nowrap;
vertical-align: middle;
touch-action: manipulation;
cursor: pointer;
user-select: none;
background-image: none;
border: 1px solid transparent;
border-radius: 4px;
}
.query-builder .btn.focus, .query-builder .btn:focus, .query-builder .btn:hover {
color: #333;
text-decoration: none;
}
.query-builder .btn.active, .query-builder .btn:active {
background-image: none;
outline: 0px none;
box-shadow: 0px 3px 5px rgba(0, 0, 0, 0.125) inset;
}
.query-builder .btn-success {
color: #FFF;
background-color: #5CB85C;
border-color: #4CAE4C;
}
.query-builder .btn-primary {
color: #FFF;
background-color: #337AB7;
border-color: #2E6DA4;
}
.query-builder .btn-danger {
color: #FFF;
background-color: #D9534F;
border-color: #D43F3A;
}
.query-builder .btn-success.active, .query-builder .btn-success.focus,
.query-builder .btn-success:active, .query-builder .btn-success:focus,
.query-builder .btn-success:hover {
color: #FFF;
background-color: #449D44;
border-color: #398439;
}
.query-builder .btn-primary.active, .query-builder .btn-primary.focus,
.query-builder .btn-primary:active, .query-builder .btn-primary:focus,
.query-builder .btn-primary:hover {
color: #FFF;
background-color: #286090;
border-color: #204D74;
}
.query-builder .btn-danger.active, .query-builder .btn-danger.focus,
.query-builder .btn-danger:active, .query-builder .btn-danger:focus,
.query-builder .btn-danger:hover {
color: #FFF;
background-color: #C9302C;
border-color: #AC2925;
}
.query-builder .btn-group {
position: relative;
display: inline-block;
vertical-align: middle;
}
.query-builder .btn-group > .btn {
position: relative;
float: left;
}
.query-builder .btn-group > .btn:first-child {
margin-left: 0px;
}
.query-builder .btn-group > .btn:first-child:not(:last-child) {
border-top-right-radius: 0px;
border-bottom-right-radius: 0px;
}
.query-builder .btn-group > .btn:last-child:not(:first-child) {
border-top-left-radius: 0px;
border-bottom-left-radius: 0px;
}
.query-builder .btn-group .btn + .btn, .query-builder .btn-group .btn + .btn-group,
.query-builder .btn-group .btn-group + .btn, .query-builder .btn-group .btn-group + .btn-group {
margin-left: -1px;
}
.query-builder .btn-xs, .query-builder .btn-group-xs > .btn {
padding: 1px 5px;
line-height: 1.5;
border-radius: 3px;
}
/*!
* jQuery QueryBuilder 2.5.2
* Copyright 2014-2018 Damien "Mistic" Sorel (http://www.strangeplanet.fr)
* Licensed under MIT (https://opensource.org/licenses/MIT)
*/
.query-builder .rules-group-container, .query-builder .rule-container, .query-builder .rule-placeholder {
position: relative;
margin: 4px 0;
border-radius: 5px;
padding: 5px;
border: 1px solid #EEE;
background: rgba(255, 255, 255, 0.9);
}
.query-builder .rule-container .rule-filter-container,
.query-builder .rule-container .rule-operator-container,
.query-builder .rule-container .rule-value-container, .query-builder .error-container, .query-builder .drag-handle {
display: inline-block;
margin: 0 5px 0 0;
vertical-align: middle;
}
.query-builder .rules-group-container {
padding: 10px;
padding-bottom: 6px;
border: 1px solid #DCC896;
background: rgba(250, 240, 210, 0.5);
}
.query-builder .rules-group-header {
margin-bottom: 10px;
}
.query-builder .rules-group-header .group-conditions .btn.readonly:not(.active),
.query-builder .rules-group-header .group-conditions input[name$='_cond'] {
border: 0;
clip: rect(0 0 0 0);
height: 1px;
margin: -1px;
overflow: hidden;
padding: 0;
position: absolute;
width: 1px;
white-space: nowrap;
}
.query-builder .rules-group-header .group-conditions .btn.readonly {
border-radius: 3px;
}
.query-builder .rules-list {
list-style: none;
padding: 0 0 0 15px;
margin: 0;
}
.query-builder .rule-value-container {
border-left: 1px solid #DDD;
padding-left: 5px;
}
.query-builder .rule-value-container label {
margin-bottom: 0;
font-weight: normal;
}
.query-builder .rule-value-container label.block {
display: block;
}
.query-builder .rule-value-container select,
.query-builder .rule-value-container input[type='text'],
.query-builder .rule-value-container input[type='number'] {
padding: 1px;
}
.query-builder .error-container {
display: none;
cursor: help;
color: #F00;
}
.query-builder .has-error {
background-color: #FDD;
border-color: #F99;
}
.query-builder .has-error .error-container {
display: inline-block !important;
}
.query-builder .rules-list > *::before, .query-builder .rules-list > *::after {
content: '';
position: absolute;
left: -10px;
width: 10px;
height: calc(50% + 4px);
border-color: #CCC;
border-style: solid;
}
.query-builder .rules-list > *::before {
top: -4px;
border-width: 0 0 2px 2px;
}
.query-builder .rules-list > *::after {
top: 50%;
border-width: 0 0 0 2px;
}
.query-builder .rules-list > *:first-child::before {
top: -12px;
height: calc(50% + 14px);
}
.query-builder .rules-list > *:last-child::before {
border-radius: 0 0 0 4px;
}
.query-builder .rules-list > *:last-child::after {
display: none;
}
.query-builder.bt-checkbox-glyphicons .checkbox input[type='checkbox']:checked + label::after {
font-family: 'Glyphicons Halflings';
content: '\e013';
}
.query-builder.bt-checkbox-glyphicons .checkbox label::after {
padding-left: 4px;
padding-top: 2px;
font-size: 9px;
}
.query-builder .error-container + .tooltip .tooltip-inner {
color: #F99 !important;
}
.query-builder p.filter-description {
margin: 5px 0 0 0;
background: #D9EDF7;
border: 1px solid #BCE8F1;
color: #31708F;
border-radius: 5px;
padding: 2.5px 5px;
font-size: .8em;
}
.query-builder .rules-group-header [data-invert] {
margin-left: 5px;
}
.query-builder .drag-handle {
cursor: move;
vertical-align: middle;
margin-left: 5px;
}
.query-builder .dragging {
position: fixed;
opacity: .5;
z-index: 100;
}
.query-builder .dragging::before, .query-builder .dragging::after {
display: none;
}
.query-builder .rule-placeholder {
border: 1px dashed #BBB;
opacity: .7;
}

View File

@ -346,7 +346,8 @@ function addLastContributor(datatable, data, update) {
} else { } else {
last_added_contrib = org; last_added_contrib = org;
var date = new Date(data.epoch*1000); var date = new Date(data.epoch*1000);
date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); }; //date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); };
date = date.getFullYear() + "-" + String(date.getMonth()+1).padStart(2, "0") + "-" + String(date.getDate()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
var to_add = [ var to_add = [
date, date,
data.pnts, data.pnts,
@ -383,7 +384,8 @@ function addAwards(datatableAwards, json, playAnim) {
var award = createTrophyImg(json.award[1][1], 40, categ); var award = createTrophyImg(json.award[1][1], 40, categ);
} }
var date = new Date(json.epoch*1000); var date = new Date(json.epoch*1000);
date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); }; //date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); };
date = date.getFullYear() + "-" + String(date.getMonth()+1).padStart(2, "0") + "-" + String(date.getDate()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
var to_add = [ var to_add = [
date, date,
createImg(json.logo_path, 32), createImg(json.logo_path, 32),

144
static/js/doT.js Normal file
View File

@ -0,0 +1,144 @@
// doT.js
// 2011-2014, Laura Doktorova, https://github.com/olado/doT
// Licensed under the MIT license.
(function () {
"use strict";
var doT = {
name: "doT",
version: "1.1.1",
templateSettings: {
evaluate: /\{\{([\s\S]+?(\}?)+)\}\}/g,
interpolate: /\{\{=([\s\S]+?)\}\}/g,
encode: /\{\{!([\s\S]+?)\}\}/g,
use: /\{\{#([\s\S]+?)\}\}/g,
useParams: /(^|[^\w$])def(?:\.|\[[\'\"])([\w$\.]+)(?:[\'\"]\])?\s*\:\s*([\w$\.]+|\"[^\"]+\"|\'[^\']+\'|\{[^\}]+\})/g,
define: /\{\{##\s*([\w\.$]+)\s*(\:|=)([\s\S]+?)#\}\}/g,
defineParams:/^\s*([\w$]+):([\s\S]+)/,
conditional: /\{\{\?(\?)?\s*([\s\S]*?)\s*\}\}/g,
iterate: /\{\{~\s*(?:\}\}|([\s\S]+?)\s*\:\s*([\w$]+)\s*(?:\:\s*([\w$]+))?\s*\}\})/g,
varname: "it",
strip: true,
append: true,
selfcontained: false,
doNotSkipEncoded: false
},
template: undefined, //fn, compile template
compile: undefined, //fn, for express
log: true
}, _globals;
doT.encodeHTMLSource = function(doNotSkipEncoded) {
var encodeHTMLRules = { "&": "&#38;", "<": "&#60;", ">": "&#62;", '"': "&#34;", "'": "&#39;", "/": "&#47;" },
matchHTML = doNotSkipEncoded ? /[&<>"'\/]/g : /&(?!#?\w+;)|<|>|"|'|\//g;
return function(code) {
return code ? code.toString().replace(matchHTML, function(m) {return encodeHTMLRules[m] || m;}) : "";
};
};
_globals = (function(){ return this || (0,eval)("this"); }());
/* istanbul ignore else */
if (typeof module !== "undefined" && module.exports) {
module.exports = doT;
} else if (typeof define === "function" && define.amd) {
define(function(){return doT;});
} else {
_globals.doT = doT;
}
var startend = {
append: { start: "'+(", end: ")+'", startencode: "'+encodeHTML(" },
split: { start: "';out+=(", end: ");out+='", startencode: "';out+=encodeHTML(" }
}, skip = /$^/;
function resolveDefs(c, block, def) {
return ((typeof block === "string") ? block : block.toString())
.replace(c.define || skip, function(m, code, assign, value) {
if (code.indexOf("def.") === 0) {
code = code.substring(4);
}
if (!(code in def)) {
if (assign === ":") {
if (c.defineParams) value.replace(c.defineParams, function(m, param, v) {
def[code] = {arg: param, text: v};
});
if (!(code in def)) def[code]= value;
} else {
new Function("def", "def['"+code+"']=" + value)(def);
}
}
return "";
})
.replace(c.use || skip, function(m, code) {
if (c.useParams) code = code.replace(c.useParams, function(m, s, d, param) {
if (def[d] && def[d].arg && param) {
var rw = (d+":"+param).replace(/'|\\/g, "_");
def.__exp = def.__exp || {};
def.__exp[rw] = def[d].text.replace(new RegExp("(^|[^\\w$])" + def[d].arg + "([^\\w$])", "g"), "$1" + param + "$2");
return s + "def.__exp['"+rw+"']";
}
});
var v = new Function("def", "return " + code)(def);
return v ? resolveDefs(c, v, def) : v;
});
}
function unescape(code) {
return code.replace(/\\('|\\)/g, "$1").replace(/[\r\t\n]/g, " ");
}
doT.template = function(tmpl, c, def) {
c = c || doT.templateSettings;
var cse = c.append ? startend.append : startend.split, needhtmlencode, sid = 0, indv,
str = (c.use || c.define) ? resolveDefs(c, tmpl, def || {}) : tmpl;
str = ("var out='" + (c.strip ? str.replace(/(^|\r|\n)\t* +| +\t*(\r|\n|$)/g," ")
.replace(/\r|\n|\t|\/\*[\s\S]*?\*\//g,""): str)
.replace(/'|\\/g, "\\$&")
.replace(c.interpolate || skip, function(m, code) {
return cse.start + unescape(code) + cse.end;
})
.replace(c.encode || skip, function(m, code) {
needhtmlencode = true;
return cse.startencode + unescape(code) + cse.end;
})
.replace(c.conditional || skip, function(m, elsecase, code) {
return elsecase ?
(code ? "';}else if(" + unescape(code) + "){out+='" : "';}else{out+='") :
(code ? "';if(" + unescape(code) + "){out+='" : "';}out+='");
})
.replace(c.iterate || skip, function(m, iterate, vname, iname) {
if (!iterate) return "';} } out+='";
sid+=1; indv=iname || "i"+sid; iterate=unescape(iterate);
return "';var arr"+sid+"="+iterate+";if(arr"+sid+"){var "+vname+","+indv+"=-1,l"+sid+"=arr"+sid+".length-1;while("+indv+"<l"+sid+"){"
+vname+"=arr"+sid+"["+indv+"+=1];out+='";
})
.replace(c.evaluate || skip, function(m, code) {
return "';" + unescape(code) + "out+='";
})
+ "';return out;")
.replace(/\n/g, "\\n").replace(/\t/g, '\\t').replace(/\r/g, "\\r")
.replace(/(\s|;|\}|^|\{)out\+='';/g, '$1').replace(/\+''/g, "");
//.replace(/(\s|;|\}|^|\{)out\+=''\+/g,'$1out+=');
if (needhtmlencode) {
if (!c.selfcontained && _globals && !_globals._encodeHTML) _globals._encodeHTML = doT.encodeHTMLSource(c.doNotSkipEncoded);
str = "var encodeHTML = typeof _encodeHTML !== 'undefined' ? _encodeHTML : ("
+ doT.encodeHTMLSource.toString() + "(" + (c.doNotSkipEncoded || '') + "));"
+ str;
}
try {
return new Function(c.varname, str);
} catch (e) {
/* istanbul ignore else */
if (typeof console !== "undefined") console.log("Could not create a template function: " + str);
throw e;
}
};
doT.compile = function(tmpl, def) {
return doT.template(tmpl, null, def);
};
}());

132
static/js/extendext.js Normal file
View File

@ -0,0 +1,132 @@
/*!
* jQuery.extendext 0.1.2
*
* Copyright 2014-2016 Damien "Mistic" Sorel (http://www.strangeplanet.fr)
* Licensed under MIT (http://opensource.org/licenses/MIT)
*
* Based on jQuery.extend by jQuery Foundation, Inc. and other contributors
*/
/*jshint -W083 */
(function (root, factory) {
if (typeof define === 'function' && define.amd) {
define(['jquery'], factory);
}
else if (typeof module === 'object' && module.exports) {
module.exports = factory(require('jquery'));
}
else {
factory(root.jQuery);
}
}(this, function ($) {
"use strict";
$.extendext = function () {
var options, name, src, copy, copyIsArray, clone,
target = arguments[0] || {},
i = 1,
length = arguments.length,
deep = false,
arrayMode = 'default';
// Handle a deep copy situation
if (typeof target === "boolean") {
deep = target;
// Skip the boolean and the target
target = arguments[i++] || {};
}
// Handle array mode parameter
if (typeof target === "string") {
arrayMode = target.toLowerCase();
if (arrayMode !== 'concat' && arrayMode !== 'replace' && arrayMode !== 'extend') {
arrayMode = 'default';
}
// Skip the string param
target = arguments[i++] || {};
}
// Handle case when target is a string or something (possible in deep copy)
if (typeof target !== "object" && !$.isFunction(target)) {
target = {};
}
// Extend jQuery itself if only one argument is passed
if (i === length) {
target = this;
i--;
}
for (; i < length; i++) {
// Only deal with non-null/undefined values
if ((options = arguments[i]) !== null) {
// Special operations for arrays
if ($.isArray(options) && arrayMode !== 'default') {
clone = target && $.isArray(target) ? target : [];
switch (arrayMode) {
case 'concat':
target = clone.concat($.extend(deep, [], options));
break;
case 'replace':
target = $.extend(deep, [], options);
break;
case 'extend':
options.forEach(function (e, i) {
if (typeof e === 'object') {
var type = $.isArray(e) ? [] : {};
clone[i] = $.extendext(deep, arrayMode, clone[i] || type, e);
} else if (clone.indexOf(e) === -1) {
clone.push(e);
}
});
target = clone;
break;
}
} else {
// Extend the base object
for (name in options) {
src = target[name];
copy = options[name];
// Prevent never-ending loop
if (target === copy) {
continue;
}
// Recurse if we're merging plain objects or arrays
if (deep && copy && ( $.isPlainObject(copy) ||
(copyIsArray = $.isArray(copy)) )) {
if (copyIsArray) {
copyIsArray = false;
clone = src && $.isArray(src) ? src : [];
} else {
clone = src && $.isPlainObject(src) ? src : {};
}
// Never move original objects, clone them
target[name] = $.extendext(deep, arrayMode, clone, copy);
// Don't bring in undefined values
} else if (copy !== undefined) {
target[name] = copy;
}
}
}
}
}
// Return the modified object
return target;
};
}));

View File

@ -166,56 +166,34 @@ var sources = new Sources();
sources.addSource('global'); sources.addSource('global');
var ledmanager = new LedManager(); var ledmanager = new LedManager();
var curNumLog = 0;
var curMaxDataNumLog = 0; var curMaxDataNumLog = 0;
var source_log;
function connect_source_log() {
source_log = new EventSource(urlForLogs);
source_log.onopen = function(){
//console.log('connection is opened. '+source_log.readyState);
};
source_log.onerror = function(){
console.log('error: '+source_log.readyState);
setTimeout(function() { connect_source_log(); }, 5000);
};
source_log.onmessage = function(event) {
var json = jQuery.parseJSON( event.data );
updateLogTable(json.name, json.log, json.zmqName);
};
}
var livelog;
$(document).ready(function () { $(document).ready(function () {
createHead(function() { $.getJSON(urlForHead, function(head) {
if (!!window.EventSource) { livelog = new $.livelog($("#divLogTable"), {
$.getJSON( urlForLogs, function( data ) { pollingFrequency: 5000,
data.forEach(function(item) { tableHeader: head,
updateLogTable(item.name, item.log, item.zmqName); tableMaxEntries: 50,
}); // animate: false,
connect_source_log(); preDataURL: urlForLogs,
}); endpoint: urlForLogs
} else { });
console.log("No event source_log");
}
}); });
}); });
// LOG TABLE // LOG TABLE
function updateLogTable(name, log, zmqName) { function updateLogTable(name, log, zmqName, ignoreLed) {
if (log.length == 0) if (log.length == 0)
return; return;
// update keepAlives // update keepAlives
ledmanager.updateKeepAlive(zmqName); if (ignoreLed !== true) {
ledmanager.updateKeepAlive(zmqName);
// Create new row }
tableBody = document.getElementById('table_log_body');
// only add row for attribute // only add row for attribute
if (name == "Attribute" ) { if (name == "Attribute" ) {
@ -224,13 +202,6 @@ function updateLogTable(name, log, zmqName) {
sources.incCountOnSource(categName); sources.incCountOnSource(categName);
sources.incCountOnSource('global'); sources.incCountOnSource('global');
updateChartDirect(); updateChartDirect();
createRow(tableBody, log);
// Remove old row
while ($("#table_log").height() >= $("#panelLogTable").height()-26){ //26 for margin
tableBody.deleteRow(0);
}
} else if (name == "Keepalive") { } else if (name == "Keepalive") {
// do nothing // do nothing
} else { } else {
@ -264,23 +235,6 @@ function getTextColour(rgb) {
} }
} }
function addObjectToLog(name, obj, td) {
if(name == "Tag") {
var a = document.createElement('A');
a.classList.add('tagElem');
a.style.backgroundColor = obj.colour;
a.style.color = getTextColour(obj.colour.substring(1,6));
a.innerHTML = obj.name;
td.appendChild(a);
td.appendChild(document.createElement('br'));
} else if (name == "mispObject") {
td.appendChild(document.createTextNode('mispObj'));
} else {
td.appendChild(document.createTextNode('nop'));
}
}
function createRow(tableBody, log) { function createRow(tableBody, log) {
var tr = document.createElement('TR'); var tr = document.createElement('TR');
@ -338,3 +292,553 @@ function createHead(callback) {
callback(); callback();
}); });
} }
/* LIVE LOG */
(function(factory) {
"use strict";
if (typeof define === 'function' && define.amd) {
define(['jquery'], factory);
} else if (window.jQuery && !window.jQuery.fn.Livelog) {
factory(window.jQuery);
}
}
(function($) {
'use strict';
// Livelog object
var Livelog = function(container, options) {
this._default_options = {
pollingFrequency: 5000,
tableHeader: undefined,
tableMaxEntries: undefined,
animate: true
}
options.container = container;
this.validateOptions(options);
this._options = $.extend({}, this._default_options, options);
// create table and draw header
this.origTableOptions = {
dom: "<'row'<'col-sm-12'<'dt-toolbar-led'>>>"
+ "<'row'<'col-sm-12'tr>>",
searching: false,
paging: false,
"order": [[ 0, "desc" ]],
responsive: true,
columnDefs: [
{ targets: 0, orderable: false },
{ targets: '_all', searchable: false, orderable: false,
render: function ( data, type, row ) {
var $toRet;
if (typeof data === 'object') {
$toRet = $('<span></span>');
data.data.forEach(function(cur, i) {
switch (data.name) {
case 'Tag':
var $tag = $('<a></a>');
$tag.addClass('tagElem');
$tag.css({
backgroundColor: cur.colour,
color: getTextColour(cur.colour.substring(1,6))
});
$tag.text(cur.name)
$toRet.append($tag);
break;
case 'mispObject':
$toRet.append('MISP Object not supported yet')
break;
default:
break;
}
});
$toRet = $toRet[0].outerHTML;
} else if (data === undefined) {
$toRet = '';
} else {
var textToAddArray = data.split(char_separator);
$toRet = '';
textToAddArray.forEach(function(e, i) {
if (i > 0) {
$toRet += '<br>' + e;
} else {
$toRet += e;
}
});
}
return $toRet;
},
}
],
};
this.DOMTable = $('<table class="table table-striped table-bordered" style="width:100%"></table>');
this._options.container.append(this.DOMTable);
this.origTableOptions.columns = [];
var that = this;
this._options.tableHeader.forEach(function(field) {
var th = $('<th>'+field+'</th>');
that.origTableOptions.columns.push({ title: field });
});
this.dt = this.DOMTable.DataTable(this.origTableOptions);
this.fetch_predata();
// add status led
this._ev_timer = null;
this._ev_retry_frequency = this._options.pollingFrequency; // sec
this._cur_ev_retry_count = 0;
this._ev_retry_count_thres = 3;
var led_container = $('<div class="led-container" style="margin-left: 10px;"></div>');
var led = $('<div class="led-small led_red"></div>');
this.statusLed = led;
led_container.append(led);
var header = this._options.container.parent().parent().find('.panel-heading');
if (header.length > 0) { // add in panel header
header.append(led_container);
} else { // add over the map
led.css('display', 'inline-block');
led_container.append($('<span>Status</span>')).css('float', 'left');
$('.dt-toolbar-led').append(led_container)
}
this.data_source = undefined;
this.connect_to_data_source();
};
Livelog.prototype = {
constructor: Livelog,
validateOptions: function(options) {
var o = options;
if (o.endpoint === undefined || typeof o.endpoint != 'string') {
throw "Livelog must have a valid endpoint";
}
if (o.container === undefined) {
throw "Livelog must have a container";
} else {
o.container = o.container instanceof jQuery ? o.container : $('#'+o.container);
}
// pre-data is either the data to be shown or an URL from which the data should be taken from
if (Array.isArray(o.preData)){
o.preDataURL = null;
o.preData = o.preData;
} else if (o.preData !== undefined) { // should fetch
o.preDataURL = o.preData;
o.preData = [];
}
if (o.tableHeader === undefined || !Array.isArray(o.tableHeader)) {
throw "Livelog must have a valid header";
}
if (o.tableMaxEntries !== undefined) {
o.tableMaxEntries = parseInt(o.tableMaxEntries);
}
},
fetch_predata: function() {
var that = this;
if (this._options.preDataURL !== null) {
$.when(
$.ajax({
dataType: "json",
url: this._options.preDataURL,
data: this._options.additionalOptions,
success: function(data) {
that._options.preData = data;
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(textStatus);
that._options.preData = [];
}
})
).then(
function() { // success
// add data to the widget
that._options.preData.forEach(function(j) {
var name = j.name,
zmqName = j.zmqName,
entry = j.log;
updateLogTable(name, entry, zmqName, true);
switch (name) {
case 'Attribute':
that.add_entry(entry);
break;
case 'ObjectAttribute':
that.add_entry(entry, true);
break;
default:
break;
}
});
}, function() { // fail
}
);
}
},
connect_to_data_source: function() {
var that = this;
if (!this.data_source) {
// var url_param = $.param( this.additionalOptions );
this.data_source = new EventSource(this._options.endpoint);
this.data_source.onmessage = function(event) {
var json = jQuery.parseJSON( event.data );
var name = json.name,
zmqName = json.zmqName,
entry = json.log;
updateLogTable(name, entry, zmqName);
switch (name) {
case 'Attribute':
that.add_entry(entry);
break;
case 'ObjectAttribute':
that.add_entry(entry, true);
break;
default:
break;
}
};
this.data_source.onopen = function(){
that._cur_ev_retry_count = 0;
that.update_connection_state('connected');
};
this.data_source.onerror = function(){
if (that.data_source.readyState == 0) { // reconnecting
that.update_connection_state('connecting');
} else if (that.data_source.readyState == 2) { // closed, reconnect with new object
that.reconnection_logique();
} else {
that.update_connection_state('not connected');
that.reconnection_logique();
}
};
}
},
reconnection_logique: function () {
var that = this;
if (that.data_source) {
that.data_source.close();
that.data_source = null;
}
if (that._ev_timer) {
clearTimeout(that._ev_timer);
}
if(that._cur_ev_retry_count >= that._ev_retry_count_thres) {
that.update_connection_state('not connected');
} else {
that._cur_ev_retry_count++;
that.update_connection_state('connecting');
}
that._ev_timer = setTimeout(function () { that.connect_to_data_source(); }, that._ev_retry_frequency*1000);
},
reconnect: function() {
if (this.data_source) {
this.data_source.close();
this.data_source = null;
this._cur_ev_retry_count = 0;
this.update_connection_state('reconnecting');
this.connect_to_data_source();
}
},
update_connection_state: function(connectionState) {
this.connectionState = connectionState;
this.updateDOMState(this.statusLed, connectionState);
},
updateDOMState: function(led, state) {
switch (state) {
case 'connected':
led.removeClass("led_red");
led.removeClass("led_orange");
led.addClass("led_green");
break;
case 'not connected':
led.removeClass("led_green");
led.removeClass("led_orange");
led.addClass("led_red");
break;
case 'connecting':
led.removeClass("led_green");
led.removeClass("led_red");
led.addClass("led_orange");
break;
default:
led.removeClass("led_green");
led.removeClass("led_orange");
led.addClass("led_red");
}
},
add_entry: function(entry, isObjectAttribute) {
var rowNode = this.dt.row.add(entry).draw().node();
if (this._options.animate) {
$( rowNode )
.css( 'background-color', '#5cb85c !important' )
.animate( { 'background-color': '' }, { duration: 1500 } );
}
if (isObjectAttribute === true) {
$( rowNode ).children().last()
.css('position', 'relative')
.append(
$('<it class="fa fa-th rowTableIsObject" title="This attribute belong to an Object"></it>')
);
}
// remove entries
var numRows = this.dt.rows().count();
var rowsToRemove = numRows - this._options.tableMaxEntries;
if (rowsToRemove > 0 && this._options.tableMaxEntries != -1) {
//get row indexes as an array
var arraySlice = this.dt.rows().indexes().toArray();
//get row indexes to remove starting at row 0
arraySlice = arraySlice.slice(-rowsToRemove);
//remove the rows and redraw the table
var rows = this.dt.rows(arraySlice).remove().draw();
}
}
};
$.livelog = Livelog;
$.fn.livelog = function(option) {
var pickerArgs = arguments;
return this.each(function() {
var $this = $(this),
inst = $this.data('livelog'),
options = ((typeof option === 'object') ? option : {});
if ((!inst) && (typeof option !== 'string')) {
$this.data('livelog', new Livelog(this, options));
} else {
if (typeof option === 'string') {
inst[option].apply(inst, Array.prototype.slice.call(pickerArgs, 1));
}
}
});
};
$.fn.livelog.constructor = Livelog;
}));
/* Live log filter */
function recursiveInject(result, rules, isNot) {
if (rules.rules === undefined) { // add to result
var field = rules.field;
var value = rules.value;
var operator_notequal = rules.operator === 'not_equal' ? true : false;
var negate = isNot ^ operator_notequal;
value = negate ? '!' + value : value;
if (result.hasOwnProperty(field)) {
if (Array.isArray(result[field])) {
result[field].push(value);
} else {
result[field] = [result[field], value];
}
} else {
result[field] = value;
}
}
else if (Array.isArray(rules.rules)) {
rules.rules.forEach(function(subrules) {
recursiveInject(result, subrules, isNot ^ rules.not) ;
});
}
}
function cleanRules(rules) {
var res = {};
recursiveInject(res, rules);
// clean up invalid and unset
Object.keys(res).forEach(function(k) {
var v = res[k];
if (v === undefined || v === '') {
delete res[k];
}
});
return res;
}
$(document).ready(function() {
var qbOptions = {
plugins: {
'filter-description' : {
mode: 'inline'
},
'unique-filter': null,
'bt-tooltip-errors': null,
},
allow_empty: true,
filters: [],
rules: {
condition: 'AND',
not: false,
rules: [],
flags: {
no_add_group: true,
condition_readonly: true,
}
},
icons: {
add_group: 'fa fa-plus-square',
add_rule: 'fa fa-plus-circle',
remove_group: 'fa fa-minus-square',
remove_rule: 'fa fa-minus-circle',
error: 'fa fa-exclamation-triangle'
}
};
// add filters and rules
[
'Attribute.category',
'Attribute.comment',
'Attribute.deleted',
'Attribute.disable_correlation',
'Attribute.distribution',
'Attribute.event_id',
'Attribute.id',
'Attribute.object_id',
'Attribute.object_relation',
'Attribute.sharing_group_id',
'Attribute.Tag.name',
'Attribute.timestamp',
'Attribute.to_ids',
'Attribute.type',
'Attribute.uuid',
'Attribute.value',
'Event.Org',
'Event.Orgc',
'Event.analysis',
'Event.attribute_count',
'Event.date',
'Event.disable_correlation',
'Event.distribution',
'Event.event_creator_email',
'Event.extends_uuid',
'Event.id',
'Event.info',
'Event.locked',
'Event.org_id',
'Event.orgc_id',
'Event.proposal_email_lock',
'Event.publish_timestamp',
'Event.published',
'Event.sharing_group_id',
'Event.threat_level_id',
'Event.Tag.name',
'Event.timestamp',
'Event.uuid',
'Org.id',
'Org.name',
'Org.uuid',
'Orgc.id',
'Orgc.name',
'Orgc.uuid'
].forEach(function(field) {
var tempFilter = {
"input": "text",
"type": "string",
"operators": [
"equal",
"not_equal"
],
"unique": true,
"id": field,
"label": field,
"description": "Perfom strict equality on " + field,
"validation": {
"allow_empty_value": true
}
};
qbOptions.filters.push(tempFilter);
});
var filterCookie = getCookie('filters');
var filters = JSON.parse(filterCookie !== undefined && filterCookie !== '' ? filterCookie : "{}");
var activeFilters = Object.keys(filters)
var tempRule = [];
activeFilters.forEach(function(field) {
var v = filters[field];
var tmp = {
field: field,
id: field,
value: v
};
tempRule.push(tmp);
});
qbOptions.rules.rules = tempRule;
updateFilterButton(activeFilters);
var $ev = $('#filteringQB');
var querybuilderTool = $ev.queryBuilder(qbOptions);
querybuilderTool = querybuilderTool[0].queryBuilder;
$('#saveFilters').click(function() {
var rules = querybuilderTool.getRules({ skip_empty: true, allow_invalid: true });
var result = {};
recursiveInject(result, rules, false);
updateFilterButton(Object.keys(result));
var jres = JSON.stringify(result, null);
document.cookie = 'filters=' + jres;
$('#modalFilters').modal('hide');
livelog.dt
.clear()
.draw();
livelog.fetch_predata();
livelog.reconnect();
})
$('#log-fullscreen').click(function() {
var $this = $(this);
var $panel = $('#panelLogTable');
var isfullscreen = $this.data('isfullscreen');
if (isfullscreen === undefined || !isfullscreen) {
$panel.detach().prependTo('#page-wrapper')
$panel.addClass('liveLogFullScreen');
$this.data('isfullscreen', true);
} else {
$panel.detach().appendTo('#rightCol')
$panel.removeClass('liveLogFullScreen');
$this.data('isfullscreen', false);
}
});
});
function updateFilterButton(activeFilters) {
if (activeFilters.length > 0) {
$('#log-filter').removeClass('btn-default');
$('#log-filter').addClass('btn-success');
} else {
$('#log-filter').removeClass('btn-success');
$('#log-filter').addClass('btn-default');
}
}
function getCookie(cname) {
var name = cname + "=";
var decodedCookie = decodeURIComponent(document.cookie);
var ca = decodedCookie.split(';');
for(var i = 0; i <ca.length; i++) {
var c = ca[i];
while (c.charAt(0) == ' ') {
c = c.substring(1);
}
if (c.indexOf(name) == 0) {
return c.substring(name.length, c.length);
}
}
return "";
}

View File

@ -23,7 +23,7 @@ class MapEvent {
this.specifName = json.specifName; this.specifName = json.specifName;
this.cityName = json.cityName; this.cityName = json.cityName;
this.text = this.categ + ": " + this.value; this.text = this.categ + ": " + this.value;
let underText = ""; let underText = "";
if (this.specifName !== null && this.cityName !== null) { if (this.specifName !== null && this.cityName !== null) {
underText = this.specifName+", "+this.cityName; underText = this.specifName+", "+this.cityName;
} else if (this.specifName !== null) { } else if (this.specifName !== null) {
@ -225,6 +225,7 @@ function connect_source_map() {
}; };
source_map.onerror = function(){ source_map.onerror = function(){
console.log('error: '+source_map.readyState); console.log('error: '+source_map.readyState);
source_map.close()
setTimeout(function() { connect_source_map(); }, 5000); setTimeout(function() { connect_source_map(); }, 5000);
}; };
} }

File diff suppressed because it is too large Load Diff

6202
static/js/query-builder.js Normal file

File diff suppressed because it is too large Load Diff

View File

@ -145,7 +145,7 @@ function getTextColour(rgb) {
} }
} }
// If json (from tag), only retreive the name> otherwise return the supplied arg. // If json (from tag), only retrieve the name> otherwise return the supplied arg.
function getOnlyName(potentialJson) { function getOnlyName(potentialJson) {
try { try {
jsonLabel = JSON.parse(potentialJson); jsonLabel = JSON.parse(potentialJson);

View File

@ -79,15 +79,21 @@ function updateDatePunch(ignore1, igonre2, org) { //date picker sets ( String da
punchcardWidget.refresh(); punchcardWidget.refresh();
highlight_punchDay(); highlight_punchDay();
} else { } else {
punchcardWidget = $('#punchcard').punchcard({ var data_max = Math.max.apply(Math, data.flat());
data: data, if (data_max === 0) { // no data, MISP's audit notification could be disabled
singular: 'login', $('#punchcard').text('No login or MISP\'s audit notification is disabled.');
plural: 'logins', } else {
timezones: ['local'], $('#punchcard').empty();
timezoneIndex:0 punchcardWidget = $('#punchcard').punchcard({
}); data: data,
punchcardWidget = punchcardWidget.data("plugin_" + "punchcard"); singular: 'login',
highlight_punchDay(); plural: 'logins',
timezones: ['local'],
timezoneIndex:0
});
punchcardWidget = punchcardWidget.data("plugin_" + "punchcard");
highlight_punchDay();
}
} }
}); });
} }

View File

@ -24,9 +24,28 @@
<script src="{{ url_for('static', filename='js/jquery.flot.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery.flot.pie.min.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.pie.min.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery.flot.resize.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.resize.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery-ui.min.js') }}"></script>
<link href="{{ url_for('static', filename='css/jquery-ui.min.css') }}" type="text/css" rel="stylesheet">
<!-- Bootstrap Core JavaScript --> <!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script> <script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery.dataTables.min.css') }}">
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js') }}"></script>
<link href="{{ url_for('static', filename='css/font-awesome.min.css') }}" type="text/css" rel="stylesheet">
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery-jvectormap-2.0.3.css') }}" type="text/css" media="screen"/>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-2.0.3.min.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-world-mill.js') }}"></script>
<script src="{{ url_for('static', filename='js/doT.js') }}"></script>
<script src="{{ url_for('static', filename='js/extendext.js') }}"></script>
<script src="{{ url_for('static', filename='js/moment-with-locales.js') }}"></script>
<script src="{{ url_for('static', filename='js/query-builder.js') }}"></script>
<link href="{{ url_for('static', filename='css/query-builder.default.css') }}" type="text/css" rel="stylesheet">
</head> </head>
<style> <style>
@ -42,8 +61,9 @@
font-size: 12px; font-size: 12px;
font-weight: bold; font-weight: bold;
line-height: 14px; line-height: 14px;
border-bottom-left-radius: 3px; border-radius: 3px;
box-shadow: 3px 3px 3px #888888; box-shadow: 3px 3px 3px #888888;
margin: 2px;
} }
table { table {
@ -123,6 +143,63 @@ small {
font-weight: bold; font-weight: bold;
} }
.led_green {
background-color: #ABFF00;
box-shadow: rgba(0, 0, 0, 0.2) 0 -1px 7px 1px, inset #304701 0 -1px 6px, #89FF00 0 0px 6px;
}
.led_red {
background-color: #F82222;
box-shadow: rgba(0, 0, 0, 0.2) 0 -1px 7px 1px, inset #304701 0 -1px 6px, #FF0303 0 0px 6px;
}
.led_orange {
background-color: #FFB400;
box-shadow: rgba(0, 0, 0, 0.2) 0 -1px 7px 1px, inset #304701 0 -1px 6px, #FF9000 0 0px 6px;
}
.led-small {
margin: auto auto;
margin-top: 6px;
width: 12px;
height: 12px;
border-radius: 50%;
}
.led-container {
text-align: center;
display: inline-block;
}
.led-container > span {
margin: auto 5px;
}
div.dataTables_scrollHead table.dataTable {
margin-top: 0px !important;
}
.dataTables_scrollBody thead tr {
visibility: collapse !important;
}
.liveLogFullScreen {
position: absolute !important;
top: 66px !important;
left: 15px !important;
right: 10px !important;
z-index: 1001 !important;
bottom: -7px !important;
height: unset !important;
}
.rowTableIsObject {
position: absolute;
right: 15px;
top: 0px;
color: #3465a4;
}
</style> </style>
<body> <body>
@ -198,7 +275,7 @@ small {
</div> </div>
<!-- /.col-lg-6 --> <!-- /.col-lg-6 -->
<!-- /.col-lg-6 --> <!-- /.col-lg-6 -->
<div class="col-lg-{{ size_dashboard_width[1] }}"> <div id="rightCol" class="col-lg-{{ size_dashboard_width[1] }}">
<div class="panel panel-default" style="margin-top: 15px; height: {{ pannelSize[2] }}vh;"> <div class="panel panel-default" style="margin-top: 15px; height: {{ pannelSize[2] }}vh;">
<div id="panelbody" class="panel-body" style="height: 100%;"> <div id="panelbody" class="panel-body" style="height: 100%;">
@ -212,23 +289,12 @@ small {
<div id="panelLogTable" class="panel panel-default" style="height: {{ pannelSize[3] }}vh;"> <div id="panelLogTable" class="panel panel-default" style="height: {{ pannelSize[3] }}vh;">
<div class="panel-heading"> <div class="panel-heading">
<i class="fa fa-tasks fa-fw"></i> Logs <i class="fa fa-tasks fa-fw"></i> Logs
<div class="pull-right"> <div style="display: inline-block; float: right;">
<input id="checkbox_log_info" type="checkbox" value="info"> INFO <button id="log-filter" data-toggle="modal" data-target="#modalFilters" class="btn btn-xs btn-default" ><i class="fa fa-filter"></i></button>
<input id="checkbox_log_warning" type="checkbox" value="warning" checked="true"> WARNING <button id="log-fullscreen" class="btn btn-xs btn-default"><i class="fa fa-expand"></i></button>
<input id="checkbox_log_critical" type="checkbox" value="critical" checked="true"> CRITICAL
</div> </div>
</div> </div>
<div id="divLogTable" class="panel-body" style="height: 98%; padding: 0px;"> <div id="divLogTable" class="panel-body" style="height: calc(100% - 46px); padding: 0px; overflow: hidden">
<div class="row" style="height: 100%;">
<div class="col-lg-12" style="height: 100%;">
<table class="table table-bordered table-hover table-striped" id="table_log">
<thead id="table_log_head">
</thead>
<tbody id="table_log_body">
</tbody>
</table>
</div>
</div>
</div> </div>
</div> </div>
@ -254,6 +320,25 @@ small {
</div> </div>
<!-- /#wrapper --> <!-- /#wrapper -->
<!-- Modal -->
<div class="modal fade" id="modalFilters" tabindex="-1" role="dialog" aria-labelledby="myModalLabel">
<div class="modal-dialog modal-lg" role="document">
<div class="modal-content">
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button>
<h4 class="modal-title" id="myModalLabel">Log filtering rules</h4>
</div>
<div class="modal-body">
<div id="filteringQB"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-default" data-dismiss="modal">Close</button>
<button id="saveFilters" type="button" class="btn btn-primary">Save filters</button>
</div>
</div>
</div>
</div>
<!-- Index --> <!-- Index -->
<script> <script>
/* URL */ /* URL */
@ -299,11 +384,6 @@ small {
<script src="{{ url_for('static', filename='js/index/index_map.js') }}"></script> <script src="{{ url_for('static', filename='js/index/index_map.js') }}"></script>
<script src="{{ url_for('static', filename='js/index/index_pie.js') }}"></script> <script src="{{ url_for('static', filename='js/index/index_pie.js') }}"></script>
<link href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery-jvectormap-2.0.3.css') }}" type="text/css" media="screen"/>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-2.0.3.min.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-world-mill.js') }}"></script>
<script type="text/javascript"> <script type="text/javascript">
</script> </script>

View File

@ -1,8 +1,13 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import configparser import configparser
import redis
import sys,os
import datetime import datetime
import os
import sys
import redis
from helpers import geo_helper
sys.path.append('..') sys.path.append('..')
configfile = 'test_config.cfg' configfile = 'test_config.cfg'
@ -14,7 +19,6 @@ serv_redis_db = redis.StrictRedis(
port=6260, port=6260,
db=1) db=1)
from helpers import geo_helper
geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg) geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg)
categ = 'Network Activity' categ = 'Network Activity'

View File

@ -1,8 +1,14 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import configparser import configparser
import datetime
import os
import sys
import time
import redis import redis
import sys,os
import datetime, time from helpers import trendings_helper
sys.path.append('..') sys.path.append('..')
configfile = 'test_config.cfg' configfile = 'test_config.cfg'
@ -14,7 +20,6 @@ serv_redis_db = redis.StrictRedis(
port=6260, port=6260,
db=1) db=1)
from helpers import trendings_helper
trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg) trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg)

View File

@ -1,8 +1,14 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import configparser import configparser
import datetime
import os
import sys
import time
import redis import redis
import sys,os
import datetime, time from helpers import users_helper
sys.path.append('..') sys.path.append('..')
configfile = 'test_config.cfg' configfile = 'test_config.cfg'
@ -14,7 +20,6 @@ serv_redis_db = redis.StrictRedis(
port=6260, port=6260,
db=1) db=1)
from helpers import users_helper
users_helper = users_helper.Users_helper(serv_redis_db, cfg) users_helper = users_helper.Users_helper(serv_redis_db, cfg)

79
updates.py Normal file
View File

@ -0,0 +1,79 @@
import redis
import os
import configparser
import logging
DATABASE_VERSION = [
1
]
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
serv_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db'))
serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db'))
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
# logger
logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'update_filename')
logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir):
os.makedirs(logDir)
handler = logging.FileHandler(logPath)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
update_logger = logging.getLogger(__name__)
update_logger.setLevel(logging.INFO)
update_logger.addHandler(handler)
def check_for_updates():
db_version = serv_redis_db.get(cfg.get('RedisDB', 'dbVersion'))
db_version = int(db_version) if db_version is not None else 0
updates_to_be_done = find_updates(db_version)
if len(updates_to_be_done) == 0:
update_logger.info('database up-to-date')
else:
for i in updates_to_be_done:
exec_updates(i)
def find_updates(db_version):
updates_to_be_done = []
for i in DATABASE_VERSION:
if db_version < i:
updates_to_be_done.append(i)
return updates_to_be_done
def exec_updates(db_version):
result = False
if db_version == 1:
result = apply_update_1()
if result:
serv_redis_db.set(cfg.get('RedisDB', 'dbVersion'), db_version)
update_logger.warning(f'dbVersion sets to {db_version}')
else:
update_logger.error(f'Something went wrong. {result}')
# Data format changed. Wipe the key.
def apply_update_1():
serv_redis_db.delete('TEMP_CACHE_LIVE:Attribute')
log_text = 'Executed update 1. Deleted Redis key `TEMP_CACHE_LIVE:Attribute`'
print(log_text)
update_logger.info(log_text)
return True

26
util.py
View File

@ -1,5 +1,6 @@
import datetime
import time
from collections import defaultdict from collections import defaultdict
import datetime, time
ONE_DAY = 60*60*24 ONE_DAY = 60*60*24
@ -7,7 +8,7 @@ def getZrange(serv_redis_db, keyCateg, date, topNum, endSubkey=""):
date_str = getDateStrFormat(date) date_str = getDateStrFormat(date)
keyname = "{}:{}{}".format(keyCateg, date_str, endSubkey) keyname = "{}:{}{}".format(keyCateg, date_str, endSubkey)
data = serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
return data return data
def noSpaceLower(text): def noSpaceLower(text):
@ -17,7 +18,7 @@ def push_to_redis_zset(serv_redis_db, mainKey, toAdd, endSubkey="", count=1):
now = datetime.datetime.now() now = datetime.datetime.now()
today_str = getDateStrFormat(now) today_str = getDateStrFormat(now)
keyname = "{}:{}{}".format(mainKey, today_str, endSubkey) keyname = "{}:{}{}".format(mainKey, today_str, endSubkey)
serv_redis_db.zincrby(keyname, toAdd, count) serv_redis_db.zincrby(keyname, count, toAdd)
def getMonthSpan(date): def getMonthSpan(date):
ds = datetime.datetime(date.year, date.month, 1) ds = datetime.datetime(date.year, date.month, 1)
@ -102,3 +103,22 @@ def sortByTrendingScore(toSort, topNum=5):
topArray.append(dailyCombi) topArray.append(dailyCombi)
return topArray return topArray
def getFields(obj, fields):
jsonWalker = fields.split('.')
itemToExplore = obj
lastName = ""
try:
for i in jsonWalker:
itemToExplore = itemToExplore[i]
lastName = i
if type(itemToExplore) is list:
return {'name': lastName, 'data': itemToExplore}
else:
if i == 'timestamp':
itemToExplore = datetime.datetime.utcfromtimestamp(
int(itemToExplore)).strftime('%Y-%m-%d %H:%M:%S')
return itemToExplore
except KeyError as e:
return None

View File

@ -1,34 +1,39 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import time, datetime
import copy
import logging
import zmq
import redis
import random
import configparser
import argparse import argparse
import os import configparser
import sys import copy
import datetime
import json import json
import logging
import os
import random
import sys
import time
import redis
import zmq
import util import util
from helpers import geo_helper import updates
from helpers import contributor_helper from helpers import (contributor_helper, geo_helper, live_helper,
from helpers import users_helper trendings_helper, users_helper)
from helpers import trendings_helper
from helpers import live_helper
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser() cfg = configparser.ConfigParser()
cfg.read(configfile) cfg.read(configfile)
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'dispatcher_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
logger = logging.getLogger('zmq_dispatcher') logger = logging.getLogger('zmq_dispatcher')
LISTNAME = cfg.get('RedisLIST', 'listName') LISTNAME = cfg.get('RedisLIST', 'listName')
@ -36,15 +41,18 @@ LISTNAME = cfg.get('RedisLIST', 'listName')
serv_log = redis.StrictRedis( serv_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db')) db=cfg.getint('RedisLog', 'db'),
decode_responses=True)
serv_redis_db = redis.StrictRedis( serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db')) db=cfg.getint('RedisDB', 'db'),
decode_responses=True)
serv_list = redis.StrictRedis( serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db')) db=cfg.getint('RedisLIST', 'db'),
decode_responses=True)
live_helper = live_helper.Live_helper(serv_redis_db, cfg) live_helper = live_helper.Live_helper(serv_redis_db, cfg)
geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg) geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg)
@ -53,23 +61,6 @@ users_helper = users_helper.Users_helper(serv_redis_db, cfg)
trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg) trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg)
def getFields(obj, fields):
jsonWalker = fields.split('.')
itemToExplore = obj
lastName = ""
try:
for i in jsonWalker:
itemToExplore = itemToExplore[i]
lastName = i
if type(itemToExplore) is list:
return { 'name': lastName , 'data': itemToExplore }
else:
if i == 'timestamp':
itemToExplore = datetime.datetime.utcfromtimestamp(int(itemToExplore)).strftime('%Y-%m-%d %H:%M:%S')
return itemToExplore
except KeyError as e:
return ""
############## ##############
## HANDLERS ## ## HANDLERS ##
############## ##############
@ -139,7 +130,16 @@ def handler_conversation(zmq_name, jsonevent):
def handler_object(zmq_name, jsondata): def handler_object(zmq_name, jsondata):
logger.info('Handling object') logger.info('Handling object')
return # check if jsonattr is an mispObject object
if 'Object' in jsondata:
jsonobj = jsondata['Object']
soleObject = copy.deepcopy(jsonobj)
del soleObject['Attribute']
for jsonattr in jsonobj['Attribute']:
jsonattrcpy = copy.deepcopy(jsonobj)
jsonattrcpy['Event'] = jsondata['Event']
jsonattrcpy['Attribute'] = jsonattr
handler_attribute(zmq_name, jsonattrcpy, False, parentObject=soleObject)
def handler_sighting(zmq_name, jsondata): def handler_sighting(zmq_name, jsondata):
logger.info('Handling sighting') logger.info('Handling sighting')
@ -167,11 +167,8 @@ def handler_event(zmq_name, jsonobj):
timestamp = jsonevent['timestamp'] timestamp = jsonevent['timestamp']
trendings_helper.addTrendingEvent(eventName, timestamp) trendings_helper.addTrendingEvent(eventName, timestamp)
tags = [] tags = []
for tag in jsonobj.get('EventTag', []): for tag in jsonevent.get('Tag', []):
try: tags.append(tag)
tags.append(tag['Tag'])
except KeyError:
pass
trendings_helper.addTrendingTags(tags, timestamp) trendings_helper.addTrendingTags(tags, timestamp)
#redirect to handler_attribute #redirect to handler_attribute
@ -185,6 +182,16 @@ def handler_event(zmq_name, jsonobj):
else: else:
handler_attribute(zmq_name, attributes) handler_attribute(zmq_name, attributes)
if 'Object' in jsonevent:
objects = jsonevent['Object']
if type(objects) is list:
for obj in objects:
jsoncopy = copy.deepcopy(jsonobj)
jsoncopy['Object'] = obj
handler_object(zmq_name, jsoncopy)
else:
handler_object(zmq_name, objects)
action = jsonobj.get('action', None) action = jsonobj.get('action', None)
eventLabeled = len(jsonobj.get('EventTag', [])) > 0 eventLabeled = len(jsonobj.get('EventTag', [])) > 0
org = jsonobj.get('Orgc', {}).get('name', None) org = jsonobj.get('Orgc', {}).get('name', None)
@ -196,11 +203,15 @@ def handler_event(zmq_name, jsonobj):
action, action,
isLabeled=eventLabeled) isLabeled=eventLabeled)
def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False): def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False, parentObject=False):
logger.info('Handling attribute') logger.info('Handling attribute')
# check if jsonattr is an attribute object # check if jsonattr is an attribute object
if 'Attribute' in jsonobj: if 'Attribute' in jsonobj:
jsonattr = jsonobj['Attribute'] jsonattr = jsonobj['Attribute']
else:
jsonattr = jsonobj
attributeType = 'Attribute' if jsonattr['object_id'] == '0' else 'ObjectAttribute'
#Add trending #Add trending
categName = jsonattr['category'] categName = jsonattr['category']
@ -208,22 +219,9 @@ def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False):
trendings_helper.addTrendingCateg(categName, timestamp) trendings_helper.addTrendingCateg(categName, timestamp)
tags = [] tags = []
for tag in jsonattr.get('Tag', []): for tag in jsonattr.get('Tag', []):
try: tags.append(tag)
tags.append(tag)
except KeyError:
pass
trendings_helper.addTrendingTags(tags, timestamp) trendings_helper.addTrendingTags(tags, timestamp)
to_push = []
for field in json.loads(cfg.get('Dashboard', 'fieldname_order')):
if type(field) is list:
to_join = []
for subField in field:
to_join.append(str(getFields(jsonobj, subField)))
to_add = cfg.get('Dashboard', 'char_separator').join(to_join)
else:
to_add = getFields(jsonobj, field)
to_push.append(to_add)
#try to get coord from ip #try to get coord from ip
if jsonattr['category'] == "Network activity": if jsonattr['category'] == "Network activity":
@ -237,13 +235,19 @@ def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False):
eventLabeled = len(jsonobj.get('EventTag', [])) > 0 eventLabeled = len(jsonobj.get('EventTag', [])) > 0
action = jsonobj.get('action', None) action = jsonobj.get('action', None)
contributor_helper.handleContribution(zmq_name, jsonobj['Event']['Orgc']['name'], contributor_helper.handleContribution(zmq_name, jsonobj['Event']['Orgc']['name'],
'Attribute', attributeType,
jsonattr['category'], jsonattr['category'],
action, action,
isLabeled=eventLabeled) isLabeled=eventLabeled)
# Push to log # Push to log
live_helper.publish_log(zmq_name, 'Attribute', to_push) live_helper.publish_log(zmq_name, attributeType, jsonobj)
def handler_diagnostic_tool(zmq_name, jsonobj):
try:
res = time.time() - float(jsonobj['content'])
except Exception as e:
logger.error(e)
serv_list.set('diagnostic_tool_response', str(res))
############### ###############
## MAIN LOOP ## ## MAIN LOOP ##
@ -259,15 +263,18 @@ def process_log(zmq_name, event):
def main(sleeptime): def main(sleeptime):
updates.check_for_updates()
numMsg = 0 numMsg = 0
while True: while True:
content = serv_list.rpop(LISTNAME) content = serv_list.rpop(LISTNAME)
if content is None: if content is None:
logger.debug('Processed {} message(s) since last sleep.'.format(numMsg)) log_text = 'Processed {} message(s) since last sleep.'.format(numMsg)
logger.info(log_text)
numMsg = 0 numMsg = 0
time.sleep(sleeptime) time.sleep(sleeptime)
continue continue
content = content.decode('utf8') content = content
the_json = json.loads(content) the_json = json.loads(content)
zmqName = the_json['zmq_name'] zmqName = the_json['zmq_name']
content = the_json['content'] content = the_json['content']
@ -287,13 +294,17 @@ dico_action = {
"misp_json_conversation": handler_conversation, "misp_json_conversation": handler_conversation,
"misp_json_object_reference": handler_skip, "misp_json_object_reference": handler_skip,
"misp_json_audit": handler_audit, "misp_json_audit": handler_audit,
"diagnostic_channel": handler_diagnostic_tool
} }
if __name__ == "__main__": if __name__ == "__main__":
parser = argparse.ArgumentParser(description='The ZMQ dispatcher. It pops from the redis buffer then redispatch it to the correct handlers') parser = argparse.ArgumentParser(description='The ZMQ dispatcher. It pops from the redis buffer then redispatch it to the correct handlers')
parser.add_argument('-s', '--sleep', required=False, dest='sleeptime', type=int, help='The number of second to wait before checking redis list size', default=5) parser.add_argument('-s', '--sleep', required=False, dest='sleeptime', type=int, help='The number of second to wait before checking redis list size', default=1)
args = parser.parse_args() args = parser.parse_args()
main(args.sleeptime) try:
main(args.sleeptime)
except (redis.exceptions.ResponseError, KeyboardInterrupt) as error:
print(error)

View File

@ -1,24 +1,31 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import time, datetime
import zmq
import logging
import redis
import configparser
import argparse import argparse
import configparser
import datetime
import json
import logging
import os import os
import sys import sys
import json import time
import redis
import zmq
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser() cfg = configparser.ConfigParser()
cfg.read(configfile) cfg.read(configfile)
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'subscriber_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
logger = logging.getLogger('zmq_subscriber') logger = logging.getLogger('zmq_subscriber')
CHANNEL = cfg.get('RedisLog', 'channel') CHANNEL = cfg.get('RedisLog', 'channel')
@ -27,7 +34,8 @@ LISTNAME = cfg.get('RedisLIST', 'listName')
serv_list = redis.StrictRedis( serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db')) db=cfg.getint('RedisLIST', 'db'),
decode_responses=True)
############### ###############
@ -53,6 +61,8 @@ def main(zmqName, zmqurl):
print(zmqName, content) print(zmqName, content)
except KeyboardInterrupt: except KeyboardInterrupt:
return return
except Exception as e:
logger.warning('Error:' + str(e))
if __name__ == "__main__": if __name__ == "__main__":
@ -62,4 +72,7 @@ if __name__ == "__main__":
parser.add_argument('-u', '--url', required=False, dest='zmqurl', help='The URL to connect to', default="tcp://localhost:50000") parser.add_argument('-u', '--url', required=False, dest='zmqurl', help='The URL to connect to', default="tcp://localhost:50000")
args = parser.parse_args() args = parser.parse_args()
main(args.zmqname, args.zmqurl) try:
main(args.zmqname)
except redis.exceptions.ResponseError as error:
print(error)