Merge branch 'upstream/main' into megaRefact

megaRefact
Steve Clement 2021-03-31 10:34:58 +09:00
commit 1e3aba57b6
No known key found for this signature in database
GPG Key ID: 69A20F509BE4AEE9
50 changed files with 24659 additions and 572 deletions

149
.gitignore vendored Normal file
View File

@ -0,0 +1,149 @@
# MISP Dashboard ignores
# static files
static/
# virtualenv folder
DASHENV/
# misp-dashboard config file
config/config.cfg
# ignore GeoLite DB
data/GeoLite2-City*
# Created by https://www.gitignore.io/api/python
# Edit at https://www.gitignore.io/?templates=python
### Python ###
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
### Python Patch ###
.venv/
### Python.VirtualEnv Stack ###
# Virtualenv
# http://iamzed.com/2009/05/07/a-primer-on-virtualenv/
[Bb]in
[Ii]nclude
[Ll]ib
[Ll]ib64
[Ll]ocal
[Ss]cripts
pyvenv.cfg
pip-selfcheck.json
# End of https://www.gitignore.io/api/python

188
README.md
View File

@ -1,8 +1,70 @@
# MISP-Dashboard # misp-dashboard
An experimental dashboard showing live data and statistics from the ZMQ of one or more MISP instances.
A dashboard showing live data and statistics from the ZMQ feeds of one or more [MISP](https://www.misp-project.org/) instances.
The dashboard can be used as a real-time situational awareness tool to gather threat intelligence information.
The misp-dashboard includes a [gamification](https://en.wikipedia.org/wiki/Gamification#Criticism) tool to show the contributions of each organisation and how they are ranked over time.
The dashboard can be used for SOCs (Security Operation Centers), security teams or during cyber exercises to keep track of what is being processed on your various MISP instances.
# Features
## Live Dashboard
- Possibility to subscribe to multiple ZMQ feeds from different MISP instances
- Shows immediate contributions made by organisations
- Displays live resolvable posted geo-locations
![Dashboard live](./screenshots/dashboard-live.png)
## Geolocalisation Dashboard
- Provides historical geolocalised information to support security teams, CSIRTs or SOCs in finding threats within their constituency
- Possibility to get geospatial information from specific regions
![Dashbaord geo](./screenshots/dashboard-geo.png)
## Contributors Dashboard
__Shows__:
- The monthly rank of all organisations
- The last organisation that contributed (dynamic updates)
- The contribution level of all organisations
- Each category of contributions per organisation
- The current ranking of the selected organisation (dynamic updates)
__Includes__:
- [Gamification](https://en.wikipedia.org/wiki/Gamification#Criticism) of the platform:
- Two different levels of ranking with unique icons
- Exclusive obtainable badges for source code contributors and donator
![Dashboard contributors](./screenshots/dashboard-contributors2.png)
![Dashboard contributors2](./screenshots/dashboard-contributors3.png)
## Users Dashboard
- Shows when and how the platform is used:
- Login punchcard and contributions over time
- Contribution vs login
![Dashboard users](./screenshots/dashboard-users.png)
## Trendings Dashboard
- Provides real time information to support security teams, CSIRTs or SOC showing current threats and activity
- Shows most active events, categories and tags
- Shows sightings and discussion overtime
![Dashboard users](./screenshots/dashboard-trendings.png)
# Installation # Installation
- Launch ```./install_dependencies.sh``` from the MISP-Dashboard directory
Before installing, consider that the only supported system are open source Unix-like operating system such as Linux and others.
1. You will need to [create a free MaxMind account.](https://www.maxmind.com/en/geolite2/signup)
2. Set your password and [create a license key](https://www.maxmind.com/en/accounts/current/license-key)
2.1 Make a note of your License Key it's needed during install.
- Launch ```./install_dependencies.sh``` from the MISP-Dashboard directory ([idempotent-ish](https://en.wikipedia.org/wiki/Idempotence))
- Update the configuration file ```config.cfg``` so that it matches your system - Update the configuration file ```config.cfg``` so that it matches your system
- Fields that you may change: - Fields that you may change:
- RedisGlobal -> host - RedisGlobal -> host
@ -13,7 +75,7 @@ An experimental dashboard showing live data and statistics from the ZMQ of one o
# Updating by pulling # Updating by pulling
- Re-launch ```./install_dependencies.sh``` to fetch new required dependencies - Re-launch ```./install_dependencies.sh``` to fetch new required dependencies
- Re-update your configuration file ```config.cfg``` - Re-update your configuration file ```config.cfg``` by comparing eventual changes in ```config.cfg.default```
:warning: Make sure no zmq python3 scripts are running. They block the update. :warning: Make sure no zmq python3 scripts are running. They block the update.
@ -35,9 +97,10 @@ Traceback (most recent call last):
with open(dst, 'wb') as fdst: with open(dst, 'wb') as fdst:
OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin/python3' OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin/python3'
``` ```
- Restart the System: `./start_all.sh` **OR** `./start_zmq.sh` and `./server.py &`
# Starting the System # Starting the System
:warning: You do not need to run it as root. Normal privileges are fine. :warning: You should not run it as root. Normal privileges are fine.
- Be sure to have a running redis server - Be sure to have a running redis server
- e.g. ```redis-server --port 6250``` - e.g. ```redis-server --port 6250```
@ -47,6 +110,12 @@ OSError: [Errno 26] Text file busy: '/home/steve/code/misp-dashboard/DASHENV/bin
- Start the Flask server ```./server.py &``` - Start the Flask server ```./server.py &```
- Access the interface at ```http://localhost:8001/``` - Access the interface at ```http://localhost:8001/```
__Alternatively__, you can run the ```start_all.sh``` script to run the commands described above.
# Authentication
Authentication can be enable in ``config/config.cfg`` by setting ``auth_enabled = True``.
Users will be required to login to MISP and will be allowed to proceed if they have the *User Setting*'s ``dashboard_access`` sets to 1 for the MISP user account.
# Debug # Debug
Debug is fun and gives you more details on what is going on when things fail. Debug is fun and gives you more details on what is going on when things fail.
@ -60,59 +129,47 @@ export FLASK_APP=server.py
flask run --host=0.0.0.0 --port=8001 # <- Be careful here, this exposes it on ALL ip addresses. Ideally if run locally --host=127.0.0.1 flask run --host=0.0.0.0 --port=8001 # <- Be careful here, this exposes it on ALL ip addresses. Ideally if run locally --host=127.0.0.1
``` ```
OR, just toggle the debug flag in start_all.sh script. OR, just toggle the debug flag in start_all.sh or config.cfg.
Happy hacking ;) Happy hacking ;)
# Features
## Live Dashboard ## Restart from scratch
- Possibility to subscribe to multiple ZMQ feeds
- Shows direct contribution made by organisations
- Shows live resolvable posted locations
![Dashboard live](./screenshots/dashboard-live.png) To restart from scratch and empty all data from your dashboard you can use the dedicated cleaning script ``clean.py``
```usage: clean.py [-h] [-b]
## Geolocalisation Dashboard Clean data stored in the redis server specified in the configuration file
- Provides historical geolocalised information to support security teams, CSIRTs or SOC finding threats in their constituency optional arguments:
- Possibility to get geospatial information from specific regions -h, --help show this help message and exit
-b, --brutal Perfom a FLUSHALL on the redis database. If not set, will use
a soft method to delete only keys used by MISP-Dashboard.
```
![Dashbaord geo](./screenshots/dashboard-geo.png) ## Notes about ZMQ
The misp-dashboard being stateless in regards to MISP, it can only process data that it received. Meaning that if your MISP is not publishing all notifications to its ZMQ, the misp-dashboard will not have them.
## Contributors Dashboard The most revelant example could be the user login punchcard. If your MISP doesn't have the option ``Plugin.ZeroMQ_audit_notifications_enable`` set to ``true``, the punchcard will be empty.
__Shows__: ## Dashboard not showing results - No module named zmq
- The monthly rank of all organisation When the misp-dashboard does not show results then first check if the zmq module within MISP is properly installed.
- The last organisation that contributed (dynamic updates)
- The contribution level of all organisation
- Each category of contribution per organisation
- The current ranking of the selected organisation (dynamic updates)
__Includes__: In **Administration**, **Plugin Settings**, **ZeroMQ** check that **Plugin.ZeroMQ_enable** is set to **True**.
- Gamification of the platform: Publish a test event from MISP to ZMQ via **Event Actions**, **Publish event to ZMQ**.
- Two different levels of ranking with unique icons
- Exclusive obtainable badges for source code contributors and donator
![Dashboard contributor](./screenshots/dashboard-contributors2.png) Verify the logfiles
![Dashboard contributor2](./screenshots/dashboard-contributors3.png) ```
${PATH_TO_MISP}/app/tmp/log/mispzmq.error.log
${PATH_TO_MISP}/app/tmp/log/mispzmq.log
```
## Users Dashboard If there's an error **ModuleNotFoundError: No module named 'zmq'** then install pyzmq.
- Shows when and how the platform is used: ```
- Login punchcard and overtime $SUDO_WWW ${PATH_TO_MISP}/venv/bin/pip install pyzmq
- Contribution vs login ```
![Dashboard users](./screenshots/dashboard-users.png)
## Trendings Dashboard
- Provides real time information to support security teams, CSIRTs or SOC showing current threats and activity
- Shows most active events, categories and tags
- Shows sightings and discussion overtime
![Dashboard users](./screenshots/dashboard-trendings.png)
# zmq_subscriber options # zmq_subscriber options
```usage: zmq_subscriber.py [-h] [-n ZMQNAME] [-u ZMQURL] ```usage: zmq_subscriber.py [-h] [-n ZMQNAME] [-u ZMQURL]
@ -129,7 +186,7 @@ optional arguments:
# Deploy in production using mod_wsgi # Deploy in production using mod_wsgi
Install Apache's mod-wsgi for Python3 Install Apache mod-wsgi for Python3
```bash ```bash
sudo apt-get install libapache2-mod-wsgi-py3 sudo apt-get install libapache2-mod-wsgi-py3
@ -144,10 +201,10 @@ The following NEW packages will be installed:
libapache2-mod-wsgi-py3 libapache2-mod-wsgi-py3
``` ```
Configuration file `/etc/apache2/sites-available/misp-dashboard.conf` assumes that `misp-dashboard` is cloned into `var/www/misp-dashboard`. It runs as user `misp` in this example. Change the permissions to folder and files accordingly. Configuration file `/etc/apache2/sites-available/misp-dashboard.conf` assumes that `misp-dashboard` is cloned into `/var/www/misp-dashboard`. It runs as user `misp` in this example. Change the permissions to your custom folder and files accordingly.
``` ```
<VirtualHost *:8000> <VirtualHost *:8001>
ServerAdmin admin@misp.local ServerAdmin admin@misp.local
ServerName misp.local ServerName misp.local
@ -191,33 +248,36 @@ Configuration file `/etc/apache2/sites-available/misp-dashboard.conf` assumes th
``` ```
# License # License
~~~~
Copyright (C) 2017-2019 CIRCL - Computer Incident Response Center Luxembourg (c/o smile, security made in Lëtzebuerg, Groupement d'Intérêt Economique)
Copyright (c) 2017-2019 Sami Mokaddem
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
~~~~
Images and logos are handmade for: Images and logos are handmade for:
- rankingMISPOrg/ - rankingMISPOrg/
- rankingMISPMonthly/ - rankingMISPMonthly/
- MISPHonorableIcons/ - MISPHonorableIcons/
Note that: Note that:
- Part of ```MISPHonorableIcons/1.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/1.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/2.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/2.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/3.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/3.svg``` comes from [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/4.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/4.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
- Part of ```MISPHonorableIcons/5.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved) - Part of ```MISPHonorableIcons/5.svg``` comes from [Zeptozephyr](https://zeptozephyr.deviantart.com/art/Vectored-Portal-Icons-207347804) & [octicons.github.com](https://octicons.github.com/icon/git-pull-request/) (CC0 - No Rights Reserved)
```
Copyright (C) 2017 CIRCL - Computer Incident Response Center Luxembourg (c/o smile, security made in Lëtzebuerg, Groupement d'Intérêt Economique)
Copyright (c) 2017 Sami Mokaddem
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.
```

86
clean.py Executable file
View File

@ -0,0 +1,86 @@
#!/usr/bin/env python3
import argparse
import configparser
import os
from pprint import pprint
import redis
RED="\033[91m"
GREEN="\033[92m"
DEFAULT="\033[0m"
def clean(brutal=False):
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
host = cfg.get("RedisGlobal", "host")
port = cfg.getint("RedisGlobal", "port")
servers = []
for db in range(0, 4):
servers.append(redis.Redis(host, port, db=db, decode_responses=True))
if brutal:
print(RED+'Brutal mode'+DEFAULT+' selected')
print('[%s:%s] Cleaning data...' % (host, port))
cleanBrutal(servers[0])
else:
print(GREEN+'Soft mode'+DEFAULT+' selected')
print('[%s:%s] Cleaning data...' % (host, port))
cleanSoft(servers)
# Perform a FLUSHALL
def cleanBrutal(server):
server.flushall()
# Delete all keys independently
def cleanSoft(servers):
prefix_keys_per_db = {
0: [], # publish only
1: [], # publish only (maps)
3: ['bufferList'], # queuing
2: [
'GEO_COORD:*',
'GEO_COUNTRY:*',
'GEO_RAD:*',
'CONTRIB_DAY:*',
'CONTRIB_CATEG:*',
'CONTRIB_LAST:*',
'CONTRIB_ALL_ORG',
'CONTRIB_ORG:*',
'CONTRIB_TROPHY:*',
'CONTRIB_LAST_AWARDS:*',
'CONTRIB_ALL_ORG',
'LOGIN_TIMESTAMP:*',
'LOGIN_ORG:*',
'LOGIN_ALL_ORG',
'TRENDINGS_EVENTS:*',
'TRENDINGS_CATEGS:*',
'TRENDINGS_TAGS:*',
'TRENDINGS_DISC:*',
'TRENDINGS_SIGHT_sightings:*',
'TRENDINGS_SIGHT_false_positive:*',
'TEMP_CACHE_LIVE:*',
]
}
for db, keys in prefix_keys_per_db.items():
serv = servers[db]
for k in keys:
# fetch all keys on the db
key_to_del = serv.keys(k)
# delete all existing keys
if len(key_to_del) > 0:
serv.delete(*tuple(key_to_del))
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Clean data stored in the redis server specified in the configuration file')
parser.add_argument("-b", "--brutal", default=False, action="store_true", help="Perfom a FLUSHALL on the redis database. If not set, will use a soft method to delete only keys used by MISP-Dashboard.")
args = parser.parse_args()
clean(args.brutal)
print(GREEN+'Done.'+DEFAULT)

View File

@ -34,7 +34,7 @@
# By default Redis does not run as a daemon. Use 'yes' if you need it. # By default Redis does not run as a daemon. Use 'yes' if you need it.
# Note that Redis will write a pid file in /var/run/redis.pid when daemonized. # Note that Redis will write a pid file in /var/run/redis.pid when daemonized.
daemonize no daemonize yes
# When running daemonized, Redis writes a pid file in /var/run/redis.pid by # When running daemonized, Redis writes a pid file in /var/run/redis.pid by
# default. You can specify a custom pid file location here. # default. You can specify a custom pid file location here.
@ -62,6 +62,7 @@ tcp-backlog 511
# #
# bind 192.168.1.100 10.0.0.1 # bind 192.168.1.100 10.0.0.1
# bind 127.0.0.1 # bind 127.0.0.1
bind 127.0.0.1 ::1
# Specify the path for the Unix socket that will be used to listen for # Specify the path for the Unix socket that will be used to listen for
# incoming connections. There is no default, so Redis will not listen # incoming connections. There is no default, so Redis will not listen
@ -141,7 +142,7 @@ databases 16
#save 900 1 #save 900 1
#save 300 10 #save 300 10
save 60 10000 save 9000 1
# By default Redis will stop accepting writes if RDB snapshots are enabled # By default Redis will stop accepting writes if RDB snapshots are enabled
# (at least one save point) and the latest background save failed. # (at least one save point) and the latest background save failed.

View File

@ -1,6 +1,23 @@
[Server] [Server]
host = localhost host = localhost
port = 8001 port = 8001
debug = False
ssl = False
# If you set SSL to True with a cert/key then an adhoc (self-signed) certificate is created
# ssl_cert = cert.pem
# ssl_key = key.pem
[Auth]
auth_enabled = False
misp_fqdn = https://misp.local
ssl_verify = True
session_secret = **Change_Me**
# Only send cookies with requests over HTTPS if the cookie is marked secure.
session_cookie_secure = True
# Prevent sending cookies in all external requests including regular links.
session_cookie_samesite = Strict
# Expire session cookie after n days.
permanent_session_lifetime = 1
[Dashboard] [Dashboard]
#hours #hours
@ -10,12 +27,13 @@ rotation_wait_time = 30
max_img_rotation = 10 max_img_rotation = 10
hours_spanned = 48 hours_spanned = 48
zoomlevel = 15 zoomlevel = 15
maxCacheHistory = 30
# [1->12] # [1->12]
size_dashboard_left_width = 5 size_dashboard_left_width = 5
size_openStreet_pannel_perc = 55 size_openStreet_pannel_perc = 55
size_world_pannel_perc = 35 size_world_pannel_perc = 35
item_to_plot = Attribute.category item_to_plot = Attribute.category
fieldname_order=["Event.id", "Attribute.Tag", "Attribute.category", "Attribute.type", ["Attribute.value", "Attribute.comment"]] fieldname_order=["Attribute.timestamp", "Event.id", "Attribute.Tag", "Attribute.category", "Attribute.type", ["Attribute.value", "Attribute.comment"]]
char_separator=|| char_separator=||
[GEO] [GEO]
@ -32,15 +50,29 @@ additional_help_text = ["Sightings multiplies earned points by 2", "Editing an a
[Log] [Log]
directory=logs directory=logs
filename=logs.log dispatcher_filename=zmq_dispatcher.log
subscriber_filename=zmq_subscriber.log
helpers_filename=helpers.log
update_filename=updates.log
[RedisGlobal] [RedisGlobal]
host=localhost host=localhost
port=6250 port=6250
#misp_web_url = http://192.168.56.50 misp_web_url = http://0.0.0.0
misp_web_url = http://localhost misp_instances = [{
#zmq_url=tcp://192.168.56.50:50000 "name": "misp1",
zmq_url=tcp://localhost:50000 "url": "http://localhost",
"zmq": "tcp://localhost:50000"}]
#misp_instances = [{
# "name": "misp1",
# "url": "http://localhost",
# "zmq": "tcp://localhost:50000"},
# {
# "name": "misp2",
# "url": "http://10.0.2.4",
# "zmq": "tcp://10.0.2.4:50000"}
# ]
[RedisLIST] [RedisLIST]
db=3 db=3
@ -48,6 +80,8 @@ listName=bufferList
[RedisLog] [RedisLog]
db=0 db=0
streamLogCacheKey = streamLogCache
streamMapCacheKey = streamMapsCache
channel=1 channel=1
channelLastContributor = lastContributor channelLastContributor = lastContributor
channelLastAwards = lastAwards channelLastAwards = lastAwards
@ -61,3 +95,4 @@ path_countrycode_to_coord_JSON=./data/country_code_lat_long.json
[RedisDB] [RedisDB]
db=2 db=2
dbVersion=db_version

View File

@ -80,7 +80,7 @@ regularlyDays=7
[TrophyDifficulty] [TrophyDifficulty]
#represent the % of org that can have this rank. Rank 1 is ignored as only 1 org can have it. #represent the % of org that can have this rank. Rank 1 is ignored as only 1 org can have it.
trophyMapping=[2, 9, 9, 10, 10, 10, 10, 10, 10, 10, 10] trophyMapping=[2, 9, 9, 10, 10, 16, 16, 10, 10, 4, 4]
[HonorTrophy] [HonorTrophy]
0=No trophy 0=No trophy

500
diagnostic.py Executable file
View File

@ -0,0 +1,500 @@
#!/usr/bin/env python3
import os
import sys
import stat
import time
import signal
import functools
import configparser
from urllib.parse import urlparse, parse_qs
import subprocess
import diagnostic_util
try:
import redis
import zmq
import json
import flask
import requests
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
from halo import Halo
except ModuleNotFoundError as e:
print('Dependency not met. Either not in a virtualenv or dependency not installed.')
print('- Error: {}'.format(e))
sys.exit(1)
'''
Steps:
- check if dependencies exists
- check if virtualenv exists
- check if configuration is update-to-date
- check file permission
- check if redis is running and responding
- check if able to connect to zmq
- check zmq_dispatcher processing queue
- check queue status: being filled up / being filled down
- check if subscriber responding
- check if dispatcher responding
- check if server listening
- check log static endpoint
- check log dynamic endpoint
'''
HOST = 'http://127.0.0.1'
PORT = 8001 # overriden by configuration file
configuration_file = {}
pgrep_subscriber_output = ''
pgrep_dispatcher_output = ''
signal.signal(signal.SIGALRM, diagnostic_util.timeout_handler)
def humanize(name, isResult=False):
words = name.split('_')
if isResult:
words = words[1:]
words[0] = words[0][0].upper() + words[0][1:]
else:
words[0] = words[0][0].upper() + words[0][1:] + 'ing'
return ' '.join(words)
def add_spinner(_func=None, name='dots'):
def decorator_add_spinner(func):
@functools.wraps(func)
def wrapper_add_spinner(*args, **kwargs):
human_func_name = humanize(func.__name__)
human_func_result = humanize(func.__name__, isResult=True)
flag_skip = False
with Halo(text=human_func_name, spinner=name) as spinner:
result = func(spinner, *args, **kwargs)
if isinstance(result, tuple):
status, output = result
elif isinstance(result, list):
status = result[0]
output = result[1]
elif isinstance(result, bool):
status = result
output = None
else:
status = False
flag_skip = True
spinner.fail('{} - Function return unexpected result: {}'.format(human_func_name, str(result)))
if not flag_skip:
text = human_func_result
if output is not None and len(output) > 0:
text += ': {}'.format(output)
if isinstance(status, bool) and status:
spinner.succeed(text)
elif isinstance(status, bool) and not status:
spinner.fail(text)
else:
if status == 'info':
spinner.info(text)
else:
spinner.warn(text)
return status
return wrapper_add_spinner
if _func is None:
return decorator_add_spinner
else:
return decorator_add_spinner(_func)
@add_spinner
def check_virtual_environment_and_packages(spinner):
result = os.environ.get('VIRTUAL_ENV')
if result is None:
return (False, 'This diagnostic tool should be started inside a virtual environment.')
else:
if redis.__version__.startswith('2'):
return (False, '''Redis python client have version {}. Version 3.x required.
\t [inside virtualenv] pip3 install -U redis'''.format(redis.__version__))
else:
return (True, '')
@add_spinner
def check_configuration(spinner):
global configuration_file, port
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
configuration_file = cfg
cfg = {s: dict(cfg.items(s)) for s in cfg.sections()}
configfile_default = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg.default')
cfg_default = configparser.ConfigParser()
cfg_default.read(configfile_default)
cfg_default = {s: dict(cfg_default.items(s)) for s in cfg_default.sections()}
# Check if all fields from config.default exists in config
result, faulties = diagnostic_util.dict_compare(cfg_default, cfg)
if result:
port = configuration_file.get("Server", "port")
return (True, '')
else:
return_text = '''Configuration incomplete.
\tUpdate your configuration file `config.cfg`.\n\t Faulty fields:\n'''
for field_name in faulties:
return_text += '\t\t- {}\n'.format(field_name)
return (False, return_text)
@add_spinner(name='dot')
def check_file_permission(spinner):
max_mind_database_path = configuration_file.get('RedisMap', 'pathmaxminddb')
try:
st = os.stat(max_mind_database_path)
except FileNotFoundError:
return (False, 'Maxmind GeoDB - File not found')
all_read_perm = bool(st.st_mode & stat.S_IROTH) # FIXME: permission may be changed
if all_read_perm:
return (True, '')
else:
return (False, 'Maxmind GeoDB might have incorrect read file permission')
@add_spinner
def check_redis(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLog', 'db'))
if redis_server.ping():
return (True, '')
else:
return (False, '''Can\'t reach Redis server.
\t Make sure it is running and adapt your configuration accordingly''')
@add_spinner
def check_zmq(spinner):
timeout = 15
context = zmq.Context()
misp_instances = json.loads(configuration_file.get('RedisGlobal', 'misp_instances'))
instances_status = {}
for misp_instance in misp_instances:
socket = context.socket(zmq.SUB)
socket.connect(misp_instance.get('zmq'))
socket.setsockopt_string(zmq.SUBSCRIBE, '')
poller = zmq.Poller()
flag_skip = False
start_time = time.time()
poller.register(socket, zmq.POLLIN)
for t in range(1, timeout+1):
socks = dict(poller.poll(timeout=1*1000))
if len(socks) > 0:
if socket in socks and socks[socket] == zmq.POLLIN:
rcv_string = socket.recv()
if rcv_string.startswith(b'misp_json'):
instances_status[misp_instance.get('name')] = True
flag_skip = True
break
else:
spinner.text = 'checking zmq of {} - elapsed time: {}s'.format(misp_instance.get("name"), int(time.time() - start_time))
if not flag_skip:
instances_status[misp_instance.get('name')] = False
results = [s for n, s in instances_status.items()]
if all(results):
return (True, '')
elif any(results):
return_text = 'Connection to ZMQ stream(s) failed.\n'
for name, status in instances_status.items():
return_text += '\t{}: {}\n'.format(name, "success" if status else "failed")
return (True, return_text)
else:
return (False, '''Can\'t connect to the ZMQ stream(s).
\t Make sure the MISP ZMQ is running: `/servers/serverSettings/diagnostics`
\t Make sure your network infrastucture allows you to connect to the ZMQ''')
@add_spinner
def check_processes_status(spinner):
global pgrep_subscriber_output, pgrep_dispatcher_output
try:
response = subprocess.check_output(
["pgrep", "-laf", "zmq_"],
universal_newlines=True
)
except subprocess.CalledProcessError as e:
return (False, 'Could not get processes status. Error returned:\n'+str(e))
for line in response.splitlines():
lines = line.split(' ', maxsplit=1)
pid, p_name = lines
if 'zmq_subscriber.py' in p_name:
pgrep_subscriber_output = line
elif 'zmq_dispatcher.py' in p_name:
pgrep_dispatcher_output = line
if len(pgrep_subscriber_output) == 0:
return (False, 'zmq_subscriber is not running')
elif len(pgrep_dispatcher_output) == 0:
return (False, 'zmq_dispatcher is not running')
else:
return (True, 'Both processes are running')
@add_spinner
def check_subscriber_status(spinner):
global pgrep_subscriber_output
pool = redis.ConnectionPool(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'),
decode_responses=True)
monitor = diagnostic_util.Monitor(pool)
commands = monitor.monitor()
start_time = time.time()
signal.alarm(15)
try:
for i, c in enumerate(commands):
if i == 0: # Skip 'OK'
continue
split = c.split()
try:
action = split[3]
target = split[4]
except IndexError:
pass
if action == '"LPUSH"' and target == '\"{}\"'.format(configuration_file.get("RedisLIST", "listName")):
signal.alarm(0)
break
else:
spinner.text = 'Checking subscriber status - elapsed time: {}s'.format(int(time.time() - start_time))
except diagnostic_util.TimeoutException:
return_text = '''zmq_subscriber seems not to be working.
\t Consider restarting it: {}'''.format(pgrep_subscriber_output)
return (False, return_text)
return (True, 'subscriber is running and populating the buffer')
@add_spinner
def check_buffer_queue(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
warning_threshold = 100
elements_in_list = redis_server.llen(configuration_file.get('RedisLIST', 'listName'))
return_status = 'warning' if elements_in_list > warning_threshold else ('info' if elements_in_list > 0 else True)
return_text = 'Currently {} items in the buffer'.format(elements_in_list)
return (return_status, return_text)
@add_spinner
def check_buffer_change_rate(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
time_slept = 0
sleep_duration = 0.001
sleep_max = 10.0
refresh_frequency = 1.0
next_refresh = 0
change_increase = 0
change_decrease = 0
elements_in_list_prev = 0
elements_in_list = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
elements_in_inlist_init = elements_in_list
consecutive_no_rate_change = 0
while True:
elements_in_list_prev = elements_in_list
elements_in_list = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
change_increase += elements_in_list - elements_in_list_prev if elements_in_list - elements_in_list_prev > 0 else 0
change_decrease += elements_in_list_prev - elements_in_list if elements_in_list_prev - elements_in_list > 0 else 0
if next_refresh < time_slept:
next_refresh = time_slept + refresh_frequency
change_rate_text = '{}/sec\t{}/sec'.format(change_increase, change_decrease)
spinner.text = 'Buffer: {}\t{}'.format(elements_in_list, change_rate_text)
if consecutive_no_rate_change == 3:
time_slept = sleep_max
if elements_in_list == 0:
consecutive_no_rate_change += 1
else:
consecutive_no_rate_change = 0
change_increase = 0
change_decrease = 0
if time_slept >= sleep_max:
return_flag = elements_in_list == 0 or (elements_in_list < elements_in_inlist_init or elements_in_list < 2)
return_text = 'Buffer is consumed {} than being populated'.format("faster" if return_flag else "slower")
break
time.sleep(sleep_duration)
time_slept += sleep_duration
elements_in_inlist_final = int(redis_server.llen(configuration_file.get('RedisLIST', 'listName')))
return (return_flag, return_text)
@add_spinner
def check_dispatcher_status(spinner):
redis_server = redis.StrictRedis(
host=configuration_file.get('RedisGlobal', 'host'),
port=configuration_file.getint('RedisGlobal', 'port'),
db=configuration_file.getint('RedisLIST', 'db'))
content = {'content': time.time()}
redis_server.rpush(configuration_file.get('RedisLIST', 'listName'),
json.dumps({'zmq_name': 'diagnostic_channel', 'content': 'diagnostic_channel ' + json.dumps(content)})
)
return_flag = False
return_text = ''
time_slept = 0
sleep_duration = 0.2
sleep_max = 10.0
redis_server.delete('diagnostic_tool_response')
while True:
reply = redis_server.get('diagnostic_tool_response')
elements_in_list = redis_server.llen(configuration_file.get('RedisLIST', 'listName'))
if reply is None:
if time_slept >= sleep_max:
return_flag = False
return_text = 'zmq_dispatcher did not respond in the given time ({}s)'.format(int(sleep_max))
if len(pgrep_dispatcher_output) > 0:
return_text += '\n\t➥ Consider restarting it: {}'.format(pgrep_dispatcher_output)
else:
return_text += '\n\t➥ Consider starting it'
break
time.sleep(sleep_duration)
spinner.text = 'Dispatcher status: No response yet'
time_slept += sleep_duration
else:
return_flag = True
return_text = 'Took {:.2f}s to complete'.format(float(reply))
break
return (return_flag, return_text)
@add_spinner
def check_server_listening(spinner):
url = '{}:{}/_get_log_head'.format(HOST, PORT)
spinner.text = 'Trying to connect to {}'.format(url)
try:
r = requests.get(url)
except requests.exceptions.ConnectionError:
return (False, 'Can\'t connect to {}').format(url)
if '/error_page' in r.url:
o = urlparse(r.url)
query = parse_qs(o.query)
error_code = query.get('error_code', '')
if error_code[0] == '1':
return (False, 'To many redirects. Server may not be properly configured\n\t➥ Try to correctly setup an HTTPS server or change the cookie policy in the configuration')
else:
error_message = query.get('error_message', '')[0]
return (False, 'Unkown error: {}\n{}'.format(error_code, error_message))
else:
return (
r.status_code == 200,
'{} {}reached. Status code [{}]'.format(url, "not " if r.status_code != 200 else "", r.status_code)
)
@add_spinner
def check_server_dynamic_enpoint(spinner):
payload = {
'username': 'admin@admin.test',
'password': 'Password1234',
'submit': 'Sign In'
}
sleep_max = 15
start_time = time.time()
# Check MISP connectivity
url_misp = configuration_file.get("Auth", "misp_fqdn")
try:
r = requests.get(url_misp, verify=configuration_file.getboolean("Auth", "ssl_verify"))
except requests.exceptions.SSLError as e:
if 'CERTIFICATE_VERIFY_FAILED' in str(e):
return (False, 'SSL connection error certificate verify failed.\n\t➥ Review your configuration'.format(e))
else:
return (False, 'SSL connection error `{}`.\n\t➥ Review your configuration'.format(e))
except requests.exceptions.ConnectionError:
return (False, 'MISP `{}` cannot be reached.\n\t➥ Review your configuration'.format(url_misp))
url_login = '{}:{}/login'.format(HOST, PORT)
url = '{}:{}/_logs'.format(HOST, PORT)
session = requests.Session()
session.verify = configuration_file.getboolean("Auth", "ssl_verify")
r_login = session.post(url_login, data=payload)
# Check if we ended up on the error page
if '/error_page' in r_login.url:
o = urlparse(r_login.url)
query = parse_qs(o.query)
error_code = query.get('error_code', '')
if error_code[0] == '2':
return (False, 'MISP cannot be reached for authentication\n\t➥ Review MISP fully qualified name and SSL settings')
else:
error_message = query.get('error_message', '')[0]
return (False, 'Unkown error: {}\n{}'.format(error_code, error_message))
# Recover error message from the url
if '/login' in r_login.url:
o = urlparse(r_login.url)
query = parse_qs(o.query)
error_message = query.get('auth_error_message', ['Redirected to `loging` caused by an unknown error'])[0]
return_text = 'Redirected to `loging` caused by: {}'.format(error_message)
return (False, return_text)
# Connection seems to be successful, checking if we receive data from event-stream
r = session.get(url, stream=True, timeout=sleep_max, headers={'Accept': 'text/event-stream'})
return_flag = False
return_text = 'Dynamic endpoint returned data but not in the correct format.'
try:
for line in r.iter_lines():
if line.startswith(b'data: '):
data = line[6:]
try:
json.loads(data)
return_flag = True
return_text = 'Dynamic endpoint returned data (took {:.2f}s)\n\t{}...'.format(time.time()-start_time, line[6:20])
break
except Exception:
return_flag = False
return_text = 'Something went wrong. Output {}'.format(line)
break
except diagnostic_util.TimeoutException:
return_text = 'Dynamic endpoint did not returned data in the given time ({}sec)'.format(int(time.time()-start_time))
return (return_flag, return_text)
def start_diagnostic():
if not (check_virtual_environment_and_packages() and check_configuration()):
return
check_file_permission()
check_redis()
check_zmq()
check_processes_status()
check_subscriber_status()
if check_buffer_queue() is not True:
check_buffer_change_rate()
dispatcher_running = check_dispatcher_status()
if check_server_listening() and dispatcher_running:
check_server_dynamic_enpoint()
def main():
start_diagnostic()
if __name__ == '__main__':
main()

68
diagnostic_util.py Normal file
View File

@ -0,0 +1,68 @@
import configparser
def dict_compare(dict1, dict2, itercount=0):
dict1_keys = set(dict1.keys())
dict2_keys = set(dict2.keys())
intersection = dict1_keys.difference(dict2_keys)
faulties = []
if itercount > 0 and len(intersection) > 0:
return (False, list(intersection))
flag_no_error = True
for k, v in dict1.items():
if isinstance(v, dict):
if k not in dict2:
faulties.append({k: dict1[k]})
flag_no_error = False
else:
status, faulty = dict_compare(v, dict2[k], itercount+1)
flag_no_error = flag_no_error and status
if len(faulty) > 0:
faulties.append({k: faulty})
else:
return (True, [])
if flag_no_error:
return (True, [])
else:
return (False, faulties)
class TimeoutException(Exception):
pass
def timeout_handler(signum, frame):
raise TimeoutException
# https://stackoverflow.com/a/10464730
class Monitor():
def __init__(self, connection_pool):
self.connection_pool = connection_pool
self.connection = None
def __del__(self):
try:
self.reset()
except Exception:
pass
def reset(self):
if self.connection:
self.connection_pool.release(self.connection)
self.connection = None
def monitor(self):
if self.connection is None:
self.connection = self.connection_pool.get_connection(
'monitor', None)
self.connection.send_command("monitor")
return self.listen()
def parse_response(self):
return self.connection.read_response()
def listen(self):
while True:
yield self.parse_response()

View File

@ -1,12 +1,16 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import os, sys, json
import datetime, time
import redis
import configparser import configparser
import datetime
import json
import os
import sys
import time
import redis
import util import util
import contributor_helper from helpers import contributor_helper
ONE_DAY = 60*60*24 ONE_DAY = 60*60*24
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
@ -206,7 +210,7 @@ def main():
for award in awards_given: for award in awards_given:
# update awards given # update awards given
serv_redis_db.zadd('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })) serv_redis_db.zadd('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), {json.dumps({'org': org, 'award': award, 'epoch': nowSec }): nowSec})
serv_redis_db.expire('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), ONE_DAY*7) #expire after 7 day serv_redis_db.expire('CONTRIB_LAST_AWARDS:'+util.getDateStrFormat(now), ONE_DAY*7) #expire after 7 day
# publish # publish
publish_log('GIVE_HONOR_ZMQ', 'CONTRIBUTION', {'org': org, 'award': award, 'epoch': nowSec }, CHANNEL_LASTAWARDS) publish_log('GIVE_HONOR_ZMQ', 'CONTRIBUTION', {'org': org, 'award': award, 'epoch': nowSec }, CHANNEL_LASTAWARDS)

View File

@ -1,16 +1,20 @@
import util
from util import getZrange
import math, random
import time
import os
import configparser import configparser
import json
import datetime import datetime
import json
import logging import logging
import math
import os
import random
import sys
import time
import redis import redis
import util import util
from util import getZrange
from . import users_helper from . import users_helper
KEYDAY = "CONTRIB_DAY" # To be used by other module KEYDAY = "CONTRIB_DAY" # To be used by other module
KEYALLORG = "CONTRIB_ALL_ORG" # To be used by other module KEYALLORG = "CONTRIB_ALL_ORG" # To be used by other module
@ -30,12 +34,21 @@ class Contributor_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
#honorBadge #honorBadge
self.honorBadgeNum = len(self.cfg_org_rank.options('HonorBadge')) self.honorBadgeNum = len(self.cfg_org_rank.options('HonorBadge'))
@ -48,6 +61,7 @@ class Contributor_helper:
self.org_honor_badge_title[badgeNum] = self.cfg_org_rank.get('HonorBadge', str(badgeNum)) self.org_honor_badge_title[badgeNum] = self.cfg_org_rank.get('HonorBadge', str(badgeNum))
self.trophyMapping = json.loads(self.cfg_org_rank.get('TrophyDifficulty', 'trophyMapping')) self.trophyMapping = json.loads(self.cfg_org_rank.get('TrophyDifficulty', 'trophyMapping'))
self.trophyMappingIncremental = [sum(self.trophyMapping[:i]) for i in range(len(self.trophyMapping)+1)]
self.trophyNum = len(self.cfg_org_rank.options('HonorTrophy'))-1 #0 is not a trophy self.trophyNum = len(self.cfg_org_rank.options('HonorTrophy'))-1 #0 is not a trophy
self.categories_in_trophy = json.loads(self.cfg_org_rank.get('HonorTrophyCateg', 'categ')) self.categories_in_trophy = json.loads(self.cfg_org_rank.get('HonorTrophyCateg', 'categ'))
self.trophy_title = {} self.trophy_title = {}
@ -86,7 +100,7 @@ class Contributor_helper:
self.DICO_PNTS_REWARD[categ] = self.default_pnts_per_contribution self.DICO_PNTS_REWARD[categ] = self.default_pnts_per_contribution
self.rankMultiplier = self.cfg_org_rank.getfloat('monthlyRanking' ,'rankMultiplier') self.rankMultiplier = self.cfg_org_rank.getfloat('monthlyRanking' ,'rankMultiplier')
self.levelMax = self.cfg_org_rank.getint('monthlyRanking' ,'levelMax') self.levelMax = self.cfg_org_rank.getint('monthlyRanking', 'levelMax')
# REDIS KEYS # REDIS KEYS
self.keyDay = KEYDAY self.keyDay = KEYDAY
@ -97,7 +111,6 @@ class Contributor_helper:
self.keyTrophy = "CONTRIB_TROPHY" self.keyTrophy = "CONTRIB_TROPHY"
self.keyLastAward = "CONTRIB_LAST_AWARDS" self.keyLastAward = "CONTRIB_LAST_AWARDS"
''' HELPER ''' ''' HELPER '''
def getOrgLogoFromMISP(self, org): def getOrgLogoFromMISP(self, org):
return "{}/img/orgs/{}.png".format(self.misp_web_url, org) return "{}/img/orgs/{}.png".format(self.misp_web_url, org)
@ -105,11 +118,11 @@ class Contributor_helper:
def addContributionToCateg(self, date, categ, org, count=1): def addContributionToCateg(self, date, categ, org, count=1):
today_str = util.getDateStrFormat(date) today_str = util.getDateStrFormat(date)
keyname = "{}:{}:{}".format(self.keyCateg, today_str, categ) keyname = "{}:{}:{}".format(self.keyCateg, today_str, categ)
self.serv_redis_db.zincrby(keyname, org, count) self.serv_redis_db.zincrby(keyname, count, org)
self.logger.debug('Added to redis: keyname={}, org={}, count={}'.format(keyname, org, count)) self.logger.debug('Added to redis: keyname={}, org={}, count={}'.format(keyname, org, count))
def publish_log(self, zmq_name, name, content, channel=""): def publish_log(self, zmq_name, name, content, channel=""):
to_send = { 'name': name, 'log': json.dumps(content), 'zmqName': zmq_name } to_send = {'name': name, 'log': json.dumps(content), 'zmqName': zmq_name }
self.serv_log.publish(channel, json.dumps(to_send)) self.serv_log.publish(channel, json.dumps(to_send))
self.logger.debug('Published: {}'.format(json.dumps(to_send))) self.logger.debug('Published: {}'.format(json.dumps(to_send)))
@ -119,14 +132,14 @@ class Contributor_helper:
if action in ['edit', None]: if action in ['edit', None]:
pass pass
#return #not a contribution? #return #not a contribution?
now = datetime.datetime.now() now = datetime.datetime.now()
nowSec = int(time.time()) nowSec = int(time.time())
pnts_to_add = self.default_pnts_per_contribution pnts_to_add = self.default_pnts_per_contribution
# Do not consider contribution as login anymore # Do not consider contribution as login anymore
#self.users_helper.add_user_login(nowSec, org) #self.users_helper.add_user_login(nowSec, org)
# is a valid contribution # is a valid contribution
if categ is not None: if categ is not None:
try: try:
@ -134,27 +147,27 @@ class Contributor_helper:
except KeyError: except KeyError:
pnts_to_add = self.default_pnts_per_contribution pnts_to_add = self.default_pnts_per_contribution
pnts_to_add *= pntMultiplier pnts_to_add *= pntMultiplier
util.push_to_redis_zset(self.serv_redis_db, self.keyDay, org, count=pnts_to_add) util.push_to_redis_zset(self.serv_redis_db, self.keyDay, org, count=pnts_to_add)
#CONTRIB_CATEG retain the contribution per category, not the point earned in this categ #CONTRIB_CATEG retain the contribution per category, not the point earned in this categ
util.push_to_redis_zset(self.serv_redis_db, self.keyCateg, org, count=1, endSubkey=':'+util.noSpaceLower(categ)) util.push_to_redis_zset(self.serv_redis_db, self.keyCateg, org, count=1, endSubkey=':'+util.noSpaceLower(categ))
self.publish_log(zmq_name, 'CONTRIBUTION', {'org': org, 'categ': categ, 'action': action, 'epoch': nowSec }, channel=self.CHANNEL_LASTCONTRIB) self.publish_log(zmq_name, 'CONTRIBUTION', {'org': org, 'categ': categ, 'action': action, 'epoch': nowSec }, channel=self.CHANNEL_LASTCONTRIB)
else: else:
categ = "" categ = ""
self.serv_redis_db.sadd(self.keyAllOrg, org) self.serv_redis_db.sadd(self.keyAllOrg, org)
keyname = "{}:{}".format(self.keyLastContrib, util.getDateStrFormat(now)) keyname = "{}:{}".format(self.keyLastContrib, util.getDateStrFormat(now))
self.serv_redis_db.zadd(keyname, nowSec, org) self.serv_redis_db.zadd(keyname, {org: nowSec})
self.logger.debug('Added to redis: keyname={}, nowSec={}, org={}'.format(keyname, nowSec, org)) self.logger.debug('Added to redis: keyname={}, nowSec={}, org={}'.format(keyname, nowSec, org))
self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day
awards_given = self.updateOrgContributionRank(org, pnts_to_add, action, contribType, eventTime=datetime.datetime.now(), isLabeled=isLabeled, categ=util.noSpaceLower(categ)) awards_given = self.updateOrgContributionRank(org, pnts_to_add, action, contribType, eventTime=datetime.datetime.now(), isLabeled=isLabeled, categ=util.noSpaceLower(categ))
for award in awards_given: for award in awards_given:
# update awards given # update awards given
keyname = "{}:{}".format(self.keyLastAward, util.getDateStrFormat(now)) keyname = "{}:{}".format(self.keyLastAward, util.getDateStrFormat(now))
self.serv_redis_db.zadd(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })) self.serv_redis_db.zadd(keyname, {json.dumps({'org': org, 'award': award, 'epoch': nowSec }): nowSec})
self.logger.debug('Added to redis: keyname={}, nowSec={}, content={}'.format(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec }))) self.logger.debug('Added to redis: keyname={}, nowSec={}, content={}'.format(keyname, nowSec, json.dumps({'org': org, 'award': award, 'epoch': nowSec })))
self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day self.serv_redis_db.expire(keyname, util.ONE_DAY*7) #expire after 7 day
# publish # publish
@ -167,7 +180,7 @@ class Contributor_helper:
if pnts is None: if pnts is None:
pnts = 0 pnts = 0
else: else:
pnts = int(pnts.decode('utf8')) pnts = int(pnts)
return pnts return pnts
# return: [final_rank, requirement_fulfilled, requirement_not_fulfilled] # return: [final_rank, requirement_fulfilled, requirement_not_fulfilled]
@ -352,7 +365,6 @@ class Contributor_helper:
''' TROPHIES ''' ''' TROPHIES '''
def getOrgTrophies(self, org): def getOrgTrophies(self, org):
self.getAllOrgsTrophyRanking()
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
trophy = [] trophy = []
for categ in self.categories_in_trophy: for categ in self.categories_in_trophy:
@ -360,50 +372,53 @@ class Contributor_helper:
totNum = self.serv_redis_db.zcard(key) totNum = self.serv_redis_db.zcard(key)
if totNum == 0: if totNum == 0:
continue continue
pos = self.serv_redis_db.zrank(key, org) pos = self.serv_redis_db.zrevrank(key, org)
if pos is None: if pos is None:
continue continue
trophy_rank = self.posToRankMapping(pos, totNum) trophy_rank = self.posToRankMapping(pos, totNum)
trophy_Pnts = self.serv_redis_db.zscore(key, org) trophy_Pnts = self.serv_redis_db.zscore(key, org)
trophy.append({ 'categ': categ, 'trophy_points': trophy_Pnts, 'trophy_rank': trophy_rank, 'trophy_true_rank': trophy_rank, 'trophy_title': self.trophy_title[trophy_rank]}) trophy.append({ 'categ': categ, 'trophy_points': trophy_Pnts, 'trophy_rank': trophy_rank, 'trophy_true_rank': self.trophyNum-trophy_rank, 'trophy_title': self.trophy_title[trophy_rank]})
return trophy return trophy
def getOrgsTrophyRanking(self, categ): def getOrgsTrophyRanking(self, categ):
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
res = self.serv_redis_db.zrange(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), 0, -1, withscores=True, desc=True) res = self.serv_redis_db.zrange(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), 0, -1, withscores=True, desc=True)
res = [[org.decode('utf8'), score] for org, score in res] res = [[org, score] for org, score in res]
return res return res
def getAllOrgsTrophyRanking(self): def getAllOrgsTrophyRanking(self, category=None):
concerned_categ = self.categories_in_trophy if category is None else category
dico_categ = {} dico_categ = {}
for categ in self.categories_in_trophy: for categ in [concerned_categ]:
res = self.getOrgsTrophyRanking(categ) res = self.getOrgsTrophyRanking(categ)
# add ranking info
tot = len(res)
for pos in range(tot):
res[pos].append(self.trophyNum-self.posToRankMapping(pos, tot))
dico_categ[categ] = res dico_categ[categ] = res
toret = dico_categ if category is None else dico_categ.get(category, [])
return toret
def posToRankMapping(self, pos, totNum): def posToRankMapping(self, pos, totNum):
mapping = self.trophyMapping ratio = pos/totNum*100
mapping_num = [math.ceil(float(float(totNum*i)/float(100))) for i in mapping] rank = 0
if pos == 0: #first if pos == totNum:
position = 1 return 0
else: else:
temp_pos = pos for i in range(len(self.trophyMappingIncremental)):
counter = 1 if self.trophyMappingIncremental[i] < ratio <= self.trophyMappingIncremental[i+1]:
for num in mapping_num: rank = i+1
if temp_pos < num: return rank
position = counter
else:
temp_pos -= num
counter += 1
return self.trophyNum+1 - position
def giveTrophyPointsToOrg(self, org, categ, points): def giveTrophyPointsToOrg(self, org, categ, points):
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), org, points) self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), points, org)
self.logger.debug('Giving {} trophy points to {} in {}'.format(points, org, categ)) self.logger.debug('Giving {} trophy points to {} in {}'.format(points, org, categ))
def removeTrophyPointsFromOrg(self, org, categ, points): def removeTrophyPointsFromOrg(self, org, categ, points):
keyname = '{mainKey}:{orgCateg}' keyname = '{mainKey}:{orgCateg}'
self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), org, -points) self.serv_redis_db.zincrby(keyname.format(mainKey=self.keyTrophy, orgCateg=categ), -points, org)
self.logger.debug('Removing {} trophy points from {} in {}'.format(points, org, categ)) self.logger.debug('Removing {} trophy points from {} in {}'.format(points, org, categ))
''' AWARDS HELPER ''' ''' AWARDS HELPER '''
@ -550,7 +565,7 @@ class Contributor_helper:
def getAllOrgFromRedis(self): def getAllOrgFromRedis(self):
data = self.serv_redis_db.smembers(self.keyAllOrg) data = self.serv_redis_db.smembers(self.keyAllOrg)
data = [x.decode('utf8') for x in data] data = [x for x in data]
return data return data
def getCurrentOrgRankFromRedis(self, org): def getCurrentOrgRankFromRedis(self, org):
@ -586,4 +601,3 @@ class Contributor_helper:
return { 'remainingPts': i-points, 'stepPts': prev } return { 'remainingPts': i-points, 'stepPts': prev }
prev = i prev = i
return { 'remainingPts': 0, 'stepPts': self.rankMultiplier**self.levelMax } return { 'remainingPts': 0, 'stepPts': self.rankMultiplier**self.levelMax }

View File

@ -1,21 +1,27 @@
import math, random import datetime
import os
import json import json
import datetime, time
import logging import logging
import json import math
import redis import os
import random
import sys
import time
from collections import OrderedDict from collections import OrderedDict
import redis
import geoip2.database import geoip2.database
import phonenumbers, pycountry import phonenumbers
import pycountry
import util
from helpers import live_helper
from phonenumbers import geocoder from phonenumbers import geocoder
import util
class InvalidCoordinate(Exception): class InvalidCoordinate(Exception):
pass pass
class Geo_helper: class Geo_helper:
def __init__(self, serv_redis_db, cfg): def __init__(self, serv_redis_db, cfg):
self.serv_redis_db = serv_redis_db self.serv_redis_db = serv_redis_db
@ -24,15 +30,25 @@ class Geo_helper:
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisMap', 'db')) db=cfg.getint('RedisMap', 'db'))
self.live_helper = live_helper.Live_helper(serv_redis_db, cfg)
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
self.keyCategCoord = "GEO_COORD" self.keyCategCoord = "GEO_COORD"
self.keyCategCountry = "GEO_COUNTRY" self.keyCategCountry = "GEO_COUNTRY"
@ -41,8 +57,18 @@ class Geo_helper:
self.PATH_TO_JSON = cfg.get('RedisMap', 'path_countrycode_to_coord_JSON') self.PATH_TO_JSON = cfg.get('RedisMap', 'path_countrycode_to_coord_JSON')
self.CHANNELDISP = cfg.get('RedisMap', 'channelDisp') self.CHANNELDISP = cfg.get('RedisMap', 'channelDisp')
self.reader = geoip2.database.Reader(self.PATH_TO_DB) try:
self.country_to_iso = { country.name: country.alpha_2 for country in pycountry.countries} self.reader = geoip2.database.Reader(self.PATH_TO_DB)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
self.country_to_iso = {}
for country in pycountry.countries:
try:
self.country_to_iso[country.name] = country.alpha_2
except AttributeError:
pass
with open(self.PATH_TO_JSON) as f: with open(self.PATH_TO_JSON) as f:
self.country_code_to_coord = json.load(f) self.country_code_to_coord = json.load(f)
@ -104,7 +130,9 @@ class Geo_helper:
if not self.coordinate_list_valid(coord_list): if not self.coordinate_list_valid(coord_list):
raise InvalidCoordinate("Coordinate do not match EPSG:900913 / EPSG:3785 / OSGEO:41001") raise InvalidCoordinate("Coordinate do not match EPSG:900913 / EPSG:3785 / OSGEO:41001")
self.push_to_redis_zset(self.keyCategCoord, json.dumps(ordDic)) self.push_to_redis_zset(self.keyCategCoord, json.dumps(ordDic))
self.push_to_redis_zset(self.keyCategCountry, rep['full_rep'].country.iso_code) iso_code = rep['full_rep'].country.iso_code if rep['full_rep'].country.iso_code is not None else rep['full_rep'].registered_country.iso_code
country_name = rep['full_rep'].country.name if rep['full_rep'].country.name is not None else rep['full_rep'].registered_country.name
self.push_to_redis_zset(self.keyCategCountry, iso_code)
ordDic = OrderedDict() #keep fields with the same layout in redis ordDic = OrderedDict() #keep fields with the same layout in redis
ordDic['categ'] = categ ordDic['categ'] = categ
ordDic['value'] = supposed_ip ordDic['value'] = supposed_ip
@ -113,15 +141,17 @@ class Geo_helper:
"coord": coord, "coord": coord,
"categ": categ, "categ": categ,
"value": supposed_ip, "value": supposed_ip,
"country": rep['full_rep'].country.name, "country": country_name,
"specifName": rep['full_rep'].subdivisions.most_specific.name, "specifName": rep['full_rep'].subdivisions.most_specific.name,
"cityName": rep['full_rep'].city.name, "cityName": rep['full_rep'].city.name,
"regionCode": rep['full_rep'].country.iso_code, "regionCode": iso_code,
} }
self.serv_coord.publish(self.CHANNELDISP, json.dumps(to_send)) j_to_send = json.dumps(to_send)
self.serv_coord.publish(self.CHANNELDISP, j_to_send)
self.live_helper.add_to_stream_log_cache('Map', j_to_send)
self.logger.info('Published: {}'.format(json.dumps(to_send))) self.logger.info('Published: {}'.format(json.dumps(to_send)))
except ValueError: except ValueError:
self.logger.warning("can't resolve ip") self.logger.warning("Can't resolve IP: " + str(supposed_ip))
except geoip2.errors.AddressNotFoundError: except geoip2.errors.AddressNotFoundError:
self.logger.warning("Address not in Database") self.logger.warning("Address not in Database")
except InvalidCoordinate: except InvalidCoordinate:
@ -163,7 +193,9 @@ class Geo_helper:
"cityName": "", "cityName": "",
"regionCode": country_code, "regionCode": country_code,
} }
self.serv_coord.publish(self.CHANNELDISP, json.dumps(to_send)) j_to_send = json.dumps(to_send)
self.serv_coord.publish(self.CHANNELDISP, j_to_send)
self.live_helper.add_to_stream_log_cache('Map', j_to_send)
self.logger.info('Published: {}'.format(json.dumps(to_send))) self.logger.info('Published: {}'.format(json.dumps(to_send)))
except phonenumbers.NumberParseException: except phonenumbers.NumberParseException:
self.logger.warning("Can't resolve phone number country") self.logger.warning("Can't resolve phone number country")
@ -175,13 +207,22 @@ class Geo_helper:
now = datetime.datetime.now() now = datetime.datetime.now()
today_str = util.getDateStrFormat(now) today_str = util.getDateStrFormat(now)
keyname = "{}:{}".format(keyCateg, today_str) keyname = "{}:{}".format(keyCateg, today_str)
self.serv_redis_db.geoadd(keyname, lon, lat, content) try:
self.serv_redis_db.geoadd(keyname, lon, lat, content)
except redis.exceptions.ResponseError as error:
print(error)
print("Please fix the above, and make sure you use a redis version that supports the GEOADD command.")
print("To test for support: echo \"help GEOADD\"| redis-cli")
self.logger.debug('Added to redis: keyname={}, lon={}, lat={}, content={}'.format(keyname, lon, lat, content)) self.logger.debug('Added to redis: keyname={}, lon={}, lat={}, content={}'.format(keyname, lon, lat, content))
def push_to_redis_zset(self, keyCateg, toAdd, endSubkey="", count=1): def push_to_redis_zset(self, keyCateg, toAdd, endSubkey="", count=1):
if not isinstance(toAdd, str):
self.logger.warning('Can\'t add to redis, element is not of type String. {}'.format(type(toAdd)))
return
now = datetime.datetime.now() now = datetime.datetime.now()
today_str = util.getDateStrFormat(now) today_str = util.getDateStrFormat(now)
keyname = "{}:{}{}".format(keyCateg, today_str, endSubkey) keyname = "{}:{}{}".format(keyCateg, today_str, endSubkey)
self.serv_redis_db.zincrby(keyname, toAdd, count) self.serv_redis_db.zincrby(keyname, count, toAdd)
self.logger.debug('Added to redis: keyname={}, toAdd={}, count={}'.format(keyname, toAdd, count)) self.logger.debug('Added to redis: keyname={}, toAdd={}, count={}'.format(keyname, toAdd, count))
def ip_to_coord(self, ip): def ip_to_coord(self, ip):

67
helpers/live_helper.py Normal file
View File

@ -0,0 +1,67 @@
import datetime
import json
import logging
import os
import random
import sys
import time
class Live_helper:
def __init__(self, serv_live, cfg):
self.serv_live = serv_live
self.cfg = cfg
self.maxCacheHistory = cfg.get('Dashboard', 'maxCacheHistory')
# REDIS keys
self.CHANNEL = cfg.get('RedisLog', 'channel')
self.prefix_redis_key = "TEMP_CACHE_LIVE:"
# logger
logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir):
os.makedirs(logDir)
try:
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
def publish_log(self, zmq_name, name, content, channel=None):
channel = channel if channel is not None else self.CHANNEL
to_send = { 'name': name, 'log': json.dumps(content), 'zmqName': zmq_name }
to_send_keep = { 'name': name, 'log': content, 'zmqName': zmq_name }
j_to_send = json.dumps(to_send)
j_to_send_keep = json.dumps(to_send_keep)
self.serv_live.publish(channel, j_to_send)
self.logger.debug('Published: {}'.format(j_to_send))
if name != 'Keepalive':
name = 'Attribute' if 'ObjectAttribute' else name
self.add_to_stream_log_cache(name, j_to_send_keep)
def get_stream_log_cache(self, cacheKey):
rKey = self.prefix_redis_key+cacheKey
entries = self.serv_live.lrange(rKey, 0, -1)
to_ret = []
for entry in entries:
jentry = json.loads(entry)
to_ret.append(jentry)
return to_ret
def add_to_stream_log_cache(self, cacheKey, item):
rKey = self.prefix_redis_key+cacheKey
if type(item) != str:
item = json.dumps(item)
self.serv_live.lpush(rKey, item)
r = random.randint(0, 8)
if r == 0:
self.serv_live.ltrim(rKey, 0, 100)

View File

@ -1,13 +1,17 @@
import math, random
import os
import json
import copy import copy
import datetime, time import datetime
import json
import logging import logging
import math
import os
import random
import sys
import time
from collections import OrderedDict from collections import OrderedDict
import util import util
class Trendings_helper: class Trendings_helper:
def __init__(self, serv_redis_db, cfg): def __init__(self, serv_redis_db, cfg):
self.serv_redis_db = serv_redis_db self.serv_redis_db = serv_redis_db
@ -23,12 +27,21 @@ class Trendings_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
''' SETTER ''' ''' SETTER '''
@ -40,7 +53,7 @@ class Trendings_helper:
to_save = json.dumps(data) to_save = json.dumps(data)
else: else:
to_save = data to_save = data
self.serv_redis_db.zincrby(keyname, to_save, 1) self.serv_redis_db.zincrby(keyname, 1, to_save)
self.logger.debug('Added to redis: keyname={}, content={}'.format(keyname, to_save)) self.logger.debug('Added to redis: keyname={}, content={}'.format(keyname, to_save))
def addTrendingEvent(self, eventName, timestamp): def addTrendingEvent(self, eventName, timestamp):
@ -76,15 +89,16 @@ class Trendings_helper:
''' GETTER ''' ''' GETTER '''
def getGenericTrending(self, trendingType, dateS, dateE, topNum=0): def getGenericTrending(self, trendingType, dateS, dateE, topNum=10):
to_ret = [] to_ret = []
prev_days = (dateE - dateS).days prev_days = (dateE - dateS).days
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else [] data = data if data is not None else []
to_ret.append([util.getTimestamp(curDate), data]) to_ret.append([util.getTimestamp(curDate), data])
to_ret = util.sortByTrendingScore(to_ret, topNum=topNum)
return to_ret return to_ret
def getSpecificTrending(self, trendingType, dateS, dateE, specificLabel=''): def getSpecificTrending(self, trendingType, dateS, dateE, specificLabel=''):
@ -97,15 +111,15 @@ class Trendings_helper:
to_ret.append([util.getTimestamp(curDate), data]) to_ret.append([util.getTimestamp(curDate), data])
return to_ret return to_ret
def getTrendingEvents(self, dateS, dateE, specificLabel=None): def getTrendingEvents(self, dateS, dateE, specificLabel=None, topNum=None):
if specificLabel is None: if specificLabel is None:
return self.getGenericTrending(self.keyEvent, dateS, dateE) return self.getGenericTrending(self.keyEvent, dateS, dateE, topNum=topNum)
else: else:
specificLabel = specificLabel.replace('\\n', '\n'); # reset correctly label with their \n (CR) instead of their char value specificLabel = specificLabel.replace('\\n', '\n'); # reset correctly label with their \n (CR) instead of their char value
return self.getSpecificTrending(self.keyEvent, dateS, dateE, specificLabel) return self.getSpecificTrending(self.keyEvent, dateS, dateE, specificLabel)
def getTrendingCategs(self, dateS, dateE): def getTrendingCategs(self, dateS, dateE, topNum=None):
return self.getGenericTrending(self.keyCateg, dateS, dateE) return self.getGenericTrending(self.keyCateg, dateS, dateE, topNum=topNum)
# FIXME: Construct this when getting data # FIXME: Construct this when getting data
def getTrendingTags(self, dateS, dateE, topNum=12): def getTrendingTags(self, dateS, dateE, topNum=12):
@ -114,7 +128,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(self.keyTag, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keyTag, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else [] data = data if data is not None else []
temp = [] temp = []
for jText, score in data: for jText, score in data:
@ -129,15 +143,15 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(self.keySigh, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keySigh, util.getDateStrFormat(curDate))
sight = self.serv_redis_db.get(keyname) sight = self.serv_redis_db.get(keyname)
sight = 0 if sight is None else int(sight.decode('utf8')) sight = 0 if sight is None else int(sight)
keyname = "{}:{}".format(self.keyFalse, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keyFalse, util.getDateStrFormat(curDate))
fp = self.serv_redis_db.get(keyname) fp = self.serv_redis_db.get(keyname)
fp = 0 if fp is None else int(fp.decode('utf8')) fp = 0 if fp is None else int(fp)
to_ret.append([util.getTimestamp(curDate), { 'sightings': sight, 'false_positive': fp}]) to_ret.append([util.getTimestamp(curDate), { 'sightings': sight, 'false_positive': fp}])
return to_ret return to_ret
def getTrendingDisc(self, dateS, dateE): def getTrendingDisc(self, dateS, dateE, topNum=None):
return self.getGenericTrending(self.keyDisc, dateS, dateE) return self.getGenericTrending(self.keyDisc, dateS, dateE, topNum=topNum)
def getTypeaheadData(self, dateS, dateE): def getTypeaheadData(self, dateS, dateE):
to_ret = {} to_ret = {}
@ -148,7 +162,7 @@ class Trendings_helper:
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True) data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True)
for elem in data: for elem in data:
allSet.add(elem.decode('utf8')) allSet.add(elem)
to_ret[trendingType] = list(allSet) to_ret[trendingType] = list(allSet)
tags = self.getTrendingTags(dateS, dateE) tags = self.getTrendingTags(dateS, dateE)
tagSet = set() tagSet = set()
@ -177,7 +191,7 @@ class Trendings_helper:
for curDate in util.getXPrevDaysSpan(dateE, prev_days): for curDate in util.getXPrevDaysSpan(dateE, prev_days):
keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(trendingType, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = self.serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
data = data if data is not None else [] data = data if data is not None else []
to_format.append([util.getTimestamp(curDate), data]) to_format.append([util.getTimestamp(curDate), data])

View File

@ -1,10 +1,14 @@
import math, random import datetime
import os
import json import json
import datetime, time
import logging import logging
import math
import os
import random
import sys
import time
import util import util
from . import contributor_helper from . import contributor_helper
@ -20,23 +24,32 @@ class Users_helper:
#logger #logger
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'helpers_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
handler = logging.FileHandler(logPath)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
self.logger = logging.getLogger(__name__) self.logger = logging.getLogger(__name__)
self.logger.setLevel(logging.INFO)
self.logger.addHandler(handler)
def add_user_login(self, timestamp, org): def add_user_login(self, timestamp, org, email=''):
timestampDate = datetime.datetime.fromtimestamp(float(timestamp)) timestampDate = datetime.datetime.fromtimestamp(float(timestamp))
timestampDate_str = util.getDateStrFormat(timestampDate) timestampDate_str = util.getDateStrFormat(timestampDate)
keyname_timestamp = "{}:{}".format(self.keyTimestamp, org) keyname_timestamp = "{}:{}".format(self.keyTimestamp, org)
self.serv_redis_db.zadd(keyname_timestamp, timestamp, timestamp) self.serv_redis_db.zadd(keyname_timestamp, {timestamp: timestamp})
self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_timestamp, timestamp)) self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_timestamp, timestamp))
keyname_org = "{}:{}".format(self.keyOrgLog, timestampDate_str) keyname_org = "{}:{}".format(self.keyOrgLog, timestampDate_str)
self.serv_redis_db.zincrby(keyname_org, org, 1) self.serv_redis_db.zincrby(keyname_org, 1, org)
self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_org, org)) self.logger.debug('Added to redis: keyname={}, org={}'.format(keyname_org, org))
self.serv_redis_db.sadd(self.keyAllOrgLog, org) self.serv_redis_db.sadd(self.keyAllOrgLog, org)
@ -44,7 +57,7 @@ class Users_helper:
def getAllOrg(self): def getAllOrg(self):
temp = self.serv_redis_db.smembers(self.keyAllOrgLog) temp = self.serv_redis_db.smembers(self.keyAllOrgLog)
return [ org.decode('utf8') for org in temp ] return [ org for org in temp ]
# return: All timestamps for one org for the spanned time or not # return: All timestamps for one org for the spanned time or not
def getDates(self, org, date=None): def getDates(self, org, date=None):
@ -63,11 +76,11 @@ class Users_helper:
else: else:
break # timestamps should be sorted, no need to process anymore break # timestamps should be sorted, no need to process anymore
return to_return return to_return
# return: All dates for all orgs, if date is not supplied, return for all dates # return: All dates for all orgs, if date is not supplied, return for all dates
def getUserLogins(self, date=None): def getUserLogins(self, date=None):
# get all orgs and retreive their timestamps # get all orgs and retrieve their timestamps
dates = [] dates = []
for org in self.getAllOrg(): for org in self.getAllOrg():
keyname = "{}:{}".format(self.keyOrgLog, org) keyname = "{}:{}".format(self.keyOrgLog, org)
@ -81,7 +94,7 @@ class Users_helper:
keyname = "{}:{}".format(self.keyOrgLog, util.getDateStrFormat(curDate)) keyname = "{}:{}".format(self.keyOrgLog, util.getDateStrFormat(curDate))
data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True) data = self.serv_redis_db.zrange(keyname, 0, -1, desc=True)
for org in data: for org in data:
orgs.add(org.decode('utf8')) orgs.add(org)
return list(orgs) return list(orgs)
# return: list composed of the number of [log, contrib] for one org for the time spanned # return: list composed of the number of [log, contrib] for one org for the time spanned
@ -125,7 +138,7 @@ class Users_helper:
def getLoginVSCOntribution(self, date): def getLoginVSCOntribution(self, date):
keyname = "{}:{}".format(self.keyContribDay, util.getDateStrFormat(date)) keyname = "{}:{}".format(self.keyContribDay, util.getDateStrFormat(date))
orgs_contri = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=False) orgs_contri = self.serv_redis_db.zrange(keyname, 0, -1, desc=True, withscores=False)
orgs_contri = [ org.decode('utf8') for org in orgs_contri ] orgs_contri = [ org for org in orgs_contri ]
orgs_login = [ org for org in self.getAllLoggedInOrgs(date, prev_days=0) ] orgs_login = [ org for org in self.getAllLoggedInOrgs(date, prev_days=0) ]
contributed_num = 0 contributed_num = 0
non_contributed_num = 0 non_contributed_num = 0
@ -169,7 +182,7 @@ class Users_helper:
data = [data[6]]+data[:6] data = [data[6]]+data[:6]
return data return data
# return: a dico of the form {login: [[timestamp, count], ...], contrib: [[timestamp, 1/0], ...]} # return: a dico of the form {login: [[timestamp, count], ...], contrib: [[timestamp, 1/0], ...]}
# either for all orgs or the supplied one # either for all orgs or the supplied one
def getUserLoginsAndContribOvertime(self, date, org=None, prev_days=6): def getUserLoginsAndContribOvertime(self, date, org=None, prev_days=6):
dico_hours_contrib = {} dico_hours_contrib = {}

View File

@ -1,17 +1,59 @@
#!/bin/bash #!/bin/bash
set -e ## disable -e for production systems
set -x #set -e
sudo apt-get install python3-virtualenv virtualenv screen redis-server unzip -y ## Debug mode
#set -x
# Functions
get_distribution() {
lsb_dist=""
# Every system that we officially support has /etc/os-release
if [ -r /etc/os-release ]; then
lsb_dist="$(. /etc/os-release && echo "$ID")"
fi
# Returning an empty string here should be alright since the
# case statements don't act unless you provide an actual value
echo "$lsb_dist" | tr '[:upper:]' '[:lower:]'
}
sudo chmod -R g+w .
if ! id zmqs >/dev/null 2>&1; then
if [ "$(get_distribution)" == "rhel" ] || [ "${get_distribution}" == "centos" ]; then
# Create zmq user
sudo useradd -U -G apache -m -s /usr/bin/bash zmqs
# Adds right to www-data to run ./start-zmq as zmq
echo "apache ALL=(zmqs) NOPASSWD:/bin/bash /var/www/misp-dashboard/start_zmq.sh" |sudo tee /etc/sudoers.d/apache
VENV_BIN="/usr/local/bin/virtualenv"
else
# Create zmq user
sudo useradd -U -G www-data -m -s /bin/bash zmqs
# Adds right to www-data to run ./start-zmq as zmq
echo "www-data ALL=(zmqs) NOPASSWD:/bin/bash /var/www/misp-dashboard/start_zmq.sh" |sudo tee /etc/sudoers.d/www-data
VENV_BIN="virtualenv"
fi
fi
VENV_BIN="${VENV_BIN:-virtualenv}"
sudo apt-get install python3-virtualenv virtualenv screen redis-server unzip net-tools -y
if [ -z "$VIRTUAL_ENV" ]; then if [ -z "$VIRTUAL_ENV" ]; then
virtualenv -p python3 DASHENV ${VENV_BIN} -p python3 DASHENV ; DASH_VENV=$?
if [[ "$DASH_VENV" != "0" ]]; then
echo "Something went wrong with either the update or install of the virtualenv."
echo "Please investigate manually."
exit $DASH_VENV
fi
. ./DASHENV/bin/activate . ./DASHENV/bin/activate
fi fi
pip3 install -U pip argparse redis zmq geoip2 flask phonenumbers pycountry pip3 install -U -r requirements.txt
## config ## config
if [ -e "config/config.cfg" ]; then if [ -e "config/config.cfg" ]; then
@ -35,7 +77,14 @@ mkdir -p css fonts js
popd popd
mkdir -p temp mkdir -p temp
wget http://www.misp-project.org/assets/images/misp-small.png -O static/pics/MISP.png NET_WGET=$(wget --no-cache -q https://www.misp-project.org/assets/images/misp-small.png -O static/pics/MISP.png; echo $?)
if [[ "$NET_WGET" != "0" ]]; then
echo "The first wget we tried failed, please investigate manually."
exit $NET_WGET
fi
wget https://www.misp-project.org/favicon.ico -O static/favicon.ico
# jquery # jquery
JQVERSION="3.2.1" JQVERSION="3.2.1"
@ -98,10 +147,24 @@ wget http://jvectormap.com/js/jquery-jvectormap-world-mill.js -O ./static/js/jqu
rm -rf data/GeoLite2-City* rm -rf data/GeoLite2-City*
mkdir -p data mkdir -p data
pushd data pushd data
wget http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz -O GeoLite2-City.tar.gz # The following lines do not work any more, see: https://blog.maxmind.com/2019/12/18/significant-changes-to-accessing-and-using-geolite2-databases/
#wget http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz -O GeoLite2-City.tar.gz
read -p "Please paste your Max Mind License key: " MM_LIC
while [ "$(sha256sum -c GeoLite2-City.tar.gz.sha256 >/dev/null; echo $?)" != "0" ]; do
echo "Redownloading GeoLite Assets, if this loops, CTRL-C and investigate"
wget "https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_LIC}&suffix=tar.gz" -O GeoLite2-City.tar.gz
wget "https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_LIC}&suffix=tar.gz.sha256" -O GeoLite2-City.tar.gz.sha256
if [[ $? == 6 ]]; then
echo "Something is wrong with your License Key, please try entering another one. (You DO NOT need a GeoIP Update key) "
echo "If you created the key JUST NOW, it will take a couple of minutes to become active."
read -p "Please paste your Max Mind License key: " MM_LIC
fi
sed -i 's/_.*/.tar.gz/' GeoLite2-City.tar.gz.sha256
sleep 3
done
tar xvfz GeoLite2-City.tar.gz tar xvfz GeoLite2-City.tar.gz
ln -s GeoLite2-City_* GeoLite2-City ln -s GeoLite2-City_* GeoLite2-City
rm -rf GeoLite2-City.tar.gz rm -rf GeoLite2-City.tar.gz*
popd popd
# DataTable # DataTable

13
requirements.txt Normal file
View File

@ -0,0 +1,13 @@
argparse
flask
flask-login
wtforms
geoip2
redis
phonenumbers
pip
pycountry
zmq
requests
halo
pyopenssl

View File

@ -1,13 +1,15 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import redis
import requests
import shutil
import json import json
import math import math
import sys, os import os
import shlex
import shutil
import sys
import time import time
from subprocess import PIPE, Popen from subprocess import PIPE, Popen
import shlex
import redis
import requests
URL_OPEN_MAP = "http://tile.openstreetmap.org/{zoom}/{x}/{y}.png" URL_OPEN_MAP = "http://tile.openstreetmap.org/{zoom}/{x}/{y}.png"
MAP_DIR = "static/maps/" MAP_DIR = "static/maps/"

512
server.py
View File

@ -1,20 +1,29 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
from flask import Flask, render_template, request, Response, jsonify
import json
import redis
import random, math
import configparser import configparser
import datetime
import uuid
import errno
import json
import logging
import math
import os
import re
from datetime import timedelta
import random
from time import gmtime as now from time import gmtime as now
from time import sleep, strftime from time import sleep, strftime
import datetime
import os import redis
import logging
import util import util
from helpers import geo_helper from flask import (Flask, Response, jsonify, render_template, request, make_response,
from helpers import contributor_helper send_from_directory, stream_with_context, url_for, redirect)
from helpers import users_helper from flask_login import (UserMixin, LoginManager, current_user, login_user, logout_user, login_required)
from helpers import trendings_helper from helpers import (contributor_helper, geo_helper, live_helper,
trendings_helper, users_helper)
import requests
from wtforms import Form, SubmitField, StringField, PasswordField, validators
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser() cfg = configparser.ConfigParser()
@ -25,37 +34,228 @@ logger.setLevel(logging.ERROR)
server_host = cfg.get("Server", "host") server_host = cfg.get("Server", "host")
server_port = cfg.getint("Server", "port") server_port = cfg.getint("Server", "port")
server_debug = cfg.get("Server", "debug")
server_ssl = cfg.get("Server", "ssl")
try:
server_ssl_cert = cfg.get("Server", "ssl_cert")
server_ssl_key = cfg.get("Server", "ssl_key")
except:
server_ssl_cert = None
server_ssl_key = None
pass
auth_host = cfg.get("Auth", "misp_fqdn")
auth_enabled = cfg.getboolean("Auth", "auth_enabled")
auth_ssl_verify = cfg.getboolean("Auth", "ssl_verify")
auth_session_secret = cfg.get("Auth", "session_secret")
auth_session_cookie_secure = cfg.getboolean("Auth", "session_cookie_secure")
auth_session_cookie_samesite = cfg.get("Auth", "session_cookie_samesite")
auth_permanent_session_lifetime = cfg.getint("Auth", "permanent_session_lifetime")
app = Flask(__name__) app = Flask(__name__)
#app.secret_key = auth_session_secret
app.config.update(
SECRET_KEY=auth_session_secret,
SESSION_COOKIE_SECURE=auth_session_cookie_secure,
SESSION_COOKIE_SAMESITE=auth_session_cookie_samesite,
PERMANENT_SESSION_LIFETIME=timedelta(days=auth_permanent_session_lifetime)
)
redis_server_log = redis.StrictRedis( redis_server_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db')) db=cfg.getint('RedisLog', 'db'),
decode_responses=True)
redis_server_map = redis.StrictRedis( redis_server_map = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisMap', 'db')) db=cfg.getint('RedisMap', 'db'),
decode_responses=True)
serv_redis_db = redis.StrictRedis( serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db')) db=cfg.getint('RedisDB', 'db'),
decode_responses=True)
streamLogCacheKey = cfg.get('RedisLog', 'streamLogCacheKey')
streamMapCacheKey = cfg.get('RedisLog', 'streamMapCacheKey')
live_helper = live_helper.Live_helper(serv_redis_db, cfg)
geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg) geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg)
contributor_helper = contributor_helper.Contributor_helper(serv_redis_db, cfg) contributor_helper = contributor_helper.Contributor_helper(serv_redis_db, cfg)
users_helper = users_helper.Users_helper(serv_redis_db, cfg) users_helper = users_helper.Users_helper(serv_redis_db, cfg)
trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg) trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg)
subscriber_log = redis_server_log.pubsub(ignore_subscribe_messages=True) login_manager = LoginManager(app)
subscriber_log.psubscribe(cfg.get('RedisLog', 'channel')) login_manager.session_protection = "strong"
subscriber_map = redis_server_map.pubsub(ignore_subscribe_messages=True) login_manager.init_app(app)
subscriber_map.psubscribe(cfg.get('RedisMap', 'channelDisp'))
subscriber_lastContrib = redis_server_log.pubsub(ignore_subscribe_messages=True) ##########
subscriber_lastContrib.psubscribe(cfg.get('RedisLog', 'channelLastContributor')) ## Auth ##
subscriber_lastAwards = redis_server_log.pubsub(ignore_subscribe_messages=True) ##########
subscriber_lastAwards.psubscribe(cfg.get('RedisLog', 'channelLastAwards'))
class User(UserMixin):
def __init__(self, id, password):
self.id = id
self.password = password
def misp_login(self):
"""
Use login form data to authenticate a user to MISP.
This function uses requests to log a user into the MISP web UI. When authentication is successful MISP redirects the client to the '/users/routeafterlogin' endpoint. The requests session history is parsed for a redirect to this endpoint.
:param misp_url: The FQDN of a MISP instance to authenticate against.
:param user: The user account to authenticate.
:param password: The user account password.
:return:
"""
post_data = {
"_method": "POST",
"data[_Token][key]": "",
"data[_Token][fields]": "",
"data[_Token][unlocked]": "",
"data[User][email]": self.id,
"data[User][password]": self.password,
}
misp_login_page = auth_host + "/users/login"
misp_user_me_page = auth_host + "/users/view/me.json"
session = requests.Session()
session.verify = auth_ssl_verify
# The login page contains hidden form values required for authenticaiton.
login_page = session.get(misp_login_page)
# This regex matches the "data[_Token][fields]" value needed to make a POST request on the MISP login page.
token_fields_exp = re.compile(r'name="data\[_Token]\[fields]" value="([^\s]+)"')
token_fields = token_fields_exp.search(login_page.text)
# This regex matches the "data[_Token][key]" value needed to make a POST request on the MISP login page.
token_key_exp = re.compile(r'name="data\[_Token]\[key]" value="([^\s]+)"')
token_key = token_key_exp.search(login_page.text)
# This regex matches the "data[_Token][debug]" value needed to make a POST request on the MISP login page.
token_key_exp = re.compile(r'name="data\[_Token]\[debug]" value="([^\s]+)"')
token_debug = token_key_exp.search(login_page.text)
post_data["data[_Token][fields]"] = token_fields.group(1)
post_data["data[_Token][key]"] = token_key.group(1)
# debug_token should return None when MISP debug is off.
# Only send debug_token when MISP is running in debug mode.
if token_debug is not None:
post_data["data[_Token][debug]"] = token_debug.group(1)
# POST request with user credentials + hidden form values.
post_to_login_page = session.post(misp_login_page, data=post_data, allow_redirects=False)
# Consider setup with MISP baseurl set
redirect_location = post_to_login_page.headers.get('Location', '')
# Authentication is successful if MISP returns a redirect to '/users/routeafterlogin'.
if '/users/routeafterlogin' in redirect_location:
# Logged in, check if logged in user can access the dashboard
me_json = session.get(misp_user_me_page).json()
dashboard_access = me_json.get('UserSetting', {}).get('dashboard_access', False)
if dashboard_access is True or dashboard_access == 1:
return (True, '')
else:
return (None, 'User does not have dashboard access')
return (None, '')
@login_manager.user_loader
def load_user(user_id):
"""
Return a User object required by flask-login to keep state of a user session.
Typically load_user is used to perform a user lookup on a db; it should return a User object or None if the user is not found. Authentication is defered to MISP via User.misp_login() and so this function always returns a User object .
:param user_id: A MISP username.
:return:
"""
return User(user_id, "")
@login_manager.unauthorized_handler
def unauthorized():
"""
Redirect unauthorized user to login page.
:return:
"""
redirectCount = int(request.cookies.get('redirectCount', '0'))
if redirectCount > 5:
response = make_response(redirect(url_for(
'error_page',
error_message='Too many redirects. This can be due to your brower not accepting cookies or the misp-dashboard website is badly configured',
error_code='1'
)))
response.set_cookie('redirectCount', '0', secure=False, httponly=True)
else:
response = make_response(redirect(url_for('login', auth_error=True, auth_error_message='Unauthorized. Review your cookie settings')))
response.set_cookie('redirectCount', str(redirectCount+1), secure=False, httponly=True)
return response
@app.route('/error_page')
def error_page():
error_message = request.args.get('error_message', False)
return render_template('error_page.html', error_message=error_message)
@app.route('/logout')
@login_required
def logout():
"""
Logout the user and redirect to the login form.
:return:
"""
logout_user()
return redirect(url_for('login'))
@app.route('/login', methods=['GET', 'POST'])
def login():
"""
Login form route.
:return:
"""
if not auth_enabled:
# Generate a random user name and redirect the automatically authenticated user to index.
user = User(str(uuid.uuid4()).replace('-',''), '')
login_user(user)
return redirect(url_for('index'))
if current_user.is_authenticated:
return redirect(url_for('index'))
form = LoginForm(request.form)
if request.method == 'POST' and form.validate():
user = User(form.username.data, form.password.data)
error_message = 'Username and Password does not match when connecting to MISP or incorrect MISP permission'
try:
is_logged_in, misp_error_message = user.misp_login()
if len(misp_error_message) > 0:
error_message = misp_error_message
if is_logged_in:
login_user(user)
return redirect(url_for('index'))
except requests.exceptions.SSLError:
return redirect(url_for('login', auth_error=True, auth_error_message='MISP cannot be reached for authentication'))
return redirect(url_for('login', auth_error=True, auth_error_message=error_message))
else:
auth_error = request.args.get('auth_error', False)
auth_error_message = request.args.get('auth_error_message', '')
return render_template('login.html', title='Login', form=form, authError=auth_error, authErrorMessage=auth_error_message)
class LoginForm(Form):
"""
WTForm form object. This object defines form fields in the login endpoint.
"""
username = StringField('Username', [validators.Length(max=255)])
password = PasswordField('Password', [validators.Length(max=255)])
submit = SubmitField('Sign In')
eventNumber = 0
########## ##########
## UTIL ## ## UTIL ##
@ -66,8 +266,6 @@ class LogItem():
FIELDNAME_ORDER = [] FIELDNAME_ORDER = []
FIELDNAME_ORDER_HEADER = [] FIELDNAME_ORDER_HEADER = []
FIELDNAME_ORDER.append("Time")
FIELDNAME_ORDER_HEADER.append("Time")
for item in json.loads(cfg.get('Dashboard', 'fieldname_order')): for item in json.loads(cfg.get('Dashboard', 'fieldname_order')):
if type(item) is list: if type(item) is list:
FIELDNAME_ORDER_HEADER.append(" | ".join(item)) FIELDNAME_ORDER_HEADER.append(" | ".join(item))
@ -75,13 +273,10 @@ class LogItem():
FIELDNAME_ORDER_HEADER.append(item) FIELDNAME_ORDER_HEADER.append(item)
FIELDNAME_ORDER.append(item) FIELDNAME_ORDER.append(item)
def __init__(self, feed): def __init__(self, feed, filters={}):
self.time = strftime("%H:%M:%S", now()) self.filters = filters
#FIXME Parse feed message? self.feed = feed
self.fields = [] self.fields = []
self.fields.append(self.time)
for f in feed:
self.fields.append(f)
def get_head_row(self): def get_head_row(self):
to_ret = [] to_ret = []
@ -90,34 +285,72 @@ class LogItem():
return to_ret return to_ret
def get_row(self): def get_row(self):
if not self.pass_filter():
return False
to_ret = {} to_ret = {}
#Number to keep them sorted (jsonify sort keys) for i, field in enumerate(json.loads(cfg.get('Dashboard', 'fieldname_order'))):
for item in range(len(LogItem.FIELDNAME_ORDER)): if type(field) is list:
try: to_join = []
to_ret[item] = self.fields[item] for subField in field:
except IndexError: # not enough field in rcv item to_join.append(str(util.getFields(self.feed, subField)))
to_ret[item] = '' to_add = cfg.get('Dashboard', 'char_separator').join(to_join)
else:
to_add = util.getFields(self.feed, field)
to_ret[i] = to_add if to_add is not None else ''
return to_ret return to_ret
def pass_filter(self):
for filter, filterValue in self.filters.items():
jsonValue = util.getFields(self.feed, filter)
if jsonValue is None or jsonValue != filterValue:
return False
return True
class EventMessage(): class EventMessage():
# Suppose the event message is a json with the format {name: 'feedName', log:'logData'} # Suppose the event message is a json with the format {name: 'feedName', log:'logData'}
def __init__(self, msg): def __init__(self, msg, filters):
msg = msg.decode('utf8') if not isinstance(msg, dict):
try: try:
jsonMsg = json.loads(msg) jsonMsg = json.loads(msg)
except json.JSONDecodeError as e: jsonMsg['log'] = json.loads(jsonMsg['log'])
logger.error(e) except json.JSONDecodeError as e:
jsonMsg = { 'name': "undefined" ,'log': json.loads(msg) } logger.error(e)
jsonMsg = { 'name': "undefined" ,'log': json.loads(msg) }
else:
jsonMsg = msg
self.feedName = jsonMsg['name'] self.name = jsonMsg['name']
self.zmqName = jsonMsg['zmqName'] self.zmqName = jsonMsg['zmqName']
self.feed = json.loads(jsonMsg['log'])
self.feed = LogItem(self.feed).get_row() if self.name == 'Attribute':
self.feed = jsonMsg['log']
self.feed = LogItem(self.feed, filters).get_row()
elif self.name == 'ObjectAttribute':
self.feed = jsonMsg['log']
self.feed = LogItem(self.feed, filters).get_row()
else:
self.feed = jsonMsg['log']
def to_json_ev(self):
if self.feed is not False:
to_ret = { 'log': self.feed, 'name': self.name, 'zmqName': self.zmqName }
return 'data: {}\n\n'.format(json.dumps(to_ret))
else:
return ''
def to_json(self): def to_json(self):
to_ret = { 'log': self.feed, 'feedName': self.feedName, 'zmqName': self.zmqName } if self.feed is not False:
return 'data: {}\n\n'.format(json.dumps(to_ret)) to_ret = { 'log': self.feed, 'name': self.name, 'zmqName': self.zmqName }
return json.dumps(to_ret)
else:
return ''
def to_dict(self):
return {'log': self.feed, 'name': self.name, 'zmqName': self.zmqName}
########### ###########
## ROUTE ## ## ROUTE ##
@ -126,6 +359,7 @@ class EventMessage():
''' MAIN ROUTE ''' ''' MAIN ROUTE '''
@app.route("/") @app.route("/")
@login_required
def index(): def index():
ratioCorrection = 88 ratioCorrection = 88
pannelSize = [ pannelSize = [
@ -146,8 +380,14 @@ def index():
zoomlevel=cfg.getint('Dashboard' ,'zoomlevel') zoomlevel=cfg.getint('Dashboard' ,'zoomlevel')
) )
@app.route('/favicon.ico')
@login_required
def favicon():
return send_from_directory(os.path.join(app.root_path, 'static'),
'favicon.ico', mimetype='image/vnd.microsoft.icon')
@app.route("/geo") @app.route("/geo")
@login_required
def geo(): def geo():
return render_template('geo.html', return render_template('geo.html',
zoomlevel=cfg.getint('GEO' ,'zoomlevel'), zoomlevel=cfg.getint('GEO' ,'zoomlevel'),
@ -155,6 +395,7 @@ def geo():
) )
@app.route("/contrib") @app.route("/contrib")
@login_required
def contrib(): def contrib():
categ_list = contributor_helper.categories_in_datatable categ_list = contributor_helper.categories_in_datatable
categ_list_str = [ s[0].upper() + s[1:].replace('_', ' ') for s in categ_list] categ_list_str = [ s[0].upper() + s[1:].replace('_', ' ') for s in categ_list]
@ -206,12 +447,14 @@ def contrib():
) )
@app.route("/users") @app.route("/users")
@login_required
def users(): def users():
return render_template('users.html', return render_template('users.html',
) )
@app.route("/trendings") @app.route("/trendings")
@login_required
def trendings(): def trendings():
maxNum = request.args.get('maxNum') maxNum = request.args.get('maxNum')
try: try:
@ -228,30 +471,70 @@ def trendings():
''' INDEX ''' ''' INDEX '''
@app.route("/_logs") @app.route("/_logs")
@login_required
def logs(): def logs():
return Response(event_stream_log(), mimetype="text/event-stream") if request.accept_mimetypes.accept_json or request.method == 'POST':
key = 'Attribute'
j = live_helper.get_stream_log_cache(key)
to_ret = []
for item in j:
filters = request.cookies.get('filters', '{}')
filters = json.loads(filters)
ev = EventMessage(item, filters)
if ev is not None:
dico = ev.to_dict()
if dico['log'] != False:
to_ret.append(dico)
return jsonify(to_ret)
else:
return Response(stream_with_context(event_stream_log()), mimetype="text/event-stream")
@app.route("/_maps") @app.route("/_maps")
@login_required
def maps(): def maps():
return Response(event_stream_maps(), mimetype="text/event-stream") if request.accept_mimetypes.accept_json or request.method == 'POST':
key = 'Map'
j = live_helper.get_stream_log_cache(key)
return jsonify(j)
else:
return Response(event_stream_maps(), mimetype="text/event-stream")
@app.route("/_get_log_head") @app.route("/_get_log_head")
@login_required
def getLogHead(): def getLogHead():
return json.dumps(LogItem('').get_head_row()) return json.dumps(LogItem('').get_head_row())
def event_stream_log(): def event_stream_log():
for msg in subscriber_log.listen(): subscriber_log = redis_server_log.pubsub(ignore_subscribe_messages=True)
content = msg['data'] subscriber_log.subscribe(live_helper.CHANNEL)
yield EventMessage(content).to_json() try:
for msg in subscriber_log.listen():
filters = request.cookies.get('filters', '{}')
filters = json.loads(filters)
content = msg['data']
ev = EventMessage(content, filters)
if ev is not None:
yield ev.to_json_ev()
else:
pass
except GeneratorExit:
subscriber_log.unsubscribe()
def event_stream_maps(): def event_stream_maps():
for msg in subscriber_map.listen(): subscriber_map = redis_server_map.pubsub(ignore_subscribe_messages=True)
content = msg['data'].decode('utf8') subscriber_map.psubscribe(cfg.get('RedisMap', 'channelDisp'))
yield 'data: {}\n\n'.format(content) try:
for msg in subscriber_map.listen():
content = msg['data']
to_ret = 'data: {}\n\n'.format(content)
yield to_ret
except GeneratorExit:
subscriber_map.unsubscribe()
''' GEO ''' ''' GEO '''
@app.route("/_getTopCoord") @app.route("/_getTopCoord")
@login_required
def getTopCoord(): def getTopCoord():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -261,6 +544,7 @@ def getTopCoord():
return jsonify(data) return jsonify(data)
@app.route("/_getHitMap") @app.route("/_getHitMap")
@login_required
def getHitMap(): def getHitMap():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -270,6 +554,7 @@ def getHitMap():
return jsonify(data) return jsonify(data)
@app.route("/_getCoordsByRadius") @app.route("/_getCoordsByRadius")
@login_required
def getCoordsByRadius(): def getCoordsByRadius():
try: try:
dateStart = datetime.datetime.fromtimestamp(float(request.args.get('dateStart'))) dateStart = datetime.datetime.fromtimestamp(float(request.args.get('dateStart')))
@ -286,41 +571,55 @@ def getCoordsByRadius():
''' CONTRIB ''' ''' CONTRIB '''
@app.route("/_getLastContributors") @app.route("/_getLastContributors")
@login_required
def getLastContributors(): def getLastContributors():
return jsonify(contributor_helper.getLastContributorsFromRedis()) return jsonify(contributor_helper.getLastContributorsFromRedis())
@app.route("/_eventStreamLastContributor") @app.route("/_eventStreamLastContributor")
@login_required
def getLastContributor(): def getLastContributor():
return Response(eventStreamLastContributor(), mimetype="text/event-stream") return Response(eventStreamLastContributor(), mimetype="text/event-stream")
@app.route("/_eventStreamAwards") @app.route("/_eventStreamAwards")
@login_required
def getLastStreamAwards(): def getLastStreamAwards():
return Response(eventStreamAwards(), mimetype="text/event-stream") return Response(eventStreamAwards(), mimetype="text/event-stream")
def eventStreamLastContributor(): def eventStreamLastContributor():
for msg in subscriber_lastContrib.listen(): subscriber_lastContrib = redis_server_log.pubsub(ignore_subscribe_messages=True)
content = msg['data'].decode('utf8') subscriber_lastContrib.psubscribe(cfg.get('RedisLog', 'channelLastContributor'))
contentJson = json.loads(content) try:
lastContribJson = json.loads(contentJson['log']) for msg in subscriber_lastContrib.listen():
org = lastContribJson['org'] content = msg['data']
to_return = contributor_helper.getContributorFromRedis(org) contentJson = json.loads(content)
epoch = lastContribJson['epoch'] lastContribJson = json.loads(contentJson['log'])
to_return['epoch'] = epoch org = lastContribJson['org']
yield 'data: {}\n\n'.format(json.dumps(to_return)) to_return = contributor_helper.getContributorFromRedis(org)
epoch = lastContribJson['epoch']
to_return['epoch'] = epoch
yield 'data: {}\n\n'.format(json.dumps(to_return))
except GeneratorExit:
subscriber_lastContrib.unsubscribe()
def eventStreamAwards(): def eventStreamAwards():
for msg in subscriber_lastAwards.listen(): subscriber_lastAwards = redis_server_log.pubsub(ignore_subscribe_messages=True)
content = msg['data'].decode('utf8') subscriber_lastAwards.psubscribe(cfg.get('RedisLog', 'channelLastAwards'))
contentJson = json.loads(content) try:
lastAwardJson = json.loads(contentJson['log']) for msg in subscriber_lastAwards.listen():
org = lastAwardJson['org'] content = msg['data']
to_return = contributor_helper.getContributorFromRedis(org) contentJson = json.loads(content)
epoch = lastAwardJson['epoch'] lastAwardJson = json.loads(contentJson['log'])
to_return['epoch'] = epoch org = lastAwardJson['org']
to_return['award'] = lastAwardJson['award'] to_return = contributor_helper.getContributorFromRedis(org)
yield 'data: {}\n\n'.format(json.dumps(to_return)) epoch = lastAwardJson['epoch']
to_return['epoch'] = epoch
to_return['award'] = lastAwardJson['award']
yield 'data: {}\n\n'.format(json.dumps(to_return))
except GeneratorExit:
subscriber_lastAwards.unsubscribe()
@app.route("/_getTopContributor") @app.route("/_getTopContributor")
@login_required
def getTopContributor(suppliedDate=None, maxNum=100): def getTopContributor(suppliedDate=None, maxNum=100):
if suppliedDate is None: if suppliedDate is None:
try: try:
@ -334,6 +633,7 @@ def getTopContributor(suppliedDate=None, maxNum=100):
return jsonify(data) return jsonify(data)
@app.route("/_getFameContributor") @app.route("/_getFameContributor")
@login_required
def getFameContributor(): def getFameContributor():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -344,6 +644,7 @@ def getFameContributor():
return getTopContributor(suppliedDate=date, maxNum=10) return getTopContributor(suppliedDate=date, maxNum=10)
@app.route("/_getFameQualContributor") @app.route("/_getFameQualContributor")
@login_required
def getFameQualContributor(): def getFameQualContributor():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -354,10 +655,12 @@ def getFameQualContributor():
return getTopContributor(suppliedDate=date, maxNum=10) return getTopContributor(suppliedDate=date, maxNum=10)
@app.route("/_getTop5Overtime") @app.route("/_getTop5Overtime")
@login_required
def getTop5Overtime(): def getTop5Overtime():
return jsonify(contributor_helper.getTop5OvertimeFromRedis()) return jsonify(contributor_helper.getTop5OvertimeFromRedis())
@app.route("/_getOrgOvertime") @app.route("/_getOrgOvertime")
@login_required
def getOrgOvertime(): def getOrgOvertime():
try: try:
org = request.args.get('org') org = request.args.get('org')
@ -366,6 +669,7 @@ def getOrgOvertime():
return jsonify(contributor_helper.getOrgOvertime(org)) return jsonify(contributor_helper.getOrgOvertime(org))
@app.route("/_getCategPerContrib") @app.route("/_getCategPerContrib")
@login_required
def getCategPerContrib(): def getCategPerContrib():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -375,6 +679,7 @@ def getCategPerContrib():
return jsonify(contributor_helper.getCategPerContribFromRedis(date)) return jsonify(contributor_helper.getCategPerContribFromRedis(date))
@app.route("/_getLatestAwards") @app.route("/_getLatestAwards")
@login_required
def getLatestAwards(): def getLatestAwards():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -384,10 +689,12 @@ def getLatestAwards():
return jsonify(contributor_helper.getLastAwardsFromRedis()) return jsonify(contributor_helper.getLastAwardsFromRedis())
@app.route("/_getAllOrg") @app.route("/_getAllOrg")
@login_required
def getAllOrg(): def getAllOrg():
return jsonify(contributor_helper.getAllOrgFromRedis()) return jsonify(contributor_helper.getAllOrgFromRedis())
@app.route("/_getOrgRank") @app.route("/_getOrgRank")
@login_required
def getOrgRank(): def getOrgRank():
try: try:
org = request.args.get('org') org = request.args.get('org')
@ -396,6 +703,7 @@ def getOrgRank():
return jsonify(contributor_helper.getCurrentOrgRankFromRedis(org)) return jsonify(contributor_helper.getCurrentOrgRankFromRedis(org))
@app.route("/_getContributionOrgStatus") @app.route("/_getContributionOrgStatus")
@login_required
def getContributionOrgStatus(): def getContributionOrgStatus():
try: try:
org = request.args.get('org') org = request.args.get('org')
@ -404,6 +712,7 @@ def getContributionOrgStatus():
return jsonify(contributor_helper.getCurrentContributionStatus(org)) return jsonify(contributor_helper.getCurrentContributionStatus(org))
@app.route("/_getHonorBadges") @app.route("/_getHonorBadges")
@login_required
def getHonorBadges(): def getHonorBadges():
try: try:
org = request.args.get('org') org = request.args.get('org')
@ -412,6 +721,7 @@ def getHonorBadges():
return jsonify(contributor_helper.getOrgHonorBadges(org)) return jsonify(contributor_helper.getOrgHonorBadges(org))
@app.route("/_getTrophies") @app.route("/_getTrophies")
@login_required
def getTrophies(): def getTrophies():
try: try:
org = request.args.get('org') org = request.args.get('org')
@ -419,10 +729,17 @@ def getTrophies():
org = '' org = ''
return jsonify(contributor_helper.getOrgTrophies(org)) return jsonify(contributor_helper.getOrgTrophies(org))
@app.route("/_getAllOrgsTrophyRanking")
@app.route("/_getAllOrgsTrophyRanking/<string:categ>")
@login_required
def getAllOrgsTrophyRanking(categ=None):
return jsonify(contributor_helper.getAllOrgsTrophyRanking(categ))
''' USERS ''' ''' USERS '''
@app.route("/_getUserLogins") @app.route("/_getUserLogins")
@login_required
def getUserLogins(): def getUserLogins():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -433,7 +750,13 @@ def getUserLogins():
data = users_helper.getUserLoginsForPunchCard(date, org) data = users_helper.getUserLoginsForPunchCard(date, org)
return jsonify(data) return jsonify(data)
@app.route("/_getAllLoggedOrg")
@login_required
def getAllLoggedOrg():
return jsonify(users_helper.getAllOrg())
@app.route("/_getTopOrglogin") @app.route("/_getTopOrglogin")
@login_required
def getTopOrglogin(): def getTopOrglogin():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -444,6 +767,7 @@ def getTopOrglogin():
return jsonify(data) return jsonify(data)
@app.route("/_getLoginVSCOntribution") @app.route("/_getLoginVSCOntribution")
@login_required
def getLoginVSCOntribution(): def getLoginVSCOntribution():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -454,6 +778,7 @@ def getLoginVSCOntribution():
return jsonify(data) return jsonify(data)
@app.route("/_getUserLoginsAndContribOvertime") @app.route("/_getUserLoginsAndContribOvertime")
@login_required
def getUserLoginsAndContribOvertime(): def getUserLoginsAndContribOvertime():
try: try:
date = datetime.datetime.fromtimestamp(float(request.args.get('date'))) date = datetime.datetime.fromtimestamp(float(request.args.get('date')))
@ -466,6 +791,7 @@ def getUserLoginsAndContribOvertime():
''' TRENDINGS ''' ''' TRENDINGS '''
@app.route("/_getTrendingEvents") @app.route("/_getTrendingEvents")
@login_required
def getTrendingEvents(): def getTrendingEvents():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -475,10 +801,11 @@ def getTrendingEvents():
dateE = datetime.datetime.now() dateE = datetime.datetime.now()
specificLabel = request.args.get('specificLabel') specificLabel = request.args.get('specificLabel')
data = trendings_helper.getTrendingEvents(dateS, dateE, specificLabel) data = trendings_helper.getTrendingEvents(dateS, dateE, specificLabel, topNum=int(request.args.get('topNum', 10)))
return jsonify(data) return jsonify(data)
@app.route("/_getTrendingCategs") @app.route("/_getTrendingCategs")
@login_required
def getTrendingCategs(): def getTrendingCategs():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -488,10 +815,11 @@ def getTrendingCategs():
dateE = datetime.datetime.now() dateE = datetime.datetime.now()
data = trendings_helper.getTrendingCategs(dateS, dateE) data = trendings_helper.getTrendingCategs(dateS, dateE, topNum=int(request.args.get('topNum', 10)))
return jsonify(data) return jsonify(data)
@app.route("/_getTrendingTags") @app.route("/_getTrendingTags")
@login_required
def getTrendingTags(): def getTrendingTags():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -501,10 +829,11 @@ def getTrendingTags():
dateE = datetime.datetime.now() dateE = datetime.datetime.now()
data = trendings_helper.getTrendingTags(dateS, dateE) data = trendings_helper.getTrendingTags(dateS, dateE, topNum=int(request.args.get('topNum', 10)))
return jsonify(data) return jsonify(data)
@app.route("/_getTrendingSightings") @app.route("/_getTrendingSightings")
@login_required
def getTrendingSightings(): def getTrendingSightings():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -517,6 +846,7 @@ def getTrendingSightings():
return jsonify(data) return jsonify(data)
@app.route("/_getTrendingDisc") @app.route("/_getTrendingDisc")
@login_required
def getTrendingDisc(): def getTrendingDisc():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -530,6 +860,7 @@ def getTrendingDisc():
return jsonify(data) return jsonify(data)
@app.route("/_getTypeaheadData") @app.route("/_getTypeaheadData")
@login_required
def getTypeaheadData(): def getTypeaheadData():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -542,6 +873,7 @@ def getTypeaheadData():
return jsonify(data) return jsonify(data)
@app.route("/_getGenericTrendingOvertime") @app.route("/_getGenericTrendingOvertime")
@login_required
def getGenericTrendingOvertime(): def getGenericTrendingOvertime():
try: try:
dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS'))) dateS = datetime.datetime.fromtimestamp(float(request.args.get('dateS')))
@ -551,8 +883,26 @@ def getGenericTrendingOvertime():
dateE = datetime.datetime.now() dateE = datetime.datetime.now()
choice = request.args.get('choice', 'events') choice = request.args.get('choice', 'events')
data = trendings_helper.getGenericTrendingOvertime(dateS, dateE, choice) data = trendings_helper.getGenericTrendingOvertime(dateS, dateE, choice=choice)
return jsonify(data) return jsonify(data)
if __name__ == '__main__': if __name__ == '__main__':
app.run(host=server_host, port=server_port, threaded=True) try:
if bool(server_ssl) is True:
if server_ssl_cert and server_ssl_key:
server_ssl_context = (server_ssl_cert, server_ssl_key)
else:
server_ssl_context = 'adhoc'
else:
server_ssl_context = None
app.run(host=server_host,
port=server_port,
ssl_context=server_ssl_context,
debug=server_debug,
threaded=True)
except OSError as error:
if error.errno == 98:
print("\n\n\nAddress already in use, the defined port is: " + str(server_port))
else:
print(str(error))

View File

@ -1,11 +1,34 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -x #set -x
GREEN="\\033[1;32m" GREEN="\\033[1;32m"
DEFAULT="\\033[0;39m" DEFAULT="\\033[0;39m"
RED="\\033[1;31m" RED="\\033[1;31m"
function wait_until_redis_is_ready {
redis_not_ready=true
while $redis_not_ready; do
if checking_redis; then
redis_not_ready=false;
else
sleep 1
fi
done
echo -e $GREEN"* Redis 6250 is running"$DEFAULT
}
function checking_redis {
flag_redis=0
bash -c 'redis-cli -p 6250 PING | grep "PONG" &> /dev/null'
if [ ! $? == 0 ]; then
echo -e $RED"Redis 6250 not ready"$DEFAULT
flag_redis=1
fi
sleep 0.1
return $flag_redis;
}
# Getting CWD where bash script resides # Getting CWD where bash script resides
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )" DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DASH_HOME="${DIR}" DASH_HOME="${DIR}"
@ -20,12 +43,25 @@ else
exit 1 exit 1
fi fi
[ ! -f "`which redis-server`" ] && echo "'redis-server' is not installed/not on PATH. Please fix and run again." && exit 1 if [[ -f "/etc/redhat-release" ]]; then
echo "You are running a RedHat flavour. Detecting scl potential..."
if [[ -f "/usr/bin/scl" ]]; then
echo "scl detected, checking for redis-server"
SCL_REDIS=$(scl -l|grep rh-redis)
if [[ ! -z $SCL_REDIS ]]; then
echo "We detected: ${SCL_REDIS} acting accordingly"
REDIS_RUN="/usr/bin/scl enable ${SCL_REDIS}"
fi
else
echo "redis-server seems not to be install in scl, perhaps system-wide, testing."
[ ! -f "`which redis-server`" ] && echo "'redis-server' is not installed/not on PATH. Please fix and run again." && exit 1
fi
else
[ ! -f "`which redis-server`" ] && echo "'redis-server' is not installed/not on PATH. Please fix and run again." && exit 1
fi
netstat -an |grep LISTEN |grep 6250 |grep -v tcp6 ; check_redis_port=$? netstat -an |grep LISTEN |grep 6250 |grep -v tcp6 ; check_redis_port=$?
netstat -an |grep LISTEN |grep 8001 |grep -v tcp6 ; check_dashboard_port=$? netstat -an |grep LISTEN |grep 8001 |grep -v tcp6 ; check_dashboard_port=$?
ps auxw |grep zmq_subscriber.py |grep -v grep ; check_zmq_subscriber=$?
ps auxw |grep zmq_dispatcher.py |grep -v grep ; check_zmq_dispatcher=$?
# Configure accordingly, remember: 0.0.0.0 exposes to every active IP interface, play safe and bind it to something you trust and know # Configure accordingly, remember: 0.0.0.0 exposes to every active IP interface, play safe and bind it to something you trust and know
export FLASK_APP=server.py export FLASK_APP=server.py
@ -37,32 +73,25 @@ conf_dir="config/"
sleep 0.1 sleep 0.1
if [ "${check_redis_port}" == "1" ]; then if [ "${check_redis_port}" == "1" ]; then
echo -e $GREEN"\t* Launching Redis servers"$DEFAULT echo -e $GREEN"\t* Launching Redis servers"$DEFAULT
redis-server ${conf_dir}6250.conf & if [[ ! -z $REDIS_RUN ]]; then
$REDIS_RUN "redis-server ${conf_dir}6250.conf" &
else
redis-server ${conf_dir}6250.conf &
fi
else else
echo -e $RED"\t* NOT starting Redis server, made a very unrealiable check on port 6250, and something seems to be there… please double check if this is good!"$DEFAULT echo -e $RED"\t* NOT starting Redis server, made a very unrealiable check on port 6250, and something seems to be there… please double check if this is good!"$DEFAULT
fi fi
sleep 0.1 sleep 0.1
if [ "${check_zmq_subscriber}" == "1" ]; then wait_until_redis_is_ready;
echo -e $GREEN"\t* Launching zmq subscriber"$DEFAULT
${ENV_PY} ./zmq_subscriber.py &
else
echo -e $RED"\t* NOT starting zmq subscriber, made a rather unrealiable ps -auxw | grep for zmq_subscriber.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi
sleep 0.1
if [ "${check_zmq_dispatcher}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq dispatcher"$DEFAULT
${ENV_PY} ./zmq_dispatcher.py &
else
echo -e $RED"\t* NOT starting zmq dispatcher, made a rather unrealiable ps -auxw | grep for zmq_dispatcher.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi
sleep 0.1
if [ "${check_dashboard_port}" == "1" ]; then if [ "${check_dashboard_port}" == "1" ]; then
echo -e $GREEN"\t* Launching flask server"$DEFAULT echo -e $GREEN"\t* Launching flask server"$DEFAULT
${ENV_PY} ./server.py & ${ENV_PY} ./server.py &
else else
echo -e $RED"\t* NOT starting flask server, made a very unrealiable check on port 8001, and something seems to be there… please double check if this is good!"$DEFAULT echo -e $RED"\t* NOT starting flask server, made a very unrealiable check on port 8001, and something seems to be there… please double check if this is good!"$DEFAULT
fi fi
sleep 0.1
sudo -u zmqs /bin/bash ${DIR}/start_zmq.sh &

50
start_zmq.sh Executable file
View File

@ -0,0 +1,50 @@
#!/usr/bin/env bash
#set -x
GREEN="\\033[1;32m"
DEFAULT="\\033[0;39m"
RED="\\033[1;31m"
# Getting CWD where bash script resides
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
DASH_HOME="${DIR}"
SCREEN_NAME="Misp_Dashboard"
cd ${DASH_HOME}
if [ -e "${DIR}/DASHENV/bin/python" ]; then
echo "dashboard virtualenv seems to exist, good"
ENV_PY="${DIR}/DASHENV/bin/python"
else
echo "Please make sure you have a dashboard environment, au revoir"
exit 1
fi
PID_SCREEN=$(screen -ls | grep ${SCREEN_NAME} | cut -f2 | cut -d. -f1)
if [[ $PID_SCREEN ]]; then
echo -e $RED"* A screen '$SCREEN_NAME' is already launched"$DEFAULT
echo -e $GREEN"Killing $PID_SCREEN"$DEFAULT;
kill $PID_SCREEN
else
echo 'No screen detected'
fi
screen -dmS ${SCREEN_NAME}
ps auxw |grep zmq_subscriber.py |grep -v grep ; check_zmq_subscriber=$?
ps auxw |grep zmq_dispatcher.py |grep -v grep ; check_zmq_dispatcher=$?
sleep 0.1
if [ "${check_zmq_subscriber}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq subscribers"$DEFAULT
screen -S "Misp_Dashboard" -X screen -t "zmq-subscribers" bash -c ${ENV_PY}' ./zmq_subscribers.py; read x'
else
echo -e $RED"\t* NOT starting zmq subscribers, made a rather unrealiable ps -auxw | grep for zmq_subscriber.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi
sleep 0.1
if [ "${check_zmq_dispatcher}" == "1" ]; then
echo -e $GREEN"\t* Launching zmq dispatcher"$DEFAULT
screen -S "Misp_Dashboard" -X screen -t "zmq-dispacher" bash -c ${ENV_PY}' ./zmq_dispatcher.py; read x'
else
echo -e $RED"\t* NOT starting zmq dispatcher, made a rather unrealiable ps -auxw | grep for zmq_dispatcher.py, and something seems to be there… please double check if this is good!"$DEFAULT
fi

7
static/css/jquery-ui.min.css vendored Normal file

File diff suppressed because one or more lines are too long

1
static/css/jquery.dataTables.min.css vendored Normal file

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,314 @@
.selected-path-container {
padding-left: 10px;
border: 1px solid #DCC896;
background: rgb(250, 240, 210);
border-radius: 4px;
margin-bottom: 0px;
}
.group-conditions > button[data-not="group"].active {
color: #FFF;
background-color: #C9302C;
border-color: #AC2925;
}
.query-builder, .query-builder * {
margin: 0;
padding: 0;
box-sizing: border-box;
}
.query-builder {
font-family: sans-serif;
}
.query-builder .hide {
display: none;
}
.query-builder .pull-right {
float: right !important;
}
.query-builder .btn {
text-transform: none;
display: inline-block;
padding: 6px 12px;
margin-bottom: 0px;
font-size: 14px;
font-weight: 400;
line-height: 1.42857;
text-align: center;
white-space: nowrap;
vertical-align: middle;
touch-action: manipulation;
cursor: pointer;
user-select: none;
background-image: none;
border: 1px solid transparent;
border-radius: 4px;
}
.query-builder .btn.focus, .query-builder .btn:focus, .query-builder .btn:hover {
color: #333;
text-decoration: none;
}
.query-builder .btn.active, .query-builder .btn:active {
background-image: none;
outline: 0px none;
box-shadow: 0px 3px 5px rgba(0, 0, 0, 0.125) inset;
}
.query-builder .btn-success {
color: #FFF;
background-color: #5CB85C;
border-color: #4CAE4C;
}
.query-builder .btn-primary {
color: #FFF;
background-color: #337AB7;
border-color: #2E6DA4;
}
.query-builder .btn-danger {
color: #FFF;
background-color: #D9534F;
border-color: #D43F3A;
}
.query-builder .btn-success.active, .query-builder .btn-success.focus,
.query-builder .btn-success:active, .query-builder .btn-success:focus,
.query-builder .btn-success:hover {
color: #FFF;
background-color: #449D44;
border-color: #398439;
}
.query-builder .btn-primary.active, .query-builder .btn-primary.focus,
.query-builder .btn-primary:active, .query-builder .btn-primary:focus,
.query-builder .btn-primary:hover {
color: #FFF;
background-color: #286090;
border-color: #204D74;
}
.query-builder .btn-danger.active, .query-builder .btn-danger.focus,
.query-builder .btn-danger:active, .query-builder .btn-danger:focus,
.query-builder .btn-danger:hover {
color: #FFF;
background-color: #C9302C;
border-color: #AC2925;
}
.query-builder .btn-group {
position: relative;
display: inline-block;
vertical-align: middle;
}
.query-builder .btn-group > .btn {
position: relative;
float: left;
}
.query-builder .btn-group > .btn:first-child {
margin-left: 0px;
}
.query-builder .btn-group > .btn:first-child:not(:last-child) {
border-top-right-radius: 0px;
border-bottom-right-radius: 0px;
}
.query-builder .btn-group > .btn:last-child:not(:first-child) {
border-top-left-radius: 0px;
border-bottom-left-radius: 0px;
}
.query-builder .btn-group .btn + .btn, .query-builder .btn-group .btn + .btn-group,
.query-builder .btn-group .btn-group + .btn, .query-builder .btn-group .btn-group + .btn-group {
margin-left: -1px;
}
.query-builder .btn-xs, .query-builder .btn-group-xs > .btn {
padding: 1px 5px;
line-height: 1.5;
border-radius: 3px;
}
/*!
* jQuery QueryBuilder 2.5.2
* Copyright 2014-2018 Damien "Mistic" Sorel (http://www.strangeplanet.fr)
* Licensed under MIT (https://opensource.org/licenses/MIT)
*/
.query-builder .rules-group-container, .query-builder .rule-container, .query-builder .rule-placeholder {
position: relative;
margin: 4px 0;
border-radius: 5px;
padding: 5px;
border: 1px solid #EEE;
background: rgba(255, 255, 255, 0.9);
}
.query-builder .rule-container .rule-filter-container,
.query-builder .rule-container .rule-operator-container,
.query-builder .rule-container .rule-value-container, .query-builder .error-container, .query-builder .drag-handle {
display: inline-block;
margin: 0 5px 0 0;
vertical-align: middle;
}
.query-builder .rules-group-container {
padding: 10px;
padding-bottom: 6px;
border: 1px solid #DCC896;
background: rgba(250, 240, 210, 0.5);
}
.query-builder .rules-group-header {
margin-bottom: 10px;
}
.query-builder .rules-group-header .group-conditions .btn.readonly:not(.active),
.query-builder .rules-group-header .group-conditions input[name$='_cond'] {
border: 0;
clip: rect(0 0 0 0);
height: 1px;
margin: -1px;
overflow: hidden;
padding: 0;
position: absolute;
width: 1px;
white-space: nowrap;
}
.query-builder .rules-group-header .group-conditions .btn.readonly {
border-radius: 3px;
}
.query-builder .rules-list {
list-style: none;
padding: 0 0 0 15px;
margin: 0;
}
.query-builder .rule-value-container {
border-left: 1px solid #DDD;
padding-left: 5px;
}
.query-builder .rule-value-container label {
margin-bottom: 0;
font-weight: normal;
}
.query-builder .rule-value-container label.block {
display: block;
}
.query-builder .rule-value-container select,
.query-builder .rule-value-container input[type='text'],
.query-builder .rule-value-container input[type='number'] {
padding: 1px;
}
.query-builder .error-container {
display: none;
cursor: help;
color: #F00;
}
.query-builder .has-error {
background-color: #FDD;
border-color: #F99;
}
.query-builder .has-error .error-container {
display: inline-block !important;
}
.query-builder .rules-list > *::before, .query-builder .rules-list > *::after {
content: '';
position: absolute;
left: -10px;
width: 10px;
height: calc(50% + 4px);
border-color: #CCC;
border-style: solid;
}
.query-builder .rules-list > *::before {
top: -4px;
border-width: 0 0 2px 2px;
}
.query-builder .rules-list > *::after {
top: 50%;
border-width: 0 0 0 2px;
}
.query-builder .rules-list > *:first-child::before {
top: -12px;
height: calc(50% + 14px);
}
.query-builder .rules-list > *:last-child::before {
border-radius: 0 0 0 4px;
}
.query-builder .rules-list > *:last-child::after {
display: none;
}
.query-builder.bt-checkbox-glyphicons .checkbox input[type='checkbox']:checked + label::after {
font-family: 'Glyphicons Halflings';
content: '\e013';
}
.query-builder.bt-checkbox-glyphicons .checkbox label::after {
padding-left: 4px;
padding-top: 2px;
font-size: 9px;
}
.query-builder .error-container + .tooltip .tooltip-inner {
color: #F99 !important;
}
.query-builder p.filter-description {
margin: 5px 0 0 0;
background: #D9EDF7;
border: 1px solid #BCE8F1;
color: #31708F;
border-radius: 5px;
padding: 2.5px 5px;
font-size: .8em;
}
.query-builder .rules-group-header [data-invert] {
margin-left: 5px;
}
.query-builder .drag-handle {
cursor: move;
vertical-align: middle;
margin-left: 5px;
}
.query-builder .dragging {
position: fixed;
opacity: .5;
z-index: 100;
}
.query-builder .dragging::before, .query-builder .dragging::after {
display: none;
}
.query-builder .rule-placeholder {
border: 1px dashed #BBB;
opacity: .7;
}

View File

@ -3,6 +3,13 @@
height: auto; height: auto;
} }
@media (min-width: 768px) {
.modal-xl {
width: 90%;
max-width:1500px;
}
}
.successCell { .successCell {
background-color: #dff0d8 !important background-color: #dff0d8 !important
} }
@ -66,6 +73,11 @@
color: black; color: black;
} }
.higlightRowInTable {
background-color: #dff0d8 !important;
color: black;
}
.centerInBtn { .centerInBtn {
float: left; float: left;
text-align: center; text-align: center;
@ -186,6 +198,19 @@
margin-right:auto; margin-right:auto;
} }
.allOrgRankingDiv {
overflow: scroll;
max-height: 600px;
background-color: #fff;
-webkit-background-clip: padding-box;
background-clip: padding-box;
border: 1px solid #ccc;
border: 1px solid rgba(0,0,0,.2);
border-radius: 6px;
-webkit-box-shadow: 0 5px 10px rgba(0,0,0,.2);
box-shadow: 0 5px 10px rgba(0,0,0,.2);
}
small { small {
font-size: 100%; font-size: 100%;
font-weight: bold; font-weight: bold;

View File

@ -9,6 +9,8 @@ var sec_before_reload = refresh_speed;
var plotLineChart; var plotLineChart;
var source_awards; var source_awards;
var source_lastContrib; var source_lastContrib;
var last_added_contrib;
var timeout_last_added_contrib;
/* CONFIG */ /* CONFIG */
var maxRank = 16; var maxRank = 16;
@ -150,6 +152,7 @@ function getMonthlyRankIcon(rank, size, header) {
img.width = size; img.width = size;
} }
} }
img.setAttribute('onerror', "this.style.display='none'");
return img.outerHTML; return img.outerHTML;
} }
@ -165,7 +168,8 @@ function getOrgRankIcon(rank, size) {
obj.src = rankLogoPath; obj.src = rankLogoPath;
obj.type = "image/svg" obj.type = "image/svg"
obj.title = org_rank_obj[rank]; obj.title = org_rank_obj[rank];
obj.classList.add('orgRankClass') obj.classList.add('orgRankClass');
obj.setAttribute('onerror', "this.style.display='none'");
return obj.outerHTML; return obj.outerHTML;
} }
@ -175,8 +179,9 @@ function createImg(source, size) {
obj.width = size; obj.width = size;
obj.style.margin = 'auto'; obj.style.margin = 'auto';
obj.src = source; obj.src = source;
obj.type = "image/png" obj.type = "image/png";
obj.alt = "" obj.alt = "";
obj.setAttribute('onerror', "this.style.display='none'");
return obj.outerHTML; return obj.outerHTML;
} }
@ -185,10 +190,11 @@ function createTrophyImg(rank, size, categ) {
obj.height = size; obj.height = size;
obj.width = size; obj.width = size;
obj.style.margin = 'auto'; obj.style.margin = 'auto';
obj.src = url_baseTrophyLogo+rank+'.png';; obj.src = url_baseTrophyLogo+rank+'.png';
obj.title = trophy_title[rank] + " in " + categ; obj.title = trophy_title[rank] + " in " + categ;
obj.type = "image/png" obj.type = "image/png";
obj.alt = "" obj.alt = "";
obj.setAttribute('onerror', "this.style.display='none'");
return obj.outerHTML; return obj.outerHTML;
} }
@ -206,6 +212,7 @@ function createHonorImg(array, size) {
obj.style.margin = 'auto'; obj.style.margin = 'auto';
obj.title = org_honor_badge_title[badgeNum]; obj.title = org_honor_badge_title[badgeNum];
obj.src = url_baseHonorLogo+badgeNum+'.svg'; obj.src = url_baseHonorLogo+badgeNum+'.svg';
obj.setAttribute('onerror', "this.style.display='none'");
div.appendChild(obj); div.appendChild(obj);
} }
div.style.width = 32*array.length+'px'; div.style.width = 32*array.length+'px';
@ -336,39 +343,40 @@ function addLastFromJson(datatable, url) {
} }
function addLastContributor(datatable, data, update) { function addLastContributor(datatable, data, update) {
var date = new Date(data.epoch*1000); var org = data.org;
date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); }; if (org == last_added_contrib) {
var to_add = [ let node = $('#lastTable > tbody tr:eq(0)');
date, node.addClass('higlightRowInTable');
data.pnts, update_timeout_last_added_contrib();
getMonthlyRankIcon(data.rank), } else {
getOrgRankIcon(data.orgRank, 60), last_added_contrib = org;
createHonorImg(data.honorBadge, 20), var date = new Date(data.epoch*1000);
createImg(data.logo_path, 32), //date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); };
createOrgLink(data.org), date = date.getFullYear() + "-" + String(date.getMonth()+1).padStart(2, "0") + "-" + String(date.getDate()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
]; var to_add = [
if (update == undefined || update == false) { date,
datatable.row.add(to_add); data.pnts,
datatable.draw(); getMonthlyRankIcon(data.rank),
} else if(update == true) { getOrgRankIcon(data.orgRank, 60),
var row_added = false; createHonorImg(data.honorBadge, 20),
datatable.rows().every( function() { createImg(data.logo_path, 32),
if($(this.data()[6])[0].text == data.org) { createOrgLink(data.org),
var node = $(datatable.row( this ).node()); ];
datatable.row( this ).data( to_add ); if (update === undefined || update === false) {
if(next_effect <= new Date()) { datatable.row.add(to_add);
node.effect("slide", 500);
next_effect.setSeconds((new Date()).getSeconds() + 5);
}
row_added = true;
}
datatable.draw(); datatable.draw();
}); } else if(update == true) {
if (!row_added) { datatable.rows().every( function() {
var node = $(datatable.row.add(to_add).draw().node()); if($(this.data()[6])[0].text == data.org) {
node.effect("slide", 700); var node = $(datatable.row( this ).node());
datatable.row( this ).data( to_add );
node.effect("slide", 500);
}
datatable.draw();
});
} }
} }
} }
function addAwards(datatableAwards, json, playAnim) { function addAwards(datatableAwards, json, playAnim) {
@ -381,7 +389,8 @@ function addAwards(datatableAwards, json, playAnim) {
var award = createTrophyImg(json.award[1][1], 40, categ); var award = createTrophyImg(json.award[1][1], 40, categ);
} }
var date = new Date(json.epoch*1000); var date = new Date(json.epoch*1000);
date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); }; //date.toString = function() {return this.toTimeString().slice(0,-15) +' '+ this.toLocaleDateString(); };
date = date.getFullYear() + "-" + String(date.getMonth()+1).padStart(2, "0") + "-" + String(date.getDate()).padStart(2, "0") + "@" + String(date.getHours()).padStart(2, "0") + ":" + String(date.getMinutes()).padStart(2, "0");
var to_add = [ var to_add = [
date, date,
createImg(json.logo_path, 32), createImg(json.logo_path, 32),
@ -549,6 +558,36 @@ function updateProgressHeader(org) {
updateOvertakePnts(); updateOvertakePnts();
} }
function generate_table_ranking_on_category(categ) {
$.getJSON( url_getAllOrgsTrophyRanking+'/'+categ, function( data ) {
var body = $('#bodyTableThropyAllOrgRankingModal');
body.empty();
data.forEach(function(arr, i) {
var org = arr[0];
var points = arr[1];
var rank = arr[2];
var tr = $('<tr></tr>');
tr.append($('<td style="width: 100px;">'+i+'</td>'));
tr.append($('<td style="width: 100px;"><img src="'+url_baseTrophyLogo+rank+'.png" width="30" height="30" onerror="this.style.display=\'none\'"></td>'));
tr.append($('<td style="width: 200px;">'+points+'</td>'));
tr.append($('<td><a href="?org='+org+'">'+org+'</a></td>'));
if (currOrg == org) {
tr.addClass('selectedOrgInTable');
}
body.append(tr);
});
})
}
function update_timeout_last_added_contrib() {
clearTimeout(timeout_last_added_contrib);
timeout_last_added_contrib = setTimeout(function() {
let node = $('#lastTable > tbody tr:eq(0)');
node.removeClass('higlightRowInTable');
last_added_contrib = null;
}, 5000);
}
function showOnlyOrg() { function showOnlyOrg() {
datatableCateg.search( $('#orgText').text() ).draw(); datatableCateg.search( $('#orgText').text() ).draw();
} }
@ -650,4 +689,13 @@ $(document).ready(function() {
addAwards(datatableAwards, json, true); addAwards(datatableAwards, json, true);
updateProgressHeader(currOrg); updateProgressHeader(currOrg);
}; };
$('#bodyTableTrophyModalOrg input').off('click').on('click', function(e) {
var categ = $(this).data('category');
var tds = $('#bodyTableTrophyModalOrg td');
tds.removeClass('success');
$(this).parent().addClass('success');
generate_table_ranking_on_category(categ);
});
}); });

144
static/js/doT.js Normal file
View File

@ -0,0 +1,144 @@
// doT.js
// 2011-2014, Laura Doktorova, https://github.com/olado/doT
// Licensed under the MIT license.
(function () {
"use strict";
var doT = {
name: "doT",
version: "1.1.1",
templateSettings: {
evaluate: /\{\{([\s\S]+?(\}?)+)\}\}/g,
interpolate: /\{\{=([\s\S]+?)\}\}/g,
encode: /\{\{!([\s\S]+?)\}\}/g,
use: /\{\{#([\s\S]+?)\}\}/g,
useParams: /(^|[^\w$])def(?:\.|\[[\'\"])([\w$\.]+)(?:[\'\"]\])?\s*\:\s*([\w$\.]+|\"[^\"]+\"|\'[^\']+\'|\{[^\}]+\})/g,
define: /\{\{##\s*([\w\.$]+)\s*(\:|=)([\s\S]+?)#\}\}/g,
defineParams:/^\s*([\w$]+):([\s\S]+)/,
conditional: /\{\{\?(\?)?\s*([\s\S]*?)\s*\}\}/g,
iterate: /\{\{~\s*(?:\}\}|([\s\S]+?)\s*\:\s*([\w$]+)\s*(?:\:\s*([\w$]+))?\s*\}\})/g,
varname: "it",
strip: true,
append: true,
selfcontained: false,
doNotSkipEncoded: false
},
template: undefined, //fn, compile template
compile: undefined, //fn, for express
log: true
}, _globals;
doT.encodeHTMLSource = function(doNotSkipEncoded) {
var encodeHTMLRules = { "&": "&#38;", "<": "&#60;", ">": "&#62;", '"': "&#34;", "'": "&#39;", "/": "&#47;" },
matchHTML = doNotSkipEncoded ? /[&<>"'\/]/g : /&(?!#?\w+;)|<|>|"|'|\//g;
return function(code) {
return code ? code.toString().replace(matchHTML, function(m) {return encodeHTMLRules[m] || m;}) : "";
};
};
_globals = (function(){ return this || (0,eval)("this"); }());
/* istanbul ignore else */
if (typeof module !== "undefined" && module.exports) {
module.exports = doT;
} else if (typeof define === "function" && define.amd) {
define(function(){return doT;});
} else {
_globals.doT = doT;
}
var startend = {
append: { start: "'+(", end: ")+'", startencode: "'+encodeHTML(" },
split: { start: "';out+=(", end: ");out+='", startencode: "';out+=encodeHTML(" }
}, skip = /$^/;
function resolveDefs(c, block, def) {
return ((typeof block === "string") ? block : block.toString())
.replace(c.define || skip, function(m, code, assign, value) {
if (code.indexOf("def.") === 0) {
code = code.substring(4);
}
if (!(code in def)) {
if (assign === ":") {
if (c.defineParams) value.replace(c.defineParams, function(m, param, v) {
def[code] = {arg: param, text: v};
});
if (!(code in def)) def[code]= value;
} else {
new Function("def", "def['"+code+"']=" + value)(def);
}
}
return "";
})
.replace(c.use || skip, function(m, code) {
if (c.useParams) code = code.replace(c.useParams, function(m, s, d, param) {
if (def[d] && def[d].arg && param) {
var rw = (d+":"+param).replace(/'|\\/g, "_");
def.__exp = def.__exp || {};
def.__exp[rw] = def[d].text.replace(new RegExp("(^|[^\\w$])" + def[d].arg + "([^\\w$])", "g"), "$1" + param + "$2");
return s + "def.__exp['"+rw+"']";
}
});
var v = new Function("def", "return " + code)(def);
return v ? resolveDefs(c, v, def) : v;
});
}
function unescape(code) {
return code.replace(/\\('|\\)/g, "$1").replace(/[\r\t\n]/g, " ");
}
doT.template = function(tmpl, c, def) {
c = c || doT.templateSettings;
var cse = c.append ? startend.append : startend.split, needhtmlencode, sid = 0, indv,
str = (c.use || c.define) ? resolveDefs(c, tmpl, def || {}) : tmpl;
str = ("var out='" + (c.strip ? str.replace(/(^|\r|\n)\t* +| +\t*(\r|\n|$)/g," ")
.replace(/\r|\n|\t|\/\*[\s\S]*?\*\//g,""): str)
.replace(/'|\\/g, "\\$&")
.replace(c.interpolate || skip, function(m, code) {
return cse.start + unescape(code) + cse.end;
})
.replace(c.encode || skip, function(m, code) {
needhtmlencode = true;
return cse.startencode + unescape(code) + cse.end;
})
.replace(c.conditional || skip, function(m, elsecase, code) {
return elsecase ?
(code ? "';}else if(" + unescape(code) + "){out+='" : "';}else{out+='") :
(code ? "';if(" + unescape(code) + "){out+='" : "';}out+='");
})
.replace(c.iterate || skip, function(m, iterate, vname, iname) {
if (!iterate) return "';} } out+='";
sid+=1; indv=iname || "i"+sid; iterate=unescape(iterate);
return "';var arr"+sid+"="+iterate+";if(arr"+sid+"){var "+vname+","+indv+"=-1,l"+sid+"=arr"+sid+".length-1;while("+indv+"<l"+sid+"){"
+vname+"=arr"+sid+"["+indv+"+=1];out+='";
})
.replace(c.evaluate || skip, function(m, code) {
return "';" + unescape(code) + "out+='";
})
+ "';return out;")
.replace(/\n/g, "\\n").replace(/\t/g, '\\t').replace(/\r/g, "\\r")
.replace(/(\s|;|\}|^|\{)out\+='';/g, '$1').replace(/\+''/g, "");
//.replace(/(\s|;|\}|^|\{)out\+=''\+/g,'$1out+=');
if (needhtmlencode) {
if (!c.selfcontained && _globals && !_globals._encodeHTML) _globals._encodeHTML = doT.encodeHTMLSource(c.doNotSkipEncoded);
str = "var encodeHTML = typeof _encodeHTML !== 'undefined' ? _encodeHTML : ("
+ doT.encodeHTMLSource.toString() + "(" + (c.doNotSkipEncoded || '') + "));"
+ str;
}
try {
return new Function(c.varname, str);
} catch (e) {
/* istanbul ignore else */
if (typeof console !== "undefined") console.log("Could not create a template function: " + str);
throw e;
}
};
doT.compile = function(tmpl, def) {
return doT.template(tmpl, null, def);
};
}());

132
static/js/extendext.js Normal file
View File

@ -0,0 +1,132 @@
/*!
* jQuery.extendext 0.1.2
*
* Copyright 2014-2016 Damien "Mistic" Sorel (http://www.strangeplanet.fr)
* Licensed under MIT (http://opensource.org/licenses/MIT)
*
* Based on jQuery.extend by jQuery Foundation, Inc. and other contributors
*/
/*jshint -W083 */
(function (root, factory) {
if (typeof define === 'function' && define.amd) {
define(['jquery'], factory);
}
else if (typeof module === 'object' && module.exports) {
module.exports = factory(require('jquery'));
}
else {
factory(root.jQuery);
}
}(this, function ($) {
"use strict";
$.extendext = function () {
var options, name, src, copy, copyIsArray, clone,
target = arguments[0] || {},
i = 1,
length = arguments.length,
deep = false,
arrayMode = 'default';
// Handle a deep copy situation
if (typeof target === "boolean") {
deep = target;
// Skip the boolean and the target
target = arguments[i++] || {};
}
// Handle array mode parameter
if (typeof target === "string") {
arrayMode = target.toLowerCase();
if (arrayMode !== 'concat' && arrayMode !== 'replace' && arrayMode !== 'extend') {
arrayMode = 'default';
}
// Skip the string param
target = arguments[i++] || {};
}
// Handle case when target is a string or something (possible in deep copy)
if (typeof target !== "object" && !$.isFunction(target)) {
target = {};
}
// Extend jQuery itself if only one argument is passed
if (i === length) {
target = this;
i--;
}
for (; i < length; i++) {
// Only deal with non-null/undefined values
if ((options = arguments[i]) !== null) {
// Special operations for arrays
if ($.isArray(options) && arrayMode !== 'default') {
clone = target && $.isArray(target) ? target : [];
switch (arrayMode) {
case 'concat':
target = clone.concat($.extend(deep, [], options));
break;
case 'replace':
target = $.extend(deep, [], options);
break;
case 'extend':
options.forEach(function (e, i) {
if (typeof e === 'object') {
var type = $.isArray(e) ? [] : {};
clone[i] = $.extendext(deep, arrayMode, clone[i] || type, e);
} else if (clone.indexOf(e) === -1) {
clone.push(e);
}
});
target = clone;
break;
}
} else {
// Extend the base object
for (name in options) {
src = target[name];
copy = options[name];
// Prevent never-ending loop
if (target === copy) {
continue;
}
// Recurse if we're merging plain objects or arrays
if (deep && copy && ( $.isPlainObject(copy) ||
(copyIsArray = $.isArray(copy)) )) {
if (copyIsArray) {
copyIsArray = false;
clone = src && $.isArray(src) ? src : [];
} else {
clone = src && $.isPlainObject(src) ? src : {};
}
// Never move original objects, clone them
target[name] = $.extendext(deep, arrayMode, clone, copy);
// Don't bring in undefined values
} else if (copy !== undefined) {
target[name] = copy;
}
}
}
}
}
// Return the modified object
return target;
};
}));

View File

@ -187,30 +187,91 @@ function queryAndAddMarkers() {
var coord = circleRadius._latlng; var coord = circleRadius._latlng;
var dateStart = datePickersRadiusWidgetFrom.datepicker("getDate").getTime() / 1000; var dateStart = datePickersRadiusWidgetFrom.datepicker("getDate").getTime() / 1000;
var dateEnd = datePickersRadiusWidgetTo.datepicker("getDate").getTime() / 1000; var dateEnd = datePickersRadiusWidgetTo.datepicker("getDate").getTime() / 1000;
$.getJSON(urlCoordsByRadius+"?dateStart="+dateStart+"&dateEnd="+dateEnd+"&centerLat="+coord.lat+"&centerLon="+coord.lng+"&radius="+radius_km, function(allList){ var geo_param = {
// remove old markers dateStart: dateStart,
for (var i in savedMarkerRadius) { dateEnd: dateEnd,
savedMarkerRadius[i].remove(); // remove marker centerLat: coord.lat,
} centerLon: coord.lng,
radius: radius_km
};
for (var listIndex in allList) { $.ajax({
var curMarker = allList[listIndex]; data: geo_param,
var dataText = ""; cache: false,
var coordJson = curMarker[1]; beforeSend: function(XMLHttpRequest) {
for (var dataI in curMarker[0]) { //$('.loading').show();
var jsonData = JSON.parse(curMarker[0][dataI]) set_loading_status($('#geo_info_qry_btn'), true, 'Querying and fetching', 1);
dataText += '<strong>'+jsonData.categ+': </strong> '+jsonData.value + "<br>" },
success: function(data, textStatus) {
var allList = data;
set_loading_status($('#geo_info_qry_btn'), true, 'Drawing '+allList.length + ' results', 2);
// remove old markers
for (var i in savedMarkerRadius) {
savedMarkerRadius[i].remove(); // remove marker
} }
var marker = L.marker([coordJson[1], coordJson[0]]).addTo(radiusOpenStreetMap);
savedMarkerRadius.push(marker); for (var listIndex in allList) {
marker.bindPopup(dataText, {autoClose:false}).openPopup(); var curMarker = allList[listIndex];
} var dataText = "";
var coordJson = curMarker[1];
for (var dataI in curMarker[0]) {
var jsonData = JSON.parse(curMarker[0][dataI])
dataText += '<strong>'+jsonData.categ+': </strong> '+jsonData.value + "<br>"
}
var marker = L.marker([coordJson[1], coordJson[0]]).addTo(radiusOpenStreetMap);
savedMarkerRadius.push(marker);
marker.bindPopup(dataText, {autoClose:false}).openPopup();
}
set_loading_status($('#geo_info_qry_btn'), false, allList.length + ' results');
},
error: function( jqXhr, textStatus, errorThrown ){
set_loading_status($('#geo_info_qry_btn'), false, 'Error: '+ errorThrown);
},
type: 'get',
url: urlCoordsByRadius
}); });
} }
/* UTIL */ /* UTIL */
function set_loading_status(jhtml, is_loading, text, loading_state) {
text = text === undefined || text === null ? '' : text;
if (is_loading) {
if (jhtml.data('default-text') === undefined) {
jhtml.data('default-text', jhtml.text());
}
var loading_icon = ''
switch(loading_state) {
case 1:
loading_icon = 'fa-spinner';
break;
case 2:
loading_icon = 'fa-circle-o-notch';
break;
case 3:
loading_icon = 'fa-refresh';
break;
default:
loading_icon = 'fa-circle-o-notch';
break;
}
var loadingIcon = $('<i class="fa fa-spin '+loading_icon+'"></i>');
jhtml.text(' '+text);
jhtml.prepend(loadingIcon);
} else {
jhtml.empty();
jhtml.text(text);
setTimeout(function() {
let old_text = jhtml.data('default-text');
jhtml.text(old_text);
}, 5000);
}
}
function days_between(date1, date2) { function days_between(date1, date2) {
var ONEDAY = 60*60*24*1000; var ONEDAY = 60*60*24*1000;
var diff_ms = Math.abs(date1.getTime() - date2.getTime()); var diff_ms = Math.abs(date1.getTime() - date2.getTime());

View File

@ -166,67 +166,42 @@ var sources = new Sources();
sources.addSource('global'); sources.addSource('global');
var ledmanager = new LedManager(); var ledmanager = new LedManager();
var curNumLog = 0;
var curMaxDataNumLog = 0; var curMaxDataNumLog = 0;
var source_log;
function connect_source_log() {
source_log = new EventSource(urlForLogs);
source_log.onopen = function(){
//console.log('connection is opened. '+source_log.readyState);
};
source_log.onerror = function(){
console.log('error: '+source_log.readyState);
setTimeout(function() { connect_source_log(); }, 5000);
};
source_log.onmessage = function(event) {
var json = jQuery.parseJSON( event.data );
updateLogTable(json.feedName, json.log, json.zmqName);
};
}
var livelog;
$(document).ready(function () { $(document).ready(function () {
createHead(function() { $.getJSON(urlForHead, function(head) {
if (!!window.EventSource) { livelog = new $.livelog($("#divLogTable"), {
connect_source_log(); pollingFrequency: 5000,
} else { tableHeader: head,
console.log("No event source_log"); tableMaxEntries: 50,
} // animate: false,
preDataURL: urlForLogs,
endpoint: urlForLogs
});
}); });
}); });
// LOG TABLE // LOG TABLE
function updateLogTable(feedName, log, zmqName) { function updateLogTable(name, log, zmqName, ignoreLed) {
if (log.length == 0) if (log.length == 0)
return; return;
// update keepAlives // update keepAlives
ledmanager.updateKeepAlive(zmqName); if (ignoreLed !== true) {
ledmanager.updateKeepAlive(zmqName);
// Create new row }
tableBody = document.getElementById('table_log_body');
// only add row for attribute // only add row for attribute
if (feedName == "Attribute" ) { if (name == "Attribute" ) {
var categName = log[toPlotLocationLog]; var categName = log[toPlotLocationLog];
sources.addIfNotPresent(categName); sources.addIfNotPresent(categName);
sources.incCountOnSource(categName); sources.incCountOnSource(categName);
sources.incCountOnSource('global'); sources.incCountOnSource('global');
updateChartDirect(); updateChartDirect();
createRow(tableBody, log); } else if (name == "Keepalive") {
// Remove old row
while ($("#table_log").height() >= $("#panelLogTable").height()-26){ //26 for margin
tableBody.deleteRow(0);
}
} else if (feedName == "Keepalive") {
// do nothing // do nothing
} else { } else {
// do nothing // do nothing
@ -259,23 +234,6 @@ function getTextColour(rgb) {
} }
} }
function addObjectToLog(name, obj, td) {
if(name == "Tag") {
var a = document.createElement('A');
a.classList.add('tagElem');
a.style.backgroundColor = obj.colour;
a.style.color = getTextColour(obj.colour.substring(1,6));
a.innerHTML = obj.name;
td.appendChild(a);
td.appendChild(document.createElement('br'));
} else if (name == "mispObject") {
td.appendChild(document.createTextNode('mispObj'));
} else {
td.appendChild(document.createTextNode('nop'));
}
}
function createRow(tableBody, log) { function createRow(tableBody, log) {
var tr = document.createElement('TR'); var tr = document.createElement('TR');
@ -333,3 +291,588 @@ function createHead(callback) {
callback(); callback();
}); });
} }
/* LIVE LOG */
(function(factory) {
"use strict";
if (typeof define === 'function' && define.amd) {
define(['jquery'], factory);
} else if (window.jQuery && !window.jQuery.fn.Livelog) {
factory(window.jQuery);
}
}
(function($) {
'use strict';
// Livelog object
var Livelog = function(container, options) {
this._default_options = {
pollingFrequency: 5000,
tableHeader: undefined,
tableMaxEntries: undefined,
animate: true
}
options.container = container;
this.validateOptions(options);
this._options = $.extend({}, this._default_options, options);
// create table and draw header
this.origTableOptions = {
dom: "<'row'<'col-sm-12'<'dt-toolbar-led'>>>"
+ "<'row'<'col-sm-12'tr>>",
searching: false,
paging: false,
"order": [[ 0, "desc" ]],
responsive: true,
columnDefs: [
{ targets: 0, orderable: false },
{ targets: '_all', searchable: false, orderable: false,
render: function ( data, type, row ) {
var $toRet;
if (typeof data === 'object') {
$toRet = $('<span></span>');
data.data.forEach(function(cur, i) {
switch (data.name) {
case 'Tag':
var $tag = $('<a></a>');
$tag.addClass('tagElem');
$tag.css({
backgroundColor: cur.colour,
color: getTextColour(cur.colour.substring(1,6))
});
$tag.text(cur.name)
$toRet.append($tag);
break;
case 'mispObject':
$toRet.append('MISP Object not supported yet')
break;
default:
break;
}
});
$toRet = $toRet[0].outerHTML;
} else if (data === undefined) {
$toRet = '';
} else {
var textToAddArray = data.split(char_separator);
$toRet = '';
textToAddArray.forEach(function(e, i) {
if (i > 0) {
$toRet += '<br>' + e;
} else {
$toRet += e;
}
});
}
return $toRet;
},
}
],
};
this.DOMTable = $('<table class="table table-striped table-bordered" style="width:100%"></table>');
this._options.container.append(this.DOMTable);
this.origTableOptions.columns = [];
var that = this;
this._options.tableHeader.forEach(function(field) {
var th = $('<th>'+field+'</th>');
that.origTableOptions.columns.push({ title: field });
});
this.dt = this.DOMTable.DataTable(this.origTableOptions);
this.fetch_predata();
// add status led
this._ev_timer = null;
this._ev_retry_frequency = this._options.pollingFrequency; // sec
this._cur_ev_retry_count = 0;
this._ev_retry_count_thres = 3;
var led_container = $('<div class="led-container" style="margin-left: 10px;"></div>');
var led = $('<div class="led-small led_red"></div>');
this.statusLed = led;
led_container.append(led);
var header = this._options.container.parent().parent().find('.panel-heading');
if (header.length > 0) { // add in panel header
header.append(led_container);
} else { // add over the map
led.css('display', 'inline-block');
led_container.append($('<span>Status</span>')).css('float', 'left');
$('.dt-toolbar-led').append(led_container)
}
this.data_source = undefined;
this.connect_to_data_source();
};
Livelog.prototype = {
constructor: Livelog,
validateOptions: function(options) {
var o = options;
if (o.endpoint === undefined || typeof o.endpoint != 'string') {
throw "Livelog must have a valid endpoint";
}
if (o.container === undefined) {
throw "Livelog must have a container";
} else {
o.container = o.container instanceof jQuery ? o.container : $('#'+o.container);
}
// pre-data is either the data to be shown or an URL from which the data should be taken from
if (Array.isArray(o.preData)){
o.preDataURL = null;
o.preData = o.preData;
} else if (o.preData !== undefined) { // should fetch
o.preDataURL = o.preData;
o.preData = [];
}
if (o.tableHeader === undefined || !Array.isArray(o.tableHeader)) {
throw "Livelog must have a valid header";
}
if (o.tableMaxEntries !== undefined) {
o.tableMaxEntries = parseInt(o.tableMaxEntries);
}
},
changeOptions: function(options) {
var that = this;
Object.keys(options).forEach(function (optionName) {
that._options[optionName] = options[optionName];
});
},
fetch_predata: function() {
var that = this;
if (this._options.preDataURL !== null) {
$.when(
$.ajax({
dataType: "json",
url: this._options.preDataURL,
data: this._options.additionalOptions,
success: function(data) {
that._options.preData = data;
},
error: function(jqXHR, textStatus, errorThrown) {
console.log(textStatus);
that._options.preData = [];
}
})
).then(
function() { // success
// add data to the widget
that._options.preData.forEach(function(j) {
var name = j.name,
zmqName = j.zmqName,
entry = j.log;
updateLogTable(name, entry, zmqName, true);
switch (name) {
case 'Attribute':
that.add_entry(entry);
break;
case 'ObjectAttribute':
that.add_entry(entry, true);
break;
default:
break;
}
});
}, function() { // fail
}
);
}
},
connect_to_data_source: function() {
var that = this;
if (!this.data_source) {
// var url_param = $.param( this.additionalOptions );
this.data_source = new EventSource(this._options.endpoint);
this.data_source.onmessage = function(event) {
var json = jQuery.parseJSON( event.data );
var name = json.name,
zmqName = json.zmqName,
entry = json.log;
updateLogTable(name, entry, zmqName);
switch (name) {
case 'Attribute':
that.add_entry(entry);
break;
case 'ObjectAttribute':
that.add_entry(entry, true);
break;
default:
break;
}
};
this.data_source.onopen = function(){
that._cur_ev_retry_count = 0;
that.update_connection_state('connected');
};
this.data_source.onerror = function(){
if (that.data_source.readyState == 0) { // reconnecting
that.update_connection_state('connecting');
} else if (that.data_source.readyState == 2) { // closed, reconnect with new object
that.reconnection_logique();
} else {
that.update_connection_state('not connected');
that.reconnection_logique();
}
};
}
},
reconnection_logique: function () {
var that = this;
if (that.data_source) {
that.data_source.close();
that.data_source = null;
}
if (that._ev_timer) {
clearTimeout(that._ev_timer);
}
if(that._cur_ev_retry_count >= that._ev_retry_count_thres) {
that.update_connection_state('not connected');
} else {
that._cur_ev_retry_count++;
that.update_connection_state('connecting');
}
that._ev_timer = setTimeout(function () { that.connect_to_data_source(); }, that._ev_retry_frequency*1000);
},
reconnect: function() {
if (this.data_source) {
this.data_source.close();
this.data_source = null;
this._cur_ev_retry_count = 0;
this.update_connection_state('reconnecting');
this.connect_to_data_source();
}
},
update_connection_state: function(connectionState) {
this.connectionState = connectionState;
this.updateDOMState(this.statusLed, connectionState);
},
updateDOMState: function(led, state) {
switch (state) {
case 'connected':
led.removeClass("led_red");
led.removeClass("led_orange");
led.addClass("led_green");
break;
case 'not connected':
led.removeClass("led_green");
led.removeClass("led_orange");
led.addClass("led_red");
break;
case 'connecting':
led.removeClass("led_green");
led.removeClass("led_red");
led.addClass("led_orange");
break;
default:
led.removeClass("led_green");
led.removeClass("led_orange");
led.addClass("led_red");
}
},
add_entry: function(entry, isObjectAttribute) {
entry = this.sanitizeJson(entry);
var rowNode = this.dt.row.add(entry).draw().node();
if (this._options.animate) {
$( rowNode )
.css( 'background-color', '#5cb85c !important' )
.animate( { 'background-color': '' }, { duration: 1500 } );
}
if (isObjectAttribute === true) {
$( rowNode ).children().last()
.css('position', 'relative')
.append(
$('<it class="fa fa-th rowTableIsObject" title="This attribute belong to an Object"></it>')
);
}
// remove entries
var numRows = this.dt.rows().count();
var rowsToRemove = numRows - this._options.tableMaxEntries;
if (rowsToRemove > 0 && this._options.tableMaxEntries != -1) {
//get row indexes as an array
var arraySlice = this.dt.rows().indexes().toArray();
//get row indexes to remove starting at row 0
arraySlice = arraySlice.slice(-rowsToRemove);
//remove the rows and redraw the table
var rows = this.dt.rows(arraySlice).remove().draw();
}
},
sanitizeJson: function(dirty_json) {
var sanitized_json = {};
var that = this;
Object.keys(dirty_json).forEach(function(k) {
var val = dirty_json[k];
if (Array.isArray(val)) {
var clear_array = [];
sanitized_json[k] = val.map(function(item) {
return that.sanitize(item);
});
} else if(typeof val === 'object') {
sanitized_json[k] = that.sanitizeJson(val);
} else {
sanitized_json[k] = that.sanitize(val);
}
});
return sanitized_json;
},
sanitize: function(e) {
return $("<p>").text(e).html();;
}
};
$.livelog = Livelog;
$.fn.livelog = function(option) {
var pickerArgs = arguments;
return this.each(function() {
var $this = $(this),
inst = $this.data('livelog'),
options = ((typeof option === 'object') ? option : {});
if ((!inst) && (typeof option !== 'string')) {
$this.data('livelog', new Livelog(this, options));
} else {
if (typeof option === 'string') {
inst[option].apply(inst, Array.prototype.slice.call(pickerArgs, 1));
}
}
});
};
$.fn.livelog.constructor = Livelog;
}));
/* Live log filter */
function recursiveInject(result, rules, isNot) {
if (rules.rules === undefined) { // add to result
var field = rules.field;
var value = rules.value;
var operator_notequal = rules.operator === 'not_equal' ? true : false;
var negate = isNot ^ operator_notequal;
value = negate ? '!' + value : value;
if (result.hasOwnProperty(field)) {
if (Array.isArray(result[field])) {
result[field].push(value);
} else {
result[field] = [result[field], value];
}
} else {
result[field] = value;
}
}
else if (Array.isArray(rules.rules)) {
rules.rules.forEach(function(subrules) {
recursiveInject(result, subrules, isNot ^ rules.not) ;
});
}
}
function cleanRules(rules) {
var res = {};
recursiveInject(res, rules);
// clean up invalid and unset
Object.keys(res).forEach(function(k) {
var v = res[k];
if (v === undefined || v === '') {
delete res[k];
}
});
return res;
}
$(document).ready(function() {
var qbOptions = {
plugins: {
'filter-description' : {
mode: 'inline'
},
'unique-filter': null,
'bt-tooltip-errors': null,
},
allow_empty: true,
filters: [],
rules: {
condition: 'AND',
not: false,
rules: [],
flags: {
no_add_group: true,
condition_readonly: true,
}
},
icons: {
add_group: 'fa fa-plus-square',
add_rule: 'fa fa-plus-circle',
remove_group: 'fa fa-minus-square',
remove_rule: 'fa fa-minus-circle',
error: 'fa fa-exclamation-triangle'
}
};
// add filters and rules
[
'Attribute.category',
'Attribute.comment',
'Attribute.deleted',
'Attribute.disable_correlation',
'Attribute.distribution',
'Attribute.event_id',
'Attribute.id',
'Attribute.object_id',
'Attribute.object_relation',
'Attribute.sharing_group_id',
'Attribute.Tag.name',
'Attribute.timestamp',
'Attribute.to_ids',
'Attribute.type',
'Attribute.uuid',
'Attribute.value',
'Event.Org',
'Event.Orgc',
'Event.analysis',
'Event.attribute_count',
'Event.date',
'Event.disable_correlation',
'Event.distribution',
'Event.event_creator_email',
'Event.extends_uuid',
'Event.id',
'Event.info',
'Event.locked',
'Event.org_id',
'Event.orgc_id',
'Event.proposal_email_lock',
'Event.publish_timestamp',
'Event.published',
'Event.sharing_group_id',
'Event.threat_level_id',
'Event.Tag.name',
'Event.timestamp',
'Event.uuid',
'Org.id',
'Org.name',
'Org.uuid',
'Orgc.id',
'Orgc.name',
'Orgc.uuid'
].forEach(function(field) {
var tempFilter = {
"input": "text",
"type": "string",
"operators": [
"equal",
"not_equal"
],
"unique": true,
"id": field,
"label": field,
"description": "Perfom strict equality on " + field,
"validation": {
"allow_empty_value": true
}
};
qbOptions.filters.push(tempFilter);
});
var filterCookie = getCookie('filters');
var filters = JSON.parse(filterCookie !== undefined && filterCookie !== '' ? filterCookie : "{}");
var activeFilters = Object.keys(filters)
var tempRule = [];
activeFilters.forEach(function(field) {
var v = filters[field];
var tmp = {
field: field,
id: field,
value: v
};
tempRule.push(tmp);
});
qbOptions.rules.rules = tempRule;
updateFilterButton(activeFilters);
var $ev = $('#filteringQB');
var querybuilderTool = $ev.queryBuilder(qbOptions);
querybuilderTool = querybuilderTool[0].queryBuilder;
$('#saveFilters').click(function() {
var rules = querybuilderTool.getRules({ skip_empty: true, allow_invalid: true });
var result = {};
recursiveInject(result, rules, false);
updateFilterButton(Object.keys(result));
var jres = JSON.stringify(result, null);
document.cookie = 'filters=' + jres;
$('#modalFilters').modal('hide');
livelog.dt
.clear()
.draw();
livelog.fetch_predata();
livelog.reconnect();
})
$('#log-fullscreen').click(function() {
var $this = $(this);
var $panel = $('#panelLogTable');
var isfullscreen = $this.data('isfullscreen');
if (isfullscreen === undefined || !isfullscreen) {
$panel.detach().prependTo('#page-wrapper')
$panel.addClass('liveLogFullScreen');
$this.data('isfullscreen', true);
$panel.find('#divLogTable').css({'overflow-y': 'auto'});
livelog.changeOptions({tableMaxEntries: 300});
} else {
$panel.detach().appendTo('#rightCol')
$panel.removeClass('liveLogFullScreen');
$this.data('isfullscreen', false);
$panel.find('#divLogTable').css({'overflow': 'hidden'});
livelog.changeOptions({tableMaxEntries: 50});
}
});
});
function updateFilterButton(activeFilters) {
if (activeFilters.length > 0) {
$('#log-filter').removeClass('btn-default');
$('#log-filter').addClass('btn-success');
} else {
$('#log-filter').removeClass('btn-success');
$('#log-filter').addClass('btn-default');
}
}
function getCookie(cname) {
var name = cname + "=";
var decodedCookie = decodeURIComponent(document.cookie);
var ca = decodedCookie.split(';');
for(var i = 0; i <ca.length; i++) {
var c = ca[i];
while (c.charAt(0) == ' ') {
c = c.substring(1);
}
if (c.indexOf(name) == 0) {
return c.substring(name.length, c.length);
}
}
return "";
}

View File

@ -23,7 +23,17 @@ class MapEvent {
this.specifName = json.specifName; this.specifName = json.specifName;
this.cityName = json.cityName; this.cityName = json.cityName;
this.text = this.categ + ": " + this.value; this.text = this.categ + ": " + this.value;
this.textMarker = "<b>{1}</b><br>{2}".replace("{1}", this.country).replace("{2}", this.specifName+", "+this.cityName); let underText = "";
if (this.specifName !== null && this.cityName !== null) {
underText = this.specifName+", "+this.cityName;
} else if (this.specifName !== null) {
underText = this.specifName;
} else if (this.cityName !== null) {
underText = this.cityName;
} else {
underText = "";
}
this.textMarker = "<b>{1}</b><br>{2}".replace("{1}", this.country).replace("{2}", underText);
} }
} }
@ -215,10 +225,10 @@ function connect_source_map() {
}; };
source_map.onerror = function(){ source_map.onerror = function(){
console.log('error: '+source_map.readyState); console.log('error: '+source_map.readyState);
source_map.close()
setTimeout(function() { connect_source_map(); }, 5000); setTimeout(function() { connect_source_map(); }, 5000);
}; };
} }
connect_source_map()
$(document).ready(function () { $(document).ready(function () {
$( "#rotation_wait_time_selector" ).change(function() { $( "#rotation_wait_time_selector" ).change(function() {
@ -240,4 +250,15 @@ $(document).ready(function () {
ZOOMLEVEL = sel; ZOOMLEVEL = sel;
mapEventManager.directZoom(); mapEventManager.directZoom();
}); });
if (!!window.EventSource) {
$.getJSON( urlForMaps, function( data ) {
data.forEach(function(item) {
var marker = L.marker([item.coord.lat, item.coord.lon]).addTo(myOpenStreetMap);
var mapEvent = new MapEvent(item, marker);
mapEventManager.addMapEvent(mapEvent);
});
connect_source_map()
});
}
}); });

File diff suppressed because it is too large Load Diff

6202
static/js/query-builder.js Normal file

File diff suppressed because it is too large Load Diff

View File

@ -145,7 +145,7 @@ function getTextColour(rgb) {
} }
} }
// If json (from tag), only retreive the name> otherwise return the supplied arg. // If json (from tag), only retrieve the name> otherwise return the supplied arg.
function getOnlyName(potentialJson) { function getOnlyName(potentialJson) {
try { try {
jsonLabel = JSON.parse(potentialJson); jsonLabel = JSON.parse(potentialJson);

View File

@ -79,15 +79,21 @@ function updateDatePunch(ignore1, igonre2, org) { //date picker sets ( String da
punchcardWidget.refresh(); punchcardWidget.refresh();
highlight_punchDay(); highlight_punchDay();
} else { } else {
punchcardWidget = $('#punchcard').punchcard({ var data_max = Math.max.apply(Math, data.flat());
data: data, if (data_max === 0) { // no data, MISP's audit notification could be disabled
singular: 'login', $('#punchcard').text('No login or MISP\'s audit notification is disabled.');
plural: 'logins', } else {
timezones: ['local'], $('#punchcard').empty();
timezoneIndex:0 punchcardWidget = $('#punchcard').punchcard({
}); data: data,
punchcardWidget = punchcardWidget.data("plugin_" + "punchcard"); singular: 'login',
highlight_punchDay(); plural: 'logins',
timezones: ['local'],
timezoneIndex:0
});
punchcardWidget = punchcardWidget.data("plugin_" + "punchcard");
highlight_punchDay();
}
} }
}); });
} }

BIN
static/pics/misp-logo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

View File

@ -0,0 +1,58 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="256"
height="128"
viewBox="0 0 67.733332 33.866668"
version="1.1"
id="svg8"
inkscape:version="0.92.1 r15371"
sodipodi:docname="0.svg">
<defs
id="defs2" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.4"
inkscape:cx="-18.174659"
inkscape:cy="139.70413"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:pagecheckerboard="true"
inkscape:window-width="1855"
inkscape:window-height="1056"
inkscape:window-x="65"
inkscape:window-y="24"
inkscape:window-maximized="1" />
<metadata
id="metadata5">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Layer 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-263.13332)" />
</svg>

After

Width:  |  Height:  |  Size: 1.6 KiB

View File

@ -120,7 +120,7 @@
<tbody id='bodyTablerankingModal'> <tbody id='bodyTablerankingModal'>
{% for item in org_rank_list %} {% for item in org_rank_list %}
<tr data-rank={{ loop.index }}> <tr data-rank={{ loop.index }}>
<td style='padding: 0px; text-align: right;'><img height='35px' width='70px' style="margin-right: 20px;" src="{{ url_for('static', filename='pics/rankingMISPOrg/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;"</img></td> <td style='padding: 0px; text-align: right;'><img height='35px' width='70px' style="margin-right: 20px;" src="{{ url_for('static', filename='pics/rankingMISPOrg/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;" onerror="this.style.display='none'"</img></td>
<td>{{ item[1] }}</td> <td>{{ item[1] }}</td>
<td>{{ item[2] }}</td> <td>{{ item[2] }}</td>
<td>{{ item[3] }}</td> <td>{{ item[3] }}</td>
@ -148,7 +148,7 @@
<!-- Modal trophy --> <!-- Modal trophy -->
<div id="myModalTrophy" class="modal fade" role="dialog"> <div id="myModalTrophy" class="modal fade" role="dialog">
<div class="modal-dialog modal-lg" style="width: 1500px;"> <div class="modal-dialog modal-xl">
<!-- Modal content--> <!-- Modal content-->
<div class="modal-content"> <div class="modal-content">
<div class="modal-header"> <div class="modal-header">
@ -172,7 +172,7 @@
<tr> <tr>
<td> <td>
<div id="divBadge_{{ loop.index }}" class="circleBadgeSmall circlBadgeNotAcquired"> <div id="divBadge_{{ loop.index }}" class="circleBadgeSmall circlBadgeNotAcquired">
<img height='48px' width='48' class="" style="margin-top: 3px;" src="{{ url_for('static', filename='pics/MISPHonorableIcons/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;"</img> <img height='48px' width='48' class="" style="margin-top: 3px;" src="{{ url_for('static', filename='pics/MISPHonorableIcons/1.svg')[:-5]}}{{ item[0] }}.svg" type='image/svg' style="margin: auto;" onerror="this.style.display='none'"</img>
</div> </div>
</td> </td>
<td style="padding-left: 15px;">{{ item[1] }}</td> <td style="padding-left: 15px;">{{ item[1] }}</td>
@ -181,32 +181,35 @@
</tbody> </tbody>
</table> </table>
</div> </div>
<p style="font-size: 18px; display: inline;">Trophies: </p><p style="display: inline;">Shows your skills in information sharing </p><i> (earned via upvotes or sightings from other organisation)</i>
<div class="table-responsive"> <div>
<table class="table table-striped table-bordered"> <p style="font-size: 18px; display: inline;">Trophies: </p><a style="display: inline;" class="collapsed" data-toggle="collapse" href="#collapsibleTrophyInfo" aria-expanded="false">Shows your skills in information sharing <span class="fa fa-caret-down"></span></a><i> (earned via upvotes or sightings from other organisation)</i>
<thead>
<tr> <div id="collapsibleTrophyInfo" class="table-responsive collapse">
{% for title in trophy_title_str %} <table class="table table-striped table-bordered">
<th class="centerCell">{{ title }}</th> <thead>
{% endfor %} <tr>
</tr> {% for title in trophy_title_str %}
</thead> <th class="centerCell">{{ title }}</th>
<tbody id='bodyTableTrophyModal'> {% endfor %}
<tr> </tr>
{% for perc in trophy_mapping %} </thead>
<td class="centerCell">{{ perc }}</td> <tbody id='bodyTableTrophyModal'>
{% endfor %} <tr>
</tr> {% for perc in trophy_mapping %}
<tr> <td class="centerCell">{{ perc }}</td>
{% for title in trophy_title_str %} {% endfor %}
<td> </tr>
<input type='image' style="display: block; margin-left: auto; margin-right: auto;" height="64" width="64" src="{{ url_for('static', filename='pics/MISPTrophy/'+loop.index0|string+'.png') }}"> <tr>
</td> {% for title in trophy_title_str %}
{% endfor %} <td>
</tr> <input type='image' style="display: block; margin-left: auto; margin-right: auto;" height="64" width="64" src="{{ url_for('static', filename='pics/MISPTrophy/'+loop.index0|string+'.png') }}">
</tbody> </td>
</table> {% endfor %}
</tr>
</tbody>
</table>
</div>
</div> </div>
<p style="font-size: 18px; display: inline;">Acquired trophies: </p> <p style="font-size: 18px; display: inline;">Acquired trophies: </p>
@ -219,11 +222,11 @@
{% endfor %} {% endfor %}
</tr> </tr>
</thead> </thead>
<tbody id='bodyTableTrophyModal'> <tbody id='bodyTableTrophyModalOrg'>
<tr> <tr>
{% for categ in trophy_categ_list_id %} {% for categ in trophy_categ_list_id %}
<td> <td>
<input type='image' id='trophy_{{categ}}' style="display: block; margin-left: auto; margin-right: auto;" height="64" width="64" src="{{ url_for('static', filename='pics/MISPTrophy/0.png') }}"> <input type='image' id='trophy_{{categ}}' data-category='{{categ}}' style="display: block; margin-left: auto; margin-right: auto;" height="64" width="64" src="{{ url_for('static', filename='pics/MISPTrophy/0.png') }}">
</td> </td>
{% endfor %} {% endfor %}
</tr> </tr>
@ -231,6 +234,22 @@
</table> </table>
</div> </div>
<div class='allOrgRankingDiv'>
<table class="table">
<thead>
<tr>
<th>Rank</th>
<th>Trophy</th>
<th>Points</th>
<th>Org</th>
</tr>
</thead>
<tbody id='bodyTableThropyAllOrgRankingModal'>
<tr><td>Click on a category to view the global ranking</td></tr>
</tbody>
</table>
</div>
</div> </div>
</div> </div>
</div> </div>
@ -510,6 +529,7 @@
var url_getContributionOrgStatus = "{{ url_for('getContributionOrgStatus') }}"; var url_getContributionOrgStatus = "{{ url_for('getContributionOrgStatus') }}";
var url_getHonorBadges = "{{ url_for('getHonorBadges') }}"; var url_getHonorBadges = "{{ url_for('getHonorBadges') }}";
var url_getTrophies = "{{ url_for('getTrophies')}}" var url_getTrophies = "{{ url_for('getTrophies')}}"
var url_getAllOrgsTrophyRanking = "{{ url_for('getAllOrgsTrophyRanking')}}"
var url_baseRankMonthlyLogo = "{{ url_for('static', filename='pics/rankingMISPMonthly/1.svg') }}"; var url_baseRankMonthlyLogo = "{{ url_for('static', filename='pics/rankingMISPMonthly/1.svg') }}";
url_baseRankMonthlyLogo = url_baseRankMonthlyLogo.substring(0, url_baseRankMonthlyLogo.length-5); url_baseRankMonthlyLogo = url_baseRankMonthlyLogo.substring(0, url_baseRankMonthlyLogo.length-5);

43
templates/error_page.html Normal file
View File

@ -0,0 +1,43 @@
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <meta name="viewport" content="width=device-width" />
<title>
Users - MISP
</title>
<!-- jQuery -->
<script src="{{ url_for('static', filename='js/jquery.min.js') }}"></script>
<!-- Bootstrap Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<!-- Custom CSS -->
<link href="{{ url_for('static', filename='css/sb-admin-2.css') }}" rel="stylesheet">
<!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
</head>
<body>
<div id="flashContainer" style="padding-top:50px; !important;">
<div id="main-view-container" class="container-fluid ">
</div>
</div>
<div style="width:100%;">
<table style="margin-left:auto;margin-right:auto;">
<tr>
<td style="text-align:right;width:250px;padding-right:50px"></td>
<td style="width:460px">
<div>
<img src="{{ url_for('static', filename='pics/misp-logo.png') }}" style="display:block; margin-left: auto; margin-right: auto;"/>
</div>
<div class="alert alert-danger" style="margin-top: 15px;">
{{ error_message }}
</div>
</td>
</tr>
</table>
</div>
</body>
</html>

View File

@ -26,7 +26,7 @@
<script src="{{ url_for('static', filename='js/jquery.flot.resize.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.resize.js') }}"></script>
<!-- Bootstrap Core JavaScript --> <!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script> <script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css"> <link rel="stylesheet" href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery-jvectormap-2.0.3.css') }}" type="text/css" media="screen"/> <link rel="stylesheet" href="{{ url_for('static', filename='css/jquery-jvectormap-2.0.3.css') }}" type="text/css" media="screen"/>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-2.0.3.min.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery-jvectormap-2.0.3.min.js') }}"></script>
@ -211,7 +211,7 @@ small {
<input type="text" id="datepickerRadiusFrom" size="20" style=""> <input type="text" id="datepickerRadiusFrom" size="20" style="">
<input type="text" id="datepickerRadiusTo" size="20" style=""> <input type="text" id="datepickerRadiusTo" size="20" style="">
</strong> </strong>
<button type="button" class="btn btn-default leftSepa" onclick="queryAndAddMarkers()">Query</button> <button id="geo_info_qry_btn" type="button" class="btn btn-default leftSepa" style="padding-left: 12px;" onclick="queryAndAddMarkers()">Query</button>
</div> </div>
<div id="panelbody" class="panel-body" style="height: 100%;"> <div id="panelbody" class="panel-body" style="height: 100%;">
<div id="radiusMap" style="width:100%; height: 51vh; position: relative; z-index: 1;"></div> <div id="radiusMap" style="width:100%; height: 51vh; position: relative; z-index: 1;"></div>

View File

@ -24,9 +24,28 @@
<script src="{{ url_for('static', filename='js/jquery.flot.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery.flot.pie.min.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.pie.min.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery.flot.resize.js') }}"></script> <script src="{{ url_for('static', filename='js/jquery.flot.resize.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery-ui.min.js') }}"></script>
<link href="{{ url_for('static', filename='css/jquery-ui.min.css') }}" type="text/css" rel="stylesheet">
<!-- Bootstrap Core JavaScript --> <!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script> <script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery.dataTables.min.css') }}">
<script src="{{ url_for('static', filename='js/jquery.dataTables.min.js') }}"></script>
<link href="{{ url_for('static', filename='css/font-awesome.min.css') }}" type="text/css" rel="stylesheet">
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery-jvectormap-2.0.3.css') }}" type="text/css" media="screen"/>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-2.0.3.min.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-world-mill.js') }}"></script>
<script src="{{ url_for('static', filename='js/doT.js') }}"></script>
<script src="{{ url_for('static', filename='js/extendext.js') }}"></script>
<script src="{{ url_for('static', filename='js/moment-with-locales.js') }}"></script>
<script src="{{ url_for('static', filename='js/query-builder.js') }}"></script>
<link href="{{ url_for('static', filename='css/query-builder.default.css') }}" type="text/css" rel="stylesheet">
</head> </head>
<style> <style>
@ -42,8 +61,9 @@
font-size: 12px; font-size: 12px;
font-weight: bold; font-weight: bold;
line-height: 14px; line-height: 14px;
border-bottom-left-radius: 3px; border-radius: 3px;
box-shadow: 3px 3px 3px #888888; box-shadow: 3px 3px 3px #888888;
margin: 2px;
} }
table { table {
@ -123,6 +143,67 @@ small {
font-weight: bold; font-weight: bold;
} }
.led_green {
background-color: #ABFF00;
box-shadow: rgba(0, 0, 0, 0.2) 0 -1px 7px 1px, inset #304701 0 -1px 6px, #89FF00 0 0px 6px;
}
.led_red {
background-color: #F82222;
box-shadow: rgba(0, 0, 0, 0.2) 0 -1px 7px 1px, inset #304701 0 -1px 6px, #FF0303 0 0px 6px;
}
.led_orange {
background-color: #FFB400;
box-shadow: rgba(0, 0, 0, 0.2) 0 -1px 7px 1px, inset #304701 0 -1px 6px, #FF9000 0 0px 6px;
}
.led-small {
margin: auto auto;
margin-top: 6px;
width: 12px;
height: 12px;
border-radius: 50%;
}
.led-container {
text-align: center;
display: inline-block;
}
.led-container > span {
margin: auto 5px;
}
div.dataTables_scrollHead table.dataTable {
margin-top: 0px !important;
}
.dataTables_scrollBody thead tr {
visibility: collapse !important;
}
.liveLogFullScreen {
position: absolute !important;
top: 66px !important;
left: 15px !important;
right: 10px !important;
z-index: 990 !important;
bottom: -7px !important;
height: unset !important;
}
div.leaflet-bottom {
z-index: 900;
}
.rowTableIsObject {
position: absolute;
right: 15px;
top: 0px;
color: #3465a4;
}
</style> </style>
<body> <body>
@ -140,6 +221,7 @@ small {
<li><a href="{{ url_for('contrib') }}">MISP Contributors</a></li> <li><a href="{{ url_for('contrib') }}">MISP Contributors</a></li>
<li><a href="{{ url_for('users') }}">MISP Users</a></li> <li><a href="{{ url_for('users') }}">MISP Users</a></li>
<li><a href="{{ url_for('trendings') }}">MISP Trendings</a></li> <li><a href="{{ url_for('trendings') }}">MISP Trendings</a></li>
<li><a href="{{ url_for('logout') }}">Logout</a></li>
</ul> </ul>
<div id="ledsHolder" style="float: right; height: 50px;"></div> <div id="ledsHolder" style="float: right; height: 50px;"></div>
@ -198,7 +280,7 @@ small {
</div> </div>
<!-- /.col-lg-6 --> <!-- /.col-lg-6 -->
<!-- /.col-lg-6 --> <!-- /.col-lg-6 -->
<div class="col-lg-{{ size_dashboard_width[1] }}"> <div id="rightCol" class="col-lg-{{ size_dashboard_width[1] }}">
<div class="panel panel-default" style="margin-top: 15px; height: {{ pannelSize[2] }}vh;"> <div class="panel panel-default" style="margin-top: 15px; height: {{ pannelSize[2] }}vh;">
<div id="panelbody" class="panel-body" style="height: 100%;"> <div id="panelbody" class="panel-body" style="height: 100%;">
@ -212,23 +294,12 @@ small {
<div id="panelLogTable" class="panel panel-default" style="height: {{ pannelSize[3] }}vh;"> <div id="panelLogTable" class="panel panel-default" style="height: {{ pannelSize[3] }}vh;">
<div class="panel-heading"> <div class="panel-heading">
<i class="fa fa-tasks fa-fw"></i> Logs <i class="fa fa-tasks fa-fw"></i> Logs
<div class="pull-right"> <div style="display: inline-block; float: right;">
<input id="checkbox_log_info" type="checkbox" value="info"> INFO <button id="log-filter" data-toggle="modal" data-target="#modalFilters" class="btn btn-xs btn-default" ><i class="fa fa-filter"></i></button>
<input id="checkbox_log_warning" type="checkbox" value="warning" checked="true"> WARNING <button id="log-fullscreen" class="btn btn-xs btn-default"><i class="fa fa-expand"></i></button>
<input id="checkbox_log_critical" type="checkbox" value="critical" checked="true"> CRITICAL
</div> </div>
</div> </div>
<div id="divLogTable" class="panel-body" style="height: 98%; padding: 0px;"> <div id="divLogTable" class="panel-body" style="height: calc(100% - 46px); padding: 0px; overflow: hidden">
<div class="row" style="height: 100%;">
<div class="col-lg-12" style="height: 100%;">
<table class="table table-bordered table-hover table-striped" id="table_log">
<thead id="table_log_head">
</thead>
<tbody id="table_log_body">
</tbody>
</table>
</div>
</div>
</div> </div>
</div> </div>
@ -254,6 +325,25 @@ small {
</div> </div>
<!-- /#wrapper --> <!-- /#wrapper -->
<!-- Modal -->
<div class="modal fade" id="modalFilters" tabindex="-1" role="dialog" aria-labelledby="myModalLabel">
<div class="modal-dialog modal-lg" role="document">
<div class="modal-content">
<div class="modal-header">
<button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button>
<h4 class="modal-title" id="myModalLabel">Log filtering rules</h4>
</div>
<div class="modal-body">
<div id="filteringQB"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-default" data-dismiss="modal">Close</button>
<button id="saveFilters" type="button" class="btn btn-primary">Save filters</button>
</div>
</div>
</div>
</div>
<!-- Index --> <!-- Index -->
<script> <script>
/* URL */ /* URL */
@ -299,11 +389,6 @@ small {
<script src="{{ url_for('static', filename='js/index/index_map.js') }}"></script> <script src="{{ url_for('static', filename='js/index/index_map.js') }}"></script>
<script src="{{ url_for('static', filename='js/index/index_pie.js') }}"></script> <script src="{{ url_for('static', filename='js/index/index_pie.js') }}"></script>
<link href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
<link rel="stylesheet" href="{{ url_for('static', filename='css/jquery-jvectormap-2.0.3.css') }}" type="text/css" media="screen"/>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-2.0.3.min.js') }}"></script>
<script src="{{ url_for('static', filename='js/jquery-jvectormap-world-mill.js') }}"></script>
<script type="text/javascript"> <script type="text/javascript">
</script> </script>

67
templates/login.html Normal file
View File

@ -0,0 +1,67 @@
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <meta name="viewport" content="width=device-width" />
<title>
Users - MISP
</title>
<!-- jQuery -->
<script src="{{ url_for('static', filename='js/jquery.min.js') }}"></script>
<!-- Bootstrap Core CSS -->
<link href="{{ url_for('static', filename='css/bootstrap.min.css') }}" rel="stylesheet">
<!-- Custom CSS -->
<link href="{{ url_for('static', filename='css/sb-admin-2.css') }}" rel="stylesheet">
<!-- Bootstrap Core JavaScript -->
<script src="{{ url_for('static', filename='js/bootstrap.js') }}"></script>
<link rel="stylesheet" href="{{ url_for('static', filename='css/font-awesome.min.css') }}" rel="text/css">
</head>
<body>
<div id="flashContainer" style="padding-top:50px; !important;">
<div id="main-view-container" class="container-fluid ">
</div>
</div>
<div>
<div style="width:100%;">
<table style="margin-left:auto;margin-right:auto;">
<tr>
<td style="text-align:right;width:250px;padding-right:50px"></td>
<td style="width:460px">
<div>
<img src="{{ url_for('static', filename='pics/misp-logo.png') }}" style="display:block; margin-left: auto; margin-right: auto;"/>
</div>
{% if authError %}
<div class="alert alert-danger">
{{ authErrorMessage }}
</div>
{% endif %}
<form action="" id="UserLoginForm" method="post" accept-charset="utf-8">
<br><legend>Welcome to MISP-Dashboard</legend><br>
<div class="input email required">
{{ form.username.label }}<br>
{{ form.username(size=32, maxlength=255, autocomplete="off", autofocus="autofocus") }}
</div>
<div class="input password required">
{{ form.password.label }}<br>
{{ form.password(size=32, maxlength=255, autocomplete="off") }}
</div>
<div class="clear"></div>
<p>{{ form.submit() }}</p>
</form>
</td>
<td style="width:250px;padding-left:50px"></td>
</tr>
</table>
</div>
</div>
</body>
</html>

View File

@ -201,7 +201,7 @@ small {
var url_getTopOrglogin = "{{ url_for('getTopOrglogin') }}"; var url_getTopOrglogin = "{{ url_for('getTopOrglogin') }}";
var url_getLoginVSCOntribution = "{{ url_for('getLoginVSCOntribution') }}"; var url_getLoginVSCOntribution = "{{ url_for('getLoginVSCOntribution') }}";
var url_getUserLoginsAndContribOvertime = "{{ url_for('getUserLoginsAndContribOvertime') }}"; var url_getUserLoginsAndContribOvertime = "{{ url_for('getUserLoginsAndContribOvertime') }}";
var url_getTypeaheadData = "{{ url_for('getAllOrg') }}"; var url_getTypeaheadData = "{{ url_for('getAllLoggedOrg') }}";
/* DATA FROM CONF */ /* DATA FROM CONF */

View File

@ -1,8 +1,13 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import configparser import configparser
import redis
import sys,os
import datetime import datetime
import os
import sys
import redis
from helpers import geo_helper
sys.path.append('..') sys.path.append('..')
configfile = 'test_config.cfg' configfile = 'test_config.cfg'
@ -14,7 +19,6 @@ serv_redis_db = redis.StrictRedis(
port=6260, port=6260,
db=1) db=1)
from helpers import geo_helper
geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg) geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg)
categ = 'Network Activity' categ = 'Network Activity'

View File

@ -1,8 +1,14 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import configparser import configparser
import datetime
import os
import sys
import time
import redis import redis
import sys,os
import datetime, time from helpers import trendings_helper
sys.path.append('..') sys.path.append('..')
configfile = 'test_config.cfg' configfile = 'test_config.cfg'
@ -14,7 +20,6 @@ serv_redis_db = redis.StrictRedis(
port=6260, port=6260,
db=1) db=1)
from helpers import trendings_helper
trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg) trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg)

View File

@ -1,8 +1,14 @@
#!/usr/bin/env python3.5 #!/usr/bin/env python3.5
import configparser import configparser
import datetime
import os
import sys
import time
import redis import redis
import sys,os
import datetime, time from helpers import users_helper
sys.path.append('..') sys.path.append('..')
configfile = 'test_config.cfg' configfile = 'test_config.cfg'
@ -14,7 +20,6 @@ serv_redis_db = redis.StrictRedis(
port=6260, port=6260,
db=1) db=1)
from helpers import users_helper
users_helper = users_helper.Users_helper(serv_redis_db, cfg) users_helper = users_helper.Users_helper(serv_redis_db, cfg)

79
updates.py Normal file
View File

@ -0,0 +1,79 @@
import redis
import os
import configparser
import logging
DATABASE_VERSION = [
1
]
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
serv_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db'))
serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db'))
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
# logger
logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'update_filename')
logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir):
os.makedirs(logDir)
handler = logging.FileHandler(logPath)
formatter = logging.Formatter('%(asctime)s:%(levelname)s:%(name)s:%(message)s')
handler.setFormatter(formatter)
update_logger = logging.getLogger(__name__)
update_logger.setLevel(logging.INFO)
update_logger.addHandler(handler)
def check_for_updates():
db_version = serv_redis_db.get(cfg.get('RedisDB', 'dbVersion'))
db_version = int(db_version) if db_version is not None else 0
updates_to_be_done = find_updates(db_version)
if len(updates_to_be_done) == 0:
update_logger.info('database up-to-date')
else:
for i in updates_to_be_done:
exec_updates(i)
def find_updates(db_version):
updates_to_be_done = []
for i in DATABASE_VERSION:
if db_version < i:
updates_to_be_done.append(i)
return updates_to_be_done
def exec_updates(db_version):
result = False
if db_version == 1:
result = apply_update_1()
if result:
serv_redis_db.set(cfg.get('RedisDB', 'dbVersion'), db_version)
update_logger.warning('dbVersion sets to {}'.format(db_version))
else:
update_logger.error('Something went wrong. {}'.format(result))
# Data format changed. Wipe the key.
def apply_update_1():
serv_redis_db.delete('TEMP_CACHE_LIVE:Attribute')
log_text = 'Executed update 1. Deleted Redis key `TEMP_CACHE_LIVE:Attribute`'
print(log_text)
update_logger.info(log_text)
return True

57
util.py
View File

@ -1,4 +1,6 @@
import datetime, time import datetime
import time
from collections import defaultdict
ONE_DAY = 60*60*24 ONE_DAY = 60*60*24
@ -6,7 +8,7 @@ def getZrange(serv_redis_db, keyCateg, date, topNum, endSubkey=""):
date_str = getDateStrFormat(date) date_str = getDateStrFormat(date)
keyname = "{}:{}{}".format(keyCateg, date_str, endSubkey) keyname = "{}:{}{}".format(keyCateg, date_str, endSubkey)
data = serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True) data = serv_redis_db.zrange(keyname, 0, topNum-1, desc=True, withscores=True)
data = [ [record[0].decode('utf8'), record[1]] for record in data ] data = [ [record[0], record[1]] for record in data ]
return data return data
def noSpaceLower(text): def noSpaceLower(text):
@ -16,7 +18,7 @@ def push_to_redis_zset(serv_redis_db, mainKey, toAdd, endSubkey="", count=1):
now = datetime.datetime.now() now = datetime.datetime.now()
today_str = getDateStrFormat(now) today_str = getDateStrFormat(now)
keyname = "{}:{}{}".format(mainKey, today_str, endSubkey) keyname = "{}:{}{}".format(mainKey, today_str, endSubkey)
serv_redis_db.zincrby(keyname, toAdd, count) serv_redis_db.zincrby(keyname, count, toAdd)
def getMonthSpan(date): def getMonthSpan(date):
ds = datetime.datetime(date.year, date.month, 1) ds = datetime.datetime(date.year, date.month, 1)
@ -71,3 +73,52 @@ def getDateHoursStrFormat(date):
def getTimestamp(date): def getTimestamp(date):
return int(time.mktime(date.timetuple())) return int(time.mktime(date.timetuple()))
def sortByTrendingScore(toSort, topNum=5):
scoredLabels = defaultdict(float)
numDay = len(toSort)
baseDecay = 1.0
decayRate = lambda x: baseDecay*((numDay-x**2)/numDay)
for i, arr in enumerate(toSort):
timestamp = arr[0]
dailyData = arr[1]
for item in dailyData:
label = item[0]
occ = item[1]
scoredLabels[label] += occ*decayRate(i)
topList = [[l, s] for l, s in scoredLabels.items()]
topList.sort(key=lambda x: x[1], reverse=True)
topSet = [ l for l, v in topList[:topNum]]
# now that we have the top, filter out poor scored elements
topArray = []
for arr in toSort:
timestamp = arr[0]
dailyData = arr[1]
topDailyArray = list(filter(lambda item: (item[0] in topSet), dailyData))
dailyCombi = [timestamp, topDailyArray]
topArray.append(dailyCombi)
return topArray
def getFields(obj, fields):
jsonWalker = fields.split('.')
itemToExplore = obj
lastName = ""
try:
for i in jsonWalker:
itemToExplore = itemToExplore[i]
lastName = i
if type(itemToExplore) is list:
return {'name': lastName, 'data': itemToExplore}
else:
if i == 'timestamp':
itemToExplore = datetime.datetime.utcfromtimestamp(
int(itemToExplore)).strftime('%Y-%m-%d %H:%M:%S')
return itemToExplore
except KeyError as e:
return None

View File

@ -1,85 +1,92 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import time, datetime
import copy
import logging
import zmq
import redis
import random
import configparser
import argparse import argparse
import os import configparser
import sys import copy
import datetime
import json import json
import logging
import os
import random
import sys
import time
import redis
import zmq
import util import util
from helpers import geo_helper import updates
from helpers import contributor_helper from helpers import (contributor_helper, geo_helper, live_helper,
from helpers import users_helper trendings_helper, users_helper)
from helpers import trendings_helper
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser() cfg = configparser.ConfigParser()
cfg.read(configfile) cfg.read(configfile)
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'dispatcher_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
logger = logging.getLogger('zmq_dispatcher') logger = logging.getLogger('zmq_dispatcher')
CHANNEL = cfg.get('RedisLog', 'channel')
LISTNAME = cfg.get('RedisLIST', 'listName') LISTNAME = cfg.get('RedisLIST', 'listName')
serv_log = redis.StrictRedis( serv_log = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLog', 'db')) db=cfg.getint('RedisLog', 'db'),
decode_responses=True)
serv_redis_db = redis.StrictRedis( serv_redis_db = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisDB', 'db')) db=cfg.getint('RedisDB', 'db'),
decode_responses=True)
serv_list = redis.StrictRedis( serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db')) db=cfg.getint('RedisLIST', 'db'),
decode_responses=True)
live_helper = live_helper.Live_helper(serv_redis_db, cfg)
geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg) geo_helper = geo_helper.Geo_helper(serv_redis_db, cfg)
contributor_helper = contributor_helper.Contributor_helper(serv_redis_db, cfg) contributor_helper = contributor_helper.Contributor_helper(serv_redis_db, cfg)
users_helper = users_helper.Users_helper(serv_redis_db, cfg) users_helper = users_helper.Users_helper(serv_redis_db, cfg)
trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg) trendings_helper = trendings_helper.Trendings_helper(serv_redis_db, cfg)
def publish_log(zmq_name, name, content, channel=CHANNEL):
to_send = { 'name': name, 'log': json.dumps(content), 'zmqName': zmq_name }
serv_log.publish(channel, json.dumps(to_send))
logger.debug('Published: {}'.format(json.dumps(to_send)))
def getFields(obj, fields):
jsonWalker = fields.split('.')
itemToExplore = obj
lastName = ""
try:
for i in jsonWalker:
itemToExplore = itemToExplore[i]
lastName = i
if type(itemToExplore) is list:
return { 'name': lastName , 'data': itemToExplore }
else:
return itemToExplore
except KeyError as e:
return ""
############## ##############
## HANDLERS ## ## HANDLERS ##
############## ##############
def handler_log(zmq_name, jsonevent): def handler_skip(zmq_name, jsonevent):
logger.info('Log not processed') logger.info('Log not processed')
return return
def handler_audit(zmq_name, jsondata):
action = jsondata.get('action', None)
jsonlog = jsondata.get('Log', None)
if action is None or jsonlog is None:
return
# consider login operations
if action == 'log': # audit is related to log
logAction = jsonlog.get('action', None)
if logAction == 'login': # only consider user login
timestamp = int(time.time())
email = jsonlog.get('email', '')
org = jsonlog.get('org', '')
users_helper.add_user_login(timestamp, org, email)
else:
pass
def handler_dispatcher(zmq_name, jsonObj): def handler_dispatcher(zmq_name, jsonObj):
if "Event" in jsonObj: if "Event" in jsonObj:
handler_event(zmq_name, jsonObj) handler_event(zmq_name, jsonObj)
@ -87,17 +94,17 @@ def handler_dispatcher(zmq_name, jsonObj):
def handler_keepalive(zmq_name, jsonevent): def handler_keepalive(zmq_name, jsonevent):
logger.info('Handling keepalive') logger.info('Handling keepalive')
to_push = [ jsonevent['uptime'] ] to_push = [ jsonevent['uptime'] ]
publish_log(zmq_name, 'Keepalive', to_push) live_helper.publish_log(zmq_name, 'Keepalive', to_push)
# Login are no longer pushed by `misp_json_user`, but by `misp_json_audit`
def handler_user(zmq_name, jsondata): def handler_user(zmq_name, jsondata):
logger.info('Handling user') logger.info('Handling user')
action = jsondata['action'] action = jsondata['action']
json_user = jsondata['User'] json_user = jsondata['User']
json_org = jsondata['Organisation'] json_org = jsondata['Organisation']
org = json_org['name'] org = json_org['name']
if action == 'login': #only consider user login if action == 'edit': #only consider user login
timestamp = int(time.time()) pass
users_helper.add_user_login(timestamp, org)
else: else:
pass pass
@ -123,7 +130,16 @@ def handler_conversation(zmq_name, jsonevent):
def handler_object(zmq_name, jsondata): def handler_object(zmq_name, jsondata):
logger.info('Handling object') logger.info('Handling object')
return # check if jsonattr is an mispObject object
if 'Object' in jsondata:
jsonobj = jsondata['Object']
soleObject = copy.deepcopy(jsonobj)
del soleObject['Attribute']
for jsonattr in jsonobj['Attribute']:
jsonattrcpy = copy.deepcopy(jsonobj)
jsonattrcpy['Event'] = jsondata['Event']
jsonattrcpy['Attribute'] = jsonattr
handler_attribute(zmq_name, jsonattrcpy, False, parentObject=soleObject)
def handler_sighting(zmq_name, jsondata): def handler_sighting(zmq_name, jsondata):
logger.info('Handling sighting') logger.info('Handling sighting')
@ -151,11 +167,8 @@ def handler_event(zmq_name, jsonobj):
timestamp = jsonevent['timestamp'] timestamp = jsonevent['timestamp']
trendings_helper.addTrendingEvent(eventName, timestamp) trendings_helper.addTrendingEvent(eventName, timestamp)
tags = [] tags = []
for tag in jsonobj.get('EventTag', []): for tag in jsonevent.get('Tag', []):
try: tags.append(tag)
tags.append(tag['Tag'])
except KeyError:
pass
trendings_helper.addTrendingTags(tags, timestamp) trendings_helper.addTrendingTags(tags, timestamp)
#redirect to handler_attribute #redirect to handler_attribute
@ -169,6 +182,16 @@ def handler_event(zmq_name, jsonobj):
else: else:
handler_attribute(zmq_name, attributes) handler_attribute(zmq_name, attributes)
if 'Object' in jsonevent:
objects = jsonevent['Object']
if type(objects) is list:
for obj in objects:
jsoncopy = copy.deepcopy(jsonobj)
jsoncopy['Object'] = obj
handler_object(zmq_name, jsoncopy)
else:
handler_object(zmq_name, objects)
action = jsonobj.get('action', None) action = jsonobj.get('action', None)
eventLabeled = len(jsonobj.get('EventTag', [])) > 0 eventLabeled = len(jsonobj.get('EventTag', [])) > 0
org = jsonobj.get('Orgc', {}).get('name', None) org = jsonobj.get('Orgc', {}).get('name', None)
@ -180,11 +203,15 @@ def handler_event(zmq_name, jsonobj):
action, action,
isLabeled=eventLabeled) isLabeled=eventLabeled)
def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False): def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False, parentObject=False):
logger.info('Handling attribute') logger.info('Handling attribute')
# check if jsonattr is an attribute object # check if jsonattr is an attribute object
if 'Attribute' in jsonobj: if 'Attribute' in jsonobj:
jsonattr = jsonobj['Attribute'] jsonattr = jsonobj['Attribute']
else:
jsonattr = jsonobj
attributeType = 'Attribute' if jsonattr['object_id'] == '0' else 'ObjectAttribute'
#Add trending #Add trending
categName = jsonattr['category'] categName = jsonattr['category']
@ -192,22 +219,9 @@ def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False):
trendings_helper.addTrendingCateg(categName, timestamp) trendings_helper.addTrendingCateg(categName, timestamp)
tags = [] tags = []
for tag in jsonattr.get('Tag', []): for tag in jsonattr.get('Tag', []):
try: tags.append(tag)
tags.append(tag)
except KeyError:
pass
trendings_helper.addTrendingTags(tags, timestamp) trendings_helper.addTrendingTags(tags, timestamp)
to_push = []
for field in json.loads(cfg.get('Dashboard', 'fieldname_order')):
if type(field) is list:
to_join = []
for subField in field:
to_join.append(str(getFields(jsonobj, subField)))
to_add = cfg.get('Dashboard', 'char_separator').join(to_join)
else:
to_add = getFields(jsonobj, field)
to_push.append(to_add)
#try to get coord from ip #try to get coord from ip
if jsonattr['category'] == "Network activity": if jsonattr['category'] == "Network activity":
@ -221,13 +235,19 @@ def handler_attribute(zmq_name, jsonobj, hasAlreadyBeenContributed=False):
eventLabeled = len(jsonobj.get('EventTag', [])) > 0 eventLabeled = len(jsonobj.get('EventTag', [])) > 0
action = jsonobj.get('action', None) action = jsonobj.get('action', None)
contributor_helper.handleContribution(zmq_name, jsonobj['Event']['Orgc']['name'], contributor_helper.handleContribution(zmq_name, jsonobj['Event']['Orgc']['name'],
'Attribute', attributeType,
jsonattr['category'], jsonattr['category'],
action, action,
isLabeled=eventLabeled) isLabeled=eventLabeled)
# Push to log # Push to log
publish_log(zmq_name, 'Attribute', to_push) live_helper.publish_log(zmq_name, attributeType, jsonobj)
def handler_diagnostic_tool(zmq_name, jsonobj):
try:
res = time.time() - float(jsonobj['content'])
except Exception as e:
logger.error(e)
serv_list.set('diagnostic_tool_response', str(res))
############### ###############
## MAIN LOOP ## ## MAIN LOOP ##
@ -243,15 +263,18 @@ def process_log(zmq_name, event):
def main(sleeptime): def main(sleeptime):
updates.check_for_updates()
numMsg = 0 numMsg = 0
while True: while True:
content = serv_list.rpop(LISTNAME) content = serv_list.rpop(LISTNAME)
if content is None: if content is None:
logger.debug('Processed {} message(s) since last sleep.'.format(numMsg)) log_text = 'Processed {} message(s) since last sleep.'.format(numMsg)
logger.info(log_text)
numMsg = 0 numMsg = 0
time.sleep(sleeptime) time.sleep(sleeptime)
continue continue
content = content.decode('utf8') content = content
the_json = json.loads(content) the_json = json.loads(content)
zmqName = the_json['zmq_name'] zmqName = the_json['zmq_name']
content = the_json['content'] content = the_json['content']
@ -266,17 +289,22 @@ dico_action = {
"misp_json_attribute": handler_attribute, "misp_json_attribute": handler_attribute,
"misp_json_object": handler_object, "misp_json_object": handler_object,
"misp_json_sighting": handler_sighting, "misp_json_sighting": handler_sighting,
"misp_json_organisation": handler_log, "misp_json_organisation": handler_skip,
"misp_json_user": handler_user, "misp_json_user": handler_user,
"misp_json_conversation": handler_conversation, "misp_json_conversation": handler_conversation,
"misp_json_object_reference": handler_log, "misp_json_object_reference": handler_skip,
"misp_json_audit": handler_audit,
"diagnostic_channel": handler_diagnostic_tool
} }
if __name__ == "__main__": if __name__ == "__main__":
parser = argparse.ArgumentParser(description='The ZMQ dispatcher. It pops from the redis buffer then redispatch it to the correct handlers') parser = argparse.ArgumentParser(description='The ZMQ dispatcher. It pops from the redis buffer then redispatch it to the correct handlers')
parser.add_argument('-s', '--sleep', required=False, dest='sleeptime', type=int, help='The number of second to wait before checking redis list size', default=5) parser.add_argument('-s', '--sleep', required=False, dest='sleeptime', type=int, help='The number of second to wait before checking redis list size', default=1)
args = parser.parse_args() args = parser.parse_args()
main(args.sleeptime) try:
main(args.sleeptime)
except (redis.exceptions.ResponseError, KeyboardInterrupt) as error:
print(error)

View File

@ -1,34 +1,41 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import time, datetime
import zmq
import logging
import redis
import configparser
import argparse import argparse
import configparser
import datetime
import json
import logging
import os import os
import sys import sys
import json import time
import redis
import zmq
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg') configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser() cfg = configparser.ConfigParser()
cfg.read(configfile) cfg.read(configfile)
logDir = cfg.get('Log', 'directory') logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'filename') logfilename = cfg.get('Log', 'subscriber_filename')
logPath = os.path.join(logDir, logfilename) logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir): if not os.path.exists(logDir):
os.makedirs(logDir) os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO) try:
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
except PermissionError as error:
print(error)
print("Please fix the above and try again.")
sys.exit(126)
logger = logging.getLogger('zmq_subscriber') logger = logging.getLogger('zmq_subscriber')
ZMQ_URL = cfg.get('RedisGlobal', 'zmq_url')
CHANNEL = cfg.get('RedisLog', 'channel') CHANNEL = cfg.get('RedisLog', 'channel')
LISTNAME = cfg.get('RedisLIST', 'listName') LISTNAME = cfg.get('RedisLIST', 'listName')
serv_list = redis.StrictRedis( serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'), host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'), port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db')) db=cfg.getint('RedisLIST', 'db'),
decode_responses=True)
############### ###############
@ -41,25 +48,31 @@ def put_in_redis_list(zmq_name, content):
serv_list.lpush(LISTNAME, json.dumps(to_add)) serv_list.lpush(LISTNAME, json.dumps(to_add))
logger.debug('Pushed: {}'.format(json.dumps(to_add))) logger.debug('Pushed: {}'.format(json.dumps(to_add)))
def main(zmqName): def main(zmqName, zmqurl):
context = zmq.Context() context = zmq.Context()
socket = context.socket(zmq.SUB) socket = context.socket(zmq.SUB)
socket.connect(ZMQ_URL) socket.connect(zmqurl)
socket.setsockopt_string(zmq.SUBSCRIBE, '') socket.setsockopt_string(zmq.SUBSCRIBE, '')
while True: while True:
try: try:
content = socket.recv() content = socket.recv()
put_in_redis_list(zmqName, content) put_in_redis_list(zmqName, content)
print(zmqName, content)
except KeyboardInterrupt: except KeyboardInterrupt:
return return
except Exception as e:
logger.warning('Error:' + str(e))
if __name__ == "__main__": if __name__ == "__main__":
parser = argparse.ArgumentParser(description='A zmq subscriber. It subscribes to a ZNQ then redispatch it to the misp-dashboard') parser = argparse.ArgumentParser(description='A zmq subscriber. It subscribes to a ZMQ then redispatch it to the misp-dashboard')
parser.add_argument('-n', '--name', required=False, dest='zmqname', help='The ZMQ feed name', default="MISP Standard ZMQ") parser.add_argument('-n', '--name', required=False, dest='zmqname', help='The ZMQ feed name', default="MISP Standard ZMQ")
parser.add_argument('-u', '--url', required=False, dest='zmqurl', help='The URL to connect to', default=ZMQ_URL) parser.add_argument('-u', '--url', required=False, dest='zmqurl', help='The URL to connect to', default="tcp://localhost:50000")
args = parser.parse_args() args = parser.parse_args()
main(args.zmqname) try:
main(args.zmqname, args.zmqurl)
except redis.exceptions.ResponseError as error:
print(error)

74
zmq_subscribers.py Executable file
View File

@ -0,0 +1,74 @@
#!/usr/bin/env python3
import time, datetime
import logging
import redis
import configparser
import argparse
import os
import subprocess
import sys
import json
import atexit
import signal
import shlex
import pty
import threading
configfile = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'config/config.cfg')
cfg = configparser.ConfigParser()
cfg.read(configfile)
logDir = cfg.get('Log', 'directory')
logfilename = cfg.get('Log', 'subscriber_filename')
logPath = os.path.join(logDir, logfilename)
if not os.path.exists(logDir):
os.makedirs(logDir)
logging.basicConfig(filename=logPath, filemode='a', level=logging.INFO)
logger = logging.getLogger('zmq_subscriber')
CHANNEL = cfg.get('RedisLog', 'channel')
LISTNAME = cfg.get('RedisLIST', 'listName')
serv_list = redis.StrictRedis(
host=cfg.get('RedisGlobal', 'host'),
port=cfg.getint('RedisGlobal', 'port'),
db=cfg.getint('RedisLIST', 'db'))
children = []
def signal_handler(signal, frame):
for child in children:
# We don't resume as we are already attached
cmd = "screen -p"+child+" -X {arg}"
argsc = shlex.split(cmd.format(arg = "kill"))
print("\n\033[1;31m [-] Terminating {child}\033[0;39m".format(child=child))
logger.info('Terminate: {child}'.format(child=child))
subprocess.call(argsc) # kill window
sys.exit(0)
###############
## MAIN LOOP ##
###############
def main():
print("\033[1;31m [+] I am the subscriber's master - kill me to kill'em'all \033[0;39m")
# screen needs a shell and I an no fan of shell=True
(master, slave) = pty.openpty()
try:
for item in json.loads(cfg.get('RedisGlobal', 'misp_instances')):
name = shlex.quote(item.get("name"))
zmq = shlex.quote(item.get("zmq"))
print("\033[1;32m [+] Subscribing to "+zmq+"\033[0;39m")
logger.info('Launching: {child}'.format(child=name))
children.append(name)
subprocess.Popen(["screen", "-r", "Misp_Dashboard", "-X", "screen", "-t", name ,sys.executable, "./zmq_subscriber.py", "-n", name, "-u", zmq], close_fds=True, shell=False, stdin=slave, stdout=slave, stderr=slave)
except ValueError as error:
print("\033[1;31m [!] Fatal exception: {error} \033[0;39m".format(error=error))
logger.error("JSON error: %s", error)
sys.exit(1)
signal.signal(signal.SIGINT, signal_handler)
forever = threading.Event()
forever.wait() # Wait for SIGINT
if __name__ == "__main__":
main()