Lookyloo is a web interface allowing to scrape a website and then displays a tree of domains calling each other. https://lookyloo.circl.lu/
Go to file
Raphaël Vinot 1c2cdad38b new: Allow admin to rebuild cache 2020-04-01 17:44:06 +02:00
.github/workflows Create dockerimage.yml 2019-11-15 16:51:37 +01:00
bin chg: Update call to Lookyloo in async scrape 2020-03-31 16:57:16 +02:00
cache fix: Systemd service, add proper stop script 2019-04-05 14:01:36 +02:00
client fix: incorrect call to is_up 2020-03-17 15:13:52 +01:00
config new: Allow admin to rebuild cache 2020-04-01 17:44:06 +02:00
doc chg: move to bootstrap4, toggle legend 2019-04-18 17:34:36 +02:00
etc chg: Bump configs to be in-line with prod 2019-04-05 14:13:07 +02:00
lookyloo new: Allow admin to rebuild cache 2020-04-01 17:44:06 +02:00
user_agents new: Add more recent user agents list 2020-02-28 12:52:58 +01:00
website new: Allow admin to rebuild cache 2020-04-01 17:44:06 +02:00
.gitignore new: Add config files, initial support for 3rd party modules 2020-03-31 14:12:57 +02:00
.travis.yml chg: Use poetry instead of pipenv 2020-01-21 17:39:18 +01:00
Dockerfile fix: Dockerfile, bump pylookyloo 2020-02-10 11:42:09 +01:00
LICENSE Update LICENSE 2019-04-06 10:33:22 +02:00
README.md chg: Update client API usage 2020-03-16 17:20:46 +01:00
docker-compose.yml chg: Use poetry instead of pipenv 2020-01-21 17:39:18 +01:00
poetry.lock new: Allow admin to rebuild cache 2020-04-01 17:44:06 +02:00
pyproject.toml new: Allow admin to rebuild cache 2020-04-01 17:44:06 +02:00
setup.py chg: Add typing 2020-01-06 15:32:38 +01:00

README.md

Lookyloo icon

Lookyloo is a web interface allowing to scrape a website and then displays a tree of domains calling each other.

Thank you very much Tech Blog @ willshouse.com for the up-to-date list of UserAgents.

What is that name?!

1. People who just come to look.
2. People who go out of their way to look at people or something often causing crowds and more disruption.
3. People who enjoy staring at watching other peoples misfortune. Oftentimes car onlookers to car accidents.
Same as Looky Lou; often spelled as Looky-loo (hyphen) or lookylou
In L.A. usually the lookyloo's cause more accidents by not paying full attention to what is ahead of them.

Source: Urban Dictionary

Screenshot

Screenshot of Lookyloo

Implementation details

This code is very heavily inspired by webplugin and adapted to use flask as backend.

The two core dependencies of this project are the following:

  • ETE Toolkit: A Python framework for the analysis and visualization of trees.
  • Splash: Lightweight, scriptable browser as a service with an HTTP API

Cookies

If you want to scrape a website as if you were loggged in, you need to pass your sessions cookies. You can do it the the folloing way:

  1. Install Cookie Quick Manager
  2. Click onthe icon in the top right of your browser > Manage all cookies
  3. Search for a domain, tick the Sub-domain box if needed
  4. Right clock on the domain you want to export > save to file > $LOOKYLOO_HOME/cookies.json

Then, you need to restart the webserver and from now on, every cookies you have in that file will be available for the browser used by Splash

Python client

You can use pylookyloo as a standalone script, or as a library, more details here

Installation

IMPORTANT: Use pipenv

NOTE: Yes, it requires python3.6+. No, it will never support anything older.

NOTE: If you want to run a public instance, you should set only_global_lookups=True in website/web/__init__.py and bin/async_scrape.py to disallow scraping of private IPs.

Installation of Splash

You need a running splash instance, preferably on docker

sudo apt install docker.io
sudo docker pull scrapinghub/splash
sudo docker run -p 8050:8050 -p 5023:5023 scrapinghub/splash --disable-ui --disable-lua --disable-browser-caches
# On a server with a decent abount of RAM, you may want to run it this way:
# sudo docker run -p 8050:8050 -p 5023:5023 scrapinghub/splash --disable-ui -s 100 --disable-lua -m 50000 --disable-browser-caches

Install redis

git clone https://github.com/antirez/redis.git
cd redis
git checkout 5.0
make
cd ..

Installation of Lookyloo

git clone https://github.com/CIRCL/lookyloo.git
cd lookyloo
pipenv install
echo LOOKYLOO_HOME="'`pwd`'" > .env

Run the app

pipenv run start.py

Run the app in production

With a reverse proxy (Nginx)

pip install uwsgi

Config files

You have to configure the two following files:

  • etc/nginx/sites-available/lookyloo
  • etc/systemd/system/lookyloo.service

Copy them to the appropriate directories, and run the following command:

sudo ln -s /etc/nginx/sites-available/lookyloo /etc/nginx/sites-enabled

If needed, remove the default site

sudo rm /etc/nginx/sites-enabled/default

Make sure everything is working:

sudo systemctl start lookyloo
sudo systemctl enable lookyloo
sudo nginx -t
# If it is cool:
sudo service nginx restart

And you can open http://<IP-or-domain>/

Now, you should configure TLS (let's encrypt and so on)

Use aquarium for a reliable multi-users app

Aquarium is a haproxy + splash bundle that will allow lookyloo to be used by more than one user at once.

The initial version of the project was created by TeamHG-Memex but we have a dedicated repository that fits our needs better.

Follow the documentation if you want to use it.

Run the app with a simple docker setup

Dockerfile

The repository includes a Dockerfile for building a containerized instance of the app.

Lookyloo stores the scraped data in /lookyloo/scraped. If you want to persist the scraped data between runs it is sufficient to define a volume for this directory.

Running a complete setup with Docker Compose

Additionally you can start a complete setup, including the necessary Docker instance of splashy, by using Docker Compose and the included service definition in docker-compose.yml by running

docker-compose up

After building and startup is complete lookyloo should be available at http://localhost:5000/

If you want to persist the data between different runs uncomment the "volumes" definition in the last two lines of docker-compose.yml and define a data storage directory in your Docker host system there.