Work in progress
Setting up NOCLook
This guide is written for Ubuntu 14.04.
Basic requirements
Start with installing some basic requirements and creating a new user as any superuser.
$ sudo apt-get install python-setuptools git libpq-dev python-dev $ sudo easy_install pip $ sudo pip install virtualenv $ sudo adduser --disabled-password --home /var/opt/norduni ni # sudo apt-get install git-core python-virtualenv openjdk-6-jdk build-essential postgresql python-psycopg2 libpq-dev python-dev
We are using postgresql but you can use any SQL database that Django supports. See Django database documentation for other supported SQL databases.
Create database
Set password for database user and create a new database
sudo -u postgres psql postgres \password postgres Write password Write password again Ctrl+D sudo -u postgres createdb norduni
NORDUni repository
Get the NORDUni files.
$ sudo -u ni -i $ pwd /var/opt/norduni $ git clone git://git.nordu.net/norduni.git
Python environment
Make a virtual python environment.
$ virtualenv norduni_environment
Making a virtual environment is also just a suggestion but it makes it easier to keep your system clean.
Python requirements
Install required python modules.
$ . norduni_environment/bin/activate $ pip install -r norduni/requirements.txt
Django settings
Change the django settings.
cd /opt/norduni/src/niweb/ cp generic_settings.py settings.py vi settings.py
Change at least the following settings.
NIWEB_ROOT = '/opt/norduni/src/niweb/' # Database settings DATABASES = { 'default': { 'ENGINE': 'django.db.backends.postgresql_psycopg2', 'NAME': 'norduni', 'USER': 'postgres', 'PASSWORD': 'secret', 'HOST': 'localhost' } } # Neo4j settings NEO4J_RESOURCE_URI = '/opt/norduni/dependencies/neo4jdb/'
Neo4j >1.5 embedded with Python bindings
Install JPype and Neo4j-embedded.
Download jpype. (http://sourceforge.net/projects/jpype/files/)
pip install neo4j-embedded export JAVA_HOME=/usr/lib/jvm/java-6-openjdk/ pip install /path/to/JPype-version.zip
# Django Generic Login (r'^accounts/login/$', 'django.contrib.auth.views.login'), # Federated login #(r'^accounts/', include('niweb.apps.fedlogin.urls')),
python manage.py syncdb python manage.py runserver 0.0.0.0:80
Now you should be able surf to your machines ip and see the NOCLook app started.
It is time to collect and insert some data.
Deploying NOCLook
Comment out the static media url in /opt/norduni/src/niweb/urls.py.
# Static serve #(r'^site_media/(?P<path>.*)$', 'django.views.static.serve', # {'document_root': settings.STATIC_DEV_MEDIA}),
Install nginx, postfix and gunicorn.
sudo apt-get install nginx postfix pip install gunicorn
Create a gunicorn start file.
#!/bin/bash set -e export JAVA_HOME=/usr/lib/jvm/java-6-openjdk/ LOGFILE=/var/log/ni/noclook.log LOGDIR=$(dirname $LOGFILE) NUM_WORKERS=1 # user/group to run as USER=user GROUP=group cd /opt/norduni/src/niweb source env/bin/activate test -d $LOGDIR || mkdir -p $LOGDIR exec env/bin/gunicorn_django -w $NUM_WORKERS \ --user=$USER --group=$GROUP --log-level=debug \ --log-file=$LOGFILE 2>>$LOGFILE
Configure nginx.
server { listen 80; root /opt/norduni/src/niweb; server_name ni.example.net; access_log /var/log/ni/noclook-access.log; error_log /var/log/ni/noclook-error.log; location /static/ { root /opt/norduni/src/niweb/; autoindex on; access_log off; expires 30d; } location / { proxy_pass_header Server; proxy_set_header Host $http_host; proxy_redirect off; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Scheme $scheme; proxy_connect_timeout 10; proxy_read_timeout 10; proxy_pass http://localhost:8000/; } }
Install supervisord and set up the following start script.
easy_install supervisor echo_supervisord_conf > /etc/supervisord.conf
description "supervisord" start on runlevel [2345] stop on runlevel [!2345] respawn exec /usr/local/bin/supervisord --nodaemon --configuration /etc/supervisord.conf
Add the noclook start script to the supervisor configuration.
[program:noclook] directory = /opt/norduni/src/niweb/ user = user command = /opt/norduni/src/start_noclook.sh stdout_logfile = /var/log/ni/supervisor_logfile.log stderr_logfile = /var/log/ni/supervisor_err_logfile.log
Collecting and processing network data
To insert data you need to stop any python process that is using the Neo4j database. We hope to get the option to load more database instances in read-only mode in a near future then this could be avoided.
NORDUnet has a GIT repository called nistore and it is cloned to /opt/nistore/.
To start have a look at the NERDS README then clone the NERDS project.
cd /opt/norduni/ mkdir tools cd tools git clone https://github.com/fredrikt/nerds.git
Juniper Configuration Producer/Consumer
The Juniper configuration producer can load Juniper configuration directly from the router via SSH or Juniper configuration files in XML format from disk.
[ssh] user = view_account_user password = not_so_secret_password [sources] remote = one.example.org two.example.org three.example.org local = /var/conf/one.xml /var/conf/two.xml /var/conf/three.xml
"host": { "juniper_conf": { "bgp_peerings": [ { "as_number": "", "group": "", "description": "", "remote_address": "", "local_address": "", "type": "" }, ], "interfaces": [ { "name": "", "bundle": "", "vlantagging": true/false, "units": [ { "address": [ "", "" ], "description": "", "unit": "", "vlanid": "" } ], "tunnels": [ { "source": "", "destination": "" } ], "description": "" }, ], "name": "" }, "version": 1, "name": "" }
The JSON files can the be inserted using noclook_juniper_consumer.py.
Change the path at the top of the script to be able to import norduni_client.py.
Edit the template.conf file with the correct path to the Juniper NERDS files.
[data] juniper_conf = /path/to/juniper/json nmap_services = alcatel_isis = noclook =
Then run:
python noclook_juniper_consumer.py -C template.conf
Alcatel-Lucent ISIS Producer/Consumer
Using the output from the "show isis database detail" on a Cisco router
connected to the Alcatel-Lucent DCN network, nodes and their neighbors
will be grouped.
To get a more human readable result use the IOS command "clns" to map
the NSAP address to a hostname. eg. clns host hostname NSAP_address.
You can also provide a mapping CSV file. The mandatory columns are
osi_address and name. All following columns will be added to the JSON
output.
osi_address;name;other1;otherN 47002300000001000100010001002060280DB11D;NU-SHHM-ILA-01;info1;infoN
"host": { "alcatel_isis": { "data": { "ip_address": "", "link": "", "name": "", "osi_address": "", "ots": "", "type": "" }, "name": "", "neighbours": [ { "metric": "", "name": "" }, ] }, "name": "", "version": 1 }
The JSON files can be inserted with noclook_alcatel_consumer.py.
Edit the template.conf file with the correct path to the Alcatel ISIS NERDS files.
Change the path at the top of the script to be able to import norduni_client.py.
[data] juniper_conf = nmap_services = alcatel_isis = /path/to/alcatel/json noclook =
Then run:
python noclook_alcatel_consumer.py -C template.conf
nmap Producer/Consumer
Using the nmap services producer you can scan a network or individual addresses. NORDUnet have a file
with networks that is used with the "-iL networks_file" option added to NERDS_NMAP_OPTIONS in the run.sh file.
You need to install python-nmap from https://github.com/johanlundberg/python-nmap if the pip version gives you trouble.
Then you can scan your localhost with:
cd /opt/norduni/tools/nerds/producers/nmap_services ./run.sh . 127.0.0.1
You will find the JSON file in /opt/norduni/tools/nerds/producers/nmap_services/producers/json/.
"host" : { "." : { "os" : { "family" : "", "name" : "" } }, "addrs" : [ "127.0.0.1" ], "hostnames" : [ "host.example.org" ], "name" : "host.example.org", "services" : { "ipv4": { "127.0.0.1": { "tcp": { "1025": { "product": "Microsoft Windows RPC", "confidence": "10", "name": "msrpc", "proto": "unknown"}, "1029": { "product": "Microsoft Windows RPC over HTTP", "confidence": "10", "version": "1.0", "name": "ncacn_http", "proto": "unknown"}, } } } }, "version" : 1 }
The JSON files can be inserted with noclook_nmap_consumer_py.py.
Edit the template.conf file with the correct path to the nmap services JSON files.
Change the path at the top of the script to be able to import norduni_client.py.
[data] juniper_conf = nmap_services = /path/to/nmap/json alcatel_isis = noclook =
Then run:
python noclook_nmap_consumer.py -C template.conf
CSV Site Producer/Consumer
The script produces JSON output in the NERDS format from the provided CSV file.
The csv file needs to start with the name of the node and then the node type.
After those two columns any other node property may follow.
Start your csv file with a line similar to the one below.
name;node_type;node_property1,node_property2;...;node_property15
name;Host;site_type;address;area;postcode;city;country;floor;room;latitude;longitude;responsible_for;owner_id;telenor_subscription_id;comment
{ "host": { "csv_producer": { "address": "", "area": "", "city": "", "comment": "", "country": "", "floor": "", "latitude": "", "longitude": "", "meta_type": "", "name": "", "node_type": "", "owner_id": "", "postcode": "", "responsible_for": "", "room": "", "site_type": "", "telenor_subscription_id": "" }, "name": "", "version": 1 } }
The consumer script should only be run once as it does not update the sites, only creates new.
The JSON file directory is then inserted in to the database using noclook_site_csv_consumer.py.
Change the path at the top of the script to be able to import norduni_client.py.
Then run:
python noclook_site_csv_consumer.py -D /path/to/site_files/json
Daily database update
The producers are run with a cron job and the script noclook_consumer.py is used to run the three inserting/updating scripts (noclook_juniper_consumer.py, noclook_alcatel_consumer.py and noclook_nmap_consumer.py).
Change the path at the top of the script to be able to import norduni_client.py.
[data] juniper_conf = /path/to/juniper/json nmap_services = /path/to/nmap/json alcatel_isis = /path/to/alcate/json noclook = #Used for loading backup.
Then run:
python noclook_consumer.py -C template.conf -I
Setting up a local/development NOCLook
git clone https://git.nordu.net/norduni.git git checkout neo4jdb-python # Download neo4j docker image and start it docker pull tpires/neo4j docker run -d -v /path_to_repo/norduni/docker/neo4j.properties:/var/lib/neo4j/conf/neo4j.properties -v /opt/docker/neo4jdata:/var/lib/neo4j/data -p 7474:7474 tpires/neo4j # Create the indexes with curl curl -D - -H "Content-Type: application/json" --data '{"name" : "node_auto_index","config" : {"type" : "fulltext","provider" : "lucene"}}' -X POST http://localhost:7474/db/data/index/node/ curl -D - -H "Content-Type: application/json" --data '{"name" : "relationship_auto_index","config" : {"type" : "fulltext","provider" : "lucene"}}' -X POST http://localhost:7474/db/data/index/relationship/ # Create a virtualenv and activate it virtualenv env . env/bin/activate # Install the python packages pip install paver pip install -r /path_to_repo/requirements.txt # Create a settings.py from the template /path_to/repo/src/niweb/niweb cp /path_to_repo/src/niweb/niweb/generic_settings.py /path_to/repo/src/niweb/niweb/settings.py # Sync the db python /path_to_repo/src/niweb/manage.py syncdb python /path_to_repo/src/niweb/manage.py migrate apps.noclook python /path_to_repo/src/niweb/manage.py migrate actstream python /path_to_repo/src/niweb/manage.py migrate tastypie # Run the app python /path_to_repo/src/niweb/manage.py runserver # Optional postgres instead of sqlite3, don't forget to change database settings in settings.py. # Download postgres docker image and start it docker pull orchardup/postgresql docker run -d -p 5432:5432 -e POSTGRESQL_USER=norduni -e POSTGRESQL_PASS=docker -e POSTGRESQL_DB=norduni -v /opt/docker/postgresql_data/:/var/lib/postgresql/ orchardup/postgresql