TARGER can be installed in two ways, either manually or with prebuilt Docker containers.
For installing TARGER, we recommend using Python 3.5.
The models for the backend are not included and must be manually downloaded. Instructions and URLs can be found in the project readme.
Now clone this repository and open the cloned directory:
git clone https://github.com/webis-de/targer && cd targer
Manual installation can be structured in the different parts of the system architecture. Each part can be installed individually and on different machines.
Hint: this document describes the same setup as the backend Dockerfile
.
-
Download the pre-trained models to the
models
directory:wget https://files.webis.de/data-in-production/data-research/acqua/targer/models/ \ --recursive --level=1 \ --no-directories --no-host-directories \ --accept=h5,hdf5 --directory-prefix=models
Afterwards there should be 8
.h5
/.hdf5
files in this folder. -
Go to backend directory:
cd backend
-
Clone the BiLSTM-CNN-CRF repository:
git clone https://github.com/UKPLab/emnlp2017-bilstm-cnn-crf
-
Install required Python packages:
pip install -r emnlp2017-bilstm-cnn-crf/requirements.txt -r requirements.txt
-
Move Phyton source code into the cloned directory:
mv backend.py Model.py ModelNewES.py ModelNewWD.py emnlp2017-bilstm-cnn-crf/ mv BiLSTM.py emnlp2017-bilstm-cnn-crf/neuralnets/
-
Move configuration into the cloned directory:
mv config.ini emnlp2017-bilstm-cnn-crf/
-
Move downloaded models into the cloned directory:
mv models/*.h5 emnlp2017-bilstm-cnn-crf/models/
-
Go to the cloned directory:
cd emnlp2017-bilstm-cnn-crf
-
Run the server in developement mode:
python backend.py
Alternatively, you can use the backend with a Nginx web server and WSGI. Use the included
uwsgi.ini
to load it in the environment.Note: The backend requires all models to fit into memory. Thus you won't be able to run the backend if you have less than 8GB free memory available on your machine.
-
Go to frontend directory:
cd frontend
-
Edit the Backend URLs in
config.ini
. Replace it with the backend server hostname. If it runs on the same machine, you can uselocalhost
. -
Install required Python packages:
pip install -r requirements.txt
-
Run the server in development mode:
python frontend.py ``` Alternatively, you can use the frontend with a Nginx web server and WSGI. Use the included `uwsgi.ini` to load it in the environment. _Note: Argument search will only work if the configured Elasticsearch holds the configured arguments index, i.e. [indexed previously](#setup-elasticsearch-indexing)._
-
Install and run Elasticsearch
-
Go to batch processing directory:
cd batch_processing
-
Install required Python packages:
pip install -r requirements.txt
-
Run indexing:
python index.py -host HOST -port PORT -index INDEX -input FILE
-
Go to batch processing directory:
cd batch_processing
-
Install required Python packages:
pip install -r requirements.txt
-
Run argument labelling:
python label_mp.py --model MODEL --workers N --input FILE --output FILE
After cloning the repository, the system can be installed and started with Docker Compose in the default configuration:
docker-compose up
Each part can also be launched individually with their respective Dockerfile
.