Project for tracking farm animals.Sample YT
- Python >=3.7
- TensorFlow Object Detection API
Download repository and install dependencies
$ git clone https://github.com/burnpiro/farm-animal-tracking.git
$ cd farm-animal-tracking
$ pip install -r requirements.txt
- To download precompiled model weights Google Drive
- Unzip archive to model/detection_model
- To download precompiled model weights Google Drive
- Unzip archive to model/siamese/weights
To visualize animal detection on video use:
$ python show_prediction.py
or for image:
$ python run_detection.py
To visualize animal tracking on video use:
$ python show_tracking.py --video=<path to video>
Dataset for learning of model can be obtained at PSRG website.
- Run:
docker-compose -f eda/docker-compose.yaml up
- Go to
localhost:8001
and enter token from console
You can download current best weights from Google Drive MobileNetV2 Google Drive EfficientNetB5 Google Drive ResNet101V2. Put them into ./model/siamese/weights
and use the path as --weights
parameter.
Make sure you have cropped dataset in ./data/cropped_animals
folder. Please check ./data/data_generator.py
documentation for more info.
$ python train_siamese.py
Instead of running this script manually (requires ~30GB of RAM) you can use pre-generated train/test/concat files in ./data/visualization
. Just select two files with the same postfix, vecs-$1.tsv
and meta-$1.tsv
, it's important to use the same postfix, otherwise length won't match.
$ python helpers/generate_siamese_emb_space.py
Options:
--datatype
: eithertrain
ortest
(defaulttrain
), which data should be used for embeddings--weights
: string (defaultsiam-118_0.0633.h5
), specify weights file frommode/siamese/weights/MobileNetV2/
folder
This is going to produce two files:
- vecs.tsv - list of embeddings for test dataset
- meta.tsv - list of labels for embeddings
You can visualize those embeddings in https://projector.tensorflow.org/ application. Just upload them as a custom data (use Load
option).
$ cd data
$ python generate_tracking.py
This is going to produce tracking data from videos, so we can evaluate model. Look for frames_tracking.json
and pigs_tracking.json
inside ./data/tracking/
. For more details check Wiki.
You can specify the weights for the model. Please use weights marked with the lowest number (loss value).
$ python test_siamese.py
Options:
--weights siam-118_0.0633.h5