Skip to content

Latest commit

 

History

History
113 lines (71 loc) · 4.99 KB

README.md

File metadata and controls

113 lines (71 loc) · 4.99 KB

SCALAR - Streaming ChALlenge plAtfoRm

Logo

SCALAR is the first of its kind, platform to organize Machine Learning competitions on Data Streams. It was used to organize a real-time ML competition on IEEE Big Data Cup Challenges 2019.

Features

Data stream mining competitions - With SCALAR you can organize the real-time competitions on Data Streams. It is inspired by Kaggle, platform for offline machine learning competitions.

Simple, user friendly interface - SCALAR has a WEB application that allows you to easily browse and subscribe to the competitions. Its simple and intuitive design, let's you to easily upload the datasets, and create the competition.

Dialog competition

Live results and leaderboard - Since the competition is real-time, the results are also updated in real time. During the competition, you can follow performance of your model in the WEB application. Leaderboard will show how do you compare to other users, and live chart shows the comparison with baseline model.

Secure, bi-directional streaming communication - We use a combination of gRPC and Protobuf to provide secure, low latency bi-directional streaming communication between server and users.

Architecture

Freedom to choose a programming language - SCALAR lets users to choose their preferred environment. The only requirement is to be able to communicate through gRPC and Protobuf, which is supported for many programming languages: Python, Java, C++, GO... Additionally, SCALAR provides support for R. Apart from that, users can choose their setup, environment and additional resources to train better models.


Getting Started

The project is done in Python and organized in Docker containers. Each service is a separate Docker container.

Prerequisites

To run the platform locally, Docker is needed:

Install Docker

Also, Docker compose should be installed:

Install Docker compose

Running

Running is done using Docker-compose.

Quick setup (to test the functionality of the platform)

  • Run setup.py and follow the instructions to setup the environment. The script will set up the time zone and create the docker network for all containers.

  • Once the setup.py finished successfully, the platform can be run by:

docker-compose up

In depth setup (If you want to use it for organizing competitions and enable the registration):

  • Download the code locally and then adjust the config.json and docker-compose.yml files. More details in config-ReadMe.txt and in docker-compose-ReadMe.txt.

  • Set up an email account which will be used to send the registration confirmation message and authentication token. For that, you will need to set up your email account to allow the access of less secure apps. For a quick start, update only email information in config.json.

  • In docker-compose.yml update only the local paths to mount a persistent volumes, following the docker-compose-ReadMe.txt.

  • Run setup.py and follow the instructions to setup the environment. The script will set up the time zone and create the docker network for all containers.

  • Once the setup.py finished successfully, the platform can be run by:

docker-compose up

This command will pull all necessary containers and run them. When all services are up, web application will be available on localhost:80

To log in to the platform, you can use default credentials: admin:admin Default Test datastream, for creating the test competition is available under Datastreams tab.

To get to know around the platform use the the Quick Start Guide. To create and participate in the competition use the provided examples.

  • Nedeljko Radulovic
  • Dihia Boulegane
  • Albert Bifet
  • Nenad Stojanovic

Acknowledgments

Open source Docker containers were used:

References