Super experimental, fully functional Loki API emulator made with NodeJS, Fastify and Clickhouse
APIs are compatible with Grafana Explore and paStash for logs ingestion
π₯ Beta Stage, Contributions Welcome! Do not use this for anything serious.. yet!
The Loki API and its Grafana native integration are brilliant, simple and appealing - but we just love Clickhouse.
cLoki implements the same API functionality as Loki, buffered by a fast bulking LRU sitting on top of Clickhouse tables and relying on its columnar search and insert performance alongside solid distribuion and clustering capabilities for stored data. Just like Loki, cLoki does not parse or index incoming logs, but rather groups log streams using the same label system as Prometheus.
cLoki implements custom query functions for clickhouse timeseries extraction, allowing direct access to any table
Convert columns to tagged timeseries using the emulated loki 2.0 query format
<aggr-op> by (<labels,>) (<function>(<metric>[range_in_seconds])) from <database>.<table> where <optional condition>
avg by (source_ip) (rate(mos[60])) from my_database.my_table
sum by (ruri_user, from_user) (rate(duration[300])) from my_database.my_table where duration > 10
Convert columns to tagged timeseries using the experimental clickhouse
function
clickhouse({ db="my_database", table="my_table", tag="source_ip", metric="avg(mos)", where="mos > 0", interval="60" })
parameter | description |
---|---|
db | clickhouse database name |
table | clickhouse table name |
tag | column(s) for tags, comma separated |
metric | function for metric values |
where | where condition (optional) |
interval | interval in seconds (optional) |
timefield | time/date field name (optional) |
Insert using Telegraf Input and display metrics and logs in Grafana without plugins
Clone this repository, install with npm
and run using nodejs
8.x (or higher)
npm install
npm start
For a fully working demo, check the docker-compose example
CLICKHOUSE_SERVER="my.clickhouse.server" CLICKHOUSE_DB="my_data" CLICKHOUSE_AUTH="default:password" DEBUG=true node cloki.js
The following ENV Variables can be used to control cLoki parameters and backend settings.
ENV | Default | Usage |
---|---|---|
CLICKHOUSE_SERVER | localhost | Clickhouse Server address |
CLICKHOUSE_PORT | 8123 | Clickhouse Server port |
CLICKHOUSE_DB | default | Clickhouse Database Name |
CLICKHOUSE_TSDB | loki | Clickhouse TS Database Name |
CLICKHOUSE_AUTH | default: | Clickhouse Authentication (user:password) |
BULK_MAXAGE | 2000 | Max Age for Bulk Inserts |
BULK_MAXSIZE | 5000 | Max Size for Bulk Inserts |
BULK_MAXCACHE | 50000 | Max Labels in Memory Cache |
LABELS_DAYS | 7 | Max Days before Label rotation |
SAMPLES_DAYS | 7 | Max Days before Timeseries rotation |
HOST | 0.0.0.0 | cLOKi API IP |
PORT | 3100 | cLOKi API PORT |
CLOKI_LOGIN | false | Basic HTTP Username |
CLOKI_PASSWORD | false | Basic HTTP Password |
READONLY | false | Readonly Mode, no DB Init |
DEBUG | false | Debug Mode |
The ideal companion for parsing and shipping log streams to cLoki is paStash with extensive interpolation capabilities.
Loki API Functions are loosely implemented as documented by the Loki API reference.
- /loki/api/v1/push
- /loki/api/v1/query
- /loki/api/v1/query_range
- /loki/api/v1/label
- /loki/api/v1/label/name/values
- Basic Writes
- Label Fingerprints
- Sample Series
- JSON Support
- ProtoBuf Support
- Basic Fingerprinting
- Stream Selector rules ()
- = exactly equal.
- != not equal.
- =~ regex-match.
- !~ do not regex-match.
- Basic Search
- Labels (single key, multi key, AND logic)
- Samples (by Fingerprint match)
curl -i -XPOST -H Content-Type: application/json http://localhost:3100/loki/api/v1/push --data '{"streams":[{"labels":"{\"__name__\":\"up\"}","entries":[{"timestamp":"2018-12-26T16:00:06.944Z","line":"zzz"}]}]}'
curl -i -XPOST -H Content-Type: application/json http://localhost:3100/loki/api/v1/push --data '{"streams":[{"labels":"{\"__name__\":\"metric\"}","entries":[{"timestamp":"2018-12-26T16:00:06.944Z","value":"100"}]}]}'
# curl localhost:3100/loki/api/v1/query?query='{__name__="up"}'
{
"streams": [
{
"labels": "{\"__name__\":\"up\"}",
"entries": [
{
"timestamp":"1545840006944",
"line":"zzz"
},
{
"timestamp":"1545840006944",
"line":"zzz"
},
{
"timestamp": "1545840006944",
"line":"zzz"
}
]
}
]
}
# curl localhost:3100/loki/api/v1/label
{"values":["__name__"]}
# curl 'localhost:3100/loki/api/v1/__name__/values'
{"values":["up"]}
CREATE TABLE time_series (
date Date,
fingerprint UInt64,
labels String,
name String
)
ENGINE = ReplacingMergeTree
PARTITION BY date
ORDER BY fingerprint;
CREATE TABLE samples (
fingerprint UInt64,
timestamp_ms Int64,
value Float64,
string String,
)
ENGINE = MergeTree
PARTITION BY toDate(timestamp_ms / 1000)
ORDER BY (fingerprint, timestamp_ms);
CREATE DATABASE IF NOT EXISTS loki
CREATE TABLE IF NOT EXISTS loki.time_series (
date Date,
fingerprint UInt64,
labels String,
name String
) ENGINE = ReplacingMergeTree PARTITION BY date ORDER BY fingerprint;
CREATE TABLE IF NOT EXISTS loki.samples (
fingerprint UInt64,
timestamp_ms Int64,
value Float64,
string String
) ENGINE = MergeTree PARTITION BY toRelativeHourNum(toDateTime(timestamp_ms / 1000)) ORDER BY (fingerprint, timestamp_ms);
SELECT DISTINCT fingerprint, labels FROM loki.time_series
SELECT fingerprint, timestamp_ms, string
FROM loki.samples
WHERE fingerprint IN (7975981685167825999) AND timestamp_ms >= 1514730532900
AND timestamp_ms <= 1514730532902
ORDER BY fingerprint, timestamp_ms
SELECT fingerprint, timestamp_ms, value
FROM loki.samples
ANY INNER JOIN 7975981685167825999 USING fingerprint
WHERE timestamp_ms >= 1514730532900 AND timestamp_ms <= 1514730532902
ORDER BY fingerprint, timestamp_ms
INSERT INTO loki.time_series (date, fingerprint, labels, name) VALUES (?, ?, ?, ?)
INSERT INTO loki.samples (fingerprint, timestamp_ms, value, string) VALUES (?, ?, ?, ?)
cLoki is not affiliated or endorsed by Grafana Labs. All rights belong to their respective owners.