my custom data collector and visualization board
Find a file
2022-09-14 13:36:46 +02:00
src feat: allow to save and load metric data 2022-06-28 01:09:15 +02:00
.editorconfig style: use tabs 2022-06-13 02:35:51 +02:00
.gitignore Initial commit 2022-06-03 02:01:19 +02:00
.rustfmt.toml style: use tabs 2022-06-13 02:35:51 +02:00
Cargo.toml feat: allow to save and load metric data 2022-06-28 01:09:15 +02:00
LICENSE Initial commit 2022-06-03 02:01:19 +02:00
README.md chore: updated README 2022-09-14 13:36:46 +02:00

dashboard

screenshot A data aggregating dashboard, capable of periodically fetching, parsing, archiving and plotting data.

Name

Do you have a good name idea for this project? Let me know!

How it works

This software periodically (customizable interval) makes a GET request to given URL, then applies all metric JQL queries to the JSON output, then inserts all extracted points into its underlying SQLite. Each panel displays all points gathered respecting limits, without redrawing until user interacts with UI or data changes. If no "x" query is specified, current time will be used (as timestamp) for each sample "x" coordinate, making this software especially useful for timeseries.

Features

  • parse JSON apis with JQL syntax
  • embedded SQLite, no need for external database
  • import/export metrics data to/from CSV
  • split data from 1 fetch to many metrics
  • customize source color and name, toggle them (visibility or fetching)
  • customize panels (size, span, offset)
  • reduce data points with average or sampling
  • per-source query interval
  • light/dark mode

Usage

This program will work on a database stored in $HOME/.local/share/dashboard.db. By default, nothing will be shown. To add sources or panels, toggle edit mode (top left). Once in edit mode you can:

  • Add panels (top bar)
  • Add sources (in source sidebar, bottom)
  • Edit panels (name, height, display options)
  • Edit sources (name, color, query, panel)

Installation

cargo build --release, then drop it in your ~/.local/bin. Done, have fun hoarding data!