Build a searchable database for Correlator data in the pyerrors format.
- Python 100%
|
Some checks failed
Mypy / mypy (push) Successful in 13m2s
Pytest / pytest (3.12) (push) Successful in 2m54s
Pytest / pytest (3.13) (push) Successful in 15m38s
Pytest / pytest (3.14) (push) Successful in 1m12s
Ruff / ruff (push) Successful in 4m38s
Ruff / ruff (pull_request) Failing after 39s
Mypy / mypy (pull_request) Failing after 39s
Pytest / pytest (3.12) (pull_request) Waiting to run
Pytest / pytest (3.13) (pull_request) Waiting to run
Pytest / pytest (3.14) (pull_request) Waiting to run
|
||
|---|---|---|
| .github/workflows | ||
| corrlib | ||
| examples | ||
| tests | ||
| .gitignore | ||
| pyproject.toml | ||
| README.md | ||
| TODO.md | ||
| uv.lock | ||
Pyerrors backlog
With this tool, we aim at making it easy to backlog correlation functions that can be read with pyerrors.
This is done in a reproducible way using datalad.
In principle, a dataset is created, that is automatically administered by the backlogger, in which data from differnt projects are held together.
Everything is catalogued by a searchable SQL database, which holds the paths to the respective measurements.
The original projects can be linked to the dataset and the data may be imported using wrapper functions around the read methonds of pyerrors.