Skip to content
Snippets Groups Projects
Commit 7e8bd7cc authored by Lennard's avatar Lennard Committed by Raspberry
Browse files

Initial commit

parents
Branches
No related tags found
No related merge requests found
Showing
with 423 additions and 0 deletions
---
BasedOnStyle: Google
IndentWidth: 4
---
Language: Cpp
# Force pointers to the type for C++.
DerivePointerAlignment: false
PointerAlignment: Left
AllowShortFunctionsOnASingleLine: Empty
NamespaceIndentation: All
AllowShortIfStatementsOnASingleLine: Never
AlignConsecutiveAssignments: 'false'
ColumnLimit: 120
\ No newline at end of file
data/* filter=lfs diff=lfs merge=lfs -text
The MIT License (MIT)
Copyright (c) 2019 Marvin Pinto and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
# GitHub Automatic Releases
This action simplifies the GitHub release process by automatically uploading assets, generating changelogs, handling pre-releases, and so on.
## Contents
1. [Usage Examples](#usage-examples)
1. [Supported Parameters](#supported-parameters)
1. [Event Triggers](#event-triggers)
1. [Versioning](#versioning)
1. [How to get help](#how-to-get-help)
1. [License](#license)
> **NOTE**: The `marvinpinto/action-automatic-releases` repository is an automatically generated mirror of the [marvinpinto/actions](https://github.com/marvinpinto/actions) monorepo containing this and other actions. Please file issues and pull requests over there.
## Usage Examples
### Automatically generate a pre-release when changes land on master
This example workflow will kick in as soon as changes land on `master`. After running the steps to build and test your project:
1. It will create (or replace) a git tag called `latest`.
1. Generate a changelog from all the commits between this, and the previous `latest` tag.
1. Generate a new release associated with the `latest` tag (removing any previous associated releases).
1. Update this new release with the specified title (e.g. `Development Build`).
1. Upload `LICENSE.txt` and any `jar` files as release assets.
1. Mark this release as a `pre-release`.
You can see a working example of this workflow over at [marvinpinto/actions](https://github.com/marvinpinto/actions/releases/tag/latest).
```yaml
---
name: "pre-release"
on:
push:
branches:
- "master"
jobs:
pre-release:
name: "Pre Release"
runs-on: "ubuntu-latest"
steps:
# ...
- name: "Build & test"
run: |
echo "done!"
- uses: "marvinpinto/action-automatic-releases@latest"
with:
repo_token: "${{ secrets.GITHUB_TOKEN }}"
automatic_release_tag: "latest"
prerelease: true
title: "Development Build"
files: |
LICENSE.txt
*.jar
```
### Create a new GitHub release when tags are pushed to the repository
Similar to the previous example, this workflow will kick in as soon as new tags are pushed to GitHub. After building & testing your project:
1. Generate a changelog from all the commits between this and the previous [semver-looking](https://semver.org/) tag.
1. Generate a new release and associate it with this tag.
1. Upload `LICENSE.txt` and any `jar` files as release assets.
Once again there's an example of this over at [marvinpinto/actions](https://github.com/marvinpinto/actions/releases/latest).
```yaml
---
name: "tagged-release"
on:
push:
tags:
- "v*"
jobs:
tagged-release:
name: "Tagged Release"
runs-on: "ubuntu-latest"
steps:
# ...
- name: "Build & test"
run: |
echo "done!"
- uses: "marvinpinto/action-automatic-releases@latest"
with:
repo_token: "${{ secrets.GITHUB_TOKEN }}"
prerelease: false
files: |
LICENSE.txt
*.jar
```
## Supported Parameters
| Parameter | Description | Default |
| ----------------------- | ----------------------------------------------------------------------------- | -------- |
| `repo_token`\*\* | GitHub Action token, e.g. `"${{ secrets.GITHUB_TOKEN }}"`. | `null` |
| `draft` | Mark this release as a draft? | `false` |
| `prerelease` | Mark this release as a pre-release? | `true` |
| `automatic_release_tag` | Tag name to use for automatic releases, e.g `latest`. | `null` |
| `title` | Release title; defaults to the tag name if none specified. | Tag Name |
| `files` | Files to upload as part of the release assets. | `null` |
| `changelog_file` | Text you want to put in the release/changelog info. (Disables auto changelog) | `null` |
## Outputs
The following output values can be accessed via `${{ steps.<step-id>.outputs.<output-name> }}`:
| Name | Description | Type |
| ------------------------ | ------------------------------------------------------ | ------ |
| `automatic_releases_tag` | The release tag this action just processed | string |
| `upload_url` | The URL for uploading additional assets to the release | string |
### Notes:
- Parameters denoted with `**` are required.
- The `files` parameter supports multi-line [glob](https://github.com/isaacs/node-glob) patterns, see repository examples.
## Event Triggers
The GitHub Actions framework allows you to trigger this (and other) actions on _many combinations_ of events. For example, you could create specific pre-releases for release candidate tags (e.g `*-rc*`), generate releases as changes land on master (example above), nightly releases, and much more. Read through [Workflow syntax for GitHub Actions](https://help.github.com/en/articles/workflow-syntax-for-github-actions) for ideas and advanced examples.
## Versioning
Every commit that lands on master for this project triggers an automatic build as well as a tagged release called `latest`. If you don't wish to live on the bleeding edge you may use a stable release instead. See [releases](../../releases/latest) for the available versions.
```yaml
- uses: "marvinpinto/action-automatic-releases@<VERSION>"
```
## How to get help
The main [README](https://github.com/marvinpinto/actions/blob/master/README.md) for this project has a bunch of information related to debugging & submitting issues. If you're still stuck, try and get a hold of me on [keybase](https://keybase.io/marvinpinto) and I will do my best to help you out.
## License
The source code for this project is released under the [MIT License](/LICENSE). This project is not associated with GitHub.
name: "Automatic Releases"
author: "marvinpinto"
description: "Automate the GitHub release process with assets, changelogs, pre-releases, and more"
inputs:
repo_token:
description: "GitHub secret token"
required: true
automatic_release_tag:
description: "Git tag (for automatic releases)"
required: false
draft:
description: "Should this release be marked as a draft?"
required: false
default: false
prerelease:
description: "Should this release be marked as a pre-release?"
required: false
default: true
title:
description: "Release title (for automatic releases)"
required: false
files:
description: "Assets to upload to the release"
required: false
changelog_file:
description: "Text you want to put in the release/changelog info. (Disables auto changelog)"
required: false
outputs:
automatic_releases_tag:
description: "The release tag this action just processed"
upload_url:
description: "The URL for uploading additional assets to the release"
runs:
using: "node12"
main: "dist/index.js"
branding:
icon: "git-merge"
color: "red"
This diff is collapsed.
name: Convert data to .mat file(s)
on:
schedule:
- cron: "0 1 * * *"
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
lfs: 'true'
- name: Set up Python 3.10
uses: actions/setup-python@v3
with:
python-version: "3.10"
- name: Install dependencies
run: |
python3 -m pip install --upgrade pip
python3 -m pip install numpy scipy
- name: Convert data
run: |
python3 convert.py
- name: release
uses: "./.github/actions/automatic-releases"
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
prerelease: false
title: "Data"
automatic_release_tag: latest
changelog_file: ./CHANGELOG.md
files: |
data.zip
*.txt
out/*
CHANGELOG.md
profiling.log
data/data
data.zip
__pycache__/**
\ No newline at end of file
stages:
- convert
- upload
- delete
- release
variables:
PACKAGE_REGISTRY_URL: "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/my_package/latest"
workflow:
rules:
- if: $CI_PIPELINE_SOURCE == "schedule" # only run this pipeline upon a schedule event
convert:
stage: convert
image: "python:3.10"
script:
- python3 -m pip install --upgrade pip
- python3 -m pip install numpy scipy
- python3 convert.py
artifacts:
paths:
- data.zip
- CHANGELOG.md
upload:
stage: upload
image: curlimages/curl:latest
script:
- 'curl --header "JOB-TOKEN: ${CI_JOB_TOKEN}" --upload-file data.zip "${PACKAGE_REGISTRY_URL}/data.zip"'
delete:
stage: delete
image: curlimages/curl:latest
allow_failure: true
script:
- |
echo "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/releases/latest"
curl --request DELETE --header "JOB-TOKEN: ${CI_JOB_TOKEN}" "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/releases/latest"
release:
stage: release
image: registry.gitlab.com/gitlab-org/release-cli:latest
script:
- |
release-cli create --name "Data" --tag-name "latest" --description "CHANGELOG.md" \
--assets-link "{\"name\":\"data.zip\",\"url\":\"${PACKAGE_REGISTRY_URL}/data.zip\"}"
\ No newline at end of file
# DMS Messungen vom Silo
## Daten
- Die Daten der Messungen bis zum Vortag können unter [Veröffentlichungen/Releases](https://gitlab.cvh-server.de/Lennard/messdatensilo/-/releases/latest) heruntergeladen werden.
- Die Daten werden als `.mat` Datein gespeichert und sind jeweils von einer Woche.
## Funktionsweise
- Ein Cron Job führt nach jeden reboot and jeden Tag um 0:00 das `scripts/run.bash` skript aus.
- `run.bash` führt dann `python3 main.py` aus, welches die Daten für einen Tag sammelt, und läd diese dann im Anschluss auf gitlab.cvh-server.de hoch.
- Das `main.py` Programm liest immer die Daten von den Arduinos, mittelt diese über einen gewissen Zeitraum und speichert diese dann in `data/data` ab.
- `main.py` stoppt dann kurz vor Mitternacht, benennt `data/data` dann in `log.Jahr-Monat-Tag_Stunde.log` um und löscht die älteste Datei, falls zu viele Datein vorhanden sind.
- Der Zeitraum, die Anzahl zu behaltene Datein und weitere Parameter von dem Programm können in `config.yml` verändert werden.
- Zusätzlich werden noch weitere Log Datein geführt:
- In `logs/*` werden logs vom `python3 main.py` geschrieben.
- In `bash.log` werden logs von `run.bash` geschrieben.
\ No newline at end of file
File added
DataLogger:
backupCount: 350 # number of datalogs to keep
levels: # log level for outputting to file and to stdout respectivly
- INFO
- WARNING
InfoLogger:
backupCount: 10 # number of logs to keep
maxBytes: 100000 # size of single log in bytes
levels: # log level for outputting to file and to stdout respectivly
- INFO
- WARNING
Data:
factors: [1.855, 0, 0.923, -1] # factors for the 4 dms
delta_time: 30 # time between logging data
"""Convert csv data into mat files to read into matlab.
Combines the files from one week and converts it into a single '.mat' file.
"""
from datetime import datetime, timedelta
from pathlib import Path
import glob
from time import strptime
from typing import Dict
import shutil
import numpy as np
import scipy.io
data: Dict[str, np.ndarray] = {} # a dict of numpy array's, one for each week
header = ["Timestamp"] + [f"dms{i+1}" for i in range(4)] + [f"temp{i+1}" for i in range(4)] + ["n"]
start_time: float = 0
def convertfunc(x: bytes) -> float:
global start_time
if start_time == 0:
start_time = datetime.strptime(x.decode("utf-8"), "%Y-%m-%d %H:%M:%S").timestamp()
return datetime.strptime(x.decode("utf-8"), "%Y-%m-%d %H:%M:%S").timestamp() - start_time
files = sorted(glob.glob(str(Path.joinpath(Path(__file__).parent, "data", "log.*.log"))))
for file in files:
date = datetime(*strptime(Path(file).suffixes[0][1:].split("_")[0], "%Y-%m-%d")[:6]) - timedelta(days=2)
week_start = (date - timedelta(days=date.weekday()) + timedelta(days=2)).strftime("%Y-%m-%d")
csv_data = np.genfromtxt(file, skip_header=1, delimiter=",", converters={0: convertfunc})
# either add the data from one day to already existing entry for that week or create new entry
if week_start in data.keys():
data[week_start] = np.vstack((data[week_start], csv_data))
else:
data[week_start] = csv_data
Path(f"{Path(__file__).parent}/out").mkdir(parents=True, exist_ok=True)
# save each week as seperate '.mat' file in 'out' folder
for week_start, arr in data.items():
scipy.io.savemat(
f"{Path(__file__).parent}/out/data.{week_start}.mat",
mdict={name: column for name, column in zip(header, np.split(arr, arr.shape[1], axis=1))},
)
# zip 'out' folder
shutil.make_archive("data", "zip", "out")
# Update CHANGELOG.md
with open(str(Path.joinpath(Path(__file__).parent, "CHANGELOG.md")), "w+") as f:
f.write("## Messdaten vom Silo von den Wochen:\n" + "\n".join(["- " + key for key in data.keys()]))
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
Source diff could not be displayed: it is stored in LFS. Options to address this: view the blob.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment