| tags | created | status | gist | ||||
|---|---|---|---|---|---|---|---|
|
2026-03-07 |
planning |
Deep-dive into NanoClaw internals, covering message flow, container isolation, session lifecycle, and provider support.
Single Node.js orchestrator process that bridges WhatsApp (or other channels) to Claude Code running inside isolated containers. Each group gets its own container with filesystem isolation, persistent sessions, and file-based IPC.
WhatsApp (Baileys) → SQLite buffer → Polling loop → Container (Claude Agent SDK) → Response
A documentation of our process to generate a personalized Rally Cross event voucher using AI image generation.
Create a printable voucher (DIN A4) for a Rally Cross event as a 16th birthday gift, featuring:
- Manga/Anime style illustrations of father and son (based on reference photos)
- Rally Cross action elements
- Event details: DRX Lauf 2 | 19.04.2025 | Estering, Buxtehude
To deploy Red Hat Developer Hub (RHDH) on OpenShift, multiple container images work together through the OpenShift Operator Lifecycle Manager (OLM). This document explains the different types of builds available and how they are distributed through various container registries.
Three main components are involved in a RHDH deployment:
- The RHDH application itself (the developer portal)
- The RHDH operator (manages the lifecycle of RHDH instances)
- The Index Image Builder (IIB - a catalog of available operator versions)
| {"time":1669661619395,"telemetry":{"Clutch":1.0,"Brake":0.0,"Throttle":0.9991061,"Handbrake":0.0,"SteeringAngle":0.0944157839,"Rpms":0.0,"Gear":0,"SpeedMs":50.70474,"DistanceRoundTrack":4283.12158,"CurrentLap":1,"CurrentLapTime":110.148262}} | |
| {"time":1669661619426,"telemetry":{"Clutch":1.0,"Brake":0.0,"Throttle":0.9991061,"Handbrake":0.0,"SteeringAngle":0.0944157839,"Rpms":0.0,"Gear":4,"SpeedMs":50.7033844,"DistanceRoundTrack":4284.81055,"CurrentLap":1,"CurrentLapTime":110.181595}} | |
| {"time":1669661619457,"telemetry":{"Clutch":1.0,"Brake":0.0,"Throttle":0.9991061,"Handbrake":0.0,"SteeringAngle":0.09211275,"Rpms":0.0,"Gear":4,"SpeedMs":50.7513466,"DistanceRoundTrack":4285.65527,"CurrentLap":1,"CurrentLapTime":110.198265}} | |
| {"time":1669661619487,"telemetry":{"Clutch":1.0,"Brake":0.0,"Throttle":0.9991061,"Handbrake":0.0,"SteeringAngle":0.08664401,"Rpms":0.0,"Gear":4,"SpeedMs":50.8654671,"DistanceRoundTrack":4287.34766,"CurrentLap":1,"CurrentLapTime":110.2316}} | |
| {"time":1669661619518,"telemetry":{"Clutch":1.0,"Brake": |
| def hive_schema(filename): | |
| nl = ',\n' | |
| diag_parq = fastparquet.ParquetFile(filename) | |
| data = diag_parq.schema.text | |
| schema = filter(None, data.split('\n')[1:]) | |
| schema = [re.sub(r'^[^a-z]+', ' ', l) for l in schema] | |
| schema = [re.sub(r':', '', l) for l in schema] | |
| schema = [re.sub(r'BYTE_ARRAY.*', 'STRING', l) for l in schema] | |
| schema = [re.sub(r'INT64, TIMESTAMP_MICROS.*', 'TIMESTAMP', l) for l in schema] | |
| schema = [re.sub(r'INT64, TIMESTAMP_MILLIS.*', 'TIMESTAMP', l) for l in schema] |