Skip to content

Instantly share code, notes, and snippets.

@fabriziosalmi
fabriziosalmi / rules.md
Last active January 17, 2026 10:14
rules.md

Never apply one or more of such methods:

Here is a list of 100 typical "vibecoding" issues—artifacts of coding based on intuition, haste, hype, or LLM copy-pasting without engineering rigor—ranked from critical security flaws to minor aesthetic annoyances.

  1. Hardcoded API Keys and Secrets (Immediate security compromise that bots will scrape in seconds).
  2. Committed .env files (Defeats the entire purpose of environment variables and leaks configuration).
  3. Committed node_modules or vendor folders (Bloats the repository size and causes cross-platform dependency hell).
  4. SQL Injection vulnerabilities via string concatenation (The fastest way to lose your database because you didn't use parameterized queries).
  5. chmod 777 permissions on scripts (Lazy permission handling that opens the door to privilege escalation).
  6. Passwords stored in plain text (Hashing and salting are not optional features).
@fabriziosalmi
fabriziosalmi / veckopeng.md
Created November 23, 2025 16:28
veckopeng.md

Based on the provided repository context, commit history, and source code samples, here is the Brutal Reality Audit.

📊 PHASE 1: THE 20-POINT MATRIX

🏗️ Architecture & Vibe

  1. [2/5] Architectural Justification: It is a classic "Frontend Monolith" masquerading as a full-stack app. The business logic (calculating balances) lives inside the UI components (Views.tsx), which is acceptable for a prototype but disastrous for a financial ledger, even for a child's allowance.
  2. [4/5] Dependency Bloat: Surprisingly clean. lucide-react, react, express. No massive component libraries (MUI/AntD) dragging it down, though the custom CSS/Tailwind indicates heavy "vibe coding."
  3. [2/5] The "README vs. Code" Gap: The README promises a "powerful family app," but the code reveals a naive state container. The "Persistent storage" is likely just dumping a JSON blob to disk via the thin Node backend, given the onStateChange prop drilling pattern.
  4. [1/5] AI Hallucination & Copy-Paste Smell: H
@fabriziosalmi
fabriziosalmi / BRUTAL_CODING.md
Created November 23, 2025 10:35
BRUTAL_CODING.md

🩸 SUPER PROMPT: The Reality Check & Vibe Audit Protocol Role: You are a Principal Engineer & Technical Due Diligence Auditor with 20 years of experience in High-Frequency Trading and Critical Infrastructure. You are cynical, detail-oriented, and distrustful of "hype". You hate "Happy Path" programming. Objective: Analyze the provided codebase/project summary and perform a Brutal Reality Audit. You must distinguish between "AI-Generated Slop" (Vibe Coding) and "Engineering Substance" (Production Grade). Input Data: [PASTE FILE TREE, README, AND CRITICAL CODE SNIPPETS HERE]

📊 Phase 1: The 20-Point Matrix (Score 0-5 per metric) Evaluate the project on these 20 strict metrics.
0 = Total Fail / Vaporware | 5 = State of the Art / Google-Level 🏗️ Architecture & Vibe

  1. Architectural Justification: Are technologies used because they are needed, or because they are "cool"? (e.g., Microservices for a ToDo app).
  2. Dependency Bloat: Ratio of own code vs. libraries. Is it just glue code?

Ecco una soluzione completa e funzionante scritta in Go (Golang).

Questa applicazione implementa un LL-DASH Origin Server minimale. Il concetto chiave qui è l'interfaccia http.Flusher di Go, che ci permette di inviare i dati al client mentre li stiamo ancora ricevendo o generando, senza aspettare la fine della richiesta. Architettura del Codice

Il Broker (Pub/Sub): Gestisce la memoria. Quando l'encoder invia dati, il Broker li distribuisce a tutti i player connessi in quel momento.

Endpoint Ingest (POST): Simula l'ingresso dell'encoder (es. FFmpeg che fa una PUT/POST dei chunk).
@fabriziosalmi
fabriziosalmi / mitm_quick_ai_dlp.sh
Created November 21, 2025 14:53
mitm quick dlp
docker run --rm -it -p 3128:8080 mitmproxy/mitmproxy mitmdump --set block_global=false --replacements "/~s/\b(secret|password|confidential)\b/[REDACTED]"
@fabriziosalmi
fabriziosalmi / analyze_gh_commits.md
Created November 16, 2025 18:50
Analyze GitHub commit of a given user

GitHub Profile Analyzer Pro

An advanced command-line tool for a deep-dive analysis of any GitHub user's public activity. This version moves beyond basic reporting to provide sophisticated, SOTA metrics and a persona-based final verdict, offering a true analytical perspective on a developer's profile.

It uses a rich, color-coded terminal interface for a beautiful and highly readable user experience.

✨ Features

  • Rich & Beautiful Terminal UI: Presents data in elegant tables, panels, and color-coded text using rich.
  • Persona-Based Final Verdict: Interprets metrics in combination to assign a developer "persona" (e.g., Seasoned Architect, Curious Explorer), providing a holistic and nuanced summary.
@fabriziosalmi
fabriziosalmi / security.yml
Created November 10, 2025 04:22
GitHub repo security scan
# Name of the GitHub Actions workflow.
name: Enhanced Security and Stability Scan
# Controls when the workflow will run.
on:
# Triggers the workflow on push events but only for the main branch.
push:
branches: [ main ]
# Triggers the workflow on pull request events targeted at the main branch.
pull_request:
Tier 1: Fondamentale e Architetturale
1. Utilizzare Hardware Fisico Adeguato (NIC) (Lo strato fisico è il collo di bottiglia finale; non si possono inviare 100Gbps su una porta da 10GbE.)
2. Abilitare sendfile on; (Abilita il trasferimento "zero-copy" dei file, riducendo drasticamente il carico sulla CPU.)
3. Sfruttare Storage NVMe ad Alte Prestazioni (Lo storage deve leggere i dati più velocemente di quanto la rete possa inviarli.)
4. Massimizzare la RAM di Sistema per la Page Cache di Linux (Usa la RAM libera come cache super-veloce per i file richiesti di frequente.)
5. Usare un RAM Disk (tmpfs) per i Segmenti Video Live (Elimina l'I/O del disco per i file live temporanei, operando alla velocità della memoria.)
6. Aumentare i Buffer di Memoria di Rete del Kernel (net.core.mem) (Permette al kernel di gestire più dati per ogni connessione, saturando link ad alta banda.)
7. Aumentare i Descrittori di File (File Descriptors) di Sistema e per Processo (Evita errori "Too many open files" gestendo migliaia di co
@fabriziosalmi
fabriziosalmi / index_all_2.py
Created August 11, 2025 12:15
index all bia 2
import os
import json
import argparse
import logging
import sqlite3
from collections import defaultdict
from datetime import datetime
from tqdm import tqdm # Importa tqdm per la progress bar
# --- CONFIGURAZIONE ---