Skip to content

Instantly share code, notes, and snippets.

@StarKnightt
StarKnightt / liquid-effect-animation-prompt.md
Created February 19, 2026 15:01
Prompt to build an interactive liquid distortion effect component for React/Next.js using shadcn, Tailwind CSS & Three.js

You are given a task to integrate an existing React component in the codebase.

The codebase should support:

  • shadcn project structure
  • Tailwind CSS
  • TypeScript

If it doesn't, provide instructions on how to setup the project via shadcn CLI, install Tailwind or TypeScript.

@ariannamethod
ariannamethod / molequla.c
Created February 21, 2026 21:17
molequla.c — a dependency-free, single-file, continually-learning GPT organism in pure C. ontogenesis (25K→10M params), immune system, consciousness, swarm ecology, delta adapters, BLAS acceleration. part of github.com/ariannamethod/molequla
//go:build ignore
/*
* molequla.c
* A dependency-free, single-file, continually-learning GPT organism in pure C.
*
* Compile: gcc -O2 -o molequla molequla.c -lsqlite3 -lpthread -lm
* With BLAS: gcc -O2 -DUSE_BLAS -o molequla molequla.c -lsqlite3 -lpthread -lm -lopenblas
* macOS: gcc -O2 -DUSE_BLAS -o molequla molequla.c -lsqlite3 -lpthread -lm -framework Accelerate
*
@chr15m
chr15m / aider-convention-scraper
Last active February 21, 2026 21:16
Convert documentation URLs to Markdown "convention" files to be used by aider and other LLMs
#!/bin/sh
# Scrapes documentation from a URL and converts it to Markdown suitable for aider convention files
# to provide context to the LLM.
if [ $# -eq 0 ]; then
echo "Usage: $(basename "$0") <URL> [URL...]"
echo
echo "Generate aider 'convention' Markdown context from documentation URLs."
echo "suitable for providing LLM context about a project's conventions and style."
@ab2005
ab2005 / README.md
Last active February 21, 2026 21:14
Simple uvc camera capture using libuvc

Run:

  1. brew install libuvc
  2. gcc uvccam.c -luvc -o exx
  3. To get device info run: ./exx 1>/dev/null
  4. To save YUV422 to file: ./exx 2>/dev/null 1>video.yuv

Some ffmpeg experiments:

Record raw YUV422 from camera:

@championswimmer
championswimmer / how-ai-agents-are-made.md
Last active February 21, 2026 21:11
How Personal AI Agents and Agent Orchestrators like OpenClaw or GasTown are Made

How Personal AI Agents and Agent Orchestrators like OpenClaw or GasTown are Made

img-01

Over the last few months, projects like Gas Town by Steve Yegge and OpenClaw by Peter Steinberger have made “AI agent orchestrators” feel suddenly mainstream. It is tempting to treat them as a new kind of intelligence, but under the hood they are still a small set of primitives wired together with discipline: an LLM API call, a state loop, tools, memory, and orchestration.

This raises a practical question: what is actually inside an “agent,” and how is it different from ChatGPT (a chat UI over a model) or coding tools like Claude Code (an agentic coding surface)? Gas Town’s README frames it as a “multi‑agent orchest

@velvet-shark
velvet-shark / openclaw-50-day-prompts.md
Last active February 21, 2026 21:11
OpenClaw after 50 days: all prompts for 20 real workflows (companion to YouTube video)

OpenClaw after 50 days: all prompts

Companion prompts for the video: OpenClaw after 50 days: 20 real workflows (honest review)

These are the actual prompts I use for each use case shown in the video. Copy-paste them into your agent and adjust for your setup. Most will work as-is or the agent will ask you clarifying questions.

Each prompt describes the intent clearly enough that the agent can figure out the implementation details. You don't need to hand-hold it through every step.

My setup: OpenClaw running on a VPS, Discord as primary interface (separate channels per workflow), Obsidian for notes (markdown-first), Coolify for self-hosted services.

@karpathy
karpathy / microgpt.py
Last active February 21, 2026 21:07
microgpt
"""
The most atomic way to train and run inference for a GPT in pure, dependency-free Python.
This file is the complete algorithm.
Everything else is just efficiency.
@karpathy
"""
import os # os.path.exists
import math # math.log, math.exp
(original: http://furukawablog.spaces.live.com/Blog/cns!1pmWgsL289nm7Shn7cS0jHzA!2225.entry)
(archived: https://web.archive.org/web/20061105073147/http://furukawablog.spaces.live.com/Blog/cns!1pmWgsL289nm7Shn7cS0jHzA!2225.entry)
November 04
私のマイコン遍歴、日本のパソコン30年史、その1
私がその昔、秋葉原少年だった頃(今のアキバ系とちょっと違うとは思うのだけど、まぁ普通の人から見ると同類項だったのかな?)秋葉原にはアスターインターナショナル、コンピュータLab、若松通商、ビットイン、本田通商、そして新宿のムーンベース、タンディ・ラジオシャック、御苑前のアスターインターナショナル本店などに当時のマイコン少年は毎日たむろしていたのでした。
当時はTK-80、KIM-1、SCAMP、HitachiやL-Kit16などの10万円ほどする、いわゆるワンボードマイコンが全盛期でありました。Altair、IMSAI、SOL-20、Horizon North Star、クロメムコといったS-100バスのコンピュータはテレタイプに紙テープで操作するもので50万から80万円、8インチのフロッピー・ディスクを搭載したものが100万から150万円もする時代でありました。(ご参考までに、大卒初任給は10万円ぐらいの時代のことです。)