Astro + MDX · rsync deploy · Linode · Cloudflare DNS
This document defines a boring, sane, long-lived infrastructure for a content-driven website and blog.
The guiding principle is simple:
Build locally. Inspect everything. Deploy files. Serve static content.
No CI rebuilds. No Docker. No GitHub in the deploy path. No surprises.
⸻
These constraints govern every decision that follows.
-
The server is dumb
- Serves static files only
- No Node, Bun, Docker, or build tooling on the server
-
The site is content
- Source of truth = files in a git repo
- Every git commit is a potential release
-
Build happens locally
- Deterministic
- Inspectable
- Repeatable
-
Deploy is atomic
- Upload new files
- Flip one symlink
- Done
-
GitHub is not part of deployment
- Used only for revision control
- No CI, no webhooks, no apps, no secrets
If any future "improvement" violates these rules, it is rejected.
⸻
Linode + Cloudflare DNS + HTTPS
Goal: https://tugtool.dev serves a static index.html over HTTPS.
- Linode VM
- Ubuntu 24.04 LTS
- Small instance is sufficient (static files only)
Cloudflare is used explicitly to simplify analytics later.
- Move
tugtool.devDNS to Cloudflare - Create:
- A record → Linode IPv4
- AAAA record → Linode IPv6 (optional)
- Leave proxy ("orange cloud") off initially (DNS-only, origin-served)
Note (registrar vs DNS): If the domain is registered at Namecheap (or any registrar), that is separate from DNS hosting. To use Cloudflare DNS, you change the domain's nameservers at the registrar to the Cloudflare-provided nameservers. After that, DNS records are managed in Cloudflare.
Door open for later: If we enable the Cloudflare proxy later, this plan still works. The origin stays static; only caching/CDN behavior changes upstream.
Caddy handles HTTPS automatically via Let's Encrypt.
sudo apt update
sudo apt install -y debian-keyring debian-archive-keyring apt-transport-https curl
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/gpg.key' \
| sudo gpg --dearmor -o /usr/share/keyrings/caddy-stable-archive-keyring.gpg
curl -1sLf 'https://dl.cloudsmith.io/public/caddy/stable/debian.deb.txt' \
| sudo tee /etc/apt/sources.list.d/caddy-stable.list
sudo apt update
sudo apt install -y caddyCheck with:
sudo systemctl status caddy/srv/tugtool/
releases/
current -> /srv/tugtool/releases/<release-id>
- Caddy serves
/srv/tugtool/current
Goal: allow deploys without SSHing as root, while keeping the server "dumb" (static file serving only).
- Create a shared group:
site - Create a dedicated Unix user:
deploy(member ofsite) - Add
caddyto thesitegroup - SSH: key-based auth only; multiple keys allowed (deploys can happen from any machine with an authorized key)
- Filesystem policy:
/srv/tugtoolowned bydeploy:sitewith group-readable permissionsdeployhas write access to/srv/tugtool/releases/and can update thecurrentsymlink- Caddy has read access via
sitegroup membership - No build tools on the server; the server never runs
node,pnpm, etc.
# Create directory structure
sudo mkdir -p /srv/tugtool/releases
# Create site group and deploy user
sudo groupadd site
sudo useradd -m -g site -s /bin/bash deploy
sudo usermod -aG site caddy
# Set ownership and permissions
sudo chown -R deploy:site /srv/tugtool
sudo chmod -R u=rwX,g=rX,o= /srv/tugtool
sudo chmod 2750 /srv/tugtool /srv/tugtool/releasesEnable Linode's built-in monitoring for basic uptime and resource alerts:
- CPU, memory, disk usage alerts
- Network connectivity monitoring
- Configure email alerts for anomalies
This provides "good enough" monitoring without operational complexity.
Copy your public key to the deploy user. From your local machine:
# Option A: If you can still SSH as root
ssh-copy-id -i ~/.ssh/id_ed25519.pub deploy@tugtool.dev
# Option B: Manually (run on server as root)
sudo -u deploy mkdir -p /home/deploy/.ssh
sudo -u deploy chmod 700 /home/deploy/.ssh
echo "YOUR_PUBLIC_KEY_HERE" | sudo -u deploy tee /home/deploy/.ssh/authorized_keys
sudo -u deploy chmod 600 /home/deploy/.ssh/authorized_keysTest SSH login as deploy (from local machine):
ssh deploy@tugtool.devGive deploy passwordless sudo (if you're logged in as deploy, you can use the box):
echo "deploy ALL=(ALL) NOPASSWD:ALL" | sudo tee /etc/sudoers.d/deploy
sudo chmod 440 /etc/sudoers.d/deployTest:
sudo whoami
# Should return: root (no password prompt)Note: The deploy script won't need sudo — it runs as deploy which owns /srv/tugtool. Sudo is for interactive maintenance tasks.
Edit SSH config to disable password auth and root login:
sudo tee /etc/ssh/sshd_config.d/hardening.conf << 'EOF'
PermitRootLogin no
PasswordAuthentication no
ChallengeResponseAuthentication no
UsePAM yes
EOFValidate and restart:
Warning: Before doing this, confirm you can SSH as deploy with your key. If you lock yourself out, you'll need Linode console access (Lish) to recover.
sudo sshd -t && sudo systemctl restart sshsudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow ssh
sudo ufw allow http
sudo ufw allow https
sudo ufw enableVerify:
sudo ufw status
# Should show: 22, 80, 443 allowedCreate test content (as deploy user):
sudo -u deploy mkdir -p /srv/tugtool/releases/hello
echo "hello world" | sudo -u deploy tee /srv/tugtool/releases/hello/index.html
sudo -u deploy ln -sfn /srv/tugtool/releases/hello /srv/tugtool/currentWrite the complete Caddyfile:
sudo tee /etc/caddy/Caddyfile << 'EOF'
tugtool.dev {
respond /health "ok\n" 200
root * /srv/tugtool/current
handle_errors {
@notfound expression {http.error.status_code} == 404
rewrite @notfound /404/index.html
file_server
}
file_server
}
www.tugtool.dev {
redir https://tugtool.dev{uri} permanent
}
EOFFormat and reload:
sudo caddy fmt --overwrite /etc/caddy/Caddyfile
sudo systemctl reload caddyVerify:
curl https://tugtool.dev
# Should return: hello worldDeliverable: https://tugtool.dev serves a static page over HTTPS.
⸻
Goal: Every deployed version is identified by a 7-character git commit hash.
Release ID:
git rev-parse --short=7 HEADServer layout:
/srv/tugtool/releases/
a7c91e2/
f13b09d/
current -> /srv/tugtool/releases/f13b09d
This is:
- Unique
- Human-traceable
- Content-accurate
- Naturally ordered by git history
Each release contains a file: VERSION
Contents:
commit a7c91e2
Cutover is a single operation:
ln -sfn /srv/tugtool/releases/<commit> /srv/tugtool/current- Instant
- Atomic
- Reversible
Deliverable: Any commit can be made live or rolled back instantly.
⸻
Goal: Deploy the currently checked-out commit with one command.
- Input: current git commit
- Output: site live on
tugtool.dev - No SSH shells
- No manual steps
The script is written in Python 3.12+ (bash is inadequate for this complexity).
The script must:
- Assert clean working tree (hard failure — a dirty tree means the commit hash doesn't represent what you're deploying)
- Compute commit hash
- Run local build
- Validate content invariants locally (fail fast with excellent error messages):
- Slugs must be URL-safe and well-formed
- Slugs must be globally unique (single flat namespace)
- Compute content manifest hash (SHA256 of all file paths + contents) for verification
- rsync build output to
/srv/tugtool/releases/<commit>/(idempotent — re-running deploys the same commit safely) - Verify remote content matches local manifest hash
- Flip
currentsymlink via SSH - Prune old releases, keeping last 3 (easy to redeploy any old commit from git)
Re-deploy behavior: If a deploy fails mid-transfer or needs retry, running ./deploy again for the same commit will rsync the correct files over any partial state and verify integrity. No manual cleanup needed.
Implementation notes:
- Manifest algorithm must be deterministic:
- Include relative file path + file contents for every file in
dist/ - Sort file paths bytewise before hashing
- Include relative file path + file contents for every file in
- Remote verification must use only baseline tooling on Ubuntu:
- Prefer running a short
python3snippet over SSH to compute the same manifest on the remote directory. - No server-side Node. No server-side build.
- Prefer running a short
Python dependencies (keep minimal, pinned):
PyYAMLfor frontmatter parsing/validation (slug checks, required fields, etc.)
./deployNo arguments. No modes.
Deliverable: A single, reliable deploy command.
./deploy— executable Python script (chmod +x, shebang line)requirements.txt— PyYAML dependency (pinned)
REMOTE_HOST = "tugtool.dev"
REMOTE_USER = "deploy"
REMOTE_BASE = "/srv/tugtool"
RELEASES_DIR = f"{REMOTE_BASE}/releases"
CURRENT_LINK = f"{REMOTE_BASE}/current"
KEEP_RELEASES = 3
LOCAL_DIST = Path("dist")
CONTENT_DIR = Path("src/content/journal")- Assert clean working tree —
git status --porcelain, hard fail if dirty - Get commit hash —
git rev-parse --short=7 HEAD - Run build —
pnpm astro build - Validate content — parse MDX frontmatter, check slugs
- Compute manifest hash — SHA256 of sorted file paths + contents in
dist/ - rsync to server —
rsync -avz --delete --checksum dist/ deploy@tugtool.dev:/srv/tugtool/releases/{commit}/ - Verify remote hash — run python3 snippet over SSH, compare hashes
- Flip symlink —
ln -sfn /srv/tugtool/releases/{commit} /srv/tugtool/current - Prune old releases — keep last 3, remove older
- Write VERSION file —
commit {hash}in release directory
Slug Validation:
SLUG_PATTERN = re.compile(r'^[a-z0-9]+(?:-[a-z0-9]+)*$')- Lowercase alphanumeric with hyphens only
- No leading/trailing hyphens, no consecutive hyphens
Manifest Hash (must be identical local and remote):
def compute_manifest_hash(dist_dir: Path) -> str:
hasher = hashlib.sha256()
files = sorted(
(p.relative_to(dist_dir) for p in dist_dir.rglob("*") if p.is_file()),
key=lambda p: str(p).encode('utf-8')
)
for rel_path in files:
hasher.update(str(rel_path).encode('utf-8'))
hasher.update(b'\n')
hasher.update((dist_dir / rel_path).read_bytes())
hasher.update(b'\n')
return hasher.hexdigest()Remote Verification:
- SSH into server, run identical Python hash algorithm
- Compare hex digests, fail if mismatch
main()
├── assert_clean_working_tree()
├── get_commit_hash() -> str
├── run_build()
├── validate_content() -> int
│ ├── extract_frontmatter(content) -> dict
│ └── validate_slug(slug) -> bool
├── compute_manifest_hash() -> str
├── rsync_to_server(commit) -> int
├── verify_remote_hash(commit, expected_hash)
├── flip_symlink(commit)
├── prune_old_releases() -> list[str]
└── write_version_file(commit)
Deploying commit a7c91e2...
[1/9] Checking working tree... ok
[2/9] Running build... ok (3.2s)
[3/9] Validating content... ok (4 posts)
[4/9] Computing manifest hash... ok (sha256:e3b0c442...)
[5/9] Uploading to server... ok (rsync: 42 files)
[6/9] Verifying remote content... ok
[7/9] Switching to new release... ok
[8/9] Pruning old releases... ok (removed: b2c91e3)
[9/9] Writing VERSION file... ok
Deployed! https://tugtool.dev is now serving commit a7c91e2
- Custom
DeployErrorexception with descriptive messages - Fail fast on any error
- Clear messages for: dirty tree, build failure, invalid slug, duplicate slug, rsync failure, hash mismatch
Files to create:
requirements.txtwithpyyaml==6.0.1(pinned)
Usage:
pip install -r requirements.txtScript imports: subprocess, hashlib, pathlib, re (all stdlib) + yaml (PyYAML)
- Empty content directory (valid, deploy proceeds)
- Re-deploy same commit (idempotent)
- Partial previous deploy (rsync repairs)
- No SSH key configured (clear error from ssh)
- dist/ doesn't exist (check before hash, clear error)
Note: Full verification requires Phase 4 (Astro setup) to be complete. The build step needs pnpm astro build to work.
What can be verified now (Phase 3):
-
Test dirty tree rejection
echo "test" > test.txt ./deploy # Should fail with "Working tree is dirty" rm test.txt
-
Test clean tree detection
./deploy # Should pass step 1, fail at step 2 (build) with "No package.json"
Verify after Phase 4 is complete:
-
Full deploy test
./deploy curl https://tugtool.dev
-
Test idempotency
./deploy # Run twice, should succeed both times -
Verify VERSION file
ssh deploy@tugtool.dev cat /srv/tugtool/current/VERSION
-
Verify manifest hash
- Deploy, note hash
- SSH to server, run hash algorithm manually
- Compare results
⸻
Set up Astro as a static site generator with Tailwind CSS v4, React (for icons only), and all required integrations.
.nvmrc— Node version pinpackage.json— Dependencies and scriptsastro.config.mjs— Astro configurationsrc/styles/global.css— Global styles with Tailwind + font definitionssrc/layouts/BaseLayout.astro— Base HTML layoutsrc/pages/index.astro— Homepagesrc/pages/404.astro— 404 error pagepublic/robots.txt— Allow all crawlerspublic/fonts/— Self-hosted font files (user provides)
22.11.0
pnpm init
pnpm add astro @astrojs/mdx @astrojs/rss @astrojs/sitemap @astrojs/react react react-dom
pnpm add tailwindcss @tailwindcss/vite
pnpm add remark-gfm lucide-react clsxEnsure package.json has:
{
"name": "tugtool-site",
"type": "module",
"scripts": {
"dev": "astro dev",
"build": "astro build",
"preview": "astro preview"
}
}import { defineConfig } from 'astro/config';
import mdx from '@astrojs/mdx';
import sitemap from '@astrojs/sitemap';
import react from '@astrojs/react';
import tailwindcss from '@tailwindcss/vite';
import remarkGfm from 'remark-gfm';
export default defineConfig({
site: 'https://tugtool.dev',
output: 'static',
integrations: [
mdx(),
sitemap(),
react(),
],
markdown: {
remarkPlugins: [remarkGfm],
},
vite: {
plugins: [tailwindcss()],
},
});mkdir -p public/fontsFont files:
- Download Inter from fonts.google.com (provides TTF variable fonts)
- Typography.com fonts: deferred
Converting TTF to WOFF2 (recommended for smaller file size):
Install Google's official woff2 tool:
brew install woff2Convert:
cd public/fonts
woff2_compress Inter-VariableFont_opsz,wght.ttf
woff2_compress Inter-Italic-VariableFont_opsz,wght.ttfThis creates .woff2 files (~350KB) alongside the TTFs (~875KB). Delete the TTFs after conversion if desired.
Alternative: Use TTF files directly. They work in all modern browsers; the only downside is larger file size.
Expected files after conversion:
public/fonts/
Inter-VariableFont_opsz,wght.woff2
Inter-Italic-VariableFont_opsz,wght.woff2
Create global styles with:
- Tailwind CSS import
@custom-variant darkfor dark mode support@font-facedefinitions for self-hosted fonts (reference files in/fonts/)@theme inlineblock with CSS custom properties for colors, radii, etc.:rootand.darkblocks with light/dark color schemes@layer basewith default styles for body, links, borders.prosestyles for MDX content (code blocks, headings, blockquotes)
The specific color palette, theme variables, and prose styles are left as an exercise — adapt from an existing design system or create your own.
Create the base HTML layout with:
- Imports: site config, global.css
- Props interface: title, description, ogImage
- Head: charset, viewport, color-scheme meta, title, description, canonical URL, favicon, OG/Twitter meta tags, RSS link
- Theme detection script (check localStorage and prefers-color-scheme, apply
.darkclass) - Body: gradient background overlay, sticky header with logo and nav, main content slot, footer
- Header: logo (with dark/light variants), nav links with current-path highlighting, theme toggle button
- Footer: copyright, RSS link, GitHub link
- Theme toggle script (toggle
.darkclass and persist to localStorage)
Content collection queries (for nav drawer, etc.) should be commented out until Phase 5.
Create the homepage with:
- Import BaseLayout and site config
- Hero section with headline, description paragraphs, CTA buttons
- Grid layout with text on left, hero image on right
- "Latest entries" section (commented out until Phase 5 content collections are configured)
Create a simple 404 page with:
- Import BaseLayout
- Heading, message, and links back to home and journal
User-agent: *
Allow: /
Sitemap: https://tugtool.dev/sitemap-index.xml
mkdir -p src/{pages,layouts,styles,lib,content/journal} public/{fonts,og}Copy public assets (favicon, logos, OG images) from existing design or create new ones.
Ensure .gitignore includes entries for:
- Astro build output (
dist/,.astro/) - Node dependencies (
node_modules/) - Font files if licensed (
public/fonts/*.woff,public/fonts/*.woff2,public/fonts/*.ttf)
Note: Inter is OFL-licensed and could be committed, but typography.com fonts are proprietary and should not be committed to public repos.
After implementation:
-
Install Node (if needed)
nvm install nvm use
-
Install dependencies
pnpm install
-
Run dev server
pnpm dev
Visit http://localhost:4321 — should see "tugtool.dev" homepage
-
Test build
pnpm build
Should create
dist/with static files -
Test preview
pnpm preview
Visit http://localhost:4321 — should match dev
-
Verify 404 page Visit http://localhost:4321/nonexistent — should show 404 page
-
Full deploy test
./deploy curl https://tugtool.dev
Should deploy and serve the new homepage
-
Verify sitemap generated
ls dist/sitemap*.xml curl https://tugtool.dev/sitemap-index.xml -
Verify fonts loaded (if font files provided)
- Open browser dev tools → Network tab
- Reload page
- Confirm .woff2 files loaded from /fonts/
- Font files must be provided by user (Google Fonts download + typography.com web kit)
- If fonts aren't available yet, the site will fall back to system fonts gracefully
- Content collections (Phase 5) not set up yet — journal/ directory created but empty
- RSS feed will work once content exists
- Typography.com fonts are purchased/licensed. Do not commit font files to a public repository. If the repo is public, add
public/fonts/*.woff*to.gitignoreand document the required fonts in the README so collaborators know what to obtain.
- No external requests at page load
- No GDPR/privacy concerns (no Google Fonts CDN)
- Fonts versioned with the site
- Works offline
- Faster first contentful paint
⸻
Bring the full blog system online with content collections, MDX support, RSS, and all markdown niceties.
pnpm add remark-smartypants rehype-slug rehype-autolink-headings remark-tocAdd all remark/rehype plugins with proper ordering:
import remarkSmartypants from 'remark-smartypants';
import remarkToc from 'remark-toc';
import rehypeSlug from 'rehype-slug';
import rehypeAutolinkHeadings from 'rehype-autolink-headings';
// In markdown config:
markdown: {
remarkPlugins: [
remarkGfm,
remarkSmartypants,
[remarkToc, { heading: 'contents|table of contents|toc', maxDepth: 3, tight: true }],
],
rehypePlugins: [
rehypeSlug,
[rehypeAutolinkHeadings, { behavior: 'prepend', properties: { className: ['anchor-link'], ariaHidden: true, tabIndex: -1 } }],
],
shikiConfig: {
themes: { light: 'github-light', dark: 'github-dark' },
},
},Define the journal collection schema:
import { defineCollection, z } from 'astro:content';
const journal = defineCollection({
type: 'content',
schema: z.object({
title: z.string(),
date: z.coerce.date(),
slug: z.string(),
tags: z.array(z.string()).default([]),
description: z.string().max(200).optional(),
}),
});
export const collections = { journal };Notes:
slugis required in frontmatter. The deploy script validates slug uniqueness.datesupports multiple formats viaz.coerce.date():- ISO datetime:
2025-01-16T12:00:00 - Unix
dateoutput:Fri Jan 16 10:54:10 PST 2026 - Use datetime (not just date) to ensure unambiguous ordering when multiple posts are published on the same day.
- ISO datetime:
Journal index page showing all posts:
- Import
getCollectionfromastro:content - Query all journal posts, sort by date descending
- Render as card list with title, description, date
- Use BaseLayout
Reference: Old implementation at /Users/kocienda/Mounts/u/src/tugtool.dev/src/pages/journal/index.astro
Individual post pages with dynamic routing:
- Use
getStaticPaths()to generate routes from frontmatterslugfield - Query post by slug, render MDX content
- Wrap in prose styling
- Include back link to journal index
Reference: Old implementation at /Users/kocienda/Mounts/u/src/tugtool.dev/src/pages/journal/[slug].astro
RSS feed generation:
- Use
@astrojs/rss(already installed) - Query all journal posts
- Map to RSS items with title, pubDate, description, link
- Use SITE config for feed metadata
Reference: Old implementation at /Users/kocienda/Mounts/u/src/tugtool.dev/src/pages/rss.xml.ts
Uncomment the "Latest entries" section:
- Uncomment content collection imports
- Uncomment the
latestquery - Uncomment the "Latest entries" section HTML
Uncomment the latestForDrawer query (for future mobile drawer support).
Image wrapper component for MDX posts:
- Wraps
astro:assetsImage component - Enforces required
altattribute - Sets sensible
sizesdefault - Handles content-hashed output
Per site-plan.md (hard rule): "No raw <img> tags in MDX posts. All post images must be imported and rendered via the shared wrapper component."
Style the heading anchor links:
.anchor-link {
@apply text-muted-foreground opacity-0 transition-opacity;
}
h2:hover .anchor-link,
h3:hover .anchor-link,
h4:hover .anchor-link {
@apply opacity-100;
}Create src/content/journal/hello-world/index.mdx:
---
title: "Hello World"
date: 2025-01-16T12:00:00
slug: hello-world
tags: []
description: "First post to test the blog system."
---
# Hello World
This is a test post.
## Contents
## Section One
Some content here with "smart quotes" and -- dashes.
## Section Two
More content.| File | Action |
|---|---|
package.json |
Add 4 dependencies |
astro.config.mjs |
Add remark/rehype plugins, shiki config |
src/content/config.ts |
Create (new) |
src/pages/journal/index.astro |
Create (new) |
src/pages/journal/[slug].astro |
Create (new) |
src/pages/rss.xml.ts |
Create (new) |
src/pages/index.astro |
Uncomment Phase 5 code |
src/layouts/BaseLayout.astro |
Uncomment Phase 5 code |
src/styles/global.css |
Add anchor link styles |
src/components/BlogImage.astro |
Create (required) |
src/content/journal/hello-world/index.mdx |
Create test post |
-
Install dependencies
pnpm install
-
Run dev server
pnpm dev
- Visit http://localhost:4321 — homepage should show "Latest entries" with test post
- Visit http://localhost:4321/journal — should list all posts
- Visit http://localhost:4321/journal/hello-world — should render test post
-
Test markdown niceties
- Smart quotes: "test" should render as "test"
- Dashes: -- should render as —
- TOC: ## Contents should be replaced with linked list
- Heading anchors: hover over h2/h3 to see anchor links
- Code blocks: should have syntax highlighting
-
Test RSS
- Visit http://localhost:4321/rss.xml — should show valid RSS feed
-
Test build
pnpm build
- Should complete without errors
- Check
dist/journal/hello-world/index.htmlexists
-
Test deploy
./deploy
- Should validate slug, build, upload, verify
- Visit https://tugtool.dev/journal/hello-world
- No draft feature per site-plan.md — preview locally before committing
- Slug is authoritative (from frontmatter, not filename) — deploy script validates
- Tag pages not implemented in this phase (can be added later)
- MobileJournalDrawer deferred (requires React component from old site)
⸻
Goal: Add analytics without touching infrastructure.
- Add one Cloudflare Web Analytics snippet to the site layout
- No server configuration
- No deploy process changes
- No runtime dependencies
Cloudflare observes traffic at the DNS/CDN layer.
Deliverable: Analytics with zero operational complexity.
⸻
- One git repo
- One Linode VM
- One static directory
- One deploy command
- Infinite releases via commit hashes
- Zero CI rebuilds
- Zero deployment drama
- Optional Cloudflare proxy/CDN later (no origin redesign)
This is as close as the modern web gets to:
"copy files to a server and be done"
while still delivering TLS, SEO, images, RSS, and analytics.