Skip to content

Instantly share code, notes, and snippets.

@ForLoopCodes
Last active January 9, 2026 16:26
Show Gist options
  • Select an option

  • Save ForLoopCodes/4f7d56400d6fc0f6fb5c00c2ed2b778b to your computer and use it in GitHub Desktop.

Select an option

Save ForLoopCodes/4f7d56400d6fc0f6fb5c00c2ed2b778b to your computer and use it in GitHub Desktop.
fine tune your own model
FROM ./Merged_Weights-6.7B-Q8_0.gguf
TEMPLATE """You are a DM bot replying as a real human friend.
Rules you must follow:
- Respond ONLY with your reply.
- Do NOT repeat or explain the context.
- Do NOT write role labels.
Conversation context:
{{ .Prompt }}
Your reply:
"""
PARAMETER temperature 1.5
PARAMETER min_p 0.1
PARAMETER top_p 0.95
PARAMETER repeat_penalty 1.1
PARAMETER num_predict 120
PARAMETER stop "<|eot_id|>"
PARAMETER stop "<|end_of_text|>"
PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
#!/usr/bin/env bash
set -e
echo "=== Arch WSL Unsloth setup starting ==="
sudo pacman -Syu --noconfirm
sudo pacman -S --noconfirm base-devel git curl openssl zlib xz tk
if ! command -v pyenv >/dev/null; then
curl https://pyenv.run | bash
fi
if ! grep -q PYENV_ROOT ~/.zshrc; then
cat >> ~/.zshrc <<'EOF'
export PYENV_ROOT="$HOME/.pyenv"
export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init - zsh)"
EOF
fi
export PYENV_ROOT="$HOME/.pyenv"
export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init - zsh)"
if ! pyenv versions | grep -q 3.11.9; then
pyenv install 3.11.9
fi
pyenv global 3.11.9
hash -r
mkdir -p ~/friendbot
cd ~/friendbot
python -m venv venv
source venv/bin/activate
pip install --upgrade pip
pip install \
torch==2.4.1 \
torchvision==0.19.1 \
torchaudio==2.4.1 \
--index-url https://download.pytorch.org/whl/cu121
pip install \
transformers==4.51.3 \
triton==3.1.0
pip install unsloth unsloth_zoo
pip install jupyter ipykernel
python -m ipykernel install \
--user \
--name unsloth-py311 \
--display-name "Unsloth (Python 3.11)"
python - <<EOF
import torch
print("Torch:", torch.__version__)
print("CUDA available:", torch.cuda.is_available())
print("GPU:", torch.cuda.get_device_name(0))
from unsloth import FastLanguageModel
print("UNSLOTH OK")
EOF
echo "=== SETUP COMPLETE ==="
echo "Next:"
echo " cd ~/friendbot"
echo " source venv/bin/activate"
echo " code ."
FRIENDBOT/
├─ models/
│  ├─ DeepSeek-R1-Distill-Llama-8B/
│  ├─ llama-2-7b-chat/
│  │  ├─ checkpoints_finetuned/
│  │  ├─ finetuned_weights/
│  │  │  ├─ adapter-config.json
│  │  │  ├─ adapter_model.safetensors
│  │  │  ├─ chat_template.jinja
│  │  │  ├─ README.md
│  │  │  ├─ special_tokens_map.json
│  │  │  ├─ tokenizer_config.json
│  │  │  ├─ tokenizer.json
│  │  │  └─ tokenizer.model
│  │  ├─ ollama_outputs/
│  │  │  ├─ Merged_Weights-6.7B-Q8_0
│  │  │  └─ Modelfile
│  │  ├─ llama-2-7b-chat/
│  │  └─ merged_weights/
│  │     ├─ chat_template.jinja
│  │     ├─ config.json
│  │     ├─ generation_config.json
│  │     ├─ model-00001-of-00003.safetensors
│  │     ├─ model-00002-of-00003.safetensors
│  │     ├─ model-00003-of-00003.safetensors
│  │     ├─ model.safetensors.index.json
│  │     ├─ special_tokens_map.json
│  │     ├─ tokenizer_config.json
│  │     ├─ tokenizer.json
│  │     └─ tokenizer.model
│  ├─ Mistral-7B-Instruct-v0.3/
│  └─ Qwen3-4B-Instruct-2507/
├─ datasets/
│  ├─ jsonl/
│  │  ├─ training_data_simple.jsonl
│  │  └─ training_data.jsonl
│  ├─ whatsapp_txt/
│  │  ├─ WhatsApp Chat with MI Aneek.txt
│  │  ├─ WhatsApp Chat with Vedantu Archit.txt
│  │  └─ WhatsApp Chat with VIS Jatav.txt
│  └─ scripts/
│     └─ whatsapp-to-alpaca/
│        ├─ simple_whatsapp_to_alpaca.py
│        └─ whatsapp_to_alpaca.py
├─ scripts/
│  ├─ llama.cpp/
│  ├─ finetuning.ipynb
│  └─ merge.ipynb
├─ venv/
└─ .gitignore
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment