| Disk Cache Engine | Addressing Pattern |
|---|---|
| Block-based | (fd, offset, buffer) |
| Object-based | (key, buffer) |
| Io Engine | Compatible Devices | Acceptable Addressing Pattern |
|---|---|---|
| Psync | FS, File | (fd, offset, buffer) |
| Uring | FS, File | (fd, offset, buffer) |
| Disk Cache Engine | Addressing Pattern |
|---|---|
| Block-based | (fd, offset, buffer) |
| Object-based | (key, buffer) |
| Io Engine | Compatible Devices | Acceptable Addressing Pattern |
|---|---|---|
| Psync | FS, File | (fd, offset, buffer) |
| Uring | FS, File | (fd, offset, buffer) |
| #!/usr/bin/env ./venv/bin/python | |
| import json | |
| import os | |
| import subprocess | |
| import sys | |
| from pathlib import Path | |
| # Automatically generate the latest qbt_files.json each run | |
| subprocess.run(["qbt", "torrent", "list", "--format", "json"], stdout=open("qbt_files.json", "w")) |
| // Copyright 2025 foyer Project Authors | |
| // | |
| // Licensed under the Apache License, Version 2.0 (the "License"); | |
| // you may not use this file except in compliance with the License. | |
| // You may obtain a copy of the License at | |
| // | |
| // http://www.apache.org/licenses/LICENSE-2.0 | |
| // | |
| // Unless required by applicable law or agreed to in writing, software | |
| // distributed under the License is distributed on an "AS IS" BASIS, |
| #!/bin/bash | |
| set -ueo pipefail | |
| TEST_DIR=$1 | |
| echo "benchmark disk mounted on" $TEST_DIR | |
| echo "--> write throughput" | |
| sudo fio --name=write_throughput --directory=$TEST_DIR --numjobs=8 \ | |
| --size=10G --time_based --runtime=60s --ramp_time=2s --ioengine=libaio \ |
| > ~/github/tsuna/contextswitch make all ✔ mrcroxx@homelab 05:55:12 | |
| gcc -pthread -march=native -O3 -mno-avx -D_XOPEN_SOURCE=600 -D_GNU_SOURCE -std=c99 -W -Wall -Werror -lrt -lpthread timectxsw.c -o timectxsw | |
| gcc -pthread -march=native -O3 -mno-avx -D_XOPEN_SOURCE=600 -D_GNU_SOURCE -std=c99 -W -Wall -Werror -lrt -lpthread timectxswws.c -o timectxswws | |
| gcc -pthread -march=native -O3 -mno-avx -D_XOPEN_SOURCE=600 -D_GNU_SOURCE -std=c99 -W -Wall -Werror -lrt -lpthread timesyscall.c -o timesyscall | |
| gcc -pthread -march=native -O3 -mno-avx -D_XOPEN_SOURCE=600 -D_GNU_SOURCE -std=c99 -W -Wall -Werror -lrt -lpthread timetctxsw.c -o timetctxsw | |
| gcc -pthread -march=native -O3 -mno-avx -D_XOPEN_SOURCE=600 -D_GNU_SOURCE -std=c99 -W -Wall -Werror -lrt -lpthread timetctxsw2.c -o timetctxsw2 | |
| ./cpubench.sh | |
| model name : AMD Ryzen 9 |
| Running benches/chained_spawn.rs (target/release/deps/chained_spawn-c8dc8c936ec4fa3d) | |
| Gnuplot not found, using plotters backend | |
| chained_spawn/yatp::future/256 | |
| time: [684.35 µs 684.77 µs 685.25 µs] | |
| Found 4 outliers among 100 measurements (4.00%) | |
| 3 (3.00%) high mild | |
| 1 (1.00%) high severe | |
| chained_spawn/yatp::callback/256 | |
| time: [550.01 µs 551.86 µs 553.61 µs] |
| use std::time::{Duration, Instant}; | |
| use bcc::perf_event::PerfMapBuilder; | |
| use bcc::{BccError, Kprobe, Kretprobe, BPF}; | |
| use itertools::Itertools; | |
| use tokio::fs::read_to_string; | |
| use tokio::sync::oneshot; | |
| use crate::Args; |
rm -rf /p44pro/foyer && cargo build --release && RUST_BACKTRACE=1 RUST_LOG=warn ./target/release/foyer-storage-bench --dir /p44pro/foyer --capacity 102400 --file-size 64 --get-range 10000 --flushers 12 --reclaimers 12 --time 60 --writers 64 --w-rate 8 --ticket-insert-rate-limit 500 --readers 1024 --r-rate 1 --runtime --metrics --compression none --distribution zipf --distribution-zipf-s 0.8 --disable-directTotal:
disk total iops: 1818.0
disk total throughput: 402.4 MiB/s
disk read iops: 90.5
disk read throughput: 6.3 MiB/s
| use std::{ | |
| hint::black_box, | |
| sync::{ | |
| atomic::{AtomicUsize, Ordering}, | |
| Arc, | |
| }, | |
| time::{Duration, Instant}, | |
| }; | |
| use itertools::Itertools; |
| use std::{collections::HashMap, fs::File, path::Path}; | |
| use csv::StringRecord; | |
| use itertools::Itertools; | |
| fn open(path: impl AsRef<Path>) -> Vec<StringRecord> { | |
| let f = File::open(path).unwrap(); | |
| let mut reader = csv::Reader::from_reader(f); | |
| let headers = reader.headers().cloned().unwrap(); | |
| let records = reader.records().map(|res| res.unwrap()).collect_vec(); |