Skip to content

Instantly share code, notes, and snippets.

@rathwell
Created March 2, 2026 22:47
Show Gist options
  • Select an option

  • Save rathwell/7d2a29d4bcb6826d878e98ee3b517e5b to your computer and use it in GitHub Desktop.

Select an option

Save rathwell/7d2a29d4bcb6826d878e98ee3b517e5b to your computer and use it in GitHub Desktop.
import { useState } from 'react';
interface Measurement {
id: number;
time: string;
value: number;
}
function MeasurementViewer() {
const [measurements, setMeasurements] = useState<Measurement[]>([]);
const [stats, setStats] = useState({
downloaded: 0,
decompressed: 0,
itemCount: 0,
timeToFirstItem: 0
});
const [loading, setLoading] = useState(false);
async function loadData() {
setLoading(true);
const startTime = Date.now();
let firstItemTime = 0;
try {
// Step 1: Get presigned URL from your API
const response = await fetch('/api/large-query', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ startDate: '2024-01-01' })
});
const { url } = await response.json();
// Step 2: Fetch directly from S3 using presigned URL
const s3Response = await fetch(url);
const contentLength = parseInt(
s3Response.headers.get('content-length') || '0'
);
// Step 3: Decompress stream as it downloads
const decompressedStream = s3Response.body!
.pipeThrough(new DecompressionStream('gzip'));
const reader = decompressedStream.getReader();
const decoder = new TextDecoder();
let buffer = '';
let downloadedBytes = 0;
let decompressedBytes = 0;
// Step 4: Read chunks as they arrive
while (true) {
const { done, value } = await reader.read();
if (done) break;
// We have data! This happens DURING download, not after
downloadedBytes += value.byteLength;
decompressedBytes += value.length;
buffer += decoder.decode(value, { stream: true });
// Step 5: Parse complete JSON objects
// (Assuming NDJSON format - one object per line)
const lines = buffer.split('\n');
buffer = lines.pop() || '';
const newItems = lines
.filter(line => line.trim())
.map(line => JSON.parse(line));
if (newItems.length > 0) {
// Step 6: Display items IMMEDIATELY
if (firstItemTime === 0) {
firstItemTime = Date.now() - startTime;
console.log(`First items visible after ${firstItemTime}ms!`);
}
setMeasurements(prev => [...prev, ...newItems]);
setStats({
downloaded: downloadedBytes,
decompressed: decompressedBytes,
itemCount: measurements.length + newItems.length,
timeToFirstItem: firstItemTime
});
}
}
// Process any remaining data
if (buffer.trim()) {
setMeasurements(prev => [...prev, JSON.parse(buffer)]);
}
} catch (error) {
console.error('Error streaming data:', error);
} finally {
setLoading(false);
}
}
return (
<div>
<button onClick={loadData} disabled={loading}>
Load Large Dataset
</button>
{stats.timeToFirstItem > 0 && (
<div style={{
background: '#d4edda',
padding: '10px',
margin: '10px 0',
borderRadius: '4px'
}}>
<strong>βœ“ First data visible in {stats.timeToFirstItem}ms</strong>
<br />
<small>
(Download still in progress -
{(stats.downloaded / 1024 / 1024).toFixed(2)} MB compressed,
{(stats.decompressed / 1024 / 1024).toFixed(2)} MB decompressed)
</small>
</div>
)}
<div>
<p>Items loaded: {measurements.length}</p>
{loading && <p>Streaming data...</p>}
</div>
<div style={{
height: '500px',
overflow: 'auto',
border: '1px solid #ccc'
}}>
{measurements.map((m, i) => (
<div key={i} style={{
padding: '8px',
borderBottom: '1px solid #eee',
animation: 'slideIn 0.2s ease-in'
}}>
<strong>ID:</strong> {m.id} |
<strong>Time:</strong> {m.time} |
<strong>Value:</strong> {m.value}
</div>
))}
</div>
</div>
);
}
```
## **What Actually Happens:**
```
Timeline for 50MB gzipped file (compresses to 5MB):
0ms: User clicks "Load"
100ms: Presigned URL received from Lambda
200ms: S3 fetch starts
500ms: First 256KB chunk arrives ← DECOMPRESSION STARTS
501ms: First chunk decompressed (2MB uncompressed)
502ms: First 1000 items parsed
503ms: USER SEES FIRST DATA! πŸŽ‰ ← While still downloading!
800ms: Second chunk arrives
801ms: More items appear in UI
1200ms: Third chunk arrives
1201ms: Even more items appear
...
5000ms: Download complete (5MB transferred)
5001ms: All data visible
Traditional approach would show nothing until 5000ms+
```
## **Console Output You'll See:**
```
First items visible after 503ms!
Items loaded: 1000 (download: 0.25 MB, decompressed: 2.1 MB)
Items loaded: 2500 (download: 0.6 MB, decompressed: 5.2 MB)
Items loaded: 5000 (download: 1.2 MB, decompressed: 10.5 MB)
Items loaded: 8000 (download: 2.0 MB, decompressed: 16.8 MB)
...
Items loaded: 50000 (download: 5.0 MB, decompressed: 50 MB) - Complete!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment