Skip to content

Instantly share code, notes, and snippets.

@nns2009
Created September 1, 2025 17:57
Show Gist options
  • Select an option

  • Save nns2009/a3b2a564c72288bd25004889fac3f246 to your computer and use it in GitHub Desktop.

Select an option

Save nns2009/a3b2a564c72288bd25004889fac3f246 to your computer and use it in GitHub Desktop.
Script to export all metadata (filename, date taken, size, etc.) from all media (photos and videos) in your Google Photos library
/*
Script to export all metadata (filename, date taken, size, etc.) from all media (photos and videos) in your Google Photos library.
Based on this script from Reddit user "HalBenHB":
https://www.reddit.com/r/googlephotos/comments/1lf6dz0/a_script_to_export_all_your_photo_video_metadata/
but enhanced to save all downloaded data and importantly to be continue-able in case of failures.
The post in this ^ link also contains a script to export data of specific albums only (I didn't use or enhance it).
Uses "Google Photos Toolkit"
https://github.com/xob0t/Google-Photos-Toolkit
Instructions:
1) Install "Google Photos Toolkit" using the link above
2) Navigate to photos.google.com
3) Open your browser's Developer Console (usually by pressing F12).
4) Copy the entire script below to launch metadata download and export for the first time
5) In case of errors, wait for some time, let Google Photos rest. Then execute the following single line:
exportEntireLibrary();
DON'T paste/run the entire script once again - that will wipe the variables with data you've accumulated so far.
You may re-run everything from the line "async function exportEntireLibrary() {" in case you make some adjustments to the function.
Just don't re-run global variable declarations - this wipes them clean.
6) You may be prompted by the browser the permission to save files
7) In case, on the contrary, you want to start from scratch: re-run the entire script
(I realized `_continue` variable likely doesn't do anything after all, but I don't want to re-test this script anymore or risk breaking it with untested edits)
Three files will be saved:
- Google_Photos_allLibraryItems.json (the first thing downloaded, no filenames, no file-sizes, but has some other data)
- Google_Photos_allMediaInfo.json (separate fetch requests, no resolutions, no some other data)
- Google_Photos_Library_Export.csv (same as `allMediaInfo.json`, but reformated as .csv as per original script)
The respective filesizes in my case (~64000 media files) are: 34Mb, 22Mb, 9.5Mb
Note:
Script doesn't save the intermediate progress to the disk.
In case you close or refresh the page, all of the data you've downloaded will be gone and you'll need to start afresh.
I tested in Yandex Browser 25.6 (Chromium based) using Tampermonkey (https://www.tampermonkey.net/index.php?browser=chrome).
My collection turned out to be ~64000 media (more than expected).
The first launch loaded full `allLibraryItems` (no filenames here) and 9/13 of 5000-media chunks of `allMediaInfo`,
but failed on 10th chunk with:
> Error in EWgK9e request: TypeError: Failed to fetch
so the ability to 'continue' proved useful.
The second (continuation) launch finished successfully.
*/
let _continue = true; // if 'false', will restart from scratch every time you call 'exportEntireLibrary'
let _exportTest = false; // Export just 50 items, as a test that the script still works
// Backup of variables in case you do re-run the entire script once.
try {
window._exportStateBackup = structuredClone({
allLibraryItems,
mediaKeys,
mediaKeyChunks,
allMediaInfo, chunkI,
formattedData,
csvContent,
exportFinished,
fali_allItems,
fali_nextPageId,
fali_pageCount,
})
} catch (_err_) { }
let allLibraryItems; // Affected by '_continue'
let mediaKeys;
let mediaKeyChunks;
let allMediaInfo, chunkI; // Affected by '_continue'
let formattedData;
let csvContent;
let exportFinished;
let fali_allItems; // Affected by '_continue'
let fali_nextPageId; // Affected by '_continue'
let fali_pageCount; // Affected by '_continue'
// Re-running the rest of the script from this point won't cause variable data getting wiped
async function exportEntireLibrary() {
if (_continue && exportFinished) {
console.warn(`Nothing to continue, export already fully finished. Aborting`);
return;
}
const INFO_CHUNK_SIZE = _exportTest ? 50 : 5000;
if (!window.gptkApi) { console.error("Google-Photos-Toolkit core API not found."); return; }
console.log("--- Starting Full Library Export ---");
console.warn("This process will be very long for large libraries. Please be patient.");
try {
allLibraryItems = (_continue && allLibraryItems) ?? await fetchAllLibraryItems();
if (!allLibraryItems || allLibraryItems.length === 0) { console.log("Library is empty."); return; }
downloadAsFile('Google_Photos_allLibraryItems.json', JSON.stringify(allLibraryItems, null, 2), 'data:application/json;charset=utf-8;');
console.log(`--- Fetching detailed metadata for ${allLibraryItems.length} items ---`);
mediaKeys = allLibraryItems.map(item => item.mediaKey);
mediaKeyChunks = splitIntoChunks(mediaKeys, INFO_CHUNK_SIZE);
console.log(`Data will be fetched in ${mediaKeyChunks.length} chunk(s).`);
debugger
allMediaInfo = (_continue && allMediaInfo) ?? [];
chunkI = (_continue && chunkI) ?? 0;
for (; chunkI < (_exportTest ? 1 : mediaKeyChunks.length); chunkI++) {
console.log(`Fetched details of ${chunkI}/${mediaKeyChunks.length} chunks ...`);
const chunkResult = await gptkApi.getBatchMediaInfo(mediaKeyChunks[chunkI]);
allMediaInfo.push(...chunkResult);
}
console.log(`Fetched details of ${chunkI}/${mediaKeyChunks.length} chunks`);
console.log("Formatting data into CSV format...");
formattedData = allMediaInfo.map(item => ({
"Filename": item.fileName, "Description": item.descriptionFull, "Media_Key": item.mediaKey,
"Date_Taken": item.timestamp ? new Date(item.timestamp).toISOString() : null,
"Date_Uploaded": item.creationTimestamp ? new Date(item.creationTimestamp).toISOString() : null,
"Size_Bytes": item.size, "Takes_Up_Space": item.takesUpSpace, "Space_Consumed_Bytes": item.spaceTaken,
"Is_Original_Quality": item.isOriginalQuality,
"Timezone_Offset": item.timezoneOffset,
}));
csvContent = convertToCsv(formattedData);
downloadAsFile('Google_Photos_Library_Export.csv', csvContent, 'data:text/csv;charset=utf-8;');
downloadAsFile('Google_Photos_allMediaInfo.json', JSON.stringify(allMediaInfo, null, 2), 'data:application/json;charset=utf-8;');
exportFinished = true;
console.log("--- ✅ Full Library Export Process Complete! ---");
} catch (error) { console.error("A critical error occurred:", error); }
}
async function fetchAllLibraryItems() {
fali_allItems = (_continue && fali_allItems) ?? [];
fali_nextPageId = (_continue && fali_nextPageId) ?? null;
fali_pageCount = (_continue && fali_pageCount) ?? 0;
console.log("Fetching all library items page by page...");
do {
fali_pageCount++; const page = await gptkApi.getItemsByUploadedDate(fali_nextPageId);
if (page?.items?.length > 0) { console.log(` - Page ${fali_pageCount}: ${page.items.length} items. Total: ${fali_allItems.length + page.items.length}`); fali_allItems.push(...page.items); }
fali_nextPageId = page?.nextPageId;
if (_exportTest) break;
} while (fali_nextPageId);
return fali_allItems;
}
function splitIntoChunks(array, chunkSize) {
const chunks = []; for (let i = 0; i < array.length; i += chunkSize) chunks.push(array.slice(i, i + chunkSize)); return chunks;
}
function convertToCsv(data) {
if (data.length === 0) return ""; const headers = Object.keys(data[0]);
const rows = data.map(obj => headers.map(header => { let v = obj[header]; if (v === null || v === undefined) return ''; let s = String(v); if (s.includes(',')) return `"${s.replace(/"/g, '""')}"`; return s; }).join(','));
return [headers.join(','), ...rows].join('\n');
}
function downloadAsFile(filename, text, mimeType) {
const e = document.createElement('a'); e.setAttribute('href', `${mimeType},` + encodeURIComponent(text)); e.setAttribute('download', filename);
e.style.display = 'none'; document.body.appendChild(e); e.click(); document.body.removeChild(e);
}
exportEntireLibrary();
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment