Tooly McToolface
/ benchmarks

How Tooly measures itself.

Most "free web tool" comparison pages are marketing in disguise. This one measures three things that actually matter — page weight, bytes uploaded, and task latency — and shows our numbers honestly, including where we're slower.

01 · Page weightHow heavy is each tool?

Every tool in the workshop is a single HTML file with inline CSS and JavaScript. No frameworks, no bundlers, no webpack, no tree-shaking. The file you download is the file that runs. Here are the real sizes, measured with wc -c against the deployed files.

Tool HTML size Gzipped What fits inside
HEIC Liberator 38 KB ~9 KB UI + drag-drop logic (libheif loaded on demand)
Image Smusher 40 KB ~10 KB Canvas-based compression for PNG, JPG, WebP, AVIF
Favicon Foundry 41 KB ~11 KB Canvas resizer + in-browser ICO writer + ZIP builder
JSON Fixer 49 KB ~12 KB Parser, formatter, tree view, JSONC stripping
Timestamp Converter 54 KB ~13 KB Full IANA timezone list + parsers for 5 formats
Format Flipper 97 KB ~28 KB 9 format parsers bundled inline (JSON, YAML, TOML, CSV, TSV, XML, MD, HTML, SQL)
Methodology
Raw byte counts are from the deployed HTML on Netlify. Gzipped estimates assume standard HTTP compression (Netlify's default brotli compression is actually a bit better — these numbers are conservative). No external JavaScript except Plausible analytics (1KB) and Google Fonts CSS.

For comparison: a typical "free online tool" site running on a JavaScript framework with ads and analytics delivers 800 KB to 3 MB of JavaScript on first load. Tooly's heaviest tool is Format Flipper at 28 KB gzipped — and it parses nine file formats in nine different directions.

Why this matters

A 30 KB page loads on a 3G connection in roughly 200 ms. A 3 MB single-page app takes 8-15 seconds on the same connection. If you're on an office Wi-Fi with 50 Mbps, you won't notice. If you're on the subway in Brooklyn or a train in rural Germany, you absolutely will.

Tools should work when your internet is bad. That's when you need them most.

02 · Upload bytesHow many of your bytes leave your device?

The easy benchmark. For every tool in the workshop, the answer is the same: zero. Not "encrypted," not "we delete them after processing," not "GDPR compliant." Zero bytes of your files ever leave your browser.

0
Bytes uploaded
Your files, your device, your control. Every tool runs entirely on your CPU via the File API and Canvas / Blob primitives.
0
Cookies set
No session tracking, no advertising IDs, no "preferences" stored server-side. The only storage is localStorage for per-tool settings you actually chose.
0
Accounts required
No signup, no email, no free-trial-limit, no "log in to export." The tools either work on load or they don't.
How to verify
Open any tool. Press F12, go to the Network tab, enable "Preserve log." Drop a file onto the tool, use it, export. Count the network requests originating from your file: there won't be any. The only requests you'll see are page-load requests (HTML, CSS, fonts) and one Plausible analytics ping.

This isn't a privacy promise. It's an architectural fact. There is literally no server in the pipeline that could receive your file. Netlify serves a static HTML file to your browser; from that moment forward, the tool runs on your machine. Tooly couldn't see your files if we wanted to.

03 · Task latencyHow fast does the work finish?

This is the harder honest benchmark, because it depends on your device. Compressing a 4K image on a 2024 MacBook is wildly different from the same task on a 2019 Android phone. Instead of making a number up, the page below runs an actual benchmark in your browser right now.

Live · running on your browser
Page loaded in
ms
measuring…
Image compress (1024×1024 test)
ms
queued…
JSON parse (50 KB doc)
ms
queued…
Timezone math (1000 conversions)
ms
queued…
These numbers come from your own CPU, right now. Refresh to rerun. No data is sent anywhere — the benchmark, like the tools, runs entirely in your browser.
Methodology
Each benchmark runs after the page is interactive. Image compression generates a 1024×1024 procedural image and runs canvas.toBlob('image/webp', 0.85). JSON parse builds a 50 KB synthetic document then runs JSON.parse. Timezone math runs 1000 conversions across 10 different IANA zones using the same logic Timestamp Converter uses. All times via performance.now().

A server-based tool has to add network latency to every operation: 100-300 ms round-trip on a good connection, more on mobile. Tooly's latency is exactly what your browser takes — nothing more. For most tasks in the workshop that means under 50 ms, which is faster than the blink of an eye.

04 · Where we're not the fastestHonest caveats.

Three places where Tooly is measurably slower than the best paid alternative:

Task Tooly Best paid alternative Why
Image compression ratio Canvas + default WebP/AVIF encoder Squoosh's ported MozJPEG / libavif at highest effort Browser encoders optimize for speed, not peak ratio. For ~5% better compression, Squoosh is still the answer.
Big HEIC batches libheif.js in-browser (~500ms per photo) Native iOS Photos.app or cloud service Browser WASM is impressive but no match for native hardware HEIC decode. Good enough for <50 files.
Huge JSON docs Native JSON.parse in main thread Server-streaming parser (e.g. oboe.js) on Node backend Files over 50 MB will freeze the tab briefly. We're not a data pipeline.

The workshop is optimized for the 95% case: tasks that happen ten times a week, on files measured in megabytes not gigabytes, where spending three minutes searching for the right tool costs more than the task itself. If you're processing a thousand images nightly, you want a command-line pipeline, not a browser tab. Tooly doesn't pretend otherwise.

05 · The real benchmarkWhat actually matters.

Raw speed numbers are fun to quote, but they're not the point. The real benchmark is: how quickly does the tool stop wasting your time? That breaks down into five questions that no performance monitor can measure for you:

Tooly's answer to every one of those is no-friction by design. That's the benchmark we actually care about. The page-weight and latency numbers are just the easy-to-measure shadow of that larger commitment.

These benchmarks are measured honestly and updated when the tools change. If you find a number that doesn't match reality — a tool that feels slower than claimed, or a page that's heavier than listed — tell me. Accuracy matters more than looking good.