Your Browser Is the Snitch
Cookies are the part of tracking everyone has heard of. Fingerprinting is the part that works after you reject them, clear them, run a VPN, and switch to a private window. Here is exactly how your browser sells you out — and what stops it.
The premise of fingerprinting
Every browser exposes hundreds of small facts about the device it is running on: the GPU, the OS version, the installed font list, the audio stack, the timezone, the available memory, the screen resolution. Individually, none of these are identifying. Combined, they form a near-unique signature that does not require any cookie, login, or local storage to follow you between sites.
The EFF's Cover Your Tracks tool has been quantifying this since 2010. In a typical run on a desktop browser without anti-fingerprinting protections, the entropy collected is between 18 and 22 bits — enough to distinguish you from roughly one to four million other users. On the global web that is uncomfortably specific.
The five primary signals
1. Canvas fingerprinting
When the browser draws text or graphics to an HTML5 <canvas> element, the resulting pixel buffer depends on the GPU driver, the font rasterizer, sub-pixel anti-aliasing settings, and the OS rendering pipeline. Two devices drawing the same string render almost, but not exactly, the same image. Hashing the pixel data yields a stable identifier.
// The classic canvas fingerprint, ~6 lines
const ctx = document.createElement('canvas').getContext('2d');
ctx.textBaseline = 'top';
ctx.font = '14px Arial';
ctx.fillStyle = '#069';
ctx.fillText('how-quickly-daft-jumping-zebras-vex', 2, 15);
const hash = sha256(ctx.canvas.toDataURL());
That hash is highly stable for a given browser+GPU+OS combination and survives cache clearing, VPN switches, and incognito mode.
2. WebGL fingerprinting
The browser exposes the GPU vendor and renderer string via WEBGL_debug_renderer_info: e.g. "ANGLE (NVIDIA, NVIDIA GeForce RTX 3080)". Combine that with shader compilation timing and a rendered pixel hash of a 3D scene and you get an extremely high-entropy signal. Tor Browser disables WebGL by default for exactly this reason.
3. AudioContext fingerprinting
The Web Audio API runs an oscillator through a compressor and reads back the resulting waveform. Floating-point math in the audio pipeline differs by CPU and OS, producing a stable hash. It is silent, runs in a few milliseconds, and is invisible to the user.
4. Font enumeration
JavaScript cannot ask "list every installed font" directly anymore, but it can render strings in a candidate font and measure the bounding box. If the box differs from the fallback font's box, the font is installed. Run this against a list of 500 common fonts and you get a discriminating set: corporate-issued laptops, Adobe Creative Suite users, and Asian-market devices all leave very distinct shapes.
5. The HTTP entropy layer
Even before JavaScript runs, your TLS handshake exposes a JA3/JA4 fingerprint based on cipher suite ordering, extensions, and supported groups. Your User-Agent, Accept-Language, and Sec-CH-UA Client Hints add more bits. Header order itself is identifying — curl, Firefox, and Chrome each emit headers in a recognizable sequence.
What actually defeats it
The good news is that the defense space is well-understood. The bad news is that most browser extensions claiming to "block fingerprinting" do nothing useful, because randomizing an attribute on every pageload raises your entropy rather than lowering it. You become more identifiable, not less.
The two strategies that work:
- Uniformity (the Tor approach): Make every user of the browser look identical. Tor Browser ships a fixed window size, fixed font list, fixed timezone (UTC), and disables WebGL, Canvas readback, and AudioContext by default. Every Tor user fingerprints the same.
- Per-origin keyed noise (the Brave approach): Brave's Farbling introduces deterministic noise into Canvas, WebGL, and AudioContext readbacks, keyed to a per-session, per-eTLD+1 secret. The same site sees a stable fingerprint within a session (so you don't break apps); a different site sees a different one. Cross-site linking breaks.
privacy.resistFingerprinting (RFP) flag is the closest thing to Tor mode in mainstream Firefox. It enforces a 1000×1000 letterboxed viewport, sets timezone to UTC, and clamps a long list of APIs. It works — but it breaks enough sites that most users disable it within a day.
The fingerprinters in production
This is not theoretical. Three companies dominate commercial fingerprinting:
| Vendor | Used by | Signals |
|---|---|---|
| FingerprintJS | Major banks, ticket sellers, ad networks | 50+ including canvas, audio, fonts, math constants |
| ThreatMetrix (LexisNexis) | Most US banks, KYC providers | Device intelligence + behavioral |
| Iovation (TransUnion) | Lending, gambling, e-commerce | Device reputation graph spanning ~7B devices |
Their stated use case is fraud prevention. Their actual deployment is broader: ad networks buy access to the same identifiers and use them for cross-site re-identification. The legal distinction between "anti-fraud telemetry" and "behavioral advertising" is mostly a contract detail.
What you can do today
- Test yourself. Run coveryourtracks.eff.org and amiunique.org in your everyday browser. Note the entropy. Repeat in your "private" browser. The number that matters is "bits of identifying information," not "looks unique."
- Pick a uniformity browser for sensitive use: Tor Browser (highest), Mullvad Browser (Tor without the Tor network), or Brave with Strict shields.
- Stop installing extensions in your privacy browser. Every extension changes your
navigatorattributes, your CSP behavior, and sometimes your DOM. uBlock Origin is the rare exception worth the entropy. - Disable JavaScript on sites that don't need it. NoScript or Tor's "Safer" mode collapse most fingerprinting to the HTTP layer alone.
- Don't randomize. If a tool claims to "shuffle your fingerprint on every visit," it is making you a unicorn. You want to be a sheep.
The bigger picture
Fingerprinting is a structural problem with the web platform, not a bug in any one browser. Browsers added these APIs (Canvas, WebGL, AudioContext, sensors) for legitimate reasons, and removing them now would break enormous swaths of the web. The realistic future is a continued back-and-forth: vendors add noise or restrictions, fingerprinters find new signals, vendors patch them. The W3C's Privacy CG is the current battleground — and it is far from settled.
For now: assume the browser is a snitch. Treat it like one. Use a uniformity browser when it matters, your everyday browser when it doesn't, and don't pretend a VPN alone is doing anything to stop this.