Thanks for sharing! A human-friendly summary of the gist of those results would be super helpful. This looks interesting, and you may have rekindled my interest in LibreWolf over BetterFox-hardened Firefox again, but I'm not entirely clear on what your results mean in human terms.
Free and Open Source Software
If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Thank you for kind words!
Ah, then the hope is that this curiosity will trigger you to dig into it yourself (for example using the provided tool or taking inspiration from it) so that it starts making sense! I know it's an unconventional format to refrain from laying out my own opinions and analysis but that's my thing today. So much "everyone knows" and vapid third-hand takes flying around these days that I think we would do well to actually verify (and pick up related knowledge in the process) rather than take forum comments and blog posts for gospel.
OK, all right, I can try. I guess I can point at one thing in the Mozilla telemetry at the very end, doesn't that look very fine-grained if you look at the URLs (addresses) listed?
We can tell that many of the actions I took were communicated to the mothership for analysis and product improvement. Is this data really anonymized (or anonymizable)? Is it a reasonable amount for a user that has not opted in? My professional and personal opinion is: It is not.
But! That's just one isolated example. And an extremely limited view. What about Zen? Chrome, Edge and Safari weren't included here at all. And it's not at all looking at what happens for a user who probably cares about this: when you go to settings and disable all the telemetry. See I just said that one thing about Mozilla Telemetry and now I'm going to have to run some new tests and write reports about them for days just to set that record straight!
Maybe I'm odd but I think it's many (100?) times easier and quicker to gain understanding of the kinds of stuff we're looking at here by getting hands-on than to communicate it verbally. And I'm concerned with this limited attention span so many people are afflicted with these days, and look at how long this comment is already, no we're done with me tell you how it is, let's wrap this one up and get onto the juicy stuff.
There's an expandable section Basic test environment usage under Testing procedure but I realize now that might be easy to miss...
Anyway, to start it: Install podman, docker-compose (v2) and MITM_BROWSER=firefox-esr podman compose up --build. That should be it.
Then the browser pops up (hopefully), you do your thing, and after you Ctrl+C in the console, it will quit and the proxy will dump the recorded .har file which contains all HTTP and websocket traffic that went through the proxy in cleartext, in JSON format. There're tools online that can help visualize I think but nothing I can recommend off the bat. Simply cating it to the terminal or opening it in a text editor can be educative. Also playing around with variations of the jq snippets and see if you can come up with questions of your own to answer. Or if anything in my numbers make you scratch your head or say "wait a minute" dig there.
In case you want to take al look at what the thing does before running it (trust me bro), these are the files involved when you run that compose up command:
- entrypoint
compose.yml - imported
compose/proxy.compose.ymlfor mitmproxy - The browser
Containerfile(aka Dockerfile)