May-25-2020: update from the researcher:
Hi, I've done a retrospective analysis of Firefox versions, up until Firefox 76. All issues discussed in the original report appear to be resolved, except for one; the request initiated by the WebSocket API is not blocked [by Tracking Protection] when directed to a blacklisted domain
This issue is really just for myself, and FYI - I'll fix the title one day when I get the time to read the docs etc and understand what is going on
Discussions
I had a quick skim of the reddit link and article last night, and
My assumption has always been that any persistent storage of website data can and will be used against you. Hence why the default user.js essentially blocked everything (until recently where we allowed first party cookies). Anyway, that's all for now
PS: I do not care about ABP, Disconnect, SafeScript etc, I only care about uM, uBO (and Firefox)
@gorhill Thanks, feel free to chime in if and when you have time (and feel it necessary), just don't like talking behind yer back :)
Notes / observations: from https://wholeftopenthecookiejar.eu/
Edge (included because ... WTF?!!#&@# .. seriously? No QA team I guess: "Enabling the option to block third-party cookies in Edge has no effect" - that's a bit of big hole
FF: https://bugzilla.mozilla.org/show_bug.cgi?id=1447935 [Access Denied for now] - Various mechanisms can be leveraged to bypass Firefox Tracking Protection. This way, cross-site requests directed at blacklisted domains can be sent while this countermeasurement is enabled
FF: https://bugzilla.mozilla.org/show_bug.cgi?id=1447933 [WONTFIX] - It is difficult for extension developers to distinguish requests initiated by browsers background processes from requests initiated by websites.
FF: https://bugzilla.mozilla.org/show_bug.cgi?id=1433700 -> https://bugzilla.mozilla.org/show_bug.cgi?id=1453751 [Fixed in FF63 I think] - Requests to fetch the favicon are not interceptable by Firefox extensions
I also wanted to make a post about this but you beat me to it :)
It's nice to see that Gecko-based browser did far better than the rest (at least for the most part). Good job mozilla! I mean we already knew that or we wouldn't be using FF in the 1st place but it's nice to see it confirmed once again.
One problem in FF and other Gecko-based browsers was this:
Contrasting to extensions of other browsers, almost every Firefox-based extension could be bypassed in the HTML category. In most cases, this was caused by a element, which rel attribute was set to "shortcut icon". By further analyzing this case, we traced back the cause of this issue to an implementation bug in the WebExtension API. We found that the onBeforeRequest event did not trigger for requests originating from this link element. Although abusing this bug may not be straightforward, as it is only sent when a web page is visited for the first time, it does indicate that browsers exhibit small inconsistencies, which may often lead to unintended behavior.
This is bad because apparently it can leak cookies and could also be abused for tracking fe with ETAGs. But the good news is that this issue was (allegedly) fixed in FF63. Probably worth checking if that's really the case.
Another problem they also mentioned is that extensions are prohibited from seeing requests from other extensions. I was already aware of that and it's one the things that bugs me the most with the current WE-APIs. Hopefully they'll change that behavior sooner rather than later but I won't hold my breath :(
For extensions devs of privacy/security-related addons there's this important note if they want to make sure they'll catch everything:
In the JavaScript category, we found that most extensions could be bypassed with at least one technique: for the tracking extensions, only a single extension managed to block requests initiated by JavaScript.
Most prevalently, a bypass was made possible because of WebSocket connections. We found that a common mistake extension developers made, was in the registration on the onBeforeRequest event. The bypassed extensions set the filter value to[http://*/*, https://*/*], which would allow intercepting all HTTP requests, but not WebSockets, which use thews://orwss://protocol. Hence, to be able to intercept all requests, the filter should include these protocols or use<all urls>. Of course, the configuration of the manifest file should be updated accordingly.
Same-site cookie - No bugs were found for Firefox’ implementation of this policy.
:1st_place_medal: :+1:
Most of the browsers that we evaluated have built-in support for suppressing cookies of third-party requests. Our results show that only the Gecko-based browsers (Firefox, Cliqz and Tor Browser) manage to do this successfully.
:+1: :+1:
although for 3rd-party cookie blocking FF still lacks (/leaks ?) in Redirects:

But if FF is used with uBO their Table 2 shows that Redirects makes "no request".
@gorhill is there something in uBO that blocks redirects or strips cookies from such requests? I'm not aware that there is such a thing.
Tracking Protection unfortunately failed miserably! An easy fix for most of the problems would be to block 3rd-party cookies/site-date when TP is enabled. @fmarier is there anything you're allowed to tell us in regards to what you and your guys make of these results and maybe plans to fix those issues? I'd love to hear some insights!
is there something in uBO that blocks redirects or strips cookies from such requests?
No. According to the tests you can execute from their site, uBO would fail the "Redirects" category.
However, I think they may have run the test differently on their side.
This is how they describe the "Redirect" category:
Top-level redirects are often not regarded as cross-site requests, because stripping cookies from them would cause breakage of many existing websites. Nevertheless, we included them in our evaluation for the sake of completeness, because various scenarios exist in which top-level redirects can be abused. For instance, a tracker trying to bypass browser mitigations can listen for the blur event on the window element, which indicates that the user switched tabs. When receiving this event, the tracker could trigger a redirect to its own website in the background tab, which would capture information from the user and afterwards redirect him back to the original web page.
On their site, the tests use EasyList's &act=ads_ to trigger blockers. This one filter won't trigger uBO's strict-blocking -- which is the blocking of a whole document. But typically tracking servers are filtered according to their hostname, and in such case uBO would be the only blocker to properly deal with such bypass. I suspect in their real tests they may have tested with &act=ads_$document to account for the fact that tracking servers are blocked through their hostname, not through some parameters in the URL. In such case, uBO is able to block top-level document (which does actually happen for all servers listed in filter lists), which is what the "Redirect" category is about.

That's the only explanation I can come up to explain why uBO was deemed as having passed the "Redirect" category. If you wanted to get a similar behavior in other blockers, you would be unable as far as I know.
Edit: From what I see in their framework source, "Redirect" category was tested using adition.com (see https://github.com/DistriNet/xsr-framework/blob/master/web/leaktest/s0/redirect/index.php), hence uBO would pass these tests since adition.com is filtered in EasyList and Peter Lowe's. adition.com is probably not used in their online tests because that would cause undue network traffic to the server.
^^ so the uBO pass mark was a false positive?
Not a false positive, a deserved pass mark -- see for yourself and try to navigate to https://adition.com/report/?leak=any, they used that server to run their tests. (you may have missed my update in my comment above yours).
Not a false positive, a deserved pass mark
it sounds like Redirects can be problematic depending on how a site uses it, and you said this yourself:
According to the tests you can execute from their site, uBO would fail the "Redirects" category.
I'd say the 1st line of defense is to enable 3rd-party cookie blocking in FF because that will stop everything in these tests except Redirects. Add uBO to that and Redirects should be covered in some or most cases. But it's not 100%, right?
What also remains problematic AFAIK is the issue with favicons but that should be solved in FF63 when extensions are given the ability to see these requests and be able to strip ETAGs for example or block favicons requests to 3rd-parties or strip cookies from such requests or whatever else extensions want to do.
But it's not 100%, right?
Well if a tracking server is not in one of the lists then uBO's strict-blocking won't kick in.
This is why I've been pining for a web ext of NoRedirect - earthlng, what was the bugzilla for that API you needed, the one with crickets and tumbleweeds - maybe with this report we can push to get it done
Is this it: https://bugzilla.mozilla.org/show_bug.cgi?id=1352653 ? Would this allow you to build a web ext NoRedirect?
is there anything you're allowed to tell us in regards to what you and your guys make of these results and maybe plans to fix those issues?
Our plan is to fix the existing bypasses and then flip the default so that all requests go through TP unless specifically exempted (e.g. Firefox update). The details are in https://bugzilla.mozilla.org/show_bug.cgi?id=1207775.
@gorhill
Well if a tracking server is not in one of the lists then uBO's strict-blocking won't kick in.
Hm - I'm not sure that I understand. The Strict Blocking wiki site says that "uBO will block the web page served by a server found in one of the malware list". But a tracking server is usually not on a malware list - so how could it trigger strict blocking? I'm confused. Or is that wiki site not up-to-date?
so how could it trigger strict blocking?
Any domain that is blocked using $document like ||example.org^$documentor wholly as ||example.org^ will trigger because Strict blocking is enabled by default.
FYI: not sure when Oh, didn't realize Francois opened up the ticket a month ago, 1447935 is no longer access denied
* AppCache API (caching a resource located on a blacklisted domain)
* Response headers (referring to the blacklisted domain)
- Link rel=next
- Link rel=prefetch
* HTML tags (referring to blacklisted domain)
- <link rel="shortcut icon" href=“…”>
- <link rel="apple-touch-icon image_src" href=“…”>
* EventSource API (referring to blacklisted domain)
* WebSocket API (opening a new web socket for the blacklisted domain)
* Fetch API, importScripts() used by ServiceWorker (referring to blacklisted domain)
Note: (in our user.js)
browser.cache.offline.enabledom.serviceWorkers.enabled (or can control them in uM)gorhill has us covered, assuming your config is "right"
What is the right config for uBO?
I'm not worried personally, as I lock down basically all persistent storage
How do you lock down all persistent storage?
What is the right config for uBO?
Nothing for you to do. Just config uBO however you like.
How do you lock down all persistent storage?
Cookies control cookies, localStorage and IDB - and I block all cookies by default. appCache is disabled. SW's are disabled (and "web workers" in uMatrix is default block). And I clean on close, which most likely clears other shit, like cache, sw cache (if I had any), and other things
updated OP
Most helpful comment
updated OP