Ublock: 4chan malware domains getting past Ublock

Created on 26 Dec 2017  路  18Comments  路  Source: gorhill/uBlock

As some of you may know 4chan's owner Hiro added a script to 4chan that connected users to several malware domains. Luckily the great Gorhill fixed this issue but now it seems to have come back. Several users have reported that the malware domains are some how connecting again when browsing 4chan with frames on. Blocking these domains in the filter list seems the only way to stop them for good.

http://www.4chan.org/frames

untitled 1

My settings
Windows 7 64-bit
Browser: Firefox 56.0
uBlock Origin version: 1.14.23b2

My filter lists

Default filter lists + Adguard base filters

Custom filters:
This is a list of the domains that have to be blocked to prevent them from connecting to your browser

0.0.0.0 n0-r98d2.amgload.net
0.0.0.0 n1-r98d2.amgload.net
0.0.0.0 n2-r98d2.amgload.net
0.0.0.0 n3-r98d2.amgload.net
0.0.0.0 n4-r98d2.amgload.net
0.0.0.0 kz1d.piguiqproxy.com
0.0.0.0 kz1c.piguiqproxy.com
0.0.0.0 kz6d.piguiqproxy.com
0.0.0.0 kz6c.piguiqproxy.com
0.0.0.0 kz9c.piguiqproxy.com
0.0.0.0 kz9d.piguiqproxy.com
0.0.0.0 n0-r99d2.piguiqproxy.com
0.0.0.0 n3-r99d2.piguiqproxy.com
0.0.0.0 n6-r99d2.piguiqproxy.com
0.0.0.0 n7-r99d2.piguiqproxy.com
0.0.0.0 xk1n.amgload.net
0.0.0.0 xk6n.amgload.net
0.0.0.0 xk9n.amgload.net
0.0.0.0 xk9o.amgload.net
0.0.0.0 xk1o.amgload.net
0.0.0.0 xk2o.amgload.net
0.0.0.0 xk3o.amgload.net
0.0.0.0 xk6o.amgload.net
0.0.0.0 zx6s.smcheck.org
0.0.0.0 zx1s.smcheck.org
0.0.0.0 utraffic.engine.adglare.net
0.0.0.0 n2-r99d2.piguiqproxy.com

duplicate

Most helpful comment

I've been working on a solution for that class of issues in the past week, which is essentially to bring back script tag filtering to Firefox while working to fix #3069.

I have been using that site as a test case, and it works, even better than the legacy implementation as I have broadened the feature to filter out any DOM element (not just script tag) before the browser is allowed to parse the document.

All 18 comments

Use this page only to report bugs, i.e. problems that make uBlock Origin stop working. Please close this issue and read this carefully before posting here again: CONTRIBUTING.md. Report domains to be blocked to the appropriate filter maintainers.

My Bad. I'll close this issue

But Where should I report this then? Gorhill was the one who added the fix to the filters so I thought this was the correct place to post this question.

Since its an filter list issue, you can report it at uBlockOrigin/uAssets.

From @anewuser posted link

For filter-related issues, report on the respective filter list support site, or at uBlockOrigin/uAssets. Use the logger to diagnose/confirm filter-related issues. If something does not work properly with uBO enabled, the first step is to rule out filter-related issues.

I've been working on a solution for that class of issues in the past week, which is essentially to bring back script tag filtering to Firefox while working to fix #3069.

I have been using that site as a test case, and it works, even better than the legacy implementation as I have broadened the feature to filter out any DOM element (not just script tag) before the browser is allowed to parse the document.

Can't reproduce. Probably because of 4chan.org##script:inject(abort-current-inline-script.js, String.fromCharCode, /[0-9a-f]{40}..$/) present in uBlock Filters.

It would have to be investigated by OP but I suspect some may be suffering from a race condition, causing the scriptlet to not be able to do its job, maybe especially those with other extensions injecting content script at document_start.

This is the main problem currently with script:inject, and also what #3069 is meant to solve for Firefox. Once this lands, uBO for Firefox will be way ahead of Chromium in term of capabilities and reliability.

Indeed. The same filter has no effect on https://tvgid.ua/, so yeah race condition is largely playing the part here. Any chances of it landing on Chromium ?

Any chances of it landing on Chromium ?

Only if they implement webRequest.filterResponseData: https://bugs.chromium.org/p/chromium/issues/detail?id=487422. If they implemented this, uBO-Extra would no longer be needed.

That one aims at enabling to read the response body, will it include enabling to modify the response too ?

On Firefox you can read and modify, yes. You can do so asynchronously too, which allows you to stream the data to parser. You always have the option to read the entire body, process it, and dump it to parser, although that would have performance impact.
I can still hard code rules for Chromium, not much have changed since last year.

I removed the domain host from my filters and blocked first and third party scripts/frames in the advanced menu. This seems to have stopped the domains from connecting but the sites CSS does still break occasionally. I think Gorhill is right in that this is a race condition.

Several other firefox users are reporting the same problem on the technology board. It seems that its even worse for chrome users.

Try 4chan.org##script:inject(abort-current-inline-script.js, XMLHttpRequest, 7c9e3a5d51cdacfcbe39622b93f069c13b55e7ae)

I have been using this one on 9anime too.

@uBlock-user thanks for the help. I tried that filter but it didn't work.
I think Gorhill is right and a race condition is messing with the script:inject command, but I don't know what could be causing it. I turned off tampermonkey and I'm not running any custom scripts.

The weird thing is this seems to happen only when browsing 4chan with frames on. Maybe that's whats causing the filter trouble? So far the only way to stop these domains from connecting is blocking first and third party scripts/frames.

Works for me on Chromium, have you tried testing there ?

Fixed with a9f68fe02f40.

Fix works only for Firefox however using uBO 1.14.23b3, and this then becomes a filter issue: 4chan.org##^script:has-text(7c9e3a5d51cdacfc).

Why not make the syntax $$ instead of ##^ so Adguard rules can be parsed?

I did consider it at first, and finally decided against it because the code to scan a filter for ##/#@# is already there. When that scan fails, then the filter is deemed to be a static network filter.

Supporting a $$ anchor would require to have another scan of the filter before deciding whether the filter is a static network filter -- the most common occurrence. As I have decided before, I prefer to reuse ##/#@# just like I did for scriptlet injection, for the same reason. At least Adguard's and ABP's #%#, #$# and #?# do not cause extra scanning.

Using a special character following the already handled ##/#@# opens the door to support more new variations in the future (so long as they do not introduce ambiguity with CSS selector syntax), without having to keep adding and dealing with new sort of anchors.

Marking as duplicate of #3069, hence fixed for Firefox 57+.

For Chromium-based browsers, there is no solution until their extensions API allows for modifying the response body data on the fly.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

GSNord picture GSNord  路  3Comments

butonic picture butonic  路  3Comments

rvandermeulen picture rvandermeulen  路  4Comments

GitHubBlow picture GitHubBlow  路  4Comments

KonoromiHimaries picture KonoromiHimaries  路  3Comments