Chrome v64 DEV Tools Audits hangs forever without returning result.
Lighthouse v2.8.0 extension would throw error:
VM204:5 Refused to connect to 'http://example.com/' because it violates the following Content Security Policy directive: "default-src 'none'". Note that 'connect-src' was not explicitly set, so 'default-src' is used as a fallback.
__nativePromise.resolve.then._ @ VM204:5
Promise.then (async)
(anonymous) @ VM204:5
wrapInNativePromise @ VM204:3
(anonymous) @ VM204:21
VM204:5 Refused to connect to 'http://example.com/' because it violates the document's Content Security Policy.
Content-Security-Policy: default-src 'none'; connect-src 'self' would solve the issue, but I think Lighthouse mustn't break and depend on connect-src 'self'.
Once was open related issue https://github.com/GoogleChrome/lighthouse/issues/2319
The same error also in getRobotsTxtContent (Lighthouse v3.0.3) resulting "robots.txt is not valid"
Lighthouse was unable to download your robots.txt file
Refused to connect to 'https://example.com/robots.txt' because it violates the following Content Security Policy directive: "default-src 'none'". Note that 'connect-src' was not explicitly set, so 'default-src' is used as a fallback.
getRobotsTxtContent @ VM1773:7
__nativePromise.resolve.then._ @ VM1773:17
Promise.then (async)
(anonymous) @ VM1773:5
wrapInNativePromise @ VM1773:3
(anonymous) @ VM1773:31

Any ETA when this will be fixed.
https://web.dev/measure breaks on same error showing invalid message

You mustn't say the error is in customer side when actually is Lighthouse incompatibility bug.
No ETA, this requires fetching outside the context of the page which is not something Lighthouse can do in most environments at the moment.
It took a while to discover this issue and FWIW, a more descriptive error message would be helpful when the CSP blocks the request as opposed to an HTTP error response.
At a minimum, it would be helpful to add to the documentation for this check under "Common errors include:"
Most helpful comment
The same error also in
getRobotsTxtContent(Lighthouse v3.0.3) resulting "robots.txt is not valid"