Julia: Obscure regression in 32-bit Julia 0.6.3

Created on 31 May 2018  Â·  8Comments  Â·  Source: JuliaLang/julia

In Julia 0.6.3, StatsBase.wmedian gives incorrect results on 32-bit. In the simplified function below, the result is 8.5 on 0.6.2 and 0.7.0-alpha, but 10.0 on 0.6.3. Everything is OK on 64-bit.

FWIW, the bug goes away just by adding @show statements.

function wmedian(v::AbstractVector{<:Real}, wt::AbstractVector{<:Real})
    midpoint = sum(wt)/2
    maxval, maxind = findmax(wt)
    if maxval > midpoint
        v[maxind]
    else
        permute = sortperm(v)
        cumulative_weight = zero(eltype(wt))
        i = 0
        for (_i, p) in enumerate(permute)
            i = _i
            if cumulative_weight == midpoint
                i += 1
                break
            elseif cumulative_weight > midpoint
                cumulative_weight -= wt[p]
                break
            end
            cumulative_weight += wt[p]
        end
        if cumulative_weight == midpoint
            middle(v[permute[i-2]], v[permute[i-1]])
        else
            middle(v[permute[i-1]])
        end
    end
end

julia> wmedian([1, 2, 4, 7, 10, 15], [1/3, 1/3, 1/3, 1, 1, 1])
10.0
32-bit regression

Most helpful comment

See #27402. New 32-bit binaries for Windows and Linux have been uploaded. Let me know if this fixes things for you.

All 8 comments

Our 32bit tests also fail on some weird last digit difference in LineSearches.jl. I can report details later (on mobile) but if I hardcode copies of the variables involved the results are correct.

edit:
I have not looked at making this self-contained, but to show the problem I had the following in a function

    a = ((phitest-phi_0)/alphatest - dphi_0)/alphatest  # quadratic fit
   @show a
   _a = 0.72
   _b = 2.0
   _c = 0.2
   _d = -8.0
   @show _a === phitest
   @show _b === phi_0
   @show _c === alphatest
   @show _d === dphi_0
   @show ((_a - _b)/_c - _d)/_c
   @show ((phitest-phi_0)/alphatest - dphi_0)/alphatest

with the output

a = 8.0
_a === phitest = true
_b === phi_0 = true
_c === alphatest = true
_d === dphi_0 = true
((_a - _b) / _c - _d) / _c = 8.000000000000002
((phitest - phi_0) / alphatest - dphi_0) / alphatest = 8.0

! Pretty weird that they can be identical ===, but the two calculations return different values!

Our 32-bit tests are failing because 0.5:0.1:0.7 is becoming 0.5:0.1:0.6. Maybe that's a small case to debug.

https://ci.appveyor.com/project/YingboMa/ordinarydiffeq-jl/build/1.0.62/job/o113qdrx1owjjfy2#L458

Would someone with access to a 32-bit system be willing to run git bisect between v0.6.2 and v0.6.3? git bisect run with a minimal reproducing test case would make it easier. I'd do it myself but all of the machines I have access to are 64-bit.

Sorry this didn't get caught. PackageEvaluator runs on 64-bit Ubuntu so it doesn't catch OS- or architecture-specific issues in packages.

I have tried replicating the problem on a Fedora 32-bit VM, and while I see it with the official binaries, I don't when building from source. I'm using MARCH=pentium4, which AFAICT is what the official binaries use too. I'm not sure where the difference can come from.

For post-1.0 minor releases, we really need 32-bit PackageEvaluator runs (and ideally on several platforms, including Windows and OS X).

Any difference in the generated code? Julia lowering, llvm code, generated code?

Good question. There are lots of small differences in the native code, but none in the LLVM IR. So it seems it's a bug in LLVM? See this gist, where the first "commit" contains results with the official binaries, and the second one results with my custom build.

See #27402. New 32-bit binaries for Windows and Linux have been uploaded. Let me know if this fixes things for you.

Thanks, that did the trick!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

i-apellaniz picture i-apellaniz  Â·  3Comments

sbromberger picture sbromberger  Â·  3Comments

StefanKarpinski picture StefanKarpinski  Â·  3Comments

ararslan picture ararslan  Â·  3Comments

omus picture omus  Â·  3Comments