Julia: implement Float16 parsing in readdlm

Created on 17 May 2016  路  17Comments  路  Source: JuliaLang/julia

Try this. Create a text file with the contents:

0 1 2
3 2 1

Now try to load this file into Julia using readdlm(filename, Float16). It will give an error:

ERROR: at row 1, column 1 : ErrorException("file entry "0" cannot be converted to Float16")
in error at ./error.jl:21
in dlm_fill at datafmt.jl:315
in readdlm_string at datafmt.jl:272
in readdlm_auto at datafmt.jl:49
in readdlm at datafmt.jl:40
in readdlm at datafmt.jl:33

However readdlm(filename, Float64) works fine.

This is problematic. In my particular case, I have a large matrix of small float numbers, some of which are written as exact integers. I don't want to load it as Float64 to save memory.

I am using Julia v.0.4.5

O help wanted

Most helpful comment

NVIDIA GPUs do Float 16 operations, and so does the Xeon Phi (as "half"s). In fact, the in the Xeon Phi it works in SIMD vectorized operations to do double the number of calculations at a time as floats. Just pointing that out because that means supporting Float 16 can be crucial to targeting accelerator cards.

All 17 comments

Is there a workaround to load the matrix as Float16?

I think the problem here is that we don't have a native way of parsing Float16:

julia> parse(Float16,"0")
ERROR: MethodError: no method matching tryparse(::Type{Float16}, ::String)
Closest candidates are:
  tryparse{T<:Integer}(::Type{T<:Integer}, ::AbstractString, ::Int64)
  tryparse{T<:Integer}(::Type{T<:Integer}, ::AbstractString)
  tryparse(::Type{Float64}, ::String)
  ...
 in parse(::Type{Float16}, ::String) at ./parse.jl:159
 in eval(::Module, ::Any) at ./boot.jl:226

You could try reading it in as Float32.

Alternatively, you could use the CSV.jl package which would allow you to read it in as Float16; it actually reads the underlying value in initially as a Float64 (using strdod), but then ensures a correct conversion to the storage format. Something along the lines of:

julia> csv = CSV.csv("/Users/jacobquinn/Downloads/test_float16.csv";delim=' ',types=[Float16,Float16,Float16])
Data.Table:
2x3 Data.Schema:
    col1,    col2,    col3
 Float16, Float16, Float16

    Column Data:
    Float16[0.0,3.0]
    Float16[1.0,2.0]
    Float16[2.0,1.0]

Float16 is not technically a "computational type" (according IEEE), but we've gradually moved in the direction of just supporting all operations on it. If someone wants to implement Float16 parsing, that would be cool.

@StefanKarpinski Does that mean computations with Float16 may be slower than Float32/Float64?

Yes, since they are not implemented in any hardware that I'm aware of. We do Float16 operations by converting to Float32 and back.

NVIDIA GPUs do Float 16 operations, and so does the Xeon Phi (as "half"s). In fact, the in the Xeon Phi it works in SIMD vectorized operations to do double the number of calculations at a time as floats. Just pointing that out because that means supporting Float 16 can be crucial to targeting accelerator cards.

Ah, I didn't know this was a common type to be supported on GPUs. All the more reason to treat it as a first-class computational type.

When you say parsing, do you mean something like 1.0h3 === Float16(1000.0) (where h = half, from OpenCL, but other choices are possible)? Would this be Julia syntax or just in functions to read from a text file?

I think the specific case we've been discussing here is implementing parse(Float16, str::String)

OK, thanks. Of course, if were to be treated as a "first class computational type"... but I understand there is reluctance to add more to the language parser.

?

import Base: parse

function parse(::Type{Float16}, str::String)
    fp = 0.0
    try
        fp = parse(Float64, str)
    catch
        throw(ArgumentError(string("invalid number format \"",str,"\" for Float16")))
    end
    return convert(Float16, fp)
end

has been fixed in a2a6d182dc2

It does seem to be fixed but github (and I) can't find that commit?

sorry, that was a tree hash. now corrected

Thanks, I think this can be closed then.

Was this page helpful?
0 / 5 - 0 ratings