Currently when trying to build native python modules it seems to be a one way street.
One can create the native module quite easily via:
cc_library(
name = "foo",
srcs = [
"foo.cpp",
],
deps = [
"@pybind11//:pybind11",
],
)
but when trying to set it as a dependency it of course is missing the py provider:
py_test(
name = "testme",
srcs = [
"test.py",
],
deps = [
":foo",
],
)
Sadly I also cannot add the cc_library as srcs of a py_library.
Now my question is how would one express the dependency?
For py_test to work properly you would want the :foo directory to be added to imports.
Sadly imports is currently not a Provider but a simple String list. A possibility would be to introduce a PyImportsProvider. Looking at Attribute.Builder it only works with one Type. So adding the ability to have another Type would perhaps mean to extend Attribute.Builder to work on a tuple of Types.
Should imports be reworked?
So does py_library have to be extended?
Is a new py_native_library needed?
This might be related depended on https://github.com/bazelbuild/bazel/issues/146
I've gotten this to work by building the extension as a cc_binary shared object (linkstatic = 1, linkshared = 1), and adding it in the 'data' section of py_library rules that depend on it.
I've gotten this to work by building the extension as a cc_binary shared object (linkstatic = 1, linkshared = 1), and adding it in the 'data' section of py_library rules that depend on it.
How does using cc_binary differ from cc_library?
The data dependency does work but does not provide the import paths AFAIK.
AFAIK, the exact output of cc_library shouldn't be relied on -- it could produce shared objects, a static library, or both, and the naming scheme isn't exactly pretty. It also may not be linked against third party libraries in your linkopts? Not sure about that.
Have you tried making a wrapper py_library wrapper in the same package? Something like this:
py_library(name = 'foo', data=[':foo.so'], imports=['.'])
cc_binary(name='foo.so', linkstatic=1, linkshared=1, srcs=[....])
You could make a macro to make it less awkward.
Have you tried making a wrapper py_library wrapper in the same package? Something like this:
py_library(name = 'foo', data=[':foo.so'], imports=['.'])
cc_binary(name='foo.so', linkstatic=1, linkshared=1, srcs=[....])
The problem arises as soon as you no longer have foo.so but @ext//some/completely/other:path.so. Getting that into imports I haven't figured out yet.
Sure -- that's why I suggested using the wrapper py_library. You can put imports on that, and the data dependency (the .so) should get pulled in transitively.
@ioctl0 Thanks that works fine for me.
@aehlig foo.so depends on
cc_library(
name = "filesystem",
data = ["lib/libboost_filesystem.so",], # had to add this for the so get handled in runfiles. Is this by design or a bug?
srcs = ["lib/libboost_filesystem.so",],
)
and unless I add the .so to data, it does not copy it to runfiles. Intended?
Are you building libboost_filesystem as a bazel target, or including it in your build as a binary in your source tree? If the former, you could put it in '''deps''', in which case bazel will autmatically statically link it into your .so, so long as you specified '''linkstatic'''. If the latter, you can probably remove it from '''srcs'''.
@ioctl0 The latter.
I think that kind of config is going to cause all kinds of trouble -- you might do better if you can include a static library built with -fPIC?
@ioctl0 It does. And boost_filesystem is easy to statically link. We do however have the same situation for freeimage and statically linking that is really non-trivial. Currently no way around it sadly.
Some notes, since I happened to look at this bug report:
Some notes, since I happened to look at this bug report:
- don't use cc_library, use cc_binary; we want to change cc_library in a way that will otherwise break you (i.e., remove the implicit .a and .so outputs)
Already slowly converting. Maybe this should be made more clear in the docs.
- if you want to use a precompiled file, no need to use an intermediate cc_library, just add it to data of py_library directly
The argument against that for me is that I have multiple targets (py_* as well as cc_*), that depend on the precompiled file. And it seems cleaner to link against cc_* in that case!?
- if you have multiple dependencies on the same cc_library, you can end up with multiply-defined symbols, which can break in subtle ways during runtime
We are aware ;)
- if we support native dependencies from py_library, we may add a native_deps attribute, instead of reusing deps; in either case, we'll likely link all dependent cc_library rules into a single .so at the py_binary level to avoid the aforementioned multiple definition problem
native_deps would be nice.
If a cc_binary is used, how should one specify the include paths and link options (and other build flags in distutils.sysconfig.get_config_vars) to use? Or, if distutils is invoked directly from a setup.py to build the shared object libraries (e.g. via a genrule) how should it depend on other cc_librarys?
Possible dup with #701. In any case marking P3 since this isn't happening this quarter.
Closing in favor of #701.
Most helpful comment
I've gotten this to work by building the extension as a cc_binary shared object (linkstatic = 1, linkshared = 1), and adding it in the 'data' section of py_library rules that depend on it.