Bazel: ijar does not work with scala jars that contain macros

Created on 20 Nov 2015  Â·  63Comments  Â·  Source: bazelbuild/bazel

I get:

Exception in thread "main" java.lang.ClassFormatError: Absent Code attribute in method that is not native or abstract in class file com/twitter/
scalding/serialization/macros/impl/OrderedSerializationProviderImpl$

when compiling against an ijar'ed file that contains macros.

maybe this is not easily solvable, since I guess the scala compiler has to run methods in the jars at compile time. It would be good to see if there is a way we can tell a method is a macro method, and if it is, it should not be removed by ijar.

P4 team-Rules-Java bug

Most helpful comment

java_import is the recommended bazel way for java jars that are local. But that is a java rule which forces the use of ijar by the time the scala rules can see it (currently, maybe a future change to the skylark interface of the java rules would change that).

@softprops to work around, you should use a filegroup:

filegroup(
    name = 'file',
    srcs = ['some_jar_i_very_very_like.jar'],
    visibility = ['//visibility:public']
)

All 63 comments

added a scala_macro_library target that does not do ijar to my awful hack attempts here:

https://github.com/twitter/scalding/commit/73fe8aff7181ef519d410f86d4b4b2eb7d78cf63

pulling in @cushon and @eaftan for their expertise: would that be safe to fix? Is a macro method an actual Java stuff or a hack from Scala?

The build requirements for macros sound kind of similar to Java annotations processors. The approach there is to create a java_plugin for the processor, and add it to a library with the java_library.plugins attribute. Plugin deps don't get processed by ijar, so the code is available to run processors during the compilation.

Would the same approach work for macros?

I think I did exactly that by adding a scala_macro_library target that I don't run through ijar that I linked above.

I guess that + minimizing target size is the answer.

Isn't scala_macro_library just a scala_library that puts its runtime output on the library classpath of upstream targets? I was suggesting creating a scala_library.macros attribute, and adding the runtime jars of any targets in macros to scalac's library classpath.

But your approach works too, and if scalac doesn't distinguish between library and macro deps then maybe there's no point making the distinction in the build language.

Sorry, I misunderstood you. I actually tried that approach first, but my unfamiliarity with skylark made this approach easier.

It is strange for the consumer to know if something has macros. The idea is that macros should behave like normal functions/methods. So I wouldn't want scala_library to have to declare macro dependencies. Declaring that a target contains macros seems like the right way to go. Perhaps that could be done with an attribute rather than the name.

It seems like this approach doesn't allow depending on a maven_jar that contains a macro. Is that correct?

How might I go about doing that?

@sixolet I assume you are using the scala rules here: https://github.com/bazelbuild/rules_scala

in that case, you can use the /jar:file target of a maven_jar rather than the /jar target.
for instance:
https://github.com/johnynek/bazel-deps/blob/master/3rdparty/workspace.bzl#L48

@johnynek Awesome, that helps. Thank you.

I think I'm having a similar problem, not by a library I'm producing, but a library I'm consuming

Here, I'm running into an issue with spire's Syntax companion.

At first I thought it was do to a conflicting version of machinist I had on my classplath but I've since resolved that back to 0.3.0 which should be correct

context is bazel port of a scala 2.11.7 sbt multiproject build.

error: java.lang.ClassFormatError: Absent Code attribute in method that is not native or abstract in class file spire/macros/Syntax$
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at scala.reflect.macros.runtime.JavaReflectionRuntimes$JavaReflectionResolvers$class.resolveJavaReflectionRuntime(JavaReflectionRuntimes.scala:16)
    at scala.reflect.macros.runtime.MacroRuntimes$MacroRuntimeResolver.resolveJavaReflectionRuntime(MacroRuntimes.scala:52)
    at scala.reflect.macros.runtime.MacroRuntimes$MacroRuntimeResolver.resolveRuntime(MacroRuntimes.scala:65)
    at scala.reflect.macros.runtime.MacroRuntimes$$anonfun$standardMacroRuntime$3.apply(MacroRuntimes.scala:35)

Doug, how are you using this library? Did you see the comment about using :file target above?

Macros will not work when they are in java_library or scala_library.

Incidentally, this is one thing my dependency tool handles for you (in addition to locking all transitive versions and the shas of the jars).

I'm using java_library atm. We manage our dependencies with ivy (by way of ant) for historical reasons then vendor and check them into our git repo. Is there a recommended approach if you already have the jar locally?

Isn't java_import the recommended approach when the jar is local?

On Tue, Feb 28, 2017 at 4:08 PM doug tangren notifications@github.com
wrote:

I'm using java_library atm. We manage our dependencies with ivy (by way
of ant) for historical reasons then vendor and check them into our git
repo. Is there a recommended approach if you already have the jar locally?

—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-283048079,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABUIF9YIfrdWcyAMRIicLVglru_ydXTLks5rhCppgaJpZM4GmC6N
.

Isn't java_import the recommended approach when the jar is local?

@ittaiz see @johnynek's comment

I'm looking for a way to work around the macro issue when I have a local jar. Is java_import included in that list?

TBH I don't know

java_import is the recommended bazel way for java jars that are local. But that is a java rule which forces the use of ijar by the time the scala rules can see it (currently, maybe a future change to the skylark interface of the java rules would change that).

@softprops to work around, you should use a filegroup:

filegroup(
    name = 'file',
    srcs = ['some_jar_i_very_very_like.jar'],
    visibility = ['//visibility:public']
)

following up. @johnynek that worked wonders for me. In case anyone runs in the specific case of spire in the future, leaving another example here for reference

filegroup(
    name = "org_spire_math_spire_macros",
    srcs = ["path/to/spire-macros_2.11-0.9.0.jar"],
    visibility = ["//visibility:public"],
)

@damienmg @ulfjack Can we somehow disable ijar for maven_jar? The current workaround of using "//jar:file" breaks IDE support. Given solving this issue is complicated we need a workaround which allows using the IDE. cc @chaoren @helenalt

/cc @lberki @iirina

You can pass --nouse_ijars to Bazel, or put "build --nouse_ijars" in the .bazelrc. It's not currently possible to disable it on a rule-by-rule basis. Any idea why the scala compiler is introspecting the bytecode?

IIUC scala has an annotation processor-like plugin mechanism where the plugins are discovered off the compile-time classpath. If the compile-time classpath contains ijars, the plugins can't be executed.

Instead of disabling ijar, it should be possible to use java_common.create_provider to wire prebuilt scala jars (or the outputs of scala_library) into the build graph, and void the ijar processing that's built into java_import and java_library.

Are the plugins actually using those classes?

On Wed, Jul 5, 2017 at 8:36 PM, Liam Miller-Cushon <[email protected]

wrote:

IIUC scala has an annotation processor-like plugin mechanism where the
plugins are discovered off the compile-time classpath. If the compile-time
classpath contains ijars, the plugins can't be executed.

Instead of disabling ijar, it should be possible to use
java_common.create_provider to wire prebuilt scala jars (or the outputs
of scala_library) into the build graph, and void the ijar processing
that's built into java_import and java_library.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313189428,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AHA9Yb3JXbPi8gpuXBWnVKnU-COC6O-vks5sK9fKgaJpZM4GmC6N
.

Yeah, my understanding is that the plugins are those classes, the plugin implementation is loaded from the compile-time classpath.

Ah, I see. Can we get upstream scalac to change that behavior? What are the plugins for?

scala has compile time macros. Those macros are code stored in jars. To expand the macro, the compiler needs access to the code, since a macro is nothing more than code that runs at compile time.

I don't know any way to use ijar in this case since you actually need all of the code to compile. So, changing this behavior in scalac means, I believe convincing them to remove macros from the scala language, which I think is a non-starter (and won't fix all the currently deployed jars that contain macros).

If we could pass an option to java_import that tells it not to run ijar, that would work.

I had assumed changing scalac was a non-starter, but conflating the compilation classpath with the plugin classpath doesn't seem like a great idea. Do plugins differ from annotation processors in ways that make using a separate classpath for plugins (similar to the processor path) infeasible?

If we could pass an option to java_import that tells it not to run ijar, that would work.

Have you investigated java_common.create_provider? That skylark API allows other JVM-based languages to interoperate with the built-in Java rules. It is a better solution to this problem than adding features to the native java_ rules to make them building blocks for scala support.

We are using java_common.create_provider in the rules but I think recent ask: https://github.com/bazelbuild/bazel/issues/632#issuecomment-312938533 was around maven_jar which we don't control. We can certainly make our own scala_maven_jar repository rule but it is a shame to have a duplicate path with only one minor difference.

What Liam is saying is that you can get the non-processed jar files from
the java_common provider, and use those on the classpath.

On Wed, Jul 5, 2017 at 9:34 PM, P. Oscar Boykin notifications@github.com
wrote:

We are using java_common.create_provider in the rules but I think recent
ask: #632 (comment)
https://github.com/bazelbuild/bazel/issues/632#issuecomment-312938533
was around maven_jar which we don't control. We can certainly make our
own scala_maven_jar repository rule but it is a shame to have a duplicate
path with only one minor difference.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313203937,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AHA9YXowXWXd5oA4ck5rZ7sbkEyBagltks5sK-UygaJpZM4GmC6N
.

So maybe to have a rule that uses native.maven_jar but then exports as an
output the non-ijar jar file (instead of the ijar one)?
Did I understand correctly?

On Thu, Jul 6, 2017 at 12:54 PM Ulf Adams notifications@github.com wrote:

What Liam is saying is that you can get the non-processed jar files from
the java_common provider, and use those on the classpath.

On Wed, Jul 5, 2017 at 9:34 PM, P. Oscar Boykin notifications@github.com
wrote:

We are using java_common.create_provider in the rules but I think recent
ask: #632 (comment)
https://github.com/bazelbuild/bazel/issues/632#issuecomment-312938533
was around maven_jar which we don't control. We can certainly make our
own scala_maven_jar repository rule but it is a shame to have a duplicate
path with only one minor difference.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313203937,
or mute the thread
<
https://github.com/notifications/unsubscribe-auth/AHA9YXowXWXd5oA4ck5rZ7sbkEyBagltks5sK-UygaJpZM4GmC6N

.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313351358,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABUIF33irI4rPfUBpr7utMUe-2FsYg-cks5sLK68gaJpZM4GmC6N
.

On Thu, Jul 6, 2017 at 2:55 PM, Ittai Zeidman notifications@github.com
wrote:

So maybe to have a rule that uses native.maven_jar but then exports as an
output the non-ijar jar file (instead of the ijar one)?
Did I understand correctly?

More or less. I think the idea would be that the scala rules can depend on
maven_jar, but use the runtime classpath for compiling rather than the
compile-time classpath. So no intermediate rules needed.

>
>

On Thu, Jul 6, 2017 at 12:54 PM Ulf Adams notifications@github.com
wrote:

What Liam is saying is that you can get the non-processed jar files from
the java_common provider, and use those on the classpath.

On Wed, Jul 5, 2017 at 9:34 PM, P. Oscar Boykin <
[email protected]>
wrote:

We are using java_common.create_provider in the rules but I think
recent
ask: #632 (comment)
<https://github.com/bazelbuild/bazel/issues/632#issuecomment-312938533

was around maven_jar which we don't control. We can certainly make our
own scala_maven_jar repository rule but it is a shame to have a
duplicate
path with only one minor difference.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<https://github.com/bazelbuild/bazel/issues/632#issuecomment-313203937
,
or mute the thread
<
https://github.com/notifications/unsubscribe-auth/
AHA9YXowXWXd5oA4ck5rZ7sbkEyBagltks5sK-UygaJpZM4GmC6N

.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313351358,
or mute the thread
auth/ABUIF33irI4rPfUBpr7utMUe-2FsYg-cks5sLK68gaJpZM4GmC6N>
.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313387939,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AHA9Ybgxk-_P-jN_uqgF6BCCVESXTKggks5sLNlWgaJpZM4GmC6N
.

That's very interesting. Problematic part of this is that we have many cases where iJar does work with scala code and so we'd like to preserve it (iJar has a lot of value).
Can we tell in the scala rules if a dependency is of type maven_jar or not? I think not.

No, you can't, but that's not the right distinction (conceptually). You
need to know whether the jar is used for one of the scala plugins / macros
at compile time. We do that in Java for Java compiler plugins (annotation
processors), i.e., we actually keep separate classpaths for Java plugins,
compile-time code, and runtime code. The question is whether scalac
supports that, or whether they'd be willing to support it.

On Thu, Jul 6, 2017 at 3:45 PM, Ittai Zeidman notifications@github.com
wrote:

That's very interesting. Problematic part of this is that we have many
cases where iJar does work with scala code and so we'd like to preserve it
(iJar has a lot of value).
Can we tell in the scala rules if a dependency is of type maven_jar or
not? I think not.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313400474,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AHA9YaMAz0tcwZr5XUOabYlPTcPoEOp5ks5sLOUOgaJpZM4GmC6N
.

I don't know if it's supported (I'd guess not) but @johnynek might have a better answer.
The way this distinction is done today in rules_scala is we allow the origin to declare scala_macro_library and so we skip ijar. We don't have this distinction on the call site.
I think that's also why @johnynek and me are trying to solve it from the declaration site.

Yeah, it is not like you want to use a macro not as a macro. So, it seems to me labeling the target itself (as we do by making scala_macro_library), it is clear for our users that is how to declare you are writing macros. Then consumers don't do anything different (indeed, macros can look like normal functions so it is painful if consumers have to think about if a library is a macro or not).

If I see: foo.bar("bippy") I have no way of telling that .bar is a macro without reading the code. This is also why I don't think it makes sense for scala to have a separate class path for macros and libraries, since it draws a distinction users don't know about: what is and is not a macro. If I migrate a function from being a macro to not, or vice-versa, I need to change the compiler invocations in complex ways. Additionally, macros are often used to generate interfaces (think Comparator) automatically for a given class (in scala, case class, which are basically simple data records, commonly use this pattern). So, if we had to compilation paths, the macro path would generally need everything that the compiler path will need: the person who makes the interface that is being generated probably put the macro in the same jar, so we need both the interface as well as the macro to expand the macro and type-check it (and note, type checking happens before some macro expansion, another reason why these classpaths are not separate).

Well, you'll need to make sure that all libraries used by macros aren't
processed with ijar. The tricky part is that you can end up at the same
library through a macro rule and through a normal library rule, so the
library can't decide whether to use ijar or not. That's easier to solve
with a separate macro classpath.

On Thu, Jul 6, 2017 at 7:35 PM, P. Oscar Boykin notifications@github.com
wrote:

Yeah, it is not like you want to use a macro not as a macro. So, it seems
to me labeling the target itself (as we do by making scala_macro_library),
it is clear for our users that is how to declare you are writing macros.
Then consumers don't do anything different (indeed, macros can look like
normal functions so it is painful if consumers have to think about if a
library is a macro or not).

If I see: foo.bar("bippy") I have no way of telling that .bar is a macro
without reading the code. This is also why I don't think it makes sense for
scala to have a separate class path for macros and libraries, since it
draws a distinction users don't know about: what is and is not a macro. If
I migrate a function from being a macro to not, or vice-versa, I need to
change the compiler invocations in complex ways. Additionally, macros are
often used to generate interfaces (think Comparator) automatically for a
given class (in scala, case class, which are basically simple data records,
commonly use this pattern). So, if we had to compilation paths, the macro
path would generally need everything that the compiler path will need: the
person who makes the interface that is being generated probably put the
macro in the same jar, so we need both the interface as well as the macro
to expand the macro and type-check it (and note, type checking happens
before some macro expansion, another reason why these classpaths are not
separate).

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-313465972,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AHA9YbK8RMtieY3befZfbsg_VeshbefPks5sLRrGgaJpZM4GmC6N
.

Yeah, that's an corner case we are probably getting wrong now. I think we are adding such a dependency both as an ijar on the compile path and the full jar as well (due to the macro dependency). So, we should be more careful about removing the ijar by aggregating the ijar replacements. in the macro library.

I'm not sure if this update is more suited here or in the intellij issue since both places suggested I build a scala_import rule.
I tried building a basic scala_import rule which takes a label and creates a java provider out of it. This passes compilation but still no resolution in the IDE.

def _scala_import_impl(ctx): 
    java_provider = java_common.create_provider(
        compile_time_jars = depset(ctx.attr.jar_to_import.files),
        runtime_jars = depset(ctx.attr.jar_to_import.files),
    )
    return struct(
        providers = [java_provider],
    )


scala_import = rule(
  implementation=_scala_import_impl,
  attrs={
      "jar_to_import": attr.label(allow_files=True),
      },
)

I have this reproduction repo where different commits show different approaches.
I tried combing through the java_import implementation but I suspect that the "secret sauce" it uses and which I need isn't exposed to skylark.
I might be way off in my small impl of course and just "blaming the compiler".
Would love to see it.

cc @dslomov who also suggested I create a java_import equivalent

"jar_to_import": attr.label(allow_files=True),

We look for a java attribute from java rules, and a scala attribute from scala rules.
You can probably just make it look like scalaattr from here.

@chaoren Thanks!
I tried adding it, got into a bit of a fight with the aspect and what it wants but got to the same place at the end. Sync passes successfully but imports are still red.

def _scala_import_impl(ctx): 
    java_provider = java_common.create_provider(
        compile_time_jars = ctx.attr.jar_to_import.files,
        runtime_jars = ctx.attr.jar_to_import.files,
    )
    rule_outputs = struct(
          ijar = ctx.attr.jar_to_import.files.to_list()[0],
          class_jar = ctx.attr.jar_to_import.files.to_list()[0],
    )
    scalaattr = struct(
      outputs = rule_outputs,
      #does not seem to help or hurt IDE so I commented it out
      #compile_jars = depset(ctx.attr.jar_to_import.files), 
      #transitive_runtime_jars = depset(ctx.attr.jar_to_import.files),
    )
    return struct(
        scala = scalaattr,
        providers = [java_provider],
    )


scala_import = rule(
  implementation=_scala_import_impl,
  attrs={
      "jar_to_import": attr.label(allow_files=True),
      },
)

Implementation is pushed to the above repo and of course is hacky to understand what's going on. End game shouldn't include ctx.attr.jar_to_import.files.to_list()[0]

@ittaiz we should probably move this back to our issue as this is not directly related to the topic anymore.

I agree
On Mon, 10 Jul 2017 at 19:26 Chaoren Lin notifications@github.com wrote:

@ittaiz https://github.com/ittaiz we should probably move this back to
our issue as this is not directly related to the topic anymore.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-314159335,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABUIF_jI0aclhkZDJIMDAw56M6hYJKC8ks5sMlC0gaJpZM4GmC6N
.

Sorry for the stupid question: I'm having a hard time figuring out how to get scala macros in external libraries to work. I was able to get things compiling with @dep//jar:file, but I still get these errors at runtime:

java.lang.NoClassDefFoundError: enumeratum/Enum$class
[...]
Cause: java.lang.ClassNotFoundException: enumeratum.Enum$class

It's especially weird because I've confirmed that the JAR makes it to the runtime environment, is in the class path, and has this class in it.

Can you share a minimal example that reproduces this?

This is not currently an issue for us (with the work arounds).

On Fri, Sep 29, 2017 at 04:58 Drew DeVault notifications@github.com wrote:

Sorry for the stupid question: I'm having a hard time figuring out how to
get scala macros in external libraries to work. I was able to get things
compiling with @dep//jar:file, but I still get these errors at runtime:

java.lang.NoClassDefFoundError: enumeratum/Enum$class
[...]
Cause: java.lang.ClassNotFoundException: enumeratum.Enum$class

It's especially weird because I've confirmed that the JAR makes it to the
runtime environment and has this class in it.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-333149856,
or mute the thread
https://github.com/notifications/unsubscribe-auth/AAEJdnLW0zMmKZvUh6EVDye61uujFoTRks5snQV7gaJpZM4GmC6N
.

>

P. Oscar Boykin, Ph.D. | http://twitter.com/posco | http://pobox.com/~boykin

Yes, I'm putting it together now.

Here's a minimal test case:

https://git.sr.ht/~sircmpwn/scala-bazel-test/

bazel build //example:example and bazel build //example:test work correctly, but bazel test //example:test raises an error at runtime.

Tested with the following bazel versions:

Build label: 0.6.0- (@non-git)
Build target: bazel-out/local-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Fri Sep 29 14:42:59 2017 (1506696179)
Build timestamp: 1506696179
Build timestamp as int: 1506696179
Build label: 0.4.5
Build target: bazel-out/local-fastbuild/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Thu Mar 16 12:19:38 2017 (1489666778)
Build timestamp: 1489666778
Build timestamp as int: 1489666778

you are trying to use scala 2.12 jars with scala 2.11 aren't you?

Am I? I'm not sure how to tell. In this small test case I don't see where that's being set.

I recommend my own tool that we use at Stripe for setting up the third party directory:

https://github.com/johnynek/bazel-deps

which sets there versions (2.11 vs 2.12) in one place.

That seems to specify the dependency versions, but not the scala version for the project overall. Setting the deps to 2.11 did, in fact, fix the problem, though. Thanks for the help!

I will check out bazel-deps, stumbled upon it before but wasn't entirely sure how to integrate it. Will study it further.

The problem is how to use ijars as much as possible without breaking macros.

The solution, I believe is to be found in the answer to this question.

I had assumed changing scalac was a non-starter, but conflating the compilation classpath with the plugin classpath doesn't seem like a great idea. Do plugins differ from annotation processors in ways that make using a separate classpath for plugins (similar to the processor path) infeasible?

There are two approaches to macros. For lack of better terms: "explicit" (C, C++, Java), and "implicit" (LISP, Scala). The latter intentionally conflates compilation + runtime. [Insert readability / boilerplate / runtime-reflection debate here.]

In the latter universe, a runtime function and a compile-time macro identical to developers, both in invoking them and in package management (many times, a macro will require runtime support as well).

This problem gets more complicated in that macros can call anything on the compilation classpath during compilation, including vanilla Java jars.

Solution

Scala macros can only be defined in Scala, and they can be invoked in Scala. Therefore, if done correctly, we can keep this in Scala land.

  1. ScalaInfo has a macro_classpath attribute.
  2. scala_library has a boolean macro attribute.
  3. If macro is true, ScalaInfo.macro_classpath is the transitive runtime classpath of the library.
  4. In either case, JavaInfo always uses ijar.
  5. When compiling Scala, the macro_classpath of deps precedes their compile classpath when passing them to the Scala compiler, causing it to prefer the full classes.

This is true to the spirit of macros, by having devs declare macro-ness only once, in the original library definition. And it allows ijar to be used as much as possible.

And optionally, if a jar contains only macro-related code, then it can be declared as neverlink, giving you a smaller runtime.

Paul,
I might be totally mistaken here but your design sounds very similar
(exactly?) to what we have today.
We have scala_library which uses ijar and scala_macro_library which doesn’t.
How does you design improve things?

Two more notes:

  1. We’re lacking ijars today in the whole external dependencies area but
    we’ll improve that soon.
  2. I don’t remember where I read this but I think there was talk somewhere
    in the scala ecosystem of a separate macro classpath somewhere. @johnynek
    do you happen to remember where?
    On Sat, 21 Apr 2018 at 21:24 Paul Draper notifications@github.com wrote:

The problem is how to use ijars as much as possible without breaking
macros.

The solution, I believe is to be found in the answer to this question.

I had assumed changing scalac was a non-starter, but conflating the
compilation classpath with the plugin classpath doesn't seem like a great
idea. Do plugins differ from annotation processors in ways that make using
a separate classpath for plugins (similar to the processor path) infeasible?

There are two approaches to macros. For lack of better terms: "explicit"
(C, C++, Java), and "implicit" (LISP, Scala). The latter intentionally
conflates compilation + compiler runtime.

In the latter universe, a runtime function and a compile-time macro
identical to developers, both in invoking them and in package management.

This problem gets more complicated in that macros can call anything on the
compilation classpath, including vanilla Java code.

Solution

Scala macros can only be defined in Scala code, and used by Scala sources.
Therefore, if done correctly, we can keep this in Scala land.

  1. ScalaInfo has a macro_classpath attribute.
  2. scala_library has a boolean macro attribute.
  3. If macro is true, ScalaInfo.macro_classpath is the transitive
    runtime classpath of the library.
  4. In either case, JavaInfo uses ijar.
  5. When compiling Scala, the macro_classpath of deps precedes their
    compile classpath when passing them to the Scala compiler, causing it to
    prefer the non-macro classes.

This is true to the spirit of macros, by having devs declare macro-ness
only once, in the original library definition. And it allows ijar to be
used as much as possible.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-383318341,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABUIF0AD0qj1EVbAWEoyJdwiwpYtGcH-ks5tq3l5gaJpZM4GmC6N
.

I don’t recall that @ittaiz. I haven’t been very effective at getting most scala people interested in the problems bazel addresses (reproducible large scale builds). As such, the compiler people seem to think build is not really a problem they need to think about.

Changing that will likely be paying lightbend or maybe Miles Sabin to get PRs into scalac.

@ittaiz if that's true, then I think we're already at the perfect solution.

AFAIK, there are two differences with scala_macro_library/scala_import, though admittedly subtle:

  • They change the ijar usage for just that jar. E.g. if Apache commons lang is used by a macro lib, it doesn't work.
  • They unnecssarily cause non-Scala compilers (e.g. javac) to use the full jar.

In any case, whether or not it's been implemented in rules_scala, this approach doesn't require support from Bazel java rules so we can close or migrate this issue. (That is, assuming this is the right approach.)

Can you elaborate on the commons/macro example? Not sure I understand.

I think the issue is here not because of java rules but because this repo
is also concerned with common infrastructure such as ijar. Makes sense?
On Sat, 21 Apr 2018 at 23:23 Paul Draper notifications@github.com wrote:

@ittaiz https://github.com/ittaiz if that's true, then I think we're
already at the perfect solution.

AFAIK, scala_macro_library/scala_import change the ijar usage for just
that jar. E.g. if Apache commons lang is used by a macro, it doesn't work.

In any case, if that's the right solution, we can probably close this
issue or migrate it to rules_scala, because the solution doesn't require
any support from Bazel java rules.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-383327125,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABUIF9be1fr6cDreolzqAmX8nMT4hj30ks5tq5VGgaJpZM4GmC6N
.

I think paul is right, we have a bug in the scala_macro_library that all transitive runtime dependencies should become compile dependencies since you are running a macro at compile time.

We just have a bug now. We should add a test for a macro that depends on another target and run that macro, but this is really just a matter of a few lines change and writing a test.

Maybe a good PR opportunity @pauldraper !

They change the ijar usage for just that jar. E.g. if Apache commons lang is used by a macro lib, it doesn't work.

Can you elaborate on the commons/macro example?

The example of a scala macro depending on apache commons is the issue described in bazelbuild/rules_scala#445, right?

The ScalaInfo.macro_classpath idea sounds good to me. I think the rule logic ends up looking fairly similar to how annotation processor classpaths are handled, until the point where the macro classpath is prepended to scalac's compilation classpath (instead of passed as a separate argument like javac's -processorpath).

@cushon, yep :)

I think the rule logic ends up looking fairly similar to how annotation processor classpaths are handled

That was the inspiration. Combined with the current scala_macro_library style where macros identified by the library rather than its downstream dependents (Java plugins being the opposite).

Maybe a good PR opportunity @pauldraper !

:+1: Though there is some thought that must be given to make sure strict deps / unused deps continues to work as expected.

Currently encountering this same issue when trying to use the macros defined in the Quill library. What's the status of this ticket?

@nickersoft it is not resolved (and as far as I know not likely to be resolved any time soon).

The solution is to use the :file dependency in the maven_jar. The bazel-deps tool manages this automatically.

Another solution is to use scala_import from rules_scala.
:file dependency isn’t supported well in intelllij
On Sun, 3 Jun 2018 at 3:12 P. Oscar Boykin notifications@github.com wrote:

@Nickersoft https://github.com/Nickersoft it is not resolved (and as
far as I know not likely to be resolved any time soon).

The solution is to use the :file dependency in the maven_jar. The
bazel-deps tool manages this automatically.

—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/bazelbuild/bazel/issues/632#issuecomment-394125751,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABUIF29qH7FzOKoxfIXCmWb9-rruToluks5t4yntgaJpZM4GmC6N
.

Was this page helpful?
0 / 5 - 0 ratings