I'm hitting a reflection issue with Kafka consumer libs. I'm unable to determine if it's something that can be worked around (and I'm just not doing the proper workaround).
The app I'm trying is here: https://github.com/tednaleid/rat/commit/cae18f0a6f0adc81295edcf0047bec5777b14589
When I compile that with graal 1.0.0-rc3:
./gradlew clean shadowJar
and then try to build a native image from that jar file (using a reflection config file that I'm probably using wrong ):
native-image -H:ReflectionConfigurationFiles=graal_config.json -jar rat-core/build/libs/rat.jar
I get this error:
Error: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Invoke with MethodHandle argument could not be reduced to at most a single call: java.lang.invoke.LambdaForm$MH.1181730677.invoke_MT(Object, Object, Object)
Trace:
at parsing org.apache.kafka.common.record.CompressionType$3.wrapForInput(CompressionType.java:90)
full output:
Build on Server(pid: 15757, port: 58019)
classlist: 1,151.88 ms
(cap): 1,225.42 ms
setup: 1,647.19 ms
23:31:55.503 [ForkJoinPool-75-worker-6] WARN org.apache.kafka.common.utils.AppInfoParser - Error while loading kafka-version.properties :invalid stored block lengths
analysis: 5,852.41 ms
error: unsupported features in 2 methods
Detailed message:
Error: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Invoke with MethodHandle argument could not be reduced to at most a single call: java.lang.invoke.LambdaForm$MH.1181730677.invoke_MT(Object, Object, Object)
Trace:
at parsing org.apache.kafka.common.record.CompressionType$3.wrapForInput(CompressionType.java:90)
Call path from entry point to org.apache.kafka.common.record.CompressionType$3.wrapForInput(ByteBuffer, byte, BufferSupplier):
at org.apache.kafka.common.record.CompressionType$3.wrapForInput(CompressionType.java:90)
at org.apache.kafka.common.record.DefaultRecordBatch.compressedIterator(DefaultRecordBatch.java:257)
at org.apache.kafka.common.record.DefaultRecordBatch.streamingIterator(DefaultRecordBatch.java:335)
at org.apache.kafka.common.record.MemoryRecords.toString(MemoryRecords.java:290)
at java.lang.String.valueOf(String.java:2994)
at java.lang.StringBuilder.append(StringBuilder.java:131)
at com.oracle.svm.core.amd64.AMD64CPUFeatureAccess.verifyHostSupportsArchitecture(AMD64CPUFeatureAccess.java:165)
at com.oracle.svm.core.JavaMainWrapper.run(JavaMainWrapper.java:154)
at com.oracle.svm.core.code.CEntryPointCallStubs.com_002eoracle_002esvm_002ecore_002eJavaMainWrapper_002erun_0028int_002corg_002egraalvm_002enativeimage_002ec_002etype_002eCCharPointerPointer_0029(generated:0)
Error: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Unsupported method java.security.ProtectionDomain.getCodeSource() is reachable: The declaring class of this element has been substituted, but this element is not present in the substitution class
To diagnose the issue, you can add the option -H:+ReportUnsupportedElementsAtRuntime. The unsupported element is then reported at run time when it is accessed the first time.
Trace:
at parsing ch.qos.logback.classic.spi.PackagingDataCalculator.getCodeLocation(PackagingDataCalculator.java:164)
Call path from entry point to ch.qos.logback.classic.spi.PackagingDataCalculator.getCodeLocation(Class):
at ch.qos.logback.classic.spi.PackagingDataCalculator.getCodeLocation(PackagingDataCalculator.java:162)
at ch.qos.logback.classic.spi.PackagingDataCalculator.calculateByExactType(PackagingDataCalculator.java:123)
at ch.qos.logback.classic.spi.PackagingDataCalculator.populateFrames(PackagingDataCalculator.java:96)
at ch.qos.logback.classic.spi.PackagingDataCalculator.calculate(PackagingDataCalculator.java:58)
at ch.qos.logback.classic.spi.ThrowableProxy.calculatePackagingData(ThrowableProxy.java:142)
at ch.qos.logback.classic.spi.LoggingEvent.<init>(LoggingEvent.java:122)
at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:419)
at ch.qos.logback.classic.Logger.filterAndLog_2(Logger.java:414)
at ch.qos.logback.classic.Logger.error(Logger.java:530)
at org.apache.kafka.common.utils.KafkaThread$1.uncaughtException(KafkaThread.java:51)
at com.oracle.svm.core.thread.JavaThreads.dispatchUncaughtException(JavaThreads.java:485)
at com.oracle.svm.core.JavaMainWrapper.run(JavaMainWrapper.java:175)
at com.oracle.svm.core.code.CEntryPointCallStubs.com_002eoracle_002esvm_002ecore_002eJavaMainWrapper_002erun_0028int_002corg_002egraalvm_002enativeimage_002ec_002etype_002eCCharPointerPointer_0029(generated:0)
Error: Processing image build request failed
In looking at the line failing, CompressionType.java:90, it is calling invoke:
return (InputStream) SnappyConstructors.INPUT.invoke(new ByteBufferInputStream(buffer));
and the underlying SnappyConstructors.INPUT is defined as:
private static class SnappyConstructors {
static final MethodHandle INPUT = findConstructor("org.xerial.snappy.SnappyInputStream",
MethodType.methodType(void.class, InputStream.class));
I've attempted to add SnappyInputStream to the reflection config file as directed in https://github.com/oracle/graal/blob/master/substratevm/REFLECTION.md but I'm either doing something wrong or it isn't possible with the current release.
what I think I'm supposed to do is use a reflection file like:
[
{
"name": "org.xerial.snappy.SnappyInputStream",
"methods": [
{ "name": "<init>", "parameterTypes": ["java.io.InputStream"] }
]
}
]
as that is the constructor of SnappyInputStream that is being found, but that isn't working and neither is using:
[
{
"name": "org.xerial.snappy.SnappyInputStream",
"allDeclaredConstructors" : true,
"allPublicConstructors" : true,
"allDeclaredMethods" : true,
"allPublicMethods" : true
}
]
What would be really great from a GraalVM usability perspective is that when it emits an error like this:
Error: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException:
Invoke with MethodHandle argument could not be reduced to at most a single call:
java.lang.invoke.LambdaForm$MH.1181730677.invoke_MT(Object, Object, Object)
for it to also emit the choices in the format usable by the reflection file for easy copy/pasting. It seems like native-image is aware that there are multiple choices so actually has the list of choices in-hand at the time the error is thrown.
(I'm sure this is far more complicated than I'm making it sound, just talking about the usability perspective without understanding the cost)
Thank you for your report! It doesn't look like this is actually an issue with reflection, but rather a limitation of our implementation of method handles, so I'm assigning this one to @christianwimmer.
Ah! I was wondering about if it was a different issue as native-image was not properly compiling with -H:+ReportUnsupportedElementsAtRuntime. As I understand that flag, it should still compile the binary but then fail when running when that branch of code hits. But it doesn't do that and still fails with the same error from above:
native-image --class-path rat-core/build/libs/rat.jar -H:ReflectionConfigurationFiles=./graal_config.json -H:Name=rat-native -H:Class=rat.Application -H:+ReportUnsupportedElementsAtRuntime
Build on Server(pid: 30703, port: 62657)
classlist: 1,837.56 ms
(cap): 1,584.83 ms
setup: 2,016.05 ms
analysis: 8,745.88 ms
error: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Invoke with MethodHandle argument could not be reduced to at most a single call: java.lang.invoke.LambdaForm$MH.33904206.invoke_MT(Object, Object, Object)
Detailed message:
Error: com.oracle.graal.pointsto.constraints.UnsupportedFeatureException: Invoke with MethodHandle argument could not be reduced to at most a single call: java.lang.invoke.LambdaForm$MH.33904206.invoke_MT(Object, Object, Object)
Trace:
at parsing org.apache.kafka.common.record.CompressionType$3.wrapForInput(CompressionType.java:90)
Call path from entry point to org.apache.kafka.common.record.CompressionType$3.wrapForInput(ByteBuffer, byte, BufferSupplier):
at org.apache.kafka.common.record.CompressionType$3.wrapForInput(CompressionType.java:90)
at org.apache.kafka.common.record.DefaultRecordBatch.compressedIterator(DefaultRecordBatch.java:257)
at org.apache.kafka.common.record.DefaultRecordBatch.streamingIterator(DefaultRecordBatch.java:335)
at org.apache.kafka.common.record.MemoryRecords.toString(MemoryRecords.java:290)
at java.lang.String.valueOf(String.java:2994)
at java.lang.StringBuilder.append(StringBuilder.java:131)
at com.oracle.svm.core.amd64.AMD64CPUFeatureAccess.verifyHostSupportsArchitecture(AMD64CPUFeatureAccess.java:165)
at com.oracle.svm.core.JavaMainWrapper.run(JavaMainWrapper.java:154)
at com.oracle.svm.core.code.CEntryPointCallStubs.com_002eoracle_002esvm_002ecore_002eJavaMainWrapper_002erun_0028int_002corg_002egraalvm_002enativeimage_002ec_002etype_002eCCharPointerPointer_0029(generated:0)
Error: Processing image build request failed
I'm seeing the same behavior on RC4 (I know this isn't marked as fixed, just confirming that it is still there).
Also trying to get this working - using kafka-clients 1.0.0 and graal RC4 - I sidestepped the snappy thing by creating a patched kafka-clients that doesn't do the availability methodHandle/invoke check. Now there seems to be an issue with the Deserializer config? This might not the same issue but I was just looking to also get a native kafka consumer running so I was excited to see someone else working on it.
Stacktrace here looks like it is a reflection issue but I can't determine what is going wrong. Kafka config looks like its working on building the deserializer out of the string or class that you give in its properties, but even when providing the class to the consumer it still fails.
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class.getName());
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ByteArrayDeserializer.class.getName());
StringDeserializer keyDeserializer = new StringDeserializer();
ByteArrayDeserializer valueDeserializer = new ByteArrayDeserializer();
consumer = new KafkaConsumer<>(props, keyDeserializer, valueDeserializer);
leads to -
Exception in thread "main" java.lang.reflect.InvocationTargetException
at java.lang.Throwable.<init>(Throwable.java:310)
at java.lang.Exception.<init>(Exception.java:102)
at java.lang.ReflectiveOperationException.<init>(ReflectiveOperationException.java:89)
at java.lang.reflect.InvocationTargetException.<init>(InvocationTargetException.java:72)
at com.oracle.svm.reflect.proxies.Proxy_1_BasicTailer_main.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.oracle.svm.core.JavaMainWrapper.run(JavaMainWrapper.java:173)
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.StringDeserializer for configuration key.deserializer: Class org.apache.kafka.common.serialization.StringDeserializer could not be found.
at java.lang.Throwable.<init>(Throwable.java:265)
at java.lang.Exception.<init>(Exception.java:66)
at java.lang.RuntimeException.<init>(RuntimeException.java:62)
at org.apache.kafka.common.KafkaException.<init>(KafkaException.java:31)
at org.apache.kafka.common.config.ConfigException.<init>(ConfigException.java:37)
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:715)
at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:460)
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:453)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62)
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75)
at org.apache.kafka.clients.consumer.ConsumerConfig.<init>(ConsumerConfig.java:481)
at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:635)
at com.nytimes.knowmg.BasicTailer.main(BasicTailer.java:45)
The parseType part that looks like it is failing, I think is.
case CLASS:
if (value instanceof Class)
return value;
else if (value instanceof String)
return Class.forName(trimmed, true, Utils.getContextOrKafkaClassLoader());
else
throw new ConfigException(name, value, "Expected a Class instance or class name.");
Some time has passed since the latest comments in this issue, so I'll just hop in and ask if someone has made progress in compiling Kafka producers/consumers to a native image?
I have the exact same problem with kafka Snappy compression method invocation. Given that this thread is a year old, may I ask how did you ended up solving this?
@joshuavijay The problem appears to be related to method handlers. I asked in this issue if there is a workaround but could not get a response or comment yet. I'd be really interested in finding a workaround.
I sidestepped the snappy thing by creating a patched kafka-clients that doesn't do the availability methodHandle/invoke check.
@dnfehren Do you have that fork available still? I am curious... I think perhaps if you do not put the class names as strings into the consumer properties, but only pass instances of the ser/des (as your example code also does), then it will not have to do reflection.
But with graalvm 19.0.0 and kafka-clients 2.2.0 I am having troubles with sun.nio.ch.EPollArrayWrapper
@blak3mill3r sorry, too long ago, good luck though
FYI for future dumb self and for those coming from google, if you are getting this error and wonder ytf this is happening my Main class is so simple..
You will get this error if you define your entrypoint without the static classifier by accident 馃う鈥嶁檪
public void main(String[] args) {
vs
public static void main(String[] args) {
Pretty weird error this kind of mistake thought? 馃し鈥嶁檪 @cstancu
Good catch, and sorry for the missing check. In https://github.com/oracle/graal/blob/e4ac613c022c67c4504d5772de3ad5ced7a5271b/substratevm/src/com.oracle.svm.hosted/src/com/oracle/svm/hosted/NativeImageGeneratorRunner.java#L288 we check that the main method is public but not that it is static.
Any progress on this?
The underlying issue here is incomplete MethodHandle support. That is under development and is being tracked by https://github.com/oracle/graal/issues/2761.
Most helpful comment
Any progress on this?