Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update supported Spark and Java versions in installation guide #742

Open
andygrove opened this issue Jul 30, 2024 · 3 comments
Open

Update supported Spark and Java versions in installation guide #742

andygrove opened this issue Jul 30, 2024 · 3 comments
Labels
documentation Improvements or additions to documentation enhancement New feature or request good first issue Good for newcomers

Comments

@andygrove
Copy link
Member

What is the problem the feature request solves?

Installation guide currently says:

Requirements:

  • Apache Spark 3.3, or 3.4
  • JDK 8 and up
  • GLIBC 2.17 (Centos 7) and up

We now support Spark 3.5 and have experimental support for 4.0

We do not support JDK 8 with Spark 4 ... we should state which Java versions are supported for each Spark version to avoid confusion

Describe the potential solution

No response

Additional context

No response

@andygrove andygrove added the enhancement New feature or request label Jul 30, 2024
@andygrove andygrove added this to the 0.2.0 milestone Jul 30, 2024
@andygrove andygrove added documentation Improvements or additions to documentation good first issue Good for newcomers labels Jul 30, 2024
@adi-kmt
Copy link

adi-kmt commented Aug 15, 2024

Hey @andygrove, could take this up

@andygrove andygrove removed this from the 0.2.0 milestone Aug 16, 2024
@justahuman1
Copy link

Hi @adi-kmt are you still working on this? I can take if not. I already have the changes ready, let me know, thanks

justahuman1@111a6ba

@zemin-piao
Copy link

zemin-piao commented Oct 9, 2024

Hey folks,

I installed the jar https://mvnrepository.com/artifact/org.apache.datafusion/comet-parent-spark3.5_2.12/0.3.0 on my cluster where we run spark 3.5 and we use java 8

When running spark with apache datafusion comet jar included, the driver throws this exception:

py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.UnsupportedClassVersionError: org/apache/comet/CometRuntimeException has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.spark.CometDriverPlugin.init(Plugins.scala:63)
	at org.apache.spark.internal.plugin.DriverPluginContainer.$anonfun$driverPlugins$1(PluginContainer.scala:53)
	at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
	at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
	at org.apache.spark.internal.plugin.DriverPluginContainer.<init>(PluginContainer.scala:46)
	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:210)
	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:193)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:579)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
	at py4j.Gateway.invoke(Gateway.java:238)
	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:748)

Seems some classes are compiled by java 11 also for comet jars for spark 3.5 version... May I ask whether this is expected or maybe something wrong with my setup also, asking because it is mentioned that java version 8 is supported

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants