-
I'm trying to use kyuubi, but I'd like to be able to add the jar at runtime (spark engine mode) My current thinking is: use restful api trigger to send the local jar as a binary to the driver. then the driver writes the corresponding jar file and adds it to the sparkcontext and its own classpath. But when I look at the code, it seems that all communication is built through hive's thrift protocol. I'm not too familiar with this protocol, but it seems like its only supports a few message types. So is there any way I can expand the message types of the thrift protocol? Or is there another way to do what I need? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I suppose you are asking, "How to add jars from the client's local file system to the remote Spark engine?" The typical solution is:
|
Beta Was this translation helpful? Give feedback.
I suppose you are asking, "How to add jars from the client's local file system to the remote Spark engine?"
The typical solution is:
ADD JAR hdfs:///path/of/jar