I have some code that I want to use in different Spark projects. I'm using sbt to create the .jar file.
I saw this idea
Put the shared code into a another project that builds its own JAR file, and use it in both projects.
But that post is old and also exclusively for Java... I'm wondering if there is a better way for my scenario.