Skip to content

Commit

Permalink
Prepare 3.0.0 release
Browse files Browse the repository at this point in the history
  • Loading branch information
jtgrabowski committed Sep 24, 2020
1 parent ee0ba7a commit 8e5c5f6
Show file tree
Hide file tree
Showing 6 changed files with 23 additions and 15 deletions.
9 changes: 7 additions & 2 deletions CHANGES.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,11 @@
3.0.0
* Update Spark to 3.0.1 and commons-lang to 3.9
* Remove full table scans performance regression (SPARKC-614)
* Restore PrefetchingResultSetIterator (SPARKC-619)
* Restore ContinuousPaging properties (SPARKC-606)
* Integration tests work with C* 4.0.0-beta (SPARKC-615)
* Fix USE <catalog> command (SPARKC-615)


***************************************************************************
3.0.0-beta
* Data Source v2 support
* Secondary artifact with shaded typesafe.config
Expand Down
15 changes: 9 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
| What | Where |
| ---------- | ----- |
| Community | Chat with us at [Datastax and Cassandra Q&A](https://community.datastax.com/index.html) |
| Scala Docs | Most Recent Release (2.5.1): [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/2.5.1/connector/index.html#com.datastax.spark.connector.package), [Spark-Cassandra-Connector-Driver](https://datastax.github.io/spark-cassandra-connector/ApiDocs/2.5.1/driver/#package)|
| Latest Production Release | [2.5.1](https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.11/2.5.1) |
| Latest Preview Release | [3.0.0-beta](https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.0.0-beta) |
| Scala Docs | Most Recent Release (3.0.0): [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.0.0/connector/com/datastax/spark/connector/index.html), [Spark-Cassandra-Connector-Driver](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.0.0/driver/com/datastax/spark/connector/index.html)|
| Latest Production Release | [3.0.0](https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.12/3.0.0) |

## Features

*Lightning-fast cluster computing with Apache Spark&trade; and Apache Cassandra&reg;.*
Expand Down Expand Up @@ -63,6 +63,9 @@ development for the next connector release in progress.
## Hosted API Docs
API documentation for the Scala and Java interfaces are available online:

### 3.0.0
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/3.0.0/connector/com/datastax/spark/connector/index.html)

### 2.5.1
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/2.5.1/connector/#package)

Expand All @@ -80,10 +83,10 @@ This project is available on the Maven Central Repository.
For SBT to download the connector binaries, sources and javadoc, put this in your project
SBT config:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.5.1"
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "3.0.0"

* The default Scala version for Spark 2.0+ is 2.11 please choose the appropriate build. See the
[FAQ](doc/FAQ.md) for more information
* The default Scala version for Spark 3.0+ is 2.12 please choose the appropriate build. See the
[FAQ](doc/FAQ.md) for more information.

## Building
See [Building And Artifacts](doc/12_building_and_artifacts.md)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
public class CassandraJavaUtilTest {

/**
* Scala refelection type tags change the string reprsentation of some types, in scala 2.11 java.lang
* Scala reflection type tags change the string reprsentation of some types, in scala 2.11 java.lang
* is included, in scala 2.12 it is removed. To remove this conflict we just always remove the java.lang
* portion
*/
Expand Down
8 changes: 4 additions & 4 deletions doc/0_quick_start.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,14 +15,14 @@ Configure a new Scala project with the Apache Spark and dependency.

The dependencies are easily retrieved via Maven Central

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.0.0-beta"
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector_2.12" % "3.0.0"

The spark-packages libraries can also be used with spark-submit and spark shell, these
commands will place the connector and all of its dependencies on the path of the
Spark Driver and all Spark Executors.

$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0-beta
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0-beta
$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0

For the list of available versions, see:
- https://repo1.maven.org/maven2/com/datastax/spark/spark-cassandra-connector_2.12/
Expand All @@ -42,7 +42,7 @@ and *all* of its dependencies on the Spark Class PathTo configure
the default Spark Configuration pass key value pairs with `--conf`

$SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0-beta
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0
--conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions

This command would set the Spark Cassandra Connector parameter
Expand Down
2 changes: 1 addition & 1 deletion doc/13_spark_shell.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Find additional versions at [Spark Packages](https://repo1.maven.org/maven2/com/
```bash
cd spark/install/dir
#Include the --master if you want to run against a spark cluster and not local mode
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.11:2.5.1 --conf spark.cassandra.connection.host=yourCassandraClusterIp
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0 --conf spark.cassandra.connection.host=yourCassandraClusterIp
```

By default spark will log everything to the console and this may be a bit of an overload. To change this copy and modify the `log4j.properties` template file
Expand Down
2 changes: 1 addition & 1 deletion doc/15_python.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ shell similarly to how the spark shell is started. The preferred method is now t

```bash
./bin/pyspark \
--packages com.datastax.spark:spark-cassandra-connector_2.11:2.5.1 \
--packages com.datastax.spark:spark-cassandra-connector_2.12:3.0.0 \
--conf spark.sql.extensions=com.datastax.spark.connector.CassandraSparkExtensions
```

Expand Down

0 comments on commit 8e5c5f6

Please sign in to comment.