Skip to content

Commit

Permalink
Merge pull request #10125 from NvTimLiu/release-tmp
Browse files Browse the repository at this point in the history
Merge branch 'branch-23.12' into main [skip ci]
  • Loading branch information
NvTimLiu authored Dec 29, 2023
2 parents 02cf1a0 + 9856aeb commit 180faac
Show file tree
Hide file tree
Showing 63 changed files with 139 additions and 134 deletions.
7 changes: 6 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Change log
Generated on 2023-12-12
Generated on 2023-12-29

## Release 23.12

Expand Down Expand Up @@ -118,6 +118,11 @@ Generated on 2023-12-12
### PRs
|||
|:---|:---|
|[#10123](https://github.com/NVIDIA/spark-rapids/pull/10123)|Change version to v23.12.1 [skip ci]|
|[#10122](https://github.com/NVIDIA/spark-rapids/pull/10122)|Init changelog for v23.12.1 [skip ci]|
|[#10121](https://github.com/NVIDIA/spark-rapids/pull/10121)|[DOC] update download page for db hot fix [skip ci]|
|[#10116](https://github.com/NVIDIA/spark-rapids/pull/10116)|Upgrade to 23.12.1|
|[#9935](https://github.com/NVIDIA/spark-rapids/pull/9935)|Init 23.12 changelog [skip ci]|
|[#9943](https://github.com/NVIDIA/spark-rapids/pull/9943)|[DOC] Update docs for 23.12.0 release [skip ci]|
|[#10014](https://github.com/NVIDIA/spark-rapids/pull/10014)|Add documentation for how to run tests with a fixed datagen seed [skip ci]|
|[#9954](https://github.com/NVIDIA/spark-rapids/pull/9954)|Update private and JNI version to released 23.12.0|
Expand Down
8 changes: 4 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -130,15 +130,15 @@ mvn -pl dist -PnoSnapshots package -DskipTests
Verify that shim-specific classes are hidden from a conventional classloader.

```bash
$ javap -cp dist/target/rapids-4-spark_2.12-23.12.0-SNAPSHOT-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl
$ javap -cp dist/target/rapids-4-spark_2.12-23.12.1-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl
Error: class not found: com.nvidia.spark.rapids.shims.SparkShimImpl
```

However, its bytecode can be loaded if prefixed with `spark3XY` not contained in the package name

```bash
$ javap -cp dist/target/rapids-4-spark_2.12-23.12.0-SNAPSHOT-cuda11.jar spark320.com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
Warning: File dist/target/rapids-4-spark_2.12-23.12.0-SNAPSHOT-cuda11.jar(/spark320/com/nvidia/spark/rapids/shims/SparkShimImpl.class) does not contain class spark320.com.nvidia.spark.rapids.shims.SparkShimImpl
$ javap -cp dist/target/rapids-4-spark_2.12-23.12.1-cuda11.jar spark320.com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
Warning: File dist/target/rapids-4-spark_2.12-23.12.1-cuda11.jar(/spark320/com/nvidia/spark/rapids/shims/SparkShimImpl.class) does not contain class spark320.com.nvidia.spark.rapids.shims.SparkShimImpl
Compiled from "SparkShims.scala"
public final class com.nvidia.spark.rapids.shims.SparkShimImpl {
```
Expand Down Expand Up @@ -181,7 +181,7 @@ mvn package -pl dist -am -Dbuildver=340 -DallowConventionalDistJar=true
Verify `com.nvidia.spark.rapids.shims.SparkShimImpl` is conventionally loadable:
```bash
$ javap -cp dist/target/rapids-4-spark_2.12-23.12.0-SNAPSHOT-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
$ javap -cp dist/target/rapids-4-spark_2.12-23.12.1-cuda11.jar com.nvidia.spark.rapids.shims.SparkShimImpl | head -2
Compiled from "SparkShims.scala"
public final class com.nvidia.spark.rapids.shims.SparkShimImpl {
```
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ as a `provided` dependency.
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<scope>provided</scope>
</dependency>
```
4 changes: 2 additions & 2 deletions aggregator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../jdk-profiles/pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-aggregator_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Aggregator</name>
<description>Creates an aggregated shaded package of the RAPIDS plugin for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>aggregator</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions api_validation/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,11 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shim-deps-parent_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../shim-deps/pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark-api-validation_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>api_validation</rapids.module>
Expand Down
2 changes: 1 addition & 1 deletion datagen/ScaleTest.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ $SPARK_HOME/bin/spark-submit \
--conf spark.sql.parquet.datetimeRebaseModeInWrite=CORRECTED \
--class com.nvidia.rapids.tests.scaletest.ScaleTestDataGen \ # the main class
--jars $SPARK_HOME/examples/jars/scopt_2.12-3.7.1.jar \ # one dependency jar just shipped with Spark under $SPARK_HOME
./target/datagen_2.12-23.12.0-spark332.jar \
./target/datagen_2.12-23.12.1-spark332.jar \
1 \
10 \
parquet \
Expand Down
4 changes: 2 additions & 2 deletions datagen/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shim-deps-parent_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../shim-deps/pom.xml</relativePath>
</parent>
<artifactId>datagen_2.12</artifactId>
<name>Data Generator</name>
<description>Tools for generating large amounts of data</description>
<version>23.12.0</version>
<version>23.12.1</version>
<properties>
<rapids.module>datagen</rapids.module>
<target.classifier/>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-20x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-20x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.0.x Support</name>
<description>Delta Lake 2.0.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-20x</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-21x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-21x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.1.x Support</name>
<description>Delta Lake 2.1.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-21x</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-22x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-22x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.2.x Support</name>
<description>Delta Lake 2.2.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-22x</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-23x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-23x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.3.x Support</name>
<description>Delta Lake 2.3.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-23x</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-24x/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-24x_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake 2.4.x Support</name>
<description>Delta Lake 2.4.x support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-24x</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark321db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark321db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 10.4 Delta Lake Support</name>
<description>Databricks 10.4 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-spark321db</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark330db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark330db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 11.3 Delta Lake Support</name>
<description>Databricks 11.3 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-spark330db</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark332db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark332db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 12.2 Delta Lake Support</name>
<description>Databricks 12.2 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-spark332db</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-spark341db/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-spark341db_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Databricks 13.3 Delta Lake Support</name>
<description>Databricks 13.3 Delta Lake support for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.compressed.artifact>false</rapids.compressed.artifact>
Expand Down
4 changes: 2 additions & 2 deletions delta-lake/delta-stub/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,14 +22,14 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../../jdk-profiles/pom.xml</relativePath>
</parent>

<artifactId>rapids-4-spark-delta-stub_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Delta Lake Stub</name>
<description>Delta Lake stub for the RAPIDS Accelerator for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>

<properties>
<rapids.module>../delta-lake/delta-stub</rapids.module>
Expand Down
4 changes: 2 additions & 2 deletions dist/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@
<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-jdk-profiles_2.12</artifactId>
<version>23.12.0</version>
<version>23.12.1</version>
<relativePath>../jdk-profiles/pom.xml</relativePath>
</parent>
<artifactId>rapids-4-spark_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Distribution</name>
<description>Creates the distribution package of the RAPIDS plugin for Apache Spark</description>
<version>23.12.0</version>
<version>23.12.1</version>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
Expand Down
2 changes: 1 addition & 1 deletion docs/configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The following is the list of options that `rapids-plugin-4-spark` supports.
On startup use: `--conf [conf key]=[conf value]`. For example:

```
${SPARK_HOME}/bin/spark-shell --jars rapids-4-spark_2.12-23.12.0-cuda11.jar \
${SPARK_HOME}/bin/spark-shell --jars rapids-4-spark_2.12-23.12.1-cuda11.jar \
--conf spark.plugins=com.nvidia.spark.SQLPlugin \
--conf spark.rapids.sql.concurrentGpuTasks=2
```
Expand Down
12 changes: 6 additions & 6 deletions docs/dev/shims.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,17 +68,17 @@ Using JarURLConnection URLs we create a Parallel World of the current version wi
Spark 3.0.2's URLs:

```text
jar:file:/home/spark/rapids-4-spark_2.12-23.12.0.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.0.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.0.jar!/spark302/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.1.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.1.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.1.jar!/spark302/
```

Spark 3.2.0's URLs :

```text
jar:file:/home/spark/rapids-4-spark_2.12-23.12.0.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.0.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.0.jar!/spark320/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.1.jar!/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.1.jar!/spark3xx-common/
jar:file:/home/spark/rapids-4-spark_2.12-23.12.1.jar!/spark320/
```

### Late Inheritance in Public Classes
Expand Down
16 changes: 8 additions & 8 deletions docs/download.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ cuDF jar, that is either preinstalled in the Spark classpath on all nodes or sub
that uses the RAPIDS Accelerator For Apache Spark. See the [getting-started
guide](https://docs.nvidia.com/spark-rapids/user-guide/latest/getting-started/overview.html) for more details.

## Release v23.12.0
## Release v23.12.1
### Hardware Requirements:

The plugin is tested on the following architectures:
Expand Down Expand Up @@ -65,14 +65,14 @@ for your hardware's minimum driver version.
### RAPIDS Accelerator's Support Policy for Apache Spark
The RAPIDS Accelerator maintains support for Apache Spark versions available for download from [Apache Spark](https://spark.apache.org/downloads.html)

### Download RAPIDS Accelerator for Apache Spark v23.12.0
### Download RAPIDS Accelerator for Apache Spark v23.12.1
- **Scala 2.12:**
- [RAPIDS Accelerator for Apache Spark 23.12.0 - Scala 2.12 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.12.0/rapids-4-spark_2.12-23.12.0.jar)
- [RAPIDS Accelerator for Apache Spark 23.12.0 - Scala 2.12 jar.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.12.0/rapids-4-spark_2.12-23.12.0.jar.asc)
- [RAPIDS Accelerator for Apache Spark 23.12.1 - Scala 2.12 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.12.1/rapids-4-spark_2.12-23.12.1.jar)
- [RAPIDS Accelerator for Apache Spark 23.12.1 - Scala 2.12 jar.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/23.12.1/rapids-4-spark_2.12-23.12.1.jar.asc)

- **Scala 2.13:**
- [RAPIDS Accelerator for Apache Spark 23.12.0 - Scala 2.13 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.13/23.12.0/rapids-4-spark_2.13-23.12.0.jar)
- [RAPIDS Accelerator for Apache Spark 23.12.0 - Scala 2.13 jar.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.13/23.12.0/rapids-4-spark_2.13-23.12.0.jar.asc)
- [RAPIDS Accelerator for Apache Spark 23.12.1 - Scala 2.13 jar](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.13/23.12.1/rapids-4-spark_2.13-23.12.1.jar)
- [RAPIDS Accelerator for Apache Spark 23.12.1 - Scala 2.13 jar.asc](https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.13/23.12.1/rapids-4-spark_2.13-23.12.1.jar.asc)

This package is built against CUDA 11.8. It is tested on V100, T4, A10, A100, L4 and H100 GPUs with
CUDA 11.8 through CUDA 12.0.
Expand All @@ -81,9 +81,9 @@ CUDA 11.8 through CUDA 12.0.
* Download the [PUB_KEY](https://keys.openpgp.org/[email protected]).
* Import the public key: `gpg --import PUB_KEY`
* Verify the signature for Scala 2.12 jar:
`gpg --verify rapids-4-spark_2.12-23.12.0.jar.asc rapids-4-spark_2.12-23.12.0.jar`
`gpg --verify rapids-4-spark_2.12-23.12.1.jar.asc rapids-4-spark_2.12-23.12.1.jar`
* Verify the signature for Scala 2.13 jar:
`gpg --verify rapids-4-spark_2.13-23.12.0.jar.asc rapids-4-spark_2.13-23.12.0.jar`
`gpg --verify rapids-4-spark_2.13-23.12.1.jar.asc rapids-4-spark_2.13-23.12.1.jar`

The output of signature verify:

Expand Down
6 changes: 3 additions & 3 deletions integration_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -254,7 +254,7 @@ individually, so you don't risk running unit tests along with the integration te
http://www.scalatest.org/user_guide/using_the_scalatest_shell

```shell
spark-shell --jars rapids-4-spark-tests_2.12-23.12.0-tests.jar,rapids-4-spark-integration-tests_2.12-23.12.0-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
spark-shell --jars rapids-4-spark-tests_2.12-23.12.1-tests.jar,rapids-4-spark-integration-tests_2.12-23.12.1-tests.jar,scalatest_2.12-3.0.5.jar,scalactic_2.12-3.0.5.jar
```

First you import the `scalatest_shell` and tell the tests where they can find the test files you
Expand All @@ -277,7 +277,7 @@ If you just want to verify the SQL replacement is working you will need to add t
assumes CUDA 11.0 is being used and the Spark distribution is built with Scala 2.12.

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.12.0-cuda11.jar" ./runtests.py
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.12.1-cuda11.jar" ./runtests.py
```

You don't have to enable the plugin for this to work, the test framework will do that for you.
Expand Down Expand Up @@ -389,7 +389,7 @@ To run cudf_udf tests, need following configuration changes:
As an example, here is the `spark-submit` command with the cudf_udf parameter on CUDA 11.0:

```
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.12.0-cuda11.jar,rapids-4-spark-tests_2.12-23.12.0.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-23.12.0-cuda11.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-23.12.0-cuda11.jar" ./runtests.py --cudf_udf
$SPARK_HOME/bin/spark-submit --jars "rapids-4-spark_2.12-23.12.1-cuda11.jar,rapids-4-spark-tests_2.12-23.12.1.jar" --conf spark.rapids.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.memory.gpu.allocFraction=0.3 --conf spark.rapids.python.concurrentPythonWorkers=2 --py-files "rapids-4-spark_2.12-23.12.1-cuda11.jar" --conf spark.executorEnv.PYTHONPATH="rapids-4-spark_2.12-23.12.1-cuda11.jar" ./runtests.py --cudf_udf
```

### Enabling fuzz tests
Expand Down
Loading

0 comments on commit 180faac

Please sign in to comment.