@@ -26,7 +26,7 @@ runtime 9.1 LTS or later.
26
26
27
27
## Checking your code for common issues
28
28
29
- Run ` ./ lint.sh ` from the project root directory to run various code style checks.
29
+ Run ` make dev- lint` from the project root directory to run various code style checks.
30
30
These are based on the use of ` prospector ` , ` pylint ` and related tools.
31
31
32
32
## Setting up your build environment
@@ -45,6 +45,11 @@ Our recommended mechanism for building the code is to use a `conda` or `pipenv`
45
45
46
46
But it can be built with any Python virtualization environment.
47
47
48
+ ### Spark dependencies
49
+ The builds have been tested against Spark 3.2.1. This requires the OpenJDK 1.8.56 or later version of Java 8.
50
+ The Databricks runtimes use the Azul Zulu version of OpenJDK 8 and we have used these in local testing.
51
+ These are not installed automatically by the build process, so you will need to install them separately.
52
+
48
53
### Building with Conda
49
54
To build with ` conda ` , perform the following commands:
50
55
- ` make create-dev-env ` from the main project directory to create your conda environment, if using
@@ -70,7 +75,7 @@ To build with `pipenv`, perform the following commands:
70
75
- Run ` make dist ` from the main project directory
71
76
- The resulting wheel file will be placed in the ` dist ` subdirectory
72
77
73
- The resulting build has been tested against Spark 3.0 .1
78
+ The resulting build has been tested against Spark 3.2 .1
74
79
75
80
## Creating the HTML documentation
76
81
0 commit comments