Skip to content

Commit aae8bde

Browse files
Changed release version to be 0.4.0 (#271)
* wip * wip * corrected version information * wip * wip * updated build instructions
1 parent 4206b5c commit aae8bde

File tree

9 files changed

+17
-11
lines changed

9 files changed

+17
-11
lines changed

CHANGELOG.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,10 @@
33
## Change History
44
All notable changes to the Databricks Labs Data Generator will be documented in this file.
55

6-
### Unreleased
6+
### Version 0.4.0
77

88
#### Changed
9+
* Updated minimum pyspark version to be 3.2.1, compatible with Databricks runtime 10.4 LTS or later
910
* Modified data generator to allow specification of constraints to the data generation process
1011
* Updated documentation for generating text data.
1112
* Modified data distribiutions to use abstract base classes

CONTRIBUTING.md

+7-2
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ runtime 9.1 LTS or later.
2626

2727
## Checking your code for common issues
2828

29-
Run `./lint.sh` from the project root directory to run various code style checks.
29+
Run `make dev-lint` from the project root directory to run various code style checks.
3030
These are based on the use of `prospector`, `pylint` and related tools.
3131

3232
## Setting up your build environment
@@ -45,6 +45,11 @@ Our recommended mechanism for building the code is to use a `conda` or `pipenv`
4545

4646
But it can be built with any Python virtualization environment.
4747

48+
### Spark dependencies
49+
The builds have been tested against Spark 3.2.1. This requires the OpenJDK 1.8.56 or later version of Java 8.
50+
The Databricks runtimes use the Azul Zulu version of OpenJDK 8 and we have used these in local testing.
51+
These are not installed automatically by the build process, so you will need to install them separately.
52+
4853
### Building with Conda
4954
To build with `conda`, perform the following commands:
5055
- `make create-dev-env` from the main project directory to create your conda environment, if using
@@ -70,7 +75,7 @@ To build with `pipenv`, perform the following commands:
7075
- Run `make dist` from the main project directory
7176
- The resulting wheel file will be placed in the `dist` subdirectory
7277

73-
The resulting build has been tested against Spark 3.0.1
78+
The resulting build has been tested against Spark 3.2.1
7479

7580
## Creating the HTML documentation
7681

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ details of use and many examples.
6565

6666
Release notes and details of the latest changes for this specific release
6767
can be found in the GitHub repository
68-
[here](https://github.com/databrickslabs/dbldatagen/blob/release/v0.3.6post1/CHANGELOG.md)
68+
[here](https://github.com/databrickslabs/dbldatagen/blob/release/v0.4.0/CHANGELOG.md)
6969

7070
# Installation
7171

dbldatagen/_version.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ def get_version(version):
3434
return version_info
3535

3636

37-
__version__ = "0.3.6post1" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
37+
__version__ = "0.4.0" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
3838
__version_info__ = get_version(__version__)
3939

4040

docs/source/conf.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232
author = 'Databricks Inc'
3333

3434
# The full version, including alpha/beta/rc tags
35-
release = "0.3.6post1" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
35+
release = "0.4.0" # DO NOT EDIT THIS DIRECTLY! It is managed by bumpversion
3636

3737
# -- General configuration ---------------------------------------------------
3838

python/.bumpversion.cfg

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[bumpversion]
2-
current_version = 0.3.6post1
2+
current_version = 0.4.0
33
commit = False
44
tag = False
55
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+){0,1}(?P<release>\D*)(?P<build>\d*)

python/dev_require.txt

+2-2
Original file line numberDiff line numberDiff line change
@@ -3,9 +3,9 @@
33
numpy==1.22.0
44
pandas==1.2.4
55
pickleshare==0.7.5
6-
py4j==0.10.9
6+
py4j>=0.10.9.3
77
pyarrow==4.0.1
8-
pyspark>=3.1.3
8+
pyspark>=3.2.1,<=3.3.0
99
python-dateutil==2.8.1
1010
six==1.15.0
1111
pyparsing==2.4.7

python/require.txt

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ pandas==1.2.5
55
pickleshare==0.7.5
66
py4j==0.10.9
77
pyarrow==4.0.1
8-
pyspark>=3.1.3
8+
pyspark>=3.2.1
99
python-dateutil==2.8.1
1010
six==1.15.0
1111
pyparsing==2.4.7

setup.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131

3232
setuptools.setup(
3333
name="dbldatagen",
34-
version="0.3.6post1",
34+
version="0.4.0",
3535
author="Ronan Stokes, Databricks",
3636
description="Databricks Labs - PySpark Synthetic Data Generator",
3737
long_description=long_description,

0 commit comments

Comments
 (0)