Skip to content

Commit

Permalink
[SPARK-51260][SQL] Move V2ExpressionBuilder and PushableExpression to…
Browse files Browse the repository at this point in the history
… Catalyst module

### What changes were proposed in this pull request?

The class `V2ExpressionBuilder` is under the package `org.apache.spark.sql.catalyst.util` while the file path is under the SQL Core module. The file path should be under the SQL Catalyst module instead.

Since V2ExpressionBuilder references the object PushableExpression, this PR moves it to the Catalyst module as well.
### Why are the changes needed?

1. Code clean up. Remove the `../catalyst/util` folder under `sql/core`
2. After refactoring, V2ExpressionBuilder and PushableExpression can be accessed from the Catalyst module. This will be useful for new projects like V2 constraints.

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

GA
### Was this patch authored or co-authored using generative AI tooling?

No

Closes #50011 from gengliangwang/move.

Authored-by: Gengliang Wang <gengliang@apache.org>
Signed-off-by: Gengliang Wang <gengliang@apache.org>
  • Loading branch information
gengliangwang committed Feb 20, 2025
1 parent 948840b commit 2beb7ed
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 9 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@ import org.apache.spark.sql.connector.catalog.functions.ScalarFunction
import org.apache.spark.sql.connector.expressions.{Cast => V2Cast, Expression => V2Expression, Extract => V2Extract, FieldReference, GeneralScalarExpression, LiteralValue, NullOrdering, SortDirection, SortValue, UserDefinedScalarFunc}
import org.apache.spark.sql.connector.expressions.aggregate.{AggregateFunc, Avg, Count, CountStar, GeneralAggregateFunc, Max, Min, Sum, UserDefinedAggregateFunc}
import org.apache.spark.sql.connector.expressions.filter.{AlwaysFalse, AlwaysTrue, And => V2And, Not => V2Not, Or => V2Or, Predicate => V2Predicate}
import org.apache.spark.sql.execution.datasources.PushableExpression
import org.apache.spark.sql.internal.SQLConf
import org.apache.spark.sql.types.{BooleanType, DataType, IntegerType, StringType}

Expand Down Expand Up @@ -451,3 +450,10 @@ object ColumnOrField {
case _ => None
}
}

/**
* Get the expression of DS V2 to represent catalyst expression that can be pushed down.
*/
object PushableExpression {
def unapply(e: Expression): Option[V2Expression] = new V2ExpressionBuilder(e).build()
}
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ import org.apache.spark.sql.catalyst.plans.logical.{AppendData, InsertIntoDir, I
import org.apache.spark.sql.catalyst.rules.Rule
import org.apache.spark.sql.catalyst.streaming.StreamingRelationV2
import org.apache.spark.sql.catalyst.types.DataTypeUtils
import org.apache.spark.sql.catalyst.util.{GeneratedColumn, IdentityColumn, ResolveDefaultColumns, V2ExpressionBuilder}
import org.apache.spark.sql.catalyst.util.{GeneratedColumn, IdentityColumn, PushableExpression, ResolveDefaultColumns}
import org.apache.spark.sql.classic.{SparkSession, Strategy}
import org.apache.spark.sql.connector.catalog.{SupportsRead, V1Table}
import org.apache.spark.sql.connector.catalog.TableCapability._
Expand Down Expand Up @@ -866,10 +866,3 @@ object PushableColumnAndNestedColumn extends PushableColumnBase {
object PushableColumnWithoutNestedColumn extends PushableColumnBase {
override val nestedPredicatePushdownEnabled = false
}

/**
* Get the expression of DS V2 to represent catalyst expression that can be pushed down.
*/
object PushableExpression {
def unapply(e: Expression): Option[V2Expression] = new V2ExpressionBuilder(e).build()
}

0 comments on commit 2beb7ed

Please sign in to comment.