Uniquess Check #176
Replies: 3 comments
-
@rishujuneja1298 No really as long as you are able to return
This will report issue for every row that contains a duplicate for the column. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the feedback, it is working. But there is one thing if I want to call the custom defined checks we need have a argument globals() which will take this custom defined check. In my case. I have defined all my checks in a .yml file which contains some other predefined checks also.
And I am calling the below function to run the dqx. If I am including globals() it fails with errors like "Error applying checks to dataframe: function 'is_not_null' is not defined: {'check':" and if I excluded globals it failed with error " ERROR [root] Error applying checks to dataframe: function 'is_unique' is not defined:". How can we handle this case? def apply_checks_by_metadata(input_df, check_path: str):
|
Beta Was this translation helpful? Give feedback.
-
Great. The uniquness check should be released this week: #200 The error is expected. Before calling I agree it's a bit cumbersome so I am changing this so that predefined functions are included by default without a need to make import: #203 |
Beta Was this translation helpful? Give feedback.
-
I have been trying to create a custom check functions to perform the unique value check for a column. But I am executing errors using the GroupBy clause. Is there any limitations there?
Beta Was this translation helpful? Give feedback.
All reactions