You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apparently a json schema function already exists in geopetl, and it does it in a slightly better way by returning it as a dictionary with constraints.
However it doesn't return precision or scale which would allow us to use more precise data type such as smallint, bigint, smallfloat, bigfloat, which can have much better performance than a generic "numeric" type.
Apparently a json schema function already exists in geopetl, and it does it in a slightly better way by returning it as a dictionary with constraints.
However it doesn't return precision or scale which would allow us to use more precise data type such as smallint, bigint, smallfloat, bigfloat, which can have much better performance than a generic "numeric" type.
Remove this extraction method from databridge_etl_tools and merge with geopetl: https://github.com/CityOfPhiladelphia/databridge-etl-tools/blob/master/databridge_etl_tools/oracle.py#L64-L80
Add in functionality to use geopetl schema into carto, specifically using our logic here to decide on data types: https://github.com/CityOfPhiladelphia/databridge-etl-tools/blob/master/databridge_etl_tools/carto_.py#L211-L301
Update the carto method here to instead used a "etl.extract_table_schema": https://github.com/CityOfPhiladelphia/geopetl/blob/master/geopetl/oracle_sde.py#L20-L23
dbtools uploads the json schema here, change it: https://github.com/CityOfPhiladelphia/databridge-etl-tools/blob/master/databridge_etl_tools/oracle.py#L179
The text was updated successfully, but these errors were encountered: