diff --git a/changelog b/changelog index 1a87fdb668..a5d34215ff 100644 --- a/changelog +++ b/changelog @@ -79,6 +79,9 @@ 29) PR #2556 towards #1351. Documents the new OPERATES_ON=dof LFRic kernel type. + 30) PR #2252 for #1990. Support extraction of kernels that import + variables from other modules. + release 2.5.0 14th of February 2024 1) PR #2199 for #2189. Fix bugs with missing maps in enter data diff --git a/config/psyclone.cfg b/config/psyclone.cfg index b8762efb44..0042391864 100644 --- a/config/psyclone.cfg +++ b/config/psyclone.cfg @@ -55,7 +55,7 @@ VALID_PSY_DATA_PREFIXES = profile, extract, read_only_verify, nan_test OCL_DEVICES_PER_NODE = 1 # Symbols imported from the following modules will be ignored when parsing -# and will not produce a warning message if they cannot be found +# and will not produce a warning message if they cannot be found. IGNORE_MODULES = netcdf, mpi # Settings specific to the LFRic (Dynamo 0.3) API diff --git a/doc/user_guide/psy_data.rst b/doc/user_guide/psy_data.rst index d46f213b69..53c0e595ea 100644 --- a/doc/user_guide/psy_data.rst +++ b/doc/user_guide/psy_data.rst @@ -170,7 +170,7 @@ linking the verification library. The application which uses the read-only-verification library needs to link in the infrastructure library anyway. -.. note: +.. note:: It is the responsibility of the user to make sure that the infrastructure files used during compilation of the read-only-verification library are also used when linking the application. Otherwise strange and @@ -255,3 +255,44 @@ An executable example for using the LFRic read-only-verification library is included in ``tutorial/practicals/LFRic/building_code/4_psydata`` directory, see `this link for more information `_. + + +.. _integrating_psy_data_lfric: + +Integrating PSyData Libraries into the LFRic Build Environment +-------------------------------------------------------------- +The easiest way of integrating any PSyData-based library into the LFRic +build environment is: + +- In the LFRic source tree create a new directory under ``infrastructure/source``, + e.g. ``infrastructure/source/psydata``. +- Build the PSyData wrapper stand-alone in ``lib/extract/netcdf/lfric`` (which + will use NetCDF as output format) or ``lib/extract/standalone/lfric`` (which + uses standard Fortran binary output format) by executing ``make``. The compiled + files will actually not be used, but this step will create all source + files (some of which are created by jinja). Do not copy + the compiled files into your LFRic build tree, since these files might be + compiled with an outdated version of the infrastructure files and be + incompatible with files in a current LFRic version. +- Copy all processed source files (``extract_netcdf_base.f90``, + ``kernel_data_netcdf.f90``, ``psy_data_base.f90``, + ``read_kernel_data_mod.f90``) into ``infrastructure/source/psydata`` +- Start the LFRic build process as normal. The LFRic build environment will + copy the PSyData source files into the working directory and compile + them. +- If the PSyData library needs additional include paths (e.g. when using an + external profiling tool), add the required paths to ``$FFLAGS``. +- If additional libraries are required at link time, add the paths + and libraries to ``$LDFLAGS``. Alternatively, when a compiler wrapper + script is provided by a third-party tool (e.g. the profiling tool + TAU provides a script ``tau_f90.sh``), either set the environment variable + ``$FC``, or if this is only required at link time, the variable ``$LDMPI`` + to this compiler wrapper. + +.. warning:: + Only one PSyData library can be integrated at a time. Otherwise there + will be potentially several modules with the same name (e.g. + ``psy_data_base``), resulting in errors at compile time. + +.. note:: + With the new build system FAB this process might change. diff --git a/doc/user_guide/psyke.rst b/doc/user_guide/psyke.rst index abb1ccac89..b4ec5e5480 100644 --- a/doc/user_guide/psyke.rst +++ b/doc/user_guide/psyke.rst @@ -457,13 +457,8 @@ code, it will create an output file for each instrumented code region. The same logic for naming variables (using ``_post`` for output variables) used in :ref:`extraction_for_gocean` is used here. -As in the case of e.g. :ref:`read-only verification -`, this library uses the pared-down LFRic -infrastructure located in a clone of PSyclone repository, -``/src/psyclone/tests/test_files/dynamo0p3/infrastructure``. -However, this needs to be changed for any user (for instance with -PSyclone installation). Please refer to the relevant ``README.md`` -documentation on how to build and link this library. +Check :ref:`integrating_psy_data_lfric` for the recommended way of linking +an extraction library to LFRic. The output file contains the values of all variables used in the subroutine. The ``LFRicExtractTrans`` transformation can automatically @@ -495,14 +490,6 @@ optimisation of a stand-alone kernel. stores the variable names and will not be able to find a variable if its name has changed. -.. note:: If the kernel, or any function called from an extracted kernel - should use a variable from a module directly (as opposed to supplying - this as parameter in the kernel call), this variable will not be - written to the extract data file, and the driver will also not try to - read in the value. As a result, the kernel will not be able to - run stand-alone. As a work-around, these values can be added manually - to the driver program. Issue #1990 tracks improvement of this situation. - The LFRic kernel driver will inline all required external modules into the driver. It uses a ``ModuleManager`` to find the required modules, based on the assumption that a file ``my_special_mod.f90`` will define exactly one module @@ -531,6 +518,21 @@ paths (infrastructure files and extraction library) for the compiler, but these flags are actually only required for compiling the example program, not for the driver. +Restrictions of Kernel Extraction and Driver Creation +##################################################### +A few restrictions still apply to the current implementation of the driver +creation code: + +- Distributed memory is not yet supported. See #1992. +- The extraction code will now write variables that are used from other + modules to the kernel data file, and the driver will read these values in. + Unfortunately, if a variable is used that is defined as private, + the value cannot be written to the file, and compilation will abort. + The only solution is to modify this file and make all variables public. + This mostly affects ``log_mod.F90``, but a few other modules as well. +- The new build system FAB will be able to remove ``private`` and + ``protected`` declarations in any source files, meaning no manual + modification of files is required anymore (TODO #2536). Extraction for NEMO ++++++++++++++++++++ diff --git a/examples/gocean/eg5/extract/Makefile b/examples/gocean/eg5/extract/Makefile index f99bada396..db2c999cee 100644 --- a/examples/gocean/eg5/extract/Makefile +++ b/examples/gocean/eg5/extract/Makefile @@ -61,19 +61,16 @@ INF_INC = $(INF_DIR)/src INF_LIB = $(INF_DIR)/src/lib_fd.a ifeq ($(TYPE), netcdf) EXTRACT_DIR ?= $(PSYROOT)/lib/extract/netcdf/dl_esm_inf - READ_DIR ?= $(PSYROOT)/lib/extract/netcdf F90FLAGS += $$(nf-config --fflags) LDFLAGS += $$(nf-config --flibs) $$(nc-config --libs) GENERATED_FILES += main-init.nc main-update.nc else EXTRACT_DIR ?= $(PSYROOT)/lib/extract/standalone/dl_esm_inf - READ_DIR ?= $(PSYROOT)/lib/extract/standalone GENERATED_FILES += main-init.binary main-update.binary endif LIB_NAME = lib_extract.a -READ_KERNEL_DATA = $(READ_DIR)/read_kernel_data_mod.o \ # The two kernels used in the application. @@ -91,16 +88,16 @@ DRIVER_UPDATE = driver-main-update .PHONY: transform compile run run: compile - ./extract_test - ./driver-main-init - ./driver-main-update + ./$(NAME) + ./driver-main-init.$(TYPE) + ./driver-main-update.$(TYPE) compile: transform $(NAME) $(DRIVER_INIT).$(TYPE) $(DRIVER_UPDATE).$(TYPE) transform: psy.f90 -F90FLAGS += -I$(INF_INC) -I$(EXTRACT_DIR) -I$(READ_DIR) +F90FLAGS += -I$(INF_INC) -I$(EXTRACT_DIR) alg.f90 psy.f90: test.x90 extract_transform.py $(PSYCLONE) -nodm -api "gocean1.0" -s ./extract_transform.py\ @@ -112,15 +109,15 @@ $(NAME): $(INF_LIB) $(EXTRACT_DIR)/$(LIB_NAME) $(KERNELS) alg.o psy.o #TODO #1757: $(INF_LIB) is required because of the meta-data in the # kernel - once this is fixed, $(INF_LIB) can be removed. -$(DRIVER_INIT).$(TYPE): $(KERNELS) $(DRIVER_INIT).o $(READ_KERNEL_DATA) +$(DRIVER_INIT).$(TYPE): $(KERNELS) $(DRIVER_INIT).o $(F90) $(KERNELS) $(DRIVER_INIT).o -o $(DRIVER_INIT).$(TYPE) \ - $(INF_LIB) $(READ_KERNEL_DATA) $(LDFLAGS) + $(INF_LIB) $(EXTRACT_DIR)/$(LIB_NAME) $(LDFLAGS) #TODO #1757: $(INF_LIB) is required because of the meta-data in the # kernel - once this is fixed, $(INF_LIB) can be removed. -$(DRIVER_UPDATE).$(TYPE): $(KERNELS) $(DRIVER_UPDATE).o $(READ_KERNEL_DATA) +$(DRIVER_UPDATE).$(TYPE): $(KERNELS) $(DRIVER_UPDATE).o $(F90) $(KERNELS) $(DRIVER_UPDATE).o -o $(DRIVER_UPDATE).$(TYPE) \ - $(INF_LIB) $(READ_KERNEL_DATA) $(LDFLAGS) + $(INF_LIB) $(EXTRACT_DIR)/$(LIB_NAME) $(LDFLAGS) # The dl_esm_inf library $(INF_LIB): @@ -138,8 +135,6 @@ psy.o: $(KERNELS) # directory will fail. $(DRIVER_INIT).f90: psy.f90 $(DRIVER_UPDATE).f90: psy.f90 -$(DRIVER_INIT).o: $(READ_KERNEL_DATA) -$(DRIVER_UPDATE).o: $(READ_KERNEL_DATA) # Dependency to INF_LIB to make sure the mod file are available $(KERNELS): $(INF_LIB) @@ -152,10 +147,6 @@ $(KERNELS): $(INF_LIB) $(EXTRACT_DIR)/lib_kernel_data_netcdf.a: make -C $(EXTRACT_DIR) -$(READ_KERNEL_DATA): - make -C $(READ_DIR) - allclean: clean make -C $(INF_DIR) clean make -C $(EXTRACT_DIR) clean - make -C $(READ_DIR) clean diff --git a/examples/lfric/eg17/full_example_extract/Makefile b/examples/lfric/eg17/full_example_extract/Makefile index c9c01eee5b..d01c697d34 100644 --- a/examples/lfric/eg17/full_example_extract/Makefile +++ b/examples/lfric/eg17/full_example_extract/Makefile @@ -58,7 +58,7 @@ GENERATED_FILES += driver-main-init driver-main-init.F90 \ F90 ?= gfortran F90FLAGS ?= -Wall -g -ffree-line-length-none -OBJ = main_psy.o main_alg.o testkern_w0_kernel_mod.o +OBJ = main_psy.o main_alg.o testkern_w0_kernel_mod.o dummy_mod.o ifeq ($(TYPE), netcdf) EXTRACT_DIR ?= $(PSYROOT)/lib/extract/netcdf/lfric @@ -71,7 +71,6 @@ endif EXEC = extract.$(TYPE) EXTRACT_NAME ?= _extract EXTRACT_LIB = $(EXTRACT_DIR)/lib$(EXTRACT_NAME).a -READ_KERNEL_DATA_OBJ = $(EXTRACT_DIR)/../read_kernel_data_mod.o LFRIC_PATH ?= $(PSYROOT)/src/psyclone/tests/test_files/dynamo0p3/infrastructure LFRIC_NAME=lfric LFRIC_LIB=$(LFRIC_PATH)/lib$(LFRIC_NAME).a @@ -101,19 +100,16 @@ $(EXTRACT_LIB): $(LFRIC_LIB) # Dependencies main_psy.o: testkern_w0_kernel_mod.o $(EXTRACT_LIB) $(LFRIC_LIB) main_alg.o: main_psy.o -testkern_w0_kernel_mod.o: $(LFRIC_LIB) +testkern_w0_kernel_mod.o: dummy_mod.o $(LFRIC_LIB) -driver-main-update: LFRIC_INCLUDE_FLAGS += -I $(EXTRACT_DIR)/.. -driver-main-update: driver-main-update.o $(READ_KERNEL_DATA_OBJ) +driver-main-update: driver-main-update.o $(F90) $(F90FLAGS) $(LFRIC_INCLUDE_FLAGS) driver-main-update.o \ - $(LDFLAGS) -o driver-main-update + $(LDFLAGS) -o driver-main-update driver-main-init: LFRIC_INCLUDE_FLAGS += -I $(EXTRACT_DIR)/.. -driver-main-init: driver-main-init.o $(READ_KERNEL_DATA_OBJ) +driver-main-init: driver-main-init.o $(F90) $(F90FLAGS) $(LFRIC_INCLUDE_FLAGS) driver-main-init.o \ - $(LDFLAGS) -o driver-main-init - -driver-main-update.o driver-main-init.o: $(READ_KERNEL_DATA_OBJ) + $(LDFLAGS) -o driver-main-init %.o: %.F90 $(F90) $(F90FLAGS) $(LFRIC_INCLUDE_FLAGS) -c $< @@ -124,14 +120,14 @@ driver-main-update.o driver-main-init.o: $(READ_KERNEL_DATA_OBJ) # Keep the generated psy and alg files .precious: main_psy.f90 main_alg.f90 +# This dependency will make sure that read_kernel_data_mod was created +# (which will be inlined in the driver). +main_psy.f90: $(EXTRACT_LIB) main_alg.f90: main_psy.f90 -$(READ_KERNEL_DATA_OBJ): - $(MAKE) -C $(EXTRACT_DIR)/.. - %_psy.f90: %.x90 ${PSYCLONE} -s ./extract_transform.py \ - -d . -d $(EXTRACT_DIR) -d $(EXTRACT_DIR)/.. \ + -d . -d $(EXTRACT_DIR) \ -d $(LFRIC_PATH) \ -nodm -opsy $*_psy.f90 -oalg $*_alg.f90 $< diff --git a/examples/lfric/eg17/full_example_extract/dummy_mod.f90 b/examples/lfric/eg17/full_example_extract/dummy_mod.f90 new file mode 100644 index 0000000000..89862b0bea --- /dev/null +++ b/examples/lfric/eg17/full_example_extract/dummy_mod.f90 @@ -0,0 +1,67 @@ +! BSD 3-Clause License +! +! Copyright (c) 2023-2024, Science and Technology Facilities Council +! All rights reserved. +! +! Redistribution and use in source and binary forms, with or without +! modification, are permitted provided that the following conditions are met: +! +! * Redistributions of source code must retain the above copyright notice, this +! list of conditions and the following disclaimer. +! +! * Redistributions in binary form must reproduce the above copyright notice, +! this list of conditions and the following disclaimer in the documentation +! and/or other materials provided with the distribution. +! +! * Neither the name of the copyright holder nor the names of its +! contributors may be used to endorse or promote products derived from +! this software without specific prior written permission. +! +! THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +! AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +! IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +! DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +! FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +! DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +! SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +! CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +! OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +! OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. +! ----------------------------------------------------------------------------- +! Author J. Henrichs, Bureau of Meteorology + +! This simple module is used to showcase and test the extraction of non-local +! module variables with the driver extraction. + +module dummy_mod + integer :: dummy_var1 + real :: dummy_var2 + real :: dummy_var3 = 3 + + public :: dummy_code + + interface dummy_code + module procedure dummy_code_1, dummy_code_2 + end interface + + contains + + subroutine dummy_code_1(a) + implicit none + integer :: a + dummy_var1 = dummy_var1 + 1 + end subroutine dummy_code_1 + + subroutine dummy_code_2(a) + implicit none + real :: a + dummy_var1 = dummy_var1 + 1 + end subroutine dummy_code_2 + + integer function dummy_func(a) + implicit none + integer :: a + dummy_func = a+1 + dummy_var2 + end function dummy_func + +end module dummy_mod diff --git a/examples/lfric/eg17/full_example_extract/testkern_w0_kernel_mod.f90 b/examples/lfric/eg17/full_example_extract/testkern_w0_kernel_mod.f90 index f5ee31ba73..8d73364980 100644 --- a/examples/lfric/eg17/full_example_extract/testkern_w0_kernel_mod.f90 +++ b/examples/lfric/eg17/full_example_extract/testkern_w0_kernel_mod.f90 @@ -32,6 +32,7 @@ ! Modified by J. Henrichs, Bureau of Meteorology ! Modified by I. Kavcic, Met Office + module testkern_w0_kernel_mod use argument_mod @@ -40,8 +41,15 @@ module testkern_w0_kernel_mod use constants_mod + ! This is used to showcase the ability of the kernel extraction + ! to write and for the driver creation to read non-local module variables + ! when importing them in the module scope + use dummy_mod, only: dummy_var1, dummy_code + implicit none + integer, public :: some_other_var + integer, parameter :: some_other_const = 123 private type, public, extends(kernel_type) :: testkern_w0_kernel_type @@ -64,6 +72,10 @@ module testkern_w0_kernel_mod subroutine testkern_w0_code(nlayers, fld1, fld2, chi1, chi2, chi3, & some_logical, ndf_w0, undf_w0, map_w0) + ! This is used to showcase the ability of the kernel extraction + ! to write and for the driver creation to read non-local module + ! variables when importing them in the kernel itself. + use dummy_mod, only: dummy_var2, dummy_var3, dummy_func, dummy_code implicit none integer(kind=i_def), intent(in) :: nlayers @@ -75,12 +87,19 @@ subroutine testkern_w0_code(nlayers, fld1, fld2, chi1, chi2, chi3, & integer(kind=i_def), dimension(ndf_w0) :: map_w0 integer(kind=i_def) :: i, k + real(kind=r_def) :: some_r + call dummy_code(1) + some_r = 0 do k=0, nlayers-1 do i=1, ndf_w0 - fld1(map_w0(i)+k) = fld1(map_w0(i)+k) + fld2(map_w0(i)+k) + some_r = some_r + 1 + fld1(map_w0(i)+k) = fld1(map_w0(i)+k) + fld2(map_w0(i)+k) & + + dummy_func(i) if (some_logical) then - fld1(map_w0(i)+k) = fld1(map_w0(i)+k) + 1 + fld1(map_w0(i)+k) = fld1(map_w0(i)+k) + 1 + dummy_var1 + dummy_var2 & + + some_other_var + some_r + dummy_var3 & + + some_other_const endif end do end do diff --git a/lib/extract/netcdf/dl_esm_inf/.gitignore b/lib/extract/netcdf/dl_esm_inf/.gitignore new file mode 100644 index 0000000000..b13e29fa31 --- /dev/null +++ b/lib/extract/netcdf/dl_esm_inf/.gitignore @@ -0,0 +1 @@ +read_kernel_data_mod.f90 diff --git a/lib/extract/netcdf/dl_esm_inf/Makefile b/lib/extract/netcdf/dl_esm_inf/Makefile index 140299f1ff..c75695d685 100644 --- a/lib/extract/netcdf/dl_esm_inf/Makefile +++ b/lib/extract/netcdf/dl_esm_inf/Makefile @@ -63,7 +63,8 @@ PROCESS_ARGS = -prefix=extract_ -types=int,real,double \ -dims=2 PROCESS = $$($(PSYDATA_LIB_DIR)/get_python.sh) $(PSYDATA_LIB_DIR)/process.py -OBJS = psy_data_base.o kernel_data_netcdf.o extract_netcdf_base.o +OBJS = psy_data_base.o kernel_data_netcdf.o extract_netcdf_base.o \ + read_kernel_data_mod.o default: $(PSYDATA_LIB) @@ -78,6 +79,9 @@ extract_netcdf_base.o: psy_data_base.o psy_data_base.f90: $(PSYDATA_LIB_DIR)/psy_data_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) $< > psy_data_base.f90 +read_kernel_data_mod.f90: $(LIB_TMPLT_DIR)/read_kernel_data_mod.jinja Makefile + $(PROCESS) $(PROCESS_ARGS) $< > read_kernel_data_mod.f90 + extract_netcdf_base.f90: $(LIB_TMPLT_DIR)/extract_netcdf_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) -generic-declare -generic-provide $< > extract_netcdf_base.f90 diff --git a/lib/extract/netcdf/lfric/.gitignore b/lib/extract/netcdf/lfric/.gitignore index 06f7d51238..43711f2e6e 100644 --- a/lib/extract/netcdf/lfric/.gitignore +++ b/lib/extract/netcdf/lfric/.gitignore @@ -1 +1,2 @@ kernel_data_netcdf.f90 +read_kernel_data_mod.f90 diff --git a/lib/extract/netcdf/lfric/Makefile b/lib/extract/netcdf/lfric/Makefile index 89ef0a0f93..3c08a5c316 100644 --- a/lib/extract/netcdf/lfric/Makefile +++ b/lib/extract/netcdf/lfric/Makefile @@ -75,8 +75,12 @@ default: $(PSYDATA_LIB) .PHONY: default clean allclean -$(PSYDATA_LIB): $(INF_LIB) $(OBJS) - ${AR} ${ARFLAGS} ${PSYDATA_LIB} $^ +# We add read_kernel_data_mod as a convenience for using this +# in LFRic: all required file for kernel extractions and drivers +# are then in this directory. That file does not need to be compiled, +# it will be included in the created driver programs. +$(PSYDATA_LIB): $(INF_LIB) $(OBJS) read_kernel_data_mod.f90 + ${AR} ${ARFLAGS} ${PSYDATA_LIB} $(OBJS) # Create LFRic infrastructure library $(INF_LIB): @@ -91,6 +95,9 @@ kernel_data_netcdf.f90: kernel_data_netcdf.jinja Makefile psy_data_base.f90: $(PSYDATA_LIB_DIR)/psy_data_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) $< > psy_data_base.f90 +read_kernel_data_mod.f90: $(LIB_TMPLT_DIR)/read_kernel_data_mod.jinja Makefile + $(PROCESS) $(PROCESS_ARGS) $< > read_kernel_data_mod.f90 + extract_netcdf_base.f90: $(LIB_TMPLT_DIR)/extract_netcdf_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) -generic-declare -generic-provide $< > extract_netcdf_base.f90 @@ -98,7 +105,7 @@ extract_netcdf_base.f90: $(LIB_TMPLT_DIR)/extract_netcdf_base.jinja Makefile $(F90) $(F90FLAGS) $(LFRIC_INCLUDE_FLAGS) -c $< clean: - rm -f *.o *.mod $(PSYDATA_LIB) psy_data_base.f90 extract_netcdf_base.f90 + rm -f *.o *.mod $(PSYDATA_LIB) psy_data_base.f90 extract_netcdf_base.F90 allclean: clean $(MAKE) -C $(LFRIC_PATH) allclean diff --git a/lib/extract/netcdf/read_kernel_data_mod.jinja b/lib/extract/netcdf/read_kernel_data_mod.jinja index 7328c8976f..855ddeee10 100644 --- a/lib/extract/netcdf/read_kernel_data_mod.jinja +++ b/lib/extract/netcdf/read_kernel_data_mod.jinja @@ -37,7 +37,7 @@ ! ----------------------------------------------------------------------------- ! BSD 3-Clause License ! -! Copyright (c) 2022-2023, Science and Technology Facilities Council. +! Copyright (c) 2022-2024, Science and Technology Facilities Council. ! All rights reserved. ! ! Redistribution and use in source and binary forms, with or without diff --git a/lib/extract/standalone/dl_esm_inf/.gitignore b/lib/extract/standalone/dl_esm_inf/.gitignore new file mode 100644 index 0000000000..b13e29fa31 --- /dev/null +++ b/lib/extract/standalone/dl_esm_inf/.gitignore @@ -0,0 +1 @@ +read_kernel_data_mod.f90 diff --git a/lib/extract/standalone/dl_esm_inf/Makefile b/lib/extract/standalone/dl_esm_inf/Makefile index d510a3b5e4..4f1f5c5ade 100644 --- a/lib/extract/standalone/dl_esm_inf/Makefile +++ b/lib/extract/standalone/dl_esm_inf/Makefile @@ -61,7 +61,8 @@ PROCESS_ARGS = -prefix=extract_ -types=int,real,double \ -dims=2 PROCESS = $$($(PSYDATA_LIB_DIR)/get_python.sh) $(PSYDATA_LIB_DIR)/process.py -OBJS = psy_data_base.o kernel_data_standalone.o extract_standalone_base.o +OBJS = psy_data_base.o kernel_data_standalone.o extract_standalone_base.o \ + read_kernel_data_mod.o default: $(PSYDATA_LIB) @@ -76,6 +77,9 @@ extract_standalone_base.o: psy_data_base.o psy_data_base.f90: $(PSYDATA_LIB_DIR)/psy_data_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) -generic-declare $< > psy_data_base.f90 +read_kernel_data_mod.f90: $(LIB_TMPLT_DIR)/read_kernel_data_mod.jinja Makefile + $(PROCESS) $(PROCESS_ARGS) $< > read_kernel_data_mod.f90 + extract_standalone_base.f90: $(LIB_TMPLT_DIR)/extract_standalone_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) -generic-provide $< > extract_standalone_base.f90 diff --git a/lib/extract/standalone/lfric/.gitignore b/lib/extract/standalone/lfric/.gitignore index 85a9245fda..99eaf94920 100644 --- a/lib/extract/standalone/lfric/.gitignore +++ b/lib/extract/standalone/lfric/.gitignore @@ -1 +1,2 @@ kernel_data_standalone.f90 +read_kernel_data_mod.f90 diff --git a/lib/extract/standalone/lfric/Makefile b/lib/extract/standalone/lfric/Makefile index 1906360e2d..3ce3dea539 100644 --- a/lib/extract/standalone/lfric/Makefile +++ b/lib/extract/standalone/lfric/Makefile @@ -71,8 +71,12 @@ default: $(PSYDATA_LIB) .PHONY: default clean allclean -$(PSYDATA_LIB): $(INF_LIB) $(OBJS) - ${AR} ${ARFLAGS} ${PSYDATA_LIB} $^ +# We add read_kernel_data_mod as a convenience for using this +# in LFRic: all required file for kernel extractions and drivers +# are then in this directory. That file does not need to be compiled, +# it will be included in the created driver programs. +$(PSYDATA_LIB): $(INF_LIB) $(OBJS) read_kernel_data_mod.f90 + ${AR} ${ARFLAGS} ${PSYDATA_LIB} $(OBJS) # Create LFRic infrastructure library $(INF_LIB): @@ -87,6 +91,9 @@ kernel_data_standalone.f90: kernel_data_standalone.jinja Makefile psy_data_base.f90: $(PSYDATA_LIB_DIR)/psy_data_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) -generic-declare $< > psy_data_base.f90 +read_kernel_data_mod.f90: $(LIB_TMPLT_DIR)/read_kernel_data_mod.jinja Makefile + $(PROCESS) $(PROCESS_ARGS) $< > read_kernel_data_mod.f90 + extract_standalone_base.f90: $(LIB_TMPLT_DIR)/extract_standalone_base.jinja Makefile $(PROCESS) $(PROCESS_ARGS) -generic-provide $< > extract_standalone_base.f90 @@ -94,7 +101,8 @@ extract_standalone_base.f90: $(LIB_TMPLT_DIR)/extract_standalone_base.jinja Make $(F90) $(F90FLAGS) $(LFRIC_INCLUDE_FLAGS) -c $< clean: - rm -f *.o *.mod $(PSYDATA_LIB) psy_data_base.f90 extract_standalone_base.f90 + rm -f *.o *.mod $(PSYDATA_LIB) psy_data_base.f90 extract_standalone_base.f90 \ + read_kernel_data_mod.f90 allclean: clean $(MAKE) -C $(LFRIC_PATH) allclean diff --git a/psyclone.pdf b/psyclone.pdf index 47f9db3ea4..e4b8ffd450 100644 Binary files a/psyclone.pdf and b/psyclone.pdf differ diff --git a/src/psyclone/domain/lfric/lfric_extract_driver_creator.py b/src/psyclone/domain/lfric/lfric_extract_driver_creator.py index 32846e4059..9ec811f2e4 100644 --- a/src/psyclone/domain/lfric/lfric_extract_driver_creator.py +++ b/src/psyclone/domain/lfric/lfric_extract_driver_creator.py @@ -40,6 +40,10 @@ the output data contained in the input file. ''' +# TODO #1382: refactoring common functionality between the various driver +# creation implementation should make this file much smaller. +# pylint: disable=too-many-lines + from psyclone.configuration import Config from psyclone.core import Signature from psyclone.domain.lfric import LFRicConstants @@ -293,7 +297,8 @@ def _flatten_reference(self, old_reference, symbol_table, old_reference.replace_with(new_ref) # ------------------------------------------------------------------------- - def _add_all_kernel_symbols(self, sched, symbol_table, proxy_name_mapping): + def _add_all_kernel_symbols(self, sched, symbol_table, proxy_name_mapping, + read_write_info): '''This function adds all symbols used in ``sched`` to the symbol table. It uses LFRic-specific knowledge to declare fields and flatten their name. @@ -306,9 +311,12 @@ def _add_all_kernel_symbols(self, sched, symbol_table, proxy_name_mapping): :param proxy_name_mapping: a mapping of proxy names to the original \ names. :type proxy_name_mapping: Dict[str,str] - + :param read_write_info: information about all input and output + parameters. + :type read_write_info: :py:class:`psyclone.psyir.tools.ReadWriteInfo` ''' + # pylint: disable=too-many-locals all_references = sched.walk(Reference) # First we add all non-structure names to the symbol table. This way @@ -369,6 +377,46 @@ def _add_all_kernel_symbols(self, sched, symbol_table, proxy_name_mapping): self._flatten_reference(reference, symbol_table, proxy_name_mapping) + # Now add all non-local symbols, which need to be + # imported from the appropriate module: + # ----------------------------------------------- + mod_man = ModuleManager.get() + for module_name, signature in read_write_info.set_of_all_used_vars: + if not module_name: + # Ignore local symbols, which will have been added above + continue + container = symbol_table.find_or_create( + module_name, symbol_type=ContainerSymbol) + + # Now look up the original symbol. While the variable could + # be declared Unresolved here (i.e. just imported), we need the + # type information for the output variables (VAR_post), which + # are created later and which will query the original symbol for + # its type. And since they are not imported, they need to be + # explicitly declared. + mod_info = mod_man.get_module_info(module_name) + sym_tab = mod_info.get_psyir().symbol_table + try: + container_symbol = sym_tab.lookup(signature[0]) + except KeyError: + # TODO #2120: This typically indicates a problem with parsing + # a module: the psyir does not have the full tree structure. + continue + + # It is possible that external symbol name (signature[0]) already + # exist in the symbol table (the same name is used in the local + # subroutine and in a module). In this case, the imported symbol + # must be renamed: + if signature[0] in symbol_table: + interface = ImportInterface(container, orig_name=signature[0]) + else: + interface = ImportInterface(container) + + symbol_table.find_or_create_tag( + tag=f"{signature[0]}@{module_name}", root_name=signature[0], + symbol_type=DataSymbol, interface=interface, + datatype=container_symbol.datatype) + # ------------------------------------------------------------------------- @staticmethod def _add_call(program, name, args): @@ -402,7 +450,7 @@ def _add_call(program, name, args): # ------------------------------------------------------------------------- @staticmethod def _create_output_var_code(name, program, is_input, read_var, - postfix, index=None): + postfix, index=None, module_name=None): # pylint: disable=too-many-arguments ''' This function creates all code required for an output variable. @@ -412,6 +460,10 @@ def _create_output_var_code(name, program, is_input, read_var, the size of the _post variable) and initialised to 0. This function also handles array of fields, which need to get an index number added. + If a module_name is specified, this indicates that this variable + is imported from an external module. The name of the module will + be appended to the tag used in the extracted kernel file, e.g. + `dummy_var2@dummy_mod`. :param str name: the name of original variable (i.e. without _post), which will be looked up as a tag in the symbol @@ -428,6 +480,9 @@ def _create_output_var_code(name, program, is_input, read_var, values, which are read from the file. :param index: if present, the index to the component of a field vector. :type index: Optional[int] + :param str module_name: if the variable is part of an external module, + this contains the module name from which it is imported. + Otherwise, this must either not be specified or an empty string. :returns: a 2-tuple containing the output Symbol after the kernel, and the expected output read from the file. @@ -435,23 +490,36 @@ def _create_output_var_code(name, program, is_input, read_var, :py:class:`psyclone.psyir.symbols.Symbol`] ''' + # For each variable that is written, we need to declare a new variable + # that stores the expected value which is contained in the kernel data + # file, which has `_post` appended to the name (so `a` is the variable + # that is written, and `a_post` is the corresponding variable that + # has the expected results for verification). Since the written + # variable and the one storing the expected results have the same + # type, look up the 'original' variable and declare the _POST variable symbol_table = program.symbol_table - if index is not None: - sym = symbol_table.lookup_with_tag(f"{name}_{index}_data") + if module_name: + sym = symbol_table.lookup_with_tag(f"{name}@{module_name}") else: - # If it is not indexed then `name` will already end in "_data" - sym = symbol_table.lookup_with_tag(name) + if index is not None: + sym = symbol_table.lookup_with_tag(f"{name}_{index}_data") + else: + # If it is not indexed then `name` will already end in "_data" + sym = symbol_table.lookup_with_tag(name) # Declare a 'post' variable of the same type and read in its value. post_name = sym.name + postfix post_sym = symbol_table.new_symbol(post_name, symbol_type=DataSymbol, datatype=sym.datatype) - if index is not None: - post_tag = f"{name}{postfix}%{index}" + if module_name: + post_tag = f"{name}{postfix}@{module_name}" else: - # If it is not indexed then `name` will already end in "_data" - post_tag = f"{name}{postfix}" + if index is not None: + post_tag = f"{name}{postfix}%{index}" + else: + # If it is not indexed then `name` will already end in "_data" + post_tag = f"{name}{postfix}" name_lit = Literal(post_tag, CHARACTER_TYPE) LFRicExtractDriverCreator._add_call(program, read_var, [name_lit, @@ -476,7 +544,8 @@ def _create_output_var_code(name, program, is_input, read_var, # ------------------------------------------------------------------------- def _create_read_in_code(self, program, psy_data, original_symbol_table, read_write_info, postfix): - # pylint: disable=too-many-arguments, too-many-locals + # pylint: disable=too-many-arguments, too-many-branches + # pylint: disable=too-many-locals, too-many-statements '''This function creates the code that reads in the NetCDF file produced during extraction. For each: @@ -533,18 +602,30 @@ def _sym_is_field(sym): symbol_table = program.scope.symbol_table read_var = f"{psy_data.name}%ReadVariable" + mod_man = ModuleManager.get() # First handle variables that are read: # ------------------------------------- - for signature in read_write_info.signatures_read: + for module_name, signature in read_write_info.read_list: # Find the right symbol for the variable. Note that all variables # in the input and output list have been detected as being used # when the variable accesses were analysed. Therefore, these # variables have References, and will already have been declared # in the symbol table (in _add_all_kernel_symbols). sig_str = self._flatten_signature(signature) - orig_sym = original_symbol_table.lookup(signature[0]) - if orig_sym.is_array and _sym_is_field(orig_sym): + if module_name: + mod_info = mod_man.get_module_info(module_name) + sym_tab = mod_info.get_psyir().symbol_table + try: + orig_sym = sym_tab.lookup(signature[0]) + except KeyError: + # TODO 2120: We likely couldn't parse the module. + print(f"Error finding symbol '{sig_str}' in " + f"'{module_name}'.") + else: + orig_sym = original_symbol_table.lookup(signature[0]) + + if orig_sym and orig_sym.is_array and _sym_is_field(orig_sym): # This is a field vector, so add all individual fields upper = int(orig_sym.datatype.shape[0].upper.value) for i in range(1, upper+1): @@ -554,8 +635,22 @@ def _sym_is_field(sym): Reference(sym)]) continue - sym = symbol_table.lookup_with_tag(str(signature)) - name_lit = Literal(str(signature), CHARACTER_TYPE) + if module_name: + tag = f"{signature[0]}@{module_name}" + try: + sym = symbol_table.lookup_with_tag(tag) + except KeyError: + print(f"Cannot find symbol with tag '{tag}' - likely " + f"a symptom of an earlier parsing problem.") + # TODO #2120: Better error handling, at this stage + # we likely could not find a module variable (e.g. + # because we couldn't successfully parse the module) + # and will have inconsistent/missing declarations. + continue + name_lit = Literal(tag, CHARACTER_TYPE) + else: + sym = symbol_table.lookup_with_tag(str(signature)) + name_lit = Literal(str(signature), CHARACTER_TYPE) self._add_call(program, read_var, [name_lit, Reference(sym)]) # Then handle all variables that are written (note that some @@ -568,13 +663,19 @@ def _sym_is_field(sym): # file. The content of these two variables should be identical # at the end. output_symbols = [] - for signature in read_write_info.signatures_written: + + for module_name, signature in read_write_info.write_list: # Find the right symbol for the variable. Note that all variables # in the input and output list have been detected as being used # when the variable accesses were analysed. Therefore, these # variables have References, and will already have been declared # in the symbol table (in _add_all_kernel_symbols). - orig_sym = original_symbol_table.lookup(signature[0]) + if module_name: + mod_info = mod_man.get_module_info(module_name) + sym_tab = mod_info.get_psyir().symbol_table + orig_sym = sym_tab.lookup(signature[0]) + else: + orig_sym = original_symbol_table.lookup(signature[0]) is_input = read_write_info.is_read(signature) if orig_sym.is_array and _sym_is_field(orig_sym): # This is a field vector, so handle each individual field @@ -585,13 +686,15 @@ def _sym_is_field(sym): sym_tuple = \ self._create_output_var_code(flattened, program, is_input, read_var, - postfix, index=i) + postfix, index=i, + module_name=module_name) output_symbols.append(sym_tuple) else: sig_str = str(signature) - sym_tuple = self._create_output_var_code(str(signature), - program, is_input, - read_var, postfix) + sym_tuple = \ + self._create_output_var_code(str(signature), program, + is_input, read_var, postfix, + module_name=module_name) output_symbols.append(sym_tuple) return output_symbols @@ -652,6 +755,7 @@ def _add_precision_symbols(symbol_table): if name not in ["r_quad", "r_phys"]] for prec_name in all_precisions: symbol_table.new_symbol(prec_name, + tag=f"{prec_name}@{mod_name}", symbol_type=DataSymbol, datatype=INTEGER_TYPE, interface=ImportInterface(constant_mod)) @@ -816,7 +920,7 @@ def create(self, nodes, read_write_info, prefix, postfix, region_name): self._import_modules(program.scope.symbol_table, schedule_copy) self._add_precision_symbols(program.scope.symbol_table) self._add_all_kernel_symbols(schedule_copy, program_symbol_table, - proxy_name_mapping) + proxy_name_mapping, read_write_info) root_name = prefix + "psy_data" psy_data = program_symbol_table.new_symbol(root_name=root_name, @@ -873,7 +977,7 @@ def collect_all_required_modules(file_container): # ------------------------------------------------------------------------- def get_driver_as_string(self, nodes, read_write_info, prefix, postfix, region_name, writer=FortranWriter()): - # pylint: disable=too-many-arguments + # pylint: disable=too-many-arguments, too-many-locals '''This function uses the `create()` function to get the PSyIR of a stand-alone driver, and then uses the provided language writer to create a string representation in the selected language @@ -909,20 +1013,9 @@ def get_driver_as_string(self, nodes, read_write_info, prefix, postfix, :returns: the driver in the selected language. :rtype: str - :raises NotImplementedError: if the driver creation fails. - ''' - try: - file_container = self.create(nodes, read_write_info, prefix, - postfix, region_name) - # TODO #2120 (Handle failures in Kernel Extraction): Now that all - # built-ins are lowered, an alternative way of triggering a - # NotImplementedError is needed. - except NotImplementedError: - # print(f"Cannot create driver for '{region_name[0]}-" - # f"{region_name[1]}' because:") - # print(str(err)) - return "" + file_container = self.create(nodes, read_write_info, prefix, + postfix, region_name) module_dependencies = self.collect_all_required_modules(file_container) # Sort the modules by dependencies, i.e. start with modules @@ -933,8 +1026,9 @@ def get_driver_as_string(self, nodes, read_write_info, prefix, postfix, sorted_modules = mod_manager.sort_modules(module_dependencies) # Inline all required modules into the driver source file so that - # it is stand-alone: + # it is stand-alone. out = [] + for module in sorted_modules: # Note that all modules in `sorted_modules` are known to be in # the module manager, so we can always get the module info here. @@ -983,12 +1077,6 @@ def write_driver(self, nodes, read_write_info, prefix, postfix, postfix, region_name, writer=writer) fll = FortLineLength() code = fll.process(code) - if not code: - # This indicates an error that was already printed, - # so ignore it here. - # TODO #2120 (Handle failures in Kernel Extraction): revisit - # how this is handled in 'get_driver_as_string'. - return module_name, local_name = region_name with open(f"driver-{module_name}-{local_name}.F90", "w", encoding='utf-8') as out: diff --git a/src/psyclone/domain/lfric/transformations/lfric_extract_trans.py b/src/psyclone/domain/lfric/transformations/lfric_extract_trans.py index f0a443690e..fbb648901f 100644 --- a/src/psyclone/domain/lfric/transformations/lfric_extract_trans.py +++ b/src/psyclone/domain/lfric/transformations/lfric_extract_trans.py @@ -148,13 +148,13 @@ def apply(self, nodes, options=None): my_options["region_name"] = region_name my_options["prefix"] = my_options.get("prefix", "extract") # Get the input- and output-parameters of the node list - read_write_info = ctu.get_in_out_parameters(nodes) + read_write_info = \ + ctu.get_in_out_parameters(nodes, collect_non_local_symbols=True) # Determine a unique postfix to be used for output variables # that avoid any name clashes postfix = ExtractTrans.determine_postfix(read_write_info, postfix="_post") my_options["post_var_postfix"] = postfix - if my_options.get("create_driver", False): # We need to create the driver before inserting the ExtractNode # (since some of the visitors used in driver creation do not diff --git a/src/psyclone/psyir/nodes/extract_node.py b/src/psyclone/psyir/nodes/extract_node.py index 0a6315efc3..8c28e777ce 100644 --- a/src/psyclone/psyir/nodes/extract_node.py +++ b/src/psyclone/psyir/nodes/extract_node.py @@ -55,13 +55,13 @@ class ExtractNode(PSyDataNode): ''' - This class can be inserted into a Schedule to mark Nodes for \ - code extraction using the ExtractRegionTrans transformation. By \ - applying the transformation the Nodes marked for extraction become \ + This class can be inserted into a Schedule to mark Nodes for + code extraction using the ExtractRegionTrans transformation. By + applying the transformation the Nodes marked for extraction become children of (the Schedule of) an ExtractNode. - :param ast: reference into the fparser2 parse tree corresponding to \ - this node. + :param ast: reference into the fparser2 parse tree corresponding to + this node. :type ast: sub-class of :py:class:`fparser.two.Fortran2003.Base` :param children: the PSyIR nodes that are children of this node. :type children: list of :py:class:`psyclone.psyir.nodes.Node` @@ -69,15 +69,19 @@ class ExtractNode(PSyDataNode): :type parent: :py:class:`psyclone.psyir.nodes.Node` :param options: a dictionary with options provided via transformations. :type options: Optional[Dict[str, Any]] - :param str options["prefix"]: a prefix to use for the PSyData module name \ + :param str options["prefix"]: a prefix to use for the PSyData module name (``prefix_psy_data_mod``) and the PSyDataType - (``prefix_PSyDataType``) - a "_" will be added automatically. \ - It defaults to "extract", which means the module name used will be \ + (``prefix_PSyDataType``) - a "_" will be added automatically. + It defaults to "extract", which means the module name used will be ``extract_psy_data_mode``, and the data type ``extract_PSyDataType``. - :param str options["post_var_postfix"]: a postfix to be used when \ - creating names to store values of output variable. A variable 'a' \ - would store its value as 'a', and its output values as 'a_post' with \ + :param str options["post_var_postfix"]: a postfix to be used when + creating names to store values of output variable. A variable 'a' + would store its value as 'a', and its output values as 'a_post' with the default post_var_postfix of '_post'. + :param options["read_write_info"]: information about variables that are + read and/or written in the instrumented code. + :type options["read_write_info"]: + py:class:`psyclone.psyir.tools.ReadWriteInfo` ''' # Textual description of the node. @@ -102,10 +106,13 @@ def __init__(self, ast=None, children=None, parent=None, options=None): # variable 'a' exists, which creates 'a_out' for the output variable, # which would clash with a variable 'a_out' used in the program unit). - if options: - self._post_name = options.get("post_var_postfix", "_post") - else: - self._post_name = "_post" + if options is None: + options = {} + + self._post_name = options.get("post_var_postfix", "_post") + + # Keep a copy of the argument list: + self._read_write_info = options.get("read_write_info") def __eq__(self, other): ''' @@ -150,15 +157,25 @@ def gen_code(self, parent): :type parent: :py:class:`psyclone.psyir.nodes.Node`. ''' - # Avoid circular dependency - # pylint: disable=import-outside-toplevel - from psyclone.psyir.tools.call_tree_utils import CallTreeUtils - # Determine the variables to write: - ctu = CallTreeUtils() - read_write_info = \ - ctu.get_in_out_parameters(self, options=self.options) - options = {'pre_var_list': read_write_info.read_list, - 'post_var_list': read_write_info.write_list, + if self._read_write_info is None: + # Typically, _read_write_info should be set at the constructor, + # but some tests do not provide the required information. To + # support these tests, allow creation of the read_write info here. + # We cannot do this in the constructor, since at construction + # time of this node it is not yet part of the PSyIR tree, so it + # does not have children from which we can collect the input/output + # parameters. + + # Avoid circular dependency + # pylint: disable=import-outside-toplevel + from psyclone.psyir.tools.call_tree_utils import CallTreeUtils + # Determine the variables to write: + ctu = CallTreeUtils() + self._read_write_info = \ + ctu.get_in_out_parameters(self, options=self.options) + + options = {'pre_var_list': self._read_write_info.read_list, + 'post_var_list': self._read_write_info.write_list, 'post_var_postfix': self._post_name} parent.add(CommentGen(parent, "")) @@ -179,16 +196,24 @@ def lower_to_language_level(self): :rtype: :py:class:`psyclone.psyir.node.Node` ''' - # Avoid circular dependency - # pylint: disable=import-outside-toplevel - from psyclone.psyir.tools.call_tree_utils import CallTreeUtils - # Determine the variables to write: - ctu = CallTreeUtils() - read_write_info = \ - ctu.get_in_out_parameters(self, options=self.options) - - options = {'pre_var_list': read_write_info.read_list, - 'post_var_list': read_write_info.write_list, + if self._read_write_info is None: + # Typically, _read_write_info should be set at the constructor, + # but some tests do not provide the required information. To + # support these tests, allow creation of the read_write info + # here (it can't be done in the constructor, since this node + # is not yet integrated into the PSyIR, so the dependency tool + # cannot determine variable usage at that time): + + # Avoid circular dependency + # pylint: disable=import-outside-toplevel + from psyclone.psyir.tools.call_tree_utils import CallTreeUtils + # Determine the variables to write: + ctu = CallTreeUtils() + self._read_write_info = \ + ctu.get_in_out_parameters(self, options=self.options) + + options = {'pre_var_list': self._read_write_info.read_list, + 'post_var_list': self._read_write_info.write_list, 'post_var_postfix': self._post_name} return super().lower_to_language_level(options) diff --git a/src/psyclone/psyir/nodes/psy_data_node.py b/src/psyclone/psyir/nodes/psy_data_node.py index 2d8090991e..e4213fec9c 100644 --- a/src/psyclone/psyir/nodes/psy_data_node.py +++ b/src/psyclone/psyir/nodes/psy_data_node.py @@ -48,6 +48,7 @@ from fparser.two.parser import ParserFactory from psyclone.configuration import Config +from psyclone.core import Signature from psyclone.errors import InternalError, GenerationError from psyclone.f2pygen import CallGen, TypeDeclGen, UseGen from psyclone.psyir.nodes.codeblock import CodeBlock @@ -80,17 +81,17 @@ class PSyDataNode(Statement): :type ast: sub-class of :py:class:`fparser.two.Fortran2003.Base` :param children: the PSyIR nodes that are children of this node. These \ will be made children of the child Schedule of this PSyDataNode. - :type children: List[:py:class:`psyclone.psyir.nodes.Node`] + :type children: list[:py:class:`psyclone.psyir.nodes.Node`] :param parent: the parent of this node in the PSyIR tree. :type parent: :py:class:`psyclone.psyir.nodes.Node` :param options: a dictionary with options for transformations. - :type options: Optional[Dict[str, Any]] + :type options: Optional[dict[str, Any]] :param str options["prefix"]: a prefix to use for the PSyData module name \ (``prefix_psy_data_mod``) and the PSyDataType \ (``prefix_PSyDataType``) - a "_" will be added automatically. \ It defaults to "", which means the module name used will just be \ ``psy_data_mod``, and the data type ``PSyDataType``. - :param Tuple[str,str] options["region_name"]: an optional name to \ + :param tuple[str,str] options["region_name"]: an optional name to \ use for this PSyDataNode, provided as a 2-tuple containing a \ module name followed by a local name. The pair of strings should \ uniquely identify a region unless aggregate information is required \ @@ -217,7 +218,7 @@ def __eq__(self, other): @property def options(self): ''':returns: the option dictionary of this class. - :rtype: Dict[str,Any] + :rtype: dict[str,Any] ''' return self._options @@ -272,7 +273,7 @@ def create(cls, children, symbol_table, ast=None, options=None): :param children: the PSyIR nodes that will become children of the \ new PSyData node. - :type children: List[:py:class:`psyclone.psyir.nodes.Node`] + :type children: list[:py:class:`psyclone.psyir.nodes.Node`] :param symbol_table: the associated SymbolTable to which symbols \ must be added. :type symbol_table: :py:class:`psyclone.psyir.symbols.SymbolTable` @@ -280,13 +281,13 @@ def create(cls, children, symbol_table, ast=None, options=None): instrumented with PSyData calls. :type ast: :py:class:`fparser.two.Fortran2003.Base` :param options: a dictionary with options for transformations. - :type options: Optional[Dict[str, Any]] + :type options: Optional[dict[str, Any]] :param str options[prefix"]: a prefix to use for the PSyData module \ name (``prefix_psy_data_mod``) and the PSyDataType \ (``prefix_PSyDataType``) - a "_" will be added automatically. \ It defaults to "", which means the module name used will just be \ ``psy_data_mod``, and the data type ``PSyDataType``. - :param Tuple[str,str] options["region_name"]: an optional name to use \ + :param tuple[str,str] options["region_name"]: an optional name to use \ for this PSyDataNode, provided as a 2-tuple containing a module \ name followed by a local name. The pair of strings should \ uniquely identify a region unless aggregate information is \ @@ -393,7 +394,7 @@ def add_psydata_class_prefix(self, symbol): def region_identifier(self): ''':returns: the unique region identifier, which is a tuple \ consisting of the module name and region name. - :rtype: Tuple[str, str] + :rtype: tuple[str, str] ''' return self._region_identifier @@ -467,14 +468,49 @@ def _add_call(self, name, parent, arguments=None): :param parent: parent node into which to insert the calls. :type parent: :py:class:`psyclone.f2pygen.BaseGen` :param arguments: optional arguments for the method call. - :type arguments: Optional[List[str]] + :type arguments: Optional[list[str]] ''' call = CallGen(parent, f"{self._var_name}%{name}", arguments) parent.add(call) + # ------------------------------------------------------------------------- + def _create_unique_names(self, var_list, symbol_table): + '''This function takes a list of (module_name, signature) tuple, and + for any name that is already in the symbol table (i.e. creating a + name clash), creates a new, unique symbol and adds it to the symbol + table with a tag "symbol@module". It returns a list of three-element + entries: module name, original_signature, unique_signature. The unique + signature and original signature are identical if the original name + was not in the symbol table. + + :param var_list: the variable information list. + :type var_list: list[tuple[str, :py:class:`psyclone.core.Signature`]] + :param symbol_table: the symbol table used to create unique names in + :type symbol_table: :py:class:`psyclone.psyir.symbols.SymbolTable` + + :returns: a new list which has a unique signature name added. + :rtype: list[tuple[str, :py:class:`psyclone.core.Signature`, + :py:class:`psyclone.core.Signature`]] + + ''' + out_list = [] + for (module_name, signature) in var_list: + if module_name: + var_symbol = \ + symbol_table.find_or_create_tag(tag=f"{signature[0]}" + f"@{module_name}", + root_name=signature[0]) + unique_sig = Signature(var_symbol.name, signature[1:]) + else: + # This is a local variable anyway, no need to rename: + unique_sig = signature + out_list.append((module_name, signature, unique_sig)) + return out_list + # ------------------------------------------------------------------------- def gen_code(self, parent, options=None): # pylint: disable=arguments-differ, too-many-branches + # pylint: disable=too-many-statements '''Creates the PSyData code before and after the children of this node. @@ -485,17 +521,17 @@ def gen_code(self, parent, options=None): :param parent: the parent of this node in the f2pygen AST. :type parent: :py:class:`psyclone.f2pygen.BaseGen` :param options: a dictionary with options for transformations. - :type options: Optional[Dict[str, Any]] + :type options: Optional[dict[str, Any]] :param options["pre_var_list"]: container name and variable name to \ be supplied before the first child. The container name is \ supported to be able to handle variables that are imported from \ a different container (module in Fortran). - :type options["pre_var_list"]: List[Tuple[str, str]] + :type options["pre_var_list"]: list[tuple[str, str]] :param options["post_var_list"]: container name and variable name to \ be supplied after the last child. The container name is \ supported to be able to handle variables that are imported from \ a different container (module in Fortran). - :type options["post_var_list"]: List[Tuple[str, str]] + :type options["post_var_list"]: list[tuple[str, str]] :param str options["pre_var_postfix"]: an optional postfix that will \ be added to each variable name in the pre_var_list. :param str options["post_var_postfix"]: an optional postfix that will \ @@ -505,14 +541,13 @@ def gen_code(self, parent, options=None): # Avoid circular dependency # pylint: disable=import-outside-toplevel from psyclone.psyGen import Kern, InvokeSchedule - # TODO: #415 Support different classes of PSyData calls. invoke = self.ancestor(InvokeSchedule).invoke - module_name = self._module_name - if module_name is None: + global_module_name = self._module_name + if global_module_name is None: # The user has not supplied a module (location) name so # return the psy-layer module name as this will be unique # for each PSyclone algorithm file. - module_name = invoke.invokes.psy.name + global_module_name = invoke.invokes.psy.name region_name = self._region_name if region_name is None: @@ -540,10 +575,33 @@ def gen_code(self, parent, options=None): if not options: options = {} - pre_variable_list = options.get("pre_var_list", []) - post_variable_list = options.get("post_var_list", []) + # Get the list of variables, and handle name clashes: a now newly + # imported symbol (from a module that is used directly or indirectly + # from a kernel) might clash with a local variable. Convert the lists + # of 2-tuples (module_name, signature) to a list of 3-tuples + # (module_name, signature, unique_signature): + + symbol_table = self.scope.symbol_table + pre_variable_list = \ + self._create_unique_names(options.get("pre_var_list", []), + symbol_table) + post_variable_list = \ + self._create_unique_names(options.get("post_var_list", []), + symbol_table) + pre_suffix = options.get("pre_var_postfix", "") post_suffix = options.get("post_var_postfix", "") + for module_name, signature, unique_signature in (pre_variable_list + + post_variable_list): + if module_name: + if unique_signature != signature: + rename = f"{unique_signature[0]}=>{signature[0]}" + use = UseGen(parent, module_name, only=True, + funcnames=[rename]) + else: + use = UseGen(parent, module_name, only=True, + funcnames=[unique_signature[0]]) + parent.add(use) # Note that adding a use statement makes sure it is only # added once, so we don't need to test this here! @@ -560,11 +618,11 @@ def gen_code(self, parent, options=None): parent.add(var_decl) self._add_call("PreStart", parent, - [f"\"{module_name}\"", + [f"\"{global_module_name}\"", f"\"{region_name}\"", len(pre_variable_list), len(post_variable_list)]) - self.set_region_identifier(module_name, region_name) + self.set_region_identifier(global_module_name, region_name) has_var = pre_variable_list or post_variable_list # Each variable name can be given a suffix. The reason for @@ -581,18 +639,27 @@ def gen_code(self, parent, options=None): # values of a variable "A" as "A" in the pre-variable list, # and store the modified value of "A" later as "A_post". if has_var: - for _, var_name in pre_variable_list: + for module_name, sig, unique_sig in pre_variable_list: + if module_name: + module_name = f"@{module_name}" self._add_call("PreDeclareVariable", parent, - [f"\"{var_name}{pre_suffix}\"", var_name]) - for _, var_name in post_variable_list: + [f"\"{sig}{module_name}{pre_suffix}\"", + unique_sig]) + for module_name, sig, unique_sig in post_variable_list: + if module_name: + module_name = f"@{module_name}" self._add_call("PreDeclareVariable", parent, - [f"\"{var_name}{post_suffix}\"", var_name]) + [f"\"{sig}{post_suffix}{module_name}\"", + unique_sig]) self._add_call("PreEndDeclaration", parent) - for _, var_name in pre_variable_list: + for module_name, sig, unique_sig in pre_variable_list: + if module_name: + module_name = f"@{module_name}" self._add_call("ProvideVariable", parent, - [f"\"{var_name}{pre_suffix}\"", var_name]) + [f"\"{sig}{module_name}{pre_suffix}\"", + unique_sig]) self._add_call("PreEnd", parent) @@ -602,13 +669,17 @@ def gen_code(self, parent, options=None): if has_var: # Only add PostStart() if there is at least one variable. self._add_call("PostStart", parent) - for _, var_name in post_variable_list: + for module_name, sig, unique_sig in post_variable_list: + if module_name: + module_name = f"@{module_name}" self._add_call("ProvideVariable", parent, - [f"\"{var_name}{post_suffix}\"", var_name]) + [f"\"{sig}{post_suffix}{module_name}\"", + unique_sig]) self._add_call("PostEnd", parent) def lower_to_language_level(self, options=None): + # pylint: disable=arguments-differ # pylint: disable=too-many-branches, too-many-statements ''' Lowers this node (and all children) to language-level PSyIR. The @@ -620,17 +691,17 @@ def lower_to_language_level(self, options=None): PSyDataNode. :param options: dictionary of the PSyData generation options. - :type options: Optional[Dict[str, Any]] + :type options: Optional[dict[str, Any]] :param options["pre_var_list"]: container- and variable-names to be \ supplied before the first child. The container names are \ supported to be able to handle variables that are imported from \ a different container (module in Fortran). - :type options["pre_var_list"]: List[Tuple[str, str]] + :type options["pre_var_list"]: list[tuple[str, str]] :param options["post_var_list"]: container- and variable-names to be \ supplied after the last child. The container names are \ supported to be able to handle variables that are imported from \ a different container (module in Fortran). - :type options["post_var_list"]: List[Tuple[str, str]] + :type options["post_var_list"]: list[tuple[str, str]] :param str options["pre_var_postfix"]: an optional postfix that will \ be added to each variable name in the pre_var_list. :param str options["post_var_postfix"]: an optional postfix that will \ @@ -651,10 +722,10 @@ def gen_type_bound_call(typename, methodname, argument_list=None, :param str typename: the name of the base type. :param str methodname: the name of the method to be called. :param argument_list: the list of arguments in the method call. - :type argument_list: List[str] + :type argument_list: list[str] :param annotations: the list of node annotations to add to the \ generated CodeBlock. - :type annotations: List[str] + :type annotations: list[str] :returns: a CodeBlock representing the type bound call. :rtype: :py:class:`psyclone.psyir.nodes.CodeBlock` @@ -708,8 +779,14 @@ def gen_type_bound_call(typename, methodname, argument_list=None, if not options: options = {} - pre_variable_list = options.get("pre_var_list", []) - post_variable_list = options.get("post_var_list", []) + symbol_table = self.scope.symbol_table + pre_variable_list = \ + self._create_unique_names(options.get("pre_var_list", []), + symbol_table) + post_variable_list = \ + self._create_unique_names(options.get("post_var_list", []), + symbol_table) + pre_suffix = options.get("pre_var_postfix", "") post_suffix = options.get("post_var_postfix", "") @@ -741,25 +818,31 @@ def gen_type_bound_call(typename, methodname, argument_list=None, # values of a variable "A" as "A" in the pre-variable list, # and store the modified value of "A" later as "A_post". if has_var: - for _, var_name in pre_variable_list: + for module_name, sig, unique_sig in pre_variable_list: + if module_name: + module_name = f"@{module_name}" call = gen_type_bound_call( self._var_name, "PreDeclareVariable", - [f"\"{var_name}{pre_suffix}\"", var_name]) + [f"\"{sig}{module_name}{pre_suffix}\"", unique_sig]) self.parent.children.insert(self.position, call) - for _, var_name in post_variable_list: + for module_name, sig, unique_sig in post_variable_list: + if module_name: + module_name = f"@{module_name}" call = gen_type_bound_call( self._var_name, "PreDeclareVariable", - [f"\"{var_name}{post_suffix}\"", var_name]) + [f"\"{sig}{module_name}{post_suffix}\"", unique_sig]) self.parent.children.insert(self.position, call) call = gen_type_bound_call(self._var_name, "PreEndDeclaration") self.parent.children.insert(self.position, call) - for _, var_name in pre_variable_list: + for module_name, sig, unique_sig in pre_variable_list: + if module_name: + module_name = f"@{module_name}" call = gen_type_bound_call( self._var_name, "ProvideVariable", - [f"\"{var_name}{pre_suffix}\"", var_name]) + [f"\"{sig}{module_name}{pre_suffix}\"", unique_sig]) self.parent.children.insert(self.position, call) call = gen_type_bound_call(self._var_name, "PreEnd") @@ -774,10 +857,12 @@ def gen_type_bound_call(typename, methodname, argument_list=None, # Only add PostStart() if there is at least one variable. call = gen_type_bound_call(self._var_name, "PostStart") self.parent.children.insert(self.position, call) - for _, var_name in post_variable_list: + for module_name, sig, unique_sig in post_variable_list: + if module_name: + module_name = f"@{module_name}" call = gen_type_bound_call( self._var_name, "ProvideVariable", - [f"\"{var_name}{post_suffix}\"", var_name]) + [f"\"{sig}{module_name}{post_suffix}\"", unique_sig]) self.parent.children.insert(self.position, call) # PSyData end call diff --git a/src/psyclone/psyir/tools/call_tree_utils.py b/src/psyclone/psyir/tools/call_tree_utils.py index 350af9d235..255c86e1d5 100644 --- a/src/psyclone/psyir/tools/call_tree_utils.py +++ b/src/psyclone/psyir/tools/call_tree_utils.py @@ -229,14 +229,27 @@ def get_output_parameters(self, read_write_info, node_list, read_write_info.add_write(signature) # ------------------------------------------------------------------------- - def get_in_out_parameters(self, node_list, options=None): + def get_in_out_parameters(self, node_list, collect_non_local_symbols=False, + options=None): '''Returns a ReadWriteInfo object that contains all variables that are input and output parameters to the specified node list. This function calls `get_input_parameter` and `get_output_parameter`, but avoids the - repeated computation of the variable usage. + repeated computation of the variable usage. If + `collect_non_local_symbols` is set to True, the code will also include + non-local symbols used directly or indirectly, i.e. it will follow the + call tree as much as possible (e.g. it cannot resolve a procedure + pointer, since then it is not known which function is actually called) + and collect any other variables that will be read or written when + executing the nodes specified in the node list. The corresponding + module name for these variables will be included in the ReadWriteInfo + result object. For this to work it is essential that the correct + search paths are specified for the module manager. :param node_list: list of PSyIR nodes to be analysed. :type node_list: list[:py:class:`psyclone.psyir.nodes.Node`] + :param bool collect_non_local_symbols: whether non-local symbols + (i.e. symbols used in other modules either directly or + indirectly) should be included in the in/out information. :param options: a dictionary with options for the CallTreeUtils which will also be used when creating the VariablesAccessInfo instance if required. @@ -255,6 +268,9 @@ def get_in_out_parameters(self, node_list, options=None): read_write_info = ReadWriteInfo() self.get_input_parameters(read_write_info, node_list, variables_info) self.get_output_parameters(read_write_info, node_list, variables_info) + if collect_non_local_symbols: + self.get_non_local_read_write_info(node_list, read_write_info) + return read_write_info # ------------------------------------------------------------------------- @@ -283,12 +299,14 @@ def get_non_local_read_write_info(self, node_list, read_write_info): # by querying the module that contains the kernel: try: mod_info = mod_manager.get_module_info(kernel.module_name) - except FileNotFoundError: + except FileNotFoundError as err: # TODO #11: Add proper logging # TODO #2120: Handle error print(f"[CallTreeUtils.get_non_local_read_write_info] " f"Could not find module '{kernel.module_name}' - " f"ignored.") + # This includes the currently defined search path: + print(str(err)) continue # TODO #2435: once we have interface support, this will be @@ -322,7 +340,8 @@ def _resolve_calls_and_unknowns(self, outstanding_nonlocals, :type read_write_info: :py:class:`psyclone.psyir.tools.ReadWriteInfo` ''' - # pylint: disable=too-many-branches + # pylint: disable=too-many-branches, too-many-locals + # pylint: disable=too-many-statements mod_manager = ModuleManager.get() done = set() # Using a set here means that duplicated entries will automatically @@ -354,17 +373,29 @@ def _resolve_calls_and_unknowns(self, outstanding_nonlocals, print(f"[CallTreeUtils._resolve_calls_and_unknowns] " f"Cannot find module '{module_name}' - ignored.") continue - routine = mod_info.get_psyir().get_routine_psyir(signature[0]) - if routine: - # Add the list of non-locals to our todo list: - outstanding_nonlocals.extend( - self.get_non_local_symbols(routine)) - else: - # TODO #11: Add proper logging - # TODO #2120: Handle error - print(f"[CallTreeUtils._resolve_calls_and_unknowns] " - f"Cannot find symbol '{signature[0]}' in module " - f"'{module_name}' - ignored.") + + # Add the list of non-locals to our todo list. Handle + # generic interfaces: + all_routines = mod_info.resolve_routine(signature[0]) + for routine_name in all_routines: + # We need a try statement here in case that a whole + # module becomes a CodeBlock + try: + routine = mod_info.get_psyir().\ + get_routine_psyir(routine_name) + except AttributeError: + # TODO #2120: Handle error + routine = None + if routine: + # Add the list of non-locals to our todo list: + outstanding_nonlocals.extend( + self.get_non_local_symbols(routine)) + else: + # TODO #11: Add proper logging + # TODO #2120: Handle error + print(f"[CallTreeUtils._resolve_calls_and_unknowns] " + f"Cannot find symbol '{signature[0]}' in module " + f"'{module_name}' - ignored.") continue if external_type == "unknown": @@ -386,6 +417,19 @@ def _resolve_calls_and_unknowns(self, outstanding_nonlocals, outstanding_nonlocals.append(("routine", module_name, signature, access_info)) continue + # Check if it is a constant (the symbol should always be found, + # but if a module cannot be parsed then the symbol table won't + # have been populated) + sym_tab = \ + mod_info.get_psyir().symbol_table + try: + sym = sym_tab.lookup(signature[0]) + if sym.is_constant: + continue + except KeyError: + print(f"Unable to check if signature '{signature}' " + f"is constant.") + sym = None # Otherwise fall through to the code that adds a reference: # Now it must be a reference, so add it to the list of input- diff --git a/src/psyclone/psyir/tools/dependency_tools.py b/src/psyclone/psyir/tools/dependency_tools.py index d149146c35..af101023c8 100644 --- a/src/psyclone/psyir/tools/dependency_tools.py +++ b/src/psyclone/psyir/tools/dependency_tools.py @@ -51,6 +51,8 @@ from psyclone.psyir.nodes import Loop +# pylint: disable=too-many-lines + class DTCode(IntEnum): '''A simple enum to store the various info, warning and error codes used in the dependency analysis. It is based in IntEnum diff --git a/src/psyclone/tests/domain/lfric/lfric_extract_driver_creator_test.py b/src/psyclone/tests/domain/lfric/lfric_extract_driver_creator_test.py index 122fe3df7e..013f19886b 100644 --- a/src/psyclone/tests/domain/lfric/lfric_extract_driver_creator_test.py +++ b/src/psyclone/tests/domain/lfric/lfric_extract_driver_creator_test.py @@ -44,6 +44,7 @@ from psyclone.domain.lfric import LFRicExtractDriverCreator from psyclone.domain.lfric.transformations import LFRicExtractTrans from psyclone.errors import InternalError +from psyclone.line_length import FortLineLength from psyclone.parse import ModuleManager from psyclone.psyir.nodes import Literal, Routine, Schedule from psyclone.psyir.symbols import INTEGER_TYPE @@ -400,7 +401,7 @@ def test_lfric_driver_operator(): @pytest.mark.usefixtures("change_into_tmpdir", "init_module_manager") def test_lfric_driver_removing_structure_data(): '''Check that array accesses correctly remove the `%data`(which would be - added for builtins using f1_proxy$data(df)). E.g. the following code needs + added for builtins using f1_proxy%data(df)). E.g. the following code needs to be created: do df = loop0_start, loop0_stop, 1 f2(df) = a + f1(df) @@ -568,3 +569,175 @@ def test_lfric_driver_field_array_inc(): # does not need any of the infrastructure files build = Compile(".") build.compile_file("driver-field-test.F90") + + +# ---------------------------------------------------------------------------- +@pytest.mark.usefixtures("change_into_tmpdir", "init_module_manager") +def test_lfric_driver_external_symbols(): + '''Test the handling of symbols imported from other modules, or calls to + external functions that use module variables. + + ''' + _, invoke = get_invoke("driver_creation/invoke_kernel_with_imported_" + "symbols.f90", API, dist_mem=False, idx=0) + + extract = LFRicExtractTrans() + extract.apply(invoke.schedule.children[0], + options={"create_driver": True, + "region_name": ("import", "test")}) + code = str(invoke.gen()) + assert ('CALL extract_psy_data%PreDeclareVariable("' + 'module_var_a_post@module_with_var_mod", module_var_a)' in code) + assert ('CALL extract_psy_data%ProvideVariable("' + 'module_var_a_post@module_with_var_mod", module_var_a)' in code) + + filename = "driver-import-test.F90" + with open(filename, "r", encoding='utf-8') as my_file: + driver = my_file.read() + + assert ("call extract_psy_data%ReadVariable('module_var_a_post@" + "module_with_var_mod', module_var_a_post)" in driver) + assert "if (module_var_a == module_var_a_post)" in driver + + # While the actual code is LFRic, the driver is stand-alone, and as such + # does not need any of the infrastructure files + build = Compile(".") + build.compile_file("driver-import-test.F90") + + +# ---------------------------------------------------------------------------- +@pytest.mark.usefixtures("change_into_tmpdir", "init_module_manager") +def test_lfric_driver_external_symbols_name_clash(): + '''Test the handling of symbols imported from other modules, or calls to + external functions that use module variables. In this example the external + module uses a variable with the same name as the user code, which causes + a name clash. + + ''' + _, invoke = get_invoke("driver_creation/invoke_kernel_with_imported_" + "symbols.f90", API, dist_mem=False, idx=1) + + extract = LFRicExtractTrans() + extract.apply(invoke.schedule.children[0], + options={"create_driver": True, + "region_name": ("import", "test")}) + code = str(invoke.gen()) + + # Make sure the imported, clashing symbol 'f1_data' is renamed: + assert "USE module_with_name_clash_mod, ONLY: f1_data_1=>f1_data" in code + assert ('CALL extract_psy_data%PreDeclareVariable("f1_data@' + 'module_with_name_clash_mod", f1_data_1)' in code) + assert ('CALL extract_psy_data%ProvideVariable("f1_data@' + 'module_with_name_clash_mod", f1_data_1)' in code) + + # Even though PSyclone cannot find the variable, it should still be + # extracted: + + filename = "driver-import-test.F90" + with open(filename, "r", encoding='utf-8') as my_file: + driver = my_file.read() + + assert ("call extract_psy_data%ReadVariable(" + "'f1_data@module_with_name_clash_mod', f1_data_1)" in driver) + assert ("call extract_psy_data%ReadVariable(" + "'f2_data@module_with_name_clash_mod', f2_data_1)" in driver) + assert ("call extract_psy_data%ReadVariable(" + "'f2_data_post@module_with_name_clash_mod', f2_data_1_post)" + in driver) + + # While the actual code is LFRic, the driver is stand-alone, and as such + # does not need any of the infrastructure files + build = Compile(".") + build.compile_file("driver-import-test.F90") + + +# ---------------------------------------------------------------------------- +@pytest.mark.usefixtures("change_into_tmpdir", "init_module_manager") +def test_lfric_driver_external_symbols_error(capsys): + '''Test the handling of symbols imported from other modules, or calls to + external functions that use module variables. In this example, the + external module cannot be parsed by fparser (it contains syntax errors), + resulting in external functions and variables that cannot be found. + + ''' + _, invoke = get_invoke("driver_creation/invoke_kernel_with_imported_" + "symbols_error.f90", API, dist_mem=False, idx=0) + + extract = LFRicExtractTrans() + extract.apply(invoke.schedule.children[0], + options={"create_driver": True, + "region_name": ("import", "test")}) + code = str(invoke.gen()) + # Even though PSyclone cannot find the variable, it should still be + # extracted: + assert ('CALL extract_psy_data%PreDeclareVariable("non_existent_var@' + 'module_with_error_mod", non_existent_var' in code) + assert ('CALL extract_psy_data%ProvideVariable("non_existent_var@' + 'module_with_error_mod", non_existent_var' in code) + + filename = "driver-import-test.F90" + with open(filename, "r", encoding='utf-8') as my_file: + driver = my_file.read() + + # First check output of extraction, which will detect the problems of + # finding variables and functions: + out, _ = capsys.readouterr() + assert ("Cannot find symbol 'non_existent_func' in module " + "'module_with_error_mod' - ignored." in out) + assert ("Error finding symbol 'non_existent_var' in " + "'module_with_error_mod'." in out) + + # This error comes from the driver creation: a variable is in the list + # of variables to be processed, but its type cannot be found. + assert ("Cannot find symbol with tag 'non_existent_var@module_with_" + "error_mod' - likely a symptom of an earlier parsing problem." + in out) + # This variable will be ignored (for now, see TODO 2120) so no code will + # be created for it. The string will still be in the created driver (since + # the module is still inlined), but no ReadVariable code should be created: + assert "call extract_psy_data%ReadVariable('non_existent@" not in driver + + # Note that this driver cannot be compiled, since one of the inlined + # source files is invalid Fortran. + + +# ----------------------------------------------------------------------------- +@pytest.mark.usefixtures("change_into_tmpdir", "init_module_manager") +def test_lfric_driver_rename_externals(): + '''Tests that we get the used non-local symbols from a routine that + renames a symbol reported correctly. Additionally, this also tests + a potential name clash, if the renamed symbol should already exist + in the PSy layer: in this case, the symbol also needs to be renamed + on import + + ''' + # This example calls a subroutine that renames a symbol used from + # a different module, i.e.: + # use module_with_var_mod, only: renamed_var => module_var_a + + _, invoke = get_invoke("driver_creation/invoke_kernel_rename_symbols.f90", + API, dist_mem=False, idx=0) + + ctu = CallTreeUtils() + read_write_info = ctu.get_in_out_parameters(invoke.schedule, + collect_non_local_symbols=True) + driver_creator = LFRicExtractDriverCreator() + code = driver_creator.get_driver_as_string(invoke.schedule, + read_write_info, "extract", + "_post", ("region", "name")) + # The invoking program also contains a variable `module_var_a`. So + # the `module_var_a` from the module must be renamed on import and it + # becomes `module_var_a_1`. + assert ("use module_with_var_mod, only : module_var_a_1=>module_var_a" + in code) + assert ("call extract_psy_data%ReadVariable(" + "'module_var_a@module_with_var_mod', module_var_a_1)" in code) + + # While the actual code is LFRic, the driver is stand-alone, and as such + # does not need any of the infrastructure files. The string also needs + # to be wrapped explicitly (which the driver creation only does + # when writing the result to a file). + build = Compile(".") + fll = FortLineLength() + code = fll.process(code) + build.string_compiles(code) diff --git a/src/psyclone/tests/psyir/nodes/extract_node_test.py b/src/psyclone/tests/psyir/nodes/extract_node_test.py index 0b1af9b99f..98f8a9e864 100644 --- a/src/psyclone/tests/psyir/nodes/extract_node_test.py +++ b/src/psyclone/tests/psyir/nodes/extract_node_test.py @@ -33,12 +33,73 @@ # ----------------------------------------------------------------------------- # Author A. B. G. Chalk, STFC Daresbury Lab # Modified S. Siso, STFC Daresbury Lab +# Modified J. Henrichs, Bureau of Meteorology # ----------------------------------------------------------------------------- ''' Performs pytest tests on the ExtractNode PSyIR node. ''' from psyclone.psyir.nodes import ExtractNode, Schedule from psyclone.psyir.symbols import SymbolTable +from psyclone.tests.utilities import get_invoke + + +def test_extract_node_constructor(): + '''Tests the constructor of the ExtractNode. ''' + # The constructor must have a read_write_info key + en = ExtractNode() + assert en.post_name == "_post" + assert en._read_write_info is None + + en = ExtractNode(options={'read_write_info': 1}) + assert en._post_name == "_post" + assert en._read_write_info == 1 + + # Add a schedule and test that we get the same object as extract body: + schedule = Schedule() + en.children.append(schedule) + assert en.extract_body is schedule + + +def test_extract_node_gen_code(): + '''Test the ExtractNode's gen_code function if there is no ReadWriteInfo + object specified in the options. Since the transformations will always + do that, we need to manually insert the ExtractNode into a schedule: + + ''' + _, invoke = get_invoke("1.0.1_single_named_invoke.f90", + "dynamo0.3", idx=0, dist_mem=False) + loop = invoke.schedule.children[0] + loop.detach() + en = ExtractNode() + en._var_name = "psydata" + en.addchild(Schedule(children=[loop])) + invoke.schedule.addchild(en) + + code = str(invoke.gen()) + expected = [ + 'CALL psydata%PreStart("single_invoke_psy", ' + '"invoke_important_invoke:testkern_code:r0", 17, 2)', + 'CALL psydata%PreDeclareVariable("a", a)', + 'CALL psydata%PreDeclareVariable("f1_data", f1_data)', + 'CALL psydata%PreDeclareVariable("f2_data", f2_data)', + 'CALL psydata%PreDeclareVariable("loop0_start", loop0_start)', + 'CALL psydata%PreDeclareVariable("loop0_stop", loop0_stop)', + 'CALL psydata%PreDeclareVariable("m1_data", m1_data)', + 'CALL psydata%PreDeclareVariable("m2_data", m2_data)', + 'CALL psydata%PreDeclareVariable("map_w1", map_w1)', + 'CALL psydata%PreDeclareVariable("map_w2", map_w2)', + 'CALL psydata%PreDeclareVariable("map_w3", map_w3)', + 'CALL psydata%PreDeclareVariable("ndf_w1", ndf_w1)', + 'CALL psydata%PreDeclareVariable("ndf_w2", ndf_w2)', + 'CALL psydata%PreDeclareVariable("ndf_w3", ndf_w3)', + 'CALL psydata%PreDeclareVariable("nlayers", nlayers)', + 'CALL psydata%PreDeclareVariable("undf_w1", undf_w1)', + 'CALL psydata%PreDeclareVariable("undf_w2", undf_w2)', + 'CALL psydata%PreDeclareVariable("undf_w3", undf_w3)', + 'CALL psydata%PreDeclareVariable("cell_post", cell)', + 'CALL psydata%PreDeclareVariable("f1_data_post", f1_data)'] + for line in expected: + assert line in code def test_extract_node_equality(): diff --git a/src/psyclone/tests/psyir/nodes/psy_data_node_test.py b/src/psyclone/tests/psyir/nodes/psy_data_node_test.py index da477e61a2..e4d0f0f0d3 100644 --- a/src/psyclone/tests/psyir/nodes/psy_data_node_test.py +++ b/src/psyclone/tests/psyir/nodes/psy_data_node_test.py @@ -36,19 +36,22 @@ ''' Module containing tests for generating PSyData hooks''' +import os import re import pytest +from psyclone.domain.lfric.transformations import LFRicExtractTrans from psyclone.errors import InternalError, GenerationError from psyclone.f2pygen import ModuleGen from psyclone.psyir.nodes import ( - PSyDataNode, Schedule, Return, Routine, CodeBlock) + CodeBlock, PSyDataNode, Schedule, Return, Routine) +from psyclone.parse import ModuleManager from psyclone.psyir.nodes.statement import Statement from psyclone.psyir.transformations import PSyDataTrans, TransformationError from psyclone.psyir.symbols import ( ContainerSymbol, ImportInterface, SymbolTable, DataTypeSymbol, UnresolvedType, DataSymbol, UnsupportedFortranType) -from psyclone.tests.utilities import get_invoke +from psyclone.tests.utilities import get_base_path, get_invoke # ----------------------------------------------------------------------------- @@ -517,3 +520,77 @@ def test_psy_data_node_lower_to_language_level_with_options(): for codeblock, string in zip(codeblocks, expected): assert string == str(codeblock.ast) + + +# ---------------------------------------------------------------------------- +@pytest.mark.usefixtures("change_into_tmpdir", "clear_module_manager_instance") +def test_psy_data_node_name_clash(fortran_writer): + '''Test the handling of symbols imported from other modules, or calls to + external functions that use module variables. In this example the external + module uses a variable with the same name as the user code, which causes + a name clash and must be renamed. + + ''' + api = "dynamo0.3" + infrastructure_path = get_base_path(api) + # Define the path to the ReadKernelData module (which contains functions + # to read extracted data from a file) relative to the infrastructure path: + psyclone_root = os.path.dirname(os.path.dirname(os.path.dirname( + os.path.dirname(os.path.dirname(infrastructure_path))))) + read_mod_path = os.path.join(psyclone_root, "lib", "extract", "standalone") + + module_manager = ModuleManager.get() + module_manager.add_search_path(infrastructure_path) + module_manager.add_search_path(read_mod_path) + + _, invoke = get_invoke("driver_creation/invoke_kernel_with_imported_" + "symbols.f90", api, dist_mem=False, idx=1) + + extract = LFRicExtractTrans() + extract.apply(invoke.schedule.children[0], + options={"create_driver": True, + "region_name": ("import", "test")}) + + # First test, use the old-style gen_code way: + # ------------------------------------------- + code = str(invoke.gen()) + + # Make sure the imported, clashing symbols 'f1' and 'f2' are renamed: + assert "USE module_with_name_clash_mod, ONLY: f1_data_1=>f1_data" in code + assert "USE module_with_name_clash_mod, ONLY: f2_data_1=>f2_data" in code + assert ('CALL extract_psy_data%PreDeclareVariable("f1_data@' + 'module_with_name_clash_mod", f1_data_1)' in code) + assert ('CALL extract_psy_data%ProvideVariable("f1_data@' + 'module_with_name_clash_mod", f1_data_1)' in code) + assert ('CALL extract_psy_data%PreDeclareVariable("f2_data@' + 'module_with_name_clash_mod", f2_data_1)' in code) + assert ('CALL extract_psy_data%PreDeclareVariable("f2_data_post@' + 'module_with_name_clash_mod", f2_data_1)' in code) + assert ('CALL extract_psy_data%ProvideVariable("f2_data@' + 'module_with_name_clash_mod", f2_data_1)' in code) + assert ('CALL extract_psy_data%ProvideVariable("f2_data_post@' + 'module_with_name_clash_mod", f2_data_1)' in code) + + # Second test, use lower_to_language_level: + # ----------------------------------------- + invoke.schedule.children[0].lower_to_language_level() + + # Note that atm we cannot fortran_writer() the schedule, LFRic does not + # yet fully support this. So we just lower each line individually: + code = "".join([fortran_writer(i) for i in invoke.schedule.children]) + + assert ('CALL extract_psy_data % PreDeclareVariable("f1_data_post", ' + 'f1_data)' in code) + assert ('CALL extract_psy_data % PreDeclareVariable("f1_data@' + 'module_with_name_clash_mod", f1_data_1)' in code) + assert ('CALL extract_psy_data % PreDeclareVariable("f2_data@' + 'module_with_name_clash_mod", f2_data_1)' in code) + assert ('CALL extract_psy_data % PreDeclareVariable("f2_data@' + 'module_with_name_clash_mod_post", f2_data_1)' in code) + + assert ('CALL extract_psy_data % ProvideVariable("f1_data@' + 'module_with_name_clash_mod", f1_data_1)' in code) + assert ('CALL extract_psy_data % ProvideVariable("f2_data@' + 'module_with_name_clash_mod", f2_data_1)' in code) + assert ('CALL extract_psy_data % ProvideVariable("f2_data@' + 'module_with_name_clash_mod_post", f2_data_1)' in code) diff --git a/src/psyclone/tests/psyir/tools/call_tree_utils_test.py b/src/psyclone/tests/psyir/tools/call_tree_utils_test.py index 9f2ae9cfa7..ee47d69072 100644 --- a/src/psyclone/tests/psyir/tools/call_tree_utils_test.py +++ b/src/psyclone/tests/psyir/tools/call_tree_utils_test.py @@ -39,14 +39,16 @@ ''' This module contains the pytest tests for the Routine class. ''' import os +import re + import pytest from psyclone.configuration import Config -from psyclone.core import Signature +from psyclone.core import Signature, SingleVariableAccessInfo from psyclone.domain.lfric import LFRicKern from psyclone.parse import ModuleManager from psyclone.psyGen import BuiltIn -from psyclone.psyir.nodes import (Reference, Schedule) +from psyclone.psyir.nodes import CodeBlock, Reference, Schedule from psyclone.psyir.tools import CallTreeUtils, ReadWriteInfo from psyclone.tests.utilities import get_base_path, get_invoke @@ -175,9 +177,11 @@ def test_call_tree_get_used_symbols_from_modules(): non_locals_without_access = set((i[0], i[1], str(i[2])) for i in non_locals) - # Check that the expected symbols, modules and internal type are correct: + # Check that the expected symbols, modules and internal type are correct. + # Note that a constant variable from another module is still reported here expected = set([ ("unknown", "constants_mod", "eps"), + ("unknown", "module_with_var_mod", "module_const"), ("reference", "testkern_import_symbols_mod", "dummy_module_variable"), ('routine', 'testkern_import_symbols_mod', "local_func"), @@ -235,7 +239,14 @@ def test_get_non_local_read_write_info(capsys): psyir, _ = get_invoke(test_file, "dynamo0.3", 0, dist_mem=False) schedule = psyir.invokes.invoke_list[0].schedule - # First call without setting up the module manager. This will result + # Set up the module manager with a search directory that does not + # contain any files used here: + kernels_dir = os.path.join(get_base_path("dynamo0.3"), + "kernels", "dead_end", "no_really") + mod_man = ModuleManager.get() + mod_man.add_search_path(kernels_dir) + + # Since the right search path is missing, this will result # in the testkern_import_symbols_mod module not being found: read_write_info = ReadWriteInfo() rw_info = ctu.get_non_local_read_write_info(schedule, read_write_info) @@ -243,7 +254,12 @@ def test_get_non_local_read_write_info(capsys): assert ("Could not find module 'testkern_import_symbols_mod' - ignored." in out) - # Now add the search path of the driver creation tests to the + # The search directories are absolute, so use a regex: + assert re.search("Could not find source file for module " + "'testkern_import_symbols_mod' in any of the " + "directories '.*kernels/dead_end/no_really'.", out) + + # Now add the correct search path of the driver creation tests to the # module manager: test_dir = os.path.join(get_base_path("dynamo0.3"), "driver_creation") mod_man = ModuleManager.get() @@ -271,6 +287,11 @@ def test_get_non_local_read_write_info(capsys): assert (('testkern_import_symbols_mod', Signature("dummy_module_variable")) in rw_info.write_list) + # Make sure that accessing a constant from a different module is + # not included: + assert (('module_with_var_mod', Signature("module_const")) + not in rw_info.read_list) + # Check that we can ignore a module: mod_man.add_ignore_module("constants_mod") rw_info = ReadWriteInfo() @@ -389,8 +410,8 @@ def test_call_tree_utils_inout_parameters_generic(fortran_reader): psyir = fortran_reader.psyir_from_source(source) loops = psyir.children[0].children - ctu = CallTreeUtils() read_write_info_read = ReadWriteInfo() + ctu = CallTreeUtils() ctu.get_input_parameters(read_write_info_read, loops) # Use set to be order independent @@ -441,3 +462,90 @@ def test_call_tree_utils_const_argument(): # Make sure the constant '0' is not listed assert "0" not in read_write_info.signatures_read assert Signature("0") not in read_write_info.signatures_read + + +# ----------------------------------------------------------------------------- +@pytest.mark.usefixtures("clear_module_manager_instance") +def testcall_tree_utils_non_local_inout_parameters(capsys): + '''Tests the collection of non-local input and output parameters. + ''' + Config.get().api = "dynamo0.3" + ctu = CallTreeUtils() + + test_file = os.path.join("driver_creation", "module_with_builtin_mod.f90") + psyir, _ = get_invoke(test_file, "dynamo0.3", 0, dist_mem=False) + schedule = psyir.invokes.invoke_list[0].schedule + + test_dir = os.path.join(get_base_path("dynamo0.3"), "driver_creation") + mod_man = ModuleManager.get() + mod_man.add_search_path(test_dir) + + # The example does contain an unknown subroutine (by design), and the + # infrastructure directory has not been added, so constants_mod cannot + # be found: + rw_info = ctu.get_in_out_parameters(schedule, + collect_non_local_symbols=True) + out, _ = capsys.readouterr() + assert "Unknown routine 'unknown_subroutine - ignored." in out + assert ("Cannot find module 'constants_mod' - ignoring unknown symbol " + "'eps'." in out) + + # We don't test the 14 local variables here, this was tested earlier. + # Focus on the remote symbols that are read: + assert (('module_with_var_mod', Signature("module_var_b")) + in rw_info.read_list) + # And check the remote symbols that are written: + assert (('module_with_var_mod', Signature("module_var_a")) + in rw_info.write_list) + assert (('module_with_var_mod', Signature("module_var_b")) + in rw_info.write_list) + assert (('testkern_import_symbols_mod', Signature("dummy_module_variable")) + in rw_info.write_list) + + +# ----------------------------------------------------------------------------- +def test_call_tree_error_var_not_found(capsys): + '''Tests that trying to import a variable from a module that does not + contain the variable is handled, i.e. printing a warning and otherwise + ignores (TODO #2120) + ''' + dyn_test_dir = get_base_path("dynamo0.3") + mod_man = ModuleManager.get() + mod_man.add_search_path(os.path.join(dyn_test_dir, "infrastructure")) + + read_write_info = ReadWriteInfo() + ctu = CallTreeUtils() + sva = SingleVariableAccessInfo(Signature("a")) + ctu._resolve_calls_and_unknowns([("unknown", "constants_mod", + Signature("does_not_exist"), sva)], + read_write_info) + out, _ = capsys.readouterr() + + assert "Unable to check if signature 'does_not_exist' is constant" in out + + +# ----------------------------------------------------------------------------- +def test_call_tree_error_module_is_codeblock(capsys): + '''Tests that a module that cannot be parsed and becomes a codeblock + is handled correctly. + ''' + dyn_test_dir = get_base_path("dynamo0.3") + mod_man = ModuleManager.get() + mod_man.add_search_path(os.path.join(dyn_test_dir, "driver_creation")) + + cblock = CodeBlock([], "dummy") + mod_info = mod_man.get_module_info("testkern_import_symbols_mod") + # get_psyir returns the module PSyIR, which we need to replace with + # the codeblock in order to reproduce this error: + container = mod_info.get_psyir() + container.replace_with(cblock) + + ctu = CallTreeUtils() + sva = SingleVariableAccessInfo(Signature("a")) + read_write_info = ReadWriteInfo() + ctu._resolve_calls_and_unknowns( + [("routine", "testkern_import_symbols_mod", + Signature("testkern_import_symbols_code"), sva)], read_write_info) + out, _ = capsys.readouterr() + assert ("Cannot find symbol 'testkern_import_symbols_code' in module " + "'testkern_import_symbols_mod' - ignored." in out) diff --git a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/invoke_kernel_rename_symbols.f90 b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/invoke_kernel_rename_symbols.f90 new file mode 100644 index 0000000000..1c1e4a67f5 --- /dev/null +++ b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/invoke_kernel_rename_symbols.f90 @@ -0,0 +1,52 @@ +! ----------------------------------------------------------------------------- +! BSD 3-Clause License +! +! Copyright (c) 2023-2024, Science and Technology Facilities Council +! All rights reserved. +! +! Redistribution and use in source and binary forms, with or without +! modification, are permitted provided that the following conditions are met: +! +! * Redistributions of source code must retain the above copyright notice, this +! list of conditions and the following disclaimer. +! +! * Redistributions in binary form must reproduce the above copyright notice, +! this list of conditions and the following disclaimer in the documentation +! and/or other materials provided with the distribution. +! +! * Neither the name of the copyright holder nor the names of its +! contributors may be used to endorse or promote products derived from +! this software without specific prior written permission. +! +! THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS +! "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT +! LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS +! FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE +! COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, +! INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, +! BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; +! LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +! CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT +! LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN +! ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE +! POSSIBILITY OF SUCH DAMAGE. +!------------------------------------------------------------------------------- +! Author J. Henrichs, Bureau of Meteorology + +program invoke_kernel_with_imported_symbols + + ! Description: The kernel will call a subroutine which renames a symbol + ! it imports. This is used to test the handling of non-local renamed + ! variables in extraction and driver creation. + use constants_mod, only: r_def + use field_mod, only: field_type + use testkern_rename_symbols_mod, only: testkern_rename_symbols_type + + implicit none + + type(field_type) :: f1, f2, m1, m2 + real(r_def) :: module_var_a + + call invoke(testkern_rename_symbols_type(module_var_a, f1, f2, m1, m2)) + +end program invoke_kernel_with_imported_symbols diff --git a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_renaming_external_var_mod.f90 b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_renaming_external_var_mod.f90 index 4ddab52e11..0a8aec5877 100644 --- a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_renaming_external_var_mod.f90 +++ b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_renaming_external_var_mod.f90 @@ -32,7 +32,7 @@ ! Author: J. Henrichs, Bureau of Meteorology ! This modules renames a variable it imports. This is used to test the -! handling of renaming variables in non-local variable in extraction and +! handling of renaming of non-local variables in extraction and ! driver creation. module module_renaming_external_var_mod diff --git a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_with_var_mod.f90 b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_with_var_mod.f90 index bdd5cea130..f68b2dc402 100644 --- a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_with_var_mod.f90 +++ b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/module_with_var_mod.f90 @@ -37,6 +37,7 @@ module module_with_var_mod integer :: module_var_a, module_var_b + integer, parameter :: module_const = 123 contains diff --git a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_import_symbols_mod.f90 b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_import_symbols_mod.f90 index fd5bcee2d9..c60a3df183 100644 --- a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_import_symbols_mod.f90 +++ b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_import_symbols_mod.f90 @@ -79,7 +79,7 @@ subroutine testkern_import_symbols_code(nlayers, ascalar, & ndf_w3, undf_w3, map_w3) use constants_mod, only: eps, i_def, r_def use module_with_var_mod, only: module_subroutine, module_var_a, & - module_function + module_function, module_const implicit none @@ -98,9 +98,10 @@ subroutine testkern_import_symbols_code(nlayers, ascalar, & real(kind=r_def), intent(in), dimension(undf_w3) :: fld4 real(kind=r_def) :: tmp - tmp = fld2(1)*fld3(1)*fld4(1) + ! Also test handling of intrinsics + tmp = fld2(1)*fld3(1)*fld4(1) + sqrt(ascalar) fld1(1) = eps * nlayers + tmp - dummy_module_variable = 1 + dummy_constant + local_func(1) + dummy_module_variable = 1 + dummy_constant + local_func(1) + module_const fld1(1) = module_function() call module_subroutine() call local_subroutine() diff --git a/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_rename_symbols_mod.f90 b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_rename_symbols_mod.f90 new file mode 100644 index 0000000000..a8c4685f48 --- /dev/null +++ b/src/psyclone/tests/test_files/dynamo0p3/driver_creation/testkern_rename_symbols_mod.f90 @@ -0,0 +1,90 @@ + ! ----------------------------------------------------------------------------- +! BSD 3-Clause License +! +! Copyright (c) 2017-2024, Science and Technology Facilities Council. +! All rights reserved. +! +! Redistribution and use in source and binary forms, with or without +! modification, are permitted provided that the following conditions are met: +! +! * Redistributions of source code must retain the above copyright notice, this +! list of conditions and the following disclaimer. +! +! * Redistributions in binary form must reproduce the above copyright notice, +! this list of conditions and the following disclaimer in the documentation +! and/or other materials provided with the distribution. +! +! * Neither the name of the copyright holder nor the names of its +! contributors may be used to endorse or promote products derived from +! this software without specific prior written permission. +! +! THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +! AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +! IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +! DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +! FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +! DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +! SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +! CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +! OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +! OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. +! ----------------------------------------------------------------------------- +! Authors: R. W. Ford and A. R. Porter, STFC Daresbury Lab +! Modified: I. Kavcic, Met Office +! Modified: J. Henrichs, Bureau of Meteorology + +module testkern_rename_symbols_mod + + ! The kernel here calls a function that will rename a symbol it imports. + ! This is used to test the handling of non-local renamed symbols in the + ! extraction and driver creation. + use argument_mod + use fs_continuity_mod + use kernel_mod + + implicit none + + type, extends(kernel_type) :: testkern_rename_symbols_type + type(arg_type), dimension(5) :: meta_args = & + (/ arg_type(gh_scalar, gh_real, gh_read), & + arg_type(gh_field, gh_real, gh_inc, w1), & + arg_type(gh_field, gh_real, gh_read, w2), & + arg_type(gh_field, gh_real, gh_read, w2), & + arg_type(gh_field, gh_real, gh_read, w3) & + /) + integer :: operates_on = cell_column + contains + procedure, nopass :: code => testkern_rename_symbols_code + end type testkern_rename_symbols_type + +contains + + subroutine testkern_rename_symbols_code(nlayers, ascalar, & + fld1, fld2, fld3, fld4, & + ndf_w1, undf_w1, map_w1, & + ndf_w2, undf_w2, map_w2, & + ndf_w3, undf_w3, map_w3) + use constants_mod, only: eps, i_def, r_def + use module_renaming_external_var_mod, only: renaming_subroutine + implicit none + + integer(kind=i_def), intent(in) :: nlayers + integer(kind=i_def), intent(in) :: ndf_w1 + integer(kind=i_def), intent(in) :: ndf_w2 + integer(kind=i_def), intent(in) :: ndf_w3 + integer(kind=i_def), intent(in) :: undf_w1, undf_w2, undf_w3 + integer(kind=i_def), intent(in), dimension(ndf_w1) :: map_w1 + integer(kind=i_def), intent(in), dimension(ndf_w2) :: map_w2 + integer(kind=i_def), intent(in), dimension(ndf_w3) :: map_w3 + real(kind=r_def), intent(in) :: ascalar + real(kind=r_def), intent(inout), dimension(undf_w1) :: fld1 + real(kind=r_def), intent(in), dimension(undf_w2) :: fld2 + real(kind=r_def), intent(in), dimension(undf_w2) :: fld3 + real(kind=r_def), intent(in), dimension(undf_w3) :: fld4 + integer :: tmp + + call renaming_subroutine(tmp) + + end subroutine testkern_rename_symbols_code + +end module testkern_rename_symbols_mod diff --git a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_all b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_all index 6a7abdd43a..3f2d8441ae 100755 --- a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_all +++ b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_all @@ -2,6 +2,7 @@ # BSD 3-Clause License # # Copyright (c) 2020-2024, Science and Technology Facilities Council. + # All rights reserved. # # Redistribution and use in source and binary forms, with or without @@ -42,6 +43,7 @@ include Makefile.inc time_evolution_alg_mod_psy.f90: time_evolution_alg_mod.x90 solutions/extract_all_transform.py psyclone $(PSYCLONE_CMD) \ + -d . -d ../gungho_lib \ -s ./solutions/extract_all_transform.py -opsy time_evolution_alg_mod_psy.f90 \ -oalg time_evolution_alg_mod.f90 time_evolution_alg_mod.x90 diff --git a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_one b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_one index 5da5b1d5a2..152c2f2fc2 100755 --- a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_one +++ b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.extract_one @@ -42,6 +42,7 @@ include Makefile.inc time_evolution_alg_mod_psy.f90: time_evolution_alg_mod.x90 solutions/extract_one_transform.py psyclone $(PSYCLONE_CMD) \ + -d . -d ../gungho_lib \ -s ./solutions/extract_one_transform.py -opsy time_evolution_alg_mod_psy.f90 \ -oalg time_evolution_alg_mod.f90 time_evolution_alg_mod.x90 diff --git a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.nan_all b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.nan_all index 0bd10b9534..ccac90fec7 100755 --- a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.nan_all +++ b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.nan_all @@ -41,6 +41,7 @@ include Makefile.inc time_evolution_alg_mod_psy.f90: time_evolution_alg_mod.x90 solutions/nan_all_transform.py psyclone $(PSYCLONE_CMD) \ + -d . -d ../gungho_lib \ -s ./solutions/nan_all_transform.py -opsy time_evolution_alg_mod_psy.f90 \ -oalg time_evolution_alg_mod.f90 time_evolution_alg_mod.x90 diff --git a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.readonly_all b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.readonly_all index 2a6bb52e88..562df8fae3 100755 --- a/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.readonly_all +++ b/tutorial/practicals/LFRic/building_code/4_psydata/solutions/Makefile.readonly_all @@ -41,6 +41,7 @@ include Makefile.inc time_evolution_alg_mod_psy.f90: time_evolution_alg_mod.x90 solutions/readonly_all_transform.py psyclone $(PSYCLONE_CMD) -s ./solutions/readonly_all_transform.py \ + -d . -d ../gungho_lib \ -opsy time_evolution_alg_mod_psy.f90 \ -oalg time_evolution_alg_mod.f90 time_evolution_alg_mod.x90