-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Get ansible_ resources from multiple Remote state #24
Comments
I'll have to put some thought into this use case, but I don't think it's something that would be easy to support. The way this script pulls the state now, it just takes a dump of whatever state Terraform is configured to use. But remote sate data isn't part of that sate - data providers are populated at runtime and not persisted to the state file. You could still depend on values from remote state in the Ansible resources you configure within Terraform config, but all of the hosts for this plugin still need to be in the same project. |
A Project can be divided into multiple tfstate (different directories for different purpose), continuing to be a single project (for this terraform ). I think the script read backend.tf to get tfstate with ansible_ configuration right ? If so, cannot you read ansible_ from all tfstates pointed by backend.tf in all directories specified in ANSIBLE_TF_DIR (or ANSIBLE_TF_DIRS) ? |
Yes, in theory we could read multiple TF States from different project directories. But this is something altogether different from reading remote state data providers. Amongst other things, using multiple project directories requires that you have the Terraform config files checked-out locally. Since the remote state providers uses outputs as a common interface between modules, this wouldn't be necessary with a remote-state configuration, but you would also be violating the outputs-as-dependencies contract from the Terraform design. I'm pretty sure you could already do multiple-local-projects with Ansible, by using it's own multiple inventory sources support. Maybe by passing it a wrapper script that sets environment variables for each project and then calls out to the /etc/ansible/ProjectA #!/usr/bin/env sh
export ANSIBLE_TF_DIR=/home/somebody/projects/A
exec /etc/ansible/terraform.py /etc/ansible/ProjectB #!/usr/bin/env sh
export ANSIBLE_TF_DIR=/home/somebody/projects/B
exec /etc/ansible/terraform.py
|
To clarify this point... no the script does not try to parse any Terraform files. It calls |
I think It is the same. Before you can pull terraform state, you need to know where the state is and how to have access to it. If not on local disk backend.tf (for any remote state configuration) is what allow your script to pull state. I'll give a try to multiple inventory "as scripts" using different env var. I want to thank you for this dynamic inventory script and the terraform ansible provider (a great solution). |
Having terraform configuration in different directories/account for secuirity/sepration o duty or something else, can be useful terraform inventory allow to read states from other directories out of the main as current or defined by ANSIBLE_TF_DIR.
E.g. gathering ansible configuration from multimple remote state (multiple directoies in ANSIBLE_TF_DIR), to work like multiple terraform_remote_state in terraform:
Every directory will have its backend with aws connection config ti allow this design, probably.
The text was updated successfully, but these errors were encountered: