Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancing Specmatic for Contract Test Customization #1463

Open
Rajaneeshkumar opened this issue Dec 2, 2024 · 5 comments
Open

Enhancing Specmatic for Contract Test Customization #1463

Rajaneeshkumar opened this issue Dec 2, 2024 · 5 comments
Assignees
Labels
feature_request New feature

Comments

@Rajaneeshkumar
Copy link

Is your feature request related to a problem? Please describe.
I am currently exploring Specmatic for contract testing of our microservices. While I appreciate its ability to dynamically generate provider tests by simply passing an OpenAPI (YAML) file, I’ve encountered limitations that hinder its usability in certain scenarios:

1)Lack of Mock Data Customization: Unlike the setExpectations feature for consumer-side contract tests, there’s no straightforward way to provide mock data for specific requests during provider-side contract testing.
2)Dependency Management Across Requests: Some test scenarios require prerequisites, such as creating resources (e.g., generating an ID) that need to be reused in subsequent requests. Specmatic doesn’t seem to support this workflow seamlessly.

These limitations are making it difficult for me to decide whether to continue with Specmatic or switch to another tool that better fits our requirements.

Describe the solution you'd like
It would be highly beneficial if Specmatic provided more customization options and flexibility for generating contract tests

Additional context
I have recently contributed to Specmatic and have a fair understanding of it
https://github.com/znsio/specmatic/issues/1429

Your valuable suggestions would greatly help us move forward. This is a bit urgent, so I would appreciate your prompt response. Thank you!

@harikrishnan83 @samyakOO7

@pranavgawri pranavgawri self-assigned this Dec 3, 2024
@pranavgawri
Copy link
Contributor

pranavgawri commented Dec 3, 2024

Hi @Rajaneeshkumar ,

Thank you for raising this detailed feature request. I appreciate you taking the time to share your specific use cases and challenges.

I've tagged this with question label. I will review your requirements and get back to you with more detailed feedback later today.

We value your input in making Specmatic better.

Thanks,
Pranav

@pranavgawri pranavgawri added the question Further information is requested label Dec 3, 2024
@Rajaneeshkumar
Copy link
Author

Thanks for checking this on a priority @pranavgawri !!
Expecting feedback and would be happy if there is any workaround .

@pranavgawri
Copy link
Contributor

Hi @Rajaneeshkumar ,

Thanks for your patience.

  1. Regarding mock data customization: You can actually customize test data using external examples in JSON files. These examples work for both consumer and provider tests. Here's a link for the documentation of the same : External Examples.
  2. For managing dependencies across requests: We're actually working on supporting OpenAPI Links, which will help address this exact scenario. Links allow you to define relationships between operations and reuse data (like IDs) from previous responses. This is on our roadmap for this month, so I will update once we roll this out.

I hope this helps, please feel free to let me know in case you have any other doubts or confusions.

Thanks,
Pranav

@Rajaneeshkumar
Copy link
Author

Thanks for the update @pranavgawri !!

@pranavgawri pranavgawri added feature_request New feature and removed question Further information is requested labels Dec 11, 2024
@jhomswk
Copy link

jhomswk commented Dec 23, 2024

I've been thinking about this use case for a while and would like to suggest an alternative solution since I think OpenAPI links will not be enough to fulfill all use cases. Let's say we have an example named "example_one" that can be either inline in the OpenAPI specification or defined as an external example. I think it would be great to be able to control the setup and teardown phases for each test independently like this:

class ProviderTest {
    
    // General configuration code...
    
    @BeforeExample("example_one")
    void set_up_example_one() {
        // prepare the system to execute example_one
    }
    
    @AfterExample("example_one")
    void tear_down_example_one() {
        // clean the system after the execution of example_one
    }
    
}

I think this solution would provide the following benefits:

  • It is similar to the standard testing workflow we are most used to.
  • Enables thinking about individual test cases in isolation, which reduces cognitive load.
  • Removes the need for the global setup steps required to prepare the system for test execution.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature_request New feature
Projects
None yet
Development

No branches or pull requests

3 participants