Skip to content

Commit

Permalink
Result batches (#59)
Browse files Browse the repository at this point in the history
* execution of the tests is split up into batches

* version bump

* add result_batch_size into docs
  • Loading branch information
amyasnikov authored Nov 21, 2023
1 parent 7207a9d commit 9301714
Show file tree
Hide file tree
Showing 4 changed files with 20 additions and 7 deletions.
11 changes: 11 additions & 0 deletions docs/plugin_settings.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,16 @@ The amount of seconds system will wait between executing each Compliance Test.

Compliance Test execution may cause a lot of DB queries, because Compliance Test is dynamic by its nature and the system cannot prefetch all the required instances before the test. If you're realizing that `Run Compliance Tests` script overwhelms your DB with a lot of queries, you can adjust this setting to spread the queries over time.


# result_batch_size

*Default:* `500`

*Type:* `int`

Execution of the Tests and producing Test Results is carried out in batches. As soon as each batch reaches its maximum size (specified via this variable) all the Test Results within a batch will be uploaded into a DB.


### store_reports

*Default:* `5`
Expand Down Expand Up @@ -66,6 +76,7 @@ PLUGINS_CONFIG = {
'validity': {
'git_folder': '/opt/git/',
'sleep_between_tests': 0.02,
'result_batch_size': 300,
'store_reports': 7,
'store_last_results': 8,
},
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "netbox-validity"
version = "1.4.0"
version = "1.4.1"
description = "NetBox plugin for vendor-agnostic configuration compliance"
authors = [
{name = "Anton Miasnikov", email = "anton2008m@gmail.com"},
Expand Down
3 changes: 2 additions & 1 deletion validity/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ class NetBoxValidityConfig(PluginConfig):
description = "Vendor-agnostic framework to build your own configuration compliance rule set"
author = "Anton Miasnikov"
author_email = "anton2008m@gmail.com"
version = "1.4.0"
version = "1.4.1"
base_url = "validity"
django_apps = ["bootstrap5"]
min_version = "3.4.0"
Expand All @@ -44,6 +44,7 @@ class ValiditySettings(BaseModel):
store_reports: int = Field(default=5, gt=0, lt=1001)
git_folder: DirectoryPath = Path("/opt/git_repos")
sleep_between_tests: float = 0
result_batch_size: int = 500


settings = ValiditySettings.parse_obj(django_settings.PLUGINS_CONFIG.get("validity", {}))
11 changes: 6 additions & 5 deletions validity/scripts/validity_run_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@
class RunTestsScript(SyncReposMixin, Script):

_sleep_between_tests = validity.settings.sleep_between_tests
_result_batch_size = validity.settings.result_batch_size

sync_repos = BooleanVar(
required=False,
Expand Down Expand Up @@ -142,8 +143,8 @@ def fire_report_webhook(self, report_id: int) -> None:
queue = webhooks_queue.get()
enqueue_object(queue, report, self.request.user, self.request.id, ObjectChangeActionChoices.ACTION_CREATE)

def save_to_db(self, results: list[ComplianceTestResult], report: ComplianceReport | None) -> None:
ComplianceTestResult.objects.bulk_create(results)
def save_to_db(self, results: Iterable[ComplianceTestResult], report: ComplianceReport | None) -> None:
ComplianceTestResult.objects.bulk_create(results, batch_size=self._result_batch_size)
ComplianceTestResult.objects.delete_old()
if report:
ComplianceReport.objects.delete_old()
Expand All @@ -157,9 +158,9 @@ def run(self, data, commit):
device_ids = data.get("devices", [])
if specific_selectors := data.get("selectors"):
selectors = selectors.filter(pk__in=specific_selectors)
results = [
*chain.from_iterable(self.run_tests_for_selector(selector, report, device_ids) for selector in selectors)
]
results = chain.from_iterable(
self.run_tests_for_selector(selector, report, device_ids) for selector in selectors
)
self.save_to_db(results, report)
output = {"results": {"all": self.results_count, "passed": self.results_passed}}
if report:
Expand Down

0 comments on commit 9301714

Please sign in to comment.