Skip to content
This repository has been archived by the owner on Sep 16, 2020. It is now read-only.

Commit

Permalink
Experimental import limits
Browse files Browse the repository at this point in the history
Related to #17
  • Loading branch information
joakimk committed Sep 25, 2016
1 parent 813f0ec commit 4ac2afd
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 0 deletions.
12 changes: 12 additions & 0 deletions lib/toniq/job_importer.ex
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,22 @@ defmodule Toniq.JobImporter do
defp import_jobs(enabled: false), do: nil
defp import_jobs(enabled: true) do
incoming_jobs
|> Enum.take(jobs_to_import_count)
|> log_import
|> Enum.each(&import_job/1)
end

This comment has been minimized.

Copy link
@joakimk

joakimk Sep 25, 2016

Author Owner

The principle of only keeping a certain number of jobs in memory at a time is probably sound, but doing it like this won't work.

The redis-query needs to load a limited set of jobs at once for it to work. I tried a bit with sscan without success, but it seems doable.


defp jobs_to_import_count do
max_count = 500 # this would be in config
diff = max_count - Toniq.JobPersistence.jobs_count

if diff < 0 do
0
else
diff
end
end

defp incoming_jobs do
Toniq.JobPersistence.incoming_jobs
end
Expand Down
2 changes: 2 additions & 0 deletions lib/toniq/job_persistence.ex
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,8 @@ defmodule Toniq.JobPersistence do
"""
def jobs(identifier \\ default_identifier), do: load_jobs(jobs_key(identifier), identifier)

def jobs_count(identifier \\ default_identifier), do: scard(redis, jobs_key(identifier)) |> String.to_integer

@doc """
Returns all incoming jobs (used for failover).
"""
Expand Down

0 comments on commit 4ac2afd

Please sign in to comment.