-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wall tests #17
Wall tests #17
Conversation
…ttom wall scenario
…nd two wall cases
…of the bottom when there is a wall
Comment had the answer for the PSE test. Removed the periodic correction part
… dissipation test.
Minor test edits
@RaulPPelaez This plot is to illustrate an issue I've been running into with DPStokes in libMobility. Here, the square points and the lines are reference data from the DPStokes repository, and the triangular points are from libMobility generated from the bottom wall test on this branch. In general, libMobility lines up well, but the inset shows a small error between libMobility and the reference that is consistently of order 1e-3. The error goes to zero as Brennan suspects this mismatch may be from suboptimal grid/kernel parameters. Do you agree? I'm going to attempt to integrate the parameter selection from DPStokes to see if that resolves the mismatch. Let me know if have a different idea of what could be causing this systematic underestimation so I can check that out as well! |
…t-working nbody test
… the correct constants from the Swan correction paper- the previous implementation was using constants based on a self mobility constant of 1/(8*pi*eta*a) instead of 1/(6*pi*eta*a)
I agree with Brennan, it is most probably suboptimal grid parameters. |
We should be merging this before #19 |
…ctor of 1/8pi in the original implementation instead of 1/6pi in the Swan paper.
@RaulPPelaez I haven't merged this yet as there are still problems with the DPStokes results not agreeing perfectly with reference results and Brennan and I think it should be possible to get the results to match to more than a tolerance of ~1e-3. Even when using the parameter selection code, there remains a persistent underestimation on the order of 1e-4. I have the reference code running from the DPStokes repository. As far as I can tell, all the grid and kernel parameters are the same between the two simulations. Do you know if there is anything that got changed between the libMobility version of DPStokes and the original version that could be causing this mismatch? We aren't sure where to look to try and resolve this as I've (mostly) ruled out the difference in parameters. |
I cannot think of anything that could cause this given identical parameters. |
Here's a relative error plot for the xx and zz trans-trans components of the mobility matrix, computed as The plot is the same for double and single precision, with negligible changes in the exact values of the computed mobilities. Here, the values for |
Between the version from the reference and the current one used in libmobility there is this big commit in UAMMD RaulPPelaez/UAMMD@c662443 The reference uses w=6 or 12, but this PR has w=4, I reckon that is taken into account? |
…ts are now passing if the same kernel parameters are used as the ref implementation
@RaulPPelaez finally think I figured out the rest of the issues. Turns out that all the reference data from the DPStokes repository was generated using the CPU implementations. Additionally, although I was using the reference data for the w=4 kernel, their script for the pair mobility had a bug that could force the wrong kernel to be used. I regenerated all the data in that repository using the GPU implementations and with correct kernels and ended up with the relative error plots below, which Brennan and I are satisfied with. The pair mobility plot (2nd plot) isn't great since I didn't mess with the formatting enough, so a few notes on that one: The green curve with the error spikes is for two particles spaced 8R_h apart. The errors in that curve are consistently higher than the rest, but mostly small for z/R_h > 1. The large spike in the bottom right plot is because that height is at the exact middle of the slit channel, so the mobility gets small quickly. The actual error there is on the scale of 1e-9, but the relative error appears larger. To get these small relative errors, I had to use the exact same parameters as the DPStokes repository code. To avoid doing that in the future, I regenerated all the reference data using our implementations with our current default parameters in single precision to use as the reference going forwards. If all this looks good, I think this branch can finally get merged. Let me know what you think! |
I am not sure I am following. In these plots, you are comparing the GPU impl from the article with the GPU impl in libmobility?. BTW one difference between CPU and GPU impls is that in the CPU particlesare spread to the cell corners in XY (pos_cell(i) = h*i), but in GPU I spread to cell centers (pos_cell(i) = (i+0.5)*h). This will give you differences in the results due to the variation of the hydrodynamic radius inside a cell. You should be able to accommodate for this by comparing CPU(particle_x, particle_y) with GPU(particle_x-0.5h, particle_y-0.5h). |
What we settled in the past with Aleks on this "you have to change r_h or L" is to have a parameter that allows you to select if L can be changed or not. Note that by changing L you also change r_h due to periodic corrections. |
I did regenerate the DPStokes references using libMobility, although I was able to use the old NBody references. What I did to have confidence when regenerating the reference was to first match the parameters used to generate the original reference and see that I matched the original reference to around If we want to have both a parameter selection that preserves Good point about modifying |
What you did makes sense, I was not getting the full picture. My advise here is to keep it simple for now. Add a parameter called "allowChangingBoxSize" or smth like that. If set to true (you can make this the default if you decide) change L so you can fix h, otherwise keep h fixed to the nearest possible value (changing r_h). If the latter is too troublesome then just make the module change L and document accordingly. We can add that functionality when needed in the future. |
… modifies the box size and another version that preserves it
@RaulPPelaez I've followed your suggestion and implemented a mode that changes I added the parameter to select which mode to use- I did it in setParameters() since I thought it made sense to go with where you set Finally, I regenerated the ref data for hopefully the last time with the new default. Once again, I compared against the data from the original DPStokes repository to verify a decent match before regenerating. |
@RaulPPelaez I've followed your suggestion and implemented a mode that changes I added the parameter to select which mode to use- I did it in setParameters() since I thought it made sense to go with where you set Finally, I regenerated the ref data for hopefully the last time with the new default. Once again, I compared against the data from the original DPStokes repository to verify a decent match before regenerating. Edit: I also changed the threshold for the mobility matrix symmetric test to 1e-5 to make them all pass. I think this is okay since if I use the default parameters from the original DPStokes repository, they only pass this test at the 1e-5 level as well. |
Sounds good Ryker. Is this ready to merge? |
@RaulPPelaez Ready to merge unless you find anything to change on review! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have some nitpicks and comments, but its great. Thanks Ryker.
tests/test_wall_mobility.py
Outdated
params = self_mobility_params[Solver.__name__] | ||
|
||
ref_dir = "./ref/" | ||
ref = scipy.io.loadmat(ref_dir + ref_file) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line introduces scipy as a test dependency, it should be added to the environment.yml accordingly. Its a big one, though, maybe the files should be converted to npy to be read with numpy.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also agree that we should remove scipy as a dependency whenever possible - it's a buggy, bad library
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I converted everything in the wall tests to use numpy, but it's worth mentioning that the fluctuation dissipation test depends on scipy.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lol good point, but it's still good to reduce our dependence on garbage :)
…rance for the test of M being symmetric
@RaulPPelaez This PR should be ready to merge- I think I addressed all your comments but let me know if there is anything else you notice! I talked with Brennan and we're deferring implementing rectangular boxes (Lx != Ly) for DPStokes to a later date, so I put an error in there during initialization. |
# @pytest.mark.parametrize("numberParticles", [1,2,10]) | ||
@pytest.mark.parametrize("numberParticles", [10]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why did you removed the 1,2 again? Is it failing or just takes a lot of time?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Whoops, forgot to revert that despite looking through the files! That test is quite slow for DPStokes so I often run only the 10 particle case if I'm running through tests more than once.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One can mark tests or individual parameters with custom markers:
@pytest.mark.parametrize(
"numberParticles", [1, 2, pytest.param(10, marks=pytest.mark.slow)]
)
Then you can run with pytest -m slow tests*py
or pytest -n "not slow" tests*py
.
Technically you have to register these in a separate file, but using them as is just works for me https://docs.pytest.org/en/stable/how-to/mark.html
Something to keep in mind for the future, not worth it at this point IMO.
That's fine, we will get around it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your hard work Ryker!
Adding some tests for DPStokes (and maybe other solvers later) that compare to some reference results.