-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add NDT sensor model #338
Add NDT sensor model #338
Conversation
@hidmic PTAL when possible, it's still a draft but might be good to get the review process going. A few concerns I have. Citing the NDT MCL paper:
I'm not doing nearest neighbor search here, but we could preprocess the grid to make that work or modify the conversion grid <-> NDT map script to take that into account, likelihoods fall so abruptly when you go away from the mean that I don't expect it to matter much but we would certainly have some boundary effects with the current approach. The way they're suggesting the scaling of the likelihood with I'd like to try a different version where
This way, our weights would have an upper bound of 1 and would make our lives easier. Also I'd consider using the log likelihood instead of the likelihood to make the weighting function less spiky and prevent us from giving poor scores to very-close hits to a narrow distribution. |
IIUC nearest neighbors here is just a performance optimization, so it's OK for a start if we don't do it. I do wonder about the detour in the algorithm from what Sarineen et. al. propose. We are not applying NDT to the measurement like Biber et. al. describe (i.e. discretizing space, fitting normal distributions within each cell) before computing likelihoods. Why?
And that can be problematic, numerically speaking. Working with degenerate, point-mass distributions won't make it any better (like we do here instead of using NDT in full). Even working with NDT may not be enough. Both Biber et. al. and Schulz et. al. suggest having multiple map layers, offseted to deal with boundary effects. We are not doing that either.
Yeah, there's some handwaving to those scalings.
I don't follow you here.
We do
+1. That's something I've been wanting to try for a while. I have a strong suspicion that much of the heuristic that the other sensor models rely on when computing likelihoods can be traced back to numerical issues. Log likelihood and log odds representation may help. I would defer to a different PR though. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Second pass. Overall looks reasonable.
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
5f478b1
to
32c00c0
Compare
@hidmic ready to re-review. I'll create a ticket for this effort and leave it here if you're OK with it. |
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM. The coverage check is in a weird state.
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Proposed changes
Adds a new sensor model, based on NDT transforms, that performs importances weighting by transforming the measurements
Type of change
Checklist
Put an
x
in the boxes that apply. This is simply a reminder of what we will require before merging your code.Additional comments
Anything worth mentioning to the reviewers.