Subsample parameter - Federated XGBoost (Histogram Based mainly) #1830
-
Hi there! How does the To my understanding, Best regards, |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments
-
@ZiyueXu77 can you help answer this question? thanks in advance! |
Beta Was this translation helpful? Give feedback.
-
Histogram-based would be identical to distributed version, so unlike tree-based where tree training only access local data, I believe it will be based on global info, @rongou can further comment. |
Beta Was this translation helpful? Give feedback.
-
Yes, we sum the gradients across all workers, so the sampling would be based on the global gradient information. |
Beta Was this translation helpful? Give feedback.
-
Hi @rongou, Thank you for your answer and sorry for delayed response. Does it make more sense to sample based on global gradient information and not local? I am thinking whether sampling based on local gradient information could be used for adaptation or personalization? I might be wrong here though |
Beta Was this translation helpful? Give feedback.
-
You can give it a try, but I suspect it'll just add more noise. |
Beta Was this translation helpful? Give feedback.
Yes, we sum the gradients across all workers, so the sampling would be based on the global gradient information.