Hyperparameter tuning process #2652
Unanswered
avramit5674
asked this question in
Q&A
Replies: 1 comment
-
I am not familiar with bayesian optimization for hyperparameter tuning. When you said "access to the training data distributed across all clients.", what does this mean? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Python version (
python3 -V
)3.5
NVFlare version (
python3 -m pip list | grep "nvflare"
)2.3.8
NVFlare branch (if running examples, please use the branch that corresponds to the NVFlare version,
git branch
)No response
Operating system
Ubuntu 20.04
Have you successfully run any of the following examples?
Please describe your question
I am currently setting up XGBoost for regression tasks using NVFlare, and I need to configure the hyperparameters in the server configuration files. I aim to tune these hyperparameters based on the data from all clients. For example, I plan to use Bayesian optimization for hyperparameter tuning, which requires access to the training data distributed across all clients. How can I effectively utilize optimization methods for hyperparameter tuning with NVFlare in a decentralized data environment?
Beta Was this translation helpful? Give feedback.
All reactions