Hi all,
I want to have access to a Hyperparameters artifact containing hyper-parameters values computed from a previous run. I could then perform a hyper-parameters search with some hyper-parameters values fixed (e.g., the number of layers to be able to load a previously trained model) and to fine-tune the others (e.g., the learning rate).
I tried to use a Resolver component with a LatestArtifactStrategy and two inputs channels, one for the Model artifact and one for the Hyperparameters artifact. I then pass the outputs of this Resolver component to the Tuner component in the following way:
resolver = Resolver(
strategy_class=LatestArtifactStrategy,
model=channel_utils.as_channel([standard_artifacts.Model()]),
hyperparameters=channel_utils.as_channel([standard_artifacts.HyperParameters()])
)
tuner = Tuner(
examples=example_gen.outputs[standard_component_specs.EXAMPLES_KEY],
schema=infer_schema.outputs[standard_component_specs.SCHEMA_KEY],
module_file=module_file,
base_model=resolver.outputs[standard_component_specs.MODEL_KEY],
custom_config={
standard_component_specs.HYPERPARAMETERS_KEY:
resolver.outputs[standard_component_specs.HYPERPARAMETERS_KEY]
},
)
Unfortunately, the resolver component only succeeded to load the Model artifact but not the Hyperparameters one.
One trick that I did is to write a custom Trainer executor that loads the Hyperparameters artifact and copies the ‘best_hyperparameters.txt’ file to the model directory. I could then access this file inside the tuner_fn thanks to the fn_args.base_model attribute but it doesn’t appear to be a valid option.
What is the recommended way to pass previous run artifacts to downstream components ?
In this case how to access hyper-parameters used to train a model, without having to hand save the parameters inside the model directory ? Or how to pass it to the tuner_fn beside the custom_config ?
Thanks