You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 19, 2024. It is now read-only.
I have trained a pretrained ImageNet resnet on a custom dataset with 12 classes.
For training I used following yaml file: training_yaml_file
In this yaml file I only changed the of the head layer:
While training the one-top-accuracy for the 5 different heads ranges between 80 and 90%
For running inference I used following code, slighty adapted from second part of Inference_tutorial
from omegaconf import OmegaConf
from vissl.utils.hydra_config import AttrDict
from vissl.utils.hydra_config import compose_hydra_configuration, convert_to_attrdict
# Config is located at vissl/configs/config/pretrain/simclr/simclr_8node_resnet.yaml.
# All other options override the simclr_8node_resnet.yaml config.
cfg = [
'config=custom_configs/eval_resnet_8gpu_transfer_in1k_linear_bart.yaml',
'config.MODEL.WEIGHTS_INIT.PARAMS_FILE=/content/checkpoints/linear_superresnet12class/model_phase26.torch',
'config.MODEL.FEATURE_EVAL_SETTINGS.EVAL_MODE_ON=True', # Turn on model evaluation mode.
'config.MODEL.FEATURE_EVAL_SETTINGS.FREEZE_TRUNK_AND_HEAD=True', # Freeze trunk.
'config.MODEL.FEATURE_EVAL_SETTINGS.EVAL_TRUNK_AND_HEAD=True', # Extract the trunk features, as opposed to the HEAD.
]
# Compose the hydra configuration.
cfg = compose_hydra_configuration(cfg)
# Convert to AttrDict. This method will also infer certain config options
# and validate the config is valid.
_, cfg = convert_to_attrdict(cfg)
# Build the model
from vissl.models import build_model
from vissl.utils.checkpoint import init_model_from_consolidated_weights
from classy_vision.generic.util import load_checkpoint
model = build_model(cfg.MODEL, cfg.OPTIMIZER)
# Load the checkpoint weights.
weights = load_checkpoint(checkpoint_path=cfg.MODEL.WEIGHTS_INIT.PARAMS_FILE)
# Initializei the model with the simclr model weights.
init_model_from_consolidated_weights(
config=cfg,
model=model,
state_dict=weights,
state_dict_key_name="classy_state_dict",
skip_layers=[], # Use this if you do not want to load all layers
)
print("Weights have loaded")
# model.heads[0].clf.clf[0].weight
model.trunk.base_model._feature_blocks.conv1.weight[0][0][0]
Problem
Every time I am loading the model the weights are different
First of all, thank you for using VISSL and raising your question :)
(and sorry for the delay in my answer...)
It would seem from your description that the model is not correctly loaded and so the weights are random, hence the random accuracy at the end. Could you check the logs and grep the line "Extra layers not loaded from checkpoint"? It will indicate if the weights are not loaded.
Instructions To Reproduce the Issue:
I have trained a pretrained ImageNet resnet on a custom dataset with 12 classes.
For training I used following yaml file:
training_yaml_file
In this yaml file I only changed the of the head layer:
Then I trained the model with following commands:
While training the one-top-accuracy for the 5 different heads ranges between 80 and 90%
For running inference I used following code, slighty adapted from second part of Inference_tutorial
Problem
Every time I am loading the model the weights are different
reload again:
In addition, the loaded mode has the ResNext, but is a resnet. As a result of the changing weights, my output is quite random. Anybody a solution?
Environment:
Provide your environment information using the following command:
The text was updated successfully, but these errors were encountered: