[Wait for #2846][FSU] Modify Load Logic at FSU @open sesame 01/07 09:50 #2854
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The process of loading trained weights during inference in NNTrainer currently uses model->load(). (before call inference)
However, since FSU loads weights from flash memory according to the required layers up to the specified look-ahead value, we need to prevent weights from being loaded before inference.
To reflect this change, I modified the code in LayerNode and added a new parameter called swap_parm.
commit summary
commit 1 [Application] Update FSU SimpleFC Application : update SimpleFC Application for more easy test & actual environment
commit 2 [FSU] update layer weight load logic at fsu : update layernode not to load weight at load()
Self evaluation:
Signed-off-by: Donghak PARK [email protected]