Does model.compile() initialize all the weights and biases in Keras (tensorflow backend)?

TensorflowKeras

Tensorflow Problem Overview


When I start training a model, there is no model saved previously. I can use model.compile() safely. I have now saved the model in a h5 file for further training using checkpoint.

Say, I want to train the model further. I am confused at this point: can I use model.compile() here? And should it be placed before or after the model = load_model() statement? If model.compile() reinitializes all the weights and biases, I should place it before model = load_model() statement.

After discovering some discussions, it seems to me that model.compile() is only needed when I have no model saved previously. Once I have saved the model, there is no need to use model.compile(). Is it true or false? And when I want to predict using the trained model, should I use model.compile() before predicting?

Tensorflow Solutions


Solution 1 - Tensorflow

When to use?

If you're using compile, surely it must be after load_model(). After all, you need a model to compile. (PS: load_model automatically compiles the model with the optimizer that was saved along with the model)

What does compile do?

Compile defines the loss function, the optimizer and the metrics. That's all.

It has nothing to do with the weights and you can compile a model as many times as you want without causing any problem to pretrained weights.

You need a compiled model to train (because training uses the loss function and the optimizer). But it's not necessary to compile a model for predicting.

Do you need to use compile more than once?

Only if:

  • You want to change one of these:
    • Loss function
    • Optimizer / Learning rate
    • Metrics
    • The trainable property of some layer
  • You loaded (or created) a model that is not compiled yet. Or your load/save method didn't consider the previous compilation.

Consequences of compiling again:

If you compile a model again, you will lose the optimizer states.

This means that your training will suffer a little at the beginning until it adjusts the learning rate, the momentums, etc. But there is absolutely no damage to the weights (unless, of course, your initial learning rate is so big that the first training step wildly changes the fine tuned weights).

Solution 2 - Tensorflow

Don't forget that you also need to compile the model after changing the trainable flag of a layer, e.g. when you want to fine-tune a model like this:

  1. load VGG model without top classifier

  2. freeze all the layers (i.e. trainable = False)

  3. add some layers to the top

  4. compile and train the model on some data

  5. un-freeze some of the layers of VGG by setting trainable = True

  6. compile the model again (DON'T FORGET THIS STEP!)

  7. train the model on some data

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionPreetom Saha ArkoView Question on Stackoverflow
Solution 1 - TensorflowDaniel MöllerView Answer on Stackoverflow
Solution 2 - TensorflowtodayView Answer on Stackoverflow