How to count total number of trainable parameters in a tensorflow model?
Neural NetworkTensorflowNeural Network Problem Overview
Is there a function call or another way to count the total number of parameters in a tensorflow model?
By parameters I mean: an N dim vector of trainable variables has N parameters, a NxM
matrix has N*M
parameters, etc. So essentially I'd like to sum the product of the shape dimensions of all the trainable variables in a tensorflow session.
Neural Network Solutions
Solution 1 - Neural Network
Loop over the shape of every variable in tf.trainable_variables()
.
total_parameters = 0
for variable in tf.trainable_variables():
# shape is an array of tf.Dimension
shape = variable.get_shape()
print(shape)
print(len(shape))
variable_parameters = 1
for dim in shape:
print(dim)
variable_parameters *= dim.value
print(variable_parameters)
total_parameters += variable_parameters
print(total_parameters)
Update: I wrote an article to clarify the dynamic/static shapes in Tensorflow because of this answer: https://pgaleone.eu/tensorflow/2018/07/28/understanding-tensorflow-tensors-shape-static-dynamic/
Solution 2 - Neural Network
I have an even shorter version, one line solution using using numpy:
np.sum([np.prod(v.get_shape().as_list()) for v in tf.trainable_variables()])
Solution 3 - Neural Network
Not sure if the answer given actually runs (I found you need to convert the dim object to an int for it to work). Here is is one that works and you can just copy paste the functions and call them (added a few comments too):
def count_number_trainable_params():
'''
Counts the number of trainable variables.
'''
tot_nb_params = 0
for trainable_variable in tf.trainable_variables():
shape = trainable_variable.get_shape() # e.g [D,F] or [W,H,C]
current_nb_params = get_nb_params_shape(shape)
tot_nb_params = tot_nb_params + current_nb_params
return tot_nb_params
def get_nb_params_shape(shape):
'''
Computes the total number of params for a given shap.
Works for any number of shapes etc [D,F] or [W,H,C] computes D*F and W*H*C.
'''
nb_params = 1
for dim in shape:
nb_params = nb_params*int(dim)
return nb_params
Solution 4 - Neural Network
Update April 2020: tfprof and the Profiler UI have been deprecated in favor of profiler support in TensorBoard.
The two existing answers are good if you're looking into computing the number of parameters yourself. If your question was more along the lines of "is there an easy way to profile my TensorFlow models?", I would highly recommend looking into tfprof. It profiles your model, including calculating the number of parameters.
Solution 5 - Neural Network
I'll throw in my equivalent but shorter implementation:
def count_params():
"print number of trainable variables"
size = lambda v: reduce(lambda x, y: x*y, v.get_shape().as_list())
n = sum(size(v) for v in tf.trainable_variables())
print "Model size: %dK" % (n/1000,)
Solution 6 - Neural Network
If one prefers to avoid numpy (it can be left out for many projects), then:
all_trainable_vars = tf.reduce_sum([tf.reduce_prod(v.shape) for v in tf.trainable_variables()])
This is a TF translation of the previous answer by Julius Kunze.
As any TF operation, it requires a session run to evaluate:
print(sess.run(all_trainable_vars))
Solution 7 - Neural Network
Now, you can use this :
from keras.utils.layer_utils import count_params
count_params(model.trainable_weights)
Solution 8 - Neural Network
model.summary()
Model: "sequential_32"
Layer (type) Output Shape Param #
conv2d_88 (Conv2D) (None, 240, 240, 16) 448
max_pooling2d_87 (MaxPooling (None, 120, 120, 16) 0
conv2d_89 (Conv2D) (None, 120, 120, 32) 4640
max_pooling2d_88 (MaxPooling (None, 60, 60, 32) 0
conv2d_90 (Conv2D) (None, 60, 60, 64) 18496
max_pooling2d_89 (MaxPooling (None, 30, 30, 64) 0
flatten_29 (Flatten) (None, 57600) 0
dropout_48 (Dropout) (None, 57600) 0
dense_150 (Dense) (None, 24) 1382424
dense_151 (Dense) (None, 9) 225
dense_152 (Dense) (None, 3) 30
dense_153 (Dense) (None, 1) 4
Total params: 1,406,267 Trainable params: 1,406,267 Non-trainable params: 0