in usingpre-train modellength of time,We needrestore variables from checkpoint files.
Often "Tensor name not found" is not found in checkpoint.
This is a good time to check what variables are actually in ckpt
import os from import pywrap_tensorflow checkpoint_path = (model_dir, "") # Read data from checkpoint file reader = pywrap_tensorflow.NewCheckpointReader(checkpoint_path) var_to_shape_map = reader.get_variable_to_shape_map() # Print tensor name and values for key in var_to_shape_map: print("tensor_name: ", key) print(reader.get_tensor(key))
You can display the tensor names and values in ckpt, and of course you can debug with pycharm.
ADDITIONAL: Reading the values saved in the model in tensorflow, the
Use (model_dir)
A standard model file has the following files, model_dir is MyModel (no suffix)
checkpoint -00000-of-00001
import tensorflow as tf import pprint # Improve print readability with pprint NewCheck =("model")
Print all variables in the model
print("debug_string:\n") (NewCheck.debug_string().decode("utf-8"))
There are three fields, name, data type and shape.
Getting the value in a variable
print("get_tensor:\n") (NewCheck.get_tensor("D/conv2d/bias"))
print("get_variable_to_dtype_map\n") (NewCheck.get_variable_to_dtype_map()) print("get_variable_to_shape_map\n") (NewCheck.get_variable_to_shape_map())
The above is a personal experience, I hope it can give you a reference, and I hope you can support me more. If there is any mistake or something that has not been fully considered, please do not hesitate to give me advice.