当前位置:网站首页>Tensorflow - tensorflow Foundation
Tensorflow - tensorflow Foundation
2022-07-03 10:28:00 【JallinRichel】
TensorFlow Basics
source ——TensorFlow Official website 1
The article is not complete
Tensorflow It is an end-to-end machine learning platform , It supports the following :
- Numerical calculation based on multidimensional array ( and NumPy similar )
- GPU And distributed processing
- Automatic differentiation
- model building , Training and output
- wait
tensor ( It's actually a vector )
Tensorflow Run multidimensional arrays or tensor representation objects tf.Tensor, The most important parameters are shap and dtype:
- Tensor.shap: Represents the size of the tensor along each axis
- Tensor.dtype: Represents the data type of all elements in the tensor
Tensorflow Implement standard mathematical operations on tensors , And many operations dedicated to machine learning
If you want to improve the speed of the program, you can GPU Up operation , stay CPU It is slow to perform large-scale operations on
Variable
tf.Tensor Objects are generally invariant . have access to tf.Variable Store model weights ( Or other variable states )
Automatic differentiation
Gradient descent and correlation algorithm are the basis of machine learning .Tensorflow Automatic differentiation is realized , Use calculus to calculate the gradient . Usually , You can use this to calculate the gradient of the error or loss of the model relative to its weight
Tensorflow The gradient of any number of non scalar tensors can be calculated at the same time
Images and functions (tf.function)
You can use it like Python Use interactively like libraries Tensorflow
Tensorflow Also for the performance optimization 、 Output Tools are provided
- performance optimization : Speed up training and derivation
- Output : After training, you can save your model
You can use tf.function Will be pure TensorFlow Code and the Python Separate
@tf.function
def my_func(x):
print('Tracing.\n')
return tf.reduce_sum(x)
The first run tf.function, Although in Python In the implementation of , It captures a complete optimization diagram , Represents the... Completed in the function TensorFlow Calculation . In subsequent calls ,TensorFlow Only execute optimization graph , Skip any non TensorFlow step . For having different characteristic codes ( Shape and data type ) The input of , Graphics may not be reusable , Therefore, a new graphic will be generated to replace .
These captured images offer two benefits :
- Most of the time , They can significantly speed up execution
- You can use tf.saved_model Export these images , Then run on other systems , No installation required Python
modular , Layers and models
tf.Modules It's a management tf.Variabel The class of the object , also tf.function Object on which to run
tf.Modules Classes are necessary to support two important functions :
- You can use tf.train.Checkpoint Save and restore the values of variables .
- You can use tf.saved_model Import and export tf.Variable Values and tf.function Images
Here is a simple export tf.Module Examples of objects :
class MyModule(tf.Module):
def __init__(self,value):
self.weight = tf.Variable(value)
@tf.function
def multiply(self, x):
return x * self.weight
mod = MyModule(3)
mod.multiply(tf.constant([1,2,3]))
Save the model :
save_path = './saved'
tf.saved_model.save(mod, save_path)
The last saved model is independent of the code we created
Training cycle
Now? , Put these together to build a basic model and train from scratch
First , Create some sample data , This generates a point cloud loosely along a conic
import matplotlib
from matplotlib import pyplot as plt
matplotlib.rcParams['figure.figsize'] = [9,6]
x = tf.linspace(-2, 2, 201)
x = tf.cast(x, tf.float32)
def f(x):
y = x**2 + 2*x - 5
return y
y = f(x) + tf.random.normal(shape=[201])
plt.plot(x.numpy(), y.numpy(), '.', label='Data')
plt.plot(x, f(x), label='Ground truth')
plt.legend();
Build a model :
class Model(tf.keras.Model):
def __init__(self, units):
super().__init__()
self.dense1 = tf.keras.layers.Dense(units=units,
activation=tf.nn.relu,
kernel_initializer=tf.random.normal,
bias_initializer=tf.random.normal)
self.dense2 = tf.keras.layers.Dense(1)
def call(self, x, training=True):
# For Keras layers/models, implement `call` instead of `__call__`.
x = x[:, tf.newaxis]
x = self.dense1(x)
x = self.dense2(x)
return tf.squeeze(x, axis=1)
model = Model(64)
plt.plot(x.numpy(), y.numpy(), '.', label='data')
plt.plot(x, f(x), label='Ground truth')
plt.plot(x, model(x), label='Untrained predictions')
plt.title('Before training')
plt.legend();
Write a basic training cycle :
variables = model.variables
optimizer = tf.optimizers.SGD(learning_rate=0.01)
for step in range(1000):
with tf.GradientTape() as tape:
prediction = model(x)
error = (y-prediction)**2
mean_error = tf.reduce_mean(error)
gradient = tape.gradient(mean_error, variables)
optimizer.apply_gradients(zip(gradient, variables))
if step % 100 == 0:
print(f'Mean squared error: {
mean_error.numpy():0.3f}')
plt.plot(x.numpy(),y.numpy(), '.', label="data")
plt.plot(x, f(x), label='Ground truth')
plt.plot(x, model(x), label='Trained predictions')
plt.title('After training')
plt.legend();
This is feasible , But in tf.keras General training procedures are provided in the module , So before writing your own training cycle, you might as well consider what has been provided . use Model.compile and Model.fit Methods implement your training cycle
new_model = Model(64)
new_model.compile(
loss=tf.keras.losses.MSE,
optimizer=tf.optimizers.SGD(learning_rate=0.01))
history = new_model.fit(x, y,
epochs=100,
batch_size=32,
verbose=0)
model.save('./my_model')
plt.plot(history.history['loss'])
plt.xlabel('Epoch')
plt.ylim([0, max(plt.ylim())])
plt.ylabel('Loss [Mean Squared Error]')
plt.title('Keras training progress');
边栏推荐
- Numpy Foundation
- 3.3 Monte Carlo Methods: case study: Blackjack of Policy Improvement of on- & off-policy Evaluation
- Opencv feature extraction - hog
- Implementation of "quick start electronic" window dragging
- 2018 Lenovo y7000 black apple external display scheme
- 20220602数学:Excel表列序号
- 20220531 Mathematics: Happy numbers
- Leetcode-106: construct a binary tree according to the sequence of middle and later traversal
- [graduation season] the picture is rich, and frugality is easy; Never forget chaos and danger in peace.
- What did I read in order to understand the to do list
猜你喜欢
Opencv image rotation
Leetcode - 705 design hash set (Design)
Leetcode-106:根据中后序遍历序列构造二叉树
openCV+dlib實現給蒙娜麗莎換臉
LeetCode - 1670 設計前中後隊列(設計 - 兩個雙端隊列)
Leetcode - 895 maximum frequency stack (Design - hash table + priority queue hash table + stack)*
Leetcode - 706 design hash mapping (Design)*
3.3 Monte Carlo Methods: case study: Blackjack of Policy Improvement of on- & off-policy Evaluation
Hands on deep learning pytorch version exercise solution - 2.5 automatic differentiation
Leetcode-513: find the lower left corner value of the tree
随机推荐
20220608其他:逆波兰表达式求值
R language classification
OpenCV Error: Assertion failed (size.width>0 && size.height>0) in imshow
重写波士顿房价预测任务(使用飞桨paddlepaddle)
Several problems encountered in installing MySQL under MAC system
Flutter 退出当前操作二次确认怎么做才更优雅?
[C question set] of Ⅵ
20220601 Mathematics: zero after factorial
20220604 Mathematics: square root of X
LeetCode - 715. Range module (TreeSet)*****
Cases of OpenCV image enhancement
Realize an online examination system from zero
20220531 Mathematics: Happy numbers
[LZY learning notes dive into deep learning] 3.4 3.6 3.7 softmax principle and Implementation
4.1 Temporal Differential of one step
Anaconda安装包 报错packagesNotFoundError: The following packages are not available from current channels:
Judging the connectivity of undirected graphs by the method of similar Union and set search
20220606 Mathematics: fraction to decimal
Deep Reinforcement learning with PyTorch
Leetcode-513: find the lower left corner value of the tree