当前位置:网站首页>Pytorch quantitative perception training (qat) steps
Pytorch quantitative perception training (qat) steps
2022-06-30 21:47:00 【Breeze_】
# QAT follows the same steps as PTQ, with the exception of the training loop before you actually convert the model to its quantized version
# QAT Follow and PTQ The same procedure , In addition to the training cycle before actually converting the model to a quantitative version
''''''
''' Quantify perception training steps : step1. Build a model step2. The fusion ( An optional step ) step3. Insert stubs(1 and 3 Can be combined ) step4. Get ready ( The main thing is to choose the architecture ) step5. Training step6. Model transformation '''
import torch
from torch import nn
backend = "fbgemm" # running on a x86 CPU. Use "qnnpack" if running on ARM.
'''step1. Build a model build model'''
m = nn.Sequential(
nn.Conv2d(2,64,8),
nn.ReLU(),
nn.Conv2d(64, 128, 8),
nn.ReLU(),
)
"""step2. The fusion Fuse( An optional step )"""
torch.quantization.fuse_modules(m, ['0','1'], inplace=True) # fuse first Conv-ReLU pair
torch.quantization.fuse_modules(m, ['2','3'], inplace=True) # fuse second Conv-ReLU pair
"""step3. Insert stubs In the model ,Insert stubs"""
m = nn.Sequential(torch.quantization.QuantStub(),
*m,
torch.quantization.DeQuantStub())
"""step4. Get ready Prepare"""
m.train()
m.qconfig = torch.quantization.get_default_qconfig(backend)
torch.quantization.prepare_qat(m, inplace=True)
"""step5. Training Training Loop"""
n_epochs = 10
opt = torch.optim.SGD(m.parameters(), lr=0.1)
loss_fn = lambda out, tgt: torch.pow(tgt-out, 2).mean()
for epoch in range(n_epochs):
x = torch.rand(10,2,24,24)
out = m(x)
loss = loss_fn(out, torch.rand_like(out))
opt.zero_grad()
loss.backward()
opt.step()
print(loss)
"""step6. Model transformation Convert"""
m.eval()
torch.quantization.convert(m, inplace=True)
边栏推荐
- sdfsdf
- 1-19 using CORS to solve interface cross domain problems
- 1-20 预检请求
- 兴奋神经递质——谷氨酸与大脑健康
- A comprehensive understanding of gout: symptoms, risk factors, pathogenesis and management
- 1-15 nodemon
- 的撒啊苏丹看老司机
- Ml & DL: Introduction à l’optimisation des hyperparamètres, indice d’évaluation, phénomène de surajustement et introduction détaillée aux méthodes d’optimisation des paramètres couramment utilisées da
- 你我他是谁
- 【回溯】全排列 leetcode46
猜你喜欢

Prediction and regression of stacking integrated model

模板方法模式介绍与示例

Reading notes of Clickhouse principle analysis and Application Practice (2)
Understand what MySQL index push down (ICP) is in one article

漫谈Clickhouse Join

本地浏览器打开远程服务器上的Jupyter Notebook/Lab以及常见问题&设置

1-2 安装并配置MySQL相关的软件

京东与腾讯续签三年战略合作协议;起薪涨至26万元,韩国三星SK争相加薪留住半导体人才;Firefox 102 发布|极客头条

Anaconda下安装Jupyter notebook

Neurotransmetteurs excitateurs - glutamate et santé cérébrale
随机推荐
[untitled]
1-17 express中间件
1-7 Path路径模块
A comprehensive understanding of gout: symptoms, risk factors, pathogenesis and management
Multi table operation - foreign key constraint
京东与腾讯续签三年战略合作协议;起薪涨至26万元,韩国三星SK争相加薪留住半导体人才;Firefox 102 发布|极客头条
ceshi deces
vim 常用快捷键
Can flinksql two Kafka streams join?
Develop your own package
Inventory the six second level capabilities of Huawei cloud gaussdb (for redis)
笔记【JUC包以及Future介绍】
pytorch geometric torch-scatter和torch-sparse安装报错问题解决
Arcmap|assign values to different categories of IDS with the field calculator
. NETCORE redis geo type
1-3 using SQL to manage databases
Dm8: generate DM AWR Report
Sqlserver gets the data of numbers, Chinese and characters in the string
It is urgent for enterprises to protect API security
Anaconda下安装Jupyter notebook