当前位置:网站首页>Using logistic regression and neural network to deal with complex binary classification problems
Using logistic regression and neural network to deal with complex binary classification problems
2022-07-29 09:05:00 【Salty salty】
import torch
import numpy as np
from torch import nn
from torch.autograd import Variable
import torch.nn.functional as F
import os
os.environ["KMP_DUPLICATE_LIB_OK"]="TRUE"
import matplotlib.pyplot as plt
%matplotlib inline
# mapping _ Decision making _ The border
def plot_decision_boundary(model,x,y):
# Set the minimum and maximum values , And give it some blank
x_min,x_max=x[:,0].min()-1,x[:,0].max()+1
y_min,y_max=x[:,1].min()-1,x[:,1].max()+1
h=0.01
# Generate mesh of points , The distance between the points is h
xx,yy=np.meshgrid(np.arange(x_min,x_max,h),np.arange(y_min,y_max,h))
#Predict the function value for the whole grid
z=model(np.c_[xx.ravel(),yy.ravel()])
z=z.reshape(xx.shape)
# Draw contour lines and training examples
plt.contourf(xx,yy,z,cmap=plt.cm.Spectral)
plt.ylabel('x2')
plt.xlabel('x1')
plt.scatter(x[:,0],x[:,1],c=y.reshape(-1),s=40,cmap=plt.cm.Spectral)
np.random.seed(1)
m=400 # Number of samples
N=int(m/2)# Number of points of each class
D=2 # dimension
x=np.zeros((m,D))
y=np.zeros((m,1),dtype='uint8')#label vector ,0 It means red ,1 It means blue
a=4
for j in range(2):
ix=range(N*j,N*(j+1))
t=np.linspace(j*3.12,(j+1)*3.12,N)+np.random.randn(N)*0.2 # angle
r=a*np.sin(4*t)+np.random.randn(N)*0.2 # radius
x[ix]=np.c_[r*np.sin(t),r*np.cos(t)]
y[ix]=j
plt.scatter(x[:,0],x[:,1],c=y.reshape(-1),s=40,cmap=plt.cm.Spectral)
x = torch.from_numpy(x).float()
y = torch.from_numpy(y).float()
w = nn.Parameter(torch.randn(2, 1))
b = nn.Parameter(torch.zeros(1))
optimizer = torch.optim.SGD([w, b], 1e-1)
def logistic_regression(x):
return torch.mm(x, w) + b
criterion = nn.BCEWithLogitsLoss()
for e in range(100):
out = logistic_regression(Variable(x))
loss = criterion(out,Variable(y))
optimizer.zero_grad()
loss.backward()
optimizer.step()
if(e+1)%20==0:
print('epoch:{},loss:{}'.format(e+1,loss.item()))

def plot_logistic(x):
x = Variable(torch.from_numpy(x).float())
out = F.sigmoid(logistic_regression(x))
out = (out > 0.5) * 1
return out.data.numpy()
plot_decision_boundary(lambda x: plot_logistic(x), x.numpy(), y.numpy())
plt.title('logistic regression')

Can clearly see , For complex binary classification problems , Logistic regression cannot solve , Next, use neural network to solve this problem .
# Define the parameters of the two-layer neural network
w1=nn.Parameter(torch.randn(2,4)*0.01) # The number of neurons in the hidden layer 2
b1=nn.Parameter(torch.zeros(4))
w2=nn.Parameter(torch.randn(4,1)*0.01)
b2=nn.Parameter(torch.zeros(1))
# Defining models
def two_network(x):
x1=torch.mm(x,w1)+b1
x1=torch.tanh(x1)
x2=torch.mm(x1,w2)+b2
return x2
optimizer = torch.optim.SGD([w1,w2,b1,b2],1.)#SGD Stochastic gradient descent . In order to use torch.optim, Need to build optimizer object , This object can maintain the current parameter state and update the parameters based on the calculated gradient
criterion = nn.BCEWithLogitsLoss()
# We train 10000 Time
for e in range(10000):
out =two_network(Variable(x))
loss = criterion(out,Variable(y))
optimizer.zero_grad()
loss.backward()
optimizer.step()
if (e+1)%1000==0:
print('epoch:{},loss:{}'.format(e+1,loss.item()))

def plot_network(x):
x=Variable(torch.from_numpy(x).float())
x1=torch.mm(x,w1)+b1
x1=torch.tanh(x1)
x2=torch.mm(x1,w2)+b2
out=torch.sigmoid(x2)
out=(out>0.5)*1
return out.data.numpy()

We can see that , For complex binary classification problems , Neural network is more suitable .
边栏推荐
- Application of matrix transpose
- Quaternion and its simple application in unity
- [LOJ 6485] LJJ binomial theorem (unit root inversion) (template)
- C # use database to bind listview control data
- File upload and expansion
- 2022年P气瓶充装考试模拟100题模拟考试平台操作
- ICMP message analysis
- (Video + graphic) introduction series to machine learning - Chapter 2 linear regression
- Several ways of debugging support under oneos
- Analysis of zorder sampling partition process in Hudi - "deepnova developer community"
猜你喜欢

Floweable advanced

MySQL 错误总结

Tesseract text recognition -- simple

数学建模——微分方程

2022 Shandong Province safety officer C certificate work certificate question bank and answers

信息系统项目管理师必背核心考点(五十三)质量等级

2022 Teddy cup data mining challenge C project and post game summary

Error reporting when adding fields to sap se11 transparent table: structural changes at the field level (conversion table xxxxx)

Fastjson's tojsonstring() source code analysis for special processing of time classes - "deepnova developer community"

2022 electrician (elementary) test question simulation test platform operation
随机推荐
Compile and install Apache for rocky Foundation
AI is at the forefront | focusing on natural language processing, machine learning and other fields; From Fudan University, Institute of automation, Chinese Academy of Sciences and other teams
Leetcode: interview question 08.14. Boolean operation
Sword finger offer 26. substructure of tree
WQS binary learning notes
Network knowledge summary
One article tells you the salary after passing the PMP Exam
优秀的Allegro Skill推荐
Quaternion and its simple application in unity
Sudoku (DFS)
Redis series 3: highly available master-slave architecture
Simple unit testing idea
Analysis of zorder sampling partition process in Hudi - "deepnova developer community"
LeetCode刷题(6)
【Unity入门计划】C#与Unity-了解类和对象
Unity3d learning notes (I)
Emmet syntax
Excellent package volume optimization tutorial
What is the key to fast multi tag user profile analysis?
2022 R2 mobile pressure vessel filling test question simulation test platform operation