当前位置:网站首页>[[first blog]p controller implementation instructions in UDA course]
[[first blog]p controller implementation instructions in UDA course]
2022-07-29 08:26:00 【Learn make me happy】
List of articles
Preface
Tips : Here you can add the general content to be recorded in this article :
If the interviewer is meeting me , Please let me pass .( Manual formation )
Tips : The following is the main body of this article , The following cases can be used for reference
One 、 Blog purpose ?
Record your learning process , Talk to people .
Two 、Uda In the course PID Control algorithm Python Realization
1. Import and stock in
It's useful to , Let's take another look . The code is as follows :
import random
import numpy as np
import matplotlib.pyplot as plt
There is a
# maximum recursion depth exceeded
import sys
sys.setrecursionlimit(3000)
We don't understand this , I put sys.setrecursionlimit,doc( I remember a markdown The function of inserting code in a segment , Forget , Add it later .) Moved here , I can understand a little . in short , Keep the stack from bursting .
setrecursionlimit(n)
Set the maximum depth of the Python interpreter stack to n. This
limit prevents infinite recursion from causing an overflow of the C
stack and crashing Python. The highest possible limit is platform-
dependent.
2.class And operation
This is still a little troublesome , I write it directly in the notes , Then copy the code , How efficient, how to .:
class Robot(object):
def __init__(self, length=20.0):
""" Creates robot and initializes location/orientation to 0, 0, 0. Let me translate , Create the robot and initialize the position and direction as 0, 0, 0 """
self.x = 0.0
self.y = 0.0
self.orientation = 0.0 #todo This angle should be the vehicle x Axis and earth X The angle of the axis , It's questionable here
self.length = length
self.steering_noise = 0.0
self.distance_noise = 0.0
self.steering_drift = 0.0
def set(self, x, y, orientation):
""" Sets a robot coordinate. Set the coordinates of a robot . Location and direction . """
self.x = x
self.y = y
self.orientation = orientation % (2.0 * np.pi) # Here is a remainder symbol , I used to think of division , I wonder for most of the day , Sure enough, you can't be self righteous
def set_noise(self, steering_noise, distance_noise):
""" Sets the noise parameters. Set noise parameters , Noise is everywhere . """
# makes it possible to change the noise parameters
# this is often useful in particle filters
self.steering_noise = steering_noise
self.distance_noise = distance_noise
def set_steering_drift(self, drift):
""" Sets the systematical steering drift parameter Set the steering drift parameters of the system . What may be considered here is the frame size before the steering gear , It should also include the initial offset . """
self.steering_drift = drift
def move(self, steering, distance, tolerance=0.001, max_steering_angle=np.pi / 4.0):
""" steering = front wheel steering angle, limited by max_steering_angle distance = total distance driven, most be non-negative Steering is the steering angle of the front wheels , be limited to . Physical limits cannot be exceeded . Distance is generally nonnegative . """
if steering > max_steering_angle:# Handling of excessive steering angle
steering = max_steering_angle
if steering < -max_steering_angle:
steering = -max_steering_angle
if distance < 0.0:# Deal with situations with negative distance
distance = 0.0
# apply noise Noise value It seems that Gaussian noise conforms to the positive distribution . It belongs to my knowledge blind spot , It needs to be filled in the back .
steering2 = random.gauss(steering, self.steering_noise)
distance2 = random.gauss(distance, self.distance_noise)
# apply steering drift Noise value plus drift value
steering2 += self.steering_drift
# Execute motion I don't know what model to use , I went to Lao Wang to push the kinematic model . First look at the following .
turn = np.tan(steering2) * distance2 / self.length
if abs(turn) < tolerance:
# approximate by straight line motion
self.x += distance2 * np.cos(self.orientation)
self.y += distance2 * np.sin(self.orientation)
self.orientation = (self.orientation + turn) % (2.0 * np.pi)
else:
# approximate bicycle model for motion
radius = distance2 / turn
cx = self.x - (np.sin(self.orientation) * radius)
cy = self.y + (np.cos(self.orientation) * radius)
self.orientation = (self.orientation + turn) % (2.0 * np.pi)
self.x = cx + (np.sin(self.orientation) * radius)
self.y = cy - (np.cos(self.orientation) * radius)
def __repr__(self):
return '[x=%.5f y=%.5f orient=%.5f]' % (self.x, self.y, self.orientation)
############## ADD / MODIFY CODE BELOW ####################
# ------------------------------------------------------------------------
#
# run - does a single control run
robot = Robot() # Instantiation
robot.set(0, 1, 0) # Object to set the initial value
def run(robot, tau, n=100, speed=1.0):
x_trajectory = []
y_trajectory = []
# TODO: your codehere Here is the implementation of the algorithm
for i in range(n):
cte = robot.y
steer = -tau*cte
robot.move(steer,speed)
x_trajectory.append(robot.x)
y_trajectory.append(robot.y)
return x_trajectory, y_trajectory
x_trajectory,y_trajectory = run(robot, 1) # The second parameter is P Gain size
n = len(x_trajectory)
print(x_trajectory)
print(y_trajectory)
#plt.plot(x_trajectory, y_trajectory, 'g', label='P controller')
# The following is the plot , I don't know why it didn't show
fig, (ax1, ax2) = plt.subplots(2, 1, figsize=(8, 8))
ax1.plot(x_trajectory, y_trajectory, 'g', label='P controller')
ax1.plot(x_trajectory, np.zeros(n), 'r', label='reference')
summary
Dear interviewer , Maybe you find me very good , But who hasn't become familiar from a novice . I just want to say , old hand , Take me .
边栏推荐
- Unity多人联机框架Mirro学习记录(一)
- Collation of ml.net related resources
- Qpalette learning notes
- 简易计算器微信小程序项目源码
- What if official account does not support markdown format file preparation?
- 谷歌浏览器免跨域配置
- [robomaster] a board receives jy-me01 angle sensor data -- Modbus Protocol & CRC software verification
- Detailed steps of installing MySQL 5.7 for windows
- Unity shader learning (VI) achieving radar scanning effect
- Charging pile charging technology new energy charging pile development
猜你喜欢
![[opencv] - Operator (Sobel, canny, Laplacian) learning](/img/24/4e40408b61ec2c714b750b05a534c7.png)
[opencv] - Operator (Sobel, canny, Laplacian) learning

Temperature acquisition and control system based on WiFi

Flask reports an error runtimeerror: the session is unavailable because no secret key was set

Proteus simulation based on msp430f2491 (realize water lamp)

Basic shell operations (Part 2)

Unity shader learning (VI) achieving radar scanning effect

Cluster usage specification

Second week of postgraduate freshman training: convolutional neural network foundation

Simplefoc parameter adjustment 3-pid parameter setting strategy

The first week of postgraduate freshman training: deep learning and pytorch Foundation
随机推荐
Simple operation of SQL server data table
TCP - sliding window
torch.nn.functional.one_ hot()
Osg3.6.5 failed to compile freetype
leetcode hot 100(刷题篇9)(301/45/517/407/offer62/MST08.14/)
OSG advanced sequence
集群使用规范
Four pin OLED display based on stm32
Day4: SQL server is easy to use
Collation of ml.net related resources
产品推广的渠道和策略,化妆品品牌推广方法及步骤
ROS common instructions
MySQL中的时间函数
用户身份标识与账号体系实践
125kHz wake-up function 2.4GHz single transmitter chip-si24r2h
Cv520 domestic replacement of ci521 13.56MHz contactless reader chip
Week 2: convolutional neural network basics
Third week weekly report resnet+resnext
MFC integration QT verification and problem handling
Time function in MySQL