当前位置:网站首页>Comparison of Optical Motion Capture and UWB Positioning Technology in Multi-agent Cooperative Control Research
Comparison of Optical Motion Capture and UWB Positioning Technology in Multi-agent Cooperative Control Research
2022-07-31 13:47:00 【MocapLeader】
When humans do any work, they always emphasize teamwork, teamwork.With the interdisciplinary development and integration of control science, computer science and other disciplines, in the field of intelligent body control, the control of a single robot, unmanned aerial vehicle, and unmanned vehicle can no longer meet the technical needs of the current field.These agents complete multi-unit and multi-dimensional cooperative work, and the cooperative control and application of multi-agent systems has become one of the research hotspots in many fields such as control, mathematics, communication, biology and artificial intelligence.
In the process of completing multi-agent cooperative control, the mainstream positioning technologies currently used are Optical Motion Capture (Optical Motion Capture) and UWB (Ultra Wide Band, Ultra Wide Band).Both technologies have their own characteristics.
Get Data Type
Optical motion capture obtains the position information of the marker points on the agent through the lens in the collaborative control experiment to calculate the position information and pose information of the agent, transmits the data to the host through SDK or VRPN, and then transmits the data to the host through the SDK or VRPN.The calculation and actual control software sends the intelligent body to complete real-time control through wireless signals; while UWB technology is a wireless carrier communication technology that uses a frequency bandwidth above 1GHz, and it cannot be completely a position control technology.Carry a signal transmitting device to indirectly obtain the location of the agent in this way.In contrast, optical motion capture can provide both the location information and attitude information of the agent, while UWB technology can only provide the location information of the agent alone.In a market with increasingly diverse and complex needs, agents that only move in position can no longer meet the needs of the current market.What is needed now is that the drone not only flies in a straight line, but can know its own attitude in the air, adjust it at any time, and even roll over. At this time, UWB technology cannot meet the control requirements.

Capture Accuracy and Latency
Not only the data support, but also the UWB system built indoors, the overall accuracy will be greatly reduced in the case of civil and commercial use, about centimeters, and the transmission range is only about 10m.However, the indoor capture accuracy of optical motion capture can reach sub-millimeter level, and the entire capture range has no upper limit according to the setting of the venue and the number of lenses.
The optical motion capture system can not only provide high-precision, low-interference information, but also transmit the position and attitude information of the agent back to the host in high real-time, and the overall data information returned to the agent is low-latencyYes, latency can be as low as a few milliseconds.
Hardware Comparison
In terms of hardware, UWB needs to install a device that transmits radio waves on the intelligent body, which will increase the complexity and uncontrollability of the collaborative control process.The optical motion capture system only sticks a few very light special markers on the intelligent body, no other equipment is required, and it will not cause other interference to the intelligent body, which not only ensures accurate position and attitude information, but also ensures that noInterfere with the cooperative control process and provide a stable and reliable activity space.
Typical Cases
Many teams in China are also conducting research on multi-agent collaborative control. Now, Professor Xia Yuanqing's team from Beijing Institute of Technology has completed the research on air-ground collaborative control. After comparing different positioning technologies for many times, the team adopted NOKOV metrology optical threeA motion capture system is used as the positioning system for the final study.In the whole research process, the NOKOV metric optical 3D motion capture system provides the research with the spatial position and attitude information of the unmanned vehicle, which makes the whole research proceed smoothly.Finally, the preliminary research was successfully completed, and it was reported by Xinhuanet.
边栏推荐
- 报错IDEA Terminated with exit code 1
- 技能大赛训练题:交换机虚拟化练习
- numpy矩阵和向量的保存与加载,以及使用保存的向量进行相似度计算
- 关于MySQL主从复制的数据同步延迟问题
- Verilog——基于FPGA的贪吃蛇游戏(VGA显示)
- 图像大面积缺失,也能逼真修复,新模型CM-GAN兼顾全局结构和纹理细节
- Linux bash: redis-server: 未找到命令
- C语言基础练(九九乘法表)与打印不同星号图案
- The pre-sale of the new Hyundai Paristi is open, and safety and comfort are not lost
- MySQL玩到这种程度,难怪大厂抢着要!
猜你喜欢

纸质说明书秒变3D动画,斯坦福大学吴佳俊最新研究,入选ECCV 2022

C# control ListView usage

golang-gin-优雅重启

C#控件 ToolStripProgressBar 用法

代码随想录笔记_哈希_454四数相加II

C# control ToolStripProgressBar usage

go使用makefile脚本编译应用

技能大赛训练题:ftp 服务攻防与加固

PartImageNet物体部件分割(Semantic Part Segmentation)数据集介绍

AWS implements scheduled tasks - Lambda+EventBridge
随机推荐
How IDEA runs web programs
Controller层代码这么写,简洁又优雅!
报错:npm ERR code EPERM
文本相似度计算(中英文)详解实战
20.nn.Module
The use of C# control CheckBox
PartImageNet物体部件分割(Semantic Part Segmentation)数据集介绍
endnote引用
JSP response对象简介说明
Shell项目实战1.系统性能分析
VU 非父子组件通信
STM32——软件SPI控制AD7705[通俗易懂]
IDEA连接MySQL数据库并使用数据
最新完整代码:使用word2vec预训练模型进行增量训练(两种保存方式对应的两种加载方式)适用gensim各种版本
尚硅谷-JVM-内存和垃圾回收篇(P1~P203)
Batch大小不一定是2的n次幂!ML资深学者最新结论
49.【拷贝构造函数与重载】
[Niu Ke brush questions - SQL big factory interview questions] NO3. E-commerce scene (some east mall)
Open Inventor 10.12 重大改进--和谐版
Save and load numpy matrices and vectors, and use the saved vectors for similarity calculation
https://www.nokov.com/support/case_studies_detail/multi-intelligent-cooperative-control-experimental-platform.html