当前位置:网站首页>Hisilicon 3559 universal platform construction: RTSP real-time playback support
Hisilicon 3559 universal platform construction: RTSP real-time playback support
2022-07-07 03:37:00 【Run! The bug is coming】
Preface
Want to build a versatile platform with complete functions ,rtsp Your support is naturally essential . Whether encoded h264 And what we use for real-time transmission rtsp, All of them can be singled out as research directions , Fortunately, in terms of functional support , For the time being, call it all as a black box , It's easier to implement . In fact, knowledge is endless , Just choose what we need .
RTSP
Real time streaming protocol (Real Time Streaming Protocol,RTSP),RFC2326( Chinese version ), yes TCP/IP An application layer protocol in protocol architecture , By Columbia University 、 Netscape and RealNetworks The company submitted IETF RFC standard . The protocol defines how one to many applications can effectively pass through IP Multimedia data over the Internet .
RTSP yes TCP/IP An application layer protocol in protocol architecture , The protocol defines how one to many applications can effectively pass through IP Multimedia data over the Internet .RTSP Architecturally located in RTP and RTCP above , It USES TCP or UDP Complete the data transfer .HTTP And RTSP comparison ,HTTP delivery HTML, and RTSP Multimedia data is transmitted .
RTSP It's a text-based protocol , use ISO10646 Character set , Use UTF-8 coding scheme . Line to CRLF interrupt , Including message types 、 The message header 、 Message body and message length . But the recipient itself can CR and LF Explain the line terminator . Text based protocols make it easier to add optional parameters in a self describing way , ... is used in the interface SDP As a descriptive language .
RTSP It is an application level protocol , Control the transmission of real-time data .RTSP Provides an extensible framework , Make real-time data , If the controlled on-demand of audio and video becomes possible . Data sources include live data and data stored in clips . The purpose of the protocol is to control multiple data transmission connections , Select the transmission channel for , Such as UDP、 Multicast UDP And TCP, Provide access , And for selection based on RTP Send on mechanism provides methods .
RTSP Establish and control one or more time synchronized continuous streaming media . Although it is possible to exchange continuous media streams with control streams , Usually it does not send a continuous stream itself . In other words ,RTSP Network remote control as multimedia server .RTSP The connection is not bound to the transport layer connection , Such as TCP. stay RTSP During connection ,RTSP The user can open or close multiple transportable connections to the server to issue RTSP request . Besides , Connectionless transport protocol can be used , Such as UDP.RTSP Flow controlled flow may use RTP, but RTSP The operation does not depend on the transmission mechanism used to carry continuous media .
A simple sweep of Baidu Encyclopedia , In an instant, clouds and fog , Regardless of technical implementation , In a big way , That is to realize the real-time playback of the encoded audio and video files , our sample It has built-in function of saving audio and video files after encoding , Just call rtsp library , It's OK to complete the function support
The transplant process
Before venc Of sample The analysis has already mentioned SAMPLE_COMM_VENC_StartGetStream Is a function used to save files , Unfortunately , This function is in common In the catalog ,Makefile All files in this directory will be added during compilation , For unnecessary trouble , In addition to adding comments , Generally, do not change the functions in these places without authorization
in addition , Out of personal habit , The original sample All possible modifications in are re referenced sample Write again , First, it can deepen the impression , It is very convenient to modify , Don't expect to give others sample Documents have an impact
Save the encoded file
pthread_t gs_RtspVencPid;
static SAMPLE_VENC_GETSTREAM_PARA_S gs_stPara;
/* * describe : call RTSP After static library , It is used to display the encoded image in real time * Parameters :VeChn[] Code channel number s32Cnt The channel number * Return value : Create thread PLATFORM_VENC_GetVencStreamRtsp, Transitive structure gs_stPara * Be careful : nothing */
HI_S32 PLATFORM_VENC_StartGetStreamRtsp(VENC_CHN VeChn[],HI_S32 s32Cnt)
{
HI_U32 i;
gs_stPara.bThreadStart = HI_TRUE;
gs_stPara.s32Cnt = s32Cnt;
for(i=0; i<s32Cnt; i++)
{
gs_stPara.VeChn[i] = VeChn[i];
}
return pthread_create(&gs_RtspVencPid, 0, PLATFORM_VENC_GetVencStreamRtsp, (HI_VOID*)&gs_stPara);
}
/****************************************************************************** * describe : end RTSP Real time playback thread * Parameters : nothing * Return value : Successfully returns 0 * Be careful : nothing ******************************************************************************/
HI_S32 PLATFORM_VENC_StopGetStreamRtsp(void)
{
if (HI_TRUE == gs_stPara.bThreadStart)
{
gs_stPara.bThreadStart = HI_FALSE;
pthread_join(gs_RtspVencPid, 0);
}
return HI_SUCCESS;
}
/****************************************************************************** * describe : Save the file * Parameters :pFd File descriptor pstStream Frame stream type structure . * Return value : Successfully returns 0 * Be careful : nothing ******************************************************************************/
HI_S32 PLATFORM_VENC_SaveStream(FILE* pFd, VENC_STREAM_S* pstStream)
{
HI_S32 i;
for (i = 0; i < pstStream->u32PackCount; i++)
{
fwrite(pstStream->pstPack[i].pu8Addr + pstStream->pstPack[i].u32Offset,
pstStream->pstPack[i].u32Len - pstStream->pstPack[i].u32Offset, 1, pFd);
fflush(pFd);
}
return HI_SUCCESS;
}
/****************************************************************************** * describe : according to enPayload Get file suffix * Parameters :enPayload Audio and video payload type enumeration szFilePostfix File prefix * Return value : Successfully returns 0, Failure to return -1 ******************************************************************************/
HI_S32 PLATFORM_VENC_GetFilePostfix(PAYLOAD_TYPE_E enPayload, char* szFilePostfix)
{
if (PT_H264 == enPayload)
{
strcpy(szFilePostfix, ".h264");
}
else if (PT_H265 == enPayload)
{
strcpy(szFilePostfix, ".h265");
}
else if (PT_JPEG == enPayload)
{
strcpy(szFilePostfix, ".jpg");
}
else if (PT_MJPEG == enPayload)
{
strcpy(szFilePostfix, ".mjp");
}
else if (PT_PRORES == enPayload)
{
strcpy(szFilePostfix, ".prores");
}
else
{
SAMPLE_PRT("payload type err!\n");
return HI_FAILURE;
}
return HI_SUCCESS;
}
/* * describe : call RTSP After static library , It is used to display the encoded image in real time * Parameters :p The prototype of the structure pointer passed in by the thread is SAMPLE_VENC_GETSTREAM_PARA_S * Return value :NULL * Be careful : nothing */
HI_VOID* PLATFORM_VENC_GetVencStreamRtsp(HI_VOID* p)
{
HI_S32 i;
HI_S32 s32ChnTotal;
VENC_CHN_ATTR_S stVencChnAttr;
SAMPLE_VENC_GETSTREAM_PARA_S* pstPara;
HI_S32 maxfd = 0;
struct timeval TimeoutVal;
fd_set read_fds;
HI_U32 u32PictureCnt[VENC_MAX_CHN_NUM]={
0};
HI_S32 VencFd[VENC_MAX_CHN_NUM];
HI_CHAR aszFileName[VENC_MAX_CHN_NUM][64];
FILE* pFile[VENC_MAX_CHN_NUM];
char szFilePostfix[10];
VENC_CHN_STATUS_S stStat;
VENC_STREAM_S stStream;
HI_S32 s32Ret;
VENC_CHN VencChn;
PAYLOAD_TYPE_E enPayLoadType[VENC_MAX_CHN_NUM];
VENC_STREAM_BUF_INFO_S stStreamBufInfo[VENC_MAX_CHN_NUM];
prctl(PR_SET_NAME, "GetVencStream", 0,0,0);
pstPara = (SAMPLE_VENC_GETSTREAM_PARA_S*)p;
s32ChnTotal = pstPara->s32Cnt;
/****************************************** step 1: check & prepare save-file & venc-fd Check and prepare to save files and venc fd ******************************************/
if (s32ChnTotal >= VENC_MAX_CHN_NUM)
{
SAMPLE_PRT("input count invaild\n");
return NULL;
}
for (i = 0; i < s32ChnTotal; i++)
{
/* decide the stream file name, and open file to save stream Determine the video stream file name , And open the file to save the video stream */
VencChn = pstPara->VeChn[i];
s32Ret = HI_MPI_VENC_GetChnAttr(VencChn, &stVencChnAttr);
if (s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("HI_MPI_VENC_GetChnAttr chn[%d] failed with %#x!\n", \
VencChn, s32Ret);
return NULL;
}
enPayLoadType[i] = stVencChnAttr.stVencAttr.enType;
s32Ret = PLATFORM_VENC_GetFilePostfix(enPayLoadType[i], szFilePostfix);
if (s32Ret != HI_SUCCESS)
{
SAMPLE_PRT("PLATFORM_VENC_GetFilePostfix [%d] failed with %#x!\n", \
stVencChnAttr.stVencAttr.enType, s32Ret);
return NULL;
}
if(PT_JPEG != enPayLoadType[i])
{
snprintf(aszFileName[i],32, "./RTSP/RTSP_chn%d%s", i, szFilePostfix);
pFile[i] = fopen(aszFileName[i], "wb");
if (!pFile[i])
{
SAMPLE_PRT("open file[%s] failed!\n",
aszFileName[i]);
return NULL;
}
}
/* Set Venc Fd. */
VencFd[i] = HI_MPI_VENC_GetFd(i);
if (VencFd[i] < 0)
{
SAMPLE_PRT("HI_MPI_VENC_GetFd failed with %#x!\n",
VencFd[i]);
return NULL;
}
if (maxfd <= VencFd[i])
{
maxfd = VencFd[i];
}
s32Ret = HI_MPI_VENC_GetStreamBufInfo (i, &stStreamBufInfo[i]);
if (HI_SUCCESS != s32Ret)
{
SAMPLE_PRT("HI_MPI_VENC_GetStreamBufInfo failed with %#x!\n", s32Ret);
return (void *)HI_FAILURE;
}
}
/****************************************** step 2: Start to get streams of each channel. Start getting video streams for each channel ******************************************/
while (HI_TRUE == pstPara->bThreadStart)
{
FD_ZERO(&read_fds);
for (i = 0; i < s32ChnTotal; i++)
{
FD_SET(VencFd[i], &read_fds);
}
TimeoutVal.tv_sec = 2;
TimeoutVal.tv_usec = 0;
s32Ret = select(maxfd + 1, &read_fds, NULL, NULL, &TimeoutVal);
if (s32Ret < 0)
{
SAMPLE_PRT("select failed!\n");
break;
}
else if (s32Ret == 0)
{
SAMPLE_PRT("get venc stream time out, exit thread\n");
continue;
}
else
{
for (i = 0; i < s32ChnTotal; i++)
{
if (FD_ISSET(VencFd[i], &read_fds))
{
/******************************************************* step 2.1 : query how many packs in one-frame stream. Query how many packets are in each frame stream . *******************************************************/
memset(&stStream, 0, sizeof(stStream));
s32Ret = HI_MPI_VENC_QueryStatus(i, &stStat);
if (HI_SUCCESS != s32Ret)
{
SAMPLE_PRT("HI_MPI_VENC_QueryStatus chn[%d] failed with %#x!\n", i, s32Ret);
break;
}
/******************************************************* step 2.2 :suggest to check both u32CurPacks and u32LeftStreamFrames at the same time,for example: It is recommended to check at the same time u32CurPacks and u32LeftStreamFrames if(0 == stStat.u32CurPacks || 0 == stStat.u32LeftStreamFrames) { SAMPLE_PRT("NOTE: Current frame is NULL!\n"); continue; } *******************************************************/
if(0 == stStat.u32CurPacks)
{
SAMPLE_PRT("NOTE: Current frame is NULL!\n");
continue;
}
/******************************************************* step 2.3 : malloc corresponding number of pack nodes. malloc The number of corresponding package nodes . *******************************************************/
stStream.pstPack = (VENC_PACK_S*)malloc(sizeof(VENC_PACK_S) * stStat.u32CurPacks);
if (NULL == stStream.pstPack)
{
SAMPLE_PRT("malloc stream pack failed!\n");
break;
}
/******************************************************* step 2.4 : call mpi to get one-frame stream call mpi Get a frame stream *******************************************************/
stStream.u32PackCount = stStat.u32CurPacks;
s32Ret = HI_MPI_VENC_GetStream(i, &stStream, HI_TRUE);
if (HI_SUCCESS != s32Ret)
{
free(stStream.pstPack);
stStream.pstPack = NULL;
SAMPLE_PRT("HI_MPI_VENC_GetStream failed with %#x!\n", \
s32Ret);
break;
}
/******************************************************* step 2.5 : save frame to file Save the frame to a file *******************************************************/
if(PT_JPEG == enPayLoadType[i])
{
snprintf(aszFileName[i],32, "stream_chn%d_%d%s", i, u32PictureCnt[i],szFilePostfix);
pFile[i] = fopen(aszFileName[i], "wb");
if (!pFile[i])
{
SAMPLE_PRT("open file err!\n");
return NULL;
}
}
#ifndef __HuaweiLite__
s32Ret = PLATFORM_VENC_SaveStream(pFile[i], &stStream);
#else
s32Ret = SAMPLE_COMM_VENC_SaveStream_PhyAddr(pFile[i], &stStreamBufInfo[i], &stStream);
#endif
if (HI_SUCCESS != s32Ret)
{
free(stStream.pstPack);
stStream.pstPack = NULL;
SAMPLE_PRT("save stream failed!\n");
break;
}
/******************************************************* step 2.6 : release stream Release video stream *******************************************************/
s32Ret = HI_MPI_VENC_ReleaseStream(i, &stStream);
if (HI_SUCCESS != s32Ret)
{
SAMPLE_PRT("HI_MPI_VENC_ReleaseStream failed!\n");
free(stStream.pstPack);
stStream.pstPack = NULL;
break;
}
/******************************************************* step 2.7 : free pack nodes Release the package node *******************************************************/
free(stStream.pstPack);
stStream.pstPack = NULL;
u32PictureCnt[i]++;
if(PT_JPEG == enPayLoadType[i])
{
fclose(pFile[i]);
}
}
}
}
}
/******************************************************* * step 3 : close save-file *******************************************************/
for (i = 0; i < s32ChnTotal; i++)
{
if(PT_JPEG != enPayLoadType[i])
{
fclose(pFile[i]);
}
}
return NULL;
}
MakeFile modify
rtsp Library we change the cross compilation environment ( modify rtsp Library Makefile), After recompiling and generating the static library , Modify Hisilicon Makefile, add to :
RTSP_DIR ?= $(PWD)/../../rtsp_lib
INC_FLAGS += -I$(RTSP_DIR)
SENSOR_LIBS += $(REL_LIB)/librtsp.a
The path depends entirely on where you put it , The static library is temporarily placed in the default location , It can be changed later
RTSP transplant
Reference resources demo Define the loading file structure , It will be upgraded later !
#define MAX_SESSION_NUM 8 //rtsp Maximum number of interfaces
#define DEMO_CFG_FILE "platform.ini"
/* Configuration file structure */
typedef struct demo_cfg_para
{
int session_count;
struct {
char path[64];
char video_file[64];
char audio_file[64];
} session_cfg[MAX_SESSION_NUM];
}demo_cfg;
static int flag_run = 1;
static void sig_proc(int signo)
{
flag_run = 0;
}
static int get_next_video_frame (FILE *fp, uint8_t **buff, int *size)
{
uint8_t szbuf[1024];
int szlen = 0;
int ret;
if (!(*buff)) {
*buff = (uint8_t*)malloc(2*1024*1024);
if (!(*buff))
return -1;
}
*size = 0;
while ((ret = fread(szbuf + szlen, 1, sizeof(szbuf) - szlen, fp)) > 0) {
int i = 3;
szlen += ret;
while (i < szlen - 3 && !(szbuf[i] == 0 && szbuf[i+1] == 0 && (szbuf[i+2] == 1 || (szbuf[i+2] == 0 && szbuf[i+3] == 1)))) i++;
memcpy(*buff + *size, szbuf, i);
*size += i;
memmove(szbuf, szbuf + i, szlen - i);
szlen -= i;
if (szlen > 3) {
//printf("szlen %d\n", szlen);
fseek(fp, -szlen, SEEK_CUR);
break;
}
}
if (ret > 0)
return *size;
return 0;
}
static int get_next_audio_frame (FILE *fp, uint8_t **buff, int *size)
{
int ret;
#define AUDIO_FRAME_SIZE 320
if (!(*buff)) {
*buff = (uint8_t*)malloc(AUDIO_FRAME_SIZE);
if (!(*buff))
return -1;
}
ret = fread(*buff, 1, AUDIO_FRAME_SIZE, fp);
if (ret > 0) {
*size = ret;
return ret;
}
return 0;
}
int load_cfg(demo_cfg *cfg, const char *cfg_file)
{
//cfgline: path=%s video=%s audio=%s
FILE *fp = fopen(cfg_file, "r");
char line[256];
int count = 0;
if (!fp) {
fprintf(stderr, "open %s failed\n", cfg_file);
return -1;
}
memset(cfg, 0, sizeof(*cfg));
while (fgets(line, sizeof(line) - 1, fp)) {
const char *p;
memset(&cfg->session_cfg[count], 0, sizeof(cfg->session_cfg[count]));
if (line[0] == '#')
continue;
p = strstr(line, "path=");
if (!p)
continue;
if (sscanf(p, "path=%s", cfg->session_cfg[count].path) != 1)
continue;
if ((p = strstr(line, "video="))) {
if (sscanf(p, "video=%s", cfg->session_cfg[count].video_file) != 1) {
fprintf(stderr, "parse video file failed %s\n", p);
}
}
if ((p = strstr(line, "audio="))) {
if (sscanf(p, "audio=%s", cfg->session_cfg[count].audio_file) != 1) {
fprintf(stderr, "parse audio file failed %s\n", p);
}
}
if (strlen(cfg->session_cfg[count].video_file) || strlen(cfg->session_cfg[count].audio_file)) {
count ++;
} else {
fprintf(stderr, "parse line %s failed\n", line);
}
}
cfg->session_count = count;
/* path=/live/chn0 video=BarbieGirl.h264 audio=BarbieGirl.alaw path=/live/chn1 video=BarbieGirl.h264 path=/live/chn2 audio=BarbieGirl.alaw */
printf("cfg->session_count:%d\n",cfg->session_count);//3
fclose(fp);
return count;
}
/* * describe : be used for rtsp Thread for real-time playback * Parameters :NULL * Return value : nothing * Be careful : Load the file demo.cfg path=/mnt/sample/venc/RTSP video=RTSP_chn1.h264 */
void *video_play_rtsp_task(void*arg)
{
const char *cfg_file = DEMO_CFG_FILE;
demo_cfg cfg;
FILE *fp[MAX_SESSION_NUM][2] = {
{
NULL}};
rtsp_demo_handle demo;
rtsp_session_handle session[MAX_SESSION_NUM] = {
NULL};
int session_count = 0;
uint8_t *vbuf = NULL;
uint8_t *abuf = NULL;
uint64_t ts = 0;
int vsize = 0, asize = 0;
int ret, ch;
ret = load_cfg(&cfg, cfg_file);
demo = rtsp_new_demo(8554);//rtsp sever socket
if (NULL == demo) {
SAMPLE_PRT("rtsp new demo failed!\n");
return 0;
}
session_count = 1;
for (ch = 0; ch < session_count; ch++)
{
if (strlen(cfg.session_cfg[ch].video_file)) {
fp[ch][0] = fopen(cfg.session_cfg[ch].video_file, "rb");// Open video file
if (!fp[ch][0]) {
fprintf(stderr, "open %s failed\n", cfg.session_cfg[ch].video_file);
}
}
//fp[ch][1] : Handle to audio file
// if (strlen(cfg.session_cfg[ch].audio_file)) {
// fp[ch][1] = fopen(cfg.session_cfg[ch].audio_file, "rb");
// if (!fp[ch][1]) {
// fprintf(stderr, "open %s failed\n", cfg.session_cfg[ch].audio_file);
// }
// }
if (fp[ch][0] == NULL && fp[ch][1] == NULL)
continue;
session[ch] = rtsp_new_session(demo, cfg.session_cfg[ch].path);// Corresponding rtsp session
if (NULL == session[ch]) {
printf("rtsp_new_session failed\n");
continue;
}
if (fp[ch][0]) {
// The current request path is stored in the video data source
rtsp_set_video(session[ch], RTSP_CODEC_ID_VIDEO_H264, NULL, 0);
rtsp_sync_video_ts(session[ch], rtsp_get_reltime(), rtsp_get_ntptime());
}
printf("==========> rtsp://192.168.119.200:8554%s for %s <===========\n", cfg.session_cfg[ch].path,
fp[ch][0] ? cfg.session_cfg[ch].video_file : "");
}
ts = rtsp_get_reltime();
signal(SIGINT, sig_proc);
while (flag_run) {
uint8_t type = 0;
for (ch = 0; ch < session_count; ch++) {
//3 Source
if (fp[ch][0]) {
read_video_again:
ret = get_next_video_frame(fp[ch][0], &vbuf, &vsize);
if (ret < 0) {
fprintf(stderr, "get_next_video_frame failed\n");
flag_run = 0;
break;
}
if (ret == 0) {
fseek(fp[ch][0], 0, SEEK_SET);
if (fp[ch][1])
fseek(fp[ch][1], 0, SEEK_SET);
goto read_video_again;
}
if (session[ch])//1 Source session Deposit
rtsp_tx_video(session[ch], vbuf, vsize, ts);//2rtsp_client_connect There is
type = 0;
if (vbuf[0] == 0 && vbuf[1] == 0 && vbuf[2] == 1) {
type = vbuf[3] & 0x1f;
}
if (vbuf[0] == 0 && vbuf[1] == 0 && vbuf[2] == 0 && vbuf[3] == 1) {
type = vbuf[4] & 0x1f;
}
if (type != 5 && type != 1)
goto read_video_again;
}
if (fp[ch][1]) {
ret = get_next_audio_frame(fp[ch][1], &abuf, &asize);
if (ret < 0) {
fprintf(stderr, "get_next_audio_frame failed\n");
break;
}
if (ret == 0) {
fseek(fp[ch][1], 0, SEEK_SET);
if (fp[ch][0])
fseek(fp[ch][0], 0, SEEK_SET);
continue;
}
if (session[ch])
rtsp_tx_audio(session[ch], abuf, asize, ts);
}
}
do {
ret = rtsp_do_event(demo);//
if (ret > 0)
continue;
if (ret < 0)
break;
usleep(20000);
} while (rtsp_get_reltime() - ts < 1000000 / 25);
if (ret < 0)
break;
ts += 1000000 / 25;
printf(".");fflush(stdout);// Immediately printf The data output shows
}
free(vbuf);
free(abuf);
for (ch = 0; ch < session_count; ch++) {
if (fp[ch][0])
fclose(fp[ch][0]);
if (fp[ch][1])
fclose(fp[ch][1]);
if (session[ch])
rtsp_del_session(session[ch]);
}
rtsp_del_demo(demo);
printf("Exit.\n");
getchar();
return 0;
}
边栏推荐
- Numpy中排序操作partition,argpartition,sort,argsort
- 从0开始创建小程序
- 密码学系列之:在线证书状态协议OCSP详解
- MySQL的索引
- 19. (ArcGIS API for JS) ArcGIS API for JS line acquisition (sketchviewmodel)
- About Tolerance Intervals
- 21.(arcgis api for js篇)arcgis api for js矩形采集(SketchViewModel)
- 2022年上半年HIT行业TOP50
- [colmap] 3D reconstruction with known camera pose
- The latest 2022 review of "small sample deep learning image recognition"
猜你喜欢
Not All Points Are Equal Learning Highly Efficient Point-based Detectors for 3D LiDAR Point
Enumeration general interface & enumeration usage specification
Clock in during winter vacation
24. (ArcGIS API for JS) ArcGIS API for JS point modification point editing (sketchviewmodel)
About Confidence Intervals
什么是 BA ?BA怎么样?BA和BI是什么关系?
华为小米互“抄作业”
卡尔曼滤波-1
25.(arcgis api for js篇)arcgis api for js线修改线编辑(SketchViewModel)
19.(arcgis api for js篇)arcgis api for js线采集(SketchViewModel)
随机推荐
25.(arcgis api for js篇)arcgis api for js线修改线编辑(SketchViewModel)
Jerry's transmitter crashed after the receiver shut down [chapter]
Install torch 0.4.1
Shangsilicon Valley JVM Chapter 1 class loading subsystem
大白话高并发(二)
Jerry's question about DAC output power [chapter]
Vernacular high concurrency (2)
Appx code signing Guide
Open3D 网格滤波
Flink Task退出流程与Failover机制
【无标题】
我的勇敢对线之路--详细阐述,浏览器输入URL发生了什么
华为小米互“抄作业”
Free PHP online decryption tool source code v1.2
VHDL implementation of arbitrary size matrix multiplication
QT 使用QToolTip 鼠标放上去显示文字时会把按钮的图片也显示了、修改提示文字样式
[swift] learning notes (I) -- familiar with basic data types, coding styles, tuples, propositions
Jerry's ble exiting Bluetooth mode card machine [chapter]
未来发展路线确认!数字经济、数字化转型、数据...这次会议很重要
A 股指数成分数据 API 数据接口