当前位置:网站首页>Curl command

Curl command

2022-07-07 01:42:00 Full stack programmer webmaster

Hello everyone , I meet you again , I'm the king of the whole stack , I've prepared for you today Idea Registration code .

It can be thought of as a command-line browser

1、 to open up gzip seek curl -I http://www.sina.com.cn/ -H Accept-Encoding:gzip,defalte

2、 Monitor the response time of web pages curl -o /dev/null -s -w “time_connect: %{time_connect}\ntime_starttransfer: %{time_starttransfer}\ntime_total: %{time_total}\n” “http://www.kklinux.com”

3. Monitor website availability curl -o /dev/null -s -w %{http_code} “http://www.kklinux.com”

4、 With http1.0 Protocol request ( Tacit feeling http1.1) curl -0 ………….. 1) Read the web page    curl linuxidc.com”>http://www.linuxidc.com   2) Save web page    curl -o page.html http://www.linuxidc.com   3) The use of proxyserver And its port:-x    curl -x 123.45.67.89:1080 -o page.html http://www.linuxidc.com   4) Use cookie To record session Information    curl -x 123.45.67.89:1080 -o page.html -D cookie0001.txt http://www.linuxidc.com option: -D It's a http Of response Inside cookie The information is stored in a special file , such , When the page is saved to page.html At the same time ,cookie Information is also stored in cookie0001.txt Inside the 5) that , Next visit , How to continue to use the last left cookie Information? ?    Use option Let's take the last one cookie The information is appended to http request Go inside :-b    curl -x 123.45.67.89:1080 -o page1.html -D cookie0002.txt -b cookie0001.txt http://www.linuxidc.com

6) Browser information ~~~~ Arbitrarily specify your browser information claimed by this visit : -A curl -A “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)” -x 123.45.67.89:1080 -o page.html -D cookie0001.txt http://www.yahoo.com such ,server End access request , You will feel like an executive in Windows 2000 Upper IE6.0, Hey, hey, hey . In fact, maybe you use an apple machine ! and ”Mozilla/4.73 [en] (X11; U; Linux 2.2; 15 i686″ Then you can tell the other party that you are a PC On the run Linux. It's using Netscape 4.73, Ha ha ha

7) Another one server The limiting method often used by the client , It's inspection http Visited referer.

For example, you first visit the home page , Then visit the download page specified inside , This is the second visit referer The address is the page after the first successful visit site . such .server The client only needs to find a certain access to the download page referer Address no It's the address of the home page , It can be concluded that it is a theft ~~~~~ I hate it ~~~ I just want to steal company ~~~~~!! fortunately curl It gives us settings referer Of option: -e curl -A “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)” -x 123.45.67.89:1080 -e “mail.yahoo.com” -o page.html -D cookie0001.txt http://www.yahoo.com such , Can cheat each other server, You are from mail.yahoo.com Click on a link , Ha ha ha

8)curl Download the file I said just now . Download the page to a file . Able to use -o . It's the same with downloading files .

example , curl -o 1.jpg http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG Here's a new one option: -O uppercase O. So used : curl -O http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG such , Can follow server File name on . I have taken the initiative to exist locally !

And a better one .

hypothesis screen1.JPG Besides screen2.JPG、screen3.JPG、….、screen10.JPG Need to download , It's hard for us to write a script To finish these operations ? Quit doing ! stay curl Inside , Just write it like this : curl -O http://cgi2.tky.3web.ne.jp/~zzh/screen[1-10].JPG Ha ha ha . Terrible! ?!~~~ 9) Come again , Let's continue to explain and download ! curl -O http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201].JPG The resulting download . Namely ~zzh/001.JPG ~zzh/002.JPG … ~zzh/201.JPG ~nick/001.JPG ~nick/002.JPG … ~nick/201.JPG It's convenient enough ? Ha ha ha Why ? Too soon to be happy . because zzh/nick The file names under are 001.002…,201. The downloaded file has the same name , The latter has covered all the previous documents ~~~ No problem . We have more !

curl -o #2_#1.jpg http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201].JPG – This is a ….. Download the file name defined by yourself ? – Right , ha-ha ! #1 It's a variable. , refer to {zzh,nick} This part , For the first time zzh, The second value is nick #2 Represents the variable , It's the second variable —[001-201], The values from 001 Add one by one 201 such . Define the name of the downloaded file by yourself , That's how it became : original : ~zzh/001.JPG —> After downloading : 001-zzh.JPG original : ~nick/001.JPG —> After downloading : 001-nick.JPG thus , I'm not afraid of duplicate names , ha-ha

9) Continue with download We usually windows On the platform ,flashget This tool can help us download in parallel , Can also break the line to continue . curl I won't lose to anyone in these aspects . Hey Let's download screen1.JPG in , All of a sudden, it's off the line , We can start the continuation like this curl -c -O http://cgi2.tky.3wb.ne.jp/~zzh/screen1.JPG Of course . You don't want to take a flashget Download half the files to fool me ~~~~ Half of the other downloaded software may not be available ~~~ Download in blocks , We use this option You can : -r Illustrate with examples Let's say we have a http://cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 To download ( Mr. Zhao's phone reading :D ) We can use this command : curl -r 0-10240 -o “zhao.part1” http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 &\ curl -r 10241-20480 -o “zhao.part1” http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 &\ curl -r 20481-40960 -o “zhao.part1” http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 &\ curl -r 40961- -o “zhao.part1” http:/cgi2.tky.3web.ne.jp/~zzh/zhao1.mp3 In this way, you can download in blocks . It's just that you need to merge these broken documents by yourself Suppose you use UNIX Or apple . use cat zhao.part* > zhao.mp3 You can Suppose we use Windows, use copy /b Let's work it out . ha-ha It's all about http Download the protocol , in fact ftp It can also be used .

How to use it , curl -u name:passwd ftp://ip:port/path/file Or familiar curl ftp://name:[email protected]:port/path/file 10) The uploaded option yes -T For example, let's say ftp Send a file : curl -T localfile -u name:passwd ftp://upload_site:port/path/ Of course , towards httpserver Uploading files can also example curl -T localfile http://cgi2.tky.3web.ne.jp/~zzh/abc.cgi Be careful , Now . The protocol used is HTTP Of PUT method Just now PUT, Hey . Naturally, Lao Fu remembered other kinds methos Not yet !

GET and POST Don't forget .

http Submit a form , More often used is POST Patterns and GET Pattern GET Pattern what option No need , Just write the variables in url It's inside example : curl http://www.yahoo.com/login.cgi?

user=nickwolfe&password=12345 and POST Mode option It is -d example ,curl -d “user=nickwolfe&password=12345” http://www.yahoo.com/login.cgi It is equivalent to sending a login application to this website ~~~~~ How to use GET Mode or POST Pattern , Look across server Program settings . One thing to note is ,POST File upload on file mode , example <form method=”POST” enctype=”multipar/form-data” action=”http://cgi2.tky.3web.ne.jp/~zzh/up_file.cgi”> <input type=file name=upload> <input type=submit name=nick value=”go”> </form> Such a HTTP Forms , We need to use it. curl To simulate , It should be this grammar : curl -F [email protected] -F nick=go http://cgi2.tky.3web.ne.jp/~zzh/up_file.cgi So much chatter , in fact curl There are many, many skills and methods of use example https Use a local certificate when you use it . Can be like this curl -E localcert.pem https://remote_server Let's take another example , You can still use enough curl adopt dict Look it up in the dictionary ~~~~~ curl dict://dict.org/d:computer

Today, in order to check whether all domain names on all hedgehog hosts have been filed . In the use of wget In case of discomfort , eureka curl This command line flowmeter command . Find out post The call of is still pretty good . It is especially conducive to the submission of information and changes Compare the parameters . For me, I want to transfer hundreds of thousands of domain names to miibeian.gov.cn It is very practical to verify whether there is filing information . I found this article very good , Specially posted . My goal : curl -d “cxfs=1&ym=xieyy.cn” http://www.miibeian.gov.cn/baxx_cx_servlet Filter out the information , Extract the record number information , And set an identification bit . Domain name . Record number and identification bit are put into storage

use curl command ,post Submit data with spaces I came across a situation today . I want to use curl Log into a web page , Inadvertently found to post There are spaces in the data of .

example username by ”abcdef”,password by ”abc def”. There is a space in it , Submit as I have done : curl -D cookie -d “username=abcdef&password=abc def” http://login.xxx.com/ Prompt login failure .

So check curl Manual man curl. find : d/–data (HTTP) Sends the speci?ed data in a POST request to the HTTP server, in a way that can emulate as if a user has ?lled in a HTML form and pressed the submit button. Note that the data is sent exactly as speci?

ed with no extra processing (with all newlines cut off). The data is expected to be “url-encoded”. This will cause curl to pass the data to the server using the content-type application/x-www-form-urlencoded. Compare to -F/–form. If this option is used more than once on the same command line, the data pieces speci?

ed will be merged together with a separating &-letter. Thus, using ’-d name=daniel -d skill=lousy’ would generate a post chunk that looks like ’name=daniel&skill=lousy’. So instead : curl -D cookie -d “username=abcdef” -d “password=abc efg” http://login.xxx.com/ In this way, you can successfully login .

( editor : Flying night )

Curl yes Linux The next one is very powerful http Command line tools . It's very powerful .

1) Without saying , Let's start here .

$ curl http://www.linuxidc.com

After the return ,www.linuxidc.com Of html It's on the screen ~

2) Um. , To save the page you read . Is that what we want ?

$ curl http://www.linuxidc.com > page.html

Of course , But it doesn't have to be that much trouble !

use curl The built-in option Just fine , keep for the future http Result . Use this option: -o

$ curl -o page.html http://www.linuxidc.com

such , You can see a progress indicator of the download page appear on the screen . When we get there 100%, Naturally OK Slightly

3) What, what ?! Cannot access ? It must be yours proxy No setting .

Use curl When , Use this option Be able to specify http Access used proxyserver And its port: -x

$ curl -x 123.45.67.89:1080 -o page.html http://www.linuxidc.com

4) It's annoying to visit some sites , He USES cookie To record session Information .

image IE/NN This browser , Of course, it can be easily handled cookie Information , But our curl Well ?…..

Let's learn this option: -D <— This is a http Of response Inside cookie The information is stored in a special file

$ curl -x 123.45.67.89:1080 -o page.html -D cookie0001.txt http://www.linuxidc.com

such , When the page is saved to page.html At the same time ,cookie Information is also stored in cookie0001.txt Inside the

5) that . Next visit . How to continue to use the last left cookie Information? ? Need to know , Many sites rely on monitoring you cookie Information , To infer whether you are not following the rules to visit their site .

This time we use this option Let's take the last one cookie The information is appended to http request Go inside : -b

$ curl -x 123.45.67.89:1080 -o page1.html -D cookie0002.txt -b cookie0001.txt http://www.linuxidc.com

such , We can almost simulate all IE operation , Go to visit the web !

6) Wait a little ~ I seem to have forgotten something ~

by the way !

It's browser information

Some annoying sites always ask us to use certain browsers to access them . Sometimes it's more extreme . Also use some specific version numbers NND. Where can I find these weird browsers for it !

Fortunately curl It provides us with a practical option, It allows us to arbitrarily specify our own browser information claimed in this visit : -A

$ curl -A “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)” -x 123.45.67.89:1080 -o page.html -D cookie0001.txt http://www.linuxidc.com

such ,server End access request , You will feel like an executive in Windows 2000 Upper IE6.0, Hey, hey, hey . In fact, maybe you use an apple machine !

and ”Mozilla/4.73 [en] (X11; U; Linux 2.2; 15 i686″ Then you can tell the other party that you are a PC On the run Linux. It's using Netscape 4.73, Ha ha ha

7) Another one server The limiting method often used by the client , It's inspection http Visited referer. For example, you first visit the home page . Then visit the download page specified inside , This is the second visit referer The address is the page address after the first successful visit . such .server The client only needs to find a certain access to the download page referer The address is not the address of the home page . It can be concluded that it is a thief Even ~

I hate it ~ I just want to steal company ~!

fortunately curl It gives us settings referer Of option: -e

$ curl -A “Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)” -x 123.45.67.89:1080 -e “mail.linuxidc.com” -o page.html -D cookie0001.txt http://www.linuxidc.com

such , Can cheat each other server, You are from mail.linuxidc.com Click on a link , Ha ha ha

8) It's about finding something important missing !

——- utilize curl Download the file

I said just now , Download the page to a file . Able to use -o . It's the same with downloading files . example ,

$ curl -o 1.jpg http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG

Here's a new one option: -O uppercase O, So used :

$ curl -O http://cgi2.tky.3web.ne.jp/~zzh/screen1.JPG

such , Can follow server File name on , I have taken the initiative to exist locally !

And a better one .

hypothesis screen1.JPG Besides screen2.JPG、screen3.JPG、….、screen10.JPG Need to download , It's hard for us to write a script To finish these operations ?

Quit doing !

stay curl Inside . Just write it like this :

$ curl -O http://cgi2.tky.3web.ne.jp/~zzh/screen[1-10].JPG

Ha ha ha . Terrible! ?! ~

9) Come again , Let's continue to explain and download .

$ curl -O http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201].JPG

The resulting download , Namely

~zzh/001.JPG

~zzh/002.JPG

~zzh/201.JPG

~nick/001.JPG

~nick/002.JPG

~nick/201.JPG

It's convenient enough ? Ha ha ha

Why ? Too soon to be happy .

because zzh/nick The file names under are 001,002….201, The downloaded file has the same name , The latter has covered all the previous documents ~

No problem , We have more !

$ curl -o #2_#1.jpg http://cgi2.tky.3web.ne.jp/~{zzh,nick}/[001-201].JPG

— This is a ….. Download the file name defined by yourself ? — Right , ha-ha .

such . Define the name of the downloaded file by yourself . That's how it became : original : ~zzh/001.JPG —-> After downloading : 001-zzh.JPG original : ~nick/001.JPG —-> After downloading : 001-nick.JPG

thus , I'm not afraid of duplicate names , ha-ha

End

Copyright notice : This article is the original article of the blogger . Blog , Do not reprint without permission .

Publisher : Full stack programmer stack length , Reprint please indicate the source :https://javaforall.cn/116887.html Link to the original text :https://javaforall.cn

原网站

版权声明
本文为[Full stack programmer webmaster]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/188/202207061756040129.html