当前位置:网站首页>The super document downloading tool scihub CN teaches you to download documents with one line of command

The super document downloading tool scihub CN teaches you to download documents with one line of command

2022-06-11 12:24:00 m0_ sixty-one million eight hundred and ninety-nine thousand on

introduction

Project address :GitHub - Ckend/scihub-cn: Available in domestic environment scihub Thesis downloader

What can the project do , Just look at the picture . You can download the required documents in one sentence .

1. Get ready

Before the start , Ensure existing Python and pip Environmental Science .( Direct installation anaconda3, It is better to build a virtual environment )

Windows open Cmd( Start — function —CMD), Or open anaconda3 Virtual environment running .

The apple system is turned on Terminal(command+ Space input Terminal), Enter the command to install the dependency :

pip install scihub-cn

Will download a series of installation packages , notice Successfully installed ... It means successful installation scihub-cn.

scihub-cn It's dependence aiohttp Module for concurrent download , Supported by The minimum Python Version is 3.6.

Project source code :GitHub - Ckend/scihub-cn: Available in domestic environment scihub Thesis downloader

2.Scihub-cn Usage method

2.1 Use DOI Download Paper No

First, let's try according to DOI Number Download literature :

scihub-cn -d 10.1038/s41524-017-0032-0

The downloaded paper will be automatically generated in the current folder :

Choose to download it to any directory , Just add -o Parameters :

scihub-cn -d 10.1038/s41524-017-0032-0 -o D:\papers

This will download the paper to D Discoid papers In the folder .

2.2 Download the paper according to the key words

Use -w Parameter specifies a keyword , You can download papers through keywords :

scihub-cn -w reinforcement

The effect is as follows :

Again , It also supports -o Parameter specifies the folder . Besides , The default search engine used here is Baidu academic , You can also use Google academic 、publons、science_direct etc. . By designation -e Parameters can be :

scihub-cn -w reinforcement -e google_scholar

for fear of Google Academic cannot be connected , You can add agents -p Parameters :

scihub-cn -w reinforcement -e google_scholar -p http://127.0.0.1:10808

When accessing an external data source , Adding agents can avoid Connection closed Other questions .

Besides , You can also limit the number of downloads , For example, I want to download 100 An article :

scihub-cn -w reinforcement -l 100

2.3 according to url Download the paper

Given any paper address , It can make scihub-cn Try to download the paper : Use -u Parameter specifies the paper link .

scihub-cn -u https://ieeexplore.ieee.org/document/26502

3. Download papers in batches

Several new batch download methods have been added :

1. According to given Name of all papers Of txt Text file download paper .

2. According to given All papers url Of txt File download thesis .

3. According to given All papers DOI Number Of txt Text file download paper .

4. According to given bibtex file Download the paper .

such as , Give all papers according to URL Of txt File download thesis :

scihub-cn -i urls.txt --url

The effect is as follows :

You can see , There are 4 A paper link , Also successfully downloaded to this 4 Papers .

Try again DOI The no. txt Batch download of files :

scihub-cn -i dois.txt --doi

The effect is as follows :

Parameter description

You can enter scihub-cn --help See more parameter descriptions :

$scihub-cn --help
... ...
optional arguments:
  -h, --help show this help message and exit
  -u URL input the download url
  -d DOI input the download doi
  --input INPUTFILE, -i INPUTFILE
                        input download file
  -w WORDS, --words WORDS
                        download from some key words,keywords are linked by
                        _,like machine_learning.
  --title download from paper titles file
  -p PROXY, --proxy PROXY
                        use proxy to download papers
  --output OUTPUT, -o OUTPUT
                        setting output path
  --doi download paper from dois file
  --bib download papers from bibtex file
  --url download paper from url file
  -e SEARCH_ENGINE, --engine SEARCH_ENGINE
                        set the search engine
  -l LIMIT, --limit LIMIT
                        limit the number of search result

Overview of main usage

Use doi, Paper title , perhaps bibtex Download papers in batches .

Support Python3.6 And above .

install :

pip install scihub-cn

How to use it is as follows :

1. give bibtex file

$scihub-cn -i input.bib --bib

2. Give papers doi name

$scihub-cn -d 10.1038/s41524-017-0032-0

3. Give papers url

$scihub-cn -u https://ieeexplore.ieee.org/document/9429985

4. Give the key words of the thesis ( Between keywords _ link , Such as machine_learning)

$scihub-cn -w word1_words2_words3

5. Give papers doi Of txt text file , such as

10.1038/s41524-017-0032-0
10.1063/1.3149495
$scihub-cn -i dois.txt --doi

6. Give the names of all papers txt text file

Some Title 1
Some Title 2
$scihub-cn -i titles.txt --title

7. Give all papers url Of txt file

url 1
url 2
$scihub-cn -i urls.txt --url

You can add... At the end -p(--proxy),-o(--output),-e(--engine),-l(--limit) To specify a proxy , Output folder 、 Search engines and the number of restricted search entries Search engines include google_scholar、baidu_xueshu、publons、 as well as science_direct.

For specific usage, you can access the source project :GitHub - Ckend/scihub-cn: Available in domestic environment scihub Thesis downloader

Reference resources

New Year , The super literature download tool has been updated ! One line command to download any document

原网站

版权声明
本文为[m0_ sixty-one million eight hundred and ninety-nine thousand on]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/03/202203012127212947.html