当前位置:网站首页>The super document downloading tool scihub CN teaches you to download documents with one line of command
The super document downloading tool scihub CN teaches you to download documents with one line of command
2022-06-11 12:24:00 【m0_ sixty-one million eight hundred and ninety-nine thousand on】
introduction
Project address :GitHub - Ckend/scihub-cn: Available in domestic environment scihub Thesis downloader
What can the project do , Just look at the picture . You can download the required documents in one sentence .

1. Get ready
Before the start , Ensure existing Python and pip Environmental Science .( Direct installation anaconda3, It is better to build a virtual environment )
Windows open Cmd( Start — function —CMD), Or open anaconda3 Virtual environment running .
The apple system is turned on Terminal(command+ Space input Terminal), Enter the command to install the dependency :
pip install scihub-cnWill download a series of installation packages , notice Successfully installed ... It means successful installation scihub-cn.
scihub-cn It's dependence aiohttp Module for concurrent download , Supported by The minimum Python Version is 3.6.
Project source code :GitHub - Ckend/scihub-cn: Available in domestic environment scihub Thesis downloader
2.Scihub-cn Usage method
2.1 Use DOI Download Paper No
First, let's try according to DOI Number Download literature :
scihub-cn -d 10.1038/s41524-017-0032-0The downloaded paper will be automatically generated in the current folder :

Choose to download it to any directory , Just add -o Parameters :
scihub-cn -d 10.1038/s41524-017-0032-0 -o D:\papersThis will download the paper to D Discoid papers In the folder .
2.2 Download the paper according to the key words
Use -w Parameter specifies a keyword , You can download papers through keywords :
scihub-cn -w reinforcementThe effect is as follows :

Again , It also supports -o Parameter specifies the folder . Besides , The default search engine used here is Baidu academic , You can also use Google academic 、publons、science_direct etc. . By designation -e Parameters can be :
scihub-cn -w reinforcement -e google_scholarfor fear of Google Academic cannot be connected , You can add agents -p Parameters :
scihub-cn -w reinforcement -e google_scholar -p http://127.0.0.1:10808When accessing an external data source , Adding agents can avoid Connection closed Other questions .
Besides , You can also limit the number of downloads , For example, I want to download 100 An article :
scihub-cn -w reinforcement -l 1002.3 according to url Download the paper
Given any paper address , It can make scihub-cn Try to download the paper : Use -u Parameter specifies the paper link .
scihub-cn -u https://ieeexplore.ieee.org/document/26502
3. Download papers in batches
Several new batch download methods have been added :
1. According to given Name of all papers Of txt Text file download paper .
2. According to given All papers url Of txt File download thesis .
3. According to given All papers DOI Number Of txt Text file download paper .
4. According to given bibtex file Download the paper .
such as , Give all papers according to URL Of txt File download thesis :
scihub-cn -i urls.txt --urlThe effect is as follows :

You can see , There are 4 A paper link , Also successfully downloaded to this 4 Papers .
Try again DOI The no. txt Batch download of files :
scihub-cn -i dois.txt --doi
The effect is as follows :

Parameter description
You can enter scihub-cn --help See more parameter descriptions :
$scihub-cn --help
... ...
optional arguments:
-h, --help show this help message and exit
-u URL input the download url
-d DOI input the download doi
--input INPUTFILE, -i INPUTFILE
input download file
-w WORDS, --words WORDS
download from some key words,keywords are linked by
_,like machine_learning.
--title download from paper titles file
-p PROXY, --proxy PROXY
use proxy to download papers
--output OUTPUT, -o OUTPUT
setting output path
--doi download paper from dois file
--bib download papers from bibtex file
--url download paper from url file
-e SEARCH_ENGINE, --engine SEARCH_ENGINE
set the search engine
-l LIMIT, --limit LIMIT
limit the number of search resultOverview of main usage
Use doi, Paper title , perhaps bibtex Download papers in batches .
Support Python3.6 And above .
install :
pip install scihub-cn
How to use it is as follows :
1. give bibtex file
$scihub-cn -i input.bib --bib
2. Give papers doi name
$scihub-cn -d 10.1038/s41524-017-0032-0
3. Give papers url
$scihub-cn -u https://ieeexplore.ieee.org/document/9429985
4. Give the key words of the thesis ( Between keywords _ link , Such as machine_learning)
$scihub-cn -w word1_words2_words3
5. Give papers doi Of txt text file , such as
10.1038/s41524-017-0032-0
10.1063/1.3149495
$scihub-cn -i dois.txt --doi
6. Give the names of all papers txt text file
Some Title 1
Some Title 2
$scihub-cn -i titles.txt --title
7. Give all papers url Of txt file
url 1
url 2
$scihub-cn -i urls.txt --url
You can add... At the end -p(--proxy),-o(--output),-e(--engine),-l(--limit) To specify a proxy , Output folder 、 Search engines and the number of restricted search entries Search engines include google_scholar、baidu_xueshu、publons、 as well as science_direct.
For specific usage, you can access the source project :GitHub - Ckend/scihub-cn: Available in domestic environment scihub Thesis downloader
Reference resources
边栏推荐
- This is our golden age
- Splunk健康检查orphaned searches
- Serveur FTP: téléchargement et utilisation de Serv - U
- Is reflection really time-consuming? How long does it take to reflect 100000 times.
- Jerry's CMD_ SET_ BT_ Name command format [chapter]
- Live app development to determine whether the user is logging in to the platform for the first time
- FTP server: downloading and using Serv-U
- Zhouhongyi's speech at the China Network Security Annual Conference: 360 secure brain builds a data security system
- saltstack安装与使用
- 8、原子操作类之18罗汉增强
猜你喜欢

flink 数据流图、并行度、算子链、JobGraph与ExecutionGraph、任务和任务槽

Flip window join, interval join, window cogroup

Wechat web developers, how to learn web development

Wireshark packet capturing and debugging RTSP

9. Parler de threadlocal

saltstack安装与使用

flink 部署模式和运行时架构(会话模式、单作业模式、应用模式,JobManager、TaskManager,YARN 模式部署以及运行时架构)

YARN 切换ResourceManager(Failed to connect to server:8032 retries get failed due to exceeded maximum)

9、聊聊ThreadLocal

When I saw the sudden death of a 28 year old employee, I wanted to moisten
随机推荐
flink 滚动窗口、滑动窗口、会话窗口、全局窗口
Serveur FTP: téléchargement et utilisation de Serv - U
Harmonyos application development -- General app interface framework appgeneralframework[app general framework][api v6]
InputStream读取文件OutputStream创建文件
7. CAS
flink GROUPING SETS多维度聚合、设置Table state 到期时间
微信授权获取手机号码
(solve) the kV store down problem of Splunk
Command symbols commonly used by programmers
Addition of large numbers (C language)
.net core 抛异常对性能影响的求证之路
Yapi installation
(推荐)splunk 多少数量search head 才合适
14. Course summary and review
Splunk best practices - lighten the burden on Captain
Acwing50+acwing51 weeks +acwing3493 Maximum sum (open)
纯数据业务的机器打电话进来时回落到了2G/3G
flink 物理分区( 随机分区、 轮询分区、重缩放分区、 广播、 全局分区、自定义分区 )
中国联通 22春招 群面
中间人攻击之ettercap嗅探