当前位置:网站首页>Using easyexcel to import tables to realize batch insertion of xlsx files ----- MySQL of Linux
Using easyexcel to import tables to realize batch insertion of xlsx files ----- MySQL of Linux
2022-07-26 05:39:00 【kjshuan】
Introduce
Java analysis 、 Generate Excel Well known frameworks are Apache poi、jxl. But they all have a serious problem that is very memory consuming ,poi There is a set SAX Mode API It can solve some memory overflow problems to a certain extent , but POI There are still some flaws , such as 07 edition Excel Decompression and storage after decompression are done in memory , Memory consumption is still high .easyexcel Rewrote poi Yes 07 edition Excel Parsing , One 3M Of excel use POI sax Parsing still needs 100M Left and right memory , change to the use of sth. easyexcel It can be reduced to a few M, And the bigger excel There will be no memory overflow ;03 Version depends on POI Of sax Pattern , In the upper layer, the encapsulation of model transformation is done , Make the user more simple and convenient
Prepare the premises linux Connection service on nacos
The first step is to import pom rely on
<properties>
<easyexcel.version>3.0.5</easyexcel.version>
</properties>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>easyexcel</artifactId>
<version>${easyexcel.version}</version>
</dependency>Others that may need to be relied on pom
<properties>
<mysql.version>5.1.38</mysql.version>
<mybatis-plus.version>3.4.2</mybatis-plus.version>
<druid.version>1.1.9</druid.version>
<easyexcel.version>3.0.5</easyexcel.version>
<fastdss.verson>1.26.4</fastdss.verson>
<fastjson.version>1.2.83</fastjson.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>com.alibaba.cloud</groupId>
<artifactId>spring-cloud-starter-alibaba-nacos-discovery</artifactId>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<dependency>
<groupId>com.baomidou</groupId>
<artifactId>mybatis-plus-boot-starter</artifactId>
<version>${mybatis-plus.version}</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>druid-spring-boot-starter</artifactId>
<version>${druid.version}</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/com.alibaba/easyexcel -->
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>easyexcel</artifactId>
<version>${easyexcel.version}</version>
</dependency>
<dependency>
<groupId>net.oschina.zcx7878</groupId>
<artifactId>fastdfs-client-java</artifactId>
<version>1.27.0.0</version>
</dependency>
<dependency>
<groupId>com.github.tobato</groupId>
<artifactId>fastdfs-client</artifactId>
<version>${fastdss.verson}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.13.1</version>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>${fastjson.version}</version>
</dependency>
</dependencies>The second step To configure application.yml Connect nacos
server:
port: 9090
spring:
application:
name: tc_manager
cloud:
nacos:
discovery:
server-addr: 192.168.64.118:8848
username: nacos
password: nacos
namespace: public
datasource:
druid:
url: jdbc:mysql://192.168.64.118:3306/tc_towercrane?useSSL=false&serverTimezone=Asia/Shanghai&characterEncoding=utf-8&autoReconnect=true
username: root
password: 3090_Cmok
# Number of physical connections established during initialization . Initialization occurs in the display call init Method , Or for the first time getConnection when
initial-size: 3
max-active: 30
min-idle: 3
# datasource:
# # druid Connection pool
# type: com.alibaba.druid.pool.DruidDataSource
# # Database driven
# driver: com.mysql.jdbc.Driver
# # Maximum number of connection pools
# max-active: 20
# # Number of physical connections established during initialization . Initialization occurs in the display call init Method , Or for the first time getConnection when
# initial-size: 10
# Maximum wait time when getting a connection , Unit millisecond . Configured with maxWait after , Fair lock is enabled by default ,
# Concurrency efficiency will decrease , If necessary, it can be configured through useUnfairLock The attribute is true Use unfair locks .
max-wait: 60000
# Minimum number of connection pools
# There are two meanings :
#1: Destroy The thread will detect the connection interval
#2: testWhileIdle On the basis of , See in detail testWhileIdle Description of the property
time-between-eviction-runs-millis: 60000
# Configure the minimum lifetime of a connection in the pool , In milliseconds
min-evictable-idle-time-millis: 180000
# Used to check whether the connection is valid sql, The requirement is a query statement . If validationQuery by null,testOnBorrow、testOnReturn、testWhileIdle It doesn't work .
validation-query: select 'x'
# Timeout for connection validity check 1 second
validation-query-timeout: 1
# Execute on connection request validationQuery Check whether the connection is valid , This configuration will degrade performance .
test-on-borrow: false
# Set whether to check the validity of the connection when getting the connection from the connection pool ,true when , If the connection is idle for more than minEvictableIdleTimeMillis Inspection , Otherwise, do not check ;false when , Do not check
test-while-idle: true
# Execute... When returning the connection validationQuery Check whether the connection is valid , This configuration will degrade performance
test-on-return: false
# Whether the cache preparedStatement, That is to say PSCache.PSCache Great improvement in database performance supporting cursors , for instance oracle. stay mysql The next suggestion is to close .
pool-prepared-statements: true
# To enable the PSCache, Must be configured greater than 0, When more than 0 when ,poolPreparedStatements Auto trigger changed to true. stay Druid in ,
# No existence Oracle Next PSCache The problem of using too much memory , You can configure this value to be larger , for instance 100
max-open-prepared-statements: 20
# Database links over 3 Minutes to close idle connections Seconds per unit
remove-abandoned-timeout: 1800
# For connections that have not been used for a long time, it is forced to close
remove-abandoned: true
# After opening , enhance timeBetweenEvictionRunsMillis Periodic connection check ,minIdle Idle connections in ,
# Each check forces verification of connection validity . Reference resources :https://github.com/alibaba/druid/wiki/KeepAlive_cn
keep-alive: true
# adopt connectProperties Property to open mergeSql function ; slow SQL Record
connect-properties: druid.stat.mergeSql=true;druid.stat.slowSqlMillis=5000
# Whether to close the connection after timeout The default is false , if true Even if the database is restored , Can't connect
break-after-acquire-failure: false
# Set the number of automatic reconnections when there is an error in getting the connection
connection-error-retry-attempts: 1
# Set whether to return the error immediately when getting the connection error ,true For immediate return
fail-fast: true
# Property type is string , Configure the extension by alias , Common plug-ins are :
# For monitoring statistics filter:stat It's for the log filter:log4j defense sql Injected filter:wall
filters: stat,wall
mybatis-plus:
mapper-locations: classpath:mapper/*.xml
fdfs:
so-timeout: 3000
connect-timeout: 1000
thumb-image:
width: 200
height: 200
tracker-list:
- 192.168.64.118:22122Write entity class Tcinfos
@Data
@AllArgsConstructor
@NoArgsConstructor
@Builder
public class Tcinfos {
@TableId(type = IdType.AUTO)
@ExcelIgnore
private Integer tcid;
@ExcelProperty(" Tower type ")
private String type;
@JsonFormat(pattern = "yyyy-MM-dd")// Use
@DateTimeFormat(pattern = "yyyy-MM-dd")// Use when saving date
@ExcelProperty(" Date of construction ")
private Date makedate;
@ExcelProperty(" Maximum height ")
private Integer maxhigh;
@ExcelProperty(" Maximum amplitude ")
private Integer maxrange;
@ExcelProperty(" Maximum load ")
private Integer maxweight;
@ExcelProperty(" Security employee number ")
private Integer secman;
@ExcelProperty(" In charge of the labor number ")
private Integer resman;
}
Step 3 configuration controller layer
@RestController
@RequestMapping("/tcinfo")
public class TcinfosCtrl {
@Resource
private TcinfoStoreService tcinfoStoreService;
/**
* Batch insert
* @param file
* @return
*/@PostMapping(value ="/batchTowerCrane")
public String batchTc(MultipartFile file){
tcinfoStoreService.batch_save(file);
return BackinfoConf.BATCHADDTC_SUCCESS;
}
BackinfoConf Class encapsulates the return information
public class BackinfoConf {
public static final String FILEUPLOAD_SUCCESS="{\"upload_status\":\"success\"}";
public static final String FILEUPLOAD_FAIL="{\"upload_status\":\"fail\"}";
public static final String ADDTCCHECK_SUCCESS="{\"addcheck_status\":\"success\"}";
public static final String ADDTCCHECK_FAIL="{\"addcheck_status\":\"fail" + "\"}";
public static final String MANUALADDTC_SUCCESS="{\"manualaddtc_status\":\"success" + "\"}";
public static final String MANUALADDTC_FAIL="{\"manualaddtc_status\":\"fail" + "\"}";
public static final String MODTC_SUCCCESS="{\"modtc_status\":\"success" + "\"}";
public static final String MODTC_FAIL="{\"modtc_status\":\"fail" + "\"}";
public static final String BATCHADDTC_SUCCESS="{\"batchaddtc_status\":\"success" + "\"}";
public static final String BATCHADDTC_FAIL="{\"batchaddtc_status\":\"fail" + "\"}";
}
Step 4 configuration service Layer interface And implementation classes
public interface TcinfoStoreService {
/**
* Upload files Read file contents
* And save data in batches
* @param file
*/
void batch_save(MultipartFile file);
}
@Service
public class TcinfoStoreServiceImpl implements TcinfoStoreService {
@Resource
private TCinfosMapper tCinfosMapper;
@Resource
private FastFileStorageClient ffsc;
@Resource
private TcappendixMapper tcappendixMapper;
@Transactional
@Override
public void batch_save(MultipartFile file) {
try {
//new ExcelReadListener Call his parameterized construct Inject mapper object
//sheet() Method represents the first table
// Similar to sheet name no Method to customize with name or number
EasyExcel.read(file.getInputStream(),Tcinfos.class,new ExcelReadListener(tCinfosMapper)).sheet().doRead();
} catch (IOException e) {
e.printStackTrace();
}
}
} //EasyExcel In the reading excel On form , Every read line , This method will be called once ,
// And read the row data , Encapsulate to the specified type (Question1) The object of , Pass it on to us (Object object)
/*
This problem may occur in the lower version easyExcel in , When it occurs, it can be solved in the following ways
If the table data is not written in the top row , Need to pass through headRowNumber Specify the number of header rows
If the table data is not written in the top column , You need to pass on the encapsulated entity attributes @ExcelProperty Correspond entity attributes to table column names
*/Step 5 configuration mapper Layer interface And implementation classes
@Mapper
// Use mybatis-plus Realization crud
//mybatis-plus There is no way to batch insert So we write a batch insert interface
public interface TCinfosMapper extends BaseMapper<Tcinfos> {
void batchSaveTcInfo(List<Tcinfos> tc);
}
<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE mapper PUBLIC
"-//mybatis.org//DTD Mapper 3.0//EN"
"http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<!-- Use mybatis dynamic sql Batch insert the set -->
<mapper namespace="com.kgc.towercrane.tcmanager.mapper.TCinfosMapper">
<insert id="batchSaveTcInfo">
insert into tcinfos
(type,makedate,maxhigh,maxrange,maxweight,secman,resman) values
<foreach collection="list" item="tc" separator=",">
(#{tc.type},#{tc.makedate},#{tc.maxhigh},#{tc.maxrange},#{tc.maxweight},#{tc.secman},#{tc.resman})
</foreach>
</insert>
</mapper>Step 6 configuration ExcelReadListener class
package com.kgc.towercrane.tcmanager.common;
import com.alibaba.excel.context.AnalysisContext;
import com.alibaba.excel.event.AnalysisEventListener;
import com.kgc.towercrane.tcmanager.domian.module.Tcinfos;
import com.kgc.towercrane.tcmanager.mapper.TCinfosMapper;
import java.util.ArrayList;
import java.util.List;
public class ExcelReadListener extends AnalysisEventListener<Tcinfos> {
// Define batch Each read 500 Row data The most suitable can 3000
private static final int BATCH_SIZE=500;
// Create a global collection Each read data is stored in it
private List<Tcinfos> tcs=new ArrayList<>();
// Define a mapper Interface properties
private TCinfosMapper tCinfosMapperss;
// Create a parameterized construct Convenient in controller Layer in mapper Interface
public ExcelReadListener(TCinfosMapper tcinfos){
this.tCinfosMapperss=tcinfos;
}
/**
* Read one line at a time Automatic triggering invoke Method
* @param tcinfos
* @param analysisContext
*/
@Override
public void invoke(Tcinfos tcinfos, AnalysisContext analysisContext) {
System.out.println(tcinfos);
// Read data to set
tcs.add(tcinfos);
// Judge whether the set has wood or arrival 500
if(tcs.size()>=BATCH_SIZE){
// The read data is greater than or equal to 500 Insert 500 Bar data to mysql database
tCinfosMapperss.batchSaveTcInfo(tcs);
// Empty the set
tcs.clear();
}
}
/**
* Trigger after reading the file doAfterAllAndLysed once
* Suppose the amount of data is 3490 strip invoke Method per 500 Jump and insert once
* The last one is left 490 It won't be inserted We manually save once in this method
* Insert the last 490 strip
* @param analysisContext
*/
@Override
public void doAfterAllAnalysed(AnalysisContext analysisContext) {
System.out.println(" Trigger again after reading the file !!");
tCinfosMapperss.batchSaveTcInfo(tcs);
tcs.clear();
}
}View source code AnalysisEventListener Is an abstract class

Step seven apipost perhaps postman Test interface --- What we use here is apiPost

Step 8 use DataGrip View data information

Batch data insertion succeeded !!!!
边栏推荐
- SSH远程管理
- 代码审计之百家cms
- Development projects get twice the result with half the effort, a large collection of open source STM32 driver Libraries
- Okaleido launched the fusion mining mode, which is the only way for Oka to verify the current output
- No EGL Display 报错解决
- If MySQL calculates the current month change / current month increase / year-on-year change / year-on-year increase?
- 高手是怎样炼成的?
- Debugging sharp weapon! A lightweight log library log.c
- Data warehouse construction -dim floor
- Jupiter notebook shortcut key
猜你喜欢

中文文本纠错任务简介

Okaleido launched the fusion mining mode, which is the only way for Oka to verify the current output

【STM32系列汇总】博主的STM32实战快速进阶之路(持续更新)

Redis official visualization tool, with high appearance value and powerful functions!

Hack The Box - Introduction to Networking Module详细讲解中文教程

Data warehouse construction -dim floor

数仓搭建-DIM层

DOM operation -- operation node

Mongodb common commands

代码审计之百家cms
随机推荐
Redis发布订阅
Redis publish subscription
Mongondb API usage
SSH Remote Management
How to view the container name in pod
A trick to teach you to easily understand Potter's map
Lesson 2 getting to know slam for the first time
nn.Moudle模块-创建神经网络结构需要注意的细节
Why can't lpddr completely replace DDR?
STL common template library
How to name the project version number? Looks like cow b
Lightweight MCU command line interaction project, all open source
No EGL display error resolution
Unity Profiler
高分子物理试题库
【个人总结】2022.7.24周结
You'd better not take this kind of project!
选电子工程被劝退,真的没前景了?
FTP实验及概述
Getaverse, a distant bridge to Web3