当前位置:网站首页>POI dealing with Excel learning
POI dealing with Excel learning
2022-07-03 06:14:00 【Muyu】
POI Study
Import dependencies first
<dependencies>
<!--xls(03)-->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.9</version>
</dependency>
<!--xlsx(07)-->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.9</version>
</dependency>
<!-- Date formatting tool -->
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.10.1</version>
</dependency>
<!--test-->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
</dependencies>
Excel2003 The version can only be put at most 65536 Row data (xls)
Excel2007 The version does not have this limitation (xlsx)
Excel form
- workbook
- Worksheet
- That's ok
- Column
- Cell
- Worksheet
Big files write HSSF
shortcoming : At most, it can only deal with 65536 That's ok , Otherwise, an exception will be thrown
advantage : Write cache in process , Do not operate disk , The last one-time write to disk , Fast
Big files write XSSF
shortcoming : Writing data is very slow , Very memory intensive , Memory overflow can also occur , Such as 100 Ten thousand
advantage : Can write a large amount of data , Such as 20 Ten thousand
Big files write SXSSF
advantage : Can write a large amount of data , Such as 20 Ten thousand
operation Excel
Create a Excel 07 Version of xlsx file
package com.muyu.main;
import org.apache.poi.ss.usermodel.Cell;
import org.apache.poi.ss.usermodel.Row;
import org.apache.poi.ss.usermodel.Sheet;
import org.apache.poi.ss.usermodel.Workbook;
import org.apache.poi.xssf.streaming.SXSSFWorkbook;
import org.joda.time.DateTime;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class ExcelWriteTest {
private static String path = "E:\\code\\IDEA code\\ControlExcel\\Excel-poi";
public static void main(String[] args) throws IOException {
//1. Create Workbook
Workbook workbook = new SXSSFWorkbook();
//2. Create a worksheet
Sheet sheet = workbook.createSheet(" Fan Xu practice table ");
//3. Create a line
Row row1 = sheet.createRow(0);
//4. Create a cell
Cell cell11 = row1.createCell(0);
cell11.setCellValue(" New learning today :");
Cell cell12 = row1.createCell(1);
cell12.setCellValue("poi");
// The second line
Row row2 = sheet.createRow(1);
Cell cell21 = row2.createCell(0);
Cell cell22 = row2.createCell(1);
String time = new DateTime().toString("yyyy-MM-dd HH:mm:ss");
cell21.setCellValue(" Count the time :");
cell22.setCellValue(time);
FileOutputStream fileOutputStream = new FileOutputStream(path + "\\FanXu.xlsx");
workbook.write(fileOutputStream);
fileOutputStream.close();
}
}
image xls Write large amount of data in
@Test
public void xls65536Test() throws IOException {
// Fast execution
long begain = System.currentTimeMillis();
Workbook workbook = new HSSFWorkbook();
Sheet sheet = workbook.createSheet("66636");
for (int i = 0; i <65536; i++) {
Row row = sheet.createRow(i);
for (int j = 0; j <10 ; j++) {
Cell cell = row.createCell(j);
cell.setCellValue(j);
}
}
FileOutputStream fileOutputStream = new FileOutputStream(path + "\\xls65536Test.xls");
workbook.write(fileOutputStream);
fileOutputStream.close();
System.out.println("over");
long end = System.currentTimeMillis();
double time = (double)(end-begain)/1000 ;
System.out.println(time);
}
//over Output
//1.776
towards xlsx Write large amount of data in
@Test
public void xlsx65536Test() throws IOException {
// Slow execution 1 Ten thousand have been implemented 2.682 second
long begain = System.currentTimeMillis();
Workbook workbook = new XSSFWorkbook();
Sheet sheet = workbook.createSheet("66636");
for (int i = 0; i <10000; i++) {
Row row = sheet.createRow(i);
for (int j = 0; j <10 ; j++) {
Cell cell = row.createCell(j);
cell.setCellValue(j);
}
}
FileOutputStream fileOutputStream = new FileOutputStream(path + "\\xlsx65536Test.xlsx");
workbook.write(fileOutputStream);
fileOutputStream.close();
System.out.println("over");
long end = System.currentTimeMillis();
double time = (double)(end-begain)/1000 ;
System.out.println(time);
}
//over Output
//2.682
Optimize ! The second method is to xlsx Write large amount of data in
@Test
public void sxlsx65536Test() throws IOException {
//20 Ten thousand have been implemented 4.205 second
long begain = System.currentTimeMillis();
Workbook workbook = new SXSSFWorkbook();
Sheet sheet = workbook.createSheet("66636");
for (int i = 0; i <200000; i++) {
Row row = sheet.createRow(i);
for (int j = 0; j <10 ; j++) {
Cell cell = row.createCell(j);
cell.setCellValue(j);
}
}
FileOutputStream fileOutputStream = new FileOutputStream(path + "\\xlsx65536Test.xlsx");
workbook.write(fileOutputStream);
fileOutputStream.close();
((SXSSFWorkbook)workbook).dispose();
System.out.println("over");
long end = System.currentTimeMillis();
double time = (double)(end-begain)/1000 ;
System.out.println(time);
}
//over Output
//4.205
read Excel The operation of
package com.muyu.main;
import org.apache.poi.hssf.usermodel.HSSFWorkbook;
import org.apache.poi.ss.usermodel.Cell;
import org.apache.poi.ss.usermodel.Row;
import org.apache.poi.ss.usermodel.Sheet;
import org.apache.poi.ss.usermodel.Workbook;
import org.apache.poi.xssf.streaming.SXSSFWorkbook;
import org.apache.poi.xssf.usermodel.XSSFWorkbook;
import org.joda.time.DateTime;
import org.junit.Test;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class ExcelReadTest {
private static String path = "E:\\code\\IDEA code\\ControlExcel\\Excel-poi";
@Test
public void getExcelValue() throws IOException {
FileInputStream fileInputStream = new FileInputStream("FanXu.xlsx");
Workbook workbook = new XSSFWorkbook(fileInputStream);
Sheet sheet = workbook.getSheet(" Fan Xu practice table ");
Row row = sheet.getRow(0);
Cell cell = row.getCell(1);
System.out.println(cell.getStringCellValue());
fileInputStream.close();
}
}
边栏推荐
- 项目总结--01(接口的增删改查;多线程的使用)
- 88. 合并两个有序数组
- Kubernetes notes (II) pod usage notes
- Yum is too slow to bear? That's because you didn't do it
- Project summary --04
- 从小数据量分库分表 MySQL 合并迁移数据到 TiDB
- Clickhouse learning notes (2): execution plan, table creation optimization, syntax optimization rules, query optimization, data consistency
- Method of converting GPS coordinates to Baidu map coordinates
- Common interview questions
- Kubernetes cluster environment construction & Deployment dashboard
猜你喜欢
Oauth2.0 - explanation of simplified mode, password mode and client mode
Interesting research on mouse pointer interaction
YOLOV1学习笔记
Maximum likelihood estimation, divergence, cross entropy
Kubesphere - build MySQL master-slave replication structure
Kubesphere - Multi tenant management
Loss function in pytorch multi classification
Use abp Zero builds a third-party login module (I): Principles
SQL实现将多行记录合并成一行
Project summary --01 (addition, deletion, modification and query of interfaces; use of multithreading)
随机推荐
Mysql database table export and import with binary
Kubesphere - Multi tenant management
Core principles and source code analysis of disruptor
Kubernetes notes (IV) kubernetes network
Decision tree of machine learning
使用 Abp.Zero 搭建第三方登录模块(一):原理篇
When PHP uses env to obtain file parameters, it gets strings
The most responsible command line beautification tutorial
Kubernetes notes (10) kubernetes Monitoring & debugging
Deep learning, thinking from one dimensional input to multi-dimensional feature input
The mechanical hard disk is connected to the computer through USB and cannot be displayed
智牛股项目--04
Phpstudy setting items can be accessed by other computers on the LAN
Maximum likelihood estimation, divergence, cross entropy
Kubernetes notes (III) controller
Yum is too slow to bear? That's because you didn't do it
从小数据量 MySQL 迁移数据到 TiDB
Click cesium to obtain three-dimensional coordinates (longitude, latitude and elevation)
Migrate data from Mysql to tidb from a small amount of data
輕松上手Fluentd,結合 Rainbond 插件市場,日志收集更快捷