当前位置:网站首页>ENVI_ Idl: create HDF5 file and write data (take writing GeoTIFF file to HDF file as an example) + detailed parsing
ENVI_ Idl: create HDF5 file and write data (take writing GeoTIFF file to HDF file as an example) + detailed parsing
2022-07-26 03:03:00 【Fried eggplant】
Catalog
1. Learning content
Article title : Is to create by yourself HDF File and write data to it
——————————————————————————————————————————————————————————————————————————————————————
establish HDF5 Basic process of documents :
1. Specify the path to create HDF5 file
2. Create groups that store the same type of data
3. Get the data type of the dataset id
4. Get the data space of the dataset id
5. Create a dataset id( Creating attributes is a similar step )
6. Write data to dataset
7. Close file
——————————————————————————————————————————————————————————————————————————————————————
2. Knowledge reserve
Here are some functions you may use .
——————————————————————————————————————————————————————————————————————————————————————

—————————————————————————————————————————————————————————————————————————————————————
3. Programming
pro week_seven_study1
; This program is used to solve the problem of creating hdf5 File and write data
; The data written here is some tiff File data
; The time when the program starts
start_time = systime(1)
; route
in_dir = 'D:/IDL_program/experiment_data/chapter_4'
out_dir = 'D:/IDL_program/experiment_data/chapter_4/hdf/'
if file_test(out_dir) eq 0.0 then file_mkdir, out_dir ; If the directory does not exist, create
; obtain in_dir Under the air_quality dependent tiff File path and get tiff Number of files (count= return )
tiff_path_list = file_search(in_dir, 'air_quality*.tiff', count=tiff_count) ; Because there are some tiff The document is not what we want , So you can't just *.tiff
; obtain tiff The name of the document
tiff_name_list = file_basename(tiff_path_list, '.tiff')
; establish hdf5 file ——》 adopt h5f_create( route ) Function creation , Return to the created hdf5 Of documents id
create_path = out_dir + 'air_quality.he5' ; hdf5 The suffix of the file is .he5
h5_id = h5f_create(create_path)
; establish HDF5 Group , For storing datasets ( upper tiff Body data of the file )——》 adopt h5g_create(Loc_id, Name) function (Loc_id It could be a file id It can also be a group id)
; Which one do you want to put the group under , Which one do you import id, Here I put it directly in hdf5 Below file , So the introduction hdf5 Of documents id(h5_id), The second parameter is the name of the Group
ds_group_id = h5g_create(h5_id, 'DataSets')
; Here, a global attribute is created to describe this hdf5 A brief overview of the document
decribe = 'Here are some data about air quality'
; Although you know that the data type of writing global attributes is string , But you need to let IDL Know and know , So you need to know the data type id
; Incoming data returns data of type id——》h5t_idl_create(Data) Function passes in data , Return the data type of the data
global_att_type_id = h5t_idl_create(decribe)
; In addition, all your incoming data is similar to excel In the form of , So you need to know how many rows and columns your data needs ——. That is, you need to tell IDL How much space is there to write data
; The size of the space that needs to be written is passed in, and so many data spaces are returned id——》h5s_create_simple( Incoming row and column ) Function returns... Of the data space id
global_att_space_id = h5s_create_simple([1]) ; As our string theory, one row and one column is enough , So you can directly [1], If necessary in the future x That's ok y Column then write [y, x]
; establish hdf5 Properties of , Returns the id——》 adopt h5a_create(Loc_id, Datatype_id, Datasapce_id) function
; Where do you need to create the attribute , Then which position id, Here we create a global attribute , Obviously, this is a hdf5 Under the document , Therefore, the id, If you need to place attributes under a group , Then you should pass in the Group id
global_att_id = h5a_create(h5_id, 'Decribe', global_att_type_id, global_att_space_id)
; Now you can write data under a certain attribute , Because you have created this attribute and obtained its id——》 adopt h5a_write Function writes data to the attribute
h5a_write, global_att_id, decribe ; The first parameter is passed into the attribute that needs to write data id, The second parameter passes in the data to be written
; Now enter the loop and put each tiff The data of the file is extracted and written to the newly created hdf5 In the document
for tiff_i = 0, tiff_count - 1 do begin
; Get the tiff File data , Get the subject information and geographic reference information
tiff_data = read_tiff(tiff_path_list[tiff_i], geotiff=tiff_geo)
; To get the data size( Number of rows and columns )
tiff_size = size(tiff_data)
; Get the number of rows and columns of the data
tiff_column = tiff_size[1]
tiff_row = tiff_size[2]
; Get the resolution of the data
tiff_resolution = tiff_geo.(0)
; Get the corner information of the data
tiff_geo = tiff_geo.(1)
; Created earlier DataSets Create data sets in groups , Used to store the tiff data
; Similarly , You also need to get data types id, Get data space id, Create a dataset and get the id
; Get the data type of the dataset id
ds_type_id = h5t_idl_create(tiff_data) ; Incoming data about to be transferred into this dataset , Get the id
; Get the data space of the dataset id
ds_space_id = h5s_create_simple([tiff_column, tiff_row])
; Create the dataset and get the id
ds_id = h5d_create(ds_group_id, tiff_name_list[tiff_i], ds_type_id, ds_space_id, gzip=9) ; Here is an additional keyword parameter gzip, This indicates the compression degree of the incoming data , The greater the numerical , The higher the degree of compression , obtain hdf5 The file size will be correspondingly smaller , But the code will run longer
; Because the created dataset needs to be placed under the Group , So you need to pass in the Group id It's not a document id
; Write data to the dataset
h5d_write, ds_id, tiff_data
; Create dataset attributes in the dataset created above ( Here create a resolution attribute )
; Get data type id
res_type_id = h5t_idl_create(tiff_resolution)
; Get data space id
res_space_id = h5s_create_simple([3])
; Create a resolution attribute and return the id
res_id = h5a_create(ds_id, 'resolution', res_type_id, res_space_id)
; Write data to this attribute
h5a_write, res_id, tiff_resolution
; Allied , You can also create other attributes in the dataset to write corner information , There's no more demo
endfor
; Close the open hdf5 file
h5f_close, h5_id
; The time when the program ends
stop_time = systime(1)
; Give tips
print, ' Program time >>> ' + strcompress(string(stop_time - start_time)) + 's'
end
——————————————————————————————————————————————————————————————————————————————————————
I am fried eggplant , You're welcome
边栏推荐
- Pbootcms upload thumbnail size automatically reduces and blurs
- Matlab simulation of vertical handover between MTD SCDMA and TD LTE dual networks
- Autojs cloud control source code + display
- Usage of arguments.callee
- .net serialize enumeration as string
- Qt 信号在多层次对象间传递 多层嵌套类对象之间信号传递
- VR panoramic shooting and production of business center helps businesses effectively attract people
- snownlp库各功能及用法
- Have you ever seen this kind of dynamic programming -- the stock problem of state machine dynamic programming (Part 1)
- MySQL tutorial: MySQL database learning classic (from getting started to mastering)
猜你喜欢

ShardingSphere数据分片

Self-supervised learning method to solve the inverse problem of Fokker-Planck Equation

如何根据登录测试的需求设计测试用例?

Win11更改磁盘驱动器号的方法
![[steering wheel] use the 60 + shortcut keys of idea to share with you, in order to improve efficiency (live template & postfix completion)](/img/b8/56c4541602c5a6e787e2455f80febd.png)
[steering wheel] use the 60 + shortcut keys of idea to share with you, in order to improve efficiency (live template & postfix completion)

MySQL build websites data table

FPGA_Vivado软件初次使用流程_超详细

Win11大小写提示图标怎么关闭?Win11大小写提示图标的关闭方法

Be highly vigilant! Weaponization of smartphone location data on the battlefield

Software testing post: Ali has three sides. Fortunately, he has made full preparations and has been offered
随机推荐
An article allows you to understand the relevance of cloud native containerization
After clicking play, the variables in editorwindow will be destroyed inexplicably
Games101 review: shading, rendering pipelines
hello world驱动(二)-初级版
assert _ Aligns
Autojs cloud control source code + display
AMD64 (x86_64) architecture ABI document:
Anti electronic ink screen st7302
Continuous delivery and Devops are good friends
VOFA+ 串口调试助手
Personally test five efficient and practical ways to get rid of orders, and quickly collect them to help you quickly find high-quality objects!
软件测试岗:阿里三面,幸好做足了准备,已拿offer
Vofa+ serial port debugging assistant
Study notes of pytorch deep learning practice: convolutional neural network (Advanced)
Chapter 3 business function development (delete clues)
【方向盘】使用IDEA的60+个快捷键分享给你,权为了提效(重构篇)
MySQL教程:MySQL数据库学习宝典(从入门到精通)
Arthas view the source code of the loaded class (JAD)
Usage of fuser and lsof
Teach you to rely on management hand in hand