Quantcast

Interests to create chunks for a compression data

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Interests to create chunks for a compression data

MEYNARD Rolih
Hi,

I would like to know the reasons why it is required to create chunks  
for compression on HDF5 ?

For example Why is not possible to compress dataset without creating  
of chunks ?

Thank you,

Rolih

_______________________________________________
Hdf-forum is for HDF software users discussion.
[hidden email]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Interests to create chunks for a compression data

Elvis Stansvik


Den 6 sep. 2016 6:33 em skrev "MEYNARD Rolih" <[hidden email]>:
>
> Hi,
>
> I would like to know the reasons why it is required to create chunks for compression on HDF5 ?
>
> For example Why is not possible to compress dataset without creating of chunks ?

I would assume that without chunks, the entire dataset would need to be decompressed just to access a single value. It's therefore better to create chunks, such that when accessing a value, only the chunk in which it lies must be decompressed.

Elvis

>
> Thank you,
>
> Rolih
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [hidden email]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5


_______________________________________________
Hdf-forum is for HDF software users discussion.
[hidden email]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Interests to create chunks for a compression data

Werner Benger
In reply to this post by MEYNARD Rolih
Hi,

  you can make a chunk as large as 4GB, which may cover the entire
dataset at once, if you prefer so. For datasets larger than 4GB, then
chunking it into sections that can be accessed with 32-bit indices seems
important.

If you always read an entire dataset at once, then using a single chunk
may be just fine. However, if you ever use hyperslabs and you want to
read only a part of a bigger datasets, then the ability to read only
those parts, and even more, to decompress only those parts that are of
interested instead of needing to decompress the entire dataset will be
beneficial. Also, if you have multiple chunks, then each chunk can be
decompressed independently, thus in parallel (though I don't know if
there are filters implemented that way for serial HDF5).

       Werner


On 06.09.2016 18:31, MEYNARD Rolih wrote:

> Hi,
>
> I would like to know the reasons why it is required to create chunks
> for compression on HDF5 ?
>
> For example Why is not possible to compress dataset without creating
> of chunks ?
>
> Thank you,
>
> Rolih
>
> _______________________________________________
> Hdf-forum is for HDF software users discussion.
> [hidden email]
> http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
> Twitter: https://twitter.com/hdf5

--
___________________________________________________________________________
Dr. Werner Benger                Visualization Research
Center for Computation & Technology at Louisiana State University (CCT/LSU)
2019  Digital Media Center, Baton Rouge, Louisiana 70803
Tel.: +1 225 578 4809                        Fax.: +1 225 578-5362


_______________________________________________
Hdf-forum is for HDF software users discussion.
[hidden email]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
Loading...