Dynamically loaded filters in Java

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Dynamically loaded filters in Java

Bálint Balázs
Hi,

I'm been trying to write a dataset from Java with a custom written dynamic filter applied, however I get the following error message:

Exception in thread "main" hdf.hdf5lib.exceptions.HDF5DataFiltersException: Error from filter 'can apply' callback
at hdf.hdf5lib.H5._H5Dcreate2(Native Method)
at hdf.hdf5lib.H5.H5Dcreate(H5.java:1533)
at com.testing.H5TestFilter.createFile(H5TestFilter.java:60)
at com.testing.H5TestFilter.main(H5TestFilter.java:13)
HDF5-DIAG: Error detected in HDF5 (1.8.17) thread 0:
  #000: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pdcpl.c line 929 in H5Pget_chunk(): can't find object for ID
    major: Object atom
    minor: Unable to find atom information (already closed?)
  #001: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pint.c line 3381 in H5P_object_verify(): property list is not a member of the class
    major: Property lists
    minor: Unable to register new atom
  #002: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pint.c line 3331 in H5P_isa_class(): not a property list
    major: Invalid arguments to routine
    minor: Inappropriate type

The filter works from other applications, such as python or C++. The Java application also successfully writes the dataset if I don't apply the filter to it. When debugging the problem, I can see that the filter gets loaded, and the "can apply" callback is also called with seemingly correct parameters. Here, however when the filter tries to read the chunk size of the dataset using

ndims = H5Pget_chunk(dcpl, 32, chunkdims);

I get the above error message.
I'm using the prebuilt 64bit binaries for Windows (VS 2013) of HDF Java 3.2.1 and HDF5 1.8.17, and the filter was also compiled with VS 2013 against HDF5 1.8.17. The machine is running Windows 7 64 bit version.

Any help or suggestion would be greatly appreciated to fix this problem.

Thanks a lot,
Balint

_______________________________________________
Hdf-forum is for HDF software users discussion.
[hidden email]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5

H5TestFilter.java (3K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Dynamically loaded filters in Java

Allen Byrne
In order for plugin filters to work with compression filters that require a call back into hdf5, you must use the dynamic library stack. And be sure that the libraries/dlls are in the path.

Allen


On Thursday, July 13, 2017 10:52:51 AM CDT Bálint Balázs wrote:

> Hi,
>
> I'm been trying to write a dataset from Java with a custom written dynamic
> filter applied, however I get the following error message:
>
> Exception in thread "main" hdf.hdf5lib.exceptions.HDF5DataFiltersException:
> Error from filter 'can apply' callback
> at hdf.hdf5lib.H5._H5Dcreate2(Native Method)
> at hdf.hdf5lib.H5.H5Dcreate(H5.java:1533)
> at com.testing.H5TestFilter.createFile(H5TestFilter.java:60)
> at com.testing.H5TestFilter.main(H5TestFilter.java:13)
> HDF5-DIAG: Error detected in HDF5 (1.8.17) thread 0:
>   #000: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pdcpl.c line 929 in
> H5Pget_chunk(): can't find object for ID
>     major: Object atom
>     minor: Unable to find atom information (already closed?)
>   #001: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pint.c line 3381 in
> H5P_object_verify(): property list is not a member of the class
>     major: Property lists
>     minor: Unable to register new atom
>   #002: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pint.c line 3331 in
> H5P_isa_class(): not a property list
>     major: Invalid arguments to routine
>     minor: Inappropriate type
>
> The filter works from other applications, such as python or C++. The Java
> application also successfully writes the dataset if I don't apply the
> filter to it. When debugging the problem, I can see that the filter gets
> loaded, and the "can apply" callback is also called with seemingly correct
> parameters. Here, however when the filter tries to read the chunk size of
> the dataset using
>
> ndims = H5Pget_chunk(dcpl, 32, chunkdims);
>
> I get the above error message.
> I'm using the prebuilt 64bit binaries for Windows (VS 2013) of HDF Java
> 3.2.1 and HDF5 1.8.17, and the filter was also compiled with VS 2013
> against HDF5 1.8.17. The machine is running Windows 7 64 bit version.
>
> Any help or suggestion would be greatly appreciated to fix this problem.
>
> Thanks a lot,
> Balint
>



_______________________________________________
Hdf-forum is for HDF software users discussion.
[hidden email]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5
Reply | Threaded
Open this post in threaded view
|

Re: Dynamically loaded filters in Java

Bálint Balázs
Hi Allen,

thanks for the help. I compiled HDF Java with BUILD_SHARED_LIBS=ON, and it worked with that.

Best,
Balint

2017-07-13 18:38 GMT+02:00 Allen Byrne <[hidden email]>:
In order for plugin filters to work with compression filters that require a call back into hdf5, you must use the dynamic library stack. And be sure that the libraries/dlls are in the path.

Allen


On Thursday, July 13, 2017 10:52:51 AM CDT Bálint Balázs wrote:
> Hi,
>
> I'm been trying to write a dataset from Java with a custom written dynamic
> filter applied, however I get the following error message:
>
> Exception in thread "main" hdf.hdf5lib.exceptions.HDF5DataFiltersException:
> Error from filter 'can apply' callback
> at hdf.hdf5lib.H5._H5Dcreate2(Native Method)
> at hdf.hdf5lib.H5.H5Dcreate(H5.java:1533)
> at com.testing.H5TestFilter.createFile(H5TestFilter.java:60)
> at com.testing.H5TestFilter.main(H5TestFilter.java:13)
> HDF5-DIAG: Error detected in HDF5 (1.8.17) thread 0:
>   #000: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pdcpl.c line 929 in
> H5Pget_chunk(): can't find object for ID
>     major: Object atom
>     minor: Unable to find atom information (already closed?)
>   #001: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pint.c line 3381 in
> H5P_object_verify(): property list is not a member of the class
>     major: Property lists
>     minor: Unable to register new atom
>   #002: C:\autotest\HDF518ReleaseRWDITAR\src\H5Pint.c line 3331 in
> H5P_isa_class(): not a property list
>     major: Invalid arguments to routine
>     minor: Inappropriate type
>
> The filter works from other applications, such as python or C++. The Java
> application also successfully writes the dataset if I don't apply the
> filter to it. When debugging the problem, I can see that the filter gets
> loaded, and the "can apply" callback is also called with seemingly correct
> parameters. Here, however when the filter tries to read the chunk size of
> the dataset using
>
> ndims = H5Pget_chunk(dcpl, 32, chunkdims);
>
> I get the above error message.
> I'm using the prebuilt 64bit binaries for Windows (VS 2013) of HDF Java
> 3.2.1 and HDF5 1.8.17, and the filter was also compiled with VS 2013
> against HDF5 1.8.17. The machine is running Windows 7 64 bit version.
>
> Any help or suggestion would be greatly appreciated to fix this problem.
>
> Thanks a lot,
> Balint
>




_______________________________________________
Hdf-forum is for HDF software users discussion.
[hidden email]
http://lists.hdfgroup.org/mailman/listinfo/hdf-forum_lists.hdfgroup.org
Twitter: https://twitter.com/hdf5