I try my best from the available information (no code example provided). I suspect that the argument provided to the H5Acreate_by_name call is either not provided or not recognised as a valid name for an attribute. So you have to check that you provide a buffer containing a valid and not-yet-existing name for the attribute.
Once you've created the attribute you should be able to write the xml string to it using H5Awrite().
In reply to this post by "André Walker-Loud <email@example.com>"
A simple example that reproduces the error goes a long way.
I can't follow your description.
Please include a description of your environment (OS, compiler, etc.)
The error message suggests that you are using MPI.
Are you using parallel HDF5? If so, how?
From: Hdf-forum [mailto:[hidden email]] On Behalf Of gmail
Sent: Wednesday, January 10, 2018 7:09 PM
To: [hidden email] Subject: [Hdf-forum] xml string attribute write error
A colleague and I are getting an error when trying to write a string, which is fully formed xml snippet, as an attribute in a hdf5 file.
Checks we have done:
1 - manually write the string and with .compare, this manual string matches exactly the string we have read from an xml buffer.
1a - the manual string can be written to an attribute
1b - the original string raises an error when we write
2 - we have used .copy to make a new copy of the string from the xml buffer
2a - this copy also fails to write with the same error
3 - we removed all new-line characters from the string - this did not help
4 - we tried the .substr method to chop the string to small pieces, this also did not help (all pieces failed to write)
5 - we tried the first 5 characters of the string, which do not have any special characters - still fails
The fact that 1a succeeds means we are probably missing something simple.
The error is
HDF5-DIAG: Error detected in HDF5 (1.8.16) MPI-process 1:
#000: H5A.c line 334 in H5Acreate_by_name(): no attribute name
major: Invalid arguments to routine
minor: Bad value
I am hoping someone here has had and fixed the same problem, as we are very stuck.