Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instants get not deserialized as expected sometimes #304

Open
droni1234 opened this issue Mar 1, 2024 · 3 comments
Open

Instants get not deserialized as expected sometimes #304

droni1234 opened this issue Mar 1, 2024 · 3 comments

Comments

@droni1234
Copy link

Hey guys,

I bumped into this issue and it took me some time to figure out what was wrong on my side.
Maybe you don't even consider this as an issue but let me explain first what happened on my side.

When having the flag disabled for WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS

It will eventually write timestamps in Milliseconds:
1709296652087

When I now missed the flag READ_DATE_TIMESTAMPS_AS_NANOSECONDS

and read this we will get an insane Instant consisting of:
Seconds 1709296652087
and
Nanos 0

Using any site online it would understand and consider it as a Milliseconds timestamp.

I guess usually you would just split up a timestamp with a dot and put it to Seconds and Nanos, but I think theoretically both should be possible.

So do you think we could have a feature where we parse values based on their length?
11 characters or less = Seconds
12 to 15 characters = Millis
16 to 18 characters = Micros
19 to 21 characters = Nanos

While writing I am realizing that this might feel less consistent but I also think this might be an interesting flag to add,
you be the judge.

@droni1234 droni1234 changed the title Instants get not deserialized properly in jsr310 Instants get not deserialized as expected sometimes Mar 1, 2024
@cowtowncoder
Copy link
Member

Instead of verbally explaining details, it would be helpful to have piece of code to make sure it is clear exactly what settings (mapper, module configuration) and input (json) is being used.

As to length-based detection... I am bit wary about heuristics like that.

But I also think the behavior is a mess with various settings having been added: originally all handling has indeed assumed milliseconds-since-1970-01-01 timestamps, but over time new settings were added by contributors who did not always understand (or, no one did) full consequences of combinations.

@droni1234
Copy link
Author

I think usually yes you should 100% be exact with what your code produces and enabling length detection may mess something up, but having an option to enable it would maybe prevent strange errors as that.

The following code has a user mistake (me originally) it is missing the deserialization Flag.
This code kind of sums up the issue I had, I hope it makes sense.

import com.fasterxml.jackson.databind.SerializationFeature;
import com.fasterxml.jackson.databind.json.JsonMapper;
import com.fasterxml.jackson.datatype.jsr310.JavaTimeModule;

import java.time.Instant;
import java.time.temporal.ChronoUnit;

class Scratch {
	public static void main(String[] args) throws Exception {
		JsonMapper mapper = JsonMapper.builder()
				.addModule(new JavaTimeModule())
				.configure(SerializationFeature.WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS, false)
				//.configure(DeserializationFeature.READ_DATE_TIMESTAMPS_AS_NANOSECONDS, false)
				.build();

		Instant time = Instant.now().truncatedTo(ChronoUnit.MILLIS);

		String mappedTime = mapper.writeValueAsString(time);

		Instant remappedTime = mapper.readValue(mappedTime, Instant.class);

		System.out.println(time.toString() + "\n" + remappedTime.toString());
	}
}

as a result we get a very oddly parsed value since the milliseconds have been added to the seconds field in the Instant.

2024-03-04T10:41:27.538Z
+56143-06-13T10:58:58Z

Well would be curious to hear what you think.

@winfriedgerlach
Copy link

cf. #71

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants