2845
CFCs (mander.xyz)
you are viewing a single comment's thread
view the rest of the comments
[-] SpaceCowboy@lemmy.ca -5 points 8 months ago

You're saying "imagine" a lot there.

Were there concrete examples of critical software that actually would've failed? At the time I remember there was one consultant that was on the news constantly saying everything from elevators to microwaves would fail on Y2K. Of course this was creating a lot of business for his company.

When you think about it storing a date with 6 bytes would take more space than using Unix time which would give both time and date in four bytes. Y2K38 is the real problem. Y2K was a problem with software written by poor devs that were trying to save disk space by actually using more disk space than needed.

And sure a lot of of software needed to be tested to be sure someone didn't do something stupid. But a lot of it was indeed an exaggeration. You have to reset the time on your microwave after a power outage but not the date, common sense tells you your microwave doesn't care about the year. And when a reporter actually followed up with the elevator companies, it was the same deal. Most software simply doesn't just fail when it's run in an unexpected year.

If someone wrote a time critical safety mechanism for a nuclear reactor that involved parsing a janky homebrew time format from a string then there's some serious problems in that software way beyond Y2K.

The instances of the Y2K bug I saw in the wild, the software still worked, it just displayed the date wrong.

Y2K38 is the real scary problem because people that don't understand binary numbers don't understand it at all. And even a lot of people in the technology field think it's not a problem because "computers are 64 bit now." Don't matter how many bits the processor has, it's only the size that's compiled and stored that counts. And unlike some janky parsed string format, unix time is a format I could see systems at power plants actually using.

[-] AA5B@lemmy.world 9 points 8 months ago* (last edited 8 months ago)

Some of the software at my employer at the time, would have failed. In particular, I fixed some currency trading software

[-] BorgDrone@lemmy.one 1 points 8 months ago

When you think about it storing a date with 6 bytes would take more space than using Unix time which would give both time and date in four bytes. Y2K38 is the real problem. Y2K was a problem with software written by poor devs that were trying to save disk space by actually using more disk space than needed.

This comes to mind:

You don’t store dates as Unix time. Unix timestamps indicate a specific point in time. Dates are not a specific point in time.

[-] SpaceCowboy@lemmy.ca 1 points 8 months ago

You also don't store dates in a string that you'll have to parse later. I've had to deal with MM-DD-YYYY vs. DD-MM-YYYY problems more times than I can count.

And you understand that you could have a date in unix time and leave the time to be midnight, right? You'd end up with an integer that you could sort without having to parse every goddamn string first.

And for God's sake if you insist on using strings for dates at the very least go with something like YYYY-MM-DD. Someone else may someday have to deal with your shit code, at the very least make the strings sortable FFS.

[-] cqthca@reddthat.com 1 points 8 months ago

You don't have a line that checks the format and auto converts to your favorite?

[-] BorgDrone@lemmy.one 1 points 8 months ago* (last edited 8 months ago)

You also don’t store dates in a string that you’ll have to parse later

Depends. If the format is clearly defined, then there's no problem. Or could use a binary format. The point is that you store day/month/year separately, instead of a Unix timestamp.

And you understand that you could have a date in unix time and leave the time to be midnight, right?

No, you can't.

First of all, midnight in what timezone? A timestamp is a specific instant in time, but dates are not, the specific moment that marks the beginning of a date depends on the timezone.

Say you store the date as midnight in your local timezone. Then your timezone changes, and all your stored dates are incorrect. And before you claim timezones rarely change, they change all the time. Even storing it as the date in UTC can cause problems.

You use timestamps for specific instances in time, but never for storing things that are in local time. Even if you think you are storing a specific instant it, time, you aren't. Say you make an appointment in your agenda at 14:00 local time, you store this as a Unix timestamp. It's a specific instant in time, right? No, it's not. If the time zone changes so, for example, DST goes into effect at a different time, your appointment could suddenly be an hour off, because that appointment was not supposed to be at that instant in time, it was supposed to be at 14:00 in the local timezone, so if the timezone changes the absolute point in time of that appointment changes with it.

[-] SpaceCowboy@lemmy.ca 1 points 8 months ago

First of all, midnight in what timezone? A timestamp is a specific instant in time, but dates are not, the specific moment that marks the beginning of a date depends on the timezone.

What are you talking about? The same problems apply no matter which format you're talking about. Depending on which side of the dateline your timezone is on you could wind up with different dates.

Does your janky string format of "18-03-2024" suddenly has to become aware of the timezone if I tack on a "0:00" at the end of it? Or maybe you always will have timezone issues no matter what the precision of the time you want to store.

I think you got it in your mind that you can't do anything other than Timestamp=getdate() and if it's a date only you have to use a string. That's not the case. You can indeed translate a date into any number of formats, unix time is one of them. I assure you that 1710720000 will translate to the same janky "18-03-2024" format you're using every single time unless you deliberately mess with timezones in code where you admit that you don't want to deal with timezones. But your string jankiness break simply by someone parsing it with MM-dd-yyyy just as easily and this may not require someone to do something to deliberately break it. Depending on the library that's being used and the localization settings of the OS, this can happen automatically. If your code will break because someone has different OS settings than yours, you are writing bad code.

If the goal is to save space then your format uses 10 bytes, while the timestamp uses 4 (with Y2K38 problems) or 8 with 64 bit Epoch time. If you're not too worried about saving space (you really shouldn't be these days) then use the appropriate structs defined by the language you're using and the DB you're using.

Even this would be better than a string:

struct { int year byte month byte day }

Six bytes as opposed to 10 and there would be no issues with confusion with the dd and MM parts of the string. It's still shit (use existing date libraries instead) but still won't have as many problems than what you're doing. Seriously anything is better than just dumping a date into a string. And as I say, using the dd-MM-yyyy format is bad for multiple reasons.

Though congratulations, you've convinced me that Y2K might've been a bigger problem than I thought given how adamant you are about repeating similar mistakes that caused those issues. I guess even when there's very obvious problems with how someone's doing things they will insist on doing things that way even when it's pointed out all the problems with it. I can imagine someone in the 80s and 90s pointing out the Y2K problem to someone writing the code and getting some arrogant bullshit about how only mid-level programmers worry about that. "Experts put dates in strings LOL!"

[-] BorgDrone@lemmy.one 1 points 8 months ago* (last edited 8 months ago)

Does your janky string format of “18-03-2024” suddenly has to become aware of the timezone

No, there is no timezone, and that is the entire point. In the majority of cases you just want to store the local date. The point is that a local date or time is not necessarily a fixed point in time. If I have drinks at 18:00 every Friday, that doesn't change when we switch to or from DST, it's still 18:00 in local time. I don't need a timezone, I know what timezone I live in.

Now, in cases where timezones do matter, for example if you have a Zoom meeting with someone from another country, you can store as local time + timezone. But this is still very different from storing a Unix timestamp. This meeting will be at a specific time in a specific timezone, and the exact moment in time will adjust when changes are made to that timezone. Again, a Unix timestamp does not allow for this, as it's always UTC.

I assure you that 1710720000 will translate to the same janky “18-03-2024” format you’re using every single time unless you deliberately mess with timezones in code

No, it doesn't. You can't convert it to any date unless you "mess with timezones", because 1710720000 is a specific moment in time and you have to provide a timezone when converting it to a date. You are mistaking the fact that some systems implicitly use UTC when converting for some sort of of universal standard, because it's not.

Run the following Swift code:

let d = Date(timeIntervalSince1970: 1710720000)
print(d.formatted(date: .complete, time: .omitted))

You'll get a different date depending on your location.

If your code will break because someone has different OS settings than yours, you are writing bad code.

Yes, and your bad code will break simply because you are abusing a datatype for something beyond it's intended use. If you want to store an absolute point in time, by all means use a Unix timestamp, but if you want to store a local time you should never use it because it's not mean for that and it doesn't encode the information needed to represent a local time.

Even this would be better than a string:

struct { int year byte month byte day }

Yes that's fine. I'm not arguing that you should store it as a string, I'm arguing that you should store it as individual components, in whatever format, instead of seconds since the epoch. As long as the format is well specced it doesn't really matter. Strings are used all the time for representing dates by the way. For example, ASN.1, which is used everywhere, stores dates and time as strings and it's perfectly fine as the format is specified unambigiously.

Six bytes as opposed to 10

In what archaic system are int's still 4 bytes? Int is 64-bits, or 8 bytes, on any modern machine. If I read your format on a 64-bit machine, it'll break. Also is that int little or big endian? You code still breaks if you spec an int32 and you store your date on an x86 machine (little endian) and I read it on a big-endian machine. You know what's not ambiguous ? "This time is stored as an ISO8601 string".

[-] SpaceCowboy@lemmy.ca 1 points 8 months ago

Dude, a date is a fixed point in time... just has less accuracy than if a time is included.

In what archaic system are int’s still 4 bytes?

When you have more experinece in programming in more languages, you'll find that in a lot of modern languages an int is always 32 bit and a long is 64 bits. Doesn't change if your system is 32 bits or 64 bits.

If I read your format on a 64-bit machine, it’ll break.

And this is exactly why many programming languages don't change the definition of int and long for different processor architectures.

You clearly don't have any experience with higher level programming languages, which you should really look into. If you have so little understanding of the problems with dates and times you should really only work in languages that have a well defined DateTime structure built in so you won't get into trouble with all the various edge cases and performance problems you're creating by not understanding why parsing date strings should be avoided whenever possible.

You know what’s not ambiguous ? “This time is stored as an ISO8601 string”.

Interesting that you were boldly claiming that experts use a dd-MM-yyyy format and now you're bringing up a format that starts with yyyy-MM-dd. Do you understand now why it's put into that order?

But yeah check out high level languages, they'll serialize dates into a standard format for you. Though I still have to put in serialization options to handle communications with partners that don't follow standards. Like all the time. I get enough headaches with just dates in a string formats when I can't avoid it that I know better than to do it when I can avoid it.

The meme you had that says that experts use dd-MM-yyyy is the wrong way around. Beginners use the built-in DateTime functionality that's offered by a high level language. Experts use this as well. It's only the mid tier devs that think they're going to come up with a better way on their own and get into the problems you're going to find yourself in.

[-] BorgDrone@lemmy.one 2 points 8 months ago

When you have more experinece in programming in more languages, you'll find that in a lot of modern languages an int is always 32 bit and a long is 64 bits

Once you gain some more experience you will realize that ‘a lot of’ is not good enough. Some languages do, some don’t. If you define a format, you don’t say ‘int’, you say something like “int32 in network byte order” (a.k.a. big endian).

Interesting that you were boldly claiming that experts use a dd-MM-yyyy format

Stop being willfully ignorant. I’ll repeat it once more: my claim is that you should store your dates as individual components, not as a the time since an epoch. I don’t care how those components are stored as long as it’s clearly specced.

this post was submitted on 17 Mar 2024
2845 points (99.2% liked)

Science Memes

11004 readers
2258 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS