← Back to context

Comment by Joker_vD

15 days ago

> the average over the whole day was 49.975Hz which doesn't strike me as particularly bad.

A day, having 86_400 seconds in it, is equivalent to 4_320_000 pulses at 50 Hz. At 49.975 Hz, it's only 4_317_840 pulses which is 2_160 pulses too few. Which, at assumption of 50 Hz, translates into discrepancy of 43.2 seconds, in this one day.

So, no, it's a pretty big discrepancy actually, over here anything over 0.2 Hz is legally declared to be "degraded quality", and it's been debated for years that this is actually a way too wide margin but the electricity providers/grid operators managed to successfully argue that they can't afford upgrades.

Moral of the story: don't get cute when designing electronics, just use AC/DC power supply and put a damn crystal oscillator as every other reasonable person.

I'm guessing you're being downvoted largely due to the "don't be snarky" rules.

You're right (by my maths too, which I only did now) about it being a discrepancy of 43.2 seconds per day, which as you say is quite high.

However, it is my understanding that most grid operators are actually very good about maintaining a 50Hz average over a day specifically for devices doing time keeping based their frequency, I've heard they intentionally run the generators faster or slower at certain points in the day in response to needing to get the average right over a day.

I used to have no issues with time drift on my microwave, only started in the last few years.