I’ve recently gotten very into NTP with GPS and PPS as a fun personal project. Just a couple weeks ago I was reading about him on Wikipedia and I could relate to this quote (as no one else I’ve talked to about PPS has shown any interest):
> he enjoyed working on synchronized time because no one else was working on it, giving him his own "little fief"
Debian recently switched to NTPSec, and I was happy to see how familiar their website style was to the main NTP site. In the FAQ I found:
> [Q] Why do these web pages look so 1990s
> [A] Because that simple look is good for people with visual impairments, and as a tribute to Dr. David Mills, the original architect of NTP who is himself visually impaired. Dr. Mills has very particular ideas about Web visuals, and this site is carefully styled to resemble his NTP documentation pages.
I’ve never had an opportunity to meet him, but he has certainly made a positive impact on my life. Rest in peace Dr. Mills.
Same here, getting into PTP, you end up studying a lot about timing on computers, and Dr. Mills is one of the main players in building up modern timing foundations! RIP, and thanks for all your contributions!
He really was. I remember encountering him on comp.protocols.time.ntp two decades ago and the breath of knowledge he had on computers keeping time was astounding, both at the time, and now that I look back at it.
I understand (and endorse) simplicity of design, but that page doesn't even have a link back to the home page. They've gone too far past accessibility into creating a less-accessible page...
The New Yorker published a piece on Network Time Protocol a little more than a year ago[0] - highly recommend it to anyone interested in how the internet works.
This is sad news. I worked (a little bit) with him when I added the adjtime system call to linux back in the 0.99 days.... He built stuff that worked and is run everywhere. That is a great legacy. He will be remembered.
Dave Mills was helpful to me as a student. I did some research into NTP in 1999. I knew a little bit about it but not a lot and I said and did a lot of brash things (including sending query packets to every NTP server on the Internet). In response to my random poorly written mailing list and questions Dave answered me and gave me some useful pointers. I felt a little like I was talking to a celebrity.
NTP is a remarkable technology. Getting millisecond synchronization out of megahertz computers and barely-megabit computers was not easy. Honestly I'm not sure I would have thought it was even possible until I read the papers explaining how it worked. And Mills didn't just make it on his own, he helped create a small community of timekeeping experts on the Internet that persists to this day.
I met Dave a few times in the 90s, first when he was visiting Peter Kirstein at UCL. I'd taken network time as simple to get right until that evening chatting with him in the pub. Fascinating discussion of what can go wrong - and a lot of patience with a young networking researcher who didn't know what he was talking about. I've had a high respect for the attention to detail he embedded in NTP ever since. RIP.
Took a class of his at University of Delaware around the turn of the century. He was a great professor that clearly had a love for the subject.
NTP was much more complex and nerdy than some of the other trivial protocol RFCs, especially by v3 (https://www.rfc-editor.org/rfc/rfc1305), which was the first one I read. A legend; RIP
I had professor Mills in the mid 90s. He knowledge and application of hardware and software was truly impressive. A true hacker (In the finest sense of the word). RIP
He influenced my career as much as Dennis and Ken did.
Our "nanokernel" paper brought NTP into the nanosecond domain and gave FreeBSD "timecounters".
But our true shared passion was Loran-C
Dave even invented the 16-pulse "tactical Loran-C" during the Vietnam War.
I borrowed his ISA card Loran-C receiver (serial #1 & only) and later I built two generations of SDR receivers, and he was so proud when I showed him this dancing pulse received with a cheap ARM chip:
Tactical Loran, also known as "Loran-D" was a concept Dave was involved in during the Korea war. There's a pdf about it somewhere on the web.
Remember: This is way before GPS, so pilots could not just look at an instrument and know where their plane was, and many planes, both civil and military would regularly get lost (in the meaning: loose track of where they were).
In friendly skies you could have your own Loran-C, but you couldn't expect the enemy to provide that for you, so "tactical" meant that you could rig up a naviation chain where you needed it, in a matter of days.
(In the end both USSR and China set up Loran-C chains, some of them operating jointly with the US chains, and today the Rusian "Chayka" and possibly the chinese chains are the only ones left.)
So tactical Loran was basically Loran-C transmitters in containers or trucks, and because they would be much weaker than real Loran-C, being both power constrained and having much smaller antenna, they used a 16 pulse code instead of an 8/9 pulse code.
Loran-C&D is spread-spectrum transmission, but decades before the theory was fleshed out, but it was well understood that a longer pattern would improve S/N.
But Loran-C/D was just one of many DoD radionav systems that project 621B (=GPS) was supposed to kill, and eventually did kill.
The manufacturers of the cheap device I cajoled within ntpd into generating more than an order of magnitude more precision than they expected offered me decent money to write them a commercial driver. But I pointed out that they could just steer their customers towards NTP for most platforms and it was already done!
Giants. Seriously, I get this vibe when looking at all these people, the early internet culture. Sometimes I feel as if I was there... or rather really wanted to be.
Sad day for the Internet, Rest in Peace David L. Mills , and that time keeps going on forever to you and your energy to be felt across time and space.
I'm sure the artifacts of your work will never be forgotten.
----
On a sidenote
Last year I had a chat with one of the members of the early Web and we understood there's a serious issue of knowledge transfer to future web devs generations.
Few people reads books, and even if they do, the books written by technical people are not pedagogical enough as to allow the reader to capture the Tacit Knowledge and experience from the author as to be able to reproduce new ideas.
We are LOSING fundamental knowledge of the internet for every mind who dies. If you think that mailing lists, web archives, books and blog posts are enough then you're being naive.
At some point nobody will understand how the Web works. The curve where the Web is going is not pretty.
This is extremely troubling to me and I'm trying on the sidelines to have some sort of way to run Tacit Knowledge extraction from those ppl. Known techniques are ACTA and CTA (Advanced Cognitive Task Analysis and Cognitive Task Analysis).
> At some point nobody will understand how the Web works
Sorry to be blunt; but that is absolute weapons grade nonsense on multiple levels.
First, we aren't losing any knowledge on how the internet works; at least, as far as I'm aware. Can you please explain what you mean? What knowledge have we lost? Are we unable to write networking stacks because some greybeards aged out?
Secondly, If you think the guys who wrote the first C compilers and implemented NTP have much of an idea how the 'modern internet' works even today (outside of what you can learn reading beej's guide), you're wrong. I'd be happy to be proven wrong, again, but I struggle to see how folks like these would be useful on the team who implements, for example, the distributed caching algorithms used by Akamai..
I get your sentiment, it's definitely sad and a 'passing of the guard' sort of feeling when the first engineers pass on, and for sure, they know a lot about their domains. But lamenting that 'nobody will understand how the web works' because no one cares about ISC bind's implementation anymore is kind of bonkers.
I don't think we're losing knowledge of how the Internet works, but we're almost certainly losing knowledge of why it was done that way. I remember Bob Braden saying (and I paraphrase):
"When we designed the early Internet, we had a huge blank space to work in, and we agonized over what the best way to do things would be. Ever since, people have been filling in all the other parts of that space."
This was 20 years ago, but he's probably even more correct today. Of course they didn't get everything right by a longshot, but we're definitely losing the rationale for why things were done the way they were. As a result, it's quite common to stumble into old problems that had been engineered around before.
I read it more as "losing knowledge of why things are the way they are today" i.e. the earlier technical context & nuance that caused things to evolve in the way they have.
Nice example from a link in this thread is a Dr. Mill's talk at udel: https://youtu.be/08jBmCvxkv4?feature=shared It's packed with interesting context and history stretching back to 1968
I see you've never met my coworkers: Akamai does employ a number of people with very long experience in the IETF world.
There is quite a bit of bad ideas the people pop up to propose time and time again, because they don't get why the net looks the way it does or the constraints on evolution. The old timers also understand when things have changed enough to justify new things.
W3C and IETF both have a paucity of early or middle career participants. So where are all these people who understand how it works? Not making more standards to solve some real problems.
Reach out to folks who are in the last chapter of their life and collect the knowledge, Story Corps [1] meets ArchiveTeam. Interview them, create or add to their Wikipedia page and upload other artifacts to the Internet Archive.
Truly a sad event. I never met him but found his work to be so well explained, even in writing and practice. Wrote him an email once and got an informative and kind response. Highly recommend folks to read his website to get to know how to write well and convey complexity in detail, as a story.
I’ve never looked at NTP so I followed a link here to RFC 1305. I found this gem of humility.
> Note that since some time in 1968 the most significant bit (bit 0 of the
integer part) has been set and that the 64-bit field will overflow some
time in 2036. Should NTP be in use in 2036, some external means will be
necessary to qualify time relative to 1900 and time relative to 2036
The 2036 problem is fixed in RFC 5905 as there is no doubt it will be needed.
No. It's been more like "Pioneers of our industry are dying as they are now largely in their 70s, 80s, and 90s." If you don't know who they are, then either you're not part of the computer and software industry (fine), or you should take this as an opportunity to learn about the industry you're participating in rather than celebrating your own ignorance.
I’ve recently gotten very into NTP with GPS and PPS as a fun personal project. Just a couple weeks ago I was reading about him on Wikipedia and I could relate to this quote (as no one else I’ve talked to about PPS has shown any interest):
> he enjoyed working on synchronized time because no one else was working on it, giving him his own "little fief"
Debian recently switched to NTPSec, and I was happy to see how familiar their website style was to the main NTP site. In the FAQ I found:
> [Q] Why do these web pages look so 1990s
> [A] Because that simple look is good for people with visual impairments, and as a tribute to Dr. David Mills, the original architect of NTP who is himself visually impaired. Dr. Mills has very particular ideas about Web visuals, and this site is carefully styled to resemble his NTP documentation pages.
I’ve never had an opportunity to meet him, but he has certainly made a positive impact on my life. Rest in peace Dr. Mills.
Same here, getting into PTP, you end up studying a lot about timing on computers, and Dr. Mills is one of the main players in building up modern timing foundations! RIP, and thanks for all your contributions!
He really was. I remember encountering him on comp.protocols.time.ntp two decades ago and the breath of knowledge he had on computers keeping time was astounding, both at the time, and now that I look back at it.
here's that page
https://www.ntpsec.org/FAQ.html
I understand (and endorse) simplicity of design, but that page doesn't even have a link back to the home page. They've gone too far past accessibility into creating a less-accessible page...
5 replies →
And if you haven't looked at it yet, you should probably look at Network Time Security (NTS). https://www.netnod.se/knowledge-base/What-is-Network-Time-Se...
Which NTP site are you referring to? I’m keen to check out some examples.
ntpsec.org
1 reply →
The New Yorker published a piece on Network Time Protocol a little more than a year ago[0] - highly recommend it to anyone interested in how the internet works.
RIP Dave, and thank you.
[0] https://www.newyorker.com/tech/annals-of-technology/the-thor...
Discussed on HN back then:
https://news.ycombinator.com/item?id=33131195 (41 comments)
This is sad news. I worked (a little bit) with him when I added the adjtime system call to linux back in the 0.99 days.... He built stuff that worked and is run everywhere. That is a great legacy. He will be remembered.
Just curious: what is 0.99 in this context?
Almost sure this is the linux kernel version.
4 replies →
Dave Mills was helpful to me as a student. I did some research into NTP in 1999. I knew a little bit about it but not a lot and I said and did a lot of brash things (including sending query packets to every NTP server on the Internet). In response to my random poorly written mailing list and questions Dave answered me and gave me some useful pointers. I felt a little like I was talking to a celebrity.
NTP is a remarkable technology. Getting millisecond synchronization out of megahertz computers and barely-megabit computers was not easy. Honestly I'm not sure I would have thought it was even possible until I read the papers explaining how it worked. And Mills didn't just make it on his own, he helped create a small community of timekeeping experts on the Internet that persists to this day.
I at first read this as him being your student and started wondering how long ago your 100th birthday was.
[dead]
I hadn't heard from him in decades, but I knew him back when TCP/IP was starting up and his Fuzzballs were used as routers.
John Nagle
Fuzzballs?
https://en.wikipedia.org/wiki/Fuzzball_router
The Fuzzball paper: https://dl.acm.org/doi/pdf/10.1145/52324.52337
Are you John Nagle, of the Nagle algorithm?
https://news.ycombinator.com/item?id=9048947
I met Dave a few times in the 90s, first when he was visiting Peter Kirstein at UCL. I'd taken network time as simple to get right until that evening chatting with him in the pub. Fascinating discussion of what can go wrong - and a lot of patience with a young networking researcher who didn't know what he was talking about. I've had a high respect for the attention to detail he embedded in NTP ever since. RIP.
Took a class of his at University of Delaware around the turn of the century. He was a great professor that clearly had a love for the subject.
NTP was much more complex and nerdy than some of the other trivial protocol RFCs, especially by v3 (https://www.rfc-editor.org/rfc/rfc1305), which was the first one I read. A legend; RIP
I had professor Mills in the mid 90s. He knowledge and application of hardware and software was truly impressive. A true hacker (In the finest sense of the word). RIP
Same, I took several of his classes in the late 90’s at UD. I remember him fondly. RIP
If someone is interested, here is the Reference and Implementation Guide of the latest NTP version (2006, was revised in 2010), written by Dave Mills: https://www.eecis.udel.edu/~mills/database/reports/ntp4/ntp4...
Quite well written, in my opinion.
His talk on the early internet https://youtu.be/08jBmCvxkv4?feature=shared
He influenced my career as much as Dennis and Ken did.
Our "nanokernel" paper brought NTP into the nanosecond domain and gave FreeBSD "timecounters".
But our true shared passion was Loran-C
Dave even invented the 16-pulse "tactical Loran-C" during the Vietnam War.
I borrowed his ISA card Loran-C receiver (serial #1 & only) and later I built two generations of SDR receivers, and he was so proud when I showed him this dancing pulse received with a cheap ARM chip:
https://phk.freebsd.dk/AducLoran/animation2.gif
And boy was he pissed when USA shut down Loran-C, he really loved his "loudenboomers"
RIP
What is the difference between Loran-C and tactical Loran-C? I googled "loran-c vs tactical loran-c" but did not come up anything. Thanks.
Tactical Loran, also known as "Loran-D" was a concept Dave was involved in during the Korea war. There's a pdf about it somewhere on the web.
Remember: This is way before GPS, so pilots could not just look at an instrument and know where their plane was, and many planes, both civil and military would regularly get lost (in the meaning: loose track of where they were).
In friendly skies you could have your own Loran-C, but you couldn't expect the enemy to provide that for you, so "tactical" meant that you could rig up a naviation chain where you needed it, in a matter of days.
(In the end both USSR and China set up Loran-C chains, some of them operating jointly with the US chains, and today the Rusian "Chayka" and possibly the chinese chains are the only ones left.)
So tactical Loran was basically Loran-C transmitters in containers or trucks, and because they would be much weaker than real Loran-C, being both power constrained and having much smaller antenna, they used a 16 pulse code instead of an 8/9 pulse code.
Loran-C&D is spread-spectrum transmission, but decades before the theory was fleshed out, but it was well understood that a longer pattern would improve S/N.
But Loran-C/D was just one of many DoD radionav systems that project 621B (=GPS) was supposed to kill, and eventually did kill.
More about him: https://en.wikipedia.org/wiki/David_L._Mills
I just imagine all the ntp daemons becoming falsetickers for a moment
I contributed to NTP.
The manufacturers of the cheap device I cajoled within ntpd into generating more than an order of magnitude more precision than they expected offered me decent money to write them a commercial driver. But I pointed out that they could just steer their customers towards NTP for most platforms and it was already done!
We get the news from Vint Cerf himself.
Thanks Dave, rest in peace.
Giants. Seriously, I get this vibe when looking at all these people, the early internet culture. Sometimes I feel as if I was there... or rather really wanted to be.
[dead]
Sad day for the Internet, Rest in Peace David L. Mills , and that time keeps going on forever to you and your energy to be felt across time and space.
I'm sure the artifacts of your work will never be forgotten.
----
On a sidenote
Last year I had a chat with one of the members of the early Web and we understood there's a serious issue of knowledge transfer to future web devs generations.
Few people reads books, and even if they do, the books written by technical people are not pedagogical enough as to allow the reader to capture the Tacit Knowledge and experience from the author as to be able to reproduce new ideas.
We are LOSING fundamental knowledge of the internet for every mind who dies. If you think that mailing lists, web archives, books and blog posts are enough then you're being naive.
At some point nobody will understand how the Web works. The curve where the Web is going is not pretty.
This is extremely troubling to me and I'm trying on the sidelines to have some sort of way to run Tacit Knowledge extraction from those ppl. Known techniques are ACTA and CTA (Advanced Cognitive Task Analysis and Cognitive Task Analysis).
If you have any other idea, please let me know.
> At some point nobody will understand how the Web works
Sorry to be blunt; but that is absolute weapons grade nonsense on multiple levels.
First, we aren't losing any knowledge on how the internet works; at least, as far as I'm aware. Can you please explain what you mean? What knowledge have we lost? Are we unable to write networking stacks because some greybeards aged out?
Secondly, If you think the guys who wrote the first C compilers and implemented NTP have much of an idea how the 'modern internet' works even today (outside of what you can learn reading beej's guide), you're wrong. I'd be happy to be proven wrong, again, but I struggle to see how folks like these would be useful on the team who implements, for example, the distributed caching algorithms used by Akamai..
I get your sentiment, it's definitely sad and a 'passing of the guard' sort of feeling when the first engineers pass on, and for sure, they know a lot about their domains. But lamenting that 'nobody will understand how the web works' because no one cares about ISC bind's implementation anymore is kind of bonkers.
I don't think we're losing knowledge of how the Internet works, but we're almost certainly losing knowledge of why it was done that way. I remember Bob Braden saying (and I paraphrase):
"When we designed the early Internet, we had a huge blank space to work in, and we agonized over what the best way to do things would be. Ever since, people have been filling in all the other parts of that space."
This was 20 years ago, but he's probably even more correct today. Of course they didn't get everything right by a longshot, but we're definitely losing the rationale for why things were done the way they were. As a result, it's quite common to stumble into old problems that had been engineered around before.
3 replies →
I read it more as "losing knowledge of why things are the way they are today" i.e. the earlier technical context & nuance that caused things to evolve in the way they have.
Nice example from a link in this thread is a Dr. Mill's talk at udel: https://youtu.be/08jBmCvxkv4?feature=shared It's packed with interesting context and history stretching back to 1968
1 reply →
I see you've never met my coworkers: Akamai does employ a number of people with very long experience in the IETF world.
There is quite a bit of bad ideas the people pop up to propose time and time again, because they don't get why the net looks the way it does or the constraints on evolution. The old timers also understand when things have changed enough to justify new things.
W3C and IETF both have a paucity of early or middle career participants. So where are all these people who understand how it works? Not making more standards to solve some real problems.
1 reply →
The Powersharing Series is a great first-person resource for the 1980s PC era: https://www.thepowersharingseries.com/
What can we do to stop this erosion of knowledge?
Reach out to folks who are in the last chapter of their life and collect the knowledge, Story Corps [1] meets ArchiveTeam. Interview them, create or add to their Wikipedia page and upload other artifacts to the Internet Archive.
[1] https://storycorps.org/
1 reply →
Truly a sad event. I never met him but found his work to be so well explained, even in writing and practice. Wrote him an email once and got an informative and kind response. Highly recommend folks to read his website to get to know how to write well and convey complexity in detail, as a story.
If I could save time in a protocol
The first thing that I'd like to do
Is to save every day
'Til eternity passes away
Just to spend them with you
I’ve never looked at NTP so I followed a link here to RFC 1305. I found this gem of humility.
> Note that since some time in 1968 the most significant bit (bit 0 of the integer part) has been set and that the 64-bit field will overflow some time in 2036. Should NTP be in use in 2036, some external means will be necessary to qualify time relative to 1900 and time relative to 2036
The 2036 problem is fixed in RFC 5905 as there is no doubt it will be needed.
No NTP means no crypto.
Most of crypto cyphers nowadays relies on having both computers in sync clock wise.
I learned it the hard way with openwrt routers disconnected from the internet.
Not exactly, timestamps only commonly matter for handling certificate expiry, and you'll still be fine if you're a few minutes out.
I had the opportunity to meet him at UD earlier last year, very bright man who was still actively working on many things.
FTR, the black bar atop the site header (as of Fri 01/19 at 10p ET) is intended to honor Dave Mills. See also https://news.ycombinator.com/item?id=39063870
I discovered NTP through Computerphile & Tom Scott.
What a great invention.
I enjoyed his writeups about the fuzzballs back in the day, I learned a lot and had fun doing it.
I took his class @ University of Delaware in 2002.
R.I.P Professor Mills.
Extraordinary wizards. Founding fathers. True legends.
RIP
Thank you Dave! Rest in peace
Was there no black bar? Or did I miss it?
There should be. Mods are asleep.
Another black ribbon day for HN. RIP Dr Mills. The internet wouldn’t be the same were it not for you.
There should be one for Dave Mills. Another legend that deserves recognition for the foundations of the internet that hundreds of millions use today.
RIP Dave Mills.
Ought to be a black ribbon for Dave Mills...
They tried to apply the ribbon for a time, but the server's clock was wrong.
[flagged]
No. It's been more like "Pioneers of our industry are dying as they are now largely in their 70s, 80s, and 90s." If you don't know who they are, then either you're not part of the computer and software industry (fine), or you should take this as an opportunity to learn about the industry you're participating in rather than celebrating your own ignorance.
It is likely due to people who had their "prime" years overlap with early computing years.
Add 30, 40, or 50 years, and you have people we've never heard of in their 80s and 90s that built the foundation we enjoy today.
Or as they say, "standing on the shoulders of giants".
[dead]
Either spend 10 seconds on Google, or show some respect by being silent.
[flagged]
Could have? Sounds like the bar lowering. What's your PhD in?