← Back to context

Comment by sfifs

4 years ago

This is actually a great question and doesn't deserve to be downvoted. Indeed this is one of the considerations that led me to leave materials science research field after a couple of undergraduate projects with PhD candidates.

It turns out that the domain between Angstroms (where we can computationally model atomic interactions accounting for quantum effects) and Milli (where standard Newton's laws and therefore mechanical engineering tools can be used) is a vast computational desert.

Most properties that affect bulk material properties happen to be developed in the micro-domain (note the photographs in the article) and almost 20 years after I've left the field, I don't believe there's still any rigorous "first-principles" based computational approach yet. In other words, materials are not uniform in the micro domain and this is where materials properties develop.

So materials research process becomes hypothize, create material batch, test it 20 ways, rinse and repeat for a slightly different composition or process

Even the software mentioned in the article (thermo-calc) is primarily empirical with some very smart extrapolations and modeling added (note the first step is experimental data capture [1]. It definitely is a massive step forward from when I was in the field but definitely not first principles based modeling.

[1] https://thermocalc.com/about-us/methodology/the-calphad-meth...

appreciate your use of "first principles based modeling". in my program, that's what we meant by "model based", but usage in the AI community is quite different

your verbiage concisely captures what's so important about the concept