← Back to context

Comment by dcre

2 months ago

Thanks for replying! I disagree that current LLMs can't help build tooling that improves rigor and lets you manage greater complexity. However, I agree that most people are not doing this. Some threads from a colleague on this topic:

https://bsky.app/profile/sunshowers.io/post/3mbcinl4eqc2q

https://bsky.app/profile/sunshowers.io/post/3mbftmohzdc2q

https://bsky.app/profile/sunshowers.io/post/3mbflladlss26