← Back to context

Comment by neonbrain

11 hours ago

The word "Testing" is a very loaded term. Few non-professionals, or even many professionals, fully understand what is meant by it.

Consider the the following: Unit, Integration, System, UAT, Smoke, Sanity, Regression, API Testing, Performance, Load, Stress, Soak, Scalability, Reliability, Recovery, Volume Testing, White Box Testing, Mutation Testing, SAST, Code Coverage, Control Flow, Penetration Testing, Vulnerability Scanning, DAST, Compliance (GDPR/HIPAA), Usability, Accessibility (a11y), Localization (L10n), Internationalization (i18n), A/B Testing, Chaos Engineering, Fault Injection, Disaster Recovery, Negative Testing, Fuzzing, Monkey Testing, Ad-hoc, Guerilla Testing, Error Guessing, Snapshot Testing, Pixel-Perfect Testing, Compatibility Testing, Canary Testing, Installation Testing, Alpha/Beta Testing...

...and I'm certain I've missed dozens of other test approaches.

There is no science to testing, no provable best way, despite many people's vehement opinions

  • Why did you assume I'm talking about a "provable best way"? I meant that it doesn't make sense to talk simply about "testing" without clarifying what one means by it. If you assume that the absence of a "provable best way" implies a lack of utility, let me remind you that there is no "provable best way" for training LLMs either. Does that matter in practice?

You forgot a hope-driven development and release process and other optimism based ("i'm sure it's fine" method), or faith based approaches to testing (ship and pray, ...). Customer driven invluntary beta testing also comes to mind and "let's see what happens" 0-day testing before deployment. We also do user-driven error discovery, frequently.