Comment by gngeal
13 years ago
"Or creating an abstract-database-driven-xml-based UI framework to automate the creation of tabular data when you have under a dozen tables in the application."
I'm not an experienced programmer, and I'd use that approach anyway, simply because it makes sense (is there any alternative that isn't worse?). What does that mean? (Oh, and I'd skip XML. I abhor violence.)
Factories only make economic sense if you are planning on building a large number of widgets.
For small applications, dynamic form generation infrastructure dwarfs the actual business logic. It means writing a lot of code which isn't solving your business problem.
Your project has an extra layer of 'meta'. It's harder to debug. It decreases flexibility. Validation is hard for multi-field relationships. Special-casing that one form can require major infrastructure changes. The approach tends to be slower and buggier than the naive approach for all but the most form-heavy applications.
It's not just the form generation. I've given some thought to how one might work with a MVCC/MGA system such as the one you have in PostgreSQL and Firebird to make an interactive system with tabular display that would keep the beneficial properties of these transactional architectures (optimistic concurrency) and still work the way that people expect from "business apps with data grids". Among other things, this implies data edit conflict resolution etc. The problem is that even if the way of going about this is straightforward, I'd never, ever want to write this even twice. Not just because it's extra work, simply because generating the updating transaction in the manner recommended by the designers of these RDBMS systems (a snapshot transaction to populate the UI and a read-committed transaction to post the edits) is error-prone.
Also, why would an extra layer of 'meta' make it harder to debug and decrease flexibility? I would have thought that it would enforce more structure by centralizing certain functionality, thus making it correct-by-design instead of relying that you'll follow conventions in multiple places consistently, and increase flexibility by separating aspects that you'd otherwise have to "weave-in" throughout the code and later change in multiple places.
> I've given some thought to how one might work with a MVCC/MGA system ...
Yes, it's an interesting problem. That's exactly what makes it fly paper ;)
Less flexible: suppose your framework is doing dynamic form generation on top of wxPython. You could have a date picker as a radio box item if you wanted. However it would probably not a be a good use of time to support that in your MDA system.
Harder to debug: Usually, bugs in the auto-form-generation infrastructure are "trickier" than the bugs you would get if you just wrote the forms.
A lot of the use cases for dynamic form generation are adequately served by automatic scaffolding in django/rails/turbogears or on the enterprisey side, technologies like Oracle forms and its successors.
1 reply →
I think you're missing the forest for the trees. The lesson is just: don't spend the time to generalize something unless doing so will save you time and effort. You'll have to use a lot of judgement to determine where the crossover points are.
For a small number of tables?
Just bite the bullet and render them by hand. It'll take less time than writing the abstract whatever-driven autogenerated UI framework, making it lay things out nicely, handle all the corner cases and auto-wiring of stuff which might not be needed in all cases, and tweaking the generated code to look good for all the data.
You know, this is where conciseness comes in handy. Another suspicion that I have (in addition to the one I've voiced elsewhere here) is that complex, inflexible languages almost force people to write stuff "by hand". What exactly is the reason for a simple, small generative layer to be so complex that it can't beat hand-written stuff for two dozen tables? I'd understand if you were talking about Java and Cobol, for example, but I'd think that such languages as Ruby, Smalltalk, and Lisp will have the threshold for "now it's worth it to be generic" quite a bit lower. I'm not sure where exactly, but certainly lower.