Quality Code Redux, Standards, Dynamic Invocation and Dick Cheney

Bruce Wood, a frequent poster on the MS C# newsgroup, said it all:

"A mediocre standard is better than no standard at all"

Bruce was responding to an OP's desire to "innovate" by using a non-standard implementation of event handlers, and he recalled his experience at university where his professor asked, "What is the purpose of writing code?". After the usual answers (e.g. "To make the computer perform some task") he answered "It's to make it clear to the next guy that reads the code how you solved the problem".


The point is, if the only objective is to make the computer perform some task, why not just use assembly language?

The answer of course is that in a multi-developer environment, or even one where you may someday leave and someone else will take over your code, you want to make it as easy as possible for others to understand not only what you did, but how you did it, and to be able to maintain that code. It is common for the insecure programmer to want to be the deus ex machina of the organization. As developers mature they usually learn that good teamwork and sharing of coding standards is far more important.

Bruce finishes, "Hacking out funky code 'that works' costs your employer extra because it will be more expensive to maintain.... A good programmer will make it work, and make it clear how it works so that it's easy to fix and modify in the future. That's what separates the professionals from the dillettantes".


Often we as developers learn some new technique and become so consumed by "how cool it is" that we forget about Bruce's principle. And, often as not, we don't think through the other implications, such as performance considerations. For example, dynamically compiling custom assemblies based on business rules that come from a database and dynamically executing these using Type.InvokeMember semantics may be very cool, but as Joel Pobar points out in his MSDN article, if that technique is in your application's "fast path" (e.g., it gets called repeatedly) you may end up finding that you have shot yourself in the "Code Monkey Foot", so to speak.

Joel's work, and additional material from Eric Gunnerson before him, show that an Interface - based methodology is approximately 200 times faster, all other things being equal. Not only that, but the whole concept of interfaces makes for better quality programming that is more maintainable. I have found very few situations where it did not make sense to enforce an interface-based approach, even for 100% dynamically - generated code.

You could say, for example "Well, I can't use an interface because all my custom - compiled assemblies / classes require specialized sets of parameters". And I say, "Fine. Then create a public Hashtable called 'Parameters' in your interface, and populate it before calling the business method(s) on the interface." The specific internal representation of that object's "execute" method should know what parameters / types it needs, and simply pick them out of its Parameters collection.

As Bruce so eloquently put it, " 'It works' are the words of a greenhorn".

Of course, we all owe a small debt of gratitude to Dick Cheney. He was the first to implement the "Fire and Forget" pattern!

Finally, just in case you think your data is secure, George Kurtz shows how incredibly easy it is to find confidential information using the Google search engine. Take the time to look at some of this stuff. Absolutely horrifying -- what's out there for the taking!

And, in similar but unrelated news, do you know that AT&T might owe you $21,000?

Something to think about.

Comments

Popular posts from this blog

Some observations on Script Callbacks, "AJAX", "ATLAS" "AHAB" and where it's all going.

IE7 - Vista: "Internet Explorer has stopped Working"

FIREFOX / IE Word-Wrap, Word-Break, TABLES FIX