Nhibernate Fluent => hbm => SQL Schema

What luck for rulers that men do not think.  - Adolf Hitler

More experimentation with NHibernate and we’re getting very close to Nirvana. The concept is (once again) – let’s develop our domain model first (similar to the “contract first” paradigm with SOA and WebServices) , create the mappings, and then create the database schema for the persistence mechanism (SQL Server, whatever) last.

The Fluent.NHibernate project  API allows you to map entities in NHibernate in a more expressive and more testable manner than you are typically able to do. What it does is to give you a clear path to take your POCOs (Plain Old CLR Objects) –- without any of the “glop”  attributes or dependencies – just standalone objects – and turn them into NHibernate  .hbm XML Mapping files. This is very cool because it means your objects can be marked [Serializable] and you can send them over the wire via WCF, etc. without the – “[VendorNameHere] Baggage?” getting in your way, because they don’t really have any.

Once this is accomplished, you can use the built-in NHibernate Tools classes to create an actual SQL Server (or other RDBMS) Schema SQL script to generate the database tables that mirror what your domain model represents in your POCO code.

Additionally, the Automap Classes are moving closer and closer to the ability to take care of the mapping “all at once” with a single method call. You don’t even need to persist .hbm XML mapping files; the NHibernate runtime is capable of using the generated mappings in memory.

Again, the whole point of this exercise is “Top Down” or “Objects First” – this flies against the face of the Microsoft Data-Centric approach of “Database first” LINQ-to-SQL or Entity Framework development (which is not a bad thing, because it is eminently usable) and will require further study by developers who understand the concept of “clean, unadulterated POCOs” that represent a true Domain Model approach to solving business problems for clients.

I’ll post more on this and probably write a comprehensive article to be posted on our eggheadcafe.com site complete with a downloadable VS 2008 solution.


IIS / ASP.NET Recycling “Deadlock Detected”

I like long walks, especially when they are taken by people who annoy me.   - Noel Coward

For some time I’ve been working on an ASP.NET web site issue where almost like clockwork, once an hour, the app recycles and then once again about six minutes later, it recycles again.

It took a long time to find it, but it turns out that “we’re our own worst enemy”. No, it wasn’t some external process like Task Scheduler on the box, running once per hour and hogging threads. It was me having so much fun doing FireAndForget pattern RPC server pings on every search that I was shooting myself in the foot! This is what happens when we put in some new cool “thing” and then six months later, when we start to see problems, we can’t remember what we did!

The ASP.NET “Deadlock detected” shows up in your Application Event log. It’s kind of cryptic, but here’s the general cause: You have a Threadpool per AppDomain. You may have a number of different operations going on (including just serving pages) – all of which use up a ThreadPool thread. Usually, whatever it is that’s happening gets done pretty quickly and so that thread becomes immediately available, and there is never a problem.

But if it doesn’t – you can quickly end up using all of your ThreadPool threads and then what will happen is that the very next Page request that comes in is plum “out of luck” – and your AppPool can thus cryptically recycle.

It could be unclosed connections, sockets, or in this case, the FireAndForget pattern which is a non-blocking way to send off some operation on a background thread.

The problem is, its still a real thread and it comes from the ThreadPool. And if it takes it’s sweet time, you could end up starving yourself out of ThreadPool threads and blow up your app. While the error reported is “Deadlock detected”, my theory is that it may not actually be a deadlock at all  – just simple thread starvation.

There is an elegant fix, the general pattern of which looks like so:

public static void DoPingOMatic(string title, string url)
int workers = 0;
int completion = 0;
ThreadPool.GetAvailableThreads(out workers, out completion);
if(workers> 10)
FireAndForget(new PingOMaticDelegate(PingOMatic),new object[] {title, url});

What I’m doing above is very simple and happens at the method body level. I simply check to see if there are at least 10 worker threads available before I allow it to attempt to use a Threadpool thread. Of course, you can get more sophisticated (waiting, for example) but in this case, just checking to see that I’ve got at least “X” Threadpool threads before we let – er- rip is sufficient.

Recursion: See “Recursion”

Doing this kind of RPC Service pinging every time you get a search or other request can get you into an unwitting “internet recursion” folly with your website if you aren’t careful. Here’s what I mean: You decide to ping a bunch of RPC Servers every time you get a search on your blog or website.  What happens is that the RPC Ping service (Pingomatic, whatever – see article on eggheadcafe.com) dutifully goes out and makes a request of the url you’ve pinged it with (e.g., http:/www.yourblog.com?q=whatever ). Your blog treats this RPC Service request as a new search, and then pings the RPC Servers with it, and …. you get the picture. Talk about thread starvation!

So, the next time you see this kind of IIS ASP.NET Application Pool recycling, search through your code carefully. The bottom line is, if you have any kind of background thread processing going on, your ThreadPool only has 50 worker threads by default – some of which may already be being used behind the scenes even though your code doesn’t explicitly engage them. It’s a good idea, therefore, to throttle whatever you’re doing -- especially if it is likely to happen many times per minute.


Referencing a non-Silverlight Assembly in a Silverlight Project

“Bailout? Hey! I’m having trouble paying my bills. How about it?” -- Me

This one comes up a lot, and the bottom line is, you can’t do it.

But - I think it is important to understand why it won't work:

You cannot reuse non-Silverlight assemblies since the desktop CLR and Silverlight CLR are based on two different Frameworks. They are similar, but they aren’t “the same”. The BCL’s (Base Class Libraries) that are referenced in each type of project may have similar names but they are completely physically different files. But you can reuse the original code if it's compatible and available to you in source

code form.

In Visual Studio, you can create a new Silverlight class library project, then right click to add an existing item. You can click the arrow on the right side of the Add Button, and choose "Add As Link". This will not copy the .cs file to your new project, so when you modify the .cs file in one place, the other place will be automatically updated. You can also have two separate projects, one a full-framework one, and one a SIlverlight project, that both use the same class files. You can even save a Silverlight .csproj file right next to the full framework .csproj file (with a slightly different name) in order to promote code re-use.

As far as being able to "add a reference" to an assembly that wasn't compiled under a Silverlight project, you cannot do it. However, you can add a reference from a full .Net CLR to a Silverlight assembly.

Technorati Tags:

Silverlight 2 / Visual Studio 2008 “The project type is not supported” error.


Recently I managed to discombobulate my Visual Studio 2008 Installation, and had to remove and reinstall the little booger. Everything went pretty much OK except after reapplying the Silverlight_Tools.exe (the combined developer installer), I tried to load an existing Silverlight 2.0 RTM project and got “The project type is not supported by this installation” dialog.

I’m like, “Huh? The installation went perfectly with no errors”.  One fix I found that seems to work perfectly is to run “Devenv /setup”. For some reason this resets all the package loads and Silverlight Tools will be happy ever after.

In my short happy life as a .NET developer, I’ve seen several products where MSIEXEC ends up just sitting there in memory at the end of an install session, all dressed up with no place to go, and it just hasn’t quite finished the job.

While I’m on this subject, it might be appropriate for a quick review of all the DEVENV.EXE switches (the full documentation is here , along with all parameters):

(Note: When you run Visual Studio 2008 on Windows Vista, you must run devenv as an adminstrator in order to use the /Setup  and /InstallVSTemplates )

/? (devenv.exe)
Displays a message box listing all devenv switches, with a brief description of each one.

/Build (devenv.exe)
Builds a solution using a specified solution configuration file

/Clean (devenv.exe)
Cleans all intermediary files and output directories.

/Command (devenv.exe)
Executes the specified command after launching the Visual Studio integrated development environment (IDE).

/DebugExe (devenv.exe)
Opens the specified executable file to be debugged.

/Deploy (devenv.exe)
Deploys a solution after a build or rebuild. Applies to managed code projects only.

/Edit (devenv.exe)
Opens the specified file in an existing instance of Visual Studio.

/LCID (devenv.exe)
Sets the default language used for text, currency, and other values within the integrated development environment (IDE).

/Log (devenv.exe)
Starts Visual Studio and logs all activity to the specified log file for troubleshooting

/NoVSIP (devenv.exe)
Disables the Visual Studio SDK developer license key on a developer workstation and then starts Visual Studio.

/Out (devenv.exe)
Specifies a file to store and display errors when you run, build, rebuild, or deploy a solution.

/Project (devenv.exe)
Identifies a single project within the specified solution configuration to build, clean, rebuild, or deploy.

/ProjectConfig (devenv.exe)
Specifies a project build configuration to be applied when you build, clean, rebuild, or deploy the project named in the /project argument.

/Rebuild (devenv.exe)
Cleans and then builds the specified solution configuration.

/InstallVSTemplates (devenv.exe)
Registers project or item templates that are located in <Visual Studio installation path>\Common7\IDE\ProjectTemplates\ or <Visual Studio installation path>\Common7\IDE\ItemTemplates\ so that they can be accessed through the New Project and Add New Item dialog boxes.

/ResetSettings (devenv.exe)
Restores Visual Studio default settings. Optionally resets the settings to the specified .vssettings file.

/ResetSkipPkgs (devenv.exe)
Clears all options to skip loading added to VSPackages by users wishing to avoid loading problem VSPackages, then starts Visual Studio.

/Run (devenv.exe)
Compiles and runs the specified project or solution.

/Runexit (devenv.exe)
Compiles and runs the specified project or solution, and then closes the integrated development environment (IDE).

/SafeMode (devenv.exe)
Starts Visual Studio in safe mode, loading only the default environment and services.

/Setup (devenv.exe)
Forces Visual Studio to merge the resource metadata that describes menus, toolbars, and command groups, from all available VSPackages.

/Upgrade (devenv.exe)
Updates the solution file and all of its project files, or the project file specified, to the current Visual Studio 2005/2008 formats for these files.

/UseEnv (devenv.exe)
Starts Visual Studio and uses the environment variables for PATH, INCLUDE, LIBS, and LIBPATH in the VC++ Directories dialog box

Well, there they are! I sure didn’t know about all of them. Happy Holidays.


LINQ To SQL / Entity Framework / NHibernate ORM Top-Down, Objects First

People demand freedom of speech as a compensation for the freedom of thought which they seldom use. – Soren Kirkegaard

In the process of stumbling through LINQ To SQL to see if I would be able to represent a SQL Server database schema I created to provide storage for a hierarchical well-defined XML Schema for a commonly used utility object, I came to the realization that I was indeed doing everything completely backwards!

What I am saying is this:  ORM should be done by focusing on the OBJECTS FIRST, not the Database Schema! Unfortunately, most of the tools we have are data-centric, not object-centric.  Scott Allen has a post that clearly describes the debacle.

To my knowledge, there will not be any plain old CLR objects (POCOs) in Entity Framework. LINQ to SQL doesn’t yet have all the mapping capability to really separate the object model from the underlying database schema – and of course, you can use it with SQL Server only.

Now, this situation may improve in the future. But hey -- I need to write this application now, and I’m not about to hold my breath waiting.

Of all the .NET frameworks out there (including the current Microsoft offerings) the only mature offering that appears to be really capable of providing the mapping functionality I need is NHibernate. With NHibernate, I can create my objects (classes) first, create the mapping schema, and then using the SchemaExport class to create the matching database schema. Best of all, NHibernate is happy whether you do this with SQL Server, Oracle, or even SQLite.  Pete Weissbrod has some excellent material that you can use as a study guide on how to do this.

The key feature with an ORM like NHibernate is what's called transparent and automated persistence:

A DataSet allows you to extract the changes performed on it in order to persist them. NHibernate provides a different feature: It can automatically persist your changes in a way that is transparent to your domain model. This means a complete separation of concerns between the persistent classes of the domain model and the persistence logic itself, such that the persistent classes are unaware of—and have no dependency to—the persistence mechanism.

An “Item” class, for example, will not have any code-level dependency to any NHibernate API.  It doesn’t need funky attributes on the class or methods that can make serialization and re-use over the wire via WCF difficult. In addition,  NHibernate doesn’t require that any special base classes or interfaces be inherited or implemented by persistent classes, nor are any special classes used to implement properties or associations.

The bottom line for me is that in a system with transparent persistence, objects aren’t aware of the underlying data store; they need not even be aware that they are being persisted or retrieved. They can be serialized, sent over the wire, used for NUnit Tests, and more.

As Scott indicates in his post, developers need to learn how ORM identity maps work. Think objects first, data second. Start with your classes and business logic, and only then get the persistence and mapping to the database done. It was a hard lesson to learn because I’ve wasted a lot of time, partly because of a predilection for using the Microsoft flavor,  but now I think I can safely say that “I got it”.


One morning, I shot a Chevrolet in my pajamas…

Government does not solve problems; it subsidizes them. – Ronald Reagan

The White House said today that "The legislation crafted in recent days aimed at helping the ailing U.S. automakers is an effective and responsible approach". 

I strongly disagree. It was hard enough to get policymakers to finally utter the word "recession." The Treasury just issued 4 week T-Bills at ZERO percent interest, for the first time in history. In the secondary markets, T-Bills were trading at a premium (meaning negative yields). You want to talk about deflation? People are so fearful that they’re willing to let the US Treasury hold their money and earn no interest at all.  The next challenge may be to get them to say "nationalization"  -- because that’s exactly what’s happening, baby!

That's pretty much what the government takeover of big chunks of the economy amounts to, in my opinion. Welcome to the USSRofA, Comrade!

Actions taken by the Democratic-led Congress and the outgoing administration — moves generally supported by President-elect Barack Obama — already have reversed decades of deregulation and privatization that Presidents Ronald Reagan, George W. Bush and George H.W. Bush all championed. It seems that when times get tough, the U.S. Government needs to step in and be the "lender of last resort". I say, “bullshit!”

Washington has taken a direct stake in  or orchestrated the takeover of banks, seized control of mortgage finance giants Fannie Mae and Freddie Mac, taken a controlling stake in insurer American International Group and now is poised in the final weeks of Bush's term to throw a multi-billion dollar lifeline to the troubled Big Three automakers. We can print money until our faces turn blue. But where is the solution to the problem? Do you see a solution? NOT! Credit markets are still frozen, Pal. And people are out on the fyookin’ street because they cannot pay their mortgages. And that’s after our Government has literally pissed away billions of your and my hard-earned tax dollars in an insanely misguided effort to “fix it”.

Congressional Democratic leaders and the White House just negotiated a bill to provide $14 billion in emergency short-term loans for Detroit and create a "car czar" to be named by Bush to dole out the loans and oversee restructuring.  Republicans may very well  ditch it, and I really hope they're successful. The Government simply does not have a very good track record of intervention in the business sector during hard times.

The problem is, this is simply throwing money at a sinking ship in an effort to "rearrange the deck chairs on the Titanic" without addressing the real baggage of bad decisions on long-term union contract agreements and other financial obligations that will still completely prevent US automakers from ever being competitive in global markets. 

The unions crippled the car industry because they were greedy and  didn’t think ahead. It’s not just the fault of the unions – the carmakers made reallyreallydumb decisions about what to produce and sell too.

I bet you that with the current deal, within just three months, GM will be back in D.C. asking for an additional 15 Billion of your and my hard earned money! Six months later, an additional 15 billion! What am I talking, GREEK? Come on, can we  THINK for a change?

"The government has no business managing car companies, even if temporarily", says Sen. Richard Shelby, R-Ala. "It's very un-Republican". And boy, is he right.

Nearly two years after Bush suggested that Detroit produce "a product that's relevant" -- rather than looking for a possible Washington bailout, the President now supports the emergency loans — after getting a concession from Democrats that the money would come from an existing program to help the industry retool its plants to make greener cars.

Longer-term proposals being developed by Obama and congressional leaders call for an equity stake for the government and a chance to dictate business decisions for years to come. This is totally wrong! The only way to fix the mess is to enable the automakers to restructure under Chapter 11 Bankruptcy, which will give them a chance to ask the Judge to nullify all the bad contracts and utterly stupid decisions they agreed to over the last 30+ years, and have a chance to really come out with a fresh start and have a viable industry.

I repeat: the pouring of taxpayer dollars into this Black Hole WILL NOT FIX THE UNDERLYING PROBLEM!

During World War II, the government seized railroads, coal mines, Midwest trucking operators and, briefly, retailer Montgomery Ward.

The federal government partially nationalized the nation's troubled railroads in the 1970s. Today, it still owns and runs Amtrak. What a DISASTER! You want more of this?

The government nationalized more than a thousand failed savings and loan institutions in the late 1980s and the early 1990s, modeling the effort on a government-run corporation that made loans and bought stock in distressed banks during the 1930s.

The Government has a very bad track record of intervening in failing industries. Don’t we get it? Can’t we THINK? It's time to stop the bullshit and let the automakers take bankruptcy. That’s what they need. It won’t be painless, to be sure. But it could work. It could be a lot less expensive to retrain and re-employ the displaced workers than to throw hard-earned taxpayer money down the sewer.  My two cents.


A Time to Reflect

"Everything you can imagine is real" -- Picasso

As we approach the end of another year, most people begin a period where there is more introspection and reflection about their inner feelings and aspirations, making donations where appropriate to favored causes, hopefully spending more time thinking about the family unit and the blessings that it provides.

I’ve already given thanks to my Twitter brethren for the enlightening 140 character or less pearls of wisdom and links they’ve provided. And I would like to extend the same thanks to readers here.

I want to wish great peace to anyone who reads or has read my UnBlog, a happy Thanksgiving and Holiday season, and don’t forget – work isn’t everything. Take time to nurture your family and loved ones, and make an effort to do good in your community. Not only now – but all year.

Commune with your family and friends, support those who need help with whatever resources you may have available to you, and – above all – be confident in your skills and your future.

Tough economic times will eventually pass. Hopefully, we will learn the lessons of history and be able to apply them to create a better future for ourselves and those who follow us.


Silverlight vs. Flash – Where’s the Fire?

Is fuel efficiency really what we need most desperately? I say that what we really need is a car that can be shot when it breaks down. –-Russell Baker

I had a chance to reflect a little bit today on Silverlight and Flash and I’d like to offer these observations.

I have some not insignificant experience with Flash. Many developers are too new to remember FutureSplash – but I remember it very well, and I used it when it first came out, circa 1995. That was the predecessor of today’s Flash, which was ultimately purchased by Macromedia, and now of course subsumed into Adobe.

A lot of the stuff you read today revolves around “Flash vs. SIlverlight”. The media loves controversy, and they’ll hammer on this subject ad-infinitum – often to the extreme detriment of any real content.  If you are on Twitter, the latest volley was about MLB (Major League Baseball) dropping Silverlight and going back to Flash for their video coverage. Its seems like every Tom, Dick and Harry Blogger / Twitterer needed to “break the news”.

It was a joke – it got to the point where the signal-to-noise ratio made it uncomfortable to even find relevant content about “Silverlight”. Fortunately, they’ve started to give up and things are getting back to normal.

I have no idea what the facts are behind this arrangement, nor do I particularly care, because Silverlight is already “on a roll”, and it’s just the beginning. Flash is not going to go away, and I don’t want it to - that’s good because competition is good for everyone.

When I and Robbe Morris (my MVP partner on the eggheadcafe.com venture) worked for Sprint Telecenters back in 2000, Sprint spent over $500,000 to have some very sophisticated Flash developers put together a really hot-shot Flash animation web page for their current home – business offering called “ION”. Needless to say, it no longer exists as a product. The animation was fantastic – but at the time, that’s all it did – be a Flash-y animation.

Flash was and still is used primarily as a delivery system for graphic animations, banner ads, and to a lesser extent, a UI front – end for interaction with data on some back-end such as a database. Flex has brought some notable changes, but I haven’t seen any compelling business scenarios built with it. Microsoft properties use Flash – and they’ll probably still be using it for quite a while. Sure, they want to promote Silverlight – but they aren’t stupid! Bloggers seem to want to expose some sort of hypocricy on the part of Microsoft here – but Microsoft is simply being pragmatic in my view. The Flash IDE has not changed much in all these years – it it still “Timeline” concept – based. Silverlight is much different.

When I was able to use the Silverlight .NET Framework to perform fast custom binary serialization of strongly typed .NET objects combined with in-memory Zip compression and send the resultant  compact byte arrays back -- and forth -- over the wire from the client – side Silverlight application to a WCF service on the server side, that’s when I realized that my Flash friends cannot perform this kind of sophistication. Their infrastructure simply does not permit this kind of enterprise – level, “create your own transport channel” architecture at present. (Note in the comments to this post that I stand corrected in some areas, but you can draw your own conclusions). Bear in mind, we are talking about custom binary serialization in Silverlight, and doing it in your language of choice – C#, VB.NET, Ruby, etc.

The main difference with Silverlight is that it really leverages the developer base – both designers and coders – to use their .NET Framework skills to build line-of-business applications that can deliver enterprise-level interaction coupled with high quality video and vector graphics -- with animation -- in a subset of the .NET Framework that installs and runs on the user’s PC, in their Internet browser. You could clearly make the case that the single compelling reason to look at Silverlight is that you are already a .NET (or Mono) developer and don’t want to learn ActionScript (or that you like VB.NET or Ruby or Python!)

And unless you’ve been living in a cave, I don’t need to sell you on the fact that  .NET has matured to the point where most all of the major corporations around the globe are using it as a mainstream technology.

Moonlight, the Microsoft / Novell supported *nix version of Silverlight, is about to make it’s debut.  And the Eclipse Silverlight IDE is already available to try in BETA.

SIlverlight code is compiled. It runs (in several different programming languages now)  up to 300 times faster than comparable Flash code, which relies (currently) on ActionScript.

The next generation of Silverlight will support Hi-Def video, adaptive streaming, and much more.

I still like Flash. But I just love Silverlight, because the productivity curve with Expression Blend and being able to create applications in the familiar environment of Visual Studio 2008 (and even in Visual Studio Express versions) Is quite compelling.

Recently I subscribed to NetFlix just so I could opt –in to the SIlverlight – delivered “Watch Instantly” feature, which is a pleasure to use. Full – screen, searchable hi-def video of your favorite movies, streamed live right to your computer monitor. I like Indie, comedy and foreign films – and they have plenty.  It’s comedic to see these MAC users (who are often hardcore Microsoft – haters) having their jaws drop when they see it working so well on their MACS!

Don’t sell Silverlight short. You’d be making a big mistake. It’s not about Flash vs. Silverlight; it’s about RIA and good competition.


Why ASMX-Style WebReferences to WCF Services Don’t serialize Numbers


Based on a casual googling of this problem, it appears that some developers have spent days trying to figure out why an integer they set on the generated proxy field comes over the wire into the actual WCF service as ZERO.  I didn’t spend days figuring it out, but it certainly did cause no end of annoyance and cursing until I  did.

Let’s say you have a WCF service and for one reason or another (maybe your app is on a Handheld device and you cannot use “Add Service Reference”) you’ve set an ASMX – style “Add Web Reference”.  Your code in the generated proxy Reference.cs class may look like this:

 public int Quantity {
            get {
                return this.quantityField;
            set {
                this.quantityField = value;
        /// <remarks/>
        public bool QuantitySpecified {
            get {
                return this.quantityFieldSpecified;
            set {
                this.quantityFieldSpecified = value;

Note the extra field XxxSpecified with the XmlIgnore attribute that gets generated. If you do not set the QuantitySpecified Boolean field to “true”, your Quantity isn’t going to make it over the wire, period.

The extra XxxSpecified properties are added when you use ASMX-style references  to a service that contains nullable value types, as ASMX does not directly support nullable types. The XxxSpecified must be true for ASMX to know that the value is specified in the XML, otherwise it will assume that the value is 'null'. So for example  if your datetime value contained MinDate, how would you know that this signals 'null' ? The same is true for reading property values; if the XxxSpecified is false, then the related value must be considered to be 'null'.  

The only exception to this is for strings.  The documentation for this may be there, but I sure could not find it.

Twitter Tip

Recently I saw somebody I follow on Twitter announcing they had started to follow everybody who follows them as an “experiment”. I’m not sure I would advise that – because it could bring the signal-to-noise ratio to an unacceptable level. When I get an email that somebody is now following me on twitter, I go check them out. I look at their list of followers. If I don’t recognize too many names of people I’m already following, I usually decline to follow them unless I can clearly see that their content is of interest to me.

I’ve made a Twitter policy of only following people I’m interested in – fellow MVP’s, gurus in various disciplines such as Silverlight or WCF, and so on. The result is that I get a lot of very relevant content and very little “noise” – except for the inevitable “fembot” spammers – but it seems that Twitter has been doing an excellent job of putting those out of commission lately.

I also follow a few luminaries like Jeff Attwood, even though a lot of his Tweets are not that relevant – because of his occasional “Twingers” that really make you think!


Does Web Censorship Software Make Sense?

Like many other .NET Developers, I try to work as efficiently as possible for the benefit of my employer and clients. I don’t take smoking breaks, I usually eat lunch at my desk in about 10 or 15 minutes, and I don’t waste my employer’s time needlessly surfing the net.

But many employers don’t seem to appreciate that. They look at what they see as a problem, and use web filtering software like WebSense to block employees from “doing bad things” under the guise of “improving productivity”.  You know what I say to that? Bull!

Look, there are always going to be a small minority of employees who are irresponsible and don’t have good work-ethic values who will spend hours of their employers’ time surfing “unacceptable” web sites. But to punish everyone for the offenses of a few is like using a sledge hammer to swat a  gnat.

If you block people from web sites, what do you think they’ll do instead? Their productivity won’t increase – the “Baddies” will just do something else, like playing Solitaire on their computers. Or they’ll find something else to do that wastes the employer’s time.

Scott Hanselman actually got his blog “Banned” by WebSense because it got mindlessly categorized as “Personal Web Sites; Society and Lifestyles.” Can you believe this utter B.S.?   WebSense, because of the nutty algorithms that it uses, is intrinsically flawed.

For the majority of us “good guys”, being blocked in what sometimes seems a heavy-handed, indiscriminate manner is both an annoyance and a reduction in productivity.

Some companies prohibit employees from using IM clients like Live Messenger. At the company I work for, we all agree that it is a great productivity booster. Even the “higher ups” use it.  Sure, there may be a few abusers – but the benefits far outweigh.

Occasionally when doing a search for technical resources on a programming problem, I end up on one of those WebSense “blocked” pages. At a client’s site recently, I found that the entire domain “blogspot.com” – which has thousands of legitimate technical blogs on .NET, Silverlight, WPF, WCF and other technologies (including this UnBlog) – is BLOCKED.

What on earth are these people thinking? Is it really about “productivity”, or just the heavy hand of self-appointed righteousness? What do you think?

Item of Note:

My “playground” Social Short Url / Search / Tagging site, http://ittyurl.net, now has over 958 user-submitted links related to Silverlight, which has been the site’s focus since about June of this year. Keep those new links coming, and check out the Webservice API, which now features a new Webmethod to discover and Feedburnerize feeds on any webpage that has “link rel=” feed discovery tags! By creating a Feedburner Url for your feed, you automatically enable Silverlight cross-domain requests!


Uninstall Internet Explorer 8 Beta 2 on Vista When not found in Programs And Features

“Code is read a lot more times than it is written.” – Anthony Moore, Framework Design Guidelines 2nd Ed

Can’t find Windows Internet Explorer 8 Beta 2 to uninstall in Programs And Features, or affter checking “View Installed Updates”? No problem!

To uninstall Internet Explorer 8 Beta 2 in Windows Vista or in Windows Server 2008, do this:

1.Select and then copy the following command to the Clipboard:

FORFILES /P %WINDIR%\servicing\Packages /M Microsoft-Windows-InternetExplorer-8*.mum /c "cmd /c echo Uninstalling package @fname && start /w pkgmgr /up:@fname /norestart"

2. Click Start, and then type cmd.exe in the Start Search box. (If you have Start / Run enabled, just type cmd.exe in the “Run” Textbox).

3.In the list of programs, right-click Cmd.exe, and then click Run as administrator.

(If you are prompted for an administrator password or for confirmation, type the password, or click Continue.)

4.Right-click inside the Administrator: Command Prompt window, and then click Paste to paste the command that you copied in step 1.

5.Press ENTER to uninstall Internet Explorer 8 Beta 2. When the uninstall program is finished, REBOOT YOUR COMPUTER!

Verify that your earlier version of Internet Explorer was restored using the “Help/About” screen. It should look like this:


Bye-bye, Internet Explorer 8 Beta 2! I’m looking forward to IE 8, but I think I can wait until it’s released.


Global Crisis: Advice for Developers

At this point, there is no question in my mind that we’ve entered into a global financial crisis of epic proportions such that you will (hopefully) tell your grandchildren about the “Global Meltdown of 2008”. I kid you not.

Although over the years I have been a very diligent student of economic cycles and trends, I do not purport to hold some sort of “crystal ball” that makes me the Edward Cayce of global economic trends. But one thing is sure: long term economic cycles of boom-and-bust have turned decidedly negative, and this has broad, long-term implications for you and me as software developers. This is not a “US phenomenon” – this is truly global and touches every economy on the face of the earth. WHAT IS HAPPENING IN 2008 WILL REWRITE HISTORY.

I’m not going to bore you with economic theories and say who is to blame or claim to have a solution to the problems that face us. Rather, I want to take a few paragraphs to give my read on what is happening and how you can protect your ass.

The best “thermometer” of the credit situation is what is known in financial parlance as the “TED Spread”. Simply put, this is the difference in rates between US short term 3 month Treasury Bills (considered  a “riskless” security) and  LIBOR (London Interbank Offered Rate). LIBOR simply reflects the credit risk of lending to commercial banks by other banks, in US Dollars. Right now, the TED spread (even after all the inter-government intervention and global discount rate cuts this week)  has hit around 400 basis points (4%) after hovering between 1% and 2% since last November. That’s the equivalent of a “choke hold” on the global financial markets. In sum, the global economy is HAVING A STROKE:


What this means is that there is so much fear in the international banking markets that the credit system – as we know it – has virtually DRIED UP. FINIS. NADA. KAPUT. EFES. NO CREDIT. It’s DYSFUNCTIONAL. They’ve tried to fix it, and the patient is still terminal. Everything the doctors have tried DOESN’T WORK. Got the picture?

This has long-term effects on Wall Street, but it also trickles down to the “average Joe” – that’s you and me. Here’s what it means, in PLAIN ENGLISH:

  • The company you work for will probably NOT get new contracts. You could be out of a job, depending on what service sector and target market your company is in.
  • Companies will be shortly looking at ways to cut costs. That mean lay-offs, terminations. If you aren’t “Senior” at your company, your job is AT RISK, RIGHT NOW. NOT TOMORROW - TODAY!
  • Companies will be delaying or aborting plans that involve any kind of incremental expenditure other than those that enable the company to SURVIVE.

The vendors that survive and prosper, in the main, will be those that offer a product or service that can either help a company SAVE MONEY, or MAKE MONEY. If your company / product is not able to clearly and efficiently spell out how you deliver this benefit, YOU ARE AT RISK.

My advice?

  • Polish up your resume. NOW!
  • Keep your ear to the ground to be aware of any potential changes around you.
  • Keep up your networking contacts (LinkedIn, recruiters, etc) and REOPEN the dialog.

Have fun, and good luck! This will pass, but it could take a long time (several years)- and you need to be both vigilant and informed.



Sunspots. Ready to Chill Out?

Not only is the universe stranger than we imagine, it is stranger than we can imagine.
  - Sir Arthur Eddington

You probably haven’t heard much about Solar Cycle 24, the current cycle that our sun has just entered. However if Solar Cycle 24 becomes a household term, our lifestyles could be taking a dramatic turn for the worse.

Solar Cycle 24 could mark a time of dramatic long-term change in the climate. According to geophysicist Philip Chapman, a former NASA astronaut, scientist and former president of the National Space Society, "It is time to put aside the global warming dogma, at least to begin contingency planning about what to do if we are moving into another little ice age."

In recent months the sun has lost its spots. By this point in the solar cycle, sunspots would ordinarily be present in significant numbers.  If the sun does not soon revert to its "normal" behavior, and the speculation in the scientific community is growing that it won’t – we could be entering a major new global cooling period.

This happened during the Little Ice Age, a period starting  around 1625 and lasting for centuries, says NASA’s Goddard Space Center, which  claims that the absence of sunspots is linked to the cold that then descended on Earth. During the coldest part of the Little Ice Age, a time known as the Maunder Minimum, astronomers saw only about 50 sunspots over a 30-year period (see chart below), less than one half of 1% of the sunspots that would normally have been expected. Other Minimums  also corresponded to times of unusual cold.


During the Little Ice Age, the Thames froze over. In what had previously been a warm Europe , growing seasons in England and Continental Europe generally became short and unreliable, which led to shortages and famine. But these hardships were nothing compared to the more northerly countries. Glaciers advanced rapidly in Greenland, Iceland, Scandinavia and North America, rendering vast tracts of land uninhabitable. The Arctic pack ice extended so far south that several reports describe Eskimos landing their kayaks in Scotland. Finland’s population fell by one-third, Iceland’s by half, and the Viking colonies in Greenland were abandoned altogether -- as were many Inuit communities. The cold in North America spread so far south that  in the winter of 1780  New York Harbor froze, enabling people to walk from Manhattan to Staten Island.

In the same way that the Earth cooled when sunspots disappeared, the Earth warmed when sunspot activity became pronounced. The warm period about 1000 years ago known as the Medieval Warm Period — a time of bounty in which grapes grew in England and Greenland was colonized — was also a time of high sunspot activity, called the Medieval Maximum. Since 1900, Earth has experienced what astronomers call “the Modern Maximum” — the 20th century has again been a time of high sunspot activity, accompanied by cries of "global warming" a - la Al Gore.

But the 1900s are over, along with the high temperatures that accompanied them. The last 10 years have seen no increase in temperatures — they reached a plateau and then remained there — and the last year saw a precipitous decline. How much lower and for how long the temperatures will fall, if at all, no one yet knows — the science is far from settled on what drives climate.

Several renowned scientists have been predicting for some time that the world could enter a period of cooling right around now, with consequences that could be dire. “The next little ice age would be much worse than the previous one and much more harmful than anything warming may do,” says Dr. Chapman.

The four major agencies tracking Earth’s temperature, including NASA’s Goddard Institute, report that the Earth cooled 0.7 degree Celsius in 2007, the fastest decline in the age of instrumentation, putting us back to where the Earth was in 1930.

John Casey of the Space and Science Research Center: “The key difference for this next Bi-Centennial Cycle’s impact versus the last is that we will have over 8 billion mouths to feed in the next coldest years where as we had only 1 billion the last time. Among other effects like social and economic disruption, we are facing the real prospect of the ‘perfect storm of global food shortages’ in the next climate change. In answer to the question, everyone on the street will be affected.”

As global cooling affects crop output and commodity prices, it also affects the global economy, causing a general global contraction in GDP’s, swings in interest rates, and other more subtle effects. I’m not about to propose that it was the solar cycle that caused the recent seizures in the commercial paper markets that frightened the Fed into coming up with their current $700 Billion emergency bailout package, but I would not be surprised if it did. At any rate I have serious doubts that it was “the failed policies of the Bush Administration”. There are far greater forces at work here. See here for more on this.

Theodor Landscheit, whose research on the effect of the sun’s interaction with the center of mass of our Solar System has enabled him to make some uncanny predictions, had this to say over 10 years ago:  “We need not wait until 2030 to see whether the forecast of the next deep Gleissberg minimum is correct. A declining trend in solar activity and global temperature should become manifest long before the deepest point in the development. The current 11-year sunspot cycle 23 with its considerably weaker activity seems to be a first indication of the new trend, especially as it was predicted on the basis of solar motion cycles two decades ago. As to temperature, only El Niño periods should interrupt the downward trend, but even El Niños should become less frequent and strong. The outcome of this further long-range climate forecast solely based on solar activity may be considered to be a touchstone of the IPCC's hypothesis of man-made global warming.“

Unfortunately, Dr. Landscheit died in 2004, so we no longer have the benefit of his potentially astute research.

So with the recent appearance of a solitary sunspot very late in the game, we are now at the beginning of Solar Cycle 24 (the scientific designation). Chill out, fire up your 10MPG gas-guzzling SUV, and take a nice drive in the soon to be cooler countryside.


Silverlight 2 RC0 : Important Considerations for Developers

There are hundreds of “Me Too” blog posts about this, so I won’t bore you with more of the same. There are, however, two of what I think are very important considerations about Silverlight 2 RC0 that developers need to know about. You can only find out about this stuff if you take the time to RTFM carefully. In this case that would be the info on Scott Guthrie’s blog and possibly on Tim Heuer’s blog as well.  Pete Brown also does a great job of covering details, and he writes well. Finally, another smart person to follow would be Mike Snow, who is a Senior Software Design Engineer in Test (SDET) for Visual Web Developer Tools.


1)  RC0 is a developer release only. You cannot deploy RC0 applications to the web. They won't work. It's only for test environments where you want to ensure that existing or new applications will work with the final Silverlight 2 release.

To repeat: there is no end-user installable runtime for RC0, only the developer runtime with the developer tools.  If you deploy an RC0 application to the web, your users will be greeted with unfriendly install messages taking them to installs for Beta 2, and will make them confused. That's no good for you or for Microsoft.

2) The Blend SP1 update that all the announcements point to is for Blend 2 – the product – NOT the June 2008 Blend Preview. This means if you decide to install all the RC0 bits you’ll either need to be an owner of a licensed copy of the product (through MSDN Subscriber downloads for example) or you’ll need to install the free Trial of Blend 2, and then apply the SP1 update to that. Of course, if you have it, you will also need to uninstall the June 2008 Blend Preview.

3) As with any Release Candidate, spending 30 minutes carefully reading the Release Notes and Breaking Changes Word document can and will save you hours of frustration. R-T-F-M!  It’s taken me years to learn to do this, and I still occasionally slip up, but I gotta tell ya this is the best way to save time: Drop everything and Read The Friggin’ Manual First!

One recommendation I’ll make for a lot of developers who want to continue to work with BETA 2 and still be able to work on their code for the eventual Release version is to put these RC0 bits on a VPC image and do all your experimentation there; leaving the BETA 2 stuff on your regular development machine. I’ve got a VPC of Server 2008 on two of my machines. I can do anything I want with it, and if I don’t like what I’ve done or screwed it up, all I have to do is extract the original 7Zip self-extractor of the image from my USB stick and I’m like “it never happened”.


Have fun with Silverlight and remember to practice Safe Development – especially if you’re doing PDD (Pajama Driven Development).


Learning Experiences: What Developers Want

“The last update to the Hypertext Markup Language — the lingua franca of the web — was the 4.01 specification completed in September, 1999.” –Digg Post

Recently I read a post on Jesse Liberty’s blog about getting flamed by some commenter who didn’t like what he was publishing. I responded in a comment that I thought he was doing just fine, and that you cannot expect to please everybody. But I also recommended that he put up one of those free poll “thingies” that would allow his visitors to vote on what they did want to see, and Jesse took me up on it.

He put up a comprehensive poll that allowed write-in suggestions. I thought it was very well designed. The preliminary results of some 250 responses (including mine) is quite revealing, I think:

“The results have held steady from the very beginning – Webcasts have overwhelmingly been the "last choice" for over 2/3 of users and in-depth tutorials have been the first choice; with short videos and short tutorials splitting the middle position.  In any case, there is almost total unanimity that all presentations should be at the Intermediate (300) level except for tutorials,  which nearly 2/3 of you think should be more advanced”

What this is saying is something that I myself have known and stated many times: Developers, when looking for learning resources on the web,  overwhelmingly prefer to read in-depth tutorials (preferably with downloadable code too) rather than viewing Webcasts or downloading Podcasts. And, at least as far as reading Liberty’s material on Silverlight, they prefer that the tutorials should be at the advanced level. Your “basic” level stuff usually comes from the articles and FAQs at the main silverlight.net web site.

For example, you can make a very nice screencast of “How to create a Silverlight Custom Control” (and people like Mike Taulty have done excellent ones, with pretty good voice narration). But the point is, you can also create an article / tutorial with screenshots showing the important stages and text explanations, and downloadable code -- and according to Liberty’s poll, this is what developers prefer. Oh, and did I mention that PODCASTS SUCK?

Item of Note

My “playground” short url and social tagging site, ittyurl.net, now has over 600 links on Silverlight! That’s right, over 600 user-submitted links, all searchable by tags, with nearly 30,000 clickthroughs to date. Thanks to all who have contributed. And a big thanks to my eggheadcafe.com site partner Robbe Morris for agreeing to host the site and get me out of “gate.com hosting company hell”! Boy did that company go down the crapper!


Virtual PC (VPC) techniques for developers

I’m starting my studies of Sharepoint and MOSS, so it occurred to me that creating a Microsoft Virtual PC image of Windows Server 2008 along with SQL Server 2008, Visual Studio 2008 SP1, and other useful tools would be a good idea. I’ve used VMWare and it’s great. But for a single developer who just wants a portable image that you can zip up and store on a USB stick, where you don’t need a whole virtualization infrastructure, VPC is ideal. One of the reasons I like it is that VPC doesn’t install a bunch of network drivers and Windows Services like VMWare Workstation or the free VMWare player.  VPC is nearly 100% “Self contained”, and doesn’t install any baggage at all on the host OS.

Also, Shawn Wildermuth told me that VPC runs faster, so I took him at his word.

The nice thing about all this is that the entire VPC control file and VHD expandable hard disk, with all of the software enumerated above,  7Zips down to just about 3GB – small enough to put on my  $7.99 Kingston Data Traveler 4GB USB stick that I got from buy.com (with the Google Checkout $10 rebate) – and leaves plenty of room for other stuff!

You can snag VPC from here. and the SP1 update here. Sorry, I don’t remember if the SP1 is a full install or whether you have to install VPC first.

One of the common mistakes developers make is to assume that a dynamic VHD hard disk will expand indefinitely. Nope! It will only expand to the initial maximum size you set when you created it. However, don’t despair! VHDResizer is a free tool that will take care of it!


HINT: After using VHDREsizer, you have to extend your C partition to include the new space!   Go to Computer Management --> Disk Management. You should see the extra space you added to the right of the C:/ drive.  Right click on the C:/ drive, then select Extend Volume. It should automatically add the new space.

My copy of Server 2008 on the VPC image is fully activated, and all the latest Windows Updates are applied. I’ve put it out on a network share so any of our devs who want this can simply install VPC, unzip my “thing”, and they’re good to go!  One neat thing I did was to modify the login policy to display the Administrator password right on the login message, so nobody who uses this will have to worry about finding the password to log-in. This will run on either Windows XP or Windows Vista.  You can even drag a file from the Host OS over into an Explorer window on the VPC and it will copy right over.  Plus, don’t forget that VMWare is capable of reading MS VPC VHD disk images. Have fun!

Codeplex SVN support

Recently my eggheadcafe.com site partner and I installed VisualSvn Server (free) and the newest AnkhSvn VisualStudio 2008 SCC –compliant plug-in (also free) and things have been working out famously. Coincidentally, I just saw that Codeplex now supports TortoiseSvn out of the box with no “bridge” or other installs.  That’s a real win!


Silverlight: Handy Dynamic Javascript Debugger Favorite

Often you need to be able to View Source on a page that has injected dynamic javascript to see what you did, and,surprise – it’s not there. Here is a neat way to view source (including any dynamically generated elements or script):




What this does is simply using the <xmp> (“example”) tag to literally render  your “stuff” without parsing. You can add this to your browser’s favorites as an A-HREF link and that way you can simply choose the favorite to view the complete source on any page.  This is extremely useful when using the Silverlight Browser classes to manipulate the DOM of the underlying Page. Here’s a sample like (this may not work as Blogger does funny stuff, but you can still mouse over it):

Debug Js


Twitter - Social Microblogging Experiment (and Google Chrome)

I started out on Twitter maybe a year ago, then decided it was a waste of time. Then, for some reason maybe 4 months ago I picked up again. It might be that  Witty (a very nice WPF Twitter Client) came out, then I found the JAVA TinyTwitter app for my virtually prehistoric AUDIOVOX SMT5600 smartphone (which only runs compact framework 1.0) . At any rate, I currently follow some  88 active Tweeters (Twitterers? Twitterinoes? Twitterheads?)  and have some 62 or so following me. What I've done from the start is try to concentrate on following only people who I know (either personally or through other communications) plus some extras whose work I am familiar with and who may or may not necessarily be familiar with me. I've found that a good many of the people I follow return the favor by following me. I've focused mainly on the .NET developer crowd. If you cast too broad a net, you are likely to be disappointed with the incredible amount of noise you've generated for yourself.

The net result for my efforts, which haven't taken up much time at all, has been a Twitter timeline that provides me with very current information about subjects that I am really interested in, e.g. most of these people are .NET developers like me. I'm really not interested in following the "gurus" - you know, the people who have 9,999 followers, most of whom they don't know from Adam.  I also use the Twitter API to post new IttyUrl.net links to a special IttyUrl Twitter account programatically. IttyUrl.net focuses mostly on Silverlight resources currently.

Certainly there is a signal to noise ratio in that there's a lot of garbage -- but there is also some pretty good information that gives me leads on stuff I want or need to follow up on. Another benefit is that you can post a question and often you'll get a very quick answer from somebody. You also find out about what other people are interested in and to me, that is a big bonus, since besides being a traditional full-time .NET developer,I also write articles for eggheadcafe.com which we've had up now since 2000.  And of course, you have the inevitable "fembots" - users who follow you with a racy avatar of some hot female whose sole purpose is to promote some spammy MLM program about a useless get - rich -quick scam.

The key thing with Twitter is that as a microblogging platform you have to learn to compose your thoughts into a 140 character deal -- kind of a Hemingwayesque lesson in writing. There have been several times when I was excited or wasn't thinking that I "went over" and sometimes the result of what you post becomes kind of comical.

But all-in-all I have to say that so far, except for the constant "Fail Whales" (outages) Twitter is a success.



As I've UnBlogged before, the thing that is missing is a comprehensive API that not only provides twitter - but also access to more traditional forums and IM, social tagging, and more.

What's been your experience with Twitter?


Google Chrome Download

Is Google really gonna take over the world with its new Gears- integrated “open source” browser, “Chrome”? The download page didn’t work when I visited today, but it was relatively easy putting together the download link by simply viewing source of the page and inspecting the javascript:


Interestingly, I had read that Google pulled the download link out of their cache, and sure enough (at least for me) the installer executable did nothing – it didn’t work. You can see the executable appear in Process Explorer for about 4 seconds, and then it disappears. Maybe later?

Hmm, got a copy from another developer who isn’t running Vista – and his appears to be working.  Meanwhile, Kaspersky Labs has disclosed the presence of a serious security flaw in Gooogle Chrome...

Google's Matt Cutts on the Chrome EULA fiasco....


Why Podcasts Suck, Redux!


I weighed in on this subject some time ago here and I think it's high time for a rehash. Why?  Because they just won't give up! I'm putting links to Silverlight stuff into IttyUrl.net, which finally has a good, fast  new home (hasta-la-vista gate.com -- useless hosting company!)  -- and now I keep coming up with these podcast promotions.  NO, NO, NO!

Podcasts are linear, they are like TV, which has become virtually prehistoric for us Internet Geeks! A podcast cannot be indexed by Google, you cannot "Search" it to find the part you are interested in (if there actually is one!) , and, except in rare cases, it's not professionally produced media - not by a long shot. And you certainly cannot copy code samples from a FYOOKIN' PODCAST! I mean, if I want to listen to the BBC audiobook narrated edition of Ernest Hemingway's "The Short Happy Life of Francis Macomber", that's professional media! Podcasts, "NOT"!

As I opined in my previous post, we live in a technological era where any Joe Schmoe garage band can burn their own CD or MP3 and distribute it. That doesn't mean it will be any good. Same with Podcasts! PODCASTS SUCK.


What we have, as I've observed, is a number of blog / print authors who are, in many cases,  really good in their own right -- but who just fall to pieces when they try to produce a podcast. They are (unfortunately) spending a lot of time and effort producing these media pieces when they could be writing really good articles with downloadable source code. But they're not, because they don't understand why PODCASTS SUCK!

Marshall McLuhan, whose work is viewed as one of the cornerstones of media theory, said "The medium is the massage". What did he mean? McLuhan was the John Cage of modern media. Cage stretched the concept of what media really is, but in the musical domain. I can remember when I was younger, driving up Route 202 in Stony Point NY to see John Cage's house, and thinking "Holy Sh*t!":


You can't see much in the photo above, but Cage's house was a semi-cylinder - like a beer can with shingles on it! This guy was so avante-garde that he was light-years ahead of the crowd, in every thing he did.

I was in awe, because I believe that I understood who this guy really was.  Read the Wikipedia pages about these two giants, who set the stage for what is today, even though the Internet didn't even exist yet for either of them. They SAW the future, and they saw it with such genius vision, it is still just as relevant today!


I believe that both John Cage and Marshall McLuhan would have laughed at modern podcasts. I've also consulted with Dr. Dexter Dotnetsky, denizen of the deep at eggheadcafe.com, and he agrees. And, Dr. Dotnetsky would not play you wrong, dude!

Podcasts, even the "best of the best" by MVPs and Microsoft gurus -- many of whom I know personally -- are, in the main, essentially no more than mental masturbation with people breathing into the microphone and saying "Um" a lot. Sorry, but that's my considered opinion.

Oh, and did I forget to mention -- PODCASTS SUCK! Make no mistake, they really do. You want my unending appreciation and admiration? Write me an authoritative article with downloadable source code!  An article that I can FIND - because Google et. al. indexed it!  I'm not gonna download your dumb Mp3 podcast to my device! Go ahead, comment, make your case why I am wrong-- but, PODCASTS SUCK!

Now, if you want to see some media that is linear like a podcast, but useful and really entertaining, try this Silverlight streaming video of Chick Corea with Miroslav Vitous (bass) and Roy Hanes (drums) playing Thelonius Monk's "Rhythm-A-Ning". On October 10th, Monk would have been 91 years old. To me, Monk is quite alive and well, and his work continues to inspire jazz musicians worldwide. Recorded live at the Blue Note. ( More Monk Here)

Did you know that an unusually high percentage of programmers are also musicians - particularly the jazz flavor? It's true. I studied with jazz players in New York and San Francisco; used to play string bass and flute with the Robert Hunt Trio in New York, and I still keep up my flute chops today. It's easier to get three squares a day as a programmer, though!  Dream big, Be yourself, and Swing On! (Oh, and don't forget, PODCASTS ______).


Internet Explorer 8 Beta 2 out, and compatibility tags

Internet Explorer 8 Beta 2 was released today in multiple languages, with more to come in the next 30 days. So, this represents a move beyond the "BETA 1" developer preview stage (possibly by a longshot).

As can be expected, there will be a lot of pages and sites that want to opt-out of IE 8 “Standards” mode rendering. There are two ways to do this:

  • On a per-site basis, add a custom HTTP header

X-UA-Compatible: IE=EmulateIE7 (IIS 7.0 example:)


  • On a per-page basis, add a special HTML tag to each document, right after the <head> tag

<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE7" />

Implementing the HTTP header is useful if a site owner wants most of their site to render as it did in IE7 or if there are no plans to update site content. Inclusion of this header honors any Quirks mode pages that belong to the site.

Using the meta-tag on a per-page basis is beneficial when you want to opt-in specific pages to render as they did in IE7.

NOTE: The X-UA-Compatible tag and header override any existing DOCTYPE. Also, the mode specified by the page takes precedence over the HTTP header. For example, you could add the EmulateIE7 HTTP header to a site, and set specific pages to display in IE8 mode (by using the meta-tag with content=”IE8”).

I have no immediate plans to start using IE 8 as I’ve got plenty of other problems and challenges. But you are welcome to share advice or experiences by commenting here. On second thought, maybe I'll install it on my laptop which I don't use that frequently and on which I don't have mission-critical "stuff".

BTW Department

Have you ever wondered where the new Sql Server Database Publishing Wizard is? You know, the one that installs with Visual Studio 2008? Well you can run it “standalone”, in case you were wondering -- c:\Program Files\Microsoft SQL Server\90\Tools\Publishing\1.2\SqlPubWiz.exe . On a 64-bit, here: C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Publishing\1.2\Sqlpubwiz.exe


Bugs in Silverlight?

Dr. Evil: Right, people you have to tell me these things, okay? I've been frozen for thirty years, okay? Throw me a frickin' bone here! I'm the boss! Need the info.

I've seen more than a few posts on the Silverlight Forums where people are complaining (or sometimes just asking for help / guidance) on issues where they appear to be attempting to "tax the system"  and thus expose what they believe is "a bug".  Hey - Silverlight has bugs - even release software does -- that's not the issue. But creating artificial programming situations where one can claim "It's a bug" is not always a  legitimate effort.

More often than not, this revolves around issues like "memory leaks" when attempting to set up some sort of "test" code that does some operation in a tight loop, or some similar operation that does not necessarily relate to what would likely happen in a "real world" Silverlight application. The poster then takes great glee in the assumption that they have "found a bug". Sometimes, it's just "attention seeking", and other times it revolves around an incomplete understanding of what Silverlight is, and what it's capabilities really are.

I think it's important to understand that the Silverlight runtime is based on a subset of the .NET Framework which is normally installed on the client in about 10 seconds - without the requirement to even restart the browser in most cases. In order to accomplish this, a lot of "stuff" had to be left out.

There are workarounds for some things, and for others, there are not any workarounds.

The key thing to remember here is that if you are designing a new Silverlight application that relies on some assumptions that you could normally make with some degree of confidence with the full .NET Framework, when attempting to do this with Silverlight, your assumptions may very well be "out the window". It "is what it is" - and you have to be able -- and willing -- to work with it. Not just that - but you also must be willing and able to invest the time to discover "why" you may not be able to do what you want.

What's important to keep in mind is first, to test your assumptions extensively and see how they  perform.

The second, and perhaps more important principle -- is to understand that in order to accomplish your objective, you may need to refactor your approach so that it does not tax the SL runtime - which is a subset of the full Framework that runs in the browser, on the client. 

For example, if you are going to create 10,000 UserControls in a tight loop, calling GC.Collect both immediately before and then after each iteration -- and then proceed to complain about memory leaks -- ask yourself if this is a scenario that is really likely to play out in a real - world Silverlight application. If you think it is, then maybe the problem is that you simply need to revisit your original assumptions and make appropriate adjustments that come into play a little bit closer to "runtime reality".

The SIlverlight forums are rife with this kind of "complainer post" - but they also have a significant number of interesting and extremely useful posts and answers. I saw one user who signed up with the name "SILVERLIGHTSUCKS" - and proceeded to make allegations that belied a true understanding of how Silverlight media streaming works, and why. You could see that one coming from a mile away :-).

I think Silverlight has huge potential for a variety of applications including LOB apps that present exciting UI interfaces to complex business scenarios. But like Flash / Flex, everything has its limitations. In general, Developers need to invest the time to learn what these limitations are and then they will be able to ask more intelligent questions.

Visual Studio: Enable / Disable IE Script Debugging Tool

Rocky: There has already been two attempts on your life.
Bullwinkle: Don't worry, we'll be renewed.

If you are like me, you are always keeping your eye out for shortcuts and ways to make your life easier. One of the little annoyances of working with Visual Studio .NET is that it has no option to turn Client Script Debugging on or off – you have to open up Internet Explorer, go to Tools/Internet Options/Advanced and either check or uncheck the checkbox option.

All this checkbox does is control a Registry entry, so why not just make a little .vbs script and register it as an External Tool in Visual Studio?

The steps to do this are very simple. But first, you need to find out what Registry Key has your value.

The key is located at HKEY_USERS\XXXXXX\Software\Microsoft\Internet Explorer\Main\ where “XXXXXX'” could either be “.default” or one of the machine user identities such as “S-1-5-19”. Once you have identified where your actual “Disable Script Debugger” key is located, you can modify the short script below, and save it as “IEScriptDebug.vbs” in a convenient folder.  You should also make a second batch file, “IEScriptDebug.bat” that will run this, since Visual Studio doesn’t like to configure .vbs files as executables (even though they are), preferring .bat files instead.

Here’s the script:

Option Explicit
' NOTE you will need to find the HKEY_USERS\XXXXXX\Software value of "XXXXXX" for your machine.
' where this key is located; It could be \.Default\. On this machine it is \S-1-5-19\
Const cHKU = "HKEY_USERS\S-1-5-19\Software\Microsoft\Internet Explorer\Main\"
Const cDSD = "Disable Script Debugger"
Dim objWSH
Set objWSH = WScript.CreateObject("WScript.Shell")
Dim strWSH
strWSH = LCase(objWSH.RegRead(cHKU & cDSD))
Dim strMSG
strMSG = cDSD & " = " & strWSH & vbCrLf & vbCrLf & "Toggle this value?"
If MsgBox(strMSG,vbYesNo,cDSD) = vbYes Then
If strWSH = "no" Then
objWSH.RegWrite cHKU & cDSD, "yes"
objWSH.RegWrite cHKU & cDSD, "no"
End If
MsgBox cDSD & " = " & objWSH.RegRead(cHKU & cDSD),vbInformation,cDSD
End If
Set objWSH = Nothing


Once you’ve got your .vbs and .bat file saved, in Visual Studio, all you need to do is choose “Tools/ External Tools” and add it:



Now whenever you want to toggle Script Debugging, Just hit “Tools / Script Debugging” in Visual Studio and you are good to go!



Today a fellow developer who works offsite sent out an email asking for help. He is apparently using a javascript “virtual keyboard” to help with accessibility on an ASP.NET app he’s working on.  In the sample “.htm” page, it seems to work great. But when he added everything to a new .ASPX page, the virtual keyboard jumps down in the page when invoked,  and this gets worse as you add more breaks to move the target textbox down the page. He states he knows  it’s “just some IE bug, but I am going nuts trying to find it”.

I opened the app. The .htm page works great. In the .aspx page, just as he describes, instead of the virtual keyboard rendering just below the input box, it renders at least 2 or three lines below where it is supposed to appear. Yet, all the code, javascript, CSS – is IDENTICAL! Or is it?

In the .Htm page, the DOCTYPE Declaration specifies XHTML STRICT. The default DOCTYPE from the .ASPX page is XHTML TRANSITIONAL. Yep, it can sure make a difference. Its “Not a Bug” – IE is simply doing its best to follow what you told it!

This CSS will also help IE play nice:

<style type="text/css">
html, body
overflow: auto;

The moral of the story is, “Don’t DuctTape your DocTypes!”


Silverlight 2 Beta 2: ConfigUnrecognizedElement issue and Fix


When you create a Service Reference in a Silverlight App to an ASMX WebService, you may get the: 

"An unhanded exception ('Sys.InvalidOperationException: ManagedRuntimeError error #4004 in control 'Xaml1': System.InvalidOperationException: ConfigUnrecognizedElement at System.ServiceModel.Configuration.BindingsSection.ReadXml(XmlReader reader)"

Exception. This "is a real bug".  When your Visual Studio 2008 Silverlight app generates the ServiceReferences.ClientConfig file, it creates a customBinding section that it cannot parse. In fact even the IDE marks the customBinding tag with squiggles saying "The element 'bindings' has invalid child element 'customBinding'. List of possible elements expected: 'basicHttpBinding'."

This is because SIlverlight doesn't understand SOAP 1.2, which is the default for ASMX webservices. DUH! To fix, you can do this with WCF:

<binding name="myBinding">
<textMessageEncoding messageVersion="Soap11"/>

Or, with ASMX, you can do this:

<webServices >
<remove name="HttpSoap12"/>


Those of course would be in your server-side web.config file.

The problem here is that the tools are creating a custom binding that is unnecessary and improper. If Silverlight can't support SOAP 1.2  then the tools should not be creating this binding. Silverlight should also ignore any bindings it doesn't know about.

WCF is great, but for simple deployment and auto-documentation in the service discovery page, I still prefer ASMX.


FACTOID:  Count Basie would be 100 years old today. The guy made a huge contribution to American Jazz, and dozens of jazz greats came out of his band. Here’s a clip of his band doing Neil Hefti’s “Lil Darlin’”:

BTW - it was Neil Hefti who wrote the original Batman theme.


When Hosting Plans Go Bad...

Talk sense to a fool and he calls you foolish.  --Euripides

When I started using paid hosting for ASP.NET sites, I started out with CrystalTech and they were very good. Then I found Gate.com and they seemed like they had a pretty good deal going - SQL Server 2005, plenty of bandwidth (200GB), plenty of space, ASP.NET 2.0 and a control panel that would let you create custom subfolder IIS applications, set mime-types in IIS, and even your own custom 404 page - which I instantly turned into a custom UrlRewriting handler. All this for like $9.95 / mo, and a 25% discount on additional sites. Sweet. For an extra $5.00 I could go up to 500GB bandwidth.

Then, about a month ago, I got an email from gate.com promoting how they were going to have a new "improved" control panel page, new features, more this, more that -- you get the idea.

Turns out that they REMOVED features. They blew away my custom 404 handler and removed the ability to specfy "my" page for HTTP 404. They removed my sub applications AND the abiity to create one. The sites have gotten dog - slow, and the tech support is like "lo babayit" (Hebrew slang for "nobody home"). 

It is pretty obvious that for one reason or another, they simply don't care about the customer any longer – and that’s a recipe for disaster for a hosting company (or really, any company that expects to stay in business).

You know what, Gate.com? The minute I can find a better hoster with a good price and the features I need, you bastards are HISTORY.

So, what about you? Got any HCHS's (Hosting Company Horror Stories)?


Silverlight: Some HttpWebRequest Headers don't work

Parents: Talk to your kids about Linux. Before somebody else does. -- XKCD

I saw a couple of posts where people were attempting to make GET or POST requests to some service (which, if not same-domain, had the required crossdomain.xml or clientaccesspolicy.xml file) and this service required BASIC authentication credentials.

Normally, you would add an "Authorization" header with the value "Basic QWxhZGRpbjpvcGVuIHNlc2FtZQ==" where the gobblydegook after the word BASIC is username:password converted to a Base64 string.  That's standard W3C Header protocol.

Unfortunately, even though there is sample code illustrating this from Karen Corby here, it does not work (she does not actually add an Authorization header, just illustrates how to add a header). It turns out that the "Authorization" header is on the "restricted list". See here for a complete listing.  So, if you have been trying to do this, or any similar header that is on the restricted list, you'll only get strange exceptions that make no sense at all, and  you can give up and look for a workaround.

Obviously, the easy workaround is to have your own proxy page or service that would accept the requisite username and password on the querystring (encrypted if desired, or as form post fields) and your server-side code  would make the full framework HttpWebRequest  -- where we do not have such restrictions, and pass the results back to the Silverlight app.  An annoyance to be sure, but "it is what it is".  Here is an article I recently put together that provides an interesting workaround.


Visual Studio 2008 Service Pack 1 (release) now available

"In all recorded history there has not been one economist who has had to worry about where the next meal would come from." -- Peter Drucker

Here is a page with a bunch of related stuff, including a prep tool that will set you up in cases where you had a beta Service Pack installed:


The actual Service Pack link is about the 10th one down in the list, here:


I use UltraISO to extract this to the file system, and then you can just execute the "SPInstaller.exe" to start the install. A lot easier than burning a DVD. You can also mount the ISO using the VCD Control Tool or a similar utility such as Daemon Tools. But don’t forget if you ever need to do update or repair, you need to do so from Control Panel , not from the original install source for VS 2008 – and it will be looking for a drive letter and location. If you have space, I believe its better just to keep folders containing the Visual Studio 2008 and the Service Pack media on your hard drive.

There is also  a new Silverlight_Chainer.exe installer for Silverlight 2 Beta 2 that plays well with the new release VS Service Pack.  If you see a date of 8/11/2008 on this page:

http://www.microsoft.com/downloads/details.aspx?FamilyId=50A9EC01-267B-4521-B7D7-C0DBA8866434&displaylang=en (86461KB) File version: 9.0.30729.10
This checks SP1 RTM patches...
And installs:
      VS90SP1-   KB955214

-=- that's the new one!   Oh, and lest I forget, here’s the Readme:


So far, I have all this successfully installed on 2 machines running Vista, including one x64.

Here is a direct download link to the MSDN Library for Visual Studio 2008 SP1 (2209MB):


If you want to install VS2008 SP1 (release) and still work with Silverlight 2 applications, download the updated Silverlight tools and install this after you install VS208 SP1.

I have also installed SQL Server 2008 as an upgrade to SQL Server 2005 on a 64-bit Vista OS wtith no issues to report. Other machines follow, with updates to this post as appropriate.

NOTE: if you are having issues around Silverlight, read Heath's post.

There is one additional issue that happened to me with a third PC: If you get errors during Service Pack 1 installation such as “unable to load package xx1234.msi” this is a patch problem. The first thing to try is the Download the Microsoft .NET Framework 2.0 Registration Correction Tool package:



If this doesn’t work (and it didn’t for me) the next step would be to UNINSTALL Visual Studio 2008 and reinstall it fresh, then apply the Service Pack.

The fastest way to uninstall Visual Studio when all else fails:

An easy, reliable and very fast way to uninstall VS 2008 without having to wait an hour or more for the standard operations  is to use the Microsoft Windows Installer Cleanup Utility:

http://support.microsoft.com/kb/290301/en-us (download is midway down the page).

If you haven’t used it before, this presents you with a list of all software installations on your machine. You can select Visual Studio 2008 (any version), click the “REMOVE” button, and all registry entries pertaining to Visual Studio will be removed. If you want you can also delete the folder in C:\Program Files\Microsoft Visual Studio 9.0 but I suspect that won’t be necessary because when you install it again everything will be written over anyway. The installer will now think that VS2008 has never been installed before! Files don't really matter -- it's what's in the Registry that counts.

One final note:

Don't forget to temporarily disable your antivirus software while you are doing all this "stuff". I had F-Secure pop up an "access denial" on one file during my travels. You don't want that to happen when everything is 95% complete...

Have fun and don't forget to RTFM  on the readme file, before you jump in with both feet.

Some Interesting Silverlight Stats

According to Eweek, during the first week of the Beijing Olympics:

  • Silverlight was downloaded 8 million times per day
  • 13.5 million video streams active
  • 16.9 million users of the site
  • On August 11th Silverlight delivered 250Terabytes of data

I can remember a couple of times when the Silverlight "loading" thingies were spinning and no video got delivered, but that's still some pretty impressive numbers.