Nhibernate Fluent => hbm => SQL Schema

What luck for rulers that men do not think.  - Adolf Hitler

More experimentation with NHibernate and we’re getting very close to Nirvana. The concept is (once again) – let’s develop our domain model first (similar to the “contract first” paradigm with SOA and WebServices) , create the mappings, and then create the database schema for the persistence mechanism (SQL Server, whatever) last.

The Fluent.NHibernate project  API allows you to map entities in NHibernate in a more expressive and more testable manner than you are typically able to do. What it does is to give you a clear path to take your POCOs (Plain Old CLR Objects) –- without any of the “glop”  attributes or dependencies – just standalone objects – and turn them into NHibernate  .hbm XML Mapping files. This is very cool because it means your objects can be marked [Serializable] and you can send them over the wire via WCF, etc. without the – “[VendorNameHere] Baggage?” getting in your way, because they don’t really have any.

Once this is accomplished, you can use the built-in NHibernate Tools classes to create an actual SQL Server (or other RDBMS) Schema SQL script to generate the database tables that mirror what your domain model represents in your POCO code.

Additionally, the Automap Classes are moving closer and closer to the ability to take care of the mapping “all at once” with a single method call. You don’t even need to persist .hbm XML mapping files; the NHibernate runtime is capable of using the generated mappings in memory.

Again, the whole point of this exercise is “Top Down” or “Objects First” – this flies against the face of the Microsoft Data-Centric approach of “Database first” LINQ-to-SQL or Entity Framework development (which is not a bad thing, because it is eminently usable) and will require further study by developers who understand the concept of “clean, unadulterated POCOs” that represent a true Domain Model approach to solving business problems for clients.

I’ll post more on this and probably write a comprehensive article to be posted on our eggheadcafe.com site complete with a downloadable VS 2008 solution.


IIS / ASP.NET Recycling “Deadlock Detected”

I like long walks, especially when they are taken by people who annoy me.   - Noel Coward

For some time I’ve been working on an ASP.NET web site issue where almost like clockwork, once an hour, the app recycles and then once again about six minutes later, it recycles again.

It took a long time to find it, but it turns out that “we’re our own worst enemy”. No, it wasn’t some external process like Task Scheduler on the box, running once per hour and hogging threads. It was me having so much fun doing FireAndForget pattern RPC server pings on every search that I was shooting myself in the foot! This is what happens when we put in some new cool “thing” and then six months later, when we start to see problems, we can’t remember what we did!

The ASP.NET “Deadlock detected” shows up in your Application Event log. It’s kind of cryptic, but here’s the general cause: You have a Threadpool per AppDomain. You may have a number of different operations going on (including just serving pages) – all of which use up a ThreadPool thread. Usually, whatever it is that’s happening gets done pretty quickly and so that thread becomes immediately available, and there is never a problem.

But if it doesn’t – you can quickly end up using all of your ThreadPool threads and then what will happen is that the very next Page request that comes in is plum “out of luck” – and your AppPool can thus cryptically recycle.

It could be unclosed connections, sockets, or in this case, the FireAndForget pattern which is a non-blocking way to send off some operation on a background thread.

The problem is, its still a real thread and it comes from the ThreadPool. And if it takes it’s sweet time, you could end up starving yourself out of ThreadPool threads and blow up your app. While the error reported is “Deadlock detected”, my theory is that it may not actually be a deadlock at all  – just simple thread starvation.

There is an elegant fix, the general pattern of which looks like so:

public static void DoPingOMatic(string title, string url)
int workers = 0;
int completion = 0;
ThreadPool.GetAvailableThreads(out workers, out completion);
if(workers> 10)
FireAndForget(new PingOMaticDelegate(PingOMatic),new object[] {title, url});

What I’m doing above is very simple and happens at the method body level. I simply check to see if there are at least 10 worker threads available before I allow it to attempt to use a Threadpool thread. Of course, you can get more sophisticated (waiting, for example) but in this case, just checking to see that I’ve got at least “X” Threadpool threads before we let – er- rip is sufficient.

Recursion: See “Recursion”

Doing this kind of RPC Service pinging every time you get a search or other request can get you into an unwitting “internet recursion” folly with your website if you aren’t careful. Here’s what I mean: You decide to ping a bunch of RPC Servers every time you get a search on your blog or website.  What happens is that the RPC Ping service (Pingomatic, whatever – see article on eggheadcafe.com) dutifully goes out and makes a request of the url you’ve pinged it with (e.g., http:/www.yourblog.com?q=whatever ). Your blog treats this RPC Service request as a new search, and then pings the RPC Servers with it, and …. you get the picture. Talk about thread starvation!

So, the next time you see this kind of IIS ASP.NET Application Pool recycling, search through your code carefully. The bottom line is, if you have any kind of background thread processing going on, your ThreadPool only has 50 worker threads by default – some of which may already be being used behind the scenes even though your code doesn’t explicitly engage them. It’s a good idea, therefore, to throttle whatever you’re doing -- especially if it is likely to happen many times per minute.


Referencing a non-Silverlight Assembly in a Silverlight Project

“Bailout? Hey! I’m having trouble paying my bills. How about it?” -- Me

This one comes up a lot, and the bottom line is, you can’t do it.

But - I think it is important to understand why it won't work:

You cannot reuse non-Silverlight assemblies since the desktop CLR and Silverlight CLR are based on two different Frameworks. They are similar, but they aren’t “the same”. The BCL’s (Base Class Libraries) that are referenced in each type of project may have similar names but they are completely physically different files. But you can reuse the original code if it's compatible and available to you in source

code form.

In Visual Studio, you can create a new Silverlight class library project, then right click to add an existing item. You can click the arrow on the right side of the Add Button, and choose "Add As Link". This will not copy the .cs file to your new project, so when you modify the .cs file in one place, the other place will be automatically updated. You can also have two separate projects, one a full-framework one, and one a SIlverlight project, that both use the same class files. You can even save a Silverlight .csproj file right next to the full framework .csproj file (with a slightly different name) in order to promote code re-use.

As far as being able to "add a reference" to an assembly that wasn't compiled under a Silverlight project, you cannot do it. However, you can add a reference from a full .Net CLR to a Silverlight assembly.

Technorati Tags:

Silverlight 2 / Visual Studio 2008 “The project type is not supported” error.


Recently I managed to discombobulate my Visual Studio 2008 Installation, and had to remove and reinstall the little booger. Everything went pretty much OK except after reapplying the Silverlight_Tools.exe (the combined developer installer), I tried to load an existing Silverlight 2.0 RTM project and got “The project type is not supported by this installation” dialog.

I’m like, “Huh? The installation went perfectly with no errors”.  One fix I found that seems to work perfectly is to run “Devenv /setup”. For some reason this resets all the package loads and Silverlight Tools will be happy ever after.

In my short happy life as a .NET developer, I’ve seen several products where MSIEXEC ends up just sitting there in memory at the end of an install session, all dressed up with no place to go, and it just hasn’t quite finished the job.

While I’m on this subject, it might be appropriate for a quick review of all the DEVENV.EXE switches (the full documentation is here , along with all parameters):

(Note: When you run Visual Studio 2008 on Windows Vista, you must run devenv as an adminstrator in order to use the /Setup  and /InstallVSTemplates )

/? (devenv.exe)
Displays a message box listing all devenv switches, with a brief description of each one.

/Build (devenv.exe)
Builds a solution using a specified solution configuration file

/Clean (devenv.exe)
Cleans all intermediary files and output directories.

/Command (devenv.exe)
Executes the specified command after launching the Visual Studio integrated development environment (IDE).

/DebugExe (devenv.exe)
Opens the specified executable file to be debugged.

/Deploy (devenv.exe)
Deploys a solution after a build or rebuild. Applies to managed code projects only.

/Edit (devenv.exe)
Opens the specified file in an existing instance of Visual Studio.

/LCID (devenv.exe)
Sets the default language used for text, currency, and other values within the integrated development environment (IDE).

/Log (devenv.exe)
Starts Visual Studio and logs all activity to the specified log file for troubleshooting

/NoVSIP (devenv.exe)
Disables the Visual Studio SDK developer license key on a developer workstation and then starts Visual Studio.

/Out (devenv.exe)
Specifies a file to store and display errors when you run, build, rebuild, or deploy a solution.

/Project (devenv.exe)
Identifies a single project within the specified solution configuration to build, clean, rebuild, or deploy.

/ProjectConfig (devenv.exe)
Specifies a project build configuration to be applied when you build, clean, rebuild, or deploy the project named in the /project argument.

/Rebuild (devenv.exe)
Cleans and then builds the specified solution configuration.

/InstallVSTemplates (devenv.exe)
Registers project or item templates that are located in <Visual Studio installation path>\Common7\IDE\ProjectTemplates\ or <Visual Studio installation path>\Common7\IDE\ItemTemplates\ so that they can be accessed through the New Project and Add New Item dialog boxes.

/ResetSettings (devenv.exe)
Restores Visual Studio default settings. Optionally resets the settings to the specified .vssettings file.

/ResetSkipPkgs (devenv.exe)
Clears all options to skip loading added to VSPackages by users wishing to avoid loading problem VSPackages, then starts Visual Studio.

/Run (devenv.exe)
Compiles and runs the specified project or solution.

/Runexit (devenv.exe)
Compiles and runs the specified project or solution, and then closes the integrated development environment (IDE).

/SafeMode (devenv.exe)
Starts Visual Studio in safe mode, loading only the default environment and services.

/Setup (devenv.exe)
Forces Visual Studio to merge the resource metadata that describes menus, toolbars, and command groups, from all available VSPackages.

/Upgrade (devenv.exe)
Updates the solution file and all of its project files, or the project file specified, to the current Visual Studio 2005/2008 formats for these files.

/UseEnv (devenv.exe)
Starts Visual Studio and uses the environment variables for PATH, INCLUDE, LIBS, and LIBPATH in the VC++ Directories dialog box

Well, there they are! I sure didn’t know about all of them. Happy Holidays.


LINQ To SQL / Entity Framework / NHibernate ORM Top-Down, Objects First

People demand freedom of speech as a compensation for the freedom of thought which they seldom use. – Soren Kirkegaard

In the process of stumbling through LINQ To SQL to see if I would be able to represent a SQL Server database schema I created to provide storage for a hierarchical well-defined XML Schema for a commonly used utility object, I came to the realization that I was indeed doing everything completely backwards!

What I am saying is this:  ORM should be done by focusing on the OBJECTS FIRST, not the Database Schema! Unfortunately, most of the tools we have are data-centric, not object-centric.  Scott Allen has a post that clearly describes the debacle.

To my knowledge, there will not be any plain old CLR objects (POCOs) in Entity Framework. LINQ to SQL doesn’t yet have all the mapping capability to really separate the object model from the underlying database schema – and of course, you can use it with SQL Server only.

Now, this situation may improve in the future. But hey -- I need to write this application now, and I’m not about to hold my breath waiting.

Of all the .NET frameworks out there (including the current Microsoft offerings) the only mature offering that appears to be really capable of providing the mapping functionality I need is NHibernate. With NHibernate, I can create my objects (classes) first, create the mapping schema, and then using the SchemaExport class to create the matching database schema. Best of all, NHibernate is happy whether you do this with SQL Server, Oracle, or even SQLite.  Pete Weissbrod has some excellent material that you can use as a study guide on how to do this.

The key feature with an ORM like NHibernate is what's called transparent and automated persistence:

A DataSet allows you to extract the changes performed on it in order to persist them. NHibernate provides a different feature: It can automatically persist your changes in a way that is transparent to your domain model. This means a complete separation of concerns between the persistent classes of the domain model and the persistence logic itself, such that the persistent classes are unaware of—and have no dependency to—the persistence mechanism.

An “Item” class, for example, will not have any code-level dependency to any NHibernate API.  It doesn’t need funky attributes on the class or methods that can make serialization and re-use over the wire via WCF difficult. In addition,  NHibernate doesn’t require that any special base classes or interfaces be inherited or implemented by persistent classes, nor are any special classes used to implement properties or associations.

The bottom line for me is that in a system with transparent persistence, objects aren’t aware of the underlying data store; they need not even be aware that they are being persisted or retrieved. They can be serialized, sent over the wire, used for NUnit Tests, and more.

As Scott indicates in his post, developers need to learn how ORM identity maps work. Think objects first, data second. Start with your classes and business logic, and only then get the persistence and mapping to the database done. It was a hard lesson to learn because I’ve wasted a lot of time, partly because of a predilection for using the Microsoft flavor,  but now I think I can safely say that “I got it”.


One morning, I shot a Chevrolet in my pajamas…

Government does not solve problems; it subsidizes them. – Ronald Reagan

The White House said today that "The legislation crafted in recent days aimed at helping the ailing U.S. automakers is an effective and responsible approach". 

I strongly disagree. It was hard enough to get policymakers to finally utter the word "recession." The Treasury just issued 4 week T-Bills at ZERO percent interest, for the first time in history. In the secondary markets, T-Bills were trading at a premium (meaning negative yields). You want to talk about deflation? People are so fearful that they’re willing to let the US Treasury hold their money and earn no interest at all.  The next challenge may be to get them to say "nationalization"  -- because that’s exactly what’s happening, baby!

That's pretty much what the government takeover of big chunks of the economy amounts to, in my opinion. Welcome to the USSRofA, Comrade!

Actions taken by the Democratic-led Congress and the outgoing administration — moves generally supported by President-elect Barack Obama — already have reversed decades of deregulation and privatization that Presidents Ronald Reagan, George W. Bush and George H.W. Bush all championed. It seems that when times get tough, the U.S. Government needs to step in and be the "lender of last resort". I say, “bullshit!”

Washington has taken a direct stake in  or orchestrated the takeover of banks, seized control of mortgage finance giants Fannie Mae and Freddie Mac, taken a controlling stake in insurer American International Group and now is poised in the final weeks of Bush's term to throw a multi-billion dollar lifeline to the troubled Big Three automakers. We can print money until our faces turn blue. But where is the solution to the problem? Do you see a solution? NOT! Credit markets are still frozen, Pal. And people are out on the fyookin’ street because they cannot pay their mortgages. And that’s after our Government has literally pissed away billions of your and my hard-earned tax dollars in an insanely misguided effort to “fix it”.

Congressional Democratic leaders and the White House just negotiated a bill to provide $14 billion in emergency short-term loans for Detroit and create a "car czar" to be named by Bush to dole out the loans and oversee restructuring.  Republicans may very well  ditch it, and I really hope they're successful. The Government simply does not have a very good track record of intervention in the business sector during hard times.

The problem is, this is simply throwing money at a sinking ship in an effort to "rearrange the deck chairs on the Titanic" without addressing the real baggage of bad decisions on long-term union contract agreements and other financial obligations that will still completely prevent US automakers from ever being competitive in global markets. 

The unions crippled the car industry because they were greedy and  didn’t think ahead. It’s not just the fault of the unions – the carmakers made reallyreallydumb decisions about what to produce and sell too.

I bet you that with the current deal, within just three months, GM will be back in D.C. asking for an additional 15 Billion of your and my hard earned money! Six months later, an additional 15 billion! What am I talking, GREEK? Come on, can we  THINK for a change?

"The government has no business managing car companies, even if temporarily", says Sen. Richard Shelby, R-Ala. "It's very un-Republican". And boy, is he right.

Nearly two years after Bush suggested that Detroit produce "a product that's relevant" -- rather than looking for a possible Washington bailout, the President now supports the emergency loans — after getting a concession from Democrats that the money would come from an existing program to help the industry retool its plants to make greener cars.

Longer-term proposals being developed by Obama and congressional leaders call for an equity stake for the government and a chance to dictate business decisions for years to come. This is totally wrong! The only way to fix the mess is to enable the automakers to restructure under Chapter 11 Bankruptcy, which will give them a chance to ask the Judge to nullify all the bad contracts and utterly stupid decisions they agreed to over the last 30+ years, and have a chance to really come out with a fresh start and have a viable industry.

I repeat: the pouring of taxpayer dollars into this Black Hole WILL NOT FIX THE UNDERLYING PROBLEM!

During World War II, the government seized railroads, coal mines, Midwest trucking operators and, briefly, retailer Montgomery Ward.

The federal government partially nationalized the nation's troubled railroads in the 1970s. Today, it still owns and runs Amtrak. What a DISASTER! You want more of this?

The government nationalized more than a thousand failed savings and loan institutions in the late 1980s and the early 1990s, modeling the effort on a government-run corporation that made loans and bought stock in distressed banks during the 1930s.

The Government has a very bad track record of intervening in failing industries. Don’t we get it? Can’t we THINK? It's time to stop the bullshit and let the automakers take bankruptcy. That’s what they need. It won’t be painless, to be sure. But it could work. It could be a lot less expensive to retrain and re-employ the displaced workers than to throw hard-earned taxpayer money down the sewer.  My two cents.