10/31/2007

Entity Framework Goodness

I finally got around to the Entity Framework BETA 2, and I like it already. To play with this, you need these bits;

Entity Framework Beta 2:
http://www.microsoft.com/downloads/details.aspx?FamilyID=f1adc5d1-a42e-40a6-a68c-a42ee11186f7&displaylang=en

Tools, Aug 2007 CTP:
http://www.microsoft.com/downloads/details.aspx?FamilyId=09A36081-5ED1-4648-B995-6239D0B77CB5&displaylang=en

There is also a nice "Getting Started" piece on codeplex:

Getting Started:
http://www.codeplex.com/adonetsamples/Release/ProjectReleases.aspx?ReleaseId=7792

I decided to brave it and just roll a quickie on my own from the Northwind database. All you do is Add New Item / ADO.NET Entity Data Model, which gives you a wizard to choose what database objects you want modeled, and then you get a nice class diagram:

EntityDesignerDiagram

All your entity model code is built for you. So to say, get 10 products from the Products table via a nice little LINQ query:

// Get 10 products via Entity LINQ query
using (NorthwindModel.NorthwindEntities1 ent = new NorthwindEntities1())
{
Products[] prod = (NorthwindModel.Products[])
(from pr in ent.Products select pr).Take(10).ToArray();
GridView1.DataSource = prod;
GridView1.DataBind();
}


LINQ is extremely cool. As with any new technology, there is quite a learning curve, but it's really worth the effort. You can get into all kinds of complex LINQ queries and joins, the vocabulary and the syntax, especially combined with Lambda expressions, is outstanding. I've been working out of the Albaharis' book from O'Reilly, "C# 3.0 in a Nutshell", which has extensive reference-book type material on LINQ. They also have a free downloadable "LINQPAD" executable which i highly recommend.

Oh, and just for the sake of being concise: this is for Visual Studio 2008 - Orcas, NOT Visual Studio 2005.

Sweet!

10/28/2007

Google PageRank Crash of 2007: What's the Skinny?

"To the moon, Alice" -- Jackie Gleason (Ralph Kramden, The Honeymooners)

The BlogOSphere has been buzzing the last couple of days since everybody discovered that the Googly-Bear decided to update it's PageRank algorithm (there had been hints for weeks before, to be sure -e.g., Danny Sullivan, Oct. 7). Ah, "poor little me", huh?

Legions of very big blogging-related sites and commercial ventures -- Washingtonpost.com, Forbes.com, Engadget.com and SFGate.com noticed a downgrading in their PageRank. Some sites went up in PageRank. Our eggheadcafe.com site went from 5 to 6. One of my newest "playground sites", blogmetafinder.com, went from zero to a PageRank 3. This UnBlog remains unchanged, for now, at PR 5.

Most experts agree that the key determinant was the practice of "buying links" such as text link ads and the like. Apparently, Google just decided to close up this last little loophole, and they did it with ample warning too. Now, this is "big" - you may not grok it right away, but it's going to change the entire complexion of the web, and pretty quick, too. Hopefully, for the better...

As we all know, a whole industry has spawned with the goal of helping website operators obtain the highest rankings for certain keywords in search engines, and milk the most out of their AdSense accounts. Unfortunately, many of these SEO practices fall into the blue or black-hat SEO category, that's the one that got the Googly-Bear very upset.

And, you don't wanna upset the Googly-Bear, because it's a very, very big one. And when it gets mad, a whole lotta things can change overnight, yes?

Here's the "thing": If I've got a web site that's got a PageRank "7" and it's because I've got a bunch of backlinks from spammy or questionable sites that sell links, what I really have is a PageRank 7 that' s ready to go down the toilet overnight. If those spammy or questionable sites get demoted, my PageRank that's depending on them goes straight into the potty.

There will always be the crowd that doesn't want to put in legitimate effort for gain - the people that focus on how to "game the system", get DiggBoss points, manipulate PageRank, whatever.

Bottom line: Do things the old fashioned way. Don't buy or sell links. Put real content on your site or blog. Promote yourself ethically along the published Webmaster guidelines. And you won't be crying in your beer. If your venture got demoted in PageRank, better do some serious soul-searching, because it's more than likely the problem sits between your keyboard and your chair.

It will be interesting to see what happens in the next 2 to 3 weeks as lawyers start jockeying around this ambulance... Congress, of course, will need to get involved...

In Other News

Welcome to $100 oil and $4.00 gas, which I predicted a year and a half ago. If you take the time to look at the historical Oil vs. Inflation chart (which is a couple of years old) it is easy to see that inflation tracks oil prices pretty closely. Pencil in $100 oil, and you get the picture. And the Fed can't do crap about it because our economy is already in shambles and the only trick they've got is to raise the Discount Rate. With the mortgage / housing market down the crapper, there is no way they can do that. What do you get? An INFLATIONARY RECESSION. I give it about 6 months.

10/27/2007

Visual Studio 2008 (ORCAS) "Project Creation Failed"

For a list of all the ways technology has failed to improve the quality of life, please press three.
--
Alice Kahn

This is due to some assembly redirects that are added to the devenv.exe.config file by installing the GAX (Guidance Automation Extensions).

Here is the fix, and it's an easy one:

1) Navigate to: C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE in Windows Explorer

2) Load devenv.exe.config in your favorite text editor.

3) Find this string: "Microsoft.VisualStudio.TemplateWizardInterface"

4) Comment out the element so it looks like this:

<dependentAssembly>
<!--assemblyIdentity name="Microsoft.VisualStudio.TemplateWizardInterface" publicKeyToken="b03f5f7f11d50a3a" culture="neutral" /-->
<bindingRedirect oldVersion="0.0.0.0-8.9.9.9" newVersion="9.0.0.0" />
</dependentAssembly>

5) Restart Visual Studio 2008. All fixed!

10/24/2007

A $15 Billion Bubble? You Decide...

With Microsoft (MSFT) buying a very minority share that values Facebook at $15 billion, it looks like "bubble or bust".

A lot depends on whether you believe Facebook is just the latest online fad—or whether the social network is building the next, great computing "Thing".

Facebook, which is closing in on 50 million members, theoretically promises to restore control to social networking—over privacy, unwanted email, and virtual contact— back to the user.

If you divide the estimated valuation ($15 Billion) by the number of users that means each Facebook user is potentially monetized at about $300. Obviously, users aren't being monetized at that level now. I'm a Facebook user, and while it was kinda cool at first, seeing people that I know who have joined, and being invited to be "a friend" of some people that I respect or like, I already find that the bloom is off the face, if you will - and I really don't visit much at all anymore.

So, how does Facebook/Microsoft make money? Advertising, obviously. When Mark Zuckerberg talks about "social ads", I have a funny feeling that I am not gonna be there to click. It might be innovative, it might be the "next big thing" - but my guess is that if many users, like me, arent' really compelled to visit regularly, guess what? There isn't gonna be anybody to look at the ads, period. Not only that, but these social networking sites are quite transient. One day, probably soon, some young innovator is going to come up with something better (they're probably coding their asses off on it as I write) and Facebook (like MySpace) will start to really become passe.

Frankly, I'm getting the idea that Facebook friends are to remind you how many friends you might actually have if you were to spend time with real people in the real world.

So whether Facebook is worth $15 billion really depends on whether it can figure out a way to spin new kinds of online ads that work a lot better than anything we’ve seen before. My bet is: don't hold your breath. There is a broader concept at work here, as alluded to by an astute commenter below.

Déjà Vu all over again, a-la 2000? At least, NASDAQ isn't at 5,000...

10/19/2007

Un-Captcha Techniques Redux

I've spent a bit of time working on alternative CAPTCHA techniques, mostly because I've found that the vast majority of CAPTCHA offerings are non-intuitive and even for people with excellent eyesight, don't always "make it" the first time around. I have pretty good vision, but I find myself constantly frustrated by stupid case-sensitive CAPTCHA requirements that I simply cannot pass the first, sometimes the second and even as many as three tries.

Webmasters and site developers are like lemmings - they seen something that somebody promotes, they copy it, they use it, but they DON'T THINK!

WTF?  All one needs to do is look at Jeff Attwood's blog and you can see that he requires the user to type in a clearly readable "ORANGE" every time - and it works perfectly! The Bots simply don't get it. It's easy to see, easy to read, and shows how UTTERLY RIDICULOUS  these various CAPTCHA images are to the user, and how they literally destroy the user experience!

One technique I pioneered was the use of an Image - to - HTML captcha that renders as HTML.


But! There could even be an easier way:

This concept is based on the fact that most spam - bots are, in a word, "dumb".

Here is the technique:

1) Add an input field to your form having some interesting name such as "url":
<input name="url" type="text" value=""/>


2) Hide the input box with a  css style element so that real (human) users cannot see it directly:
<style>
.captchaStyle {
  display: none;
}
</style>

<p class="captchaStyle"><input name="url" type="text" value=""/></p>

In your code that processes the form, check if the “url” formfield contains any value. If it does, it's a bogus post because it was a bot that  saw the field and "thought" that it was supposed  to fill it in, so you would reject it or set it up for moderation.


It works because geniune users cannot see a hidden input box on your form and therefore, they won’t fill it, while robots do  see it, assume that they need to provide a value, and fill it in.

Go figure.

The Project XXX could not be opened because the Microsoft Visual C# 2008 compiler could not be created

If you have any experience installing Microsoft Beta or CTP offerings, then uninstalling to make way for a newer version, then with Visual Studio 2008 (Orcas) the above may be somewhat familiar.

Surprisingly, the answer is to install the latest version of the Silverlight Tools here:
http://www.microsoft.com/downloads/details.aspx?FamilyID=b52aeb39-1f10-49a6-85fc-a0a19cac99af&DisplayLang=en

That may not be the only answer, but it seems to have worked for a number of people including me.

If you are having trouble uninstalling Visual Studio ORCAS because the MSI information got clobbered somehow, or it reports not being able to find some "network location" (DOH), one of the fastest ways to clean things up is to use some of the tools out there, many of which have been provided by installation guru Aaron Stebner. Here's a post by Brad Abrams that highlights two tools you can use.

MSIInv takes an inventory of all your MSI-installed software (even stuff you cannot see in Add/Remove or Programs and Features). It provides you with the GUID so that you can just run msiexec /x {clsid of software}. This often solves the problem. You can also use MSIZap (also detailed on the post with a download link).


As well, I have been able to use the MSICUU "cleanup" utility which is even easier to use.

Have fun. It's us against them.

In other news, if you are concerned about the freedom of the Internet ("Net Neutrality"), read this about what Comcast is doing.

10/10/2007

Losing ASP.NET Sessions - Why Application Pools recycle

I've seen a more or less constant stream of questions on the asp.net newsgroup and the asp.net site forum messageboard, all of which revolve around the problem of "Why am I losing Sessions?". (BTW that is "losing" with one o, not "loosing"! Your pants might be loose, but if your Session goes away you can bet that you are losing it!).

Heres a summary of what I've learned about this; comments and additions are always welcome:

If your ASP.NET application crashes, has an unhandled exception, hangs or otherwise becomes brain-dead, it will cause the application pool to recycle. Sometimes your application pool recycles for no obvious reason. This is usually a configuration issue or it may be caused by your app performing file system operations in the application directory. Many times developers incorrectly set up SqlConnections so that they aren't properly closed and returned to the connection pool, and this can also cause your AppPool to recycle unexpectedly. When your AppPool recycles, you can kiss your InProc Sessions - and everything else -- goodbye.

Application pool settings

Looking at the properties for the application pool in IIS, you'll see the settings for "on purpose" recycling. In IIS6 these are:

  • Recycle worker processes (in minutes)
  • Recycle worker process (in requests)
  • Recycle worker processes at the following times
  • Maximum virtual memory
  • Maximum used memory

If you're running IIS5 or the IIS5 isolation mode you must look at the processModel element of machine.config. The properties you should pay attention to are:

  • memoryLimit
  • requestLimit
  • timeout

In IIS 7.o, you have Fixed Interval or Fixed # Requests, or Specific Times for recycling. Also, there are Memory -based Maximums for Virtual and Private Memory, and additional items for Configurable and Runtime recycling events including "unhealthy ISAPI".

When an application pool recycles, HTTP.SYS holds onto the client connection in kernel mode while the user mode worker process recycles. After the process recycle, HTTP.SYS transparently routes the new requests to the new worker process. Consequently, the client never "loses all connectivity" to the server; the TCP connection is not lost -- only state is lost (Application, Session, Cache, etc.).

memoryLimit

The default value of memoryLimit is 60. This value is only useful if you have a small amount memory on a 32 bit machine. "60" means 60% of total system memory. So if you have 1 GB of memory your IIS worker process will automatically restart once it hits memory usage of 600 MB.

requestLimit

This setting is "infinite" by default, but if it is set to 8000 for example, then ASP.NET will launch a new worker process once it has handled 8000 requests.

timeout

The default timeout is "infinite". This is where you set the lifetime of the worker process. Once the timeout is reached ASP.NET launches a new worker process, so setting this to "00:30:00" would recycle your application every 30 minutes.

Other properties

Another property within the processModel element that will cause your application pool to recycle is responseDeadlockInterval. If you have a deadlock then that's your main "fix" that you need to worry about -- changing the responseDeadlockInterval setting won't do much to resolve the problem. You need to deal with the deadlock itself, find out why it's happening, and change your code.

File Change Notification

ASP.NET 2.0 depends on File Change Notifications (FCN) to see if the application has been updated, and depending on the magnitude of change the application pool will recycle. If you or your application are adding and removing directories to the application folder, then you will be restarting your application pool every time.

Altering the following files also causes an immediate restart of the application pool:

  • web.config
  • machine.config
  • global.asax
  • Any file in the /bin directory or subfolders

Updating .aspx files, etc. causing a recompile eventually triggers a restart of the application pool also. There is a property of the compilation element under system.web called numRecompilesBeforeAppRestart. The default value is 20, meaning that after 20 recompiles the application pool will recycle.

Workaround for the sub-directory issue

If your application actually requires adding and removing sub-directories you can use linkd to create what's called a directory junction:

Create a directory you'd like to exclude from FCN, e.g. c:\inetpub\wwwroot\MyWebApp\MyFolder
Create a separate folder somewhere outside the wwwroot, e.g. c:\MyExcludedFolder
Use linkd to link the two: linkd c:\inetpub\wwwroot\MyWebApp\MyFolder c:\MyExcludedFolder
Now any changes made in the c:\inetpub\wwwroot\MyWebApp\MyFolder will now actually occur in c:\MyExcludedFolder so they will not be sensed by FCN.

Linkd only comes with the Windows XX Resource Kit, which is a pretty big download. But Mark Russinovitch has "junction" which could be even better:

http://www.microsoft.com/technet/sysinternals/FileAndDisk/Junction.mspx

Is recycling the application pool good or bad?

If your app is coded properly, you shouldn't have to recycle the application pool. However, if you're dealing with a memory leak in your app and you need to buy time to fix it, then recycling the application pool could be a good idea. It's important to understand, though, that's not a "Fix" - it's just a "Band-Aid" until you find out what's causing the problem and fix your code. Unlike as with ASP.NET 1.1, in ASP.NET 2.0 if your app generates an unhandled exception the AppDomain will unload causing an application pool recycle. Consequently it is extremely important to ensure that your code is "best practices" and doesn't generate unhandled exceptions except under the most extreme and unusual conditions.

Additional Resources

http://msdn2.microsoft.com/en-us/library/aa719566(VS.71).aspx

http://msdn2.microsoft.com/en-us/library/aa720473(VS.71).aspx

http://forums.asp.net/t/623320.aspx

http://www.microsoft.com/technet/technetmag/issues/2006/01/ServingTheWeb/

http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/26d8cee3-ec31-4148-afab-b6e089a0300b.mspx?mfr=true

http://blogs.msdn.com/david.wang/default.aspx

10/09/2007

AJAX: Enough Already!

Is there such a thing as OverAJAXification?

ajax

From the AJAX Sucks Department...

Jeesh. Everybody and their brother is sticking AJAX into their "stuff" - whether it's appropriate or not. They just did it at Codeplex.com. The search facility was just fine. You'd click a "next" link and the page would postback and right away you would get to see the next page of results.

Now, since they've supposedly "souped it up" with AJAX, what you get is a very long (sometimes 5 seconds or more) grey screen with an "updating" in-your-face graphic to look at, and then finally you get to see your next page of search results. To me, that's a lot more annoying and disruptive to the user experience than the slight flicker of a quick postback. But often, instead, you may just as well see the above Sys.WebForms.PageRequestManager exception dialog, which does absolutely NOTHING to improve the user experience.

Oh, and while I'm ranting, here's another nasty side effect of your "OverAjaxification": I put in a search term, and page through my results, and I'm on page 7, and I click "back" to go back to page 6, right? NOPE. You guys put me back on page 1! What if I click on a result to look at the project, and then click back to get back to my page of results? Same thing - you guys are now putting me back to the first page of results, without the sort that I chose. DOH! This is like Dan Rather and the Selectrics, man! Do you have to do this dumb stuff?

This is what happens when people are hot to showcase some technology but they don't THINK first. AJAX (excuse me - Remote Scripting), like any other technology, should be used with care and especially with great forethought as to its appropriateness within the specific presentation paradigm and web traffic load.

AJAX can definitely improve the user experience when used with care and in the right situation. But frankly, if it's going to bomb out or take longer for me to see my results than a simple postback, I'll vote for the old fashioned way every time.

The key thing is "form over substance"- we should always favor doing things in a usable, correct manner over "how they look". IF you can do both, then fine -- but make it stand up to the test first.

10/08/2007

Vista Upgrade: DVD Driver Problems FIX - Roxio DLA

The gods too are fond of a joke. - Aristotle

Recently I upgraded XP Pro to Windows Vista Ultimate. It wasn't until sometime later that I found I had lost my DVD drive. The driver appeared in Device Manager but it had that familiar yellow exclamation mark indicating a problem loading the driver.

After a bit of searching, I discovered that the culprit was Roxio CD/DVD software (e.g., "DLACDBHM.SYS" et. al.) which is not Vista - compatible, and thus blocks correct loading of the built in Vista CD/DVD drivers.

The FIX:

First step, in C:\Windows\System32 you need to find the DLA folder and delete it. That's their "Stuff". If you want to get sophisticated, you can search the registry for all keys that point to any of these .sys drivers and remove the entries.

Next, run this Registry Fix script courtesy of Doug Knox:

'Restore CD-Roms and DVD's to Explorer
'xp_cd_dvd_fix.vbs
'© Doug Knox - rev 04/14/2002
'Downloaded from www.dougknox.com
'based on cdgone.reg

Option Explicit
On Error Resume Next

Dim WshShell, Message

Set WshShell = WScript.CreateObject("WScript.Shell")

WshShell.RegDelete "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4D36E965-E325-11CE-BFC1-08002BE10318}\UpperFilters"
WshShell.RegDelete "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4D36E965-E325-11CE-BFC1-08002BE10318}\LowerFilters"
WshShell.RegDelete "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Cdr4_2K\"
WshShell.RegDelete "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Cdralw2k\"
WshShell.RegDelete "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Cdudf\"
WshShell.RegDelete "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\UdfReadr\"
Set WshShell = Nothing

Message = "Your CD/DVD-Rom drives should now appear in Windows Explorer." & vbCR
Message = Message & "You may need to reboot your computer to see the change."

MsgBox Message, 4096,"Finished!"

The above script should be saved as "xp_cs_dvd_fix.vbs" and you can execute it out of Windows Explorer by double-clicking. This is a pretty old script but the most important thing it does is to remove the "UpperFilters" and "LowerFilters" entries.

Note: the above does not remove Roxio DLA - which is incompatible with Vista. Here are full instructions for that:

Note: These instructions apply specifically to Windows XP, but they work on Vista as well.
To perform a clean uninstallation of DLA.
1) Uninstall the current copy of DLAusing the "Add/Remove Programs," control panel, if available.
2) Delete the "DLA," folder from your Program Files folder. (By default this is located inside:C:\Program Files\Sonic\)
3) Edit the Registry to remove entries relating specifically to DLA. (As a precaution, please exportany registry entries before deleting them.)
a. To start the Registry Editor, click on "Start," click on to "Run," type "regedit," and click "OK."
b. Open the "HKEY_CURRENT_USER" folder. (If this folder does not exist, proceed to step f.)
c. Open the "Software" folder.
d. Open the "Sonic" folder.
e. Right-click on the "Direct Access" folder and select delete.
f. Go to HKEY_LOCAL_MACHINE\Software\Sonic.
g. Right-click on the "Direct Access" folder and select delete.
h. Go to HKEY_CLASSES_ROOT\CLSID.
i. Right-click on the "{5CA3D70E-1895-11CF-8E15-001234567890}" folder and select delete.(This data for the "Default," setting in this folder should be "DriveLetterAccess.")
j. Go to HKEY_CLASSES_ROOT\Installer\Products.
k. Right-click on the "29FE602138E2958RCABC02843CBCD76A" folder and select delete.
l. Go to HKEY_CLASSES_ROOT\
m. Right-click on the "VERITAS.DLAEventHandler" folder and select delete.
n. Go to HKEY_LOCAL_MACHINE\SOFTWARE\Classes\CLSID.
o. Right-click on the "{5CA3D70E-1895-11CF-8E15-001234567890}" folder and select delete.(This data for the "Default," setting in this folder should be "DriveLetterAccess.")
p. Go to HKEY_LOCAL_MACHINE\SOFTWARE\Classes.
q. Right-click on the "VERITAS.DLAEventHandler" folder and select delete.
r. Got to HKEY_LOCAL_MACHINE\Software\Microsoft\Shared Tools\MSConfig\startupreg.
s. Right-click on the "dla" folder and select delete.
t. Go to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Explorer\AutoplayHandlers\Handlers.
u. Right-click on "VxDlaCdOnArrival" and select delete.
v. Go to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\
w. Right-click on the "{1206EF92-2E83-4859-ACCB-2048C3CB7DA6}" folder and select delete.
4) Restart your computer.
5) Delete the "DLA," folder from the System32 folder. (This is located inside: C:\Windows\System32)

Reboot your machine, and you should now see your DVD drive, and Roxio DLA will be gone.

10/04/2007

Repository Factory, GridView and ObjectDataSource with WebService Layer

Quick Repository Factory tutorial with a WebService data layer feeding an ObjectDataSource on a web page. Also shows how to use the ObjectDataSourceView class to insert a new row in your Gridview.

read more | digg story

10/02/2007

Repository Factory Out

The P&P group has been listening to developers, and they've decoupled the Data Access Guidance Package from the WSSF etc. It really should have been available on its own from the git-go. The full source is available at codeplex.com here, and its called "Repository Factory".

You need to be careful, though. They haven't produced a simple MSI Installer for this, even though the setup project is included in the downloadable "final" source code - so you'll need to build it. I have no idea why they do this, but hey- let's not look a gift horse, you know?

The other issue is a bit more fragile - the "Final Source Code" download that was available today at the upper right link on the page did not build for me. I fiddled with missing references for at least 20 minutes, and no-go. Finally, I went to the Source Code section and downloaded the most recent checked in build, number 10107 - and that one built in Release mode without a hitch. Also, you'll need to have both the GAT and the GAX installed. You can get the July drops of these from the links on-site, or if you have the January builds those should also work.

Now - what's so cool about this? Well, you can start any project, and enable the Guidance Package for Repository Factory (once you've installed the built MSI from the setup project in the solution) -- and you can specify a connection, project responsibilities, build your sprocs, your business entities, and Repository Factories, all nicely integrated with the Enterprise Library 3.1.

So for example, if I generate "stuff" for the Northwind Products table, all I need to do to get all products looks like so:

ProductsRepository rep = new ProductsRepository("Northwind");
List<Products> prods=rep.GetAllFromProducts();
this.dataGridView1.DataSource = prods;

Don't like Enterprise Library? No problem. David Hayden, an MVP here in FL who seems to have made a career around EntLib and the P&P / GAT stuff, shows how you can decouple the whole thing from Ent Lib and use SqlHelper - or any other DAAB that you already have.

If you aren't inclined to download and install all the prerequisites, or to attempt to build the MSI installer, I've cached a copy of it in my Windows Live SkyDrive public folder here. Help Yourself! (UPDATE: 10/3/2007 - they now have an MSI installer at codeplex.)

Incidentally, if you are having trouble uninstalling earlier versions of the Guidance Automation Toolkit (GAT) or the Guidance Automation Extensions (GAX), especially on Windows Vista, please don't drive yourself nuts trying to follow the incredibly sadistic lists of "Stuff" that so - called experts tell you that you have to do. Just download the MSICUU (Windows Installer Cleanup Utility). Once installed, you can run this from your Start menu, find the offending installed products that "won't uninstall" - and remove all traces of them in the Registry. The newer versions should then install perfectly. Whole thing takes about 10 seconds to fix.

I can dig it. When I used the WSSF with this in it, it just got overly complex. But now, I can produce a BLL and DAL layer that my finicky clients will approve, and know that it will be not only "best practices" -- but consistent and maintainable too.