Smartphone and Vista: "Look Ma, No USB!"

Recently I posted about how the Bluetooth connectivity to the new Windows Vista Device Center had been disabled by Microsoft in response to security concerns by corporate customers. That was partially inaccurate.

You can connect your SmartPhone to Windows Vista with Bluetooth (assuming of course you have a Bluetooth radio USB dongle on your desktop PC). The process is a bit weird, but hopefully this explanation will help:

1) First you have to enable Bluetooth on your device and make it "Discoverable".
2) On your desktop machine, in Control Panel, in "Bluetooth Devices" you need to add your device. The following pics show a successful add:

Now all you need to do is go into the Device Center main screen, click on your device, and it will say "Waiting for device to connect". You may also need to go into the BlueTooth setup on your device and make it connect. A passkey is recommended, and once you fill that in on your device, you should see "Connected" on your desktop.

At that point, you can sync, sync media files, or just browse the filesystem on your device and copy files back and forth like Windows Explorer, all using the features of the Windows Mobile Device Center.

I wish there were clearer instructions on how to do this that came with the product, but so far I haven't found them.


Da Dum, Da Dum, Saddam

"At 6:10 a.m., the trapdoor swung open. He seemed to fall a good distance, but he died swiftly. After just a minute, he was not moving. His eyes still were open but he was dead.
His body stayed hanging on the rope for another nine minutes as those in attendance broke out in prayer, praising the Prophet, at the death of a dictator."

There are surely two schools of belief system about the above:

1) America toppled the Dictator and empowered his people to regain their destiny, and the Iraqi people did what they consider proper justice at their own will.

2) It was the Americans' fault, America has no business in Iraq; It was America that executed Saddam.

You will often also find the number 2 people in the cadre of folks who have something in their radical, left-wing psyches that leads them to believe that the U.S. Government was somehow involved in the 9/11 disaster, in some evil conspiratorial way.

So, now it's over. Like Hitler, Tojo, Stalin. Good vs Evil. At any rate, it certainly should be cathartic for the Iraqi people, many of whom have suffered dearly at the hands of this maniac.

Where do we go from here? Personally, I have no doubt that Saddam was one of the most evil dictators of the 20th century (and a portion of the 21st), and deserved to be hanged. Banishment would have been more humane, but we must remember that in that part of the world, punishment by death is more the rule than the exception.

-- And, it was their decision.

We still have 40,000 troops in Japan and 30,000 in Korea after 60 and 50 some-odd years respectively. The U.S. of A. is going to have a presence in Iraq for a very long time. Get over it.

The American Thinker article that one commenter astutely points out sums it up:

"It is interesting to note that those complaining about Saddam's death sentence do so in the safety knowing that they will probably never have to live in the culture he helped create, nor will they ever have Saddam Hussein as a neighbor. If Saddam were allowed to live, Iraqis who suffered under his regime would not have those same assurances."


Enterprise Library 3.0 Dec '06 CTP, About 2007...

The Library 3.0 Beta is out at Codeplex.  Also in the Community Extensions, there  are a number of cool add-ons and extensions that users have authored.

Even if you do not use the full EntLib in your work, this is really good "best practices" code to study, and I highly recommend it. It's a good way to get into Patterns and building blocks OOP coding.

About 2007...

2007 starts with MONDAY and also ends on MONDAY...
2007 has the highest number of SUNDAYS and SATURDAYS...
So enjoy the least working year in your life!


OpenID - Ready for Prime Time?

(Subtitle: "Reality.sys not found. Universe halted.")

I saw a video on OpenId by Simon Williston, and checked out the main site, and I did manage to find an implementation of server and client for ".NET", but - it was ported from something for MONO, there was no source code, and the original was written in BOO for .NET.

There seems to be a lot of activity around OpenId, one could possibly make the case that this represents what Microsoft hoped to do with Passport (oh, wait, I think it's "LiveID" now), or at least what everybody else hoped to do with it.

The concept is pretty simple, you get a URI that exclusively identifies you and is difficult to spoof, and it allows you to do single sign-on (or at least use the same credentials mechanism) at multiple sites.

I have no particular problem with BOO, I could use SharpDevelop, which supports it, but my real question is why isn't this authored in an industry - standard certified CLI language such as C#? It just seem like so much stuff was cobbled together, I think at this point the best thing to do is wait.

I like the concept, and I'd implement it on sites I develop, but not without a lot more infrastructure support and less FUD.

A recent survey of the Fortune 1000 websites by Port80 Software shows that Microsoft's Internet Information Services (IIS) is being used by 54.9 percent of companies. and the ASP.NET platform is being run by 48.4 percent of the companies. You can dispute the figures as being biased but the bottom line is, if you want to gain broad acceptance of a platform or standard, you have to embrace the entire market, not just the Penguinistas. This, to me, represents a gaping hole in whatever marketing strategy the OpenId crowd may have.

This is one of the trouble areas that you can find with open-source initiatives -- you can get a kind of fragmentation where individual efforts go off in different directions, and although the intentions are noble, there may not be some central authority that helps to tie everything together and you run the risk of having a real mess on your hands. I really hope they get it sorted out better. I am a big proponent of open-source software and platforms. It's just that open-source still requires leadership and proper management to succeed.

Just my $0.02 .


Boom! Boom! 5:30 PM

If you live within an hour or so of the Kennedy Space Center, this unmistakeable loud double-boom will really wake you up. I think geeks who live in Central Florida are just so much more tuned in to NASA, space exploration, and what technology is all about.

I completely forgot about the Shuttle trip - I've been so engrossed in my work. But that unmistakeable sound blast woke me, and within a half a second, I knew that they were coming home.

Flipped on the TV and watched a perfect landing, and also noted that my stress level, even though perhaps subliminal, just went down a notch for not having to worry about the space program - at least for now.

Think about it. We are sending people up into space to do science and follow our human destiny, and meanwhile back here on earth, brother is killing brother in a mindless universe of hatred.

If this doesn't represent the two most polarized opposites of the human species, I don't know what does.


IttyUrl.net in Beta

My latest creation, IttyUrl.net, takes the "Short Url" concept farther, and is oriented toward developers.

1) Turns long Urls into "Ittified" short Urls just like the dozens of similar sites do.
2) Automatically spiders the target page, returning the url type (Feed or Page), Title, and up to 200 TAGWORDS on the page, along with any custom tagwords you provide, and indexes all.
3) Easy "bookmarklet" you can drag to the Toolbar or add to Favorites enables you to "Ittify" any page while viewing it.
4) RSS Feeds of your IttyUrls, or most recent site-wide IttyUrls.
5) Search by tags or Title keywords.
6) Tag Cloud feature.
7) Complete WebService API
8) Neat little script you can put on any page of your blog or web site (neutered here, remove the + signs):

Everything on the site is free! Comments, suggestions and criticism are welcome!

Try it! http://www.ittyurl.net


Windows Update and IIS Metabase Corruption?

I let the most recent windows updates install on Server 2003 x64 this morning, and the first thing I noticed on reboot was that familiar but despised messagebox, "One or more services failed to start, have a looky at the Event Viewer, etc.".

Now, I can't be sure if it was the update, I'm just mentioning this in case other poor souls want to corroborate, in addition to the fact that I know of at least one other situation where this may have occurred.

At any rate, the IIS Admin service would not start. A bit of investigation revealed that the IIS Metabase, "Metabase.xml", which resides in %windows%\system32\inetsrv\, was corrupted.

If you have enabled backups on your IIS metabase, there are three ways to restore a previous backup. You can do it from IIS Admin snapin (the preferred method). However, if (as in this case) the Admin Service won't start, you are pretty much S.O.L. on that one. Another way is to do a restore of your backed - up System State, but in many cases that would result in so much additional "Bad Baggage" being restored that you really want to avoid it.

The third way, is to look in the History Folder, and you'll see a bunch of files that look like "MetaBase_0000003275_0000000000.xml". All you need to do is copy the latest one (assuming it's not a backup of the corrupted one, of course), copy it back into the inetsrv folder, and rename it to "Metabase.xml".

Presto! You're done, and the IIS Admin service should start. Of course, you may have to reconstruct some VRoots and stuff, but it's quick and pretty much failsafe.

IIS automatically maintains a record of metabase changes in history files that are saved to disk. You can also configure the number of metabase history files to save. By using the metabase history feature, you can revert the metabase through any number of changes to restore a particular configuration, or to see what has changed between revisions.

The History folder stores versioned copies of the MetaBase.xml and MBSchema.xml files. These copies can only be viewed by members of the Administrators group. The location of the History folder is systemroot\System32\Inetsrv\History.

When the in-memory metabase is written to disk, IIS checks to determine whether the number of history file pairs that are contained in the History folder exceeds the value of the MaxHistoryFiles property. If the number of history file pairs exceeds the value of the MaxHistoryFiles property, the oldest history file pair is deleted.

Now! The question becomes, how did my Metabase xml file, which has never been corrupted before, ever, get corrupted like this?


Why does Forms Authentication Fail When Migrating from ASP.NET 1.1 To 2.0?

The <machineKey> element in the Web.config file is used to control tamper-proofing and encryption of ViewState, forms authentication tickets, and role cookies.

ViewState is signed and tamper-proof by default. You can request encryption for pages that contain sensitive items in their ViewState by using the ViewStateEncryptionMode attribute. Forms authentication and role cookies are also signed and encrypted by default. You do not need to modify the default settings under normal usage scenarios, except for a few situations that developers should be aware of:

If your application is in a Web farm or if you need to share authentication tickets across applications, you need to manually generate encryption and hashing keys and specify them in the <machineKey> element, and NOT use the "autogenerate" default.

If you migrate an application from ASP.NET 1.1 to ASP.NET 2.0 and use Hashed passwords, the key material used to generate your hashes WILL CHANGE. Again, the solution is to have specified encryption and decryption keys in your <machineKey> element, and to keep these the same in your migrated application.

Judging from the number of "what's wrong" posts around this subject, it appears that Microsoft hasn't made this clear enough. There are a couple of KB's on the subject, but the problem is - most developers read KB's AFTER they have a problem, not as a "preventative measure". You give me 100 developers who have installed Visual Studio 2005, and I will show you at least 95 developers who never read the "Readme" file that accompanies the distribution! It's just human nature to RTFM as a last resort.

For ViewState, a hashed message authentication code (HMAC) is generated from the ViewState content and the hash is compared on subsequent requests.
The validation attribute of the <machineKey> controls this, and indicates which hashing algorithm to use. It defaults to SHA1, which uses the HMACSHA1 algorithm. Valid choices for hashing include SHA1 or MD5, although SHA1 is preferable because it produces a larger hash and is considered cryptographically stronger than MD5. The validationKey attribute of <machineKey> is used in conjunction with the ViewState content to produce the HMAC. If your application is installed in a Web farm, you need to change the validationKey from AutoGenerate,IsolateApps to a specific manually generated key value.

Here is an example element with manually-provided keys:


You can make yourself a nice little class to generate fresh keys, like so:

using System;

using System.Text;

using System.Security;

using System.Security.Cryptography;


public static class GenerateKey {

  public static string  GetKey( int keyLength)


    int len = 128;

    if (keyLength  > 0)

      len = keyLength;

    byte[] buff = new byte[len/2];

    RNGCryptoServiceProvider rng = new



    StringBuilder sb = new StringBuilder(len);

    for (int i=0; i<buff.Length; i++)

      sb.Append(string.Format("{0:X2}", buff[i]));  

      return sb.ToString();   



Pass in the desired key length as the parameter to the GetKey method. Passing zero will result in the default 128 length.


The Case of the Incredible Multiplying Email Alias

(-- or "How I Learned to be a Complete Idiot and Send Out Spam Via the 'Me Too' Effect...")

"A man does not exist until he is drunk." -- Ernest Hemingway

From the ReallyReallyDumb Department:

This one "takes the cake" for absolute stupidity! I wouldn't even know about this except for the fact that I once started work on a book for this publisher, which deal I eventually got out of, but they've never taken my email address off their "Authors" list.

This afternoon about 2 PM, I get about 20 emails from the "contacts@....." address of this publisher, who shall remain unnamed, (you probably know who they are anyway). Apparently, some complete idiot set up this address to relay any mail sent to it to every author in the list (or maybe they didn't even know, which in my book still qualifies for "idiot" status), and then they decided to "Test it" - even asking other people to "Send it an email every 30 minutes" - Good God! So somebody sends a mail with this alias in the CC list, right? BOOM! 200 people get a copy of it and what do they do? You got it - they hit the reply button, which includes the alias in their CC list in many cases, and instead of 200 emails, now you have 800 more. Not to mention all the bounces from people who are "Lo Babayit" (Nobody Home) or whose mail servers are sending the bounce mail notifications BACK TO THE ALIAS -- which of course, once again, replicates the entire process, ad nausuem.

Well, this started about 1PM, its finally starting to die down about 5PM. I've seen this a couple of times before, where the "Me too!" effect kicks in, creating absolute recursive havoc with news and mail servers (not to mention any poor slob who happens to have gotten their tail in the loop, like me).

N.B. -- Oops, spoke too soon. It's Friday morning, and there are another 60 or so from their own Postmaster address. Boy, did they create a mess!

Go figure. Lawn Chair Larry from Los Angeles wasn't even this dumb - at least he took along some sandwiches, a six pack of beer and a BB gun!


ASP.NET 2.0 vs PHP -- or PHP.NET?

I finished up my Multisearch Windows Vista Sidebar Gadget, which allows you to choose from multiple search providers and get your RSS Search results in your Sidebar gadget, and I chose the Google Blog Search provider and searched on ASP.NET and came up with this post on Digg.com (target of post):

http://www.modernlifeisrubbish.co.uk/article/why-not-dot-net where this fellow basically trashes ASP.NET in favor of PHP. Most of his reasoning in favor of PHP is really just personal preference, the reasons given are mostly inaccurate or biased because of lack of knowledge about ASP.NET and the .NET platform.

There was one comment on the Digg posting, however, that I found revealing:

"You should check out Phalanger.


It integrates PHP with ASP.NET, pre-compiling your PHP into MSIL, the same way eAccelerator and others pre-compile scripts. The difference is, it's done by ASP.NET, and your scripts run on IIS6. With your scripts running on ASP.NET, they're running managed which protects you from a great deal of security issues.

As for performance, I suggest you give it a try, because our tests show PHP running under ASP.NET under IIS6 under Windows 2003 completely destroys the same hardware running PHP with eAccelerator under Apache Linux 2.6 or FreeBSD. Sounds hard to believe, but we're a shop full of linux/unix guys and all our stuff is currently on Apache and we gave ASP.NET a chance at running our PHP and the results were stunning.

Phalanger gives you the ability to use the .NET framework in your PHP scripts, and to use the code behind model of ASP.NET, but you don't have to use that stuff if you don't want to! It is 100% compatible with existing classic style PHP scripts, and PHP modules. (N.B. - it is - tried it, and it was a 100% total "no brainer")

I'm no microsoft fanboi, but if you're a real developer you shouldn't be a unix fanboi either, you should test things out and see what works best for you. I highly recommend you test Phalanger with your existing PHP codebase."

Well! I had already looked at Phalanger, but I didn't realize it was that far along, so I downloaded and installed it (including the Visual Studio 2005 IDE Integration support).

Let me tell you something: PHP is extremely popular, but it's still an interpreted language, like VBScript with classic ASP. When you can combine the ease of use of PHP with .NET Compiled Framework support, you have got a winning combo. I tried a couple of PHP web apps, and I was duly impressed. There is a checkbox in the installer that basically asks you if you want to support compiling classic PHP scripts, and I checked it, and it works- right out of the box!

The included PHP.NET compiler will output standard .NET class library assemblies that can be used by any .NET application. Think about it - this is HUGE!

What does Phalanger do?

My extract from the features section of the help PDF:

Phalanger enables developers to painlessly deploy and run existing PHP code on an ASP.NET web server and develop cross-platform extensions to such code taking profit from the best from both sides.

Compatible with PHP 5.1 as well as with proposed features from the upcoming PHP 6, the object model in Phalanger enables to combine PHP objects with the .NET ones. It is possible to use and extend any .NET class in PHP and also consume classes written in PHP from an arbitrary .NET language.

From another point of view, Phalanger provides .NET programmers with the giant amount of practical PHP functions and data structures - many of them reimplemented in the managed environment of the .NET Framework. The whole library of PHP functions and classes (including those implemented in the PHP extensions) is accessible to a .NET programmer together with type information.

The compilation of PHP scripts gives yet more power to the existing PHP web applications inside the Phalanger environment. All the static (run-time immutable) code in the scripts gets parsed and compiled only once and all following accesses to a page benefit from the unleashed execution of the native compilation of the script. Yet the usage of Phalanger is not limited to web applications. The compiler supports output of standalone executables or dynamic link libraries enabling you to create managed PHP console or windows applications, or library modules reusable from any other .NET Framework application.

Personally, I look at this from the standpoint that there is a huge codebase of high quality PHP stuff that I may wish to use without having to run PHP in interpreted (or even "eAccelerator") mode. With Phalanger, I can run this great codebase in ASP.NET with no hiccups at all and get the enormous perf boost that the ASP.NET model provides. And, I can host it in IIS just like any ASP.NET application.

If this isn't PHP.NET, I don't know what is.

In addition I will leave you with this snippet from Scott Guthrie's ("Mr. ASP.NET") blog from March (paraphrased for simplicity):

"Myspace had (in March) 65 million registered subscribers, and were registering 260,000 new users each day. According to the Media Metrix report (an independent analyst firm) MySpace.com had more page views in February than all of the MSN and Google sites combined. Umm!

They re-built and re-deployed their site on ASP.NET 2.0 shortly after it was shipped last year. Some statistics:

MySpace.com is now processing 1.5 Billion page views per day
MySpace.com handles 2.3 million concurrent users during the day
MySpace.com's average server CPU utilization went from 85% to 27% after moving (from another technology) to ASP.NET 2.0

The top-6 domains in terms of page-views in February according to Media Metrix were: 1) Yahoo, 2) MySpace, 3) MSN, 4) Ebay, 5) Google, and 6) Hotmail.

4 of the top 6 sites (MySpace, MSN, Ebay and Hotmail) run on IIS and Windows"

You PHP folks? Read up on it d00ds, WAKE UP and put that convenient "anti-Microsoft" stance aside for a bit, and open your mind. You can have your PHPCake and eat it, too... It's not just about scripting: it's about performance too.

It ain't null until I SAY it's null!

The fiasco around System.DbNull and "null" and Databases kind of reminds me of the "Hanes Lady" commercial where she is pulling the elastic of the briefs (Now that one was Marketing 101 exemplified -- how many TV ads do you really remember like that one?).

The typical forum or newsgroup post goes:

"When I insert a blank value into a SQL Server database for a DateTime column, my database is inserting 1/1/1900 even if I assign Null to the variable in my application."

When you are inserting data into a database, the ADO.NET data providers and your database may distinguish between a null object and an uninitialized value on a spcific data type. In this case, inserting a null into a DateTime column causes the database to seed the field with the default initialized value - 1/1/1900. What you really want is to tell the database that the field in question should remain uninitialized. To do that there is a System.DBNull class and you use the Value property of the class, e.g. "System.DbNull.Value".

To insert a row into your database, and maintain the uninitialized state of the DateTime fields you use code like this:

SqlCommand cmd = new SqlCommand();
cmd.Connection = conn;
cmd.CommandText = "INSERT INTO USERS (Name, RegisterDate, CancelDate) VALUES (@Name, @RegisterDate, @CancelDate)";
cmd.Parameters.Add("@Name", "FeeFiFoFum");
cmd.Parameters.Add("@RegisterDate", DateTime.Now);
//Use System.DBNull.Value to keep the CancelDate field uninitialized
cmd.Parameters.Add("@CancelDate", System.DBNull.Value);


I've seen a number of approaches to this, but one engineered by Adam Anderson is clearly the best. In .NET 2.0 , we can have one function for all the data types:

public static class CastDBNull
public static T To( object value, T defaultValue )
return ( value != DBNull.Value ) ? (T) value : defaultValue;

To use this:

// Pass a string type to cast to string; you could pass either String.Empty or null,
//depending on what you want for the default value:
string s = CastDBNull.To( dr[0], String.Empty );
//Now with the same class and method, passing int type to cast to int:
int i = CastDBNull.To( dr[0], 0 );

Some programmers prefer to use nullable types to handle DBNull, the reasoning being that using null to represent DBNull is better than using a "magic number" such as 0 to indicate DBNull.

However, there are times when you can't use nullable types, because you need to know the difference between having "no data" and null data. If you try to select a field from a row with certain criteria, there might be no matching row, so your field value remains uninitialized (null), or a row where that field's value is DBNull, or a non-null value. In cases like this, where you can have three different kinds of results, nullable types are difficult to "make fit".


Getting Dugg: An exercise in audience understanding

This past weekend I finished putting together my "programmers" version of the Myers-Briggs MPTI test for online consumption. Actually, the "Real" MPTI can only be administered by a licensed practitioner and it's trademarked. However, over the years, Keirsey and a number of others have refined their own well-researched versions of this test, and those are not trademarked.

Consequently, with a little study, and common sense on figuring out how the test is actually scored, it is possible to put together a highly accurate version of the MPTI. So I posted this, along with a nice chart that links to the Wikipedia page for each of the 16 personality types, as well as 16 details pages with data accumulated from a number of sources, over on eggheadcafe.com on Sunday afternoon.

I also submitted it to Digg, mostly because I had a "hunch" that it would fit pretty well with the Digg geek herd mentality.

Well! Within 5 minutes, it already had 10 Diggs, and as of this morning (Monday) it had some 1400 Diggs and had made the front page, and had some 300 comments, most of which were favorable. I've never gotten any of my "Stuff" dugg more than six or seven times, so this was an epiphany of sorts.

They say its the title. Of course "10 Best ways...", "Amazing ..." and similar buzz phrases can get you dugg, but it really takes the herd mentality to make it to the Digg front page (and if you do, you better have your webserver and your pages running lean and mean, or you won't be there long).

My catchy title starts with "Are you a programmer?", and it was posted to the Programming section. I guess that combination of title and the natural curiosity of being able to take a free personality test online must have hit the right "DiggNerve".

Anyway, Analytics reported some 27,000 page views just for Sunday - a day when most people are watching football, and they certainly aren't at work.

On Monday, Dec 11, the "article page" had garnered 55,687 page views for the day and was responsible for approximately 56 percent of the site-wide Adsense revenue for Monday - all from the single 300X250 ad that appears at the top of the article. But the residual effect should be good as well, since many of the visitors were brand new and will come back repeatedly to visit our site for more good content. The only real downside is that the ISP bitched about the extra bandwidth!


Standards, Schmandards! - OpenXml vs Open Document Format

Microsoft's Office OpenXML has been approved as an Ecma standard and will now also be submitted for consideration as an ISO international standard. Ecma International announced the approval of the new standard on Dec. 6 following a meeting of its general assembly. Ecma will also begin the fast track process for adoption of the Office OpenXML formats as an ISO international standard in January 2007.

The technical committee, which includes representatives from Apple, Barclays Capital, BP, The British Library, Essilor,Microsoft, NextPage, Novell, Statoil, Toshiba and the U.S. Library of Congress, also boasts the membership of Intel, which recently joined.

Naturally, criticism of the new OpenXML standard was quick, particularly from those who support the competing OpenDocument Format, which has already been approved as an ISO standard. For example:

Bob Sutor, the vice president of Open Source and Standards at IBM, said in a blog posting that IBM "voted no today in ECMA on approval for Microsoft's OpenXML spec. I think we have made it clear in the last few months why we think the OpenDocument Format ISO standard is vastly superior to the OpenXML spec," he said.

In actual fact, IBM's "no" vote was THE ONLY "no" vote.

But Ecma clearly disagrees with their view, saying in a statement that an increasing number of organizations around the world are interested in achieving document processing interoperability and creating digital archives using open formats.

"The Office OpenXML (OpenXML) formats provide an international open standard for word-processing documents, presentations and spreadsheets that can be freely implemented across multiple applications and platforms, both today and in the future," it said.

Vendors, including Corel, Microsoft and Novel, have already announced implementations of the OpenXML standard in their applications, such as WordPerfect, Open Office and Microsoft Office 2007.

Wikipedia has a nice comparison of the two, er, "standards", along with two sets of "Adavantages of XXX" lists here.

What do I think? I think it's almost over, and Microsoft won. One thing I have learned in my short happy career as a programmer and software architect is that just because something is open-source and doesn't have the name Microsoft in it anywhere, doesn't always mean its the best thing for the broadest population of users. If you are Microsoft, they are gonna bash you even when you try to do the right thing and support standards that will work for the "greatest good". In particular, I find Mr. Sutor's pronouncements somewhat two-faced. Microsoft voted "yes" for ODF at ISO, and offered no resistance. But Sutor / IBM freely admit that it will take 3 revisions just to get ODF to be OpenXml compatible. But, don't take my advice - read the wikipedia comparisons and come to your own conclusions.

Oh, well . ,. the nice thing about standards is that you get to pick the one you like the best...

Ready for Your Spanish-American War Tax Refund?

Er, "yippee" --the IRS is going to return money collected from our phone bills that was supposed to to pay for the Spanish-American War.

The Federal Excise Tax, which was enacted in 1898, amounts to about $3 per month for the average say, $100 / month phone bill. Heavy phone users might pay $100 or more per year. Yep, this was actually to pay for the Spanish-American War, and we've all been paying for "it" since 1898.

Fortunately, once this ludicrous tax started getting some legs in the press, no one could really defend it and the tax has indeed finally come to an end. We're even being offered refunds:

You are to claim the refund on the 2006 tax form that you file in 2007.

You can opt for a standard refund of $30 (if you have one exemption), $40 (if you have two), $50 (if you have three) or $60 (if you have more). This option requires no documentation from you.

If you have (or want to go through the trouble of procuring) your telephone bill statements from March 2003 to July 2006, you can get a refund based on amounts you were actually charged. In most cases, this can amount to a lot more than the standard refund -- perhaps as much as $100 to $300 for many of us. You need to fill out IRS Form 8913 for this.

Let's see, I'll get out my "Fawlty Math" calculator. 2006 minus 1898 is 108 years, times $36 a year, times 200 million Americans.... equals a $50 refund. Yep, works out PERFECTLY!

Folks, this completely, utterly idiotic example is just the tip of the iceberg. I have never liked US taxes. I've never gotten along well with Uncle Sam, and I don't like him any better now. Trust me, he can make your life miserable if you aren't careful.

The reason we have this incredible waste is that Americans put their elected representatives (Republican, Democrat, and Lieberman) into Congress and then go promptly to sleep, completely oblivious to the greedy abuse of power that ensues. Congress taxes, Congress spends, and it spends more than it gets, but who cares! It's kinda like:

"And it's 1, 2, 3 what are we taxin' for?
Who cares, I don't give a damn
Next stop is la-la land."

What happened to the "Contract With America"? The Republicans conveniently forgot about it, and that's why the stupid morons got booted out in the last election. No, I think the Iraq war midterm election thing was just a cover for plain bad governance. The American People are smart enough to realize we are going to be in Iraq for a very long time. Look, we still have 40,000 troops in Japan, and 30,000 in Korea after 60 and 50 years respectively. But, wars as a percentage of GDP are small compared to the reallyreallydumb stuff that Congress spends money on.

Democrats want things to be "more fair" by increasing the percentage of tax the rich pay, and that's total BS. If I make $2 million a year and you make $200 thousand, and both our tax rates are 20 percent, then I ALREADY pay ten times as much in taxes as you do, and there is no reason to increase my tax rate to 40%. Close the loopholes the rich use, and you can even lower all tax rates across the board.

What we NEED to do is lower the tax rates for EVERYBODY, WAKE UP, and stop letting Congress throw money down the fyookin' TOILET! Taxes SUCK. We don't need them, at least not the way it's set up now.

Maybe we could get the Chairman of the Federal Lubrication Board, Alan Greasepan, to fix things up?


More Windows Vista Goodness for Developers

There is a page on MSDN that details and has links to:

Visual Studio on Windows Vista FAQ

Visual Studio on Windows Vista Issue List - normal permissions

Visual Studio on Windows Vista Issue List - Elevated Permissions

Visual Studio.Net 2003 on Windows Vista

Also, you can use the Visual Studio and .NET Framework bug reporting site both to submit issues, and to look for issues already submitted, some of which may have fixes or workarounds.

PC World has a "Windows Vista FAQ" they call the "Ultimate Guide". Its a bit more consumer-ish, but worth a look.

And, "Windows Vista Security News" has some worthwhile stuff to look at.

In Other News...

Victoria's Secret, in an unusual environmentally sensitive move, announced they will be cutting down on the amount of paper used in their racy catalogs. They did not specify how this would be accomplished. Hmmm... skinnier models, maybe?


What's in a [domain] Name?

Domain names are - well, important as the lingua franca of the internet, so a quick review of some selected top level domains may be appropriate.

The domain you choose has more ramifications than just search engine performance. The problem with strange TLDs is that:

  • They can confuse visitors
  • They are almost always more difficult to remember than .com, unless they spell something or sound like a word or phrase.
  • They can have a tendency make your orgainzation or site appear less reputable than you actually are.

Here are some choices, and my comments:

The ubiquitous, "everything bagel", .com is the TLD you want. Assuming, of course, the one you want isn't already taken! Some of the hardest .com domains to find are "short" ones. Try to find a .com domain like "whiz". You can't. Even the .net versions are already taken!

.net / .org
Theese two other non-country specific TLDs are good second choices, if you can get one. But, they lack the familiarity of a .com, and for some sites that's unacceptable. However, for certain applications or audiences a good easy-to-remember .net or .org address can be cool.

This TLD was made popular by sites like del.icio.us. The .us TLD is great for those targeting a primarily US-based market. But unless, like del.icio.us, the full site url actually does something like spell a word, they probably have little appeal, because they simply won't ring off the tongue as "familiar".

.biz, .info
Early adoption by spammers and other less reputable operators have tarnished the .biz and .info domains.

These are intended for individual use, but .name has never really caught on, and so it just doesn't "cut it". I could never imagine a "peterbromberg.name". Could you? Doesn't even sound like "The Internet".

The others

There are hundreds of other country-specific and industry-specific domains available, but most lack the recognition required, so for a global site it's usually safer to stick to a generic TLD. A few country codes have gained credence in niche areas, like the Federated States of Micronesia (.fm) for music sites, Tuvalu (.tv) for TV sites, and the Tonga (.to) as in "kickme.to".

Stick to .com if you can. The .net TLD is a good second choice. If you're interested in the official list, here it is.


Windows Vista Defrag? NOT!

One of the so-called "nice" new features of Windows Vista is the "rebuilt" defrag engine. Problem is, I don't like it. Why? I like to SEE what's being defragged, and I like to SEE a visual representation of what my filesystem looks like.

The main reason for this is that I can choose different defrag methods (such as with O&O Defrag) and get better file ordering. Also, when I get to see what's happening, it helps me to identify files I know I don't need and I can delete them, and do a follow-up defrag.

Unfortunately, O&O doesn't have anything out for Vista yet (yes I know you can Orca the MSI, but I ain't doing that!). Diskeeper isn't ready for Vista either.

Frankly, I don't know what these people have been doing all this time. They knew Windows Vista was coming out, the defrag API has been readily available to them to get their products ready. What, were they waiting for Godot?

At any rate, Raxco, which makes PerfectDisk, has a free 150 day trial of their Vista-compatible beta, and it works great. Also, Auslogic has a free defrag that is Vista compatible, but it doesn't offer control of defrag method or offer boot-time defragmentation.

Suggestion: Delete your Paging file before defragging (requires a reboot, and don't forget to press the "SET" button). You can then do the boot-time defrag, which takes care of your MFT and System files, and have a nice fast hard drive. Then, go back to Control Panel / System and restore your Paging file.

Always run the Disk Cleanup Wizard before defragging, and get rid of unwanted .tmp and .bak files.

Happy Vista-ing!


HTTP referer spoofing, cookies, User Agent strings

There was a post on one of the groups recently by a developer who was making a number of WebRequests for various pages, claiming that one of them would strangely fail.

Yet, this individual stated that if he would paste the respective URL into his browser, that page would come up just fine.

There are several things that could come into play here with various sites:

1) Many sites will reject a request that doesn't match a particular one or more User Agent strings (Here are some samples, if your memory is rusty). So you can add the UserAgent header to the WebRequest. I've even seen some wise-asses who detect Internet Exploder and give you a nasty message about how immoral you are and you should go download Firefox to become a real person and how dare you try to view my site with ..etc...

(Listen, Pal: I already figured out you are a Webtard, so I'm not going to bother to fire up my copy of Firefox to see your page, which I already know is worthless. Besides, your little shenanigan is currently restricting you to only about 15% of your potential audience. Go take a good course in Marketing.)

2) Many times a site is looking for a cookie. Perhaps they set it when you first visit, and subsequent pages look for it. So if you make a request for a "Subsequent page" without the required cookie, you get "Bupkis". Some cookie container code:

CookieContainer myContainer = new CookieContainer();

            // following line adds a cookie in container, which will be for all urls on the domain myDomainstr

            myContainer.Add(new Cookie("name", "value", "/", myDomainstr));

            HttpWebRequest request1 = (HttpWebRequest)WebRequest.Create(httpUrlString);

            request1.CookieContainer = myContainer; // use this same container for all requests

            HttpWebResponse response = (HttpWebResponse)request.GetResponse(); //you can check cookies on response.Cookies


            // next request coming--


            //all cookies received on request1 would be automatically included in this request from same Cookiecontainer

            HttpWebRequest request2 = (HttpWebRequest)WebRequest.Create(httpUrlString2);

            request2.CookieContainer = myContainer;

            HttpWebResponse response = (HttpWebResponse)request.GetResponse();

3) Another common issue is redirects. Here are a couple of settings you can use:

webrequest.AllowAutoRedirect = [true|false];
webrequest.MaximumAutomaticRedirections = 30;

You can also capture the redirect url:

 public virtual string GetRedirectURL(HttpWebResponse

                webresponse, ref string Cookie)


            string uri = "";


            WebHeaderCollection headers = webresponse.Headers;

            if ((webresponse.StatusCode == HttpStatusCode.Found) ||

              (webresponse.StatusCode == HttpStatusCode.Redirect) ||

              (webresponse.StatusCode == HttpStatusCode.Moved) ||

              (webresponse.StatusCode == HttpStatusCode.MovedPermanently))


                // Get redirected uri

                uri = headers["Location"];

                uri = uri.Trim();



            //Check for any cookies

            if (headers["Set-Cookie"] != null)


                Cookie = headers["Set-Cookie"];



            return uri;

        }//End method

4) Another common technique (this one is real popular with "those" sites) is to check the HTTP Referer. That's available (in ASP.NET) with the Request.UrlReferer property. They do this as a sort of "poor man's authentication" - the idea being that you got in at some "gateway" page with your credentials, and now they figure that you could only be requesting one of their pages from within one of their sites that you "got into" so they look for one or more referers. Here's some code to handle this in a WebRequest:

public string GetUrl(string url, string referer)


           // assumes a fully-qualified "http://" url

            HttpWebResponse webResp = null;

            HttpWebRequest HTTPGetRequest = null;

            StreamReader sr = null;

            string myString = String.Empty;

            HTTPGetRequest = (HttpWebRequest)(WebRequest.Create(url));

            HTTPGetRequest.KeepAlive = false;

            HTTPGetRequest.Referer = referer;

            webResp = (HttpWebResponse)HTTPGetRequest.GetResponse();

            sr = new StreamReader(webResp.GetResponseStream(), Encoding.UTF8);

            myString = sr.ReadToEnd();



            return myString;


Happy Spoofing!

Inspector Rae and the case of the Incredible Shrinking DIV

This one popped up on the asp.net newsgroup, and I think it's good for a chuckle!


More Windows Vista: The Saga Continues...

"Linux sucks twice as fast and 10 times more reliably, and since you have
the source, it's your fault."
-- from Google Codebase Search

I have Vista Ultimate running on two PC's now, my notebook, and my "Main machine" where it resides on one hard drive, dual booting with Windows Server 2003 Enterprise x64 Edition on the other drive.

I'm getting to like Vista so much that I've even changed the Outlook default .Pst file to the one from the x64 OS so that I'll have the same Outlook data whether I boot into Vista or Windows Server.

At this point, I have everything pretty much set up the way I want, and I've gotten past a few of Vista's quirks with security and such to the point where I feel happy with the OS.

There are some plus items I've noticed about Vista, and also a few minuses:


1) It boots FAST. Much faster than Windows Server.
2) You can put it to Sleep (like "Hibernate"). Your Computer's power light goes out, it "looks like" its off. But, when you press CTRL-ALT-DEL it comes to life in just seconds and all your stuff is still there. A big convenience feature.
3) It appears to be pretty good with memory and CPU management, especially with more than one program going.
4) Speech Recognition. They have made SUBSTANTIAL improvements. You can sit there with a headset mike and literally do anything you want, hands free. All it takes is a little study. "What can I say?"
5) the Copy folders/files dialog has been enhanced with conflict resolution and similar useful features.
6) For Developers, IIS 7 provides a number of enhancements, more programmatic API control, and improved stability, diagnostics, and throughput.

1) The sidebar is still buggy and its a real memory hog. However, I'll give it the benefit of the doubt for now, since I probably will be developing a few freebie Sidebar Gadgets soon!
2) The Windows Mobile Device thing that replaces ActiveSync doesn't support Bluetooth. Apparently, corporate customers bitched to Microsoft, and they took it out. That's the pits, and you can't even install ActiveSync anymore.
3) The Copy folder/files dialog, because of the enhancements and computations of estimated time, etc. is now considerably SLOWER than in Windows Server 2003/ Windows XP.
4) The Microsoft Marketing Machine ("MMM") has completely failed me in describing whether Vista "Ultimate" is really the highest-end server-type product, or whether there will actually be a Longhorn Server that is still to appear. I strongly suspect that I am not the only person who is confused about Microsoft's proclivity for tinkering with the nomenclature, to the total and utter confusion of the end user.

Anyway, there will be more, I just wanted to get this started, and it'll be updated over the long weekend and beyond.

N.B. -- My first Sidebar Gadget: "Feed Search".

Linux What?

As if that weren't enough, have you ever searched Google Codebase? There are some pretty funny results. You can search for "Windows Sucks", but you can also search for "Linux Sucks" and get 5 pages worth of results!


The Evolution of a Programmer

Usually around major holidays I become a bit more reflective and I "Reflected" recently on my kinda / sorta "evolution as a programmer".

I started programming seriously on an Apple IIe (and a bit on Commodore 64's) - at the time I was a broker with Merrill Lynch in Orlando, and I was fascinated by the Technical Analysis Department up in New York. These guys, like Bob Farrell and Phil Rettew, who've become legendary, would post their daily market calculations on the Quotron screens (for those who aren't old enough to know, the Quotron system was a hard-wired network with small monitors and keyboards for the brokers. The monitor had a screen with one glorious color - "puke green").

I started out copying down the TRIN, put/call and other indicators onto graph paper with colored pencils. It was fascinating (so fascinating that I eventually left the business when I realized I was more interested in technical analysis than sales!). While other brokers in the office would get excited about having some Muni bonds they could get a full point commission on, I was ranting about the put/call ratio making a new high.

There was an older broker in the office who had a Trash 80 with a printer. He wasn't much of a programmer, but he would let me write BASIC code that would actually print out these indicators as rudimentary graphs on perforated dot-matrix printer paper, and man - was I ever hooked! D00d, you wouldn't want to short GOOG at $500 without these graphs!

So I saved up and got my Apple IIe with the two floppies and the big cables and a dot-matrix printer. 64K of RAM! Anyway, I was fortunate because one of my customers was an engineer at what was then Martin Marietta in Orlando (now Lockheed) and he would help me with learning BASIC. I almost got divorced 'cause the Significant Other could not understand what I was doing up at 3 AM writing code...

So my first real entry into the programming world was driven by a need - to be able to analyze, manipulate via mathematical computations, and graph stock market data (this eventually led to a Ph.D. in economics).

Spaghetti Code

My first programs were what I now refer to as Linear Buitoni. Everything was "inline"- from front to back. Then I learned you could isolate code blocks into what were called Subroutines - that could be called from anywhere in your code. This was nice because if you had 10,000 points of daily data and you need to compute a slow stochastic index on it, you could pass in each day's data to your subroutine and have it do the computation, and return you the result of that day's worth of data.

Oh, and let's not forget the glorious "GOTO" statement. That REALLY made things fun! My Grandmother always said "The way to understand recursion is to understand recursion." Bless her heart - she managed to recurse her way up to 100 years old before she finally threw a stack overflow.

It wasn't until much later, working with FoxPro 2.6 and writing much more organized code for real business applications that I began to think in terms of "Structured Programming" concepts. Try writing an Australian Binary MLM commission program in Foxpro and you will learn what "structured" means, really fast.

And, although I dabbled in JAVA and C++ in the 90's, I never really got the concept of OOP (Object Oriented Programming) until .NET came out in 2000. That's when things changed. One of the biggest problems with programmers who started out or are coming from the Visual Basic "space" is to UNLEARN all the "bad habits" and crutches you have come to rely on. For many, this takes a continued, concerted effort, and for some, they never get out of the bad programming habits and thinking. This is particularly unfortunate for VB.NET programmers because Microsoft decided to bring forward a huge amount of "backwards compatibility" and "crutches" in the VB.NET programming language. If Option Strict and Option Explicit are left off by default, how many VB programmers do you think are going to "explicitly" learn why they should both be turned "on"?

Being largely self-taught as a programmer is not for everyone. You have to be really, really motivated to begin with. There are a couple of other skills I've acquired and carefully honed along the way that have helped me:

1) Learn to search on the web (google, Live.com, whatever). The web is your personal "RTFM". The better a searcher you are, the less frustrated you will be.

2) Learn how to ask questions - whether it's a post on a newsgroup, or a verbal question to a peer, you must be able to express yourself clearly and concisely, stating the exact problem and what information you need. When I look at some of these newsgroup posts (e.g., "Urgent-Please Help me") - posts that ask for nebulous things and have no sample code - I often wonder how anybody could even bother to respond.

So anyway, I thought I would share this tidbit of history. Happy Thanksgiving (or whatever your favorite hoiliday is). And, thanks for reading!


Yahoo, Google and Microsoft Team Up on Sitemaps

Yahoo, Google, and Microsoft have all announced that they’ve agreed to set a standard for sitemaps.

DiggSpeak Translation: "Amazing! Top Ten Reasons to use Sitemaps"

Sitemaps are those XML files that list all the pages on your Web site. Search engines like to have all the listings in one place so that a site can be indexed without anything being missed.

The protocol has now been released under Creative Commons, so any search engine can pick up on it if they like.

Most webmasters / developers and web site owners use sitemaps, and there is plenty of sample code to generate these dynamically.

We use sitemaps on our Eggheadcafe.com site, and I believe they result in much better indexing. Plus, you can specify how often the bots should crawl, and what the priority is of each item. For more complex sites, you can have a SiteMapIndex file in your website root, which has entries that point to any number of other individual sitemap files. So for example, you might have a messageboard or Forum section and create a separate sitemap for that out of your database daily. Then, you might have another sitemap for your "regular" content such as articles, that gets updated weekly. Your index file would point to both of these, and the bots will happily crawl them.

Sitemap files can also be GZipped, which cuts down on bandwidth and the bots can load them faster. With .NET, just use the System.IO.Compression namespace.

The nice thing about sitemaps is this - Bots only know how to do one thing: follow links. They can't follow Javascript; they can't follow images, and they can't follow dropdownlists of links either. It has to be an anchor "A" tag, with an "href" attribute to make the spiders happy. Yep, they are just "reallyreallydumb".

If you have content in your site that's not linked to from another page, or which only comes up because of a database search, it's not likely to get indexed at all. But if you put the url into a sitemap file, the bots will find it, crawl, and index it. Think about it - you may have content that is dynamically generated out of a database, and doesn't sit on the filesystem at all. With the correct sitemap element, that content, which ordinarily would be invisible to the spiders, will be successfully crawled and indexed by the major search engines. That means more hits, and more revenue if you serve advertising.

Sitemaps are your friend. Now, with Yahoo and Microsoft on the bandwagon, they'll be more important than ever.

Yahoo is expected to begin using your sitemap(s) on Thursday, with Microsoft picking them up early next year.


VISTA RTM: "Windows could not update the computer's boot configuration." - And BCDEDIT For Dummies

Vista RTM is out for MSDN subscribers, so I figured it would be as good a time as any this morning to install it on my second drive (the one where I had an x64 version of Windows XP that I hardly ever use.)

So I booted off the DVD and asked Vista to install itself "new" (not an upgrade) on this drive. I've already had some experience with this in the betas and I figured it would be cleared up by RTM, but no joy. About 85% through the expanding files phase you get a dialog that says "Windows could not update the computer's boot configuration." and that's the end of that.

Now there have been a number of so-called "Fixes" for this that involve a missing registry key, that go something like the following:

"This bug happens when partition manager is missing as upper filter for
disk. The following steps will fix this:

1. Open
HKLM\SYSTEM\CurrentControlSet\Control\Class\{4D36E 967-E325-11CE-BFC1-08002BE10318} using regedit.
2. Confirm that UpperFilters is empty and UpperFilters.bak is present.
3. Rename UpperFilters.bak to UpperFilters.
4. Reboot and re-attempt the upgrade"

However, in my case, this would not apply, since the Registry never comes into play - I am doing a clean install, not an upgrade.

So far, it's still very early in the "game" and the Windows Vista MS newsgroups have no information of any value on this (not for it happening on a clean install).

I suppose I could move my various folders and "Stuff" to the main drive temporarily, format the little booger, and try it that way, but that's an inconvenience I'd like to avoid.

Anyway, if you are finding this same issue comment here and we'll try to pin it down.

(P.S.) "Your upgrade may take several hours to complete." -- Jesus H. Christ! They aren't kidding, man! Go get a big spaghetti dinner and some wine!

N.B. - And the solution, as indicated above, was to copy my folders of "stuff" (music, videos, backups and other junque) to the main drive. Then, I booted off the Vista DVD, selected my second drive, allowed it to FORMAT the booger, and everything went fine from that point on. Moral of the story? Even with a clean install (not an upgrade) if your selected hard drive has folders and "stuff" on it that Vista doesn't like, it may not install until you clean it off and or format the drive.

HINT: Windows Server 2003 x64 can copy entire folders of files from one drive to another a lot faster than Windows Vista x86 can....

Now the only issue left is how to configure BCDEdit on the Vista hard drive to allow for booting into the legacy OS. Here's my post on the Windows Vista newsgroup, if anybody wants to follow the saga. Maybe, "BCDEDIT For Dummies"?

And the answer:

Within a couple of hours, John Barnes posted this:

"Download and install a copy of VistaBootPro or BCDEdit in Vista and with the
Vista drive first, set up a legacy os. After that if it doesn't load, on my
system I have always had to copy the ntldr, ntdetect.com and boot.ini files
from the other drive to the root of the Vista boot drive and adjust the
boot.ini as necessary to point to the NT based drive."

I had already "done that", but as I tried again, I realized that if you change the Hard Drive boot sequence in your BIOS, the entry in boot.ini (on the Vista Drive) would need to be changed. Here's what fixed it (the old , and then the "new"):

[boot loader]
[operating systems]
multi(0)disk(0)rdisk(0)partition(1)\WINDOWS="Windows Server 2003 Enterprise
x64 Edition" /fastdetect /NoExecute=OptIn
C:\CMDCONS\BOOTSECT.DAT="Microsoft Windows Recovery Console" /cmdcons

[boot loader]
[operating systems]
multi(0)disk(0)rdisk(1)partition(1)\WINDOWS="Windows Server 2003 Enterprise
x64 Edition" /fastdetect /NoExecute=OptIn
C:\CMDCONS\BOOTSECT.DAT="Microsoft Windows Recovery Console" /cmdcons

With rdisk(1) in the new, we are good to go! And of course, we can use VistaBootPro in Vista to make either one be the "default".

One other pointer: If you do an upgrade with Vista, it's going to leave nasty temporary folders filled with multi-megabytes of useless leftover junque on your hard drive. If you run the Disk Cleanup wizard, and select all the checkboxes, that will do a fine job of cleaning you up.

Disabling Those Nasty Security Dialogs

This is, of course, not recommended, so you should consider the implications carefully. In my case, I'm behind a firewall (my router's, not "Windows Firewall", which I do not need) , nobody else uses the computer, and I've got Windows Defender and AntiVirus running on the machine.

As users have come to know, Windows Vista runs with User Account security checks on everything, and frankly, although it certainly makes sense, it can get a little ridiculous IMHO. So, to disable these nasty, scary dialogs:

(WARNING!! This will disable most of Windows Vista’s new security features):

Go to Control Panel.
Click Admin Tools.
Click System Config.
Click Tools.
"Disable UAC" - requires a reboot.

Bye, bye, fellas. This may also fix a whole slew of minor security - related issues with Access Denied messages on wwwroot folder and subfolders, and debugging with Visual Studio 2005 in ASP.NET 2.0 with IIS 7.0.

And, don't forget, you Vista Ultimate folks - IIS 7 will now offer you the ability to do what you'd expect on a "real web server" (which it is) - have multiple web sites. Just add some Class C (192.168.0.XXX) IP addresses to your stack, make a new Website in IIS, and give it an IP address. You can put the friendly name (e.g. "mysite2") into the hosts file so you can now do "http://mysite2/"

Now, you don't need to put up with that stupid restriction from Windows XP that only lets you have "one" real web site on the machine (not to mention the 10 connection limit, which you'll be pleased to note is also history).

The Final Analysis

When all the dust is settled, I just have one thing to say: Windows Vista is a solid, really awesome operating system. I like it. In fact, I kind of feel sorry for all those Anti-Microsoft, bash Windows Penguinistas - they will never have an OS as robust, easy to use, and powerful as Windows Vista. Actually, they are secretly doing what "Fred" did in the image above - although they'll never admit it, because they are total dweebs! I have nothing against Linux - I use it, primarily to do "MONO" .NET development. But when you want to compare based on speed, ease of use, ease of configuration and a whole host of other features, there is simply no comparison - Windows Vista is a clear winner. And when you start getting into this TCO - "Total Cost of Ownership" deal, the Penguinistas are gonna be left in the dust on this one. An AK-47 is no comparision with an M-16.

Einstein's Greatest Blunder Not a Boo-Boo?

In other news, the Hubble Space Telescope has shown that a mysterious form of energy first conceived by Albert Einstein, then rejected by the famous physicist as his "greatest blunder," appears to have been fueling the expansion of the universe for most of its history. Experts say the upcoming Hubble repair mission could extend our view up to 11 - 12 Billion years back, which is getting pretty close to the "beginning"? Dark matter.


What's Happening in the Browser Space?

"We need to stop problems when they are small"
-- Benjamin Netanyahu, referring to the Iranian nuclear effort

I thought it would be interesting to post some stats from google analytics on current browser usage. This info comes from our eggheadcafe.com site, which tends to attract a larger percentage of Microsoft devotees, so your mileage may vary.

First, a chart of major browser usage:

As can be seen above, Internet Explorer holds 74.06% of our visitor market, with Firefox at 23.05%. The version breakdown:

IE 6.0 - 76.84%
IE 7.0 - 22.68%

Firefox 2.0 - 46%
Firefox 1.508 - 31.65%
Firefox 1.507 - 13%

According to OneStat, the November 6 statistics:

The most popular browsers on the web are:

November 2006
1. Microsoft IE 85.24%
2. Mozilla Firefox 12.15%
3. Apple Safari 1.61%
4. Opera 0.69%
5. Netscape 0.11%

One of the things that irks me is that your typical Penguinista Anti-Microsoft Firefart afficionados are always pointing out that IE is full of security holes. Well, the Bugzilla database for Firefox developers currently has over 200 open entries, some of them marked "critical". If you were a hacker, would you go after the guys with 12 percent of the market, or the guys with 85%?

I think it will be interesting to see what happens as IE7 takes hold, as well as what happens when Vista gets into circulation.

There's one bright cloud on this horizon: If I code for IE and Firefox, I can be confident of reaching about 99% of our visitor market. Less is more!


Why I like Web Application Projects vs. WebSite Projects in Visual Studio 2005

Like myself, many developers found migrating Visual Studio .NET 2003 applications to the new Web site model in Visual Studio 2005 impractical, especially because precompiling (publishing) a Visual Studio 2005 Web site creates multiple assemblies. Lots of other complaints surfaced; they are too numerous to mention, but the good news is that "Mr. ASP.NET" (Scott Guthrie) and his team responded with the new Web Application Project add-in and it's vastly improved, even over the original VS.NET 2003 model. This was all in response to developer feedback (or screams of bloody murder, if you prefer) and the final came out about May of this year, just months after the initial release of Visual Studio 2005.

However, I see from forum and newsgroup posts that a significant number of developers have obviously either not yet found the Web Application Project add-in, or they aren't yet convinced of its benefits.

The new Web Application Project model is uniquely suitable when:

  • You need to migrate large Visual Studio .NET 2003 applications
  • You need to control names of output assemblies
  • You need stand-alone classes to reference page and user control classes
  • You need to build a Web application using multiple Web projects
  • You need to add pre-build and post-build steps during compilation
  • You, like me, just decided you don't particularly like the WebSite app!

Only files that are referenced in the project file are part of the project, are displayed in Solution Explorer, and are compiled during a build. Because there is a project file, some scenarios are more easily enabled:

  • You can subdivide one ASP.NET application into multiple Visual Studio projects.
    Omar Khan has put together some really good material on this in a three part "blogathon" which is, in my opinion, first-rate.
  • You can easily exclude files from the project and from source code-control.
  • You get more flexibility and control over what happens when you use the Publish option from Visual Studio.

The compilation model for Web application projects is very similar to that in Visual Studio .NET 2003:

All code-behind class files and stand-alone class files in the project are compiled into a single assembly, which is placed in the Bin folder. Because this is a single assembly, you can specify attributes such as assembly name and version, as well as the location of the output assembly.

Certain other applications scenarios are better enabled, such as the Model-View-Controller (MVC) pattern, because they allow stand-alone classes in the project to reference page and user control classes. This is one of the biggest complaints of developers with the WebSite project model - not being able to "find" their UserControls, because of the unique build and environment semantics of the APP_CODE folder.

To run and debug pages, you must build the entire Web project. However, building the entire Web application project is usually very fast, because Visual Studio employs an incremental build model that builds only the files that have changed. I've seen large projects that took upwards of a minute or more under the WebSite model compile in as little as two to three seconds under the new Web Application Project model, so having to recompile whenever you change code should rarely be an issue.

Edit and Continue only works with the built-in development server, but I rarely use it except for quick "demo" projects, so that's of no particular consequence to me.

Because all class files are compiled into a single assembly, only that assembly needs to be deployed, along with the .aspx and .ascx files and other static content files.

In this model, .aspx files are not compiled until they are run in the browser. However, when used with Web Deployment Projects (a downloadable add-in to Visual Studio 2005), the .aspx files can also be compiled and included in a single assembly for deployment.

Each time you deploy the single assembly produced in this model, you replace the code for all pages in the project. This is not only helpful to me, as I've experienced situations where all the precompiled and dynamically compiled assemblies and pages in a WebSite project can get "discombobulated" - causing some real problems, but also, you can drop a "script only" page into your app and it will work just as it did with ASP.NET 1.1 - something you cannot do with a WebSite project -- at least not one that's precompiled - and I don't know about you, but I never could get comfortable with having my "*.cs" files out there on the web server.

I've already converted most of my "stuff" to the Web Application Project model, and in my opinion (YMMV) it's made my work a lot easier - especially site maintenance.

For ASP.NET 2.0 developers who have some trepidation about migrating WebSite model projects, I would encourage you to look at the excellent set of tutorials and information that Guthrie and his team have put together.

There are also some known issues that developers should review. None of them have been problematic for me, but then again, everyone's enterprise environment is different, so it would be wise to review the list. The Web Application Project and templates are included in the Visual Studio 2005 SP1 (Service Pack) which is still in beta, but I use it.

As with most other good things in life, taking some time to RTFM will help you to avoid unnecessary questions later. After all, if somebody drives up with a new Bentley Turbo and hands you the keys, you probably want to sit up in bed and read the owner's manual, right?


Usability Studies, My Butt -- and Office 2007 Installation Woes

If you have worked with Microsoft products to any degree (I have, I was actually a beta tester for Microsoft's BASIC COMPILER back in 1985 - before some current script kiddies were even born) - then you know that Microsoft (and, to be fair, many other vendors) has developed a finely - honed penchant for buzzwords and name-changing. A big ingredient of this seems to be the year (hopefully) that the software was introduced.

I think "Windows 95" was the first one, but I could be mistaken. Followed of course, by Windows 98, Windows 2000, Windows Server 2003, Office 97, Office 2000, Office 2003, and now - (gasp!) - Office 2007. Frankly, with all the issues in the last few years, I wish they'd just learn to drop the year off the names and come out with it WHEN IT'S READY.

I speak with great trepidation, since the RTM is downloading from my MSDN Subscription as I write this. It's taken a long time to get used to some of the nice features of say, Excel 2003 - features that went horribly awry in Office 2007. For example, if I wanted to chart some data, the chart icon was right there, up at the top. All I had to do is select the columns of data (holding down the CTRL key to select multiple columns), hit the chart icon, choose a chart type and bingo! Nice chart. Where is it on Office 2007? Well, I actually did find it recently - you have to choose "Insert" and then it shows up in the choices. Of course, the icon is completely different, turning what used to be a pleasure into a real learning curve and a chore. It's taken me weeks to find the drawing tools so that I can draw crummy support / resistance lines on my stock charts.

How about if you just wanted to undo something? I used to be able to just hit ALT-E-U (Alt -Edit -Undo). Dang! It's not there now. In fact, I haven't been able to find it yet, and I'm probably gonna start using this puppy tonight. All I get is this weirdo semi-transparent popup saying something like "Office Alt-Key combination started - press correct key to continue". Well of course I started an alt-key combination! What the HELL did you do to it? I want it BACK!

Soon, I'll be switching to Windows Vista. I shudder to think about it, because I know they've got new "stuff" in there and some of the "Classic" options that I probably would want simply aren't provided anymore.

This is what "usability studies" do to perfectly good software, folks. So, here's the deal. Can I send you a bill for my loss of productivity while I am re-learning what you've supposedly "improved", moving me from the familiar into the unknown? How about my psychiatrist's bill (he only speaks Portugese)?

Heh. Don't hold your breath.

N.B. Uh Oh! It installed fine on my notebook, which only had Office 2003. The upgrade was flawless. On my "main box", which had Office 2007 Beta, I got a dialog:

"Setup is unable to proceed due to the following error(s):
The 2007 Microsoft Office system does not support upgrading from a prereleased version of the 2007 Microsoft Office system.
You must first uninstall any prerelease versions.
Correct the issue(s) listed above and re-run setup."

This is after I unistalled everything from the Beta, and even used the latest version of the MSICUU.EXE utility to "cleanup" traces, and rebooted.

Here is "one" answer:


If you have Office Web Components installed (even Office 2003) you need to uninstall that first!

Well! I tried that, and I tried using the MSICUU "cleanup" utility to remove all traces of anything with "12" in its name, and it all STILL DOESN'T WORK.

And the Final Score:

Well, the last and final "culprit" was Microsoft Expression Web Designer Beta. Apparently, this uses some "Office 12 Stuff" and had to be removed. Removal wasn't easy, though - there was no Add/Remove Item for it in Control Panel. I had to stumble through the lovely MSI's in Windows\Installer by most recent date order, mousing over each until I found the 2 offenders with the right metadata descriptions.

Then, a right-click and "Uninstall" (or, MSIEXEC /x pathtopackage) and they were gone, and Office 2007 ("12") presented me with a nice "enter your product Id" dialog.
Peter, 1, Betas - ZERO! Yay. It's us against them, man, and we are gonna win!

N.B. Mark Dawson was kind enough to post a comment with a link to Jensen Harris' UI blog about the "helper" links from Office 2007. The comment system blew away the long link, so I'm adding it here:



It Works on My Machine!

How many times have you heard this one? Or it might have been stated "It works in my Browser". It doesn't matter.

Wannabe Code Monkeys do this all the time. When you are developing code on your machine, you have certain settings and an environment that may have certain attributes or settings that will not always be the same in the target environment - the user's machine, or a webserver.

One of the newest offenses is where developers create web sites using the WebSite project model in Visual Studio 2005, using the built-in Development Web Server. There are minor inconsistencies in behavior between this ASP.NET webserver and the real IIS. In fact, if you develop on IIS in Windows XP, I bet you use an IIS application ( a VRoot that is below the actual root of the site - since Windows XP IIS only offers "one" web site), even though you are developing a full site that will usually be at the web root of an IP address on the target production server. Again, the behavior will not always be the same. You have session, cookies, settings, relative paths and more, some of which may not match up or behave exactly the same way they did on your developer machine setup. You could have a web.config file in a higher folder with HttpModule settings that cause your application to blow up because you didn't think the settings would be inherited.

One of the marks of the professional developer is to keep this in mind and perform sufficient testing - including enlisting friends or co-workers to help - in order to determine without doubt that your creation will indeed work the way you intended it to work -- in production.

I remember the first time I uploaded a web page that had image tags in it. When I looked at the page from the webserver, no images. Silly me - I had FILE:/// references in there. Of course it worked "on my machine!". Fortunately, I was still around to fix it - I hadn't left for home yet. Problem? The only place I had ever tested it from was my development machine, where all the image tags seemed to work just fine!

My point: Never "assume" that because everything works fine on your machine, that it will in deployment; test thoroughly to be sure. If it's a web application, test it from other people's machines and with different browsers / versions. And do a "Smoke Test". A smoke test is a set of steps that a user must go through which ensure that all the major functions and features of an application are working as designed. Where I come from, nothing is in "production" and "done" until it has passed a comprehensive smoke test. Above all: Stay tuned to station WDTA. When you are going to make a change to an application, always stop and ask yourself first: "What Does This Affect?".

"In Production" is the worst possible place to have to test software or web sites. Unfortunately, a significant percentage of outfits do just that. They don't have a "test" environment, they don't have a QA department, and they don't have a testing regimen. They don't know what a "smoke test" is. Contractors come and go, they push untested stuff into production, and there's nobody in control to ensure quality. This results in embarrassments and a lot of extra lost time that costs money. And if you own the company, shame on you, because you just threw some of your profits down the toilet, and you may have lost credibility with your customer(s).

Testing your "stuff" in production means that it will be your users and clients who find the boo-boo's for you. At best, that's extremely unprofessional.

Are you sure that's what you want? Test your stuff. Test it well. Sleep better.


Open Source Software and the CPL

I love the concept of open source software. I've contributed to it, I use it, everybody is getting hip to it, even big folks like Microsoft, IBM, Novell, are helping.

But the one thing that gets my goat is those licenses. Good God! For something that's supposed to be free, have you ever seen so much legalese in your life? Not only that, but it seems every Joe Developer and his brother have to come up with a new one - "Common" this, GPL that.

Here's my take: The CPL (Cool Public License):

Cool Public License

This software is yours. Do whatever you want with it, call it whatever you want, use it anyway you want.
I/we have no blame for anything that happens, and you can't sue me/us. Thanks you, and G'Bye!

Now, isn't that refreshing?


JLCA 3.0 - "Java Language What?"

This probably should come under the Third Base: "I dunno" category. Recently I've been playing with various kinds of content "generators" and Wikipedia came into the crosshairs. Wikipedia has a policy that you can reproduce their content, and a substantial portion of their content is actually very very good and well-researched. There are over 130 listed sites that reproduce Wikipedia content in one form or another, some giving proper attribution, and many not even bothering. Answers.com is one of the biggest, and they do a nice job of it.

The problem is, if you do a Wikipedia title search and get the results back as xml (which they offer) it has a content node filled with that God-awful Mediawiki markup. At that point you have to find a way to convert it to displayable HTML, or it's not going to look very pretty. To the best of my knowledge, nobody has written a "Wiki2HTML" parser in C#.

So, in keeping with my smart developer philosophy of "don't reinvent the wheel", I looked around for some conversion apps - any language, thank you! There are a few very good ones; in fact about the best one is actually written in client-side Javascript. Another good one in JAVA. Some in Ruby, PHP, Python, Perl. Boo, anyone? Halloween is over. But, no C#.

Now, JScript.Net is no easy task for a C# developer. Once you have script that starts using prototype and function JScript has no idea what to do with it - at least not "out of the box". So, you'd need to be a real Javascript expert - I mean GURU level expert (and I am not) to convert it.

The next thing we tried is the JAVA .java class files. Did you know that the Microsoft JAVA Language Conversion Assistant 3.0, which is built into Visual Studio 2005, will load an entire folder full of these babies and happily convert them to C#?

Yes, it will. It even does JSP. Nevermind that it makes a struct with static readonly fields as constants instead of an enum - that kind of stuff you can fix. But when you get into some of these wild-ass Visitor patterns, well --. Let me just say this: It's one thing to get the code to compile. It's a whole other ballgame to get it to WORK! And, I don't think it's so much the differences in the languages, which aren't that great. It's those dang Patterns those JAVA D00ds use! The poor conversion assistant starts recursing and ends up with it's head stuck up its butt!

I think I'll just stick with the client-side javascript and attach the output to a div tag for now!


Web Application Project Issues 101: "Could Not Load Type..."

One *Extremely* common newsgroup and forum post I've seen recently revolves around "double compilation" of stuff that was left in the APP_CODE folder when a project is migrated from WebSite mode to Web Application Project. For starters, I'll quote directly from "Mr. ASP.NET" Scott Guthrie's blog tutorial:

"VERY, VERY IMPORTANT: Because ASP.NET 2.0 tries to dynamically compile any classes it finds under the /App_Code directory of an application at runtime, you explictly *DO NOT* want to store classes that you compile as part of your VS 2005 Web Application Project under an "app_code" folder. If you do this, then the class will get compiled twice -- once as part of the VS 2005 Web Application Project assembly, and then again at runtime by ASP.NET. The result will most likely be a "could not load type" runtime exception -- caused because you have duplicate type names in your application. Instead, you should store your class files in any other directory of your project other than one named "app_code". This will be handled automatically by the "Convert to Web Application" command. This command will rename the folder Old_App_Code."

This should be very clear, but apparently (as one would expect) a lot of developers don't take the time to read the tutorials that kind Mr. Guthrie has literally toiled over for your benefit! I should mention as a footnote that you need to choose the context menu item "Convert to Web Application Project" at the PROJECT level in your Solution Explorer node in order for this to occur.

There's another little "gotcha" that I bet you have run into, and there's a fix for that too, if you just take the time to "RTFM":

VS 2005 Web Application Projects don't automatically support generating a strongly-typed Profile class proxy. However, you can use this free download to automatically keep your own proxy class in sync with the profile configuration. Now, I'll be the first to admit that this little guy can be very tricky in getting "Profile.Common" to come alive, but once you understand what it is doing, you'll have taken the first step to "Conversion"!

Moral of the story: Not to sound like a broken MP3, but "Read the Manual" (in this case, the nice tutorials!). Or, phrased another way: There are two kinds of professional developers in the world: Those that take the time to read the manual, -- and those that take the time to read the manual.

There is no other way. More later.