2/29/2008

On Programming: Lateral vs. Vertical Thinking

This is an issue that I personally deal with almost every day: How developers approach a programming problem from the standpoint of THINKING.

I bring this up because recently a co-developer asked for my help with overriding a base class property in an RSS library, and it turned out that the version of the library he was using was, more or less, defunct. Yet this individual doggedly continued to try various ways to override a base class property of type Image that returned a string, and wanted to try to coerce this in some way (programmatically) to return a different type without having to disturb the base class library. This was ostensibly because the base type was already baked into a production application that used it, and he now wanted to be able to modify it without changing the base class library.

Sound familiar?

We, as programmers ("Developers", if you prefer) -- are paid to THINK. The way we think has, in great part, a deterministic influence on what we produce -- how innovative it may be, how well - suited to the problem domain, and how effective and profitable our efforts may be to the end user. That end user might be your boss, or your company, the customer, or the general visitor from the Internet, as the case may be.

Many programmers simply have not learned how to THINK. Thinking is a science, and it can be learned through study of the right materials. Comprehensive thinking is simply NOT something that you are born with. You actually have to study and learn correct thinking strategies.

Vertical Thinking

Americans have a great propensity to use only vertical thinking techniques -- or logical / critical thinking. You could characterize vertical thinking as continuing to dig a deeper hole to try to get under a fence. It's tunnel-vision, characterized by a somewhat rigid, resisting change "this is how we solve problems" development process.

Lateral Thinking

Lateral Thinking is:

  • seeking to solve problems by apparently illogical means
  • a process and willingness to look at things in a different way
  • a relatively new type of thinking that complements analytical and critical thinking
  • a fast, effective tool used to help individuals, companies and teams solve tough problems and create new ideas, new products, new processes and new services.
  • a term that is used interchangeably with creativity

Lateral Thinking, a technique pioneered by Dr. Edward De Bono, is a way of thinking that seeks a solution to an intractable problem through unorthodox methods or elements that would normally be ignored by logical thinking.Edward de Bono divides thinking into two methods. He calls one 'vertical thinking' that is, using the processes of logic, the traditional-historical method. He calls the other 'lateral thinking', which involves disrupting an apparent sequence and arriving at the solution from another angle.

When you are faced with fast-changing trends, fierce competition, and the need to work miracles despite tight budgets, you need Lateral Thinking. I began studying and practicing lateral thinking techniques over 20 years ago, and I continue to find myself going "back to the Master's teaching" on a regular basis. Over 20 years ago, I once checked out from a library (and never returned) one of Dr. De Bono's first books, the dog-eared copy of which remains in my library.

I'm not going to launch into a dissertation on why you should study Dr. De Bono's (and others') Lateral thinking techniques -- you can find plenty of good material, including audio books by De Bono and others, quite easily.

However, I will say this: Lateral Thinking techniques are proven to result in innovative new concepts in business and software development, and if you are not familiar with them, you owe it to yourself to devote some study to learn about the techniques.

A good place to start is De Bono's "Six Thinking Hats", which is available on audiobook (mp3) and in print.

So, putting on my Green Thinking Hat, I suggest we need to spend more time studying how to creatively come up with ideas for this solution.

2/28/2008

No Country for Old Text Ads...

I don't know why this never sank in. I think it went in one brain cell and out the other. Last year, around November, Google changed the clicking behavior of their text ads for Adsense. It used to be that you could click on the title, link, or the description portion and it would click through. Now you can only click on the title portion.

The net effect of this, according to the pundits, is that clickthrough rates for text ads went down up to 60%. Clickthrough rates go down, revenue goes down.

However, image ads remain 100% clickable. The solution? Change your setup to serve only image ads. Now that the quality of google's image ads has generally improved, it should not be an issue.

If you are using their new Custom Channels, you can actually do this without having to change any of your ad code that's in your pages.

As a general rule, it's a good idea to watch your CTR and eCPM figures carefully after making such a change - it doesn't work the same for every implementation.

Jeesh!

2/27/2008

ASP.NET "App_Data": Writing files vs Application Restarts

Most ASP.NET developers know that if you create a new file , modify any file in the application root or the /bin folder, or modify web.config, this will trigger an application restart. InProc Session, Application and Cache state go bye-bye. This is a major cause of "WTF" type newsgroup and forum posts by n00b developers who don't yet fully understand the ASP.NET runtime model and its rather complicated set of behaviors - which are by design.

However, there is a special folder, APP_DATA, that is designed not to respond to this filesystemwatcher behavior. In a WebSite application this folder is created by default. It is normally used for SQL Server MDF database files using the UserInstance SQLEXPRESS hosting mechanism, or for XML files.

The good thing to know is that this works the same way (filesystemwatcher events are ignored by the ASP.NET runtime) for Web Application Projects. The only difference is that with a WAP you need to create the folder manually (except with VS 2008, which does create it):

Right - Click the project node in Solution Explorer, choose "Add" from the context menu, and "Add Asp.Net Folder", and choose "App_Data".

You now have the same App_Data folder that would be created by default via a Web Site Project, and it will behave exactly the same. You can put Sql Server MDF files there, use the "User Instance" feature of SQLEXPRESS, and you can write files into this folder or subfolders and they will not cause the ASP.NET runtime to recycle your application when it is in operation.

If you have a site where dynamically generated content files need to be deposited into the file system within the IIS Application root without causing your IIS Application to recycle, this is the way to do it.

2/26/2008

Recession? Inflation? No Country for Old Muni Bond Insurers.

"There is an inverse relationship between reliance on the state and self-reliance."
-- William F. Buckley, Jr. (who died today at 82)

Gasoline prices, which for months lagged the big run-up in the price of oil, are suddenly rising fast. Some experts say they could hit $4 a gallon by spring. Diesel is hitting new records daily and oil closed at an all-time high on Tuesday of $100.88 a barrel. I may have been a year early in my predictions, but my views have not changed. I drive to work in a Toyota Corolla that gets 37MPG. But that's not what I'm worried about.

"The effect of high oil prices today could be the difference between having a recession and not having a recession," says Kenneth Rogoff, a Harvard University economist. Wrong, Mr. Rogoff. The effect of high oil prices today could be the difference between the recession we are already in, and a depression of mind-boggling proportions. It's not just oil prices, or real estate down the toilet - its about the entire credit underpinnings of our economy about ready to go to Hell in a handbasket. Don't believe me? If AMBAC and MBIA don't get bailed out fast from the reckless endeavors they are guilty of, we gonna have some serious financial fit hittin' the shan -- very soon.

I see support on the Dow Industrials (if you just draw yourself a nice long - term support trendline across all the lows) somewhere around 10,000 on the Dow. This recession won't be over until investors throw in the towel and say "Get me out - I can't take it anymore". That hasn't happened yet - It could take some months until we get there.

Compared with a year ago, producer prices were up 7.4 percent. That's the worst producer price inflation in the United States since 1981. I UnBlogged about inflationary recession over a year ago. Now you are seeing the clear signs of - exactly that.

Home prices around the country are falling at an accelerating pace, suggesting no end is in sight for the housing meltdown. You like real estate? Wait awhile - you'll like it a lot better a year from now, when price deflation really kicks in.

It's probably too late to buy either gold or oil at these prices, but I suppose if you dollar cost average, it might make you feel better.

When the dust settles in a year or so, the whole landscape is gonna look a lot different. I'm an ex-Merrill Lynch broker, I studied economics data and statistics for years -- and I know the signs, having been through a couple of these before, including the crash of '87.

What will Happen?

If you want to know what will happen as we come out of a recession, it's necessary to understand the concept of industry group relative strength.  Industry groups that show the highest relative strength (price performance compared to the universe of all groups) are the ones that will lead out of the recession and be the best performers in the following bull market expansion cycle. You can get a lot of information about this stuff on the various financial web sites like MSN Money, Yahoo Finance, and many others. By carefully studying this information as it relates to your personal financial and job situation, you can gain a lot of insight into how to plan your career and your financial future.

If your job seems secure, better thank your lucky stars. And keep your resume polished up nice, just in case.

If you want a little history, see what I was saying in 2006.

Silverlight 2.0: Cross-Domain Access Redux

Scott Guthrie has been publishing some "pre release" very cool blog posts about the upcoming Silverlight 2.0 release. One of the most interesting features is that cross-domain access will be allowed (think JSONp and Crockford's JSONRequest or other cool ideas). Here's a short quote:

Cross Domain Network Access

Silverlight 2 applications can always call back to their "origin" server when making network calls (meaning they can call URLs on the same domain that the application was downloaded from). Silverlight 2 applications can also optionally make cross-domain network calls (meaning they can call URLs on different domains from where the application was downloaded from) when the remote web server has an XML policy file in place that indicates that clients are allowed to make these cross-domain calls.

Silverlight 2 defines an XML policy file format that allows server administrators to precisely control what access a client should have. Silverlight 2 also honors the default Flash cross domain policy file format - which means that you can use Silverlight 2 to call any existing remote REST, SOAP/WS*, RSS, JSON or XML end-point on the web that already enables cross-domain access for Flash clients.

Digg.com has a pretty cool set of Digg APIs that they publish over HTTP. Because they have a Flash cross-domain policy file in place on their server, we can call them directly from our Silverlight Digg client application (and not require us to tunnel back through our web-server to reach their APIs).

In order to implement the Flash Cross Domain policy you need a file on your server named crossdomain.xml . Here is an example:

<?xml version="1.0"?>
<!DOCTYPE cross-domain-policy SYSTEM "http://www.macromedia.com/xml/dtds/cross-domain-policy.dtd">
<cross-domain-policy>
<allow-access-from domain="*" />
</cross-domain-policy>

You can see the little asterisk up there that works in a manner similar to the "allow user" element in your web.config for authorization.

While this does require sites wishing to serve Flash / Silverlight content via an API to add a file to their site, it's still a big step in the right direction.

I haven't done much anything yet with Silverlight because I've been waiting for what is now "2.0" with .NET Framework support and other features. But the time is coming soon! I predict Silverlight will become a major player in this space. It may not ever outstrip Flash - which by now is ubiquitous, but you will know about it. Major sites will start using the technology.

Jesse Ezell, who probably has some of the best .NET AND Flash credentials around, sums it up:

"...about the only thing Flash has going for it from my perspective is adoption. Adoption isn't hard to achieve, especially for the people that ship the operating system 90% of the world uses. As such, it's just a matter of time till that is no longer part of the equation. Adobe has a lot of work to do in the mean time, and the clock is ticking. Open sourcing Flex is a really good start in the right direction... unfortunately, Flex was built on top of the wrong platform from the start (something I told the Flex team while Flex was still in Alpha), so this last effort, while a good one, still might not be big enough to turn the tide that is coming. Now, this isn't to say that Flash isn't a great format and doesn't enable a lot of scenarios (like I said, my job is working with Flash and I'd be doing something completely different if it wasn't for Flash). So, Flash is great. Silverlight just solves a lot of the major problems that I've run into with Flash."

The bottom line for me is that with Silverlight 2.0, developers will be able to choose between writing powerful, rich desktop applications or powerful, rich browser applications, and we will no longer have the big skillset difference between the two. This is what the vision of the .NET framework was originally. Microsoft is continuing to expand on and deliver on that vision.

On Twitter, On Dasher!

Live Messenger is helpful. It's relatively unobtrusive, and if somebody IM's you and you are busy people understand the delays. In fact, it was down for a good part of today and I really did miss the convenience (WebMessenger was down too). But Twitter! Twitter is downright harmful to you as a developer. I mean, I go on there and look at some of the people I'm following and there's just this puke-pot stream of garbage with people jabbering like teenage girls on the phone. Much content, little substance. No, thanks!

Visual LINQ

I was just corresponding with MVP Jon Skeet and he pointed me to his latest creation, Visual LINQ. Basically, Jon has the beginnings of a WPF visual LINQ Query Analyzer that animates the actual process of how a LINQ query expression is evaluated right in front of your eyes. I believe this may have some strong teaching / learning tools potential. Go, Sir Jon! Skeet's new book, "C# In Depth" from Manning, will be out soon and we hope to have a feature article combined with a book giveaway for our top posters contest in March.

2/21/2008

Holy Sh*t, Batman! Windows Vista Service Pack 1 Installed!

I don't necessarily agree with everything I say.  - Marshall McLuhan

Jeesh! After months of having Windows Update not working because I volunteered to be a Beta Tester for Service Pack 1, I got an email from the MotherShip yesterday congratulating me for my participation, with links to the RTM real deal. You have to be very careful with this because on Microsoft Connect there are a whole bunch of links and what you want is the file that looks like this:

6001.18000.080118-1840_x86fre_Client_en-us-FRMCFRE_EN_DVD.iso

make sure the .18000. build number is in there - that's the RTM. In FTM, you should see:

"Windows Vista SP1 Client for X86 and X64 English and German"

So I burned the sucker to a DVD and ran the Setup.exe off of it from within Windows Vista. Batman! The Upgrade option was enabled! Holy Jamoca, could this be the start of something good? You cannot "undo" an RC Vista Service Pack whose "View Available Updates" entry is gone by trying to upgrade with the original Vista media - the upgrade option will be grayed out. But I guess since the build number on this new slipstreamed Vista is higher, you can do it! An Upgrade preserves all your existing software. You also must have an SP1 already installed to do this, for example one of the RCXX releases.

Anyway, it installed, Windows Update is working again, and now I've got a whole new Windows Vista with the Windows Server 2008 Kernel, and all my software is still there --er, kind of...

Yikes! Everything is back to Square 1 - UAC is enabled, file extensions in Explorer are disabled, Firewall is on, Windows Defender is trying to run on Startup - the whole shebang! Now I have to go back a whole year and try to figure out all the customizations I've done and re-enable them.

Oy, vay! Alright. First we make Kosher de Chicken...

Tried to run Sql Server and Visual Studio 2008 and this is what I got:


Sidebyside

"The application has failed to start because its side-by-side configuration is incorrect. Please see the application event log for more detail."

OK. Downloading and running the Microsoft Visual C++ 2005 Redistributable Package (x86) fixed the SQL Server issue.

The Event log entry for Visual Studio 2008 reads:

Activation context generation failed for "C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\devenv.exe". Dependent Assembly Microsoft.VC90.CRT,processorArchitecture="x86",publicKeyToken="1fc8b3b9a1e18e3b",type="win32",version="9.0.21022.8" could not be found. Please use sxstrace.exe for detailed diagnosis.

Umm, they tested this Service Pack, right? Crap. There is no redist for VC++9.0, only 8.0. I'm running a repair on Visual Studio to see if that will fix the VC++ 9.0 runtimes. OK, now Visual Studio 2008 loads. Oh, crap! "COMException" trying to load a Web Application Project... Oh man --it's because IIS 7 is no longer installed! Don't forget the Metabase and other compatibility stuff...

The moral of the story? If you upgrade Vista with the new slipstreamed SP1 DVD, you run the risk of having it clobber your VC++8.0 and VC++9.0 runtime libraries, and you may have to reinstall them. It isn't the end of the world, but it is a big annoyance. In addition, EXPECT to have to reinstall IIS 7.0 and run a full Repair on Visual Studio 2008 in order to get back the VC++ 9.0 runtime dlls.

Thank the Lord I've got an installation of Server 2008 on the other drive just in case... Jeesh! You can bet I won't be installing this on my Vista box at the office anytime soon, unless I've got a lot of time and everything I need all ready.

Some programs either will not run at all, or run with reduced functionality under SP1. My advice? You better check that list FIRST! Microsoft has warned that this list is not "comprehensive" and asks people to get in touch with the maker of any affected software to fix problems. Transalated from M-Speak: "We aren't responsible if our service pack F**ks up some of your software".

Your account is disabled

Now here is a kick in the head! Having finally gotten to first base with the desktop box, I installed (upgraded via slipstream) SP1 on my laptop. Everything went fine, except when I went to login - "Your account is disabled" -- WTF? I had my account set to "Never expires"! Here is how I fixed it:

1) Boot off the DVD, until you get to the final screen. Then choose the "REPAIR" option, and run a Command Prompt.
2) Navigate to the windows\ERDNT folder (assumes you are using ERUNT, which I highly recommend). Navigate to a recent backup date folder.
3) "Copy SAM C:\Windows\System32\config\" - overwrites the SAM hive with your backup. If you don't have ERDNT, look for the "RegBackup" subfolder.
4) Reboot - your original account status is repaired.

Don't ask me why or how this happened. I just figure if it happened to me, then it can happen to others. Go figure!

N.B. As of today, 2/22.2008 I have SP1 installed on my box at work. There, I used the standalone installer and had very few problems.

Conclusion

Having gone through all this pain, I would not recommend using the Slipstreamed SP1 image to perform an upgrade to Vista Service Pack 1 unless absolutely necessary. The standalone EXE Installer is less problematic. Having said all this, Windows Vista does indeed perform better and faster with SP1. Really. I was all ready to repave and put in Windows Server 2008 as my primary workstation OS. Now, I think Vista will turn out to be OK.

Incidentally, to help kick off our eggheadcafe.com March promotions and newsletter for members, we decided to provide a "What I hate most about Windows Vista poll"! You can either vote or just look at the results. Enlightening!

2/18/2008

Portable Virtual Machine as Development Environment

The man who doesn't read good books has no advantage over the man who can't read them. --Mark Twain

We do a lot of stuff with Virtualization at work. Most all of this is done with VMWare Workstation. But the free VMWare Player makes all this stuff really quite portable.

I made a VM of Windows Server 2008, activated it. Then I installed SQL Server 2005. This VM compressed down to less than 2 Gigs - small enough to fit on one of those cool 2GB USB sticks - along with a copy of the free Player.

Need a development environment? Just unzip the little doozy into a folder, start the player, load the little puppy and off you go! Great for testing installations, doing stress testing that would normally require two or more machines (but all on one "real" machine) and so on.

Next, I added Visual Studio 2008 Team System. Now it's too big to fit on on small USB stick (about 2.8 GB) but I just burned it to a DVD instead. I even modified the logon screen message to remind me what the Administrator password is, lest I forget.

It even supports VNC remote desktopping.

I also have a VMX of Open SUSE with MONO for experimentation with the MONO platform. If I want, I can run both VM's at the same time.

Nice.

FileSystemWatcher Events and incomplete file errors

This is a question that has popped up numerous times, and I myself have had to deal with it:

You get a FileSystemWatcher event that a new file was created in a folder you are monitoring. So you try to process the file, but problems arise because the process that has written the file isn't finished writing it yet.

One way to handle this is to have the process that is writing the file send a second small text file consisting of something like the name of the file just written and the datetime that it was completed. You would get an event that this new file had been written, and your code logic would tell you what to do at this point. Of course, this assumes that you have sufficient control over the file - writing process to make this change.

One newsgroup poster commented, "We're running a windows service which contains several filesystemwatchers. Sometimes we hit spikes of 20-50.000 files in a matter of seconds and each file can potentially take some time to process. There's two problems we had to solve because of this:

1. If you spend too long time processing each event, and events arrive quickly, the internal Win32 filesystemwatcher buffer will fill up and start dropping events. See the InternalBufferSize property in FileSystemWatcher docs for more information.

2. Files not being closed when the event is raised."

The solution for #1 was to spend as little time as possible in the event handler. They simply post the data to a queue and let a separate worker thread process it.

In the worker thread they check each file before processing it. Some files must be handled in the order they arrive, so the queue will stall until the file at the head of the queue is closed.

Other files can be handled in any order, and they tag them with the worker thread iteration id and push them back to the end of the queue. To prevent endless processing of an opened file in a tight loop, they use the iteration id to check if they should Thread.Sleep or not before processing the next item in the queue.

To check if a file is available/closed, they use the following code:

bool IsFileClosed( FileInfo fi )
{
try
{
FileStream fs = fi.Open( FileMode.Open, FileAccess.ReadWrite,
FileShare.None );
fs.Close();
fs.Dispose();
return true;
}
catch
{
return false;
}
}

Seems like a workable solution to me. It's unfortunate to have to sink to the level of using exceptions for business logic, but I can't think of a better way.

2/16/2008

Firefox 3.0 (Minefield): Pretty Slick

Don't worry about people stealing an idea. If it's original, you will
have to ram it down their throats.
- Howard Aiken

You can download the Firefox 3.0 Beta 3 nightly builds here. Frankly, I"m pretty impressed with this so far. It seems to load faster than the previous version, and it certainly does pass the Acid 2 test with flying colors (as does Internet Explorer 8.0, which is still in closed BETA).

I predict that Firefox 3.0 will enable the Firefox browser to capture an even greater percentage of the overall browser market. I intend to install it on my Open SUSE Linux VM with MONO on it that I use for experimentation.

It's not about religion here - it's about a browser that performs well, doesn't blow up in your face on a site that serves FLV video or Silverlight, and generally doesn't cause you problems.

The Windows Installer for this is only about 6.72MB, installs in about 20 seconds, automatically imports all Internet Explorer settings and bookmarks, and provides performance that I see is comparable to or superior to IE 7.0...

And, of course, standards compliant too, hmm?

To be completely honest, I'm really getting tired of IE quitting on me because of this page or that and have wasted a considerable amount of time trying to "fix" its many little idiosyncrasies.

I like Microsoft, it has been developer-friendly and very good to me over the years. But, I am also a pragmatist. You give me a good tool, and I'll use it. My 2 cents.

2/14/2008

MicroHoo: Done Deal?

Microsoft has turned up the heat bigtime on the Yahoo deal. It looks to me like a direct "Take it to the shareholders" ploy, and if this succeeds (as it likely will) then this is going to be a "done deal".

There are some internal Microsoft shakeups which have been reported in recent hours that make this deal ever more likely.

Microsoft  announced the departure of several executives Thursday, among them a Silicon Valley veteran recruited to help fix its unprofitable Web business and one in charge of marketing Windows Vista, and the promotion of more than a dozen others across the company.

Microsoft Senior Vice President Pieter Knook will head a new Internet Services division that Vodafone Group Plc announced Tuesday. In this new role, Knook will direct the development and delivery of Vodafone's consumer Internet business.

Yahoo is reportedly talking with News Corp. about a possible deal that could save it from Microsoft's $44.6 billion takeover bid. But like the rumored merger talks with AOL, some analysts see Yahoo's maneuvers as a ploy to simply pry more money from Microsoft. And it looks like Microsoft is prepared to ante-up to meet the challenge, if necessary.

Microsoft has been working this deal with varying results for over a year, currently amid bigtime sqwawks from Google. But unless a suitor (NewsCorp, for example) materializes, this is going to be a "done deal" soon.

Microsoft apparently is willing to do "whatever it takes" to take on arch rival Google. This is getting interesting!'

What are the synergies? Yahoo is a major presence in content and eyeballs on the web. For developers, it will take considerable time to sort everything out. But Yahoo has some very interesting developer API's and I bet that Microsoft will try to capitalize on them. All this benefits you and me as developers, once the dust settles - which could take a year or more.

2/09/2008

Visual Studio 2008 Hotfix Rollup

Wisdom cannot and must not be sold for money -- Plato

This Hotfix rollup addresses a number of issues and is a good example of how Scott Guthrie and his crew are able to nimbly respond to various issues and push out easy-to-install updates that fix various bugs and issues. This is in keeping with their stated goal of more frequently releasing public patches that roll-up bug-fixes of commonly reported problems.

Scott explains more on his blog. The collection of fixes addresses issues with the HTML Editor source view performance, Design View performance, HTML editing, Javascript editing, web site build performance when there are a large number of assemblies in the /bin folder, and more.

After successful installation, if you select the Help->About menu item, there is an entry that says Hotfix for Microsoft Visual Studio Team System 2008 Team Suite – ENU (KB946581).

I noticed a definite improvement in Solution load time and more responsiveness in the IDE in general after installing this relatively small 2.6 MB fix.

There is so much new cool stuff packed into Visual Studio 2008, I feel like I am only beginning to discover much of it, and I've been using this product since the early BETAs. Everything from built-in WCF, Silverlight, AJAX, and much more.

2/07/2008

Compete.com vs Quantcast.com vs Alexa.com vs ...

"There are three kinds of lies: lies, damned lies and statistics." --Disraeli

In a recent post, I ended up getting into some off -topic comment flaming with a reader who it turns out (IMHO) really wanted to play the "my site is more popular than your site" game. Really, this kind of "mental masturbation" is sort of childish. But, it did make me think, and I thank him for that.

In the traffic measurement game, there are some upstarts (like Quantcast, Compete and others) that don't seem too different from Alexa.

Based on the evidence I've looked at from some of the pundits, there isn't much indication to suggest that Compete or Quantcast are better than Alexa. And we all know that Alexa's data has flaws. If you analyze on toolbar installs, these new services certainly have less data. They come up with "ISP relationships", "Panels" and other inventions to show that they can compete, but I don't necessarily buy it.

Apparently Quantcast attempts to combine various data sources to arrive at more accurate rankings, including demographic information. To help in this regard, sites are encouraged to install some javascript code. Realistically -- who can expect to get every site to install this code?

I experimented with Quantcast, installing their code, and the site I put it on then had an increase in ranking. To me, this indicates that either the original or the "after javascript" numbers are inaccurate.

Some pundits have given Compete.com positive comments. But Matt Cutts of Google says no.

Regarding third-party traffic estimates and actual traffic, Technorati links win as the best predictor. Personally, I have found Google Analytics traffic stats to be very close to actual IIS logs in the small comparisons I've made. And that's a free service. It's just that you can't see my stats - only I can.

I think the bottom line here is that some services that aren't really supposed to be traffic measurement services are ending up predicting actual traffic better than those who make the claim that they "are" traffic measurement services.

Personally, I think Google is the best measurement of site popularity (traffic or not). They have the biggest installed toolbar base, and they index more "stuff" (backlinks, etc.) than anybody else. And, they have a proven measurement system. It's called PageRank. All the hot-shot SEO people hate them, but I think PageRank is about the best indicator out there. It's objective, it has been tuned to reject SEO "spam", and it has withstood the test of time.

And with Google dominating SERPs with up to 90% of my search engine traffic, I want my PageRank to be just as high as I can get it, thank you!

Where I work, at a NYSE listed IT consultancy, we're building a Search Engine Marketing (SEM) and Search Engine Optimization (SEO) practice and I get to use my programming skills to build some exceptionally cool measurement tools. Business is literally flying in the door.

It took Robbe Morris and I since the year 2004 (when PageRank first became known) to get Eggheadcafe.com up to a PR 6. That's not bad for a niche site that only deals with developer topics. This UnBlog, which is much younger, is a PR 5 currently. But you have to understand that PageRank is "Logarithmic" - there is a huge difference between a 5 and a 6.

That's my 2 cents.

Congratulations! You're getting a Liberal for President.

Now that Romney has dropped out, you basically are going to have three choices - either Obama or Clinton on the Democrat side (both Liberals), or McCain on the Republican side (also a Liberal). Sigh. I suppose I could just move to Canada, where they have a three party system and nobody can garner enough of the power to be able to accomplish anything.

Conservative commentator Ann Coulter last week said she would support Sen. Hillary Clinton over McCain. She said, "If he's our candidate, then Hillary is going to be our girl, because she's more conservative than he is... I think she would be stronger on the war on terrorism."

Yikes! Ann Coulter said THAT?

But of course, Hillary is losing. The day she admitted she'd written herself a check for $5 million, Obama's people were crowing that they'd just raised $3 million. His staff is happy -- they're all getting paid. (BTW, where in hell did she get the $5 Million?)

Regardless of what happens in November, you can be sure of three things: your taxes will go up, the Government will get bigger, and unless you elect McCain, you'll probably have a bigger chance of getting your ass blown off by a terrorist attack. Oh - and you better get used to it - because we are going to be in Iraq for the next 45 years.

Hint: I took the Worlds Smallest Political Quiz and came out -- gasp! -- a Libertarian.

Wasn't it John Adams that said, "Trust no man living with power to endanger the public liberty"?

N.B. I got an interesting comment that I was defining "Liberal". Here's one from the Urban Dictionary (actually it's two, pieced together):

Liberal: Adjective - Falling to the left on the political spectrum of the average person, being in favor of more government control of economic actions (such as making minimum wage laws), but less governmental control of personal actions (such as allowing protesting)

noun - Any person who is liberal, as described above.

Bill: We should fight poverty.
Jim: Yeah, but you wouldn't know how, because you're a stupid pot-smoking hippie naive liberal.
Bill: Well you're a heartless selfish conservative, and I hate your views!
Jim: I hate your views too!
Bill: Well, I feel better about fighting poverty now.

So there you have it.

2/05/2008

Entityize and ASCIIfy your XML text strings

2 is not equal to 3, not even for very large values of 2 -- Grabel's Law

I have a custom search facility that I use on a couple of different web sites where the search queries are stored in a database table in order to compute count statistics and also to generate a standard xml sitemap for the search engines to nibble on. Problem is, I don't know what users are going to enter as search terms.

From a purely search standpoint, I really don't care; if they enter gobbledegook Unicode glop and get back no search results, fie on them, right?

However, I need to clean this stuff before I store it in the database since when I pull it out to generate my custom sitemap, I'm going to end up with illegal XML characters in the sitemap document. That means google, ask.com, live.com and yahoo are all going to choke on it and I might as well not even have a sitemap if that happens.


So I put a couple of static cleanup methods into global.asax which conveniently allows them to be called from any page:

// Usage: query =  Global.Entityize(Global.ASCIIify(query)); 

public static string ASCIIify( string str)
{

StringBuilder sb = new StringBuilder();
char[] chars = str.ToCharArray();
for (int i = 0; i < chars.Length; i++)
{
char c = chars[i];
if ((int)c < 128) // is within ASCII charset
{
sb.Append(c);
}
}
return sb.ToString();
}

public static string Entityize(string str)
{
return System.Security.SecurityElement.Escape(str);
}




ASCIIfy simply strips out anything that's not in the ASCII Charset. Of course, you may not want to do this, so your solution may be different.



Entityize uses the convenient Escape method of the SecurityElement class - no need to write complicated "replace" code. By combining the calls to the two methods in a single line: Global.Entityize(Global.ASCIIify(query));



-- I get a clean string that I can insert in the database and know that my Sitemap.xml files will be OK.