7/29/2007

WCF Service Debugging with the serviceDebug element.

Learning the "WCF way to do stuff" that you already know how to do is hard enough - but being able to debug boo-boos and fix them along the way can be an important part of your toolset. Since WCF services can return complete exception detail of a fault back over to the client, an easy way to enable this is to add the debug element:

<behaviors>
<serviceBehaviors>
<behavior name="serviceBehavior" >
<serviceDebug includeExceptionDetailInFaults="True" httpHelpPageEnabled="True"/>
<serviceMetadata httpGetEnabled="true" />
</behavior>
</serviceBehaviors>
</behaviors>
</system.serviceModel>

the <serviceDebug element with the includeExceptionDetailInFaults and optional httpHelpPageEnabled attributes turns on this cool stuff for debugging.

So if an exception occurs, it gets marshaled back to the client where you can see it.

See here for details.

Finally, Nicholas Allen has a wonderful downloadable GIF diagram of WCF configuration schema.

7/28/2007

Visual Studio 2008 (Orcas) Beta 2 Installation, Issues, Fixes

"A computer lets you make more mistakes faster than any invention in human history - with the possible exceptions of handguns and tequila." -- Anonymous

Well, I downloaded Visual Studio 2008 Beta 2. I mounted the image as a virtual DVD with the free VCDControl Tool on Windows Vista Ultimate, 32-bit. I uninstalled the Orcas BETA 1 from Control Panel "Programs and Features", which does a chain uninstall of all the bits. Now I am ready to install the BETA2.

I click on the SETUP.EXE from off the virtual DVD drive and nothing happens. Try several times. OK, that doesn't work. Let's try executing the .msi instead. Uh-Oh. The msi tells me I have to install with Setup.exe, which I already know doesn't work. Reboot, try again, same BS.

FIX: Copy everything from the VCD drive onto a folder on your phyiscal hard drive, and run Setup.exe from there. That should fix it.

Everything else about my install went perfectly, and I didn't even need to repair Visual Studio 2005 after uninstalling BETA 1 of Orcas. Technology is grand when it works, eh? Let's do some threadpool with anonymous delegates, some WCF, WPF, a little SilverLight and other goodness! They've put a lot of very cool and in some cases extremely useful new "stuff" into Orcas.

I think I am gonna like this product.

7/25/2007

Cell Phones Making People Sick --Is It All In The Mind?

"With WCF, all messages are SOAP messages" -- Juval Lowy

A lot of noise has been made lately about the supposed health dangers of WiFi and other types of wireless communications. -- The media stories about them are generally full of crummy reporting and bad science. The reports usually feature a person who claims to have "electrosensitivity," and that radiation from WiFi or mobile phone networks (or the phones themselves) makes them sick. I have some personal experience with this as my S.O. (Significant Other - who is perfectly normal in all respects) claims she has electrosensitivity and in fact I remember one time when I was sitting in a chair on the other side of the living room, and my cell phone vibrated (no noise at all) because of an incoming voicemail, she suddenly startled and complained of a sharp pain in her leg. Coincidental, yes - but it doesn't prove anything.

Despite the claims, these people generally fail double-blind tests to see if they really can feel the presence of WiFi or other wireless networks, and several studies have now confirmed this. Researchers said that people claiming to have electrosensitivity weren't faking it, and really were displaying the symptoms they claimed -- but that they were brought about by the people's belief that they were being harmed by radiation, and not by the radiation itself. Just two of 44 people claiming to be "electrosensitive" correctly determined when the wireless signals were being emitted in six out of six tests; this compares evenly to the 5 out of 114 control participants.

In fact there are well over thirty published studies looking into this question. The studies typically ask electrosensitive volunteers to record their symptoms in the presence of suspect devices like mobile phones. The trick is, though, that the researchers and the subjects are not told if the devices are really on or not, i.e. the trial is blind. The thirty or so studies all do things a little differently, but generally center around this theme. Of the studies, only seven so far have shown there is a difference between on and off, that is, that the mobile phone had some sort of effect. However, five of these positive results could not be repeated by the same researchers and the other two are thought to be statistical flukes. In other words, the vast majority of the experiments have shown that electrosensitivity has not been demonstrated to be due to exposure to EMR emitting devices.

Now, we can continue doing studies for the next 30 years, but as long as the studies are scientifically accurate and conducted fairly with no hidden agenda, I strongly suspect that the results will continue to be similar.

A systematic review of most of the studies that have been done concluded, "The symptoms described by “electromagnetic hypersensitivity” sufferers can be severe and are sometimes disabling. However, it has proved difficult to show under blind conditions that exposure to EMF can trigger these symptoms. This suggests that “electromagnetic hypersensitivity” is unrelated to the presence of EMF, although more research into this phenomenon is required."

For some strange reason this whole thing smacks of that "Al Gore Apocalypse" global warming theme. I leave you to draw your own conclusions. Maybe we should just ask Uri Geller to come in, bring his cell phone, and bend a few spoons over it, hmm?

7/24/2007

Nasty IM Spam Sites Comin' at ya - UPDATE!

NOTE: It only took the DNS provider an hour or so to shut these bastards down. Good Riddance! I'm leaving the post up for educational purposes. Don't give out your Windows Live ID (Passport) or other credentials to any site unless you absolutely know who they are. In this case, you are getting a message from a trusted friend to go visit this site. That's social engineering.

UPDATE 7/28/2007: These guys are back with a new domain and a new provider, and now they have EVEN MORE "FAQ" like stuff to try and convince you that they aren't "Phishing".
TRUST ME: THEY ARE:
http://www.messenger-tips.com/

Visit this site:
http://msnlive.bounceme.net/ (This domain was TURNED OFF by the DNS provider)




it tells you who deleted or blocked you from their MSN (Live) Messenger contacts.

BUT DO NOT LOG IN!

This is one of the best examples of social engineering I've seen in a while. It looks really professional doesn't it. You are going to log in with your Windows Live messenger credentials (your Passport account, essentially - are you SURE you want to do this?). It will give you a list of all your contacts from the very beginning, and whether they are blocked or not. They could be using DotMsn - this is easy to do. Fine. Unfortunately it doesn't stop there. It will proceed to message every account (from you!) telling them about itself. You have no idea what else it may do with your credentials. I repeat: you have no idea what they will do.

I quote from their site:
"Is it safe? Absolutely. Messenger-Tips.com does not save your mail address, your password or contact list. The data you enter is just used to retrieve the requested info and discarded immediately. If you still feel insecure change your password temporarily before using this tool. "

It must be true, right? After all, you just read it on the Internet! GET REAL!

RECOMMENDATION: IF YOU DIDN'T LISTEN TO ME, CHANGE YOUR PASSWORD.

Thanks to my friend John Bailey for the heads - up on this one.

BTW, here are the people that run this little spam /scam deal:

Domain Name: MESSENGER-TIPS.COM
Registrant: Virtus Offshore Investment Co. Virtus Offshore Investment Co. (private@voichaven.com) Suite 2007 20th Floor The Century Tower Ave Ricardo J. Alfaro Panama City Panama,- PA Tel. +507.2051616
Creation Date: 02-Apr-2007 Expiration Date: 02-Apr-2008
Domain servers in listed order: ns2.ipnames.net ns1.ipnames.net
Administrative Contact: Virtus Offshore Investment Co. Virtus Offshore Investment Co. (private@voichaven.com) Suite 2007 20th Floor The Century Tower Ave Ricardo J. Alfaro Panama City Panama,- PA Tel. +507.2051616
Technical Contact: Virtus Offshore Investment Co. Virtus Offshore Investment Co. (private@voichaven.com) Suite 2007 20th Floor The Century Tower Ave Ricardo J. Alfaro Panama City Panama,- PA Tel. +507.2051616
Billing Contact: Virtus Offshore Investment Co. Virtus Offshore Investment Co. (private@voichaven.com) Suite 2007 20th Floor The Century Tower Ave Ricardo J. Alfaro Panama City Panama,- PA Tel. +507.2051616
Status:ACTIVE

Google and SEO: Some interesting facts

Google now "sees" underscores in URLS as word-separators(delimiters). Previously, in a URL like http://www.mysite.com/iphone_review.html Googlebot couldn't "see" the words iphone or review. Instead it read iphone_review as one word. Now, it will see iphone and review.

Google treats URLs with a query string the same as static URLs. Caveat: as long as there are no more than two or three parameters in the URL, that is! To explain in another way, you won't take a hit in your Google rankings if you have a question mark in your URL; just don't have more than two or three equals signs in the URL. So, if you've been tormented about whether you should implement URLrewriting in your site or blog, don't fret.

The number of slashes in your URL (i.e. the number of directories deep your page is) isn't a factor in your Google rankings. Although it doesn't matter for Google, it is rumored to matter for Yahoo and MSN (Live Search).

The file extension in your URL won't affect your rankings. It doesn't matter whether you use .php, .html, .htm, .asp, .aspx, .jsp etc.

Contrary to rumors, Google does not use its status as a domain registrar to access domain registration data to use it as a ranking signal. However some self-proclaimed experts believe Google is using WHOIS data as a signal.

if you want to get your blog into Google News, one of Google's requirements for inclusion is that the blog must have multiple authors. So if you want your blog to show up in Google News results, it needs to be a group blog.

Why don't all my pages get indexed by Google?

One of the classic crawling strategies that Google uses is the amount of PageRank on your pages.
So just because your site has been around for a couple years (or that you submitted a sitemap) doesn’t mean that google will automatically crawl every page on your site. In general, getting good quality links to pages in your site would probably help the googlebot know to crawl your site more deeply.

You might also want to look at your remaining unindexed urls; do they have a lot of parameters (the google crawler typically prefers urls with 1-2 parameters)? Is there a robots.txt? Is it possible to reach the unindexed urls easily by following static text links (no Flash, JavaScript, AJAX, cookies, frames, etc. in the way)?

Feed Reader Stats

It appears to me that Google Reader is cornering the feed reader market pretty well. Google Reader provides 37% of the readers of this UnBlog, with Bloglines providing another 16 percent. Newsgator comes in third with 11 percent.

I use Google Reader myself, I have it on my customized home page as a Web Part (or whatever, widget, thingy, you know what I mean). I also use the IE7 built-in feed reader.

7/22/2007

Web standards? IE? Firefox? BS!

It's a rare person who wants to hear what he doesn't want to hear. - Dick Cavett

I am really getting tired of seeing all these "holier than thou" articles, comments and rants about how Internet Explorer sucks because it doesn't implement "web standards" correctly, and how Firefox "does".

Bottom line? A bunch of BS! Neither one does! Here are a few images from the ACID 2 Test which deliberately invokes invalid CSS to see how browsers handle it:

IE TEST:


FIREFOX TEST:


REFERENCE IMAGE:


Bottom line? Users need to quit their political, opinionated rants and work to ensure that browser manufacturers all work together so that there is seamless, consistent behavior across browser brands - whether it be rendering, Javascript, CSS, the DOM, or whatever. Standards take time to create, and a lot of thought goes into it. The standards are there for a reason.

Let's stop the BS. Browser manufacturers and users all need to stop attacking each other and instead -- work to provide a consistent, standard set of browser behaviors for developers and users. You don't do world domination with web browsers. Everyone benefits. My two cents.

7/19/2007

ASP.NET: Prevent Long-Running Page from timing out

On some occasions you may have a database query or other operation that takes a long time. If it takes long enough, your ASP.NET Page may timeout.

Obviously, you want to optimize your SQL or other process, but if you can't you can control the Page Timeout via the httpRuntime element's executionTimeout attribute. You can set the timeout value for a request in web.config. You can even change the timeout for one specific page:

<location path="yourpage.aspx">
<system.web>
<httpRuntime executionTimeout="180"/>
</system.web>
</location>

The httpRuntime element is not explicitly defined in the Machine.config file or in the root Web.config file. However, the following settings are the default values initialized by the system. If you need to customize this section you must create it in your configuration file and define only those attributes that need customization:

<httpruntime
enable="true"
executiontimeout="110"
maxrequestlength="4096"
requestlengthdiskthreshold="256"
usefullyqualifiedredirecturl="false"
minfreethreads="8"
minlocalrequestfreethreads="4"
apprequestqueuelimit="5000"
enablekerneloutputcache="true"
enableversionheader="true"
requirerootedsaveaspath="true"
shutdowntimeout="90"
delaynotificationtimeout="5"
waitchangenotification="0"
maxwaitchangenotification="0"
requestpriority="Normal"
enableheaderchecking="true"
sendcachecontrolheader="true"
apartmentthreading="false"
>

See http://msdn2.microsoft.com/en-us/library/e1f13641.aspx for more details.

Social Networking Sites and Session Objects

"I had a dream I was stuck in an elevator with Michael Bolton, Kenny G. and Yanni,
and I had a gun - with only one bullet."
-- Alan Rock (Jazz DJ)

One good answer to Alan's quote above would be to get them all to line up, one right behind the other.. but, I digress!

I have had a passing interest in the social networking buzz over the last couple of years, and the ones with API's are of course most interesting to me as a developer.

Digg is useful, and it has an API, but it really has a more narrow focus based on community ranking of posted "news" (e.g., articles, blog posts and what have you), rather than more of a concept of "communities" comprised of members who share common interests / traits.

One that is really starting to stand out is FaceBook, which has a clean, uncluttered look and actually has features that could be considered useful, unlike stuff like MySpace which (to me) is just a cluttered mess of spam and "me too" ism.

Of course the question that comes into play is "How long will it last" -- it's one thing to have 30 million users sign up like rubberneckers at a car crash, but will the usage statistics persist. Probably only Steve Ballmer knows, as rumor has it he's been trying to acquire the sucker for a ridiculous sum (in the $B's).

Another reason why I'm starting to get interested in facebook is that it can import your email contacts and uses these to determine potential "friends" - so that when you join you may be pleasantly surprised that a number of your contacts from LinkedIn, hotmail, yahoo or other mail are already members.

In fact, I saw this morning that Rob Howard had just added me as a friend (that's a compliment, Rob!). But the real interest, I think, is the FaceBook API. You can go to codeplex.com and search on "Facebook" and you'll see that there are no less than four projects that deal with .NET-ized versions of the API.

Nikhil Kothari has some work that he's done on this and also has a lot of good pointers to resources that you can find here. Time permitting, I'll be doing some work on this to figure out sensible ways that I can integrate the Facebook API into one or more of my current "playground" sites. See you on facebook!

Session Object Fun

I keep seeing various posts regarding misunderstandings of how Session works in ASP.NET. It's reference - if you modify an object that's in Session, the SESSION OBJECT is modifed. Here's some reallyreallydumb code that brings it home quickly:


protected void Page_Load(object sender, EventArgs e)
{
// Let's Drop a Gridview on the page, and create a data source...
DataTable dt = new DataTable();
dt.Columns.Add("Test");
// let's add a row
DataRow row = dt.NewRow();
// give the column a value
row["Test"] = "First";
dt.Rows.Add(row);
// store it in Session
Session["dt"]=dt;
// now let's modify the ORIGINAL OBJECT
dt.Rows[0]["Test"] = "Second";
// now let's bind our grid from the SESSION object
GridView1.DataSource = (DataTable) Session["dt"];
GridView1.DataBind();
// it says "Second" -- in other words, our SESSION OBJECT is MODIFIED,even though we never worked on it directly.
This behavior can cause a lot of slip-ups for developers who are not familiar with it.
}

7/17/2007

Spam Filters Gone Wild 2007

One of the primary concerns you will have as a blogger or developer of content sites is filtering porn and spam. The approach I've taken with ittyurl.net is basically to have a database table, "BADWORDS". This gets loaded into a string array at startup, and any time somebody wants to add a link, since the application spiders the page anyway to collect tags and metadata, I run it through my IsBadWord method. The process is very fast and it has worked extremely well. Since about January 2007 when I put up the first beta of the site, I've only had to manually remove three or four links out of the several thousand that users have added on the site. Sometimes the sneaky little scumbags have a redirect to their porn / spam sites from a "nice" page and that of course is something you cannot foresee (unless of course, you want to have your WebRequest follow redirects -- it just goes to show you they will stop at nothing in the dirty tricks department!). Other times it was just drug stuff ("Phentermine", "Xanax", "Viagra" - you know the routine) and those weren't in my BADWORDS table -- although they are now! Here's some sample code:





public class Global : HttpApplication
{
public static string[] BadWords;
....(snip)

public static void PopulateBadWords()
{
DataTable dtBadWords = null;
try
{
DataSet dsBadWords =
SqlHelper.ExecuteDataset(ConnectionString,
"dbo.GetBadWords", null);
dtBadWords = dsBadWords.Tables[0];
}
catch (Exception ex)
{
PAB.ExceptionHandler.ExceptionLogger.HandleException(ex);
return;
}
BadWords = new string[dtBadWords.Rows.Count];
for (int i = 0; i < dtBadWords.Rows.Count; i++)
{
BadWords[i] = (String) dtBadWords.Rows[i][0];
}
HttpContext.Current.Application["BadWords"] = BadWords;
}



But other times you get a catch-22 - This article, "Bloggers Bring in the Big Bucks" wouldn't go in because one of the characters' names is "Heather Cocks" (good God, what a name to have!). And of course, my badwords filter found it and DK-ed the entry. That's too bad, because I actually built the site for my own use - as a way to easily store, tag and make links searchable. In the process, I realized others might find it useful so I expanded the concept and made it public. It even has a Webservices API -http://ittyurl.net/IttyUrlService.asmx

Other problems I've found are that it's one thing to have a CAPTCHA on your publication to deter automated spam bots. But what about maniacs? There are actually mentally disturbed people who deliberately hate-spam comments on blogs. I don't have any issue with people posting comments that disagree with my views on something; that's perfectly fine with me, I put my ego in my back pocket and publish their comment. But there are actually people who mount ad-hominem attacks, deliberately seeking out numbers of posts and putting their trash on there.

So, moderated comments come in to play. It's an inconvenience, but I'm usually online most of the time so I get an email and approve it right away. I think possibly the best answer may be a combination of spam filters and moderation. In other words, if there is a bad word, you would get an email allowing you to look at the content and override your spam filter.

Bayesian filtering is another possibility - I've seen some pretty interesting C# code with Bayesian filtering, but as we all know these filters need to be "trained" - much like a neural network. In my case, that's probably overkill.

It never ceases to amaze me the amount of spam there is in the Blogosphere - I get sometimes 200+ messages in my yahoo mail spam folder, and only 10 or 20 real messages. Yahoo and Gmail are both doing a pretty good job -- I hardly ever find legitimate mail in the spam folder. On the other hand, I do often find one or two spams in the inbox, especially when there is a new genetic mutation of some spam formula they haven't learned yet. Do these people really believe I need a bigger member? That I really want to buy bogus drugs online and give them my credit card? Oh, and here's a killer- "N33d m0ney right now? One-hour payday loans" -- like their l33t-speak is really gonna make it past the spam filter, huh?

It's pitiful. It's childish, and it's harmful. It's indicative of what our societal psyche has become - don't do any real work, don't care about other people, and just try to make money fast any way you can -- and you and I are bearing the cost of it in inefficiency and increased bandwidth consumption. Spam isn't just an annoyance. It has a real cost, and guess who's footing the bill?

7/14/2007

Effectively Promoting your Blog (or WebSite)

Over the last couple of years I've learned a few "tricks" that help to promote blogs and WebSites. Most of the time, the techniques work for both a blog and a WebSite, especially if the site has at least one RSS feed. Blog and site promotion is an ongoing process, it takes time. There is no "Magic Bullet". Use the following steps to start promoting your blog or website:

  1. Submit your blog's URL to each major search engine Google, MSN (live.com), AOL, and Yahoo. Also go to
    http://www.1stopsubmit.com/
    http://www.submitexpress.com/
    http://siteexplorer.search.yahoo.com/
    and submit your URL. The first one will automatically submit your blog URL to over 50 search engines for free. They ask for a reciprocal link, but it is not required.
  2. Open http://www.blogmetafinder.com/ (our site) and submit your blog to as many of these blog and RSS feed directories as possible. Use your key phrase in the title and at least once in the description if you are able to enter one. This will take a long time, but the effort is well worth it.
  3. Open http://yahoo.com/ and set up an account if you don’t already have one. Add your RSS feed to your "My Yahoo" account as a news feed you want to monitor. You will need to add RSS Headline Module to your My Yahoo content if you don’t have it. For more info on how to do that, go here: http://my.yahoo.com/s/rss-faq.html. You will now add your blog into the RSS feed in My Yahoo. To do that, click on edit in the RSS feed section. Enter the URL of your blog. RSS will increase your blog’s reach. It is important that you include your blog’s content in an RSS feed to increase readership and distribution.
  4. Go to http://www.feedburner.com/ and subscribe your blog.
  5. It's also necessary to submit your blog to Internet directories because people might find you through these. One of the important web directories is http://dmoz.org/. It is hard to get listed there, but if you are successful, it will have a very positive effect.
  6. Add your blog to blogtoplist.com website. Visiors may come to your website from resources like this.
  7. Post your comments on highly trafficked blogs; also put your blog link in your name or at the end of your comment, if they don't provide a form field for same. Make sure any comments you post are relevant and not "blog spam" or it will work in reverse!
  8. Write good quality articles related to your blog / subject and submit to free articles directories like http://www.ezinearticle.com/, http://www.goarticles.com/ or yahoo groups. If you write for a technical niche subject, there are sites that want your content. For example, our eggheadcafe.com site accepts articles about web development and .NET. If your stuff is good, we'll publish it and you can have a one-way link to your site from a PageRank 5 website.
  9. try to update your blog once a week if not daily; it will give you an advantage in terms of search engine ranking. Write a few lines (400 words) of article and post it to your existing blog regularly.
  10. Register with Blog Search Engines – Search engines that specialize in blogs http://www.daypop.com/
    http://www.blogvision.com/
    http://www.blogsearchengine.com/
  11. Use the listing services to popularize your blog. There are lots of "blog directories".
    You can find most of these listed on our blogmetafinder.com directory -- along with many more. In fact if you join the forums as a member, it lets you download the entire directory for free!
  12. Register with Tracking Services – These services note when a blog has been updated and publish an ongoing list. They even keep track of the most updated and most visited weblogs. Plus you'll get to learn what pinging is. http://blo.gs/ http://www.weblogs.com/
  13. Create and increase the "Why Should I come back" value of Your blog. Everyone knows that blogs get returning visitors because of their constantly updated content. As long as you can efficiently and consistently increase this Return Value, it is virtually guaranteed that you'll be getting repeat visitors.
  14. Track Your Blog: Google Analytics is free, and offers some very sophisticated metrics.
  15. Join Technorati, and claim your blog there. It will generate traffic for you. Technorati makes it easier to find your blog through Technorati tags. Technorati is a large blog ranking engine; it will rank your blog by the number of links to your blog from various WebSites (e.g., "Authority"). The higher you rank in Technorati the more traffic you will generate for your blog.
  16. Ping the ping servers with your blog whenever you update with the following:
    http://www.ipings.com/
    http://kping.com/pings/ping.php
    http://www.pingomatic.com/
    http://pingoat.com/
    There are numerous other ping servers, including Yahoo's API. For .NET developers, this article provides "fire and forget" ping code in C#.
  17. Study "SEO" - Search Engine Optimization. Learn how the search engines rank a site and also what criteria they use to downgrade a site. Don't get all hung up about "keywords" - Just write good content and the keywords will be there.
  18. Make sure your site has a sitemap.xml file that's updated preferably in real time. That's a "google" sitemap, not an "ASP.NET" sitemap. Google, Yahoo and MSN (Live.com) all read these map files to index your site better. If you cannot create a sitemap, consider a page that has a list of hyperlinks to every page on your blog or site - specifically for the crawlers to chew on.
  19. Look at your content: So-called "Web 2.0" sites make up the fastest-growing category on the Web--doubling their traffic over the last year, according to data presented by Nielsen/NetRatings.

    Web 2.0 sites--defined loosely as those allowing users to "talk" to their "friends" via e-mail, messaging, blogs, and other social media tools--ranked first in year-over-year growth in unique audience and Web pages viewed.

    Social networking sites with the highest traffic growth included Feedburner (385%), Digg.com (286%), MySpace (170%), Wikipedia (161%), and Facebook (134%).
    The Nielsen//NetRatings data also showed that engagement with Web 2.0 sites had grown over the last year, with retention rate increases of 10% at MySpace, 46% at Wikipedia and 20% at Facebook. Web 2.0 users also tend to be more active than typical Web users in online search, with 63.8 searches per month compared to 44.7 for the total market.

A lot of this advice may seem superfluous or not relevant. All of it is relevant; some actions have more effect, and a quicker effect than others. But they are all valid.

7/13/2007

Who's Winning the Search Engine War?

This will probably not come as a surprise to many, but my statistics show that Google is the clear leader across the board. Recently I checked some statistics on which search engines are sending traffic to my ittyurl.net site, which has really started to take off. The bottom line:

1.google 6,782
2. yahoo 1,666
3. aol 76
4. search 64
5. altavista 24
6. msn 12

MSN is at the very bottom, man! My site has long since been submitted to all these search engines, and it has a standard XML Sitemap for them all to consume.

The sitemap is automatically updated at least once a day, and anybody who's got an RPC or other PING mechanism is getting notified of any new content in realtime.

Yahoo is making a showing, but the others are so far behind, it's a JOKE! Google is the clear winner, by a long shot.

You guys want my business? You need to crawl my site(s) and send traffic. It's a pretty simple equation to me! I'm the web site owner and producer, and I have the original content. Your job is to index it. If there wasn't any content to index, there wouldn't be any search engines. Ain't rocket science!

Finally, I leave you with this terrible true story of the Internet Crash of 2007:




Breaking News: All Online Data Lost After Internet Crash

Vista: KB935807 Windows Update Woes - and XML Escape

Windows Update is great -- when it works. There have been a few slip-ups in recent months, and this is another one. What I got is basically the update failed to install. So, I resorted to installing the list of updates one - by - one. All worked except KB935807. FInally I downloaded the .msu from TechNet and installed it by double clicking on the file out of Windows Explorer. That worked, except for one thing - Windows Update keeps showing up in the Notification area, reporting that I need to install - you guessed it - KB935807!


There's some "stuff" appearing on the web about this, lots of people are getting it, it's a patch for the firewall, which service I keep disabled since my wireless router -- like most routers -- already has one built in. (Yikes - how many firewalls do we need, folks -- really?).

So far, this is the only "Fix" I've seen. I want to caution that I have not used this, nor do I know if it even works:

Follow the instructions below

1. Open an Administrator command prompt by right clicking on Start -> All Programs -> Accessories -> Command Prompt and selecting "Run as Administrator" and clicking "Allow" for the elevation prompt

2. In the command prompt, type the command below

a. fsutil resource setautoreset true C:\

Note: This assumes that C: is the drive in which Vista is installed. If it is installed on another drive like D: please change the drive letter appropriately

3. Reboot the machine

4. After reboot, please try to install the updates again and let me know if that resolves the issue.

If that doesnt work also try to clear the Softwaredistrobution [sic].

1. Open a command prompt with administrator acess.

2. type net stop wuauserv

3. type start %windir%

4. Delete Softwaredistrobution folder

5. Go back to your command prompt and type net start wuauserv

6. Do windows updates and see if the issue has not been resolved.

I am not going to rail about this one except to say that if you are doing something that is going to affect millions of users, wouldn't it be a good idea to test it thoroughly? If you have some info on this, feel free to post a comment.

From the "Did you know that..." Department:

Often you need to provide the XML-Safe entities for markup that is included in a web.config element (such as appSettings, "value" attribute). You could always do 5 Replaces, but there is an easier way:

In System.Security namespace the SecurityElement class has an Escape method that handles this:

The following table shows the invalid XML characters and their escaped equivalents.

Invalid XML Character

Replaced With

<

"& lt;"

>

"& gt;"

"

& quot;

'

"& apos;"

&

"& amp;"

(I've put some spaces in my "Replace With" entries. I hope the reader is smart enough to understand why!)

7/11/2007

ASP.NET : Killer Viewstate Invasion

"They can hold all the peace talks they want, but there will never be peace in the Middle East. Billions of years from now, when Earth is hurtling toward the Sun and there is nothing left alive on the planet except a few microorganisms, the microorganisms living in the Middle East will be bitter enemies."
-- Dave Barry

Recently I was "trying out" some pages on my latest creation, BlogMetaFinder.com, and I noticed that paging of the Gridview on the main page was slow. Long story short -- View Source and I've got 975,000 bytes (In EditPlus, "Edit / Character Count") mostly all ViewState. "Yikes", I thought, "if I turn this off I'm gonna have to do custom paging with this SQLDataSource and all kinds of extra stuff". The data source is a little less than 2500 rows now, which is kind of "on the edge" of where you might not want to use the default paging in a GridView.

Anyway, I turned off ViewState in that Page and also in the Master, which was also contributing to FVS (Fat ViewState) and everything still works fine. I've written about this a couple of times (here's one article) and I could kick myself for forgetting about it.

Custom paging is a lot easier to do now with SQL Server 2005 because you've got the ROW_NUMBER feature, and there is plenty of sample code on how to do it. SQLDataSource doesn't play well with it though, you need to use the ObjectDataSource. Obviously, if you aren't using ObjectDataSource and you need to put in custom paging, there's going to be some serious refactoring going on...

Bottom Line? Watch that ViewState. Most of the time, you don't even need it. Turn it off selectively on various controls. Lightens up the page quite nicely. ViewState is downright evil!

7/08/2007

Adsense E-Books, SEO, and Private Jets for Climate Change

"Reusability of code may not be one of the three pillars of OOP,
but it's one of the most important ones."
--Yogi Berra

There is no question that Google has totally dominated the context-sensitive advertising space with their Adsense product. It's easy to use, they've tuned it up nicely over the last 3 or so years, and now they have added the product - oriented "pay-per-action" advertising option in addition to the "pay-per-click".

Another truism is that anytime you have a product or service that makes a big impact, cottage industries spring up around it. Adsense is no exception, and there are now gazillions of SEO sites, services, forums, and purveyors of tracking products -- and of course, dozens of e-books about Adsense. Most of these e-books are grossly overpriced, contain a lot of "Fluff" around the "good stuff" (if there is much of it) and can be safely ignored.

But there is some "good stuff" to be learned. Here's my summary:

1) The best performing Adsense ad layout is the 336X280 format.

2) The best place for it is just under the title of your content, and just before the main content, right in the middle of your content pane (not on the side).

3) The best performing combination of link, border, background, url and text colors is:

google_color_border = "333366";google_color_bg = "ffffff";google_color_link = "000080";google_color_url = "008000";google_color_text = "000000";

(the FFFFFF or white border is a close second)

4. The best link color performance is with Navy , #000080.

5. The best URL color in the ad is #999999 - a dark gray.

6. The best border color is #333366, a muted greyish navy.

7. The best background color is #F9DEFC, a kind of light pinkish color. Background should of course, blend with your page background, with white being the next best choice.

8. The top text color is #000001 (basically, "black").

9. Traffic from different search engines yields different click through ratios (CTR), with froogle being the best (especially if you have product ads), then MSN, then dogpile, then yahoo.

10. google's web search ads are not good performers.

The ad at the top of this post mirrors most of these findings.

So basically, depending on which e-book you buy for $79 or even $99, the above summarizes -- believe it or not -- 99 percent of what you are likely to get for your money (assuming the book isn't one of the ripoffs). Contrary to what many of these promoters claim with their custom tracking products and scripts, I believe the best way to track performance is to use channels, review your statistics on your Adsense login stats, and use Google Analytics, which is FREE (as in beer).

And above all - the best way to make money with Google Adsense (or any competitor) is to have real, unique content, plenty of high PageRank links pointing at your stuff from other sites, and not worry so much about all this keyword bullshit.

You read it here first! Save your money.

Private Jets for Climate Change, Anyone?

One of the Live Earth concert promoters was quoted, " Live Earth will produce about 74,500 tons of C02. We would have to plant 100,000 trees to offset the effect of Live Earth."

Heh. No shit, Sherlock! Let's solve HIV-Aids and poverty first, where we get $40 worth of return on every dollar invested, instead of the measly $04 cents from investing in Kyoto Treaty stuff. When we've got those done, the next Little Ice Age will have started kicking in (according to the real long term climatic record) and we won't have to worry about global warming.

7/06/2007

Database-Agnostic or Database-Specific .NET Architecture?


This is one of those flaming debates which really derives from (or perhaps just implements the interface of) the "Stored Procedures Are Evil" debate. I title it thusly because that's the real issue in my mind - not whether one should use stored procs or parameterized textual queries - but whether code should be written using a provider model that allows any RDBMS system to be "plugged in", or whether to exploit all the features of a particular database system and say "be damned" to the provider model. In other words, "this is the app, we take advantage of x, y and z special features of xyz brand database that we wanted to use, and you can't switch databases with this app". So what the hell is wrong with that? For certain apps, that can be a good thing.

Here are a few juicy links on the original issue to get your blood boiling, if you haven't seen them already:

Bouma

Howard

Miller

DeBetta

Attwood

Personally, I can and have taken both approaches. I've used the Enterprise Library (3.0 and now, 3.1) to have an infrastructure that is RDBMS - agnostic, as long as there is a Provider written for the particular brand of Database you want to use with it. In fact, although we didn't use it for the client, I reworked and unit-tested an Oracle Provider (ODP) for it. I've used CodeSmith with NetTiers, I've used Db4.0 (which doesn't even use an RDBMS); I've used Subsonic (a hybrid) and others. Do they handle FullText Search? Nope. If you want that, you need to make a decision, like "this is the app", above, and stick to it.

I have no quibble with the "ORM D00ds" insisting on not using stored procs and doing all the database logic "Not in the database" - but within their mapping and DAL classes. An ORM framework is the type of app that you probably do want to be database-agnostic. The problem is, that's not the only architectural modality we ever face as developers. If you look, for example, at SQL Server 2005 -- you've got Service Broker for very transactional MSMQ-like notifications and updates, you have SQLCache invalidation, you've got TABLE VARIABLES where you can do some sophisticated and lightning fast data - manipulation that would be virtually impossible to perform from "outside the database". You have built - in WebServices, triggers, complex cascading delete logic, User-Defined-Functions - all of this stuff either cannot be reproduced outside of the database, or it is less efficient if you can reproduce it there.

You've got updateable Views, the XML Data type, and the list goes on. And finally, you have CLR-Hosted .NET code hooked into T-SQL stored procs, where literally - the sky is the limit on what you can accomplish. In fact, I just finished reading a post on the MS C# newsgroup where Nick Paldino claims that CLR-hosted .NET UDF's to parse delimited strings of items are faster than T-SQL code - and he is probably right.

Are you gonna tell me that I have to "dumb down" SQL Server 2005 (or Oracle, or whatever my chosen RDBMS platform) and only use the most basic of features in my application so that it can conform to your narrow- minded view of the world whose mantra states that "Stored Procedures Are Evil"? That's Honky-Code, man!

Let me try to boil it down to a basic theme: Databases like Oracle, SQL Server 2005, etc. aren't just "CRUD" any longer. They offer advanced features. If you -- as a developer -- are entrenched in this CRUD and portablity thing, you may think that you are pretty advanced as a programmer, but the fact of the matter is -- you've put yourself into a box where you may never be able to take advantage of everything an advanced product offers. Some applications do fine with the Provider - DB agnostic "no sproc" model; other applications can and should be designed to take full advantage of the strong features of a particular brand of database - with the idea that you ARE NOT ever going to switch it to a different brand. The main lesson I've gotten here is that when people start making "blanket statements" such as "Stored Procedures are Evil" - learn to run the other way. The best approach, in my view, is to look at each situation and avoid putting thinking into restrictive "boxes" via "XYZ are evil" type pronouncements.

That's my take on it. What do you think?

Practically Done Events

I recently heard from Jonathan Goodyear of ASPSoft, who tells me they are now featuring a new series of short seminars for .NET Developers who can't "take a whole week off" for a conference. Here's the "Practically Done" page from Jon's site. Their first deal features John Papa, who has an excellent reputation and has done some really good MSDN Magazine articles. Jon tells me that future sessions will feature AJAX and Silverlight. Something to keep an eye on!Here is a link to the event setup.

7/03/2007

ASP.NET: Modifying display of DataBound Items based on values of data

"Now you can spend like a drunken liberal on a government grant with our new flat-rate shipping." -- conservative online T-Shirt company

A common question you see on forums and newsgroups is "how to I do xyz with a databound repeater field based on the value of the data?" (e.g. show an image, change the display text, etc.). There are several ways to do this, here are some samples from recent newsgroup posts oriented around the Repeater control:


1. Bound expressions:

<ItemTemplate>
<asp:Label runat="server" ID="Label1"
Text='<%# (int) Eval("DataField") > 0 ? "Greater" : "Less or Equal" %>'/>
</ItemTemplate>


2. Handling the ItemDataBound event (or RowDataBound for GridView):

protected void repeater_ItemDataBound(object sender, RepeaterItemEventArgs e)
{
RepeaterItem item = e.Item;
if (item.ItemType == ListItemType.Item
item.ItemType == ListItemType.AlternatingItem)
{
Label label = (Label)item.FindControl("Label1");
DataRow row = ((DataRowView) item.DataItem).Row;
int value = (int) row["DataField"];
label.Text = value > 0 ? "Greater" : "Less or Equal";
}
}


3. Call a code-behind method on the bound data.


<ItemTemplate>
<asp:Label runat="server" ID="Label1" Text='<%# MyMethod(Container.DataItem) %>'/>
</ItemTemplate>

and in the code behind:

protected string MyMethod (object DataItem)
{
DataRowView dr = DataItem as DataRowView;
if (dr == null)
return "";
return dr["Column1"].ToString() + dr["Column2"].ToString();
}

Finally, congrats to friend and fellow MVP Stan Schultes on being featured on MVP Insider!

7/02/2007

Saving Rendered ASP.NET controls to files

This is a pattern I use frequently -- for example, if I have rendered some RSS format search results to a page, I may want to save this generated HTML of the DataList (or a Gridview, or any other UI display control) to a file, which can easily be read back in later and assigned to an HtmlGeneric Control that is then attached to a placeholder on an ASPX page, kind of like a FileSystem Caching mechanism:

this.DataList1.DataBind();
System.IO.StringWriter oStringWriter = new System.IO.StringWriter();
System.Web.UI.HtmlTextWriter oHtmlTextWriter = new System.Web.UI.HtmlTextWriter(oStringWriter);
DataList1.RenderControl(oHtmlTextWriter);
StreamWriter sr = null;
string fullFilePath = Server.MapPath("mySavedDataList1.htm");
try
{
sr = new StreamWriter(fullFilePath);
string oStuff = oStringWriter.ToString();
sr.Write(oStuff)
}
catch (Exception ex)
{
// Exception Handler here
finally
{
if (sr != null)
{
sr.Close();
sr.Dispose();
}
}


What this does is wrap an HtmlTextWriter around a StringWriter, then call the RenderControl method of the control whose contents we want to save, into the HtmlTextWriter. We then create a StreamWriter into our file path, call the ToString() method on the StringWriter to get out our content, and using the StreamWriter, we write the content to the file.

Later on, if there is a request for this specific content, instead of having to go to our database or make our WebClient search all over again, we can simply look to see if we have cached the content on the filesystem. If so, we only need to load it and display it.

Coder's Block and Braille on Drive-Up ATMs

Q. Why do they have Braille dots on drive-up ATM machines?
A. Since the Braille is required on the ones that are installed in walk-up locations, it is cheaper to only make one model.

At first the above question seems ridiculous, but once you get into "real world" economics, the true answer seems quite logical, doesn't it?

Did you ever get Coder's Block? It's like writer's block - you sit down knowing that you intend to work on something, and, well, for one reason or another, nothing happens. I generally don't ever get this at work -- the pressure of having to produce and being on the time-expense report means that although my performance might go up and down, at least I'm always producing.

However, I do get this when I am working on my own projects, and I've got a bit of it now. I'm working on a prototype "Link Purchase" system that includes an affiliate commission program, and I'm at the part where I need to integrate PayPal payments with IPN (Instant Payment Notification) that will be used to update a paid PayPal transaction into a database, and I'm like "going in circles". I'm trying to make it as generic as possible so all the values can be stored in a configuration file and thus it can be re-used for virtually any type of product or service sale via PayPal. I've got a sandbox account and credentials, and I am basically "ready to go", but...

I keep downloading sample PayPal ASP.NET code (much of which is in VB.NET -- which "could" be part of the problem!), playing with it, and then abandoning it and scouring the Web looking for more. I'm sure I'll break out of the Coder's Block rut, but I wonder what advice experts may have on this subject!

I'm a big fan of "Don't reinvent the wheel" and so far I've found some pretty good open-source implementations of Paypal with IPN notification, but for some reason, they don't seem to want to "fit" this situation. If you have used a particular implementation that you like, feel free to comment here and we'll feature it.

N.B. Rick's link in his comment got chopped by Blogger, so I'll reproduce it here. Definitely one of the better descriptions of how to handle IPN, too. His advice about not using Sandbox and just doing your testing with very small prices is right on the "money".