OpenID Prime Time Redux, now with Microsoft!

"We don't all agree on everything. I don't agree with myself on everything..." -- Rudy Giuliani

Not too long ago I opined about whether OpenID was really ready for Prime Time

A Short while ago, Microsoft announced a deal with JanRain, VeriSign and Sxip to develop integration between Microsoft CardSpace and the open source OpenID project. In addition, the partners announced that Microsoft would be bringing its anti-phishing technology to the OpenId platform - important, because any time you have an open-source, open-standards Single Sign On infrastructure, you are going to attract the dastardly hackers that attempt to abuse it.

I believe that these are new technologies that will have a significant impact on the future of the Internet, identity and single-sign-on (SSO) as we know them.

OpenID is an open, decentralized, free framework for user-centric digital identity. It attempts to solve the problem of Web single sign-on - something that Microsoft attempted at first to do with Passport (now "LiveID") but met with stiff resistance on. If you struggle with keeping track of different usernames and passwords at different websites where you have an account, OpenID can help. With OpenID you will be assigned a standard username (typically a URL) that you can use on all sites that support OpenID. For example, mine is "pbromberg.myopenid.com". But the key thing is "designed not to crumble if one company turns evil or goes out of business".

Anyone can get started using an OpenID, get one at myopenid. Now the cool news (besides the MS-JanRain et al dealie) is that Jason Alexander has started working on a C# port of a Boo control that implements OpenId. Apparently Scott Watermasysk and Scott Hanselman have jumped on board, and they've started an open project on it at .... no, not codeplex.com, but Google Code!.

I don't know Watermasysk personally; I do know he's good. But I do know Scott Hanselman - and I can tell you that if he will quit playing with all his gadgets and toys that he always has with him, and decide to work on this, good things will definitely happen fast.

I believe many developers are struggling with TFS for source control and some have switched to SVN, which Google code uses. We use svn at work with the Ankh VS.NET Plug-in, and so far, I have no negative issues at all with it. Umm, the price is pretty good too ....

I left a comment on their Google Code project site that I was willing to contribute, but I also stated that they ought to search around first. There is suddenly a lot of interest in OpenId (and Cardspace integration) and now that Starship Seattle has jumped on it, I expect it to be looking at you big time, right away. I did hear back from Jason, and I got the impression that they have a mature plan to handle it.

In other news

How many times have you read a forum or newsgroup post that was so ReallyReallyDumb that you felt compelled to answer with the url of a Google search on the subject? I know I have, if for no other reason than to implement the old Chinese proverb of "Teach a man to fish" wisdom. Now there's a site that will get the message across for you! Just JFGI (for OpenId)


Dynamically Loading an Image from an External Assembly

This one came up recently in the C# newsgroup, so I thought I'd take a quick crack at it:

private void button1_Click(object sender, EventArgs e)
Assembly asm = Assembly.LoadFrom(System.Environment.CurrentDirectory + @"\AssemblyResources.dll");
Stream strm = asm.GetManifestResourceStream((string)asm.GetManifestResourceNames()[0]);
Bitmap b = (Bitmap)Image.FromStream(strm);
pictureBox1.Image = b;

What you are doing is:

1) Assuming the "external" assembly is a managed assembly, and it's name is "AssemblyResources.dll", and it resides next to the executable (for ASP.NET you would have to use Server.MapPath or some combination of Request.PhysicalApplicationPath, or the like). We load the assembly into the AppDomain using the LoadFrom method.

2)We extract the resource into a stream using the GetManifestResourceStream method.

3) To make it easier to know the exact names of resources (which can be tricky) I use the GetManifestResourceNames method, which returns a string array of the full names, and just get the first ([0]) one.

4) Use the Image.FromStream method to get an image (cast as Bitmap) and

5) Assign it to your PictureBox's Image property.

I use this all the time, in ASP.NET Server Controls, so that I don't have to worry about deploying needed images or other resources in the right place - they are right there embedded in my assembly. Note that in this case the assembly has no methods, it is used solely as a container for resources.

Other uses for this: You can Compress a database table saved as CSV data and store the file as an embedded resource. By adding the necessary code to decompress, your class can extract the CSV Data, split the strings, and store it in a DataTable which gives you SQL - like select and Find features, so in effect, you've got an "embedded database" right in your product.


Microsoft Junking GotDotNet.com site

According to the site:

"We are phasing out GotDotNet for the following reasons:
Microsoft wants to eliminate redundant functionality between GotDotNet and other community resources provided by Microsoft

Traffic and usage of GotDotNet features has significantly decreased over the last six months

Microsoft wants to reinvest the resources currently used for GotDotNet in new and better community features for our customersPhase Out ScheduleThe GotDotNet phase out will be carried out in phases according the following timetable: (etc, etc)"

Basically what we are saying here folks, is "We don't think there's enough interest in this site, and we think we have other resources that are better, so take it or leave it".

Well! Sez who? I've got samples up there that have been downloaded OVER 47,000 times, and I myself have downloaded sample code contributed by others probably a hundred times or more.

Yes, I know you've got Codeplex.com. It's very nice - but it's not really the same, and its a lot harder to "upload a sample" there - you have to start a whole new project.

You know, you could offer it out to the developer community and see if they would like to keep it going... Smells like something more political than common sense business to me!

Boo Hoo for you, Microsoft.

On the Plus Side

On the plus side, Microsoft is now making available oft-requested Hotfixes for immediate download Find your favorite here.


SQL Server: "Don't Reinvent the Wheel" Department

In your travels as a professional software developer, especially when you come into a new position and need to get used to a new enterprise and its programming - related environment - the tools, the programming style, the existing codebase, etc. you get to observe some of the repetitive coding patterns that people resort to in order to solve their problems.

One of the most common ones I've observed is where developers do not have a full understanding of how ADO.NET and connection pooling work. It seems almost like instead of seeking out and using best-practices code and techniques, that some people, either through lack of knowledge or just plain being stubborn, feel compelled to "roll their own" DAL and Database layers. Mistakes and poor design often result.

Back around 2001, I found the Microsoft Data Access Application Block ("SqlHelper") class, which quickly solved a whole bunch of data access problems for me, and in fact I still use it today - 5 years later (the v2 version, not the more recent provider-based version) for most of my data access with SQL Server. Knock-offs have been created for Oracle, etc. and I even wrote one for SQLite.

The SqlHelper class is one of the true gems of the Patterns and Practices group, IMHO. If you aren't familiar with it, you really owe yourself the favor. All this time, the only addition or change I've ever needed to make was to add a CommandTimeOut feature, which isn't present in the existing code.

What is often surprising to me is how few developers actually use this as the "talk to the database" portion of their DAL code. Yo! Don't Reinvent the Wheel! But they still do, and often they make real design boo-boos in doing so. I've actually seen a custom Connection Pool one dev created because he wasn't fully aware of the best-practices technique of "open the connection using the same connection string, do the work, and immediately close the connection and let it go back to the pool" -- somebody had told him that there was a lot of overhead in creating and tearing down connections! Wait a minute -- I thought that's why they already give you the connection pool! The coolest thing about the SQLHelper class is that you can get a DataSet from a stored procedure with parameters in ONE LINE OF CODE -- and it will even cache the SQLParameter for you! DOH! You wanna reinvent the wheel? Knock yourself out -- I wouldn't hire you!

Here is an example of the above "one liner":

DataSet ds = SqlHelper.ExecuteDataSet(connectionString, "dbo.StoredProcName", new object[] {paramvalue, paramvalue, etc.});

Here's a link to the class:
Data Access Application Block v2

VB.NET Folks, don't sweat. It comes in two flavors.

Scott Guthrie pointed out another tool that recently came out. I originally had modified the Admin assembly of their Web Data Administrator application to use it in a Windows Forms app for my Database Export Utility, but the new Database Publishing Wizard is much more sophisticated.

Here are a couple of links to additional resources courtesy of "Mr. ASP.NET":

Deploying SQL Database to Remote
Upload SQL File to Hoster and Execute

Finally, don't forget that SQL Server 2005 Service Pack 2 final is out!


Meet Jack Murtha: Official Saboteur of US peace efforts in Iraq (and parallels to programming)

Since last December, Mr. Murtha has become the hero of the antiwar crowd, and, as we've seen with other such individuals, scrutinizing their behavior is considered disrespectful. Few might recall (since many aren't even old enough to remember) that after the massive 1980 Abscam scandal, Mr. Murtha was named by the FBI as an unindicted co-conspirator.

Sez Murtha, "Once we get out of there, it will be more stable in Iraq." Oh, Really? Jack Murtha is an idiot. If he has his way and we get out of there, even many left-wing Democrats agree that all hell will break loose, and we will be in much more danger - not only from Al Queda, but from Iran and other destabilizing influences, than we are now.

Newt Gingrich: "It's conceivable that Murtha woke up one day a year ago and said, 'You know, if I don't start bashing America, and bashing the military, and repudiating everything I've stood for my whole life, these guys aren't going to allow me to be chairman of the committee that spends the money.'"

I have no problem with people wanting us to get out of Iraq. I want that too. But when you sabotage our best efforts through subterfuge and chicanery in such a manner as to not be held accountable for the results, that's not good public or foreign policy at all. Democrats? Where's the PLAN? I still don't see one.

The recent House vote, and the off-the-wall Liberal Democrats and spineless Republicans who supported it are at fault. These are the same liberals that believe that:

* Global warming exists as our fault just because Al Gore said it was a problem.

* President Bush went AWOL in the National Guard just because Dan Rather reported it on 60 minutes.

* All Christians are extremists.

* Universal healthcare will actually work here in the US.

* All conservatives are racist towards blacks and Mexican Americans.

* All conservatives are white, upper class males.

* Who protest for the sake of protesting, without a full and complete understanding of the issues.

I'm sorry to say it, but the American Left is not well -- it's driven by rage.

They're mad about losing the past two presidential elections, mad about losing their longtime monopoly on how news and information are disseminated. They're mad about increasing numbers of minority conservatives in the Republican Party. They're mad about patriotism, capitalism, religious freedom, our national heritage, and the war on terror.

Disagree? Sure, I expect you will. But, be careful, or I may sic Michelle Malkin and Ann Coulter on you.

Some advice on politics - and parallels to programming:

It is the same with politics, programming, and criticism. For us as software developers, telling someone their code is "total crap" is not helpful. Telling them they misspelled an XML element name, have several places where they've repeated code, and they need exception handling in all of their methods is much more helpful. It not only allows them to improve their current code, but it may help them out in the future as well.

It certainly takes more time to itemize mistakes, but it's worth it. Liberals can learn to be leaders by not criticizing "Bush the President" and instead, offering constructive criticism and showing clearly defined, well - thought - out alternatives. Cutting off war funding or creating restrictions so onerous that they paralyze the Commander in Chief aren't "solutions".

It's easy to come into a situation and complain. It's easy to point out the flaws in everything. Democrats who complain and think the solution is a non-binding vote of no-confidence on the war in Iraq aren't providing any real help. Much the contrary -- they may be emboldening our adversaries and weakening our position with potential allies, unless they can present a clearly-defined path to success. Making policy by obstruction is the fastest path to failure.


The views expressed here are my own, except when they are someone else's. Business and politics are unpredictable and unsafe. The Internet is dangerous. Many blogs have been written about these dangers, and there's no way I can list them all here. Read the blogs. The Internet is covered in slippery slopes with loose, slippery and unpredictable footing. The RIAA can make matters worse. Patent trolls are everywhere. You may fall, be spammed or suffer a DOS attack. There are hidden viruses and worms. You could break your computer. There is wild code, which may be vicious, poisonous or carriers of dread malware. These include viruses and worms. E-mail can be poisonous as well. I don't do anything to protect you from any of this. I do not inspect, supervise or maintain the Internet, blogosphere, ISP’s or other features, natural or otherwise. You are on your own. Figure it out for yourself.


FIX: Flash on Windows Vista

Lots of people (me included) have been complaining that Flash (such as for viewing videos on Youtube.com) doesn't work. You go and install Flash again (as prompted) and when you go back to view the video it's like you haven't done anything -- no Flash.

Here's the fix:

Navigate to C:\Windows\System32\Macromed\Flash

Then right-click over both 'Flash9b.ocx' and 'FlashUtil9b.exe', and choose Properties.

In Properties choose the "Security" tab. Click on the "Everyone" account (or add it if it isn't there), and the button called "Edit" and then check the box called "Full Control", click "Apply" and do the same with your own Windows local account name.

Once you have done this for both files, run the FlashUtil9b.exe and it should install and update and tell you to restart. Do so and you should now find that YouTube and other sites that use Flash now work.



I see so much of this on forums and newsgroups, I almost feel compelled to say "something". It seems that somewhere along the way new .NET developers read (or are told) that the DataReader is the "most efficient way" to get data out of a database. What happens next is they take this 100% literally, and get themselves into all kinds of trouble - blowing up connection pools, trying to do things that you cannot do, and so on. This becomes the idee fixe de-facto method for getting data, no matter what.

Yes! The DataReader IS the fastest way to get data out of a database. However, that comes with a price that you need to evaluate. DataReaders hold the connection open. You cannot have a method that just "returns a DataReader" - leaving you to blithely do whatever you want -- unless CommandBehavior.CloseConnection was used in the intial SQL call and you are prepared to call the Close method on your reader.

In addition, DataReaders offer only ONE TIME firehose-style, forward-only access through the resultset. You cannot "reset" a DataReader; you cannot "go backwards". One time through, and that's it, pal! The ONLY other thing you can do is switch to the next resultset in the reader, if there is one. And if you do not explicitly call the Close method on the Reader (or the connection), preferably in a finally block, your app will come to a screeching halt when the 100 connections in the pool are gone. PLONK!

By contrast, the TableAdapter (in .NET 2.0) has a Fill method that uses a DataReader under the hood to fill the DataTable, automatically closes the connection for you, and now you've got a DataTable that you CAN do all those cool things with. If you are going to want to Sort, filter, Page, choose a row, or any of a dozen other things then FORGET about using DataReaders!

I should also add, based on one of the comments, that the TableAdapter is a control and comes with a lot of overhead. If you just want a DataTable programmatically, you can use the DataAdapter's Fill method (which also uses DataReader "under the hood") and get your table very efficiently. If you use the Patterns and Practices Group's SqlHelper class ("Microsoft.ApplicationBlocks.Data") you don't even have to worry about connections, because it cleans up after itself automatically.

In my opinion, the slight additional overhead of filling a DataTable with the contents of the temporary DataReader is so small as to be inconsequential. This does not even address the fact that any good developer is usually able to cache most database data, another useful feature that is overlooked a good part of the time. I have finally trained myself to ask, every time I get data out of a database, "Can I cache this? If so, what kind of caching do I want to use?".

In most cases, unless you know exactly what you are doing and why you need a DataReader, developers will be better served by FORGETTING ABOUT DATAREADERS, and using DataTables and DataSets.

Just my two cents.

FIREFOX / IE Word-Wrap, Word-Break, TABLES FIX

One of the most annoying things a developer (who would normally delight in writing code, not futzing with markup) can have is getting rendering issues between different browsers. The two biggest players are of course Internet Exploder and Firefart (as I lovingly like to refer to each). In most cases that's going to take care of 98 percent of your total site traffic.

IE has had a proprietary "word-break" style attribute for a long time, and this made it into the CSS 3.0 spec. But that doesn't necessarily help you with Firefox right now. Firefox literally - (and I consider this completely idiotic, since it's been in their bug database for FIVE YEARS) does not have a reliable CSS style element to force table cell content to break in the middle of a word in order to stop the content from expanding your table / div off the page or over other content on the page. Here's a partial fix, which will work for most tables and browsers. The style declaration (I call it my HardBreak class):


/* this will force a column to be no wider than 300px,

breaking words in the middle of a long word if necessary. */


width: 300px;
white-space: pre-wrap; /* css-3 */
white-space: -moz-pre-wrap; /* Mozilla, since 1999 */
white-space: -pre-wrap; /* Opera 4-6 */
white-space: -o-pre-wrap; /* Opera 7 */
word-wrap: break-word; /* Internet Explorer 5.5+ */


You could also replace the .HardBreak class designator above with

and the style would automatically apply to all TD elements. Remove the width attribute in that case, and set the width in the TD element itself.

The only issue I found with this is that if you want a table column to observe your stated width, you must apply this class to EVERY CELL IN THE COLUMN. Otherwise, it seems to behave nicely in IE, Firefox, and Opera. Also, you may need to remove or comment - out the "white-space" elements in the style, as they can mess up the rendering of lines in Mozilla browsers. I've left them in here, however.

This will force long urls (for example) that have no spaces in them to wrap to the specified width automatically (even breaking in the middle of a word) and not throw the widths of your tables or divs all to hell.

Really, this kind of immature crap from the browser manufacturers has to stop. It ain't rocket science to all get together and agree on CSS handling that works with everybody's product. Forget about the rivalry - what about the damned consumer! And please don't hand me this "Get Firefox, IE sucks" crap. In this case, IE has a working CSS property and Firefox didn't implement it, because it didn't make it into the CSS 2.0 spec - even though its been in their bug database forever. Fortunately, the IE-specific element made it into CSS 3.0 - but you purists will still have to wait.


XAML, Expression Blend, and WPF

Microsoft is apparently moving forward as fast as it can with the integrated "next gen" tools like Expression Blend that are XAML - compliant.

XAML has a lot of promise, expecially because of it's high level of adaptability to animation, Windows Forms - like controls, play stuff in the browser, and so on.

I do a lot of work with Maya on the side. Mostly, I take my digital photographic work and do "digital photo collage" by overlaying photos as the material attributes of various shapes, often 3D replicated with MEL scripting, setting raytracing, refraction, reflection, lighting and other features then doing a master render to a large TIFF file (sometimes 65MB). I have a friend who runs a studio here in Deland, FL and he has an expensive giclée printer, so he can put these out on large - format prints on expensive paper for me (that is, the "good ones", 'cause this can cost $100 a print -- here is a link to a small size one ).

At any rate, Thomas Goddard came out with a Maya 8 - compatible XAML export plug-in. I converted my nurbs objects to polygons, and exported the view to XAML. Then I loaded the XAML document into Expression Blend. Surprise! In design view, it rendered very realistically. But - uh-oh -- when I tried to compile it, it blows up with "Out of Memory" errors. Too bad, guess it needs more work! Has promise, though, I'll say that.


Did Chuck Norris Kill SOA?

It's fun to get a handle on what people are interested in on the web. At my ittyurl.net site, I have a ticker that shows the most popular internet searches for the day,constantly refreshed, and when you click on one, it takes you to the search results for that on my search page -- on site.

This is extremely interesting to me because it not only keeps "counts" of the most popular (and recent) searches for everyone to see, it also increments the count every time somebody clicks on one of the searches that are already there. So you get kind of a mixed bag - what people are searching for on the web in general, plus what of my current list of top searches are getting "re-clicked". Here's a sample of a recent top 20:

Saddam Hussein
aishwarya rai
grey's anatomy
cross-page postback
nfl draft
britney spears
james brown
beyonce knowles
new years revolution 2007

You can see quite easily that people are primarily interested in celebrities, Football, TV, movies, and gory stuff (Saddam Hussein). There are a few others that made it to the top 20 (web.config, for example) because they were already in the list (probably from me, although they could get into the list from anybody's search) and then a lot of people clicked on them.

I built this site primarily for myself - I wanted an easy way to keep lists of links (especially very long urls), convert them to short urls, and index / tag them so they'd be easily searchable. Then I thought, why not just make everything public, and let other people use it if they like it. So, its gaining popularity every day, and I get to "experiment" with my new ideas and get quick feedback on them. The main assembly for the whole site is still only 92K!

If i get a new idea, I code it up, test it, and redeploy the site. Takes about 1 minute. I have a lot of private metrics that only I (as admin) get to look at, and so I get quick feedback on what works, and what doesn't. Everything that happens on this site - every click, every referrer, every querystring -- gets logged to the database, where I can easily do various kinds of data-mining.

So the bottom line of all this is that if you want to attract traffic to your .NET / Programming related material, you probably should consider tying it in some way to Chuck Norris. I mean, if you are coming out with .NET SOA Toolkit, consider calling it "Britney SOA Spears Toolkit". Get a lot more interested folks, huh?

Then again, you might just attract the wrong crowd... But you will never know unless you try it, right?

Einstein defined insanity as "Doing the same thing over and over again, expecting different results". How many people (yourself included) do you know that you've seen doing this? People who go around in circles, attempting to solve a problem, always following the same pattern in vain hope that some outcome will change? What you need to do is do different things, and measure the results. That's not insane at all. That's the way to make things change, and move forward.


SQLite, ADO.NET, Prepared Statements, Transactions and Enterprise Manager

Anybody who has read some of my "stuff" (particularly, articles at eggheadcafe.com) will know that I am a big aficionado of the SQLite database engine. Open source and Free, extreme portability, blazing speed, no installation, and many other features make this a really competitive solution for almost any project except the largest databases. I don't usually recommend software and add-ons, but recently I came across the SQLITE PRO Enterprise Manager.
NOTE: The above link is dead, apparently the guy gave up his domain. I'm posting the below link which is to the 3.61MB download of my original free version on my SkyDrive public folder.

Now this is the equivalent of Enterprise Manager for SQLite 3 databases, and it is slick. Import and Export from Access or even SQL Server and the guy even put in a little "blob viewer" so you can look at pictures you've stored in your tables. Oh, and did I mention the price? It's free.

Another issue with SQLite that comes up often (as well as with other Databases) is the issue of using ADO.NET Transactions and prepared statements for inserts.

Rather than launching into a lengthy discussion, I think it would be better to share the comments of Robert Simpson, the developer who single-handedly brought us System.Data.SQLite for ADO.NET 2.0. Robert and I have corresponded frequently; this comes directly out of his help file for the provider, and I am sure he would approve; here it is:

The Importance of Transactions

If you are inserting data in SQLite without first starting a transaction: DO NOT PASS GO! Call BeginTransaction() right now, and finish with Commit()! If you think I'm kidding, think again. SQLite's A.C.I.D. design means that every single time you insert any data outside a transaction, an implicit transaction is constructed, the insert made, and the transaction destructed. EVERY TIME. If you're wondering why in the world your inserts are taking 100x longer than you think they should, look no further.

Prepared Statements
Lets have a quick look at the following code and evaluate its performance:

using (SQLiteCommand mycommand = new SQLiteCommand(myconnection))
int n;

for (n = 0; n < 100000; n ++)
mycommand.CommandText = String.Format("INSERT INTO [MyTable] ([MyId]) VALUES({0})", n + 1);
This code seems pretty tight, but if you think it performs well, you're dead wrong. Here's what's wrong with it:

I didn't start a transaction first! This insert is dog slow!
The CLR is calling "new" implicitly 100,000 times because I am formatting a string in the loop for every insert Since SQLite precompiles SQL statements, the engine is constructing and deconstructing 100,000 SQL statements and allocating/deallocating their memory All this construction and destruction is involving about 300,000 more native to managed interop calls than an optimized insert.
So lets rewrite that code slightly:

using (SQLiteTransaction mytransaction = myconnection.BeginTransaction())
using (SQLiteCommand mycommand = new SQLiteCommand(myconnection))
SQLiteParameter myparam = new SQLiteParameter();
int n;

mycommand.CommandText = "INSERT INTO [MyTable] ([MyId]) VALUES(?)";

for (n = 0; n < 100000; n ++)
myparam.Value = n + 1;
Now this is a blazing fast insert for any database engine, not just SQLite. The SQL statement is prepared one time -- on the first call to ExecuteNonQuery(). Once prepared, it never needs re-evaluating. Furthermore, we're allocating no memory in the loop and doing a very minimal number of interop transitions. Surround the entire thing with a transaction, and the performance of this insert is so far and away faster than the original that it merits a hands-on-the-hips pirate-like laugh.

Every database engine worth its salt utilizes prepared statements. If you're not coding for this, you're not writing optimized SQL, and that's the bottom line.

Many developers do not fully understand the importance of using prepared statements with parameters where stored procedures (the preferred method in almost all cases) cannot or should not be used. The importance of this, combined with the practice of wrapping repeating inserts or updates in a transaction, cannot be underestimated. If the above all makes sense, then you'll like my next installment, "Did Chuck Norris kill SOA?".


FREE LINKS (as in "free beer")

"The significant problems we face cannot be solved by the same level of thinking that created them." -- Einstein

Yep. Review my UnBlog and get a free link! Increase your website or blog's PageRank (and ours) by building backlinks.

The rules are simple:

1) Write up a review about Peter Bromberg's UnBlog (at least 200 words) containing a link to BOTH my main page and this page, and post it on your blog or website. You can say whatever you want. I hope it's something nice, but it doesn't have to be.

2) Email me a link to your review once it is up (remove the ".nospam" from the email address).

3) I will respond by adding a link to your review (and of course, to your site or blog) on my links post, which will appear as an entry on the sidebar of every page of this UnBlog, as soon as I have a few links to list.

Oh - and, I probably don't need to say this, but I will: If you've got offbase content, adult or offensive material, I reserve the right to decline the link offer.

Links Follow:

IttyUrl.Net Free Links, short urls, and more
Nicholas's Blog-O-Rama
Mario's Blog


D00D! Wake Up, .NET WCF is HERE, man!

"There used to be a real me, but I had it surgically removed" -- Peter Sellers

I confess I must have either been asleep at the switch, or Microsoft is just throwing so many CTP's and BETAs at us poor slobs that I got sensory overload.

At any rate, WCF is here. They actually released it last November, according to Clemens Vasters, last Nov 8th - and its pre-baked into Windows Vista - all this under the moniker of ".NET Framework 3.0". Even some hosting ISP's have already installed it for their customers.

OK, so what is WCF (Windows Communications Foundation) and why should I care, right?

Let me try to provide a short, meaningful explanation:

Do you think it would be useful to you as a programmer if they took ASP.NET Web Services, Web Service Enhancements, .NET Remoting, Enterprise Services, and System.Messaging, and rolled them all into this new integrated architecture, programming model, and runtime environment and provided a more productive SOA development platform for distributed systems that's easy to learn to use?

That's basically what WCF is. Oh, sure -- there's more to it than that, but the above short paragraph is basically all you need to know. So the next time you need to develop a WebService, or something that uses MSMQ or Remoting, and you'd like to make it interoperable, transport agnostic, and easy to implement, WCF will quickly become your friend. And in fact, you don't even have to touch any WSDL- you can write the service contract / interfaces in C#!

Experienced .NET developers, after their first in-depth look at WCF will probably say what I said -- "This looks just like Remoting" -- which is true, although there is a lot more to it than that!

Now I'm not going to show sample code here. What I want to do is provide a series of resources that you can use as a starting point, and which have passed my "Bromberg's NO-FUD test":

Bromberg NO-FUD Seal of Approval WCF Links:

Microsoft .NET Framework (NetFx3) This is the "Main Page" where you can decipher the acronyms - WCF, WPF, WWE, NFL, you got the idea.

IDesign WCF Samples The IDesign people put out incredibly good work for training, and they make their samples available to the public. Highly recommended. (Hint: when you get your first email with the url, copy the url to the clipboard). Revisit their list and mouse over the ones you want to download, and then just change the download ID in your pasted URL).

Thom Robbins Blog Good stuff on WCF ServiceContract and OperationContract by a Microsoft insider.

Alexy Kovrin Video on creating WebServices with WCF. Also, WPF/E..

Al Alberto Debugging WCF apps.

Sam Gentile An old-timer who's really been around the block. Good for
WCF, and a lot more.

Aaron Skonnard Service Factory for WCF (MSDN Magazine Article)

WebService Software Factory Page (MSDN)

Juval Lowy Build a Queued WCF Response Service (MSDN Magazine)

Greg McKinley WCF Bloggers and Forums

WCF Integration with NetTiers (CodeSmith - compatible templates)

DasBlonde Michele Leroux Bustamente (now calls herself "Indigo Girl" -thanks for the comment)

There will be more resources shortly, and if you have one you think I should post, feel free to comment. NOTE: Some articles and resources may be a bit behind the times, and namespaces / methods etc. have changed, so "Developer Emptor".


Stateless Web, ASP.NET and AJAX

One of the most common things in the universe, (besides hydrogen and bureaucrats) is the inability of the uninitiated ASP.NET developer to understand the stateless nature of how HTTP works. This is true not only for ASP.NET, but also for classic scripting platforms such as ASP and PHP.

I say this because we at eggheadcafe.com get lots of forum posts that revolve around this subject. Sometimes they center around attempts to write to the filesystem at the browser from server-side code (which is long gone), other times they question how to tell if somebody has closed their browser, and many other variations in between.

But the common theme is a misunderstanding of just how disconnected the server and the browser really are in a web application. This disconnectedness and State are the two central issues that web developers face, and are the basis of the evolution of web applications to include IFRAMES, javascript callbacks, the XMLHTTP object, and finally, Remote Scripting ("AJAX") and JSON (Javascript Object Notation).


Let's go over some basics: When you put a Uri in the browser's address bar, or click on a link in a web page, the browser makes an HTTP Request for that resource. The two most common types of request are GET and POST. GET transmits any additional information as querystring items, e.g. http://www.yoursite.com/pagename.aspx?username=Howard .

POST only occurs when a FORM element's submit method is called. This packages up field names and values into the BODY of the HTTP Request, not on the querystring.

The server receives this request and routes it to the appropriate place (a web page, an image, a piece of javascript, etc.) and the resource is "served" back to the client (browser). In ASP.NET, an ASPX web page is processed via an instance of the Page class. Page builds the Control Tree, parses the tags, does all the logic, and all this is assembled into POHTML (Plain Old HTML) and sent back to the browser. At this point, the Page class instance at the server is GONE. Zippo, nada, efes, nuttin! It's GONE. If you needed to "Save" something, some "state" of this transaction, you have to have a separate way of storing it. The ASP.NET Page class is compiled - not interpreted, as it is with script languages. But the key thing to understand is that once that page has been "served", the server-side code is gone and the server has NO WAY to know anything about what may or may not be happening at the client browser until a new request comes in. If you are a n00b, try to get this concept to stick to your mind, like concrete on the sides of the Grand Coulee Dam. It will help you later. The Web is STATELESS, without extra "stuff" in the equation.


So, what's the "extra stuff" we need?

Traditionally, state storage is performed in several ways - client side and server side.

Client side state storage mechanisms includes Cookies, Querystring items, and hidden form fields (these only work in an HTTP POST scenario).

Server side state storage include Application, Session, and Cache, along with other mechanisms such as static fields in the Global class, A State Service, or a database.


There are also hybrid mechanisms such as having a hidden IFRAME in a page whose src attribute is dynamically set (via client - side script) to a page on the server, with specified Querystring items, and after the IFRAME has loaded the processed document, client script can use the information in it to update the HTML DOM of the page without the main page having to make a separate request to the server. In point of fact, this was the beginning of "AJAX", and people have been doing it since about 1996 in various forms.

Later on, about 1998, Microsoft came out with Remote Scripting, which uses a combination of javascript, the XMLHTTP object and / or a small JAVA applet to handle getting information from the server and updating the page in the browser without requiring a "reload" (a re-request to the server).

Later came JSRS (Javascript Remote Scripting), made popular by Brent Ashley (circa 2000), where javascript is used to do pretty much the same thing as with the hidden IFRAME.

The XMLHTTP object is simply a part of the MSXML ActiveX component that allows for making HTTP requests external to the browser, and it can be handled with both server-side and client - side script. Actually, there is very little XML involved with this in most cases, so it's poorly named. From a Microsoft perspective, however, that name is what got it into the product and so it "stuck". The rest is, well, history.

The first big implementation of Remote Scripting was for Outlook Web Access which came with Exchange Server, and although by current standards it could be considered clunky, it was pretty amazing at the time.

Later, Google started implementing advanced Javascript with arrays to do very cool things like Google Suggest. About this time the "other" browsers got religion and decided to follow Microsoft's lead and include their own "XMLHTTP" object built into the browser. When that happened, interest in the whole concept soared and people like Jesse Garrett came along and decided it was something they could make money promoting to the hordes of knowledge-hungry web developers. So "AJAX" was born. Essentially, just a catchy new name for Remote Scripting. I've commented on this mishap to death so I won't beat a dead horse. Suffice to say that even Microsoft got snookered into the buzzword hype and now calls their more sophisticated ATLAS Remote Scripting framework "Microsoft AJAX".

Finally, developers should be aware that IFRAME techniques require that the resource requested come from the SAME DOMAIN that the main page came from, due to security policies all browser makers have adopted.

Strangely, this "same domain" policy does not apply to the src of the SCRIPT tag, and so enterprising developers have engineered ways to dynamically inject a script tag into the HTML Document, and when it's src property is set via client script, the tag makes the browser dutifully request the resource EVEN IF IT IS FROM A DIFFERENT DOMAIN.

Developers have started using JSON as the return vehicle with this arrangement. JSON is simply annotated text that is legal javascript, and when it is passed into a callback javascript method in the page, or the "eval" method is applied to it, you get Javascript objects - which can be representations of virtually anything including handy things like DataSets. This "cool stuff" is then used to update the DOM of the page without a reload.

In JSON, objects take on these forms:

An object is an unordered set of name/value pairs. An object begins with { (left brace) and ends with } (right brace). Each name is followed by : (colon) and the name/value pairs are separated by , (comma).

An array is an ordered collection of values. An array begins with [ (left bracket) and ends with ] (right bracket). Values are separated by , (comma).

A value can be a string in double quotes, or a number, or true or false or null, or an object or an array. These structures can be nested.

A string is a collection of zero or more Unicode characters, wrapped in double quotes, using backslash escapes.

A character is represented as a single character string.

So there you have it. Some history, a little clarity (if that's actually possible with all these buzzwords and acronyms) and a bit of State, and JSON came along for the ride. If you want to find some good custom resources on JSON, put it in the custom search widget just below and take a look at the results.


IE7 - Vista: "Internet Explorer has stopped Working"

This one was a bitch. All of a sudden for no reason at all, out of the blue, I get this dialog "internet Explorer has stopped working". I didn't install any new software, crap- I didn't do anything!

So here I am using FIREFART to go on the internet and find out what to do! Damn! Good thing I've got the sucker on the same machine (I'm not "anti-Firefox", i just don't use it that much except to check my page renderings).

The Fix

The fix (at least for me):

1) Go into Control Panel, and choose "Internet Options".
2) Under the "Advanced" tab, press the "RESET" button at the lower right:

Don't ask me why this happens, or why the fix works. That's a BUG, D00D - I don't care how you slice it!


Boston Mooninite Hoax Device for Sale on EBay

Christopher Budnick, a third-year student at Northeastern University and a member of the Harvard and Northeastern Free Culture student activist group, managed to retrieve one of the recent Turner Broadcasting Cartoon Network promo devices about 2 hours after the public bomb scare -- well ahead of the ATF.

Mooninite Photo from Boston Scare

Somebody on Wired News got a hold of him, and the following snippet illustrates:

"What do you think of the entire panic these caused? Do you think Beredovsky should be held criminally responsible?"

"I think it's pretty indicative of the mad culture we live within right now. If we were even lucky enough to be targeted by a mad bomber with the foresight to clearly illuminate, mark, and make public his explosives, homeland security still would have only found 10 of them within the first hour."

--The guy wants to help fund a local artist alliance with part of the proceeds, after paying off his student loans. Nice kid, and smart too.

We live in an altered state after 9-11. Schmucks like Turner Broadcasting need to step back and think "WHAT DOES THIS AFFECT" first. That's what we as programmers do.

It's cool to have neat promotional campaigns to bring attention to your TV show and/or movie. It's not cool to say thing like "Well, we've had it going for 3 weeks in other cities" as if to imply that there is something wrong with the City of Boston, Massachusetts. If you want to be a good corporate citizen, you gotta stop acting like a fyookin' 14 year old, even though that may be your primary audience.


Interfaces, Abstract Classes, and Inversion of Dependency

In Framework Design Guidelines (Cwalina and Abrams, Addison-Wesley), the authors have a section in Chapter Four entitled "Choosing Between Class and Interface" that is very revealing about the "behind the scenes" goings on during the development of the .NET Framework.

They say that in general, classes are the preferred construct for exposing abstractions, the logical basis of this being that once you ship an interface, the set of members is baked forever - any additions would break existing types that implement the interface.

Classes are much more flexible - you can add members to classes that have already shipped, and as long as the method has a default implementation, existing derived classes continue to function undisturbed.

They provide an example (although not a very good one) of how difficult it would be to add timeout support for streams. All of the options have substantial development cost and usability issues.

It seems that one of the primary arguments for interfaces is that they allow separating contract from implementation. However, this incorrectly assumes that you can't separate contracts from implementation using classes. Abstract classes residing in a separate assembly from their concrete implementations are an excellent way to achieve this kind of separation.

Bottom line, "DO" favor defining classes over interfaces. Of course this does not address the concept of being able to dynamically load an assembly with a class that implements a specified interface and cast the object to an instance of the I<Whatever> interface it is, and make method calls on it.

Inversion of Dependency

The Inversion of Dependency concept comes from work by Robert Martin.

First he defines "Bad Design", before getting on to his concept. To Martin, "Bad Design" equals the following three concepts:

1.It is hard to change because every change affects too many other parts of the sys-
tem. (Rigidity)

2.When you make a change, unexpected parts of the system break. (Fragility)

3.It is hard to reuse in another application because it cannot be disentangled from
the current application. (Immobility)

Martin states that it would be difficult to demonstrate that a piece of software that exhibits none of those traits, i.e. it is flexible, robust, and reusable, and that it also fulflls all its requirements, has a bad design. Thus, we can use these three traits as a way to unambiguously decide if a design is "good" or "bad".

Following this, Martin explains that it is the interdependence of the modules within that design that is the root cause of "bad design".

He illustrates with an example of a "Copy" Program that has two subprograms, "Read Keyboard" and "Write Printer" and shows how the Copy module is not reusable in any situation that does not involve a keyboard and a printer.

This leads to Martin's "Dependency Inversion Principle":



This leads us to the abstraction interfaces:


READER (abstract) WRITER (abstract)

Keyboard Reader Printer Writer

This arrangement allows us to plug ANY kind of "Reader" into the abstract READER interface, along with ANY kind of Writer into the WRITER interface, and achieve the kind of usability we want from software. This Copy class does not depend upon the Keyboard Reader nor the Printer Writer at all. Thus the dependencies have been inverted; the Copy class Writer/Reader depends upon abstractions, and the detailed readers and writers depend upon the same abstractions. Hence, "dependency inversion".

This all came about because I'm on a new job, where there happen to be people who are at least as smart, and probably smarter, than I. How refreshing!