Sunday, August 31, 2008

Introverted Programming: The Buzz-Oriented Architecture

It has been a while since the last article of my beloved Introverted Programming series. I consider it inappropriate, so here we go.

What is with all these new and newer approaches, the Service Oriented Architecture, Resource Oriented Architecture, Web Oriented Architecture, File Oriented Architecture, Thick Clients and Thin Clients, RESTful, SOAPless and Quo-Vadis Services? XP, CI, TDD, WPF, WCF, WTF and OMG technologies? Their successes and their failures?

What about a Get-Job-Done Oriented Architecture? I mean - when project was actually delivered, developers got paid and client has managed to not loose his business as a result?

I bet that will make a good line in resume...

Awesome video

No common sense allowed

The summer is over but not quite! You can pick a sense of that, if Agielistas' blogs are getting filled with fundamental ideas and not yet diluted by details, which is inevitable when you get back down the Earth.

It is amazing how resistant software industry is to the common sense. Project after project fail and still people start from the same wrong foot in the same wrong direction, hoping that this time luck will finally turn. Technical debt and understanding software investment as an asset are the best concepts to highlight this misconception.

Would you consider a time, but not lasting quality, as the primary factor for house repairs? Unlikely, especially if it is your primary residence. But "just get things done" is not exception for a software.

Would you borrow without knowing what the conditions are? And find out that interest is 100% payable hourly? But it is OK to take design and coding shortcuts, thus increasing technical debt.

Would you advice surgeon to save time by not washing hands before the operation? As a good professional, he most likely will refuse, even given an order. But developer is expected to cut off "luxurious" practices, such as unit testing and refactoring, when times are tight.

Construction industry has eventually learnt, at a great cost, that there are some practices, which better obeyed, than skipped. Software industry is still young to come to similar realization. Collapsed building makes the news, while collapsed software project is easier to swipe under the carpet. Of course, the culprits will be found and fired. And will lead another project somewhere else.

Friday, August 29, 2008

Encrypting sections of the Web.config file - the Continuous Integration way

The following algorithm is not the easiest way to protect your web.config (unlike this solution). There are few advantages, though. First, we can replicate an RSA key container between multiple servers in a Web farm, thus providing a scalable solution. Second - we can automate encryption and container distribution with Continuous Integration. Developers can enjoy a readable web.config file while CI script will take care of encrypted production version.

1. Add the following section to the root of the <configuration> node:

<configProtectedData defaultProvider="MyRsaProvider">
<clear />
<add name="MyRsaProvider"
type="System.Configuration.RsaProtectedConfigurationProvider, System.Configuration, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL"
keyContainerName="MySiteKeys" useMachineContainer="true" />
Intellisense or Resharper may complain about keyContainerName and useMachineContainer attributes - don't believe it, they just comfused.

2. Now we can create the RSA container and make it exportable:

aspnet_regiis -pc "MySiteKeys"-exp
3. Next step is to encrypt a selected web.config sections:
aspnet_regiis -pef "connectionStrings" "c:\MySite\WebUI" -prov "MyRsaProvider"

The same way you can encrypt connectionString, system.web/membership, AppSettings and any custom configuration section.

Make sure that string attribute values in the encrypted sections (including the configProtectedData) are not split on several lines. The aspnet_regiis tool will try to reproduce the carriage return symbols and encode them:


It is not affecting the application execution in any way, but looks weird and confusing.

4. Save the above key to xml file in order to export it from your local PC to the WebServer (UAT or Production)
aspnet_regiis -px "MySiteKeys" "c:\mykeyfile.xml" -pri

-pri option enforces both private and public keys to be exported thus enabling encryption and decryption.

5. Now you have encrypted web.config file and key xml files. The first part - the deployable package creation, is done and your application is ready to be deployed to the web servers.

For the next two steps - deployment and permissions settings, the Continuous Integration script should be executed under account with admin privileges.

Import the key container on WebServer servers:

aspnet_regiis -pi "MySiteKeys" "c:\mykeyfile.xml"
To make Continuous Integration script more defensive, you can clean the container first. The task will attempt to remove key container, failing if container wasn't found, so it should be made fail-tolerant (failonerror="false" for NAnt script).
aspnet_regiis -pz "MySiteKeys"
Be careful to not run this task locally on the container source machine :).

6. Grant an access to the key container to the ASP.NET account

aspnet_regiis -pa "MySiteKeys" "DOMAIN\USER"

If, like me, you are never sure, what identity the ASPNET process runs under - test it by adding this code to any of your ASP.NET pages:


Sunday, August 24, 2008

Building a Web 2.0 Portal with ASP.Net 3.5

That's a lot of numbers so it can give you an impression that this is another very specific story about the hottest technologies of today. Well, it is. You can pick up most modern buzz words, put them in the resume and sound profound and sophisticated on job interviews.

There is more, though. Clever code shows up depths of a traditional ASP.NET, which you supposed to know at least for a last couple years to consider yourself a good ASP.NET developer. So, naturally, I found some stuff new for me.

And what is even better - the book guides you through framework building step by step, explaining every decision, tradeoff and implications of each choice made or rejected. It is rare and thorough overview of evolving architecture in a progress. Invaluable if you ask me, especially if you deprived a privilege to observe truly good architects in their natural habitat - a live project.

For a first glance check the live example, it is awesome.

A perceived complexity

This question I've heard during one of my interviews quite a while ago: "How you handle high-demand database requests when you need to perform five or six joins with data tables?" ("data" is emphasized because user data tables and not lookup tables were in scope - I clarified this specifically). The question is not important (I guess "that's stupid" was good enough answer). The join details are not important either. What was important - the question assumed a necessity of an unneeded complexity.

Honestly, it was the first time I gave it a thought. Look at formalization of your applications - how many relations require more sophisticated approach, than "one-to-one" or "one-to-many"? To your surprise you can find that it happens rarely, if ever. In most of the cases, when I though that complexity existed, I was proven wrong. And you know why? Because the object model of any intricacy comes down to the user-facing interface and humans are notoriously unable to tackle the overcomplications you think to unleash on them. That particular interview question was about building a robust forum portal, and functionality looked overwhelming - all those sub-portals and forums with discussions, posts, comments, cross-links and other stuff. But any meaningful user function drilled down to the same parent-children routine: forum with discussions, discussion with posts, post with comments and so on. User doesn't care about surrounding complexity while dealing with convenient and comprehensive "one-to-many" entity.

Higher level of complexity can even serve as an alarm bell - an indication, that business rules are handled in the database. In this case big joins is not your biggest problem.

Friday, August 22, 2008

Managing Continuous Database Integration with Red Gate tools. Part III. "A New Hope"

Yesterday I've spilled some frustration (legitimate or not) over Red Gate licensing policies. It appeared that any flaws in this area company compensates with customer-oriented approach which, unspoiled by Microsoft & Co, I find unprecedented.

Despite me calling them names, they took an effort to patiently explain their position and ways. Faced with such a gallantry I have no other choice but to apologize for my emotional thrusts and to admit that Red Gate is not "another greedy ignorant Apple Corporation software company". That's a big relief.

Similarly, the .NET Reflector future looks brighter now. The question, though, still remains - will it become "Red Gate Lutz Roeder's .NET Reflector" or stay "Lutz Roeder's Red Gate .NET Reflector"?

Thursday, August 21, 2008

Red Gate bought .NET Reflector

When I've read the InfoQ article my first reaction was panic. I had experience with Red Gate which can not be considered utterly positive. The parties comments are mutually optimistic but I still have a sinking feeling in my stomach.

I wouldn't be surprised at all, if "free community edition" will evolve to the "free community 30-day trial". You will have to buy Pro version, which will allow you to add new libraries for inspection, .NET Reflector Toolbelt license for Disassembly functionality and Ultimate edition license if you actually want to translate disassembled code from German to C#.

Managing Continuous Database Integration with Red Gate. Part II. "The Achievement"

When you done with bitching the rest is easy. It is confirmed that you need nothing but SQL Comparison SDK (currently in version 7). Help projects are pretty good and offer valuable help. You will be using SQL Compare API to generate the structure update script and SQL Data Compare - for data upgrade script (i.e. for a lookup tables).

Another option is to purchase the whole SQL Toolbelt. Version 7.0 includes SQL Compare and SQL Data Compare Professional, which is necessary to run these tools from a command line. This approach will cost you dare $1600 and your options are pretty much limited to a batch scripting so I decided to stick with the API.

You have to download SQL Toolbelt and install SQL Compare, SQL Data Compare and SQL Comparison SDK. You have to pay for the SDK license only (I guess this means that you won't be able to use Toolbelt UI).

The project has to refer the RedGate.Shared.SQL, RedGate.Shared.Utils and RedGate.SQLCompare.Engine libraries, while RedGate.Licensing.Client, RedGate.SQLCompare.ASTParser and RedGate.SQLCompare.Rewriter DLLs should reside side-by-side with listed above in order the project to compile. My test solution uses only SQL Compare so I guess there will be couple more required for the SQL Data Compare.

At some point I've created a license.licx filie, which contains one line

RedGate.SQLCompare.Engine.Database, RedGate.SQLCompare.Engine
but I migrated solution from version 6.0 to 7.0 since then an I am not sure if it still required.

This is the core code:
private static string folder; //folder
private static string server1; //DEV Server 
private static string server2; //BAT server
private static string db1; //DEV database
private static string db2; //BAT database
private static string file; //script file name
private static bool waitForInput=true;
private static bool verbose=true;

static void Main(string[] args)
if (!ResolveParameters(args)) return;

using (Database widgetDEV = new Database(), widgetBAT = new Database())
var options = Options.Default;
widgetDEV.Register(new ConnectionProperties(server1, db1), options);
widgetBAT.Register(new ConnectionProperties(server2, db2), options);

var dev_bat = GetDifferences(widgetDEV, widgetBAT);

if (verbose)
Write(null, dev_bat,
item =>
string.Format("{0} {1} {2}", ((Difference) item).Type,
((Difference) item).DatabaseObjectType, ((Difference) item).Name));

var script = GenerateScript(dev_bat);
if (verbose) Write("Script length:", script);
if (waitForInput)
Console.WriteLine("Press [Enter]");
catch (ApplicationException ex)
Console.WriteLine("ERROR OCCURED: " + ex.Message);

static string GenerateScript(Differences dev_bat)
// Calculate the work to do using sensible default options
var work = new Work();
work.BuildFromDifferences(dev_bat, Options.Default, true);
if (verbose) Write("Messages:", work.Messages, item => ((Message) item).Text);
if (work.Warnings.Count>0 && verbose)
Write("Warnings:", work.Warnings, item => ((Message) item).Text);
var script = (work.ExecutionBlock!=null)?work.ExecutionBlock.GetString():null;
return script;

private static Differences GetDifferences(Database widgetDEV, Database widgetBAT)
var dev_bat = widgetDEV.CompareWith(widgetBAT, Options.Default);
foreach (Difference difference in dev_bat) difference.Selected = true;
return dev_bat;
The omitted methods are simple but tedious argument extractors, file writers and loggers.The application compares a development database DEV (which had some changes) with testing/production database BAT and generates one-way upgrade script for the latter. At this point I didn't find out yet how to include objects, which were removed from the DEV database, and if this feature even required. This kind of functionality can present some potential danger.

The NAnt script calls the provided EXE with the following task:

<exec basedir="${CCNetWorkingDirectory}/Binaries/UpgradeGenerator"
program="UpgradeGenerator.exe" timeout="10000">
<arg line="s1=dbserverDEV s2=dvserverBAT
db1=testdb_dev db2=testdb_bat f=update-${CCNetLabel}.sql"/>
The rest is obvious. Once again - the sample projects provide great insight for the functionality.

There are few things to consider, though.

My original intent was to keep the solution as a source code and check it out and compile during the build. This would require you to install the SQL Comparison SDL license on the server, otherwise it will be trying to fire popup and effectively kill the build. So I ended up placing a compiled executable in the Subversion, in Binaries folder, which effectively made it a part of any tracked project.

Another mystery, which still remains unresolved, is an unexpected behavior of the application which remains in the memory after execution. As a result the CC.NET worker thread hangs indefinitely and your build will be showing "Building" until you manually kill the generator process. As a workaround I limited a timeout for the <exec> task (refer the code above) and call this task with failonerror="false" attribute. This is a dirty trick, but it does its job on this stage.

As a conclusion I note that while more sophisticated approaches available, including your own NAnt task, even this simple approach works just fine.

Managing Continuous Database Integration with Red Gate. Part I. "The Bitching"

Red Gate tools will do whatever you can imagine you would need while developing, deploying and maintaining a database-backed software. The products are great but unfortunately marketing types clearly took over the company.

I have never seen so ridiculously confusing pricing and licensing policies - we've spend few weeks in negotiations and still it is unclear how what they sell is mapped to what we actually need. And why we have to pay additional $2500 to use a command-line of the package for which we've already paid $1000? The "Red Gate market segmentation" translates from Redgate Marketuanian as "marketing department job security".

Even if you went through an extortion a purchasing transaction the frustration is far from over. The installation application is a wreck of the functionality. Site would tell you that for our task we need the SQL Comparison SDK, which is priced and marketed as a stand-alone application. But it is not available as a separate download - it is part of a SQL Toolbelt which is priced much higher. If you downloaded Toolbelt and installed only SDK you will end up with a single "Getting started" HTML page, no DLLs. Duh!

Things that Every Software Architect Should Know

There is always something around that you must know to become (or to stay) a good professional. How can you find it out? Follow a trusted advice. I just love the Scott Hanselman's "Weekly Source Code" crusade, though rarely able to follow the 100% of the implied beauty, but that's the whole reason why I like it.

Imagine my anxiety when RSS reader brought the "97 Things Every Software Architect Should Know" article. Holy @$%#, 97! What a fount of wisdom must it be! Surprisingly, the article didn't contain 97 Things. Not even the original 10 Things. It was merely the announcement of the upcoming "Beautiful Code Or Whatever" book from O'Reilly. I assume that the book will be a product of collective email list wisdom, much like the Spolsky's "Best Software Writing". It seems to be an accepted practice so I have nothing against that. But if you weren't the original email list subscriber, it's too bad. You can't just get those 97 Things - it's proprietary now. Buy the book.

P.S. All right, all right, it's exaggeration. Some of the Axioms are available, though quite a bit of them are questionable and I wouldn't buy a book which they make into.

Tuesday, August 12, 2008

Beware the automated unit testing

As one of the core agile practices, unit testing single-handedly can become a deal breaker, while selling Agile to the company. Test harness is a visible investment while Testing as a ubiquitous process is intangible and connection between the price and benefit is not obvious.

If you apply notions of Return on Investment and Total Cost of Ownership to the realties of the software development, the emphasis will be on maintenance, quality and other "ephemeral" concepts. Organizations often pay attention to Software Development Lifecycle, which doesn't take in account a long-term maintenance, while latest amounts to up to 90% of the total project costs. This and frequent misinterpretation of the PASS MADE principles lure dilettantes to perceive maintainability as a second-rate goal. "Saving" on testing and quality assurance in pursuit of short-term goals, organizations actually undermine quality but do not understand it or see it as a canning scheme to keep shaking customer for more money.

The unit test harness (left alone TDD) plays a crucial role in keeping application afloat during development and a long way after it done.

Said that, I still think that management should be very careful, giving a green light to TDD and unit testing. The "light weight" of Agile process does not mean "light control" but rather assumes that rigid discipline and self-control would become self-replicating daily practices. If you made a commitment to the practice you'd better stick to it. Nothing is worse, than investing the time and effort in the test coverage and abandoning it after project is done. No doubt, the tests have already helped along the way - minimizing integration and regression testing time and indirectly promoting the better design. But by letting harness rot in oblivion, the organization will deprive itself from ripping a long-lasting benefits, and also will give itself a false sense of security actually implementing some of the practices (and blame them for the failure afterwards).

So don't get into that relationship if you not sure that you will stay committed.

Dos Tequilas por favor!

That's all I can say after the pretty wearisome flight...


- They not going to host "Agile Varadero" any time soon.

- "Che" T-shirts are sold to the filthy reach imperialists only - I find the price of $19 quite excessive.


- Everything else, which has an "ocean" or "sun" in it.

© 2008-2013 Michael Goldobin. All rights reserved