Monday, November 03, 2008

Job requirements

Maybe it is an upcoming fashion in a current economy situation but recently I've seen following skills required for a position of Senior Technical Lead:

"Concentration required daily using sight, hearing and touch to operate a computer, respond to problems ….”

So I better should start training hearing, I thought, update my glasses prescription and practice touching (hopefully just for a coding purposes). At first it seemed a disappointing omission that they didn’t mention anything about requirements for a incumbent's butt, but I misunderestimated them:

“Sits at a desk for long period of time to perform the duties of the job."

Brilliant! The only question is what category those skills belong to? They not quite technical and not exactly soft…

Thursday, October 30, 2008

Password text boxes and lean custom controls

Great article from Keith Brown about improving original ASP.NET password field behaviour.

It is also a good hands-on example of building elegant custom controls and using control state. At least my attempts in this area were much messier.

Wednesday, October 29, 2008

Bridging the gap

Agile interaction between client and development team is always about a trust. It is easier to achieve with internal clients - they may have already established relationship with the team, or at least they can be influenced. Bringing aboard external client usually means that at some point Agile process should be transitioned to whatever client is used to. Some unlucky Product/Project manager has to act as a filter to accept Agile feed from the team and produce some kind of acceptable output for a client and vise versa. When looking for vendors, companies rarely have in mind something different from "fixed price for fixed features in fixed time". Software industry doesn't really have well established mechanisms for gathering and sharing success and failure statistics, so it is widely believed that such contracts minimize risks and uncertainty. It seems to provide common ground for bidding contest but in reality vendors are forced to compete in wishful thinking with their fingers crossed.

But it looks like something is happening in the twilight zone. Agile Contracts initiative aims to continue Alistair Cockburn's investigation of agile-tolerant vendor agreements. This hard-to-get experience is going to be summarized and hopefully community will get a solution, acceptable both to Agile teams and clients, who, for some weird but understandable reasons, want work done for their money.

Thursday, October 23, 2008

Worktime music

Just imagine that pushy project manager or that ignorant colleague of yours performing the song or do the dance and you will be smiling instead of cursing.


The bearded guy probably served as an inspiration for Matt The Dancer.

Thursday, September 11, 2008

CU L8R

Apparently SOA goes WOA in JEE/.NET through JSR311 & WCF with REST TBA while SOAP MIA.

Good news, IMHO. Oh, BTW FYI.

LOL, MG

Tuesday, September 02, 2008

Configurable deployment of deployable configuration. Part II

Mark Needham (of ThoughtWorks) had a nice post about configuring builds for different environments. Very clean and elegant approach and a great addition to my collection!

Update: and another two about per-user configuration and overriding properties.

Monday, September 01, 2008

The Agile Overdose

Do you follow the Agile Movement? Do you like some of the recent trends?

The problem, I sense, arises from the obvious fact, that more and more dedicated Agilists have moved from an actual development trenches to a coaching and high-level consulting. The agile coach is getting perceived as a spiritual leader, much like a Commissar in the Soviet Army. As such he does not have an assigned delivery, does not code or design architecture, so he uses his idle time to take agile theories off the ground and "develop" them out of the proportions. Inevitably they progress from the more or less fair "Design Up Front is Harmful" to derailed "Requirements are Harmful" and "Don't Let Clients Influence the Project" hallucinations. Pass me that butt, dude! I saw that "Estimations are Harmful" and going to the trip to "Code is Harmful" country!

[Note] If you think, that I am exaggerating then search Google yourself. I had all "practices" linked, but decided to be polite and not to point a finger.

Maybe I am mistaken and those guys are sober. Then, I guess, it's all about money. The community has said whatever it could, news are not so new anymore and a lot of companies, which wanted Agile process implemented, already did so. Now the army of coaches agielistas has to compete for the clients by inventing one buzzy theory after another. Personally, I would like to keep my faith in humanity and believe in the mushroom version.

P.S. BTW, same thing is happening within a blogging community. Quite a few reputed IT bloggers have recently "outgrown" simple development. Didn't you notice that too?

Sunday, August 31, 2008

Introverted Programming: The Buzz-Oriented Architecture

It has been a while since the last article of my beloved Introverted Programming series. I consider it inappropriate, so here we go.

What is with all these new and newer approaches, the Service Oriented Architecture, Resource Oriented Architecture, Web Oriented Architecture, File Oriented Architecture, Thick Clients and Thin Clients, RESTful, SOAPless and Quo-Vadis Services? XP, CI, TDD, WPF, WCF, WTF and OMG technologies? Their successes and their failures?

What about a Get-Job-Done Oriented Architecture? I mean - when project was actually delivered, developers got paid and client has managed to not loose his business as a result?

I bet that will make a good line in resume...

Awesome video

No common sense allowed

The summer is over but not quite! You can pick a sense of that, if Agielistas' blogs are getting filled with fundamental ideas and not yet diluted by details, which is inevitable when you get back down the Earth.

It is amazing how resistant software industry is to the common sense. Project after project fail and still people start from the same wrong foot in the same wrong direction, hoping that this time luck will finally turn. Technical debt and understanding software investment as an asset are the best concepts to highlight this misconception.

Would you consider a time, but not lasting quality, as the primary factor for house repairs? Unlikely, especially if it is your primary residence. But "just get things done" is not exception for a software.

Would you borrow without knowing what the conditions are? And find out that interest is 100% payable hourly? But it is OK to take design and coding shortcuts, thus increasing technical debt.

Would you advice surgeon to save time by not washing hands before the operation? As a good professional, he most likely will refuse, even given an order. But developer is expected to cut off "luxurious" practices, such as unit testing and refactoring, when times are tight.

Construction industry has eventually learnt, at a great cost, that there are some practices, which better obeyed, than skipped. Software industry is still young to come to similar realization. Collapsed building makes the news, while collapsed software project is easier to swipe under the carpet. Of course, the culprits will be found and fired. And will lead another project somewhere else.

Friday, August 29, 2008

Encrypting sections of the Web.config file - the Continuous Integration way

The following algorithm is not the easiest way to protect your web.config (unlike this solution). There are few advantages, though. First, we can replicate an RSA key container between multiple servers in a Web farm, thus providing a scalable solution. Second - we can automate encryption and container distribution with Continuous Integration. Developers can enjoy a readable web.config file while CI script will take care of encrypted production version.

1. Add the following section to the root of the <configuration> node:

<configProtectedData defaultProvider="MyRsaProvider">
<providers>
<clear />
<add name="MyRsaProvider"
type="System.Configuration.RsaProtectedConfigurationProvider, System.Configuration, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL"
keyContainerName="MySiteKeys" useMachineContainer="true" />
</providers>
</configProtectedData>
Intellisense or Resharper may complain about keyContainerName and useMachineContainer attributes - don't believe it, they just comfused.

2. Now we can create the RSA container and make it exportable:

aspnet_regiis -pc "MySiteKeys"-exp
3. Next step is to encrypt a selected web.config sections:
aspnet_regiis -pef "connectionStrings" "c:\MySite\WebUI" -prov "MyRsaProvider"

The same way you can encrypt connectionString, system.web/membership, AppSettings and any custom configuration section.

Make sure that string attribute values in the encrypted sections (including the configProtectedData) are not split on several lines. The aspnet_regiis tool will try to reproduce the carriage return symbols and encode them:

image

It is not affecting the application execution in any way, but looks weird and confusing.

4. Save the above key to xml file in order to export it from your local PC to the WebServer (UAT or Production)
aspnet_regiis -px "MySiteKeys" "c:\mykeyfile.xml" -pri

-pri option enforces both private and public keys to be exported thus enabling encryption and decryption.

5. Now you have encrypted web.config file and key xml files. The first part - the deployable package creation, is done and your application is ready to be deployed to the web servers.

For the next two steps - deployment and permissions settings, the Continuous Integration script should be executed under account with admin privileges.

Import the key container on WebServer servers:

aspnet_regiis -pi "MySiteKeys" "c:\mykeyfile.xml"
To make Continuous Integration script more defensive, you can clean the container first. The task will attempt to remove key container, failing if container wasn't found, so it should be made fail-tolerant (failonerror="false" for NAnt script).
aspnet_regiis -pz "MySiteKeys"
Be careful to not run this task locally on the container source machine :).

6. Grant an access to the key container to the ASP.NET account

aspnet_regiis -pa "MySiteKeys" "DOMAIN\USER"

If, like me, you are never sure, what identity the ASPNET process runs under - test it by adding this code to any of your ASP.NET pages:

Response.Write(System.Security.Principal.WindowsIdentity.GetCurrent().Name)

Sunday, August 24, 2008

Building a Web 2.0 Portal with ASP.Net 3.5

That's a lot of numbers so it can give you an impression that this is another very specific story about the hottest technologies of today. Well, it is. You can pick up most modern buzz words, put them in the resume and sound profound and sophisticated on job interviews.

There is more, though. Clever code shows up depths of a traditional ASP.NET, which you supposed to know at least for a last couple years to consider yourself a good ASP.NET developer. So, naturally, I found some stuff new for me.

And what is even better - the book guides you through framework building step by step, explaining every decision, tradeoff and implications of each choice made or rejected. It is rare and thorough overview of evolving architecture in a progress. Invaluable if you ask me, especially if you deprived a privilege to observe truly good architects in their natural habitat - a live project.

For a first glance check the live example, it is awesome.

A perceived complexity

This question I've heard during one of my interviews quite a while ago: "How you handle high-demand database requests when you need to perform five or six joins with data tables?" ("data" is emphasized because user data tables and not lookup tables were in scope - I clarified this specifically). The question is not important (I guess "that's stupid" was good enough answer). The join details are not important either. What was important - the question assumed a necessity of an unneeded complexity.

Honestly, it was the first time I gave it a thought. Look at formalization of your applications - how many relations require more sophisticated approach, than "one-to-one" or "one-to-many"? To your surprise you can find that it happens rarely, if ever. In most of the cases, when I though that complexity existed, I was proven wrong. And you know why? Because the object model of any intricacy comes down to the user-facing interface and humans are notoriously unable to tackle the overcomplications you think to unleash on them. That particular interview question was about building a robust forum portal, and functionality looked overwhelming - all those sub-portals and forums with discussions, posts, comments, cross-links and other stuff. But any meaningful user function drilled down to the same parent-children routine: forum with discussions, discussion with posts, post with comments and so on. User doesn't care about surrounding complexity while dealing with convenient and comprehensive "one-to-many" entity.

Higher level of complexity can even serve as an alarm bell - an indication, that business rules are handled in the database. In this case big joins is not your biggest problem.

Friday, August 22, 2008

Managing Continuous Database Integration with Red Gate tools. Part III. "A New Hope"

Yesterday I've spilled some frustration (legitimate or not) over Red Gate licensing policies. It appeared that any flaws in this area company compensates with customer-oriented approach which, unspoiled by Microsoft & Co, I find unprecedented.

Despite me calling them names, they took an effort to patiently explain their position and ways. Faced with such a gallantry I have no other choice but to apologize for my emotional thrusts and to admit that Red Gate is not "another greedy ignorant Apple Corporation software company". That's a big relief.

Similarly, the .NET Reflector future looks brighter now. The question, though, still remains - will it become "Red Gate Lutz Roeder's .NET Reflector" or stay "Lutz Roeder's Red Gate .NET Reflector"?

Thursday, August 21, 2008

Red Gate bought .NET Reflector

When I've read the InfoQ article my first reaction was panic. I had experience with Red Gate which can not be considered utterly positive. The parties comments are mutually optimistic but I still have a sinking feeling in my stomach.

I wouldn't be surprised at all, if "free community edition" will evolve to the "free community 30-day trial". You will have to buy Pro version, which will allow you to add new libraries for inspection, .NET Reflector Toolbelt license for Disassembly functionality and Ultimate edition license if you actually want to translate disassembled code from German to C#.

Managing Continuous Database Integration with Red Gate. Part II. "The Achievement"

When you done with bitching the rest is easy. It is confirmed that you need nothing but SQL Comparison SDK (currently in version 7). Help projects are pretty good and offer valuable help. You will be using SQL Compare API to generate the structure update script and SQL Data Compare - for data upgrade script (i.e. for a lookup tables).

Another option is to purchase the whole SQL Toolbelt. Version 7.0 includes SQL Compare and SQL Data Compare Professional, which is necessary to run these tools from a command line. This approach will cost you dare $1600 and your options are pretty much limited to a batch scripting so I decided to stick with the API.

You have to download SQL Toolbelt and install SQL Compare, SQL Data Compare and SQL Comparison SDK. You have to pay for the SDK license only (I guess this means that you won't be able to use Toolbelt UI).

The project has to refer the RedGate.Shared.SQL, RedGate.Shared.Utils and RedGate.SQLCompare.Engine libraries, while RedGate.Licensing.Client, RedGate.SQLCompare.ASTParser and RedGate.SQLCompare.Rewriter DLLs should reside side-by-side with listed above in order the project to compile. My test solution uses only SQL Compare so I guess there will be couple more required for the SQL Data Compare.

At some point I've created a license.licx filie, which contains one line

RedGate.SQLCompare.Engine.Database, RedGate.SQLCompare.Engine
but I migrated solution from version 6.0 to 7.0 since then an I am not sure if it still required.

This is the core code:
private static string folder; //folder
private static string server1; //DEV Server 
private static string server2; //BAT server
private static string db1; //DEV database
private static string db2; //BAT database
private static string file; //script file name
private static bool waitForInput=true;
private static bool verbose=true;

static void Main(string[] args)
{
try
{
if (!ResolveParameters(args)) return;
EvaluateVariables();

using (Database widgetDEV = new Database(), widgetBAT = new Database())
{
var options = Options.Default;
widgetDEV.Register(new ConnectionProperties(server1, db1), options);
widgetBAT.Register(new ConnectionProperties(server2, db2), options);

var dev_bat = GetDifferences(widgetDEV, widgetBAT);

if (verbose)
Write(null, dev_bat,
item =>
string.Format("{0} {1} {2}", ((Difference) item).Type,
((Difference) item).DatabaseObjectType, ((Difference) item).Name));

var script = GenerateScript(dev_bat);
if (verbose) Write("Script length:", script);
WriteScriptToFile(script);
}
if (waitForInput)
{
Console.WriteLine("Press [Enter]");
Console.ReadLine();
}
}
catch (ApplicationException ex)
{
Console.WriteLine("ERROR OCCURED: " + ex.Message);
}
}

static string GenerateScript(Differences dev_bat)
{
// Calculate the work to do using sensible default options
var work = new Work();
work.BuildFromDifferences(dev_bat, Options.Default, true);
if (verbose) Write("Messages:", work.Messages, item => ((Message) item).Text);
if (work.Warnings.Count>0 && verbose)
Write("Warnings:", work.Warnings, item => ((Message) item).Text);
var script = (work.ExecutionBlock!=null)?work.ExecutionBlock.GetString():null;
return script;
}

private static Differences GetDifferences(Database widgetDEV, Database widgetBAT)
{
var dev_bat = widgetDEV.CompareWith(widgetBAT, Options.Default);
foreach (Difference difference in dev_bat) difference.Selected = true;
return dev_bat;
}
The omitted methods are simple but tedious argument extractors, file writers and loggers.The application compares a development database DEV (which had some changes) with testing/production database BAT and generates one-way upgrade script for the latter. At this point I didn't find out yet how to include objects, which were removed from the DEV database, and if this feature even required. This kind of functionality can present some potential danger.

The NAnt script calls the provided EXE with the following task:

<exec basedir="${CCNetWorkingDirectory}/Binaries/UpgradeGenerator"
program="UpgradeGenerator.exe" timeout="10000">
<arg line="s1=dbserverDEV s2=dvserverBAT
db1=testdb_dev db2=testdb_bat f=update-${CCNetLabel}.sql"/>
</exec>
The rest is obvious. Once again - the sample projects provide great insight for the functionality.

There are few things to consider, though.

My original intent was to keep the solution as a source code and check it out and compile during the build. This would require you to install the SQL Comparison SDL license on the server, otherwise it will be trying to fire popup and effectively kill the build. So I ended up placing a compiled executable in the Subversion, in Binaries folder, which effectively made it a part of any tracked project.

Another mystery, which still remains unresolved, is an unexpected behavior of the application which remains in the memory after execution. As a result the CC.NET worker thread hangs indefinitely and your build will be showing "Building" until you manually kill the generator process. As a workaround I limited a timeout for the <exec> task (refer the code above) and call this task with failonerror="false" attribute. This is a dirty trick, but it does its job on this stage.

As a conclusion I note that while more sophisticated approaches available, including your own NAnt task, even this simple approach works just fine.

Managing Continuous Database Integration with Red Gate. Part I. "The Bitching"

Red Gate tools will do whatever you can imagine you would need while developing, deploying and maintaining a database-backed software. The products are great but unfortunately marketing types clearly took over the company.

I have never seen so ridiculously confusing pricing and licensing policies - we've spend few weeks in negotiations and still it is unclear how what they sell is mapped to what we actually need. And why we have to pay additional $2500 to use a command-line of the package for which we've already paid $1000? The "Red Gate market segmentation" translates from Redgate Marketuanian as "marketing department job security".

Even if you went through an extortion a purchasing transaction the frustration is far from over. The installation application is a wreck of the functionality. Site would tell you that for our task we need the SQL Comparison SDK, which is priced and marketed as a stand-alone application. But it is not available as a separate download - it is part of a SQL Toolbelt which is priced much higher. If you downloaded Toolbelt and installed only SDK you will end up with a single "Getting started" HTML page, no DLLs. Duh!

Things that Every Software Architect Should Know

There is always something around that you must know to become (or to stay) a good professional. How can you find it out? Follow a trusted advice. I just love the Scott Hanselman's "Weekly Source Code" crusade, though rarely able to follow the 100% of the implied beauty, but that's the whole reason why I like it.

Imagine my anxiety when RSS reader brought the "97 Things Every Software Architect Should Know" article. Holy @$%#, 97! What a fount of wisdom must it be! Surprisingly, the article didn't contain 97 Things. Not even the original 10 Things. It was merely the announcement of the upcoming "Beautiful Code Or Whatever" book from O'Reilly. I assume that the book will be a product of collective email list wisdom, much like the Spolsky's "Best Software Writing". It seems to be an accepted practice so I have nothing against that. But if you weren't the original email list subscriber, it's too bad. You can't just get those 97 Things - it's proprietary now. Buy the book.

P.S. All right, all right, it's exaggeration. Some of the Axioms are available, though quite a bit of them are questionable and I wouldn't buy a book which they make into.

Tuesday, August 12, 2008

Beware the automated unit testing

As one of the core agile practices, unit testing single-handedly can become a deal breaker, while selling Agile to the company. Test harness is a visible investment while Testing as a ubiquitous process is intangible and connection between the price and benefit is not obvious.

If you apply notions of Return on Investment and Total Cost of Ownership to the realties of the software development, the emphasis will be on maintenance, quality and other "ephemeral" concepts. Organizations often pay attention to Software Development Lifecycle, which doesn't take in account a long-term maintenance, while latest amounts to up to 90% of the total project costs. This and frequent misinterpretation of the PASS MADE principles lure dilettantes to perceive maintainability as a second-rate goal. "Saving" on testing and quality assurance in pursuit of short-term goals, organizations actually undermine quality but do not understand it or see it as a canning scheme to keep shaking customer for more money.

The unit test harness (left alone TDD) plays a crucial role in keeping application afloat during development and a long way after it done.

Said that, I still think that management should be very careful, giving a green light to TDD and unit testing. The "light weight" of Agile process does not mean "light control" but rather assumes that rigid discipline and self-control would become self-replicating daily practices. If you made a commitment to the practice you'd better stick to it. Nothing is worse, than investing the time and effort in the test coverage and abandoning it after project is done. No doubt, the tests have already helped along the way - minimizing integration and regression testing time and indirectly promoting the better design. But by letting harness rot in oblivion, the organization will deprive itself from ripping a long-lasting benefits, and also will give itself a false sense of security actually implementing some of the practices (and blame them for the failure afterwards).

So don't get into that relationship if you not sure that you will stay committed.

Dos Tequilas por favor!

That's all I can say after the pretty wearisome flight...

Disappointments:

- They not going to host "Agile Varadero" any time soon.

- "Che" T-shirts are sold to the filthy reach imperialists only - I find the price of $19 quite excessive.

Pluses:

- Everything else, which has an "ocean" or "sun" in it.

Thursday, July 31, 2008

Don't you hate DEMOs?

You've heard about jolly good presentation, which you missed, to your deepest regret. But luckily a downloadable version is available, so you go there, get it, run and... Cripes! The page, promising the "self-explanatory code which will make it all clear for you" is followed by an empty one with a big fat "DEMO" and nothing else.

90% of Microsoft presentation seems to be like that - it is like there is a corporate standard enforced. Presenter could spend two/thirds of the the time, presenting the code, but still do not include code snippets in the presentation itself or at least accompany online version with the source code.

Please, don't do that.

P.S. Newly adopted screencasts suffer from the same problem, but there is joint fault of both presenter and operator. It is even more annoying to stare at the presenter while he is pointing out how "this class is neatly mapped with this tricky instance". Please, don't do that too.

Wednesday, July 30, 2008

One more time about a "Triumph of the (group) will"

What is undoubtedly good about ThoughtWorks - is that thoughts are definitely at work there.

Patrick Kua just had a post about a group thinking vs. thinking of groups. He has a lovely picture:

That was an easy one - Cathy Sierra had a great post back in 2007 about "Dumbness of crowds":

So superiority of the collective thinking is not the law of nature. The dullness of averaging is just one side of the problem, though. It is not just a committee is incapable of producing the next Google - the unified thinking can make a team miss the point.

I would argue for the "Collective Intelligence" vs. "Wisdom of Crowds" - the way of harvesting individual contributions in order to produce something what customer actually wanted:

CollectiveThinking

P.S. And I just couldn't help it: In a team you reach the goal. In Soviet Russia goal reaches you :)

P.P.S. And of course Patrick's intention wasn't to advocate the clone thinking and he points this out in the comment. It's just the original post was allowing too much room for misinterpretation. You can have an opinion, but in Soviet Russia opinion has you!

Tuesday, July 29, 2008

In Soviet Russia...

In Extreme Programming, you continually test your code. In Waterfall, your code continually tests you.

© Tim Lesher & Yakov Smirnoff

Friday, July 25, 2008

Lazy TryParse

This is my second attempt on the TryParse with the C# 3.0 Extension Methods.

I just noticed that I am copy-pasting a lot of the Integer and Boolean parsing in my current project. Maybe I just don't know any better but I ended up using TryParse this way:

int intParsed;
value= int.TryParse(reader["Value"].ToString(), out intParsed) ? intParsed : 0;

or a little bit simpler:

int intParsed= int.TryParse(reader["Value"].ToString(), out intParsed) ? intParsed : 0;

I will appreciate if somebody would show me simpler solution but so far it is as it is. I am stuck with .NET 2.0 for now, but would it be 3.5, I'd put together an extension method. It started as a pretty simple extension of a Int32 type, but quickly evolved to the more generic solution, which uses some kind of Duck Typing guessing:

public static class TryParseExtender
{
public static T LazyTryParse<T>(this T instance, object input)
{
Type type = typeof (T);
MemberInfo[] members = type.FindMembers(
MemberTypes.Method,
BindingFlags.Public | BindingFlags.Static,
(objMemberInfo, objSearch) => objMemberInfo.Name.Equals(objSearch.ToString()),
"TryParse"
);
foreach (MemberInfo info in members)
{
try
{
bool boolResult;
object[] paramArray=new[]{input, instance};
object objResult = ((MethodInfo) info).Invoke(instance, paramArray);
if (bool.TryParse(objResult.ToString(), out boolResult)) return (T)paramArray[1];
break;
}
catch {}
}
throw new ApplicationException(type+ " doesn't support TryParse");
}
}
It is not the prettiest code but does the job. Return statement assumes that types which expose TryParse are able to cast the Object type to themselves. Unfortunately the C# Extension Methods are not that powerful as the Ruby's Monkey Patching and I am already far enough into the Ruby to recognize what power (and elegance) I am missing.

Here is the test harness for the extension, which also shows the usage patterns (generic inference makes code look simpler):

[TestFixture]
public class TryParseExtenderTest
{
[Test]
public void TestIntParsing()
{
string input = "230";
int result = 0;
result = result.LazyTryParse(input);
Assert.AreEqual(230, result);
}

[Test, ExpectedException(typeof(ApplicationException),
ExpectedMessage = "System.String doesn't support TryParse")]
public void TestStringParsing()
{
int input = 230;
string result = "";
result = result.LazyTryParse(input);

Assert.AreEqual("230", result);
}

[Test]
public void TestBooleanParsing()
{
string input = "true";
bool result = true;
result = result.LazyTryParse(input);
Assert.AreEqual(true, result);
}

[Test]
public void TestDateTimeParsing()
{
string input = new DateTime(2008, 12, 1).ToString();
DateTime result = DateTime.Now;
result = result.LazyTryParse(input);
Assert.AreEqual(new DateTime(2008, 12, 1), result);
}
}

Monday, July 21, 2008

Prepare for the future

It's coming.

Warning: graphic images, weird sense of humor. Not suitable for all audiences. Actually - for anybody.

Clean, Lola, clean!

The CCleaner tool is the best friend of a contractor when it is time to leave.
Also it is useful on a daily basis. I just figured out (after a year of procrastinating) what needs to be tweaked in the settings in order to avoid the annoying disappearance of the Window Explorer settings. You know, when the sorting, appearance, etc. used to be set as you like - and suddenly it's all gone?

image

If you like it squeaky clean - go ahead, check all boxes, but leave the Window Size/Location Cache out.

Awesome, awesome tool (a rightful nominee of the SH2007UDPUTL for Windows list). I've just run it after a three day break and it has found a 68 Mb of trash.

Sunday, July 20, 2008

Worktime music: Florent Pagny

Ma liberté de penser

Caruso

Saturday, July 19, 2008

Shark!

The advocacy group The Shark Project holds a Photo Award. Here is the teaser:

I noticed that the workflow is missing some important steps (especially considering the subject) and I couldn't resist to fill in the gap:

But reality is not that funny. Those fierce insatiable predators are unstoppable and claim increasingly more lives every year. I am talking about humans. The shark elimination rate is astonishing. It is hard to believe the statistics but apparently from 38 to 200 million(!) sharks are being killed annually. These numbers are beyond any comprehension: it is like losing the entire US population in a single year, Canada - in two months or having World War II occur every quarter.

The fear and loathing are understandable - we are challenged outside of our natural habitat. Unless you are a Navy Seal, it is not really fair game. A bear is much more dangerous but we more or less understand our own land creatures, so we can put up with the necessity to stay away from their ground. If we ended up in the Amazonian jungle or in the Kenyan savannah in their better days, our chances of survival would be slimmer than us going into shark infested waters, but who wants to eliminate rainforest? We are so easily influenced by the media, that despite the fact that sharks are an important part of the ecosystem, the though of protecting them is somehow unsettling. Saving bears is OK, but not sharks? What do we think of the guy who got injured after approaching a lion pride or she-bear with a cub? A dumb schmuck! But we would call that a scuba diver attacked by a shark, he is always a victim, because most of the time sharks dare to interrupt our holiest activity - the entertainment. If sharks are a part of the thrill, one should be ready for what comes of it.

According to the Australian Museum (Australia could win a "shark country" contest with her eyes closed) the human death toll from 1980-1990 was based on the following reasons:

Activity Total Deaths Average/Year
Crocodile Attacks 8 *** 0.7
Shark Attacks 11 ** 1
Lightning Strikes 19 * 1.7
Bee Stings 20 * 1.8
Scuba Diving Accidents 88 **** 8
Drowning/Submersions 3367 * 306
Motor Vehicle Accidents 32772 * 2979


* from the Australian Bureau of Statistics, Canberra
** from John West, Shark Attack File, Taronga Zoo, Sydney
*** from Dr Graham Webb, Darwin
****from Dr Doug Walker, Operation Sticky Beak, Sydney

Following this logic, we had hunt down Ford executives! Compared to the 60 attacks per year (with less than 10 fatal), sharks are taking casualties of 3 500 000 to 1.

Nature is a smart beast - this improbable survival despite the fierce genocide shows that life, uh... finds a way. We may be surprised one day, and not pleasantly.

Friday, July 18, 2008

Class Field Inheritance Pattern

Off topic: this is one difficult book to read. Not that it's written in a tangled language (try reading this). It is just that the understanding of content will come with the practice, otherwise the knowledge will evaporate momentarily. But how often does one make architectural decision on the enterprise level? And how often does one decide to use a new set of patterns? So the actual "reading" will take quite some time and I have been reading it on and off for four years already. Yes, I am slow reader.

The only downside is examples. As a sport ignoramus, I have no idea why a Footballer would inherit from a Player but a Bowler should inherit from a Cricketer. So examples are not from my real life and confuse me more than actually explain anything. Instead of understanding relations from entities I have to go in the opposite direction. Bugger.

Now back to the title: "what object-relational pattern would be more lightweight than the Class Table Inheritance and less normalization-refuting than the Single Table Inheritance?"

Meet the Class Field Inheritance pattern.

The idea is that the base class shares a single table with children and the child-specific fields are serialized into a single field. You would use it if:

1. Base class is equal in rights with children classes and has to be instantiated.
2. Child classes have very few extra properties.
3. There is no requirement for an extensive analysis of the child-specific fields.
4. There is no requirement for a raw child-specific data to be available for a database objects (e.g. reporting tools, functions or procedures).

All these requirements are not 100% solid. If the performance is negotiable, concrete child properties can be lazily deserialized, thus enabling a business layer analysis. The raw data from the filed can be "unfolded" using user functions, providing concrete tables to join on them if required.

Pattern is fast in the implementation - one stored procedure will serve it all, and the logical flow is very clear. One can go nuts with Strategies, Abstract Factories and Dependency Injections. Now tell me that it is not a beauty!

Thursday, July 17, 2008

Attempting on Domain Driven Design. Again.

 

I think I keep making the same mistake when I try to read the fundamental Eric Evans work. This time I was advised to skip the first four chapters and hopefully, this will prevent me from dozing off again.

 

Also, I plan to sneak on DDD with this podcast and a squeezed version of the original DDD bible (106 pages versus 500). It is available for free (in a PDF format) from the InfoQ (run and subscribe to their beautiful personalized feed).

Should you fear or embrace dynamic languages?

Yesterday John Lam (the brain behind a Microsoft IronRuby effort) delivered an excellent (as usual) presentation for a Metro Toronto .NET User Group audience - "IronRuby: Should you fear or embrace dynamic languages?".

John is an exceptionally inspiring speaker. If you ever have a chance to attend his performance presentation - drop all your appointments and go! Don't worry even if the presentation is about a Roman cubic art - he is notorious for deviating from the topic and you will hear about many other things. What he will most likely to achieve is make you aspire to be better at whatever you do.

Ruby perspectives look good. So do the .NET's. By the end of the presentation both questions from the topic are answered for me. Those answers most likely are what John wanted to communicate but the reasons behind them are [possibly|slightly] different: No - you shouldn't fear the dynamic languages as a concept and No - an average C# Joe should not be afraid of Ruby sneaking up on him and taking his job away. C# 3.0 matches a Ruby's syntactic sugar in a lot of cases and there is (hopefully) more to come. And Yes - you must try Ruby at the least. The presentation has finally convinced me that doing so will make me a better developer. A better C# developer.

So go ahead and learn Ruby. By the time you are more or less done, the IronRuby will ship (spring 2009 was mentioned) and you will be able to make an informed decision - get hooked on this new cool drug or stick with the good old one :)

Saturday, July 12, 2008

Sorting a grid with ObjectDataSource

ObjectDataSource is a lazy way to power a grid control (let's say a GridView for example). Surprisingly, the Microsoft team didn't give good thought to any Data Source controls except SqlDataSource.

The "It just works" approach just doesn't work with ObjectDataSource sorting. I like using generic lists a lot but ObjectDataSource can not sort them: "The data source 'mySource' does not support sorting with IEnumerable data. Automatic sorting is only supported with DataView, DataTable, and DataSet". If they were aware of this flaw - why didn't they just fix it and provide nice sorting support?

Thus the hack is up to us.

The first step would be supplying the Comparer for our custom class. And if you're rightfully lazy to write a full set of Comparers for each property, then reflection is to the rescue:

public class ToyComparer : IComparer<Toy>
{
private readonly string _property;

public ToyComparer (string propertyName)
{
_property = propertyName;
}

public int Compare(Toy x, Toy y)
{
PropertyInfo property = x.GetType().GetProperty(_property);
if (property == null)
throw new ApplicationException("Invalid property " +_property);
return Comparer.DefaultInvariant.Compare
(property.GetValue(x, null), property.GetValue(y, null));
}
}

The class's List Select Method uses Comparer to sort the output list:


private static List<Toy> GetToys(string propertyName)
{
//list creation is omitted ...
if (!string.IsNullOrEmpty(propertyName))
{
list.Sort(new ToyComparer(propertyName,order));
}
return list;
}

The second step is to add a Select parameter to the ObjectDataSource:


<asp:ObjectDataSource ID="mySource" SelectMethod="GetToys" TypeName="Toy"...>
<SelectParameters>
<asp:Parameter Name="propertyName" Direction="Input" Type="String"/>
</SelectParameters>
</asp:ObjectDataSource>

And last but not the least - we need to set the property of the ObjectDataSource and cancel the grid Sorting event to prevent the exception:


protected void gridMaster_Sorting(object sender, GridViewSortEventArgs e)
{
mySource.SelectParameters["propertyName"].DefaultValue = e.SortExpression;
e.Cancel = true;
}

If necessary we can pass the sorting order the same way.

Friday, July 04, 2008

Subversion 1.5 + TortoiseSVN 1.5 + VisualSVN 1.5

Subversion 1.5 is released, with the TortoiseSVN 1.5 following.

Make sure that you've read the release notes carefully and if you're a happy user of VisualSVN - upgrade it to the version 1.5 as well.

Introverted programming: Office Kung Fu - art of The Crouching Master

It is nice to work for a good company. It is enviable to be respected by your colleagues and respect them in return and looking forward to the next workday. A fat paycheck would top this off and make it your dream job.

But beware the Office Kung Fu masters. It will take just one to contaminate a healthy collective. As soon as you detect the following main stances of Office Kung Fu, it is time to master them yourself. When the Big S hits the fan, your survival may depend on these very moves: Deny, Blame and Take Credit.It wasn't my fault but I have a pretty good idea...

Deny - the main defence stance. Distance yourself from any failure, even potential one, which is not (and especially if it is) your fault.
A simple "it's not me" defence is too low-power. You may need to practice higher techniques: phrase "I don't think this is OUR group" will likely recruit others to fight on your side (The Summoning Block). Master the swift transition to the attack, do not give your opponent any time to regroup.

Figure 1: Master blocks the accusation and demonstrates readiness to put blame in return.

 

I have four reasons why these three individuals should be held responsible...Blame is a crashing attacking blow. Be the first to point fingers. Don't hesitate even if the accused is another Office Kung Fu Master - he/she has to first repel your thrust and his/her responce can be easily depicted as a personal attack on you, the very person who cares about the company so much. The Leech Spree is an example of an indirect attack, which can be executed by a phrase like "Somebody changed this file". This will ignite an avalanche of blame. Be careful when blaming groups - the other Master may use the Summoning Block against you.
Figure 2: Master prepares to attack three people at once and his blame is supported by four sound arguments - very powerful Blame By Authority attack.

 

Take credit - powerful shift-of-power move. Don't cut others any slack. They will rob you of your (or their own) success behind your back, so hurry up and claim it first. Successfully applied, this attack will strengthen your position while weakening the opponent. 
We fixed that build!I hate to brag, but...Precede blaming attack by taking credit first, thus multiplying the destructive effect (The Tsunami Swoosh). On the contrary, you can emphasize your value to the organization by putting the blame first and taking credit immediately after (The Slapping Blossom).

While novices can use a highly energetic version (Fig.3), a true master prefers the carefully calculated move (Fig.4) which is likely to bring a lasting recognition (The Humble Ploy). 

 

 

The highest art of Office Kung Fu is to combine all these techniques within one sweeping motion: "It is not me, it was Nick's fault but I successfully fixed the problem.".

Deny, blame and take credit - and you'll never go home without a promotion.

Thursday, July 03, 2008

Programming geniuses and death by risk aversion

When you read about architecture astronautics and programming geniuses it seems like a good idea to keep an eye on creativity outbursts. There is some popular medicine out there like KISS and YAGNI which can help you cope with that insane yearning for Yet Another Framework.

But as usual the coin has two sides: you still need to empower your users with innovations. Setting computer dreamers free will result in stagnation as their heavy all-inclusive solutions are difficult to maintain and difficult to advance. The more effort is put into the particular architecture, the more people are hesitant to take the risk of making radical changes or let it go altogether, so eventually the solution becomes more obstacle than enabler.

If there is a ready recipe to mix evolution and revolution successfully it most likely is heavily guarded by the corporation which got filthy rich implementing it.

Dependency injection and composition with extension methods

Muddling through the C# 3.0 updates, a colleague and I (inheritance junkies we are) came upon an interesting observation. We departed from the normal base class MyClass:

public class MyCLass
{
public void Update()
{
MyProperty="new value";
DoSomething(this);
}
}


Then we extended the class collection with an extension method (this nice code is shamelessly stolen from the Umbrella library):



public static ICollection ForEach<T>(this ICollection items, Action<T> action)
{
if (items != null)
{
foreach (T item in items)
{
action(item);
}
}
return items;
}


Intended inheritance-driven usage of this method was something like this:


var list = new List<MyCLass>();
list.ForEach(a => a.Update());


Technically there were not much sense in implementing an extension method to use it this way. So we proceeded forward and took the Update method content out of the class and injected it through the extension method:

list.ForEach
(
a =>
a.MyProperty="new value";
DoSomething(a);
)



Now we do not really need instance Update method anymore and can defer the implementation until we will be really using it.


Of course, the above implementation has significant limitations, especially if it concerns private or protected members. Also this code is not exactly DRY-friendly. From another point of view, the dependency injection pattern contradicts the concept of the "black box" to some point.


I wouldn't repeat that "LINQ is cool" but adopting this approach could bring some interesting results, what do you think?

Wednesday, July 02, 2008

Little Subversion trick to improve VisualSVN

VisualSVN is a great product. It is stable, intuitive and if you know TortoiseSVN - you pretty much know it all. Until (relatively) recently, the Ankh was practically the only Visual Studio plug-in for the Subversion. Unfortunately, it suffered from performance and reliability problems and lost a lot of trust. They claim that problems are solved, but I personally would need something very convincing to try it again. Like a direct order from management.

Unfortunately the VisualSVN is not resolving the old problem with multiple working copies (you know - when you have a framework and share it between numerous solutions). For some reason neither a Subversion nor a Team System approve of this very DRY approach. As a result the Solution Explorer in the Visual Studio doesn't track the changes in the project which resides outside the solution's working folder. Not a big deal if the framework is stable but it becomes pretty annoying if some work is still being done.

To overcome this problem in the Team System you had to sacrifice a black lamb on the intersection of six roads at a full moon and hire a Gemini-born Microsoft consultant within forty days. Subversion provides a more environmentally friendly solution - the svn:externals property.

svn propedit svn:externals Framework http://svn.repository/Framework/trunk

If you use this property on the main solution folder, the framework project will be checked out inside it and become the eligible (and version controlled) part of the solution. Instead of using the single physical copy of the framework project you would have multiple underneath each of the solutions but all of them will be tracked to the single code base in the SVN.


The svn:externals property still seems a little bit like a hack and has some stability issues and limitations with branching. Nevertheless it is a reasonable solution - the SVN properties can be managed by a build master and developers don't really need to do any extensive learning.

Tuesday, July 01, 2008

Intellisense in NAnt build file editor

This is a great article about adding Intellisense to the XML editor - a very useful feature. This also works perfectly with VS 2008 - just use Visual Studio 9 instead of 8. If the registry key in the article is truncated - it is 412B8852-4F21-413B-9B47-0C9751D3EBFB.

Unfortunately NAntContrib tasks are not included in the original schema but these articles provide some insight.

Friday, June 20, 2008

Wish you were here

Absolutely awesome video! Simple but masterful piece which radiates positive energy. And the dancing style is quite catching.


Here is the guy's site - http://www.wherethehellismatt.com (and there is a much better quality video available). Toronto is not in the list yet so we still have a chance to make history :)

Tuesday, June 17, 2008

Code bombs and team morale

Jeff Atwood again did a good reading for us: code-bombing from the dark room. This is the essence of why the collective code ownership and even infamous pair-programming are actually helpful in shielding the programmer's ego.

It is much easier to give up on a 10-lines method than on a whole over-engineered and obtrusive application layer. I am so used to admit my ignorance very casually that it became a defensive mechanism for my otherwise vulnerable and envious soul. I like writing frameworks but majority of them have never seen light of the day. In the eyes of others exactly this practice supposed to make you a better programmer. Probably it did, because I avoided code-bombing my colleagues, providently submitting mere suggestions instead. Most of the time these fruits of weekend labor were criticized but some pieces were adopted after heavy rework, and that, indeed, made me a better developer. Critiques hurt a bit but kept me determined to go ahead and try to get those heartless bastards next time. How does a "suicide code-bombing" sound?

Software development is people's business and what comes good for a cubicle drone will go good for the whole project. Code reviews, especially if they cover results of months of work, are inefficient and leave people feeling more offended than enlightened. Daily cooperation will allow people admit and accept their mistakes with dignity and confident people are much more opened to learning. It is not even instrumental to have Jedy programmer on board as a team of generalizing specialists possess enough steam-power to self-propel the average skill level up and forward.

Sunday, June 15, 2008

Testing ASP.NET Ajax autocomplete extender with Watin

Back to our Autocomplete extender example.

One of the biggest reason we hesitate to unleash the ASP.NET Ajax toolkit onto our project is potential performance shortcomings. The load test would be nice addition to the test harness. We do not run Team System (not that we would have Visual Studio Test Edition available anyway) so we can judge TFS load testing suckness from the words of others.

The Watin tool seems to be a very good candidate for the black-box testing. My first attempt to run simple test wasn't any success.

[Test]
public void FirstTest()
{
IE ie=new IE("http://localhost/Prototype.WebUI/TestDataEntry.aspx");
ie.TextField(Find.ByName("ctl00$contentMain$txtManufacturer")).TypeText("n");
Thread.Sleep(200);
ie.TextField(Find.ByName("ctl00$contentMain$txtManufacturer")).TypeText("nis");
Thread.Sleep(200);
ie.Close();
}

When test runs, I could see that active fields are highlighted in a IE window, text being typed without any sign of Ajax. It's either Watin is missing events (which was unlikely knowing Jeroen's thoroughness) or ASP.NET Ajax techniques are too exotic to catch on (which is more than possible with Microsoft's history of lack attention to details).

After lengthy investigation and extensive Reflector code digging I found that Autocomplete extender attaches itself to the KeyDown event of the textbox so from this side we were more or less clear. Then I remembered my own adventures with ASP.NET testing and decided to kick other tires - play with runtime events reanimation. To make long story short - the following line brought my Ajax back to life for the Watin testing:

txtManufacturer.Attributes.Add("onkeydown", "");

That's right - "attach nothing to KeyDown event" - and all will start working as expected. Now we can load test questionable Ajax performance.

I didn't have time to investigate the differences in the generated code but I would start from blaming the ASP.NET Ajax code generation for this glitch. I bet, like with the SQL Data Source Wizard bug, somebody somehow overlooked some possible combinations.

P.S. I can't wait to lay my hands on Component Art components.

Sunday, June 08, 2008

Worktime music: Apocalyptica

The Unforgiven

 

Seeman (ft. Nina Hagen)

Friday, June 06, 2008

Introverted Programming: So what the Japanese management model really teaches us?

Not so far ago I've read a not-so interesting opinion on one forum which traditionally criticizes any kind of management style. It has inspired some thinking and this rarity needed to be captured.

We are used so much to invoke a Japan's authority in a business efficiency - but hardly look beyond the magic of events. Of course - agile this and agile that, lean development, kanban and just in time manufacturing - it all seems to work. But if you think about the organization of The Japanese Firm - what first comes to your mind? Loyalty. Not in a sense of Omerta code but rather in a sense of people identifying themselves with The Company and perceive their employment as a lifetime commitment.

So where I am going at? Maybe there is no mystery. Knowing the devotion and remembering that career advancement in Japanese company traditionally based on seniority, it is safe to assume that the vast majority of management in all levels is home-brewed. And likely their success comes from the deep understanding of the business area and processes (which they've being growing through) rather than from mere formal education (which they perfect nevertheless) or magic business techniques (which they invent and employ based on their expertise).

Thursday, June 05, 2008

"#region is evil" Part II - my bad

Ok, it is not always evil. If you have class with 50 properties, there is little can be done. Refactoring by grouping properties into the types is gross overengineering. So pack this stuff in the #region, I'll look the over way.

Smuggling a dose of LINQ to the .NET 2.0

As soon as you get hooked on the syntactic sugar like auto properties and object initializers, it is excruciatingly painful to overcome the abstinence returning back in the .NET 2.0. LINQ is even more addictive and it is good to know that there are people around who can supply us with the good stuff if we've found ourselves locked in the Visual Studio 2005 project. LINQBridge seems like a solution. The author is Joseph "C# 3.0 In a Nutshell" Albahary.

P.S. His LINQPad is worth to be looked at too - at least you don't have to fire up a fresh VS2008 project to try LINQ expressions. Would be nice, though, to have an intellisense there.

Wednesday, June 04, 2008

Fill'er up

I understand that there are more complex cause-and-effect principles involved but wouldn't it be funny if calling a towing service for your car will be cheaper than actually driving it?

Thursday, May 29, 2008

Introverted Programming: File Oriented Architecture

Meet the new kid on the block: FOA - the File Oriented Architecture.

FOA contracts are simple - applications connect with each other assuming that the counterpart can be found in a certain location. In a pseudo-code contract would look like: "three folders up - two steps right - two folders down - get Foo.Moo instance". FOA has vendor support - in a sense, Microsoft Web Site projects are examples of FOA approach. FOA is a further development of the KISS principle - the KISS of a Web 2.0(beta) era, KIverySS.

FOA is a very clean and controllable way to code reuse. "In-depth understanding of FOA principles" looks good in resume (it has "Architecture" in it!). It is much easier concept to understand than SOA or REST so any hiring manager will be on the same page with you. Document-Driven Development tools, such as wordUnit, can be used to ensure binding links accuracy within a Continuous Integration process.

FOA beginners operate with full paths, like

AdminConsole console=
new AdminConsole("C:\Documents and Settigns\joe\MyDocuments\Foo\Moo\AdminConsole.aspx");

But true master knows the power of relative paths:
AdminConsole console=new AdminConsole("..\..\..\Foo\Moo\AdminConsole.aspx");

This way the application deployability is much higher. Also the master will encapsulate values in the configuration file where they can be easily changed if any silly developer would suddenly change structure of the bound application. Imagine the flexibility of deployment application over web-farms and clustered servers. We can construct relative paths to a relative locations - the possibilities of combinations are endless.

Setting up (or getting back) a default browser

Once you installed Firefox and made it default browser, it will be tricky to let Internet Explorer step back in. It is formidable achievement - to defeat Microsoft own on it's home grounds, but it may be a little bit annoying, especially for a debugging.

So here is how you can help this poor Microsoftee to gain its glory back (system and Visual Studio settings are independent of each other):

1. For the system (at least for Win XP):

Control Panel -> Add or Remove Programs -> Set Program Access and Defaults -> Custom (this double chevron on the right side ) -> choose a default Web browser

Just be aware that the Firefox, pertinacious it is, will add itself back to the Quick Launch menu as soon as it will become default again.

2. For the Visual Studio:

First, you have to switch to any aspx (or ascx) layout file.

"File" menu -> "Browser with..." -> set a default browser of your choice.

Enjoy the powers of "Lord of the Windows".

P.S. All is intermingled in this world. If you tried to get yourself rid of that annoying Windows Update "Reboot Now - Or Die From Chafing" popup - beware, the Automated Update service will restart after Internet Explorer is made default browser for the system and continue to nag you.

Sunday, May 25, 2008

Mac vs. PC - so what's the deal?

It is interesting that PC-world haven't got back to Apple for those Mac vs. PC commercials (Update: I was wrong - they did). Partially, I guess, it is because there is no such monolithic entity as evil PC empire - so the campaign  is more like "Mac vs. Everybody". But there is another side of that: Apple has never worked harder in favor of PC. Put all commercials together - what do you see?

Doesn't it look like Apple is actually arguing for the choice

vs.

One could retort that ads can easily impress turtle-neck-sweater-wearing-latté-drinking designer dudes, who run their Photoshop on Macs anyway. And that's cool - I'd trade my PC for Mac any time if I would be rendering pictures all day. But the corporate CIOs do not need to be reminded about PC troubleshooting - they trust that with their engineers PC problems are easy come and easy go.

So as a result the ads, witty they are, actually developed totally undesired by Apple perception:

 

OK, OK, not only artists run Macs - I know a whole bunch of deadly efficient (.NET/Ruby/C++/Just-Name-It) developers who use MacBooks. But interesting, what are the most popular applications for Mac, aside of Opera browser? I dare to surmise Boot Camp or Parallels. I even bet that you would find them on a majority of die-hard Maceteers' computers.

And corporations are evil, anyway, so who cares about what those CIOs think? It's all about users and their freedom! I support Jeff Atwood in his quest for the fair product.

 

And another omission from Mac marketers - Mac is portrayed as a leisure-time computer (I just afraid to think where that Japanese-camera girl took a picture from). C'mon! Don't you have something else to do after work? Or have friends who you can listen instead of iPod?

 

And at the bottom line - Mac compares itself with PC, but in reality PC has no face. It's not fair. When we are to compare hardware with hardware and soft with soft so we should do this:

compare personality:

 

or efficiency:

 
 
Also it all was happening already - in 1996. Where the Apple Inc. ended up shortly after?

P.S. Mac is OK, really. As well as PC. And there is no real difference between them to be zealot about.

Saturday, May 24, 2008

Pre-manufactured AJAX


Ajax-enable your web-server within minutes. No skill required.

Friday, May 16, 2008

.NET Framework 3.5 SP1 Beta 1 - use some non-beta stuff today

One of the aspect of shiny tomorrow is going to be Ajax Script Combining Support - very nice feature no doubt it will be. If you ever fired the Firebug while debugging ASP.NET Ajax Control Toolkit (which I like more and more every day), you probably saw quite a few scripts being pumped up from the server. This is how the load looks with just one Autocomplete Extender:

But fear not - you don't have to wait for the full Service Pack release and you don't have to suffer with crippled Betas. The script combining functionality is already available in a stable form - through the ASP.NET Ajax Control Toolkit. ToolkitScriptManager class inherits from the standard ScriptManager and efficienlty eliminates multiple requests, combining scripts which express willingness to be combined in the form of a ScriptCombineAttribute.
This is the same page with TooliktScriptManager replacing a ScriptManager:

We saved a quarter-second for page with just for one control - how much Ajax-junkies will gain?

Now back to the Service Pack 1. Initially I though that ASP.NET team just ported a brilliant Toolkit-guys idea into the upgrade. But from the preview it looks like they did their own ScriptManager extension. New ScriptManager seems to provide more detailed adjustment while TooliktScriptManager is much simpler and elegant. It is too early to say and there is no way I would install Beta from Microsoft again.


© 2008-2013 Michael Goldobin. All rights reserved