Visual SourceSafe Tidbit

I administer my employer’s VSS and Team Foundation Server (TFS) instances as part of my job.  As a first-time administrator of these sorts of systems, there’s plenty I’m still learning.  Today I found out that moving folders doesn’t work unless the user trying to execute the move has “Destroy” permissions.

The SHAPE command is the bane of my existence

I inherited some classic ASP code not long ago that needed some enhancement.  The look-and-feel of the site is pretty nice, but under the covers there’s tons of the inline SQL I hate so much.  But worse than that is the previous developer’s use of the SHAPE command.  I’d never used it, even when I last wrote classic ASP with my own hands (around 2003).  Once this project is over, I hope never to see it or use it again.  I’m really struggling to understand how it works (and why things were done this way).  I’m hoping this article will help me figure it out.  I’m grateful that someone took the time to actually develop a formal grammar for this command.  It reminds me a lot of the BNF notation we learned as 1st year computer science majors in college.

More on databases and business logic

This particular entry in the “forever war” of whether to use object-relational mapping or stored procedures does a better job than most in these ways:

  • It changes the argument from “either-or” to a situational one.
  • It broadens the scope of database objects in the discussion beyond stored procedures to include functions, triggers, views, constraints, and referential integrity.
  • It rates the suitability of each database object to a particular task.

I found the article especially relevant to my current job because of how many projects I’m responsible for that have huge amounts of business logic captured in stored procedures hundreds of lines long.  As far as I can tell, much of the reason for the length of these stored procedures is that they’re being used to represent workflows.  SQL doesn’t look like the best way to implement those to me, so I suspect I’ll be looking at Windows Workflow Foundation a lot more closely in the very near future.

Source Code Control

I came across this post from Joel Spolsky last week (though I’m just now getting around to blogging about it). We’re using Team Foundation Server for source code control at work, and we’ve managed to have the problems of check-ins breaking the build and too few check-ins to have a good delta of changes at the same time.  While the applications we build at APS Healthcare aren’t the size of an operating system as far as lines of code, the branching-and-merging idea Spolsky describes would probably be useful to us.

Working with IT agencies

I have to deal with IT agencies a lot more now as a software development manager than I ever did in previous roles.  So coming across a blog post titled “IT Agencies and the Devil” was pretty funny.  If starting an IT agency is as simple as the author suggests, it certain explains why there seem to be so many of them.

So far, I can single out one such agency for providing people that are consistently high-quality–Software Consortium.  The guys they’ve sent to work on the projects I’m responsible for have all turned out excellent code and been very good about knowledge-sharing.

NDbUnit

I’ve been a big fan of test-driven development (and unit testing) since I first learned about it a few years ago. It wasn’t until this month that I learned about NDbUnit. This little library is a great value-add for unit tests that involve databases. The creation of test data is tedious if done manually (NDbUnit uses XML for data files), but that’s the only real drawback I’ve found so far in my limited experience with it.  It plays very nicely with NUnit, MbUnit, and TestDriven.NET.
You can get binaries and source code for NDbUnit from Quality Labs.  I also put together a (very) small sample project with Visual Studio 2005 that you can try out.

Home computer backups

Some friends of mine in California have been discussing backup strategies over the past few days. They came across a post by Jeremy Zawodny on using Amazon S3 in addition to his existing backup strategies.

I’m still only using an external hard drive for my backups. I don’t backup my machines regularly enough either. Before Seagate bought them (and before I joined the cult of the Mac), I had a lot of interest in a Mirra Personal Server. At the time it was Windows-only, but now it supports Mac OS X as well.  Recently, I started using Deja Vu for backing up the Mac mini.  So far, it’s been completely painless.

Continuous Integration

The practice is well-defined in a couple of articles on Wikipedia and on Martin Fowler’s website.  But as long as I’ve been reading about this best practice, I’ve never seen it implemented at any of my past jobs (or my current one for that matter).  Fortunately, one of the consultants I’m currently working with not only has it implemented, but has the necessary software and test projects on a USB key that he carries with him from job-to-job.

Before I demonstrate it to the broader software team as a practice, I’m trying to get it working on my own machine.  Because he uses MbUnit instead of NUnit as part of his implementation, it took me a little longer to get the second of his six test projects working.  A little googling for Nant and MbUnit yielded an article that listed 5 files to be copied to the bin directory of Nant.  Once I did that, the second test project worked fine.

Strangely, I only saw 4 of the 5 files in this list:

  • MbUnit.Core.dll
  • MbUnit.Framework.dll
  • MbUnit.Tasks.dll
  • Quickgraph.dll
  • QuickGraph.Algorithms.dll

The core dll was missing, but copying the other 4 dlls did the trick.

Why Unit Test

I came across a great post on unit testing today that provided not just why to unit test, but what to unit test. Differentiating software into “infrastructure” and “end user application” really highlights what sort of code will benefit the most from the approach. Another really useful bit of this article talks about turning bug reports into unit tests. It’s a very smart idea that I’ll try to use on my own code whenever it’s appropriate.

MyGeneration and Gentle.NET

After last week’s post about the stored procs vs. ad-hoc SQL debate, I decided I’d take a look at MyGeneration and Gentle.NET if I could think of the right project. I decided to start with a simple contact form.

Anytime a form has dropdowns, I try to use database tables to populate them.  So I started by copying data for the dropdowns from an existing database to a new one for the project. Once I created a business logic assembly for the new classes to belong to, I fired up MyGeneration and ran the Gentle.NET business entity template. Creating the classes was very easy. the real challenge turned out to be referring to the right Gentle assemblies in my projects. Because I’d included references to Gentle.Common, Gentle.Framework, Gentle.Provider.SQLServer and log4net in the business logic assembly, I thought that was all I needed to do. But the web project wouldn’t compile. In order to get the page working, I had to add Gentle.Common and Gentle.Framework references to web project.

Once I sorted out my reference issues (and added a Gentle.config file), finishing the contact form went very quickly. I developed this bit of code for loading any dropdown list:

private void LoadDropDown(ref DropDownList ddl, IList dropDownItems, string textField, string valueField)
{
ddl.DataSource = dropDownItems;
ddl.DataTextField = textField;
ddl.DataValueField = valueField;
ddl.DataBind();
}

Here’s how it’s called:

if (!Page.IsPostBack)
{
this.LoadDropDown(ref ContactTypeList,BLL.ContactType.ListAll(),”Name”,”ContactTypeId”);
this.LoadDropDown(ref SubjectList,BLL.InformationType.ListAll(),”Name”,”InformationTypeId”);
this.LoadDropDown(ref OrgTypeList,BLL.Audience.ListAll(),”Name”,”AudienceId”);
}