The Daily Parker

Politics, Weather, Photography, and the Dog

LINQ to FogBugz fun

Most Daily Parker readers can skip this (long) post about software. But if you're interested in C# 3.0, LINQ, or FogBugz, read on.

I use FogBugz's time tracking tool to provide tracability in my billing. If I bill a client 2.75 hours for work on a bug, I want the client to see the exact times and dates I worked on the bug along with all the other details. And because I track non-billable time as well, and I often work in coffee shops or places like the Duke of Perth, I wind up with lots of tiny time intervals that I have to aggregate to produce a bill.

My time sheet today, for example, looks like this:

Start End Case Title
7:11 AM 7:23 AM 901 Walking the dog (November)
8:18 AM 9:32 AM 950 FogBugz to LINQ project
9:32 AM Stop Work 902 Blogging (November)

But what I need for QuickBooks looks like this:

Case Hours
901: Walking the dog (November) 0.20
950: FogBugz to LINQ project 1.23
902: Blogging (November)  

(The last bit has no time because I'm still working on it.)

This is the perfect kind of thing to waste a few hours on while learning some new programming tricks. (Code here.)

First: the entity

To use LINQ to SQL in its pure form, you first have to create entity classes that pretty much exactly mirror the tables you're interested in. My target, the FogBugz timeintervals table, yields an entity that looks like this:

[Table(Name="timeinterval")]
class TimeInterval
{
	[Column(Name="ixInterval",IsPrimaryKey=true)]
	public int Identity;

	[Column(Name = "ixPerson")]
	public int PersonId;

	[Column(Name = "ixBug")]
	public int CaseId;

	[Column(Name = "dtStart")]
	public DateTime StartDate;

	[Column(Name = "dtEnd")]
	public DateTime EndDate;
}

Because I'm interested in the aggregate time spent on each case, I also created a simple structure to hold that info:

struct AggregateInterval
{
	public int CaseId;
	public double TotalHours;
}

Second: the console app

To use LINQ to SQL, you need to include a reference to the System.Data.Linq assembly, and import the appropriate namespaces:

#region Copyright ©2008 Inner Drive Technology

using System;
using System.Collections.Generic;
using System.Data.Linq;
using System.Data.Linq.Mapping;
using System.Linq;

#endregion

namespace InnerDrive.Research.FogBugz
{
	class Program
	{
		static void Main(string[] args)
		{

Next, set up the data context:

DataContext context = new DataContext(" (your connection string) ");
context.ObjectTrackingEnabled = false;
Table<TimeInterval> table = context.GetTable<TimeInterval>();

(I turned off object tracking because this is a read-only application. Setting ObjectTrackingEnabled to false improves performance, but the data context will throw an exception if you call DataContext.SubmitChanges().)

I actually need two queries, one to get the table rows and another to aggregate them. The reason for this is that my aggregation depends on getting the total hours each interval represents; LINQ to SQL won't do that. Here's the first query:

// FogBugz stores time as UTC; I want the time for today in Chicago, not London
DateTime startDate = DateTime.Today.ToUniversalTime();

var intervals =
	from interval in table
	where interval.EndDate <= startDate.AddDays(1) &
		interval.StartDate >= startDate
	group interval by interval.CaseId
	into grouping
	select grouping;

The second query does the aggregation, transforming the first query into an IEnumerable<T> and returning AggregateInterval structs:

IEnumerable<AggregateInterval> aggregation =
	from grouping in intervals.AsEnumerable()
	select new AggregateInterval
	{
		CaseId = grouping.First().CaseId,
		TotalHours = grouping.Sum(t => t.EndDate.Subtract(t.StartDate).TotalHours)
	};

Neither query has fired yet, by the way. That's another cool thing about LINQ. Both queries fire when I output the data, which is trivially simple:

foreach(var item in aggregation)
{
	Console.WriteLine(string.Format("Case {0} = {1:0.00}", item.CaseId, item.TotalHours));
}
Console.ReadLine();

That's it. Share and enjoy.

Comments are closed