PowerPoint Linking to Excel Issue

When updating a link (right click Update Link) or editing a linked object (double click on the object) in PowerPoint the user receives a dialog box (shown below) stating “The linked file was unavailable and can’t be updated” . Here is a document I wrote that describes how to fix this issue. We encountered this issue in Excel 2007 sp2.

New OLEDB Provider for Reading Excel 2007

I work for a financial company that uses alot of Excel. Many of the the business users here practically live in it. So we are constantly trying to figure out how to leverage Excel in our applications.

Do we just export data to Excel? If so, then is it a snapshot/copy of the data or do we build a connection to the backend data? What about importing data? Where is the boundry between using VBA and VSTO? Then if we pile SharePoint and Excel Services on this heap it starts to get really interesting.

One of our technical frustrations has been the OLEDB driver for reading Excel on the server was fairly lame. It made alot of assumptions about the data that made it nearly unusable except in the simpliest cases. Last week I found this updated Provider for Excel 2007 and I am looking forward to giving it a deeper look. What I can say is, that it did read in all my data rather easily. I just have not had time to play around the fringes much.

Download details: 2007 Office System Driver: Data Connectivity Components
href=”http://www.microsoft.com/downloads/details.aspx?FamilyID=7554F536-8C28-4598-9B72-EF94E038C891&displaylang=en

UI Testing

I spent a bunch of time in the early Windows days trying to do UI testing the way this (UI Test Automation Tools are Snake Oil) blog entry talks about. Like him (or his clients) we used some really expensive tools and ended up not doing a very good job. I really like the thinking Michael is doing here. This is definately where my head is at. The problem is that I am struggling with creating MVC-style applications.

Where I work now we just don’t build big applications. Instead we have lots of small applications that we deliver in weeks not months or years. I have not found this pattern of doing software very condusive for building applications with lots of design. Now hold on a minute – that does not mean we don’t do design. We just don’t do lots of design. When an application is very small how much design do you really need to do? Most of the applications tend to look like each other – read some data…munge it together…display it. We don’t do much data entry; which is an exception in the pattern of apps I have built over the years.

That is not to say we don’t have some big-ish applications. We do. Just that they are the exception. Could they do with more engineering? Absolutely! But we just don’t have the infrastructure (staff, mindshare, experience, etc) to do it that way. Of course there are people doing a high level of engineering here. It’s just that it’s not everyone – it’s not our default.

At first this was a hard pill to swallow (and it still makes me a little gassy at times). But it’s the nature/culture of the way we do things. It’s a model that works, but not in a scaleable way. Sharing anything in this model is very hard – maybe I will blog later about how we do that.

Old Problem – New Context

[Here is an old post I found that I had never published. I think it's because I could not get the formatting of the code to look "right". This time I published from Word and the formatting was still not preserved - rats! Also, the article is not as complete as I would like it but I wanted to get the code into the blogosphere.]

[There are a few technical things wrong with this post. The first is that using Thread Local Storage is risk (at best) in an ASP.NET application. See http://www.hanselman.com/blog/CommentView.aspx?guid=320. The second is that I would prefer to use a different pattern than the one below; something a little more elegant. Lastly, after using LINQ much more I don't know that this code is even a best practice; I need to noodle on this more.]

I keep thinking that there are like a dozen or so abstract problems that I solve and everything else is just another derivation from those. This is one of those dozen problems – what is context and why do we need it so often?

We have been using a lot of LINQ lately. While using LINQ to SQL I ran into a problem a few weeks ago and created a little hack to work around it. Well we are starting to roll LINQ out to the rest of the developers we need more than a hack for this issue. What is the issue, well it’s the same we have had in the last 4 ADO stacks – how do I reuse a connection when I have a transaction open. The only time I have not had to work around this issue was in MTS/MTX and when we used the Microsoft Enterprise Library (EntLib)

The way MTX and EntLib get around this is that they are a “layer” in between me (you) and the underlying infrastructure. I remember Bernie’s old saying – “Interesting things can happen when you can get in between things”. Another way of saying that layers or interception are something that are a key part of any design.


So here is what I want the client/consuming code to look like.


const
string CN = “Server=localhost;Database=AdventureWorks2008;Trusted_Connection=true”;

using (var aTrx = new TransactionScope())

{

var aCtx = new AdventureWorksDataContext(CN);

var x = from p in aCtx.Persons

where p.FirstName.StartsWith(“A”)

select p;

var y = x.First();

var aCtx2 = new AdventureWorksDataContext(CN);

var x2 = from p in aCtx2.Persons

where p.FirstName.StartsWith(“A”)

select p;

var y2 = x2.First();

}

So I wrote a little class that caches the connections so that they can be reused across LINQ statements. If we don’t do it this way then we are invoking the Distributed Transaction Coordinator (DTC). We only want to do this when the statements need to be in the same transaction.


public
static
class
TransactionalDataContext

{

[ThreadStatic]


private
static Hashtable mConnections;


public
static IDbConnection DataContext(string pCnStr)

{


var aTrx = Transaction.Current;


if (aTrx == null)

{


return CreateConnection(pCnStr);

}


if (mConnections == null)

{

mConnections = new Hashtable();

}


if (mConnections.ContainsKey(pCnStr))

{


return (IDbConnection)mConnections[pCnStr];

}


var aCn = CreateConnection(pCnStr);

mConnections[pCnStr] = aCn;

aTrx.TransactionCompleted += TransactionCompleted;


return aCn;

}


private
static IDbConnection CreateConnection(string pCnStr)

{


// cn str is name value pairs with ‘=’ and ‘;’ separating each pair


// get each of the pairs


var aCnStrPairs = pCnStr.Split(new[] { ‘;’ });


// find the pair with ‘Provider’


var aProvPairEnum = aCnStrPairs.Where(i => string.Compare(i, “Provider”, true) == 0);

IDbConnection aCn;


if (aProvPairEnum.Count() == 0)

{


// no provider specified so assume MS SQL Server

aCn = new SqlConnection(pCnStr);

}


else

{


// get the provider name


var aProvPair = aProvPairEnum.First();


var aProvNvp = aProvPair.Split(new[] { ‘=’ });


// create a connection


var aFactory = System.Data.Common.DbProviderFactories.GetFactory(aProvNvp[1]);

aCn = aFactory.CreateConnection();

aCn.ConnectionString = pCnStr;

}


// we open the cn so that LINQ does not – this will also signal to LINQ not to close the cn

aCn.Open();


return aCn;

}


// handler for when the trx is complete


private
static
void TransactionCompleted(object sender, TransactionEventArgs e)

{


try

{


// dispose of each connection


foreach (DictionaryEntry aEntry in mConnections)

{


try

{

((IDbConnection)aEntry.Value).Dispose();

}


catch { }

}

}


finally

{


// reset the collection, in case doing another on this thread

mConnections.Clear;

}

}

}


Brick Wall – Bang Head

I had 17 (the number is not important except that it is more than one) Excel files I needed to get into a database and since wee happen to use SQL Server – I thought of SSIS. I was going to leave my trusty C# hammer in the tool bag for a more specialized tool. I was confident that even though I had not used SSIS for much (mostly trivial imports from SSMS) that I could get something running pretty quickly.

Well I could not have been more wrong.

I spent the entire day working through a series of SSIS issues specific to the problem I was trying to solve . The final issue was that a couple of the cells in Excel have more than 255 chars of data; what I nightmare trying to get the Excel driver to read more. And now that it is – its in an NTEXT data type which is practically useless when I really want a String. What a mess! Not to mention that I enlisted (wasted?) some help from other people (at least 4) who are much more knowledgeable on this technology than I.

The question I am asking myself is not whether to use a different tool (my trusty C# knife); but when I should have “cut bait and run”?

Every time I ran into an issue yesterday it felt like I was getting closer and closer to being done. Problem is that now feels like there was an infinate distance to travel; so closer being relative – i was never going to be done. At what point do I should I have realized this? Is one day too long to have been doing this? Is this just a case of arrogance and/or stuborness?

Just one of those things that makes me say Hmmmm.

It’s hard to explain to people who don’t do what I (we) do for a living this type of situation. I wonder how much time is spent / wasted doing just this sort of thing; using the wrong tool for the job. How many pople just keep banging a bent nail? Sure it works, but it’s so britile you can never change it. I guess I could have just dropped a script component on the SSIS design surface and written the entire thing in C#. Too bad the debugger does not work. I am just too old to debug something using MessageBox.Show(). Geez! I am so anal about debuggers that a friend and I (he did most of the heavy lifting) wrote our own debugger for a Basic compiler we were using back in the early 90′s!

I think I gave SSIS a fair chance. Now it’s time to get this thing done.

Pattern for Extension Methods?

The more extension methods I see and write the more I am starting to see a loose pattern forming. If I write an extension method for IQueryable then why not write something similar for IEnumerable. Sure the implementation is different, but from the outside they both feel similar. Take for instance Distinct; why only have that on IQueryable? I ended up writing my own, which was not a big deal, but byt the time I finished all the tests and a quick peer review I had spent almost a 4 hours. Not to metion the email & SharePoint entry to everyone else in the org announcing a new release of the our extensions dll. Not bad, but just distracting. Here is the code for anyone that is interested. Sorry about the formatting, this editor is pretty lame for code.

///

/// Takes a List of type and a function as the key selector
/// and returns a unique list of .
///

/// A class instance that uniquely identifies the current item. Strings work well.
/// The type of items in the target and return lists.
/// The input list. Not changed by this method./// The function that returns a KEY that uniquely identifies the current item/// A list of where every item is unique by KEY
///
/// List<MyClass> aDistinctList = aTargetList.Distinct( i => i.Id);
///

///
/// /// Worth noting that there is a distinct method on the IQueryable<T>, but this does not work
/// on List<T>. This is an entirely different implementation form that method.
///
/// /// The order of the original list is maintained, only duplicats are removed.
///
///

public static List Distinct(this List pList, Func pGetUniqueKeyFunc)
{
if (pGetUniqueKeyFunc == null)
throw new ArgumentNullException(“pGetUniqueKeyFunc”);

if (pList == null pList.Count == 0)
return new List(0);

var aDistinctKeyHash = new Hashtable(); // the list of unique keys – used for lookup
var aDistinctItemList = new List();

// go thru each item in the source list and get the key for that item from the delegate
// if the key does not exist in the hashtable then add it to the hashtable and the item return list
// rinse – repeat…
foreach (var aItem in pList)
{
var aKey = pGetUniqueKeyFunc.Invoke(aItem);

if (!aDistinctKeyHash.ContainsKey(aKey))
{
aDistinctKeyHash.Add(aKey, aItem);
aDistinctItemList.Add(aItem);
}
}

return aDistinctItemList;
}

eReading

Yes I have a couple of free minutes today. Just clearing some stuff out of my Inbox that has accumulated over the last 5 weeks of pounding out a cool POC.

A bunch of people here at work have bought a Kindle. A good friend uses his for both technical and personal reading. He was about to buy a v1 Kindle and opted to wait for v2. So far he is still on board. It gets me wondering if I would really use one or whether I just like the cool factor.

Honestly, I am not feeling like I would use it very much and it would be relegated to the long list of toys I end up giving away. Why? Because there are very few technical books I actually read cover to cover. I do much more browsing, skipping and flipping all of which just don’t work in the e-mediums. I still like the feel of books. I like to highlight and write in the margins. I like to flip through a book I have not picked up in a while and find something new or something I marked up previously.

Take Safari. Every programmer here at work has a Safari membership. Nice. But I just don’t use it that much for actually reading books. In fact when I talked to the Safari folks originally; their main use case is not for eReading but rather for finding answers to questions or topics.

My other use case for Safari is what I used to used Borders and Barnes & Noble for – to check a book out before I buy it. This is sweet. First to Safari, then to a discount book seller or Amazon to buy it used (love this).

Back-ing into Enterprise Web Applications

Sometimes I just don’t know what to blog about. Is it just me? So when I don’t have anything I feel is new or original, I guess I need to go back to something old. I know I have blogged about building applications in the browser before. Maybe not on this blog, but certainly on one of the many outlets I have had over the years. This topic has come up again as I consider the implications of technologies like Silverlight 2.0 and AJAX on building enterprise applications.

I really like the browser, but I am not sure it makes a great application platform. It makes me wonder why it seems like we need all these “other” technologies to make it a better platform. Take for instance the Back button. When browsing I am a big fan of the Back button (actually I am still a keyboard kind of guy and I use the backspace key – old habits die hard). Whether you like it or not, I think you will agree it is a standard that a lot of people now understand.

So when I host an application in the browser, what does the Back button do? Or perhaps a better question is what does the user expect the Back button to do? Why would they think it would do anything different?

“Hey, I’m in the browser. There’s this big, prominent button with “Back” written on it”. Why would I not want to press that?”

As it turns out using technologies like Silverlight 2.0 and AJAX seem to totally ignore this. They ignore it to the extent that you lose a lot of navigation context if you use it. For instance, say I am traversing down through a hierarchy. I get some number of levels deep and I want to
back up to a previous level – hit the back button? Better not if that traversal was being done using AJAX!

By the way this is not an entirely new problem; I first encountered this problem when an application I was reviewing for a client made such extensive use of session that the back button did not do what the users expected. So we spent a fair amount of time and energy coming up with ways to preserve the Back button functionality. First we found some hack that essentially
disabled the back button (ironically, by navigating Forward). Secondly we built our screens such that they used the query string to pass some basic context from one screen to the next; so that when you navigated back the browser would pass all the context back (since all it typically saves is the URL). An additional benefit we got from using the query string was that nearly every page was bookmark-able. The only exception was when you were in edit-mode; we not try to “save” transitive or volatile states.


Couple of side notes…

It seems that the Back button works much better in ASP.NET even though the context is not stored n the query string. I have not taken the time to figure out the exact details of how this is working; but I imagine it has something to do with the browser caching pages and VIEWSTATE. Noteto self – go figure this out someday.

Found this interesting article to a Java Framework to handle the Back / Bookmark problem introduced by AJAX. I have to imagine that some resourceful person has done something similar on Codeplex (or alike).

There is a time reporting application (to remain anonymous) we use at my current job that makes fairly good use of AJAX. That is in the sense that it does improve the response time and interactivity of the application. The downside is that I continuously get burned by use the back button. Argg!!

On the contrary, the MSDN library site does a pretty good job of mixing the two paradigms such
that when I hit back; the “right” thing happens. Sure, neither of these are enterprise applications (in fact comparing them to one another is hardly fair) but the problem remains.

In my mind I am questioning why we are still using the browser for enterprise applications at all. I know why, but I don’t know why. I mean, the reason why it’s so attractive is that it helps in two major areas; distribution and disaster recovery. Distribution is the whole thing that got
us here in the first place. The DR aspect is something I have been realizing more and more in this post Sept 11 world where DR is something I have to consider more and more. It’s just easy to deploy the web application to two physical sites (managed by some content manager); one the primary and the other the secondary. Give a large enough pipe you can load balance between
the two and have a scalable/HA solution to boot.

By why do they have to run in the browser? Why, why why? Here is what I
propose instead…

  1. Build an application as a native Windows .NET application based on a 3-tier architecture. No HTML but some technology like WinForms or WPF.
  2. For the middle tier I would like to use something the Entity Framework and LINQ. I have a question as to whether I am going to have a marshalling/serialization issues with this technology.
  3. The applications are deployed using something like ClickOnce deployment.
  4. In addition, I want to get rid of the requirement to have a specific .NET framework installed. I like the direction the Xenocode people are taking. Basically linking the .NET Framework into your application thereby bringing all the code and the dependencies down as part of download / install.

Just thought I would put some thoughts out here and see what if anyone is
actually listening. Hopefully you also were able to hand on until this point.
Regardless it’s a place for me to vent; kind of like my diary.

When the Sh!t Hits the Fan

I am trying to get away for a two week vacation; which is in and of itself a challenge. You know the drill. In the weeks leading up to a longer vacation there is this tidal wave of work that needs to get done. Whether you are actually wrapping up work, handing it off or getting it to a state where it can wait until you get back. I am doing all three, as is usually the case. So by the time I leave on vacation I feel like I actually need one. Badly!

This week has been one of those ones where I have been doing all of the above and on top of it our source control system got hosed. I had a friend tell me that Mercury is in retrograde and therefore she was not surprised that this happened when it did. Not sure I understand, but I am sure feeling the pain. I think Murphy and Mercury are definitely related. So Tuesday I made the call to switch from our current SCC system to a new one – Team Foundation Server.

Only problem is that the new one has way more features than the old one. So not only do we have a technical switch over we need to do some quick and minimal process modeling to get the system up and running. We took the small company approach to this and just locked a small group of the more senior developers in a room and told them to bang this out together. Things are looking pretty good except that we are waiting for a license to unlock the crippled version of the software. So close.

Ahh, I can almost feel the sand between my toes…