Shawn Wildermuth

Author, Teacher, and Filmmaker
.NET Foundation Board Member


Tagged with DataSets

Typed DataSets and App.Config...A Cautionary Tale

In this assembly, the designer created an app.config and a Settings.setting object.  All sounded good.  So in my ASP.NET 2.0 project, I setup the connection string in the web.config and called it "MyConnection".  This all worked until I deployed it to a server, when all hell broke loose.  After deployment, my code that did *not* use Typed DataSets (mostly DataSources) worked fine with my new "MyConnection" connection string...but...

Everywhere I used the Typed DataSets it was failing to connect to the database.  When I looked at it it seems that the Typed DataSets were using the connection string I used on my dev box...but no app.config to be seen.  How was it getting that bad connection string?  Well it seems that the connection string information is being embedded as the "default" connection string to use if it can't find the connection string in the configuration.  Ok, this is bad...I'd hate for my assembly to actually have stuff like my password embedded in it, but I doubt that happens.  I was using integrated security so I haven't tested the password embedding yet.

But what is strange is the connection string was in the web.config.  What gives?  Well the Typed DataSet satellite assembly named my connection string with namespace and setting prefixes.  "MyConnection" became "MyApp.Properties.Settings.MyConnection" because the serialization of the name includes all of that.  Yeech...


Upgrading Typed DataSets in Visual Studio 2005



TableAdapters, DataAdapters and Migrated Typed DataSets

In Visual Studio 2005, when you create a Typed DataSet, it automatically creates TableAdapters for you.  These are interesting objects that use a DataAdapter internally to make a more cohesive data access layer.  It will certainly help the RAD developers get started.  I am not so sure about how they will work long-term though. 

One of the more interesting things that these new Typed DataSets do is store a link to connection information in the Typed DataSet.  These seems to be used for the TableAdapters to do their open's with.  Problem seems to be that if you migrate a Typed DataSet from 1.1, there is no way to insert this connection information.  Even if you do, the designer in Beta 2 doesn't allow you to attach TableAdapters to your existing Typed DataSet.  This means that if you want to use TableAdapters you will need to re-do your Typed DataSets entirely.

I just finished an article talking about about some of these migration issues.  I'll post a link when it gets published.


Well Thought Out Blog on Typed DataSets


Not sure how I missed this before.  I was very impressed by this discussion of the issues around Typed DataSets.  Yeah, sure he agrees with most of my opinions...but I like hearing that as well as dissenting opinions.


DataSet.ClearBeforeFill? (UPDATED!)

In previous builds, the DataSet had a property on them that said whether they should clear the DataSet whenever it is Filled by a DataAdapter.  It seems to be missing in the latest builds.  I actually prefer this because the nature of DataSets (and often overlooked) is that successive DataAdapter.Fill's will allow a DataSet to grow incrementally.  New rows will be added, and existing rows will be updated (unless it is dirty, then you would get an exception). 

Maybe my ranting to Steve Lasker paid off in a small way. 

The original LadyBug on this is here if you have an opinion and want to vote to have it changed.  I've reopened it to figure out what they've done with it.


DataSets vs. Custom Entities Again...


I haven't had time to look at this new round of discussions about where DataSets fit into the data world.  I am still reformulating my ideas around DataSets after meeting with Microsoft and being told that they did not want to encourage use of DataSets in place of business objects. 

I have also been using CSLA.NET at a client and it has some good ideas about entity mapping in general, though it has a number of well documented downsides as well.  As Rocky would probably tell you, CSLA has it's place in some architectures but was not meant to fit into all solutions.


The Death of Inherited Typed DataSets?

I've spent most of the last week in Redmond seeing some new stuff and meeting up with old friends.  While I was here I scheduled some time to sit down with Steve Lasker of the Visual Basic/Visual Studio Team.  His team in in charge of the Typed DataSet in Whidbey. 

I met with him to discuss the inheritability of Typed DataSets for business logic.  If you've been reading much of what I have written in the last three years, you probably know how long I've advocated the use of Typed DataSets as a replacement for Business Objects.  Alas, I've finally been convinced by Steve that they never intended Typed DataSets to be inherited. 

In our discussion, he suggested using Typed DataSets and registering for events to do simple business logic is where they envisioned to be the limit of that use.  They seem to be convinced that the DataSet can be too heavy for certain situations.  They've spent a lot of time in the Whidbey bits to make object binding work much better than in the 1.x Framework. 


Raise your hand if you know what DataAdapter.TableMappings is for...


I seem to be getting a lot of questions and reviewing a lot of code that isn't use TableMappings and I wonder why.  For example I see this occassionally:

 DataSet ds = new DataSet(); 
SqlDataAdapter da = new SqlDataAdapter();
// Setup Adapter here ds.Fill(ds);

ds.Tables["Table"].TableName = "Customers";

The problem here is that you don't even need TableMappings, you could call the fill like so:


Rocky and I Agree on n-Tier Development



Is Data Access Really This Hard?

I've been spending some time lately reviewing how companies are doing data access in .NET.  When I look at how most of them have crufted up solutions, I am amazed.  The model that Microsoft supports seems so obvious to me, but I am neck deep in it.  I'd like to hear from my readers their specific experience with creating data access in .NET; with an eye to why or why not use COM+ for transactions; Typed DataSet or DataReaders; Business Objects or Messages.  I am trying to understand where the community is.

Thanks in advance...


DataAdapters and Component Surfaces (or why I love using the toolset)

I always forget this blog this, but when I am doing a database project using Typed DataSets, I almost always use a Component Surface to build my DataAdapters interactively.  For example:

Here I add a component to my project:


Rocky Lhotka on DataSets and Web Services and why I think he's wrong...


After reading Rocky's blog about DataSets and Web Services, I am afraid that he is falling into the same trap that other's have (including the emminently qualified Tim Ewald) with respect to DataSets.  DataSets work well in Web Services but not by default.  As I mentioned in:


DataSet Updater Helper Method

For some time now I've been pushing the idea of doing DataSet updates using DataAdapters that use a 1-to-1 relationship between DataAdapter and logical data elements (e.g. Tables or Stored Procedures usually).  This is especially true when you are dealing with related tables in DataSets (the sweet spot for DataSets IMHO).  I've continually forgotten to post this code that I use to do these updates.  The idea of this code is for the user to provide arrays of Tables and DataAdapters that imply the order of the updates.  For example

// Array of DataTables from a Typed DataSet
DataTable[] updateTables = new DataTable[] {
// Array of DataAdapters
DataAdapter[] updateAdapters = new DataAdapter[] {
// Call the Update Method
UpdateDataSet(updateTables, updateAdapters);

This implies the order so that the helper function can do the right thing which is to delete bottom up, and insert/update top down:

// Enforces that updates will be written in the right order.
internal static void UpdateDataSet(DataTable[] tables, SqlDataAdapter[] adapters)
  // Validate the input
  if (tables.Length == 0 || adapters.Length == 0)
    throw new ArgumentException("You must send at least one table and adapter");
  if (tables.Length != adapters.Length)
    throw new ArgumentException("The number of tables and adapters must be identical");
  // Disable Constraints until end of process
  tables[0].DataSet.EnforceConstraints = false;
  using (SqlConnection conn = DataFactory.GetConnection() as SqlConnection)
    SqlTransaction tx = null;
    // Try and update the datasets with a transaction
      // Open the connection
      // Start a transaction
      tx = conn.BeginTransaction();
      // Set the Upper and Lower Bounds
      int min = tables.GetLowerBound(0);
      int max = tables.GetUpperBound(0);
      // Go through all the tables, and delete the deleted items (in reverse order)
      for (int x = max; x >= min; --x)
        DataRow[] updatingRows = tables[x].Select("", "", DataViewRowState.Deleted);
        if (updatingRows != null && updatingRows.Length > 0)
          adapters[x].DeleteCommand.Connection = conn;
          adapters[x].DeleteCommand.Transaction = tx;
      // Go through all tables and update/insert the items (in forward order)
      for (int x = min; x <= max; ++x)
        DataRow[] updatingRows = tables[x].Select("", "", DataViewRowState.Added | DataViewRowState.ModifiedCurrent);
        if (updatingRows != null && updatingRows.Length > 0)
          adapters[x].InsertCommand.Connection = conn;
          adapters[x].InsertCommand.Transaction = tx;
          adapters[x].UpdateCommand.Connection = conn;
          adapters[x].UpdateCommand.Transaction = tx;
      // Commit the transaction
      // Mark all the items as accepted
      for (int x = min; x <= max; ++x)
    catch (Exception ex)
      if (tx != null) tx.Rollback();
      throw new ApplicationException("Failed to Update the database", ex);
      if (conn.State == ConnectionState.Open) conn.Close();
      if (tx != null) tx.Dispose();
      // Enable Constraints until end of process
      tables[0].DataSet.EnforceConstraints = true;

This will eventually make it into the PowerToys project, but I haven't had time to refactor it yet. HTH


Data Part 2: n-Tier...Gone Tomorrow

Recently I was talking with Rocky Lhotka and he said something interesting:

Just when we got good at Client-Server, they switched things and had us doing n-Tier applications.  Just when we got good at n-Tier development, internet applications took off.

In my opinion he is right. It is interesting because client-server and n-Tier applications still exist, especially in enterprise development.  I think we're good at client-server and n-Tier.  The problem is that I think that much of browser based development attempts to apply n-Tier development. 


Data Part 1: Business Objects, Messages and DataSets...

I've had time lately to think about the nature of data in development lately.  I've been talking with Rocky Lhotka and Michael Earls about it (as well as number of others) about the issues with dealing with data in applications. 

The first camp is all about writing Business Objects.  In this camp, you write classes that encapsulate the data, the data access and business rules about that data.  This camp was the way to do it for years now.  It proliferated in the Client-Server and n-Tier architecture camps. 

Rocky Lhotka espouses his excellent CSLA.NET framework.  If you are going the business object road, I wholeheartedly recommend it.  It is designed around allowing object to exist locally or on separate servers through remoting.


Getting My Hands on the 64 bit Framework

I got to play with an Itanium 2 Box at the PDC today. Instead of following their script, I did what I've wanted to do for months...creating a huge DataSet. They had an interesting setup. You used a Pentium 4 box to develop code and then Terminal Service'd into a sixteen-way Itanium 2 machine to run the code. The 64 bit JIT's the IL to 64 bit code from the same assembly that the 32 bit JIT did to create the 32 bit code.

I say some interesting results:

With some encouragement by the Microsoft staff, we tested 4, 8 and 16 gig DataSets. worked fine. There was a small problem with some internal issues with multiple threads and the DataSet, but that's to be expected. The 64 bit CLR is still pretty early on.


Things About Typed DataSet Generation I Never Noticed...

I have been thinking a lot about how Typed DataSets are generated and was spelunking through the code again when it got me thinking. The Typed DataSet generator doesn't really generate the code based on the .xsd, but on the DataSet. It simply loads the .xsd into a DataSet then interrogates the DataSet directly for everything (tables, columns, relationships, constraints). So if the Typed DataSet Designer cannot handle something (like relationships *without* constraints, see below), but the DataSet schema allows it...simply create the DataSet and save the .xsd file to see what it produces! This gets around some fundamental problems with the designer. It does require you start looking and understanding .xsd, but it is a useful skill to have anyway...right?

So my first relevation was how to add unconstrained relationships (no foreign key constraint, simply a way to navigate the data). Since the designer does not allow this, I looked at the .xsd and found that the DataSet handles this with a schema annotation:

      <msdata:Relationship name="ta2t" 
        msdata:childkey="title_id" />

The five pieces of data in the msdata:Relationship element are the four pieces of data required when setting up a relationship. Pretty simple huh!