Tagged with SQL Server
As many of you may have heard, I recently launched http://GiveAQuiz.com as a new web site for creating and taking quizzes. The Data Team at Microsoft were great in helping me build this site. I've written a whitepaper detailing how we used the Microsoft data stack to accomplish it. This whitepaper shows how we used these data technologies to build the site:
As many of you who've been following know, I use a simple database of XBox game data as my example database. I usually remember to include it on the server project but in some examples its been forgotten. In other cases I've shipped with a 2008 version of the database instead of the 2005 version. To address this, I've uploaded .zip'd versions of both the 2005 and 2008 databases for anyone to use for any reason. This includes my RIA Services sample (which is using a SQL Server 2008 version) or my MVVM MSDN article sample which also uses a 2008 version. They include a MDF and a LDF. If you have trouble attaching them to an existing database, try deleting the LDF file. Go grab there here:
Any questions/problems, post a comment!
Its been an exciting day here at the Wildermuth compound. I noticed that the old setup for my SQL Server Monitor project was broken (mostly dependencies were wrong), so I figured I'd just open it up and tweak a couple of settings and move along with my day. D'oh!
Seems that I'd lost the source to that project. I use Subversion to save my sources locally but that project pre-dated that source control. What to do? Well, I rebuilt it using Reflector to give me the old code. Then I rebuild the setup using Visual Studio 2008's Setup projects (which suspiciously don't seem like they've changed since their inception in Visual Studio 2002!).
Hey everybody look...its not a Silverlight post ;)
I just finished listening to this interview with Peter Spiro. Peter Spiro discusses leadership in the SQL Server team, Shackleton and WinFS. Its worth a listen as he's one of the smartest guys in MS (in my opinion) and has enough cred to back it up.
For those of use who have spent any time in the sample AdventureWorks database, I just found a handy Visio (or HTML) database diagram of the database. Its clean and explains some of their ideas about using schemas in SQL Server. Worth a look if you've ever looked at the AW database.
I got thinking about writing custom paging code with the new SQL Server 2005 ROW_NUMBER, RANK, and DENSE_RANK functions. I started playing around with code and ended up with this simple ASP.NET 2.0 example. It works with SQL Server 2005 and the Adventure Works sample database. The example uses ad-hoc SQL so it would be easy to show how it works, but moving it into stored procs would be simple.
I'd love any feedback!
I had an interesting discussion with some members of a class I am teaching right now about how paging is done. It looks like there are four Functions that allow for paging and using Common Table Expressions. They all can optionally include a separate ORDER BY clause to specify how they are sorted (so your result set can be sorted differently than it is ranked).
This function returns the exact number of the row number in the result. For example:
If you upgrade to RC1 be aware that SQL Server 2005 requires SP2 before it will work with Vista RC1. Yeah, I know there is no SP2...but that's the case. It just doesn't work. I am trying to hack around the problem so I'll let you know if I find a solution...
I have been attempting to try out some new software from Microsoft (including Glidepath and Visual Studio Database Edition). Both of these require SQL Server Express installed. Problem is that I install a Developer Edition of SQL Server 2005 (as well as 2000) because it is more feature friendly than SQL Server Express. Why does Microsoft insist I have a third Database server? Why can't it prompt me to tell it what database to use, or at least attempt to find SQL Server 2005 as the default instance on the current machine? Just stoopid in my opinion. It's keeping me from trying out and possibly exhaulting these new interesting projects.
"There I said it..."
Recently I posted about Timestamps and CommandBuilders and I got a well informed reply by Luciano Evaristo Guerche concerning a related approach of using BINARY_CHECKSUM in SQL Server to do the same thing. I think Luciano's response means to say that if you can't use Timestamp's in the database (like you don't have control over the schema) then BINARY_CHECKSUM is an improvement over the brute force concurrency that CommandBuilders do by default.
I thought Luciano was right, but I wanted to prove it out. I ran some tests using a Typed DataSet and the Northwind Customer table:
I was talking with a bright guy inside the ADO.NET team today when he told me that the DbCommandBuilder supports a new option called ConflictOption. This option alerts the DbCommandBuilder to use one of three methods for detecting concurrency conflicts:
Unfortunately, the TableAdapters in Typed DataSets (v 2.0) don't seem to be using this when they create their concurrency (the TableAdatper code generator uses a DbCommandBuilder to create the Update/Delete statements).
If you're in Atlanta and want to see me talk about the SQL Server CLR Integration, come by the Atlanta C# Users Group on May 1st (That's today). Meeting starts at 6pm. Here's a link to the User Group's Site:
In a current project I am working on with a distributed team, we use a set of detached database (mdf/ldf) files to keep current the latest version of the database we're working on. I use SQL Server 2000 and 2005 on my local dev-box, but since this customer is going to use SQL Server 2000 I've been trying to keep the work on SQL Server 2000.
No real problem so far. One of the dev's on the project only has 2005 installed, but this isn't that big an issue and I've seen no compatibility problems so far...except...
Thanks to Bill Booth via the Windows Off Topic Mailing List, I was concerned that the DTC was being used for intra-database transactions with SQL Server 2000. I was looking at timing and using System.Transactions with SQL 2000 is through the roof slower than using traditional client-side transactions. Interestingly SQL 2005 doesn't have this limitation. Lazar Florin has a great blog post that explains what is happening here (found here).
Short story is that SQL 2000 can't automatically use the Lightweight Transaction Coordinator inside of System.Transactions (SQL 2005 has promotable transactions and can use it fine). Luckily it is not too hard to make SQL 2000 behave (as seen in Lazar's blog entry). Great find Bill!
I have so many SQL Server instances on my local machine others in my home office that I wanted one place to start and stop them all. I liked the start-stop functionality in the SQL Server agent, but I have MSDE instances and SQL Server 2005 instances running too, so a single place to do it all from an icon tray was my goal. So here I've created a simple .NET 2.0 application. I would have done it with 1.1 to make it more accessible for users, but there were some features I needed in 2.0 to make the app work. So if you have the .NET 2.0 Framework installed, check out this new app to control multiple instances of SQL Server a mouse-click away:
I've been digging deep into SQL Server 2005's CLR integration over the last few days. I am surprised by several specific omissions. My guess is that they have been dropped because of time constraints:
In addition, the current release of the InProc Managed provider has made some interesting choices that are different from earlier version:
Its going to be interesting to see where it goes from here. All of these changes are only since the February CTP. I can't imagine it's going to stop here...
Forget everything I told you. It's all changed in the April CTP. I hope to re-give this talk soon in the Atlanta area with the new bits.
In response to Sahil Malik's recent post on CLR Types as UDT's in Yukon, I have to say I prefer the Typed Xml in Yukon to CLR Types.
In Yukon there are two paths to creating User Defined Types. The CLR path has some limitations, (primarily the 8K size limititation). I am a big fan of the XML Type path. Typed XML inside the server is a better way to create complex types in the database IMHO. Typed XML is schema based. This means the types in your database can be exposed externally as well if needed (like through web services <some of you might be seeing a pattern emerge here>).