Writing tests for stored procedures
With all of the talk about unit tests and test driven development, there is little talk about writing tests for stored procedures and other database code. I would argue that testing stored procs is just as important (if not more important) than testing .NET code because (a) you don’t recompile all of your SQL code when you change something and (b) bugs in stored procs can lead to really bad things like data corruption that are hard to fix.
Writing tests for database code is not rocket science. There are no SQL testing frameworks to learn (at least I don’t know of any), but that’s fine because you can do just fine with plain ol’ T-SQL.
Let’s say I have a database that looks like this:
create table Users
(
UserId int not null primary key identity(1,1),
Username varchar(255)
)
create table Roles
(
RoleId int not null primary key identity(1,1),
RoleName varchar(255)
)
create table UsersInRoles
(
UserId int not null, -- foreign key to Users
RoleId int not null, -- foreign key to Roles
)
And let’s say that we were going to write a stored proc that looks like this:
create procedure [dbo].[IsUserInRole]
(
@UserId int,
@RoleId int
)
as
begin
if exists (select 1 from UsersInRoles where UserId = @UserId and RoleId = @RoleId)
return 1
return 0
end
We can easily write a test for this. The test might look like this:
create procedure [dbo].[unittest_IsUserInRole]
as
begin
begin transaction
-- set stuff up
declare @UserId int
declare @AnotherUserId int
declare @RoleId int
declare @AnotherRoleId int
insert into Users (Username) values ('test')
set @UserId = scope_identity()
insert into Users (Username) values ('another user')
set @AnotherUserId = scope_identity()
insert into Roles (RoleName) values ('test role')
set @RoleId = scope_identity()
insert into Roles (RoleName) values ('another test role')
set @AnotherRoleId = scope_identity()
insert into UsersInRoles (UserId, RoleId) values (@UserId, @RoleId)
-- test stuff
declare @result int
-- should return true if the user is in the role
exec @result = IsUserInRole @UserId, @RoleId
if @result <> 1
raiserror('Test failed: should return true if the user is in the role', 16, 1)
-- should return false if user is not in the role but is in another role
exec @result = IsUserInRole @UserId, @AnotherRoleId
if @result <> 0
raiserror('Test failed: should return false if user is not in the role but is in another role', 16, 1)
-- should return false if user is not in the role but another user is in the role
exec @result = IsUserInRole @AnotherUserId, @RoleId
if @result <> 0
raiserror('Test failed: should return false if user is not in the role but another user is in the role', 16, 1)
-- should return false if the user does not exist
declare @NonExistantUserId
select @NonExistantUserId = max(UserId) + 1 from Users
exec @result = IsUserInRole @NonExistantUserId , @RoleId
if @result <> 0
raiserror('Test failed: should return false if the user does not exist', 16, 1)
-- should return false if the role does not exist
declare @NonExistantRoleId
select @NonExistantRoleId = max(RoleId) + 1 from Roles
exec @result = IsUserInRole @UserId, @NonExistantRoleId
if @result <> 0
raiserror('Test failed: should return false if the role does not exist', 16, 1)
rollback transaction
end
Notice that I’m beginning a transaction at the beginning of the test procedure and I’m rolling it all back at the end. That way all of the test objects that I insert into the tables won’t be left around after the test is done. If one of my assertions fails, the test will throw an error (but everything will still get rolled back).
The last step is to call the test proc from a standard .NET unit test using your favorite testing framework, which might look like this:
[TestFixture]
public class When_checking_to_see_if_a_user_is_in_a_role
{
[Test]
public void Run_IsUserInRole_tests()
{
// call the test stored procedure here
}
}
That’s it! Now you can have peace of mind when you write T-SQL code too!
Jon, you have a 50 line test script for a 10 line piece of code.
I understand that this may give you piece of mind, but the logic of your test process is an order of magnitude more complex than the logic in your procedure. So where are errors most likely to occur, where is most of the adminstrative overhead likely to occur during refactoring and feature enhancement?
And your sproc was extremely simple, pulling from only one table. Many sprocs are EXTREMELY complex and involve multiple levels of conditional logic, perhaps based upon aggregations of data from half a dozen tables or more. The script to create such test data would be enormous, and would have to ensure that all the test data meets relational integrity requirements and does not violate constraints. In your example, for instance, what happens if there is a unique constraint on Username (there should be) and the username ‘test’ already exists?
While the example you cited is not outrageously impractical, though of correspondingly lesser value due to the simplicity of the tested code, trying to implement this approach on a 500 line piece of SQL that runs against a self-referencing normalized tables would be quite a challenge.
I think the best way to handle this is to have a stable test dataset already in place, that is migrated along with any scripts applied to production. A test harness for a sproc would then need only run the sproc with preset values and verify the results.
You could do something similar with the test dataset and accomplish the same thing, but in my experience when I do it that way, I’m constantly breaking the tests because of something changing in the database because my tests are dependent on so many different things. But you’re right, for really complicated procs, data conversions, etc., this is probably the best way.
I’d have to disagree with you on simpler procs. I wrote tests like this on my project last year, and I’m glad I did because the procs I was testing were inserting ~2000 rows of production data and adding a lot more from inside triggers. I wouldn’t want to do that without tests because if I screw it up, it’s really really hard to fix (we actually did have a problem in that situation because I didn’t have all the requirements, so I spent a day cleaning up bad data). Plus, after I left that project they had to change those sprocs again so I’m sure they were glad they had automated testing that told that they didn’t break everything.
Dealing with all of the stored procs on that project was a real pain. It wasn’t that they were written poorly (some of them were, but most seemed alright to me), but you never knew everything that they were supposed to do or if you were breaking something. If they had automated tests written, I would’ve known right away what they were supposed to do and if I was breaking them. To me, that is invaluable, especially since we are constantly coming into projects and leaving them soon after.
If you check out the Visual Studio Premium edition, there is the “Data Dude” feature, AKA Database edition . Within in that there is the capability of database unit tests. http://msdn.microsoft.com/en-us/library/bb381703(v=vs.80).aspx .