Monday 20 October 2014

What's it like to attend a community event like SQL Relay for the first time?

Following the recent SQL Relay event in Bristol, I had the opportunity to talk to a first time attendee, Claire, who has recently found herself in charge of the SQL Servers in her workplace. I asked her if she would mind sharing her thoughts on the event, so that others can get an insight into what it's like to attend for the first time.

What in the day did you most enjoy, and why?
Quite glad I got to see Alex Whittles' session [Using PowerPivot and DAX to Predict and Win Fantasy F1] because it was interesting to see how the power of Excel can be used, and seeing it applied in the way he did made it easier to understand.

So the real world example helped?
Yes, and quite a fun example too, and I do think it's something we could use as well.

Anything that didn't work in the event?
I don't know if it's because I arrived late to the start of the session [Claire had been speaking to a sponsor at the time, and had run into the start of the following session], but I found the BIML session quite hard to understand, as I felt I'd missed the introduction and found it quite hard to follow where he was going, but I can see it would be useful if you're trying to create multiple packages and don't want to run through and do the same thing again.

Had you come across BIML before?
No, although I had found something similar before, so I could recognise the concept, and it was nice to see you can build things in that way rather than just amending them.

Did you realise that was an intermediate session?
No, no I hadn't picked that up! I felt it was good, but I did feel a little behind.

There's often one session like that - particularly if it's not something you can use at work, and get familiar with, but it's good to introduce yourself to new concepts.
The NoSQL Session, Richard Munn's session - I was a bit confused by it I suppose,  because everything we do is entrenched in SQL, seeing outside that was hard; and then seeing the three different pieces of software he was demonstrating, I was trying to see how that could be of relevance to what we do, and if it would be better, but I found I couldn't clearly see.

So it's something you need a longer session to appreciate?
Yes, I think so, but it's great to see how the concepts work, and  it's good to see what else is out there.

Yes, it's just a taster, isn't it..
Yes. And his presentation style was funny, which helped - he did make me laugh!
And the last session, the "fun" one [Dave Morrison's SQL Fun Time Hour], I wasn't really sure what to expect, because it didn't really describe what it was about before I went, but it was really interesting, really good, and I was hanging on his every word. There was so much in there that was really useful.

So that's the one you'd like to see more of?
Yes, it was all real life, little bits of every day stuff you do all day and don't necessarily appreciate what issues it throws up in the background. I did really appreciate that one.

So would you go to more community events in the future?
Yes, oh yes - well, if I'm allowed to!

Some of the events are at weekends, or evenings - would you make it in your own time?
That's a different type of "Allowed To"!

It sounds like a positive experience then?
Yeah, oh yeah, I'd definitely try to go again.
It was just like a series of lectures, wasn't it, you just choose what you think might be most applicable and choose, and I like that, and it was friendly as well! I had thought it might be a bit more unfriendly, but it was fine.

Did that worry you - that it might be aimed at people who had been doing the job longer?
Yes, I know that some of the people I spoke to, the moment they spoke back to me I could only really smile as I didn't understand what they were saying, but I did also have conversations with people who I felt were in a similar position to me, which is good to know! And I was already expecting there would be less females there, so that wasn't a surprise.
I don't know if I feel strongly about gender mix, but didn't wan't to feel people were talking over my head just because I was female.

Did you find that?
No, I didn't get that at all.

I'm glad to hear that.
Yes, a good day, but now I just feel I could sleep for a year - It's a lot to take in, and reflect upon. I just hope I've got enough notes to convince my employer to send me to another one!

Thanks for your time, Claire, and I hope to see you at future events.

Sunday 8 June 2014

Check constraints not what you thought they were?

I was recently adding a check constraint to a table in SQL Server, and noted some behaviour which surprised me, so I thought I'd share it with you. I've tried this on SQL Server 2012 and 2008 R2.

I added two fields, which either both needed to be null or both filled in. (Now, it is possible this requirement points to a normalisation requirement - but that's not the point at hand). So the check constraint is simply to ensure that either:

  • Field A is null AND Field B is null
OR
  • Field A is not null AND Field B is not null 

So I used the following check constraint:


USE tempdb


CREATE TABLE AllOrNothing
    
(
      
id INT IDENTITY(1, 1) ,
      
FieldA VARCHAR(100) ,
      
FieldB VARCHAR(100)
    )


ALTER TABLE AllOrNothing ADD CONSTRAINT UK_AllOrNothing 
    CHECK (
             (FieldA IS NULL AND FieldB IS NULL) 
           OR 
             (FieldA IS NOT NULL AND FieldB IS NOT NULL)
           )


Now, what surprised me was when I queried sys.constraints on this:



and again if I modify the constraint in SQL Server Management Studio:



The brackets have gone! Now, this isn't going to affect the functionality due to the precedence order of the logical operators, but will still have a SQL Developer looking at it and wondering if the intended functionality is what's been put in place. (Of course, the best way to be sure is to unit test your code).

I thought I'd blog about this, as whilst it doesn't change the function of what's been coded, it does change the form of it, which is unusual and was to me unexpected.


Thursday 24 April 2014

What do you use to version control your databases?

Many developers version control (AKA source control) the databases upon which they work, to give them the same benefits as we receive when using source control for applications, as well as the benefit that we have the two systems version controlled together. But a recent tweet from Red Gate got me thinking about those that don't use source control for databases.



I asked Red Gate about the source of their figures, and they helpfully supplied the results of the polls on SQLServerCentral.com upon which they had based this. The figures make interesting reading, and I wanted to draw out a couple of trends which I found interesting. Here's a plot of the results

A graph of the poll results supplied by Red Gate
"Which Version Control System do you currently use to store your database scripts?"
Results from SQL Server Central.com
24% of developers not using database source control actually seemed a little low to me - my experience is that a great number of companies don't use source control for databases, but this may be due to a number of things, not least that as a consultant I tend to visit companies that need assistance! Looking at the figures however, I can see that it has varied between 31% and 21% of people don't use source control in each poll, and doesn't seem to be dropping dramatically. Now, this may be a different set of people each time, or it may be due to the self-selection nature of such polls, but it is clearly a not-inconsiderable amount of people.

I've also noted with relief that the number of users of VSS has dropped dramatically since support for it was discontinued - if you still use VSS please consider migrating to a supported system. TFS is growing in use, probably due to the inclusion of licenses with MSDN subscriptions.

The use of Git and Mercurial is also surprisingly low - perhaps the distributed nature of these systems do not lend themselves as naturally to database source control, but I have used Git with database scripts with no real issues in the past.

I found these results interesting, and if you have not yet adopted source control for your databases, you may be interested to read my series of articles entitled Stairway to Database Source Control, which are currently being published on SQL Server Central.com.

Sunday 30 March 2014

Why learn from Pluralsight?

A little while ago, I recorded my first course with Pluralsight, an online training provider.

When discussing this with others, some have expressed reticence to try it, as access to the courses is sold on a subscription basis. In this post I want to tell you why I author courses and blog posts for Pluralsight, and share with you some ways that you can experience this training before paying, so you can ensure you like this way of learning before committing.

I came across Pluralsight about a year ago, as a library of developer focussed learning. Since then, they've expanded and acquired other course libraries, covering the whole spectrum of IT learning, including Open Source and Cloud Computing, as well as commercial software. I still find them my go-to provider for the training I need (you should regard continuing learning in your field as a best practice). The on-line, modular model allows me to learn in the time I have - most courses have modules in 20-40 minute lengths, so they can be watched in a lunch break - and over several days you can cover a whole course. I find their approach to training to be conducive to learning, and the subscription model encourages you to learn and improve as much as you can, rather than each subject or area costing more money.

So how can you try Pluralsight training, and see my course as well as the other great courses in the library, if you can't get budget approval for a subscription immediately?

Well, firstly, you could sign up to a 10 day (200 minute) trial. This is free of charge, giving you time to cancel if you don't like it before it converts to a monthly subscription. But there are other ways to get this sort of training for free that you may want to explore.

Some courses are not charged for, and can therefore be accessed whether you have a subscription or not. For details of which, check these blog posts. There are some free courses for children. But you may find that you already have a subscription which can help you get access to the full Pluralsight training library - such as Microsoft's BizSparkDreamspark+ and WebsiteSpark, which each give you 90 days' access.

Of course Pluralsight courses are not the only method of continuing your learning, and I recommend that you attend community events such as SQL Saturdays for in person learning, as well as your local SQL Server User Group.

UPDATE: If you are a member of any of the following industry programmes, you can get a years' free Pluralsight training by filling in this form : Microsoft MVP, EMC Elect, Google Developers Expert, VMware vExpert, Citrix CTP, Salesforce MVP, ASPInsiders, Friends of Red Gate, JavaOne Rock Star, Cisco Champions.

Wednesday 26 February 2014

Keeping tSQLt tests separate in SSDT

Users of SQL Test and the tSQLt framework upon which it is based have noted in the past that because test objects are stored in the database, they can be difficult to differentiate from the objects under test.  This has been a barrier to some users adopting tSQLt for unit testing, as it prevented separate management of tests and raises the potential danger of tests being accidentally deployed within a production environment.

Recently I have been using tSQLt within SSDT (following this method by Ken Ross), and it occurs to me that this method of controlling unit tests allows us to keep the code under test in a different project to our tests, and therefore overcome the difficulty of keeping our tests as database objects. When we have our tests in a separate project within the same solution, we can choose to deploy either the tests with our code under test or simply the code itself without the tests.  By configuring the test project to ensure that it requires the code under test to also be deployed at the same time and into the same database, we can set up publish definitions for the tests to our development machine and also a publish definition for just the code under test. It's the latter which we would use to deploy outside of our development workstation, for example to UAT. It also means that if you deploy by DACPAC the tests have never been in that package, so it is ready to deploy to each environment without your needing to take any additional steps.

Of course, a Continuous Integration engine has access to both projects within the solution from source control so it can either include unit tests or not, depending on your desired build action.

By having both the code under test and the tests themselves within the same solution, we can use the same source control process allowing both the code under test and the tests themselves to be in one place, which prevents drift between the test and the code under test.

When developing databases in SSDT, using this method gives me the best of both worlds; tests which to live with the database under development  including within source control, and yet do not need to be removed from the database as part of or after the deployment process.

Saturday 15 February 2014

Book Review - Tribal SQL

Tribal SQL
How many of you have written a book? Probably not many. Yet how people working with SQL Server have thought that they knew better, or that someone else would really appreciate how you do something? Probably a lot more.

Trying to encourage new authors to share a small, manageable amount of knowledge on a subject, like the lightening talk of the book world, is one of the aims of the book I'm looking at in this post - Tribal SQL.

It came about from a wish of Jen and Sean McCown (The Midnight DBAs) to let people read the insight and enthusiasm which is present in the less often heard corners of the SQL Community. I think they've done a good job in encouraging new voices.

The subjects covered are as varied as the topics - and all are in an informal, insightful manor, which encourages this both as an introduction to areas into which you might not have previously strayed, and also as a reference material / revision of best practice for those who have.

I'd strongly recommend this book to Accidental DBAs, who may find themselves suddenly confronted with a lot of new requirements and challenges, not the least of which is a distinct lack of training budget! (I'd also encourage attendance at SQL Community events, like SQL Saturday Exeter which is being held in March in the South West of England, and is free to attend).

This book doesn't restrict itself to the technical subjects (although it certainly does cover them, for example in Storage Internals and Data Compression) but also covers softer skills, such as project management skills, and how to get a stellar performance review. There's more basic introductory topics too, such as Stephanie Locke's chapter on reports, and how to gather requirements, implement a good relationship with others in the organisation, and produce them a great report.

I found the breadth really refreshing - the authors clearly want you to learn, and the fact that they've donated their royalties to charity really underlines their wish to help those around them.

If you aren't convinced, Red Gate have currently got a sample chapter on their website (UPDATE: as Melanie pointed out in the comments below, there are three sample chapters from the book on the Simple Talk website - Agile Database Development, Guerrilla Project Management for DBAs and SQL Server Storage Internals 101), and you can buy the book from Amazon - there's even an e-book available! The code samples are all available from the Tribal SQL website.


Disclosure: I was sent a review copy of this book free of charge by Red Gate Publishing, but otherwise received no remuneration or reward for this review.

Thursday 30 January 2014

tSQLt adds Function Mocking to the unit testing armoury

The latest release of tSQLt (version V1.0.5137.39257) was published last week, and you can download this version in the usual place. If you're using SQL Test with tSQLt, you can upgrade SQL Test to the latest tSQLt release by following these instructions.

(Note, if you've not come across tSQLt before, it's a database unit testing framework written in T-SQL, and there's a great article explaining why you might want to use it here. Red Gate have written SQL Test to help integrate the power of tSQLt with the SQL Server Management Studio environment. This article assumes you're familiar with tSQLt, but if not I'd strongly encourage you to look at it for unit testing your databases.)

I am pleased to see that this release adds function mocking, as well as a simpler way to rename classes than the previous work around, and checking for a specific error number to tSQLt.ExpectException.

Let's look at that function mocking in a bit more detail. Consider a function which should add two numbers together. We would want to unit test it, and we can do that in the normal way, but a test on a stored procedure which needs to isolate from that function needs to call our new tSQLt method tSQLt.FakeFunction.


CREATE FUNCTION dbo.AddTogether (@a INT, @b INT)

RETURNS INT
AS
BEGIN
   RETURN
@a+@a

END;
GO

CREATE PROC Maths AS
  SELECT
dbo.AddTogether(1,2) AS SumOfNumbers
GO


Now, we can see there is a bug in the function above (and if we run the Stored procedure, we will get a value of 2 returned), but let's ignore that function and proceed to test our stored procedure. Remember, because we are isolating from our dependencies, we don't expect out stored procedure to fail its unit tests.

The way that FakeFunction works is that you need to supply it with a stub function to use in place of your function to be isolated. I suggest you put them in the test class. So let's create a simple function that returns a static value - 3.

EXEC tSQLt.NewTestClass @ClassName = N'MathsTests' -- nvarchar(max)
GO

CREATE FUNCTION MathsTests.Fake (@a INT, @b INT

RETURNS INT
AS 
BEGIN
   RETURN 3
END
GO

Now we are ready to create our test:

CREATE PROC MathsTests.[test I get a value of 3 returned when I add 1 and 2] 
AS
--Assemble
EXEC tSQLt.FakeFunction @FunctionName = N'dbo.Addtogether', -- nvarchar(max)
            @FakeFunctionName = N'MathsTests.Fake' -- nvarchar(max)

CREATE TABLE MathsTests.Expected (SumOfNumbers INT)
CREATE TABLE MathsTests.Actual (SumOfNumbers INT)
INSERT MathsTests.Expected (SumOfNumbers)
VALUES (3)

--Act
INSERT MathsTests.Actual
EXEC tSQLt.ResultSetFilter 1,'exec dbo.Maths'
--Assert
EXEC tSQLt.AssertEqualsTable @Actual = 'MathsTests.Actual'
                                               @Expected = 'MathsTests.Expected'
GO

If we run this test, we get a successful test, because the stored procedure under test returns a row with a value of 3, the expectation, as our code under test (the stored procedure) is isolated from the function which has the bug in it. Of course, you would want to ensure that any module from which you isolate is properly tested, so that you catch issues such as this, but this somewhat contrived example allows you to see how the true cause of the failure can then be found more easily.

The ability to fake functions in this way is a great addition to the unit test writer's armoury.