SQL Server Data Tools (SSDT) Lengthens Life of SQL Server 2005!

The arrival of SQL Server 2012 last week sparked a semblance of positivity among people who have been using version 2005 who also held off on upgrading and skipping two SQL Server 2008 releases. For a lot of folks like me though, SQL Server 2012 gave a lot of boost and now lengthens the lifespan of 2005 databases. Not that 2005 is dying, but the tool lend more than a helping hand to a lot of 2005’s inherent goodies that some might even consider sticking with the version a little bit longer. Enter SQL Server Data Tools (SSDT), formerly aliased as “Juneau“, is the newest tool Microsoft has provided for its legion of SQL Server professionals.

Developing for SQL Server 2005

Though SQL Server 2012 has a lot to offer to anyone, SQL Server 2005 is very much a formidable data platform that would not go away anytime soon. My company still supports a lot of clients running SQL Server 2005 that might not be considering an upgrade anytime soon. And as a matter of strategy, we still very much believe that we have to pick the most common denominator among our target space. So here, just in time, SSDT gets to be initiated with a new project that would at least support SQL Server 2005.

Working Right Away

The last time I use SSDT was in SQL Server 2012 CTP3 and utilized it for some presentation/demo. I downloaded it earlier today and wasted no time and have it installed smoothly. No hiccups. It gets me so excited to play around a new tool with something that won’t be laid only to be forgotten after some tinkering. This time, SSDT gets to do real work right away with a small database.

Here are some of goodies SQL Server 2005 users from would get from a production release of SQL Server Data Tools:

  • Stand Alone SSDT (No SQL Server 2012, No Visual Studio 2010 needed). I was glad Microsoft had a stand-alone distro for this tool. The last time I tried it, it was blended with SQL Server 2012 CTP3. Now, you need not have SQL Server 2012 and Visual Studio 2010. I mean, you don’t need to have an installation of SQL Server 2012 and Visual Studio 2010 to use SSDT. Without the two, your SSDT off the web installer installs the necessary SQL Server 2012 and Visual Studio 2010 / SP1 components so it can run standalone. I had installed on a bare Windows 7 machine with no SQL Server nor Visual Studio components prior and it just went with no hassles.
  • SSDT Supports SQL Server 2005 as Target Platform. As I have tried before, SSDT fully supports development on SQL Server 2005 (lowest version allowed). You get all the bells and whistles of SSDT even if your target deployment platform is only SQL Server 2005. SSDT tunes itself to support 2005 so you are assured that every object you have in your SSDT project is fully supported by 2005’s engine and syntax.
  • TSQL Code Analysis On Build. Most common coding malpractices are easily detected. There are around 14 coding rules that can be detected for any violations. It is all up to you to weed the problems as they appear as warnings (can be set to appear as errors if you wanted to tidy up your TSQL code.
  • Catches TSQL Design/Compile Time Errors. For users who were contented to use just the SQL Server Management Studio, catching SQL compile time error isn’t a routine. Often, code errors are resolved only when executed against a database. For example, this following statement “SELECT col1, col2 FROM table1 ORDER BY col1, col3”, when parse or included in a stored procedure, produces no error upon the creation of the stored procedure. With SSDT, errors an be resolved at design/compile time.
  • Import SQL Server 2005 databases. This tool comes very handy as I attempt to do some cleaning up of some old 2005 databases and probably copy some old stuff for the new project I am doing. I was able to import 2005 databases into SSDT projects with the goal of running some code analysis. As expected, I was able to detect a lot of areas that arent flagged by the old platform as potential cause of problems (e.g. unresolved references, data typing inconsistencies, possible points of data loses, etc). I was also able to copy easily old structures and stored procedures and have it copied to a new SSDT project I am working on.
  • Offline Development. Unless I need to get some data in and out of a database while developing my SQL Server objects, I can completely do my development without needing to connect to a development or production database.
  • It is FREE! Thank you Microsoft!

My initial feedback on SSDT is that it is a very helpful and very promising tool. For the first release, there is so much goodies that can alleviate some tasks that used to be very tedious before. For the few hours of tinkering, I was able to fashion out a database with ease.

There is a lot more to discover from SSDT. So far, only a few hours of playing time was spent and I felt my productivity has been tremendously boosted.

Indeed, SQL Server 2012 is now the flagship product of Microsoft when it comes to handling data … but with SSDT’s help, SQL Server 2005 is here to stay for a bit more.

For more information about SSDT, check out the links below:

 

The Philippine Eagle-Owl. The largest of all owls endemic to the Philippines. It stands almost 20 inches and preys on small mammals like rodents and snakes.

DB Developer : Document Your Objects

There are some very useful features, almost unseen and forgotten, buried somewhere in the SQL Server space that can give back very significant benefits to both DBAs and DB Developers in their efforts to make databases live longer. As a database developer, you might want to document each important object in your database (e.g. each column of every table that you have) and store it somewhere. Neglegted, there is a facility where you can put those very important object notes (e.g. a column’s build history, comments) somewhere inside the database.   

It is very simple. There are two ways to document your objects. First you can use SSMS as shown in the following screens   

   

   

You can also achieve the same effect using T-SQL. To add more comments, you may use the following statement:   

EXEC sys.sp_addextendedproperty
 
@name=N’Description’,
  @value=N’This column denotes a bird”s habitat’ ,
  @level0type=N’SCHEMA’,
  @level0name=N’dbo’,
  @level1type=N’TABLE’,
  @level1name=N’mytable’,
  @level2type=N’COLUMN’,
  @level2name=N’Habitat’
GO

  

  

You can then retrive all the objects’ notes using the following statement:

SELECT objname, name, value FROM ::fn_listextendedproperty (NULL, ‘schema’, ‘dbo’, ‘table’, ‘MyTable’, ‘column’, default) 

The result would be like the following:

That’s all folks! Hope you learned something out of this! 🙂

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com

DB Developer : Encryption and Memory Loss is a Dangerous Mix

When playing around data encryption, you usually deal with keys, certificates and passwords. Not knowing each role in protecting your data can in fact lead you to some massive data loss. Not knowing how to keep them can possibly lead you to losing your valuable data. Worst, not remembering exactly where you keep them will leave you stunned and helpless.

When dealing with encryption, especially powerful ones, you must not forget the following:

  • Familiarize yourself on how keys, certificates and passwords work in correlation to encrypting data. Test is the key. but testing your first encryption should be done on some dummy data. You wouldn’t want to lose live production data through encryption just because after you have successfully done your encryption, voila … you dont know how to decrypt it .. or simply, things just won’t decrypt becuase probably you might have done something you were not suppose to do.
  • Just to reiterate, just be sure you know how to decrypt data that you have encrypted. Don’t assume … see it unfold before your eyes. Decrypt it.
  • When you have become a genius on the process and art of encryption and decryption, you have to know where and how to keep your keys, certificates and passwords. Be sure to jot down some notes on the various circumstances leading to encryption (applications you use, versions, configurations, etc) and keep the notes with your keys, certs and passwords.
  • Don’t ever trust your memory. Age, alcohol, and even accident can wipe out keys, certs and passwords. I am sure you won’t store your notes in your head. Keep them in some safe place. I dunno … I have yet to figure out the safest place to keep these things other than my head. You might want to send some comments just in case you find one. The last thing you dont want is when it is time for you to decrypt something after a long long time, the things you need are nowhere to be found.

I actually just stumbled upon an old harddisk containing a large cache of encrypted files containing notes, codes, source files where I could not recall the password. Getting older proves to be amusing each day. Tsk tsk! The least that I could do now is to blog the experience. And this should be part of your data security strategy and data recovery strategy plans. You dont want all these to happen to your company’s financial database, do you?

Budlay man mag-tigulang kay malipaton na! :))

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com

DB Developer : Track Developmental Changes Of Your Database Tables

As a database developer, you need to track changes to your tables the moment it has been created during development and testing stages, up to when you have deployed and done some modifications and finally when you deprecate it. Without the benefit of expensive tools that does the same thing for you, you can keep track structural changes to a database table in Microsoft SQL Server using just SSMS. For newbies out there, this has nothing to do with tracking a database table against data/content changes. Let us save that subject for another article.

One of the first things you need to do after a fresh install of SSMS is to enable the Auto generate change script option in one of SSMS’ Designers. To enable it in SSMS, you can go to Tools > Options … > Designers > Table and Database Designers and check the option as shown below.

After you have enabled the option, SSMS will prompt you with a filesave dialog containing scripts of the changes. You may want to test the feature before you can trust it.

Here, we can try to simulate some real word scenarios and see if this feature holds up. Now Using SSMS table designer, let us do the following:

1. Create TABLE1, with col1 and col2 in it. Save the change script to “TABLE1-BUILD1.SQL”

You can then send this script to a fellow developer or anyone who needs the same table as you do or probably start filling the table with data.

2. After sometime, you may want to add col3 on TABLE1 to introduce some features that you have forgotten when you first created the table. Meanwhile we can call this modification as part of our BUILD2 and save the change script to “TABLE1-BUILD2.SQL”

Checking out the content of the script will show you that SSMS is clearly aware that your table has some data in it and will do a good job preserving it. Fellow developers, customers or even some instances of the database table you maintain needing an upgrade to BUILD2 can have this script safely to modify various instances of BUILD1 of TABLE1.

3. In some cases, you may want to change a column’s name to a new one or deprecate some features by deleting some columns. Just do it in SSMS, hit save and once again you get a new modification script. Save it to “TABLE1-BUILD3.SQL”

Once again, BUILD2 instances of TABLE1 can be upgraded to BUILD3 using script “TABLE1-BUILD3.SQL”.

You can repeat the cycle until you have BUILDX of TABLE1. To deprecate TABLE1 itself though, you need to script out a DROP TABLE yourself as this tool does not automagically generates DROP TABLE scripts.

Furthermore, you can create new instances of TABLE1 to whatever version or build you want it to be. To create a new BUILD3 version TABLE1, just run scripts TABLE1-BUILD1, TABLE1-BUILD2 and TABLE1-BUILD3 sequentially.

CAUTION: If you happen to update the table outside of this designer, it won’t be tracked. You still need to manually script and incorporate the changes in a file.

Hope this helps!

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com

 

SQL Server: Real Time Data Processing? When Not To …

During these times when we have the likes of Google, Yahoo, and Ebay, CNN where-in their massive Internet applications and services are run in mammoth data centers filled with thousands of powerful servers ready to process millions of users request and data streaming through mega bandwidth networks, it is but automatic for us developers and database enthusiasts to think of processing data in real-time whenever somebody asks us how we can handle massive amount of data on a very constrained, highly distributed and unreliable environment. Is there such an environment? Let us see.

Let us imagine the rural banks in the country. These rural banks are small and can be found in the most remote of baranggays. They are not so technology savvy, and they still use traditional means of communications. Perhaps only a few have huge servers, while the rest only use personal computers. To make it more constrained and limited, let us imagine that these rural banks use varying forms of networking technology, perhaps some use unreliable dial-ups and low-bandwidth lease-lines. To make it even more pitiful, let us imagine that some can only afford to use decades old floppy diskettes to transport data from one branch to another.

Here is the big question. Can we come up with an affordable solution that would allow them to experience a secured nationwide bank-to-bank interoperations?

Our edge in technology might perhaps convince us that the exercise would be futile as how can we interconnect them when not all of them are connected to a WAN, or we don’t have any assurance of a reliable network connection. We might also wonder how will we integrate some of them that still use diskettes.

Yes, absolutely, we can cook up some system for them … but there is one trivial question we need to ask: Do We Always Have To Process Data in Real Time?

To some, yes. For me? NO.

Anyways, let us not delve too much in what that solution would be like. But let us take a look at an old reliable technology that has made some appearance in some of the latest software offerings out there that can help us solve our rural bank challenge.

I am referring to SQL Server 2005’s Service Broker. For those that are familiar with MSMQ, you will easily comprehend this new feature of SQL Server. Service Broker is designed around the basic functions of sending and receiving messages (just like an email). In its advance form, messages can be secured, queued and routed to specific destinations regardless of whether the communication channel is online or offline (again, just line an email).

Imagine, one rural branch (Quiapo Branch) sends out a message of a money transfer to (Sulo Branch) to notify the Sulo Branch that X amount of money can be released to an authorized recipient. The Quiapo Branch teller just click “Send Message” without knowing that their DSL line just went off. Using Service Broker, the teller won’t even notice that their line went off. The Service Broker will just queue and keep the message until the resumption of the DSL line. Once online, Service Broker sends out the message to the intended destination. If the DSL interruption is for about 1 hour, the transaction delay could just be a bit more than an hour.

With these, we are giving the rural banks the means to operate just like they were online. They need not be bothered if they are offline. The system will just resume its normal operations automatically. So having unreliable and very slow communication/network lines wont be that of an issue.

So who says that we always need to process data in real time?

I’ll try to find time in coming up with something that will tell more about Service Broker. Perhaps somebody out there would want to 🙂

——————————
Additional Information: Service Broker can be used in a lot of scenarios. It can even be used to customize complex data replication requirements. It can be used to split up data paths into multiple channels so that messages can be distributed in as many queues as possible (or queued to a file and transferred unto floppy disks), thereby, increasing capacity and chances to still transport data. This also means that Service Broker can be use to scale applications. Service Broker is also an excellent solution if you want to integrate loosely coupled applications.

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com

When Is The Right Time To Create Indexes

This case was raised to me by my colleagues in the company. I thought it would be nice to write this on my weblog. This isn’t a technical article but rather a reflection on when I think is the right time to create indexes based from the experiences I have been through every time I design databases and use them. I am not sure if this is a good practice but it has worked for me ever since I got hooked with SQL Server since version 6.5.

So when do I start creating indexes? Read on …

When I layout my DB objects, I seldom define indexes right away. But of course I make notes as to where and what kind of indexes I’ll be creating later on. I let the database drag on for quite sometime during the development stage without my custom indexes. I let myself write queries without the benefit of indexes. I also allow for sometime that the database gets populated with enough data. Most often, I start creating indexes when the system is almost feature complete and most of the complex queries were written and run against some amount of data. We usually call this phase the load testing phase.

Why do I do this? Here are my two major reasons:

  • Indexes can improve your query performance drastically. During the query writing phase, having indexes can give you an impression that your scripts are blazingly fast. This can be worsen by the fact that typically, your database contains just a token amount of data.
  • Since we all create optimal indexes based on the kind of queries sql server will often encounter and process, creating indexes too early could potentially give you skewed performance impression.

Both reasons can potentially affect your capability to detect query performance problems.

By having the indexes created at a later stage of the development phase, you subject your database to work harder without the benefits of indexes. Performance problems will begin to appear if you have crappy queries. Still, at this point, resist creating an index. Exhaust your means to optimize the query without the benefit of indexes. If you want to create indexes at this point for the sake of testing, just be sure to delete them after the testing.

Doing this will give you a certain level of assurance that you are writing optimized queries without the benefit of indexes. You will even appreciate this practice when there are too many in your team writing so many queries. Once you have the indexes, your queries will be lightning fast.

So how about you? When do you create your indexes?

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com