Database Server Provisioning: Items Of Concern Using Commodity Servers

There were series of requests for info directed at me lately from clients and acquaintances on how to best deal with their need to come up with the right server specification for their SQL Server based applications.

Upon some substantial researching, I came to a conclusion that preparing a new environment for our SQL Server isn’t an exact science. It almost always followed by lots of questions where answers are hard to come by easily. The complexity of coming up with a sound specification is heightened by the lack or low frequency of our exposure to the task at hand. I mean, how many of us are tasked to come up with a new server for some new SQL Server based application on a daily or even weekly basis?

So I thought of compiling what I have stumbled upon the Web, the online fora, and from interactions with various hardware vendors. This compilation of information would probably serve as a starting point to coming up with sound configuration of a machine that would run our SQL Server and would also serve as my guide and reminder of the things that I need to be concerned of.

Today, we have so many options to choose from. But I only need to focus on those that will affect SQL Server’s performance using commodity servers. So what is in the market today that needs our attention?

CAUTION: I’ll only assume that the server will be configured as dedicated only to SQL Server. I won’t cover virtualized scenarios here.

Processing  Unit

  • 64bit. Going 64bit is the way to go in most of us. This should allow us  to go beyond the memory addressing limits of a 32bit system. SQL Server need not task itself too with the extra memory handling posed by 32bit.
  • Speed. Faster core speed would generally give us better performance. So I’d say buy the fastest per core speed processing unit your money can have.
  • Cache. Larger cache are better. Cache allows the CPU to keep frequently used data nearer and faster to the processing unit. The more it can accommodate, the better.
  • Thread. Today’s processing units offer flavors that have multi-threading in the cores. Extra threads allow us to have more ‘processors’ (logical) than what we actually have (physical) to work on a given task. SQL Server takes advantage of this. In fact, the maker of SQL Server states that the compute capacity of the 2 logical processors in the hyperthreaded core is greater than the compute capacity of the same core with hyperthreading disabled. So go for the processors with threading.
  • Cores. Today’s staple are from commodity servers have 4, 6, 8 cores per socket. Go for the highest numbers your money can buy. Be aware though that starting with SQL Server 2012, the number of cores that you have comes with licensing implications, so you need to check on these concerns too.
  • Sockets. Today’s commodity servers have 1, 2 and 4 sockets. Though your application may need only 1 single processor, it is sensible to factor in growth and future expansion options. You may opt for a server with spare sockets for growth. However, do take note that you may have difficulties acquiring additional processors for your server if you decide to expand after several years. It also make sense that if you have enough money to fill the spare sockets with the processors you need. Be aware though that starting with SQL Server 2012, the number of sockets that you have comes with licensing implications, so you need to check on these concerns too.


  • Size. More is better. Period! 🙂
  • Density. Go for the denser memory modules so don’t quickly fill the limited number of memory slots meant for expansion. E.g., If your server has 4 memory slots and can handle up to 32GB max and you need 16GB of memory, it is better to get two 8GB modules than four 4GB modules so you still have two slots for expansion. Expanding to 32GB will render your 4GB modules unusable.

Storage and Related Components

Probably the slowest components in your server, choosing the right combination of storage and related components can lead you to a better, performant SQL Server database server.

  • Capacity. In SQL Server terms, server storing capacity isn’t disk size. SQL Server has various ways of utilizing storage. You can have multiple file groups across several arrays or disks in multiple enclosures. Size is really dictated by what your application needs. Determine first your size estimates and have enough buffer for growth. If you can expand size in the future without having to shut down and reconfigure your system, that is for me is the most sensible path to take.
  • Disk Size. Individual disk size may or may not affect SQL Server. But do note that size can relatively affect seek speed. Go easy with larger disks. Heads can travel longer in larger capacity drives. So if you are building an array of multiple drives, you may prefer building it with smaller capacity drives.
  • SATA, SAS. To make things simple, SAS drives are faster but are more expensive than SATA drives.
  • SSD. Solid State Drives are the future but it is here now. Though I have not personally tried this yet in a production server, vendors and customer testimonies tout SSDs are superbly faster in a lot of cases and increases significantly the server’s performance. There are still reliability issues that I have read but things are getting better each day. I’d probably want to utilize SSDs for my tempdb.
  • RAID. There should be more than enough information on RAID and SQL Server on the web. It is one piece of component an SQL Server DBA need to be concerned of and familiarity is the key. There are plenty of storage/server options that allows you to have multiple RAID controllers and build various array types. It is just impossible to have it covered here in this article. I’d say it is always better to check with your vendor. However, I have the belief that in SQL Server, the more RAID arrays you have spread across several controllers using the fastest, most reliable and less dense disk drives are the way to go. This way, you can freely spread load such as separating transaction logs, tempdb, data files to their own arrays. You may check on other blogs on this matter.

Ethernet Ports

  • Bandwidth. Servers with 1Gbps ports are now a norm but I have seen some server models with 10Gbps ports. However, you can only take advantage of these high volume ports if your network (cabling, switches, etc) supports it. I would assume that within the datacenter, or the least within your servers’ rack/enclosure that you have the fastest links. SQL Server will take advantage of whatever bandwidth you have in your box.

Blades, Storage Area Networks, NAS and External Storage options

These types of options present a bit different set of concerns to an SQL Server DBA as compared to standard/commodity servers.  But the nuances of the common components I have mentioned above can still be applied on these options. I would love to expand this article to cover these options. This is all for now!


Whiskered Tree-Swift, 𝘏𝘦𝘮𝘪𝘱𝘳𝘰𝘤𝘯𝘦 𝘤𝘰𝘮𝘢𝘵𝘢 (female)
Photographed in Mt. Palay Palay, Cavite

Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to


Cleaning Up Old Database Dirt with SQL Server 2012

I am about to venture into transitioning a 10 year old database that started with SQL Server 2000 which has surely some (a lot?) accumulated  ‘dirt’ in it that started way back in 2001. In the past (am sure it still is with 2012 edition), SQL Server has been very forgiving for all the sins of its handlers (DBAs, developers, users, & pretenders). It had allowed a lot of minor issues to creep into the database without constantly complaining and compromising its ability to run in production. In a perfect world, those issues could have been eradicated a long time ago.

The ‘dirt’ has accumulated as SQL Server hasn’t been pestering people that there is some cleaning up to do. In the real world, that is acceptable. We just can’t stop the show after someone spilt water on the stage floor. After 10 years,with better tools and ways to clean up dirt, it would be a worthwhile exercise so we can navigate easily through the challenges of modern times.

Enter SQL Server 2012, it packs with efficient and reliable tools to accomplish this tasks of cleaning things up. To start on the cleaning process though I need to see the mostly unseen ‘dirt’. SSDT comes in handy for this job.

Using SQL Server Data Tools (SSDT) As Dirt Probe

Here is how to begin identifying what needs to be cleaned.

Then allow SSDT to scan your database for ‘dirt’. I am actually referring to ‘errors’ and ‘warnings’!

After scanning, go to the Error List tab and VOILA!!! now you see the ‘dirt’.

Armed with a long ugly list, you can now hire a cleaner to scrub off each and every dirt you have identified.

That is all for now for the cleaning up!

Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to

SQL Server Data Tools (SSDT) Lengthens Life of SQL Server 2005!

The arrival of SQL Server 2012 last week sparked a semblance of positivity among people who have been using version 2005 who also held off on upgrading and skipping two SQL Server 2008 releases. For a lot of folks like me though, SQL Server 2012 gave a lot of boost and now lengthens the lifespan of 2005 databases. Not that 2005 is dying, but the tool lend more than a helping hand to a lot of 2005’s inherent goodies that some might even consider sticking with the version a little bit longer. Enter SQL Server Data Tools (SSDT), formerly aliased as “Juneau“, is the newest tool Microsoft has provided for its legion of SQL Server professionals.

Developing for SQL Server 2005

Though SQL Server 2012 has a lot to offer to anyone, SQL Server 2005 is very much a formidable data platform that would not go away anytime soon. My company still supports a lot of clients running SQL Server 2005 that might not be considering an upgrade anytime soon. And as a matter of strategy, we still very much believe that we have to pick the most common denominator among our target space. So here, just in time, SSDT gets to be initiated with a new project that would at least support SQL Server 2005.

Working Right Away

The last time I use SSDT was in SQL Server 2012 CTP3 and utilized it for some presentation/demo. I downloaded it earlier today and wasted no time and have it installed smoothly. No hiccups. It gets me so excited to play around a new tool with something that won’t be laid only to be forgotten after some tinkering. This time, SSDT gets to do real work right away with a small database.

Here are some of goodies SQL Server 2005 users from would get from a production release of SQL Server Data Tools:

  • Stand Alone SSDT (No SQL Server 2012, No Visual Studio 2010 needed). I was glad Microsoft had a stand-alone distro for this tool. The last time I tried it, it was blended with SQL Server 2012 CTP3. Now, you need not have SQL Server 2012 and Visual Studio 2010. I mean, you don’t need to have an installation of SQL Server 2012 and Visual Studio 2010 to use SSDT. Without the two, your SSDT off the web installer installs the necessary SQL Server 2012 and Visual Studio 2010 / SP1 components so it can run standalone. I had installed on a bare Windows 7 machine with no SQL Server nor Visual Studio components prior and it just went with no hassles.
  • SSDT Supports SQL Server 2005 as Target Platform. As I have tried before, SSDT fully supports development on SQL Server 2005 (lowest version allowed). You get all the bells and whistles of SSDT even if your target deployment platform is only SQL Server 2005. SSDT tunes itself to support 2005 so you are assured that every object you have in your SSDT project is fully supported by 2005’s engine and syntax.
  • TSQL Code Analysis On Build. Most common coding malpractices are easily detected. There are around 14 coding rules that can be detected for any violations. It is all up to you to weed the problems as they appear as warnings (can be set to appear as errors if you wanted to tidy up your TSQL code.
  • Catches TSQL Design/Compile Time Errors. For users who were contented to use just the SQL Server Management Studio, catching SQL compile time error isn’t a routine. Often, code errors are resolved only when executed against a database. For example, this following statement “SELECT col1, col2 FROM table1 ORDER BY col1, col3”, when parse or included in a stored procedure, produces no error upon the creation of the stored procedure. With SSDT, errors an be resolved at design/compile time.
  • Import SQL Server 2005 databases. This tool comes very handy as I attempt to do some cleaning up of some old 2005 databases and probably copy some old stuff for the new project I am doing. I was able to import 2005 databases into SSDT projects with the goal of running some code analysis. As expected, I was able to detect a lot of areas that arent flagged by the old platform as potential cause of problems (e.g. unresolved references, data typing inconsistencies, possible points of data loses, etc). I was also able to copy easily old structures and stored procedures and have it copied to a new SSDT project I am working on.
  • Offline Development. Unless I need to get some data in and out of a database while developing my SQL Server objects, I can completely do my development without needing to connect to a development or production database.
  • It is FREE! Thank you Microsoft!

My initial feedback on SSDT is that it is a very helpful and very promising tool. For the first release, there is so much goodies that can alleviate some tasks that used to be very tedious before. For the few hours of tinkering, I was able to fashion out a database with ease.

There is a lot more to discover from SSDT. So far, only a few hours of playing time was spent and I felt my productivity has been tremendously boosted.

Indeed, SQL Server 2012 is now the flagship product of Microsoft when it comes to handling data … but with SSDT’s help, SQL Server 2005 is here to stay for a bit more.

For more information about SSDT, check out the links below:


The Philippine Eagle-Owl. The largest of all owls endemic to the Philippines. It stands almost 20 inches and preys on small mammals like rodents and snakes.

Wild Bird Photography – Calibrate Your Birding Lens

One of the ways for you to get super sharp bird photos is to know your gear like you know how crazy you are.

WARNING: This blog may have OFFENSIVE content if you have been feeling miserably lately. You may opt to skip reading this blog in its entirety, else, read at your own RISK!

Recently, my workhorse, the Canon 50D, was admitted to the hospital due to sprained shutter mechanism (one of the consumable/replaceable parts of a Canon DSLR). Advertised by Canon to have 100,000 shutter, I was only able to consume around 65,000 shots when intermittent issues popped up with increasing frequency. First was the black vertical arc appearing on my photos against extreme backlights and the last one was the intermittent Error 30 during 6fps bursts. I must have overused it to the point of exhaustion. But I thought there are 35000 shutters more left for me? Aaah, the beauty of advertising.

To cut the story short, I had the shutter mechanism transplanted with brand new parts. It took Canon Philippines a week to fix it and now I have another 65,000 shutters to boot. My Canon 50D is now back in action.

The core of this blog though was the consequence of having a brand new piece of mechanism in my birding rig. As soon as I got it back to working condition, I suspected that some shots were off in terms of sharpness that I used to have. I thought that the only way to get confirmation of this sharpness issue is to go out birding and have the gears tested and re-calibrated if indeed sharpness was off.

Unknown to many, bodies and lenses that you buy, aren’t perfect. The moment you have your lens mated with a camera, you may have brought with it some problems. The mating of the camera body and a lens produces unique combinations. Sometimes both are perfect combinations, sometimes both needs some tuning so minute imperfections are threshed out. One of the most important concern that a birder needs to look at his gears is ‘auto-focusing’. Having perfect ‘focus’ delivers super sharp images. In bird photography, having a mis-focusing rig, however minute, gives you images that are sharp but are not extremely sharp.

So if you really want extreme sharpness, you should know how to calibrate your camera and your lens to achieve perfect focus. You do calibration when one of these things happen:

  • after you combine a body and a lens (right after you have bought the set)
  • when something in the combination has changed (in my case, some mechanism that might affect focusing was replaced)
  • during some regular tune-ups (I typically re-tune my rig once every 6 months)
  • when you suspect sharpness is an issue

The good news is that, it is very easy to have your rig finely tuned to achieve perfect focusing. First, you just need to know if your camera body has some feature that allows you to calibrate auto-focus. My Canon 50D has one and it is called Micro Focus Adjustment (MFA). I would suggest at this point that you check your camera’s user’s manual if indeed your camera supports this. If you camera doesn’t have this feature, sorry, you may have to bring your rig to your service center for proper tuning. If you happen to have this feature, check the manual on how to use the feature. this might differ from camera to camera depending on brand and models.

As a bird photographer though this is how I calibrated my camera and lens upon suspecting some sharpness issue caused maybe by the installation of a brand new shutter mechanism.

NOTE: I purposely went to Angono, Rizal yesterday to shoot the Philippine Eagle-Owl and at the same time recalibrate my camera and lens to achieve perfect focusing. This is how I did the tuning:

  1. Mount your gear on a sturdy tripod. You must not move your tripod during the entire calibration process.
  2. There is no better substitute than to use the REAL THING and do it outdoors! Use bird feathers when calibrating your gears. Feather details are what you are after so using a feather would give exactly the same feedback as shooting birds. Since the Philippine Eagle-Owl is readily available yesterday, I ‘plucked’ a feather from it and use it for calibration. Place the feather at exactly just a bit right outside your MFD (minimum focusing distance). If you can’t get a feather, you are not a resourceful bird photographer. I suggest you shoot landscapes and sunsets.
  3. Format your cam’s media card.
  4. Set Timer mode to shoot. 2 seconds would do. Go for 10 if you want.
  5. Set rig to LIVEVIEW, lens at wide open and set to its longest focal length. Zoom in to the feather @ maximum magnification (10x for my Canon 50D). Focus manually or using contrast-detection focusing. This will enable you to have a visual confirmation on the sharpness of feather details via your cam’s LCD screen. Then shoot! Be sure that you produce an ultra sharp image using this process. Repeat Step 5 until you are able to produce a single ultra sharp image. This ultra sharp image will be used as your guide in achieving perfect focusing. Be sure to delete the ones that are not as sharp as can be. Retain only one image that is the sharpest.
  6. Set lens to auto focusing mode. Set the proper limiter (mine @ 3.5 meters) if your lens have these switches. Set lens wide open and its longest focal length (for zoom).
  7. Set MFA to zero and shoot at the feather. Using your cam’s LCD, view and compare the output from your previously saved ultra sharp image that was earlier taken via LIVEVIEW. If you happen to have the same ultra sharpness between the two images, then congratulations for having a perfectly tuned and focusing rig and your tuning is DONE. However, if sharpness is doubtful, you may either increase or decrease MFA setting and repeat Step 7. DO this until you achieve the same level of sharpness as the one you had using LIVEVIEW.
  8. After you are done, only then you can move the tripod. If you have moved your tripod during the process, then you consider repeating from Step 1.

Here is the result of my calibration earlier using a real Philippine Eagle-Owl feather in Angono, Rizal. Click on the photo for high-resolution viewing.

Prior to my cam’s shutter mechanism getting replaced, my MFA setting was at +4. Long ago, it was at +8. Now with its brand new shutter, it is at +12.

This is all folks! I hope this article has given you an idea on how to know your gears better.

NOTE: No birds were harmed during the calibration process.

Check out my Philippine birds photos @ Facebook!

A Snapper, a Goby and a Barracuda!

Weekend of January 28th, I woke up with an itch and found myself on the road to my favorite fishing spot in Cavite. It was past 8:00am when I arrive. Thinking that I was a bit late, I was just hoping that the day would be less lousy. After I parked, I wasted no time and rigged one rod for some live bait skipping my usual routine of casting artificial lures for the first hour or two. My target bait was the promising, becoming a favorite feathered river gar (‘kansusuwit’). In a few minutes, I reeled in one. I immediately tossed it back to the water for some 10 meters and as it splash and thrash in the water to get rid itself from the hook, it was like whaaaam! In an instant, I got myself this:

I got myself a perfectly sized Pargo or the Mangrove Jack. Officially called a Mangrove Red Snapper (Lutjanus argentimaculatus). As excitement engulfed my senses, I reached to remove the hook when I got careless. It was able to snap the tip of my thumb and blood just oozed for like 5 minutes. I realized then why it was called a ‘snapper’. It was really painful but was quite happy with what I landed. My first ever Pargo.

Then got myself another lively kansusuwit and as it landed back to the water, another strike from something big. This time though it got away. Then no strikes for the next hour or so.

I decided to change rig and opted for some dough bait to try if some tilapias would bite. Then I got another bite. This time I got this:

This is some biggest Biya (Tank Goby, Glossogobius giuris) I have ever seen and caught.

For the next couple of hours, there was no sign that my day would be bountiful so I was just there keeping myself busy with some sandwich and some ice cold soda.

Past lunch time, I decided to switch to live bait once again and got myself a kansusuwit. I tossed it back to the water and after some minutes, I got a huge strike but I wasn’t able to set the hook. As I retrieve to cast my bait farther, I got another strike, then a long pull. I moved to set the hook but it got away. However, all that was left of my bait was the head of the river gar. Something hungry got the other half of the fish. I tossed back the half bait and got another strike. I was able to set the hook properly and as I reel in, the fish jumped out of the water and showed itself to me. This is what I caught:

I got myself a Juvenile Great Barracuda (Sphyraena barracuda). I wasn’t able to properly shot some decent photo right after I landed this unexpected predator but shot this after I got home. Few more hours of casting live baits but gotten zero strikes.

It was getting late in the afternoon when I decided to have my mobile parked nearer to where I was so I decided to just leave my rod unattended. As I got back, the rod was violently bending and shaking as I have forgotten to loosen the drag a bit. But before I could get to the rod, something snapped. As I retrieve, all I got was a freshly snapped branch of the mangrove tree, around half of an inch in diameter, with the leading hook, a bit straightened and embedded on the bark while the trailing hook gone. It must have been some kind of a monster to do something like that to a size 2 hook tied to a 30 pound test line. After the incident, it was uneventful until I called it a day.

I went home a happy angler! 🙂

The Reckoning: The ‘Sword’ Just Got Bigger

January 3, three days after I lost my topwater lure to some biggie, I went back for some reckoning … this time with Dennis, my bird photography buddy. I went to Ternate, Cavite, he went to Mt. Palay Palay for bird photography. I settled in a small riverbank adjacent to a lagoon where I lost my P322 topwater lure from some fish that I failed to identify and land. The fish must have been huge to snap a 16 pounder braid and took off my one and only topwater lure.

Without my topwater lure, I decided to fish for some kansusuwit (Feathered River-garfish) as live bait. This species seem to be a perfect live topwater lure. It was actually my first time to intentionally go for live baitfish and here trying out these kansusuwits. I rigged my first rod to catch some kasusuwits with a size 10 hook and laced it with some diced shrimp. As soon as the bait landed on the water, a few of the river-garfish nearby raced for the shrimp and in no time, I reeled in one. Immediately, I carefully unhooked the fish and transfered it on my 2nd rod which is rigged with a size 6 hook.

As soon as I tossed the baitfish by some 10 meters on the deep end of the riverbank, I got a strike and I instantly set the hook. Then whaaaam … something very strong is pulling down the line. It fought strongly for some 30 seconds, then the line went loose. I thought it got away but as I reel in some line, it pulled down my line again and showed it’s silvery body near the surface. This time, I reeled it fast as excitement rush through me. As I reeled it out of the water, it was some needlefish. Then I realized it wasn’t hooked but it got itself entangled with the line and it has not swallowed the baitfish. Just as I was about to grab it, it was able to free itself. Luckily, it fell on the sandbar and I was able to prevent it from getting back to the water.

Here it is, a 15.5 incher Spottail Needlefish (Strongylura strongylura):

With its not so large mouth, I figured the needlefish must have rolled itself after it snatched the bait entangling itself with the line in the process.

Live-baiting for the rest of the day proved futile. I got not even a single strike after the needlefish.

Thinking my luck has ran out, I shifted my focus on the wild ‘saltwater’ tilapias that is so plenty nearby. With some worms, I was able to land a few. This is also my first time to catch this ‘saltwater’ variety.

Then I called up Dennis to see how he is doing, he got something that eventually was identified as a rare migrant, an Asian Brown Flycatcher.

We went home very much contented with our catch.

As for me, it was a reckoning day indeed and I was able to land a bigger “sword”.

On The Last Day of 2011, I landed 13 ‘swordfish’!

Few days prior, been contemplating to find some new fishing site for some wild fish species near my place. I tried those in the CCP Complex and snagging Tilapias was not for me. I thought of Valenzuela’s but its stressful traffic and crazy motorcyclists turned me off and Subic’s just too far and expensive.

I thought of trying Cavite as the next best place so I found myself raring to go to Ternate. I used to go here for bird photography so the place should be quite easy to navigate. I thought going to either Puerto Azul or Caylabne would be a treat.

I went to Puerto Azul to check out their pier that has been unused for quite a long time but security personnel barred me from fishing saying their policy now prohibits recreational fishing due to some recent accidents involving foreign nationals. As fishing in Puerto is now prohibited, I thought of going to Caylabne but the thought of paying P500 for entrance scared me away.

So I just roamed around Ternate and found myself in some mangrove laden lagoon teeming with wild Tilapias. As soon as I was able to deploy, I had my first cast using a topwater lure. As I retrieve it .. it was like whaaam! A huge strike, then a snap. As I retrieve, only a few inches of my 16 lbs braided terminal and a swivel were left. “@#$*&%^&%@#$%^ Sayang!” I told myself.

Then I realize that it was already past 2pm and I still haven’t landed any after the big one that got away. I have deployed 2 rods, alternating various baits. I tried doughbait, peeled shrimp, artificial ones but nada! Changed hook sizes to the smallest and still no takers. Around 3pm as the tide is rushing in, I switched back to #6 with some peeled shrimp when some river garfish took it. Then another, then another til I landed a baker’s dozen. Maybe their bellies were full early in the day, or was it the size of my hook? Or was it the bait? Was it because of the rushing tide? I am not sure but that will be charged to experience.

I went home with a handful of river garfish that looks very delicious and vowed to return to do some reckoning with the one that took my P322 topwater lure.

Here are my 13 ‘swordfishes’!

Feathered river-garfish (Zenarchopterus dispar). Locally called ‘kansusuwit’. Did I say feathered? Look at the long beaks! I am in familiar territory!

And here is what I did to some feathered friends with long beaks the next morning! 🙂

Some Major Changes to SQL Server Licensing in 2012 Version

Last month, Microsoft has published changes to SQL Server’s licensing model when the 2012 version hits gold. A number of major changes will likely catch everyone’s attention. Here are those that caught my while:

  • Editions now are streamlined to just three (3) flavors:
    • Enterprise
    • Business Intelligence (new)
    • Standard
  • Enterprise Edition will now be licensed per CPU core instead of the usual per CPU Socket (processor).
  • Server + CAL Licensing will only be available in the Standard and Business Intelligence flavors.
  • Core licenses will be sold in Packs of 2 cores.
  • Cost of per core license will be 1/4 the price of current SQL Server 2008R2’s per processor cost license
  • Cost of Server CAL Licensing has increased by more or less 25%
  • To license your server based on cores, Microsoft requires you to buy a minimum of 4 core licenses per CPU socket.

These changes will definitely impact your upgrading plans. To carefully plot your licensing strategies, you can visit SQL Server 2012 Licensing Overview.

Where I See These Changes Have An Impact

  • Scenario 1: Upgrading from Servers with dual cores or less. If you are upgrading to a very old server with 2 CPUs with single core for each, you need to buy 4 core licenses for each CPU. That means you have to buy 8 core licenses. It would be rather logical that when doing this, you might as well upgrade your hardware. Besides, your hardware upgrade seems to be long overdue.
  • Scenario 2: Upgrading from Servers with more than 4 cores. In the past, If one experiences the need to have more CPU power for SQL Server, one only has to buy a new server with as many cores per processor that money can get, reinstall your SQL Server and you are good to go … all without minding licensing issues and cost. With SQL Server 2012, you will be spending more as you go beyond 4 cores. Going 6 cores, you need 2 more core licenses. Going 8 and 10 core, you need to buy additional 4 and 6 core licenses respectively.
  • Scenario 3: Upgrading from Servers with Server + CAL Licensing. There is a 25% increase in cost for CALs.

For those of you who are still using versions 2000 and 2005 and are planning to upgrade but find these 2012 changes unpalatable, you may want to rush upgrading to SQL Server 2008 R2 instead and get stuck in that version’s perks and licensing model before your resellers remove the older SKUs with older licensing models from their sales portfolios and force you to buy SQL Server 2012.

Of course, SQL Server 2012 Express will still be FREE!

Juneau That You Are On Steroids With Denali?

Supporting multiple versions of SQL Server for a single application typically means headache. And that is what I have been in since I started making commercial applications (banking mostly on the benefits provided by T-SQL). We have not abstracted our databases with non-TSQL objects in the mid-tier for very valid and interesting reasons. In my current situation, I am dealing with SQL Server 2005, 2008, 2008/R2 supporting bits of features from each version. Thank heavens all our clients have upgraded already from version 2000. My world though is still 80% SQL Server 2005, and being still a very formidable version, I don’t think I will be able to part ways with this build anytime soon. Case in point, clients cannot just easily upgrade to higher versions every time one comes out.

As a result, I am mired in managing our SQL Server development assets using various tools. Anything that can help tract and manage everything. Ever since, I had always wished for a tool that will address most of my SQL Server development concerns if not all. All these years, I have heavily relied on SSMS and supported by various other tools. When Database Projects arrive with Visual Studio, it got me interested but it tanks out every time I load it up with what I have. The last one was with Visual Studio 2010. With the sheer number of objects that we have, capacity and performance was always the issue and each time, I had to resort back to SSMS without exploring further those tools.

When Juneau, now officially called SQL Server Developer Tools, or SSDT, was announced as one of the touted tool in the next version of SQL Server  codenamed “Denali”, I was pretty excited about the possibilities it will bring to my situation. After getting my hands on it with Denali CTP3, I say it really got me excited. Though I haven’t tried steroids, based on the effects that I have seen, SSDT (sounds like drug testing haha) gets my nod despite some issues that I have encountered. I am aware that what I got is preview build and for sure Microsoft will eventually iron out those. The benefits far outweigh the quirks that I have seen. It would be worth to upgrade even with SSDT alone.

So what is in SQL Server Developer Tools?

I barely scraped the surface yet but here is what I have found that will a big impact with my situation:

  • It is a unified environment/tool for SQL Server, Business Intelligence, SSIS. Everything I need is provided for within the SSDT shell. I dont have to switch back and forth between Visual Studio, SSMS, and BIDS. Creation of new reports and SQL Server objects such as tables, views, stored procedures, file groups, full text catalogs, indexes … all can be done within SSDT. Existing ones can be added too. When I had the chance to explore MySQL Workbench, I find it cool to have objects that I create scripted first before it is created physically. This capability is now available in SSDT.
  • “Intellisense support for TSQL” for SQL Server 2005. With SSDT, it has not actually upgraded SQL Server 2005 then gave it TSQL Intellisense. TSQL Intellisense is outright available within SSDT regardless of your targetted SQL Server platform. With SSDT, you model your database schema/objects without the tool actually being connected to a targetted version 2005 live database.  This way, you work with your TSQL with Intellisense provided by SSDT, and you can publish your project to an SQL Server 2005 database by choosing 2005 as your target platform. Build/error checks are done prior to the publishing of the database project.
  • SQL Server Platform Targetting. SSDT supports versions 2005, 2008, 2008 R2, Denali and SQL Azure. You do your thing and just select your target platform when you are done. During the build, you will be prompted for platform violations.
  • Refactoring (Rename, wildcard expansion, Fully qualified naming). While in TSQL writing mode, you can rename a field and SSDT does the rest to all affected objects (tables, stored procedures, etc). Neat. No more 3rd party stuff to achieve this. 🙂
  • Supports Incremental Project Publishing. You can keep a development version of your database and upgrade your live version incrementally either via script or while connected to the database. It can even prompt you when your changes/upgrades causes a data loss. Very cool.
  • SSDT detects changes done outside of SSDT. It prompts you during publishing of a database project when it detects objects created outside of the tool. This is necessary so you know if somebody is messing up with your databases. I do wish though that SSDT would have the capability to reverse engineer these detected objects when it is determined to be legit as sometimes, you get to tweak directly the database using other tools like SSMS.
  • Importing of Existing Database Schema (from 2005 to Denali) into a database development project. One need not start from an empty project, existing databases can be dealt with.
  • Scalability. I am glad SSDT successfully imported the more than 20,000 objects in our current database. The previous attempts I had using Visual Studios’ Database Project (both 2008 and 2010) just hanged midstream after hours of importing all the objects. Took SSDT more than 20 minutes to import our database using a lowly test Denali test machine.

This is all for now folks! Regardless of some unwanted quirks I have encountered when using SSDT, I can still say, it is worth looking at. I am just waiting for some decent Denali machine to use so I get to load up my projects in SSDT and take the next level from there.

Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group (PHISSUG), a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to