Will the next version of Microsoft’s Windows Phone OS be codenamed blackberry? or apple? 😛 Should Ballmer call it Microsoft apple, it will be my first time to own an apple. Hehehe. And one can imagine a news headline saying, millions and millions of Microsoft users now use apple! That would be bedlam. Then we probably hear Mac fanatics or Steve Jobs saying apple sucks! Ahh … that would surely perk up my day!
Author Archives: totogamboa
Expertise Bug Leads to Major Database Design Blunder
Often, we hear software developers (myself included) say, “yeah … we can definitely do that!” or “yeah … you have come to the right place, anything you want .. we can build it for you”! Then often, we too hear stories of software projects fail here and there. And now I begin to wonder how things could be worked out properly so problems, or worst case, failures can be avoided.
What happened to me recently could probably be one of the reasons why software projects fail. For the past several months, I have been working on a system that concerns the health of people. The system is intended for use by doctors, dentists, physical therapists, nurses, or any practice or profession that deals with people’s health, etc. The system, when done, will handle quite an extensive amount of data gathered from a good number of processes and sources that are realized every minute of the day.
For the past several months, there was a good amount of communication between my group and those potential users of the system. A lot of time were spent in requirements discovery and gathering, analysis, and everybody even subject a lot of items to questions just to sort things out clearly so things come out fine. A near functional prototype has been established for several months and it looks like everybody is happy of everyone’s progress … including the state of the system. Beta testing was conducted, had 6 major beta builds in the last few months, and seemed not a thing was amiss. In fact, it is almost done that the potential users are so eager to have the system deployed already and pioneered as soon as possible (that would have been like last month). I was into the finishing touches and was like in the process of putting icing on the cake.
Then … it was KABLAAAAAAMMMMMM!!! As the project’s database designer, I thought of something that hurled everything back into the drawing board. I miss identifying one piece of information that should have been in the database design from the very start. With the set of people contributing to make this happen … this one thing never had any manifestation of being thought out. The medical guys involved in the project never thought of the item. The software development guys never had any clue. I never had any clue. And I have contemplated so deeply to analyze how I could miss something this important. People who played the analyst role came short in thinking about this (I am one of those). But I, being the database designer, am blaming of missing something so important.
After gathering myself, I came to a conclusion that the only time that I’d be able to easily identify or come across such piece of information, is probably when I am a doctor, a practicing one, and at the same time a database designer who had lots of databases and experience tucked under my belt. I caught the missing piece from a totally unrelated event, not even related to what I am doing.
To cut the story short, what happened thereafter was 1 table was added to the database structure with 1 new column that would serve as a reference for 70% of other tables. With the change, 70% of sql code were re-written, 50% of critical UI got revamped and lots of time lost and gained lots of sleepless nights.
The moral of my story, though completely lacking of juicy technical details as I cannot divulge those for fear of legal ramifications and of ridicule (ano ako hilo? hahahha), software development / technical expertise can only bring us to certain extents and domain expertise is clearly a desirable attribute one can have, especially if you are a database designer. The reality and funny thing though is that, I can’t picture myself as a doctor and a database person so I can eliminate this problem in the future and this is probably the reason why there are just too many software project that have failed. And I can still hear myself saying “yeah … of course, anything you want, I can build it for you!”.
The system I talked about here is almost done and is looking really good! And I don’t think I have missed some more that would screw up my day. How about you? Do you say, being considered as an expert, can do of anything that is asked of you? 😛
1.5Terabytes of Photography Gone and Back, and How Windows 7 Installs and Fixes Itself!
Last Thursday morning, Windows 7 popped an ugly message … cannot read Drive S: and after I closed the message, I immediately opened Windows Explorer and there was no more drive S:. Suddenly, a rush of panic engulfed my senses. It was in that drive that I recently consolidated all my photos, and yeah …. 1.35TB of RAW files since I got into the digital photography madness. I load up event viewer and one item says, “The device, \Device\Harddisk1\DR1, has a bad block.”. Loaded up Disk Management and Drive S just wasn’t there. I rebooted and my PC’s BIOS telling me I had a bad disk.
After a fresh restart, I immediately opened Windows Explorer to check on drive S:. Thankfully it was there but it cannot be opened and accessed. I immediately went on recovery mode. First thing I did was to test if my chances for recovery is high. I ran an old Sandisk tool RescuePro and recovered files without subjecting the faulty hard disk of any write operation. Though very effective, RescuePro just went on and dumped every file it can recover in a single folder with the recovered files named as 00001.cr2, 00002.cr2 …. xxxxx.cr2. After a few files recovered, I realize it would be a nightmare trying to check each file for its content. I cancelled RescuePro and run TestDisk which I had used with my faulty CF and SD cards before. This tool is very advanced in terms of disk/file recovery but its UI is as old as those character based DOS apps of the 80s. Running TestDisk, I was able to peer into the folder structure of the faulty hard drive, and have each recovered to another disk 1.5TB disk. It took me almost 24 hours to have everything recovered. Yeah … ALL files were recovered.
Just as I thought my woes are over, while verifying each folder if it indeed been recovered, Windows 7 PC blue screened. If you are just a street away from me that time, you could have been deaf by the curses you have heard from me. I restarted the PC and it says something about a missing BootMgr. What the !@#$!!!!! Upon further scrutiny, I realized the problem started some couple of years ago when I had this PC freshly formatted when I added some new hard drives. I remember when I had the first HDD upgrade, I set the BIOS to boot on the new 1.5TB HD instead of the switching cables so that boot order would match corresponding disk ports. What happened was I had the following setup:
- BIOS Boot Order on Device 1
- Device 0, 320GB, Active Primary D:\, …
- Device 1, 1.5TB, System, Boot, C:\, ….
Earlier this year, I replaced the old 320GB with another 1.5TB and forgot to reset what was in the BIOS all these times. So I had the following setup:
- BIOS Boot Order on Device 1
- Device 0, 1.5TB (new)
- Device 1, 1.5TB (old)
Without this HDD crash incident, I could not have known that Windows 7 did the following when I had it reformatted right after installing the new 1.5TB HDD some months back.
- BIOS Boot Order on Device 1
- Device 0, 1.5TB (System, C:\, C:\Windows)
- Device 1, 1.5TB (Boot)
As you can see, BIOS tells the PC to boot from Device 1. Since it has crashed, it could NOT find the necessary boot info, thus I got the “BOOTMGR is missing” message. I attempted to BCDEdit, but the app hangs as it accessed the faulty drive. I have tried Windows repair and all to no avail. Windows repair only managed to fix the partition issue but it does not repair BOOT miscue. All these until I got Hanselman’s blog on BCDBoot where he happen to be in a similar situation.
I immediately ran BCDBoot and restarted the PC, changed BOOT Order to Device 0 and it just wont boot properly.
Thinking, BCDBoot had already corrected the BOOT miscue, I thought Windows Repair could do things differently this time. I popped the Windows 7 installer and went on Repair mode and voila … the PC booted normally. Checking on Disk Management, my rig now says:
- BIOS Boot Order on Device 0
- Device 0, 1.5TB (System, Boot, Primary Partition, C:\, C:\Windows)
- Device 1, 1.5TB (Active, Primary Partition)
I then physically removed the faulty hard drive for one last reboot … and everything just are back, all 1.35terabytes of RAW files and some new knowledge on how Windows 7 installs and fixes itself!
Google+ … And What It Means For Photographers Like Me
First of all, BIG thanks to my friend Rhamille for sending me a Google+ invite.
I started with Flickr, using a Canon A40 point and shoot and remained a paying customer for years to publish my photos when I realize I want more. At first I thought Flickr was all that I wanted. Hi-resolution, un-tampered photo quality, album management, peer feedback, specialized photo groups, and Explore!!. Then I realize my friends, the people that I also want my photos shown, were not there.
Then Facebook came. I am actually a late Facebook adopter as it has been quite sometime before people were able to convince me to take the social networking plunge. I resisted Facebook for some time but as soon as I experienced the power of the “Like”, I never looked back. I still have my Flickr account, which I don’t update anymore since August 2010 and it says, “Hey Toto Gamboa (Not Uploading Pics Here Anymore)! Your Flickr Pro account has expired. Don’t panic! You can only see 200 photos, but the others are safe & sound. You can see them if you renew.”. As if I care if I don’t renew! 😛
I have used Facebook’s photos for my photographs since last year but there is so much to be desired from Facebook’s Photos. And there are lots of negative things a photographer can say regarding Facebook’s photos. And you will always hear from everybody that Facebook is, first and foremost, a social networking site and not to be compared with against photography centric sites such as Flickr and the likes. And, did I say that the first note I wrote in Facebook was on how I was so disgusted with its photos? Lols! I had sworn that the moment there is another social networking site that will give photos some importance, I wont hesitate to quickly adapt to it.
And voila! G+! seem to be the answer to my prayers. Not to bore you with what Google+ is, but as a photographer, it’s like Facebook and Flickr rolled into one and then some and A LOT MORE! Here is my quick experience with Google+ Photos:
- (UPDATE As of Aug 02 2011) Hi-resolution. Great implementation I must say. Documentation says one can upload as large as a 2048 x 2048 image. Though true, you can only view “as is” this size in your browser if your monitor is large enough to contain this big of an image. But since most monitors are way smaller, you cannot. G+ adjusts the size of the display of your image to the size of your browser. The rationale is that, you must be able to view a photo in its entirety without forcing you to scroll horizontally and vertically. However, you can still get the 2048 x 2048 photo by downloading this. There is one quirk with this design though, since most monitors are orientated to landscape, vertically cropped/framed photos are re-sized on display that the result makes them really small. This is done so you can still view the photo in its entirety. Wish G+ have an option to turn resizing off on vertically framed photos.
- (UPDATE As of July 14 2011) No compression, No Re-sizing. I uploaded a 1280 x 720, 782KB photo, and Google+ retained all its gory and glory. It seems that when you upload photo that is within the limits of Google+ (as long as your browser size is capable of displaying your photo’s full resolution), no compression and resizing is done. Woooohooo!+1.
- Image linking is HTTPS-based which assures me that my photo won’t be further degraded by any other means. You can check out my other blog on this very issue.
- Linking with Picasaweb. This means that I can now reach another set of audience for my photos. The photography-centric ones, which I lost, when I stopped using Flickr.
- EXIF. I don’t have any issues showing the shooting info of my photos to everybody so this can be a plus to me.
- (UPDATE As of July 20 2011) Auto Language Translation. Ever wonder how to understand when someone post some comments in your photo in Italian or French? Google+ does the translation for you!
- It’s FREE. No limits. You can upload as many photos as you like as long as you upload via the Google+ interface. If you upload via PicasaWeb, you are only limited to 1GB of space.
- And my friends, other than those photography-centric folks I interact with, can see and appreciate like they are in Facebook.
Google+ is just 2 weeks old since invites were started. And a lot of things remain to be seen. However, despite being in beta, it looks very promising and most of the things I wanted from Facebook are there.
Of course, for things to get better. All my friends in Facebook need to join Google+ first! 🙂
Now it’s for you to find out what is in store with Google+ photos! … check out my Google+ albums! 🙂
Check if your Internet Service Provider Degrades Photo Quality
Below are two photos linked to a single file in my server. The first one you see should be in its highest quality while the second one should be exactly the same as the first IF YOUR INTERNET SERVICE PROVIDER DOES NOT DEGRADE IMAGE QUALITY by routing the image to a bandwidth optimization server. If you happen to see the second image to be inferior in quality, then the ISP you are using is doing something to save on bandwidth.
Here is another test reference:
Pay attention to details like the edges of the tree and the bird, also my signature. Check if you can see some pixelized portions and discolorations.
To resolve this issue, host your photos on services that offer Secure Socket Layer (SSL) for your site. This way, your photos will be encrypted while it goes from your host server to your viewer’s browser. This way, ISPs won’t be able to ‘touch’ or degrade your images so they can save on badwidth. So far, this is the only way I can think to circumvent this issue.
Hope this helps!
Thoughts on Designing Databases for SQL Azure – Part 3
In the first article of this series, I raised an issue considered to be one of sharding’s oddities. The issue raised was what would one do should a single tenant occupying a shard exceeds a shard’s capability (e.g. in terms of storage and computing power). The scenario I was referring in the first article was that I opted to choose “country” as my way of defining a tenant (or sharding key). In this iteration, I’ll once again attempt to share my thoughts on how I would approach the situation.
Off the bat, I’d probably blurt out the following when ask how to solve this issue:
- Increase the size of the shard
- Increase the computing power of the machine where the shard is situated
On in-premise sharding implementations, throwing in more hardware is easier to accomplish. However, doing the above suggestions when you are using SQL Azure, is easier said than done. Here is why:
- Microsoft limits SQL Azure’s database sizes to 1GB, 5GB, and 50GB chunks.
- The computing instance of where a shard can reside in SQL Azure is as finite as well
I have heard of unverified reports that Microsoft allows on a case-to-case basis to increase an SQL Azure’s database size to more than 50GB and probably situate a shard on some fine special rig. This however leads to a question on how much Microsoft allows each and every SQL Azure subscriber to avail of such special treatment. And it could probably cost one a fortune to get things done this way.
However, there are various ways to circumvent on the issue at hand without getting special treatment. One can also do the following:
- You can tell your tenant not to grow big and consume much computing power (Hey … Flickr does this. :P)
- You can probably shard a shard. Sometimes, things can really go complicated but anytime of the day, one can chop into pieces a shard. Besides, at this point, you could have probably eaten sharding for breakfast, lunch and dinner.
So how does one shard a shard?
In the first part of this series, I used as an example of sharding a database by Country. To refresh, here is an excerpt from the first article:
Server: ABC | |||
Database: DB1 | |||
Table: userfiles | |||
userfiles_pk | user | uploaded file | country_code |
1 | john | file1 | grmy |
2 | john | file2 | grmy |
6 | edu | file1 | grmy |
Server: ABC | |
Database: DB1 | |
Table: country | |
country_code | country |
grmy | germany |
can | canada |
ity | italy |
Server: CDE | |||
Database: DB1 | |||
Table: userfiles | |||
userfiles_pk | user | uploaded file | country_code |
3 | allan | file1 | can |
4 | allan | file2 | can |
5 | allan | file3 | can |
9 | jon | file1 | can |
10 | jon | file2 | can |
11 | jon | file3 | can |
Server: CDE | |
Database: DB1 | |
Table: country | |
country_code | country |
grmy | germany |
can | canada |
ity | italy |
In the sample above, the first shard contains data related only to grmy (germany) and the second shard contains data related only to can (canada). To break the shard further into pieces, one needs to find a another candidate key for sharding. If there is none, as in the case of our example, one should create one. We can probably think of splitting up a country by introducing regions from within (e.g. split by provinces, by cities, or by states). In this example, we can probably pick city as a our sharding key. To illustrate how, see the following shards:
Shard #1 | ||||
Server: ABC1 | ||||
Database: DB1 | ||||
Table: userfiles | ||||
userfiles_pk | User | uploaded file | country_code | city_code |
1 | John | file1 | grmy | berlin |
2 | John | file2 | grmy | berlin |
Shard #2 | ||||
Server: ABC2 | ||||
Database: DB1 | ||||
Table: userfiles | ||||
userfiles_pk | user | uploaded file | country_code | city_code |
6 | edu | file1 | grmy | hamburg |
By deciding to further subdivide a country by cities where each city becomes a shard, the following statements would be true:
- The new sharding key is now city_code.
- Our shard would only occupy data related to a city.
- Our shard would only occupy data related to a city.
- Various shards can be in the same server. Shards don’t need to be in separate servers.
- The increase in the number of shards would also increase the amount we spend on renting SQL Azure databases. According to Wikipedia, Germany alone have 2062 cities. This is some serious monthly spending that we have here. However this example is just for illustration purposes to convey the idea of sharding. One can always pick/create the most practical and cost-effective key for further sharding to address the issue of going beyond a shard’s capacity without the spending overhead due to poor design choices.
- At a certain point in the future, we might exceed a shard’s capacity once again breaking our design.
**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com
Wild Bird Photography – Oriental Skylark
One of the most uncommon but conspicuous birds out there in the field is the Oriental Skylark (Alauda gulgula). Conspicuous, in the sense that when it gets excited, it would stand tall and proud with its crest raised despite its minute size, and would give a very loud distinct shrill. However, you often get to see these birds in photographically boring dry open fields as this species does not perch on bushes or trees. With these circumstances, shooting the bird at an angle higher than its eye level, you will get the boring ground as its backdrop (as shown below).
In the few times I have encountered this species, I would often wish to capture it on camera properly. What I intend to have is to get a creamy bokeh/background. To do this, you can’t shoot from an elevated angle (as shown above), unless the bird is on an edge of an elevated mass that would eliminate the ground as the backdrop or would make the ground far enough not to be included in your thin depth of field. Instead, you need to be down on all four. This means you need to be on a prone position to achieve the effect (as shown below). This enables you to avoid shooting the ground as the bird’s backdrop.
Shooting the bird on the ground on a prone position gives one an eye level shot with both the foreground and background seem to merge and melt leaving the subject greatly emphasized. Below is a sample of this effect.
Shooting Disclosure
- Gears:
Canon 50D, EF 400mm f5.6L, 2-Pound Rice Bag - Settings:
Shot @ f5.6, 1/640″, ISO320, Spot Metering, Auto White Balance, Aperture Priority, Cropped 16:9 to 3.6MP, RAW, Handheld, Prone Position - Lighting
8:36am Light, Overcast - Others:
Some very minimal sharpening and color vibrancy adjustments in Photoshop
Wild Bird Photography – Dollarbird
In wild bird photography, gears and skills do really matter. Both needs to go hand in hand to produce the best snapshot of a beautiful avian subject. In the Philippines, it is often that you and your gears will be tested. For decades, our subjects here are hunted everywhere that birds here often avoid human encounters as much as possible. This cautiousness adds to the fun and challenge to wild bird photography in the country. Because of this, 400mm lenses are considered short. You have to exert more effort in terms of executing the shot to produce decent photographic captures at par with those that have bigger, longer and generally better equipment. However, one cannot fret with what he has. Working a bit more with what you have does the trick and solve some of the problems.
Compensate
Having known that I have a short lens at only 400mm (Canon EF 400mm f5.6L) with a fairly slow speed (f5.6) and a camera that is known to produce noisy images at high ISOs, I need to do a lot more to compensate on the limitations I have. In this article, I would detail how using a 2X teleconverter on a 400/f5.6 lens mounted on a Canon EOS 50D to shoot a very difficult scene can still be accomplished with satisfactory results. Here is how:
Below is a photo of an uncropped 800mm shot of a Dollarbird (Eurystomus orientalis) from more than 30 meters. I dont normally use a 2X teleconverter but I was tempted to because the bird perched motionless for minutes after I got several shots with just the bare 400mm lens mounted on my cam and the sheer distance between me and the bird. Getting nearer is also impossible.
Here is a cropped version of the same image. I cropped the photo down to 2.8 megapixel.
Shooting Disclosure
- Gears:
Canon 50D, EF 400mm f5.6L, Kenko Pro-DG 2X, Manfrotto 755X + Gimbal Head, 2-Pound Rice Bag, Remote Shutter - Settings:
Shot @ 800mm, f16, 1/40″, ISO320, Evaluative Metering, Auto White Balance, Full Manual, Cropped 16:9 to 2.8MP, RAW, Liveview, Remote Shutter - Lighting
8:42am Light, Overcast - Others:
Some very minimal sharpening and color vibrancy adjustments in Photoshop
Often, we hear discouraging comments on the use of 2X teleconverters. It definitely degrades image quality even when use with large aperture wildlife lenses such as those with f2.8’s and f4’s. Using a 2X on an f5.6 lens would surely raise eyebrowes. But when you are limited to shoot with what you got, and in my case, I only have a Canon EF-400mm f5.6L, one needs to do a lot of compensating to get decent output from lowly setup with a 2X. And a couple of requirements to effectively know how to compensate is you need to know how a photograph is made and you know very well your gears’ capability and limitation.
In the above photo of the Dollarbird, despite the constraints I had during the time of the shoot, I still managed to get a decent shot. Here are the key ingredients in executing this shot:
- LiveView. Knowing that using a 2X on an f5.6 will force you to go full manual, using LiveView is one very effective technique. But of course this can only apply since the Dollarbird lingered long enough for me to set things up. Using LiveView in this scenario, one would get AUTOFOCUS using contrast detection method. My Canon EOS 50D allowed me to do this. Some cameras would probably do the same. Using LiveView in this scenario, it also allows you to visually zoom in to 10X using your LCD to get better confirmation if you have focused well on the subject. In the 50D, you can have these features work for you. You get to zoom in to your subject and get aufo-focus.
- 2-Pound Rice Bag. At 800mm, very minute shaking is very visible. By increasing your LCD view to 10X (via LiveView), not only is the shake visible, it can make you dizzy :P. Putting weight on your rig would dampen the effects of this shake. It also speeds up in stabilizing your rig so you get to shoot in the soonest possible time. In my case, I have this useful 2 pound weight functioning as a poor man’s image stabilizer. All I have to do is place the weights on top of the lens where its center of balance is.
- Remote Shutter. Without a remote shutter. This scene is hard to execute. One can use the cam’s timer though but that is cumbersome.
The above key ingredients helped in allowing me to capture this scene. Though it won’t surpass the quality of a shot using a bare lens at the same focal lenght, the result is decent enough to merit a space on my harddrive. Without any one of the three, it would be very hard to get a decent output from this scene. Compensating can do wonders especially for photographers that don’t have those desirable longer and faster lenses and better camera bodies. The same techniques used here can be applied using better gears of course. 🙂
Thoughts on Designing Databases for SQL Azure – Part 2
In the first article, I showed an example how a database’s design could impact us technically and financially. And sharding isn’t all just about splitting up data. It also brings to the table a group of terrible monsters to slay. There are a lot of concerns that needs to be considered when one attempts to shard a database, especially in SQL Azure.
NoSQL, NoRel, NoACID
In breaking things apart, one is bordering on clashing religions. One monster to slay is the issue of ACIDity. People discuss NoSQL, NoRel, NoACID to be one of the trends out there. And most even swear to the fact that these approaches are better than SQL. In my case, I prefer to call it NoACID and it is not by any means more or less than SQL. I have NoACID implementations on some projects I had. And I love SQL. To simplify, I’ll put in these trends in a NoX lump as they commonly attempt to disengage with the realities of SQL.
For me, NoX is not a religion, it is simply a requirement. The nature of the app you build will dictate if you need to comply to the principles of ACID (Atomicity, Consistency, Isolation, Durability). If ACID is required, it is required regardless of your data and storage engine or your prefered religion. If it is required, you have to support it. Most cloud apps that we see, like Google and Facebook, could probably have ACID to be absent in their requirements list. Google is primarily read only so it does make sense to have data scattered all over various servers in all continents without the need for ACID. By nature, ACID in this regard, can be very minimal or absent. Facebook on the otherhand is read/write intensive. Seems like it is driven by a massive highly sophisticated message queuing engine. Would ACID be required in Facebook? I am not quite sure about Facebook’s implementation but the way I look at it, ACID can be optional. ACID can well be present in operations concerned only to one tenant in case of an FB account. Outside of this, the absence of ACID could probably be compensated by queuing and data synching.
If Facebook and Google decided to require ACID, they could be facing concerns on locking a lot of things. While locked on, latency could be one of the consequences. It is therefore very important to lay out firsthand if ACID is a requirement or not. For a heavy transactional system, a sharded design presents a lot of obstacles to hurdle. In SQL Azure, this is even harder as SQL Azure does not support distributed transactions like we used to with SQL Server. This means, if your transaction spans across multiple shards, there is no simple way to do it as SQL Azure does not support it, thus ACID can be compromised. SQL Azure however does support local transactions. This means you can definitely perform ACIDic operations within a shard.
To be continued…
**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com
Wild Bird Photography in the Philippines – Scaly Breasted Munia
One of the most common beautiful bird you will encounter in the Philippines is the Scaly-breasted Munia (Lonchura punctulata). It can almost be found anywhere in the Philippines. This species can also be one of the easiest to photograph considering that it is often close to human habitation. Where there are rice fields, you are guaranteed to see this bird.
Below is a photo I got in one of our sorties in San Juan, Batangas. I was patiently waiting for waders on a nearby pond when a few meters from where I hide, this fellow showed up. It probably got curious by my presence that it stayed a while and allowed me to get a bit closer and have some shots that I like.
Shooting Disclosure
- Gears:
Canon 50D, EF 400mm f5.6L, Manfrotto 755X + Gimbal Head - Settings:
Shot @ 400mm, f5.6, 1/50″, ISO100, Spot-Metering, Auto White Balance, Aperture-Priority, Cropped 16:9 to 3.2MP, RAW - Others:
Some very minimal sharpening and color vibrancy adjustments in Photoshop
On a mid-day harsh light, I usually go ISO100 to avoid getting a very fast shutter speed so more light gets captured with the sensor shutter being opened much longer. With ISO100, I also avoid some unwanted noise and it enhances the creaminess of background blur. Though it is hard to pull off and increasing the risk of getting a blurred shot, I prefer to shoot around 1/40 secs to 1/200 secs as I almost always get better color in this range (provided I am on a sturdy tripod). I just don’t know the technical reason but I suspect, the more time I allow the camera to absorb light, the better the output I got. This is why I always try to bring down ISO to as much as I can for as long as the shutter speed is within my prefered range. By constant practice, blurring cause by shake and slow shutter speed can be avoided. The bird allowed me to focus on its eye as it gave me a nice glance as shown in the photo. I also got lucky that it perched on a really photogenic decaying branch with a good greeny background from the distance.
The bird gave me a few shots but the above photo is the one I like most. It flew the moment I attempted to get closer.