Photography : Getting Started

Some of us have already gotten our DSLRs this early from Santa and are probably wondering how all the switches, buttons, levers and settings work to get that perfect photo. And we now want to be in control of our camera so I have come up with a simple guide for us to know the stuff that matters in capturing a photo. Whether you do it yourself or have the camera’s computer do it for you, it is essential that you know some fundamentals in photography. Simply put, all we need to know for now are the very basic elements that matter most in coming up with a photo. 

Let’s get started!

Exposure, is a process where we control the amount of light that goes through our lens to our DSLR’s sensor. With too much light, we get an over-exposed photo. An over-exposed photo is typically a photo that is just too bright. In other cases, the photo comes out all too whitey. On the other hand, with less light, we get an under-exposed photo. With an under-exposed photo, you usually end up with a dark to almost black photo. With the right amount of light, we get to see photos just as we see on the scene or in our viewfinder or LCD screen. 

Controlling the Light 

We might be wondering now how are we going to control the amount of light that goes through our lens to our camera’s sensor? There are two ways to do this, and these are controlling two important elements: 

  • Aperture. Aperture is basically a hole or opening in our lenses that light can pass through. When we talk about these holes, we often associate it with how big or small these holes are and by design, the size of the hole in our lenses can be varied as we desire. In lens jargon, these holes are measured as shown in the following:
     

    Aperture Size of hole
    f3.5 large
    f4 medium
    f5.6 small
    f10 smaller

We might be wondering that the larger the number associated with “f”, the smaller the size of the hole. That must be a confusing measurement convention but that is how things are in photography and we just have to accept it that way. Humans always have a knack at confusing others.

  • Shutter / Shutter Speed. A shutter is a simple mechanism that opens to allow light going through our lens to reach the sensor, and closes to prevent it. When our camera is about to take a photo, the shutter’s position is closed, then opens up to allow light to come in then closes again to prevent it. The time it takes for our camera’s shutter to open and then close can be long or short and is measured in terms of speed. Thus we have a factor known as shutter speed.
    Shutter Speed (in seconds) Speed
    1/200s or 200th of a second faster
    1/60s or 60th of a second fast
    1/30s or 30th of a second slow
    1/5s or a 5th of a second slower

By knowing how to control these two elements, (1) the aperture; and (2) the shutter speed, we now have the means to control the amount of light that goes through our lenses to the camera’s sensor. We just have to remember the following:

  • The larger the hole, the more light will come in.
  • And by opening the shutter too long before we close it, the more light will come in.
  • If the scene we are photographing is dark, we either set our aperture to its largest setting so more light will come in or slow down our shutter speed so we open the shutter much longer to allow more light to come in. We can do both.
  • If the scene we are photographing is too bright, we can do the opposite.

By correctly mixing the two, we will achieve correct exposure. 

To do this ourselves in our DSLR, we set exposure control to Manual and try adjusting the aperture and shutter speed settings while shooting until we are familiar with the correct settings given the intensity of light on a certain scene. Of course, if you feel these are too much work for you, you can set your camera’s computer to figure out the right aperture and shutter speed settings. You can do this using Program or Auto mode. There are of course other means to help us get that correct exposure but for now, this is probably what we need to know how our camera works at the core. 

Hope this helps. More to come!

Happy shooting!

Advertisement

Rationalizing Cloud Computing

For the past couple of years, people have been hot on cloud computing bandwagon. And as in any case of a birth of a new trend, people tend to get into misconceptions that eventually led them to get burned. Some just plainly embrace something without bothering to find out what is in it for them.

As a co-owner of a very small ISV trying to grapple with trends in computing and making things work for clients and stakeholders, I would always submit to the pressures of looking into the possibility of taking advantage of what a new trend can offer, cloud computing included. And during these times, I have significantly looked into what cloud computing brings to the table and how my company can take advantage of them. I have at least looked into Azure and non-Azure offerings and have attempted to assemble a design for our company’s products that can take advantage of the good stuff that are in the cloud.

Though my company is definitely a cloud computing enthusiast as some efforts are actually being done to take advantage of the cloud, as far as I am concerned, it won’t be for everybody. But recently, there is a rise of people who are investigating their possibilities with the cloud. This is maybe due to the advent of Microsoft’s farm betting Azure or perhaps there are just too many misconceptions peddled here and there. In most cases, I always see some wrong assumptions being tossed into every cloud discussion that I have had. Clients too are asking me about possibilities of being in the cloud based on wrong assumptions fed to them.

But before I get into those misconceptions, I would highlight the traits that made me and my company embrace cloud computing:

  • Cloud is the great equalizer. For very small ISVs like my company, I can see cloud as a leveling of playing field against the big guys, at least on the issue on having to spend upfront on system infrastructure to offer something for my market. For under $100 a month, a small ISV can start big.
  • Reach. My company would be able to increase reach without much overhead.
  • Scalability. The fact that I would be able to scale as the need arises, cloud computing is definitely for companies like mine.
  • Administration. Being a very small ISV, system administration tasks such as deployment, support and maintenance takes too much toll on our internal resources. Having an easily accessible uniform environment like the cloud, it would allow my company to increase the number of clients to attend to without adding too much stress on our internal resource.

There are other niceties that made me embrace cloud computing, but the above items mentioned are the major factors that I believe can only be achieved through the cloud.

As convinced as I am that cloud computing is one area that my company should invest, I am also convinced that cloud computing is definitely not for everybody and definitely not for every situation out there, at least for its current incarnation. I’d probably list down 5 misconceptions that I often encounter. My rationalization of these misconceptions will be based on what I know based on the efforts my company is doing in the cloud. These includes mostly Azure based clouds as well as few non Azure ones.

Top 5 Misconceptions I Often Encounter

  1. When I move my existing application from on-premise to cloud, my app will become scalable. Well, NOT without rewriting your apps to scale-out and be cloud-ready. In the cloud, scaling is not all automatic. For example, If you have an application that is not designed to scale out, it will not scale-out in the cloud. By scaling out, you have to know how to partition your storage in multiple storage instances. And instances can be spread geographically across continents. To scale-out, one has to start from scale-out design patterns, probably from scratch. Lucky you are if your current application is designed to scale-out.
  2. I can easily move current on-premise application to the cloud. Contrary to peddled notion that you can easily move current on-premise application (legacy) to the cloud, it is really based on what you actually expect to benefit from the cloud. Like in Azure for example, it supports moving legacy applications to the cloud by using virtual machines to host legacy apps (e.g. hosting a Win Forms based app). But this approach for me is inefficient unless all you want to do is transfer from on-premise to the cloud to perhaps get rid of system administration issues while ignoring other implications and issues like performance, latency, security, etc. You would have to figure out new issues as simple as backing up your data in the cloud. If your legacy app supports features available in on-premise environments, you might encounter problems running them in the cloud as current generation clouds do not support all features that are mostly staple in the on-premise environments.
  3. I have been designing on-premise apps for years, it is enough skill set to get me to the cloud. It is if you were designing on-premise apps using technologies and patterns similar to that available in the cloud. If not, you have to take a look at the cloud differently. If you want to write apps for the cloud, you have to think cloud. Scaling is probably the one issue that would force you to think cloud. If we are used to design our apps with a single instance mindset, it is about time to think differently and think multiple. Though of course, no one is stopping us to write apps to the cloud that way, but that is not real cloud apps are supposed to. Single instance design won’t scale even if you run it over current generation cloud. In SQL Azure, we have a 10GB database size limit. If we exceed the 10GB ceiling, what would our apps do? This seemingly easy to answer question could probably discourage anyone to embrace cloud.
  4. I just have to design apps like I use to and deploy it in the cloud, and the technologies behind it takes care of the rest. This can’t be. If you were into scale-out designs by now, chances are, you wouldn’t think of writing an app and deploying it over the cloud. Chances are, you might have deployed a cloud app already or you probably didn’t know that your design approaches are very much like those implemented over the cloud. But for most, I doubt that it’s the case. Most on-premise apps are designed not to scale-out but to scale-up. For example, most SQL Server implementations just take advantage of new hardware to scale as added load comes. Intrinsically, Microsoft SQL Server is not designed to automatically load balance, for example, a single query statement. One has to learn how to partition tables into a single view with underlying partitions spread across a federation of servers so SQL Server load balances a query. This is not supported in SQL Azure though. However, even if this is supported in SQL Azure in the future, this is not the scaling wonders the cloud provides, your application still sees only a single instance of a view. With today’s multi-million users accessing just one web application at one point in time, you can only scale as much. With cloud computing, your scaling possibilities are like crazy. You don’t have limits in terms of servers, of CPUs, of storage. Having resources like this would force you to rethink of the way you design your applications. The cloud isn’t about the technologies behind. For me it is one big design consideration. It is about the way you design your applications to take advantage of the unlimited resources found in the cloud.
  5. Any apps can or should be placed in the cloud. No one will stop you if you place anything regardless of how it will end up in the cloud. However, there could be applications that are better off off the cloud. For example, if you have an app with one massive synchronous process that you think has no way of being broken down into several pieces, you might as well stick to scaling-up on premise (unless your cloud offers hybrid services like allowing you to scale-up with a very powerful server). For apps requiring heavy informational security and if you aren’t comfortable with the notion that someone from your cloud provider might get into your highly secured data, you might as well have things on premise. There could be a lot of other applications that may not be practical in placed in the cloud.

It is very important that the cloud is understood clearly before everyone gets excited. While I am embracing it, I am welcoming it with a cautious attitude. Being relatively new, there are just too many considerations to think of, and too many questions that don’t even have clear answers for the time being.

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com

Wild Bird Photography in the Philippines – Part 2

This is so far a 3 part series of what wild bird photography is to me. I would probably evolve this series over time to make it current as much as possible.
Wild Bird Photography in the Philippines – Part 1
Wild Bird Photography in the Philippines – Part 3

Previously, on Wild Bird Photography in the Philippines – Part I, I discussed how I got into it and the things one needs to have to start photographing wild birds.

In this article, I’ll list down where you would often find birds. And you bet it right, you probably have been thinking that it is always in a zoo. 🙂 But you may have noticed though that I am prefixing bird photography with the word ‘wild’ in my previous article, and this is because bird photography can also include photographing birds in captivity. It may be cute to photograph birds inside a cage, but wild bird photography is for the real bird photographers.

WHERE THE BIRDS ARE

The next most asked question one would ask me regarding this interest is where I get to photograph these birds. In the Philippines, there are over 600+ species of birds that you can shoot. And the number has been increasing as some new sightings of species that don’t usually range in the country. Some are commonly seen, some seldom seen, and some have never ever been photographed. With the country’s 7100+ islands, one can imagine how dispersed our avian friends are in this archipelago. Some birds can only be found in certain islands in the country. For example, a Tiger Shrike (Lanius tigrinus), which can be found in some Eastern Asian countries, has only been recorded to have occurred only once in Jolo, Sulu sometime in 1887 and this is according to A Guide to the Birds of the Philippines. So if one is really bent on taking chances and going for a Tiger Shrike expedition, one should go to Jolo, Sulu.

So far, I have only been to a few places to do some serious bird photography. The farthest up north I had was in Pagudpud, Ilocos Norte and down south was in Negros Occidental. I am hoping to go to more birding sites as I progress.

So where do I find birds?

There are lots of common and unusual places where birds can be found. You just have to know and find the reasons why they are there. First, there are different types of birds, and each type has their specific habitats. For example, there is what we call shorebirds and they are usually found on our shorelines. Second, we have to know why they get attracted to certain places. It could be that there are lots of food and water sources in the area or they may be feeling secured in one particular area.

On the other hand, knowing when birds aren’t in the area can help a lot too. If the area doesn’t have any fruit tree, you would know there is less chance seeing a fruit eating bird. If the area allows rampant hunting, probably some birds will leave if they feel threatened. In most cases though they just get shot so you won’t find birds in there.

So, back to our question, where do we find birds?  I’ll try to list down the usual places where one can find birds.

  1. Where There Are Trees/Vegetation. It is almost guaranteed where there are significant clusters of trees and vegetation, there could be birds. Trees can be found almost everywhere but seeing one in significant clusters are becoming of an issue nowadays due to unnecessary tree-cutting. Forests still do represent an ecosystem where trees abound. One can go to various types of forests to photograph birds. There are lowland forests, some can be found in higher altitude, and some forests dot our coastlines. Some are natural and some are man-made. All these types of forests provide an ecosystem where different kinds of birds can thrive.
  2. Near Bodies of Water. Another potential ground where one can find birds are areas near bodies of water such as streams, rivers, lakes, shores, etc. Like any other living things on this planet, birds need water. Even small potholes of water offer birds some comfort. The likelihood of seeing a good number of bird species increases when these bodies of water are near trees and vegetation, or food sources.
  3. Near Food Sources. Another good site where birds congregate are areas where there are enough food for them to thrive. Some birds thrive on nectars so where there are flower, they are also there. Some birds eat insects, so where there are lots of insects, birds could be there. Some birds love fish, and one knows where to find these marine beings. You just have to know a birds’ diet and find those places where they could fatten their bellies.
  4. Where They Can Build Their Nests. Now this is a more difficult place to find as most birds hide their nests from anyone’s view. Some build their nests on rocks, some on the ground, some on the sand, some on a branch of a tree, and some even build on man-made structures such as tall buildings. Some birds don’t even build their nests in the country. J One needs to study more about birds to be able to locate their nests.
  5. Where Humans Are. Some birds have adapted well to people. And where people go, they go there too. These birds usually scavenge human leftovers and wastes.

Virtually, one can find birds almost everywhere but one needs to know certain bird characteristics to be effective in finding specific species.

Wild Bird Photography in the Philippines – Part 1
Wild Bird Photography in the Philippines – Part 3

Check out my album of Philippine birds!

Why Am I Still Stuck with T-SQL?

Not a day pass without something new coming out from software companies like Microsoft. And it has been a challenge to keep up for application developers like me. I happen to start my all-Microsoft stack effort during the heydays of Visual Basic 3 and Access. Prior to that, I was a Borland kind of kid mesmerized at how neat my code was when printed on reams of continuous paper.

I was really fast then in absorbing new software technologies for application development and related products but I am no longer a kid I used to be. I am now a slow slumbering oldie when it comes to absorbing software development technologies. I actually envy those guys now who are doing and using technologies that just came out of the baking oven. I feel they are so smart to figure out new things so fast. Isn’t that great?

And every time I get to mingle with these developers, they often wonder why I am still stuck with Transact-SQL (T-SQL). Every time .. and it never fails. It now makes me wonder why I am stuck with T-SQL when there are a lot of new alternatives. This makes me beg to question if I am still relevant with the times. Well, let’s see.

Am I in? Or Out?

My first serious brush with T-SQL was when I was contracted to develop a school library system. I thought of using VB3 + Access but being a fast kid then, I opted to choose what was the latest and ended up using VB4 and SQL Server 6.5. I was mostly doing VB code and using DAO/ADO to connect to my databases and still doing client-side cursors when going through my tables here and there. I had trouble adapting to sql’s set-based processing and mindset with my Clipper-heavy and Access background. In no time, I was able to absorb T-SQL and began moving some of my data manipulation code to stored procedures.

When VB5 came out I decided to upgrade the school library application with an improved database structure with all data manipulation in stored procedures. This time, no more non-TSQL code for my data. I was able to re-use some TSQL code taken from the app’s previous version.

VB6 came out and Microsoft touted a better data access component in RDO. Around that time, I was able to get more libraries to use my system so I virtually kept up with anything new from Microsoft. I upgraded the front-end portion of my library system while I was able to re-use all my stored procedures.

Shortly after, the Web’s irresistible force dawned on me and I took up ASP and VB Script and migrated a portion of my library application to face the web. During this time, I also upgraded to SQL Server 7.0. I had some inline SQL codes which were prone to SQL Injection but I was able to retain all my stored procedures.

When .NET came out, I had my library system upgraded to an entirely new schema, platform and language (ASP.NET, Windows Forms, ADO.NET, SQL Server 2000/2005). This time, I hired somebody else to code it for me.

Never Obsolete

With all the changes I made to the library system, the only technology that remained constant was T-SQL. In most cases, I was able to re-use code. In all cases, I was able to take advantage of the benefits of re-using my T-SQL experience .. all these while I managed to bulk up my knowledge on T-SQL.

VB4 is gone; my T-SQL is still here. VB5 is gone; my T-SQL is still here. VB6 is gone; my T-SQL is still here. ASP is gone; still my T-SQL is here. DAO, ADO, and RDO are gone but my T-SQL remained. I moved from VB to C#, yet I am still using T-SQL.

Today we have ADO.NET, we have LINQ. Soon they will be gone (I have heard LINQ-To-SQL is now deprecated). And tomorrow I can probably move from ASP.NET to Silverlight or some else new, or from ASP.NET to PHP, but I have the feeling I still be using T-SQL. Microsoft is even realizing in going back to something as basic as tables and queues with Azure but it can’t ignore T-SQL, thus we have SQL Azure.

Betting My Future

I am inclined to think that my investments on T-SQL have already been paid back immensely and I am still reaping its benefits. With the advent of the relatively new cloud computing, and having various players offering various cloud computing technologies and services, I can’t help the urge in identifying which part of the cloud computing technology stack will survive against the onslaught of constant change and would manage to stay relatively stable. I am afraid some of our current investments to other technologies wont be as useful in the cloud but I am betting my future once again with SQL.

What about you?

**************************************************
Toto Gamboa is a consultant specializing on databases, Microsoft SQL Server and software development operating in the Philippines. He is currently a member and one of the leaders of Philippine SQL Server Users Group, a Professional Association for SQL Server (PASS) chapter and is one of Microsoft’s MVP for SQL Server in the Philippines. You may reach him by sending an email to totogamboa@gmail.com

Data and Its Impact on Database Design & Development

I have always relied on realistic test data every time I develop my databases. My analyst hat and being an application designer/developer make things easy to improve my way of gathering a set of realistic data for use in validating my understanding of certain things that I do in connection with application development.

Having a good set of realistic data always helped greatly to the success of the applications I designed and developed. It allows me to have a better understanding of the business aspect of the realm that is the subject of any software project I have had. However, I realized that it is not fairly easy to come up with a good chunk of realistic test data for application development use, and probably this is also true to others out there. Probably for one, we are not the intrinsic experts on most fields that we deal with when we design and develop applications.

One, can’t just make up a set of realistic data randomly and instantly at that. You need to understand the vast array of reasons, rules, components, variables, constraints, linkages, and circumstances to confidently fill out blanks. You need full understanding and time to understand it and develop it. And you need to produce it with as much variations as possible so refactoring in the future is reduced to a minimum. Though this is no easy task, it is achievable to a certain degree.

Two, you need to be very familiar with the business aspect of the territory you are about to deal with. I say it is difficult as most of us are not the natural experts in the fields we are assigned to but are given assignments to understand decades old, highly specialized and fully refined business processes and we have to come up with something within an unreasonable amount of time and usually with a deadline. For example, to build an application for hospital management, one needs to fully absorb how things go through in hospitals.  I don’t have any idea how operations are in hospitals, so when someone ask me if I can build a system for one, I easily say that I still have to figure that out.

So in this article, I want to share my workflow on how I develop or acquire test data for use in any of my application development efforts.

Knowing The Business

I don’t know how to over emphasize that this is the hardest part. Unless we are part of the business, things can go a lot harder than we think. I always allot plenty of time discovering the business. You can immerse yourself, know the lingo, mingle with a systems analyst who happens to know the business, hire a tutor (consultant), or work with someone pretty much knowledgeable to get you up to speed. Get as much materials and persons to fill you in. Always remember, no one knows everything, not even an insider. And there are a lot of ways to look at something. Document and record conversations if allowed. I am assuming though that we all have the capability to absorb all these.

In most of the efforts I had, I’d say for 20-30% or bit more, this phase occupies most of our effort, time and resources. When designing database, this is where I get to know my core tables and what shape and form they will evolve eventually. But this is when the first instances of test data gets identified and created.

In my current area of expertise (I currently am hooked on developing solutions for the academic sector), knowing the business where I specialize has already taken me more than a decade and still I have yet to fully cover the entire array of entities, functions, circumstances and issues one can find in a school environment. And there is still so much to learn while things are constantly evolving in how schools operate. Their cycle of existence is just as dynamic if not more as other types of organizations. It gets better as you deal not only with one school but dozens.

Once you get into a certain comfort level where you can virtually say something about the business you need to go up a notch.

Simulate and Accumulate

Try to go over the business various cycles and processes. This time, simulate and accumulate data while you go at it. The amount of varying data you get depends on how good you are in simulating the business. In a hospital management system, how does the business start? What goes through after you start? When and how do things end? For example, business starts the moment a patient walks in the hospital. The hospital gets the name of the patient and some pertinent information like telephone number, address, medical status, etc. When he is ready to get some medical attention,  the patient then gets attended to by doctors. Then the hospital needs to know who attended to the patient, what were their findings, what were their recommendations? Will the patient be confined? Where? Then what happens after, when is the patient sent home? When will he settle his bills? Does the hospital business ends when the patient is sent home?

While you go over the processes, you get indicators of possible sources of data to help you fully understand the business, in this particular example, hospital management (see italics in the previous paragraphs), fill them in with actual data:

                telephone number:  999-9999
                address: #1, malacanang street, cebu, philippines
                medical status: psychologically unstable
                doctor who attended to the patient: dr. arrovo, md
                findings: medical status confirmed to be true
                recommendations: for treatment
                place of confinement: psychiatric ward
                when is the patient sent home: June 30 2016

Speculate

After you get the initial chunk of data, challenge everything you got. Ask some more. Have it looked by others. In most likelihood, the data chunk you have will change shape after this exercise.

  • Do I need only one telephone number? Do I need to know if the number is permanent or temporary?
  • Do I need to get only one address?
  • Do I only need to keep track of the doctors who took care of the patient? How about nurses? How about non-medical personnel? Do I need to know what each of them did?

Why do we need to speculate? Why can’t we just follow what is stated in the customer’s requirements sheet. You see, in most cases, most customers don’t know what they want. They only realize they need something else other than what they initially thought after we are done with everything. In most cases, we already have deployed the final version of the application. And speculation work reduces the need to refactor so soon.

Of course, we don’t have to provide what is not asked of us. But it is very important that we also cover our bases. We don’t want surprises after we have delivered. What is good about knowing further is that you shield yourself from unwanted surprises near or at the end. Refactoring so soon means you have failed to some degree. And the usual recourse out there is that development teams go for the agreed requirements sheet to shield them from additional customer’s request. And when the customer refuses to provide new funding for these requests, the applications we have developed suffer. In the end, no one wins. We might have delivered what is asked of us, but will our applications be as effective?

If we fail to fully ask possible questions, it might make things very difficult for us to adjust easily. For example, if we stopped asking or failed to ask if we only need to keep track of doctors, our database and application design would be different than if we had asked questions like the ones presented above. We could have allotted just a single field for the doctor’s name only to realize later that we need more.

The art of speculating isn’t easy though. It takes a while to develop your keen sense of the unknown but once you have it, you can’t get rid of it.

Re-Use

Another source of realistic data that might be available for us is data from old production systems. In case we get lucky that this one is available, it might one of the best sources of data that you can get out there. However, do take note that the reason why we are in here is to replace the old system. This means that it is possible that the data you will be getting from this source might be lacking of key pieces of information that will be essential to our efforts in building a new system. Worst, the data you get is messed up. It is very important that you have the ability to know when the data you use has captured most of the things you need to aid you in coming up with better designs.

So how do we use existing data from the system we are about to replace? My preferred approach is to still know the business, do some initial simulation and speculation so I can produce an initial structure of my database. Then the data build up part would come from essentially migrating old data to the new structure. This would almost require one to write a data migration program to accomplish the process. However, in my opinion, this is one of the most rewarding luxuries if we can get an excellent set of old production data to aid us in coming up with a new system. You are in for a rude awakening if it happens that what you have is a messy, problematic set of old data (I’d probably refuse to consider it as ‘old production data’). You might as well throw this thing and start from scratch.

By undergoing through this exercise, you gain the following experience:

  • You would see how old data fits in your new database structure.
  • You will encounter fields in your new structure not being filled with data from the old. Then you can probably attempt to fill them based on your interpretation of the new system.
  • You will also encounter data from the old that doesn’t seem to fit in the new. This happens when you failed to consider and include the same information in the new structure or you have misrepresented it in the new structure.

It would be an awesome feeling when you have finally mapped the old with the new structure. Having a realistic volume of a realistic set of data for use in application development is surely a very welcome treat and would drastically speed up a lot of things.

Generate

Today, there are equally effective tools that can help one produce a set of realistic data, and produce it in any volume we desire. You may also opt to write the data generator on your own. The hardest and crucial part is coming up with a data generation plan that can produce a realistic set of data. Of course, you still need to understand the business and produce the initial database structure. Then you figure out how to generate data by coming up with a plan. The actual generation part will be a no-brainer. You just let the tool does its job and voila, you have instant data.

So this is all about it folks. If you have a different workflow, you might want to share so I can improve mine further. 🙂