There is a lot of chatter around "Big Data" these days and in particular the speed at which we need to process this data. We may be missing the real issue as we "put the cart before the horse" again. The abundance of available data does not mean we are ready to analyze anything. We must examine first the care that was taken in collecting and collating that data. Keep in mind the old adage about "garbage in, garbage out". It is more relevant today than ever before. If we jump to process data at "warp" speed we may be setting ourselves up for a colossal nightmare. Processing bad data at the speed of light may help us go out of business at the speed of light. Before you develop a strategy for analyzing data that has been "thrown" on the shelves indiscriminately you might want to verify the integrity, or lack thereof, of that data. Your first strategy should be one that validates and then sustains standards to keep the integrity.