Problem: Determine how many people are at the beach on the 4th of July.
Solution 1: Big Data Analytics - minded
Take a fleet of quadcopter drones and have them take thousands of snapshots of the people on the beach. Download many gigabytes of jpegs. Use fancy facial recognition software to identify individuals and take a best guess estimation by de-duplicating the result set. Give a number within a confidence interval.
Solution 2: Data Warehouse - minded
Block off all access to the beach. Don't let anyone in until they swipe their driver's license or photo ID. Query the database, get an exact count.
Big data analytics is what it is because it makes guesses from data that is not structured to answer specific questions. Data warehousing is what it is because you absolutely structure the domain for data collection according to a purpose. In an ideal world, all big data analysis guessing evolves to data warehouse structure.
So 'big data analytics' essentially means inefficient unstructured data + smart guessing. All of the credit card transactions in the world are data warehouse structured, and have always been. But that's not 'small data'.
To get a longer explanation of some the details of data warehousing that would bore, but satisfy, an expert like me, read this article. But do so with the following understanding. Everything that 'big data analytics' is, will become a data management subset of future data warehousing. In other words, SQL has mastered all we need to know about set logic and transformations in database tech. It *is* the standard semantic level, as permanent as HTML is to web development. Everything the unstructured database tech now do in 'NoSQL' will be assimilated. Resistance is futile. At the core of the technology, what's too fast today will not be tomorrow. What's too voluminous today will not be in the future.
At Full360, we already combine appropriate technologies in our DW + BI Elastic framework to bridge the gap. There's no data to fast, too wide or too big for us.