/* ---- Google Analytics Code Below */

Monday, October 10, 2016

On a Third Age of Data


In Datanami  by Jeff Cobb

The Third Age of Data has arrived. Today, an estimated 1 trillion sensors are embedded in a nearly limitless landscape of networked sources, from health monitoring devices to municipal water supplies, and everything in between. The massive amounts of data being generated hold the promise of ever-greater insight, but only for those who successfully ingest, process and harness the flood of information. Now more than ever, scalability and real-time analytics have become essential for companies who want to meet business demands and stay ahead of the curve.

In order to understand how we came to generate so much information, let’s rewind 20 to 30 years ago to the First Age of Data. Traditional IT infrastructure was designed around data that was predominantly created by humans, including emails, documents, business transactions, databases, records, and the like. This data was primarily transaction-oriented, consisting of back-office databases and once-a-day batch processing. For example, if a bank generated a customer statement, it would be conducted through a mainframe, stored in a traditional database, transferred across a storage area network, and eventually end up in your mailbox. Many large, well-established companies made a name for themselves during this era of data. ... "

No comments: