The Emerging Database Market
Attempts to manage the information explosion of the past decade have given rise to many new databases, but have also contributed to higher Total Cost of Ownership (TCO), non-standardized tools across entire companies, and a business user community that is frustrated by the inability to get the data it needs. Large and small companies feel the pain everywhere and are employing everything at their disposal to harvest and mine data.
- Corporate IT is trying to fine tune existing solutions that are already complex and highly customized.
- LOBs are doing their own thing due to sometimes well-placed distrust in overburdened IT, while deadlines loom.
- Underutilization occurs because data-expert value-add is trailing off after the data is stored, leaving the “last mile” of data access to an ill-equipped user.
Wise companies are strategically investing in information with an understanding of market trends such as these:
- With innovations in hardware and the price of memory precipitously dropping, in-memory capabilities will be the corporate standard for the near future, especially for traditional databases, where disk I/O is the undisputed weak-link and bottleneck.
- Scale-out, file-based architectures will have a place in the organization for unstructured high-volume data, which presents a great opportunity for organizations to differentiate.
- The data warehouse remains a key component of the architecture but there will be many analytic data marts that require analytic features and higher levels of performance.
- In-memory technology in database market will lead to convergence between what was known as OLTP and OLAP. Data duplication and materialized aggregates will be replaced with fast virtual views and transactional and analytical processes divide blurred long term.
The Operational and Analytical Divide
The data explosion that’s occurred in the past decade and the resulting headache to manage it all has created a knock-on explosion in the technology industry, with more new databases created to solve this problem than you can shake a stick at.
NoSQL databases, columnar databases, data warehouse appliances, NewSQL databases, Hadoop, in-memory databases, cloud-based databases and more were all created because traditional row-oriented, hard disk-based relational databases cannot handle the volume, velocity and variety of data in the analytics and big data eras. This is all good news until you come to actually invest in some of these new database technologies and realize that they are all fundamentally different at an architectural level and will therefore impact your applications and infrastructure differently.Typical company progression to redundant analytical data
While business is undoubtedly still about storefront, supply chain, and transacting business, analytics are now the primary way to set a company apart and help it project the future and find ways it can amplify its trajectory. Information architectures reflect this advantage, with more and more redundancy of data, more structure, an ever-increasing information ecosystem, and significant analytic features being added to the DBMS. These features include a columnar orientation to the data, in-database analytics and significant use of data in memory, which will be explored in the next post.
This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet. I’ve been compensated to contribute to this program, but the opinions expressed in this post are my own and don’t necessarily represent IBM’s positions, strategies or opinions.