The Business and Technology Benefits of In-Memory Computing #BISUM
- Big Data and the BI Consumer #BISUM: http://goo.gl/2XDz4V
- Turning Information into a Strategic Asset #BISUM: http://goo.gl/NTgC6t
- Return on Investment: Making Our Projects Make Sense #BISUM: http://goo.gl/LHWIjx
- Does the Data Scientist Have Mojo? #BISUM: http://goo.gl/2gOkHP
- The Business and Technology Benefits of In-Memory Computing #BISUM: http://goo.gl/bvbmvQ
And one for Sunday. “The Business and Technology Benefits of In-Memory Computing” was led by Colin White of BI Research and Paul Clark.
Over the past five decades there have been a few key technologies that have had a dramatic and disruptive impact on both IT and the business. These include OLTP systems in the 1960s, relational technology in the 1980s, data warehousing in the 1990s, and now, big data. Enabling these technologies has required significant new innovations not only in software, but also hardware.
We looked at how hardware has evolved over the years to provide that foundation for new software innovation, focusing specifically on how today’s large memory spaces and in-memory computing technologies are helping companies gain huge business benefits by being able to run models and analyses in seconds, whereas previously they took hours or days.
Organizations are upgrading their IT architecture to take advantage of the low latency processing that in-memory computing offers. Regardless of company size, in-memory is the best way to deliver on high performance requirements. It’s price is dropping with an increased availability of high-performance multi-core processors.
Hard disk is ABOUT 5 cents per gigabyte, SSD is $1 per gigabyte whereas DRAM is about $10 per gigabyte. This indicates hard disks may be on the way out, like tape, even in midsize businesses. You must take a total cost of ownership approach, it’s not just the cost of the hardware.
But beware, most software does not exploit multi-core. It may use some of its advantages, but not many products are core-aware.
Hardware is giving us massive amounts of processing power, quantum leaps in performance of which in-memory is one. DB2 10.5 with BLU Acceleration, for example, is taking advantage of these chips.
Putting data in-memory does not make it an in-memory system. This article, from Information Week, speaks to this:
“Databases and data warehouses as we know them, with their spinning disks and related I/O overhead, are not the future. Big data analytics and the database management systems supporting next-generation, low-latency transactional systems demand a new in-memory approach.”
That’s a wrap on the semi-formal sessions at #BISUM. It’s hard to convey the full immersive experience. The greatest learning was in the one-on-ones and table conversations. I’m going to enjoy the rest of the day in Medford and look forward to putting #BISUM into practice starting today.
This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet. I’ve been compensated to contribute to this program, but the opinions expressed in this post are my own and don’t necessarily represent IBM’s positions, strategies or opinions.