White Papers Plus

Transitioning from PostgreSQL to an Analytical Database for Higher Performance and Massive Scale

In today’s data driven world, where effective decisions are based on a company’s ability to access information in seconds or minutes rather than hours or days, selecting the right analytical database platform is critical.

Read this McKnight white paper to learn:

  • Which criteria to consider for an analytical database
  • The process for transitioning away from PostgreSQL
  • Transition success stories from Etsy, TravelBird and Nimble Storage

Link to paper.

Sector Roadmap: Unstructured Data Management 2017

This Sector Roadmap is focused on unstructured data management tool selection for multiple uses across the enterprise. We eliminated any products that may have been well-positioned and viable for limited or non-analytical uses, such as log file management, but deficient in other areas. Our selected use cases are designed for high relevance for years to come and so the products we chose needed to match all these uses. In general, we recommend that an enterprise only pursue an unstructured data management tool capable of addressing a majority or all of that enterprises’ use cases.

In this Sector Roadmap, vendor solutions are evaluated over five Disruption Vectors: query operations, search capabilities, deployment options, data management features, and schema requirements.

Link to report (fee).

Sector Roadmap: Modern Enterprise Grade Data Integration 2017

This Sector Roadmap is focused on data integration (DI) selection for multiple/general purposes across the enterprise.

Vendor solutions are evaluated over six Disruption Vectors: SaaS Applications Connectivity, Use of Artificial Intelligence, Conversion from any format to any format, Intuitive and Programming Time Efficient, Strength in DevOps and Shared Metadata across data platforms.

di

Link to report (fee).

Sector Roadmap: Modern Master Data Management 2017

This Sector Roadmap is focused on master data management (MDM) selection for multiple data domains across the enterprise. In this Sector Roadmap, vendor solutions are evaluated over seven Disruption Vectors: cloud offerings, collaborative data management, going beyond traditional hierarchies, big data integration, machine learning-enabled, APIs and data-as-a-service, and onboard analytics.mdm

Link to report (fee).

Vertica Predictive Maintenance Testing Trial

This is NOT a white paper (except that there is documentation) but rather it’s a test drive for predictive maintenance – something on the minds of many these days –  we built.

Experience how Vertica enables you to store in near real time sensor data from multiple cooling towers across the USA and predict equipment failure ahead of time to provide continuity of service. In this AWS Test Drive, we will create an instance of the Vertica Cluster and generate readings from multiple cooling towers in real time that are stored in Vertica. The test drive also includes a web based dashboard that interacts with Vertica to leverage machine learning algorithm such as logistic regression to predict risk of failure to prevent down-time. You will have 4 hours to play, query and analyze the dataset.

Contact us for the “Vertica Predictive Maintenance Testing Trial”.

Moving to a Software-as-a-Service Model

This is a series of 4 blog posts.

If you’re a software vendor moving to a SaaS business model either by creating new product lines (from scratch or by adding cloud characteristics to existing products) or converting an existing product portfolio, the transition to a SaaS model will impact every aspect of the company right down to the company’s DNA.

In these posts, William addresses the top four considerations for choosing the database in the move. The database selection is critical and acts as a catalyst for all other technology decisions. The database needs to support both the immediate requirements as well as future, unspecified and unknown requirements. Ideally the DBMS selection should be one of the first technology decisions made for the move.

Link to posts.

Moving Analytic Workloads to the Cloud: A Transition Guide

Recent trends in information management see companies shifting their focus to, or entertaining a notion for a first-time use of, a cloud-based solution for their data warehouse and analytic environment. In the past, the only clear choice for most organizations has been on-premises data solutions —oftentimes using an appliance-based platform. However, the costs of scale are gnawing away at the notion that this remains the best approach for some or all of a company’s analytical needs.

According to market research, through 2020, spending on cloud-based Big Data Analytics technology will grow 4.5x faster than spending for on-premises solutions.  Due to the economics and functionality, use of the cloud should now be a given in most database selections. The factors driving data projects to the cloud are many.

Additionally, the multitudinous architectures made possible by hybrid cloud make the question no longer “Cloud, yes or no?” but “How much?” and “How can we get started?” This paper will reflect on the top decision points in determining what depth to move into the cloud and what you need to do in order to be successful in the move. This could be a move of an existing analytical workload or the move of the organization to the cloud for the first time. It’s “everything but the product selection.”

Link to report (GigaOM membership or fee required for full report).

Request for Information (RFI) Guide: Moving Analytic Workloads to the Cloud

Recent trends in information management see companies shifting their focus to, or entertaining a notion for a first-time use of, a cloud-based solution for their data warehouse and analytic environment. In the past, the only clear choice for most organizations has been on-premises data solutions —oftentimes using an appliance-based platform. However, the costs of scale are gnawing away at the notion that this remains the best approach for some or all of a company’s analytical needs. Additionally, the multitudinous architectures made possible by hybrid cloud make the question no longer “Cloud, yes or no?” but “How much?” and “How can we get started?”

This RFI will reflect on the top questions you should ask in making your product selection when moving information management and your analytical workload to the cloud.

Link to report (GigaOM membership or fee required for full report).

Ten Mistakes to Avoid in Data Maturity and Modernization

Companies everywhere are realizing that data is a key asset that can directly impact business goals. Yet, in some enterprises, awareness of data’s value doesn’t translate into increased data maturity and modernization. Often treated as a drag-along to budgeted applications, data architecture can be accidental or happenstance—a casualty of a lack of focus. The opportunity now exists to influence the future and undertake highly data-focused projects in more modern, scalable, and usable ways. In this Ten Mistakes to Avoid, William McKnight identifies the misguided practices that cause the most friction in modernization efforts and the journey to higher data maturity. He offers tips on how to mature the environment that supports the asset upon which competition is forged today—data.

TDWI_10m_Q2_2017_tn

Link to report (TDWI membership required).

Analytics in Action with Teradata Business Analytics Consulting

This study, written by industry analyst Richard Hackathorn of Bolder Technology, Inc. and William McKnight of McKnight Consulting, examines the business value that Teradata Business Analytics Consulting engagements generate for client companies. Based on case studies from different industries, key insights and trends behind this value generation are documented, as well as recommendations for pursuing successful business analytics consulting engagements.

Link to report.