Old business issues drive a spate of data modernisation programmes


By Mervyn Mooi, director at Knowledge Integration Dynamics (KID)

The continued evolution of all things is obviously also felt in the data warehousing and business intelligence fields and it is apparent that many organisations are currently on a modernisation track.

But why now? Behind it all is exponential growth and accumulation of data and businesses are actively seeking to derive value, in the form of information and insights, from the data. They need this for marketing, sales and performance measurement purposes and to help them face other business challenges. All the business key performance indicators or buzzwords are there: wallet share, market growth, churn, return on investment (ROI), margin, survival, customer segments, competition, productivity, speed, agility, efficiency and more. Those are business factors and issues that require stringent management for organisational success.

Take a look at Amazon’s recommended lists and you’ll see how evident and crucial these indicators are. Or peek into a local bank’s, retailer’s or other financial institution’s rewards programmes.


(Image not owned by KID)

Social media has captured the media limelight in terms of new data being gathered, explored and exploited. But it’s not the only one. Mobility, cloud and other forms of big data, such as embedded devices and The Internet of Things, collectively offer a smorgasbord of potential that many companies are mining for gold while others are entrenching such value in their IT and marketing sleuths if they’re to remain in the game. Monetisation of the right data, information and functionality, at the right time, is paramount.

The tech vendors have been hard at work to crack the market and give their customers what they need to get the job done. One of the first things they did was come out with new concepts of working with data under using the old technologies. They introduced tactical strategies like centre of excellence, enterprise resource planning, application and information integration, sand-pitting and more. They also realised the need to bring the techies out of the IT cold room and put them in front of business-people so that they could get the reports the business needed to be competitive, agile, efficient and all the other buzzwords. That had limited success.

In the meantime the vendors were also developing modern and state-of-the-art technologies that people can use. The old process of having techies write reports that would be fed to business-people on a monthly basis was not efficient, not agile, not competitive and generally not at all what they needed. What they needed were tools that could hook into any source or system, that could be accessed and massaged by the business-people themselves and that could be relied upon for off-the-shelf integration and reporting. Besides that, big data was proving to be complex and required a new and useable strategy that would be scalable and affordable to both the organisation and the man on the street.

Hadoop promised to help that along. Hadoop is a framework based on open source technology that can give other benefits such as better return on investment by using clusters of low cost servers. And it can chew through petabytes of information quickly. The key is integrating Hadoop into mainstream analytics applications.

Columnar databases make clever use of the properties of the underlying storage technologies that enable compression economies and make searching through the data quicker and more efficient. There’s a lot of techie mumbo jumbo that makes it work but suffice to say that searching information puts the highest overhead on systems and networks so it’s a natural area to address first.

NoSQL is also known as Not only SQL because it provides storage and retrieval modelled not only on tables, common to relational databases, but also by column, document, key values, graphs, lists, URLs and more. Its designs are simpler, horizontal scaling is better – which improves the ability to add low cost systems to improve performance – and it offers better control over availability.

Data appliances are just as the name suggests: plug and play, data warehousing in a box, systems, software and the whole caboodle. Just pop it in and: “Presto,” you’ve got a ton more capacity and capability. These technologies employ larger, massively parallel, and faster in-memory processing techniques.

Those technologies, and there are others like them, solve the original business issues mentioned upfront. They deliver the speed of analytics that companies need today, they give companies the means to gather, store and view the data differently that can lead to new insights, they can grow or scale as the company’s data demands change, their techies and business-people alike are more productive using the new tools, and they bring a whole raft of potential ROI benefits. ROI, let’s face it, is becoming a bigger issue in environments where demands are always growing, never diminishing, and where financial directors are increasingly furrow browed with an accumulation of nervous tics.

Large businesses aren’t about to rip out their existing investments – there’s the implicit ROI again – but will rather evolve what they have. The way organisations are working to change reporting and analytics, though, will have an impact on the skills that they require to sustain their environments. Technical and business tasks are being merged and that’s why there’s growing demand for so-called data scientists.

Data scientists are supposed to be the do-it-all guys, right from data sourcing and discovery to franchising insightful and sentiment-based intelligence. They are unlike traditional information analysts and data stewards or report writers, who had distinct roles and responsibilities in the data and information domains.