The notion of “Big Content” has been given a much needed identity and visibility over the past year or so. It is no longer sufficient for content professionals to talk about content only in a generic sense, because the challenges change according to the scale of the content. While Big Content seems easy for people to grasp when looked at in bits and pieces (such as specific items, types or categories of content), such content is difficult for people to comprehend and guide in the aggregate, due its scale, variation, and web of complex relationships with diverse audiences and organizational missions. Without a proper framework, Big Content can be too big to fathom.
The IT consultancy Gartner has promoted the concept, and various vendors are starting to talk of how they address Big Content. Most large organizations create and publish vast quantities of content – quantities of content intrinsically difficult to manage effectively. The discipline of content strategy has been addressing the general issues of content effectiveness, but so far it hasn’t focused extensively on the issue of content scale. Even though the same goals and principles of content strategy apply to both small scale content (thousands of content items) and large content (millions of content items), the tactics needed are fundamentally different. Small scale content can be managed in a labor-intensive way and understood without extensive technical expertise, but large scale content can’t be. The designation “Big Content” helps to highlight how fundamentally different the approach needs to be when trying to coordinate truly large scale content. Content strategist Rahel Anne Bailie notes: “Big Content is consideration of content beyond the copy, and even beyond the content. It’s a consideration of the infrastructure and related elements that support the production and management of content.”
It’s perhaps tempting to view Big Content as an extension of Big Data; after all, both proclaim to be “big,” and data and content sound similar. Indeed Gartner asserts: “Big Content is still Big Data.” For some analysts, vast quantities of content look a lot like mountains of customer data: content is just unstructured data that, if cleaned up, can be mined for insights. Gartner argues: “Unstructured content represents as much as 80 percent of an organization’s total information assets. While Big Data technologies and techniques are well suited to exploring unstructured information, this ‘Big Content’ remains underutilized and its potential largely unexplored.”
But Big Content is different from Big Data, and needs a different framework. While it is true that content can be mined for insights, the core value of content is different than transactional and other data that is commonly the focus of big data analysis. Unlike “data” – for example, the rows and columns of customer and product data stored in a data warehouse, “content” – the articles, videos, photos and user comments – is created for humans to read, watch and enjoy, not for machines to mince and digest. Enabling machines to process this content is an important issue to address, but only to the extent such processing delivers something that matters for customers.
So far, many discussions of Big Content focus entirely on the analysis of existing content (often user generated), rather than the management and dynamic creation and delivery of new content. Such a business intelligence focus asks how Big Content can inform business decisions (which may have nothing to do with publishing better content), rather than how to manage and deliver large scale content to better meet user needs. There is nothing wrong with analyzing content to inform executive decisions about investment or customer care, but the potential of Big Content needs to be far bigger.
Where the Big Data mindset views content as a means to obtaining business analytics (what are people doing with the content?), the Big Content mindset looks to business analytics as a means to producing better content (what content is needed?). Where Big Data techniques are about consuming large scale legacy information, approaches to Big Content should be generative, examining how new information for audiences can and should be created on a large scale. It’s not good enough to simply analyze what you get from your customers or your business divisions: you should be planning what you what to say to them as part of a large scale dialog. Audience-facing content, not internally-facing business intelligence, is what creates the superior customer experiences that drive customer loyalty and revenue.
In forthcoming posts, I’ll explore approaches to addressing Big Content – what is useful today, what could be useful in the future, what content tools need to offer, and what emerging approaches are still lacking.