Why social media and user-generated content are set to dominate email and Web
By Kym Reynolds
Social media has completely transformed the way people create, consume and share content. With this new behavior of how people interact with one another, brands need to adapt how they communicate, market and position products accordingly.
The challenge, as companies require more data, is how they are able to access and analyze it in a meaningful way.
Big on data
Rather than pushing companies to this single customer view, marketing solution providers need to try and encourage a first step with the single marketer view, so that marketers can see every touch point that their customer has interacted with, every ad they may have been exposed to, and every offer with which they interacted.
The challenge is about the influx of data – the ability to stay on top of it and make the most of it and deliver the best possible outcome.
There is no doubt in my mind that we live in a world where data is now currency.
The good news is that I think the most sophisticated marketing solutions now give access to and can help to analyze that data.
We are entering into a golden age where the question is about what you want to do with it, and what problems you want to solve with it and that is going to change on a case-by-case basis for all marketers.
What I have noticed in this golden age of data and marketing technology is that things are now resource-light, but the output you get is very much greater.
Personalization means it should take less time to send out a batch-and-blast email. Understanding data and acting on it should take less time than just looking at your data as a whole.
An increasingly dominant source of big data is social media. There are several challenges when analyzing social media, which is both voluminous and also high velocity.
Social data is analyzed for various business needs, ranging from reputation management, trend analysis and prediction to personalized marketing.
Other societal and intelligence applications involve the identification of emerging social memes, communities and cliques and coded messages.
The latest trend is in location-aware applications such as crowd-sourced navigation services.
Social media can be analyzed on multiple dimensions.
Content analysis involves examination of the actual content of a message or post.
Social network analysis looks at the data from a graphical or topological perspective – in other words, who is posting, their followers and so on.
Locational analysis tracks users and their activity based on various types of external and mobile sensors providing geo-locational information.
Sophisticated analysis methodologies use machine-learning algorithms that consolidate inputs or evidence from all these perspectives.
To extract actionable information from social media, it is necessary to analyze language usage at a finer granularity than simple keyword filtering, which is the technique used by several social media listening platforms.
Natural language processing (NLP) technology is necessary to distinguish true customer intent to buy a product or service as opposed to general opinions being expressed.
In our global society, much of the communication is in languages other than English, posing additional challenges to content analysis.
Simply translating data into English will lose much of the nuanced information such as sentiment. It is necessary to mine multilingual social media through native language processing which can deal with the vagaries of slang and code switching.
To build a scalable social media analysis platform, several skill sets are needed.
The team must include data scientists, machine learning experts and computational linguists as well as information retrieval (search) specialists.
Data scientists take a holistic view of the problem, focusing on the target analytics and tracing the requirements back to required content and scalable algorithms.
SINCE IT CAN be daunting to build such a team in-house, there are alternatives that companies may choose.
IBM’s Watson Content Analytics platform provides a comprehensive set of technologies and tools that can be leveraged to build custom analytics.
Traditional business intelligence players such as SAS are also offering analytics from both structured and unstructured data sources.
However, these solutions come with a cost, both in terms of time and resources that may be prohibitive for small and midsize organizations.
Recently, some content providers, for example, Datasift, are already providing enriched data, such as data already analyzed and tagged with demographics.
Several companies are also providing locational or positioning services. This leads to solutions that are partly out-sourced and partly built in-house.