The 4 killer stats from the ESOMAR 3D conference

The 4 killer stats from the ESOMAR 3D conference

I was only able to attend one day of this conference, for me without doubt this is the most useful research conference of the year and so I am sorry, I am only able to give you half the story, but  here is what I brought back with me, 4 interesting stats, 3 new buzzword and 1 stray fact about weather forecasting.

350 out of 36,000: This is how many useful comments Porsche manage to pick out from analysing 36,000 social media comments about their cars. So the cost benefit analysis of this runs a bit short and this was probably the headline news for me from the ESOMAR 3D conference: No existing piece of text analytics technology seems to be capable of intelligently process up this feedback. Every single one of these comments had to be read and coded manually I was shocked. I thought we were swimming in text analytics technology, but apparently most of the existing tools fall short of the real needs to market researcher right now (I spot one big fat opportunity!).

240 hours: This was the amount of time spent again conducting manual free text analysis by IPSOS OTX to process data from 1,000 Facebook users for one project (and from this they felt they had really only scratched the surface). As Michael Rodenburgh from IPSOS OTX put it "holly crap they know everything about us".  There are, he estimated, 50 million pieces of data associated with these 1,000 uses that it is possible to access, if the end user gives you a one click permission in a survey. He outlined the nightmare it was to deal with the data that is generated from Facebook just to decipher it is a task in itself and none of the existing data analytics tools we have right like SPSS now are capable of even reading it. There was lots of excellent insights in this presentation which I think deservedly won best paper. 

0.18: This is the correlation between aided awareness of a brand & purchase activity measured in some research conducted by Jannie Hofmyer and Alice Louw from TNS i.e. there is none. So the question is why do we bother asking this question in a survey? Far better just to ask top of mind brand awareness  - this correlates apparently at a much more respectable 0.56. We are stuffing our survey full of questions like these that don't correlate with any measurable behaviour.   This was the key message from a very insightful  presentation. They were able to demonstrate this by comparing survey responses to real shopping activity by the same individuals. We are also not taking enough care to ask a tailor made set of questions to each respondent, that gleans the most relevant information from each one of them. A buyer and a non buyer of a product in effect need to do 2 completely different surveys. Jannie senses that the long dull online surveys we create are now are akin to fax machines and will be obsolete in a few years time. Micro surveys are the future, especially when you think about the transition to mobile research. So we need to get the scalpel out now and start working out how to optimise every question for every respondent.

50%: The average variation between the claimed online readership of various dutch newspapers as publish by their industry jic and the readership levels measured from behavioural measurement using pc and mobile activity in tracking as conducted by Peit Hein van Dam from Wakoopa. There was such a big difference he went to great lengths to try and clean and weight the behavioural measurement to account for the demographic skew of his panel, but found this did not bring the data any closer the the industry data but in fact further away. Having worked in media research for several years I am well aware of the politics of industry readership measurement processes, so I am not surprised how "out" this data was and I know which set of figures I would use. He pointed out that cookie based tracking techniques in particular are really falling short of delivering any kind of sensible media measurement of web traffic. He cited the "unique visitors" statistics published for one Dutch newspaper website and pointed out that it was larger than the entire population of the Netherlands.

Note: Forgive me if I got any of these figures wrong - many of them were mentioned in passing and so I did not write all of them down at the time - so I am open to any corrections and clarifications if I have made some mistakes.

3 New buzzwords

Smart Ads: the next generation of online advertising with literally 1000's of variant components that are adapted to the individual end user.

Biotic Design: A technique pioneered by Yahoo that uses computer modelling to predict the stand out and noticeability of content on a web page. It is used to test out advertising and page design and we were show how close to real eye tracking results this method could be. We were not told the magic behind the black box technique but looked good to me!

Tweetvertising: Using tweets to promote things (sister of textervising)

One stray fact about weather forecasting

Predicting the weather: We were told by one of the presenters that although we have super computers and all the advances delivered by the sophisticated algorithms of the Monte Carlo method, still if you want to predict what the weather is going to be like tomorrow the most statistically reliable method is to look what the weather is like today, compare it to how it was yesterday and then draw a straight line extrapolation! I also heard that 10 human being asked to guess what the weather will be like, operating as a wisdom of the crowns team, could consistently out performed a super computer's weather prediction when programmed with the 8 previous days of weather activity. Both of these "facts" may well be popular urban myths, so I do apologise if I might be passing on tittle tattle, but do feel free to socially extend them out to everyone you know to ensure they become properly enshrined in our collective consciousness as facts!

Big data and the home chemistry set


Are we all Dodos?   I heard a couple of people tell us at the ESOMAR 3D conference that we are perilously close to extinction,  that we market researchers are dodos. In fact this has been a bit of a common theme at many of conference I have attended in the last few years a prediction of the terminal decline of research as we know it. The message is that our industry is gonna be hit by a bus with the growth of social media and the big boys like Google and Facebook and IBM muscling in to our space. We are also in many parts of the world facing tough economic times and tightening budget.

Yet despite all this it appeared that this was the best attended 3d conference ever, and it's not just this isolated conference either. I have been going to research conferences all around the world over the last year and they all seem to be seeing growing numbers of attendees and all I can sense from these conferences and particularly at this event, is an industry brimming with confidence and ideas.

So are we all putting on a brave face? Are we naively sleep walking into the future?   I don't think so...


The macro-economics of market research

Over the last decade we have seen a near exponential growth of data being generated by the worlds population. It is literally pouring out of the internet and our mobile phones.   We also have an ever increasing range of innovative ways to measure and analyse things, ranging from geo-location tracking right through to sensors attached to our heads.  We are able to measure almost everything we do. So who is going to do it? 

 We learnt at this conference how hopeless computers are at actually thinking and they lack the ability to really intelligently analyse data to the quality and standard needed to glean real market research insights. In every presentation we saw the critical contribution that human being had.  It just leads me to believe that with all this expansion of data the pool and means of measuring things more people are going to be needed to make sense of it all and who is best at doing this?  Ultimately I believe it is market researchers.  Media companies and consultancy firms might think they are in with a shout but I genuinely don't believe they have to core competences needed. Market research is all about working out how to measure things, gathering and analysis of data and using this to delivering insights. That is what we specialise in, that is what we are expert at.   No other industry is better placed to capitalise on all this free flowing data than ours.

The big thing I think we need to focus on is developing new tools to process big data. Right now it looks like we have oil tankers full of information waiting on our door step and the research industry is currently attempting to use tools that look like they are from a home chemistry set to try and process it into fuel.  


We need to develop more tools to refine all this information and learn the skills to do it.  That is our challenge , grasping the technology that is out there and adapting it for our needs.

But I have every confidence we can. I believe we are the best industry out there at cross communicating ideas and with that comes innovation. There are lots of people who bemoan the lack of innovation in our industry and again I see quite the opposite. I see an industry racing to innovate.  Step back and look at how things have developed in the last couple of years alone.  Look what we can do already in the field of mobile  research,  how many research companies have so quickly moved into offering social media research solutions. Look at some of the fantastic tools that have emerged like facial emotion measurement and neuro/biometric monitoring, look at what we are learning and embracing from the fields of behavioural economics, look at crowd sourcing and the success of MROC communities and how we have developed technology to serve these communities. Look at some of the new text analytics tools that are emerging.

I think the market research industry is more than capable to adapt to these needs and I feel it has a big future.



Kategori

Kategori