- July 21, 2009

A Generation Ahead

Josh Chasin
Josh Chasin
Principal
KnotSimpler

I’ve been in media research for 29 years. I’ve been an executive at Arbitron, and President/CEO at Simmons. So, by now I might know a thing or two about this business. When I joined Comscore in 2007, it was largely because I had long considered the company to be the most innovative in the audience measurement space. I wanted to be part of a team doing great, pioneering, innovative work, in an industry too often plagued by inertia and “old school” thinking. In fact, the first thing I did upon joining Comscore was to draft a presentation about the methodology, titled “21st Century Audience Measurement.”

Even given my expectations though, when I joined Comscore, I experienced quite a bit of culture shock. I always saw media research as a business in which introducing a material change or innovation was a lot like steering an ocean liner. At Comscore I learned quickly that for an audience measurement company with a bona fide engineering and technology pedigree, there was no ocean liner. There was only continuous development and innovation.

One of Comscore’s innovations that had particularly impressed me in the years before I joined was the company’s recognition, from the beginning, that using a 20th century telephone calling center to recruit an Internet panel was at best untenable, and at worst preposterous. In the marketplace, we talk about using online recruitment, coupled with a smaller, offline randomly-recruited calibration panel to provide the largest possible samples with a minimum of bias. This technique allows us, for example, to include all types of households in our panel - including cell-only (20% of the population), and cell primary (another 14% have landlines but receive all or almost all calls on cell.) These segments of the population are routinely omitted from traditional RDD recruitment; Comscore has been including them for years. In addition, the sample sizes that this methodology yields allow us to report audiences for tens of thousands of web entities each month.

Nielsen Online, meanwhile, would turn their nose up at us, holding steadfast to the belief that the old ways were the best ways. They countered MMX with a panel recruited by phone(!), comprised of a sample about one tenth the size of ours and which reported on about one tenth the number of entities.

Needless to say, this won us a lot of business.

Last week Nielsen Online announced, essentially, that they were giving up the ghost. In what they claimed was innovation, they announced that they are moving to the same technique invented by Comscore ten years ago: offline recruitment, with a random calibration panel. In the headline of the press release, they claimed that this would now be the largest online reporting panel in the U.S.

I initially saw the press release when Carl Bialik, aka the Wall Street Journal’s Numbers Guy, forwarded it to us asking for a comment. Naturally the first thing I pointed out was that the MMX U.S. in-tab for June was 300,000—fully 30% more panelists than Nielsen was touting. Our MMX panel size has been increasing as a natural outgrowth of our program for constant improvement. We have one million Comscore panelists in the U.S., and a subset of these qualify each month for MMX reporting in-tab (the rest are available for other Comscore services and analytic work.) Because we have been improving our technology for identifying the demographics of the person using the computer at any point in time (fully two thirds of Internet users in the U.S. are on multi-user machines), many more sample persons have been able to pass our rigorous MMX reporting criteria and qualify for in-tab.

Nielsen seemed to be caught off guard by this development; no doubt they had been devoting a lot of time and money to the initiative to adopt our methodology, and had expected to leapfrog us in panel size. A Nielsen spokesman even backtracked quickly to a statement that “size isn’t everything” - which forces us to ponder the wisdom of touting the benefit of panel size in the headline of the press release in the first place. (By the way: size - and representivity - do matter, if you want to report on more entities with less sampling error.)

Incidentally, our competitor touts the fact of reporting 30,000 entities with their “new” methodology. In the June 2009 U.S. MMX release, Comscore reported on 70,000 entities; in our three-month rolling average report, we reported on 95,000 entities.

To be fair, I think Nielsen was legitimately vexed by the announcement of MMX 360 in May. Readers of this blog will know that 360 is Comscore’s groundbreaking move to panel-centric hybrid measurement, integrating data from our panel with census-level server data. Advertisers, agencies, and publishers have been widely embracing this approach, and many of the world’s best media brands are already beaconing for us.

MMX 360 brings together the two primary datasets in online metrics - panel measurement, which provides a 360 degree view of the behavior of a sample of persons; and, site-centric server data, which provides a “census” view of activity for a given web entity. The panel data provides person-level insights like demographics, time spent, engagement, and cross-site duplication (which are necessary for buyers and sellers of advertising to disentangle total ad impressions into reach versus frequency.) The server data provides a census of all activity occurring at the website’s servers - every server call.

Of course the server data needs to be filtered to exclude non-human traffic (bots, spiders) and ineligible traffic (e.g., redirects and other non-valid traffic we often see tabulated in publisher server counts, but which we exclude from MMX.) And, in the calculation of unique visitor numbers, server data must also be adjusted for the deleterious impact of cookie deletion. Then it must be parsed by country of origin, so that we only include U.S.-based traffic in the U.S. MMX, and Canada-based traffic in the Canada MMX, and so on. But once properly filtered, server data can enrich and enhance the quality of the overall MMX offering.

MMX 360 hybrid data won’t necessarily match a publisher’s internal server data, for all the reasons cited above. But for the first time, publishers will be able to understand and empirically reconcile the differences between their internal, and Comscore’s syndicated, audience data. In addition, the trends in MMX data from month to month will correlate even more closely with trends in publishers’ internal data. Also, the hybrid approach will enable us to report at an even more granular level, and to introduce more frequent and timely data releases. The clients I’ve talked to on both the publisher and the agency side are uniformly excited about these developments,

And in case you’re wondering, the MRC audit process for the 360 component of MMX has already begun. (If this has whetted your appetite for MMX 360, why not click through to Comscore Direct and sign up?)

On reflection, I think that Nielsen Online had dedicated so much time and effort to their massive sample catch-up initiative that they did not see MMX 360 coming. MMX 360 is the next generation of online audience measurement; and the 360 announcement pre-empted their news.

I wouldn’t be surprised to see Nielsen begin steering the ocean liner in a new direction. One can only wonder where we’ll be by the time they get here.