July 22, 2014
Case Study

Marketing Analytics: How a drip email campaign transformed National Instruments' data management

SUMMARY: Metrics, and the data behind those metrics, are the lifeblood of digital marketing and email campaigns. But, the initial analysis of the raw data sets is not as straightforward as it might seem, as National Instruments found out with the global rollout of a drip email campaign.

Read on to find out why a change in analysts caused a precipitous drop in conversion rate, and how the team reacted to the challenge to create a more robust and consistent data handling and management process.
by David Kirkpatrick, Manager of Editorial Content

One of the key changes digital channels and strategies have brought to marketing has been a wealth of marketing campaign data that can be analyzed and turned into useful information to improve or affect future campaigns.

Within digital marketing, the email channel has long offered marketers data in the form of open rates, clickthrough, bounce rates and a number of other metrics. One challenge presented by all this data is in order to become useful, it has to be analyzed, and analyzed in a meaningful fashion.

Read on to find out how a relatively simple drip email campaign led National Instruments to re-examine its data analysis process and begin a transformation in data analysis across the enterprise.

CHALLENGE

National Instruments is a global B2B company with a customer base of 30,000 companies in 91 countries. It produces automated test equipment and virtual instrumentation software. In 2012, National Instruments conducted about 800 email campaigns per month.

Ellen Watkins, Manager, Global Digital Marketing Programs, National Instruments, found a trigger to truly transform how her team handled data for a drip email campaign for the global rollout for LabVIEW, which is what Watkins described as National Instruments' "flagship product."

As the campaign evolved from beta testing to a full rollout, analysis of the conversion rate for the program shifted in what could be considered an alarming fashion.

The end result of the National Instruments team really understanding how data analysis can affect the campaign's reported metric was a transformation in their data handling and reporting. This included more rigorous documentation, more standards on how email conversion was defined, and creating a "single source of truth" for National Instruments' data.

CAMPAIGN

Many MarketingSherpa case studies go into great detail on the actual campaign. In this case, the focus is a little different in that the drip email campaign covered in Step 1 below was the catalyst for full re-evaluation of data analysis and handling at National Instruments, providing insight into just how important data management is when determining results metrics.

Step #1. Execute the drip email campaign

Because the full email campaign involved National Instruments' flagship product, it initially ran as a beta program in several of the company's markets, including:
  • United States

  • United Kingdom

  • India

After the beta program, the campaign was rolled out globally across National Instruments' entire marketplace.

The campaign also included several programs, each reaching out to specific spaces within the product's sales funnel, including:
  • Awareness phase

  • Consideration phase

  • Evaluation phase

  • Customer success phase

For example, the evaluation phase email campaign was a 30-day program with a multi-touch email stream, and the conversion goal of the program was to convert prospects who were evaluating the software into paying customers.

The email stream was triggered by an evaluation form submission and included seven touches:
  1. Immediately after the form submission

  2. Day 1

  3. Day 7

  4. Day 14

  5. Day 21

  6. Day 28

  7. Day 30

This stream of lead nurturing emails was compared against what had previously been a single-touch email during the evaluation phase.

During beta testing, conversion was 8% compared to the previous single-touch email's conversion rate of 4% — a promising result for the team.

The first data analytics challenge occurred when the seven-touch program rolled out globally and conversion dropped from 8% to 5%. At the same time, the team wasn't overly concerned since the program went from a relatively limited beta test to emails sent to 40 countries, and conversion was still better than the previous single-touch campaign.

Step #2. Discover issues in the data handling and analysis process

Watkins said the team brought in a new analyst after the program launched globally. This was a change made out of necessity, and not for the sake of bringing in a new analyst.

However, the new global database marketing analyst, Jordan Hefton, did have to build a new data analysis methodology out of necessity. She had access to the reports the previous conversion rates were based on, but no documentation on how those numbers were reached.

Hefton had to perform her own analysis on the same data and came up with a different conversion rate than the previous analyst — 2%.

Watkins said, "First of all, we [asked,] 'What? Why?' because we expected to actually see improved numbers because the data was tighter."

Stephanie Logerot, Database Marketing Specialist, National Instruments, agreed: "When I first saw the number change, I was a bit freaked out."

Step #3. Clearly define the team structure

Watkins and Department Manager Rosanna Martinez built the database marketing team with clearly defined roles across three basic, functional areas:
  • Strategists — marketing and journalism background providing content and creativity

  • Automation — the technical skill set, the builders

  • Analysts — the data minds and data crunchers

Watkins said gaining approval throughout National Instruments for tightened data analytics continues to be part of an ongoing evolution at the company.

"The business was asking for solid data analytics and we needed to provide it, so we made it happen within our own team," Watkins said.

Logerot added there was some initial skepticism about adding dedicated resources for data analytics to the employee headcount, and the team essentially had to tell the leadership team to trust them. Proving that point enabled the team to deliver results that actively helped in making big decisions on content and strategy.

Step #4. Determine how metrics will be calculated

At this point, Hefton tried to determine why the metrics she calculated indicated a lower conversion rate than previous efforts.

As best as she could determine, the first group of metrics included numbers pulled together by other analysts on the team that were "a little more general."

Hefton added, "I came in and tightened the criteria a lot. Essentially, we are looking at [the fact that] email performance didn't change a lot, but our conversions did. That was the issue."

The tightened criteria was a better way of looking at the business and marketing ROI of the email program, according to Hefton.

A requirement was put into the data analysis rules: If a contact became a customer of LabVIEW, they had to meet the new criteria in order to be considered a conversion for Marketing.

"It sounds like an elementary thing when you're talking about it now," Hefton said. "But, just the way that our data was structured in affixing dates to our purchase, or to a quote, and then tying that back to when marketing material was sent to [the prospect], that analysis was something we hadn't really done before."

An important aspect of the tightened criteria for calculating conversion was requiring that contacts engage with an email program based on open and clickthrough rates in order for that customer to count toward the conversion metric.

Step #5. Collaborate with internal resources to further improve the process

The team didn't just settle for improving its own calculations; they wanted to learn from other groups within National Instruments. So they explored ways of leveraging internal resources, for example, in learning more about large data collection.

"We have a business intelligence group here that we definitely pulled some knowledge from in terms of systematic expertise in dealing with massive amounts of data," Hefton explained.

She said the business analysts and business intelligence groups helped the team identify how to more accurately measure the return on investment for the email program.

Logerot added that the level of detail was "revolutionary" at National Instruments in terms of being able to see what quotes and leads are being tracked from individual programs.

Step #6. Create a single, enterprise-wide version of the truth

Hefton said a personal challenge was not to become siloed as an analyst, and slice and dice the data as only she saw fit. Getting the team, and then outside groups such as Product Marketing, involved in the process with real input into the final decisions was important.

These new partnerships within National Instruments have helped to produce analytics that resulted in better business decisions, and since these calculations weren't created in a vacuum, there is greater buy-in and trust in the numbers.

For example, Database Marketing began using the business intelligence group's definitions of when a lead is generated and the timing of tracking a conversion begins. The database marketing team now starts their reports on the same date as other business units. Previously, these units were operating in silos.

"We modified some of what we did to match the rest of the business so that we could have one view [of the data]" Watkins explained.

Step #7. Continue to collaborate with each new analytics report

This new level of collaboration didn't stop once the new definitions of metrics were established. Each new report now requires a fresh round of collaboration.

The reports are distributed on a quarterly basis. First, they are shared internally to strategists such as Logerot, and then the strategists take the reports to other business units after a team review to ensure everyone is on the same page regarding the numbers and business implications of the report.

The report eventually receives full internal transparency by being posted to the internal community at National Instruments.

"What I'm seeing is a bridge is starting to happen. Not only aligning on the definition and the documentation and the start times, and the quotes and orders that affected a particular report," Watkins said, "but we are also starting to see, slowly but surely, bridging within these analytics teams. That's really the next evolution — it's how do we get the database analyst team talking to the Web team more."

Hefton added at the beginning, there was a lot of hand-holding when presenting the new reports featuring the tighter data-handling criteria. There were meetings to become aligned internally within the database marketing team, and then outreach to other business units such as the product marketing team.

From there, alignment even reaches out to each global region with quarterly data analytics reports. There is dedicated time to review the report on a global scale so that every region and every country can drill into their specific programs.

RESULTS

The email drip campaign highlighted earlier was considered "evaluation conversion." In other words, for a complex product like this, the initial goal was to get prospects to evaluate the product and its cost. So the KPIs tracked were conversion-to-quote, and then ultimately to purchase, according to Watkins.

Logerot said email metrics for the campaign are above National Instruments' benchmarks. This campaign during Q1 2013 realized a 30% open rate, 19% open-to-click and 5.8% unique clickthrough rate.

Watkins said these numbers are some of the best email metrics she's seen during her time as manager of global database marketing programs at National Instruments.

The conversion rate for purchases was 2%, but that metric reflects the newer, more rigorous, data-handing criteria. Now, the database marketing team can track their campaigns with full confidence in the reporting, and they can also prove exactly how good, or poorly, those campaigns are performing.

A key understanding throughout this entire process was the realization that database marketing was operating in its own silo, and now the team is actively working to be more integrated across the entire enterprise.

Watkins said that this is part of a chain reaction of "where we're headed as an organization
anyway — which is standardization, simplification, consistency and really trying to get accurate, consistent data."

Campaign Team

National Instruments
Jordan Hefton, Global Database Marketing Analyst
Stephanie Logerot, Database Marketing Specialist
Ellen Watkins, Manager, Global Database Marketing Programs

(Team update: Watkins' current title is Global Marketing Ops Training Manager; and both Logerot and Hefton have moved on to other opportunities.)

Related Resources

Analytics: How metrics can help your inner marketing detective

Marketing Research Chart: Top data analysis challenges for landing page optimization

Marketing Analytics: 4 tips for productive conversations with your data analyst

Marketing Analytics: 20% of marketers lack data



Improve Your Marketing

Join our thousands of weekly case study readers.

Enter your email below to receive MarketingSherpa news, updates, and promotions:

Note: Already a subscriber? Want to add a subscription?
Click Here to Manage Subscriptions