Keeping the Trains running on time: A Case Study in Data Capture
Right at the end of the process of preparing its first campaign codes through Claravine, Amtrak ran into a serious problem. The email pattern had been identified and selected with no issues, and the new codes and classifications followed suit. But when the landing page was entered for each of these email links, Claravine’s interface lit up with errors. My phone rang, and the users asked me what they had done wrong.
Nothing; they had done nothing wrong. Yet Claravine had detected that the codes were still not being captured. Diligent investigation uncovered a single Javascript error on the landing page that, though unrelated to the core analytics implementation, had created a domino effect that stymied the Adobe Analytics data transmission entirely. No data sent meant that no code could be captured.
Had this campaign been launched two weeks earlier, before Amtrak had transitioned to Claravine, those marketers would have captured absolutely no data about their campaign, despite having followed their inherited checklist with meticulous care. After the page was repaired, the verification step was repeated within Claravine; each link passed with flying colors.
Although this specific launch plan was temporarily derailed, the reporting data was absolutely worth saving. It always is. The ability to step back after any campaign and dispassionately judge its worth is the basis of the entire science of experience optimization. But until companies move to fully automate the tracking code process, valuable time that should be focused on which marketing efforts did or didn’t work, is instead spent on what did or didn’t track.
It’s time to fully automate the tracking code process, so we can finally shift the conversation from what did or didn’t track, to what did or didn’t work.