Wednesday, January 21, 2015
Trust your instincts when you're working with Adobe Test&Target. Recently a customer of mine applied a Test&Target A/B test to part of their site. We let it run for close to a month and at the end of the test we were ready to declare a winner, but the results were odd. Both the IA on the project and I thought that the results were odd. A button at the bottom of the left hand column, below the page fold was a clear winner over a button at the top of the right hand column. It made no sense, but we trusted in the analytics that were returned and began proceeding with plans to update our experience. It turns out that a few days into our test, another dev team applied a change to the header, which changed the ID of a div container which essentially broke our Test&Target test and only rendered the left hand column button, so after running away with the early portion of the test, the right hand column button was the only experience being rendered which allowed it to catch up and surpass the left column experience. The moral of the story is not to just blindly trust the statistics that Test&Target generates when you see something that doesn't add up. Lessons Learned: 1) Its important to know what the other development groups are touching. This was an innocuous change which shouldn't have impacted us, but it did. 2) Be sure to target you T&T tests to a very specific tag ID to avoid these kinds of situations. 3) Trust your instincts, go looking for a failure in the test, when you think your results are odd.