When split testing with a web tool like Optimizely or MixPanel, you may sometimes see inconsistent results, even with relatively large result sets. At OpenSky, we sometimes see test cases’ performance converge and then swap places over time.
If your volume supports it, a quick solution is to do an A/A/B/B test (4 simultaneous tests, duplicating each option) instead of a simple A/B test. This way if either the A’s or B’s are inconsistent with their matching test case then you know the test is wonky. (It’s a somewhat easier alternative to running a test twice, at least from a cross-team perspective.)