Subscribe to DSC Newsletter

In most large organizations, tasks such as optimizing yield (revenue per page view), user retention or sign up rate, web server performance (speed), sales per page view, newsletter opens rate, and user experience are spread across several teams.

Each team does its own A/B or multivariate testing, but there is (in most cases) no centralization of these efforts, and no global predictive model associated with the A/B testing activities. To the contrary, small organizations can test multiple levers across the entire spectrum of levers, at the same time, and do a better job even without any use of a statistical model. Here's what we did, and the results:
  • replace flash banner ads by animated gifs ==> significant user experience improvement for IE users (particularly for users with a brand new computer)
  • remove the "visitor map" ==> increase sign up rate, as user is more comfortable about privacy
  • remove calls (e.g. Javascript, widgets) to external websites ==> increased web page delivery speed and user experience (feel more secure, higher propensity to sign up)
In the past, we increased the sales per page view ratio by 30%, and revenue per page view by 80%.

Views: 32

Reply to This

Replies to This Discussion

Nice, simple approach. Common sense remains king, too many executives getting fanatical about this 'data driven' thing w/o having the relevant knowledge. Way too often people aren't even thinking about performance's relationship to improving yield ... frightening!

Eric Peterson wrote a thoughtful whitepaper called 'Successful Website Testing Practices'. The link to the paper can be found on the page below. You'll have to give them your email if you're interested.

http://eon.businesswire.com/portal/site/eon/permalink/?ndmViewId=ne...

RSS

On Data Science Central

© 2019   AnalyticBridge.com is a subsidiary and dedicated channel of Data Science Central LLC   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service