Being a data driven company, we at Coursera appreciate the importance of being able to A/B test various hypotheses. At any given time, we’ll be running various tests on the site and analyzing results to pick the winning variants. These tests can be small ones – changing the default tab names on the learner’s course page – to larger ones involving more systemic changes, such as experimenting with different types of visualizations for the data analytics dashboards that we provide our instructors. We use the results of these tests to continually iterate upon and build better products, both for our learners as well as our instructors and university partners.
So as a part of our external tech-talk series, we were excited to host Ya Xu from LinkedIn to hear about their experimentation platform. The talk, attended by well over 200 engineers and data scientists across the Bay Area, started by highlighting the importance of A/B testing. This was followed by a discussion of the broad range of considerations when designing a software platform for managing web-based experiments, along with pros and cons of others used in the industry (Bing, Google). The last part of the talk covered various gotchas learnt from tests run at LinkedIn and Microsoft, in particular significance testing.
We also had Tom Do, head of analytics at Coursera, talk about various tests run on our site along with some interesting results. We have also just started building our next generation of A/B testing platform, with tighter integration with our metrics and dashboard pipeline.
The video of both the talks, along with the slides are great resources to start learning about A/B testing. We already have an exciting lineup of external tech-talks setup, starting with one about Mesos towards the end of the month. We hope to see you at these talks soon!