For journalists writing the best headlines have always been a core element of their profession. The challenge is how to know exactly which headline most readers will find most tempting.
In marketing A/B testing is often used to compare the efficiency of different landing pages. Usually these tests are well prepared ahead and performing them does take some effort.
But what about the daily life of a newsroom? Could editors on the fly test two alternative headlines with readers and then choose the version that most readers clicked on? All in matter of minutes?
Those were the questions discussed between editors in Norway´s largest newspaper Aftenposten and its team of developers in Schibsted Tech Polska. Aftenposten wanted to explore if A/B testing could be integrated in the editorial workflow of the newsroom.
– For journalists it is very important to be as close to their readers as possible. Our job as programmers is to develop tools to help them do so and make it easier for them to produce better content, says Robert Tekielak, who is the leader for Aftenposten´s team in Schibsted Tech Polska.
Editorial A/B tool
Now the first version of an A/B tool is tested, allowing the editors to quickly test two different story headlines with smaller group of readers before choosing the final one.
– The title on the front page is in many cases the first contact point for the reader, emphasizes Robert Tekielak.
A/B tool in four parts
The first version of the editorial A/B tool is developed to test headlines on Aftenposten´s mobile version.
Robert Tekielak explains that the tool is composed of four parts:
- The A/B tool admin panel. This is where the journalists apply test scenarios to their stories. The journalists can test two or more versions of an headline against each other – and the test stops when each of them have had 1000 impressions, or any other number set by the tester. The core data for final choice of headline is the click-through rate
- The A/B tool engine. The heart of the tool with all the necessary logic to perform the tests, for instance draw the variation for the user, present previously drawn headline for returning users, caching, etc.
- Integration between the mobile front-end and the A/B tool engine.
- The statistics engine. Here all the data from the tests are stored. The statistics engine receives data from the the mobile frontend solution. In the next steps the data are cleaned and saved in formats easy to use for the A/B tool. Data are also saved in raw format for future analysis. The A/B tool engine uses Stats API to retrieve necessary data. a
– We have tried to develop a future-proof solution, says Tekielak.
– That is why the statistics engine is completely separated from the tool engine. This makes it easier to later introduce tests in other areas and to reuse the data for other purposes.
Easy for journalists
Running a headlines test is a quick and easy operation for the editor. He/she identifies an alternative headline – and the two alternatives are as default presented to 1000 readers each to test which gives the highest click-through rate.
The editor is notified when the test is over. Then he/she can set the winner as the new headline, run a new test or return to the original headline. Until a decision is made, the A/B tool will present the original headline.
Aftenposten´s team explains what technologies were used in developing the tool:
– The statistics backend consists of Flume agent, Python parser and Clojure web application. Flume gathers data from tracking pixel and moves them to Amazon Kinesis for real time processing and also archives them in Amazon S3. The parser aggregates data from Kinesis and stores them in Redis cache. Finally the web application is using Redis data to expose rest services with statistics for running tests, says Agnieszka Frolik, senior software engineer.
– The biggest challenge in the A/B tool is to determine whether the alternative headline actually affects users´ behavior or if the variations are random. We implemented two algorithms to exclude random chance. The first one calculates a test sample based on Baseline Conversion Rate (control group’s expected conversion rate), Minimum Detectable Effect (The minimum relative change in conversion rate you would like to be able to detect) and Statistical Significance. The second compares results using z-score algorithms, reveals software engineer Przemysław Pióro.
– The connection between mobile front-page and A/B tool engine is provided by Bower compatible package which is optimized to run as fast as possible to eliminate flicker effect and ensure smooth user experience, describes Maksymilian Bolek, software engineer
For the developers in Schibsted Tech Polska the A/B tool is part of a bigger effort: To help journalists in Aftenposten to be truly data-driven.
The ambition is to be able to use data about how users interact with the content in a more sophisticated way.
A similar initiative has been to explore how real-time click-stream data can be used to help the journalists produce more timely content.
Here is a presentation Robert Tekielak gave at an internal “super demo” in the fall of 2014:
MORE ARTICLES FROM SCHIBSTED TECH POLSKA
- Getting ready to make the perfect music experience
- How to land a great job as programmer
- Created scaleable event engine for city guides
- Learned from Scandinavian mobile success stories
- Here is the Schibsted Tech Polska cake parade
- Here are five digital trends for the future
- Prestigious awards to “STP projects”
- Swedish media group new partner for STP
- Ten practical tips to young Java web developers
- Here are five presentations from our programmers
- Helping readers alert each other about traffic jams