More nuggets from I’m Feeling Lucky to help us understand Sergey Brin better. Douglas’ experience with running the first set of banner ads for Google made him realize how much data driven the company’s founders were. Google had barter arrangements with Netscape and Go2Net. So the marketing department needed to come up with banner ads. “How many can you have by tomorrow?” Sergey asked. “Why don’t you start with a 100.” Douglas threw himself at the task, roping in a copy writer and an artist. All the banners were critiqued. The comments from the founders never seemed to end, until one fine day, Sergey again said: “We should try a bunch of these out. And see how they perform. We are wasting time by not running them and getting the ultimate feedback — clickthroughs.
Eventually the ads began to run, and Douglas says the pressure on him seemed to ease as he started the practise of plugging in data about which ads were attracting more CTRs into a spreadsheet, and hand-delivering them to Larry and Sergey. He would pull ads which were not working, and replace them with new banners. “I became a convert to the power of data persuasiveness,” Douglas writes. Google is now justly famous for A/B testing its headlines and design. Its popular Google Analytics tool, offered free to website owners, even features a fairly extensive A/B testing function. HOW A/B TESTING BEGAN AT GOOGLE Wired magazine ran an article on the origins of A/B testing at Google some time ago. The rotation of banner ads narrated above by Douglas Edwards is of course not included in the history of the evolution of A/B testing at the search giant. This is what the Wired article said about it: << At Google — whose rise as a Silicon Valley powerhouse has done more than anything else to spread the A/B gospel over the past decade — engineers ran their first A/B test on February 27, 2000. They had often wondered whether the number of results the search engine displayed per page, which then (as now) defaulted to 10, was optimal for users. So they ran an experiment. To 0.1 percent of the search engine’s traffic, they presented 20 results per page; another 0.1 percent saw 25 results, and another, 30. Due to a technical glitch, the experiment was a disaster. The pages viewed by the experimental groups loaded significantly slower than the control groups, causing the relevant metrics to tank. But that in itself yielded a critical insight — tenths of a second could make or break user satisfaction in a precisely quantifiable way. Soon Google tweaked its response times and allowed real A/B testing to blossom. In 2011 the company ran more than 7,000 A/B tests on its search algorithm. Amazon.com, Netflix, and eBay are also A/B addicts, constantly testing potential site changes on live (and unsuspecting) users. >> ‘DON’T INVOLVE EVERYONE; IT’S A WASTE OF TIME’ The UI team was debating names and designs for the browser button, a piece of code that allowed users to add Google search links to their browsers, writes Douglas. The mailing list had swelled to 10. Sergey found it ridiculous. Instead of exhaustive user testing and internal deliberations, Sergey said Google should just put up the service and test it when they get a chance. “Don’t forget. We can change it at the drop of a hat,” he said. ‘Launch first, iterate later’ was Sergey’s perspective. Then he expounded his philosophy. “I don’t think we should have any meetings about a project like this,” he said, “or any group emails except the one to users announcing the launch. Having everyone involved in every issue is not a good use of anyone’s time.” (To be continued. Vignettes taken from the book I’m Feeling Lucky by Douglas Edwards)
0 Comments
|
Archives
December 2014
AuthorI'm Georgy S. Thomas, the chief SEO architect of SEOsamraat. The Searchable site will track interesting developments in the world of Search Engine Optimization, both in India as well as abroad. Categories
All
|