« SES New York: Igniting Viral Campaigns From a "Newbee" Perspective | Main | Fun Photo Fridays - Brandy Singing in Her Beautiful Red High Heels »

March 18, 2008

SES New York: Landing Page Testing & Tuning

By Brian Cosgrove

Moderator: Sage Lewis, (www.sagerock.com)

This session had only one speaker which is atypical of an SES conference. Fortunately, this speaker came with a great ppt full of examples to illustrate each point and plenty of enthusiasm to keep the entire room engaged.

Who should design your website?

Marketers? IT? No! Visitors should design your website! You get thousands of people who can test out your experiments. Guinea Pigs who are willing to give you answers about your site.

Case Studies:

Tim presented a number of test scenarios from various sites. For example, he shows that RealAge.com received a 40% lift in conversion rate once the proper form was identified.

The headline, the length of questions, the look of the button,… these were all factors for the site’s registration page. The point is that a number of subtle changes meant $3 million to bottom line. Thinking streamlined, shorter, and simpler is a good way to get a page to convert. A big green round-edged button doesn’t seem to hurt either. In many cases a radical simplification is the best option.

The Matrix:

Tim’s Matrix is a calculation to decide whether people’s needs are being met.

The Matrix = Roles x Tasks x AIDA

AIDA stands for Awareness, Interest, Desire, Action.

In essence this means:

Getting the Right People

through the Right Activity

in the Right Order.

Example Roles:

Southwest airlines had a number of actions that a new or returning visitor might want to perform. They made to sure to organize their homepage to reflects these roles.

Common Awareness Problems:

Banner Ads can be distracting and could lead users away from the primary point of the site.

Entry pop-ups are annoying and invasive.

A cluttered home page such as Adorama with 146 links is confusing and overwhelming.

A site with good awareness will focus on categories. That is, if you have a lot of crap, let people focus on the subset of crap that they care about.

Keys to Creating Awareness:

  1. Stop screaming at your visitors – Flashing banners or lots of competing visual elements will drive a negative response.
  2. Eliminate choices – Less choices puts more prominence on each one.
  3. Uncluttered what remains – A clean interface simplifies choice.

Rules of Web Awareness:

  1. If you cannot find something easily, it doesn’t exist.
  2. If you emphasize too many items, all of them lose importance.
  3. Any delay increases frustration.

Typical Desire Activities: Research and Compare.

Example: A user may go to a shoe store and research the options by a number of criteria such as – Text, Category, Brand, Size, Color “On Sale” “New”

A site that is unhelpful for the research component of desire is Zappos. Its search feature lends itself to zero-results options make you reenter your search criteria again (or enter a form to get updates on new sizes…).

Rules of Web Desire:

  1. Make me feel appreciated
  2. Make me feel safe
  3. Understand that I am in control

Action Stage Consideration:

Brand Strength – Some users buy on brand. This is more the result of long-term efforts.

Previous Resource Investment (“satisfycing”) - Maybe your option is the next best thing that comes along.

The total solution- Users may be looking for the all-in-one value: availability, customer service hours, return policy, price, free shipping, etc…

Risk reducers & credibility:

These concepts are different. Risk reducers eliminate things that would scare a user away. Credibility increases the likelihood that this site is the best place to convert.

Unhelpful Risk Reducers:

Trust and credibility symbols below the fold or placed as an after thought.

Helpful Risk Reducers:

Petsmart put their Hackersafe symbol in the upper left where a logo would normally appear. 

Credibility and Validation

A lead form on the left side of the page is complemented by a list of high-profile customers on the right.

Rules of Web Action:

  1. Get out of my way.
  2. Make it easy.
  3. Don’t surprise me.

Bad Web Action:

Overstock.com’s screen is reconfigured when clicking radio buttons indicating whether you’re new or returning. A better design would focus on the new customer first and make any registration occur after the checkout process.

Transaction Interruption such as a popup during checkout will drive down conversion. Don’t ask the customer “Would you like fries with that?” through a popup when they have their wallet open. Don’t interrupt checkout process.


Most tuning methods don’t take into account the interaction between the elements. For example the term: “Ferraris are Fast” would go well with an image of a fast moving car. It would not go well with the image of a car wrapped around a tree. A picture of a car wrapped around a tree would go well with the headline “Volvos are Safe” if it accompanies a story of a person walking away from a horrific accident.

That is: It’s not the picture, it’s not headline, it’s the context in which they appear.

The best setting for a variable depends on its context and it’s best to maximize positive interactions. Not only do interactions exist, they can be very strong. Ignoring them will lead to suboptimal results. A/B split and Multivariate/Tagushi testing assume that there are no interactions.

A-B Split Test:

Test one variable at a time (with 2 or more vales), send equal traffic to all versions.

- Very simple to implement

- Requires atleast 10 conversions/day to get worthwhile results.


Test several variables at the same time, ignoring interaction.

The scope requires identifying the size of the test in terms of total unique recipes. For example: 12 variables making a total of 38 different values leads to 552960 different versions of the page. This type of testing needs more than 50 conversions a day to get valuable results.

Tuning Pitfall #1: Ignoring Your Baseline

- Always devote some bandwidth to your current version (the baseline)

Tuning Pitfall # 2 Not Collecting Enough Data

When considering numbers, remember that some degree of variance is inherent in chance. For example, 1/3 of the time, 90 is the same as 100. An inadequate sample size leads to very wide and overlapping error bars. When sample size is ramped up, bars get narrow and become uncrossed. Sample size matters.


TrackBack URL for this entry:

Listed below are links to weblogs that reference SES New York: Landing Page Testing & Tuning:

» SES NY Day Three Live Coverage Recap from Search Engine Land: News About Search Engines & Search Marketing
The third day of SES NY is now complete. Here is the live blogging coverage I found throughout the day for the event. I'll add any new coverage to tomorrow's recap.... [Read More]


The comments to this entry are closed.

Get SMG Today - Free!

Get SMG by RSS What Is RSS?
Get Search Marketing Gurus Today via RSS! Add to Google Reader or Homepage
Add to netvibes
Get SMG in Your Bloglines
Get SMG in Your NewsGator Online

Get SMG by E-Mail
Subscribe to SMG via Email
Enter your email address:

Delivered by FeedBurner

SMG Conversations

If You Like SMG Favorite Us on Technorati!
Add to Technorati Favorites
If You Like What SMG Has To Say, Joins Us At These Places!
Subscribe on YouTube to SMG's Videos
follow Li on Twitter
Follow Li on FriendFeed

Copyright 2006, 2007, 2008 SearchMarketingGurus.com