Naoshi Yamauchi on Testing and Optimization | Triangle AMA Analytics Training Camp

by Stacey Alexander on October 20, 2010

in Social Media

Naoshi Yamauchi is Director of Analytics at Brooks Bell Interactive. You can follow him on Twitter @yamalytics.

“Don’t settle for average”

Why test?

We optimize our daily lives without knowing it.

Examples:

  • Emails: labels, folders.
  • Figure out the most effective ways to commute.
  • Desk space at work. Anything to make it more convenient, efficient and comfortable.
  • Morning routine: Naoshi has four kids – two sets of twins. Analytics lesson: That’s a lot of kids. He breaks up his morning routine into phases, and has figured out ways to optimize each. An example is he gets the bowls out for breakfast the night before.

We often fail to take that diligence into our online marketing efforts. The average ratio of money spent to money optimized is 92:1.

Scenario: Retail clothing store

If you’ve allocated $5000 for marketing, that means you’ve spent $4,946 on ads, people to push your ads, etc. and $54 on optimizing your store.

Why don’t we test and optimize?

  1. We don’t know how
    1. Best way to learn is to do. Start testing.
    2. If you don’t miss, you’re not pushing the envelope. Be bold.
  2. We’re complacent
    1. 2010 is the year of “a/b testing” demand
    2. Your competitors are coming, so get optimizing.
  3. Too much red tape
    1. IT puts the “it” in “sit”. Oftentimes IT becomes the bottleneck for how often companies can test.
    2. Creative resources need to be available to make things you want to test.
    3. Budget needs to allocate a portion to testing.
    4. Takeaway: Outsource to companies that can do these things for you, allocate resources, or try new tools.
  4. We can’t sell the value
    1. This is the most underrated skill of web analyst or testing group. You have to be able to speak the language.
    2. Be persuasive.
    3. Connect it to the bottom line.

Meat of Testing

5 Ts of Testing:

  1. Time
  2. Traffic
  3. Team: right people in right place
  4. Trust data
  5. Technology

Rate of optimization

  1. Traffic up
  2. Conversion rate up
  3. # of tests down
  4. % lift/depression up

A/B testing vs. MVT

A/B is splitting traffic between options, changing one variable. MVT is splitting traffic between combinations of options, changing multiple variables.

Experts prefer A/B testing. It’s holistic. It has big wins and losses, a good start to any new experiment.

MVT has more combinations. You can see the synergy between all the elements – which elements work together better. It has smaller wins and losses, and is a good way to start after A/B testing

Conversion factors: good > bad

Good

  • Motivation – reason to convert.
  • Relevance – to their search.
  • Value proposition
  • Offer

Bad

  • Anxiety
  • Friction

Technology

The demand for testing is growing rapidly. New tools are coming on the scene all the time to perform testing.

The unfortunate landscape is that it takes companies a long time to start testing. (64% of companies take 3 months to run their first test.) They do little testing, and the tools they invest in are underutilized.

It’s about the people executing the right strategies than investing in the right tools. When testing, use common sense.

  1. Define a goal first – what do you want to do?
  2. Assess your current situation – what’s happening right now?
  3. Segment – find the right target to go after.
  4. Impact factor – what are the big ones? Start with those.
  5. Figure out what type of test.

Scenario:

If your goal is to pick up ladies—but your current situation is that you have no ladies—pay attention to segmentation. Don’t go to a sports bar during the draft. Go somewhere the ladies are more likely to be. The impact factor of your ride is that the van isn’t working. You’re A/B testing is as follows: A – add accessories to the van, B – get a sports car.

When testing, leave your ego at the door. Results will often surprise you.

If you’re not getting enough traffic to test, there are tools that will project how long it will take you to reach statistical significance.

Don’t run tests for more than a month. Otherwise things like seasonality, which are beyond your control, will become factors.

Tips

  • Test what you can. Don’t sit on your butt and not do anything
  • Visual Website Optimizer (VWSO)– makes it easy
  • Results sting sometimes…but it’s still fun.
  • The more you test, the more you learn, the better you’re going to get at it.

Best Practices

  1. Above the fold – Put Call to Action (CTA) above the fold, but there are times when this doesn’t apply. Example: If your end goal is for someone to register an account, but your article is longer than the fold. You’ll have a better conversion rate if the CTA is at the end of the article.
  2. Big & Bold – Not always better. Just because you think the content is the best, doesn’t mean that the majority of your users do. If you make it big and bold, it may backfire on you.
  3. Shiny button – Make the buttons almost irresistible. Sometimes it’s better to create text with hyperlinks. The message on the button can be off-putting. Test and see.

Takeaways:

  1. People more than technology.
  2. Learn to sell.
  3. Use common sense when you’re testing. Use offline logic online.
  4. Test and learn. Only way to learn is to do.
  5. Have fun.

{ 1 comment… read it below or add one }

Morgan Siem October 23, 2010 at 9:33 pm

“When testing, leave your ego at the door. Results will often surprise you.” –> SO TRUE! Let the data speak for itself.

Reply

Leave a Comment

Want to add a picture to your comments here on Media Two Point {OH}? Upload your picture at Gravatar to make it happen.

Your comment may be nofollow free.

Previous post:

Next post: