What You're Doing Wrong
In Your CRO Program
Shiva Manjunath, CRO Guru
Hi!
I’m Shiva
I’m a CRO Program Manager at Gartner
I’ve been running CRO programs for 5+ years
across B2B and B2C
I make CRO memes to open conversations
about experimentation 2
What I want to talk about
3
I want to talk to you about not just running
experiments, but optimizing your CRO
program for maximum efficiency
Always Be Testing Data Collection Politics
Agenda
Value of
Experimentation
Collaboration Test to Learn
Value of
Experimentation
What exactly does CRO mean to
everyone else not in it?
How do your coworkers
see CRO?
6
I Googled CRO To Figure Out How People Were Defining It
7
8
Testing This
CRO ≠
Value of Experimentation
- It’s not BCO (Button Click Optimization)
- It’s not always ‘Conversion Rate Optimization’
- It’s experimentation to optimize the customer journey
- I call it Customer Experience Optimization
- Yes. I’m a CEO
- It’s risk mitigation
9
Do your coworkers
know this?
They should...
10
Experiments Are Complicated To Run
11
- The visual editor also simplifies the
experimentation process
- Experimentation involves… a lot
- Marketing/Brand approval
- Design
- Research
- ...
Collaboration
An isolated CRO program is like
buying a Ferrari, only to keep it in
the garage
Collaboration
13
- The whole is greater than the sum of the parts
- It’s a mutualistic relationship (both benefit)
- You need them just as much as they need you
Collaboration
14
Collaboration With UX
15
- UX research gives you valuable data
- Experimentation is data (when you test to learn)
- Tips for Collaboration:
- Weekly syncs to loop them in on test updates,
and they update you on UX projects
- Use UX research as data for your experiments
- Validate UX by running their designs through
experiments
Collaboration With UX
16
Data from CRO
Data from UX
Collaboration With Engineers
17
- Mitigate risk in experiment/code rollout overlaps
- Run more compelling tests client side
- Tips for Collaboration:
- Weekly syncs to loop them in on test updates,
and they update you on dev projects
- Identify potential overlap with code
rollout/experiment breakages
- Identify more creative ways to run more
interesting tests based on your testing roadmap
Collaboration With Brand/Marketing
18
- CRO can verify (and support) marketing efforts
- CRO needs to be within brand standards to
create a unified customer journey
- Tips for Collaboration:
- Bi-weekly syncs to loop them in on test
updates, and they update you on
marketing efforts
- Partner to identify ways to ‘test to learn’
more about audience to feed data into
their marketing efforts
Test to Learn
Because winning isn’t as important
as learning
“I have not failed. I've just
found 10,000 ways that won't
work.”
20
—Thomas A. Edison
“Test to learn, not test to win. If
you’re testing to learn, you will
win far more than if you just
test to win”
21
—Shiva Manjunath
What Does ‘Test to Learn’ Mean?
22
- It means never ‘losing’
- You just paid for a learning
- It means actually winning more
- Compounding learnings = higher chance of
winning
- It means having specific winning concepts to
communicate to brand/UX
- Iterating off these learnings is far easier due to the
learnings achieved from each test
Test to Learn vs Test to Win
23
Hypothesis: We believe a video of our
product in use is valuable for our users
in the middle of the product page
We believe the video will be most helpful
by putting the video in the middle of the
product page
Hypothesis: We believe a video of our
product in use is valuable for our users
in the middle of the product page
We don’t even know if the video is
engaging to our users - we will put the
video above the fold on the product page
Test to Win Test to Learn
Test to Learn vs Test to Win
24
Hypothesis: We believe a video of our
product in use is valuable for our users
in the middle of the product page
We don’t even know if the video is
engaging to our users - we will put the
video above the fold on the product page
Test to Learn
This is better because:
- You will learn very quickly how
your audience reacts to the
video (good or bad)
- You can iterate from here!
How Do You Start ‘Testing to Learn’?
25
- Have strong hypotheses focused on ‘learning’
rather than winning
- Don’t be afraid to ‘lose’
- Iterate, iterate, iterate!
- Caveat: That doesn’t mean every test you run is
test to learn
- It’s a reframe that you should try to ‘learn’
with every test you can!
Always Be Testing
Having no test downtime ensures
maximum winability
1 in 7 A/B tests is a winning test
27
That stinks.
Always Be Testing
28
- If 1 in 7 tests ‘win’, it’s a numbers game to hit winners!
- You must balance quality with quantity of tests
- Bad: 50,000 button color tests
- Also Bad: 1 really cool, disruptive landing page test
- But it took you 6 months to build it.
- Also, it loses.
29
Balance Quality vs. Quantity of Tests
Will require
more time to
dev/design
Can run this
quicker, and
learn quicker
Always Be Testing
30
- If 1 in 7 tests ‘win’, it’s a numbers game to hit winners!
- You must balance quality with quantity of tests with
- Bad: 50,000 button color tests
- Also Bad: 1 really cool, disruptive landing page test
- But it took you 6 months to build it.
- Also, it loses.
- Parallelism
Parallelism
31
- Parallelism means concurrently
- Building tests
- Running tests
- Analyzing past tests
Building Test
Test Running
Analyzing Results
This is bad
Building Test
Test Running
Analyzing Results
Parallelism
32
- Parallelism means concurrently
- Building tests
- Running tests
- Analyzing past tests
Building Test
Test Running
Analyzing Results
This is
parallelism!
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Building Test
33
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Building Test
Building Test
Test Running
Analyzing Results
Building Test
Test Running
Analyzing Results
Double the
amount of
tests run here!
Data Collection
You can never have too much data
Let’s do a quick thought exercise
35
Country Total Cases New Cases Total Deaths New Deaths
Indonesia 2491 +218 209 +11
Thailand 2220 +51 26 +3
Serbia 2200 +292 58 +7
Finland 2176 +249 27 -1
México 2143 +253 94 +15
UAE 2076 +277 11 +1
Snapshot of Coronavirus Data in 2020
Let’s do a quick thought exercise
36
Country Total Cases New Cases Total Deaths New Deaths
Indonesia 2491 +218 209 +11
Thailand 2220 +51 26 +3
Serbia 2200 +292 58 +7
Finland 2176 +249 27 -1
Mexico 2143 +253 94 +15
UAE 2076 +277 11 +1
Snapshot of Coronavirus Data in 2020
ZOMBIES!!!!!
Defining Your Metrics
37
- Your interpretation of the data hinges on your
understanding of how the metrics are set up
- Do you know how your metrics are defined/set up?
- Are your metrics working the way you are intending
them to work?
- Regular audits to ensure they’re still working
properly
- Collaboration with engineering
Look At The Right Metrics
38
- CRO ≠ Fixation on Conversion Rate
- Ensure you’re looking at lifetime value metrics
- e.g. AOV, repeat purchases, etc.
- Microconversions are important!
Look At The Right Metrics - Microconversions
39
- Microconversions (e.g. visit to product page, add to
cart, etc.)
- Help you define ‘behavior’
- Macroconversions (e.g. purchase complete)
- Drive $$$
- Microconversions help tell you why something
happened
- Define microconversions before test launch!
Look At The Right Audiences
40
- Make sure you’re segmenting your audiences - not everyone needs the same
exact experience
- Example segments:
- Desktop vs. Mobile
- Source (Bing vs. Google, Paid Search vs. Direct traffic)
- Current vs. New customers
- Specific Categories
- Experiment-specific segments
Supplement Your Experiment Data
41
- Qualitative data is helpful to supplement quantitative data
- Heatmaps to support adding a new element = more attention paid to it
- Running a survey during the experiment
- This is why collaboration with UX is so critical!
Politics
Because there is collaboration, you
have to make sure your
stakeholders are happy!
Playing ‘Nicely’ As The Experimenter
43
- Hearing all test ideas!
- Understand the ‘why’
- No test idea is a bad test idea
- Transparency into CRO program to foster
participation
- Bringing excitement/gamification to CRO
How to Defeat A HiPPO
44
- HiPPO: Highest Paid Person’s Opinion
- Strategies to Vanquish the HiPPO
- Loop HiPPOs early into the test build/design
process
- Is it the results? Or the design?
- Understand their POV
- Identify what they’re looking to solve (e.g. a test
which lifts AOV when they want to improve CVR)
So What Did We
Learn?
45
Wrap Up
46
Value of
Experimentation
Make sure you’re
communicating the value
of experimentation
regularly!
Check Your Data
Your data can be your
friend, or your worst
enemy. Make sure it’s
accurate, and make sure
you know exactly what
you’re tracking. Track
microconversions!
Test to Learn
Position your tests to
put you in the best
position to learn.
Learnings are your
friends to winning. Plus,
learnings are valuable
outside of the
experimentation
function
Does anyone have any questions?
shivamanjunath91@gmail.com
Find me on LinkedIn - Shiva Manjunath!
47
Thanks!
What You're Doing Wrong
In Your CRO Program
Shiva Manjunath, CRO Guru
Credits
Did you like the resources on this template? Get them for free at our other websites.
Presentation template by Slidesgo
Icons by Flaticon
Images & infographics by Freepik
Author introduction slide photo created by katemangostar - Freepik.com
Big image slide photo created by jcomp - Freepik.com
Text & Image slide photo created by rawpixel.com - Freepik.com
Text & Image slide photo created by Freepik
49

Cro webinar what you're doing wrong in your cro program (sharable version)

  • 1.
    What You're DoingWrong In Your CRO Program Shiva Manjunath, CRO Guru
  • 2.
    Hi! I’m Shiva I’m aCRO Program Manager at Gartner I’ve been running CRO programs for 5+ years across B2B and B2C I make CRO memes to open conversations about experimentation 2
  • 3.
    What I wantto talk about 3 I want to talk to you about not just running experiments, but optimizing your CRO program for maximum efficiency
  • 4.
    Always Be TestingData Collection Politics Agenda Value of Experimentation Collaboration Test to Learn
  • 5.
    Value of Experimentation What exactlydoes CRO mean to everyone else not in it?
  • 6.
    How do yourcoworkers see CRO? 6
  • 7.
    I Googled CROTo Figure Out How People Were Defining It 7
  • 8.
  • 9.
    Value of Experimentation -It’s not BCO (Button Click Optimization) - It’s not always ‘Conversion Rate Optimization’ - It’s experimentation to optimize the customer journey - I call it Customer Experience Optimization - Yes. I’m a CEO - It’s risk mitigation 9
  • 10.
    Do your coworkers knowthis? They should... 10
  • 11.
    Experiments Are ComplicatedTo Run 11 - The visual editor also simplifies the experimentation process - Experimentation involves… a lot - Marketing/Brand approval - Design - Research - ...
  • 12.
    Collaboration An isolated CROprogram is like buying a Ferrari, only to keep it in the garage
  • 13.
    Collaboration 13 - The wholeis greater than the sum of the parts - It’s a mutualistic relationship (both benefit) - You need them just as much as they need you
  • 14.
  • 15.
    Collaboration With UX 15 -UX research gives you valuable data - Experimentation is data (when you test to learn) - Tips for Collaboration: - Weekly syncs to loop them in on test updates, and they update you on UX projects - Use UX research as data for your experiments - Validate UX by running their designs through experiments
  • 16.
    Collaboration With UX 16 Datafrom CRO Data from UX
  • 17.
    Collaboration With Engineers 17 -Mitigate risk in experiment/code rollout overlaps - Run more compelling tests client side - Tips for Collaboration: - Weekly syncs to loop them in on test updates, and they update you on dev projects - Identify potential overlap with code rollout/experiment breakages - Identify more creative ways to run more interesting tests based on your testing roadmap
  • 18.
    Collaboration With Brand/Marketing 18 -CRO can verify (and support) marketing efforts - CRO needs to be within brand standards to create a unified customer journey - Tips for Collaboration: - Bi-weekly syncs to loop them in on test updates, and they update you on marketing efforts - Partner to identify ways to ‘test to learn’ more about audience to feed data into their marketing efforts
  • 19.
    Test to Learn Becausewinning isn’t as important as learning
  • 20.
    “I have notfailed. I've just found 10,000 ways that won't work.” 20 —Thomas A. Edison
  • 21.
    “Test to learn,not test to win. If you’re testing to learn, you will win far more than if you just test to win” 21 —Shiva Manjunath
  • 22.
    What Does ‘Testto Learn’ Mean? 22 - It means never ‘losing’ - You just paid for a learning - It means actually winning more - Compounding learnings = higher chance of winning - It means having specific winning concepts to communicate to brand/UX - Iterating off these learnings is far easier due to the learnings achieved from each test
  • 23.
    Test to Learnvs Test to Win 23 Hypothesis: We believe a video of our product in use is valuable for our users in the middle of the product page We believe the video will be most helpful by putting the video in the middle of the product page Hypothesis: We believe a video of our product in use is valuable for our users in the middle of the product page We don’t even know if the video is engaging to our users - we will put the video above the fold on the product page Test to Win Test to Learn
  • 24.
    Test to Learnvs Test to Win 24 Hypothesis: We believe a video of our product in use is valuable for our users in the middle of the product page We don’t even know if the video is engaging to our users - we will put the video above the fold on the product page Test to Learn This is better because: - You will learn very quickly how your audience reacts to the video (good or bad) - You can iterate from here!
  • 25.
    How Do YouStart ‘Testing to Learn’? 25 - Have strong hypotheses focused on ‘learning’ rather than winning - Don’t be afraid to ‘lose’ - Iterate, iterate, iterate! - Caveat: That doesn’t mean every test you run is test to learn - It’s a reframe that you should try to ‘learn’ with every test you can!
  • 26.
    Always Be Testing Havingno test downtime ensures maximum winability
  • 27.
    1 in 7A/B tests is a winning test 27 That stinks.
  • 28.
    Always Be Testing 28 -If 1 in 7 tests ‘win’, it’s a numbers game to hit winners! - You must balance quality with quantity of tests - Bad: 50,000 button color tests - Also Bad: 1 really cool, disruptive landing page test - But it took you 6 months to build it. - Also, it loses.
  • 29.
    29 Balance Quality vs.Quantity of Tests Will require more time to dev/design Can run this quicker, and learn quicker
  • 30.
    Always Be Testing 30 -If 1 in 7 tests ‘win’, it’s a numbers game to hit winners! - You must balance quality with quantity of tests with - Bad: 50,000 button color tests - Also Bad: 1 really cool, disruptive landing page test - But it took you 6 months to build it. - Also, it loses. - Parallelism
  • 31.
    Parallelism 31 - Parallelism meansconcurrently - Building tests - Running tests - Analyzing past tests Building Test Test Running Analyzing Results This is bad Building Test Test Running Analyzing Results
  • 32.
    Parallelism 32 - Parallelism meansconcurrently - Building tests - Running tests - Analyzing past tests Building Test Test Running Analyzing Results This is parallelism! Building Test Test Running Analyzing Results Building Test Test Running Analyzing Results Building Test Test Running Building Test
  • 33.
    33 Building Test Test Running AnalyzingResults Building Test Test Running Analyzing Results Building Test Test Running Analyzing Results Building Test Test Running Building Test Building Test Test Running Analyzing Results Building Test Test Running Analyzing Results Double the amount of tests run here!
  • 34.
    Data Collection You cannever have too much data
  • 35.
    Let’s do aquick thought exercise 35 Country Total Cases New Cases Total Deaths New Deaths Indonesia 2491 +218 209 +11 Thailand 2220 +51 26 +3 Serbia 2200 +292 58 +7 Finland 2176 +249 27 -1 México 2143 +253 94 +15 UAE 2076 +277 11 +1 Snapshot of Coronavirus Data in 2020
  • 36.
    Let’s do aquick thought exercise 36 Country Total Cases New Cases Total Deaths New Deaths Indonesia 2491 +218 209 +11 Thailand 2220 +51 26 +3 Serbia 2200 +292 58 +7 Finland 2176 +249 27 -1 Mexico 2143 +253 94 +15 UAE 2076 +277 11 +1 Snapshot of Coronavirus Data in 2020 ZOMBIES!!!!!
  • 37.
    Defining Your Metrics 37 -Your interpretation of the data hinges on your understanding of how the metrics are set up - Do you know how your metrics are defined/set up? - Are your metrics working the way you are intending them to work? - Regular audits to ensure they’re still working properly - Collaboration with engineering
  • 38.
    Look At TheRight Metrics 38 - CRO ≠ Fixation on Conversion Rate - Ensure you’re looking at lifetime value metrics - e.g. AOV, repeat purchases, etc. - Microconversions are important!
  • 39.
    Look At TheRight Metrics - Microconversions 39 - Microconversions (e.g. visit to product page, add to cart, etc.) - Help you define ‘behavior’ - Macroconversions (e.g. purchase complete) - Drive $$$ - Microconversions help tell you why something happened - Define microconversions before test launch!
  • 40.
    Look At TheRight Audiences 40 - Make sure you’re segmenting your audiences - not everyone needs the same exact experience - Example segments: - Desktop vs. Mobile - Source (Bing vs. Google, Paid Search vs. Direct traffic) - Current vs. New customers - Specific Categories - Experiment-specific segments
  • 41.
    Supplement Your ExperimentData 41 - Qualitative data is helpful to supplement quantitative data - Heatmaps to support adding a new element = more attention paid to it - Running a survey during the experiment - This is why collaboration with UX is so critical!
  • 42.
    Politics Because there iscollaboration, you have to make sure your stakeholders are happy!
  • 43.
    Playing ‘Nicely’ AsThe Experimenter 43 - Hearing all test ideas! - Understand the ‘why’ - No test idea is a bad test idea - Transparency into CRO program to foster participation - Bringing excitement/gamification to CRO
  • 44.
    How to DefeatA HiPPO 44 - HiPPO: Highest Paid Person’s Opinion - Strategies to Vanquish the HiPPO - Loop HiPPOs early into the test build/design process - Is it the results? Or the design? - Understand their POV - Identify what they’re looking to solve (e.g. a test which lifts AOV when they want to improve CVR)
  • 45.
    So What DidWe Learn? 45
  • 46.
    Wrap Up 46 Value of Experimentation Makesure you’re communicating the value of experimentation regularly! Check Your Data Your data can be your friend, or your worst enemy. Make sure it’s accurate, and make sure you know exactly what you’re tracking. Track microconversions! Test to Learn Position your tests to put you in the best position to learn. Learnings are your friends to winning. Plus, learnings are valuable outside of the experimentation function
  • 47.
    Does anyone haveany questions? [email protected] Find me on LinkedIn - Shiva Manjunath! 47 Thanks!
  • 48.
    What You're DoingWrong In Your CRO Program Shiva Manjunath, CRO Guru
  • 49.
    Credits Did you likethe resources on this template? Get them for free at our other websites. Presentation template by Slidesgo Icons by Flaticon Images & infographics by Freepik Author introduction slide photo created by katemangostar - Freepik.com Big image slide photo created by jcomp - Freepik.com Text & Image slide photo created by rawpixel.com - Freepik.com Text & Image slide photo created by Freepik 49