Back to Blog
How to A/B Test Your Landing Page (Without Going Broke)

How to A/B Test Your Landing Page (Without Going Broke)

7 min read
By PageRekt Team

A/B testing doesn't require enterprise tools or a huge budget. Here's how to test what matters and actually improve conversions.

A/B testing sounds expensive. Optimizely wants $50k/year. VWO wants $30k. You just want to know if changing your headline will work.

Here's how to A/B test without selling a kidney.

What Actually Matters

Most A/B tests are bullshit. Testing button colors is a waste of time when your value proposition sucks.

Test these (in order):

  1. Value proposition - Your headline and subheadline
  2. Call to action - What you're asking people to do
  3. Social proof - Which testimonials convert best
  4. Page length - Long-form vs short-form
  5. Pricing presentation - How you show your price

Button color? Font choice? Emoji vs no emoji? Test these last (or never).

Free A/B Testing Tools

You don't need enterprise software. Use what you have.

Google Optimize (Free)

  • Integrates with Analytics
  • Visual editor (no code needed)
  • Good for basic tests

Vercel/Netlify Edge Functions (Free tier)

  • Split traffic at the edge
  • Zero performance impact
  • Requires coding

Plausible/Simple Analytics ($9-19/mo)

  • Event tracking
  • Manual A/B testing
  • Privacy-focused

Just Use URL Parameters (Free)

  • Version A: yoursite.com
  • Version B: yoursite.com?v=b
  • Track conversions manually
  • Ghetto but works

Start with the free option. Upgrade when you're making money.

How to Actually Run a Test

Step 1: Pick ONE thing to test

Don't change the headline AND the CTA AND the image. You won't know what worked.

Change one thing. Test it. Learn from it.

Step 2: Define success

What are you measuring?

  • Email signups?
  • Demo requests?
  • Purchases?

Pick one metric. Ignore everything else during the test.

Step 3: Calculate sample size

You need enough traffic for statistical significance.

Quick math:

  • 1% conversion rate → need ~40,000 visitors
  • 5% conversion rate → need ~8,000 visitors
  • 10% conversion rate → need ~4,000 visitors

Don't have that traffic? Test bigger changes. Small tweaks need big sample sizes.

Step 4: Run the test for at least 2 weeks

Traffic varies by day of week. Run Monday through Sunday, twice.

Ending a test on Friday gives you incomplete data.

Step 5: Implement the winner (or don't)

Winner has to be statistically significant (95% confidence minimum).

If results are close? Keep the original. Changing stuff for a 2% lift isn't worth the risk.

Tests Worth Running

Test 1: Benefit-driven vs Feature-driven Headline

  • Version A: "Project Management Software with Kanban Boards"
  • Version B: "Ship Projects 2x Faster"

The benefit version usually wins. But test it.

Test 2: Social Proof Placement

  • Version A: Testimonials at bottom
  • Version B: Testimonials right after headline

Putting proof near the CTA often boosts conversions 15-30%.

Test 3: CTA Copy

  • Version A: "Get Started"
  • Version B: "Get My Free Roast"

Specific CTAs outperform generic ones. But test your audience.

Test 4: Long vs Short Page

  • Version A: 500-word page
  • Version B: 2000-word page

B2B and expensive products often need long-form. E-commerce might not. Test it.

Common A/B Testing Mistakes

Mistake 1: Stopping tests too early

You got 50 conversions and version B is winning by 10%. Stop the test?

No. Run it for 2 full weeks minimum. Early wins often disappear.

Mistake 2: Testing too many things

5 tests running simultaneously. You're splitting traffic so much that none reach significance.

Run 1-2 tests at a time. Get clear results.

Mistake 3: Not having a hypothesis

"Let's try a blue button" isn't a hypothesis.

"Blue buttons will convert better because they contrast with our green brand" is a hypothesis.

Know WHY you're testing something.

Mistake 4: Ignoring mobile

70% of your traffic is mobile. You're testing the desktop version.

Test mobile separately or use mobile traffic only.

The Ghetto A/B Test Method

No tools. No budget. Still want to test?

Week 1: Run version A. Track conversions. Week 2: Run version B. Track conversions.

Compare results. Account for traffic differences.

It's not perfect. But it's better than guessing.

When to Stop Testing

You've run 10 tests. 8 showed no significant difference. 2 made things worse.

Maybe your page is already good enough. Maybe testing isn't your problem.

Focus on traffic instead. A good landing page with no traffic = $0.

The Reality

Most landing pages don't need A/B testing. They need basic fixes.

If your page:

  • Has no clear value proposition
  • Loads in 8 seconds
  • Looks like shit on mobile
  • Has zero social proof

Fix those first. Then test.

Want to know what's broken before you start testing? Get roasted by PageRekt. Our 6 personas will tell you exactly what to fix first.

Try it free, or get the full Deep Dive roast for $26.99 (one-time) or $29.99/month unlimited.

Tags

ab testinglanding page testingconversion optimizationsplit testingcro testing

Related Articles

Ready to roast your landing page?

Get brutally honest feedback from 6 AI bear personas. Find out what's broken, what's working, and how to convert better.