Documentation
DashboardGet Started
Getting StartedAPI ReferenceFrameworksIntegration ExamplesTroubleshootingBest Practices
BotRender

Making SPAs search engine friendly, one render at a time.

Documentation

  • Getting Started
  • API Reference
  • Frameworks

Resources

  • Integration Examples
  • Troubleshooting
  • Best Practices

Support

  • Help & Support

© 2025 BotRender. All rights reserved.

    Best Practices

    BotRender Best Practices

    Optimize your BotRender integration for maximum SEO performance, faster indexing, and better search engine visibility.

    SEO (4)
    Performance (4)
    Security (1)
    Pro Tip: Following these best practices can improve your SEO scores by up to 40% and reduce page indexing time by 60%. Start with status codes and page readiness for immediate impact.
    Choose the Right Rendering Strategy
    Prefer worker-first caching for crawlers to avoid slow real-time rendering and ranking risks; enable real-time only when you accept latency trade-offs.
    Performance

    Worker-First (Recommended)

    Serve cached HTML to bots; on cache miss, return 404 quickly and queue a render. Optionally fall back to SPA so first crawl still gets content.

    • Consistent sub-200ms responses to crawlers
    • Avoids crawler timeouts and crawl-budget penalties
    • Stable under concurrency spikes and third-party latency
    • Use recache/pre-warm APIs for fresh content

    Real-Time Rendering (Optional)

    Synchronously render on bot requests. Higher latency and risk of timeouts; use with caution.

    • Expect seconds of latency during cold starts/heavy pages
    • Burst bot traffic can overload origin/middleware
    • Restrict to a few critical routes and set strict timeouts
    • Prefer pre-warming over synchronous rendering when possible
    Handle Status Codes Properly
    Ensure search engines receive the correct HTTP status codes for different page states
    SEO

    404 Not Found Pages

    Return proper 404 status for missing pages

    <meta name="prerender-status-code" content="404">

    301 Redirects

    Handle permanent redirects correctly

    <meta name="prerender-status-code" content="301">
    <meta name="prerender-header" content="Location: https://new-url.com">

    Dynamic Status Codes

    Set status codes programmatically based on content

    // JavaScript example
    if (pageNotFound) {
      document.head.innerHTML += '<meta name="prerender-status-code" content="404">';
    }
    Control Page Rendering Timing
    Ensure all dynamic content loads before BotRender captures the page
    Performance

    Prerender Ready Flag

    Control when BotRender considers the page fully loaded

    <script>
      // Initially set to false
      window.prerenderReady = false;
      
      // Set to true when all content is loaded
      Promise.all([
        loadCriticalData(),
        loadAsyncComponents()
      ]).then(() => {
        window.prerenderReady = true;
      });
    </script>

    Wait for API Calls

    Delay rendering until critical API requests complete

    async function waitForData() {
      const data = await fetch('/api/critical-data');
      const result = await data.json();
      
      // Render content with data
      renderComponent(result);
      
      // Signal ready for prerendering
      window.prerenderReady = true;
    }
    Optimize Caching Strategy
    Use smart caching to improve performance and reduce costs
    Performance

    Strategic Cache Expiration

    Set appropriate cache times based on content update frequency

    • Static content: 24-48 hours
    • Dynamic content: 1-6 hours
    • Real-time content: 15-30 minutes
    • Emergency updates: Use API recache

    API-Based Recaching

    Trigger recaching when content changes

    // Recache when content updates
    const recacheUrl = async (url) => {
      await fetch('https://api.botrender.com/recache', {
        method: 'POST',
        headers: {
          'Authorization': 'Bearer YOUR_API_TOKEN',
          'Content-Type': 'application/json'
        },
        body: JSON.stringify({ url })
      });
    };
    
    // Trigger after content update
    await updateArticle(articleId);
    await recacheUrl(`/articles/${articleId}`);
    Comprehensive Bot Detection
    Ensure all relevant crawlers and AI platforms are detected
    SEO

    Complete User Agent List

    Include all major search engines, social platforms, and AI crawlers

    • Search engines: Google, Bing, Yahoo, DuckDuckGo, Baidu, Yandex
    • Social platforms: Facebook, Twitter, LinkedIn, Pinterest
    • AI platforms: ChatGPT, Claude, Perplexity, Bard
    • SEO tools: Ahrefs, SEMrush, Moz

    Future-Proof Detection

    Use patterns that catch new bot variants

    const isBot = (userAgent) => {
      const botPattern = /(googlebot|bingbot|slurp|duckduckbot|baiduspider|yandexbot|facebookexternalhit|twitterbot|linkedinbot|pinterestbot|chatgpt-user|gptbot|claude-web|perplexitybot|ahrefsbot|semrushbot)/i;
      return botPattern.test(userAgent);
    };
    Performance Best Practices
    Optimize your pages for faster rendering and better SEO scores
    Performance

    Critical Resource Loading

    Prioritize loading of essential content

    • Load above-the-fold content first
    • Defer non-critical JavaScript
    • Optimize images with lazy loading
    • Minimize render-blocking resources

    JavaScript Optimization

    Reduce JavaScript execution time for faster rendering

    // Optimize heavy computations
    const optimizeRendering = () => {
      // Use requestIdleCallback for non-critical work
      requestIdleCallback(() => {
        performHeavyCalculations();
      });
      
      // Critical rendering path
      renderEssentialContent();
      window.prerenderReady = true;
    };
    Handle Cookie Banners Smartly
    Manage cookie consent for better crawler experience
    SEO

    Disable for Crawlers

    Hide cookie banners from search engine bots

    // Detect if request is from a crawler
    const isCrawler = () => {
      const userAgent = navigator.userAgent || '';
      return /bot|crawler|spider|crawling/i.test(userAgent);
    };
    
    // Only show cookie banner to real users
    if (!isCrawler()) {
      showCookieBanner();
    }

    Server-Side Detection

    Handle cookie banners at the server level

    // Express.js example
    app.use((req, res, next) => {
      const isBot = /bot|crawler|spider/i.test(req.get('User-Agent'));
      res.locals.showCookieBanner = !isBot;
      next();
    });
    Optimize Structured Data
    Ensure schema markup is properly rendered for search engines
    SEO

    Dynamic Schema Generation

    Generate schema markup based on page content

    const generateSchema = (pageData) => {
      const schema = {
        "@context": "https://schema.org",
        "@type": "Article",
        "headline": pageData.title,
        "description": pageData.description,
        "author": pageData.author,
        "datePublished": pageData.publishDate
      };
      
      const script = document.createElement('script');
      script.type = 'application/ld+json';
      script.textContent = JSON.stringify(schema);
      document.head.appendChild(script);
    };

    Schema Validation

    Ensure schema markup is valid before rendering

    • Test with Google's Rich Results Test
    • Validate JSON-LD syntax
    • Include all required properties
    • Use specific schema types when available
    Expected Performance Improvements
    Following these best practices typically results in measurable improvements
    40%
    Better SEO Scores
    Core Web Vitals improvement
    60%
    Faster Indexing
    Reduced crawl time
    25%
    Lower Costs
    Optimized caching strategy
    Implementation Checklist
    Follow this checklist to ensure you've implemented all best practices

    ✅ Essential Setup

    • Configure proper status codes for 404s and redirects
    • Implement window.prerenderReady for dynamic content
    • Set up comprehensive bot detection patterns
    • Optimize cache expiration times for your content

    🚀 Advanced Optimization

    • Hide cookie banners from search engine crawlers
    • Implement API-based recaching for content updates
    • Generate dynamic structured data/schema markup
    • Monitor and optimize JavaScript execution time

    Ready to Optimize Your Integration?

    Start implementing these best practices today and see immediate improvements in your SEO performance and search engine visibility.

    View Integration ExamplesTroubleshooting Guide