Skip to content

Cloudflare

Deploy Hyperterse on Cloudflare using Workers (serverless functions) or use Cloudflare as a CDN/proxy in front of your Hyperterse deployment. Since Hyperterse is a single executable binary, it can work with Cloudflare’s edge computing platform.

OptionBest For
Cloudflare Pages FunctionsFull-stack apps with Hyperterse
Cloudflare WorkersServerless API endpoints
Cloudflare TunnelSecure proxy to external Hyperterse
DNS ProxyCDN/proxy in front of Hyperterse

Option 1: Cloudflare Pages Functions Recommended

Section titled “Option 1: Cloudflare Pages Functions ”

Deploy Hyperterse using Cloudflare Pages Functions, which provides a full-stack deployment with edge computing.

  1. Export your Hyperterse bundle

    Terminal window
    hyperterse export -f config.terse -o dist
  2. Create a Pages Function

    Create a function that wraps Hyperterse:

    functions/api/[[path]].js
    export async function onRequest(context) {
    const { request, env } = context
    const url = new URL(request.url)
    // Extract the Hyperterse binary from the exported script
    // Note: You'll need to extract the binary from the base64-encoded script
    // For now, proxy to an external Hyperterse instance
    // Or use Cloudflare Workers to execute the binary
    const hyperterseUrl = env.HYPERTERSE_URL || 'http://localhost:8080'
    const response = await fetch(`${hyperterseUrl}${url.pathname}`, {
    method: request.method,
    headers: request.headers,
    body: request.body,
    })
    return new Response(response.body, {
    status: response.status,
    headers: response.headers,
    })
    }
  3. Configure environment variables

    In wrangler.toml or Cloudflare dashboard:

    wrangler.toml
    [env.production]
    DATABASE_URL = "postgresql://user:pass@host:5432/db"
  4. Deploy

    Terminal window
    wrangler pages deploy dist --project-name=hyperterse

Deploy Hyperterse as a Cloudflare Worker for edge computing:

  1. Create a Worker

    worker.js
    export default {
    async fetch(request, env) {
    // Execute Hyperterse binary
    // Note: Workers support WebAssembly, so you may need to compile
    // the binary to WASM or use a subprocess approach
    const url = new URL(request.url)
    // Proxy to external Hyperterse or execute locally
    const response = await fetch(env.HYPERTERSE_URL + url.pathname, {
    method: request.method,
    headers: request.headers,
    body: request.body,
    })
    return response
    },
    }
  2. Deploy

    Terminal window
    wrangler deploy

Connect your external Hyperterse deployment securely using Cloudflare Tunnel. This is ideal when you want to deploy Hyperterse on another platform but use Cloudflare’s edge network:

┌──────────┐ ┌─────────────┐ ┌─────────────┐ ┌──────────┐
│ Client │─────▶│ Cloudflare │─────▶│ Hyperterse │─────▶│ Database │
│ │ │ Tunnel │ │ (External) │ │ │
└──────────┘ └─────────────┘ └─────────────┘ └──────────┘

Cloudflare Tunnel provides secure connectivity without exposing your origin server to the internet. This is the recommended approach for production deployments.

  1. Deploy Hyperterse on your origin server

    First, deploy Hyperterse on your preferred platform (Railway, AWS, DigitalOcean, etc.). Make note of the internal URL or IP address where it’s running.

  2. Install cloudflared

    Install the Cloudflare Tunnel daemon (cloudflared) on your origin server:

    Terminal window
    # macOS
    brew install cloudflare/cloudflare/cloudflared
    # Linux
    wget https://github.com/cloudflare/cloudflared/releases/latest/download/cloudflared-linux-amd64
    chmod +x cloudflared-linux-amd64
    sudo mv cloudflared-linux-amd64 /usr/local/bin/cloudflared
  3. Authenticate cloudflared

    Log in to Cloudflare to authenticate the tunnel:

    Terminal window
    cloudflared tunnel login

    This opens a browser window to authorize the tunnel.

  4. Create a tunnel

    Create a new tunnel for your Hyperterse deployment:

    Terminal window
    cloudflared tunnel create hyperterse

    Note the tunnel ID that’s displayed.

  5. Configure the tunnel

    Create a configuration file for your tunnel:

    ~/.cloudflared/config.yml
    tunnel: <tunnel-id>
    credentials-file: /path/to/<tunnel-id>.json
    ingress:
    # Route traffic for your domain to Hyperterse
    - hostname: api.example.com
    service: http://localhost:8080
    # Catch-all rule (must be last)
    - service: http_status:404

    Replace <tunnel-id> with your actual tunnel ID, and api.example.com with your domain.

  6. Run the tunnel

    Start the tunnel daemon:

    Terminal window
    cloudflared tunnel run hyperterse

    For production, run it as a systemd service:

    /etc/systemd/system/cloudflared.service
    [Unit]
    Description=Cloudflare Tunnel
    After=network.target
    [Service]
    Type=simple
    User=cloudflare
    ExecStart=/usr/local/bin/cloudflared tunnel run hyperterse
    Restart=always
    RestartSec=5
    [Install]
    WantedBy=multi-user.target
  7. Configure DNS

    In the Cloudflare dashboard:

    • Go to DNS settings for your domain
    • Create a CNAME record: api<tunnel-id>.cfargotunnel.com
    • Ensure proxy is enabled (orange cloud icon)

If your Hyperterse server is already publicly accessible, use Cloudflare as a CDN/proxy:

┌──────────┐ ┌─────────────┐ ┌─────────────┐ ┌──────────┐
│ Client │─────▶│ Cloudflare │─────▶│ Hyperterse │─────▶│ Database │
│ │ │ (Proxy/CDN) │ │ (Origin) │ │ │
└──────────┘ └─────────────┘ └─────────────┘ └──────────┘
  1. Add DNS record

    In Cloudflare dashboard:

    • Go to DNS settings
    • Add an A record (if you have an IP) or CNAME record pointing to your Hyperterse server
    • Enable proxy (orange cloud icon)
  2. Configure SSL/TLS

    • Go to SSL/TLS settings
    • Set encryption mode to “Full” or “Full (strict)”
    • Cloudflare will automatically provision SSL certificates
  3. Verify connectivity

    Test that requests are proxied correctly:

    Terminal window
    curl https://api.example.com/docs

Configure Cloudflare to cache read-heavy endpoints like /llms.txt and /docs:

  1. Create a Page Rule or Transform Rule

    In Cloudflare dashboard, go to Rules → Page Rules or Transform Rules.

  2. Cache GET requests to documentation

    Create a rule:

    • URL pattern: *api.example.com/docs* or *api.example.com/llms.txt
    • Setting: Cache Level → Cache Everything
    • Edge Cache TTL: 1 hour
  3. Don’t cache POST requests

    Ensure POST requests to /query/* are not cached:

    • URL pattern: *api.example.com/query/*
    • Setting: Cache Level → Bypass

Protect your API from abuse with Cloudflare rate limiting:

  1. Go to Security → WAF → Rate limiting rules

  2. Create a rate limit rule

    • Rule name: “Hyperterse API Rate Limit”
    • Match: http.request.uri.path contains "/query/"
    • Rate: 100 requests per minute
    • Action: Block or Challenge
  3. Apply the rule

    Save and deploy the rule. Cloudflare will now limit requests to your query endpoints.

Enable Cloudflare’s security features for additional protection:

  1. Web Application Firewall (WAF)

    • Go to Security → WAF
    • Enable managed rulesets
    • Configure custom rules if needed
  2. Bot management

    • Go to Security → Bots
    • Enable Bot Fight Mode (free) or Bot Management (paid)
    • Configure bot score thresholds
  3. DDoS protection

    • DDoS protection is automatically enabled
    • Configure additional settings in Security → DDoS
  4. SSL/TLS settings

    • Go to SSL/TLS
    • Set encryption mode to “Full (strict)” for maximum security
    • Enable “Always Use HTTPS”
    • Enable “Minimum TLS Version” → TLS 1.2

Optimize Cloudflare settings for better performance:

  • Auto Minify: Enable HTML, CSS, and JavaScript minification
  • Brotli compression: Enable Brotli compression for better compression ratios
  • HTTP/2: Automatically enabled
  • HTTP/3 (QUIC): Enable for faster connections
  • Global edge network: Low-latency responses from 300+ locations worldwide
  • DDoS protection: Automatic protection against attacks
  • Free SSL: Automatic SSL certificates for all domains
  • Caching: Edge caching for read-heavy endpoints
  • Workers: Serverless edge computing for dynamic requests
  • Cost-effective: Free tier available, pay-as-you-scale

502 Bad Gateway: Your origin server may be down or unreachable. Check that Hyperterse is running and accessible.

SSL errors: Ensure your origin server has a valid SSL certificate if using “Full (strict)” mode, or use Cloudflare Tunnel.

Caching issues: Clear Cloudflare cache in the dashboard (Caching → Purge Cache) if you see stale content.

Tunnel won’t connect: Verify tunnel credentials and configuration. Check logs with cloudflared tunnel info hyperterse.

Worker execution limits: Cloudflare Workers have execution time limits. For long-running queries, consider using Cloudflare Tunnel to an external server or optimizing your queries.