Files
Mudskipper bb9dcd1797 perf: reduce web requests with caching, bot control, and deferred JS
Add robots.txt to block aggressive AI scrapers (GPTBot, Bytespider,
SemrushBot, etc.) and set crawl delays for legitimate search engines.

Add Cache-Control headers for HTML pages (1h TTL with
stale-while-revalidate) so repeat visits are served from Netlify's
CDN edge cache instead of origin.

Defer below-fold video components from client:load to client:visible
so their JS chunks only load when scrolled into view — bots and users
who don't scroll never trigger those requests.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-03 11:44:54 +11:00
..
2025-03-25 15:45:13 +11:00
2025-05-12 11:41:38 +02:00