mirror of
https://github.com/audacity/audacity.github.io.git
synced 2026-04-13 21:29:15 -05:00
Add robots.txt to block aggressive AI scrapers (GPTBot, Bytespider, SemrushBot, etc.) and set crawl delays for legitimate search engines. Add Cache-Control headers for HTML pages (1h TTL with stale-while-revalidate) so repeat visits are served from Netlify's CDN edge cache instead of origin. Defer below-fold video components from client:load to client:visible so their JS chunks only load when scrolled into view — bots and users who don't scroll never trigger those requests. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
26 lines
561 B
Plaintext
26 lines
561 B
Plaintext
# HTML pages — 1 hour cache, serve stale while revalidating
|
|
/*
|
|
Cache-Control: public, max-age=3600, stale-while-revalidate=86400
|
|
|
|
# Hashed build assets — immutable forever
|
|
/_astro/*
|
|
Cache-Control: public, max-age=31536000, immutable
|
|
|
|
/fonts/*
|
|
Cache-Control: public, max-age=31536000, immutable
|
|
|
|
/favicon.ico
|
|
Cache-Control: public, max-age=604800
|
|
|
|
/favicon.svg
|
|
Cache-Control: public, max-age=604800
|
|
|
|
/*.svg
|
|
Cache-Control: public, max-age=86400
|
|
|
|
/*.pdf
|
|
Cache-Control: public, max-age=604800
|
|
|
|
/sitemap-*.xml
|
|
Cache-Control: public, max-age=3600
|