Skip to content

Commit

Permalink
Merge pull request #66 from open-sdg/robots-txt
Browse files Browse the repository at this point in the history
Robots.txt files to disallow crawlers
  • Loading branch information
brockfanning authored Jul 20, 2023
2 parents 912cf47 + 436fbf1 commit 20bd3b2
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 0 deletions.
3 changes: 3 additions & 0 deletions .github/workflows/deploy-to-production.yml
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,9 @@ jobs:
- name: Build data
run: |
python scripts/build_data.py
- name: Place public files
run: |
cp public/robots-prod.txt _site/robots.txt
- name: Deploy to GitHub Pages
uses: JamesIves/[email protected]
with:
Expand Down
3 changes: 3 additions & 0 deletions .github/workflows/deploy-to-staging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,9 @@ jobs:
- name: Build data
run: |
python scripts/build_data.py
- name: Place public files
run: |
cp public/robots-staging.txt _site/robots.txt
- name: Deploy to GitHub Pages
uses: JamesIves/[email protected]
with:
Expand Down
2 changes: 2 additions & 0 deletions public/robots-prod.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
User-agent: *
Disallow: /
2 changes: 2 additions & 0 deletions public/robots-staging.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
User-agent: *
Disallow: /

0 comments on commit 20bd3b2

Please sign in to comment.