Robots.txt Setup for Angular, Vue, Django, and PHP
Table of Contents
Every framework handles static files differently. Angular builds to a dist folder. Django has a static file pipeline. PHP apps just serve from the document root. The correct way to add robots.txt depends entirely on which framework you're using and how it serves files. Here's the setup for each major one.
Angular: Put It in the src/ Directory
In an Angular project, add robots.txt to the src/ directory. Then update your angular.json to include it in the build output:
"assets": [ "src/favicon.ico", "src/assets", "src/robots.txt" ]
When you run ng build, Angular copies the file to the dist folder and it gets served at /robots.txt by your web server.
If you're using Angular Universal (server-side rendering), the same approach works — the file is served as a static asset from the dist/browser directory, which your Express server handles before Angular routes.
Verify by running the dev server (ng serve) and visiting localhost:4200/robots.txt. If it loads, the config is correct.
Vue.js (Vite and Vue CLI): Static Assets Folder
For Vue projects using Vite (Vue 3 default), add robots.txt to the public/ directory at the project root. Vite copies everything in public to the build output without processing it. No config changes needed.
my-vue-app/
public/
robots.txt <- Add it here
src/
vite.config.js
For Vue CLI projects (Vue 2 / older Vue 3 setups), the same applies: public/robots.txt gets copied to the output directory during build.
For Nuxt.js specifically (the Vue equivalent of Next.js), the setup mirrors Next.js Pages Router: put robots.txt in the static/ directory (Nuxt 2) or public/ directory (Nuxt 3). Nuxt 3 also supports a server/routes/robots.txt.ts server route for dynamic generation.
Django: Two Approaches (URL Route or Static File)
Approach 1: URL route (recommended)
Define a view that returns the robots.txt content and wire it to the /robots.txt URL:
# views.py
from django.http import HttpResponse
def robots_txt(request):
content = """User-agent: *
Disallow: /admin/
Disallow: /account/
Sitemap: https://yoursite.com/sitemap.xml"""
return HttpResponse(content, content_type="text/plain")
# urls.py
from django.urls import path
from .views import robots_txt
urlpatterns = [
path("robots.txt", robots_txt, name="robots_txt"),
# ... other URLs
]
This gives you dynamic generation and easy editing. You can pull settings from your Django config (like SITE_URL) rather than hardcoding the domain.
Approach 2: Static file
Place robots.txt in a staticfiles directory and configure your web server (Nginx/Apache) to serve /robots.txt from there. Requires web server config, not just Django config.
Laravel and Plain PHP: Document Root
Plain PHP: Add robots.txt directly to your document root (usually public_html or www). Your web server serves it automatically at /robots.txt. No framework configuration needed.
Laravel: Add robots.txt to the public/ directory. Laravel's public folder is the document root, so everything there is served directly by the web server:
my-laravel-app/
public/
index.php
robots.txt <- Add here
app/
resources/
For environment-specific robots.txt in Laravel, you can create a route and controller instead of a static file:
// routes/web.php
Route::get('/robots.txt', function() {
$content = app()->environment('production')
? "User-agent: *
Allow: /
Sitemap: https://yoursite.com/sitemap.xml"
: "User-agent: *
Disallow: /";
return response($content, 200)->header('Content-Type', 'text/plain');
});
Verification Checklist for Any Framework
After setting up robots.txt in any framework, run through this checklist:
- Start your development server and load localhost/robots.txt — confirm it returns plain text content, not HTML
- Check that the Content-Type header is text/plain (view in browser DevTools > Network tab)
- Build for production and check the output directory — confirm robots.txt exists in the build output
- Deploy and check yoursite.com/robots.txt live
- If the file isn't loading after deploy, check your web server config — some server setups require explicit rules to serve files without known extensions from specific paths
Framework build processes sometimes get updated and change how static files are handled. After major framework upgrades, re-verify that your robots.txt is still being served correctly.
Try It Free — No Signup Required
Runs 100% in your browser. No data is collected, stored, or sent anywhere.
Open Free Robots.txt GeneratorFrequently Asked Questions
Where does robots.txt go in an Angular project?
In the src/ directory, added to the assets array in angular.json. Angular copies it to the dist folder during build.
How do I add robots.txt to a Vue project built with Vite?
Put robots.txt in the public/ directory at the project root. Vite copies everything in public to the build output without modification.
Can Django serve robots.txt as a static file?
Yes, but a URL route view is simpler and doesn't require web server configuration changes. Both approaches work.
Does Laravel have a built-in robots.txt?
No. Add the file manually to the public/ directory, or create a route/controller for dynamic generation.
What if my framework rewrites all URLs to index.html or index.php?
Many SPA frameworks do this. Robots.txt is a static file and must be served before the URL rewriting kicks in. Configure your web server to serve robots.txt directly before passing requests to the application.

