Skip to content

Commit

Permalink
Add a robots.txt file.
Browse files Browse the repository at this point in the history
Add a static robots.txt file just prohibiting all crawlers.

Signed-off-by: Julien Goodwin <jgoodwin@studio442.com.au>
  • Loading branch information
laptop006 authored and alinefm committed Jan 21, 2015
1 parent 5ab71df commit fbd2536
Show file tree
Hide file tree
Showing 3 changed files with 10 additions and 0 deletions.
4 changes: 4 additions & 0 deletions src/kimchi/config.py.in
Original file line number Diff line number Diff line change
Expand Up @@ -245,6 +245,10 @@ class KimchiConfig(dict):
'tools.staticfile.on': True,
'tools.staticfile.filename': '%s/images/logo.ico' % paths.ui_dir
},
'/robots.txt': {
'tools.staticfile.on': True,
'tools.staticfile.filename': '%s/robots.txt' % paths.ui_dir
},
'/help': {
'tools.staticdir.on': True,
'tools.staticdir.dir': '%s/ui/pages/help' % paths.prefix,
Expand Down
4 changes: 4 additions & 0 deletions ui/Makefile.am
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,7 @@
# limitations under the License.

SUBDIRS = css images js libs pages spice-html5

uidir = $(datadir)/kimchi/ui

dist_ui_DATA = robots.txt
2 changes: 2 additions & 0 deletions ui/robots.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
User-agent: *
Disallow: /

0 comments on commit fbd2536

Please sign in to comment.