
This special rule is best written within the Apache config file. The next step is to write a configuration that says, "If this website is staging or development, serve instead of robots.txt. The allows us to create a file specifically for a staging server without touching the live version of robots.txt. # This file is returned for /robots.txt on staging servers Within this new file, disallow everything.

Step 1 - Modify Robots.txt fileĬreate a new file titled. It's smart without running the risk to forget deleting the file once we go live.

I absolutely love the solution presented by Henning Koch. If you access your Wordpress website as then place the file here: /opt/bitnami/apache2/htdocs If you access your Wordpress website as then place the file here: /opt/bitnami/apps/wordpress/htdocs Instead, we should create a new robots.txt the "Bitnami Way".ĭepending on how you intend to install Wordpress, there are two possible places to add a robots.txt. sudo find / | grep robots.txtĪlthough the we were able to find a working copy of robots.txt above, this is not where we should modify the file. If you're interested in understanding its path, run a find command with grep. When you create a Bitnami Wordpress site, by default it comes with a /robots.txt file but it's not easy to find. The goal will be to create a document that prevents search engines from showing your staging server on their search results.
Aws bitnami redmine how to#
In this blog entry, I will show you how to modify your robots.txt file for a Wordpress staging server. The Bitnami package does it all but there are a few things you'll need to modify including robots.txt files.

Bitnami offers Wordpress for AWS Cloud which is great for developers whom are not interested in focusing on DevOps topics such as installing php, Apache and Wordpress.
