Virtual Robots.txt is an easy (i.e. automated) solution to creating and managing a robots.txt file for your site. Instead of mucking about with FTP, files, permissions ..etc, just upload and activate the plugin and you’re done.
By default, the Virtual Robots.txt plugin allows access to the parts of WordPress that good bots like Google need to access. Other parts are blocked.
If the plugin detects an existing XML sitemap file, a reference to it will be automatically added to your robots.txt file.
If a physical robots.txt file exists on your site, WordPress won’t process any request for one, so there will be no conflict.
Out of the box, no. Because WordPress is in a sub-folder, it won’t “know” when someone is requesting the robots.txt file which must be at the root of the site.
No it doesn’t.
By default, the virtual robots.txt is set to block WordPress files and folders that don’t need to be accessed by search engines. Of course, if you disagree with the defaults, you can easily change them.
Virtual Robots.txt automatically creates a robots.txt file for your site. Your robots.txt file can be easily edited from the plugin settings page.View Cart
Search & Replace data in your database with WordPress admin, replace do...
Flexible button creator allowing you to stick floating buttons to the side ...
Remove meta author and date information from posts and pages. Hide from Hum...
The General Data Protection Regulation (GDPR) is a regulation in EU law on ...