Remember to Update Robots.txt and File Permissions

Last updated at 30 June 2021 at 6:56pm


Deploying an app can be a boring and cumbersome task. There are a lot of things to consider. One of them is the visibility and permissions of directories and files.

The other day as I was going through the laravel dependencies, I don’t know why, maybe because I wanted to see if there are unnecessary ones. Anyway, as I was searching for one of the dependencies github page, I found a website that had their vendor folder indexed. I think they had the wrong permissions or something. This got me thinking, since a lot of websites use laravel I could search for a popular dependency via google to see if I’ll get some apps with this issue. I searched for a random one and I was not wrong. The results were not few.

Here, try it yourself(input this as your search query in google or duckduckgo): site:*.*/vendor/sebastian/type

sebastian/type is a library that laravel uses.

Using this search query you can identify some of the websites that are poorly configured.

This sort of search filtering can also be used to find sites that have not changed the default wordpress admin route(/wp-admin or /wp-login.php). Here is the query : site:*.*/wp-admin

Solution

Well, I’m not a security expert but this permission changing is not such a hard task. Here is a Stackoverflow answer that explains the permissions that should be put.

Also consider renaming your admin routes and adding them in the robots.txt file, to tell search engine crawlers not to access the route.