Hey guys, welcome to part 4 of our SEO for beginners tutorial journey! Our session today is a slightly longer one but nevertheless an important one! We are going to be talking about 2 files that are mandatory on your sites servers. The first file we are going to look at is the Robots.txt file. And the following one will be our Sitemap.

The robots.txt file!

Now you must be wondering, what does the file do and what is its purpose. It is quite simple actually. The robots.txt file is a file that robots or you can call them bots, look at so they know what are the permissions for bots on your site. For example, when you send your site to google so that they can index it, they have their crawl bot come onto your site and analyse it. Site indexing is :

Site indexing happens when you send you site for review by google and they send their bot so that they can scan your site for all the content, links, code and therefore add you to their search engine according to what the bot has scanned.

It is important to understand what site indexing is. Google works in a way where, it adds the pages of your site to a category. When someone searches a term on google, it will go through all the pages associated with the category that person is searching. It then gives the search results based on the best pages for that category.

All that in the fraction of a second… And they can’t predict the weather properly yet… Ha.Ha.

So when the google bot comes to crawl your site for content and information so that it can classify you, it will first and foremost look for a Robots.txt file. The purpose of this file is to tell the google bot the following :

  1. Am I allowed to visit this site?
  2. I am allowed. Are there restrictions to the pages I can crawl?
  3. I am allowed to crawl all the pages.
  4. Are there any bots that aren’t allowed?
  5. And etc.

The Robots.txt file serves as a permission file for the bots. Some bots are malicious. They come to your site to scan it for vulnerabilities. The Robots.txt can help sometimes in preventing that from happening. It is important to be aware of these bots and make sure that they are blocked by your Robots.txt file.

Where do I get this Robots.txt file? It is simple. Any site can make one easily. You could make one yourself if you wish. All you have to do is do the following :

  1. Open a .txt file that is blank.
  2. Insert the following :
    User-agent: *
    Disallow:
  3. Save as Robots.txt
  4. Upload it to the / folder of your server. Or as other people call it, the root.

You must be aware that Robots.txt files have different permissions settings. The previous setting I have given you allows all robots to scan your site freely. You might have some sensitive content or folder that you would like to exclude. These are the settings available that should fit most of your situations :

Settings :

To exclude all robots from the entire server

User-agent: *
Disallow: /

To allow all robots complete access

User-agent: *
Disallow:

(or just create an empty “/robots.txt” file, or don’t use one at all)

To exclude all robots from part of the server

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /junk/

To exclude a single robot

User-agent: BadBot Disallow: /

To allow a single robot

User-agent: Google
Disallow:

User-agent: *
Disallow: /

So concludes the information needed to create a Robots.txt file and upload it to your server properly. If you need further assistance feel free to send us a msg on our Facebook! Now let’s move onto the sitemap file!

The sitemap!

The sitemap file is a really simple part of this tutorial. It is what the name of the file indicates. It is a map for your website. It is used so that robots and humans can better understand the structure of your site. It helps understand the levels and categories that each page is associated with. Here is an example of the sitemap for MYDesigns :

Sitemap

As you can see, our sitemap is really simple. Our website is only 1 page. So it is a really simple Sitemap. The file itself can be created by a lot of online map creators. Feel free to use the one we used at the following link : www.xml-sitemaps.com

The site also provides a lot of information regarding the sitemaps. It is important to have one on your site. The main reason is that google crawl bots like to have one so that they can better understand your structure.

The easier you make google’s bot life, the better your score for SEO will be.

Once you have obtained your sitemap, you upload it to your serve in the root folder and you simply upload it to your google webmaster account. Voila! Sitemap is done!

This concludes our part 4 of the SEO tutorial journey. On this note, happy holidays and HNY! Till next time!