Want to make your site more visible?
Want to create a way to let search engines know about all of your pages?
Want to let Google, Yahoo!, MSN, Ask and other know about your content?
Well now you can by implementing two easy tasks:
- Create an XML sitemap
- Create a robots.txt file
What is an XML sitemap?
An XML sitemap is a listing of all of your websites pages in a simple xml format. The file contains xml markup to define the URL of the page as well as other features. The main point in creating an XML sitemap is to capture all of your websites pages. If you have a dynamic website that is constantly growing with new products: member registration, products or blogging; then you need to constantly update the XML sitemap to let the search engines know of your new pages. If you need help creating an xml sitemap check out this automated xml sitemap submission service. After you have created your xml sitemap then you only have one more step adding a robots.txt file.
What is a Robots.txt file?
Every search engine has robots also known as bots, crawlers, spiders, etc. These applications are constantly communicating with websites in the hope to find new content to index. However, some webmasters do not want all there content indexed. This was the first reason for a robots.txt file, to limit robots from crawling various directories. Recently the robots.txt was upgraded to include the sitemap location. Basically informing the robots of a list of your pages.
If you do not have a robots.txt file, simply open notepad and type the following:
Sitemap: [Sitemap_Location]
Where [Sitemap_Location] = http://www.youdomain.com/sitemap.xml
When you are all done, upload these two files to your root folder so you can get to it in a web browser by going to:
- http://www.yourdomain.com/sitemap.xml
- http://www.yourdomain.com/robots.txt
If you have any questions about increasing your page visibility let us know.
No comments:
Post a Comment