It is important for your websites to be found through search engines. Google uses Googlebot to index websites and organizes their ranking. Here are ways to control the crawling and indexing so you can control what part of your website is favored:
- Use Robots.txt
- Nofollow (meta robots tag)
- Change your crawl rate using Google Search Console
- Delete irrelevant content
- Restrict access to certain pages you don’t want Google to see (ex. Private client info.)
- Use Robots.txt for Images Only
When you are done make sure to check Google’s provided list of public IPs to verify any requests and compare them to the data in your server. This will protect you from malicious bots pretending to be Googlebot. Also, check your Crawl stats using Google Search Console (Settings > Crawl States). This will help you when organizing a content marketing strategy.
Need help implementing these files into your website? Call 248.528.3600 and we can help.