haleakua
Member
- Joined
- Mar 7, 2015
- Messages
- 61
- Points
- 8
The Google webmaster tools offer a range of useful data to improve the SEO for our site. Beyond making link-building tasks, care much information see our site in Google Webmaster Tools .
In this thread I will show you some easy tips that you can find them and improve SEO on Google webmaster tools.
1. Fixes the pages and duplicate meta-tags
If you enter the "Search Appearance" tab and then "improvements HTML", you will see that you out a series of sub-elements related meta-descriptions and title tags. This is an essential aspect of the internal SEO of your page. If you see that duplicate meta descriptions, title tags absent, labels short or too long title, these aspects have to go correcting your page.
2. Check the positioning of your pages
In the "Search Traffic" > "Search Queries" section you will see which keywords deriving traffic to your website are. If these keywords do not have much to do with your business, it is assumed that the type of audience they attract is not a prospect. One aspect that should be taken into account is how your pages are positioned (middle position), since this depends largely on the number of impressions and clicks that occur.
3. Note the tracking error pages
Crawl errors page, you can find in the section "Tracking" shows you the pages that a user attempted to access, or Google bot itself, but it has not been possible. This often has to do with who are not friendly URL. Try to eliminate the parameters that go into your URL to be SEO friendly and did not get access problems.
4. Submit a sitemap
The sitemap is a very important part of any website. Google tells what parts of your page has to be crawled and indexed. It is usually a file named sitemap.xml. This will allow Google to identify and track your pages to which it was proving difficult to access.
5. Create your robots.txt
The file robots serves to give some indications to crawlers on pages that do not have to crawl. Google can take into consideration or not. Usually, we make a file robots to tell Google that there are pages that do not add much value and I do not want to go, not indexed.
What about your site? Are you carrying out these actions to improve internal SEO of your website in google webmaster tool? Have you created your sitemaps and your robots file for them? Do you have duplicate content?
I'd like to hear your shares!
In this thread I will show you some easy tips that you can find them and improve SEO on Google webmaster tools.
1. Fixes the pages and duplicate meta-tags
If you enter the "Search Appearance" tab and then "improvements HTML", you will see that you out a series of sub-elements related meta-descriptions and title tags. This is an essential aspect of the internal SEO of your page. If you see that duplicate meta descriptions, title tags absent, labels short or too long title, these aspects have to go correcting your page.
2. Check the positioning of your pages
In the "Search Traffic" > "Search Queries" section you will see which keywords deriving traffic to your website are. If these keywords do not have much to do with your business, it is assumed that the type of audience they attract is not a prospect. One aspect that should be taken into account is how your pages are positioned (middle position), since this depends largely on the number of impressions and clicks that occur.
3. Note the tracking error pages
Crawl errors page, you can find in the section "Tracking" shows you the pages that a user attempted to access, or Google bot itself, but it has not been possible. This often has to do with who are not friendly URL. Try to eliminate the parameters that go into your URL to be SEO friendly and did not get access problems.
4. Submit a sitemap
The sitemap is a very important part of any website. Google tells what parts of your page has to be crawled and indexed. It is usually a file named sitemap.xml. This will allow Google to identify and track your pages to which it was proving difficult to access.
5. Create your robots.txt
The file robots serves to give some indications to crawlers on pages that do not have to crawl. Google can take into consideration or not. Usually, we make a file robots to tell Google that there are pages that do not add much value and I do not want to go, not indexed.
What about your site? Are you carrying out these actions to improve internal SEO of your website in google webmaster tool? Have you created your sitemaps and your robots file for them? Do you have duplicate content?
I'd like to hear your shares!