404 Error when checking cache of websiite

cogxim

New member
Joined
Aug 9, 2018
Messages
8
Points
0
My website has been indexed in google but when i check its cache its give 404 error. Why it is so i have tried every thing resubmitted sitemap, Fetch as google and many more. First it show error in some pages now 404 error showing in all pages.
My website URL is cogxim.com
Answer will be really appreciated.
 

Shivam Tripathi

New member
Joined
Aug 14, 2018
Messages
6
Points
0
Same problem is being faced by me also.Just you will have to update your sitemap and see your result within 15 min..try it once..
 

Marc van Leeuwen

Premium Member
Joined
May 29, 2016
Messages
1,132
Points
63
Checking at the second line on your robots.txt and you need to remove it

Code:
cogxim.com/robots.txt
Remove red text and your web page will be cached by Google.
Code:
User-agent: *
[COLOR="#FF0000"]Disallow:[/COLOR]
Disallow: /finance.htm
Disallow: /contact.html
I hope it helps!
 

cogxim

New member
Joined
Aug 9, 2018
Messages
8
Points
0
Hi Marc
I have deleted Disallow from my robots.txt file hope google easily cache my website. Instead of Disallow can we use Allow.
 

Marc van Leeuwen

Premium Member
Joined
May 29, 2016
Messages
1,132
Points
63
Marc van Leeuwen
This is what you should add to your robots.txt file

Code:
User-agent: *
Disallow: /finance.htm
Disallow: /contact.html
Disallow: /automobile.html
Disallow: /services.html
Disallow: /visage.html
Disallow: /downloads.html
Disallow: /news_events.htm
Disallow: /offshore_support.htm
Disallow: /e_commerce.htm 
Disallow: /petrogenius.htm
Disallow: /transport.htm
Disallow: /privacy_policy.htm
Disallow: /software_consultancy.htm
Allow: /
After that, resubmit your sitemap, using fetch tool and building some quality backlinks then Google will cache your site faster.
I hope it helps!
 

cogxim

New member
Joined
Aug 9, 2018
Messages
8
Points
0
i have resubmitted the robots.txt using allow: / hope its work.
 

simicartan

New member
Joined
Aug 1, 2018
Messages
25
Points
0
a weird thing happens to my site too. When I index as google bot, it shows up that my site has 404 errors but I cannot find where the page that has the 404 error links to to fix that, which means there is no trace of the error page at all, how can google bot still index the error pages? I know it sounds weird.
 

cogxim

New member
Joined
Aug 9, 2018
Messages
8
Points
0
Its Still not working Marc. I have done the changes and waited but no result found
 

Marc van Leeuwen

Premium Member
Joined
May 29, 2016
Messages
1,132
Points
63
Marc van Leeuwen
Have you tried to use "Fetch as Google" for your website and what result is returned there?

After used fetch tool, you can click on Partial or Completed under status column to check details which parts on your site is blocked for robots.
 
Newer threads
Latest threads
Replies
0
Views
590
Replies
2
Views
789
Recommended threads
Replies
12
Views
4,475
Replies
4
Views
3,522
Replies
0
Views
6,289
Replies
25
Views
10,728
Replies
36
Views
22,927
Similar threads

Referral contests

Referral link for :

Sponsors

Popular tags

You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an alternative browser.

Top