Unsolved Blog archive pages in Craw Error Report
-
Hi there,
I'm new to MOZ Pro and have a question. My scan shows Archive pages as having crawl issues, but this is because Yoast is set up to block robots on these pages.
Should I be allowing search engines to crawl these pages, or am I fine to leave them as I have it set up already?
Any advice is greatly appreciated.
Marc -
@fcevey
If blog archive pages are showing up in the crawl error report, it indicates that search engine bots are encountering issues while attempting to crawl and index those pages. To address this:Check URL Structure: Ensure that the URLs for your blog archive pages are correctly formatted and follow best practices. Avoid special characters, and use a logical and organized structure.
Update Sitemap: Make sure that the blog archive pages are included in your website's XML sitemap. Submit the updated sitemap to search engines using their respective webmaster tools.
Robots.txt File: Review your website's robots.txt file to ensure it's not blocking search engine bots from crawling your blog archive pages. Adjust the file if needed.
HTTP Status Codes: Check if the archive pages return the correct HTTP status codes (e.g., 200 OK). Crawl errors might be triggered if pages return 4xx or 5xx status codes.
Internal Linking: Ensure that there are internal links pointing to your blog archive pages. This helps search engines discover and index these pages more effectively.
Redirects: If you've recently changed the URL structure or migrated your website, implement proper redirects from old URLs to new ones to maintain SEO authority.
Server Issues: Investigate if there are any server-related issues causing intermittent errors when search engine bots try to access the blog archive pages.
-
Blog archive pages in the crawl error report indicate issues or problems encountered while indexing or accessing the archive pages of a blog or website. These errors need attention to ensure that all content remains accessible to users and search engines. ( what is project management) ( PMP Exam Prep) (study abroad)
-
@mhenshall The decision to allow search engines to crawl archive pages in YOAST SEO or leave them as they are currently configured depends on your specific goals and needs.
If the archive pages contain valuable and relevant content for both search engines and users, allowing them to be crawled could enhance the visibility of that content in search results. However, if the content on the archive pages is not important or is duplicated from other pages, blocking crawling could be a valid option to prevent indexing issues and improve the user experience.
I would recommend evaluating the content on the archive pages and considering how their visibility in search engines will be affected by allowing or blocking crawling. You can use tools like Google Search Console to monitor how Google is indexing those pages and make informed decisions based on the data.
Keep in mind that configuring YOAST SEO is a strategic decision that should align with your SEO goals and your webs```
code_textFree Research Preview. ChatGPT may produce inaccurate information about people, places, or facts. Follow us [https://posicionamiento-web-seo.com.ar/](link url)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What steps should I take to address damage to my website, including malware insertion and content theft?
The question revolves around the steps required to mitigate damage inflicted upon a website, encompassing issues such as malware insertion and content theft. It prompts a comprehensive exploration of the necessary actions to take in response to these challenges. The inquirer seeks guidance on how to effectively address the damage, indicating a desire for practical solutions and strategies to restore and safeguard their website's integrity. By posing this question, the individual demonstrates an awareness of the severity of the situation and a readiness to undertake corrective measures.
Technical SEO | | ralphbaer0 -
What Tools Should I Use To Investigate Damage to my website
I would like to know what tools I should use and how to investigate damage to my website in2town.co.uk I hired a person to do some work to my website but they damaged it. That person was on a freelance platform and was removed because of all the complaints made about them. They also put in backdoors on websites including mine and added content. I also had a second problem where my content was being stolen. My site always did well and had lots of keywords in the top five and ten, but now they are not even in the top 200. This happened in January and feb. When I write unique articles, they are not showing in Google and need to find what the problem is and how to fix it. Can anyone please help
Technical SEO | | blogwoman10 -
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
How .ae and .com domain in SEO performance for UAE region?
I have a domain for my UAE based project called https://mydubaiseo.com/ and however, one of my colleagues suggested going with .ae option.
Technical SEO | | 0eup.ombitao
Whether if we change the domain like as suggested get earlier results than .com domain or what?\Which domain .com or .ae ranks faster in UAE location if the SEO strategies followed in the same way?0 -
Conflict in reported link data
I have a competitor in a campaign - the campaign report shows 181 linked domains BUT - the site explorer report shows only 15 linked domains. Which is correct? And if the site explorer is correct - how do we fix the campaign report? [URL]]([URL=http://imgur.com/m9xFv][IMG]http://i.imgur.com/m9xFv.jpg[/IMG][/URL]) m9xFv open-site-explorer open-site-explorer competitive-domain-analysis-humphreys-assoc.com-seomoz-pro
Moz Pro | | robertdonnell0 -
Too many on-page links
one of my SEOmoz pro campaigns has given me the warning: Too many on-page links and the page in question is my html sitemap. How do i resolve this because I obviously need my sitemap. How do i get around this?
Moz Pro | | CompleteOffice1 -
On-Page URL
Hopefully I am missing something basic... I can't see how to specifically add and delete On-Page reports. It seems like running a report adds it but how to delete? Also, how does one change the URL for a report? I have re-organized some pages and can't seem the get the on-page report to keep my URL change. Here is what I tried. From the On-Page report card for a keyword I changed the URL and ran the test. Test runs ok but if I navigate back to the summary my old bad URL is still there.
Moz Pro | | Banknotes0 -
On Page Report Card... with or w/o local modifiers?
Hey all! So I am curious how you recommend using the "on page report card" (which is really helpful) along with the concept of local modifiers. IE, here is a term I am going after: business forums but really I care about a specific location: business forums | Greensboro NC So the word I hear is typically to do your keyword research & page optimization FOR the primary term, but then tack on your local modifiers after. So which do you run reports on? Probably both is the best answer, eh? Obviously my local sites won't have a shot at ranking nationally/internationally for such a broad term as "business forums", especially with some monster sites out there with some serious clout. This is more of a best practices question. Thanks dudes.
Moz Pro | | nsmcseo20