Very Old Pages Creeping Up - Advice
-
We are currently having very old pages dating back 5+ years ago appearing on moz all of a sudden, we don't necessarily get traffic from these links anymore and i doubt they still hold any weight.
Currently they take you to a 404 page, would there be any worth in redirecting these links?
-
Thanks for the reply, this makes a lot of sense. Very much appreciated
-
Makes a lot of sense. You'd need to crawl all the backlinks using something like ScrapeBox Free Link Checker or Screaming Frog (with certain XPath extraction settings) to see which of these links are still live. For the live ones you'd want to run something like URL Profiler over them (with paid API keys for Moz, Majestic SEO and Ahrefs) over the links to see if any of them are worth anything (by bulk fetching metrics and aggregating them in your own spreadsheet, using your own weighted custom formula). Once that was done you could decide which ones to keep and redirect those. For the others you could handle them separately or just ignore them
-
This stems from Link Explorer, pretty much all of the links are from external sites out of our hands.
-
If you're talking about Moz's crawler, it might mean that you still have links to those pages on your site even though you have removed the pages. Cleaning up links to 404s can help your 404 count decrease, both in Search Console and with Moz's crawler too
If you say that the pages have no SEO weight or traffic, redirecting them may be a waste of time. Just remove the links instead
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.com/blog?page=2 treated to be on the same level in terms of page depth as https://moz.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
Should I even do product page schema?
If I have no reviews/ratings on the page itself and special/limited time offers and just a regular product page with a standard price, is there any ability to do product schema with it getting flagged for errors? Google's Structured Markup Testing Tool threw me an error when I test it without any of those: | One of offers or review or aggregateRating should be provided. | And even if it's possible, is there any point?
Intermediate & Advanced SEO | | SearchStan0 -
Pillar pages and blog pages
Hello, I was watching this video about pillar pages https://www.youtube.com/watch?v=Db3TpDZf_to and tried to apply it to my self but find it impossible to do (but maybe I am looking at it the wrong way). Let's say I want to rank on "Normandy bike tou"r. I created a pillar page about "Normandy bike tour" what would be the topics of the subpages boosting that pillar page. I know that it should be questions people have but in the tourism industry they don't have any, they just want us to make them dream !! I though about doing more general blog pages about things such as : Places to rent a bike in Normandy or in XYZ city ? ( related to biking) Or the landing sites in Normandy ? (not related to biking) Is it the way to do it, what do you recommend ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Redirecting homepage to internal page (2nd Tier page)
We are planning to experiment redirecting our homepage to one of the 2nd tier page. I mean....example.com to example.com/page. We need this page to rank well, but it doesn't have much internal links or external back-links, so we opt for this redirect. Advantage with this page is, it has "keyword" we want to rank for in URL. "page" in example.com/page. Will this help or hurt us in SEO? I think we are missing keyword in our root domain, so interested to highlight this page. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Redirect old "not found" url (at http) to new corresponding page (now at https)
My least favorite part of SEO 😉 I'm trying to redirect an old url that no longer exists to our new website that is built with https. The old url: http://www.thinworks.com/palm-beach-gardens-team/ New url: https://www.thinworks.com/palm-beach-gardens/ This isn't working with my standard process of the quick redirection plugin in WP or through htaccess because the old site url is at http and not https. Any help would be much appreciated! How do I accomplish this, where do I do it and what's the code I'd use? Thank you Moz community! Ricky
Intermediate & Advanced SEO | | SUCCESSagency0 -
Does an H1 have to be at the top of a page?
Because H1 "may" carry some weight with Google does it have to be placed at the top of the page? Can I place it towards the bottom of the page instead in normal body size? My goal is to keep the main keywords in the H1 but create a much friendlier title for the customer to read at the top of the page.
Intermediate & Advanced SEO | | PottyScotty0 -
Removing old versions of a page.
Recently one of my good friends updated his iweb based screen printing site to wordpress per my recommendation. This update has helped dramatically boost his rankings to #3 for most local keywords. This new site is now V5 of his site, but all older iweb versions are still on the ftp. There are a total of 209 pages on the ftp, as versions of about 30 actual pages. The pages have changed significantly with each update, leaving very little duplicate content, but the old ones are still on the google index. Would it hurt the rankings to clean up these older versions & 301 redirect to the new versions, or should we leave them? The site for reference is: http://takeholdprinting.com
Intermediate & Advanced SEO | | GoogleMcDougald0 -
SEOMOZ crawl all my pages
SEOMOZ crawl all my pages including ".do" (all web pages after sign up ) . Coz of this it finishes all my 10.000 crawl page quota and be exposed to dublicate pages. Google is not crawling pages that user reach after sign up. Because these are private pages for customers I guess The main question is how we can limit SEOMOZ crawl bot. If the bot can stay out of ".do" java extensions it'll perfect to starting SEO analysis. Do you know think about it? Cheers Example; .do java extension (after sign up page) (Google can't crawl) http://magaza.turkcell.com.tr/showProductDetail.do?psi=1001694&shopCategoryId=1000021&model=Apple-iPhone-3GS-8GB Normal Page (Google can crawl) http://magaza.turkcell.com.tr/telefon/Apple-iPhone-3GS-8GB/1001694/.html
Intermediate & Advanced SEO | | hcetinsoy0