Unlikely, as long as they're returning 404 errors you should be OK. Maybe update your disavow file and you should be good to go!
- Home
- GFD_Chris
GFD_Chris
@GFD_Chris
Job Title: Senior SEO Manager
Company: Go Fish Digital
I am a Senior SEO Manager for the Go Fish Digital team. I work with unique problems and advanced search situations to help clients improve organic traffic through a deep understanding of Google's algorithm and web technology. I love looking into interesting search problems! Feel free to reach out at chris.long@gofishdigital.comLatest posts made by GFD_Chris
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
It's tough to say without seeing the site. Overall it's unlikely if you don't use that string anywhere. We usually see it more for broken relative URLs. Maybe a third party site is using that string.
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
From what I can tell, this probably isn't the reasons for the drops. I'd go back and ensure that any URLs that changed are 301 redirecting to the correct destination URL. I'd also ensure that no pages that were associated with high volume keywords no longer exist.
For your issue, Google is likely finding some broken URLs, possibly from your internal linking structure. Perform a crawl of the site and see if you can find "Inlinks" to those broken pages. If so, you can work with dev to eliminate the issue.
-
RE: Duplicate content? other issues? using vendor info when selling their prodcuts?
So it's generally not a best practice but definitely more par for the course in the eComm world. It's not uncommon to see sites that strictly use descriptions scraped from manufacturer feeds.
Ideally, your pages will contain 100% unique content. If this isn't possible, I generally advise clients to do the following:
- Find dynamic ways of adding unique content (similar products, categories etc)
- Add review functionality: This creates unique UGC content
- Have a better product page UX than your competitors. Emphasize key information in the design and ensure that all information required to decide on a purchase is on the page.
-
RE: Sitemap use for very large forum-based community site
Agreed, you'll likely want to go with option #2. Dynamic sitemaps are a must when you're dealing with large sites like this. We advise them on all of our clients with larger sites. If your forum content is important for search then these are definitely important to include as the content likely changes often and might be naturally deeper in the architecture.
In general, I'd think of sitemaps from a discoverability perspective instead of a ranking one. The primary goal is to give Googlebot an avenue to crawl your sites content regardless of internal linking structure.
-
RE: Rel=Canonical Vs. 301 for blog articles
If the current plan is to create new product sites, then 301 redirect is probably the way to go. You're right that canonical tags can technically be ignored and 301 redirects will send stronger consolidation signals. The biggest con would be that the information can't exist in two places. So if the parent sites would benefit from having that content as well, then canonical tags should be looked into.
-
RE: Re-direct Irrelevant (high ranking) blog articles?
I agree with you that I would probably leave them up. Redirecting those posts would likely sacrifice your ranking positions as you mentioned.
Your best bet might just be to create a new Google Analytics that removes the entire blog or at least those two posts. For your core reporting, you could just use that segment. That should allow you get the traffic but report your core KPIs on more relevant pages.
-
RE: Using one domain for email and another domain for your website, but redirects...
Nope! Your email domain shouldn't have any impact on your site's SEO.
-
RE: Set Up htaccess File
Hey Bob!
Would be happy to take a look at the project for you. You can email me at chris.long@gofishdigital.com
-
RE: Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
It's tough to tell without seeing the help & marketing pages. How similar are they? Generally, these are different as marketing pages generally talk more about user benefits where help pages are more tutorial based. It's likely as long as they aren't 1:1 matches (or very close) that they can both exist.
In the rare event that the help center pages are an exact duplicate of existing marketing site pages, then in theory you should be able to 404/redirect those pages and not worry about them. If there's already another version of the page on the site than there's no need to manage it in two places.
Feel free to reach out if you have any questions!
Best posts made by GFD_Chris
-
RE: Use Internal Search pages as Landing Pages?
That's a great question. If you have pages that are generating revenue and ranking really well. I'd be hesitant to remove them from the index. Like the article mentions, Wayfair generates a huge amount of search traffic through these auto-generated internal search pages. If these are considered high quality and ranking well in Google, I would probably recommend leaving them alone.
If you want to trim some of these down, I'd use Google Analytics to find ones that aren't generating organic traffic/revenue. You could consider adding the "noindex" tag to those.
In general it is best practice to remove internal search pages from Google as they can contribute to a large amount of index bloat. However, I wouldn't reduce any that you see are performing well.
I'd be happy to take a look if you have any other questions!
-
RE: Re-direct Irrelevant (high ranking) blog articles?
I agree with you that I would probably leave them up. Redirecting those posts would likely sacrifice your ranking positions as you mentioned.
Your best bet might just be to create a new Google Analytics that removes the entire blog or at least those two posts. For your core reporting, you could just use that segment. That should allow you get the traffic but report your core KPIs on more relevant pages.
-
RE: Does data-bind hurt SEO?
Basically those tools aren't reading the DOM but Google can which is why it can see your site's title tags, H1s etc. Your site is using client-side rendering which Google can crawl through. Notice how if you go to a given page and click "View Source", none of the page's content appears.
While it appears Google is reading the content in the pages I looked at, I would definitely look into this more to see if Google is able to crawl/index the content on all of your site's pages. Client side rendering is less reliable than SSR so there might be instances where Google isn't reading sections of your content.
-
RE: Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
Hey!
Is there a particular reason that you want the help articles out of the index? That content may be useful to current users who are querying Google for how to use your solution. It also could be useful to potential users who are looking for specific functionalities. We generally recommend that our SaaS clients index this content.
In terms of a time investment, it's probably still important, especially if your existing users are interacting with the documentation. Personally, to start I'd prioritize any items that can be scaled. I might start with:
- Removing the global "noindex" tag
- Fixing mixed content signals
- Removing global 4xx/3xx
- Fixing individual 4xx on your highest traffic pages
Try implementing anything globally first and then working to page-level fixes.
-
RE: Links from a penalised site.
Hey Tim!
Definitely an interesting question. If the links are tagged as "nofollow" than you should be OK in general. Even if they weren't, there's a good chance that Google is simply discounting the value of the links.
If you're really worried about it, you could test adding the domain to your disavow file and waiting to see if there are any positive shifts in rankings during this time. In our experience with disavowing domains, if a site doesn't pass the eye-test of "this adds value to my site", than it's generally safe to disavow.
If you post the link I'd be happy to take a look!
-
RE: Does data-bind hurt SEO?
No problem! A good golden rule of JavaScript SEO is to always SSR where possible. Let me know if you have any other questions!
-
RE: Phasing in new website on www2 domain - 301 plan
So if I understand it correctly, you're going to be incrementally adding new pages on the ww2 subdomain while content still exists on the www subdomain. This will be done slowly until all of the www content can be 301 redirected to ww2?
If that's the case, there's a few other things that could be helpful to know:
- What's the expected timeline to get all of the new ww2 pages live?
- How similar will the ww2 content be to the www content?
- Is the TLD staying the same and only the subdomain changing?
Ideally, everything would be added to the production site and redirected all at once.
However, if that isn't an option I'd probably try to implement the redirects from www to ww2 incrementally as well. Otherwise, Google will be able to crawl/index content from both the www and ww2 subdomains, leading to duplicate content issues. I'd try to keep the website architecture fairly consistent between the two so preserve the UX/equity signals between the two subdomains.
It's tough to give insights without more information, so I'd be happy to chat more about this!
-
RE: Huge number of crawl anomalies and 404s - non- existent urls
Unlikely, as long as they're returning 404 errors you should be OK. Maybe update your disavow file and you should be good to go!
-
RE: My last site crawl shows over 700 404 errors all with void(0 added to the ends of my posts/pages.
In the HTML of your pages there's an <a href="">link with "javascript:void(0". It appears that Google is getting into those. Is possible, remove that link or take it out of an</a> <a href="">element. Otherwise, you should be OK, those pages should 404. </a>
-
RE: "5XX (Server Error)" - How can I fix this?
Good question! I've seen this happen with a few clients.
Here is my process for reviewing 5xx errors:
- Manually check a handful of reported 5xx URLs and note their status codes. Often times upon manual inspection, you'll often find that these pages return different status codes (200, 4xx).
- Perform a crawl using Screaming Frog and watch the "Status Codes" report. Watch to see if URL go from getting crawled as 200 to 5xx status codes.
- If this is the case, the issue might that your hosting can't handle the requests getting sent to your site. If not and your URLs are getting reported as 200, it might been a temporary status the crawler found.
- If possible, check your log files to see if Googlebot is returning a large number of 5xx errors.
Overall, if you only find a few 5xx errors in Search Console and your log files but don't find them in when you crawl the site, you're probably OK. However, if a crawl consistently reveals them or you see a large number in your log files then it's a high priority item to flag to your developers. You may need to consider upgrading your hosting or rolling back any recent development changes that were made. Definitely lean on your developer's insights here.
I am a Senior SEO Manager for the Go Fish Digital team. I work with unique problems and advanced search situations to help clients improve organic traffic through a deep understanding of Google's algorithm and web technology.
I love looking into interesting search problems! Feel free to reach out at chris.long@gofishdigital.com
Looks like your connection to Moz was lost, please wait while we try to reconnect.