Product Documentation Causing 23-40K issues
-
One of my biggest hurdles at my company is our Product Documentation library, which houses thousands of pages of publicly accessible and indexed content on old and new versions of our product. Every time a product name changes the URL changes, causing a 404, so I typically have 100s of 404s every few months from this site. It's housed off our main domain. We have 23,000+ Duplicate Pages, 40,000 missing meta descriptions, and 38,000 due to this library. It is not built the same as our main content, with page titles and meta descriptions, so everything is defaulted and duplicate. I'm trying to make a case that this is an issue, especially as we migrate our site next year to a new CMS.
Does anyone have any suggestions for dealing with this issue in the short term and long term? Is it worth asking the owners of the section of content to develop page titles and meta descriptions on 40,000 pieces of content? They do not see the value of SEO and the issues this can cause.
It needs to be publicly accessible, but it's not highly ranked content. It's really for customers who want to know more about the product. But I worry it is hurting other parts of our site, with the absurd amount of duplicate content, meta, and page title issues.
-
Hi there,
As far as your platform goes, product name changes simply shouldn't be causing 404s and this can be (relatively) easily bypassed by introducing the product id to the end of the URL. The name can then change but the product id remains the identifier for the product to load on the page.
With regards to your 40K pages without meta titles or descriptions, it's going to be almost impossible to fix that manually. It sounds as though you need to establish a business case, which could be done by fixing a few hundred of them (based on the ones that get the most traffic) and seeing if it has any improvement. This might not have an impact though as it sounds as though they aren't doing well in SEO as it is, although I agree there's a chance that these poorly optimised pages might be hurting your overall rankings.
The challenge you face sounds like more political/strategic than technical though. Either SEO has actual/potential value to your business or it doesn't. If content producers aren't versed in SEO or focused on maintaining it or producing optimised pages and content then you probably have an uphill battle ahead of you to get them to focus on it.
Good luck,
George
-
Hi Caitlin,
Unfortunately, the site is structured in a way that anytime there is a change to a product version or name, a new path is created in our CMS (which is an old system called Vignette) and a new URL is created and the other is broken. Because there are 100s of these happening with each new product release, I get resistance from the web developers on my redirect requests. One reason being they'd have to do this manually each time, the other being site performance concerns. I had to really push to get the / vs non-trailing slash versions of the higher ranking pages on our site redirected and that wasn't nearly as many pages as this library.
I know my question is pretty broad. I'm just curious if someone out there has experienced similar issues and how they made the case that it needs to be fixed? Or if redirects is the only answer, will that many redirects negatively affect performance? Because we are moving to a new CMS where hopefully this won't be as big of an issue, is it best to take the hit now? As we migrate, all those links will eventually be broken. And trying to make the case to redirect 40,000 URLs might be even harder.
Because these are low-ranking pages, should I suggest removing this library from the website's root domain?
-
Hello!
Unfortunately it is difficult to give you a concrete answer without an understanding of your CMS and website structure. However, one thing did stand out to me. You mentioned above that you receive 100s of 404s every few months. Is there any reason why you are not implementing 301 redirects for these? When a 301 redirect is set up if a user where to try to navigate to a page that 404s they would be automatically redirected to another closely related page instead.
^Caitlin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Escort directory page indexing issues
Re; escortdirectory-uk.com, escortdirectory-usa.com, escortdirectory-oz.com.au,
Technical SEO | | ZuricoDrexia
Hi, We are an escort directory with 10 years history. We have multiple locations within the following countries, UK, USA, AUS. Although many of our locations (towns and cities) index on page one of Google, just as many do not. Can anyone give us a clue as to why this may be?0 -
Weird Google indexing issues with www being forced
IM working on a site which is really not indexing as it should, I have created a sitemap.xml which I thought would fix the issue but it hasn't, what seems to be happening is the Google is making www pages canonical for some of the site and without www for the rest. the site should be without www. see images attached for a visual explanation.
Technical SEO | | Donsimong
when adding pages in Google search console without www some pages cannot be indexed as Google thinks the www version is canonical, and I have no idea why, there is no canonical set up at all, what I would do if I could is to add canonical tags to each page to pint to the non www version, but the CMA does not allow for canonical. not quite sure how to proceed, how to tell google that the non www version is in fact correct, I dont have any idea why its assuming www is canonical either??? k11cGAv zOuwMxv0 -
Amp page development issue
Hi everyone currently developing an amp version of my website it validates with no errors, however my <a name="blah"></a>some blah does not work for amp any ideas
Technical SEO | | livingphilosophy0 -
Crawling issues in google
Hi everyone, I think i have crawling issues with one of my sites. It has vanished form Google rankings it used to rank for all services i offered now it doesn't anymore ever since September 29th. I have resubmitted to Google 2 times and they came back with the same answer: " We reviewed your site and found no manual actions by the web spam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search. " How i detected that it may be a crawling issue is that 2 weeks ago i changed metas - metas are very slow in getting updated and for some of my pages never did update Do you know any good tools to check for bad code that could slow down the crawling. I really don't know where to look other than issues for crawling. I validated the website with w3c validator and ran xenu and cleaned these up but my website is still down. Any ideas are appreciated.
Technical SEO | | CMTM0 -
Standard Responses Causing Duplication Issues
Hi Guys We have a Q&A section on our site which we reply to customers using standard responses which have already been approved. This is causing a lot of duplication errors, however due to the nature of our business we need to use these responses. Is there anything that we can do to stop this? Matthew
Technical SEO | | EwanFisher0 -
URL rewriting causing problems
Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's. I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1 - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks
Technical SEO | | ocelot0 -
"Too Many On-Page Links" Issue
I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page. The site is here - http://www.ibethel.org/ Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?
Technical SEO | | BethelMedia0