Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
SEO for Online Auto Parts Store
-
I'm currently doing an audit for an online auto parts store and am having a hard time wrapping my head around their duplicate content issue. The current set up is this:
- The catalogue starts with the user selecting their year of vehicle
- They then choose their brand (so each of the year pages have listed every single brand of car, creating duplicate content)
- They then choose their model of car and then the engine
- And then this takes them to a page listing every type/category of product they sell (so each and every model type/engine size has the exact same content!) This is amounting to literally thousands of pages being seen as duplicates
It's a giant mess. Is using rel=canonical the best thing to do? I'm having a hard time seeing a logical way of structuring the site to avoid this issue.
Anyone have any ideas?
-
First, is this content dynamic? Depending on how the user progresses through these choices a search engine may not even see this information.
Second, as long as those aren't the pages you are trying to rank I can't see them actually having that big of an impact (if any) on your overall SEO. There is a difference between having pages that have the same content, and having pages that are duplicate content.
Rel canonical would be another way to save your skin, and making sure that the choices themselves are no-follow links (as long as there is some other method to get to the deeper pages).
Personally for the sake of SEO and user experience (which is far more important) I'd talk to the company about creating a dynamic selection wizard that could be a popup and once the user makes all their choices brings them to the right HTML page.
The HTML pages would be open, crawl-able, structured and sitemapped and the wizard would by a dynamic widget loaded in from a page that was robots disallowed. Helping both the human and spider experience - just my two cents!
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is NitroPack plugin Black Hat SEO for speed optimization
We are getting ready to launch our redesigned WP site and were considering using NitroPack performance optimization plugin, until some of our developers started ringing the alarm. Here is what some in the SEO community are saying about the tool. The rendering of the website made with the NitroPack plugin in the Page Metric Test Tools is based entirely on the inline CSS and JS in the HTML file without taking into account additional numerous CSS or JS files loaded on the page. As a result, the final metric score does not include CSS and JavaScript files evaluation and parsing. So what they are saying is that a lot of websites with the NitroPack plugin never become interactive in the Page Metric Tools because all interactivity is derived from JavaScript and CSS execution. So, their "Time to Interactive" and "Speed Index" should be reported as equal to infinity. Would Google consider this Black Hat SEO and start serving manual actions to sites using NitroPack? We are not ready to lose our hard-earned Google ranking. Please, let me know your thoughts on the plugin. Is it simply JS and CSS "lazy loading" that magically offers the first real-world implementation that works magic and yields fantastic results, or is it truly a Black Hat attempt at cheating Google PageSpeed Insights numbers? Thank you!
On-Page Optimization | | opiates0 -
How Do SSL Certificates Affect On SEO?
Does really a SSL certificate affect on SEO? How? Why? According to my hosting provider (ganje.host), "https" improves SEO! As I know, It decreases speed. So how does it improve SEO when my speed is slower than before?
On-Page Optimization | | MirzaeeMustafa0 -
Tags - Good or bad for SEO
We are getting Moz errors for duplicate content because tag pages share the same blog posts. Is there any way to fix this? Are these errors bad for SEO, or can I simply disregard these and ignore them? We are also getting Moz errors for missing descriptions on tag pages. I am unsure how to fix these errors, as we do not actually have pages for these on our WordPress site where we are able to put in a description. I have heard that having tags can be good for SEO? (We don't mind having several links that show up when searching for us on google...) As far as the SEO goes, I am not sure what to do. Does anyone know the best strategy?
On-Page Optimization | | Christinaa0 -
Affect of ™ and ® in title for SEO
I am looking at adding the trademark and rights reserved symbols to some of my titles. I think this might help with click through rate. From what I have found, this shouldn't have an affect on SEO unless it makes the title too long. Is this correct? Stephen
On-Page Optimization | | stephen.volker1 -
Will shortening down the amount of text on my pages affect it's SEO performance?
My website has several pages with a lot of text that becomes pretty boring. I'm looking at shortening down the amount of copy on each page but then within the updated, shortened copy, integrating more target keywords naturally. Will shortening down the current copy have a negative effect on my SEO performance?
On-Page Optimization | | Liquid20150 -
Do Blog Tags affect SEO at all anymore?
We're trying to standardize the use of tags on our site amongst writers/editors, and I'm trying to come up a list of tags they can choose from to tag posts with - and telling them to use no more than 10 (absolute maximum) per post. We are also in the process of migrating to a new CMS, and have 8 defined categories that will all have their own landing page within our "News" section. TLDR: Do blog tags have any impact on SEO anymore? Are they solely meant to help users find articles related on popular topics, or does creating a tag for a popular topic help to improve organic visibility? Full Question: With the tag standardization, I want to make sure we're creating the most useful and effective tags; and the UX/SEO sides of my brain are conflicted. To my understanding, creating a tag about a high volume topic in an industry helps establish the website's relevance to Google/other search engines about that topic and improves overall relevance; but the tag feed page (ex: http://freshome.com/tag/home-protection/) isn't really meant for organic search visibility. So my other question is, is it worth it to noindex the tag pages in the robots.txt? Will that affect any benefit to increased relevance for Google (if there is any)? I'm interested to hear others' thoughts and suggestions. Thanks in advance!
On-Page Optimization | | davidkaralisjr0 -
SEO audit on a beta site
HI there, Is there much point conducting an SEO site audit on a site that has not yet launched and is protected behind a login? Presumably none of the usual SEO tools (Moz, Screaming Frog etc) can crawl this site becuase it is all locked behind a login. Would it be better to launch it and then do a site audit? Thanks
On-Page Optimization | | CosiCrawley0 -
Best SEO structure for blog
What is the best SEO page/link structure for a blog with, say 100 posts that grows at a rate of 4 per month? Each post is 500+ words with charts/graphics; they're not simple one paragraph postings. Rather than use a CMS I have a hand crafted HTML/CSS blog (for tighter integration with the parent site, some dynamic data effects, and in general to have total control). I have a sidebar with headlines from all prior posts, and my blog home page is a 1 line summary of each article. I feel that after 100 articles the sidebar and home page have too many links on them. What is the optimal way to split them up? They are all covering the same niche topic that my site is about. I thought of making the side bar and home page only have the most recent 25 postings, and then create an archive directory for older posts. But categorizing by time doesn't really help someone looking for a specific topic. I could tag each entry with 2-3 keywords and then make the sidebar a sorted list of tags. Clicking on a tag would then show an intermediate index of all articles that have that tag, and then you could click on an article title to read the whole article. Or is there some other strategy that is optimal for SEO and the indexing robots? Is it bad to have a blog that is too heirarchical (where articles are 3 levels down from the root domain) or too flat (if there are 100s of entries)? Thanks for any thoughts or pointers.
On-Page Optimization | | scanlin0