Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why to add a product id in the url
-
Hello !
shop.com/en/2628-buy-key-origin-the-sims-4-seasons/
Why will people use a product id in the link? Is there any advantage like better ranking or else?
-
Hi kh-priyam
Many people use a product ID in the link because many CMS automatically create this ID in the url, and if you don't install a plugin for friendly urls, this ID will be there.
There is no advantage for ranking better with product or categories ID. In my opinion it doesn't matter if you have an ID if the rest of the url is friendly, but some people think that urls without ID are better, so I recommend you to don't have a product ID in your urls.
Greetings
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
Do we lose Backlinks and Domain Authority of URL when we change domain Name?
Have 1 performing domain (Monthly - 4M visitor ) now we want to change domain name ( Brand name like SEOMOZ to Moz ). I have general knowledge about domain changing prevention tips like 301 redirection and other thing. My concern is about backlinks and DA. How can I prevent any lose from SEO Point of view. (backlink lose) Do I need to change all backlink form source or redirection is enough to get all reference traffic from that backlinks?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
How to improve PA of Shortened URLs
Why some of shortened urls like bitly/owly/googl has PA>40? I tried everything to improve PA of my shortened urls like facebook shares, retweets and backlinks to them but still i have PA-1. Checkout this URL: https://moz.com/blog/state-of-links in MOZ OSE and you will many 301 links from shortners
White Hat / Black Hat SEO | | igains
I asked many seo experts about this but no one answered this question so today subscribed MOZ pro for the solution. Please give me the answer.0 -
Vanity URLs Canonicalization
Hi, So right now my vanity URLs have a lot more links than my regular homepage. They 301 redirect to the homepage but I'm thinking of canonicalizing the homepage, as well as the mobile page, to the vanity URL. Currently some of my sites have a vanity URL in a SERP and some do not. This is my way of nudging google to list them all as vanity but thought I would get everyone's opinion first. Thanks!
White Hat / Black Hat SEO | | mattdinbrooklyn1 -
Duplicate keywords in URL?
Is there such a thing as keyword stuffing URLs? Such as a domain name of turtlesforsale.com having a directory called turtles-for-sale that houses all the pages on the site. Every page would start out with turtlesforsale.com/turtles-for-sale/. Good or bad idea? The owner is hoping to capitalize on the keywords of turtles for sale being in the URL twice and ranking better for that reason.
White Hat / Black Hat SEO | | CFSSEO0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0