URL Parameters to Ignore
-
Hi Mozers,
**We have a glossary of terms made up of a main page that lists out ALL of the terms, and then individual pages per alphabet letter that limit the results to that specific alphabet letter. These pages look like this: **
https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=A
https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=B
https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=C
https://www.XXXX.XXX/publications/dictionaries/XXX-terms?expand=D
etc.
If I'd like Google to remove all of these "expand=" pages from the index, such that only the main page is indexed, what is the exact parameter that I should ask Google to ignore in Search Console?
"expand=" ?
Just want to make sure! Thanks for the help!!!
-
I agree with what is said above, in addition you could also add the ignore parameter in GSC. As it 's basically adjusting the page content based on that. It's a bit unclear how much information that is really sending to the crawlers but it probably can't hurt.
-
Hi!
What billbill369 said is correct, but will only prevent google from crawling those pages.
My suggestion is to use canonical tags in every URL with a parameter pointing to the correct url (the one without parameters)For further reading:
SEO Best Practices for Canonical URLs + the Rel=Canonical Tag - Whiteboard Friday Consolidate duplicate URLs - Google Search Console HelpHope it helps.
Best luck.
GR. -
Useragent:* Disallow: /*?expand= This should work put it in your robots.txt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doubts about the technical URL structure
Hello, first we had this structure Categorie: https://www.stoneart-design.de/armaturen/ Subcategory: https://www.stoneart-design.de/armaturen/waschtischarmaturen/ Oft i see this https://www.xxxxxxxx.de/badewelt/badmoebel/ But i have heard it has something to do with layers so google can index it better, is that true ? "Badewelt" is an extra layer ? So i thought maybe we can better change this to: https://www.stoneart-design.de/badewelt/armaturen/ https://www.stoneart-design.de/badewelt/armaturen/waschtischarmaturen/ and after seeing that i thought we can do it also like this so the keyword is on the left, and make instead "badewelt" just a "c" and put it on the back https://www.stoneart-design.de/armaturen/c/ https://www.stoneart-design.de/armaturen/waschtischarmaturen/c/ I dont understand it anymomre which is the best one, to me its seems to be the last one The reason was about this: this looks to me keyword stuffing: Attached picture Google indexed not the same time the same url, so i thougt with this we can solve it Also we can use only the word "whirlpools" in de main category and the subs only the type without "whirlpools" in text thanks Regards, Marcel SC9vi60
Technical SEO | | HolgerL0 -
Shortening URL's
Hello again Mozzers, I am debating what could be a fairly drastic change to the company website and I would appreciate your thoughts. The URL structure is currently as follows Product Pages
Technical SEO | | ATP
www.url.co.uk/product.html Category Pages
www.url.co.uk/products/category/subcategory.html I am debating removing the /products/ section as i feel it doesn't really add much and lengthens the url with a pointless word. This does mean however redirecting about 50-60 pages on the website, is this worth it? Would it do more damage than good? Am i just being a bit OCD and it wont really have an impact? As always, thanks for the input0 -
AJAX and High Number Of URLS Indexed
I recently took over as the SEO for a large ecommerce site. Every Month or so our webmaster tools account is hit with a warning for a high number of URLS. In each message they send there is a sample of problematic URLS. 98% of each sample is not an actual URL on our site but is an AJAX request url that users are making. This is a server side request so the URL does not change when users make narrowing selections for items like size, color etc. Here is an example of what one of those looks like Tire?0-1.IBehaviorListener.0-border-border_body-VehicleFilter-VehicleSelectPanel-VehicleAttrsForm-Makes We have over 3 million indexed URLs according to Google because of this. We are not submitting these urls in our site maps, Google Bot is making lots of AJAX selections according to our server data. I have used the URL Handling Parameter Tool to target some of those parameters that are currently set to let Google decide and set it to "no urls" with those parameters to be indexed. I still need more time to see how effective that will be but it does seem to have slowed the number of URLs being indexed. Other notes: 1. Overall traffic to the site has been steady and even increasing. 2. Google bot crawls an average of 241000 urls each day according to our crawl stats. We are a large Ecommerce site that sells parts, accessories and apparel in the power sports industry. 3. We are using the Wicket frame work for our website. Thanks for your time.
Technical SEO | | RMATVMC0 -
Second URL
Hi We have a .com and a .co.uk Main website is .co.uk, we also have a landing page for the .com If we redirect the .com to the .co.uk, will it create duplicate content ... May seem like a silly question, but want to be sure that that the visitors cant access our website at both urls, as that would be duplicate content Thanks in advance John
Technical SEO | | Johnny4B0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
Block url with dynamic text in
I've just ran a report and I have a lot of duplicate page titles, most of which seem to be the review page, I use Magento and my normal url would be something like blah-blahtext.html but the review url is something like blah-blahtext/reviews/category/categoryname So I want to block the /reviews url bit as no one ever leaves reviews and it's not something I will be using in the future. Also I have a dynamic navigation which creates urls that look like product-name.html?size=2&colour=14 these are also creating duplicate urls, anyway to fix this? While I'm asking, anyone any tips for Magento?
Technical SEO | | Beermonster0 -
Someone is redirecting their url to mine
Hello, I have just discovered that a company in Poland www.realpilot.pl is directing their domain to ours www.transair.co.uk. We have not authorised this, neither do we want this. I have contacted the company and the webmaster to get it removed. If you search for the domain name www.realpilot.pl we (www.transair.co.uk) come up top. My biggest worry is that we will get penalised by Google for this re-direct as it appears to be done using some kind of frame. Does anyone know anything about this kind of thing? Many Thanks Rob Martin
Technical SEO | | brightonseorob0 -
Do links count if they have no href parameter?
A SEOmoz report indicates that we have a large number of links on our pages, mainly due to an embedded mega-drop down and lots of product display options that are user activated, but otherwise hidden. Most of the links have the paramter href="#", because the links are used in combination with jQuery to trigger actions. It is still possible to trigger the actions without the href parameter, so the question is: Do links without href parameters count towards the total amount of links on the page, since a link without a href parameter is actually an internal page link? Our site (this version of the site has not had empty tags removed): http://emilea.be/
Technical SEO | | Webxtrakt0