Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
New theme adds ?v=1d20b5ff1ee9 to all URL's as part of cache. How does this affect SEO
-
New theme I am working in ads ?v=1d20b5ff1ee9 to every URL. Theme developer says its a server setting issue. GoDaddy support says its part of cache an becoming prevalent in new themes.
How does this impact SEO?
-
Thanks !
I turned of Geolocate (with page caching support), and as you said, it corrected the problem.
Thanks again.
Bob
-
Hi Bob,
I second Paul. His answer is a good one. Hope we helped you.
Sincerely,
Dana
-
Just FYI - the advice to remove query strings from static resources in that WordPress article is the proverbial Very Bad Idea. If you want a full explanation, let me know, but trust me - don't.
There's a world of difference between static files like CSS and Javascript having variables, and having those variables on page URLs.
You should have self-referential canonical tags on every page on your site anyway, which would take care of the duplicate URL issue created by the variables added to each URL, but there are still many other reasons why they're bad for SEO and usability, as Dana points out.
Paul
-
You have a configuration choice in your WooCommerce settings that is causing this, Bob.
You've got the default customer location in settings set to "Geolocate (with page caching support)". This causes the variable to be added to the URL in order to enable the geo-location for each customer. Turn it off and the variable will no longer be added.
And yea, this is a disaster for SEO, as Dana explains, and it will also badly foul your Analytics and it even borks your site's internal search.
Hope that makes sense?
Paul
-
Hi again Bob,
Take a look at this thread on how to remove query strings from static parameters...I believe your answer is there.
https://wordpress.org/support/topic/how-to-remove-query-strings-from-static-resources
Dana
P.S. Why is this a problem for SEO? A couple of reasons:
1. It's highly likely your content will get shared without the query parameter AND with the query parameter. This will effectively split your link equity between two versions of the same page.
2.Google Search Console is very bad at understanding that the page without the query string is the same as it is with the query string...you'll likely get a lot of duplicate content notifications.
3. From an end-user standpoint, it's just plain ugly....and end user experience matters to SEO right? - I understand that's somewhat facetious....but it's your business right? You want it to look a good, solid, high-quality, professional site. Ugly query parameters scream "I hired my 21 year old nephew to b build me a WordPress site."
-
Hi Bob,
What CMS are you working with? Once you answer that I might be able to help a little more.
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO - New URL structure
Hi, Currently we have the following url structure for all pages, regardless of the hierarchy: domain.co.uk/page, such as domain/blog name. Can you, please confirm the following: 1. What is the benefit of organising the pages as a hierarchy, i.e. domain/features/feature-name or domain/industries/industry-name or domain/blog/blog name etc. 2. This will create too many 301s - what is Google's tolerance of redirects? Is it worth for us changing the url structure or would you only recommend to add breadcrumbs? Many thanks Katarina
Technical SEO | | Katarina-Borovska1 -
CSS user select and any potential affect on SEO
Hi everyone and thank you in advance for your helpful comments. We have a client who is concerned about copying of content from their site because it has happened a few times in the last few years. We have explained that the content is essentially publicly available and that using the CSS selector user-select to prevent selection of text will really only prevent the technically limited users from working out how to get the text. He is happy that it will at least stop some people. So the question is would there be any way that this would have an affect on SEO? We would make an assumption that it doesnt but putting it out there for some feedback. Cheers Eddie
Technical SEO | | vital_hike0 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
How Does Dynamic Content for a Specific URL Impact SEO?
Example URL: http://www.sja.ca/English/Community-Services/Pages/Therapy Dog Services/default.aspx The above page is generated dynamically depending on what province the visitor visits from. For example, a visitor from BC would see something quite different than a visitor from Nova Scotia; the intent is that the information shown should be relevant to the user of that province. How does this effect SEO? How (or from what location) does Googlebot decide to crawl the page? I have considered a subdirectory for each province, though that comes with its challenges as well. One such challenge is duplicate content when different provinces may have the same information for some pages. Any suggestions for this?
Technical SEO | | ey_sja0 -
Can a CMS affect SEO?
As the title really, I run www.specialistpaintsonline.co.uk and 6 months ago when I first got it it had bad links which google had put a penalty against it so losts it value. However the penalty was lift in Sept, the site corresponds to all guidelines and seo work has been done and constantly monitored. the issue I have is sales and visits have not gone up, we are failing fast and running on 2 or 3 sales a month isn't enough to cover any sort of cost let alone wages. hence my question can the cms have anything to do with it? Im at a loss and go grey any help or advice would be great. thanks in advance.
Technical SEO | | TeamacPaints0 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0 -
SEO Benefit from Redirecting New Exact Match Domains?
Hi, All! This is a question asked in the old Q & A section, but the answer was a little ambiguous and it was about 3 years ago, so I decided to repost and let the knowledgeable SEO public answer... From David LaFerney: It’s clear that it’s much easier to get high rankings for a term if your domain is an exact match for the query. If you own several such domains that are very related such as – investmentrealestate.com, positivecashflow.com, and rentalproperty.com – would you be able to benefit from those by 301ing them to a single site, or would you have to maintain separate sites to help capture those targeted phrases? In a nutshell – SEO wise, is it worth owning multiple domains to exactly match valuable search phrases? Or do you lose the exact match benefit when you redirect?>> To clarify: redirecting an old domain with lots of history and links to a new exact match domain seems to contain SEO benefit. (You get links+exact match domain, approximately.) But the other way around? Redirecting a new exact match domain to an older domain with links? Does that do anything for the ranking of the old domain for the exact match keyword? Or absolutely nothing? (My impression has been that it's nothing, but the question came up for a client and I just wanted to make sure I wasn't missing something.) Thanks in advance!
Technical SEO | | debi_zyx0