Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
-
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891
Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891
That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag...
Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
-
Thanks Sachin
So basically on sites that use ECWID for their ecommerce, only the main pages on the actual website (not the product pages that ECWID generates which is the part from the hashtag on) get indexed?
Essentially Google is NOT indexing any products because ECWID uses an existing page on a website and shows products there.
Is that correct? For example if you look at an XML sitemap for the running boards site that we used as an example you will see there are only 10 pages on it. However there are over a 1000 different types of running boards sold on the site which have their own pages populate after a #tag in the url: http://www.runningboards4less.com/index.php?option=com_xmap&view=xml&tmpl=component&id=1
-
Traditionally, the search engines ignore everything after the hash-tag because it's usually content contained on the same page or URL. Therefore, those additional URLs should not get indexed (only the part before the hashtag should). As per my experience, they completely disregard anything after the # tag in a URL.
However, it is always advisable to have clean urls as both SEs and people prefer them over complicated one. Clean urls deliver enhanced usability to help users remember and share your URLs more easily. Another benefit of a simple URL is that other sites are more likely to link to a simple URL, because it is easier to do so.
-
Anyone? Bueller? Bueller?
Also if anyone knows how to modify Ecwid urls so that they are "clean", please chime in...
-
Thank you for your response. I am not implying that it is indexing a "separate" url. I am referring to the SEO value of a proper "clean" url for the specific page. ECWID doesn't allow for it's users to create custom urls.
If I were creating a url for the page I listed above, I would have it something like **** .com/chevy-van NOT _.com/#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 _
My question regards the low or lack of any value at all using a url like the long one above and if the statement made by the ECWID rep is factual.
-
These URLs are called AJAX URL- a URL containing a hash fragment, e.g.,
www.example.com/index.html#mystate
, where#mystate
is the hash fragment.Reg. the above mentioned URL- This url is using Hash-Bang (#!) not hashtag, which makes Ajax/ javascript pages crawlable. The basic # indicates a location on a page (anchor) so does not get indexed as a separate URL.
You can find detailed information here- https://support.google.com/webmasters/answer/174992?hl=en
https://support.google.com/webmasters/answer/174993
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Should I Add Location to ALL of My Client's URLs?
Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
Intermediate & Advanced SEO | | pdrama231
example.com/weddings/djs-washington-dc-md-va
example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
example.com/weddings/djs
example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!0 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
How can I make a list of all URLs indexed by Google?
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap. The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google. Anyone? (I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
Intermediate & Advanced SEO | | Bryggselv.no0 -
Are these URL hashtags an SEO issue?
Hi guys - I'm looking at a website which uses hashtags to reveal the relevant content So there's page intro text which stays the same... then you can click a button and the text below that changes So this is www.blablabla.com/packages is the main page - and www.blablabla.com/packages#firstpackage reveals first package text on this page - www.blablabla.com/packages#secondpackage reveals second package text on this same page - and so on. What's the best way to deal with this? My understanding is the URLs after # will not be indexed very easily/atall by Google - what is best practice in this situation?
Intermediate & Advanced SEO | | McTaggart0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Do UTM URL parameters hurt SEO backlink value?
Does www.example.com and www.example.com/?utm_source=Google&utm_medium=Press+Release&utm_campaign=Google have the same SEO backlink value? I would assume that Google knows the difference.
Intermediate & Advanced SEO | | mkhGT0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9